Simplification of the colors of a picture

No, the image can be arbitrary, of any size and up to 255x255x255 colors.
Myself I understand what I wrote in my first post, I want to reduce the number of colors of an image to that of a defined palette. For each pixel we look for the closest color in the palette. But for certain colors like very light sky blue, the simplification gives us white.

And that may be just correct. As very very dark blueish gray, almost black, will end up just black.

Yes. It will.
So you either need to increase the number of colors,
or dither,
or accept that the nearest color to very pale blue is white.

There is no ‘solution’ to your question, because what you do, is the cause of what you see.

The palette shown in the first post contains quite a few near duplicates, which suggests that a linear approach is not optimal. I would experiment with some non linear mappings to try to resolve this.

Doing this leaves 192 as 192. I don’t understand how you’re getting a different value.

Using a fixed linear palette does the dirt fast job (save just the pixels mapped and discard the palette as it is always the same), but for “optimal results” one must use adaptative palettes and use color quantization algorithms to analyze images and extract the optimal palette for it and now we need both infos, the extracted palette and the pixels mapped to it. Octree Quantization and NeuQuant comes to mind, but the with it comes complexity too…

I agree that there is no palette that’s perfect for all image types, but I’m sure that there would be a single palette that’s far better than the one shown in the OP. As a first step, I’d read up on Gamma Correction, and try a gamma correction function to generate a more suitable reduced palette.

1 Like

1 Like

The “Color Quantization” article is useful, but it barely touches on how humans perceive colour and brightness. It seems to suggest that the quantization algorithms are being applied after gamma correction has already been done. So, gamma correction is still important.

Gamma Correction tries to fix the difference of luminance that the human eyes perceives (non-linear) and what the camera captures (linear), that causes a small subtle difference of color, and yes, that’s important. Color Quantization tries to define a limited set of colors able to represent an image discarding part of the full set of the proper colors and mapping all pixels to this limited set without much loss. While gamma correction is important, Color Quantization algorithms are the specific subject on the production of reduced palettes of colors.

Tim:
192 remains 192 but if you have for example 177: 177/64 = 2.766.
Round(2.766) = 3 and 3x64 = 192.
Robert and Rick:
I’m studying color quantization and gamma correction but I’m afraid that the calculation times may be important.
That’s why I try to optimize my code with HSV mode.
The H will give us the color even if the blue is very pale it tells us that it is blue, so the B component is max = 255.
The V gives us the luminosity, the higher it is, the lighter the color will be.
And finally the saturation (S) will tell us if the color is even lighter if the value of S is low.
But here I have to test for all colors.

Rust source of NeuQuant, you, or you + more people, could try a translation for Xojo.

Rick,
I am not an expert and even less of libraries but it would be enough to introduce this external library in Xojo to exploit it, if so I will try to get started to understand all that?
I have tried using external methods in Delphi or machine language but without any results so far.

Do you have the MBS plugins?

No Jeff, as much as I appreciate Christian for his person and everything he does, I have always wanted to try to create my own methods, which cost me a lot of wasted time, I know, but also a lot of pleasure in creating.
I develop all my programs first in RealBasic (Xojo) for ease of writing and classifying methods and then I translate them into Delphi for computational performance.
But I am surely wrong.

It’s a very good idea Michel.
For a non degraded image it applies very well as we work on 256 colors but otherwise the clear pixels become white
To give you an idea of ​​what I am trying to do, I have attached 2 images.
The initial picture is not of good quality but we can clearly see that the background is sky blue.
What is curious is that we can perfectly read “Dunlop” but by enlarging it becomes unreadable.

Your pictures are too small, Benoit. I don’t see the differences.

I know Michel but I want to start from this small image because there we can clearly see the predominant colors and I want to simplify them into a few main colors as our eye sees them.
However when I open these 2 images in Paint we can clearly see the clear pixels which have become white.

Whats wrong with millions of colors anyway?
What are you actually trying to achieve?
Why do you need just a few colors?