-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
quantize() reduces to far fewer colors than requested in some cases #5204
Comments
The reason why an identical RGB works fine is because If you change your code to do This looks like a limitation in the FastOctree implementation. As far as I can tell it has very little to do with the Octree algorithm described on Wikipedia. Your example image really has only 4 RGB colors: red, green, blue and black, where the red, green and blue have multiple alpha values. The best the FastOctree implementation can do is subdivide those alpha values into 8 buckets. So the result is 8 reds + 8 greens + 8 blues + 1 transparent black = 25. I don't think this is going to be easy to fix. It would be possible to increase the number of buckets, but this is gonna affect performance pretty badly before you get to near 256 colors. I don't think that would be acceptable for a general fix, so it would probably just have to mean implementing a new quantization method. If you just want to fix your own problem you could build Pillow yourself. Fiddle with the first 4 numbers in the Pillow/src/libImaging/QuantOctree.c Line 377 in c377d8c
You could also try building Pillow with |
What did you do?
This image, named
test-in.png
, has 508 distinct colors according togetcolors()
.I tried to quantize it to 256 colors with the following code.
What did you expect to happen?
Reduced to 256 colors, i.e. print
colors 508 -> 256
What actually happened?
Reduced to only 25 colors, i.e. prints
colors 508 -> 25
What are your OS, Python and Pillow versions?
Other remarks
I'm guessing it's something about the image having only a few distinct RGB values but with a wide range of alpha values. A test image with over 256 distinct RGB values had no such issue.
Why don't I use pngquant or imagemagick instead? I depend on Pillow because I need a quantization method that preserves the colors of fully transparent pixels. (I'm modding a game whose texture filtering causes the colors of fully-transparent pixels to be faintly but noticeably visible around the edges of the fully-transparent regions.)
Edit: Using
im.convert("P")
on an image that already has 256 or fewer colors (according tolen(im.getcolors())
) can cause it to lose more colors unnecessarily during the conversion.The text was updated successfully, but these errors were encountered: