I've written a few scripts that mess around with images and shifting their colors using PIL, and I noticed that when I load in an RGB image and convert it to a palette, it uses the correct colors, but the color palette that it stores includes a bunch of black colors for the color slots that aren't used.
from PIL import Image
test = Image.new('RGB', (1, 1), color='red')
test = test.convert('P', palette=Image.Palette.ADAPTIVE, colors=1)
test.save('test.png')
This creates an image of a single red pixel. I'd expect that given that I wanted the palette to only include 1 color, the index would only include 1 color, red, but if you open up this image using something like Aseprite, you can see that the indexed palette includes 255 extra black's. This basically means black is added to my color palettes for no reason.
Is there a way to remove all of these unused blacks from the color palette? Is this the correct behavior in the first place?
There's an open issue on Pillow regarding this behaviour, with a pull request that truncates the palette on quantize()
.
You can truncate the palette yourself:
from PIL import Image, ImagePalette
test = Image.new('RGB', (1, 1), color='red')
test = test.convert('P', palette=Image.Palette.ADAPTIVE, colors=1)
test.save('test.png') # 847 bytes, ugh
def truncate_palette(im, colors):
mode = im.im.getpalettemode()
palette = im.im.getpalette(mode, mode)[: colors * len(mode)]
im.palette = ImagePalette.ImagePalette(mode, palette)
truncate_palette(test, 1)
test.save('test2.png') # 82 bytes, aw yiss