I'm trying to read and modify a png image using ImageIO, and read it as a drawable resource in Android.
Now the java code looks like this:
BufferedImage im = ImageIO.read(new File(inFilePath));
WritableRaster raster = image.getRaster();
DataBufferByte buffer = (DataBufferByte) raster.getDataBuffer();
byte imBytes[] = buffer.getData();
This gives me a byte array with size about 140k, for the sample file. Next I try to read it with BitmapFactory:
Drawable drawable = context.getResources().getDrawable(getImageResourceIdWithKeys());
Bitmap bitmap = ((BitmapDrawable) drawable).getBitmap();
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 200, stream);
byte[] imBytes = stream.toByteArray();
This gives me a byte array with size about 52k, next reading it as a raw resource gives about 24k which is the size on disk. Now I understand that the PNG file is compressed on the disk, but I can't understand why I get different byte arrays when getting the uncompressed version. Is there some way to read the image in the same way on Android and using standard Java API ?
You are comparing apples and oranges here.
In your first (ImageIO/BufferedImage) example, imBytes
is the uncompressed pixel data (probably in RGB or ARGB format, but it depends on input).
In your second (Android/Bitmap) example, imBytes
is PNG compressed data, compressed by Android and using default settings. After reading/decoding on Android, the image data is probably expanded to ARGB, so your output will be an (A)RGB PNG. This might require more bytes than your original input, especially if the original on disk was in palette format.
Java2D and the Android platform has different graphics primitives, so I'm not sure how you would satisfy "read the image in the same way"... Anyway, BitmapFactory
is probably the closest you get to an equivalent of ImageIO
on Android. In the same way Bitmap
is roughly equivalent to BufferedImage
.
Ie. ImageIO/Java2D:
BufferedImage im = ImageIO.read(new File(inFilePath));
int[] argbPixels = new int[im.getWidth() * im.getHeight()]; // Could also let getRGB allocate this array, but doing it like this to make it more similar to Android
im.getRGB(0, 0, im.getWidth(), im.getHeight(), argbPixels, 0, im.getWidth());
Android:
Bitmap bm = BitmapFactory.decodeFile(inFilePath);
int[] argbPixels = new int[bm.getWidth() * bm.getHeight()];
bm.getPixels(argbPixels, 0, bm.getWidth(), 0, 0, bm.getWidth(), bm.getHeight();
Not sure about Android, but at least Java2D will return ARGB values in sRGB color space. If Android does the same, the argbPixels
arrays should be identical.