I have an image with size 160x175 pixels. I loaded it in a byte array with this code:
var file = await StorageFile.GetFileFromApplicationUriAsync(
new Uri("ms-appx:///Assets/Image.png"));
using (var stream = await file.OpenAsync(FileAccessMode.Read))
{
var decoder = await BitmapDecoder.CreateAsync(stream);
var myTransform = new BitmapTransform
{
ScaledHeight = (uint)pixelHeight,
ScaledWidth = (uint)pixelWidth
};
var pixels = await decoder.GetPixelDataAsync(
BitmapPixelFormat.Rgba8,
BitmapAlphaMode.Straight,
myTransform,
ExifOrientationMode.IgnoreExifOrientation,
ColorManagementMode.DoNotColorManage);
var bytes = pixels.DetachPixelData();
}
Now I want to check each pixel if its transparency or not. My byte array has a length of 112000. I think 160px * 175px * 4 (RGBA). So is it right if every 4th byte in the byte array is the transparency byte of the pixel, and when its content is 255 it's transparency?
And how the pixels are sorted in the byte array? Horizontal or vertical?
I can't find anything in MSDN about what a byte array with pixels looks like.
Usually bitmap data is written as scanlines, I.e. one horizontal row of pixels at a time. Each row is then followed by the next to move up or down vertically in the image data.
Each scan line holds Width pixels, but for efficient access there is usually padding at the end of each scanline to bring the data into a processor friendly memory alignment. however, 32 bit data usually does not need this padding, so the next scanline will follow immediately in the next byte.
So for the conditions you describe, it is most likely that every 4th byte is alpha. An alpha value of 0 indicates full transparency.