I'm trying to get the size of a generic .bmp. However, when the image is bigger than 127 pixels, the bytes become weird. (I'm opening the images with a text editor then pasting the contents in an online character-to-text converter because when in this situation my C# program will just print a "0".) Here are the relevant bytes for a 127-tall image:
127 032 032 032
Clearly, the image is 127 pixels tall.
Here are the relevant bytes for a 128-tall image:
226 130 172 032 032 032
What happened here? If 128 is the limit, shouldn't it become "0 1 032 032"? My thinking is that the value in the data would be something like 226*(base^0) + 130*(base^1) + 172*(base^2), but I think that would be at least greater than 226. Thank you for reading.
Edit: I checked the hex values instead (with Notepad++), and the number seems right. But why does it become messed up when transformed to an integer (for this I use C#'s what I believe to be called "typecasting", "(int)")?
string line;
try
{
StreamReader sr = new StreamReader("E:\\Test15.bmp");
line = sr.ReadLine();
while (line != null)
{
for(int i=0; i<line.Length; i++)
{
if(i==0)
{
Console.WriteLine("Header");
Console.Write("\tSignature: ");
}
if(i==2)
{
Console.WriteLine();
Console.Write("\tFile size: ");
}
if(i==6)
{
Console.WriteLine();
Console.Write("\tReserved: ");
}
if(i==10)
{
Console.WriteLine();
Console.Write("\tData offset: ");
}
if(i==14)
{
Console.WriteLine();
Console.WriteLine();
Console.WriteLine("Info header:");
Console.Write("\tHeader size: ");
}
if(i==18)
{
Console.WriteLine();
Console.Write("\tImage width: ");
}
if (i == 22)
{
Console.WriteLine();
Console.Write("\tImage height: ");
}
if (i == 26)
{
Console.WriteLine();
Console.Write("\tNumber of color planes: ");
}
if (i == 28)
{
Console.WriteLine();
Console.Write("\tBits per pixel: ");
}
if(i==30)
{
Console.WriteLine();
Console.Write("\tCompression: ");
}
if(i==34)
{
Console.WriteLine();
Console.Write("\tImage size: ");
}
if(i==38)
{
Console.WriteLine();
Console.Write("\tX resolution: ");
}
if (i == 42)
{
Console.WriteLine();
Console.Write("\tY resolution: ");
}
if (i == 46)
{
Console.WriteLine();
Console.Write("\tPixels per meter: ");
}
if (i == 50)
{
Console.WriteLine();
Console.Write("\tNumber of colors: ");
}
if (i == 54)
{
Console.WriteLine();
Console.WriteLine();
Console.WriteLine("----------------------------------------------------------------------------------");
}
if (i <= 53)
{
//Console.Write(i + ": ");
if ((int)line[i] == 65533)
{
Console.Write("0 ");
}
if ((int)line[i] != 65533)
{
Console.Write((int)line[i] + " ");
}
}
if (i>53)
{
if((int)line[i] < 10)
{
Console.Write(" ");
}
Console.Write((int)line[i + 0]);
Console.Write(" ");
if ((int)line[i+2] < 10)
{
Console.Write(" ");
}
Console.Write((int)line[i + 2]);
Console.Write(" ");
if ((i + 1) % 3 == 0)
{
Console.WriteLine();
}
i = i + 2;
}
}
}
sr.Close();
Console.ReadLine();
}
catch(Exception e)
{
Console.WriteLine("Exception: " + e.Message);
}
finally
{
Console.WriteLine("Executing \"finally\" block.");
}
Okay so I just tried your code with a white bmp image of size 339x103 and this is what I got:
Header
Signature: 66 77
File size: 0 69 0 0
Reserved: 0 0 0 0
Data offset: 118 0 0 0
Info header:
Header size: 40 0 0 0
Image width: 83 1 0 0
Image height: 103 0 0 0
Number of color planes: 1 0
Bits per pixel: 4 0
Compression: 0 0 0 0
Image size: 52 69 0 0
X resolution: 0 0 0 0
Y resolution: 0 0 0 0
Pixels per meter: 0 0 0 0
Number of colors: 0 0 0 0
Considering the fact that "All of the integer values are stored in little-endian format (i.e. least-significant byte first)", that line:
Image width: 83 1 0 0
should be read: 83 + 1*256 + 0*256^2 + 0*256^3 = 339
which is the expected width...
However, there is indeed a problem when byte value is between 128 and 255, because then (int)line[i]
will resolve to 65533. I suggest reading bytes from file instead of lines, in order to get rid of this problem. See File.ReadAllBytes