Edit 1: I've identified the hex combination that triggers this problem but still can't fix it. See Edit at bottom:
Original Post: I'm trying to pass data from a sensor to a binary file. I'm using the manufacturer's example code and DLL to grab 4096 bytes at a time and then I'm trying to write it to a file. My final file size varies between 4100 and 4114 bytes and the extra bits are randomly distributed throughout the file.
The data from the sensor ends up in an unsigned char 4096 long. When I send each char to std::cout the values are correct (so the sensor and the DLL to communicate with it are working). However, writing the entire character to the binary file (using ofstream::write) fails as does writing each character one at a time (using ofstream::put). In the code below I've removed the error checking for the file creation, etc.
unsigned int uiread = 4096;
unsigned char ccdbuf[4096];
ofstream ofile;
/* DLL call stuff removed since it's hardware-specific */
ofile.open("camdata.bin");
// ofile.write(reinterpret_cast<const char*>(ccdbuf), uiread); // 4100 - 4114
for (int ii = 0; ii < uiread; ii++)
{
std::cout << (int)ccdbuf[ii] << "\n";
ofile.put(ccdbuf[ii]); // 4100 - 4114
// ofile.put(5); // 4096
}
ofile.close();
The 'ofile.write' line that's commented out was provided to me by the sensor manufacturer. The comment at the end highlights that the file length that results varies from 4100 to 4114 bytes and the extra bits are strewn throughout the file.
The 'std::cout' line in the for loop shows me the correct values.
If I write '5' 4096 times, the file is exactly what I expect it to be (4096 bytes). But writing the char vector one element at a time results in a variable binary record length (longer than 4096) with random locations for the extra bits.
I suspect my problem is something with the conversion from the unsigned char to the type expected for ofstream::write (const char?) but I just don't know how to work around it. Thanks in advance.
Edit 1: I've identified that the two-byte string that initially triggers this behavior always ends in 0x0A but the first improperly-written data is actually the first byte of the pair. So the hex equivalent of the text output sent to std:cout might be 0x890A, 0xC00A, or 0xC20A but when the output breaks, that pair of bytes is always written as 0x0A0D.
Looking at the bits that make up the first byte doesn't seem to reveal a pattern of bits preceding the 0x0A and not every two-byte pair that ends in a 0x0A triggers an error. Since the ofstream::put is inside a for loop it seems odd to me that the first error in writing would be the loop before a byte of value 0x0A.
As pointed out in the comments to the question, by not explicitly opening the write file as a binary Windows would occasionally modify the writes. Modifying the ofstream::open command to include the binary flag has fixed the problem.
ofile.open("camdata.bin", ios_base::binary);
to replace
ofile.open("camdata.bin");