Search code examples
c++bitmapbit-manipulationsdlbitvector

How to diagnose bizarre behavior of saving and loading a bit vector (std::vector<bool>)?


I'm writing a one-off utility to edit a monochrome bitmap format for a game. There are 0x10000 "slots" for 8x8 monochrome sprites. I store each 8x8 sprite in eight bytes, each representing a horizontal line of on- or off-pixels.

Everything was fine while I drew the characters A through Y in slots 0 through 24. They all survived a round-trip of saving and loading with exactly the same pattern of bits. But then the Z drawing, in slot 25, lost one of its horizontal lines through the round-trip. Worse still, this happens wherever the Z is, and shifts all the lines below it up! I noticed other similar behavior with other patterns in slots after 25.

My code looks like it only ever examines single pixels at a time, so I am lost on how to diagnose this problem.

As far as I can tell, the problem is the deletion of 0x0C bytes. It seems unlikely that this is a problem with ASCII form feed (^L or '\f') characters.

I didn't find any Google results about missing form-feed characters, so I'm guessing it's a bug in my code.

Here are the saver and loader. (This is not how I write published or production code! 😳)

#include <iostream>
#include <fstream>
#include <vector>
#include <string>
#include <SDL.h>
#include <stdint.h>

static std::vector<bool> bitmap(0x400000, 0);

void save(const char *path)
{
    std::ofstream f(path, std::ios::binary);
    for (int i = 0; i < 0x10000; ++i)
    for (int j = 0; j < 8; ++j) {
        uint8_t byte = 0;
        for (int k = 0; k < 8; ++k)
            byte |= bitmap[8 * (8 * i + j) + k] << (7 - k);
        f << byte;
    }
    f.close();
    std::cout << "Wrote charmap to " << path << std::endl;
}

void load(const char *path)
{
    std::ifstream f(path, std::ios::binary);
    for (int i = 0; i < 0x10000; ++i)
    for (int j = 0; j < 8; ++j) {
        uint8_t byte;
        f >> byte;
        for (int k = 0; k < 8; ++k)
            bitmap[8 * (8 * i + j) + k] = !!(byte & (1 << (7 - k)));
    }
    f.close();
    std::cout << "Read charmap from " << path << std::endl;
}

int main(int argc, char *argv[]) { /* ... snip ... */ }

I expect 0x0C bytes to be preserved, but they are deleted. Thanks for any pointers!


Solution

  • Don't use formatting stream operators (f << ...; and f >> ...;) when dealing with binary files, even when they are opened in binary mode. You don't want formatted input/output, you want the bytes written/read as-is. Use the ofstream::write() and ifstream::read() methods instead, eg:

    //f << byte;
    f.write(reinterpret_cast<char*>(&byte), sizeof(byte));
    
    //f >> byte;
    f.read(reinterpret_cast<char*>(&byte), sizeof(byte));