Search code examples
c++stringstreambitset

bitset<4> converted to wrong value on Win7 Embedded


I have the following piece of code to convert from Sixbit ASCII to an ASCII string:

    std::string SixBitToASCII(char *buffer)
    {
        std::stringstream ss;
        std::stringstream out;

        for (int index = 0; index < 16; index++)
        {
            std::bitset<4> bset(buffer[index]);

            std::cout << buffer[index] << " - " << bset << std::endl;

            ss << bset;
        }

        std::cout << ss.str() << std::endl;

        for (int index = 0; index < 60; index += 6)
        {
            std::string s = ss.str().substr(index, index + 6);
            std::bitset<6> bits;
            std::istringstream is(s);
            is >> bits;

            int asciiCode = bits.to_ulong() + 32;

            out << (char) asciiCode;
        }

        return out.str();
    }

It compiles fine. I´m compiling in a VS2012 Win7 Professional 32bits.

When I run it into an embedded Win7 I´m getting the following output:

8 - 1000
f - 0110 <<< ??? PROBLEM HERE
2 - 0010
9 - 1001 
2 - 0010
3 - 0011
4 - 0100
1 - 0001
0 - 0000
4 - 0100
1 - 0001
3 - 0011

100001100010100100100011010000010000010000010011

What is going on where is maked as a problem ? Converting F to 0100 ??? Isn´t is supposed to be 1111 ?

Of course the final conversion is wrong due to this error convering F. I´ve tried std::bitset<4> bset((unsigned char) buffer[index]) with same results.

Help appreciated.


Solution

  • To see better what's going on, change

    std::cout << buffer[index] << " - " << bset << std::endl;
    

    to

    std::cout << +buffer[index] << " - " << bset << std::endl;
    

    That will show you the numeric value of buffer[index] instead of whatever character that numeric value represents. I don't now what "Sixbit ASCII" refers to, but with straight ASCII the result you're seeing is exactly what I'd expect: the ASCII code for the letter f is 0x66, so the low 4 bits are, indeed, 0110. You need to convert those character codes to digits. Again for ASCII (and for all standard character codes), values in the range '0' through '9' can be converted by subtracting '0'; values in the range 'a' through 'f' and 'A' through 'F' will require a more sophisticated lookup.