#define CHAR_BIT 8
union
{
float input; // assumes sizeof(float) == sizeof(int)
int output;
} data;
data.input = 122.5;
bitset<sizeof(float) * CHAR_BIT> bits(data.output);
int ieee[32];
for(int i = 0 ; i < 32 ; ++i){
ieee[i] = (int)bits[i];
}
My intention is to fill the ieee
array with the IEEE representation of a float, and is done (I've used code from another question), but there are 2 things I don't understand:
1) Why do I have to use #define CHAR_BIT 8
for a correct output?
2) How can I fill the ieee
array with the correct bit values?
1) Because you need to convert a size in bytes (that's what sizeof(float)
is) to a size in bits (which is what bitset
expects).
2) Looks to me like you are already putting the correct bit value into ieee
. Can you give an example and say why you think it's wrong.