We can look at the representation of an object of type T
by converting a T*
that points at that object into a char*
. At least in practice:
int x = 511;
unsigned char* cp = (unsigned char*)&x;
std::cout << std::hex << std::setfill('0');
for (int i = 0; i < sizeof(int); i++) {
std::cout << std::setw(2) << (int)cp[i] << ' ';
}
This outputs the representation of 511
on my system: ff 01 00 00
.
There is (surely) some implementation defined behaviour occurring here. Which of the casts is allowing me to convert an int*
to an unsigned char*
and which conversions does that cast entail? Am I invoking undefined behaviour as soon as I cast? Can I cast any T*
type like this? What can I rely on when doing this?
Which of the casts is allowing me to convert an
int*
to anunsigned char*
?
That C-style cast in this case is the same as reinterpret_cast<unsigned char*>
.
Can I cast any T* type like this?
Yes and no. The yes part: You can safely cast any pointer type to a char*
or unsigned char*
(with the appropriate const
and/or volatile
qualifiers). The result is implementation-defined, but it is legal.
The no part: The standard explicitly allows char*
and unsigned char*
as the target type. However, you cannot (for example) safely cast a double*
to an int*
. Do this and you've crossed the boundary from implementation-defined behavior to undefined behavior. It violates the strict aliasing rule.