We can look at the representation of an object of type T
by converting a T*
that points at that object into a char*
. At least in practice:
int x = 511;
unsigned char* cp = (unsigned char*)&x;
std::cout << std::hex << std::setfill('0');
for (int i = 0; i < sizeof(int); i++) {
std::cout << std::setw(2) << (int)cp[i] << ' ';
}
This outputs the representation of 511
on my system: ff 01 00 00
.
There is (surely) some implementation defined behaviour occurring here. Which of the casts is allowing me to convert an int*
to an unsigned char*
and which conversions does that cast entail? Am I invoking undefined behaviour as soon as I cast? Can I cast any T*
type like this? What can I rely on when doing this?