I have
const uint8_t longByteTable[16][256][16] = { { { 0x00, ... } } };
declared as a three-dimensional 16x256x16 array of hardcoded octet values.
For optimisation purposes and various other reasons I need this array to be interpreted as a three-dimensional 16x256x2 array of uint64_t values:
const uint64_t reinterpretedTable[16][256][2];
What I need is a valid way to cast longByteTable
to reinterpretedTable
within strict ISO/ANSI C. Is this:
const uint64_t (*reinterpretedTable)[256][2] =
(const uint64_t(*)[256][2])longByteTable;
a proper way to do that?
P.S. I can't declare longByteTable
with latter type because then it would not work properly with different endianness, and I would either need to declare different tables for diffent endianness, or perform some runtime checks and rotations. And yes, all further transformations of reinterpreted array are endianness-invariant.
Because of the pointer aliasing rules of C, you cannot make such casts. The only safe way is to use a union:
typedef union
{
uint8_t longByteTable[16][256][16]
uint64_t reinterpretedTable[16][256][2];
} table_t;
const table_t table;
Though note that this will still make your code depend on endianess. The only way to make the code endianess-indepentent, is to assign values to/from larger integer types by using bit shifts.