Search code examples
ccastingcharuint32-t

Understanding uint32_t char typecast (Bytes)


Lets say we have this:

int main()
{

    int32_t* value = (uint32_t*)malloc(sizeof(uint32_t));
    uint32_t array[9] = {1, 2, 3, 4, 5, 6, 7, 8, 9};

    *value = *(uint32_t*)((char*)array + 8);

    printf("Value is: %d\n", *value);

    return 0;
}

The value in this case would be 3. Why exactly is that? If we cast an uint32_t to char, does that mean one char is 4 Byte in uint32_t and therefore

array[9] = {0, 4, !!8!!, 12, 16, 20, 24, 28, 32};

Could someone try to explain this?


Solution

  • When you initialize an array, each initializer sets an element of the array regardless of how many bytes each element takes up.

    You machine is probably using little-endian byte ordering. That means that array looks like this in memory:

    -----------------------------------------------------------------
    | 1 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | ...
    -----------------------------------------------------------------
    
    |      [0]      |      [1]      |      [2]      |      [3]      | ...
    

    Each value of type uint32_t is 4 bytes long with the least significant byte first.

    When you do (char*)array that casts array (converted to a pointer) to a char *, so any pointer arithmetic on a char * increases the address by the size of a char, which is 1.

    So (char*)array + 8 points here:

    (char*)array + 8 ------------------
                                      v
    -----------------------------------------------------------------
    | 1 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 3 | 0 | 0 | 0 | 4 | 0 | 0 | 0 | ...
    -----------------------------------------------------------------
    
    |      [0]      |      [1]      |      [2]      |      [3]      | ...
    

    That pointer is then converted to a uint32_t * and dereferenced, so it reads the value 3.