According to c_faq,
typedef struct {
char a[3];
short int b;
long int c;
char d[3];
} T;
sizeof(T)
will return me 16 on a 32-bit gcc compiler. Because the alignment is 4, so to calculate, I should do this: 3+(1)+2+(2)+4+3+(1)
, where the brackets represent paddings. GCC concurs.
Based on that logic, did some practices on my own with another question.
typedef struct{
char buf1[9];
unsigned short s;
double d;
char buf2[3];
} S;
sizeof(S)
should return me 32 on a 32-bit gcc compiler. Because the alignment is 8, so to calculate, I should do this: 8+1+2+(5)+8+3+(5)
. However, gcc is telling me that sizeof(S)
is 24.
I reckoned it's because of optimization, however, after messing with the -O0
flag in gcc, sizeof(S)
still results in 24.
I was using gcc -m32 main.c
to compile.
What is going on here?
If I understood properly, the reason why my calculation and GCC's calculation do not match is because each compiler has their own way of handling struct data. Then what is a universal way of calculating the size of a struct? Or rather, what is the original way of calculating the size of a struct?
Then what is a universal way of calculating the size of a struct? Or rather, what is the original way of calculating the size of a struct?
There is no universal way. Nor is there a real "original" way, given that C was created in 1970s, back when PDP-11s were a thing, and C wasn't formally specified until 1989. It's implementation defined, and varies from platform to platform (and I'm not just being flippant; it really truly does vary between platforms and implementations (and even compiler flags!)).
If your professor wants you to compute the size of a structure, then he or she must provide the alignment required for each data type. If they do not, then their question is underspecified (and a virtually infinite number of different answers could be defended as "correct").