Why do C and gdb output something different?
And how some data type sizes can even be equal? At least wikipedia says that all of these have different value ranges.
I have a 64-bit machine.
#include <stdio.h>
int main() {
printf("int: %d, long int: %d, long long int: %d\n", sizeof(int), sizeof(long int), sizeof(long long int));
}
$ gcc -g test.c
$ ./a.out
int: 4, long int: 8, long long int: 8
$ gdb -q
(gdb) p sizeof(int)
$1 = 4
(gdb) p sizeof(long int)
$2 = 4
(gdb) p sizeof(long long int)
$3 = 8
The output from the C code without GDB is defined by the compiler.
This post can help you with figuring out the sizes according to your machine:
on a 64 bit windows the size of long int is 4 bytes (like you got) while on Mac this type is 8 bytes.
if you want precision while writing c code you should use "uintX_t" types from the stdint built in module.
for example: "int8_t, uint32_t ..."
This makes sure you will get the expected sizes regardless of the compiler or the machine you run on.