On Apple clang version 12.0.5 (clang-1205.0.22.11) with gcc -ansi
the following produces a segfault:
#include <stdlib.h>
#define ARG_MAX 1024 * 1024
struct S { const char *array[ARG_MAX]; };
int main(void) {
struct S as[] = {{NULL}};
return EXIT_SUCCESS;
}
ARG_MAX
is defined in sys/syslimits.h
as 1024 * 1024
, defined above explicitly.
How do I avoid the segfault?
Large arrays with automatic storage duration (aka local variables) is to be avoid on most implementations as they typically use a stack with a fixed and rather limited size, e.g. in the rang 1-8 MB. If the object size exceeds the available stack, all kind of things may happen. Including seg fault.
How do I avoid the segfault?
Use dynamic allocation like:
#define ARG_MAX (1024 * 1024)
struct S { const char *array[ARG_MAX]; };
int main(void) {
struct S * as = calloc(1, sizeof *as);
if (as == NULL) exit(EXIT_FAILURE);
... use as[0] ...
free(as);
return EXIT_SUCCESS;
}
Most implementations have much more memory available for dynamic allocated objects. It's typically called heap memory.
BTW: It's legal but a bit strange that your code makes an array with a single element.