I know that local arrays are created on the stack, and have automatic storage duration, since they are destroyed when the function they're in ends. They necessarily have a fixed size:
{
int foo[16];
}
Arrays created with operator new[]
have dynamic storage duration and are stored on the heap. They can have varying sizes.
{
const int size = 16;
int* foo = new int[size];
// do something with foo
delete[] foo;
}
The size of the stack is fixed and limited for every process.
My question is: Is there a rule of thumb when to switch from stack memory to heap memory, in order to reduce the stack memory consumption?
Example:
double a[2]
is perfectly reasoable;double a[1000000000]
will most likely result in a stack overflow, if the stack size is 1mb
Where is a reasonable limit to switch to dynamic allocation?
See this answer for a discussion about heap allocation.
Where is a reasonable limit to switch to dynamic allocation?
In several cases, including:
too large automatic variables. As a rule of thumb, I recommend avoiding call frames of more than a few kilobytes (and a call stack of more than a megabytes). That limit might be increased if you are sure that your function is not usable recursively. On many small embedded systems, the stack is much more limited (e.g. to a few kilobytes) so you need to limit even more each call frame (e.g. to only a hundred bytes). BTW, on some systems, you can increase the call stack limit much more (perhaps to several gigabytes), but this is also a sysadmin issue.
non LIFO allocation discipline, which happens quite often.
Notice that most C++ standard containers allocate their data in the heap, even if the container is on the stack. For example, an automatic variable of vector type, e.g. a local std::vector<double> autovec;
has its data heap allocated (and released when the vector is destroyed). Read more about RAII.