Why does the following code produce an error, and how do I fix it? I have been working on this for 2 days, so I would appreciate any advice. Basically, I do not know how to allocate and deallocate a very large array on a computer cluster with 200 GB of memory on a node. What are the best practices for this? Thank you!
const unsigned long int array_size_1 = 4032758016;
const unsigned long int array_size_2 = 2800526400;
const unsigned long int array_size_3 = 2800526400;
complex<double>* kinetic;
complex<double>* potential;
lapack_complex_double* hamiltonian;
int main (int argc, char *argv[]) {
kinetic = new complex<double>[4032758016];
potential = new complex<double>[4032758016];
hamiltonian = new lapack_complex_double[4032758016];
delete[] kinetic;
delete[] potential;
delete[] hamiltonian;
}
The code compiles. But during run-time... Error:
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Aborted
I don't believe that you actually need the contiguous array of this size. The solution is std::deque
: this looks very similar to std::vector
, but the elements are stored in chunks. The random access is constant, push_back
is even faster than the vector's.
This container is used for two major purposes: whenever you need a queue with push/pop operations on both sides, and for the purpose of storing extremely large buffers.
The code would look like:
#include <deque>
const unsigned long int array_size_1 = 4032758016;
const unsigned long int array_size_2 = 2800526400;
const unsigned long int array_size_3 = 2800526400;
std::deque<complex<double>> kinetic;
std::deque<complex<double>> potential;
std::deque<lapack_complex_double> hamiltonian;
int main (int argc, char *argv[]) {
kinetic.resize(array_size_1);
potential.resize(array_size_2);
hamiltonian.resize(array_size_3);
}