Here I'm allocating 10^9 bits:
#include <bitset>
#include <iostream>
const int N = 1000000000;
std::bitset<N> b;
int main()
{
std::cout << sizeof(b) << std::endl;
}
I get cc1plus.exe: out of memory allocating 268439551 bytes
.
But when I do
#include <bitset>
#include <iostream>
const int N = 1000000000;
int l[N/32];
int main()
{
std::cout << sizeof(l) << std::endl;
}
The 125000000 bytes (125 MB) are allocated fine. If I change N
to a different power of 10 I see both sizeof
are the same. I don't even see where the 268439551 byte limit is coming from, since that's 268.4 MB and I have about 4 GB RAM free. Even on a 32 bit system ~200 MB should not be causing a problem, and somehow the byte limit is reached. What's causing the problem here?
Using gcc 4.8.3 on Windows 8.1 with 8 GB RAM.
This seems to be a bug with GCC for c++11: Gcc uses large amounts of memory and processor power with large C++11 bitsets. Compiling with -std=c++98
was a temporary workaround for me.