I was testing the limits of the vector, and the below code always gives bad alloc error around the number 200 ish million elements where the vector max size is way way way bigger than that and I'm just confused why it's so low. I have 16 GB of ram and the elements should occupy 2 GB of ram.
#include <iostream>
#include <vector>
using namespace std;
int main()
{
vector<int*> tab;
int a = 4;
for(int i = 0; i < 500000000; i++)
{
tab.emplace_back(new int);
}
//cout << tab.max_size() << endl;
return 0;
}
You're testing your allocator more than the vector implementation. Vector is going to reallocate every time capacity is reached, doubling the previous capacity in most implementations. That would return memory to the malloc implementation, but the calls to new
are going to potentially fragment what's left. The growth of the vector storage needs a contiguous amount of memory.
You may find that you get a very different result of you fill the vector with nullptr
.