I have the following code :
#include <random>
#include <vector>
#include <iostream>
#include<bits/stdc++.h>
using namespace std;
const int N=100000;
int main() {
default_random_engine Generator(time(0));
uniform_real_distribution<float> dist(0.0f,nextafter(1.0f, DBL_MAX));
array<float,N> a{0};
//vector<float> a(N,0);
for ( auto it = a.begin(); it != a.end(); ++it ){
*it=dist(Generator);
}
return 0;
}
The ambiguity I have is that array a can be generated when N is 100000 but when N becomes 1 million, it immediately exits the program with exit value of not zero! in another word, it crashes. But, when I use vector instead of an array, even millions of elements can be generated exactly by this way. Can anyone explain this? Does the array have some kind of limitation to produce a large size of numbers?
Because the array
allocates it's memory on the stack, while the vector
allocates memory on the heap. You can allocate the array
on the heap as well, with new
or by using a shared_ptr
or unique_ptr
like so:
shared_ptr<array<float,N>> a(new array<float,N>{0});
float v = (*a)[10]; // dereference for operator access