I have written a program which reads data from binary files and does calculation based on the read values. Execution time is most import for this program. To validate that my program is operating within the specified time limits, I tried to log all the calculations by storing them inside a std::vector<std::string>
. And after the time critical execution is done, I write this vector to a file.
In the vector I write the execution time (std::chrono:steady_clock.now()
) and the current clock time (std::chrono::system_clock::now()
with date.h by Howard Hinnant).
While analyzing the results I stumble over the following pattern. Independent on the input data the mean execution time of 0.003ms for one operation explodes to ~20ms for a single operation at one specific reproducible index. After this, the execution time of all operations goes back to 0.003ms. The index of the execution time explosion is every time 2097151. Since 2^21 equals 2097152, something happens at 2^21 that slows down the entire program. The same effect can be observed with 2^22 and 2^23. Even more interesting is that the lag is doubled (2^21 = ~20ms, 2^22 = ~43ms, 2^23 =~81ms ). I googled about this specific number and the only thing I found was some node.js stuff which uses c++ under the hood.
At index 2^21 a memory area must be expanded, and that is why the delay occurs.
std::vector
, which supports > 10.000.000.000 elements?I was able to solve my problem by reserving memory by using std::vector::reserve()
before the time critical part of my program. Thanks to all the comments.
Here the working code I used:
std::vector<std::string> myLogVector;
myLogVector.reserve(12000000);
//...do time critical stuff, without reallocating storage