I'm trying to analyze how effective the NodeJS is in handling async functions.
I have the NodeJS script below to initiate 10 millions of Promises which will sleep for 2 seconds to simulate an intensive backend API calls. The script run for a while (~30s), consumed up to 4096 MB of ram and threw JavaScript heap out of memory
error.
const sleep = async (ms) => new Promise((resolve) => setTimeout(resolve, ms));
const fakeAPICall = async (i) => {
await sleep(2000);
return i;
};
const NUM_OF_EXECUTIONS = 1e7;
console.time(`${NUM_OF_EXECUTIONS} executions:`);
[...Array(NUM_OF_EXECUTIONS).keys()].forEach((i) => {
fakeAPICall(i).then((r) => {
if (r === NUM_OF_EXECUTIONS - 1) {
console.timeEnd(`${NUM_OF_EXECUTIONS} executions:`);
}
});
});
ERROR
<--- Last few GCs --->
[41215:0x10281b000] 36071 ms: Mark-sweep (reduce) 4095.5 (4100.9) -> 4095.3 (4105.7) MB, 5864.0 / 0.0 ms (+ 1.3 ms in 2767 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 7190 ms) (average mu = 0.296, current mu = 0.[41215:0x10281b000] 44534 ms: Mark-sweep (reduce) 4096.3 (4104.7) -> 4096.3 (4105.7) MB, 8461.4 / 0.0 ms (average mu = 0.140, current mu = 0.000) allocation failure scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: MarkCompactCollector: young object promotion failed Allocation failed - JavaScript heap out of memory
1: 0x100098870 node::Abort() [/usr/local/opt/node@14/bin/node]
2: 0x1000989eb node::OnFatalError(char const*, char const*) [/usr/local/opt/node@14/bin/node]
3: 0x1001a6d55 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/usr/local/opt/node@14/bin/node]
4: 0x1001a6cff v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/usr/local/opt/node@14/bin/node]
5: 0x1002dea5b v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/usr/local/opt/node@14/bin/node]
6: 0x100316819 v8::internal::EvacuateNewSpaceVisitor::Visit(v8::internal::HeapObject, int) [/usr/local/opt/node@14/bin/node]
Nodejs has a default memory limit which can be changed with the --max_old_space_size=<memory in MB>
NODE option;
I have the NodeJS script below to initiate 10 millions of Promises
Not even close. There are about 50 million of them.
const sleep = async (ms) => { // redundant async - Promise#1
return new Promise((resolve) => setTimeout(resolve, ms)); // Promise#2
}
const fakeAPICall = async (i) => { // async - Promise#3
await sleep(2000); // await - Promise#4
return i;
};
const NUM_OF_EXECUTIONS = 1e7;
console.time(`${NUM_OF_EXECUTIONS} executions:`);
for (let i = 0; i < NUM_OF_EXECUTIONS; i++) {
fakeAPICall(i).then((r) => { // then - Promise#5
if (r === NUM_OF_EXECUTIONS - 1) {
console.timeEnd(`${NUM_OF_EXECUTIONS} executions:`);
}
});
}
In each iteration, you actually create at least 5 promises and one generator, so you have 50 million promises and a huge amount of other objects in memory. This is a lot since they are pure JS objects written in JS and of course, they consume more memory than low-level precompiled languages. Node is not about low memory consumption, but memory becomes the bottleneck in your case. Promises are made for ease of use, if you need memory optimization - pure callbacks can be cheaper.
Here we create 10M promises:
const NUM_OF_EXECUTIONS = 5_000_000;
console.log(`Start `, NUM_OF_EXECUTIONS);
const sleep = (ms, i) => new Promise((resolve) => setTimeout(resolve, ms, I)); // Promise#1
console.time(`${NUM_OF_EXECUTIONS} executions`);
for (let i = 0; i < NUM_OF_EXECUTIONS; i++) {
sleep(2000, i).then((r) => { // then - Promise#2
if (r === NUM_OF_EXECUTIONS - 1) {
console.timeEnd(`${NUM_OF_EXECUTIONS} executions`);
}
});
}
Memory (2.6 GB):
Start 5000000
{
rss: '2.72 GB',
heapTotal: '2.68 GB',
heapUsed: '2.6 GB',
external: '308 kB',
arrayBuffers: '10.4 kB'
}
5000000 executions: 24.776s
Process finished with exit code 0