I am facing the following error when I try to compute a sha256 hash of a ~300 MB big buffer using hash.js library:
#
# Fatal error in , line 0
# Fatal JavaScript invalid size error 169220804
#
#
#
#FailureMessage Object: 0x7ffd6dc623c0
1: 0xbe6ad1 [/mnt/user-data/jan/.nvm/versions/node/v18.14.2/bin/node]
2: 0x1e42f64 V8_Fatal(char const*, ...) [/mnt/user-data/jan/.nvm/versions/node/v18.14.2/bin/node]
3: 0xf06978 [/mnt/user-data/jan/.nvm/versions/node/v18.14.2/bin/node]
4: 0x10b48c2 [/mnt/user-data/jan/.nvm/versions/node/v18.14.2/bin/node]
5: 0x10b4b82 [/mnt/user-data/jan/.nvm/versions/node/v18.14.2/bin/node]
6: 0x12c3a5b v8::internal::Runtime_GrowArrayElements(int, unsigned long*, v8::internal::Isolate*) [/mnt/user-data/jan/.nvm/versions/node/v18.14.2/bin/node]
7: 0x1700579 [/mnt/user-data/jan/.nvm/versions/node/v18.14.2/bin/node]
I tried increasing the available memory by using the --max-old-space-size
param to an absurdly high value of 262 GB and it didn't help (the code was executed on a machine with half a terabyte of RAM):
node --max-old-space-size=262144 --trace-gc-verbose ./dest
This is the memory usage before the error gets thrown:
[1864454:0x6bfe770] Memory allocator, used: 1662256 KB, available: 266822352 KB
[1864454:0x6bfe770] Read-only space, used: 241 KB, available: 0 KB, committed: 0 KB
[1864454:0x6bfe770] New space, used: 0 KB, available: 16109 KB, committed: 32768 KB
[1864454:0x6bfe770] New large object space, used: 0 KB, available: 587572 KB, committed: 0 KB
[1864454:0x6bfe770] Old space, used: 15238 KB, available: 1485 KB, committed: 17024 KB
[1864454:0x6bfe770] Code space, used: 1152 KB, available: 235 KB, committed: 1484 KB
[1864454:0x6bfe770] Map space, used: 512 KB, available: 246 KB, committed: 776 KB
[1864454:0x6bfe770] Large object space, used: 1610146 KB, available: 0 KB, committed: 1610204 KB
[1864454:0x6bfe770] Code large object space, used: 0 KB, available: 0 KB, committed: 0 KB
[1864454:0x6bfe770] All spaces, used: 1627291 KB, available: 267428000 KB, committed: 1662256 KB
[1864454:0x6bfe770] Unmapper buffering 0 chunks of committed: 0 KB
[1864454:0x6bfe770] External memory reported: 329446 KB
[1864454:0x6bfe770] Backing store memory: 329643 KB
[1864454:0x6bfe770] External memory global 0 KB
[1864454:0x6bfe770] Total time spent in GC : 444.9 ms
It shows that plenty of memory is still available. Does anyone have any idea what might be wrong? Thank you
This looks like an array allocation error. It doesn't matter how much memory you throw at node it can't allocate more entries in an array.
Use node crypto's Hash and streams for large files.
import { pipeline } from 'node:stream/promises'
import { createReadStream } from 'node:fs'
import { createHash } from 'node:crypto'
async function collectString(readable) {
let shasum = ''
for await (const chunk of readable) {
shasum += chunk
}
return shasum
}
const res = await pipeline(
createReadStream('test.js'),
createHash('sha256').setEncoding('hex'),
collectString,
)
console.log('shasum: %s',res)