EDIT: A bit of debugging shows that the problem lies in lru-cache:
public async get(key: string, fetcher: () => Promise<T>, refresh?: boolean): Promise<T | Error> {
if(!this.usedKeys.includes(key)) {
this.usedKeys.push(key);
}
const value = this.cache.get(this.namespace + "_" + key);
if (value && !refresh) {
return value as T;
}
try {
const result = await fetcher();
if (result instanceof Error) return result;
console.log('caching')
this.cache.set(this.namespace + "_" + key, result);
console.log('cached');
return result;
} catch (e) {
return e;
}
}
cache: LRUCache<string,any>
, from lib lru-cache
It breaks after 'caching'. HOWEVER in a previous version of this application I stored same amount of rows/columns and worked well, just before it was {[p: string]: any}[] and now it is any[][] (so headers are stored in separate variable to save memory.
PREVIOUS QUESTION:
I'm getting an error when *trying to .map() exactly 235886 rows queried by node-pg:
#
# Fatal error in , line 0
# Fatal JavaScript invalid size error 169224256
#
#
#
#FailureMessage Object: 0x7ffdf0e186e0
1: 0xbee3e1 [node]
2: 0x1e6b0b4 V8_Fatal(char const*, ...) [node]
3: 0xf07f88 [node]
4: 0x10b7452 [node]
5: 0x10b7712 [node]
6: 0x12c6adb v8::internal::Runtime_GrowArrayElements(int, unsigned long*, v8::internal::Isolate*) [node]
7: 0x17035b9 [node]
Trace/breakpoint trap (core dumped)
in:
const query = await this.pool.query(queryString,params);
if(query) {
console.log(query.rowCount)
this.queryDate = new Date();
return {
data: query.rows.map(row => Object.values(row)),
data_headers: query.fields.map((field) => {
return {
column: field.name,
datatype: PG_TYPE_MAP[field.dataTypeID] || 'text'
}
}),
localized_headers: query.fields.map((field) => {
const localized = localizedNames[field.name];
return localized ?? field.name;
}),
original_length: query.rowCount
}
}
I need to turn that object into values array because that's how my current system works - by caching it, paginating and sending to react client - and works well but for smaller tables ;)
Any ideas how to resolve it? I already set max-old-space-size to 8gigs, but the problem still occurs.
Found a solution - size estimation of any[][] causes such error, but for the same dataset (or even larger, because with headers) {}[] is not. Weird, but gotta search for another solution than object-sizeof
lib.
this.cache = new LRUCache({
max: 20, ttl: 1000 * 60 * 60 /* 1h live time */, sizeCalculation: (obj) => {
return sizeof(obj); // <---- HERE
}, maxSize: 1.5 * 1024 * 1024 * 1024
});