Search code examples
javascriptnode.jspuppeteerheap-memoryexceljs

NodeJS: invalid array length Allocation failed - JavaScript heap out of memory


I'm trying to export data from a file and add this data to an Excel file using ExcelJS.

worksheet.addRows(freedomRawData.records);

'records' is an array that contains more than 165,000 elements in it. While writing the data to excel file, I get the error -

FATAL ERROR: invalid array length Allocation failed - JavaScript heap out of memory

The same script works for data with 'record' length 115,000 elements. Over the internet, I found the below method:

node --max-old-space-size=8192 <file_name>.js

This didn't work and I tried using the maximum capacity of my RAM (16384MB); which didn't help as well. I'm a newbie in NodeJS, and any help would be much appreciated.


Solution

  • ExcelJS has a streaming writer interface:

    async function writeExcel(rows_array){
      const workbook = new Excel.stream.xlsx.WorkbookWriter({ filename: 'stream.xlsx' })
      const worksheet = workbook.addWorksheet('data')
      for (const row of rows_array) {
        worksheet.addRow(row).commit()
      }
      await workbook.commit()
    }
    

    In this case, the data set could still present a memory problem due to the way the node/v8 garbage collector works. Garbage collection requires a tick of the event loop to actually clean up "freed" memory. If you run enough synchronous code, like a for loop, and that loop allocates more memory each iteration, then the GC can't run until after a pause in the code (i.e the final worbook.commit(). If you still run into OOM issues then you can force an async pause to allow the row memory that can be cleaned up after a .commit() to actually be collected.

      for (const i in rows_array) {
        worksheet.addRow(rows_array[i]).commit()
        if (i % 10000 === 0) await Promise.resolve(true)
      }