Error: Invalid response body - Error running Bulk Writer in a firebase function

For a couple of days I have been getting the following error when uploading a .csv file to a folder in firebase storage (where a firebase function read the contents and saves them to firestore).

Error: Invalid response body while trying to fetch$alt=json%3Benum-encoding=int: read ECONNRESET

After that, I receive another error notifying me that the function crashed: "Function execution took 90053 ms, finished with status: 'crash'". Followed by multiple errors like this: "Exception from a finished function: Error: request to failed, reason: Client network socket disconnected before secure TLS connection was established.

What is strange is that the function was working fine just a few weeks ago. I could have updated the firebase SDK and Firebase-tools recently, but I'm not sure if that could be the cause. This is the code of my function:

// Create new documents from file on firebase storage
export const createDocumentsFromCSV = functions.region(functionLocation).runWith({
  timeoutSeconds: 540,
  memory: '8GB'
}).storage.bucket().object().onFinalize(async (object: any) => {
  // Check that the file is a CSV file and located in the specified folder
  if (!'.csv') || !'carga-documentos/')) {
    return null;

  const bulkWriter = firestoredb.bulkWriter();
  let writeCount = 0;
  let batchCount = 0;

  const file = bucket.file(;
  functions.logger.log(`Starting, fileName: ${}`);
  const headers = ['id', 'client',.... 'creationTime', 'modificationTime'];

  // Read the CSV file from Firebase Storage
  const stream = file.createReadStream();
  return new Promise<void>((resolve, reject) => {
    stream.pipe(csv({ headers, skipLines: 1, separator: ';' }))
      .on('data', (row: any) => {
        try {
          row.client = JSON.parse(row.client);
          row.creationTime = Timestamp.fromDate(new Date(row.creationTime));
          row.modificationTime = Timestamp.fromDate(new Date(row.modificationTime));
          if (writeCount % 500 === 0) {
            functions.logger.log(`Batch ${batchCount} committed with ${writeCount} writes`);
          const docRef = firestoredb.collection('clients').doc('xxxxx').doc(row.xxxxx).collection('xxxxx').doc(;
          bulkWriter.set(docRef, row);
        } catch (error) {
          functions.logger.log(`Row causing error: ${JSON.stringify(row)}`);
          functions.logger.log(`Error: ${error}`);
      .on('end', async () => {
        functions.logger.log(`Estimated number of batches: ${Math.ceil(writeCount / 500)}`);
        functions.logger.log(`Number of documents: ${writeCount}`);
        await bulkWriter.close();
        functions.logger.log(`Finished, fileName: ${}`);
      .on('error', (error: any) => {

I tried updating the firebase SDK to the latest version, and added the current try-catch block to get more info about the error, but the problem persists and I'm out of ideas. If I had to guess I would say that maybe something's is wrong with my Bulk Writer code or the node version (v16) but I'm not sure.

I would be grateful if someone can see any error in my code or a give me an idea why this can be happening. I wasn't able find another question similar to this (on trigger background function and Firebase Bulk Writer) Thanks!

Just to clarify:

  1. Each file has at most 100.000 rows with info that is saved as individual documents. I think I could probably upload 300.000 rows and the function should still have enough time to save the documents to Firestore before the timeout.
  2. No document is saved multiple times and every document have a different id.
  3. The files are uploaded sequentially and wait for the function to finish before processing the next one.


  • I will leave my solution here in case anyone has the same issue. Updating node 16 to node 18 got rid of the problem, which is strange because supposedly the Firebase Admin SDK only requires node 14. (