Search code examples
javascriptnode.jspostgresqlmulter

Inserting multiple rows into db from csv file using node pg


I am trying to add multiple rows from a csv file into my postgres database using node-pg, multer and papa parse. I tried the following:

const createTrackingCSV = (request, response) => {
    // storing the csv file using multer memoryStorage() 
    let upload = multer({ storage: multer.memoryStorage(), fileFilter: fileFilter }).single('fileUploaded');


    upload(request, response, function(err) {
        
        var results = Papa.parse(String(request.file.buffer), {
            header: true
            });

        var uploadObj = results.data;

        uploadObj.forEach(function(obj) {
            pool.query('INSERT INTO tracking (col1, col2) VALUES ($1, $2)', [obj.test1, obj.test2], (error, results) => {
                if(error) {
                    console.log(error);
                    response.sendStatus(500);
                    return;
                }
            })  
        });
        response.status(201).send('Successfully inserted');

    });
}

This approach results in the following error when I try to insert the contents of the csv file:

Error [ERR_HTTP_HEADERS_SENT]: Cannot set headers after they are sent to the client

What is the mistake I am making / is there a different approach I could try to insert multiple rows from a csv file into the database?


Solution

  • The pool.query(...) is ansychronous, and you call it in a loop. Any failure would try to send the response. However, these would take some time, and by then the loop is done and the 201 response is already sent.

    See How to make a synchronous query (blocking call) with node-postgres? for how to synchronise (blocking) query calls.

    Probably simplest, if you are used to sequential style, is to await the pool.query(...) call.