Search code examples
node.jspostgresqlnode-postgres

Node-Postgres: Merge transaction statements into one to improve speed?


I am currently using node-postgres to INSERT and UPDATE lots of data to my PostgresDB. Transactions can be up to 15k statements long.

I am using transactions as outlined in the documentation:

client.query('BEGIN', (err) => {
if (shouldAbort(err)) return
client.query('INSERT INTO users(name) VALUES($1) RETURNING id', ['brianc'], (err, res) => {
  if (shouldAbort(err)) return

  const insertPhotoText = 'INSERT INTO photos(user_id, photo_url) VALUES ($1, $2)'
  const insertPhotoValues = [res.rows[0].id, 's3.bucket.foo']
  client.query(insertPhotoText, insertPhotoValues, (err, res) => {
    if (shouldAbort(err)) return

    client.query('COMMIT', (err) => {
      if (err) {
        console.error('Error committing transaction', err.stack)
      }
      done()
    })
  })
})

The problem, however is that each Transaction statement is sent to the DB individually which makes sense in the example, but in our use case we will never ever need to grab a result from a previous statement.

I am therefore tempted to put all statements into one and execute it at once in order to reduce overall duration of the transaction.

Would be interested to hear if this could cause any undesired behavior. Thanks!


Solution

  • Resolved itself. When putting multiple commands in the statement the query fails: "Error: cannot insert multiple commands into a prepared statement"