Search code examples
google-bigqueryetlapache-nifi

How to use Apache Nifi to transfer lots of data (GB) without memory error?


I am using apach NiFi to transfer the base in postgres to bigquery, but, some tables are lot of big (29GB) and I am receiving memory erro of the VM. can i jump this limit? i need do upgrade in my descktop to suport this transition? or exist outher way to do?

error: java.lang.outofmemoryerror: gc overhead limit exceeded

schema: imageSchema


Solution

  • You should set Auto Commit property to false when executing queries for big PostgreSQL tables because the driver does not honor fetchSize while auto commit set true(default)

    Unfortunately , sql executors in current Nifi versions do not have that property.

    see the related issue