Search code examples
azure-sql-databaseoracle-data-integratorazure-synapse

Exceeded the memory limit of 20 MB per session for prepared statements. Reduce the number or size of the prepared statements


I am trying to insert record in to Azure sql Dataware House using Oracle ODI, but i am getting error after insertion of some records.

NOTE: I am trying to insert 1000 records, but error is coming after 800.

Error Message: Caused By: java.sql.BatchUpdateException: 112007;Exceeded the memory limit of 20 MB per session for prepared statements. Reduce the number or size of the prepared statements.

I am trying to insert 1000 records,but error is coming after 800.

Error Message: Caused By: java.sql.BatchUpdateException: 112007;Exceeded the memory limit of 20 MB per session for prepared statements. Reduce the number or size of the prepared statements.


Solution

  • While Abhijith's answer is technically correct, I'd like to suggest an alternative that will give you far better performance.

    The root of your problem is that you've chosen the worst-possible way to load a large volume of data into Azure SQL Data Warehouse. A long list of INSERT statements is going to perform very badly, no matter how many DWUs you throw at it, because it is always going to be a single-node operation.

    My recommendation is to adapt your ODI process in the following way, assuming that your Oracle is on-premise.

    • Write your extract to a file
    • Invoke AZCOPY to move the file to Azure blob storage
    • CREATE EXTERNAL TABLE to map a view over the file in storage
    • CREATE TABLE AS or INSERT INTO to read from that view into your target table

    This will be orders of magnitude faster than your current approach.