We are attempting to import data from Netsuite into Azure Synapse Analytics via SuiteAnalytics Connect (ODBC) - this works on all but one table (transactionLine) where the connection always times out at 2 hours
From speaking to Oracle it appears the the two hour limit is fixed and cannot be increased.
I have a Copy Data activity set up in a pipeline, when I set "Use Query" to "Table" it times out but when I set it to "Query" and limit it to a small number of rows (e.g. 100) it works fine (see screenshot)
Is there a built in way to import this data in batches, say 1000 rows at a time, without creating dozens of separate Copy Data activities?
For your pipeline to work when the large no of records or to create a Batches to import data you can use the following approach.
#to get ftom lookup
@string(activity('Lookup1').output.count)
#to add manually
@string(5000)
start
.@string(1)
end
.@string(1000)
until
activity to process until the end
value is greater than left
.@greater(variables('end'),variables('left'))
execute pipeline
activity and pass the variable values to that pipeline parameters and with help of this parameters you can query the data and import it in batches.start
and end
where we increment the value of start
and and also increment the value of end
.start
, I have taken as changestart
variable with following value.@string(add(int(variables('start')),1000))
changestart
value to start
variable.end
as well. The following is a debug output when there are 5000 records.