The third party is using the Azure Data Factory to handle the result of the API call. As the ADF has a file size limit of 4MB, the API call fails if the output JSON file generated by the Logic App is more than 4MB. The Logic App executes a SQL stored procedure which generates a JSON file.
How can we do pagination or split the output file in Logic App so that third party can make successful API calls even if the file size generated is greater than 4 MB?
You can try the below possible workarounds.
Check the number of rows in that file in the logic app and after generating the output JSON file, call an intermediate ADF pipeline and pass the length to it. In that pipeline, use a dataflow with generated output file as source and give a temporary blob JSON dataset. In the sink of the dataflow, use the partition option and set the partition number. Set this number such as the target generated files rows are 5000. Check with passed length from logic app and calculate this number. Here for sample, I am giving 4.
Set the file names as pattern and give the pattern like below.
The small size filenames will be generated like below.
Use the small sized files one by one in the next ADF pipeline as per your requirement.
If you have access to the ADF pipeline, use copy activity with API call as source and your target as sink dataset. Use the skip
and take
approach as suggested in comments in copy activity source dataset. Check this SO Answer by @Pratik Lad to know more about this.