Search code examples
azure-logic-appsazure-log-analyticsmicrosoft365-defender

How to chuck JSON output efficiently to Azure Log Analytics workspace


I have data (650k rows) which i want to store using the Send data to Azure Log Analytics workspace module.

this is my workflow:

enter image description here

but i get this error:

    "message": "The response is not in a JSON format.",
    "innerError": "The page was not displayed because the request entity is too large."

when i send 20 rows worth of data it works.

I believe somewhere in the 650k rows a string breaks the json causing it to give me this error.

I believe Advanced Hunting is only giving me ~95k rows as an output, though 650k rows exist. maybe i have to structure it to give me 50k rows at a time and send that through.

whats the most efficient way to send this data in chucks using logic apps?


Solution

  • Here's a possible solution.

    1. Gather what the max length of the data is: max length

    2. Store this as a variable total length

    3. Initialize a variable which is how many chunks we want. here I've split the total length by 10,000 . this is the number of chunks we wan.t chunks we want

    4. next we create an array of indexes, that is the length of the chunks i.e ([1,2,3,4,5,etc]). length(variables('numberArray')) and add(variables('lengthofarray'),1) enter image description here

    5. Now for every index in the array we send a chunk of data from the table. the between statement has sub(mul(int(items('send_logs_to_LAW')),10000),10000) and mul(int(items('send_logs_to_LAW')),10000), this allows us to read only 10,000 rows and send 10,000 rows so we dont hit the limit on the send data module. enter image description here

    This is a solution that utilizes logic apps. However, there may be a better solution with DCR rules?