I have a JSON document representing an array of 100 objects and I need to process this document in batches, e.g. 10 objects per batch.
def text = '[{1st},{2nd},{3rd},{4th},...{100th}]'
def json = new groovy.json.JsonSlurper().parseText(text)
Now I need to take first 10 elements from text
([{1st},{2nd},..{10th}]
) and post them into web service, then another 10 ([{11th},{12th}...{20th}]
) and so on.
I've tried this in C# but not able to do that in Groovy.
Anyone suggest me the best way to send batches of json
and every time total number of json has changed dynamically
?
Groovy adds Iterable.collate(int size)
method via DefaultGroovyMethods
class and you can use it to split your input array into n-size chunks, e.g.
['a','b','c','d','e'].collate(3) == [['a','b','c'], ['d','e']]
Consider following example:
import groovy.json.JsonOutput
import groovy.json.JsonSlurper
final String text = '[{"id": 1}, {"id": 2}, {"id": 3}, {"id": 4}, {"id": 5}, {"id": 6}]'
final List<Map> json = new JsonSlurper().parseText(text) as List<Map>
json.collate(2).each { part ->
final String out = JsonOutput.toJson(part)
println "Sending following JSON object: ${out}"
}
Here we have a JSON array of 6 objects. We parse this JSON to a List<Map>
object and then we split into chunks of size 2 and prepare JSON for later execution. I used only 6 objects as an illustration, however it doesn't matter if the initial list contains 100 objects and we split into chunks of size 10 - the algorithm is the same.
It can be generalized and described in following steps:
The example shown above produces following output:
Sending following JSON object: [{"id":1},{"id":2}]
Sending following JSON object: [{"id":3},{"id":4}]
Sending following JSON object: [{"id":5},{"id":6}]