I would like to inject n-rows from my csv file to Gatling feeder. The default approach of Gatling is to read and inject one row at a time. However, I cannot find anywhere, how to take and inject an eg. Array into a template.
I came up with creating a JSON template with Gatling Expressions as some of the fields. The issue is I have a JSON array with N-elements:
[
{"myKey": ${value}, "mySecondKey": ${value2}, ...},
{"myKey": ${value}, "mySecondKey": ${value2}, ...},
{"myKey": ${value}, "mySecondKey": ${value2}, ...},
{"myKey": ${value}, "mySecondKey": ${value2}, ...}
]
And my csv:
value,value2,...
value,value2,...
value,value2,...
value,value2,...
...
I would like to make it as efficient as possible. My data is in CSV file, so I would like to use csv
feeder. Also, the size is large, so readRecords
is not possible, since I'm getting out of memory.
Is there a way I can put N-records into the request body using Gatling?
From the documentation:
feed(feeder, 2)
Old Gatling versions:
Attribute names, will be suffixed. For example, if the columns are name “foo” and “bar” and you’re feeding 2 records at once, you’ll get “foo1”, “bar1”, “foo2” and “bar2” session attributes.
Modern Gatling versions:
values will be arrays containing all the values of the same key.
In this latter case, you can access a value at a given index with Gatling EL: #{foo(0)}
, #{foo(1)}
, #{bar(0)}
and #{bar(1)}