I'd like to skip parsing some json returned by my DB when I just turn it back into json again immediately in the response from my padrino app.
ie I have
get :data, provides: :json do
Record.order(:day).map do |r|
{r.day.to_s => JSON.parse!(r.data)}
end.reduce({}, :merge!).to_json
end
and I'd like something like the following (inspired by String#html_safe):
get :data, provides: :json do
Record.order(:day).map do |r|
{r.day.to_s => r.data.json_literal}
end.reduce({}, :merge!).to_json
end
I know I can move the hash creation to the model with #as_json but that doesn't address the unnecessary performance hit from parsing and re-encoding the json.
Example output:
{
"2010-01-01":{"linux64":12186213,"mac":24131170},
"2010-01-02":{"linux":10650417,"mac":24139611,"win":12210218},
"2010-01-03":{"linux":10628353,"linux64":12184435,"win":12229263}
}
where the object that is the value of each key/value pair is available as a json-string in r.data
eg '{"linux":10650417,"mac":24139611,"win":12210218}'
which is why i want to avoid parsing r.data and just inline it.
I tried bypassing the JSON parse/dump altogether with the following:
get :data, provides: :json do
"{"+Record.order(:day).map do |r|
"\"#{r.day}\":#{r.data},"
end.reduce(&:+).delete(' ').chop+"}"
end
but performance was even worse than the version with the unnecessary parsing. (Which is weird, I'm not sure if string concatenation is slow or string interpolation...)
It turns out that the best way to do this is actually at the DB layer, as postgres can inline json columns into json objects and aggregate strings far faster than ruby can. So I did that and dropped my latency from 30ms to 8ms. The body of the method is below:
Sequel::Model.db.fetch("select json_object_agg(day,data) as jsobj from records;").first[:jsobj]
References: