I have a reduce method done in JavaScript to combine the contents of an array of objects where there is a match with the same key.
For example:
const data = [{
"stage": "AT_CALLCENTER",
"content": "Hello",
},
{
"stage": "AT_CALLCENTER",
"content": "Bye",
},
{
"stage": "AT_SITE",
"content": "Good",
},
{
"stage": "AT_SITE",
"content": "Morning",
}
]
const result = data.reduce((val, item) => ({ ...val,
[`${item.stage}`]: [...(val[item.stage] || []), item.content]
}), {});
console.log(result)
I was told by a friend that a filter can perform better; this is important because I'm expecting much more data in production, and this code is executed 2x per row.
How can I achieve the same functionality, but with a filter instead of a reduction? Or is filter not actually the better option?
Some documentation on why reduce/spread
is not considered an optimal pattern.
But also don't use filter
either - your result is an object, not an array (which is what filter
returns). You can create your new data with a simple for/of
loop so you don't introduce that unnecessary performance hit, and make the code easier to read too.
Note: ??=
is logical nullish assignment.
const data=[{stage:"AT_CALLCENTER",content:"Hello"},{stage:"AT_CALLCENTER",content:"Bye"},{stage:"AT_SITE",content:"Good"},{stage:"AT_SITE",content:"Morning"}];
const result = {};
for (const { stage, content } of data) {
result[stage] ??= [];
result[stage].push(content);
}
console.log(result);