I'm working with array chunking at long datasets. I need to create a new array of chunks of a certain size. Currently, I use this solution but it shows bad performance.
function array_to_chunks(data, size){
let chunks = []
let d = data.slice()
while (d.length >= size) chunks.push(d.splice(0, size))
return chunks
}
I'd like to find some better idea of how to do it fast enough and why my code does not perform well.
This is slightly more performant because you don't have to copy the array:
const createGroupedArray = function (arr, chunkSize) {
if (!Number.isInteger(chunkSize)) {
throw 'Chunk size must be an integer.';
}
if (chunkSize < 1) {
throw 'Chunk size must be greater than 0.';
}
const groups = [];
let i = 0;
while (i < arr.length) {
groups.push(arr.slice(i, i += chunkSize));
}
return groups;
};
if you are doing I/O, then use Node.js streams:
const strm = new Writable({
write(chunk, enc, cb){
// do whatever
}
});