I want to merge chucks of file to another file but writeStream
is emitting close without writing or appending the readStream
content to the output file.
I have an async
function with a for-of loop that iterates through the array and within the loop I created a writableStream
with flag write truncate/write w
flag if iteration index is 0 else append w
then create a readStream
then pipe it into writeStream
.
async function mergeBlocks(op) {
for (const [index, block] of op.blockObjects.entries()) {
const flags = index === 0? 'w' : 'a'
const w = createWriteStream(op.output, { start: block.start, flags }),
r = createReadStream(block.path, { start: block.start, end: block.end })
console.log(index)
r.pipe(w)
await new Promise(res =>
w.on('finish', () => {
console.log('close')
res()
})
)
}
}
But it only write the first block and close the writeStream
without writing to output file for on every other iteration. Please what am I doing wrong?
EDIT Check answer below.
The start
and end
I was setting to the createReadStream
options was not necessary I only have to set those if I am reading from on file by setting those only the first block will match the chunk start and end position which explains why is only to first chunk that's being written to file while other are just empty contents.
And as for code I ended up using promisify stream.pipeline
which simplifies my code more.
async function mergeBlocks() {
for (const [index, block] of op.blockObjects.entries()) {
await pipeline(
createReadStream(join(block.bucket, block.hash)),
createWriteStream(op.output, { flags: index === 0 ? 'w' : 'a', start: block.start })
)
}
}