Search code examples
ffmpegstream

streams not writing complete chunk in nodejs


AIM

I am working on a small project that chunks an audio file and then uploads the chunks to a cloud server.

PROCESS

  • upload the audio file to the server
  • chunk the audio file with ffmpeg using a child process then write the chunk to a folder on the server
  • then upload the chunked files to a cloud server (working with aws s3)

This works well with audio files lesser than 15Mb but on larger files the chunk doesn't get to the end. For example, in trying to upload a 40+Mb file the chunk stops at number 153, but when I use my commandline to run the ffmpeg, I get 328 chunked files. What could be wrong

Here is my code:

const muxer = child_process.spawn('ffmpeg', transformArgs, {
           cwd: `${cwd}/transforms`,
        });

        console.log('creating stream (read)...');
                    
        fs.createReadStream(
           `${cwd}/transforms/stream${path.extname(
              fileSrc.filename,
           )}`,
        )
           .pipe(muxer.stdin)
           .on('error', (e) => {
              console.log(e);
           }).on('close', async () => {
              console.log('ended');
              muxer.stdin.end();
              muxer.kill();})

Solution

  • finally got it to work by adding

    muxer.stderr.on('data', (data) => {
       console.log(data)
    })
    

    But still don't know what this does exactly...