I'm trying to process video files with ffmpeg running on google cloud functions. Video files are downloaded from a google file storage, processed in stream by fluent-ffmpeg and streamed to a new google storage file. It works on smaller files but throws an "Output stream error: Maximum call stack size exceeded" on larger files.
I tried running the code on a normal pc, and I haven't encountered this error, even with larger files.
These are the parameters I deploy the function with
gcloud functions deploy $FUNCTION_NAME --runtime nodejs8 --trigger-http --timeout=180 --memory 256
This is the code that processes video
function cutVideo({videoUrl, startTime, duration, dist}) {
return ffmpeg(videoUrl)
.outputOptions('-movflags frag_keyframe+empty_moov')
.videoCodec('copy')
.audioCodec('copy')
.format('mp4')
.setStartTime(startTime)
.setDuration(duration);
}
const sectionStream = cutVideo({
videoUrl,
startTime,
duration,
dist: tempFilePath,
});
const outputStream = bucket.file(sectionPath)
.createWriteStream({
metadata: {
contentType: config.contentType,
},
public: true,
});
Actual error stack looks like this
Error: Output stream error: Maximum call stack size exceeded
at Pumpify.<anonymous> (/srv/node_modules/fluent-ffmpeg/lib/processor.js:498:34)
at emitOne (events.js:121:20)
at Pumpify.emit (events.js:211:7)
at Pumpify.Duplexify._destroy (/srv/node_modules/duplexify/index.js:191:15)
at /srv/node_modules/duplexify/index.js:182:10
at _combinedTickCallback (internal/process/next_tick.js:132:7)
at process._tickDomainCallback (internal/process/next_tick.js:219:9)
RangeError: Maximum call stack size exceeded
at replaceProjectIdToken (/srv/node_modules/@google-cloud/projectify/build/src/index.js:28:31)
at replaceProjectIdToken (/srv/node_modules/@google-cloud/projectify/build/src/index.js:37:30)
at replaceProjectIdToken (/srv/node_modules/@google-cloud/projectify/build/src/index.js:37:30)
at value.map.v (/srv/node_modules/@google-cloud/projectify/build/src/index.js:30:32)
at Array.map (<anonymous>)
at replaceProjectIdToken (/srv/node_modules/@google-cloud/projectify/build/src/index.js:30:23)
at replaceProjectIdToken (/srv/node_modules/@google-cloud/projectify/build/src/index.js:37:30)
at replaceProjectIdToken (/srv/node_modules/@google-cloud/projectify/build/src/index.js:37:30)
at value.map.v (/srv/node_modules/@google-cloud/projectify/build/src/index.js:30:32)
at Array.map (<anonymous>)
What could cause this error on a google cloud function?
Apart from memory/cpu based restrictions due to the fact that these kind of long running processes are impossible to be applicable in Google Cloud Functions due to the timeout in reality.
The only way to achieve this is to use "Google App Engine Flex". It is by nature have the longest available timeout mechanism which an be set in two levels both app.yaml/gunicorn (or whichever webserver you intend to use) and actual GAE timeout.
The rest of the services GAE standard or Google Cloud Functions, they have a strict timeout which you cannot change beyond 10 seconds to 30 minutes. These timeouts will not be sufficient for usage with ffmpeg and transcoding purposes.