Search code examples
node.jsfirebasegoogle-cloud-functionsyoutube-data-api

Firebase cloud function [ Error: memory limit exceeded. Function invocation was interrupted.] on youtube video upload


I was trying to upload videos to youtube using the firebase cloud function.

What I need is when a user uploads a video to firebase cloud storage, functions.storage.object().onFinalize event will get triggered and in that event, I store the file to a temporary location and upload the file to youtube from the temp location to youtube, after uploading I delete both files.

It will work fine for small files.

But if I upload a large file then the function is getting terminated by showing this error

Error: memory limit exceeded. Function invocation was interrupted.

Code for uploading video

   var requestData = {
        'params': {
        'part': 'snippet,status'
        },
        'properties': {
        'snippet.categoryId': '22',
        'snippet.defaultLanguage': '',
        'snippet.description': "docdata.shortDesc",
        'snippet.tags[]': '',
        'snippet.title': "docdata.title",
        'status.embeddable': '',
        'status.license': '',
        'status.privacyStatus': 'public',
        'status.publicStatsViewable': ''
        }, 'mediaFilename': tempLocalFile
    };

    insertVideo(tempLocalFile, oauth2Client, requestData);

insert video function

function insertVideo( file, oauth2Client, requestData) {
    return new Promise((resolve,reject)=>{
        google.options({ auth: oauth2Client });
        var parameters = removeEmptyParameters(requestData['params']);
        parameters['auth'] = oauth2Client;
        parameters['media'] = { body:  fs.createReadStream(requestData['mediaFilename'])};
        parameters['notifySubscribers'] = false;
        parameters['resource'] = createResource(requestData['properties']);

        console.log("INSERT >>> ");
        let req = google.youtube('v3').videos.insert(parameters,  (error, received)=> {
            if (error) {
                console.log("in error")
                console.log(error);
                try {
                    fs.unlinkSync(file);
                } catch (err) {
                    console.log(err);
                } finally{
                    // response.status(200).send({ error: error })
                }
                reject(error)
            } else {
                console.log("in else")
                console.log(received.data)
                fs.unlinkSync(file);
                resolve();
            }
        }); 
    })

}

code for creating temp local file

           bucket.file(filePath).createReadStream()
            .on('error', (err)=> {
                reject(err)
            })
            .on('response', (response)=> {
                console.log(response)
            })
            .on('end', ()=> {
                console.log("The file is fully downloaded");
                resolve();
            })
            .pipe(fs.createWriteStream(tempLocalFile));

Every file read and write is handled by streams, any idea on why the memory issue is happening


Solution

  • The only writeable part of the filesystem in Cloud Functions is the /tmp directory. As per the documentation here:

    This is a local disk mount point known as a "tmpfs" volume in which data written to the volume is stored in memory. Note that it will consume memory resources provisioned for the function.

    This is why you hit the memory limit with bigger files.

    Your options are:

    • Allocate more memory to your function (currently up to 2 GB)
    • Execute the upload from an environment where you can write to filesystem. For example, your Cloud Function could call an App Engine Flexible service to execute the upload.