We've been using Firebase Functions for 2+ years and have amassed well over 120 HTTP, callable, triggered, and scheduled functions, all being called from a single functions/index
and managed by a single package.json
, probably like a lot of you. As you can imagine, we have some old dependencies in there that we're hesitant to update because it's an awful lot of code to go through, test, etc. So it got me thinking, and I'm asking if any of you have done this or know why this wouldn't work...
Looking at the GCP dashboard, each function is a separate, stand-alone service. But if you download the code from there, you end up with the full build of all 120+ functions, node modules, etc. So if I run npm deploy
on my single functions
directory (if quotas weren't an issue), it looks like
That got me thinking - considering I can't and don't want to build my entire project and deploy all functions at once, do I have to have them all in a single functions
directory, sharing a single package.json
and dependencies, and exported from a single functions/index
?
Is there any reason I couldn't have, for example:
- functions
- functionSingleA (running on node 10)
- lib/index.js
- package.json (stripe 8.92 and joi)
- src/index.ts
- node_modules
- functionGroupB (running on node 12)
- lib/index.js
- package.json (stripe 8.129 and @hapi/joi)
- src/index.ts
- node_modules
I know that I lose the ability to deploy all at once, but I don't have that luxury any more due to quotas. Beyond that, is there any reason this wouldn't work? After all, as best as I can tell, Firebase Functions are just individual serverless Cloud Functions with Firebase credentials built in. Am I missing something, or do you do this and it works fine (or breaks everything)?
A Firebase engineer through support confirms that this is absolutely possible, but also check out the discussion between me and @samthecodingman. You can break up your functions into completely self-contained modules or groups with different package.json
files and dependencies, and deploy each one (individually or as groups) without affecting other functions.
What you lose in return is the ability to deploy all with the firebase functions deploy
command (though @samthecodingman presented a solution), and you lose the ability to emulate functions locally. I don't have a workaround for that yet.
It should be possible by tweaking the file structure to this:
- functionProjects
- deployAll.sh
- node10
- deploy.sh
- firebase.json
- functions
- lib/index.js
- package.json (stripe 8.92 and joi)
- src/index.ts
- node_modules
- node12
- deploy.sh
- firebase.json
- functions
- lib/index.js
- package.json (stripe 8.129 and @hapi/joi)
- src/index.ts
- node_modules
As a rough idea, you should be able to use a script to perform targeted deployments. Using the targeted deploy commands, it should leave the other functions untouched (i.e. it won't ask you to delete missing functions).
Each deploy.sh
should change the working directory to where it is located, and then execute a targeted deploy command.
#!/bin/bash
# update current working directory to where the script resides
SCRIPTPATH=$(readlink -f "$0")
SCRIPTPARENT=$(dirname "$SCRIPTPATH")
pushd $SCRIPTPARENT
firebase deploy --only functions:function1InThisFolder,functions:function2InThisFolder,functions:function3InThisFolder,...
popd
The deployAll.sh
file just executes each 'child' folder's deploy.sh
.
#!/bin/bash
/bin/bash ./node10/deploy.sh
/bin/bash ./node12/deploy.sh
This requires maintaining the list of functions in deploy.sh
, but I don't think that's too tall of an ask. You could mock the firebase-functions
library so that calls to functions.https.onRequest()
(along with the other function exports) just return true
and use that to get a dynamic list of functions if you so desire.
You could also flatten the file structure so that ./node10
and ./node12
are the deployed function directories (instead of the nested functions
folders) by adding "functions": { { "source": "." } }
to their respective firebase.json
files.