Search code examples
gogoogle-cloud-functionsprotocol-buffersbazel

Google cloud functions with bazel built protobuf dependencies


If I use bazel to build my protobuf dependent Go serverless functions, bazel will make the protobuf generated go code available at the import path that I specify.

Google cloud functions for go requires one to use go modules.

How can I add the dummy import path created by bazel to my go.mod file? The function deploy to google cloud fails because the dummy import can not be resolved. (G cloud requires me to upload my go source, AWS lambdas would allow me to upload a binary, which would work fine.)

I'm guessing I'll have to either go with AWS lambdas, use serverless containers, or write a genrule that copies the outputs of the proto generated code into my source directory but I'd like to avoid that ugliness.


Solution

  • I work at Google on Go and Google Cloud Functions.

    I see a few options for using Cloud Functions:

    • Publish the generated code publicly. You may not want to do this for a variety of reasons.
    • Copy the generated code into your source directory. This is the easiest. When you deploy your function, the current directory gets zipped up and sent to be built. We don't copy any dependencies from outside your current directory. If you do this, you can import the generated code by having the package path be prefixed by the module path of your directory.
    • Use vendoring. If you run go mod vendor and have that grab your generated code (at whatever path you choose), it will create a vendor directory with all of your dependencies. The Cloud Functions builder prefers go.mod over vendor, though. So you would have to .gcloudignore the go.mod and go.sum file to make sure they don't get uploaded when you deploy your code. https://cloud.google.com/functions/docs/writing/specifying-dependencies-go has more information.