Search code examples
google-cloud-platformevent-arc

EventArc trigger on multiple types


I'm trying to create an eventarc trigger that listens for multiple CloudEvent types. However, all I'm seeing is that at trigger has 1 CloudEvent type but can filter on attributes.

I do control the code for publishing, so I could add logic that clones all published CloudEvents and publishes another CloudEvent with a generic type but that feels pretty dirty and the error paths are brutal.

I could also create a subscriber to the eventarc's channel pubsub but that feels like gcp could change how eventarc is powered at any minute and break my subscriber.


Solution

  • Answering this for anyone who's looking for the same solution.

    As for now the docs state "Create an Eventarc trigger so that your Cloud Functions service receives notifications of a specific event or set of events", but navigating through docs' pages you can't find how.

    If you try to simply add multiple lines in the cloudbuild.yaml for the --trigger-event-filters will cause the build to fail because only one trigger at a time can be deployed; even trying to set the arg like an array

    e.g. --trigger-event-filters='type=google.cloud.firestore.document.v1.written,google.cloud.firestore.document.v1.deleted' with or without [ ] will cause the build to fail.

    What I did is to separate the second trigger in another step of cloudbuild. This results in 1) first the deploy of your cloud function, with the first trigger, and 2) then the creation of the filter with your cf as a destination.

    Here's the result:

        steps:
      - name: 'gcr.io/google.com/cloudsdktool/cloud-sdk:alpine'
        entrypoint: 'bash'
        id: 'deploy-func-and-first-trigger'
        args:
          - "-c"
          - |
            gcloud functions deploy your-cf-name \
            --gen2 \
            --trigger-event-filters='type=google.cloud.firestore.document.v1.written' \
            --trigger-event-filters='database=(default)'  \
            --trigger-location=europe-west3 \
            --trigger-service-account='[email protected]' \
            --runtime python310 \
            --entry-point hello_firestore \
            --region=europe-west3 \
            --source=.
      - name: 'gcr.io/google.com/cloudsdktool/cloud-sdk:alpine'
        waitFor: ['deploy-func-and-first-trigger'] 
        entrypoint: 'bash'
        args:
          - "-c"
          - |
            gcloud eventarc triggers create your-second-trigger \
            --destination-run-service=your-cf-name \
            --destination-run-region=europe-west3 \
            --location=europe-west3 \
            --event-filters='type=google.cloud.firestore.document.v1.deleted' \
            --event-filters='database=(default)'  \
            --event-data-content-type='application/protobuf' \
            --service-account='[email protected]'
    

    Notes: The waitFor arg will ensure to execute the first step succesfully, thus you won't risk to deploy the second trigger attached to "nothing"/failing/attached to an outdated version of your cf.

    Hopefully by the end of summer Google will improve the docs regarding Eventarc integrations and will make it smoother.