Search code examples
google-cloud-storageevent-arc

target multiple buckets with eventarc?


currently we are trying to use eventarc to send us all finalized files for buckets. this works create however, currently it looks like event-arc can only target a single bucket and we would need to enable it for every bucket on it's own. is there a way to target multiple buckets?

currently we use the following to create the eventarc trigger:

gcloud eventarc triggers create storage-events \
--location="$LOCATION" \
--destination-gke-cluster="CLUSTER-NAME" \
--destination-gke-location="$LOCATION" \
--destination-gke-namespace="$NAMSEPACE" \
--destination-gke-service="$SERVICE" \
--destination-gke-path="api/events/receive" \
--event-filters="type=google.cloud.storage.object.v1.finalized" \
--event-filters="bucket=$BUCKET" \
--service-account=$SERVICEACCOUNT-compute@developer.gserviceaccount.com

the problem is, that we generate a bucket per customer, thus we would need to create the trigger for each bucket (which is a alot) is there a simpler way?


Solution

  • You have several options.

    If you want to use the native event google.cloud.storage.object.v1.finalized, you must select one and only one bucket. Therefore, you have to create one eventarc per bucket.

    If you can use the audit logs event storage.objects.create, you have to activate the audit logs but you are not filter on the buckets. ALL the buckets are listen. If you don't want, you can play with the Cloud logging router to discard the logs that you don't want (especially the audit logs on the buckets that you don't want)

    A latest solution, if you really want to use eventarc, especially for the Cloud Event format of the messages, you can do that:

    • Create a Coud Storage PubSub notification for all your bucket that you want to listen. Use the same PubSub topic everytime
    • Create a Custom eventarc on PubSub and catch the message published on the Topic.