Eventarc enables you to read events from Google Cloud sources (via its Audit Logs integration) and custom sources (via its Pub/Sub integration) and then route them to Cloud Run services.
From our partners:
The event routing rules are defined with a trigger. In a trigger, you specify the right event filters such as service name, method name, resource (which effectively defines the event source) and the target of the events (which can only be a Cloud Run service as of today).
Let’s take a closer look at how to configure these triggers with the right event filters.
Triggers with Cloud Console
The easiest way to create triggers with the right filters is from Cloud Console. If you go to
Cloud Run >
Trigger and attempt to create a trigger, you’ll see the list of services supported via Pub/Sub or Audit Logs integration:
Once you choose a service (e.g., Cloud Storage), you can see the list of events that can originate from that service:
Then, it’s simply a matter of picking the event you’re interested in. The console creates the trigger with the right filters (service name, method name, and resource) for you:
This is all good, but what if you want to create triggers from the command line or automatically from the API? This is possible but you’ll need to do a little bit of work to figure out the right filters.
Pub/Sub triggers with gcloud
Creating a Pub/Sub trigger from the command line with gcloud is pretty straightforward:
gcloud eventarc triggers create events-pubsub-trigger \
What might not be obvious is that this trigger creates a new Pub/Sub topic under the covers. Then, it sets up a rule to route all messages from that topic to the Cloud Run service in the specified region. You can read more about how to find that topic id in the Eventarc docs.
At this point, you might wonder: What if I have an existing Pub/Sub topic? You can certainly read events from an existing topic by creating a Pub/Sub trigger with the
gcloud eventarc triggers create events-pubsub-trigger-existing \
The event filters are the same for both new and existing topics.
Audit Log triggers with gcloud
Audit Log triggers are more complicated, as you’re dealing with multiple event source types, not just Pub/Sub. This means that you need to have more sophisticated event filters to capture what you want.
For example, if you want to capture Cloud Storage events when a new object is created in a bucket, you can create a trigger as follows:
gcloud eventarc triggers create events-quickstart-trigger \
Note that the event type is Audit Log (not Pub/Sub) and that’s specified as an event filter. There is also a service account. You need to define both the event type and service account for Audit Log based event sources.
What about the other two event filters (service name and method name)? These event filters further narrow down your event source to trigger only for certain services and under certain operations (i.e. methods) of those services.
How do you find these exact event filters for all services? It’s a three-step process:
- Make sure Audit Logs are enabled for the service.
- Perform an operation on the service.
- Check Audit Logs and convert them to filters.
Let’s go through these steps in detail.
How to find event filters using Audit Logs
Step 1: Make sure Audit Logs are enabled
Before you start creating an Audit Log trigger, you need to first make sure that the service you’re interested in has Audit Logs enabled. You do this by going to
IAM & Admin >
Audit Logs in Google Cloud Console. Here, you see a list of services:
Once you identify your service, you enable all Audit Logs on that service and save:
One tricky part here is that some services (e.g., BigQuery) have Audit Logs enabled by default and you cannot disable them. Those services will not show up in the list of services but that’s ok. You can consider them as Audit Log enabled and move on to the next step.
Step 2: Perform an operation on the service
Next, perform an operation on the service that you’re interested in. For example, if you’re interested in object creation events in Cloud Storage, then, create/copy an object into a bucket:
gsutil cp random.txt gs://events-storage-atamel/random.txt
This will not only create the object in the bucket but it will also generate an Audit Log entry for it. In the next step, you will check this Audit Log.
Step 3: Check Audit Logs and convert to event filters
In the last step, check Audit Logs for the event you generated in the previous step.
Logs Explorer in Cloud Console. The Query results section lists the most recently generated logs. The Query builder section can be used to filter through logs.
In this example, we’re interested in Cloud Storage events and only from a certain bucket. This is the query we can run to filter the results:
We see logs for that bucket only:
In the logs, you will see
storage.objects.create type. This is the log type that’s generated on new object creations. When you expand it, you can see the fields you can filter on under
methodName. These are the fields that can be used as
event-filters in the trigger. There’s also a
resourceName field that can be useful. This field is optional and refers to the resource or collection that is the target of the operation. If omitted, all events for the given
methodName will be delivered. In our particular example, we didn’t filter on
resourceName, as we don’t know the names of created objects ahead of time. Event filtering is currently an exact match and this can be limiting but there will be more sophisticated regex/wildcard type matching in the coming releases.
Now that you have a deeper understanding of how to create triggers for event sources in Eventarc, try your hand at creating your own. Here are some links to get you started:
- Eventarc documentation
- Quickstart: Receiving Cloud Storage events
- Codelab: Trigger Cloud Run with events from Eventarc
As always, feel free to reach out to me on Twitter @meteatamel for any questions or feedback.
By Mete Atamel(Developer Advocate)
Source: Google Cloud Blog
For enquiries, product placements, sponsorships, and collaborations, connect with us at email@example.com. We'd love to hear from you!
Our humans need coffee too! Your support is highly appreciated, thank you!