Laptop

Integrating Eventarc And Workflows

I previously talked about Eventarc for choreographed (event-driven) Cloud Run services and introduced Workflows for orchestrated services.

Eventarc and Workflows are very useful in strictly choreographed or orchestrated architectures. However, you sometimes need a hybrid architecture that combines choreography and orchestration. 

For example, imagine a use case where a message to a Pub/Sub topic triggers an automated infrastructure workflow or where a file upload to a Cloud Storage bucket triggers an image processing workflow. In these use cases, the trigger is an event but the actual work is done as an orchestrated workflow.

How do you implement these hybrid architectures in Google Cloud? The answer lies in Eventarc and Workflows integration. 

Eventarc triggers

To recap, an Eventarc trigger enables you to read events from Google Cloud sources via Audit Logs and custom sources via Pub/Sub and direct them to Cloud Run services:

One limitation of Eventarc is that it currently only supports Cloud Run as targets. This will change in the future with more supported event targets. It’d be nice to have a future Eventarc trigger to route events from different sources to Workflows directly. 

In absence of such a Workflows enabled trigger today, you need to do a little bit of work to connect Eventarc to Workflows. Specifically, you need to use a Cloud Run service as a proxy in the middle to execute the workflow. 

Let’s take a look at a couple of concrete examples.

Eventarc Pub/Sub + Workflows integration

In the first example, imagine you want a Pub/Sub message to trigger a workflow. 

Define and deploy a workflow

First, define a workflow that you want to execute. Here’s a sample workflows.yaml that simply decodes and logs the Pub/Sub message body:

main:
  params: [args]
  steps:
    - init:
        assign:
          - headers: ${args.headers}
          - body: ${args.body}
...
    - pubSubMessageStep:
        call: sys.log
        args:
            text: ${"Decoded Pub/Sub message data is " + text.decode(base64.decode(args.body.message.data))}
            severity: INFO
Deploy the workflow with a single command:
gcloud workflows deploy ${WORKFLOW_NAME} --source=workflow.yaml --location=${REGION}

Deploy a Cloud Run service to execute the workflow

Next, you need a Cloud Run service to execute this workflow. Workflows has an execution API and client libraries that you can use for your favorite language. Here’s an example of the execution code from a Node app.js file. It simply passes the received HTTP request headers and body to the workflow and executes it:

const execResponse = await client.createExecution({
      parent: client.workflowPath(GOOGLE_CLOUD_PROJECT, WORKFLOW_REGION, WORKFLOW_NAME),
      execution: {
        argument: JSON.stringify({headers: req.headers, body: req.body})
      }
    });

Deploy the Cloud Run service with the Workflows name and region passed as environment variables:

gcloud run deploy ${SERVICE_NAME} \
  --image gcr.io/${PROJECT_ID}/${SERVICE_NAME} \
  --region=${REGION} \
  --allow-unauthenticated \
  --update-env-vars GOOGLE_CLOUD_PROJECT=${PROJECT_ID},WORKFLOW_REGION=${REGION},WORKFLOW_NAME=${WORKFLOW_NAME}

Connect a Pub/Sub topic to the Cloud Run service

With Cloud Run and Workflows connected, the next step is to connect a Pub/Sub topic to the Cloud Run service by creating an Eventarc Pub/Sub trigger:

  gcloud eventarc triggers create ${SERVICE_NAME} \
  --destination-run-service=${SERVICE_NAME} \
  --destination-run-region=${REGION} \
  --location=${REGION} \
  --event-filters="type=google.cloud.pubsub.topic.v1.messagePublished"

This creates a Pub/Sub topic under the covers that you can access with:

  export TOPIC_ID=$(basename $(gcloud eventarc triggers describe ${SERVICE_NAME} --format='value(transport.pubsub.topic)'))

Trigger the workflow

Now that all the wiring is done, you can trigger the workflow by simply sending a Pub/Sub message to the topic created by Eventarc:

gcloud pubsub topics publish ${TOPIC_ID} --message="Hello there"

In a few seconds, you should see the message in Workflows logs, confirming that the Pub/Sub message triggered the execution of the workflow:

Eventarc Audit Log-Storage + Workflows integration

In the second example, imagine you want a file creation event in a Cloud Storage bucket to trigger a workflow. The steps are similar to the Pub/Sub example with a few differences.

Define and deploy a workflow

As an example, you can use this workflow.yaml that logs the bucket and file names:

main:
  params: [args]
  steps:
...
    - log:
        call: sys.log
        args:
            text: ${"Workflows received event from bucket " + bucket + " for file " + file}
            severity: INFO

Deploy a Cloud Run service to execute the workflow

In the Cloud Run service, you read the CloudEvent from Eventarc and extract the bucket and file name in app.js using the CloudEvent SDK and the Google Event library:

  const cloudEvent = HTTP.toEvent({ headers: req.headers, body: req.body });
  //"protoPayload" : {"resourceName":"projects/_/buckets/events-atamel-images-input/objects/atamel.jpg}";
  const logEntryData = toLogEntryData(cloudEvent.data);
  const tokens = logEntryData.protoPayload.resourceName.split('/');
  const bucket = tokens[3]

Executing the workflow is similar to the Pub/Sub example, except you don’t pass in the whole HTTP request but rather just the bucket and file name to the workflow:

  const execResponse = await client.createExecution({
      parent: client.workflowPath(GOOGLE_CLOUD_PROJECT, WORKFLOW_REGION, WORKFLOW_NAME),
      execution: {
        argument: JSON.stringify({bucket: bucket, file: file})
      }
    });

Connect Cloud Storage events to the Cloud Run service

To connect Cloud Storage events to the Cloud Run service, create an Eventarc Audit Logs trigger with the service and method names for Cloud Storage:

gcloud eventarc triggers create ${SERVICE_NAME} \
  --destination-run-service=${SERVICE_NAME} \
  --destination-run-region=${REGION} \
  --location=${REGION} \
  --event-filters="type=google.cloud.audit.log.v1.written" \
  --event-filters="serviceName=storage.googleapis.com" \
  --event-filters="methodName=storage.objects.create" \
  --service-account=${PROJECT_NUMBER}-compute@developer.gserviceaccount.com

Trigger the workflow

Finally, you can trigger the workflow by creating and uploading a file to the bucket:

echo "Hello World" > random.txt
gsutil cp random.txt gs://${BUCKET}/random.txt

In a few seconds, you should see the workflow log the bucket and object name.

Conclusion

In this blog post, I showed you how to trigger a workflow with two different event types from Eventarc. It’s certainly possible to do the opposite, namely, trigger a Cloud Run service via Eventarc with a Pub/Sub message (see connector_publish_pubsub.workflows.yaml) from Workflows or a file upload to a bucket from Workflows. 


All the code mentioned in this blog post is in eventarc-workflows-integration. Feel free to reach out to me on Twitter @meteatamel for any questions or feedback.

By: Mete Atamel (Developer Advocate)
Source: Google Cloud Blog



For enquiries, product placements, sponsorships, and collaborations, connect with us at hello@globalcloudplatforms.com. We'd love to hear from you!


Our humans need coffee too! Your support is highly appreciated, thank you!

Total
0
Shares
Previous Article

The State & Local Government Tech Tightrope: Balancing COVID-19 Impacts And The Road Ahead

Next Article
Light Bulb | Unsplash

Diversity In Leadership: Celebrating The Success Of Asian & Pacific American Founders With Google Cloud

Related Posts