Google Cloud | AppDev

3 Common Serverless Patterns To Build With Workflows

In January 2021, our Workflows orchestration and automation service reached General Availability. At the same time, we updated Workflows with a preview of Connectors, which provide seamless integration with other Google Cloud products. Workflows plus Connectors are a great way to design common architecture patterns that can help you build advanced serverless applications.

Workflows is a serverless product designed to orchestrate work across Google Cloud APIs as well as any HTTP-based API available on the internet. It requires no infrastructure management and generates no charges when workflows are waiting for operations to complete. You can learn more about Workflow’s core capabilities in our previous blog post

In this blog post, we will take a look at a few useful architecture patterns including scheduling recurring workflow executions, handling long-running API requests by polling for results, and iterating through an array of database entries. 

Scheduled workflows

Let’s consider an e-commerce website or a gaming application that requires the support team’s intervention whenever user traffic is not within an expected, normal range. For example, exceptionally low number of online users may indicate an outage, while a higher than expected number of concurrent users may cause scalability issues. 

The number of concurrent online users is stored in a Firestore database as a distributed counter updated by log-in/log-out transactions. Our workflow needs to periodically check the value of the counter and react accordingly, depending on its value. 

Consider the following workflow:

The workflow is triggered every 5 minutes and retrieves the value of the current user counter from a Firestore database using the Firestore Connector. Along with the counter value, it also retrieves the last state of the traffic, e.g., “Low”, “Normal”, “High”, that was saved during the workflow’s  previous run. 

Workflows’ built-in switch step, combined with a custom formula, is used to determine whether the current value of the counter would make it fall to a different state than the one saved in the previous run of the workflow. If so, the new state is saved in the Firestore database and  Pub/Sub Connector pushes a message to the support team, informing them about state change. The workflow is checking not only the current value of the counter but also the last recorded state, so that only status changes result in notifications. 

With only a few steps, the workflow above becomes a reliable serverless application with full tracking of execution history. Built-in Identity and Access Management (IAM) integration reduces the complexity of interacting with other Google Cloud products, like Firestore or Pub/Sub.

Learn how to schedule workflow executions using Cloud Scheduler, similar to the example above, in this guide.

Workflows with API polling 

Consider a workflow that requests an execution of a long-running job using an external API. The external API accepts a job execution request and returns a unique JobID that can be used to poll for this job’s execution status. The job can take hours and the workflow can only proceed to the next steps only once this job is completed. As there is no feature in this API to notify the workflow about a job completion, the workflow needs to periodically poll for the job status.

The workflow presented below implements this pattern, checking the status of the job every 2 minutes. Note that Workflows’ pricing model is based on the number of executed steps and there is no time-related charge for a sleep operation. Workflows can run for up to a year, so you can be confident that they will follow through on even the longest-running jobs.

In real life scenarios, you may need to add an extra step at the beginning of this workflow to retrieve an authentication key for the external API from a secure storage system. We recommended you use Secret Manager as a key or password storage system and get the key values using the Secret Manager Connector feature.

Iterating through an array of database records

In this example, the application needs to check customer records once a day, and send email reminders to customers with overdue invoices. 

The workflow below uses a Firestore Connector to run a query that retrieves entries for all customers with overdue payments. The workflow then iterates through this set and sends an email reminder about the pending payment to each customer, using an external Email API like SendGrid.

The above example uses Workflow’s ability to process arrays and perform tasks for every element of an array, as shown in this example. By specifying error handling and retries in the workflow, you can be sure that intermittent failures, or errors with a particular entry, will not prevent the rest of the customer messaging from sending successfully.

Similar to the previous example, this workflow may need to be extended with a connector call to Secret Manager to retrieve an access key for the email service. 

Get ready to Workflow

Real-life, line-of-business applications often need to use a combination of architecture patterns. While your actual use cases may differ from the examples above, the patterns of periodic scheduling, polling and array iterations are universal and the building blocks of countless implementations. Workflows’ support for serverless and API-based architectures allow you to minimize ongoing operational overhead but maintaining full control over your business logic, while Connectors to Google Cloud products like Pub/Sub, Firestore, Compute Engine, Secret Manager or Cloud Tasks make it easy to integrate Workflows into your environment.

Now that Workflows is generally available, you can feel confident using it for production line-of-business applications, while built-in error handling for API calls further improves reliability of your applications. To learn more, visit the Workflows landing page today, or go directly to the Cloud Console to try it out.

By Filip Knapik(Workflows Product Manager)
Source: Google Cloud Blog

For enquiries, product placements, sponsorships, and collaborations, connect with us at We'd love to hear from you!

Our humans need coffee too! Your support is highly appreciated, thank you!

Previous Article
Google Cloud | AutoML

Scale Model Training In Minutes With RAPIDS + Dask And NVIDIA GPUs On AI Platform

Next Article
Data | Letters, Numbers and Symbols

Run Data Science At Scale With Dataproc And Apache Spark

Related Posts