❯ Guillaume Laforge

Introducing New Connectors for Workflows

Workflows is a service to orchestrate not only Google Cloud services, such as Cloud Functions, Cloud Run, or machine learning APIs, but also external services. As you might expect from an orchestrator, Workflows allows you to define the flow of your business logic, as steps, in a YAML or JSON definition language, and provides an execution API and UI to trigger workflow executions. You can read more about the benefits of Workflows in our previous article. Read more...

Orchestrating the Pic-a-Daily serverless app with workflows

Over the past year, we (Mete and Guillaume) have developed a picture sharing application, named Pic-a-Daily, to showcase Google Cloud serverless technologies such as Cloud Functions, App Engine, and Cloud Run. Into the mix, we’ve thrown a pinch of Pub/Sub for interservice communication, a zest of Firestore for storing picture metadata, and a touch of machine learning for a little bit of magic. We also created a hands-on workshop to build the application, and slides with explanations of the technologies used. Read more...

Day 15 with Workflows — Built-in Cloud Logging function

In the two previous episodes, we saw how to create and call subworkflows, and we applied this technique to making a reusable routine for logging with Cloud Logging. However, there’s already a built-in function for that purpose! So let’s have a look at this integration. To call the built-in logging function, just create a new step, and make a call to the sys.log function: - logString: call: sys.log args: text: Hello Cloud Logging! Read more...

Day 14 with Workflows — Subworkflows

Workflows are made of sequences of steps and branches. Sometimes, some particular sequence of steps can be repeated, and it would be a good idea to avoid error-prone repetitions in your workflow definition (in particular if you change in one place, and forget to change in another place). You can modularize your definition by creating subworkflows, a bit like subroutines or functions in programming languages. For example, yesterday, we had a look at how to log to Cloud Logging: if you want to log in several places in your workflow, you can extract that routine in a subworkflow. Read more...

Day 13 with Workflows — Logging with Cloud Logging

Time to come back to our series on Cloud Workflows. Sometimes, for debugging purposes or for auditing, it is useful to be able to log some information via Cloud Logging. As we saw last month, you can call HTTP endpoints from your workflow. We can actually use Cloud Logging’s REST API to log such messages! Let’s see that in action. - log: call: http.post args: url: https://logging.googleapis.com/v2/entries:write auth: type: OAuth2 body: entries: - logName: ${"projects/" + sys. Read more...

Day 12 with Workflows — Loops and iterations

In previous episodes of this Cloud Workflows series, we’ve learned about variable assignment, data structures like arrays, jumps and switch conditions to move between steps, and expressions to do some computations, including potentially some built-in functions. With all these previous learnings, we are now equipped with all the tools to let us create loops and iterations, like for example, iterating over the element of an array, perhaps to call an API several times but with different arguments. Read more...

Day 11 with Workflows — Sleeping in a workflow

Workflows are not necessarily instantaneous, and executions can span over a long period of time. Some steps may potentially launch asynchronous operations, which might take seconds or minutes to finish, but you are not notified when the process is over. So when you want for something to finish, for example before polling again to check the status of the async operation, you can introduce a sleep operation in your workflows. Read more...

Day 10 with Workflows — Accessing built-in environment variables

Google Cloud Workflows offers a few built-in environment variables that are accessible from your workflow executions. There are currently 5 environment variables that are defined: GOOGLE_CLOUD_PROJECT_NUMBER: The workflow project’s number. GOOGLE_CLOUD_PROJECT_ID: The workflow project’s identifier. GOOGLE_CLOUD_LOCATION: The workflow’s location. GOOGLE_CLOUD_WORKFLOW_ID: The workflow’s identifier. GOOGLE_CLOUD_WORKFLOW_REVISION_ID: The workflow’s revision identifier. Let’s see how to access them from our workflow definition: - envVars: assign: - projectID: ${sys.get_env("GOOGLE_CLOUD_PROJECT_ID")} - projectNum: ${sys.get_env("GOOGLE_CLOUD_PROJECT_NUMBER")} - projectLocation: ${sys.get_env("GOOGLE_CLOUD_LOCATION")} - workflowID: ${sys. Read more...

Day 9 with Workflows — Deploying and executing Workflows from the command-line

So far, in this series on Cloud Workflows, we’ve only used the Google Cloud Console UI to manage our workflow definitions, and their executions. But it’s also possible to deploy new definitions and update existing ones from the command-line, using the GCloud SDK. Let’s see how to do that! If you don’t already have an existing service account, you should create one following these instructions. I’m going to use the workflow-sa service account I created for the purpose of this demonstration. Read more...

Day 8 with Workflows — Calling an HTTP endpoint

Time to do something pretty handy: calling an HTTP endpoint, from your Google Cloud Workflows definitions. Whether calling GCP specific APIs such as the ML APIs, REST APIs of other products like Cloud Firestore, or when calling your own services, third-party, external APIs, this capability lets you plug your business processes to the external world! Let’s see calling HTTP endpoints in action in the following video, before diving into the details below: Read more...