❯ Guillaume Laforge

workflows

Day 15 with Workflows — Built-in Cloud Logging function

In the two previous episodes, we saw how to create and call subworkflows, and we applied this technique to making a reusable routine for logging with Cloud Logging. However, there’s already a built-in function for that purpose! So let’s have a look at this integration. To call the built-in logging function, just create a new step, and make a call to the sys.log function: - logString: call: sys.log args: text: Hello Cloud Logging! Read more...

Day 14 with Workflows — Subworkflows

Workflows are made of sequences of steps and branches. Sometimes, some particular sequence of steps can be repeated, and it would be a good idea to avoid error-prone repetitions in your workflow definition (in particular if you change in one place, and forget to change in another place). You can modularize your definition by creating subworkflows, a bit like subroutines or functions in programming languages. For example, yesterday, we had a look at how to log to Cloud Logging: if you want to log in several places in your workflow, you can extract that routine in a subworkflow. Read more...

Day 13 with Workflows — Logging with Cloud Logging

Time to come back to our series on Cloud Workflows. Sometimes, for debugging purposes or for auditing, it is useful to be able to log some information via Cloud Logging. As we saw last month, you can call HTTP endpoints from your workflow. We can actually use Cloud Logging’s REST API to log such messages! Let’s see that in action. - log: call: http.post args: url: https://logging.googleapis.com/v2/entries:write auth: type: OAuth2 body: entries: - logName: ${"projects/" + sys. Read more...

Day 12 with Workflows — Loops and iterations

In previous episodes of this Cloud Workflows series, we’ve learned about variable assignment, data structures like arrays, jumps and switch conditions to move between steps, and expressions to do some computations, including potentially some built-in functions. With all these previous learnings, we are now equipped with all the tools to let us create loops and iterations, like for example, iterating over the element of an array, perhaps to call an API several times but with different arguments. Read more...

Day 11 with Workflows — Sleeping in a workflow

Workflows are not necessarily instantaneous, and executions can span over a long period of time. Some steps may potentially launch asynchronous operations, which might take seconds or minutes to finish, but you are not notified when the process is over. So when you want for something to finish, for example before polling again to check the status of the async operation, you can introduce a sleep operation in your workflows. Read more...

Day 10 with Workflows — Accessing built-in environment variables

Google Cloud Workflows offers a few built-in environment variables that are accessible from your workflow executions. There are currently 5 environment variables that are defined: GOOGLE_CLOUD_PROJECT_NUMBER: The workflow project’s number. GOOGLE_CLOUD_PROJECT_ID: The workflow project’s identifier. GOOGLE_CLOUD_LOCATION: The workflow’s location. GOOGLE_CLOUD_WORKFLOW_ID: The workflow’s identifier. GOOGLE_CLOUD_WORKFLOW_REVISION_ID: The workflow’s revision identifier. Let’s see how to access them from our workflow definition: - envVars: assign: - projectID: ${sys.get_env("GOOGLE_CLOUD_PROJECT_ID")} - projectNum: ${sys.get_env("GOOGLE_CLOUD_PROJECT_NUMBER")} - projectLocation: ${sys.get_env("GOOGLE_CLOUD_LOCATION")} - workflowID: ${sys. Read more...

Day 9 with Workflows — Deploying and executing Workflows from the command-line

So far, in this series on Cloud Workflows, we’ve only used the Google Cloud Console UI to manage our workflow definitions, and their executions. But it’s also possible to deploy new definitions and update existing ones from the command-line, using the GCloud SDK. Let’s see how to do that! If you don’t already have an existing service account, you should create one following these instructions. I’m going to use the workflow-sa service account I created for the purpose of this demonstration. Read more...

Day 8 with Workflows — Calling an HTTP endpoint

Time to do something pretty handy: calling an HTTP endpoint, from your Google Cloud Workflows definitions. Whether calling GCP specific APIs such as the ML APIs, REST APIs of other products like Cloud Firestore, or when calling your own services, third-party, external APIs, this capability lets you plug your business processes to the external world! Let’s see calling HTTP endpoints in action in the following video, before diving into the details below: Read more...

Day 7 with Workflows — Pass an input argument to your workflow

All the workflow definitions we’ve seen so far, in this series, were self-contained. They were not parameterized. But we often need our business processes to take arguments (the ID of an order, the details of the order, etc.), so that we can treat those input values and do something about them. That’s where workflow input parameters become useful! Let’s start with a simple greeting message that we want to customize with a firstname and lastname. Read more...

Day 6 with Workflows — Arrays and dictionaries

So far, in this series of articles on Cloud Workflows, we have used simple data types, like strings, numbers and boolean values. However, it’s possible to use more complex data structures, like arrays and dictionaries. In this new episode, we’re going to use those new structures. Arrays can be defined inline (like anArray) or spanning over several lines (like anotherArray): - assignment: assign: - anArray: ["a", "b", "c"] - anotherArray: - one - two - output: return: ${anArray[0] + anotherArray[1]} The output step will return the string "atwo". Read more...