❯ Guillaume Laforge

Sending an email with SendGrid from Workflows

Following up the article on writing and reading JSON files in cloud storage buckets, we saw that we could access the data of the JSON file, and use it in our workflow. Let’s have a look at a concrete use of this. Today, we’ll take advantage of this mechanism to avoid hard-coding the URLs of the APIs we call from our workflow. That way, it makes the workflow more portable across environments. Read more...

Smarter Applications With Document Ai Workflows and Cloud Functions

At enterprises across industries, documents are at the center of core business processes. Documents store a treasure trove of valuable information whether it’s a company’s invoices, HR documents, tax forms and much more. However, the unstructured nature of documents make them difficult to work with as a data source. We call this “dark data” or unstructured data that businesses collect, process and store but do not utilize for purposes such as analytics, monetization, etc. Read more...

Open sourcing the App Engine Standard Java Runtime

One year after Google App Engine was released in 2008, Java became the second language runtime available on the platform. Java developers were able to deploy and scale their servlet-based web applications easily, without worrying about infrastructure management. Not only Java was able to run then, but alternative JVM languages, like Apache Groovy, and Kotlin are also part of the game. Fast forward to today, we’re pleased to announce that the Java Runtime for App Engine is now available as open source, in the GoogleCloudPlatform/appengine-java-standard repository on Github. Read more...

Reading in and writing a JSON file to a storage bucket from a workflow

Workflows provides several connectors for interacting with various Google Cloud APIs and services. In the past, I’ve used for example the Document AI connector to parse documents like expense receipts, or the Secret Manager connector to store and access secrets like passwords. Another useful connector I was interested in using today was the Google Cloud Storage connector, to store and read files stored in storage buckets. Those connectors are auto-generated from their API discovery descriptors, but there are some limitations currently that prevent, for example, to download the content of a file. Read more...

How to get the project ID in a Java Cloud Function

As I was working with my colleague Sara Ford on testing the Cloud Functions runtimes for the upcoming “second generation” of the product, rebased on the Cloud Run platform, I wrote a few simple functions for the Java runtime. In one of those Java functions, I wanted to use Google Cloud Storage, to download a file from a bucket. I took a look at the existing sample to download an object: Read more...

Introducing Workflows callbacks

With Workflows, developers can easily orchestrate various services together, on Google Cloud or third-party APIs. Workflows connectors handle long-running operations of Google Cloud services till completion. And Workflow executions can also wait for time to pass with the built-in sys.sleep function, till some computation finishes, or some event takes place. But what if you need some user input or some approval in the middle of the workflow execution, like validating automatic text translation? Read more...

Introducing New Connectors for Workflows

Workflows is a service to orchestrate not only Google Cloud services, such as Cloud Functions, Cloud Run, or machine learning APIs, but also external services. As you might expect from an orchestrator, Workflows allows you to define the flow of your business logic, as steps, in a YAML or JSON definition language, and provides an execution API and UI to trigger workflow executions. You can read more about the benefits of Workflows in our previous article. Read more...

Orchestrating the Pic-a-Daily serverless app with workflows

Over the past year, we (Mete and Guillaume) have developed a picture sharing application, named Pic-a-Daily, to showcase Google Cloud serverless technologies such as Cloud Functions, App Engine, and Cloud Run. Into the mix, we’ve thrown a pinch of Pub/Sub for interservice communication, a zest of Firestore for storing picture metadata, and a touch of machine learning for a little bit of magic. We also created a hands-on workshop to build the application, and slides with explanations of the technologies used. Read more...

Day 15 with Workflows — Built-in Cloud Logging function

In the two previous episodes, we saw how to create and call subworkflows, and we applied this technique to making a reusable routine for logging with Cloud Logging. However, there’s already a built-in function for that purpose! So let’s have a look at this integration. To call the built-in logging function, just create a new step, and make a call to the sys.log function: - logString: call: sys.log args: text: Hello Cloud Logging! Read more...

Day 14 with Workflows — Subworkflows

Workflows are made of sequences of steps and branches. Sometimes, some particular sequence of steps can be repeated, and it would be a good idea to avoid error-prone repetitions in your workflow definition (in particular if you change in one place, and forget to change in another place). You can modularize your definition by creating subworkflows, a bit like subroutines or functions in programming languages. For example, yesterday, we had a look at how to log to Cloud Logging: if you want to log in several places in your workflow, you can extract that routine in a subworkflow. Read more...