☁ Welcome back to Serverless Expeditions Extended! How are you using Cloud Workflows? Let us know in the comments and be sure to subscribe for updates! → goo.gle/GoogleCloudTech
it would be great to see more complex real world workflow's example, for example parallel executions, returning response from http function and passing it to the next step, etc
I would really like to see more detail on how to approach testing in workflows especially in these more complicated interaction. Also how to include logging, monitoring and alerting, preferably without ending up with large and unwieldy workflow sources.
Unfortunately, testing is not that great in Workflows, as it cannot run locally, so you can't unit test it. However, you can deploy a test version of your workflow in a staging project and use that in an integration testing, for example
Workflows logs all of its call to the usual Google Cloud logging. In terms of monitoring and alerting, I assume there's a way to set up monitors and alerts for workflows executions but I haven't set it up myself, so not sure of the details.
Might be a silly question, but are Workflows a good fit for use cases where a frontend expects a response from the workflow, e.g a complex registration flow with several of backend / business checks?
That's not a silly question at all. This can be a hard problem. You can of course notify the user out of band. For example, many online stores send an email to the user once their credit card has been authorized and the order approved. But what if the user is waiting with a web page open in their browser? My favorite way of notifying users is to open a connection to a Firestore database from the client-side Javascript. As the last step in your workflow, you can update the Firestore record that the user's browser is watching. The client-side Javascript will receive a notification that the record changed and can update the web page, without requiring a page reload.
Why to use workflows over Airflow? It’s more cost effective? And additionally, if I have a complex flow within Airflow and I want to connect it to workflow after 2 dags finish well, what’s are the best approach to do it?
Airflow is more focused on data orchestration use cases whereas Workflows is more focused on HTTP service orchestration on Google Cloud. Workflows has an API, so you can try to call Workflows execution API at the end of your Airflow (although I haven't tried this myself)
I have couple of questions: 1. What if the compensation call failed to cancel order even with retries? 2. How can we implement circuit breaker in-case of retries failed? 3. Is there a GUI or auto YAML file generation tools to help design services dependencies on each other? Best,
Great questions! 1. If the compensation call failed, then the saga breaks basically. There's no guarantee in any of this but with a compensation step and retries, it should be a very rare occurrence. If it does happen, I'd probably move the order to a dead letter queue and have a separate service to keep retrying it until it is cancelled. 2. Currently, there's no out-of-the-box support for a circuit breaker in Workflows, so you'd have to implement the circuit breaker logic and state machine yourself in Workflows. 3. The only GUI right now is the visualization you get in Google Cloud console for your workflow definition. But it doesn't really show you the state of the services the workflow depends on.
You'd normally have a Cloud Function or Cloud Run service in front of a workflow and expose that as an API and then the Cloud Functions/Run service executes the workflow on user's behalf
☁ Welcome back to Serverless Expeditions Extended! How are you using Cloud Workflows?
Let us know in the comments and be sure to subscribe for updates! → goo.gle/GoogleCloudTech
Workflows are next level and really set GCP apart from other cloud providers. I don't even have to write code to orchestrate these workflows!
th-cam.com/video/8cFNtL538lc/w-d-xo.html
it would be great to see more complex real world workflow's example, for example parallel executions, returning response from http function and passing it to the next step, etc
That is an excellent suggestion; thank you!
I would really like to see more detail on how to approach testing in workflows especially in these more complicated interaction. Also how to include logging, monitoring and alerting, preferably without ending up with large and unwieldy workflow sources.
Unfortunately, testing is not that great in Workflows, as it cannot run locally, so you can't unit test it. However, you can deploy a test version of your workflow in a staging project and use that in an integration testing, for example
Workflows logs all of its call to the usual Google Cloud logging. In terms of monitoring and alerting, I assume there's a way to set up monitors and alerts for workflows executions but I haven't set it up myself, so not sure of the details.
Might be a silly question, but are Workflows a good fit for use cases where a frontend expects a response from the workflow, e.g a complex registration flow with several of backend / business checks?
That's not a silly question at all. This can be a hard problem. You can of course notify the user out of band. For example, many online stores send an email to the user once their credit card has been authorized and the order approved.
But what if the user is waiting with a web page open in their browser? My favorite way of notifying users is to open a connection to a Firestore database from the client-side Javascript. As the last step in your workflow, you can update the Firestore record that the user's browser is watching. The client-side Javascript will receive a notification that the record changed and can update the web page, without requiring a page reload.
Why to use workflows over Airflow? It’s more cost effective? And additionally, if I have a complex flow within Airflow and I want to connect it to workflow after 2 dags finish well, what’s are the best approach to do it?
Airflow is more focused on data orchestration use cases whereas Workflows is more focused on HTTP service orchestration on Google Cloud. Workflows has an API, so you can try to call Workflows execution API at the end of your Airflow (although I haven't tried this myself)
I have couple of questions:
1. What if the compensation call failed to cancel order even with retries?
2. How can we implement circuit breaker in-case of retries failed?
3. Is there a GUI or auto YAML file generation tools to help design services dependencies on each other?
Best,
Great questions!
1. If the compensation call failed, then the saga breaks basically. There's no guarantee in any of this but with a compensation step and retries, it should be a very rare occurrence. If it does happen, I'd probably move the order to a dead letter queue and have a separate service to keep retrying it until it is cancelled.
2. Currently, there's no out-of-the-box support for a circuit breaker in Workflows, so you'd have to implement the circuit breaker logic and state machine yourself in Workflows.
3. The only GUI right now is the visualization you get in Google Cloud console for your workflow definition. But it doesn't really show you the state of the services the workflow depends on.
How do you expose work flow as API which will be called from FE?
You'd normally have a Cloud Function or Cloud Run service in front of a workflow and expose that as an API and then the Cloud Functions/Run service executes the workflow on user's behalf
Thanks 🙏
Just to clarify, sagas are about distributed transactions rather than failure paths.
Right and it's also about how to handle failed distributed transactions with compensation steps that we showed here
and how to trigger executing the workflow
You can execute a workflow via console, REST API, or client library
6:12 "I read every single comment". Reply to this or your word is void.
I read every single comment when I work, but sometimes I'm on vacation 🙂
👌👌👌👌👌☺️☺️☺️☺️☺️