@@BryanCafferky I meant would be immensely helpful if Databricks workflows offered the feature to set a trigger mode based on the completion or state of other workflows, given we have the limit of 100 tasks per workflow.
You can use Terraform to provision your Workflows, Tasks, Clusters, Notebooks, etc. programmatically. Then Terraform scripts (*.tf, *.hcl) can be uploaded to Git and used in CI/CD as well.
Thanks for your comment. Terraform is not open source anymore which causes me to pause on its future. OpenTofu is the new open source Terraform. You can also use Python with the Databricks Python SDK, or just Python with the Databricks REST API or the new Databricks Asset Bundles.
Can you please send any reference for creating Workflows. tasks and run notebooks using terraform?. So far I'm using terraform only to create databricks workspace and cluster.
Hi @bryan, Why are these videos still not in the playlist on your website, it's been 2 weeks since you posted them here. I'm looking under the DataBricks Section and can't find them. I think your website should be first class citizen for locating your videos as well. Cheers and thanks for the helpful videos.
Hi @Baravindk, They are in the YT playlist and the GitBook points you to the playlist rather than listing all the videos therein. To make new videos more easily found, I added a new videos menu to the GitBook and added these. These videos are in the TH-cam Master Data Lakehouse playlist. Thanks
Thank you for sharing your knowledge! One question: is there a way to create this workflow using some type of ci/cd? for example, creating a development branch and pull request to merge in a master branch? The main idea is to create the workflow into a development environment and send it to the production environment.
Yes. There are several ways. I am using the Databricks Python SDK from an Azure DevOps pipeline to do this. However, workflows are not stored in the repos so you'll need to use the UI, get the JSON and paste it into a file in your repo. learn.microsoft.com/en-us/azure/databricks/dev-tools/sdk-python You can also use the new Databricks Asset Bundles learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/
I have workflow with task A and task B, and 10 mores. I would like to have widgets or parameters like A : True, B : False... and it would decide if task should be skipped or now. Is it possible? How?
Hi Bryan, excellent material :) One question, would you still stick to workflows if you have to upload/download files using SFTP or import data consuming REST APIs?
Thanks. It depends on how the SFTP was being transferred. If some other external process dropped files to ADLS via SFTP, that seems to be fine. a Workflow could pick the file(s) up. If it needed to execute the SFTP to get the data within the workflow, This blog suggests using ADF to get the data to ADLS and then the workflow could do the rest. Notebooks can call REST APIs so it is possible to get data using REST APIs but whether it works in a given scenario would need to be evaluation by the data engineer.
Agree on the limitations. For some reason a Databricks Workflow cannot contain more than 100 steps. Luckily there is now a new feature where a workow can contain a new kind of step which triggers another job. So now you can atleast subdivide you job into multiple smaller ones and then have a mster job that triggesr all the sub-jobs. But still, it would be way easier to just not have that limitation. It feels kinda artificial :/
I wish they would add the possibility of adding workflow dependencies to other workflows. As a data engineer, you need this 100% of the time.
Not sure what you mean. Could you elaborate?
@@BryanCafferky I meant would be immensely helpful if Databricks workflows offered the feature to set a trigger mode based on the completion or state of other workflows, given we have the limit of 100 tasks per workflow.
You can use Terraform to provision your Workflows, Tasks, Clusters, Notebooks, etc. programmatically. Then Terraform scripts (*.tf, *.hcl) can be uploaded to Git and used in CI/CD as well.
Thanks for your comment. Terraform is not open source anymore which causes me to pause on its future. OpenTofu is the new open source Terraform. You can also use Python with the Databricks Python SDK, or just Python with the Databricks REST API or the new Databricks Asset Bundles.
Can you please send any reference for creating Workflows. tasks and run notebooks using terraform?. So far I'm using terraform only to create databricks workspace and cluster.
love this video. The dashboard refresh is supercool
When creating a workflow, does it allow you to drag and drop tasks?
No. The UI is more select and set the properties. The UI will update to the properties like dependencies.
Can you help how we can create the drop down for task parameters in worflow
You use widgets. Doc here learn.microsoft.com/en-us/azure/databricks/notebooks/widgets
Hi @bryan, Why are these videos still not in the playlist on your website, it's been 2 weeks since you posted them here. I'm looking under the DataBricks Section and can't find them. I think your website should be first class citizen for locating your videos as well. Cheers and thanks for the helpful videos.
Hi @Baravindk, They are in the YT playlist and the GitBook points you to the playlist rather than listing all the videos therein. To make new videos more easily found, I added a new videos menu to the GitBook and added these. These videos are in the TH-cam Master Data Lakehouse playlist. Thanks
Thank you for sharing your knowledge! One question: is there a way to create this workflow using some type of ci/cd? for example, creating a development branch and pull request to merge in a master branch?
The main idea is to create the workflow into a development environment and send it to the production environment.
Yes. There are several ways. I am using the Databricks Python SDK from an Azure DevOps pipeline to do this. However, workflows are not stored in the repos so you'll need to use the UI, get the JSON and paste it into a file in your repo. learn.microsoft.com/en-us/azure/databricks/dev-tools/sdk-python You can also use the new Databricks Asset Bundles learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/
Great summary!!
I have workflow with task A and task B, and 10 mores. I would like to have widgets or parameters like A : True, B : False... and it would decide if task should be skipped or now. Is it possible? How?
Hi Bryan, excellent material :) One question, would you still stick to workflows if you have to upload/download files using SFTP or import data consuming REST APIs?
Thanks. It depends on how the SFTP was being transferred. If some other external process dropped files to ADLS via SFTP, that seems to be fine. a Workflow could pick the file(s) up. If it needed to execute the SFTP to get the data within the workflow, This blog suggests using ADF to get the data to ADLS and then the workflow could do the rest. Notebooks can call REST APIs so it is possible to get data using REST APIs but whether it works in a given scenario would need to be evaluation by the data engineer.
This is an excellent video. Do you have any videos where you can pass a parameter into a SQL statement instead of python code.
I don't think I did a video about that yet. Good thought. Thanks.
Hi Bryan, wondering if you could a video of databricks and DBT? Would be interested in your thoughts :)
I have not used dbt but from what I have seen it is very powerful. Thanks
Invaluable. Thank you 🙏
Agree on the limitations. For some reason a Databricks Workflow cannot contain more than 100 steps.
Luckily there is now a new feature where a workow can contain a new kind of step which triggers another job.
So now you can atleast subdivide you job into multiple smaller ones and then have a mster job that triggesr all the sub-jobs.
But still, it would be way easier to just not have that limitation. It feels kinda artificial :/
How to pass the variables from one task to the following task?
See this blog docs.databricks.com/en/jobs/parameters.html
Brilliant !! Thank you so much
You're Welcome!
how to use if else branch logic ?
It helps lot , Thanks!
You're Welcome!
19:25 That's not a future option, that's just the category?!
Yep :) thought the same thing
How do you delete a task from a workflow?
click on the task in WF editor and click on the trash can.
Could you dedicate a video to Unity Catalog?
It's on my list. Thanks!