Five Things You Didn't Know You Could Do with Databricks Workflows

แชร์
ฝัง
  • เผยแพร่เมื่อ 26 ก.ค. 2023
  • Databricks workflows has come a long way since the initial days of orchestrating simple notebooks and jar/wheel files. Now we can orchestrate multi-task jobs and create a chain of tasks with lineage and DAG with either fan-in or fan-out among multiple other patterns or even run another Databricks job directly inside another job.
    Databricks workflows takes its tag: “orchestrate anything anywhere” pretty seriously and is a truly fully-managed, cloud-native orchestrator to orchestrate diverse workloads like Delta Live Tables, SQL, Notebooks, Jars, Python Wheels, dbt, SQL, Apache Spark™, ML pipelines with excellent monitoring, alerting and observability capabilities as well. Basically, it is a one-stop product for all orchestration needs for an efficient lakehouse. And what is even better is, it gives full flexibility of running your jobs in a cloud-agnostic and cloud-independent way and is available across AWS, Azure and GCP.
    In this session, we will discuss and deep dive on some of the very interesting features and will showcase end-to-end demos of the features which will allow you to take full advantage of Databricks workflows for orchestrating the lakehouse.
    Talk by: Prashanth Babu
    Connect with us: Website: databricks.com
    Twitter: / databricks
    LinkedIn: / databricks
    Instagram: / databricksinc
    Facebook: / databricksinc
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 3

  • @CoopmanGreg
    @CoopmanGreg 5 หลายเดือนก่อน

    Well done!

  • @shankhadeepghosal731
    @shankhadeepghosal731 24 วันที่ผ่านมา

    I want do build a work flow where 2nd notebook should run only when a certain table count is more than 0 in notebook 1. How?

    • @lostfrequency89
      @lostfrequency89 16 วันที่ผ่านมา

      Just pass the taskvalues form inside notebook1 then use if else where you can link the true condition to notebook2