Simplify ETL pipelines on the Databricks Lakehouse

แชร์
ฝัง
  • เผยแพร่เมื่อ 30 มี.ค. 2023
  • In this excerpt from The Best Data Engineering Platform is a Lakehouse, you’ll learn why the lakehouse is the best place to build and run modern data pipelines, and learn Delta Live Tables best practices. Watch the full webinar: dbricks.co/3nCP3Jm
    Watch the full webinar to learn how to:
    -Leverage lakehouse architecture for a simple, unified approach to data engineering
    -Meet your needs for real-time data streaming
    -Improve the latency, throughput, accuracy and cost-effectiveness of your data pipelines
    -Serve your entire data team with one solution
    You can watch the full webinar here: dbricks.co/3nCP3Jm
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 9

  • @thedatamaster
    @thedatamaster ปีที่แล้ว +2

    Michael, thank you very much for creating Delta Live Tables, which is one of the most innovative features.
    Thank You Databricks. ❤

  • @lostfrequency89
    @lostfrequency89 ปีที่แล้ว +1

    Best presentation! Love databricks team ❤

  • @akhtarmiah
    @akhtarmiah ปีที่แล้ว

    So does databricks provide ingestion process like ADF. ADF has 90+ data connectors to pull data from data sources

  • @ChirgalHansdah
    @ChirgalHansdah 11 หลายเดือนก่อน +1

    What is the difference between live table vs streaming live table ?

  • @fb-gu2er
    @fb-gu2er 11 หลายเดือนก่อน +1

    All good but, the one thing I don’t like is limitations. 100 concurrent pipelines updates per workspace is very low. I have many more than that. It’s never go work out with such limit

  • @shanhuahuang3063
    @shanhuahuang3063 ปีที่แล้ว

    after all this ads, could apply change be applied more than one time in a single one way data pipeline?

  • @shahzadb
    @shahzadb 4 หลายเดือนก่อน

    Michael can we parameterize input paths for Delta Live Tables ,it appears Delta Live Tables enforce a static path ? how can DLTs work against data partitioned over a file structure for e.g. yyyy-mm-dd/hh/*.csv , all the examples I have seen in Documentation assume a fixed input path, what if the input path changes over time e.g. hourly
    given data partitioned over day hours on storage , can a trigger DLT pipeline path update through parameterization ?

  • @Kinnoshachi
    @Kinnoshachi ปีที่แล้ว

  • @fengliang4787
    @fengliang4787 ปีที่แล้ว