All good but, the one thing I don’t like is limitations. 100 concurrent pipelines updates per workspace is very low. I have many more than that. It’s never go work out with such limit
Michael can we parameterize input paths for Delta Live Tables ,it appears Delta Live Tables enforce a static path ? how can DLTs work against data partitioned over a file structure for e.g. yyyy-mm-dd/hh/*.csv , all the examples I have seen in Documentation assume a fixed input path, what if the input path changes over time e.g. hourly given data partitioned over day hours on storage , can a trigger DLT pipeline path update through parameterization ?
Michael, thank you very much for creating Delta Live Tables, which is one of the most innovative features.
Thank You Databricks. ❤
Best presentation! Love databricks team ❤
All good but, the one thing I don’t like is limitations. 100 concurrent pipelines updates per workspace is very low. I have many more than that. It’s never go work out with such limit
Michael can we parameterize input paths for Delta Live Tables ,it appears Delta Live Tables enforce a static path ? how can DLTs work against data partitioned over a file structure for e.g. yyyy-mm-dd/hh/*.csv , all the examples I have seen in Documentation assume a fixed input path, what if the input path changes over time e.g. hourly
given data partitioned over day hours on storage , can a trigger DLT pipeline path update through parameterization ?
What is the difference between live table vs streaming live table ?
after all this ads, could apply change be applied more than one time in a single one way data pipeline?
So does databricks provide ingestion process like ADF. ADF has 90+ data connectors to pull data from data sources
❤
❤