batch & stream gets merged in Silver, then why do I need batch views, and how that is going to help me. delay in processing means a delay in making informed decisions. To view status, batch processing is good or for any time limitation reporting.
Great content! Thanks for sharing. I couldn't see any Microsoft Logo in partners section for Data Lake. There has been long support in integration between Azure Data Factory and Azure Databricks but I cannot find any support for Delta Lake in Data Factory? Is that not necessary or am I thinking about it wrong?
Azure Data Factory works with Apache Spark and Azure Databricks but is not working with Delta Lake directly yet. Because ADF works with Spark and Azure Databricks, it can still talk to Delta Lake as the underlying storage of a Spark table (for example). HTH!
@@baatchus1519 Apologies, allow me to clarify. At the time of the recording, Microsoft themselves hadn't officially supported Delta Lake per se as only Azure Databricks itself was talking to Delta Lake natively. You are correct that any Spark 2.4.2+ instance can work with Delta Lake. Saying this, recently Synapse had called out they officially support Delta Lake and subsequent presentations we've updated the partners logo to include them.
Denny Lee thanks for the update, appreciate it. Do you know if Data Factory will have native support to read and write to Delta Lake soon? And do you have any links to the Azure Synapse Delta Lake support?
i think its the other way around. blockchains was inspired by transaction logs which are super old ideas in dbs / distributed system, which is pretty much what they're doing here
Very clear explanation, thanks!
Any practical implementation of this design?
batch & stream gets merged in Silver, then why do I need batch views, and how that is going to help me. delay in processing means a delay in making informed decisions. To view status, batch processing is good or for any time limitation reporting.
i still go back to some of the old SSAS whitepapers LOL.....every now and then.
Great content! Thanks for sharing. I couldn't see any Microsoft Logo in partners section for Data Lake. There has been long support in integration between Azure Data Factory and Azure Databricks but I cannot find any support for Delta Lake in Data Factory? Is that not necessary or am I thinking about it wrong?
Azure Data Factory works with Apache Spark and Azure Databricks but is not working with Delta Lake directly yet. Because ADF works with Spark and Azure Databricks, it can still talk to Delta Lake as the underlying storage of a Spark table (for example). HTH!
@@dennyglee not working with Delta Lake directly ”yet”? Is there work being done to make it work directly?
@@baatchus1519 Apologies, allow me to clarify. At the time of the recording, Microsoft themselves hadn't officially supported Delta Lake per se as only Azure Databricks itself was talking to Delta Lake natively. You are correct that any Spark 2.4.2+ instance can work with Delta Lake. Saying this, recently Synapse had called out they officially support Delta Lake and subsequent presentations we've updated the partners logo to include them.
Denny Lee thanks for the update, appreciate it. Do you know if Data Factory will have native support to read and write to Delta Lake soon? And do you have any links to the Azure Synapse Delta Lake support?
Commits looks like blockchain
i think its the other way around. blockchains was inspired by transaction logs which are super old ideas in dbs / distributed system, which is pretty much what they're doing here