Hi, Thank you for your efforts in making wonderful content videos. I have a question, are these clusters created and running on a data plane, but cluster configuration is managed by a control plane, is that correct?
Yes correct. Clusters are created in customers cloud account where the data lies (data plane), control plane only manages them. If you liked the content, Please make sure to share with your network over LinkedIn 👍
Hi bro, so if I'm not mistaken, this tutorial seems more focused on Databricks admin stuff like setting up catalogs, schemas, and external storage. It doesn't dive too deep into the developer side of things. As a developer, I don't think I need to worry about creating catalogs or schemas myself. But should I still understand the basics of those concepts? What I'm really interested in is creating tables, working with Parquet and Delta formats, version control, performance tuning, parameterizing queries, building jobs, and orchestration. Is that enough to prep for the DP-203 exam or land a data engineer job? Thank you.
@@hanumantharaochalla7443 This tutorial is designed specific for Databricks platform knowledge. It also have examples for utilities that would be required by developers. In case you want to learn PySpark, checkout other series more focused on that in the same channel Ease With Data. th-cam.com/play/PL2IsFZBGM_IHCl9zhRVC1EXTomkEp_1zm.html
Hi bro, thanks for the awesome videos! If I'm understanding this right, the main difference between traditional BI and the modern Databricks approach is that Databricks brings everything together in one place. You've got your data warehouse, big data processing, and even ML/AI tools, all under one roof. This makes data governance a lot easier, right? In the old way, data was often scattered across different databases (like Oracle for apps, Teradata or Netezza for DW, and Hadoop/HDFS for big data), each with its own data model. But with Databricks, it seems like the client has more control over the overall data landscape, correct? Pls clarify if I missed anything.
No words to say about your explantation
Thanks
Thanks 💓 Please make sure to share this with your network over LinkedIN
You are so good at explaining this ! And you drawings enhance my understanding 10 times further! Thank you.
Thank you ❤️ Please make sure to share with your network over LinkedIn 👍
Excellent work, waiting for more videos of this series
Thanks 👍 Please make sure to share with your network over LinkedIn ❤️
So glad you started this, thank you very much.
Thanks 😊 Please make sure to share this with your network over LinkedIn ❤️
Great Series sir
Thank you ❤️ Please make sure to share with your network over LinkedIn 👍
Very good explanation
Thank you ❤️ Please make sure to share with your network over LinkedIn 👍
Hi, Thank you for your efforts in making wonderful content videos.
I have a question, are these clusters created and running on a data plane, but cluster configuration is managed by a control plane, is that correct?
Yes correct. Clusters are created in customers cloud account where the data lies (data plane), control plane only manages them.
If you liked the content, Please make sure to share with your network over LinkedIn 👍
@@easewithdataSure bro.
Hi bro, so if I'm not mistaken, this tutorial seems more focused on Databricks admin stuff like setting up catalogs, schemas, and external storage. It doesn't dive too deep into the developer side of things.
As a developer, I don't think I need to worry about creating catalogs or schemas myself. But should I still understand the basics of those concepts?
What I'm really interested in is creating tables, working with Parquet and Delta formats, version control, performance tuning, parameterizing queries, building jobs, and orchestration. Is that enough to prep for the DP-203 exam or land a data engineer job? Thank you.
@@hanumantharaochalla7443 This tutorial is designed specific for Databricks platform knowledge. It also have examples for utilities that would be required by developers. In case you want to learn PySpark, checkout other series more focused on that in the same channel Ease With Data.
th-cam.com/play/PL2IsFZBGM_IHCl9zhRVC1EXTomkEp_1zm.html
@@hanumantharaochalla7443 And this series is not designed for DP 203, which is for Azure Cloud.
Hi bro, thanks for the awesome videos! If I'm understanding this right, the main difference between traditional BI and the modern Databricks approach is that Databricks brings everything together in one place. You've got your data warehouse, big data processing, and even ML/AI tools, all under one roof. This makes data governance a lot easier, right?
In the old way, data was often scattered across different databases (like Oracle for apps, Teradata or Netezza for DW, and Hadoop/HDFS for big data), each with its own data model. But with Databricks, it seems like the client has more control over the overall data landscape, correct? Pls clarify if I missed anything.
Yes, Databricks allows you have single platform for all needs, thus governance is also ease.