Emil, finally I have fount one comprehensive tutorial about Spark architecture. I have subscribed your channel. Thanks a lot for your unvaluable support!
I came across your video while searching for spark architecture, and it's incredibly well explained. The content is delivered at a great pace, not overly complicated but perfect for a beginner. I'm expecting more videos like this in the future. great job
Amazing content! You've done absolute justice to explaining Spark! The breakdown of concepts is spot on and super easy to follow. Keep up the great work! Thank you!
Thanks for the great explanation! It would be perfect to also hear about your production deployment experiences in Databricks Delta Lake projects, including CI/CD, deploying bundles, monitoring, etc.
Hi! Thanks … indeed super thanks for such learning session on Spark. I wonder how many months of learning and understanding been put into 1 hour! I recomend this to all #databricks users to have this understanding of #spark to make life easy with the #development of #datapipeline or #dataanalytics or #dataanalysis or #optimizing your #codes. Thanks !!
query regarding cluster configuration -> while cluster configuration for example we specify node capacity as - 64 gb memory (it is RAM) - 8 cores how spark allocates -> storage (disk) ? from this 64 gb how it allocates for -> ram + disk?
Great video, Please im requesting you please where i get all this 5GB data please provide me link 🙏🙏🙏 I want to use your data which hii have used so i understand eusly 🙏 please provide all 5Gb data
Emil, finally I have fount one comprehensive tutorial about Spark architecture. I have subscribed your channel. Thanks a lot for your unvaluable support!
Pleasure on my side. I am very happy you like it and find it useful!
The best spark tutorial on youtube!!
Thank you!!
I came across your video while searching for spark architecture, and it's incredibly well explained. The content is delivered at a great pace, not overly complicated but perfect for a beginner. I'm expecting more videos like this in the future. great job
Thank you for feedback! Video about Delta Lake coming in 1-2 days. Hope you will like it as well.
@@DatabricksPro can u plz provide the link to ur git repo for the code
Amazing content! You've done absolute justice to explaining Spark! The breakdown of concepts is spot on and super easy to follow. Keep up the great work! Thank you!
Thanks for feedback. I am really happy to see this being useful!
That was really a great explanation, and covering each topic indeeply, I dont see anyone talk about sparkUI and the pwoer of it. Thanks really again
Very well explained and best part was to include examples for all the concept and explaination. Thanks for such great efforts.
Thanks! I am very happy that you liked it:)
Brilliant content, thank you!
You nailed it. Super useful video!!!
You nailed it perfectly Dimitri 😇
Too good explanation!!
Thanks for the great explanation! It would be perfect to also hear about your production deployment experiences in Databricks Delta Lake projects, including CI/CD, deploying bundles, monitoring, etc.
Yes. I also think its a good idea. it will come
This channel will explode in due time, great job! Looking forward to more such videos! (hopefully on spark optimisations soon). Thanks a lot!!
Thanks! Next week there will be a Delta Lake movie with a bit of optimizations approach.
🤙🤙💥Worthhhhhh spending time on this video. feels like zero to Hero
thanks for your efforts
Hi!
Thanks … indeed super thanks for such learning session on Spark. I wonder how many months of learning and understanding been put into 1 hour! I recomend this to all #databricks users to have this understanding of #spark to make life easy with the #development of #datapipeline or #dataanalytics or #dataanalysis or #optimizing your #codes.
Thanks !!
Thank you :)
You are welcome!
Great work and thanks
Could you please make a similar video like this on delta lake
I am glad you like it :) Delta lake is actually really good idea for the next movie.
query regarding cluster configuration ->
while cluster configuration for example we specify node capacity as
- 64 gb memory (it is RAM)
- 8 cores
how spark allocates -> storage (disk) ?
from this 64 gb how it allocates for -> ram + disk?
Great video, Please im requesting you please where i get all this 5GB data please provide me link 🙏🙏🙏
I want to use your data which hii have used so i understand eusly 🙏 please provide all 5Gb data
i’ve watched many videos and tutorials, none of them are in detailed. Awesome work🫡