Very intuitive way to explaining things. it will help data people to remember this by the shop example. teaching by example is the only way to teach effectively.
Is Normalized = better write performance De-Normalized = better read performance always true or a rule of thumb? So, I have been exploring efficiency increases in normalizing sometimes due to better indexing performance. Is it possible to actually increase read efficiency by normalizing and allowing better indexing?
Thank you for the video! I’d like to also put forward document-based db’s like MongoDB as as a sweet spot between traditional RDBMS and NoSQL db’s (like the wide-column store you described in the denormalized scenario). Mongo data models are typically more denormalized than not, and it can ingest millions of writes/sec without issue
Very nice explanation. Really interesting facts and explanation . Easy to understand and remember. You really make IT interesting as you say. This is not only for this - rather for all your videos . Thanks a lot for spreading knowledge .
sir meinay ek banday ka mobaile apnay wifi say connect kia hay ab wo appna mobaile hotspot on kar kay dosro ko internet deta hay .mein cahta hon wo net istemal karay leken hotspot kam nakaray. is ka koye settings ho to mera help karay. mehrobani hoge
I may not have money to donate but I will not skip ads for every video i watch in your channel. Great video! 😊
Thanks Marky that means a lot man thankyou 🙏🙏☺️
Very intuitive way to explaining things. it will help data people to remember this by the shop example. teaching by example is the only way to teach effectively.
Is Normalized = better write performance
De-Normalized = better read performance
always true or a rule of thumb?
So, I have been exploring efficiency increases in normalizing sometimes due to better indexing performance. Is it possible to actually increase read efficiency by normalizing and allowing better indexing?
Thank you for the video! I’d like to also put forward document-based db’s like MongoDB as as a sweet spot between traditional RDBMS and NoSQL db’s (like the wide-column store you described in the denormalized scenario). Mongo data models are typically more denormalized than not, and it can ingest millions of writes/sec without issue
Until you have to ingest all that deeply nested crap into a relational database
Nice Explanantion i have learnt about DATA ENGINEERING by watching your videos, I truly appreciate your work...Keep Making Videos!!!!!
AMAZON DB2 --- Best example for DENORMALIZATION Data
finished watching
Awesome video
Thank you man! It was a helpful video for my coming interview! Subscribed + Liked! 🤙🏼
Thank you for this video. Simple and Easy to follow!
👌👌👌
great video thank you
Thank you for this clear explanation.
Thanks
nice explanation sir...
thannks man
nice
well explained Sir..Thank you..
thanks
👏👏👏
Thanks 😊
very well explained man
Superb🙏
Brilliantly explained
Nice Explain Sir, Thank You
Thank you buddy ..... 😊
Great video 🤙🏽
Very nice explanation. Really interesting facts and explanation . Easy to understand and remember. You really make IT interesting as you say. This is not only for this - rather for all your videos . Thanks a lot for spreading knowledge .
New fan. I like your style.
Thanks Sidney
Great video!!
Great video I’ve been watching for a while and I’ve learnt so much thanks a lot your a amazing teacher never stop
Topic starts from 1:28
Sir, why is write easy in a Normalized DB? Any example, pls? Thanks for your video. 🙏
Read this 👉 link.medium.com/K7ivyOiTsxb
Good stuff for beginners, obviously every aspect could not be covered in a limited timeframe.
Can we say Denormalization process is followed by normalization ? Instea of which is better
Video starts at @1:08
Can you pls explain SDWAN concepr
thanks for suggestion
sir meinay ek banday ka mobaile apnay wifi say connect kia hay ab wo appna mobaile hotspot on kar kay dosro ko internet deta hay .mein cahta hon wo net istemal karay leken hotspot kam nakaray. is ka koye settings ho to mera help karay. mehrobani hoge