The way you explained made me to learn this topic *Kafka* which I never heard before I scrolled this random video …. hats off to you for teaching like a friend....👏🙌👏
Piyush, I am Usman from Pakistan and I really love your videos, The way to tell things from very basic and use drawings for visualizing things and clarity, it's literally just amazing and easy to follow. Thank you so much for such valuable content. Lots of love from Pakistan to you💖
I have watched the full video from starting to end and enjoyed it. Its amazing to see how clearly you understand the concept and your command on node syntax 🎉
It came randomly to my recommendation and I clicked on it…before I had only heard of Kafka but you explained it so well. I don’t know if I’ll be using this stuff anytime soon but sure as hell won’t forget how it works ❤
What a brilliant explanation! I wasn't familiar with Kafka before, and you explained it in layman's language superbly! Kudos to you. Keep creating videos like this!
Great lecture….. in one hour you can easily visualise and learn all about kafka in and out. The best part of the video is when he is coding along with visualising what he is coding.
TLDW; Databases have low throughput while Kafka has high throughput but Databases have better storage compared to Kafka. Kafka is a Messaging service that can act as both a Message Queue and Pub-Sub pattern produced by Linkedin and Apache. Kafka requires zookeeper (another service from Apache) for load balancing. Kafka can have topics and the topics can be further stored in partition. One consumer can consume to multiple topics and the partition is self balanced by kafka but one partition can't be consumed by multiple consumers this way Kafka act as message queue. In order to act as Pub-Sub, Kafka has a concept of Group, Consumers in same groups that subscribed to a topic gets the topic from partition, self balanced by Kafka but all the groups also get all the Partition, this way it act as Pub-Sub as well. Kafka has admin which has task for creating topics and partition etc, producers that produces data, consumer that consumes data.
The Way @Piyush is teaching is amazing. I just opened video to over through but ended up watching it completely. I'm working as Sr. Software Engineer and can say that this is bater then many paid videos. Thanks Man.
@15:16 Just a correction. Kafka storage is not temporary. You can store data in kafka forever since Kafka stores memory on disk, and it is not an in-memory storage like Redis.
i so happy that i found your Channel , I was browsing through your channel and found you have created really good content on the things which are currently trending.Thank you ❤
I've watched the full video through to the finish and really enjoyed it. Your command of node syntax and your obvious comprehension of the subject are outstanding. 🎉
Bhai I rarely comment for TH-cam video.. I am mechanical engineer by profession & recently got inspired to learn Data Engineering stack... And you made my day..what a great video to understand Kafka fundamentals... God bless you man with lots of happiness and success... You are great teacher.... Would love to see your crash course as first crash course for anything new learning i will be planning for in future....
Bro, Just watched 7 mins of your video and I have to come to the comment section to say that the language you are using is too simple and easy to understand. "Database fat Jayega" 😃. Perfect style to explain a core developer like me. Good Job.
Lemme try: Kafka is a distributed message queue, generally used to decouple micro services where one api produces some data, and many others consumes it. Simple. I don't know why he confused people with DB and Kafka. Kafka is really a queue, you only read data sequentially. Kafka CAN hold data for an infinite period if you want. But it's not made for 'querying' data, instead, it holds the data temporarily so that the producer api don't need to wait for the consumer api to 'read' this data, it can simply dump the data onto the queue and let consumer api read whenever it finds time for it. If you're even a little confused between Kafka and a DB, this guy is to blame for that.
I assume you would have better videos on TH-cam for explaining Kafka, if you are blaming him for confusing others. If not, then appreciate his efforts for teaching for free,
He was trying to explain why do we need Kafka at all and not insert the data directly to the database. It is because the throughput of the database is not that high to handle the fast streaming data. So, what if we only use Kafka and discard the database. This is not a good option because Kafka is not a best solution for storage and querying. So, I thought his explanation was perfect.
Great job piyush on explaining the details both visually and with the code approach 💯 While getting the hang of kafka as we go further i started to get the question of how kafka does the message transfer and when you talked about queue and sna architecture approach it clicked the brain. It's the similar how amazon SNS and SQS services works and from that point things got a lot for relatable. I had a chance to work with pusher messaging service which had the similar pub/sub messaging architecture. This kafka session turned out to be really amazing. Looking forward for more of something like this in future.
omg I never watch such kind of video which are very helpful to anyone who want to make career in this sector. superb! I like your explaining technics. Thanks a lot!
HI Piyush, This is indeed a crystal clear video about Apache Kafka. Thanks for sharing this wonderful knowledge for the IT developer fraternity. Thanks for this wonderful video and making the concept so clear.
Thanks for this content. For first time I attempted to listen to a texh content while I was busy with mundane work and got a wonderful idea and understanding of Kafka. Tomorrow morning I shall implement this and also read further on event driven architecture.🎉 Thanks again
In an Iron Man movie, Kafka could have been useful in a scene where Tony Stark (Iron Man) needs to process and manage a large volume of real-time data or communications. Here's a hypothetical scenario where Kafka could play a role: Scene: Tony Stark is in his high-tech lab, and he's remotely controlling his Iron Man suit, which is deployed in a distant location to handle a crisis. He needs to receive and process real-time data from various sensors on the suit, such as vital signs, telemetry data, and external environmental data, while also receiving live video feeds. How Kafka could be useful: 1. **Real-time Data Ingestion**: Kafka could be used to ingest data from these sensors and video feeds in real-time. Each type of data (vital signs, telemetry, video) could be treated as a separate Kafka topic. 2. **Data Processing**: Tony needs to process this data for real-time decision-making. Kafka Streams, a component of Kafka, could be used to perform real-time data processing, such as analyzing vital signs for signs of distress, stabilizing the suit's functions, and identifying threats in the video feed. 3. **Reliability**: In a high-stakes situation like this, Kafka's reliability ensures that no data is lost. If there are network interruptions or delays, Kafka can buffer and replay messages, ensuring that Tony has access to all the critical data. 4. **Scalability**: If the crisis intensifies and more data needs to be processed, Kafka can scale horizontally by adding more Kafka brokers, allowing Tony to handle the increased data flow without performance issues. 5. **Monitoring**: Kafka provides extensive monitoring capabilities, which could be depicted in the movie as Tony monitoring the health of the data pipeline in real-time, ensuring that he has a clear view of the suit's status. In this scenario, Kafka would enable Tony Stark to efficiently manage and respond to real-time data, enhancing his ability to control the Iron Man suit and handle the crisis effectively. It would add a layer of realism to the technological aspects of the movie.
Hi Piyush, This is just to let you know that you are doing a great work, Your teaching style is unmatchable, you keeps everything very simple and don't repeat yourself again that make your video so enjoyable to watch. May all your dreams come true, as you are helping us to achieve our goals.
Genuinely one of the best crash course on youtube I have come across in a while Thank you Can you also have further video on such topics like data indexing and sharding
Thank you for the hard work. DB throughput is low because of ACID properties which is not possible in case of Bulk Insert that is without redo/undo so there is a huge risk of data loss but in the cases of Junk data ( Facebook, discord, etc) you accept the risk. In case of financial transactions you need to come up with a strategy where Kafka when full storage and failures in database stops producing until consumer errors are fixed to avoid data loss.
I really learn from your vedios, can you please make vedios on the topics :- 1. design pattern in js 2. React with typescript 3. Typescript standalone course 4. Next js 5. Full stack application build from scratch 6. Asynchronous processing 7. Messages queue 8. Redis 10. Jenkins 11. Advance react js
@@caresbruh bri i need job but dint have skill can u tell me how can i get skill can u tell me hindi you tube channel for fresher i want join so fast in fornend plz reply
Omg! this is life saver. I am interning in Goldman Sachs from IIT Delhi and didn't have web dev background at all. So I had to quickly learn about Kafka. and this is the best thing one could get.
Bhai Thank you so much , i am non it person , i am practicing algo trading in stock market, this all concepts are solving many of my problems , you make it very simple to understand . Thank You Bro!
I always wanted to learn kafka , Thank you so much to putting out such great content. You made me understand every topic where i was struggling Much Thanks 🙏
very nicely articulated internal of Kafka and working with DB , Thanks lot Piyush ,keept it up good work. 2 questions: 1) why and when one should do bulk insert of kafka records into DB(Zomato,Uber) instead of record by record update? 2) what is the basic essence of - Kafka prefers consumer groups over standalone consumers?
Great Work. Keep it up.
Thanks.
Saptadeep.
Thank you so much, highly appreciated 😀🙌
brooo, this is crazy donation
Sir send me some money also
@@salman1098😂😂😂
Bhai mne do din se khana nhi khaya hai 200 rupye krdo
Bro.. this guy is crazyyyyyyyyy. This was way better than other code videos i watched
The way you explained made me to learn this topic *Kafka* which I never heard before I scrolled this random video …. hats off to you for teaching like a friend....👏🙌👏
Piyush, I am Usman from Pakistan and I really love your videos, The way to tell things from very basic and use drawings for visualizing things and clarity, it's literally just amazing and easy to follow. Thank you so much for such valuable content. Lots of love from Pakistan to you💖
I have watched the full video from starting to end and enjoyed it. Its amazing to see how clearly you understand the concept and your command on node syntax 🎉
Was waiting for "Why kafka has such a high throughput" answer.
Great explanation, thank you for the effort.
although I do not know node.js but the way you taught kafka its simply awesome.
It came randomly to my recommendation and I clicked on it…before I had only heard of Kafka but you explained it so well. I don’t know if I’ll be using this stuff anytime soon but sure as hell won’t forget how it works ❤
same
The way you explained a complex concept like kafka with such simple and clear examples is really fascinating. Thank you!
What a true sentence "Half knowledge is very dangerous". Good work buddy. Sweetly swallowing the bitter medicine.
What a brilliant explanation! I wasn't familiar with Kafka before, and you explained it in layman's language superbly! Kudos to you. Keep creating videos like this!
Not just for the coding, I come here for his personality❤.Most cool and calm. Great channel.❤
Great lecture….. in one hour you can easily visualise and learn all about kafka in and out. The best part of the video is when he is coding along with visualising what he is coding.
Kya video banaya hai Bhai Saab..Top notch..1 hour full utilised..
TLDW;
Databases have low throughput while Kafka has high throughput but Databases have better storage compared to Kafka.
Kafka is a Messaging service that can act as both a Message Queue and Pub-Sub pattern produced by Linkedin and Apache.
Kafka requires zookeeper (another service from Apache) for load balancing.
Kafka can have topics and the topics can be further stored in partition.
One consumer can consume to multiple topics and the partition is self balanced by kafka but one partition can't be consumed by multiple consumers this way Kafka act as message queue.
In order to act as Pub-Sub, Kafka has a concept of Group, Consumers in same groups that subscribed to a topic gets the topic from partition, self balanced by Kafka but all the groups also get all the Partition, this way it act as Pub-Sub as well.
Kafka has admin which has task for creating topics and partition etc, producers that produces data, consumer that consumes data.
Is these topics involved system design?
@@growmoreyt4192 yes sir
@@growmoreyt4192 yes
there can be partitions within a topic , , refer this ---> www.openlogic.com/sites/default/files/image/2022-03/Kafka%20Cluster%203.jpg
What a great explanation! Hats off to you, sir. I'm wondering why you have only 11K subscribers. You deserve millions!
Not many people interested in Kafka know Hindi, and not many people who know Hindi wants to know Kafka.
he is already at 49k in 2months
@@DanielSmith-hd9iq 60k when i joined
68k when I joined .. nice explanation
@@sandeep-rai0772k when I joined
This is the best kafka course I have watched on youtube
The Way @Piyush is teaching is amazing. I just opened video to over through but ended up watching it completely. I'm working as Sr. Software Engineer and can say that this is bater then many paid videos.
Thanks Man.
@15:16 Just a correction. Kafka storage is not temporary. You can store data in kafka forever since Kafka stores memory on disk, and it is not an in-memory storage like Redis.
Yes but we can't rely on kafka storage for long term like database system
Bro.....hatsoff to your explanation. You explained kalfka in the Most simplest way.Thanks bro
Got this channel randomly, you’re a gem ❤
Really, you literally took me into the world of kafka. Seriously your explanation is superb.
i so happy that i found your Channel , I was browsing through your channel and found you have created really good content on the things which are currently trending.Thank you ❤
Great Work. You have provided the example which will be never forgotten.
Your visual way of teaching is too good, thanks mate
I've watched the full video through to the finish and really enjoyed it. Your command of node syntax and your obvious comprehension of the subject are outstanding. 🎉
this channel is a gold mine.
It's great and depth knowledge about Kafka by piyush and he is putting (hard +smart)work to spread best knowledge......
Yygv
Yygvvvvv
Bhai I rarely comment for TH-cam video.. I am mechanical engineer by profession & recently got inspired to learn Data Engineering stack... And you made my day..what a great video to understand Kafka fundamentals... God bless you man with lots of happiness and success... You are great teacher.... Would love to see your crash course as first crash course for anything new learning i will be planning for in future....
Bro, Just watched 7 mins of your video and I have to come to the comment section to say that the language you are using is too simple and easy to understand. "Database fat Jayega" 😃. Perfect style to explain a core developer like me. Good Job.
Thank you so much bro, you really don't know how much help you have made by explaining this concept so easily 🙏🙏
Pure zindagi mei itne sare terminals open nahi kia... Bohot masth video tha..
Lemme try:
Kafka is a distributed message queue, generally used to decouple micro services where one api produces some data, and many others consumes it. Simple.
I don't know why he confused people with DB and Kafka. Kafka is really a queue, you only read data sequentially. Kafka CAN hold data for an infinite period if you want. But it's not made for 'querying' data, instead, it holds the data temporarily so that the producer api don't need to wait for the consumer api to 'read' this data, it can simply dump the data onto the queue and let consumer api read whenever it finds time for it.
If you're even a little confused between Kafka and a DB, this guy is to blame for that.
KafkaStreams and ksqldb, built on top of kafka. But yes you are right!
He is right, he is telling the working of kafka. It's seems that you only uses the technology not understand there working.
I assume you would have better videos on TH-cam for explaining Kafka, if you are blaming him for confusing others.
If not, then appreciate his efforts for teaching for free,
@@BANANAS2011 Appreciate efforts who give will give you alcohol when thirsty? Why are you braindead?
He was trying to explain why do we need Kafka at all and not insert the data directly to the database. It is because the throughput of the database is not that high to handle the fast streaming data. So, what if we only use Kafka and discard the database. This is not a good option because Kafka is not a best solution for storage and querying. So, I thought his explanation was perfect.
Bro, you have a special talent to make difficult topics easy, it was my first kafka tutorial and i understand it well enough, thanks a lot ❤❤
agree
Totally agree ❤❤😥😥
😮 the way you explain at 1.75x is so cool. I never knew what Kafka is now I feel like a Kafka expert.
All my Kafka concepts are finally clear today!
Really awesome vedio , good understanding , Hindi m h toh attention span bhi acha rha ! Great job
I tried many youtube courses for kafka but your way of explaning is so great.. please keep the good work..
Very Clear and Concise explanation! The way you dumb down complex topics for your audience is remarkable! Keep up the good work :)
Great job piyush on explaining the details both visually and with the code approach 💯
While getting the hang of kafka as we go further i started to get the question of how kafka does the message transfer and when you talked about queue and sna architecture approach it clicked the brain. It's the similar how amazon SNS and SQS services works and from that point things got a lot for relatable.
I had a chance to work with pusher messaging service which had the similar pub/sub messaging architecture.
This kafka session turned out to be really amazing.
Looking forward for more of something like this in future.
Best video on kafka ever. Loved your simple explanation and demo with visualiztion.
one of the best you tube channel found on you tube. you are doing great work bro ❤❤
omg I never watch such kind of video which are very helpful to anyone who want to make career in this sector. superb! I like your explaining technics. Thanks a lot!
Bro!!!, very well explained, the first CS lecture I have watched with excitement till the end. Thank You
WOW!!!!!!!!!!!!! Excellent , You deserve more than what i'm writing.....
BEST EXPLAINATION. I'm so sad that I have discovered this channel soo late.
What a great explanation! I don’t think I will forget it after watching this. Thanks!
Amazingly simplified explanation, superb!
This is the first video I discovered in this channel.
Maja aa gaya bhai.
Thank You!
Thank you. You teach the Kafka concepts in a crystal clear way. Love from Bangladesh 🤍🇧🇩
One of the great crash course video on Kafka. Great work Piyush 🏆
HI Piyush,
This is indeed a crystal clear video about Apache Kafka. Thanks for sharing this wonderful knowledge for the IT developer fraternity. Thanks for this wonderful video and making the concept so clear.
Mazza Agaya, Kaffi Samay Baad Itne Achi Technical Explanation Video Dekha. Yo have a teaching style similar to Brad Traverse.
bro, you are the only Indian TH-camr explaining/teaching real world tech.
Excellent video of Kafka purpose, use case and produce, consumer demo. Well done.
Use this whole scenario with a database. Thanks in advance for this soundful knowledge....
Thanks for the crash course, maybe this is my first course which I watched completely and understood completely.
Thanks for this content. For first time I attempted to listen to a texh content while I was busy with mundane work and got a wonderful idea and understanding of Kafka. Tomorrow morning I shall implement this and also read further on event driven architecture.🎉
Thanks again
I have no words this video is amazing please make whole series for kaflka please...........
In an Iron Man movie, Kafka could have been useful in a scene where Tony Stark (Iron Man) needs to process and manage a large volume of real-time data or communications. Here's a hypothetical scenario where Kafka could play a role:
Scene: Tony Stark is in his high-tech lab, and he's remotely controlling his Iron Man suit, which is deployed in a distant location to handle a crisis. He needs to receive and process real-time data from various sensors on the suit, such as vital signs, telemetry data, and external environmental data, while also receiving live video feeds.
How Kafka could be useful:
1. **Real-time Data Ingestion**: Kafka could be used to ingest data from these sensors and video feeds in real-time. Each type of data (vital signs, telemetry, video) could be treated as a separate Kafka topic.
2. **Data Processing**: Tony needs to process this data for real-time decision-making. Kafka Streams, a component of Kafka, could be used to perform real-time data processing, such as analyzing vital signs for signs of distress, stabilizing the suit's functions, and identifying threats in the video feed.
3. **Reliability**: In a high-stakes situation like this, Kafka's reliability ensures that no data is lost. If there are network interruptions or delays, Kafka can buffer and replay messages, ensuring that Tony has access to all the critical data.
4. **Scalability**: If the crisis intensifies and more data needs to be processed, Kafka can scale horizontally by adding more Kafka brokers, allowing Tony to handle the increased data flow without performance issues.
5. **Monitoring**: Kafka provides extensive monitoring capabilities, which could be depicted in the movie as Tony monitoring the health of the data pipeline in real-time, ensuring that he has a clear view of the suit's status.
In this scenario, Kafka would enable Tony Stark to efficiently manage and respond to real-time data, enhancing his ability to control the Iron Man suit and handle the crisis effectively. It would add a layer of realism to the technological aspects of the movie.
Team Iron Man!!!
Please mark this to @marvel as they need this data :0
Very clear and concise explanation - I watched at 2x speed and was still able to grasp everything :D
Hi Piyush,
This is just to let you know that you are doing a great work, Your teaching style is unmatchable, you keeps everything very simple and don't repeat yourself again that make your video so enjoyable to watch.
May all your dreams come true, as you are helping us to achieve our goals.
Great appreciated to you, You are the one of my favourite you tubers. Keeps continue this type of video regularly. Thanks you so much.
This video is really great crash course to learn fundamentals and hands on practice is so good... 👏🏻
This man is way better than most of the Edtech Academies out there.
This is one of the best tutorial on Kafka, Thank you :)
I do remote web developer job in US from india , this is very much informative video. I am dealing with a large data, using kafka & elasticsearch.
Bhaiya, you are a great teacher. Your videos are way better than paid courses. Love your videos.
Amazing explanation, nothing can be better than this.
Thanks for the clear and detailed explanation of Apache Kafka. Your insights really helped me understand its architecture and use cases better!
so basic explanation with basic examples which is really easy to understand
Genuinely one of the best crash course on youtube I have come across in a while
Thank you
Can you also have further video on such topics like data indexing and sharding
i was tired when i the beginning of the video
but in the last i am fresh
Masha Allah great Explanation sir
love from pakistan
Thank you for the hard work. DB throughput is low because of ACID properties which is not possible in case of Bulk Insert that is without redo/undo so there is a huge risk of data loss but in the cases of Junk data ( Facebook, discord, etc) you accept the risk. In case of financial transactions you need to come up with a strategy where Kafka when full storage and failures in database stops producing until consumer errors are fixed to avoid data loss.
I really learn from your vedios, can you please make vedios on the topics :-
1. design pattern in js
2. React with typescript
3. Typescript standalone course
4. Next js
5. Full stack application build from scratch
6. Asynchronous processing
7. Messages queue
8. Redis
10. Jenkins
11. Advance react js
you don't need a course to learn typescript/react-typescript u just need to use it
For 6th and 7th, I would highly recommend Namaste Javascript playlist of Akshay Saini
@@caresbruh bri i need job but dint have skill can u tell me how can i get skill can u tell me hindi you tube channel for fresher i want join so fast in fornend plz reply
One of the greatest videos I have ever watched!!!! Thanks Piyush bhai
Great watch just before an interview!
Thank you so much piyush !! For such a great explanation, i think for the first time i haven’t thought for a second to subscribe someone !!
Omg! this is life saver. I am interning in Goldman Sachs from IIT Delhi and didn't have web dev background at all. So I had to quickly learn about Kafka. and this is the best thing one could get.
Bhai Thank you so much , i am non it person , i am practicing algo trading in stock market, this all concepts are solving many of my problems ,
you make it very simple to understand .
Thank You Bro!
crisp explanation. Great way describing each concept
I always wanted to learn kafka ,
Thank you so much to putting out such great content.
You made me understand every topic where i was struggling
Much Thanks 🙏
bhai maza hi aa gya !! Quality Content man
The way of explaining was extremely good, idk why your Channel is so underrated 💥
Please appreciate this guy, so goooooooood explanation!!!!!!!!!
Your content is so engaging.... I kept watching till the end and at last I was like wow such a easy explanation.
Subscriber++; likes++;
Just 7 min into your video and you are awesome already bro
You have explained it in very simpler way. Thanks to you man ☺
Thank you I am learning spring boot this is very helpful.
This is really a wonderful explanation. Easy to understand for non-tech person as well. Thanks😀
Thanks so much for this amazing course. Your way of teaching makes it super easy to understand the concept.
Excellent, mind blowing explanation THE BEST you can say
amazing! best crash course video on kafka.
very nicely articulated internal of Kafka and working with DB , Thanks lot Piyush ,keept it up good work.
2 questions:
1) why and when one should do bulk insert of kafka records into DB(Zomato,Uber) instead of record by record update?
2) what is the basic essence of - Kafka prefers consumer groups over standalone consumers?
bhai, kya explanation dete ho yaar, maza aagaya
Super interesting and one of the simplest and consumable explanation. 👏
Very Good Explanation. Good Example. Good Theory. Good Practicle. Keep it bro.
Explained in a very easy and smooth manner !!!
It was very to understand about Kafka for beginners like me . Thanks a lot !!
Great work. Understood everybit. One of the best crash courses. Thanks