Kafka Architecture | Kafka Tutorial Interview Questions
ฝัง
- เผยแพร่เมื่อ 27 ก.ย. 2024
- As part of this video we are covering Kafka Architecture, kafka brokers,kafka topics,kafka partitions, kafka consumers,kafka benefits.
Please subscribe to our channel.
Here is link to other spark interview questions
• 2.5 Transformations Vs...
Here is link to other Hadoop interview questions
• 1.1 Why Spark is Faste...
Please add a few questions which are mostly asked like
>> Consumer Group & Consumers & Offsets
>> role of zookeeper
>> InsyncReplicas
>> Data retention
>> general Performance issues and debugging
>> monitoring of services
>> back pressure [How it is handled in Kafka + spark streaming]
Superb, thanks for your patience to keep uploading videos.....
gave me a good understanding . Thankyou
Nice video, clear
Thanks Bhargav
Nice explanation
Thanks Rishi
Nicely explained -made it very simple -like the portion on segmentation -going to read more on it -any more videos on zookeeper etc ?
I am happy it is useful for you... suggest me topic... i will create more videos on that
Sir. Kindly do More videos on interview questions.
We have a series on spark , Hadoop and Hive questions... Do you have any other interview questions which we should cover?.. please suggest... Please subscribe and share this channel
Offset id is assigned in sequence so 1st message will get offset id 1 and 2nd message will get offset id 2 and so on?
Suppose source and sink are configured and i am updating any record it is updated sink side and if i delete a record from source now that record is invalid for sink database now how kafka handles the delete operation. This is a question i was asked in bosch interview and i was not aware so can you explain this ?
you have 3 brokers, not 3 topics
When u say batch posting. How producer will send messages in batch and can consumer read messages in batch ? I understood only producer writing one message and consumer reading one message. Could you please explain this.
Producer can send group of messages to topic instead of 1 message... Consumer can also read group of messages together...
I got an interview Question But could not answer properly
Question: How kafka get's data from external application?
Using Kafka connectors that could be done
good content
Thanks :)