Excellent tutorial on real time scenarios. Avro serialization is one of the real time scenarios used in the projects and also an important topic in the Kafka interviews. Thanks a lot Basant for the efforts put in, much appreciated!!
Thanks Basant. Appreciate your efforts. God Bless you.🙂🙏 Every week waiting for your videos. You are my Guru. 😊 The way of your explanation is awesome.
Thanks for your useful tutorials. Short question please: what is the difference between you first change to avro scheme, that when you send Employee information without age/dob was validate successfully , to your second change, when you added the middle name, the validation failed. I didn't notice any change in the process for the two changes...
Yes I agree but to reduce the length of the video i kept both in the same project but what's the challenge here to separate consumers into different projects?
@@Javatechie i think we need to generate the beans class using avro schema in both producer and consumer application and then need to run both app. I thought if we separate producer and consume and if we change schema in producer it should reflect in consumer app without re run but seems not possible we have to re run both app. pls correct me if I'm wrong?
Sir I have one question. If we put producer and consumer in separate project. Then the .avsc file we should keep in consumer project. Then any changes in payload structure in producer side the same we need to change in consumer side avro file. So at the end the consumer jar file needs to be re deployed to the server. So what is the benefit? Of schema registry. Please answer
Nice explanation. Just have one doubt, can we use this when we are storing employee info in database like cassandra as database schema is fixed there. And if we want to accomodate new fileds we will have to manually add them.
@@letsCherishCoding purpose of avro to avoid failure in schema evolution. Next you can think of any e-commerce or food delivery app who frequently do changes on their payload
Hello sir, I already followed you from last two years and you are awesome. Could you please help me for one interview question on threading. In you corrent project where you are using thread with example?
Great tutorial ! One question : Do we need to add any configuration related to SSL when using confluent cloud schema registry (https) ? I'm getting below error when my spring boot app is trying to connect to the schema url. SSLHandshakeException - PKIX path building failed: sun.security.provier.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
If consumer in different microsevice, I need to generate avro class for all the consumers right ?. In future, if we need to add new field, first we need to add default as empty, is my understanding correct ?.
Yes exactly because whatever version of avro Object producer is sending consumer is going to consume that object as per version in schema registry when deserializing
@@Javatechieif producer sends new schema, does consumer need to generate the dto files from consumer end before it consumes. Until the new changes deployed to prod, is it uses old schema ? How exactly schema versioning works for consumer without any break if consumer is a separate external service ?
@@Javatechie Sorry i didn't get which registry you talking about. Could you elaborate further. Can we store it in like a config server? Just thinking..
Hi Sir , I need to convert an input xml file to avro schema dynamically instead of having it under resource folder and then build the Avro object and send it to Kafka topic .can you please help me with this.
Any help on this is appreciated Sir.. I subscribed to your channel now and would like to continue to have the membership and thank you very much for all the knowledge you are sharing to the community. How to make it work for build.gradle file Sir?
I have a doubt what if we are not sure what kind of schema is coming from my database as my app might be handling different types of json data in that case Do i need to define schema for every json data one by one or is there any other way?
Schema is nothing is the contract between producer and consumer buddy. For example to work as IAS you must need to clear the UPSC exam that's the rule or you can say contract set by the government. If you have multiple exams then you need to prepare for different subjects like schema
1. In confluent dashboard we can able to see the sesitive data, how to secure it in one interviewer asked me this question, please suggest, also confluent is free or paid. 2. If consumer is down producer is producing messages continuously and the hardware resources are exhUsted kafka how can handle this situation
@@Javatechie Thanks for the reply basant sir, actually content ex: credit card numbers, social security numbers etc. which should display in amsked manner, please suggest
From 1:00 on you explain that in the plain Kafka world we need to build a new producer and consumer app once a class, whose objects we send to Kafka, changes. From 41:00 on you generate a new Employee class based on the changed schema and then rebuild producer and consumer as well. From my understanding, using schemas we still need to build a new producer and consumer app; so this is not an advantage of the whole schema overhead. The only advantage I see is that we do not manually change classes. Do I overlook something?
No your understanding is correct. It means we no need to rely on data like pojo/events even producer modified any payload structure still my consumer will not break
@@Javatechie I used everything same as you but when it came to evolving the schema and dropping some of the fields, I got change in schema error 409. I am wondering because I used same properties as you. Also, another thing is, I have the following dashboard I am not seeing the messages even though the schemas and the topics are there. I am talking about port 9092/clusters/** but the 8081/subjects/** is showing
This are the errors org.apache.kafka.common.errors.InvalidConfigurationException: Schema being registered is incompatible with an earlier schema; error code: 409 2024-04-23T09:29:46.988+03:00 ERROR 12236 --- [apache-schema-registry] [nio-8888-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed: org.springframework.kafka.KafkaException: Send failed] with root cause org.apache.kafka.common.errors.InvalidConfigurationException: Schema being registered is incompatible with an earlier schema; error code: 409
everything's working well, but my control centre often shut down itself after some time, need to start it again from docker desktop, why can be the cause?
Excellent tutorial on real time scenarios. Avro serialization is one of the real time scenarios used in the projects and also an important topic in the Kafka interviews. Thanks a lot Basant for the efforts put in, much appreciated!!
Thanks Basant. Appreciate your efforts. God Bless you.🙂🙏 Every week waiting for your videos. You are my Guru. 😊 The way of your explanation is awesome.
Thanks for making such a detailed video about schema registry!!
Thanks for this tutorial.. your contribution to the developer community is invaluable.
Surpringly I had this kind requirement using avro object, many thanks
Glad it helped!
Thank you Basant bhai.. Lucid explanation, loved it👍
Thank you.
i learned more about kafka docker compose.
Nice explanation, lot of other things to learn as well such as Docker, etc. Thank You as always.
Bro your content is exceptionally great..
great content ,waiting kafka connect and kafka stream
Kafka connect i do struggled like anything but no results . Once successfully i will ran one source and sink poc then i will update buddy 👍
Excellent sir no words to describe your knowledge sharing
"Let's get started" has a separate fan base..
Wowww
Awesome
Thank you very much, so helpfull!
Nice explanation
Nice video you are putting lot of effort
Thanks, looking for this one. Please cover for Kafka connect also😊
Sure buddy
Thanks for your useful tutorials.
Short question please: what is the difference between you first change to avro scheme, that when you send Employee information without age/dob was validate successfully , to your second change, when you added the middle name, the validation failed. I didn't notice any change in the process for the two changes...
Check I have explained in control centre ui the difference in schema evolution
Thank you So much. why AvroSchema generated DTO has @Deprecated on the attributes ?
Thanks!
thanks bro, pls do like this videos, pls keep ur good work
Thank you for a wonderful tutorial
nice tutorial but if u do this concept by creating two differnt projects(producer and consumer) then it wil be more clear and prefect
Yes I agree but to reduce the length of the video i kept both in the same project but what's the challenge here to separate consumers into different projects?
@@Javatechie i think we need to generate the beans class using avro schema in both producer and consumer application and then need to run both app. I thought if we separate producer and consume and if we change schema in producer it should reflect in consumer app without re run but seems not possible we have to re run both app. pls correct me if I'm wrong?
You are correct only need to build your app again that's the things you need to do
Excellent tutorial
Sir I have one question. If we put producer and consumer in separate project. Then the .avsc file we should keep in consumer project. Then any changes in payload structure in producer side the same we need to change in consumer side avro file. So at the end the consumer jar file needs to be re deployed to the server. So what is the benefit? Of schema registry. Please answer
Actually this file should be place on some common or centralize place like config server or s3
❤❤ Thank you so much sir 🙏
Nice explanation. Just have one doubt, can we use this when we are storing employee info in database like cassandra as database schema is fixed there. And if we want to accomodate new fileds we will have to manually add them.
But in the db schema you need to modify the field.
@@Javatechie yes, I just wanted to know couple of real life use cases of using avro schema
@@letsCherishCoding purpose of avro to avoid failure in schema evolution. Next you can think of any e-commerce or food delivery app who frequently do changes on their payload
@@Javatechie got it. Thanks
Hello sir,
I already followed you from last two years and you are awesome.
Could you please help me for one interview question on threading.
In you corrent project where you are using thread with example?
Great tutorial !
One question : Do we need to add any configuration related to SSL when using confluent cloud schema registry (https) ? I'm getting below error when my spring boot app is trying to connect to the schema url.
SSLHandshakeException - PKIX path building failed: sun.security.provier.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
Yes you need to add keystore and truststore certificate in producer and consumer configuration
@@Javatechie Thanks for the quick response
If consumer in different microsevice, I need to generate avro class for all the consumers right ?. In future, if we need to add new field, first we need to add default as empty, is my understanding correct ?.
Yes that’s correct but if your consumer is different project then you need to only get the updated schema from producer that’s it
Yes exactly because whatever version of avro Object producer is sending consumer is going to consume that object as per version in schema registry when deserializing
@@Javatechieif producer sends new schema, does consumer need to generate the dto files from consumer end before it consumes. Until the new changes deployed to prod, is it uses old schema ? How exactly schema versioning works for consumer without any break if consumer is a separate external service ?
This is awesome content thanks. One question , how do you share the schema across multiple micro services.
It should be store in registry
@@Javatechie Sorry i didn't get which registry you talking about. Could you elaborate further. Can we store it in like a config server? Just thinking..
Very helpful!
Hi Basat ,
what is default compatiblity for avro schema registry .
Could you please discuss more on Forward and Backward compatibility configurations.
Thanks for the detailed information, But how to use different versions at consumer side while deserialising.
Nice video ❤
Thank's Basant
Hi Sir , I need to convert an input xml file to avro schema dynamically instead of having it under resource folder and then build the Avro object and send it to Kafka topic .can you please help me with this.
which data type we can use for date in schema , Pls suggest ?
Any help on this is appreciated Sir.. I subscribed to your channel now and would like to continue to have the membership and thank you very much for all the knowledge you are sharing to the community. How to make it work for build.gradle file Sir?
Hello srinivas , Thank you for following javatechie. What's the problem are you facing with gradle ?
@@Javatechie hello, can you please demonstrate how to set this dependencies using gradle instead ?
awsome
Can you also cover Kafka connect and stream
Hi Sir,
How can we achieve this kind of version in open-source Kafka ?
Not possible supriya
I have a doubt what if we are not sure what kind of schema is coming from my database as my app might be handling different types of json data in that case Do i need to define schema for every json data one by one or is there any other way?
Schema is nothing is the contract between producer and consumer buddy. For example to work as IAS you must need to clear the UPSC exam that's the rule or you can say contract set by the government. If you have multiple exams then you need to prepare for different subjects like schema
Currently we are using in our application.
1. In confluent dashboard we can able to see the sesitive data, how to secure it in one interviewer asked me this question, please suggest, also confluent is free or paid.
2. If consumer is down producer is producing messages continuously and the hardware resources are exhUsted kafka how can handle this situation
Use secure Kafka cluster and yes it’s open source
@@Javatechie Thanks for the reply basant sir, actually content ex: credit card numbers, social security numbers etc. which should display in amsked manner, please suggest
consumers are configured with appropriate heartbeat settings
From 1:00 on you explain that in the plain Kafka world we need to build a new producer and consumer app once a class, whose objects we send to Kafka, changes.
From 41:00 on you generate a new Employee class based on the changed schema and then rebuild producer and consumer as well. From my understanding, using schemas we still need to build a new producer and consumer app; so this is not an advantage of the whole schema overhead.
The only advantage I see is that we do not manually change classes.
Do I overlook something?
i'm stuck in the same, still dont get it..
No your understanding is correct. It means we no need to rely on data like pojo/events even producer modified any payload structure still my consumer will not break
@@Javatechie I used everything same as you but when it came to evolving the schema and dropping some of the fields, I got change in schema error 409. I am wondering because I used same properties as you. Also, another thing is, I have the following dashboard I am not seeing the messages even though the schemas and the topics are there. I am talking about port 9092/clusters/** but the 8081/subjects/** is showing
This are the errors org.apache.kafka.common.errors.InvalidConfigurationException: Schema being registered is incompatible with an earlier schema; error code: 409
2024-04-23T09:29:46.988+03:00 ERROR 12236 --- [apache-schema-registry] [nio-8888-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed: org.springframework.kafka.KafkaException: Send failed] with root cause
org.apache.kafka.common.errors.InvalidConfigurationException: Schema being registered is incompatible with an earlier schema; error code: 409
everything's working well, but my control centre often shut down itself after some time, need to start it again from docker desktop, why can be the cause?
can you guide me how to do kafka clustering in spring boot application how to handle
hi sir can you please create a video on employee biometric login using Spring boot
Can you please make a video on how to make a class immutable in java?
I need the same in gradle build
Thanks a lot
14:44
hi sir can you please create a video on employee biometric login using Spring boot