Spring Boot | Kafka Schema Registry & Avro with Practical Example and Implementation |

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 พ.ย. 2024

ความคิดเห็น • 85

  • @sureshnamala6776
    @sureshnamala6776 9 หลายเดือนก่อน +11

    Excellent tutorial on real time scenarios. Avro serialization is one of the real time scenarios used in the projects and also an important topic in the Kafka interviews. Thanks a lot Basant for the efforts put in, much appreciated!!

  • @gopisambasivarao5282
    @gopisambasivarao5282 10 หลายเดือนก่อน +5

    Thanks Basant. Appreciate your efforts. God Bless you.🙂🙏 Every week waiting for your videos. You are my Guru. 😊 The way of your explanation is awesome.

  • @kalaiselvankesavel2971
    @kalaiselvankesavel2971 10 หลายเดือนก่อน +2

    Thanks for making such a detailed video about schema registry!!

  • @ManishSingh-dj4yu
    @ManishSingh-dj4yu 10 หลายเดือนก่อน +1

    Thanks for this tutorial.. your contribution to the developer community is invaluable.

  • @user-0987-a
    @user-0987-a 10 หลายเดือนก่อน +1

    Surpringly I had this kind requirement using avro object, many thanks

    • @Javatechie
      @Javatechie  10 หลายเดือนก่อน

      Glad it helped!

  • @aadiraj6126
    @aadiraj6126 9 หลายเดือนก่อน +2

    Thank you Basant bhai.. Lucid explanation, loved it👍

  • @2RAJ21
    @2RAJ21 2 หลายเดือนก่อน +1

    Thank you.
    i learned more about kafka docker compose.

  • @rajyahoob
    @rajyahoob 4 หลายเดือนก่อน +1

    Nice explanation, lot of other things to learn as well such as Docker, etc. Thank You as always.

  • @858871
    @858871 4 หลายเดือนก่อน +1

    Bro your content is exceptionally great..

  • @suman8528
    @suman8528 หลายเดือนก่อน +1

    great content ,waiting kafka connect and kafka stream

    • @Javatechie
      @Javatechie  หลายเดือนก่อน

      Kafka connect i do struggled like anything but no results . Once successfully i will ran one source and sink poc then i will update buddy 👍

  • @malleswarrao3887
    @malleswarrao3887 6 หลายเดือนก่อน +1

    Excellent sir no words to describe your knowledge sharing

  • @manjunathbabu6609
    @manjunathbabu6609 6 หลายเดือนก่อน +3

    "Let's get started" has a separate fan base..

  • @ilronin804
    @ilronin804 5 หลายเดือนก่อน +2

    Wowww
    Awesome
    Thank you very much, so helpfull!

  • @sgr7ss
    @sgr7ss หลายเดือนก่อน +1

    Nice explanation

  • @mysavingclub
    @mysavingclub 10 หลายเดือนก่อน +2

    Nice video you are putting lot of effort

  • @anushbabu5023
    @anushbabu5023 10 หลายเดือนก่อน +2

    Thanks, looking for this one. Please cover for Kafka connect also😊

    • @Javatechie
      @Javatechie  10 หลายเดือนก่อน

      Sure buddy

  • @avisulimanoff1231
    @avisulimanoff1231 9 หลายเดือนก่อน +1

    Thanks for your useful tutorials.
    Short question please: what is the difference between you first change to avro scheme, that when you send Employee information without age/dob was validate successfully , to your second change, when you added the middle name, the validation failed. I didn't notice any change in the process for the two changes...

    • @Javatechie
      @Javatechie  9 หลายเดือนก่อน

      Check I have explained in control centre ui the difference in schema evolution

  • @utbhargav
    @utbhargav 3 หลายเดือนก่อน

    Thank you So much. why AvroSchema generated DTO has @Deprecated on the attributes ?

  • @ivanhomziak
    @ivanhomziak 3 วันที่ผ่านมา +1

    Thanks!

  • @Imjamalvali
    @Imjamalvali 10 หลายเดือนก่อน +1

    thanks bro, pls do like this videos, pls keep ur good work

  • @AAlphonse682
    @AAlphonse682 7 หลายเดือนก่อน

    Thank you for a wonderful tutorial

  • @LakshmiprasadKota
    @LakshmiprasadKota 6 หลายเดือนก่อน +1

    nice tutorial but if u do this concept by creating two differnt projects(producer and consumer) then it wil be more clear and prefect

    • @Javatechie
      @Javatechie  6 หลายเดือนก่อน

      Yes I agree but to reduce the length of the video i kept both in the same project but what's the challenge here to separate consumers into different projects?

    • @mithileshchandra2072
      @mithileshchandra2072 5 หลายเดือนก่อน

      @@Javatechie i think we need to generate the beans class using avro schema in both producer and consumer application and then need to run both app. I thought if we separate producer and consume and if we change schema in producer it should reflect in consumer app without re run but seems not possible we have to re run both app. pls correct me if I'm wrong?

    • @Javatechie
      @Javatechie  5 หลายเดือนก่อน

      You are correct only need to build your app again that's the things you need to do

  • @itsmeibrahimm
    @itsmeibrahimm 2 หลายเดือนก่อน

    Excellent tutorial

  • @pradeeptamishra9225
    @pradeeptamishra9225 หลายเดือนก่อน +1

    Sir I have one question. If we put producer and consumer in separate project. Then the .avsc file we should keep in consumer project. Then any changes in payload structure in producer side the same we need to change in consumer side avro file. So at the end the consumer jar file needs to be re deployed to the server. So what is the benefit? Of schema registry. Please answer

    • @Javatechie
      @Javatechie  หลายเดือนก่อน

      Actually this file should be place on some common or centralize place like config server or s3

  • @Deepakblg97
    @Deepakblg97 10 หลายเดือนก่อน +1

    ❤❤ Thank you so much sir 🙏

  • @letsCherishCoding
    @letsCherishCoding 9 หลายเดือนก่อน +2

    Nice explanation. Just have one doubt, can we use this when we are storing employee info in database like cassandra as database schema is fixed there. And if we want to accomodate new fileds we will have to manually add them.

    • @Javatechie
      @Javatechie  9 หลายเดือนก่อน

      But in the db schema you need to modify the field.

    • @letsCherishCoding
      @letsCherishCoding 9 หลายเดือนก่อน

      ​@@Javatechie yes, I just wanted to know couple of real life use cases of using avro schema

    • @Javatechie
      @Javatechie  9 หลายเดือนก่อน

      @@letsCherishCoding purpose of avro to avoid failure in schema evolution. Next you can think of any e-commerce or food delivery app who frequently do changes on their payload

    • @letsCherishCoding
      @letsCherishCoding 9 หลายเดือนก่อน +1

      @@Javatechie got it. Thanks

  • @tusharrawal885
    @tusharrawal885 9 หลายเดือนก่อน

    Hello sir,
    I already followed you from last two years and you are awesome.
    Could you please help me for one interview question on threading.
    In you corrent project where you are using thread with example?

  • @Jothianand-g1v
    @Jothianand-g1v 4 หลายเดือนก่อน +1

    Great tutorial !
    One question : Do we need to add any configuration related to SSL when using confluent cloud schema registry (https) ? I'm getting below error when my spring boot app is trying to connect to the schema url.
    SSLHandshakeException - PKIX path building failed: sun.security.provier.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

    • @Javatechie
      @Javatechie  4 หลายเดือนก่อน +1

      Yes you need to add keystore and truststore certificate in producer and consumer configuration

    • @Jothianand-g1v
      @Jothianand-g1v 4 หลายเดือนก่อน

      @@Javatechie Thanks for the quick response

  • @arunvijay2279
    @arunvijay2279 10 หลายเดือนก่อน +2

    If consumer in different microsevice, I need to generate avro class for all the consumers right ?. In future, if we need to add new field, first we need to add default as empty, is my understanding correct ?.

    • @Javatechie
      @Javatechie  10 หลายเดือนก่อน +1

      Yes that’s correct but if your consumer is different project then you need to only get the updated schema from producer that’s it

    • @saravanakumars52
      @saravanakumars52 10 หลายเดือนก่อน +1

      Yes exactly because whatever version of avro Object producer is sending consumer is going to consume that object as per version in schema registry when deserializing

    • @santhoshg5342
      @santhoshg5342 6 หลายเดือนก่อน

      ​@@Javatechieif producer sends new schema, does consumer need to generate the dto files from consumer end before it consumes. Until the new changes deployed to prod, is it uses old schema ? How exactly schema versioning works for consumer without any break if consumer is a separate external service ?

  • @deekandau4596
    @deekandau4596 4 หลายเดือนก่อน +1

    This is awesome content thanks. One question , how do you share the schema across multiple micro services.

    • @Javatechie
      @Javatechie  4 หลายเดือนก่อน

      It should be store in registry

    • @deekandau4596
      @deekandau4596 4 หลายเดือนก่อน

      @@Javatechie Sorry i didn't get which registry you talking about. Could you elaborate further. Can we store it in like a config server? Just thinking..

  • @kanaillaurent526
    @kanaillaurent526 3 หลายเดือนก่อน

    Very helpful!

  • @arunabhtiwari4771
    @arunabhtiwari4771 5 หลายเดือนก่อน

    Hi Basat ,
    what is default compatiblity for avro schema registry .

  • @vverma011
    @vverma011 4 หลายเดือนก่อน

    Could you please discuss more on Forward and Backward compatibility configurations.

  • @suprajainamadugu7083
    @suprajainamadugu7083 8 หลายเดือนก่อน

    Thanks for the detailed information, But how to use different versions at consumer side while deserialising.

  • @dattatraybharde2902
    @dattatraybharde2902 10 หลายเดือนก่อน +1

    Nice video ❤

  • @armandoruizgonzalez
    @armandoruizgonzalez 9 หลายเดือนก่อน +2

    Thank's Basant

  • @rathinmaheswaran
    @rathinmaheswaran 7 หลายเดือนก่อน

    Hi Sir , I need to convert an input xml file to avro schema dynamically instead of having it under resource folder and then build the Avro object and send it to Kafka topic .can you please help me with this.

  • @shivamkhare-p5z
    @shivamkhare-p5z 7 หลายเดือนก่อน

    which data type we can use for date in schema , Pls suggest ?

  • @srinivastadimalla1232
    @srinivastadimalla1232 3 หลายเดือนก่อน +1

    Any help on this is appreciated Sir.. I subscribed to your channel now and would like to continue to have the membership and thank you very much for all the knowledge you are sharing to the community. How to make it work for build.gradle file Sir?

    • @Javatechie
      @Javatechie  2 หลายเดือนก่อน

      Hello srinivas , Thank you for following javatechie. What's the problem are you facing with gradle ?

    • @CanalGeekDev
      @CanalGeekDev หลายเดือนก่อน

      @@Javatechie hello, can you please demonstrate how to set this dependencies using gradle instead ?

  • @mohammadturabali3870
    @mohammadturabali3870 5 หลายเดือนก่อน +1

    awsome

  • @aditvikramsingh1929
    @aditvikramsingh1929 9 หลายเดือนก่อน +1

    Can you also cover Kafka connect and stream

  • @supriyakavuri1988
    @supriyakavuri1988 7 หลายเดือนก่อน +1

    Hi Sir,
    How can we achieve this kind of version in open-source Kafka ?

    • @Javatechie
      @Javatechie  7 หลายเดือนก่อน

      Not possible supriya

  • @riyakumari8377
    @riyakumari8377 8 หลายเดือนก่อน +1

    I have a doubt what if we are not sure what kind of schema is coming from my database as my app might be handling different types of json data in that case Do i need to define schema for every json data one by one or is there any other way?

    • @Javatechie
      @Javatechie  8 หลายเดือนก่อน

      Schema is nothing is the contract between producer and consumer buddy. For example to work as IAS you must need to clear the UPSC exam that's the rule or you can say contract set by the government. If you have multiple exams then you need to prepare for different subjects like schema

  • @talhaansari5763
    @talhaansari5763 5 หลายเดือนก่อน

    Currently we are using in our application.

  • @sudheerkumar-tp1mg
    @sudheerkumar-tp1mg 9 หลายเดือนก่อน

    1. In confluent dashboard we can able to see the sesitive data, how to secure it in one interviewer asked me this question, please suggest, also confluent is free or paid.
    2. If consumer is down producer is producing messages continuously and the hardware resources are exhUsted kafka how can handle this situation

    • @Javatechie
      @Javatechie  9 หลายเดือนก่อน

      Use secure Kafka cluster and yes it’s open source

    • @sudheerkumar-tp1mg
      @sudheerkumar-tp1mg 9 หลายเดือนก่อน

      @@Javatechie Thanks for the reply basant sir, actually content ex: credit card numbers, social security numbers etc. which should display in amsked manner, please suggest

    • @GMSGAANAMANI
      @GMSGAANAMANI 8 หลายเดือนก่อน

      consumers are configured with appropriate heartbeat settings

  • @amsfuy
    @amsfuy 8 หลายเดือนก่อน +1

    From 1:00 on you explain that in the plain Kafka world we need to build a new producer and consumer app once a class, whose objects we send to Kafka, changes.
    From 41:00 on you generate a new Employee class based on the changed schema and then rebuild producer and consumer as well. From my understanding, using schemas we still need to build a new producer and consumer app; so this is not an advantage of the whole schema overhead.
    The only advantage I see is that we do not manually change classes.
    Do I overlook something?

    • @РоманПивоваров-ф7ш
      @РоманПивоваров-ф7ш 7 หลายเดือนก่อน

      i'm stuck in the same, still dont get it..

    • @Javatechie
      @Javatechie  7 หลายเดือนก่อน

      No your understanding is correct. It means we no need to rely on data like pojo/events even producer modified any payload structure still my consumer will not break

    • @kevinameda2711
      @kevinameda2711 6 หลายเดือนก่อน

      @@Javatechie I used everything same as you but when it came to evolving the schema and dropping some of the fields, I got change in schema error 409. I am wondering because I used same properties as you. Also, another thing is, I have the following dashboard I am not seeing the messages even though the schemas and the topics are there. I am talking about port 9092/clusters/** but the 8081/subjects/** is showing

    • @kevinameda2711
      @kevinameda2711 6 หลายเดือนก่อน

      This are the errors org.apache.kafka.common.errors.InvalidConfigurationException: Schema being registered is incompatible with an earlier schema; error code: 409
      2024-04-23T09:29:46.988+03:00 ERROR 12236 --- [apache-schema-registry] [nio-8888-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed: org.springframework.kafka.KafkaException: Send failed] with root cause
      org.apache.kafka.common.errors.InvalidConfigurationException: Schema being registered is incompatible with an earlier schema; error code: 409

  • @SurajKumar-l8d8y
    @SurajKumar-l8d8y 6 หลายเดือนก่อน

    everything's working well, but my control centre often shut down itself after some time, need to start it again from docker desktop, why can be the cause?

  • @ramanav3139
    @ramanav3139 8 หลายเดือนก่อน

    can you guide me how to do kafka clustering in spring boot application how to handle

  • @gauravjaiswal9808
    @gauravjaiswal9808 6 หลายเดือนก่อน

    hi sir can you please create a video on employee biometric login using Spring boot

  • @attrayadas8067
    @attrayadas8067 9 หลายเดือนก่อน

    Can you please make a video on how to make a class immutable in java?

  • @2000riddick
    @2000riddick 3 หลายเดือนก่อน

    I need the same in gradle build

  • @el_yisusT
    @el_yisusT 10 หลายเดือนก่อน

    Thanks a lot

  • @GR-gk8dh
    @GR-gk8dh 9 หลายเดือนก่อน

    14:44

  • @gauravjaiswal9808
    @gauravjaiswal9808 6 หลายเดือนก่อน

    hi sir can you please create a video on employee biometric login using Spring boot