Dear Sir, Namaskaram, And first of all Thank You for teaching us so briefly. and also i have been learning a lot from you . the way you have said "IT IS CRYING" while changing String message to Customer customer object at 5.32, i laughed HARD man. Thank You so much sir. its a little thing but i laughed a lot clapping for 5 minutes .. THANK YOU AGAIN SIR
can you please make a video on how to use common pojo and dto classs instead creating same Customer class in producer and consumer projects as you mentioned at 5:33
can you please explain why you create producer factory and kafka template bean? at 19:30 and how that both are working in configuration? I was able to understand kafka template and kafka listener when you create bean in service class in previous section of kafka series.
Hello sir!! What approach can we use for consumers not to connect to kafka directly instead of some api?? Also if the producer wants to move to a new message broker, in that case how the consumer should not be affected by the change at the producer???
No if there is a configuration change in the producer then consumers need to have those changes . We can use some centralized configuration approach to manage it to avoid manual efforts
Hi, could you please explain how to handle DeserializationError? For example if producer publishes the Message class but the consumer is expecting User class.
Hello Sir, You are really doing good job. I am watching your videos and Its really very amazing. Can you please make video on Kafka interview questions?
I have one doubt only suppose i m using 10 consumer in consumer_group in kafka all consumer need to implement producer data type class(DTO) to deserializing. any better way to do that because code is repeating in each service . what i mean by that suppose i m producing order data in one of topic in order creation then it is necessary in each consumer class like invoice email and inventory to create DTO class for order to deserializing. Any better way GraphQL? can we use
I have used spring web flux so the sink is used as a consumer over there . If you start implementing the same with the traditional approach then you can go with the Kafka template
At the consumer side, if i am using a config file and not the yml file, with the same configuration and classes like in the yml i.e. the server, string deserializer and json deserializer i am getting an error by the org.springframework.messaging.convertor.MessageConversionException saying : cannot convert from java.lang.String to com.example.dto.Customer. This would occur when the serialization and de serialization do not match or may be something else. What bugs me is if i use the yml file everything runs smooth but when i switch to a config file the consumer application cannot convert the message to Object type. Is there any solution?
I have com.producer.dto.Customer for Producer application. And com.consumer.dto.Customer for Consumer application. Since these paths do not match the process does not work. And it is not possible to specify in .yml configuration that the incoming com.producer.dto.Customer should be mapped to com.consumer.dto.Customer. So I have to rewrite one of the applications completely to match the package structure or create a custom Json deserializer.
Explanation is not clear why we transfer all properties file into class and you remain in such a hurry to make video that you are just copy pasting code that you have done from somewhere else and your same code is also not working in my ide. A little bit patience is required to give clear explanation and video making. not upto the mark
Thank you 😊 for your feedback buddy. Your response says you haven't gone through previous video please check it once and then let me know if you are having same issue again.
Sorry to bother you too much, I am using the same deserializer , and I am getting exception its not a trusted source even after configuring. properties: spring: json: trusted: packages: com.example.kafka.consumer.dto It keeps on giving me error on this one Caused by: org.apache.kafka.common.errors.RecordDeserializationException: Error deserializing key/value for partition TestTopic-1 at offset 0. If needed, please seek past the record to continue consumption. Caused by: org.apache.kafka.common.errors.RecordDeserializationException: Error deserializing key/value for partition TestTopic-1 at offset 0. If needed, please seek past the record to continue consumption. at org.apache.kafka.clients.consumer.internals.CompletedFetch.parseRecord(CompletedFetch.java:309) at org.apache.kafka.clients.consumer.internals.CompletedFetch.fetchRecords(CompletedFetch.java:263) at org.apache.kafka.clients.consumer.internals.AbstractFetch.fetchRecords(AbstractFetch.java:340) at org.apache.kafka.clients.consumer.internals.AbstractFetch.collectFetch(AbstractFetch.java:306) Caused by: java.lang.IllegalArgumentException: The class 'com.example.kafka.producer.dto.Customer' is not in the trusted packages: [java.util, java.lang, -java.util-java.lang-com.example.kafka.consumer.dto.*]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*). Posted in instack over flow stackoverflow.com/questions/77804089/unable-to-deserialize-events-from-spring-boot-consumer-getting-an-error-even-aft gitlab.com/kishore87jetty/kafkaproducer gitlab.com/kishore87jetty/kafkaconsumer
Error:Can't serialize data [com.spring.kafkaproducer.example.dto.Customer@4ad91b0e] for topic [demoCust] I am getting ths error Please help... Below is the config spring: kafka: producer: bootstrap-servers: localhost:9092 key-serializer: org.apache.kafka.common.serialization.StringSerializer value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
@@prajularao I faced the same issue but solved it defining in consumer application the Customer object with the same package as defined in producer's application.
how do we handle this sir? .KafkaMessageListenerContainer : Consumer exception java.lang.IllegalStateException: This error handler cannot process 'SerializationException's directly; please consider configuring an 'ErrorHandlingDeserializer' in the value and/or key deserializer at org.springframework.kafka.listener.DefaultErrorHandler.handleOtherException(DefaultErrorHandler.java:198)
I found the solution. its because of type mapping. JavaTechie doesnt get it as producer DTO and consumer DTO is in same package. You should provide like this. is my DTO. Producer : (you should mention your DTO package name in producer as type with the Class name) properties: spring: json: type: mapping: customer:com.gnanayakkara..dto.Customer Consumer : (You should mention where is the mapping Class available in Consumer side.) properties: spring: json: trusted: packages: com.gnanayakkara.kafkaproducer.dto type: customer:com.gnanayakkara..dto.Customer
Awesome and appreciate your findings, but just keep a note this is not a recommended way to send raw objects as a Kafka message. I will upload a video on it . Hints : Kafka provides us with flexy that we can deal with record with any data type
I just finished your entire playlist on Kafka, Amazing content. Thank you for helping me
Kafka series are well structured and theoritical explanation. Superb!!! Keep on doing!!
For the first time i completed a tutorial . Thank you so much for this amazing content. Now I got an idea on kafka
amazing content i followed you since i started my carrier. thanks for you support.
Amazing series, Please continue this series with Kafka Streams.
Yes I will continue from coming weekend
Thanks you so much learned apache kafka within few hours
as coures progress we see less and less views 😀😀😀. but seriously this course is good those of you who completes it helps a lot
Agree
Amazing❤
Big follower of your content brother😊
Dear Sir,
Namaskaram, And first of all Thank You for teaching us so briefly. and also i have been learning a lot from you .
the way you have said "IT IS CRYING" while changing String message to Customer customer object at 5.32, i laughed HARD man. Thank You so much sir. its a little thing but i laughed a lot clapping for 5 minutes .. THANK YOU AGAIN SIR
Hello Vivek thank you so much for your kind words 😊 and I am so happy to hear that you enjoy the content 😀. Keep learning buddy
Thanks so much Basant, appreciate your efforts!❤
as usual -- awesome.. Thanks Basant
Yeah great content, keep up the good work!!!👍
can you please make a video on how to use common pojo and dto classs instead creating same Customer class in producer and consumer projects as you mentioned at 5:33
can you please explain why you create producer factory and kafka template bean? at 19:30 and how that both are working in configuration? I was able to understand kafka template and kafka listener when you create bean in service class in previous section of kafka series.
Amazing content!! Thank you so much ❤
Good Content 👍
Hello sir!! What approach can we use for consumers not to connect to kafka directly instead of some api?? Also if the producer wants to move to a new message broker, in that case how the consumer should not be affected by the change at the producer???
No if there is a configuration change in the producer then consumers need to have those changes . We can use some centralized configuration approach to manage it to avoid manual efforts
thanks for ur efforts
Hi, could you please explain how to handle DeserializationError? For example if producer publishes the Message class but the consumer is expecting User class.
another great video! can you cover authentication authorization used in kafka?
Thank you
So it's Crying... got me 😂😂
Hello Sir, You are really doing good job. I am watching your videos and Its really very amazing.
Can you please make video on Kafka interview questions?
Yes buddy i will
Thank you so much sir. Can you please upload spring boot annotation second part.
Yes buddy i will but required a few more times to prepare for the PPT
Thank you sir
Thanks 🙏❤
Great thanks
Great job, on to the next topic. Can you use Kafka Streams or KSQLDB? Thanks for all.
I will definitely do that.
I have one doubt only suppose i m using 10 consumer in consumer_group in kafka all consumer need to implement producer data type class(DTO) to deserializing. any better way to do that because code is repeating in each service . what i mean by that suppose i m producing order data in one of topic in order creation then it is necessary in each consumer class like invoice email and inventory to create DTO class for order to deserializing. Any better way GraphQL? can we use
sir have you changed this playlist ? before it was 21 videos, but now only 13
No it's 13 only . I haven't uploaded 21 videos. Please check properly
Great please continue.
how to get 1000000000 record get from DB and save to another DB by Kafka
Bro why didn't we use kafka temple in choreography
I have used spring web flux so the sink is used as a consumer over there . If you start implementing the same with the traditional approach then you can go with the Kafka template
Got it thanks👍
At the consumer side, if i am using a config file and not the yml file, with the same configuration and classes like in the yml i.e. the server, string deserializer and json deserializer i am getting an error by the org.springframework.messaging.convertor.MessageConversionException saying : cannot convert from java.lang.String to com.example.dto.Customer. This would occur when the serialization and de serialization do not match or may be something else. What bugs me is if i use the yml file everything runs smooth but when i switch to a config file the consumer application cannot convert the message to Object type. Is there any solution?
Please share your GitHub link
I have com.producer.dto.Customer for Producer application.
And com.consumer.dto.Customer for Consumer application.
Since these paths do not match the process does not work.
And it is not possible to specify in .yml configuration that the incoming com.producer.dto.Customer should be mapped to com.consumer.dto.Customer.
So I have to rewrite one of the applications completely to match the package structure or create a custom Json deserializer.
could you please send rewrited code here
public class CustomCustomerDeserializer implements Deserializer {
private final ObjectMapper objectMapper = new ObjectMapper();
@Override
public Customer deserialize(String topic, byte[] data) {
try {
return objectMapper.readValue(data, Customer.class);
} catch (Exception e) {
throw new RuntimeException("Failed to deserialize Customer", e);
}
}
}
Explanation is not clear why we transfer all properties file into class and you remain in such a hurry to make video that you are just copy pasting code that you have done from somewhere else and your same code is also not working in my ide. A little bit patience is required to give clear explanation and video making. not upto the mark
Thank you 😊 for your feedback buddy. Your response says you haven't gone through previous video please check it once and then let me know if you are having same issue again.
Sorry to bother you too much, I am using the same deserializer , and I am getting exception its not a trusted source even after configuring.
properties:
spring:
json:
trusted:
packages: com.example.kafka.consumer.dto
It keeps on giving me error on this one
Caused by: org.apache.kafka.common.errors.RecordDeserializationException: Error deserializing key/value for partition TestTopic-1 at offset 0. If needed, please seek past the record to continue consumption.
Caused by: org.apache.kafka.common.errors.RecordDeserializationException: Error deserializing key/value for partition TestTopic-1 at offset 0. If needed, please seek past the record to continue consumption.
at org.apache.kafka.clients.consumer.internals.CompletedFetch.parseRecord(CompletedFetch.java:309)
at org.apache.kafka.clients.consumer.internals.CompletedFetch.fetchRecords(CompletedFetch.java:263)
at org.apache.kafka.clients.consumer.internals.AbstractFetch.fetchRecords(AbstractFetch.java:340)
at org.apache.kafka.clients.consumer.internals.AbstractFetch.collectFetch(AbstractFetch.java:306)
Caused by: java.lang.IllegalArgumentException: The class 'com.example.kafka.producer.dto.Customer' is not in the trusted packages: [java.util, java.lang, -java.util-java.lang-com.example.kafka.consumer.dto.*]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*).
Posted in instack over flow
stackoverflow.com/questions/77804089/unable-to-deserialize-events-from-spring-boot-consumer-getting-an-error-even-aft
gitlab.com/kishore87jetty/kafkaproducer
gitlab.com/kishore87jetty/kafkaconsumer
Could you please share your code in GitHub? I will take a look and figure it out
@@Javatechie Added details in my first comment itself
There are one more branch ProducerConfig and COnsumerConfig in respective repository . the config once are also throwing some error.
Error:Can't serialize data [com.spring.kafkaproducer.example.dto.Customer@4ad91b0e] for topic [demoCust]
I am getting ths error
Please help...
Below is the config
spring:
kafka:
producer:
bootstrap-servers: localhost:9092
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
Please configure trusted package
But producer is throwing the error, though I added the trusted package in consumer yml
@@prajulakottai1338 you need to add in both producer and consumer buddy
@@Javatechie still not working>please help
org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'
Caused by: java.lang.IllegalStateException: java.lang.ClassNotFoundException: com.spring.kafkaproducer.example.dto.Customer
below is the config
producerProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
"localhost:9092");
producerProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
StringSerializer.class);
producerProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,JsonSerializer.class);
producerProps.put(JsonDeserializer.TRUSTED_PACKAGES,"com.spring.kafkaproducer.example.dto");
consumerProps.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
"localhost:9092");
consumerProps.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
StringDeserializer.class);
consumerProps.put(ConsumerConfig.GROUP_ID_CONFIG,"group4");
consumerProps.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,JsonDeserializer.class);
consumerProps.put(JsonDeserializer.TRUSTED_PACKAGES,"com.spring.kafkaproducer.example.dto");
consumerProps.put(JsonDeserializer.USE_TYPE_INFO_HEADERS,"false");
consumerProps.put(JsonDeserializer.VALUE_DEFAULT_TYPE ,"com.spring.kafkaproducer.example.dto.Customer");
consumerProps.put(JsonDeserializer.TYPE_MAPPINGS, "customer:com.spring.kafkaproducer.example.dto.Customer");
@@prajularao I faced the same issue but solved it defining in consumer application the Customer object with the same package as defined in producer's application.
how do we handle this sir?
.KafkaMessageListenerContainer : Consumer exception
java.lang.IllegalStateException: This error handler cannot process 'SerializationException's directly; please consider configuring an 'ErrorHandlingDeserializer' in the value and/or key deserializer
at org.springframework.kafka.listener.DefaultErrorHandler.handleOtherException(DefaultErrorHandler.java:198)
Can you share your git link. I'll try to see if I can help.
@@janyajoshi rajeshoo7/Kafka-project/tree/master
did you find the solution ?
I found the solution. its because of type mapping. JavaTechie doesnt get it as producer DTO and consumer DTO is in same package.
You should provide like this. is my DTO.
Producer : (you should mention your DTO package name in producer as type with the Class name)
properties:
spring:
json:
type:
mapping: customer:com.gnanayakkara..dto.Customer
Consumer : (You should mention where is the mapping Class available in Consumer side.)
properties:
spring:
json:
trusted:
packages: com.gnanayakkara.kafkaproducer.dto
type:
customer:com.gnanayakkara..dto.Customer
Awesome and appreciate your findings, but just keep a note this is not a recommended way to send raw objects as a Kafka message. I will upload a video on it .
Hints : Kafka provides us with flexy that we can deal with record with any data type
Thank you