Kafka Streams using Spring Cloud Stream | Microservices Example | Tech Primers
ฝัง
- เผยแพร่เมื่อ 13 ต.ค. 2024
- This video covers how to leverage Kafka Streams using Spring Cloud stream by creating multiple spring boot microservices
📌 Related Links
=============
🔗 Github code: github.com/Tec...
🔗 Kafka setup: docs.confluent...
🔗 Public Domain API: domainsdb.info/
📌 Related Videos
===============
🔗 Spring Boot with Spring Kafka Producer example - • Spring Boot with Sprin...
🔗 Spring Boot with Spring Kafka Consumer example - • Spring Boot with Sprin...
📌 Related Playlist
================
🔗Spring Boot Primer - • Spring Boot Primer
🔗Spring Cloud Primer - • Spring Cloud Primer
🔗Spring Microservices Primer - • Spring Microservices P...
🔗Spring JPA Primer - • Spring JPA Primer
🔗Java 8 Streams - • Java 8 Streams
🔗Spring Security Primer - • Spring Security Primer
💥 Join TechPrimers Slack Community: bit.ly/JoinTec...
💥 Telegram: t.me/TechPrimers
💥 TechPrimer HindSight (Blog): / techprimers
💥 Website: techprimers.com
💥 Slack Community: techprimers.sl...
💥 Twitter: / techprimers
💥 Facebook: TechPrimers
💥 GitHub: github.com/Tec... or techprimers.gi...
🎬Video Editing: FCP
---------------------------------------------------------------
🔥 Disclaimer/Policy:
The content/views/opinions posted here are solely mine and the code samples created by me are open sourced.
You are free to use the code samples in Github after forking and you can modify it for your own use.
All the videos posted here are copyrighted. You cannot re-distribute videos on this channel in other channels or platforms.
#KafkaStreams #SpringCloudStream #TechPrimers
nice explanation. Thank you
Great video explained both the ways of vanilla Kafka way as well as Kafka Streams using spring cloud binder .
Cool. I like how you explain and go directly to what is needed to make things work. No too many speeches; because we can find that in the documentation ourselves.
Cheers bud #TimeIsPrecious
Just what I was looking for!!! Thank you for posting this helpful content!
Glad it was helpful!
tremendous effort to make this stuff so good and understandable
Glad you found it useful
Wow awesome video! Learned a lot, Kafka Streams API is very powerful
Great tutorial! Good to see in practice the difference between Kafka and Kafka Stream!
Never seen this explained so clearly. Great job!
Glad it was helpful
Ajay !!You are simply incredible...Thanks a ton for sharing such a valuable Kafka usage pattern..You Rock !!
Cheers Ankit. Glad it's helpful
Nicely explained, thanks for such a great tutorial!
Excellent Kafka tutorial thanks
Damn! you really coded it all up under 30 mins. Sr. Dev spotted I guess XD
man this is the best I have seen on Kafka stream.
Very Nice tutorial for Kafka Streams
Really liked it. Great efforts made for clear understanding.
Glad you liked it
Hi Ajay..I am a regular follower of your tutorial.. Could you please make a complete video on project reactor (Spring webflux) covering every concept? If already available, kindly provide me the link.
Another concise and helpful AV. Appreciate it, buddy.
Great session👍nice use case with kafka stream processing with spring
Very 🔥 nice video clearly explained waiting for more such video and thanks for amazing content 🙏
Thank you for such an easy explanation 🙏
Glad it was helpful!
Great educational video bro. It will be great if you can add Kafka Connect and ksql db too.
Great tutorial thankyou 🧿
Great , this is super awesome and v complete and realistic one
Thank you so much 😀
thanks a lot. it is very helpful for kafka beginner
Good one buddy.. please add error/exception handling
Best educational video! But what if there is a deserialization error? How to pre-log a message that caused the application to crash?
You can add a Spring interceptor to log those.
Clean 💥👏👏. Can you provide list of the Mac shortcuts you used in intellij .. I couldn't able get from the bottom hints?
Thank you Ajay. This is very informatic and easy to comparable with kafka and kafka stream. Please Can u make a video on Kafka connect with other messaging system like activemq and with other data source like oracle ..
Hey Thanks for the explanation, I really learnt something today :) , I have a question what if I wan to read from two kafka topics and send it to two kafka topics, in that case how will be my application.yaml. Thanks in advance.
Great tutorial. Thanks!
Glad it was helpful!
Much needed .. Stay safe Ajay .. thank you for the awesome work
You too stay Safe Sai. Thank you👍
Thank you for the tutorials.
What is the difference between spring kafka and spring cloud stream , spring stream kafka.
When we should go for spring Kafka and Kafka Stream ?
Awesome explaination !! Very clear
Glad it was helpful!
Great flow of explanation. Loved it ❤️
Great video. In extension to this can we get error handling, retry, rollback, transaction. It would be really helpful 🙏🙏🙏
@Hi, did you get any chance to check error handling, retry , rollback etc.,
@ Ravi Gupta
No
Thank you!!! Very useful tutorial!!
Great stuff sir. Excellent video.
great video, as always. One question, if we were to replace the Kafka producer(service 1) with Kafka cloud streams(just like the other 2 services), should we have to create a method that returns a Supplier of type Supplier
Please confirm :)
Nope. Supplier takes KStreams as input, it won't work for Producer. Kafka streams doesnt come at that microservice since there is no Kafka Stream usage at producer side
@@TechPrimers thanks for your reply. Does that mean kafka streams can be used only when we wanna consume data? And that why we are not able use in the producer service?
This was good. Although I am confused. Why would we use Kafka Streams for consuming a message? Should we not be using a normal consumer? Kafka Streams would be used only when we are consuming from and producing to a Kafka topic. Particularly useful when we need Exactly Once.
Hello, great tutorial. One question: Did we define consumer groups, how is it handled in ksp ?
Very useful video! Thank you for your amazing work.
Glad it was helpful!
Hi Ajay, would you mind explaining why you are using the spring kafka + spring cloud stream combination instead of the reactor kafka?
It's just for the ease of use. If we are using spring boot, the first obvious option is to try spring native libraries due to jar dependencies and their version maintenance.
Thank you very much
thank you! but I have a curious question can you explain what you did on 8:30?
Can you do one video for nats with complete end to endu message brocker
Very nice explanation. What if we have so many Autowired classes then how can we replace it with Constructor Injection, do we need to keep adding those classes in class constructor, if yes then it will be too long when we have more classes autowired?
If there are too many that means the class needs to be broken down
Thanks a lot 😊👍
awesome!!,
great tutorial from a great teacher
What's the tool that you used for creating arch visualization? It looks clean and nice.
Google slides
@@TechPrimers Thanks, Can you share the google slide link (or download link) for this architecture diagram.
Thank you very much!
Very useful demo
Nicely explained.
Any example reference for spring cloud stream kafka binder producer which send stream to topic. As we don't have producer in kafka stream binder I need producer from Kafka binder to send data in key value pair.
Great job.. Is there any video or source code link for same kafka cloud stream consumer class with retry and dead letter queue? If any can you please share with me.
Hi Ajay, Very informative. As u already shown us kinesis so can u try an example using Google cloud dataflow??
Sure will try soon sanjay
Awsome content and great effort. Thank you so much Ajay. ♥️
Thank you aswartha. Glad you liked it
Thank you very much Ajay.i am a big fan of your content and this is one topic I have been looking for.is it possible to explain why we should use kafka streams when we can use traditional kafka consumers aka kafka listeners instead of kafka streams? Any advantage?
Like any other framework/library, it's the ease of use for transforming data once it's consumed. Kstreams price rich SDK which helps in performing the transformation much simpler and readable
Hi, Can you explain how you will handle exceptions as default behavior is FAIL if there is a exception while processing.
it is possible to do this in a single project ?, instead of creating distinct projects for each cases. Thank you for the video.
If you do this in a single project why do we need queues for interaction then?
If you are referring to source code, that's upto you. You can have it in a single repo
Hi Ajay can you make some videos examples like this one on hexagonal pattern.
Sure Himanshu
How to do error handling for publishing or consuming by using Kstreams
Great question Palani. I will try making a video with examples
@@TechPrimers Thank you..looking forward for ur video..! Would help for me to implement ... currently I m not using streams ..since don't know how to handle error
how about a KTable example and a web controller serving requests from the KTable directly? :)
Hi Ajay, how can we send the inactive domain to inactive topic. Do we need to streambridge
Does kafka streams pull one message at a time from the topic?
awesome
excellent .. :)
Hi... Just having a small doubt..
In my project scenario.. For example if there is a Café where purchase transactions take place.. Any transactions, then they are immediately published using kafka.. Now suppose there are multiple different Cafés in different regions.... So in that case how much topics should I use and how shall I split the data based on different regions?and How do I design my kafka architecture using spring boot? Kindly provide ur answer..
If you are segregating by regions, you decided that data from one cafe region cannot be shared with another region, then you will need separate kafka topics and processes for consuming and persisting them
Awesome Thanks
SUPER & Thanks...
How can we trace the messages in this Kafka architecture using Spring Cloud Stream? I want to store every messages in my own database i.e. do not want to use Zipkin or Jaeger.
Please create a video to address kafka stream with sleuth integration like standalone application which can pull the data and insert into DB
Hi Ajay, Could you make a video on Kafka Connect and ksql db??
Sure Sanjay. Will do
@@TechPrimers Thanks Ajay for making a video on Kafka connect. It would be great if you could explain how to create custom connectors with a hands-on example & how to create custom transformers and use them in Kafka connect.
Nice one thank you. But I think you hacked the Get method to perform some operation!
Yes subbu. POCs are always hacked 🤓
IntelliJ in action after a long period :)
Yes kunal. 😁
public domain API is not up
This is a feedback bro, from next time before to show any demo related to spring please show its version because sometime we try to do but its deficult to create the same project at our end.
Many thanks :)
Hey bro, i have added Cachable Static Assets with Spring MVC, but now i want to give button to user by which he can clear cache of website
Get the CacheManager instance and your cache from it and clear(). Take reference here stackoverflow.com/questions/56458748/reload-refresh-cache-in-spring-boot
Are you sure it'll work with MVC ?
Because here I'm talking about static resources browser's caching
Oh you mean the images and stuff. That will not work with the above solution. You need to clean up your storages in the client side via react/angular whichever you use
Kafka streams cannot deserialized if there are two processor are active how to fix this ??
Also my other question: in crawler you said you could use kstreams, how can kstreams read data apart from topic ( kstreams reads data to and from topic)
Pls provide any example if kstreams could read from different source
Correct me if my question are wrong ❌
Nope. Crawler doesnt read from kstreams. It can't use kstreams. Kstreams is a kafka dependant library from Kafka. Only processor n service uses kstreams
So the value what you get after possessing the data could be saved in db right
Yes you are right
However with plain kstream apis saving to db is not possible, and spring kafka streams is solution. Am I right?
Sir one video for play framework of java please
I keep getting the following on the processor?
The class '[B' is not in the trusted packages:
Any ideas? The consumer config also never seems to get the deserializer to be the one specified in the config (JsonDeserializer), so I think it is because of this?
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
Yes. Serialization is the key Benny. I had faced similar issues while trying it out before recording the video. You hav to Serialize and deserialize with same library
@@TechPrimers I believe I am? I am using the same Serializer and Deserializer that you are specifying in your application.yml files? It is just not picking them up to be used and so it is falling back on the ByteArrayDeserializer?
How do you save the data on consumer side to DB? In kstreams it's not possible right. Kstreams is all about topic to topic
If so can you tell us the way to save data to db?
You have to leverage spring jpa or traditional way of storing to DB. But you can hook those methods into the kstream consumer like any other class implementation
❤️❤️❤️
Intro ❤️
Another point, don't use same domain. Microservices should be independent. If same domain changes it forces all of the other depended models to apply changes and this blocks the whole product build pipeline.
Do you publish this repo on your github guys ?
Yes. The link is in the description
Use white color for your background bro
Thanks for making the video. Kindly choose those website who is active in world-wide. Trying to use your domain Website to access the domain but this website is not working in Canada. Anyway content was fine. Thank you
and also share the proper link of branch in Github. Your github link was for home page of github.
Did not notice the missing link Maninder. Thanks for notifying. I have updated the github code in the description. here is link for Github code: github.com/TechPrimers/kafka-streams-microservices-example
Awesome work 👍🏻 Could you please try playing with multiple streams like configuring, merging, joining, processing etc? Because i had issue after merging 2 KStreams, records weren't flown to processor.
cdcStreams.merge(edcStreams).process((ProcessorSupplier) () -> new CCCPStreamProcessor(config, esRepo),
ConstantUtils.CCCP);
❤️❤️