The way you explain, go thru each step of the code. Just beautiful, what a public service right here. Thank you and hoping to see much more from you. Cheers!
awesome video, what I am wondering though, is how is streaming into a topic different than the previous consumer video? they both reading from a producer right? Would it be reasonable to say I would stream and transform the producer data onto the reader's topic?
You can architect your services however you wish. You can certainly combine different processes into one, then later on if you have a need to split them out you can do that too.
Can you do one tutorial with the transformation bit especially groupby transformation. I am having trouble implementing the grouby implemented in the documentation.
Yes that's correct. It could be determining a new value based on existing values in the data set or some average based on previous values that you have in state. Really any processing you can achieve using Python.
Hi, use consumer groups to scale your app. e.g. if both are using the same consumer group they will handle different messages from the topic and therefore allow you to get higher throughput. Check out our docs or join Slack to get more in depth answers.
I really liked how you clearly defined the problem and how you solved.
Everything in. details, thats what every tech listener expect, good going Quix
Those Kafka series are awesome 🙌 🎊 🎉 Congrats, keeeeeep posting!
Thank you! Will do!
Another wonderful video👏. You making things simpler with your explanation.
The way you explain, go thru each step of the code. Just beautiful, what a public service right here. Thank you and hoping to see much more from you. Cheers!
Are in drugs ? This idiots hadnt say nothing 😂
Thank you very much for this series
Thank you very much! Watching this clarified a lot of topics for me 😊
vim, regex in searching and replacement are wonderful
hope to study more use cases from you
love ❤your video!
awesome video, what I am wondering though, is how is streaming into a topic different than the previous consumer video? they both reading from a producer right?
Would it be reasonable to say I would stream and transform the producer data onto the reader's topic?
You can architect your services however you wish.
You can certainly combine different processes into one, then later on if you have a need to split them out you can do that too.
Hello, please what font is used on the thumbnail text?
From our designer: "The heading font is Cy SemiBold (with contextual alternates turned off)."
Hope that helps!
Can you do one tutorial with the transformation bit especially groupby transformation. I am having trouble implementing the grouby implemented in the documentation.
Will do. Watch this space.
What’s the vim plugin that shows the Python syntax prompts
That's `nvim-cmp` (in Neovim).
How can we redpanda running locally?
Good question! I've just pushed a docker compose file to the github repo. If you `docker compose up` that should give you a local instance. 👍
How come it is running continuously without a loop?
QuixStreams! sdf.run listens for the termination signals and keeps the listeners open for business.
What's the meaning of `...transform...`?
Any transformation you can do on top of the data
Yes that's correct. It could be determining a new value based on existing values in the data set or some average based on previous values that you have in state. Really any processing you can achieve using Python.
How to scale this app? Would it be common to scale such pipelines horizontally?
Hi, use consumer groups to scale your app. e.g. if both are using the same consumer group they will handle different messages from the topic and therefore allow you to get higher throughput. Check out our docs or join Slack to get more in depth answers.
imo venv is better because env is used to store environment variables
.venv is even better if you use fd/ripgrep, so that files in .venv are skipped from search.
"Farenheit is not an easy thing for a European to spell" -- looks like "celsius" is not easy either. :) Love the video!
I wish I could claim that was a deliberate joke, but... 🤦♂️