Rare to come across a session on TH-cam where the instructor is very clean on their delivery and emits a 1-1 match between every instruction and the tutoree's experience, (at least as of this comment). This was a treat!
This was brilliant, perfect and fun. What a clear instructor and perfect way to explain and introduce Kafka. And that Python generator solution was great.
Great video! I really appreciated the smooth and intuitive coding process. If you could, please consider refactoring for further improvements. In the meantime, I'll continue exploring Kafka Streams and the various applications that can be developed using it. Cheers! :)
Thanks for the session Kris, very interesting and touching many topics a lot of functionality with so little code! Looking forward to see this next refactoring video. Congrats!
This is the first video that I have watched on your channel, and I just loved it. Especially because I have started learning kafka lately. I would love to know how we could deploy such kinds of applications.
For those who cannot find API Credentials: 1. Select your environment (probably 'default') 2. You'll see the "endpoint" in the right bottom right corner. This is what you're looking for.
Would love to see the Higher order function refactoring and see if any more neat features have been added into Confluent cloud since this video was made. Retrospectives in action ;)
It was a very interesting video. I did not know that I can build such a useful application with Kafka and ksqlDB. Definitely I will wait for the refactoring and deployment video.
Thanks a lot for the video! I'm completely new to kafka and I ran into an issue - looks like Confluent UI has changed and I can't find schema registry url. Could someone help please?
Awesome tutorial! I’m definitely interested in a follow-up that refactors and deploys the watcher script in the cloud (I’m guessing as some kind of cron job). Question - you defined your schema by creating a stream in ksql and pulling down the resulting schema from Schema Registry using the schema registry client in Python. Would it be better to define the schema in Python and source control it there as the source of truth for the schema? That way, the developer has control over schema changes as the requirements evolve. What do you think? I’m also wondering if you can use an HTTP source connector to get the TH-cam data into Kafka easier. Then you don’t even have to deploy the python script. I’m not sure if the http source connector can handle paging like the python generator solution does, though. Now, imagine a world where TH-cam’s API (and the rest of the APIs in the world) supports push (eg over Server Sent Events) rather than pull. Your producer script could just subscribe to changes and publish to Kafka, like Change Data Capture in the database world. I’m looking forward to more APIs “thinking in streams” rather than just synchronous request/response. Thanks again for an awesome video!
I was thinking about developing this idea into a real life project deployed online with nlp processing but I was thinking about how can I process replies to the reply provided by you to the commenter?
Good video, thank you! At the same time I feel bad about your approach: it's too "confluent-oriented," and can vendor-lock newbie programmers with those proprietary technologies.
Quick question, just wanna know why the title of this video is 'Build a REACTIVE data streaming'. I don't see any reactive code in script you wrote Is the whole data pipeline reactive? In my understanding, reactive system should be driven by Event (for example, like, comment etc on youtube video as you scraped) It means every time Like or comments on video you lookup occurs, it should be immediatly sent to kafka cluster, not waiting for every 10 minutes waiting to be noitced that number of likes and comments has changed. Can you give me detailed explanation why this demo is reactive?
no way to make what you do, sir. I would like to see one day one tutorial where nothing is hidden and all is shown clearly. Well, i have time to die three times before it happens... { "name": "ImportError", "message": "cannot import name 'config' from 'config' (c:\\Users\\flosr\\Engineering\\Data Engineering\\TH-cam API Project\\config.py)", "stack": "--------------------------------------------------------------------------- ImportError Traceback (most recent call last) Cell In[7], line 3 1 import logging 2 import sys, requests ----> 3 from config import config ImportError: cannot import name 'config' from 'config' (c:\\Users\\flosr\\Engineering\\Data Engineering\\TH-cam API Project\\config.py)" }
Rare to come across a session on TH-cam where the instructor is very clean on their delivery and emits a 1-1 match between every instruction and the tutoree's experience, (at least as of this comment). This was a treat!
First one of these I’ve watched. Beautifully explained, with all the required detail, really bringing it to life. Great work!
This was brilliant, perfect and fun. What a clear instructor and perfect way to explain and introduce Kafka. And that Python generator solution was great.
I liked this video very much. Just the relevant parts, leaving out all the fluff but with a lot of humour!
Fantastic video -> event-driven architecture, data pipeline, notifications, Kafka, Telegram, code, best practices, and humour all in a single package🙂
Finally, I has found the best video about kafka. Thank you for this video
love that you do everything in the terminal, no need for an IDE. Respect 😆
Thank you so much for this video. Made getting acquainted with Kafka as a beginner a pleasure.
great video , great energy , very didactic. I really enjoy every minute of the video. Also the way you talk transmit really nice vibes
Fabulous video. Very well taught, coded and explained.
Outstanding and extremely clean explanation
Great video! I really appreciated the smooth and intuitive coding process. If you could, please consider refactoring for further improvements. In the meantime, I'll continue exploring Kafka Streams and the various applications that can be developed using it. Cheers! :)
Incredible session! Woow!
I would LOVE to see a vim tutorial from Kris!!
Thanks for the session Kris, very interesting and touching many topics
a lot of functionality with so little code!
Looking forward to see this next refactoring video.
Congrats!
This is the first video that I have watched on your channel, and I just loved it. Especially because I have started learning kafka lately. I would love to know how we could deploy such kinds of applications.
This is very well explained!!. I really enjoyed it
One of the best tutorials ever! Thanks for making this.
This is a wonderful tutorial!
Really touching video ;). Thanks!
Such great video on less than one hour
For those who cannot find API Credentials:
1. Select your environment (probably 'default')
2. You'll see the "endpoint" in the right bottom right corner. This is what you're looking for.
Thank you so much!
Loved the way you explained each and every part of the process. Hope you get a notification on your bot!! 😂😊
Great content as always! Looking forward to the follow-up refactoring video!
Thanks for the video!
Would love to see the Higher order function refactoring and see if any more neat features have been added into Confluent cloud since this video was made. Retrospectives in action ;)
That is a great tutorial, thank you!!
Really enjoyed the video, Thanks Kris
Great video! Thanks 👍
thanks for the extremely useful video. did you ever make the refactoring video for the things you mentioned about optimizing code?
This was good. I think I am going to use what's demo'ed here.
SQL optimization video would be CLUTCH.
It was a very interesting video. I did not know that I can build such a useful application with Kafka and ksqlDB. Definitely I will wait for the refactoring and deployment video.
Awesome work
thank you!
Great video! Wondering how much this would roughly cost you in a year i.e. how many discounted synths are you buying before breaking even.
is it possibile to do the project with basic confluent tier without paying?
Thanks a lot for the video!
I'm completely new to kafka and I ran into an issue - looks like Confluent UI has changed and I can't find schema registry url. Could someone help please?
enjoyed it indeed!
Awesome tutorial! I’m definitely interested in a follow-up that refactors and deploys the watcher script in the cloud (I’m guessing as some kind of cron job).
Question - you defined your schema by creating a stream in ksql and pulling down the resulting schema from Schema Registry using the schema registry client in Python. Would it be better to define the schema in Python and source control it there as the source of truth for the schema? That way, the developer has control over schema changes as the requirements evolve. What do you think?
I’m also wondering if you can use an HTTP source connector to get the TH-cam data into Kafka easier. Then you don’t even have to deploy the python script. I’m not sure if the http source connector can handle paging like the python generator solution does, though.
Now, imagine a world where TH-cam’s API (and the rest of the APIs in the world) supports push (eg over Server Sent Events) rather than pull. Your producer script could just subscribe to changes and publish to Kafka, like Change Data Capture in the database world. I’m looking forward to more APIs “thinking in streams” rather than just synchronous request/response.
Thanks again for an awesome video!
Thanks a lot Kris Can we get the code snippets please
brilliant!
Is it safe to show in clear your Google API ID?
@@krisjenkins8110 nice one
I was thinking about developing this idea into a real life project deployed online with nlp processing but I was thinking about how can I process replies to the reply provided by you to the commenter?
For those checking if the production video is out, here is one of Confluent's videos I found: th-cam.com/video/S204ya8eObE/w-d-xo.html
Good video, thank you!
At the same time I feel bad about your approach: it's too "confluent-oriented," and can vendor-lock newbie programmers with those proprietary technologies.
Quick question, just wanna know why the title of this video is 'Build a REACTIVE data streaming'.
I don't see any reactive code in script you wrote
Is the whole data pipeline reactive?
In my understanding, reactive system should be driven by Event (for example, like, comment etc on youtube video as you scraped)
It means every time Like or comments on video you lookup occurs, it should be immediatly sent to kafka cluster,
not waiting for every 10 minutes waiting to be noitced that number of likes and comments has changed.
Can you give me detailed explanation why this demo is reactive?
Wow, the subscribe button blew me off
no way to make what you do, sir. I would like to see one day one tutorial where nothing is hidden and all is shown clearly. Well, i have time to die three times before it happens...
{
"name": "ImportError",
"message": "cannot import name 'config' from 'config' (c:\\Users\\flosr\\Engineering\\Data Engineering\\TH-cam API Project\\config.py)",
"stack": "---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
Cell In[7], line 3
1 import logging
2 import sys, requests
----> 3 from config import config
ImportError: cannot import name 'config' from 'config' (c:\\Users\\flosr\\Engineering\\Data Engineering\\TH-cam API Project\\config.py)"
}