Build Scaleable Realtime Chat App with Kafka and Postgresql

แชร์
ฝัง
  • เผยแพร่เมื่อ 1 ม.ค. 2025

ความคิดเห็น • 108

  • @prashlovessamosa
    @prashlovessamosa ปีที่แล้ว +13

    You come up with something that no one makes
    You are awesome Piyush.

  • @CuriousAnonDev
    @CuriousAnonDev ปีที่แล้ว +9

    Hey you are doing great work! A request- Can you please continue such tutorials and also teach about scalability, microservices, chat servers with rooms, video calls or deployment on docker-k8s etc.
    Like what software engineering looks like irl. People on yt just doing nextjs stuff and I can't understand anything...

  • @abdussamad0348
    @abdussamad0348 10 หลายเดือนก่อน

    Amazing tutorial, helping us to do better engineering. Industry level standards !!

  • @debasishdutta9073
    @debasishdutta9073 ปีที่แล้ว +1

    Thanks piyush, I was waiting for this video. I love your scalable, system design videos

  • @riteshKumarWebDeveloper
    @riteshKumarWebDeveloper 2 หลายเดือนก่อน

    What a amazing amazing video it is. 😮 Greatly done.
    Thank you so much

  • @Aitool-r3q
    @Aitool-r3q 9 หลายเดือนก่อน

    Finally, completed this project and gained a lot of knowledge Thank you Piyush Sir ❤️👍🎉

    • @vikashgupta3305
      @vikashgupta3305 3 หลายเดือนก่อน

      do provide a GH link.

  • @mdmusaddique_cse7458
    @mdmusaddique_cse7458 6 หลายเดือนก่อน

    Man! So much of learning in these 2 videos. Thanks!

  • @souravkumar3553
    @souravkumar3553 8 หลายเดือนก่อน

    Really something awesome.Practically answering all system design question.

  • @aniruddhadas2953
    @aniruddhadas2953 ปีที่แล้ว

    Just wow 🤩 🤩 . I have learned something new that no one teaches us. Highly appreciable work. Thank you . 🙏

  • @niraz9701
    @niraz9701 ปีที่แล้ว +14

    Please make a full video about rabbitMQ🙏

  • @aadarshgurug
    @aadarshgurug ปีที่แล้ว

    that is something i was planning to build and had lot of confusion, now everything is cleared thank you brother

  • @shravan2891
    @shravan2891 ปีที่แล้ว +2

    Fresh unique stuff, no one is teaching this on TH-cam.
    Also can you teach more about tueborepo in detail like testing, linting etc in a turborepo

  • @harshgautam6260
    @harshgautam6260 ปีที่แล้ว

    Mann i love how professionally you do your work ❤

  • @shubhtalk7073
    @shubhtalk7073 3 หลายเดือนก่อน

    You are best piyush bhaiya ❤

  • @ARSHADKHAN-hc6pb
    @ARSHADKHAN-hc6pb ปีที่แล้ว +1

    Love you brother ❤❤❤,
    And my one is that please make series on microservices project, that how to make project using this architecture

  • @TechSpot56
    @TechSpot56 9 หลายเดือนก่อน

    Great video, please continue this series.

  • @PrantikNoor
    @PrantikNoor ปีที่แล้ว

    I like your teaching style. ❤

  • @sagar7929
    @sagar7929 ปีที่แล้ว

    awesome course and thank you so much.
    Make more awesome valuable content with monorepo architecture in nodejs .
    God bless you sir and thank you once again.

  • @snehasish-bhuin
    @snehasish-bhuin ปีที่แล้ว

    Very nice learning ❤🎉it will definitely impact on community

  • @rajukadel1007
    @rajukadel1007 11 หลายเดือนก่อน

    Great content !! Keep sharing your experience ❤

  • @shubhtalk7073
    @shubhtalk7073 3 หลายเดือนก่อน

    Better than paid course ❤

  • @akash-kumar737
    @akash-kumar737 4 หลายเดือนก่อน

    Thank man. Learned a lot.
    Love you ❤❤❤

  • @abhidevlops
    @abhidevlops ปีที่แล้ว

    Awessome Content Brother . You can further extend this Project ❤

  • @fromhousesheoran
    @fromhousesheoran หลายเดือนก่อน

    love you man, you are just amazing,

  • @vishalkhoje
    @vishalkhoje ปีที่แล้ว +2

    Hi Great work Piyush. Can you please create a video how we can able to deploy turborepo project like current scalable Realtime chat app on servers (vercel)

  • @FarazAhmad-t6g
    @FarazAhmad-t6g 6 หลายเดือนก่อน

    bro is litrally creating his own empire in backend mastery.

  • @work7066
    @work7066 ปีที่แล้ว +4

    why are we using redis along with kafka, can't we simply use Kafka's pub/sub for two servers to communicate instead of redis? can someone please explain the advantages or tradeoffs of doing so?

    • @AryanRaj-td4vs
      @AryanRaj-td4vs 11 หลายเดือนก่อน

      I'll give example based on Google pubsub or storage queue that I use at place of kafkaa, and the reason is if one consumer takes that message and ACK that message is gone from topic and other instances with same subscriber won't get the messages, offcourse we can create separate subscriber for each instance but that is manual process unlike redis pub sub

  • @swarajgandhi
    @swarajgandhi 4 หลายเดือนก่อน +1

    Very helpful videos Piyush.
    Just have one doubt, How to retrieve data in case user refresh the page and we have to fetch the last few messages? because we can't query DB in that case.

  • @saksham_1612
    @saksham_1612 ปีที่แล้ว

    Awesome video ❤🎉

  • @Support-Phalestine
    @Support-Phalestine ปีที่แล้ว +1

    Piyush ek introduction video banao plzz What is turboRepo im confused with it

  • @Chanchal_Sen
    @Chanchal_Sen 5 หลายเดือนก่อน

    You Got New Subscriber

  • @kamleshbachani8132
    @kamleshbachani8132 หลายเดือนก่อน +1

    What if user wants to see old messages while they are still inside kafka?

  • @PrashantSaini-i6j
    @PrashantSaini-i6j 11 หลายเดือนก่อน

    Keep posting such content please

  • @fromhousesheoran
    @fromhousesheoran 24 วันที่ผ่านมา

    awesome content Piyush Bhai,
    one question, why produce message when get message from redis, why don't produce it when publishing to redis

  • @mohammedgazi786
    @mohammedgazi786 ปีที่แล้ว +1

    ek MVC pe bhi video lao bhai

  • @digitalTechspace
    @digitalTechspace 9 หลายเดือนก่อน +1

    why you use redis and kafka both can we use kafka only?

  • @ankitpradhan3327
    @ankitpradhan3327 5 หลายเดือนก่อน

    You are making a new entry to the database every time a new message is produced and as we know databases have low through put so can we run consumer or a second consumer to consume the data at a certain interval of time to store the produced data in db?

  • @sreesen3159
    @sreesen3159 หลายเดือนก่อน

    If kafka too is inserting each message at a time to the db then where is the benefit except that when db downs message will still be there in kafka and will be inserted in the db when it is up again?

  • @deezwhat2791
    @deezwhat2791 ปีที่แล้ว

    top notch content

  • @riteshKumarWebDeveloper
    @riteshKumarWebDeveloper หลายเดือนก่อน

    One Question ---> So, if we refresh the browser, the values got removed. So we can again get the values from the database by making a simple GET request to keep the messages even after refreshing the browser?

  • @anassaif3181
    @anassaif3181 7 หลายเดือนก่อน +2

    You earlier said that consumer is a separate NodeJS Server but u defined the consumer in the primary server itself.. Why ? Is that for the sake of simplicity? If so , then how wil l get the same Prisma instance if we had standalone consumer server ?

    • @opsingh861
      @opsingh861 6 หลายเดือนก่อน

      You can just create the new instance for the prisma, and if you are using turbo repo then you can just import that

    • @anassaif3181
      @anassaif3181 6 หลายเดือนก่อน

      @@opsingh861 New instance would probably lead to new connection ig and probably new migrations.... Yeah !, Turbo might be a better option.

  • @akhiltej7q8
    @akhiltej7q8 ปีที่แล้ว

    please make more videos for this as continuation !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

  • @sohanbafna2282
    @sohanbafna2282 8 หลายเดือนก่อน

    Here I was curious to know that if Redis could have been replaced kafka? I am not sure if redis is required here if we are using kafka ? Please let me know your thoughts

  • @physicsakhada592
    @physicsakhada592 10 หลายเดือนก่อน

    If it is a group chat then we have to make a different database postgres to store all the data like room id and all users of that grp

  • @shubhasheeshkundu6040
    @shubhasheeshkundu6040 ปีที่แล้ว +1

    Sir please make backend project using microservices

  • @arnavranjan7881
    @arnavranjan7881 2 หลายเดือนก่อน

    Sir y u not use redis list to store messages?

  • @krishangopal2475
    @krishangopal2475 11 หลายเดือนก่อน

    Hi Piyush. Thanks for the amazing video!!!
    Just one Question, Couldn't we use kafka directly as a pub/sub instead of using redis separately , where all servers and the processing server ( running write queries in postgres ) subscribes to the 'MESSAGES' kafka topic?

  • @_PiyushKumar-vv7ui
    @_PiyushKumar-vv7ui ปีที่แล้ว

    amazing video

  • @skzahirulislam2820
    @skzahirulislam2820 ปีที่แล้ว

    Bhai ek request hai plz ek pagination aur inifinite scroll ke upar bhi video banaye bahut jada ye topic pucha jata hai interviews mein react.js, node.js mein. Aur konsa kab use karna hai wo bhi bata dijiyega indepth banaye ga bhai

  • @shi-nee7966
    @shi-nee7966 ปีที่แล้ว

    where is the video piyush sir has mentioned about i.e the first part to this video?

  • @aliimranadil2
    @aliimranadil2 ปีที่แล้ว

    love it bhai

  • @AdityaGhode-l2z
    @AdityaGhode-l2z 6 หลายเดือนก่อน

    I had one doubt please anyone help, Where is the complete frontend and deployment part of this project ?

  • @harsh_dagar
    @harsh_dagar 9 หลายเดือนก่อน

    Hey @piyushgargdev,
    Is Kafka's consumer interval (to consume message) is incremental or all data will be provided? If so, how can we only handle the incremental data and not whole?

  • @saqibhussain2843
    @saqibhussain2843 2 หลายเดือนก่อน

    Hello Piyush,
    I hope you are doing well,
    Can you please make a video on aws-msk to work with kafka, I am really trying hard to make kafka work with aws-msk but things are not getting solved

  • @nitinjain4519
    @nitinjain4519 ปีที่แล้ว

    Can i run two databases in prisma in the same project like PostgreSQL and MySQL

  • @sachinsingh2104
    @sachinsingh2104 ปีที่แล้ว

    can you help us in knowing how to deploy monorepo appilcations
    .

  • @wahabkazmi7486
    @wahabkazmi7486 4 หลายเดือนก่อน

    Piyush you are sending data into the database one by one not in bulk, right?

  • @riyanshbiswas
    @riyanshbiswas ปีที่แล้ว

    Just one question, why are you using cloud services for postgres and kafka? Isn't using a docker container locally free and less time consuming as well?

  • @jbrocksfellas4147
    @jbrocksfellas4147 ปีที่แล้ว

    awesome value

  • @indentCoding
    @indentCoding ปีที่แล้ว

    A doubt, You are able to get messages in kafka at high velocity because kafka is meant for that , but when you are inserting into db for eachMessage , how will it make any diff because in event of high velocity of messages eachMessage function will do a insert query into db so for example if you are rec 100000 messages at 1-2 second interval your db will have 100000 insert operation which will make the db down. and if that happen what is the benefit of using kafka i understand that there will be no downtime because kafka will still be active but there should be something that will reduce the insert operation into the db

    • @xcoder420
      @xcoder420 10 หลายเดือนก่อน

      Actually the consumer should be an altogether different microservice. This will consume the messages in batches and do batch insertion. Let's say we configured the DB to support 10k WPS. So we'll consume 10k messages and insert it into DB. This is actually known as async processing.

  • @FaisalKhan-oy4zz
    @FaisalKhan-oy4zz ปีที่แล้ว

    when db is down or not able to insert that message so the consumer will resume from that message or from the next message?

    • @skzahirulislam2820
      @skzahirulislam2820 ปีที่แล้ว +2

      That message because the message is stored inside kafka

  • @Sandbox-coder
    @Sandbox-coder ปีที่แล้ว

    One more like... hey mate have query how kafka understands that to which consumer beed to send reply.. i did see your kafka video... struggling to understand this...and how can i build same for mobile app??

  • @arafatislam4648
    @arafatislam4648 ปีที่แล้ว

    Can you show the deployment process?

  • @_034_divyanshusrivastava6
    @_034_divyanshusrivastava6 8 หลายเดือนก่อน +1

    Brother please deploy bhi kr diya kro, bahut problem hoti hai.

  • @tanvir4748
    @tanvir4748 ปีที่แล้ว

    Can anyone give me the previous video link?

  • @rohitpandey4411
    @rohitpandey4411 ปีที่แล้ว

    Kafka itself have a pub/sub model, rather than saving data in two places(redis and kafka), can we create a data aggregation function that'll cater the user messaging service? Working adjacent to yours for updating the db update query handling function

    • @amardeepsingh1168
      @amardeepsingh1168 ปีที่แล้ว

      We should use either Kafka or redis , right ? not both @rohitpandey4411

  • @Dark-nt8hh
    @Dark-nt8hh 4 หลายเดือนก่อน

    One thing i didnt understand is why did you run the consumer function in init function of index.ts file. Couldn't understand the logic behind it. Everything else is topnotch

  • @Ankit-01-01
    @Ankit-01-01 ปีที่แล้ว

    Make a video on posgresql

  • @lakshyasyal
    @lakshyasyal 10 หลายเดือนก่อน

    This is a beneficial video but I didn't like the ending of this. With the try catch if the DB crashes or something goes wrong with it then pause for 1 minute and restart it from the beginning, Can we do something else that would be good for the DB always?

    • @Derrick-f8m
      @Derrick-f8m 25 วันที่ผ่านมา

      DB crashing is possible but not likely. If your DB is managed by a cloud provider like AWS then you have 99.99999% durability. Cloud providers usually have robust infrastructure which should be least of your worries.

  • @iUmerFarooq
    @iUmerFarooq ปีที่แล้ว

    How to handle real time notification in Vue&Node like FB handle its post notification. For example I have a Assignment Management system and I'm logged in as a Admin, when someone upload/send new assignment so the notification come in realtime and show in toast. How it can be possible?

  • @krishnanand_yadav
    @krishnanand_yadav 4 หลายเดือนก่อน

    When the messages are consumed, then they should be deleted from the 'Topics' in Kafka, right? But they are still present there. Is it supposed to be like this?

    • @Derrick-f8m
      @Derrick-f8m 25 วันที่ผ่านมา

      Yes the messages are there in the event of a failure. You will need them there to replay events back to current state. The events are stored in kakfa but are marked as read if they were already consumed. thus those events won't be read twice.

  • @ankush9117
    @ankush9117 2 หลายเดือนก่อน

    Hi Piyush as we are using redis so will it be more better to use reddis streams instead kafka. Please reply

    • @Derrick-f8m
      @Derrick-f8m 25 วันที่ผ่านมา

      Yes. Redis streams is an alternative to Kafka because it also uses consumer groups and offsets. However Kafka can handle very large throughput sizes like 1 million requests per sec.

  • @growwithriu
    @growwithriu ปีที่แล้ว

    Hi Piyush and everyone, I have a doubt. When there are multiple servers, each of them will be consuming msg from kafka and writing to postgres, thereby creating as many message entries in db on every message as the number of servers. Is that desired?
    spin up one more server on a diff port. Send a message. There will be two entries for this message on db

    • @shantanubhise9288
      @shantanubhise9288 ปีที่แล้ว +1

      Yes, you are correct. If the logic for consuming messages and writing to PostgreSQL is directly placed within the message-receiving event, it can lead to duplicated message entries. In my implementation, I've used Redis for inter-server communication. I have not used kafka yet. When a message is sent (triggered by the "send" event), I publish it to the "MESSAGES" channel in Redis. And, on the "receive" event, I broadcast the message to all connected clients.
      Regarding the storage of messages in PostgreSQL, I've introduced a global array named "messageBatch." When a message is sent ("send" event), I push the message into this array. The important aspect is the use of setInterval to periodically process this array(use copy of messageBatch and make messageBatch empty to store new messages), writing its contents to PostgreSQL. The data is successfully stored.

    • @anagnaikgaunekar9081
      @anagnaikgaunekar9081 4 หลายเดือนก่อน

      This is a valid issue. This wont occur if we put the produceMessageForKafka logic after publishing to redis instead of putting it on subscribe

  • @sagarbhatt3346
    @sagarbhatt3346 ปีที่แล้ว

    please make the same videos on python.

  • @effectsofmp3927
    @effectsofmp3927 ปีที่แล้ว +1

    First comment 😁

  • @catchroniclesbyanik
    @catchroniclesbyanik ปีที่แล้ว

    I have one question. Since you introduced Kafka in the project, couldn't we remove Redis from the project?? Because Redis was being used for pub/sub, which Kafka can do as well.

    • @abdulrehmanjaved-rt8jq
      @abdulrehmanjaved-rt8jq ปีที่แล้ว

      you mean we can subscribe on servers to kafka topics?

    • @abdulrehmanjaved-rt8jq
      @abdulrehmanjaved-rt8jq ปีที่แล้ว +1

      Redis supports push-based delivery of messages that means messages published to Redis will be delivered automatically to subscribers immediately but kafka is supports pull-based delivery of messages, meaning that messages published in Kafka are never distributed directly to consumers, consumers subscribe to topics and ask for messages when consumers are ready to deal with them.

    • @shantanubhise9288
      @shantanubhise9288 ปีที่แล้ว

      Yes, one might think of using only Kafka or Redis. But here we need both.
      Here we have 2 requirements:
      1. Inter-Server Communication.
      Meaning message sent by user1 on server1 should be received by all the users present on different servers. Here we can use Redis Pub/Sub model. Redis Publisher publishes the message to the channel "MESSAGES". All the Redis Subscribers of this channel will receive the message, including the server which has sent the message. Thus Inter-Server Communication is achieved.
      If we use Kafka in this case. Kafka producer will produce the message to topic "MESSAGES". Here all the Kafka consumers (on all servers) will belong to the same consumer group, because they have same groupId. Hence only any one consumer will receive the message on the "MESSAGES" topic. And other Kafka consumers (servers) will not receive the message.
      2. Storage of Messages in Database.
      Here we can use Kafka. Kafka producer will produce the message to topic "MESSAGES". And only one Kafka consumer of this topic will receive the message. This consumer will store it in the database. Like I said earlier, all the consumers here have same groupId. Hence only one of them can receive message.
      If we use Redis here, all the Redis subscribers will receive the messages and store the messages in database, resulting in duplicate messages.

    • @catchroniclesbyanik
      @catchroniclesbyanik ปีที่แล้ว

      Here, all the server instances subscribe to a redis channel for incoming messages. I think, we could simply remove redis, and make every server long poll kafka for messages.

    • @abdulrehmanjaved-rt8jq
      @abdulrehmanjaved-rt8jq ปีที่แล้ว +1

      ​@@catchroniclesbyanikyes because kafka also gives pub/sub mechanism. I think pyush bhai ny first video just problem solve krny k liye bnai and yeh full scalability k liye.

  • @HimanshuDhiman-v9c
    @HimanshuDhiman-v9c 10 หลายเดือนก่อน

    can i do this with mongo ???

  • @Derrick-f8m
    @Derrick-f8m 25 วันที่ผ่านมา

    Please provide english subtitles 😢.

  • @ankitkapoordirector6087
    @ankitkapoordirector6087 ปีที่แล้ว

    Please reply anyone can i make chat app using Java Networking concept? is it possible ? please reply

    • @xcoder420
      @xcoder420 10 หลายเดือนก่อน

      Yes. Explore about Netty.

  • @mystic_monk55
    @mystic_monk55 ปีที่แล้ว

    👏👏

  • @Abdulwasay_Hashmi
    @Abdulwasay_Hashmi 2 หลายเดือนก่อน

    CHOTI se bhout bari problem😂

  • @rishiraj2548
    @rishiraj2548 ปีที่แล้ว

    🙏👍

  • @Sagarkun
    @Sagarkun 10 หลายเดือนก่อน

    maza ni aaya bro

  • @RiskyMeal
    @RiskyMeal 10 หลายเดือนก่อน +2

    Video Title In English... Video audio in Hindi... No offense... but Bruh... What are you doing?

    • @Derrick-f8m
      @Derrick-f8m 25 วันที่ผ่านมา

      But at least the comments are in english 😊