Enhancing ChatGPT with Long-Term Memory

แชร์
ฝัง
  • เผยแพร่เมื่อ 10 มิ.ย. 2024
  • github.com/msuliot/long-term-...
    I'm super excited to show you how to add long-term memory to our chat system. This means that every time you return, it will remember all the previous conversations you've had, making it better at answering your questions. Trust me, this is going to be awesome, and you're going to love it!
    To give you a quick overview, I define short-term memory as the current conversation, which can be one or multiple question-answer exchanges. When you finish a conversation, I take that short-term memory, summarize it, and store it as long-term memory. Then, when you start a new conversation, I retrieve all those long-term memory summaries and incorporate them into the current conversation. This way, if you ask a question related to something we've previously discussed, you'll get an informed answer. It's super cool and very powerful.
    So, let's get started. First, we need a MongoDB account, which can be cloud-based or local. I'm using the cloud version. I've created a cluster called "long-term memory cluster," and currently, there are no collections. We'll start by adding some information.
    The first step is to create a profile. Once you enter the profile, it generates a profile ID in MongoDB. Let's check to ensure it's updating correctly. Now, we have a collection called "profiles," and my user ID is "msulio." I'll run the app using this ID. The app checks the profile, retrieves the profile ID, and starts a conversation, verifying if there's any long-term memory associated with my profile. If there were previous long-term memory chats, they would be included.
    Now, let's ask a question: "What is the largest ocean on Earth?" and get the response. I've also added a feature to check your long-term memory with the command "LTM." Since long-term memory is currently empty, only the short-term memory of our current conversation is available. I'll ask another question, "How about the second?" which only makes sense if the previous conversation is considered. There you go, good stuff! The short-term memory is intact.
    I'll end the conversation by saying "thanks," which saves the long-term memory. Let's check MongoDB now. Refreshing the page, you'll see new entries. The conversations are stored as arrays, and you can recall previous conversations anytime. The summary includes the questions asked and their answers, ready for the next interaction.
    Let's try a different example. I'll use "msulio" again and ask, "Show me a Python example of if-else." The system provides a nice response and knows that the long-term memory doesn't pertain to this new question. My short-term memory only includes the current question. I can ask, "What did I ask?" and it accurately retrieves the previous question and answer.
    After exiting, you'll see two summary locations in MongoDB. There are now two conversations stored. A quick code walkthrough shows the main app setup, where it configures long-term and short-term memories, simulates a login, retrieves profile information, initiates a conversation, pulls in long-term memory, and loops through question and answer exchanges.
    When you type exit words like "exit," "quit," "bye," "end," "done," or "thanks," the conversation ends. If you type "STM" or "LTM," it provides the respective memories. Before sending information, the app combines long-term memory, short-term memory (the current conversation), and your current question, sending back the combined data and updating the conversations accordingly.
    It's straightforward but unbelievably powerful. The applications for this are going to be fantastic. Have a great day!
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 2

  • @duyhung89
    @duyhung89 วันที่ผ่านมา

    Hello Michael, I love this video quite much and watch it twice, but with no code like me, it should be quite difficult to build it! Could you make another video for kind of no code like me? I thing many many people will love that! Because It's super cool! Thanks in advance.