Appreciate the video, but I'm just a novice trying to learn something but those import statements just don't work any more. Here's what I've found so far: * "from llama_index.llms import OpenAI" is now "from llama_index.llms.openai import OpenAI" * if you're playing with the ChatMessage at 2:31, the import is "from llama_index.core.llms import ChatMessage", but the "print(response.text)" should simply be "print(response)" * the templates example around 3:16 "from llama_index import Prompt" should be "from llama_index.core import PromptTemplate" etc... etc... I strongly recommend if you're trying to follow this to look at the documentation for the correct import statements or this video will drive you insane. I'm susprised llamaindex links to these outdated tutorials in their beginners examples because I nearly gave up completely after all the issues I was having!
Please correct if I'm wrong, but the chat example didn't actually use the "user own" data. The chat history prompt doesn't contain additional context ingested ("starter_example.md" content). That's why the generated response is very general.
Sorry, but the repo for this lesson is hopelessly out of date. Spending more time than it's worth. chasing down new syntax for simple concepts like Prompt module imports. If you were just a third-party, OK nbd. But you're the vendor!
The only place where I found a good explanation of llama prompts and how to supply chat history.
Appreciate the video, but I'm just a novice trying to learn something but those import statements just don't work any more. Here's what I've found so far:
* "from llama_index.llms import OpenAI" is now "from llama_index.llms.openai import OpenAI"
* if you're playing with the ChatMessage at 2:31, the import is "from llama_index.core.llms import ChatMessage", but the "print(response.text)" should simply be "print(response)"
* the templates example around 3:16 "from llama_index import Prompt" should be "from llama_index.core import PromptTemplate"
etc... etc... I strongly recommend if you're trying to follow this to look at the documentation for the correct import statements or this video will drive you insane. I'm susprised llamaindex links to these outdated tutorials in their beginners examples because I nearly gave up completely after all the issues I was having!
Very straightforward, thanks.
Please correct if I'm wrong, but the chat example didn't actually use the "user own" data. The chat history prompt doesn't contain additional context ingested ("starter_example.md" content). That's why the generated response is very general.
Right, so how is this achieved though?
The context is manually given to the LLM in this case. If it is a retriever, it will specify the context as relevant documents?
great content, unfortunately very poor resolution only 720p in 2024....
Exactly
Sorry, but the repo for this lesson is hopelessly out of date. Spending more time than it's worth. chasing down new syntax for simple concepts like Prompt module imports. If you were just a third-party, OK nbd. But you're the vendor!