This is a really smart move! Question: if I wanted to use this agent in one of my apps instead of using the llm directly with an api call (like localhost:1234/v1/chat/completions), how would you do that? I'm looking for something that looks and behaves like the regular api-endpoint but behind there's an agentic framework, maybe even rag. Could be a very noob question, but would be thrilled to hear if that exists or is possible.
Gotcha so something you could do is use FastAPI. Then put an initiate chat inside of it. Then you can just call that endpoint like localhost:5000/agent/get-code Then this would start the chat and you could have it return the last message from the chat
Love your channel bro - working on learning a lot of this stuff right now
Thanks man appreciate that! If you have questions, let me know
Awesome stuff bro.
Thank you 🙏
Nice vid. Ok, what if i want to generate C# instead of python? Can it do that? Might be a good vid for you.
This is a really smart move! Question: if I wanted to use this agent in one of my apps instead of using the llm directly with an api call (like localhost:1234/v1/chat/completions), how would you do that? I'm looking for something that looks and behaves like the regular api-endpoint but behind there's an agentic framework, maybe even rag. Could be a very noob question, but would be thrilled to hear if that exists or is possible.
Gotcha so something you could do is use FastAPI. Then put an initiate chat inside of it. Then you can just call that endpoint like localhost:5000/agent/get-code
Then this would start the chat and you could have it return the last message from the chat
@@TylerReedAI you are a legend, thank you very much! Will try! Really appreciate your content, thank you for what you do for the community.