Build a Perplexity-Inspired Answer Engine Using Groq, Mixtral, Langchain, Brave & OpenAI in 10 Min
ฝัง
- เผยแพร่เมื่อ 12 ก.ย. 2024
- In this video, I walk you through the step-by-step process of creating your own answer engine, much like Perplexity, but utilizing cutting-edge technologies including Groq, Mistral AI's Mixtral 8X7B, LangChain, OpenAI Embeddings & the Brave Search API. This tutorial is designed for those interested in implementing such a system within a JavaScript or Node.js framework. I show you how to configure the engine to deliver not just answers but also sources and potential follow-up questions in response to queries. The journey begins with the initial setup of our project, where I guide you through managing API keys from OpenAI, Groq, and the Brave Search API. From there, we move on to initializing an express server to handle incoming requests effectively. I place a strong emphasis on the importance of speed in our inference processes and share insights on optimizing various components like the embeddings model, how we handle search engine requests, the method of text chunking, and the intricacies of processing queries. As we progress, I demonstrate how to curate response content meticulously, introduce streaming for more dynamic answers, and how we can automate the generation of insightful follow-up questions. The tutorial rounds off with the final touches needed to get our server up and running smoothly.
For those eager to dive in and start experimenting on your own, I'll be providing a link to download the entire repository from the video description soon.
This is your chance to get hands-on experience and truly understand the ins and outs of building an advanced answer engine. And if you find this video helpful, don't forget to support the channel by subscribing and sharing it with others who might benefit from this tutorial. Stay tuned for more updates and happy coding!
Repo is now live: github.com/developersdigest/llm-answer-engine !🤗
Thanks a ton for this, I hope you get to 100k subs soon. Looking forward to your repo on this one.
That is kind of you to say - thank you so much for the your Todd!
Repo is now live: github.com/developersdigest/llm-answer-engine
Thank you so much for putting this out! I am Node dev by day and this help accelerate learning about LLM and all killer tech around it right now!
Not mention I have been watching to play with Groq so Win, Win all around! Wicked cool content, thanks again!
Thank you so much for your comment - I love hearing that!
extremely impressive! thank you!
Glad you liked it! Thank you for watching!
Really excellent demo !
Thanks so much 😊!
DD, another good video. Out of curiosity, is the description LLM generated? It's not often that people pay attention to proper prose in their descriptions.
Nice
Cheers! 🥂
Hey Bud thanks for sharing.
Cheers mate 🥂
Great video, learned so much! I had a question concerning about the answer portion of the app. Is Groq only referencing sources, or is it able to access those sources and perform an enhanced response? It seems currently, that the sources are links that Groq can only present to the user.
I think it makes more sense when you get the streaming in the response as well. Otherwise client doesn't get any benefit of the streaming
Absolutely - I will have more full-stack examples with exactly this coming soon... Stay tuned
Thank you Sir!
Thanks for watching!
can this also accept uploading files that would be a great feature
Wonderful, thanks.
What will be the best frontend for this?
Building an example now. Should be out this week. Stay tuned! :)
@@DevelopersDigest That'll be great. Thanks.
@@DevelopersDigest a complete guide to also run this live would be awesome, your content is fantastic for a newbie, learning a ton
any update on this?
@@AIEntusiast_ It's taking a little bit longer than I expected... hoping to have it ready and recorded this week!
Thanks. Any chance to get the promised repo?
Yes! Repo is now live: github.com/developersdigest/llm-answer-engine :)
how you get the post localhost and everyhting else running there
I use thunderclient as a plugin for vs code for my post requests when testing endpoint, something like postman can be used as well!
When will you drop the code?
Repo is now live: github.com/developersdigest/llm-answer-engine :)
Can you provide the code please?
Repo is now live: github.com/developersdigest/llm-answer-engine :)
@@DevelopersDigest thank you!