Really appreciate you creating this content and giving me credit. I have to admit that your term "nested tool" is much better than my "function dependency." 😅 I was trying to find a good term, and yours is definitely better. Again, thank you so much for reviewing this. Hopefully, it can help your audience. I'm a fan of your content and channels, and I wish you and everyone contributing to the open-source community the best. May the force be with you.
thank you for putting this together. glad you approve "Nested Tool" :) Looking forward to showcase some of your other work as well. Keep up the great work. I also love open-source community because of the collaborative nature and what it brings to the table :)
I have been building agents with the old Mixtral 8x7B and Mistral 0.2, and I have to say Mixtral many times outperforms models that have been trained for function calling, eager to see how this new Mistral performs..
Can this model have a conversation with the user and call the appropriate functions when necessary and use those function results to answer the question asked by the user in a conversational manner?
Yes, that is the goal. If you ask questions that do not require a function call, it will act like normal LLM and when the input requires function call, it will use that to get info and then reply to the user with relevant info.
Really appreciate you creating this content and giving me credit. I have to admit that your term "nested tool" is much better than my "function dependency." 😅 I was trying to find a good term, and yours is definitely better. Again, thank you so much for reviewing this. Hopefully, it can help your audience. I'm a fan of your content and channels, and I wish you and everyone contributing to the open-source community the best. May the force be with you.
thank you for putting this together. glad you approve "Nested Tool" :)
Looking forward to showcase some of your other work as well. Keep up the great work. I also love open-source community because of the collaborative nature and what it brings to the table :)
I have been building agents with the old Mixtral 8x7B and Mistral 0.2, and I have to say Mixtral many times outperforms models that have been trained for function calling, eager to see how this new Mistral performs..
Seems like its much smarter than what we have been able to achieve so far. I think specialized models is the way to go.
Looking forward for a langgraph implementation of this
that would be actually good. Let me see what I can put together.
Maybe this question not about this video topic. But i want to know how i can create on my php web site smart search based on some local storages llm.
I have a video on what with your website, that might be a good start. Here (th-cam.com/video/RBnuhhmD21U/w-d-xo.html) Its a very old one though
gpu out of memory ... please help
It's not going to run on free colab. You need A100 for this one.
@@engineerprompt thanks 🥲🥲
Can this model have a conversation with the user and call the appropriate functions when necessary and use those function results to answer the question asked by the user in a conversational manner?
Yes, that is the goal. If you ask questions that do not require a function call, it will act like normal LLM and when the input requires function call, it will use that to get info and then reply to the user with relevant info.
@@engineerprompt Thanks! Would you be able to do a video on how to do that exactly?
cool does it also work with GGUF model ?
Yes, it should
chainlink
Great video
Thanks 😊