Great work! Thanks! I saw in an official langchain tutorial that you can use "bind_tools" to give an LLM access to a tool directly. I tried to bind a tool on a ChatHuggingFace object and it did work at the condition that you use the huggingFace endpoint for the LLM... It doesn't work when the LMM is loaded locally. I wonder if I should create an issue. Have you tried any of that ?
You understood it correctly but you also used the llama instruct model and all the other examples I have seen they are also using instruct models only.
@@samadislam4458 Ohhh got it! Yeah those are just my favorite models because they generally work the best as a chatbot for me but that doesn't mean those are the only ones you can use well!
@@ColeMedin but if the task is very complex and require multiple functions calling in any sequence or in any order based on output of a function then writing the parser will going to give hell😂
@@ColeMedin not true : In fact you can but they will give big problems : so you need to use the messaging type model if your using a messaging type prompt setup ! > But if your not using that type , ie your using a normal mistral prompt or input/response/ or alpaca etc you need to use the setup you will be using for your function caller ..here in this example he is using the messaging type setup , so look for model which are already been trained this way !
Great work! Thanks! I saw in an official langchain tutorial that you can use "bind_tools" to give an LLM access to a tool directly. I tried to bind a tool on a ChatHuggingFace object and it did work at the condition that you use the huggingFace endpoint for the LLM... It doesn't work when the LMM is loaded locally. I wonder if I should create an issue. Have you tried any of that ?
Thank you - you bet! ChatHuggingFace doesn't support function calling at this point unfortunately, I REALLY wish it did!
I like the memes
But we only have to use instruct models right?
@@samadislam4458 Could you clarify what you are asking? This setup will allow you to use any model with function calling - not just instruct ones!
You understood it correctly but you also used the llama instruct model and all the other examples I have seen they are also using instruct models only.
@@samadislam4458 Ohhh got it! Yeah those are just my favorite models because they generally work the best as a chatbot for me but that doesn't mean those are the only ones you can use well!
@@ColeMedin but if the task is very complex and require multiple functions calling in any sequence or in any order based on output of a function then writing the parser will going to give hell😂
@@ColeMedin not true : In fact you can but they will give big problems : so you need to use the messaging type model if your using a messaging type prompt setup ! > But if your not using that type , ie your using a normal mistral prompt or input/response/ or alpaca etc you need to use the setup you will be using for your function caller ..here in this example he is using the messaging type setup , so look for model which are already been trained this way !