Function Calling with ANY LLM for Local AI Agents (Feat. LangChain, HuggingFace, and Llama 3)

แชร์
ฝัง
  • เผยแพร่เมื่อ 2 พ.ย. 2024

ความคิดเห็น • 10

  • @antoinelaborde5697
    @antoinelaborde5697 6 วันที่ผ่านมา +1

    Great work! Thanks! I saw in an official langchain tutorial that you can use "bind_tools" to give an LLM access to a tool directly. I tried to bind a tool on a ChatHuggingFace object and it did work at the condition that you use the huggingFace endpoint for the LLM... It doesn't work when the LMM is loaded locally. I wonder if I should create an issue. Have you tried any of that ?

    • @ColeMedin
      @ColeMedin  5 วันที่ผ่านมา +1

      Thank you - you bet! ChatHuggingFace doesn't support function calling at this point unfortunately, I REALLY wish it did!

  • @rogerbusse5169
    @rogerbusse5169 3 หลายเดือนก่อน +1

    I like the memes

  • @samadislam4458
    @samadislam4458 2 หลายเดือนก่อน +1

    But we only have to use instruct models right?

    • @ColeMedin
      @ColeMedin  2 หลายเดือนก่อน

      @@samadislam4458 Could you clarify what you are asking? This setup will allow you to use any model with function calling - not just instruct ones!

    • @samadislam4458
      @samadislam4458 2 หลายเดือนก่อน

      You understood it correctly but you also used the llama instruct model and all the other examples I have seen they are also using instruct models only.

    • @ColeMedin
      @ColeMedin  2 หลายเดือนก่อน +1

      @@samadislam4458 Ohhh got it! Yeah those are just my favorite models because they generally work the best as a chatbot for me but that doesn't mean those are the only ones you can use well!

    • @samadislam4458
      @samadislam4458 2 หลายเดือนก่อน

      @@ColeMedin but if the task is very complex and require multiple functions calling in any sequence or in any order based on output of a function then writing the parser will going to give hell😂

    • @xspydazx
      @xspydazx 2 หลายเดือนก่อน

      @@ColeMedin not true : In fact you can but they will give big problems : so you need to use the messaging type model if your using a messaging type prompt setup ! > But if your not using that type , ie your using a normal mistral prompt or input/response/ or alpaca etc you need to use the setup you will be using for your function caller ..here in this example he is using the messaging type setup , so look for model which are already been trained this way !