Function Calling using Open Source LLM (Mistral 7B)

แชร์
ฝัง
  • เผยแพร่เมื่อ 28 ก.ย. 2024
  • In this tutorial, I delve into the fascinating world of function calling using the open source Large Language Model (LLM), Mistral 7B. Function calling is a powerful tool that can significantly enhance the capabilities of Gen AI applications. It allows for the integration of external web APIs, the execution of custom SQL queries, and the development of stable, reliable AI applications. By leveraging function calling, we can extract and leverage relevant information from diverse data sources, opening up a plethora of possibilities for developers and researchers alike.
    Throughout this video, I demonstrate how to effectively utilize Mistral 7B for function calling. Whether you're looking to integrate external data into your AI project, execute complex queries, or simply explore the potential of open-source LLMs, this tutorial has got you covered.
    Your support is invaluable to the continuation and improvement of content like this. If you found this tutorial helpful, please don't forget to like, comment, and subscribe to the channel.
    GitHub: github.com/AIA...
    To further support the channel, you can contribute via the following methods:
    Bitcoin Address: 32zhmo5T9jvu8gJDGW3LTuKBM1KPMHoCsW
    UPI: sonu1000raw@ybl
    Join this channel to get access to perks:
    / @aianytime
    Your contributions help in sustaining this channel and in the creation of informative and engaging content. Thank you for your support, and I look forward to bringing you more tutorials and insights into the world of Gen AI.
    #llm #mistral #ai

ความคิดเห็น • 46

  • @hanantabak
    @hanantabak 2 หลายเดือนก่อน

    Thanks for being both informative and also honest without cutting the attempts that didn’t work at first. It makes this video spontaneous and organic.

  • @aveek4902
    @aveek4902 7 หลายเดือนก่อน +6

    Hey, awesome work! However, I am wondering about this function calling solution, whether it will still be as robust as function calling is through the OpenAI API / langchain / other API's like the new mistral one. It seems like all it involves is just telling the LLM to output the information in a JSON schema format according to the specified functions, and then playing around with the string output to extract the functions? Let me know if you have any thoughts on whether this approach you used is fully reliable in forcing the model to maintain the correct format. Otherwise, really appreciate the video and tutorial, was very helpful!

    • @xspydazx
      @xspydazx 4 หลายเดือนก่อน

      yes ... sorry i just read you comment after posting :

  • @vedarutvija
    @vedarutvija 5 หลายเดือนก่อน

    I followed the same code and approach but I dont see any out put other than this:
    Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
    Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.

  • @EmilioGagliardi
    @EmilioGagliardi 7 หลายเดือนก่อน

    New to this technology, so trying to wrap my brain around it. How does this apply to building agents in AutoGen or CrewAI? Does this video imply that you can't use locally hosted LLMs for agents? My little understanding is that agents use skills and skills are python functions...do I need to use the concepts you discussed in this in order to use local open-source models for agents or am I missing something? Thanks!

    • @Stewz66
      @Stewz66 7 หลายเดือนก่อน

      He is not speaking to using open source llms in autogen/crewAI. His tutorial is not directed toward autogen/crewAI, though I suppose there might be a way to do function calling in those environments, but it's beyond my skill or experience to comment in this regard.

  • @skeptomai97
    @skeptomai97 7 หลายเดือนก่อน

    Excellent work!

    • @AIAnytime
      @AIAnytime  7 หลายเดือนก่อน

      Glad you like it!

  • @tech-genius
    @tech-genius หลายเดือนก่อน

    Holy shit, is it just me or this guy looks like Sundar Pichai???
    I thought this was a Google demo or smth for a second

  • @SonGoku-pc7jl
    @SonGoku-pc7jl 7 หลายเดือนก่อน

    thanks! :)

    • @AIAnytime
      @AIAnytime  7 หลายเดือนก่อน

      Welcome!

  • @parkersettle460
    @parkersettle460 7 หลายเดือนก่อน

    Hey are you still active, could I get some help?

    • @AIAnytime
      @AIAnytime  7 หลายเดือนก่อน

      Sure

    • @parkersettle460
      @parkersettle460 7 หลายเดือนก่อน

      What’s your discord?

    • @parkersettle460
      @parkersettle460 7 หลายเดือนก่อน

      @@AIAnytime what is your discord?

  • @TheFxdstudios
    @TheFxdstudios 7 หลายเดือนก่อน +4

    Appreciate this, was just starting this journey.

    • @AIAnytime
      @AIAnytime  7 หลายเดือนก่อน +1

      Thank you sir 🙏

    • @TheFxdstudios
      @TheFxdstudios 7 หลายเดือนก่อน +1

      Yw, function calling isn’t exactly straightforward to get working properly. Vid definitely helps!

  • @xspydazx
    @xspydazx 4 หลายเดือนก่อน

    i think we also need to capture the call for the function and execcute the function .... on open-interpreter ?
    instructor to create templaes for an exepcted output, and interpreter to execute the detected function...
    i see that we can add these funcnctions to the wrapper of openAI... so if we use olama or lmstudio to host we can Add the functions to the OpenAI(Client)....
    I noticed the model acted different when using a huggingface weights from using the gguf with llamacpp and from calling it via the api server llama/lmstudio ?
    but for function calling it worked best on (lmstudio api) ...with the open AI client: ...
    but with the openinterpretor / insructor combo you can build your own method...
    as the open interpretor is also a wrapper which intercepts the responses (same as you did) and executes these functions after removing them from the input as passing the message along the wraper chain ... (hence you dont need chains either... as your creating your own chians)...

    • @xspydazx
      @xspydazx 4 หลายเดือนก่อน

      Nice : it seemed to work (not great) but its the edge of tech!

  • @artsofpixel
    @artsofpixel 7 หลายเดือนก่อน +1

    This was exactly what i was wondering just few days back..
    Thanks man appreciate it ❤

    • @AIAnytime
      @AIAnytime  7 หลายเดือนก่อน

      Any time! 😊

  • @VijayDChauhaan
    @VijayDChauhaan 7 หลายเดือนก่อน +2

    Been Waiting for function calling tutorial from your channel🦁 Aaj sher bs video dekhega kal implement karega😅

    • @AIAnytime
      @AIAnytime  7 หลายเดือนก่อน +1

      Haha, nice! I already have one created when OpenAI launched function calling: th-cam.com/video/aKkr_lgmihw/w-d-xo.html

    • @VijayDChauhaan
      @VijayDChauhaan 7 หลายเดือนก่อน +1

      ​@@AIAnytimeI ment to say with open source... But you are the one keeping me up to date in this field bro❤❤❤

    • @akashaia
      @akashaia 7 หลายเดือนก่อน

      Thank for this

    • @akashaia
      @akashaia 7 หลายเดือนก่อน

      Waiting for next part.rag based & api based function calling

    • @AIAnytime
      @AIAnytime  7 หลายเดือนก่อน

      Sure Akash. That's on the cards!

  • @lalluyoutub
    @lalluyoutub 6 หลายเดือนก่อน

    Your content is helpful. Thank you.
    Is there an implementation for dataframelookup using LLM function calls?

  • @vedarutvija
    @vedarutvija 5 หลายเดือนก่อน

    can u please show a function call example using api? for example weather api

  • @ChopLabalagun
    @ChopLabalagun 7 หลายเดือนก่อน

    🤫

  • @jatinkashyap1491
    @jatinkashyap1491 5 หลายเดือนก่อน

    Appreciate the content. Can't access the Google Colab notebook. Thanks.

    • @xspydazx
      @xspydazx 3 หลายเดือนก่อน

      down load the github ..

  • @user4-j1w
    @user4-j1w 7 หลายเดือนก่อน

    Wow... This is amazing

  • @TheRamseven
    @TheRamseven 7 หลายเดือนก่อน

    Thanks for the video, Is possible add externals api or only through of libraries?

    • @AIAnytime
      @AIAnytime  7 หลายเดือนก่อน

      Yes, absolutely

  • @muhammadadribmahmud5012
    @muhammadadribmahmud5012 7 หลายเดือนก่อน

    Is it possible to function call using Qwen 0.5b model ?

    • @xspydazx
      @xspydazx 3 หลายเดือนก่อน

      yes ... if you intercept the output first then you can check if its a function call if so execute the function first then return the result to the model then when it returns the response it will be complete ... check your response messages for funciton call tag or final output tag...
      or create the template for a single function ... pass only this funciton :
      use that to execute the pyscript on the system or jupiter cell (remote) retun the result ... all cells executed on notebook have the same format so the moel can just write any function it needs just return the output....bash or python ...

  • @Cam0814
    @Cam0814 7 หลายเดือนก่อน

    Can you show and example of using agents with langGraph?

    • @AIAnytime
      @AIAnytime  7 หลายเดือนก่อน

      Look at my LangGraph video.

    • @Cam0814
      @Cam0814 7 หลายเดือนก่อน +1

      @@AIAnytime Looks like your using openai in that video. I would like to do it with mistral or 7b

    • @kai-yihsu3556
      @kai-yihsu3556 6 หลายเดือนก่อน

      @@Cam0814 Me too we want to use Mistral at LangGraph Function calling

  • @nikitkashyap9192
    @nikitkashyap9192 7 หลายเดือนก่อน

    hi sir
    i am nikit, i have some problem choosing best rag pipeline. Can you help me in that?

    • @AIAnytime
      @AIAnytime  7 หลายเดือนก่อน

      Yes, sure

    • @WalkthroughOfHell
      @WalkthroughOfHell 6 หลายเดือนก่อน

      hello Nikit, i'm working on RAG system now, do you mind I get your contact to discuss about RAG system ?