How does OpenAI Function Calling work?

แชร์
ฝัง
  • เผยแพร่เมื่อ 30 พ.ย. 2024

ความคิดเห็น • 36

  • @heythere7528
    @heythere7528 หลายเดือนก่อน +9

    You are the only one who explained that functions are not actual functions. It is just a reference

    • @learndatawithmark
      @learndatawithmark  หลายเดือนก่อน +1

      yeh - it's not the best piece of naming!

  • @bibstha
    @bibstha 5 หลายเดือนก่อน +4

    As a developer, this is great. Thanks for clarifying right at the beginning that its us who have to make the function call and 👍 to the sequence diagram.

  • @AlexX-xtimes
    @AlexX-xtimes 5 หลายเดือนก่อน +4

    The best simple explanation I have found about Function Calling. Thanks for making it so easy to understand!

  • @Pan_ai_art
    @Pan_ai_art 6 วันที่ผ่านมา

    Nice vedio ,with understandable realtime examples

  • @florentchif4551
    @florentchif4551 หลายเดือนก่อน

    Jesus, I spent one day reading open ai documentation and I could not figure it out. Even their example, I was like... how the heck is their function working? THanks!!

  • @chenghung0510
    @chenghung0510 4 หลายเดือนก่อน +1

    very clear introduction for open AI function calls
    this video is super useful for me to understand function calls.

    • @learndatawithmark
      @learndatawithmark  4 หลายเดือนก่อน +1

      Great! Glad it was useful - let me know if there are any other topics in this area you'd like me to cover next.

  • @terryliu3635
    @terryliu3635 3 วันที่ผ่านมา

    thanks for the explanation. Got a question...with tools being passed to the OpenAI API, how the API handle the "tools" while calling the llm?

  • @jorgesanabria6484
    @jorgesanabria6484 6 หลายเดือนก่อน

    Perfect explanation, all other videos were long winded

    • @learndatawithmark
      @learndatawithmark  6 หลายเดือนก่อน +2

      haha, thanks! I'm trying my best to explain things in under 5 minutes :)

  • @abderahimmazouz2088
    @abderahimmazouz2088 3 หลายเดือนก่อน

    i had to set the your speech rate to 0.75 in order for me to catch up with you , lol
    thank you , very good explanation sir

  • @michaelvilarino5464
    @michaelvilarino5464 4 หลายเดือนก่อน +1

    I have several functions implemented in my project, each responsible for a specific task. However, frequently, when the user requests the extraction of information that should trigger function A, other functions (like B or C) are called instead. Although function A is correctly triggered on some occasions, this does not happen consistently. Why is this happening? I am using OpenAI's GPT-4 model.

  • @rahulagalcha-rs6kg
    @rahulagalcha-rs6kg 5 หลายเดือนก่อน +1

    Thanks for the great explanation :)
    I have one query: How many API calls does it support? like in AWS it supports 5 APIs per agent (Bedrock)

  • @vuquangtruong5950
    @vuquangtruong5950 3 หลายเดือนก่อน

    Your video is the best. I am just wondering " is there any way to set the fixed number of items in an array from an open ai reponse" when you using function calling

  • @CaribouDataScience
    @CaribouDataScience 7 หลายเดือนก่อน +2

    Thanks

  • @vladimirnicolescu1342
    @vladimirnicolescu1342 5 หลายเดือนก่อน

    if I have multiple functions how does the
    "for tool_call in tool_calls" loop work with different functions? the function_response parameter has the latitude and longitude hardcoded as arguments. what if my other functions deal with other stuff and don't require lat and long as arguments, but other arguments? I'm really confused by that bit. If I have more functions do I just add a "match case" check to pass the correct arguments for each function?

    • @learndatawithmark
      @learndatawithmark  5 หลายเดือนก่อน +1

      Yeh this is a bit hardcoded for a single function. The arguments are in that function_args variable, so you could pass them in using the kwargs syntax. Maybe I'll make another video to show how to do that.

  • @amanvojha
    @amanvojha 5 หลายเดือนก่อน

    How did you get the longitude and latitude to pass to the function ? Was it in response by the LLM ?

    • @learndatawithmark
      @learndatawithmark  5 หลายเดือนก่อน

      Yes - the LLM works out the lat/long based on the location that we used in our prompt.

  • @chriswooohoo4518
    @chriswooohoo4518 5 หลายเดือนก่อน

    Hi mark. I have an array of about 60 objects. Each object has a length property with a non null value. Each object has a text property with a null value. I've been trying for weeks after work to craft a prompt that will force OpenAI to generate a word - any english word for now - that has a character amount equal to the length property. I have about an 80% error rate - Rarely can it generate a word with characters equal to the length value. The error rate goes down to about 60% when I say it doesn't have to be a word. Do you feel that this function calling double API request is the only way to fix this issue to guarantee 100% success rate? thanks.

    • @learndatawithmark
      @learndatawithmark  5 หลายเดือนก่อน +1

      I wonder whether something like Instructor might be better for trying to get structured output. I haven't tried it with something like what you're trying, but I might play around with it over the weekend - github.com/jxnl/instructor

    • @chriswooohoo4518
      @chriswooohoo4518 5 หลายเดือนก่อน

      ​@@learndatawithmark Thanks for the reply mark. I should have been more clear, the structure is always intact, the issue is that LLM's don't seem to be able to count. Very ironic - the technology that will eventually "take over our planet" can't tell you how many characters are in this message, nor how many objects are in a given array, let alone create strings with lengths equal to a value *you supply. Hopefully that issue gets solved before we hand over military systems to our AI bots ;)

    • @learndatawithmark
      @learndatawithmark  5 หลายเดือนก่อน +1

      @@chriswooohoo4518 I know, it's not intuitive at all. It's got me thinking whether the problem could be solved with Guidance, another tool that tries to control the output of LLMs github.com/guidance-ai/guidance

  • @fullstackailab
    @fullstackailab 7 หลายเดือนก่อน

    Couldn't we do the same approach with LangChain's Agents?

    • @learndatawithmark
      @learndatawithmark  6 หลายเดือนก่อน

      Yes I think so. I don't know for sure, but I would think the agents might use this API if/when they make calls to OpenAI?

    • @eddymison3527
      @eddymison3527 3 หลายเดือนก่อน

      I think you can. But I like this approach more.

  • @ihebakermi943
    @ihebakermi943 5 วันที่ผ่านมา

    gooooooood

  • @RanaAnas-s2h
    @RanaAnas-s2h 6 หลายเดือนก่อน

    then why langchain agents?
    get more control by using openai function calling

    • @learndatawithmark
      @learndatawithmark  5 หลายเดือนก่อน

      I haven't played with langchain agents yet. I assume they are more powerful than what we showed in this video

    • @RanaAnas-s2h
      @RanaAnas-s2h 5 หลายเดือนก่อน

      ​@@learndatawithmark
      I am developing an app where users can choose a topic or upload a book/PDF for conversation and create multiple personas to get responses. I am not using LangChain; I am using OpenAI function calling and high level prompting.

  • @GG-uz8us
    @GG-uz8us 3 หลายเดือนก่อน

    For real AI, I don't think prompt engineering is needed. So that brings a question: are those GPTs real AI?

    • @learndatawithmark
      @learndatawithmark  2 หลายเดือนก่อน

      The definition of AI seems to evolve to be whatever we can't currently do! So I dunno, I suppose not.

  • @imnbsp
    @imnbsp 5 หลายเดือนก่อน

    Why the fuck is everybody on the web giving just this weather example?
    Can't you be more creative and authentic?
    What the hell do I do if for example I need to translate an array of strings and always return a consistent and formated result?

    • @learndatawithmark
      @learndatawithmark  5 หลายเดือนก่อน +2

      Hey - I thought it'd be easier to explain if I used the example that people are familiar with.
      I did make another video a while back where I had it generate an array of objects and their sentiment. You can see that here - th-cam.com/video/lJJkBaO15Po/w-d-xo.html