Function Calling via ChatGPT API - First Look With LangChain

แชร์
ฝัง
  • เผยแพร่เมื่อ 1 มิ.ย. 2024
  • Twitter: / gregkamradt
    Newsletter: mail.gregkamradt.com/signup
    Code: github.com/gkamradt/langchain...
    Blog Post: openai.com/blog/function-call...
    0:00 - Intro
    0:57 - Blog Post Overview
    3:27 - OpenAI Simple Example
    7:10 - LangChain Simple Example
    8:59 - LangChain In Depth Example

ความคิดเห็น • 110

  • @hadijannat4821
    @hadijannat4821 11 หลายเดือนก่อน +23

    I appreciate your quick coverage of the latest updates from OpenAI.

  • @felipejaramillo124
    @felipejaramillo124 11 หลายเดือนก่อน

    Wow, great video. I'm impressed by how fast you are putting out amazing content. This just came out today!

  • @mrwadams
    @mrwadams 11 หลายเดือนก่อน +2

    Great work pulling those examples together so quickly. Really useful to get an idea of what's possible. Thanks. 👍

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Awesome thanks Mr. Adams!

  • @jessaco.8653
    @jessaco.8653 11 หลายเดือนก่อน +9

    Another topical and beautifully executed video. How is your content always so good and so up to date in this ever changing space?! Stay classy, Greg.

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Wow! What a great comment - thank you Jessa!

  • @georgesanchez8051
    @georgesanchez8051 11 หลายเดือนก่อน +4

    Kor punching the air lol. Always a lovely day when OpenAI releases an update, but man it’s an amazing day when you get a context window upgrade + cost cut at the same time

  • @popothebright
    @popothebright 11 หลายเดือนก่อน

    Mind blown. This is great stuff, beautifully explained.

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Awesome thanks Pop - what're you building?

  • @sachinreddy2836
    @sachinreddy2836 2 หลายเดือนก่อน

    Thanks for sharing the code. Great video man

  • @henkhbit5748
    @henkhbit5748 11 หลายเดือนก่อน

    Great video and thanks for the update👍

  • @gr8tbigtreehugger
    @gr8tbigtreehugger 11 หลายเดือนก่อน +1

    That's great news! Most of my code is managing the transformation to/from JSON.

  • @developer_george
    @developer_george 11 หลายเดือนก่อน

    Thank you ! This is the channel to follow right now

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Nice! Thank you for the support

  • @micbab-vg2mu
    @micbab-vg2mu 11 หลายเดือนก่อน

    Great information - thank you.

  • @chrisalmighty
    @chrisalmighty 11 หลายเดือนก่อน

    That was quick man.

  • @rajivraghunathan3710
    @rajivraghunathan3710 11 หลายเดือนก่อน

    Excellent!!! Well explained,,

  • @yubogao1531
    @yubogao1531 11 หลายเดือนก่อน

    That's amazing, I'd appreciate your speed. Only read openai's documentation is just boring, your video is quite clear that I can make a small demo using Jupyter notebook to test function calling. Furthermore, if LangChain updates and supports this new version of ChatGPT, I hope to see your video about it.

  • @MrDannesboe
    @MrDannesboe 11 หลายเดือนก่อน

    wow.! Really good tutorial

  • @MrDocFP
    @MrDocFP 11 หลายเดือนก่อน +1

    Thank you so much @Greg! Great Videos! Just one question: in which situations would you recommend using Function Calling rather then LangChain Agents? Aren't those different approaches to solve the same problem?

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน +2

      Function calling is a tool that langchain agents can use.
      If you don’t want to use langchain then you’ll need to deal with the raw openai library which has different ergonomics.
      I would recommend sticking with langchain agents as they provide more support for function calling

    • @MrDocFP
      @MrDocFP 11 หลายเดือนก่อน

      @@DataIndependent thank you! Just confirmed my idea!

  • @osamammursleen
    @osamammursleen 11 หลายเดือนก่อน

    This is awesome.

  • @nitinverma6878
    @nitinverma6878 7 หลายเดือนก่อน +1

    Great job explaining function calling. The only thing I'm struggling to understand is the use case. Currently, I'm using a semantic kernel with a stepwise planner. When I provide a one-line prompt like "show me a list of all the tables," the system creates a plan and goes through my functions, often finding the correct function on its own and passing the right parameters to obtain the desired result. However, this process is slow and requires many iterations to achieve the result. So, does this mean we should manually call the functions one by one to achieve the desired result? If so, would I need to select the appropriate function for every prompt or use case, a task that the planner used to handle automatically, or am I completely misunderstanding the process?

    • @nitinverma6878
      @nitinverma6878 7 หลายเดือนก่อน +1

      I tried implementing it and got few of my doubts clear. Function calling decide it self which function to call based on function list provided. it also retries once or two time i case of issue. I found it faster then SK. but Still SK chain of thought produce better results which matters more.
      I am struggling to get correct results with Function calling as in case of any issue even if do manual retry it kind of stuck on same result.
      any suggestions...

  • @DylanHumphreys
    @DylanHumphreys 11 หลายเดือนก่อน

    Great video, but 4:14 is a bit confusing. enum is (in this example) providing chatgpt with the list of all possible values for "unit" e.g. celsius or Fahrenheit, if you wanted your users to be able to request the temperature in kelvin, and you'd already updated your function to return it, then you would add "kelvin" to that enum list.
    Futher for clarification on the "required" parameter. That lists the values that MUST be supplied to your function. There could be additional parameters which arent required ... to keep with our example, the evaluation, which if not supplied assumes sea level, but is not required for the function to actually run successfully, and return something.

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Thanks for the comment and yep I’m aligned with you.

  • @cholst1
    @cholst1 11 หลายเดือนก่อน +1

    This makes me think of the LLM's as Tool Makers/Users paper from a week back or something.

  • @TomanswerAi
    @TomanswerAi 11 หลายเดือนก่อน +1

    Nice!

  • @greendsnow
    @greendsnow 11 หลายเดือนก่อน +1

    That's a beautiful green and that purple suits it so well.
    I did not understand anything from the presentation.

  • @shuntianli9651
    @shuntianli9651 9 หลายเดือนก่อน

    Hello, I have watched most of your videos. great info. I have question around how to chat with CSV files with large amount of data. is there suggestion?

  • @MiguelFernando
    @MiguelFernando 11 หลายเดือนก่อน +1

    Curious how you see it as different from the openAPI agent?

  • @lucasamadsen
    @lucasamadsen 11 หลายเดือนก่อน

    Here's an idea, Greg: explore/finish updating agents and re-do your podcast with Weaviate!!!
    And thanks for the class!

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน +1

      Nice! thanks for the tip and an agent video should be on the docket.
      I'm not 100% happy with the agent ecosystem right now and rather than putting out a complicated and fragile example I wanted to wait till the tech settles a bit more.

  • @afternuwn
    @afternuwn 11 หลายเดือนก่อน

    Appreciate this example, the blog post alone didn't do it for me!
    - One question: with langchain, Can I use my own tools in`format_tool_to_openai_function(t)`? I'm guessing I need to make them a BaseTool type from the docs.

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Yes! You can make your own custom tools with langchain

  • @MoonKun-tj5rr
    @MoonKun-tj5rr 11 หลายเดือนก่อน

    Holy. This open endless opportunities.

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      totally - what're you building?

    • @MoonKun-tj5rr
      @MoonKun-tj5rr 11 หลายเดือนก่อน

      @DataIndependent I'm in the R&D phase of building microenvironments for classification systems mapping while managing token limitations and output volatility by framing prompting through the leverage of functions

  • @codediporpal
    @codediporpal 11 หลายเดือนก่อน

    This is *exactly* the issue I've been struggling with, as I'm sure so many others have. Google recommend comes through!

  • @brofessorsbooks3352
    @brofessorsbooks3352 11 หลายเดือนก่อน

    During the demo for the weather in SF, there seems to be no real call to a weather api to acquire the weather details. Is the new version of GPT-4 somehow linked to the internet? Furthermore, something that still confuses me is that although GPT-4 is calling a "function", the get_current_weather(..) function doesn't seem to do much aside from make the outputs legible. I thought functions had to be like the Tools we use in Langchain? Thank you!

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน +1

      This code didn’t actually call an API, it was just a fake function for demo purposes.
      And yes that’s a big part, properly formatting a response in json so you handle it elsewhere.

  • @rudivonstaden
    @rudivonstaden 11 หลายเดือนก่อน

    Very cool. Is it possible to use these functions in an agent yet? In other words, to get them to execute the function with the prescribed parameters?

    • @bryandoherty2078
      @bryandoherty2078 11 หลายเดือนก่อน +3

      Latest version of LangChain supports this:
      agent_chain = initialize_agent(tools, llm, agent=AgentType.OPENAI_FUNCTIONS, verbose=True, memory=memory)

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน +1

      Yes! To what Bryan said. I made this video before they came out with that update

  • @Rundik
    @Rundik 11 หลายเดือนก่อน +2

    Would you trust a system which can't tell the difference between celsius and fahrenheit to do decisions for you?

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Good question! For certain tasks I would! It's not copy and paste ready yet but it offloads a lot of intelligence for you.

  • @user-js1ce9dr2t
    @user-js1ce9dr2t 11 หลายเดือนก่อน

    Great Explanation ❤
    I have a doubt here, How is this function calling feature different from the Agents that we already have in Langchain?
    Because agents are also capable of deciding a tool ( a function ) based on description. ( Pls note that I am complete beginner in Langchain)

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน +2

      Functions are a tool that agents take advantage of. Functions make it much easier to get structured data out of a piece of text.
      We used to have to prompt our way into structured data but this takes care of it for us

    • @harikirankante3391
      @harikirankante3391 11 หลายเดือนก่อน

      @@DataIndependent Ok, Got it

  • @user-tz1jb4lh1r
    @user-tz1jb4lh1r 11 หลายเดือนก่อน +1

    What's the difference between it and Langchain agent? I think LangChain agent is event better, for we don't need continue the interactions manually.

  • @IamalwaysOK
    @IamalwaysOK 11 หลายเดือนก่อน +2

    This is truly impressive. However, I notice that you manually executed the request to the language model three times. I wonder how the plugin architecture determines when and how many times to make those requests automatically. Moreover, how do you think it discerns the appropriate stopping point?

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน +2

      Yes! Manually did it because the conventions aren't quite there yet do it it cleanly w/ automation. I imagine that LangChain will support this first class by the end of the week. Thanks for the call out.
      I think it knows the stopping point by simply looking at the requests and seeing if they were satisfied or not. I would need to play with it more to see how it'll do at scale w/ more requests.

    • @IamalwaysOK
      @IamalwaysOK 11 หลายเดือนก่อน

      @@DataIndependent I really appreciate your response. Thank you for the awesome video again!

  • @NikS-jo5pu
    @NikS-jo5pu 11 หลายเดือนก่อน

    I am assuming that the function_call in OpenAI returns back the name of the function to be called and if we had multiple function we will use that to call the function. Right now in your example it is just hardcoded to call get_current_weather. Is it correct?

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      For the top example yes, but check out the bottom example when I have two functions. It returns back w/ which one to pick.

    • @NikS-jo5pu
      @NikS-jo5pu 11 หลายเดือนก่อน

      @@DataIndependent I see, Thanks! I stopped after the vanilla example as I don't know much about LangChain, thanks for the video again.

  • @Patrirque
    @Patrirque 9 หลายเดือนก่อน

    Parameters can have more complex types? I mean, like arrays or objects.

  • @mwaikul
    @mwaikul 11 หลายเดือนก่อน

    How do you do function calling from a QnA chain? Some answer from the user might require function calling.

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      You could set up your own custom chain and put it at the tail end.
      Or else just take the output from the user and manually write code to pass it to a function

    • @mwaikul
      @mwaikul 11 หลายเดือนก่อน

      @@DataIndependent I guess I am trying to ask can we decide what kind of chain to call dynamically based on user input or chatgpt response.

  • @prasenjitgiri919
    @prasenjitgiri919 9 หลายเดือนก่อน

    I see what you are doing, but do you have an example where the program knows how to generate the content and to figure out the recursive manner it has to work ? Right now it looks we are invoking it multiple times, but how to automate this in a chatbot? And how to keep sending the data back to the chatbot from a key-value db?

    • @AIonRails
      @AIonRails 6 หลายเดือนก่อน

      This is 'chain-of-thoughts.' The problem you're discussing is something I've been facing for the last three months. My solution is the AutoGen project, which helps me handle retries when irrelevant content is provided from the vector database, creating tests, and progressing step by step

    • @prasenjitgiri919
      @prasenjitgiri919 6 หลายเดือนก่อน

      @@AIonRails Yes, i was using the langchain agents and I did read about the autogen but it is not working as advertised, Now that openAI has come out with their own agent framework but that is still not available on the Azure Open AI -
      Do you have any study guide or open source implementation that I may refer as guide?

    • @AIonRails
      @AIonRails 6 หลายเดือนก่อน

      @@prasenjitgiri919 what do you plan to achieve using autogen? I have made few agents, and its power is inspiring. Actually right now trying to implement Obsidian based AutoGen framework.
      And yes, agent openai agent framework sound very very interesting. As a backend developer, I just need to provide Rest API for agent interaction...

  • @AshnaImtiyaz
    @AshnaImtiyaz 7 หลายเดือนก่อน

    Based on this video, it seems that the "gpt 0613" model was used. So which model should I use now? Please guide me.

    • @DataIndependent
      @DataIndependent  7 หลายเดือนก่อน

      You can still use that one, it should be still live

  • @Jjjabes
    @Jjjabes 11 หลายเดือนก่อน

    How would you go about setting GPT to try and get the information that you need for the functions to work? thanks :)

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Could you explain your question more? What use case are you solving for?

    • @Jjjabes
      @Jjjabes 11 หลายเดือนก่อน +1

      @@DataIndependent thanks for responding - So for example, in the video the function requires year, category and amount props - but is there a way to get the AI to be asking the user for this information?

  • @vesper8
    @vesper8 11 หลายเดือนก่อน

    This is so powerful!
    Is there any chance you could make a video about how to use GPT functions with a RetrievalQA agent?

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน +1

      Oh cool - How come you want to see that combo together? What is the business use case?

    • @vesper8
      @vesper8 11 หลายเดือนก่อน

      @@DataIndependent Well, my understanding has evolved in the last 48 hours since I posted this comment, but it's just a bit more complex now. My specific use case is that I've trained an LLM with custom data using a few documents and langchain. And I've now seen the limitations of doing that. My chatbot is now able to answer many different types of questions on my small data set which consists of 50 or so festival events with speaker name, title, description, date and time. My chatbot is able to answer questions such as "tell me of an event that talks about AI" or "when is xxx speaking". But it fails to answer "list all events on the 28th" or "how many events are there in total". And this is where a functions agent comes in handy. A functions agent can identify certain questions that pertain to a date or to asking an aggregate question about events, and when that happens it can use a custom tool that will query the data, a csv or pandas dataframe, return all the events for the requested date, and pass that back to the llm for processing. But in the event that the user does not ask such a question, then it needs to be able to fall back to a QA agent for answering more general questions about the data. I hope that wasn't too much all at once! Hope it makes sense now. It would be AWESOME if you made a video with such an example in mind!! Thanks! Love the content!

  • @kajasheriff
    @kajasheriff 11 หลายเดือนก่อน +1

    What is the different between OpenAi Plugins vs OpenAi Function Calling?

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      You don’t need to define a manifest file for open at functions.
      Plus extraction and tagging are super easy with functions

  • @Ryan-yj4sd
    @Ryan-yj4sd 11 หลายเดือนก่อน

    Why would I use LangChain now that openai takes care of the agent bit for me?

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Langchain is an entire framework to build AI apps, they use gpt functions to augment the tools they already had.
      You could do it manually, but why?

  • @codewithbrogs3809
    @codewithbrogs3809 11 หลายเดือนก่อน

    king

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Thanks Brogs

    • @codewithbrogs3809
      @codewithbrogs3809 11 หลายเดือนก่อน

      @@DataIndependent you should do a video on few shot prompting with the new function calls. One of the benefits of Kor

  • @pauldavis5760
    @pauldavis5760 11 หลายเดือนก่อน

    Is there any reason why I'm getting a traceback error saying that gpt-4-0613 doesn't exist?

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Nah you shouldn’t be, unless you don’t have access to gpt4 yet?

  • @matemarschalko4768
    @matemarschalko4768 11 หลายเดือนก่อน

    Interestingly, the assistant seems to just use the function output almost "as is". I'm unable to get the assistant to use the function output as context only and focus on the users actual question...
    Example: my function returns all the information about a property, and the assistant seems to just output the whole block of information instead of using it as a context/knowledge base to answer the users question. Nothing I put in the system message affects how the assistant uses the information from the function call.
    Any idea how this could be achieved?

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน +1

      I would do a two step process
      1) Have it return the information you want and then
      2) Have it use that information in another response
      If you give me more specifics via contact@dataindepentent.com I can check it out

  • @balajikatukuti5511
    @balajikatukuti5511 7 หลายเดือนก่อน

    I am getting this error: "NameError: name 'user_location' is not defined" in function_response...What should I want to do?

    • @DataIndependent
      @DataIndependent  7 หลายเดือนก่อน

      Are you running the code exactly as it stands on the notebook?

  • @luismartins8598
    @luismartins8598 หลายเดือนก่อน

    How can we do this via rest api?

  • @user-ul8uk1hj3o
    @user-ul8uk1hj3o 11 หลายเดือนก่อน +1

    Pretty cool!! But I fear that OpenAI is now starting to compete directly with LangChain or AutoGPT, and it looks like function is just missing a longer memory store(pinecone?) to handle more tasks with more steps.

  • @ashlynnantrobus5029
    @ashlynnantrobus5029 11 หลายเดือนก่อน

    "Uploaded 4 hours ago"
    I only first heard the news 3 hours ago 😂

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน +3

      I went for the speed run! This announcement felt worthy, not all do.
      I learned about it with the tweet

  • @florianhonicke5448
    @florianhonicke5448 11 หลายเดือนก่อน

    required is not defining what parameters should be returned, it is just saying what parameters are required

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน +1

      Good call - got this out in a rush and didn’t crisp that part up.
      What’re you building w/ functions?

  • @greenhoodie
    @greenhoodie 11 หลายเดือนก่อน

    Could we use it to call GPT again in a different function? 🤔

    • @greenhoodie
      @greenhoodie 11 หลายเดือนก่อน

      With a specific prompt and purpose (no idea, this is falling out of my brain in real time)

    • @greenhoodie
      @greenhoodie 11 หลายเดือนก่อน

      I guess that would just make it a LangChain agent wouldn't it.

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Ya, this is where the agent work from LangChain will help out

  • @pulkitgarg189
    @pulkitgarg189 11 หลายเดือนก่อน

    Hey @greg, would be very helpful if you can make same series for LlmaIndex as well. This series is very good, keep it up!!

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      Wonderful! Thank you! What's the business case you want apply with them? I like connecting code to practical examples

    • @pulkitgarg189
      @pulkitgarg189 11 หลายเดือนก่อน

      @@DataIndependent I know we can use LangChain for building indexes for schematic search search but I think LlmaIndex is more efficient and gives user a lot of flexibility. Thus it can be used where we are adding custom memory to a chatbot/ application.

  • @DabnisUK
    @DabnisUK 8 หลายเดือนก่อน +1

    Content is excellent, but please! could you not speak so fast. The amount of times I have to go back & listen to what you say several times is making the ergonomics difficult. BTW English is my natural language (Nottingham UK)

    • @DataIndependent
      @DataIndependent  7 หลายเดือนก่อน

      This is the first time I've heard this comment and it's important to get community feedback - thank you!

  • @MaxMustermann-up4qe
    @MaxMustermann-up4qe 11 หลายเดือนก่อน

    Somehow I just don‘t see it. For straight forward tasks I don‘t get langchain nor this function call feature. Of course, when it comes to chatbots, it makes total sense, I understand that. But apart from that, I‘d use vanilla python. The examples I see are all like this one here, where it solves a non problem. The closest to make sense was a hr assistant. But even there, a structured input like „dropdown get current vacation days“ would have killed the whole ai part. Not to be misunderstood, only talking about the chaining tools, not the ai itself.

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      It really helps with reliability when pulling information out. Before we had to pass instructions on how to respond. Now these models have been fine tuned on json output which helps a lot

  • @sgramstrup
    @sgramstrup 11 หลายเดือนก่อน

    How sad that one single corporation are setting the standard for all future frameworks. They could've used open standards like LMQL or similar, but as all scumcorp, they just want to control the developer ecosystem, and too many smart developers don't care shit about open standards or our common development ecosystem..

    • @featureflag
      @featureflag 11 หลายเดือนก่อน

      This is how it's always been... one company takes the reigns and leads the way. The others follow suit and introduce iterations that attempt to surpass the original, and sometimes yes, failing. I'm not sure if your comment is directed at OpenAI or not, but what you are frustrated about is not a unique phenomenon

  • @dievardump
    @dievardump 11 หลายเดือนก่อน

    In the Ad Hoc Example Financial Forecast, you are answering (ChatMessage) to the function_call with human language: "Just made the following updates...".
    However those actions should be automatically done by your scripts and therefore not necessarily have a human language response to it.
    Could you instead just do a back and forth of:
    AIMessage(content=str(first_response.additional_kwargs)),
    ChatMessage(role='function', additional_kwargs = {'name': first_response.additional_kwargs['function_call']['name'],content="OK"}),
    AIMessage(content=str(second_response.additional_kwargs)),
    ChatMessage(role='function', additional_kwargs = {'name': second_response.additional_kwargs['function_call']['name'],content="OK"}),
    AIMessage(content=str(third_response.additional_kwargs)),
    ChatMessage(role='function', additional_kwargs = {'name': third_response.additional_kwargs['function_call']['name'],content="OK"}),
    Acknowledging you did what was asked, without having to put it into words? Would gpt4 get it?

    • @DataIndependent
      @DataIndependent  11 หลายเดือนก่อน

      It’s a good question, I’ll need to test on that. But i would imagine that a more explicit confirmation of the action taken would help.
      If you test it out let me know