Tool Calling with LangChain is awesome!
ฝัง
- เผยแพร่เมื่อ 8 ก.ค. 2024
- In this video I will explain what Tool Calling is, how it differs from Function Calling and how it is implemented in LangChain.
Code: github.com/Coding-Crashkurse/...
Timestamps
0:00 Introduction
1:39 Basics & Tool Decorator
4:00 Tool with Pydantic Classes
6:03 Perform Tool Calling
11:06 Tool Calling with API
#langchain
Excellent video! You made life simple of non coders like us to actually solve complex tasks.. Kudos 🎉
If you watch this you are probably a Coder
Amazing video. I was looking for these informations for some time, it was hard to find a clear explanation. Thank you for the summarized info and code
excellent as always🎉
Great tutorial!
I've got one question, though:
in 6:02 respectively 6:54: Does the model decide which tool to use on basis of the doc string?
Yes, that docstring is the explanation for the llm :)
Awesome video! Would you consider adding a module to discuss how to do tool calling with other LLMs (such as Llama3 70B via Groq or Mistral)?
Question upfront? Does it not work with other models? LangChain normally provides a standardized interface for all models
@@codingcrashcourses8533 - Thanks for the reply. Perhaps I was doing something incorrectly because it is working with Groq now.
FYI your videos are probably the best I've found. Seriously great work. Thanks so much for creating this channel!
@@b18181 No worries, that questions are totally fine. But it´s just the biggest benefit of using Langchain, that you dont have to worry about APIs, but you can just switch Classes and it will (should) work ;-).
Thank you for your kind comment
Thank you very much for the explanation. Does it apply only to OpenAI models (ChatOpenAI)? I tried using your code with the Ollama-powered local Llama3-8B model and it looks like the tools are not bound to the model or another issue - the response does not contain "tool_calls"
From the docs: Many LLM providers, including Anthropic, Cohere, Google, Mistral, OpenAI, and others, support variants of a tool calling feature.
To be honest, I dont know if Llama supports tool/function calling. I would also have to google that :)
@@codingcrashcourses8533 Thank you for your response. Meta-Llama-3-8B-Instruct is #28 in the Berkeley Function-Calling Leaderboard, but indeed it does not have that "FC" (native support for function/tool calling) indicator. I guess I'll have to try Gorilla-OpenFunctions-v2 (FC), which is Apache 2.0 licensed and ranked #5, just behind the GPT-4 models.
Excellent demo as usual , just curious is the tool_mapping dict is mandatory to create , can't we just use the tool_call['name'] ?
I ask you: What would happen if you call: tool_call['name'] without the mapping? ;-)
🎉🎉 excellent as always
Great video.
I’m developing something where I have a database of courses and general info, with prices, availability and bookings.
I was trying to build a hybrid RAG pipeline with sql and semantic search, but perhaps this could replace it all together?
Also, I bought a few of your courses a while ago, but I’m missing a full fledged implementation of sql. In your previous rag video you have one table; but how would you implement something where it’s connected to a Postgres db with dozens or hundreds of tables? Perhaps using supabase, which is pretty newbie friendly.
Happy to buy a course where you go in more details about keeping dbs in sync / updated and also working with Langsmith Evals.
My 2 cents to use what:
RAG: Text Data
SQL: Tabular data
Functions/Tools: Call third party tools/APIs
How would you use it with LCEL ?
As far as I understand, this does not work with Ollama at the moment, does it?
Not sure to be honest.
Can you provide the code link to test for our use case
sorry - added it
es waere besser, when Sie mit den anderen LLM's gezeigt haetten.
Wieso? Openai bietet aktuell dem besten Support und langchain bietet ein stabdardisiertes interface für function calling