Finally some Gemini love I've been getting sick results from Gemini for months while people ignore it, there valuable stuff 1.5 can do that none of the others can do and way more then you think, now flash as well is going to have a great use case
This is the most impressive function call I've encountered to date! We definitely require an open-source version of this. The function call is more crucial than any other task an LLM can perform. It forms the basis for automation. It is imperative that we find a reliable open-source equivalent. I am determined to make it happen.
I agree, function calling is really critical for these models to be useful in real-life. Gemini Flash is surprisingly good at it. Would be great to see open source alternative.
Is there any way to specify a default value for function arguments? What does it do when we tell it to get the status of on order but don't specify an order number?
This automation of the extra step noramlly required by function calling is a really great feature. Does gemini do the same thing with "tool calling" (and identifying functions as 'tools' througha decarator)? Hope that OAI can get on the ball and get automation of this step as well, especially since the initial response object always gives the same suggested tool or function object in the response, so should be able to automate its evaluation.
I agree, its a really neat implementation. Gemini follows a very different format compared to other providers. Would be interesting if both OAI and Claude did the same.
What about same function call multiple times can it handle it, e.g if i need to call an external api multiple times, from the same prompt with different parameters?
Finally some Gemini love I've been getting sick results from Gemini for months while people ignore it, there valuable stuff 1.5 can do that none of the others can do and way more then you think, now flash as well is going to have a great use case
I agree, they have done a really good job with flash.
How we can use Function Calling in same way as we do in OpenAi using JSON SCHEMA.
I can’t help myself, but this is producing a crazy amount of sub communication to have a final response to the user. Or did I get something wrong?
Do you know if it is possible to stream the function calling outputs?
This is the most impressive function call I've encountered to date! We definitely require an open-source version of this. The function call is more crucial than any other task an LLM can perform. It forms the basis for automation. It is imperative that we find a reliable open-source equivalent. I am determined to make it happen.
I agree, function calling is really critical for these models to be useful in real-life. Gemini Flash is surprisingly good at it. Would be great to see open source alternative.
Yes, great. But what about data handling over to Google? I’m shocked.
Google is playing competitively.. Gemini Pro is also priced competitively at $3.5/million token
Is there any way to specify a default value for function arguments? What does it do when we tell it to get the status of on order but don't specify an order number?
This realistic customer service example is super useful and super interesting thanks!
This automation of the extra step noramlly required by function calling is a really great feature. Does gemini do the same thing with "tool calling" (and identifying functions as 'tools' througha decarator)? Hope that OAI can get on the ball and get automation of this step as well, especially since the initial response object always gives the same suggested tool or function object in the response, so should be able to automate its evaluation.
I agree, its a really neat implementation. Gemini follows a very different format compared to other providers. Would be interesting if both OAI and Claude did the same.
Nice video.
What about same function call multiple times can it handle it, e.g if i need to call an external api multiple times, from the same prompt with different parameters?
If it were to do sequential calls, I would imagine it will be able to do it within the same prompt.
I wish knew how to actually use this like with crew or autogen etc cause kt seems it works differently so how so we use it as an agent in a framework
My recommendation is to build your own task specific agents. I think most of the frameworks are not ready for primetime.
Great great video
Thank you so much
will see what I can do.
First it made George Washington black, then it told me to eat rocks and put glue on my pizza, NOW ITS GOING TO FLASH ME!? no thanks!