Amazing value delivered in under 20 minutes - Thank you and congratulations for the great content and instructions! I wish every instruction video on TH-cam had that same kind of quality.
Hi!! Excellent video. Thanks a lot. It works perfectly. I have a question: I want to add a node with a "Single Choice" card between the Start button and node containing the "index" workflow, so that users choose chat with the Assistant or filling additional contact information that will be inserted into a table. I’ve tried this, but the Assistant enters a loop and doesn't move past the greeting; it doesn't respond to new questions. Thank you for your help.
I saw your comment in Matthew Berman's video on the Vertex AI agent builder. I'm curious if you'd be able to make a video (or point me to a good documentation) explaining how to use the Function Tools with a Dialogflow Messenger integration. I have not yet seen any test case of it working... I know it's a preview feature and maybe that's why, but from your comment I thought maybe you'd know how to. Anyways thanks for the video, might need to use OpenAI for the time being, thanks!
I don't have as much bandwidth these days to build a full tutorial, but I think what you're looking for is this documentation page > cloud.google.com/dialogflow/vertex/docs/concept/tools#function
Absolutely. Same concept applies and may even be easier can you can use the functions feature for your various calls and it won't look as messy as botpress!
Hello Sir , thanks for your sharing , please help me where are your last video talking about the assistant api you mention at begging of this video , thanks
Hi, your guide is great! I've noticed Botpress tends to echo what I'm typing instead of giving an answer directly from the AI assistant. Any ideas why it's doing that?
Ah I think I see what's happening. One of the variables used in the response is essentially outputting "the last message" (event.preview) cause it assumes the bot successfully responds to the user utterance. If something goes wrong with the API call for some reason, you would see this behaviour, for example if the OpenAI API is down for some time... I could improve the design for a future iteration
Why do i get good answers when i ask something to the assistant through the openai playground, and get terrible answers when i ask the same question to the same assistant but through botpress?
Depends on the model. You actually end up not using AI credits from botpress cause you're using OpenAI instead, so savings there, but OpenAI is not that cheap. I recommend using GPT 3.5 turbo for your assistant model unless it's not meeting your needs
You're right. Botpress doesn't have a "function definition" feature (like Voiceflow does), but 1 workaround is to define global functions in the hooks sections by setting the function as a bot variable (upon every message received, lol). I'm fairly certain they're working on a better way around this though.
Hi, thanks for the video it was very useful. Could you help me please? The first message I send works perfectly, but if I want to establish a conversation I get an error. I added a raw input n the main, right before the index flow in the same node as the index. If I take the input out, I can’t even send the first message to the thread
Hey! So i quick question. I have set up a "hi hello welcome" then i am running this flow to answer questions. It alwasy reverts to the "hi hello welcome" and doesnt keep looping the conversation. Is there any way to say "hi hello welcome" if a thread doesnt exist, and if it does - to skip that bit?
You can use transitions to check the thread variable (if it doesn't exists, say hello and create a new thread, otherwise continue). It seems like either you have your hello logic before the threads are checked or you're failing to fetch the threads, and it's starting over. Follow the logic from the index, through to where the message is being posted to the API. It should look for a thread ID locally, if it finds one, setting the associated OpenAI thread, then continuing the conversation using that thread ID. If you're not seeing that, you need to look at the event log row by row to see where the logic is thrown off.
I have successfully used your template. But, I am facing a problem. I have defined a function in OpenAi assistant. Very simple. Ask for username and email. But I cannot pass those variables to botpress although they are also defined there 😢
@@show-me-the-data perhaps I am asking too much, but could you explain what do you mean? please :). I am very new at this, and any help is more than welcomed. Would you be able to make a small video about this (perhaps) silly question?
Good question. If you go with the built-in LLM, you have to program the conversation flow manually and do your own retrieval on documents, not to mention you can't do LLM driven code based analysis that assistants can. With assistants you can say something as simple as "act like Snoop Dogg" and you're off to the races; no need for conversation design. Add your own files to the assistant and now you got a pretty powerful RAG agent that's way better than botpress built on one
Markdown works I believe. You need to extract the image url first from the run step and structure it as markdown in a workflow variable and then display it
@@show-me-the-data that is the problem, I can't get the url, the image in openai is can't not be displayed, all I could get back is the content of the image
Thank you so much for sharing this. It has been extremely helpful!! Nicely done!
Your video was so great, thanks for sharing your knowledge with us!! thanks from Guatemala
Amazing value delivered in under 20 minutes - Thank you and congratulations for the great content and instructions! I wish every instruction video on TH-cam had that same kind of quality.
Great Video! Exactly what i was looking for! Please make more of these Videos
I'm glad you found it useful :) May I ask what you mean specifically by "these videos"?
Hi!! Excellent video. Thanks a lot. It works perfectly. I have a question: I want to add a node with a "Single Choice" card between the Start button and node containing the "index" workflow, so that users choose chat with the Assistant or filling additional contact information that will be inserted into a table. I’ve tried this, but the Assistant enters a loop and doesn't move past the greeting; it doesn't respond to new questions. Thank you for your help.
Exactly what I was looking for. Thank you brother!
Glad I could help!
it is really nice. Qestion: Is it possible to ask for consultation to help to create this kind of bot with information collect. thnaks in advance
This is great. I found you chanel just today, automatic subscribe!
I saw your comment in Matthew Berman's video on the Vertex AI agent builder. I'm curious if you'd be able to make a video (or point me to a good documentation) explaining how to use the Function Tools with a Dialogflow Messenger integration.
I have not yet seen any test case of it working... I know it's a preview feature and maybe that's why, but from your comment I thought maybe you'd know how to.
Anyways thanks for the video, might need to use OpenAI for the time being, thanks!
I don't have as much bandwidth these days to build a full tutorial, but I think what you're looking for is this documentation page > cloud.google.com/dialogflow/vertex/docs/concept/tools#function
What programming language is that in the execute tasks? Are other languages also supported?
Wow! Borna, thank you for the great explanation, rich in details!! Have you updated this template for use with OpenAI Assistant v2?
No, however there's nothing here in this basic flow that uses the v2 features!
Hey I just updated the details so it works with v2 out of the box!
This was very helpful.. Thank you so much
Amazing dude! 👏
Does it possible with voiceflow?
Absolutely. Same concept applies and may even be easier can you can use the functions feature for your various calls and it won't look as messy as botpress!
Hello Sir , thanks for your sharing , please help me where are your last video talking about the assistant api you mention at begging of this video , thanks
Hi Fang!
Here's the last youtube video I was referring to: th-cam.com/video/ihRlCgsTR20/w-d-xo.htmlsi=kObiSOr3pvOPynVV
Hi, your guide is great! I've noticed Botpress tends to echo what I'm typing instead of giving an answer directly from the AI assistant. Any ideas why it's doing that?
Ah I think I see what's happening. One of the variables used in the response is essentially outputting "the last message" (event.preview) cause it assumes the bot successfully responds to the user utterance. If something goes wrong with the API call for some reason, you would see this behaviour, for example if the OpenAI API is down for some time... I could improve the design for a future iteration
thank you for the video! maybe you can help us customise the apearance of the chat window., where are those settings.
You can edit the CSS manually or using the styler web app! styler.botpress.app/
You can just supply that CSS url to your botpress bot in the settings
Why do i get good answers when i ask something to the assistant through the openai playground, and get terrible answers when i ask the same question to the same assistant but through botpress?
Excellent!
Thak you for the tutorial. Is there any cost that we should consider with this method?
Depends on the model. You actually end up not using AI credits from botpress cause you're using OpenAI instead, so savings there, but OpenAI is not that cheap. I recommend using GPT 3.5 turbo for your assistant model unless it's not meeting your needs
I think the botpress interface has changed, the template does not work.
Cool concept but seems like you have to put every callable function inside this "execute code". It's going to get big fast. How do you manage this?
You're right. Botpress doesn't have a "function definition" feature (like Voiceflow does), but 1 workaround is to define global functions in the hooks sections by setting the function as a bot variable (upon every message received, lol). I'm fairly certain they're working on a better way around this though.
Hi, thanks for the video it was very useful. Could you help me please? The first message I send works perfectly, but if I want to establish a conversation I get an error. I added a raw input n the main, right before the index flow in the same node as the index. If I take the input out, I can’t even send the first message to the thread
It seems like it can't find the thread. Are you using the emulator? If so, just reset the session and a new conversation ID will be assigned
cool !!
Hey! So i quick question. I have set up a "hi hello welcome" then i am running this flow to answer questions. It alwasy reverts to the "hi hello welcome" and doesnt keep looping the conversation. Is there any way to say "hi hello welcome" if a thread doesnt exist, and if it does - to skip that bit?
You can use transitions to check the thread variable (if it doesn't exists, say hello and create a new thread, otherwise continue). It seems like either you have your hello logic before the threads are checked or you're failing to fetch the threads, and it's starting over. Follow the logic from the index, through to where the message is being posted to the API. It should look for a thread ID locally, if it finds one, setting the associated OpenAI thread, then continuing the conversation using that thread ID. If you're not seeing that, you need to look at the event log row by row to see where the logic is thrown off.
@@show-me-the-data Thank you!
I was wondering if you had a version that can use GPT4o?
I have successfully used your template. But, I am facing a problem. I have defined a function in OpenAi assistant. Very simple. Ask for username and email. But I cannot pass those variables to botpress although they are also defined there 😢
Did you add the newly defined functions to the registration JSON object in the code after defining the functions?
@@show-me-the-data perhaps I am asking too much, but could you explain what do you mean? please :). I am very new at this, and any help is more than welcomed. Would you be able to make a small video about this (perhaps) silly question?
Why using OpenAI Assistant if Botpress use already OpenAI? And the OpenAI Assistant is build from GPT3?
Good question. If you go with the built-in LLM, you have to program the conversation flow manually and do your own retrieval on documents, not to mention you can't do LLM driven code based analysis that assistants can. With assistants you can say something as simple as "act like Snoop Dogg" and you're off to the races; no need for conversation design. Add your own files to the assistant and now you got a pretty powerful RAG agent that's way better than botpress built on one
How do you display charts in botpress from OpenAI?
Markdown works I believe. You need to extract the image url first from the run step and structure it as markdown in a workflow variable and then display it
@@show-me-the-data that is the problem, I can't get the url, the image in openai is can't not be displayed, all I could get back is the content of the image