this was exactly what i needed, thanks dewain, you've pretty much taught me everything i know about co-pilot,. Now i'm able to use it to blow my bosses socks off and more importantly help people in my company be better at their jobs and get new people trained much quicker. Youre a legend, thanks so much dude !!!
Hello Dewain, I have been watching your videos to create my copilot. Thank you for the knowledge you shared. May I ask you to create a video where user can upload the documents in the chatbot, and the documents will be kept in the SharePoint using Power Automate?
Hi Dewain! Excelent video! Thanks for sharing! I'm currently developing a copilot that uses SharePoint as knowledge source. We will upload 1,800 documents (Most of them .DOCX within the allowed file size). How can I optimize the copilot to only generate answers based on documents related to certain subject? How does copilot index the documents in SharePoint? Is there a limitation about the quantity of files? How can I improve the response time and performance?
several reasons... SharePoint site doesn't make the content part of the publishing process meaning that the content changes directly impact the copilot without testing being done. Secondly, SharePoint doesn't index / vectorize / chunk the data so the answers are not typically as good.
Another benefit is you can give a business user access to manage the document library in Sharepoint and not the Copilot or Dataverse environment. @Dewain27
I'm also interested in comparison between connecting Chatbot directly to Sharepoint Site and approach described here. Is this "SharePoint doesn't index / vectorize / chunk the data so the answers are not typically as good" correct ? And thank you for all videos you publish 👍
Great video, 1 extra point to stay focus: your document cannot contain spaces, otherwise this is the error you get: Action 'Add_a_new_row_to_selectedenvironment' failed: Export key attribute {0} for component {1} must begin with a letter and only consist of alpha-numeric and .{}! characters. Data[0] = "schemaname" Data[1] = "botcomponent"
Thank you, this is very helpful, and we have already implemented it. The only issue I'm encountering is that this approach requires the 'Publish' button to be pressed in the ChatBot for the uploaded file to become accessible to users. Is there any way to automate the publishing process?
Thanks for the video , Dewain. It is very detailed and easy to follow and learn. I have a question based on this , what if the files are already in the sharepoint ? I tried to build the same power automate , when i click the "ALL runs " , nothing happened .does this method only works when you upload new file to the sharepoint ?
can you show me how the triggers/flow must be look when the data/file from the share point is deleted , and must triggers it in the copilot studio also.
Hi Dewain this is very useful. Thank you so much for sharing how to do this. I wish this can be added as a native feature in Copilot Studio. It would also be useful to have a power automate which will delete files on the copilot's dataverse that is not on the Sharepoint folder, so basically they should be in sync.
This is awesome! this is definitely my next to do list after I can make the SharePoint site working for my generative answers. Already set authentication to manual and added Files.Read.All and Sites.Read.All to my API permission in azure portal but still not returning any answers, only working is Uploaded documents and Public sites. Hoping you can upload video to enlighten us on that one. It seems pretty straight forward on Microsoft documentation but something is missing
I will see about putting this on my backlog of videos. Keep in mind that many people don't want to have to logon to their bot to have files available to answer questions.
Another question is whey this would be needed. Doesn't it duplicate the file? What happens if a change is made on SharePoint? And can't the same be achieved simply by adding an additional topic that specifically refers to the SharePoint site where the original documents are located?
See my answer above. The key is that data publishing should be thought out as when you change the content you change the behavior and responses. Imagine you want to files indexed and chunked like Dataverse does and you want the files to be part of the solution. Also, what if you want the files to be accessible outside of the walls of you org. You have to be authed to use SharePoint, but not all people want that.
I @dewain, I need to get the same response in the form of adaptive card. Can we do it through generative ai by uploading doc or pdf or excel. If yes, please show how to do it.
Hello Dewain, Amazing video. But I am getting the below error tried to change the schema name but didn't work out. Please advise. Export key attribute {0} for component {1} must begin with a letter and only consist of alpha-numeric and _.{}! characters. Data[0] = \"schemaname\" Data[1] = \"botcomponent\"",
Getting below error, Not sure what is wrong as I followed the same naming convention. Hope someone can help Action 'Add_a_new_row_to_selected_environment' failed: Export key attribute {0} for component {1} must begin with a letter and only consist of alpha-numeric and _.{}! characters. Data[0] = "schemaname" Data[1] = "botcomponent"
Hi Dewain, thank you so much for posting this video. This is what exactly I want. This is very useful video. I can be able to create a flow and run the flow successfully with developer account of Microsoft. But the same when I am trying with my companies account (Office 365 account, user account) I am getting "Forbidden" error. Can you please help on this?
This is awesome, thank you so much for putting it together, I am wondering if there is a benefit to doing it this way vs adding a sharepoint site directly. I look forward to your response.
Imagine you want to control when the content is released to the user and not just when you edit the file at its destination... you also might want to publish the content to people who don't have access to SharePoint.
"The number of documents you can upload is only limited by the available file storage for your Dataverse environment." And of course the 3MB limit per document
How do you replace an old file with a new files - when the employees handbook is updated for the new year for example - how do you delete the old one programmatically ?
So there are two ways... one is that you can do the same process on an update to the file or you can simply delete the file from your solution and the SharePoint site and upload a new version.
this was exactly what i needed, thanks dewain, you've pretty much taught me everything i know about co-pilot,. Now i'm able to use it to blow my bosses socks off and more importantly help people in my company be better at their jobs and get new people trained much quicker. Youre a legend, thanks so much dude !!!
Awesome! Glad you liked it. I had many people ask if this was possible.
Hello Dewain, I have been watching your videos to create my copilot. Thank you for the knowledge you shared. May I ask you to create a video where user can upload the documents in the chatbot, and the documents will be kept in the SharePoint using Power Automate?
Hi Dewain! Excelent video! Thanks for sharing! I'm currently developing a copilot that uses SharePoint as knowledge source. We will upload 1,800 documents (Most of them .DOCX within the allowed file size). How can I optimize the copilot to only generate answers based on documents related to certain subject? How does copilot index the documents in SharePoint? Is there a limitation about the quantity of files? How can I improve the response time and performance?
Why not just use a sharepoint site?
several reasons... SharePoint site doesn't make the content part of the publishing process meaning that the content changes directly impact the copilot without testing being done. Secondly, SharePoint doesn't index / vectorize / chunk the data so the answers are not typically as good.
Another benefit is you can give a business user access to manage the document library in Sharepoint and not the Copilot or Dataverse environment. @Dewain27
This is exactly my question as well. What about security ? SharePoint doesn’t give us ability to auto index ?
@@Dewain27 So in theory is best to use this method for better generative AI response rather than sharepoint?
I'm also interested in comparison between connecting Chatbot directly to Sharepoint Site and approach described here. Is this "SharePoint doesn't index / vectorize / chunk the data so the answers are not typically as good" correct ? And thank you for all videos you publish 👍
Great video, 1 extra point to stay focus: your document cannot contain spaces, otherwise this is the error you get:
Action 'Add_a_new_row_to_selectedenvironment' failed: Export key attribute {0} for component {1} must begin with a letter and only consist of alpha-numeric and .{}! characters. Data[0] = "schemaname" Data[1] = "botcomponent"
Hi Dewain, very useful video. it is possible to use Dataverse as a data source for generative answers?
Thank you, this is very helpful, and we have already implemented it. The only issue I'm encountering is that this approach requires the 'Publish' button to be pressed in the ChatBot for the uploaded file to become accessible to users. Is there any way to automate the publishing process?
Thanks for the video , Dewain. It is very detailed and easy to follow and learn. I have a question based on this , what if the files are already in the sharepoint ? I tried to build the same power automate , when i click the "ALL runs " , nothing happened .does this method only works when you upload new file to the sharepoint ?
Downside of uploaded documents vs sharepoint is the reference links showing unformatted text vs opening the document itself
can you show me how the triggers/flow must be look when the data/file from the share point is deleted , and must triggers it in the copilot studio also.
Hi Dewain this is very useful. Thank you so much for sharing how to do this. I wish this can be added as a native feature in Copilot Studio. It would also be useful to have a power automate which will delete files on the copilot's dataverse that is not on the Sharepoint folder, so basically they should be in sync.
I am struggling with it, if i dont have a solution for the bot ? should i create the table ?!
This is awesome! this is definitely my next to do list after I can make the SharePoint site working for my generative answers. Already set authentication to manual and added Files.Read.All and Sites.Read.All to my API permission in azure portal but still not returning any answers, only working is Uploaded documents and Public sites. Hoping you can upload video to enlighten us on that one. It seems pretty straight forward on Microsoft documentation but something is missing
I will see about putting this on my backlog of videos. Keep in mind that many people don't want to have to logon to their bot to have files available to answer questions.
Another question is whey this would be needed. Doesn't it duplicate the file? What happens if a change is made on SharePoint? And can't the same be achieved simply by adding an additional topic that specifically refers to the SharePoint site where the original documents are located?
See my answer above. The key is that data publishing should be thought out as when you change the content you change the behavior and responses. Imagine you want to files indexed and chunked like Dataverse does and you want the files to be part of the solution. Also, what if you want the files to be accessible outside of the walls of you org. You have to be authed to use SharePoint, but not all people want that.
I do not have the option for chatbot subcomponents is that a premium feature?
i dont have it either
I @dewain, I need to get the same response in the form of adaptive card. Can we do it through generative ai by uploading doc or pdf or excel. If yes, please show how to do it.
Hello Dewain, Amazing video. But I am getting the below error tried to change the schema name but didn't work out. Please advise. Export key attribute {0} for component {1} must begin with a letter and only consist of alpha-numeric and _.{}! characters.
Data[0] = \"schemaname\"
Data[1] = \"botcomponent\"",
Getting below error, Not sure what is wrong as I followed the same naming convention. Hope someone can help
Action 'Add_a_new_row_to_selected_environment' failed: Export key attribute {0} for component {1} must begin with a letter and only consist of alpha-numeric and _.{}! characters. Data[0] = "schemaname" Data[1] = "botcomponent"
Hi Dewain, thank you so much for posting this video. This is what exactly I want. This is very useful video. I can be able to create a flow and run the flow successfully with developer account of Microsoft. But the same when I am trying with my companies account (Office 365 account, user account) I am getting "Forbidden" error. Can you please help on this?
This is awesome, thank you so much for putting it together, I am wondering if there is a benefit to doing it this way vs adding a sharepoint site directly. I look forward to your response.
Imagine you want to control when the content is released to the user and not just when you edit the file at its destination... you also might want to publish the content to people who don't have access to SharePoint.
@Dewain27 thank you taking the time to reply, that makes perfect sense
Hi Dewain, How many files can you upload this way? Does this work let's say with 500 files?
"The number of documents you can upload is only limited by the available file storage for your Dataverse environment." And of course the 3MB limit per document
@@jidi10 ok. Then 500 1MB files should be ok if i have 500 MB free space in dataverse, right?
How do you replace an old file with a new files - when the employees handbook is updated for the new year for example - how do you delete the old one programmatically ?
So there are two ways... one is that you can do the same process on an update to the file or you can simply delete the file from your solution and the SharePoint site and upload a new version.
Hi, is it possible to delete docs from chatbot with power automate?
Yes, but can you tell me what would trigger it?
By scheduler not trigger