- 20
- 14 671
Maxim Khoruzhiy
เข้าร่วมเมื่อ 16 เม.ย. 2023
Sharing of IT-consultant experience. Solutions, cases, scenarios, challenges from real life to real developers.
Unlock the Power of Generative AI & RAG in Azure: Build Smarter Solutions Fast with Logic Apps!
Discover how to build intelligent, AI-powered solutions using Generative AI and Retrieval-Augmented Generation (RAG) in Azure!
This step-by-step tutorial takes you through the full implementation process, covering everything from foundational concepts like embeddings, vectors, and vector databases to creating advanced Ingestion and Chat Workflows using Azure AI Services and Logic Apps.
You'll explore Azure AI Search and Azure OpenAI models for embedding and chat.
You´ll witness hands-on demos, including ingesting documents and real-time interaction with ingested content.
Whether you're new to RAG or looking to enhance your Azure skill set, this video offers a comprehensive guide to mastering intelligent workflows in the cloud.
GitHub code repo with source files,
github.com/Azure/logicapps/tree/master/ai-sample
Timing
00:00-01:58 Introduction
01:58-03:00 Presentation content
03:00-04:25 RAG concept
04:25-07:06 Embeddings, vectors, vector databases
07:06-08:48 Ingestion & chat workflows concept
08:48-10:25 Ingestion & chat workflows built on Azure Logic App and Azure AI Services
10:25-13:54 Dive deeper into Azure Ingestion Logic App Workflow implementation
13:54-18:24 Dive deeper into Azure Chat Azure Logic App Workflow implementation
18:24-19:48 Azure OpenAI models - embedding & Chat
19:48-20:32 Azure AI Search Service
20:32-23:50 Demo: Chatting with an empty Vector Database
23:50-25:48 Demo: Ingesting a document
25:48-27:40 Demo: chatting, requesting information from the ingested document
27:40-25:28 Conclusion
This step-by-step tutorial takes you through the full implementation process, covering everything from foundational concepts like embeddings, vectors, and vector databases to creating advanced Ingestion and Chat Workflows using Azure AI Services and Logic Apps.
You'll explore Azure AI Search and Azure OpenAI models for embedding and chat.
You´ll witness hands-on demos, including ingesting documents and real-time interaction with ingested content.
Whether you're new to RAG or looking to enhance your Azure skill set, this video offers a comprehensive guide to mastering intelligent workflows in the cloud.
GitHub code repo with source files,
github.com/Azure/logicapps/tree/master/ai-sample
Timing
00:00-01:58 Introduction
01:58-03:00 Presentation content
03:00-04:25 RAG concept
04:25-07:06 Embeddings, vectors, vector databases
07:06-08:48 Ingestion & chat workflows concept
08:48-10:25 Ingestion & chat workflows built on Azure Logic App and Azure AI Services
10:25-13:54 Dive deeper into Azure Ingestion Logic App Workflow implementation
13:54-18:24 Dive deeper into Azure Chat Azure Logic App Workflow implementation
18:24-19:48 Azure OpenAI models - embedding & Chat
19:48-20:32 Azure AI Search Service
20:32-23:50 Demo: Chatting with an empty Vector Database
23:50-25:48 Demo: Ingesting a document
25:48-27:40 Demo: chatting, requesting information from the ingested document
27:40-25:28 Conclusion
มุมมอง: 326
วีดีโอ
Azure Integrations: Effective Monitoring Techniques for Azure Logic Apps.
มุมมอง 3543 หลายเดือนก่อน
In this video, I explore how to effectively monitor your Logic Apps using the Logic Apps Management Solution. This comprehensive solution helps you gain insights and maintain the health of your workflows with ease. In this video, you'll learn: - The benefits of using the Logic Apps Management Solution. - How to set up and configure the Logic Apps Management Solution. - Monitoring workflows with...
Azure Logic Apps Basics: Implementing Simple Data Mapping
มุมมอง 7996 หลายเดือนก่อน
Welcome to my Azure Logic Apps tutorial! In this video, I will guide you through the basics of implementing simple data mapping using Azure Logic Apps. Learn how to effectively transform and map data within your workflows. In this video, you'll learn: - Defining data mapping in a JSON variable. - Using the Compose action to restructure incoming data. - Converting source data values to destinati...
Mastering Integration with Azure: Beginner's Guide to Azure Integration Services
มุมมอง 2678 หลายเดือนก่อน
Unlock the power of integration with Azure. In this beginner's guide, I explore the fundamentals of integration solutions and dive into the world of Azure Integration Services. From understanding basic integration architecture to practical examples showcasing Azure Logic Apps, Azure Service Bus, Azure API Management, and more, this presentation equips you with the knowledge needed to succeed in...
Azure Integration: Managed Identity & MS Graph Authentication Tutorial | Step-by-Step Guide
มุมมอง 31011 หลายเดือนก่อน
Unlocking Seamless Integration: Mastering Managed Identity & MS Graph Authentication for Secure SharePoint Access. Dive into this developer-focused tutorial, guiding you through precise steps to implement robust Azure solutions. Gain insights on the intricacies of authentication setup, tailored for accessing SharePoint sites with unparalleled security. Whether you're a seasoned developer or new...
Unlocking the Power of Azure AI: Real-World Applications with Azure Document Intelligence Services
มุมมอง 336ปีที่แล้ว
In this insightful video, I delve into the world of Azure AI and explore the practical applications of Azure Document Intelligence Services in real-world scenarios. Discover the cutting-edge capabilities and features of Azure Document Intelligence Services, and learn how to harness its potential to drive innovation and create transformative solutions.
Azure Integration Solutions: A Guide to Logging, Troubleshooting, and Streamlined Prod Maintenance.
มุมมอง 211ปีที่แล้ว
In this video, I delve into the intricacies of logging and troubleshooting within your Azure integration solution. Learn valuable insights on how to efficiently handle errors in a production environment while ensuring seamless operations. If you are getting started, my step-by-step guide will empower you with the skills needed to maintain and troubleshoot your Azure integration solution effecti...
How To: Improve the efficiency of using JSON schemas in your deployment projects.
มุมมอง 145ปีที่แล้ว
How you can make your Azure deployment projects more efficient with use of the Bicep function loadJsonContent(). Follow me on Twitter: maximkhoruzhiy
How to: Key vault access policy. Bicep for-loops.
มุมมอง 199ปีที่แล้ว
How to set up a Key Vault access policy for Logic Apps with the use of Bicep. Using Bicep for-loops for deployment optimization. See Bicep code examples: github.com/mkhoruzhiy/TH-cam-AzureIntegrationServices/tree/main/DevOps/part-04 Follow me on Twitter: maximkhoruzhiy
Logic Apps: Dataverse connector error - the supplied reference link is invalid.
มุมมอง 996ปีที่แล้ว
In this video I explain why you may get an error "0x80060888 | The supplied reference link is invalid. Expecting a reference link of the form /entityset(key)" while using a Dataverse connector in a logic app.
Azure Integration Services - part-07 "Uploading files to SharePoint Online"
มุมมอง 2.9Kปีที่แล้ว
Seventh part of the integrations technics and scenarios guide to using of Azure Integration Services. How to develop an integration solution with use of Azure Integration Services. What to start with. Which Azure services to use and how to make them working together as a robust, reliably functioning data workflow. In this part: - using a Logic App as an HTTP endpoint for uploading files to a Sh...
How to set up a deployment solution for the Azure infrastructure. Part-03 "templates, multiple env."
มุมมอง 180ปีที่แล้ว
Infrastructure as Code: This is a guide on how to set up a deployment solution for the Azure infrastructure. Part-03. In this part - use of YAML template and multi-environment deployment. Use VS Code as a tool to compose the deployment code and the Azure DevOps to set up build & release flows. Part-01: th-cam.com/video/IsEWp6HsbIs/w-d-xo.html Part-02: th-cam.com/video/zow4qiQdLTw/w-d-xo.html Th...
Optimizing Azure Deployment: A Practical Guide to Logic App and CosmosDB Configuration | Part 02
มุมมอง 208ปีที่แล้ว
Infrastructure as Code: This is a guide on how to set up a deployment solution for the Azure infrastructure. Part-02 Use VS Code as a tool to compose the deployment code and the Azure DevOps to set up build & release flows. Part-01: th-cam.com/video/IsEWp6HsbIs/w-d-xo.html Part-03: th-cam.com/video/a30fUMMVQPk/w-d-xo.html This video is about deploying an Azure integration solution described in ...
Mastering Azure Infrastructure Deployment: Part 1 - "Logic App Deployment Guide"
มุมมอง 807ปีที่แล้ว
Infrastructure as Code: This is a guide on how to set up a deployment solution for the Azure infrastructure. Part-01. Use VS Code as a tool to compose the deployment code and the Azure DevOps to set up build & release flows. Source code on GitHub: github.com/mkhoruzhiy/TH-cam-AzureIntegrationServices Part-02: th-cam.com/video/zow4qiQdLTw/w-d-xo.html Part-03: th-cam.com/video/a30fUMMVQPk/w-d-xo....
Azure Integration Services: Part 6 - "Mastering OAuth 2.0 for API Access Authorization"
มุมมอง 447ปีที่แล้ว
Sixth part of the step-by-step guide to an integration solution built with Azure Integration Services. How to develop an integration solution with use of Azure Integration Services. What to start with. Which Azure services to use and how to make them working together as a robust, reliably functioning data workflow. In this part: - protecting API operation with OAuth 2.0 protocol; Part 01 - th-c...
Exploring Azure Integration Services: Part 5 - "Dividing Message Flow, adding Dataverse Destination"
มุมมอง 432ปีที่แล้ว
Exploring Azure Integration Services: Part 5 - "Dividing Message Flow, adding Dataverse Destination"
Mastering Azure Integration Services - part-04 "Optimizing Message Retrieval"
มุมมอง 539ปีที่แล้ว
Mastering Azure Integration Services - part-04 "Optimizing Message Retrieval"
Mastering Azure Integration Services: Part 03 "Messaging-Supported Flow"
มุมมอง 801ปีที่แล้ว
Mastering Azure Integration Services: Part 03 "Messaging-Supported Flow"
Mastering Azure Integration Services: Part 02 - Unveiling the Power of 'API Facade'
มุมมอง 1Kปีที่แล้ว
Mastering Azure Integration Services: Part 02 - Unveiling the Power of 'API Facade'
Azure Integration Made Easy: "Part 1 - Getting Started Simply"
มุมมอง 3.5Kปีที่แล้ว
Azure Integration Made Easy: "Part 1 - Getting Started Simply"
great explanintion sir . if you could make a video for ai ml course would be much appriciate
Thank you :) I'll put this into my plan.
DataMap is not available in my action list, any idea?
Yeah, the "DataMap" is only a custom name to action "Initialize variable" of "Variables" group of actions. This is basically definition of a json object which serves as a data-mapping structure.
Great video! I have a requirement of uploading files from azure file share to SharePoint. There are 3 folder shares where application uploads data based on job runs. How do I setup an integration so that any time a file is created it scans and uploads the file to SharePoint maintaining folder structure or putting them in their respective folders ? Can you please provide some suggestions Thank you
Hi Jackson. Thanks for the comment! 😊 As for your question, the answer depends on your capabilities. If you are limited in using only Logic Apps (no-code), then you should shift to using Blob storage instead of File storage. In this ase you can use Event Grid subscribed to events in the Blob storage (blob create). Files are blobs in the Blob storage. BlobStorage -> Event Grid -> Logic App -> SharePoint If you are skilled in programming, you can develop an Azure Function with the File Share trigger. You can set it up to be triggered by a file creation event from a File storage. If you want to minimize programming than your Function should right away push the file, fetched from the File storage, to the HTTP trigger of your Logic App which will finalize the handling. The rest is the same as described in my video - you take the file and upload it to the SharePoint. FileStorage -> Azure Function -> Logic App -> SharePoint
very helpful
thank you!
Hi, this is an excellent video.. i have a requirement to upload a folder (present in my local machine) to SharePoint Library with dynamic file properties , how to achieve this ?
Thank you. Glad it helps. The solution depends on the specific issue you're facing. Let me know which part of the solution is the challenging part. Generally speaking: 1. Don't upload files from your local PC. You need the Logic App to access your files. I recommend uploading your files to a Storage Account in Azure (File Storage or Blob Storage). Then, give the Logic App access to the Storage Account. Enable a system-assigned identity in the Logic App and grant access to this identity in the Storage Account (using Role-Based Access Control). You can then use Logic App Storage Account actions to read files from the file/blob storage and transfer them to SharePoint. 2. I'm not sure I understand what you mean by dynamic properties. If you need to upload your files to SharePoint with specific file properties, you should define a Content-Type or columns in the SharePoint library where these files will be stored (this is metadata). By doing this, you can upload each file along with its properties. Each property will be displayed as a column in the library's list view. Any questions about any specific part of the solution?
I need to read all files from BLOB and Move to SharePoint document library so if you have any reference then please let me know
Unfortunately, I don't have a scenario like that covered on my channel yet. However, you might find this video helpful: th-cam.com/video/SSfNhSnQo_g/w-d-xo.html. It focuses on using Azure AI, and I demonstrate it transferring files with use of Azure Blob Storage. And looks like your scenario is an interesting case for one of my future videos.
Hi Maxim, i've been following your channel and getting valueable info around azure(integration); may I know if you provide any coaching with real time scenarios. If so, I would like to subscribe to it
Hi! I appreciate your feedback and I'm glad you find my videos useful. I aim to make my videos a form of coaching by selecting topics and scenarios from my own experiences as a full-time IT consultant. All my videos are practical, discussing things I've used in my projects. I encourage you to leave your questions in the comments. Depending on the complexity of the problem, I'll either respond with a solution or create my next video to address it. Feel free to ask me anyway.
@@MaximKhoruzhiy Yep, we get extensive knowledge in your videos; I appretiate it.
Awesome
Thank you. Your feedback inspires me to work further.
Facing similar situation in Power automate, where the code view of dataverse connector is read-only. How to tackle this in Power Automate.. any suggestions? Thanks.
There is no standard way to edit workflow code in Power Automate, but I have found a copy-paste approach that serves as a useful workaround: Click on the action's ellipsis (three dots on the right side) and select "Copy to My Clipboard." Paste the content into your text editor and make the necessary changes. Click the "+" button and select "Add an action." Choose "My Clipboard" and press Ctrl-V. Your edited action will be added to the group. Select it and delete the old action. A word of advice: avoid using Power Automate for enterprise-scale automations and integrations. This technology is best suited for personal automations. I hope this helps. Please let me know if you have any further questions.
One of the best integration tutorials has been posted on TH-cam. Many thanks for sharing this knowledge and experience. The explanations were clear and easy to follow, making complex concepts more understandable. This will definitely help many people in their learning journey. Great work, and please keep up the excellent content!
Thank you for your feedback. I'm glad that my videos help you. Unfortunately, I can't produce videos often as I'm full-time IT consultant, not a video blogger, but the videos I produce are based on the real life scenarios and experience, from developer's point of view. I hope this gives more value to the information I share.
I want to edit sharepoint documents using a python function in an azure function app and i passed credentials of a sharepoint account that have full access (the email and password) in the environment variable but it doesnt work it give me 502 Bad Gateway Server Error. after two minutes of execution !!!! can you help me with that
The status code 502 means that there is a gateway between your azure function and the sharepoint site and communication with the SP fails. Your case is complicated by specific network topology which affects the communication. Is the SP site in the cloud (SharePoint Online)? Then your azure function is most probably in a VNet with specific routing rules and/or network devices and here is the cause of the problem. Is your SharePoint on premises? Then check the communication between the Azure Function and your local network. I would start with analysis of the network setup in your organisation's Azure tenant.
Thank you for this nice tutorial, Maxim... it would be wonderful to see a complex scenario (if possible) which includes query fields, data validation, polygon data to highlight something in the invoice, storing the data in a database
Thanks for the feedback! I'll start from the last one: I stored some extracted content data into SharePoint library along with the image file. It is very similar to storing in a database. Which DB did you mean? Query fields: fields are in the json object. Are you interested in querying json? Something else? Validation: yes, sounds reasonable. I'll put it into my plan. "Highlight something in the invoice": highlighting results of validation? Yea, sounds great. Do you have a similar use case? I'll put it into my plan. 👍
@@MaximKhoruzhiy Hi Maxim, I Apologise for any confusion; I mentioned those points from the context of document evaluation (A good use case for many business processes and may help a lot of developers) One scenario is employees submitting multiple invoices for the Expense Claim process. 1) Invoice data is extracted (Optional - including some query fields in the invoice pre-built model to extract custom/additional Data) 2) Data is stored in a database to be compared (total amount mentioned by the employee and total calculated from the invoices (I don't know how easy this may be using SharePoint data) 3) highlight errors (mismatch) using bounding box (Polygon data) on the invoice. I know this may take a lot of effort and may not be part of your tutorial roadmap, and I totally understand that. Truly your tutorials are informative and acts as a good foundation for beginners
Audio is very low
Thanks for the feedback. I'm working on it.
I have a case where customer is trying to do exactly that. Thanx man ❤
My pleasure :)
In 4:38 how did you add the ForEach step? Is that a custom step?
No, it is a standard action. For Each loop is an action in the "Control" action group which you can choose and set up according to your needs, however, Logic App designer will help you - in case you choose handling of a data element which is a member of a list/set/array, (for example you assign this element to a variable), the Logic App designer will automatically add the For Each loop so that this handling will be done for each element of the set in iterations. Let me know if still unclear.
Thanks! One question: since you are moving back to a logic app http trigger, what is you recommendation regarding securing this public endpoint?
Sorry, I'm afraid I don't understand the "...since you are moving back to a logic app http trigger", but if you asked about securing the Logic App generally, there are multiple options: First off, the http trigger is protected by default by a SAS key. You will find it in the HTTP trigger URL. In most cases it is enough for securing your Logic App. - If you want to apply more security measures, in the "Workflow settings" you can specify a range of IP addresses the Logic App will access requests from. - You can enable Azure AD Authorization Policy for you Logic App so that it will analyse the security token for the claims you specify in the policy (it is an alternative to the SAS key). - You have now option of using Logic Apps of Standard Plan type as opposed to Consumption plan. This one gives you a better networking isolation capabilities. - And you can protect your multi-tenant Logic App by ISE (Isolated Service Environment) which moves your LA out of multi-tenant environment. The last two solutions come with a higher price, so you should find a balance between security requirement and the selected measure. Hope this gave you some overview of the options to start with.
@@MaximKhoruzhiy thanks for your in depth answer. Really appreciated! 👍
My plesure@@MyNameIsDr Fell free to ask questions.
Great session Sr. Maxim thanks a lot!
My pleasure. You may find something else helping on my channel and it is alive, there are more to come.
Perhaps I am misunderstanding 'auto complete' but is it not 'safer' to only release the msg from the Q after it has been successfully processed by the logic App (i.e. use 'peak lock')?
Thanks for the question. There are two approaches: the "release" controlled by the consumer (effectively - connector to the queue/topic) and "release" controlled by your own code running in the consumer. Both work equally well if your custom code comply the rules. Btw, "release" means - return the massage to the queue to be processed again. "Autom-complete" will always return the message into the queue/topic in case an unhandled error occurred. Using "peek-lock" you make your code more flexible about deciding when a message has to be released or completed. You may release and return the message into the queue even though there was no error in the consumer handler. Having said this, it depends, you decide. Auto-complete is simpler to use.
do you have it code in a github to follow along
I'll provide the link to the code in GitHub in a while. I'll drop a message here.
The link to the GitHub repo is now available in the description to this video.
@@MaximKhoruzhiy thank you.. great video
🙏🏻🙏🏻🙏🏻
Very Well explained and one of the best channel for Azure Integration Services
Thank you :) My plan is to deliver more guidance content like this. Stay tuned.
Great content. Subscribed.
Thank you. I hope you'll find more interesting content. I share my work experience and it's an ongoing process
👍🏻👍🏻👍🏻
very detailed analysis👍🏻