- 115
- 32 343
LinoTV
United States
เข้าร่วมเมื่อ 6 ก.ค. 2018
LinoTV is excited to produce videos to explain several technologies and help with the adoption of these wonderful products
Sitefinity, GenAI, Azure, AI, Prompt Flow, Machine Learning, IoT. Microsoft Fabric, Data Analytics and many others that are close to our heart
Sitefinity, GenAI, Azure, AI, Prompt Flow, Machine Learning, IoT. Microsoft Fabric, Data Analytics and many others that are close to our heart
Part 06A - Setting up Azure API Management for Azure OpenAI
In this video, we will demonstrate how to setup and utilize APIM to manage Azure OpenAI APIs and set the TPM limit. Monitoring with Application Insights will be demonstrated as well for Tokens used.
มุมมอง: 89
วีดีโอ
05B Using Semantic Kernel to Build Agents and Use Plugins
มุมมอง 192 ชั่วโมงที่ผ่านมา
In this video, we will demonstrate the ability of building Agents suing Semantic Kernel and injecting Plugins for the use of the Agent to save to a Cosmos DB Container
Part 5A - Using Semantic Kernel for Embedding and Vectorization
มุมมอง 132 ชั่วโมงที่ผ่านมา
In this video, we will replace the direct injection of Azure OpenAI models for embedding with the Semantic Kernel built in functionality to accomplish the same thing.
Part 4C - Saving Audio Vectors in Cosmos DB
มุมมอง 32 ชั่วโมงที่ผ่านมา
In this video, we will demonstrate the ability of embedding and vectorizing transcribed content of audio files into Cosmos DB and ask questions against the vectorized data in Cosmos DB with Max results returned and vector distance.
Part 4B - Using Azure AI Speech Extraction and Sentiment Methods
มุมมอง 62 ชั่วโมงที่ผ่านมา
In this video, we will demonstrate the ability to use the Azure AI Speech and Language SDK to perform Extractive, Abstractive and Query Based summarization, then we will build a Sentiment JSON response to follow the progress of the call.
Part 4A - Using Azure AI Speech to transcribe Customer Service Conversations
มุมมอง 62 ชั่วโมงที่ผ่านมา
In this video, we will demonstrate the ability to transcribe phone calls and audio files using Azure AI Speech Services and check compliance to make sure it meets company's requirements.
Part 3C - Using Vectorization in the UX with Max results and Similarity Score
มุมมอง 102 ชั่วโมงที่ผ่านมา
In this video, we will demonstrate the ability to test our embedding and vectorization using Swagger and then implement the technique into the StreamLit python based application to perform the vector search.
Part 3B - Create an Azure Function to perform the Embedding and Vectorization
มุมมอง 954 ชั่วโมงที่ผ่านมา
In this video, we will demonstrate the ability to create and deploy an Azure Function to perform the embedding and vectorization of data in CosmosDB
Part 3A- Initializing Cosmos DB for Embedding and Vectorization
มุมมอง 114 ชั่วโมงที่ผ่านมา
In this video, we will demonstrate how to setup CosmosDB for Embedding and Vectorization and how to set the Container Vector Policy to establish the rule on how CosmosDB will embed and vectorize content into its containers.
Part 2B - Using Semantic Kernel to Function Calling into APIs
มุมมอง 307 ชั่วโมงที่ผ่านมา
In this video, we will demonstrate the ability to use the Semantic Kernel framework to make Function Calls into an API based on retrieval from Azure SQL Database.
Part 2A - Deploying an API with Azure SQL Database Access
มุมมอง 287 ชั่วโมงที่ผ่านมา
In this video, we will run scripts to create and populate multiple tables in Azure SQL Database and use C# to implement an API layer to access the data locally and in Azure as well.
Part 1C - Azure Deployment of the API and Dashboard using Github secrets and variable.
มุมมอง 187 ชั่วโมงที่ผ่านมา
In this video, we will demonstrate the ability of turn on GitHub actions with 2 workflows, one for the python dashboard app and one with the C# based API application to run and test the chat in Azure.
Part 1B - Vectorizing Documents in Azure AI Search
มุมมอง 257 ชั่วโมงที่ผ่านมา
In this video, we will demonstrate the ability of embedding and vectorizing documents from an Azure Blob Storage into an Azure AI Search Index to start chatting with an LLM model using the vectorized data.
Part 1A - Deploying all the resources needed using Bicep
มุมมอง 277 ชั่วโมงที่ผ่านมา
In this first video, to start our series, we will fork and clone a repository on GitHub and start deploying all the resources needed using a Bicep script. You can follow along using this repo: microsoft.github.io/TechExcel-Integrating-Azure-PaaS-and-AI-Services-for-AI-Design-Wins/
Merry Christmas and Happy 2025 to all
มุมมอง 210วันที่ผ่านมา
Wishing all our viewers, friends and family a glorious holiday season, Merry Christmas and Happy New Year 2025
FoundationaLLM - Why the need for FoundationaLLM?
มุมมอง 95หลายเดือนก่อน
FoundationaLLM - Why the need for FoundationaLLM?
FoundationaLLM - 10k Excel File Analysis
มุมมอง 136หลายเดือนก่อน
FoundationaLLM - 10k Excel File Analysis
FoundationaLLM - Home Equity Loan Brochure Quiz Generation - Part2
มุมมอง 10หลายเดือนก่อน
FoundationaLLM - Home Equity Loan Brochure Quiz Generation - Part2
FoundationaLLM - Recognizing a Home Equity Loan brochure - Part 1
มุมมอง 22หลายเดือนก่อน
FoundationaLLM - Recognizing a Home Equity Loan brochure - Part 1
Azure AI Studio - Safety Evaluation - Part 7
มุมมอง 902 หลายเดือนก่อน
Azure AI Studio - Safety Evaluation - Part 7
Azure AI Studio - Quality Evaluation - Part 6
มุมมอง 1082 หลายเดือนก่อน
Azure AI Studio - Quality Evaluation - Part 6
Azure AI Studio - Prompt Evaluation - Part 5
มุมมอง 972 หลายเดือนก่อน
Azure AI Studio - Prompt Evaluation - Part 5
Azure AI Studio with Code - Debugging - Part 3
มุมมอง 872 หลายเดือนก่อน
Azure AI Studio with Code - Debugging - Part 3
Azure AI Studio - Phi-3.5 Mini Deployment
มุมมอง 1782 หลายเดือนก่อน
Azure AI Studio - Phi-3.5 Mini Deployment
FoundationaLLM - Contextual Image Upload
มุมมอง 362 หลายเดือนก่อน
FoundationaLLM - Contextual Image Upload
FoundationaLLM - Inline Context Agents with OpenAI Assistants
มุมมอง 522 หลายเดือนก่อน
FoundationaLLM - Inline Context Agents with OpenAI Assistants
FoundationaLLM - Inline Context Agents
มุมมอง 462 หลายเดือนก่อน
FoundationaLLM - Inline Context Agents
Azure AI Studio with Code - Execution - Part 2
มุมมอง 2642 หลายเดือนก่อน
Azure AI Studio with Code - Execution - Part 2
Hi lino, I have done the setup as following your instructions. But here I am getting a error on Prompt Flow, On the lookup section there is a error occuring everytime. It saying this: Run failed: Exception: Exception occured in search_function_construction Also here is the traceback error: Traceback (most recent call last): File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/pydantic/_internal/_generate_schema.py", line 864, in _resolve_forward_ref obj = _typing_extra.eval_type_backport(obj, globalns=self._types_namespace) File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/pydantic/_internal/_typing_extra.py", line 279, in eval_type_backport return _eval_type_backport(value, globalns, localns, type_params) File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/pydantic/_internal/_typing_extra.py", line 303, in _eval_type_backport return _eval_type(value, globalns, localns, type_params) File "/azureml-envs/prompt-flow/runtime/lib/python3.9/site-packages/pydantic/_internal/_typing_extra.py", line 332, in _eval_type return typing._eval_type( # type: ignore File "/azureml-envs/prompt-flow/runtime/lib/python3.9/typing.py", line 292, in _eval_type return t._evaluate(globalns, localns, recursive_guard) File "/azureml-envs/prompt-flow/runtime/lib/python3.9/typing.py", line 554, in _evaluate eval(self.__forward_code__, globalns, localns), File "<string>", line 1, in <module> NameError: name 'AzureSearch' is not defined The above exception was the direct cause of the following exception: Could you please tell me how can I fix it? I have looked a lot of documents but non of them are working.
Can you make this from the curl api call? I mean upload an image from an API call to this chatflow.
Yes you can, you just need to include a “data” key and assign the base64 value of the image in the json request. I will record a video for that soon. Thanks for asking
@@LinoTadrosTV Niceee, it would be great if you also can cover other CURL calls such as using the text to speech feature
Hey this is great! My only problem is, when I go to create a page, it says there are no templates. When I try to make a template, it says there are no base templates. Any thoughts?
Thanks Keith, try it again but don't forget to npm install and make sure the startup repo you are starting with contains the sample template otherwise nothing will show up. I just tried all the steps again and the NextJS template showed up right away.
Thanks for the video: It would be great if you could answer two questions I had regarding this feature. Does fabric automatically detects any changes in the source file and update the customer table. Is there any significant latency in processing big data queries when the data is access over S3 and is cross platform.
Microsoft Fabric does not automatically detect file changes in external sources without configuring a refresh or monitoring mechanism. To enable automatic updates, you typically need to set up pipelines with triggers or schedule refreshes. You can configure these pipelines to detect incremental changes using techniques like watermarking, timestamps, or change data capture (CDC). For the second question, remember that Fabric’s compute resources are hosted in Azure regions, and accessing Amazon S3 involves cross-cloud data transfer. Network latency between Azure and AWS regions can impact query performance, especially for large datasets. There is no doubt it will be faster if you move the date to OneLake first, but it all depends on the volume of data we are talking about. Hope that helps
Wow! thank you for this awesome tutorial series! Well explained and easy to understand!
Ciao Lino please keep doing as these tutorial are amazing complete and celar! Greetings from Romaaaa🇮🇹
Hi Lino, I have few questions about indexing. Hope you can clarify for me. I have uploaded my documents. My documents are mix types like pdf, excel, word, images. Also, some of these like images and pdf have text in non-english languages, so when I choose to do the indexing (consiering vector), will it automatically consider ocr, document intelligence, language detection and translation to target language ? Would you please tell me about it ?
Excellent question! No the OCR part will not be done automatically but that is the good part of PromptFlow which will allow you to add a Python node to do the OCR beforehand using a 3rd party tool or Azure AI Document Intelligence to extract the Images from the documents and apply an image embedding model like Azure AI Vision to generate vectors for each extracted image then store the extracted text and image vectors in an Azure AI Search index and the rest is the same. Hope that makes sense Happy Holidays
@@LinoTadrosTV yes, that makes sense. Would it be possible for you to make a video about it ? The other way I was thinking about considering store the data in a blob storage, and then use custom skillsets to make indexes, later use that index to vectorize. However, the approach you suggested is the most convincing one to me. I would kindly request you to make a tutorial about it. Like images, and then how in prompt flow we ingest code to do the OCR mainly 3rd party/MS ones. My sincere request, and I hope you would consider this. Thanks in advance!
clear! thanks!!!
I really appreciate how you introduce a new aspect of a problem by saying, 'but there is a problem.' It’s such an engaging way to guide us through the thought process and transition into exploring solutions. Thanks!
Thanks for the video
awesome tthank you helped with my masters health assistant llm
short & sweet, to the point. Thank you!
Thank you so much, saved a lot of time with this!
Lino, great video. Do you have some similar but for Fabric with Visual Code?
Visual code???
@@LinoTadrosTV Yes. For example, a VIsual Code Project for a Medalion Architecture in Fabric. Thx!!
Lino great this video (and all your videos)...keep pushing, please!
can i check how we include a page to allow user to use to enter the question ?
Did not understand the question, sorry
Azure AI is really a pain in ass. Errors, synchronization issues and warnings everywhere. And super expensive. Do Microsoft know that this product is garbage ?
Truth be told, it is unstable. But they are working on it very hard to bring it to a level where it can be solid and productive. Some days I see we are moving forward and some days it goes backward. The price we pay when GENAI is moving at the speed of light and everyone is trying to catch up, even Microsoft
that was helpful, thanks
Thank you for this super helpful video!!
Sitefinity documentation isn't always clear. I appreciate you clarifying things as you go. Even the mistakes and server errors are very useful. Keep up the great work!
Could you do a detaield video on this? Every tutorial out there just show with sqllite and ur the first one I've ever found using sql server which I appreciate. I never knew about the driver until I watched this and got it to work for my rag application. Could you make a video on using an agent to talk to all your tables with a custom prompt and memory instead of making it 0 shot?
Excellent video, I have followed the exact same procedure except I did indexing in AI search and used the same in prompt flow. When i ran it by asking a question, the bot is loading forever without replying to the question. Any idea or suggestion?
Not sure, have to look into it
You have a great videos. I have bought your course in udemy about the Widgets. Do you have more courses? I’m interested into Sitefinity API, how to create configs for the Widget, Pages, Scheduled Tasks etc. Do you have more videos?
Yes I have several Sitefinity courses on Udemy, but what you are looking for is my 2 day training class offered on my web site at thetrainingboss.com that covers all of that.
Why have you not used the env variable to fetch your api key? It is publicly available to everyone right?
Yes in my videos I show it both way. The key was deleted before the video was released anyway. Thank you
@@LinoTadrosTV All right. Really insightful and simple to understand video. Thanks for that.
very nice ❤❤
Hi Lino, thanks for your videos! I wish I had seen them earlier. Could you give me some advice on how to easily and securely deploy my prompt flow endpoint into a web app? I’m looking for something similar to the easy deployment in the chat playground, but with my custom prompt flow endpoint. Thanks a lot!
Will do
very nice. i have a one question. what should i do, if i change the chunking size when the ai search created?
You will have to recreate the index as the 1536 floats that represent each chunk will be different during the embedding stage
Awesome video thanks for this great contents
Thanks Lino. All your Flowise tutorials have been fabulous! Appreciate your calm way of explaining things.
great tutorial, thanks!
Great overview of Multi Agents, thanks!
Thank you!
Thanks for showing this, it looks like a really cool feature that will save a lot of time. Great video!😁
Thanks for the series about Azure AI, it sure is better than Microsoft Learns material. Any plans for more episodes? :)
Yes, just waiting for the product to stabilize a bit more
@@LinoTadrosTV Good to hear. Glad I am not the only one who think the product is not completely ready yet
Amazingly clear tutorial please please do more!
Absolutely brilliant tutorial. Best I've seen!
Please make video on prompt hub
Please make video on prompt hub
Do you ever get a dependency error? I am trying to setup an instance using "Progress.sitefinity.headless" and the install keeps failing because of dependencies from the dll mentioned in the package
No unfortunately I don’t.
🎉🎉🎉🎉
Tq for uploading this video...
Subscribed🎉🎉
Please upload more videos
Very informative
Thank you for making Sitefinity videos like these. I've always felt the Sitefinity docs are not great, and well-explained accessible information like this very rare for Sitefinity on-line in general. I've bee working with Sitefinity for over 8 years now, yet I still check youtube as i learn so much quicker from videos.
Nice video!
Glad you enjoyed it
Thanks @LinoTV for this great tutorial. The default number of tokens for chunking is 1024 tokens. Do you have any idea on how to change this value for example to 256 tokens?
Chunking does not require any tokens. The embedding does.
I have a question. I'm trying to deploy the same standard prompt flow as you. However, I'm on a free tier subscription and apparently only have 1K tokens available. Given that I have a gpt-4 model already deployed that takes up that 1k capacity, will I be able to deploy a flow that is connected to that model? or will I then need 2K capactiy? I'm currently receiving a 400 error code when trying to deploy the same flow as you did (following your videos from part 1 and 2, but on a free tier sub).
Yes actually the correct error number you are getting is probably 429 not 400 :) and yes that means you do not have enough capacity between the 2. It is difficult to do this on a free tier. Also you can request an increase to your capacity for the deployed model you are using, it can take few days to get it from Microsoft.
@@LinoTadrosTV Thank you. I might go pay-to-go then. I think it isnt possible to request a quota increase with a free tier sub. I've and was denied an increase.
Please do more videos on this. I like how you ran through the different methods
Hi Sir, Im a designer looking to learn web development. I looked up on sitefinity and came across your video. Wondering if you have any video for beginner to sitefinity or any course of your own? Im really keen to learn
www.udemy.com/courses/search/?src=ukw&q=Sitefinity+