- 27
- 46 518
Rob Kerr AI
เข้าร่วมเมื่อ 29 ก.ย. 2023
Welcome to my AI channel!
I'm a Principal Consultant with DesignMind, specializing in AI and Data solutions. I post videos here relating AI technology, especially as it relates to using AI in data analytics and software development.
This channel is primarily used to host video content included in my blog, so please visit the main blog site at robkerr.ai too!
I'm a Principal Consultant with DesignMind, specializing in AI and Data solutions. I post videos here relating AI technology, especially as it relates to using AI in data analytics and software development.
This channel is primarily used to host video content included in my blog, so please visit the main blog site at robkerr.ai too!
Deploy an Azure Web App via an Azure DevOps CD Pipeline with a custom App Registration
Learn how to deploy an Azure Web App using a DevOps pipeline and a custom app registration and service profile secret.
In this video we deploy an app from a DevOps Git Repo via a deployment pipeline. We'll cover the following configuration steps:
1. Create a new app registration in Entra
2. Create a client secret
3. Add deployment permissions to the Azure web app
4. Create a DevOps Service Connection
5. Create the DevOps Pipeline
6. Run and test the Pipeline
0:00 Introduction
0:18 Deployed Environment
0:55 Configuration Steps
1:56 App Registration
2:15 Client Secret
3:09 Authorize Service Principal
4:30 Service Connection
5:35 DevOps Pipeline
6:25 Run Pipeline
7:40 Test Python App
In this video we deploy an app from a DevOps Git Repo via a deployment pipeline. We'll cover the following configuration steps:
1. Create a new app registration in Entra
2. Create a client secret
3. Add deployment permissions to the Azure web app
4. Create a DevOps Service Connection
5. Create the DevOps Pipeline
6. Run and test the Pipeline
0:00 Introduction
0:18 Deployed Environment
0:55 Configuration Steps
1:56 App Registration
2:15 Client Secret
3:09 Authorize Service Principal
4:30 Service Connection
5:35 DevOps Pipeline
6:25 Run Pipeline
7:40 Test Python App
มุมมอง: 57
วีดีโอ
Windows Fine Tuning Combined Streams
มุมมอง 1K8 หลายเดือนก่อน
LLM Fine-Tuning is one of the go-to techniques for making LLMs perform better in specific scenarios. In this post I'll show you how to prepare a local Windows-based machine with an Nvidia GPU to serve as a development-scale fine-tuning workstation. This video is based on my blog post you can find here: robkerr.ai/fine-tuning-llms-using-a-local-gpu-on-windows/ 0:00 Intro 2:33 Install WSL 6:01 In...
Mirroring Snowflake data to Microsoft Fabric using Change Data Capture (CDC)
มุมมอง 1.1K9 หลายเดือนก่อน
This video is a quick walk-through using Fabric mirroring to replicate data from a Snowflake Data Warehouse into a fabric Data Warehouse. Mirroring with Change Data Capture provides near real-time data replication of Snowflake data. 0:00 Introduction 1:16 Enable Mirroring 1:58 Create the Mirror 3:10 Select Tables 4:05 Mirror Monitoring 4:55 Review Data 6:03 Add new Data 6:11 Summary
Importing Snowflake data to Microsoft Fabric using a Pipeline
มุมมอง 1K9 หลายเดือนก่อน
This video is a quick walk-through using a Fabric pipeline to import data from a Snowflake Data Warehouse to a Lakehouse Delta Table. Snowflake data sources are supported out-of-the-box in Fabric, making this type of integration straightforward! 0:00 Introduction 0:25 Create a Pipeline 0:55 Create a Snowflake Connection 2:13 Select Snowflake Data 3:00 Choose Destination 3:51 Run Pipeline 4:24 R...
Unleash Your Model's Potential: Guide to Deploying a Fabric ML Model on Azure ML Inference Endpoint
มุมมอง 53911 หลายเดือนก่อน
Microsoft Fabric's Data Science workload leverages Synapse ML for training and are ideally suited to enrich data stored in a data lake. But models developed in Fabric can also be deployed in other environments like Azure Machine Learning to take advantage of real-time inference endpoint deployment and other techniques. This video walks through the steps to export a model from Fabric, Import it ...
Deploying and Consuming Large Language Models from the Azure ML Model Catalog
มุมมอง 46111 หลายเดือนก่อน
Explore Azure Machine Learning's selection of pre-trained models from leading sources like OpenAI, Microsoft, and Hugging Face. In this video I'll guide you through the process of choosing, deploying a model on Azure's computing services, and accessing it through a REST API. This video is a video demo of this blog post content: robkerr.ai/deploy-azure-model-catalog-endpoint 0:00 Create Resource...
How to create and install custom Python libraries in Microsoft Fabric
มุมมอง 2.4K11 หลายเดือนก่อน
You've probably used %pip install to include open source libraries in your Fabric notebooks, but did you know you can also easily install your own custom Python libraries? This end-to-end walk through shows you how to create and package your own code in custom libraries in a Fabric Lakehouse. Packaging your code in libraries encourages reusability and reduces the need to paste commonly used cod...
How to incorporate AI-Driven Sentiment Analysis models in a Fabric Data Lake Solution
มุมมอง 217ปีที่แล้ว
This video shows you how to use Azure AI sentiment analysis machine learning models within a Fabric Jupyter notebook to add user sentiment features to Data Lake tables providing rich new quantitative data elements that help data analysts discover new insights in unstructured data. For links to source code and similar articles, visit the blog version of this post: robkerr.ai/fabric-ai-sentiment-...
Incorporating Azure OpenAI Services with Fabric to create insights from unstructured data
มุมมอง 1.5Kปีที่แล้ว
In this video we'll use Azure OpenAI Services from a Fabric Jupyter notebook to automatically summarize large collections of semi-structured user review data. Topics covered include prompt engineering, secure secret storage in Azure Key Vault, and reading/writing data to Delta tables in a Fabric Lakehouse. For links to source code and similar articles, visit the blog version of this post: robke...
Using Azure AI Document Intelligence with Microsoft Fabric
มุมมอง 2.1Kปีที่แล้ว
This video shows you how to build a Document Intelligence solution within Microsoft Fabric that incorporates Azure AI to extract text from scanned documents. Using a Spark Notebook and Synapse ML, we can easily ingest scanned document data into a Data Lake solution. For links to source code and similar articles, visit the blog version of this post: robkerr.ai/fabric-azure-ai-document-intelligen...
Incorporating Azure AI Translation into a Fabric Data Lake Solution
มุมมอง 441ปีที่แล้ว
Using Fabric's embedded Synapse ML features, we can leverage Azure AI services directly in Spark notebooks to enrich and transform data using Language AI Services. Learn how to use these powerful services using a Python Jupyter notebook stored in a Fabric workspace. For links to source code and similar articles, visit the blog version of this post: robkerr.ai/fabric-azure-ai-translation-notebook/
Calling Azure AI Document Intelligence using the REST API
มุมมอง 9Kปีที่แล้ว
Azure AI Document Intelligence can read images and PDF scans of forms, extracting data for later use in data solutions. While various language SDKs are available, it's also possible to call these services directly using the REST API. This tutorial walks through the REST API process. For links to source code and similar articles, visit the blog version of this post: robkerr.ai/azure-ai-document-...
Committing a Microsoft Fabric Workspace to a Git Repository
มุมมอง 1.6Kปีที่แล้ว
4Microsoft Fabric supports connecting Fabric Workspaces with Git repositories. Watch this video to learn how to setup and use Git repos with Fabric. We'll cover configuration, synching notebooks and other content, as well as how to use Git to recover unintended changes to notebooks. This video is a demonstration of a related post at robkerr.ai/fabric-git-repo-configuration 0:00 Introduction 0:5...
Create Gemini Chatbot
มุมมอง 4.7Kปีที่แล้ว
Last week Google shipped the first version of its Gemini large language model and made APIs available to developers. In this post we'll build a fully functional Gemini Chatbot using Streamlit! This video is a walk through of the content in my blog post: robkerr.ai/how-to-create-a-google-gemini-chatbot/ 0:00 Introduction 1:05 Anaconda 1:21 Install Dependencies 2:43 Initialize Gemini SDK 3:05 Cre...
Creating a Custom Generative AI Microsoft Teams Copilot using Copilot Studio
มุมมอง 8Kปีที่แล้ว
This video is an end to end walk through using Microsoft Copilot Studio to create a custom Microsoft Teams copilot that includes a range of features, including: • Generative answers using a web site as grounding knowledge • Generative answers using a PDF documentation as grounding knowledge • Custom topics with branching logic The walk through starts from scratch and ends with a finished Copilo...
Display a Power BI Report in a Microsoft Fabric Jupyter Notebook
มุมมอง 545ปีที่แล้ว
Display a Power BI Report in a Microsoft Fabric Jupyter Notebook
Generative AI using Azure AI Integrated Search Vector Embeddings
มุมมอง 510ปีที่แล้ว
Generative AI using Azure AI Integrated Search Vector Embeddings
Generative AI in action: how I use Bing Image Creator and Photoshop Generative Fill together
มุมมอง 311ปีที่แล้ว
Generative AI in action: how I use Bing Image Creator and Photoshop Generative Fill together
Fabric Semantic Link: Power BI as a Machine Learning Data Source
มุมมอง 4.8Kปีที่แล้ว
Fabric Semantic Link: Power BI as a Machine Learning Data Source
Creating a Machine Learning Model with IBM Watson Studio and Jupyter
มุมมอง 950ปีที่แล้ว
Creating a Machine Learning Model with IBM Watson Studio and Jupyter
Build a Generative AI Chatbot Using a Vector Database for Custom Data (RAG)
มุมมอง 2Kปีที่แล้ว
Build a Generative AI Chatbot Using a Vector Database for Custom Data (RAG)
Machine Learning with Fabric Data Science and Jupyter Notebooks
มุมมอง 605ปีที่แล้ว
Machine Learning with Fabric Data Science and Jupyter Notebooks
Bring your own data to Azure OpenAI Service Large Language Models
มุมมอง 549ปีที่แล้ว
Bring your own data to Azure OpenAI Service Large Language Models
Exporting Azure AI Vision Models to run on iOS Devices
มุมมอง 202ปีที่แล้ว
Exporting Azure AI Vision Models to run on iOS Devices
Using AI Vision to Detect Packaging Defects
มุมมอง 356ปีที่แล้ว
Using AI Vision to Detect Packaging Defects
Enriching Images with Azure AI Vision for Power BI Analysis
มุมมอง 126ปีที่แล้ว
Enriching Images with Azure AI Vision for Power BI Analysis
Hey Rob, I understand where to get the general endpoint, but where does microsoft provide the endpoints for specific services. Say instead of form recognizer, i want to generate captions on an image
Hi Rob What will be the URL for Custom Extraction Models? I am getting 404. mentioned API version correctly and followed all of your steps.
Did u find where to get the URLS for this? I tried looking on the documentation but couldnt find it
Great presentation, will tgat work with the lowest fabric SKU? Thanks Matt
its so tiring if everything fails just cause they changed their API calls.
I am trying to follow with the example but what is the whole line when you want to split the data "X, y = df[['Pregnancies','PlasmaGlucose',...." Line 6 in the second code section, It is in the section of training the model, here: th-cam.com/video/XE3PWXB260U/w-d-xo.html
Rob good overview. If we put this into production and started doing this activity on a scheduled basis, then an enterprise level system would expect connections to access a vault for credentials, and have those credentials be a service account rather than a user login. Finally the authentication would probably be driven by an IDP. Any thoughts, or possible future video what that looks like for Snowflake connections in Fabric?
cool Rob thanks this helps me lot
How do i pass a path to a pdf file? Thank you for the video
Hi, can you please give the Data table i want the Table with this data to try the demo, thank you so much
All steps worked , but i am getting a "ModuleNotFoundError: No module named 'shared_utils'" when importing package
did you find a solution? I am having the same issue. I see my package when looking in the notebook with "%pip list" but get an error when doing "import" or "!pip show my_package_name". FYI @robkerrai
Great video, where can I find the notebook and pbi file to play with it. Thanks in advance ☺️
Very informative, thank you for sharing Rob!
Hi! Great video! I followed your steps but keep getting a 404 error on the GET request. Has there been an update to the GET syntax?
First thing to look at when you get a 404 is whether the API Version parameter is still valid. The video I think uses a 2022 API version. Check what the current one is and use that in the parameters. Likely that will be the root cause.
@@robkerrai thank you so much for your answer! it was very helpful <3
If i have a custom model?
Thanks for this.... although had one problem that I finally found a solution to. Currently you need to speicify pytorch=2.2.0 otherwise you get an error due to new torch libraries not being compatible. ( github.com/unslothai/unsloth/issues/73#issuecomment-1998826713 )
Thanks for contributing that tip!
Thank you. Please do a video for batch and webapp endpoints too. Thanks
Thanks for the suggestion!
Hi Rob, Great video, very easy to follow. 😄
Thanks!
You didn't mask the key
Thanks for pointing that out! I regenerate keys after making recordings--less work than masking in post :-)
Hi Rob, thank you very much for the tutorial! I followed every single of you steps. I am stuck in the inference part. I am getting the following KeyError. line: outputs = model.generate(input_ids = inputs, max_new_tokens = 64, use_cache = True) KeyError: 'Cache only has 0 layers, attempted to access layer with index 0'
Hmm...haven't seen that error. You might try going through the process in a Google Colab and see if the script you're using is working.
Super helpful and straightforward, thank you! I'm just getting started with Fabric - would love to see some content on managing libraries across environments (dev,test,prod) and also your own style of splitting a Fabric platform over multiple workspaces and into various lakehouses and warehouses!
Great content suggestions -- thanks!
great video
Glad you enjoyed it
Can I use another LLM model like Meta Llama-3 or other model from hugging face instead of gpt 3.5?
You could, but you would need to deploy that LLM somewhere that you can reach it from Python. Azure OpenAI Services is hosting the model compute (and you pay per token). You can also pay per token by deploying models in Azure ML. You can also host the LLM(s) in other services on other cloud platforms and send them requests from a notebook in Fabric.
is it possible for the user to trigger the run of the Fabric notebook directly from the PowerBI report and then see the output of the notebook (for example, a feature importance chart) again in the PowerBI report?
I'm not aware of how to do this, sorry.
This very helpful. It's like I found your website(needle) from the sea
Glad it was helpful!
Thanks Rob, good demonstration!
Thanks!
I can add the App in my Teams, but it's not writing an introduction or answering anything :/
Not sure what could be wrong, but hope you figure it out!
AWESOME
Thank you! Cheers!
Keep it up really nice explanations in short
Thanks!
As I am very new to Microsoft universe, I was trying to do it the hard way. With your guide I saved several days of pain !! many many thanks !!
Glad it helped!
This really helped. I couldn't find a good video that showed me how to do this all weekend, but I would recommend not cutting out scenes because people are looking at the video and following and literally a glace left and right, and you already did another step that I didn't even notice. Had to watch it like 4 times super slow to catch all the steps. Also I dont know linux so skipping the part of making the code/finetune folders had me thinking i missed something again and that had me turned around for awhile. Also when people stop it, the red bar and play, pause, buttons, subtitles etc cover up the bottom of the screen so we cant even see if you do something quick at the bottom because its covered up when we pause it. Thanks again.
Thanks for the feedback! Sorry you got tripped up in some places, I'll work harder to make sure the flow is clean and doesn't get interrupted in the future.
Thanks Rob! I have 2 questions: 1. Can I create a ticketing system in Copilot Studio? 2. When I publish via Teams to whole office of 60+ colleagues, they dont have to pay extra, right? Many Thanks.
Possibly! It depends on the complexity you need. I'd suggest trying to prototype the functionality you need to see how it can fit. Regarding cost, when using copilot studio, it's billed based on the number of messages that flow through the copilot app you create, so I don't think you would see per user charges for Teams users. Take a look at: www.microsoft.com/en-us/microsoft-copilot/microsoft-copilot-studio#Pricing
Hey Rob, how much time it takes to spin up the Spark cluster when running the new job? Or did you already have a cluster running and when you ran the job, the job simply requested resources from that running Spark Cluster?
If you use the default pool (I almost always do) a cluster will start processing Fabric jobs in 4-5 seconds since Fabric has cluster nodes already running and waiting to be assigned.
@@robkerrai I never had a chance to with Fabric but by default pool, do you mean that a cluster is always up and running in your environment and your job simply takes the resources from there. Or do you mean that default pool is Serverless and it will only charge me for the time I will use it and also, it will spin up within 4-5 seconds? That would be extremely helpful if true. We are working with HD Insight and need to have our nodes running all the time only because we can't afford a time of 10-15 minutes for a node/cluster to spin up when a new user starts his Notebook.
Yes to both questions. Fabric is SAAS (not PAAS), so it works differently in that you pay for a "capacity" that you choose (your peak usage/second), and is a relatively fixed cost rather than clusters you manage yourself billed as a variable cost. The Default pool is a large group of always running nodes waiting to be assigned to your work as needed--you don't pay for compute per-se, you pay for the maximum nodes you can use at any given time. Default pool probably would solve the low latency of starting clusters issue you have. It is possible in Fabric to design your own cluster configuration, which would probably have the same startup delay in a PAAS environment, but needing to do that is the exception not the rule. The default pool meets most general computing needs.
Perfect, thank you! That clarified a lot of things. Thanks for your time, really appreciate it
Thanks for sharing. You sure made it look easy.
Thanks for watching!
thanks for the video!
My pleasure!
Nice Share. Is there a way to decrypt the wheel file so we can see .py scripts in locally that is obtained from elsewhere. Would you have some resources on this?
If I'm not mistaken .whl files aren't encrypted. Try changing the extension to .zip and using unzip to get at teh code within the .whl file.
Hey Rob, this was really helpful. I was looking to apply information from a form like this onto a prebuilt data entry tool with certain sections for each of the matching sections on the form. Would this be possible?Could you possibly make a video on this? Thanks
You can use Azure AI to train a custom model for your own forms. I'd like to make a video about this in the future so stay tuned!
Any chance you could reply to this comment with the fetch? Thank you very informative video and useful so far
Sorry Dylan I don't understand what yo mean by "fetch". Could you clarify?
Hi Rob, The content was very useful, Do you have any complete tutorial to learn Fabric please. If already created please share the link. Thank you :)
I don't have a complete series. I'd recomed Will's series on his @LearnMicrosoftFabric channel www.youtube.com/@LearnMicrosoftFabric
Thank you for your work, Rob. I appreciate this video!
So glad it was helpful!
What are the use case for this ? Can you help with that!
It's a good question. I see the benefit with this feature is to view Power BI output within a notebook during development. For example, if the data in a Direct Lake model is being updated in the notebook, the visual is available within the notebook context.
Hi Rob, thanks for this video. How would you go about separate lakehouses for dev and prod, in relation to the notebook you sync? I tried doing exactly what you did here, but then in prod the notebook is trying to talk to the dev lakehouse. It seems the default lakehouse for the notebook is not put in git?
Microsoft's vision as presented at FabCon 24 was that different branches in a CI/CD environment are connected to different workspaces. The git features as I write this is pretty basic, but the design set out at FabCon is moving toward a more automated approach where creating a branch creates a workspace with all artifacts. In that scenario you probably would have a lakehouse in the Dev workspace, another lakehouse (with the same name) in a Prod lakehouse, etc.
Nice video
Thanks
Good video.I hope this flow is happening automatically behind the scene when mirroring happens ..
This is a pipeline (not mirroring). You are correct when using mirroring the integration runs behind the scenes.
This video has been fantastic, but I have a possibly stupid question: When you ran the code, you automatically had a job created; what did you do to get that? When I run mine that does not happen.
Not sure I understand. Which portion of the video are you referring to?
thank you, very helpful
Glad it was helpful!
Thanks Rob
Your welcome!
Great video ! I had a question, how do i call a custom model that i created in Azure Document Intelligence ?
You can use Azure AI to train a custom model for your own forms. I'd like to make a video about this in the future so stay tuned!
Thank you for this tutorial, sir!
Glad it was helpful!
Thank you this helped a lot. How would it work with custom prompts for the bot?
For a more customized experience, Azure ML PromptFlow can provide more customization, so you might look into that.
Hi Rob, great video about how to use packages on Fabric. In fact this video make think, that we can create a new Workspace and Lakehouse called SourceCode o SharedCode as the example. So we will have the code isolated from the data Bronze, Silver and Gold.
I think you could accomplish this using Fabric environments. Each environment can have different .whl files attached to it, so when you choose an environment to run your code, that code will use whatever .whl files are pre-installed in the environment.