You could, but you would need to deploy that LLM somewhere that you can reach it from Python. Azure OpenAI Services is hosting the model compute (and you pay per token). You can also pay per token by deploying models in Azure ML. You can also host the LLM(s) in other services on other cloud platforms and send them requests from a notebook in Fabric.
Hi,
can you please give the Data table i want the Table with this data to try the demo, thank you so much
Can I use another LLM model like Meta Llama-3 or other model from hugging face instead of gpt 3.5?
You could, but you would need to deploy that LLM somewhere that you can reach it from Python. Azure OpenAI Services is hosting the model compute (and you pay per token). You can also pay per token by deploying models in Azure ML. You can also host the LLM(s) in other services on other cloud platforms and send them requests from a notebook in Fabric.
Nice keep making video
Thanks!
Genius! Can you please do a video of how the openAI model in Azure was configured in order to give you these results? Thanks