Hi @TeamCykelhold, thank you for watching! It depends what you are looking to do. One thing you could do is use traditional Azure Data Factory. The PG added Lakehouse tables and Lakehouse Files to Data Factory so if you need the ability to communicate with private endpoints, to use some of the supported items in ADF, or use data pipelines this may be your go to approach!
Seems strange that ADF in Fabric doesn't support on prem sources as there is no way to use a data gateway or application gateway to ingest the data? Meaning you have to go back to using Azure ADF instead? Just seems like a strange oversight to not have that ability?
Hi, do you have any videos coming about using Azure HCI (private cloud) with an Azure public cloud, as a single pane of glass? Extending Saas offerings on prem?
Hi @user-gf4oy6qo4u sorry we do not have any videos planed on HCI devices at this time. I checked with the team and we do work against a variety of services but not that one.
We'd like to know more about the licensing costs for a datawarehouse. Seems like Fabric is very expensive for small/medium businesses, even with on-prem.
Hi @mnhworkdev1652 , Neeraj Jhaveri did a video for us on this th-cam.com/video/2rETPwBLnnY/w-d-xo.html This link is to the part of the video that begins to cover the licensing. Hope this helps, if you have more questions let me know!
Hi @sansophany4952 @SQLBalls posted a video today showing how you can use real time streaming to write data from an API to a Microsoft Fabric Lakehouse. th-cam.com/video/_RFvcPBNiOk/w-d-xo.html
Hi @vishwassee per Bradley, "This is a really interesting question. I don't want to say no, but it may not be. The only possible way I could see this working would be if you have an a site to site VPN connecting your on premises to Azure, and there is a network that you have a network gateway set up to that main network, and you have an F64 provisioned and a Managed private endpoint for your workspace that is attached to that network. Provided that you have end to end connectivity and can hit a source using jdbc or pyodbc (probably utilizing SQL Authentication), maybe this would work. Keep in mind I'm not a networking expert, and I do not have the means to set up this envrionment and proof it out..... but MAN do I WISH I could, because if it did work it would be really cool!"
Adsl gen 2 not being available in Fabrics Warehouse view and only pipeline made me think this weren't an option... Again as with all Azure products, I don't think their File menu system works any good with each Views; Lake/Warehouse/workbooks/File storage having entirely different GUI teams confusing and apparently hiding viable options, tricking you into thinking there' limitations. Not only that but the Workbooks and separete GUI for Databricks has a horrible File explorer and db explorer expanding to the right instead of downward (Good luck where you are three levels down when the screen is full of expanded big menus) lile normally would be more suitable as in Visual Studio or the actual Windows OS
Hi @dagg497 per Bradley, "Hi dagg497, not sure what this comment is to in regard to the video. For the Data Flow Gen 2 you wouldn't really use this to load ADLS Gen 2 as this is a transform mechanism to land data into OneLake, Data Warehouse or Data Lake. Pipelines do have ADLS Gen 2 as a destination and they are used more for data movement."
Hi @martijnolivier4553 per Bradley, "Hi @martijnolivier4553 not for an on-premises data source. There must be a physical or virtual machine where the Power BI Data Gateway can be installed. That creates a tunnel that allows secure communication and data transfer from on premise to Microsoft Fabric / Azure where the data is transferred."
i am getting this error after i publish error: GatewayCannotAccessSqlError Couldn't refresh the entity because of an internal error any idea how to fix it?
Hi @vishwassee, looking at the error I found a similar issue and a solution on the Fabric help forum. It looks like they needed to add a firewall rule in their on premise system to allow them to access the SQL Endpoint. I'm going to get the @Tales-from-the-Field account to post the link to the forum post.
@@SQLBalls here is the link you sent me! community.fabric.microsoft.com/t5/Dataflows/OnPremise-Gateway-cannot-write-to-lakehouse/m-p/3417668 hope this helps @vishwassee
Hi @ricardoabella867 per Bradley, "Hi Ricardo, when you import your data you would either load it as files or as a table. If you load it as files you would have to do work to land this as a table, however if you load it to a table you could select a Lakehouse or a Warehouse table. At that point you do not need to do any more work the data is accessable to you."
interesting, but seems inefficient for streaming data from key onprem tables. Easy enough if you need something updated every few hours, not great for much else (?)
Hi @Rider-jn6zh per Bradley, "Hi @Rider-jn6zh there are a couple different methods to connect to SQL Server. This video was for Data Flow Gen2, we also have a video using Azure Data Factory, or Microsoft Fabric Data Pipelines. However, if you are asking about the target location, we must land this in either a Microsoft Fabric Data Warehouse or in the files or tables of a Lakehouse. The storage that contains files is currently only accessible via a Lakehouse in the GUI interface. You can write to storage directly for a Fabric Capacity by using the API's, however, to access that data via the GUI or in another form, you need a Lakehouse to access the files as of right now."
THIS IS REALLY BIG!!! I HAD BEEN WONDERING ABOUT THAT. Great and simple video instructions! Very POWERFUL !!!!
Hi @TeamCykelhold, thank you for watching! It depends what you are looking to do. One thing you could do is use traditional Azure Data Factory. The PG added Lakehouse tables and Lakehouse Files to Data Factory so if you need the ability to communicate with private endpoints, to use some of the supported items in ADF, or use data pipelines this may be your go to approach!
Seems strange that ADF in Fabric doesn't support on prem sources as there is no way to use a data gateway or application gateway to ingest the data? Meaning you have to go back to using Azure ADF instead? Just seems like a strange oversight to not have that ability?
Hi, do you have any videos coming about using Azure HCI (private cloud) with an Azure public cloud, as a single pane of glass? Extending Saas offerings on prem?
Hi @user-gf4oy6qo4u sorry we do not have any videos planed on HCI devices at this time. I checked with the team and we do work against a variety of services but not that one.
We'd like to know more about the licensing costs for a datawarehouse. Seems like Fabric is very expensive for small/medium businesses, even with on-prem.
Hi @mnhworkdev1652 , Neeraj Jhaveri did a video for us on this th-cam.com/video/2rETPwBLnnY/w-d-xo.html This link is to the part of the video that begins to cover the licensing. Hope this helps, if you have more questions let me know!
Thanks for sharing. this is very helpful. I wonder is it possible to do realtime data ingestion into data lakehouse in fabric?
Hi @sansophany4952 @SQLBalls posted a video today showing how you can use real time streaming to write data from an API to a Microsoft Fabric Lakehouse. th-cam.com/video/_RFvcPBNiOk/w-d-xo.html
@@Tales-from-the-Field thank you 🙏
Hi Is there a way to connect to on-prem SQL server using Fabric Notebook?
Hi @vishwassee per Bradley, "This is a really interesting question. I don't want to say no, but it may not be. The only possible way I could see this working would be if you have an a site to site VPN connecting your on premises to Azure, and there is a network that you have a network gateway set up to that main network, and you have an F64 provisioned and a Managed private endpoint for your workspace that is attached to that network. Provided that you have end to end connectivity and can hit a source using jdbc or pyodbc (probably utilizing SQL Authentication), maybe this would work. Keep in mind I'm not a networking expert, and I do not have the means to set up this envrionment and proof it out..... but MAN do I WISH I could, because if it did work it would be really cool!"
Adsl gen 2 not being available in Fabrics Warehouse view and only pipeline made me think this weren't an option...
Again as with all Azure products, I don't think their File menu system works any good with each Views; Lake/Warehouse/workbooks/File storage having entirely different GUI teams confusing and apparently hiding viable options, tricking you into thinking there' limitations.
Not only that but the Workbooks and separete GUI for Databricks has a horrible File explorer and db explorer expanding to the right instead of downward (Good luck where you are three levels down when the screen is full of expanded big menus) lile normally would be more suitable as in Visual Studio or the actual Windows OS
Hi @dagg497 per Bradley, "Hi dagg497, not sure what this comment is to in regard to the video. For the Data Flow Gen 2 you wouldn't really use this to load ADLS Gen 2 as this is a transform mechanism to land data into OneLake, Data Warehouse or Data Lake. Pipelines do have ADLS Gen 2 as a destination and they are used more for data movement."
@SQLBalls Can this also be done without a virtual machine?
Hi @martijnolivier4553 per Bradley, "Hi @martijnolivier4553 not for an on-premises data source. There must be a physical or virtual machine where the Power BI Data Gateway can be installed. That creates a tunnel that allows secure communication and data transfer from on premise to Microsoft Fabric / Azure where the data is transferred."
i am getting this error after i publish error: GatewayCannotAccessSqlError Couldn't refresh the entity because of an internal error any idea how to fix it?
Oh @SQLBalls, your ears must be burning this morning!
Hi @vishwassee, looking at the error I found a similar issue and a solution on the Fabric help forum. It looks like they needed to add a firewall rule in their on premise system to allow them to access the SQL Endpoint. I'm going to get the @Tales-from-the-Field account to post the link to the forum post.
@@SQLBalls here is the link you sent me! community.fabric.microsoft.com/t5/Dataflows/OnPremise-Gateway-cannot-write-to-lakehouse/m-p/3417668 hope this helps @vishwassee
How new data in sql server is shown in one lake, do you have to import again to ms fabric?
Hi @ricardoabella867 per Bradley, "Hi Ricardo, when you import your data you would either load it as files or as a table. If you load it as files you would have to do work to land this as a table, however if you load it to a table you could select a Lakehouse or a Warehouse table. At that point you do not need to do any more work the data is accessable to you."
interesting, but seems inefficient for streaming data from key onprem tables. Easy enough if you need something updated every few hours, not great for much else (?)
Paging @SQLBalls !
how can we connect directly to sql server from fabric and bring data into fabric workspace not in lakehouse
Hi @Rider-jn6zh per Bradley, "Hi @Rider-jn6zh there are a couple different methods to connect to SQL Server. This video was for Data Flow Gen2, we also have a video using Azure Data Factory, or Microsoft Fabric Data Pipelines. However, if you are asking about the target location, we must land this in either a Microsoft Fabric Data Warehouse or in the files or tables of a Lakehouse. The storage that contains files is currently only accessible via a Lakehouse in the GUI interface. You can write to storage directly for a Fabric Capacity by using the API's, however, to access that data via the GUI or in another form, you need a Lakehouse to access the files as of right now."