Can i use dynamic path from Lakehouse folder for Dataflow Gen 2 insert into one table with append from multiple file and remove duplicate row(s)? So if new file uploaded to folder i choose, then the Dataflow Gen 2 use the newer file(s), if you know the way please enlighten me.
Hello sir, Thank you so much providing these productive videos. Today, I faced a challenge, and the solution I couldn't find elsewhere. That is How to Extract data from SAP Hana Cloud to Microsoft Fabric (cloud to cloud connectivity). Could you please help me here?
Brilliant session 👍. Thank you Gilbert and Matthias.
Thks for the beautiful live.
How can we implement incremental loading
Can i use dynamic path from Lakehouse folder for Dataflow Gen 2 insert into one table with append from multiple file and remove duplicate row(s)? So if new file uploaded to folder i choose, then the Dataflow Gen 2 use the newer file(s), if you know the way please enlighten me.
How to trigger the pipeline using API?
Hello sir,
Thank you so much providing these productive videos.
Today, I faced a challenge, and the solution I couldn't find elsewhere.
That is
How to Extract data from SAP Hana Cloud to Microsoft Fabric (cloud to cloud connectivity). Could you please help me here?