Microsoft Fabric: What are the options to load local files in Lakehouse

แชร์
ฝัง
  • เผยแพร่เมื่อ 13 ธ.ค. 2024

ความคิดเห็น • 13

  • @trone_tip
    @trone_tip ปีที่แล้ว +1

    Thanks for this 😊 please add how to connect ssms

    • @AmitChandak
      @AmitChandak  ปีที่แล้ว

      To connect with SSMS, I have discussed in this video th-cam.com/video/Mn1GB8rSeoQ/w-d-xo.html
      Take url, use database engine, url and login with User id password with MFA

  • @martinbubenheimer6289
    @martinbubenheimer6289 ปีที่แล้ว

    Hi Amit, thank you for this introduction. Now, for the shown approaches I need a web-browser and do manual upload each time I have new data. If I want to automate an update schedule for local file with Fabric, do I need Power BI Gateway like with Power BI or Azure Integration Runtime like with Synapse?

    • @AmitChandak
      @AmitChandak  ปีที่แล้ว

      As of now on-premise gateway is not supporting that. But we can expect that soon.

    • @AmitChandak
      @AmitChandak  ปีที่แล้ว

      For Azure you should be able to schedule the pipeline and dataflows

  • @matiasbarrera6959
    @matiasbarrera6959 ปีที่แล้ว +1

    Thanks for the video. Can I upload files using python in fabric notebook?

    • @AmitChandak
      @AmitChandak  ปีที่แล้ว +1

      If the file is on the some cloud and you can access it using URL and credential you can.
      As of now I doubt on-premise gateway is supported for that

  • @bloom6874
    @bloom6874 6 หลายเดือนก่อน +1

    Great explanation sir. Sir, is it possible to download the csv files in fabric lakehouse?

    • @AmitChandak
      @AmitChandak  6 หลายเดือนก่อน

      Thanks. For uploading, we have the option in Lakehouse itself. For downloading, we can use One Lake Explorer. Once installed, you will have access to the files similar to One Drive. Another method is to use Python code. PySpark on Notebooks can help, or the local Python code, which I have discussed in later videos of the series, can help to download files.
      I also checked that you can create a pipeline to move data from Lakehouse to a local folder using a pipeline. You need to have an on-premises gateway for that, and it will be downloaded on the on-premises gateway machine.

  • @DanielWeikert
    @DanielWeikert 6 หลายเดือนก่อน +1

    Why is the destination type always string and cannot be changed? Thx

    • @AmitChandak
      @AmitChandak  6 หลายเดือนก่อน +1

      When we got UI initially, we have option on the left to change data type. Now it changed and we can change it at destination side.

  • @nies_diy986
    @nies_diy986 ปีที่แล้ว

    @amit , Great tutorial , how we can transform multiple csv in a single dataflow and then sink them all in a lakehouse or warehouse using the same dataflow is it even possible i tried it but when i select the destination it is only taking one CSV to sink at destination i am stuck here does that mean i have to create multiple dataflows to tackle multiple CSV transformations ? i know in pipelines we can handle this via for each activity but not able to sort this for transformation multiple csv using dataflows ?
    Scenario is I have , Emp.csv , Dept.csv provided by client and i want to transform them ( changing the ID column from string to Whole number ) and then sink them to lakehouse , the problem is that i can ingest the files automatically using pipelines for each but cannot perform the same for transformations

    • @AmitChandak
      @AmitChandak  ปีที่แล้ว +1

      You can add multiple files in the same data flow and set a destination.
      When you start a dataflow from the Lakehouse Explorer. It will automatically set the destination for all tables.
      In the above case, you should be able to add all the files to one dataflow and transform and load to the lakehouse and warehouse.