How to Load Multiple CSV Files to a Table in Azure Data Factory - Azure Data Factory Tutorial 2021

แชร์
ฝัง
  • เผยแพร่เมื่อ 4 ก.ค. 2024
  • How to Load Multiple CSV Files to a Table in Azure Data Factory - Azure Data Factory Tutorial 2021. In this video, you will learn how to load multiple csv files from blob storage to single Azure SQL Database table. Once we load the files to Table by using Azure Data Factory then we can move all the csv files to Archive folder by Azure Data factory. Once the files are moved to Archive folder then we can delete the files from Source folder.
    Timecodes
    00:00 Introduction
    01:26 How to create Containers in Azure Data Factory
    02:08 How to Upload Multiple CSV Files in Azure Data Factory
    03:15 How to Create a Linked service in Azure Data Factory
    05:20 How to Create a Pipeline and use copy data activity and then delete activity in Azure Data Factory
    #AzureDataFactoryTutorial
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 27

  • @niteshsoni2282
    @niteshsoni2282 ปีที่แล้ว +2

    BEST CHANNEL FOR LEARNING................

  • @zohebsyed3333
    @zohebsyed3333 10 หลายเดือนก่อน +2

    Thanks for the effort. These videos' are amazing and very detailed.

  • @mr.prasadyadav
    @mr.prasadyadav 2 ปีที่แล้ว +1

    Nice effort , Thank you Sir ❤️

  • @kumareskumares-ug2ys
    @kumareskumares-ug2ys 3 ปีที่แล้ว +1

    Really helpful ... Appreciated your efforts..!!

  • @narendrakishore8526
    @narendrakishore8526 2 ปีที่แล้ว +1

    Great explanation ‼️

  • @zakariiabadis7203
    @zakariiabadis7203 3 ปีที่แล้ว +2

    Hello man !
    Please we want this training to be like that of Ssis
    I loooooooooove you man !

  • @srinubathina7191
    @srinubathina7191 ปีที่แล้ว

    Thank you sir

  • @rohitsingh8334
    @rohitsingh8334 3 ปีที่แล้ว +1

    i like your style of teaching.. 👍

  • @clouddataodyssey
    @clouddataodyssey 2 ปีที่แล้ว +1

    Thanks for the video. Can you please record a video with "List of files" option (which we configure in source - blob storage account when reading files) from source system.

  • @okkerb1
    @okkerb1 2 ปีที่แล้ว +2

    I agree that this is a simple but certainly not safe solution. The problem that I have with this approach Is that the processing, copy and delete processes of the file is completely disconnected and that there is, therefore, no guarantee that the files that were written to the DB are in fact the ones that are moved to the archive folder. Let's say there is an error that is swallowed during the processing stage the files that get moved would then be the full set or if there's an error in the moving step we will end up with duplicates in the SQL table. Or if more files land in the incoming folder after processing to SQL but before the copy then we will move files to Archive that was not processed. So sorry to say but this approach for me is not a good Architecture for a fully production-ready solution.
    But saying that thank you for your contributions and know that I've used your suggestions from previous videos usefully.

    • @TechBrothersIT
      @TechBrothersIT  2 ปีที่แล้ว +3

      Hi Okker, I totally Agree with you. The intention of this video was to provide the ways to practice different options. For fully functional solution, I will not suggest this. If would put a lot of more checks if I have to fully design the ETL. If I get time, I will add some video on that. I did looked into issues like half way failing for load for some files and that can end up bad data or duplicate data or even failure when retry. I really appreciate your took time to analysis and educate me and others on this. Have a great day.

    • @okkerb1
      @okkerb1 2 ปีที่แล้ว +1

      @@TechBrothersIT Thanks for your response. The purpose of my message was certainly not to educate you. :) But rather to highlight the danger with this approach and as you have mentioned thay there would need to be lots of checks and balances to make this work.

  • @prashantsonawane7730
    @prashantsonawane7730 ปีที่แล้ว

    Hello Sir Please make video on how to load multiple different extension files such as CSV, parquet, and txt into the database.

  • @rohitsethi5696
    @rohitsethi5696 ปีที่แล้ว

    hi im rohit Can we use all this activity in one copy command .like all in the variable and use the for each loop to do this process

  • @heliobarbosa2
    @heliobarbosa2 ปีที่แล้ว

    how to get the file name to insert into table?

  • @Mgiusto
    @Mgiusto 2 ปีที่แล้ว +1

    I need something like this but I need to load each file one at a time to a SQL DB, not everything that is found all at once which the pipeline in this video will do.
    How can I cycle through files one at a time in the order they arrived in my storage folder? Any help would be greatly appreciated, thank you!

    • @TechBrothersIT
      @TechBrothersIT  2 ปีที่แล้ว

      Can you please explain your requirement in little more details step by step. Sorry could not understand what you are trying to do. Maybe able to help you. Thanks for the feedback.

    • @brandonelliott2263
      @brandonelliott2263 2 ปีที่แล้ว

      Based on what you are describing...you want to be able to ingest files into the target SQL DB based on the latest arrival of data from your storage folder. Sounds to me like you need to set up a Message Queue and a streaming service that feeds into the pipeline that transmits your latest data to your target SQL DB.

    • @Mgiusto
      @Mgiusto 2 ปีที่แล้ว

      @@brandonelliott2263 I was able to solve my issue by using a Get Metadata task and using that to create a file list of the source blob storage files. Then I run a For-Next loop off of this listing to process each file one at a time in a Copy Data task.

  • @kalaivanan.s5042
    @kalaivanan.s5042 7 หลายเดือนก่อน

    How it is read all the file without foreach

  • @Boss56641
    @Boss56641 2 ปีที่แล้ว

    What if it is failed after copying half of the files ,we need to continue with the remaining files? How to do that?

    • @TechBrothersIT
      @TechBrothersIT  2 ปีที่แล้ว

      I will say that save the completion status in some table and process only which have not completed

  • @RameshPatel-fg9qk
    @RameshPatel-fg9qk 2 ปีที่แล้ว

    Nice Video,
    I have a Source is csv file folder in SFTP Linked Service.
    My Destination is SQL DB.
    - Daily Regular New File uploaded in SFTP Source Folder ,
    - I want Latest Modified file Transfer From SFTP Folder To SQLDB on daily Bases..
    It is Automatic Process.
    Please Share This Type Video.
    I have Problem in Mapping.
    Total 172 Column in SOURCE and Destination.
    Please Share Video