48. How to Copy data from REST API to Storage account using Azure Data Factory |

แชร์
ฝัง
  • เผยแพร่เมื่อ 31 ธ.ค. 2024

ความคิดเห็น • 36

  • @cosmocracy2000
    @cosmocracy2000 10 หลายเดือนก่อน +8

    You only got HALF of your data. You ignored the real-world challenge of pagination. The page/per_page/total/total_pages attributes indicate that you need to perform subsequent fetches to get all 12 rows.

  • @AHMEDALDAFAAE1
    @AHMEDALDAFAAE1 ปีที่แล้ว

    Thank you so much! I learnt a lot from this short video.

  • @UmarKhan-bx7jy
    @UmarKhan-bx7jy ปีที่แล้ว +1

    Thanks brother, it was really useful. I really appreciate it.

  • @tanushreenagar3116
    @tanushreenagar3116 11 หลายเดือนก่อน

    Perfect 👌 explanation sir

  • @saketsrivastava84
    @saketsrivastava84 ปีที่แล้ว

    Thanks buddy for such a insightgul video

  • @renan6827
    @renan6827 5 หลายเดือนก่อน

    That is a great video!!!!👏👏👏👏

  • @saisudha5565
    @saisudha5565 ปีที่แล้ว

    Make a video for interview questions and also what are the challenges u have faced in adf with examples please.

  • @ibm_etl_datastage8590
    @ibm_etl_datastage8590 ปีที่แล้ว

    You have great content explaining ADF ! Can you please create a video on how we can pull multiple excel files from Microsoft Teams and another video on Sharepoint as source and loading to some file or database after doing some transformation. There are videos available but I like the way you make it simple.

  • @bharatia4826
    @bharatia4826 4 หลายเดือนก่อน

    Informative! Many thanks

  • @mayanksharma5981
    @mayanksharma5981 ปีที่แล้ว

    Awesome tutorial. Is there a way to divide PDF in chunks while copying from source to sink ? Like copying a 500 page PDF to 5 PDFs 100 page each ?

  • @stoyyeti3671
    @stoyyeti3671 ปีที่แล้ว +1

    Is there any option to dynamically specify unroll column in flatten transform . Pls reply

    • @WafaStudies
      @WafaStudies  ปีที่แล้ว

      Never tried. But see if add dynamic content option is available there. If yes, then using byname() function u can add ur column name

  • @BharatHiralal-lb2pq
    @BharatHiralal-lb2pq ปีที่แล้ว

    Hi, The requirement is to Truncate the Salesforce Table first using ADF and then load the table using ADF. How can we do this? Any idea ?

  • @a2zhi976
    @a2zhi976 10 หลายเดือนก่อน

    how can control dev and prod variables in ADF ?.

  • @chubsmash7602
    @chubsmash7602 6 หลายเดือนก่อน

    Brilliant! Thanks

  • @itversityitversity7690
    @itversityitversity7690 8 หลายเดือนก่อน

    I have one requirement this topic? I want to call API and this API contains 5 CSV format files and want to move these 5 files into destination (table format) . Can you please help me how to do ??

  • @eldrigeampong8573
    @eldrigeampong8573 4 หลายเดือนก่อน

    great video......thanks a lot

  • @MaDmAxLoCaL
    @MaDmAxLoCaL 7 หลายเดือนก่อน

    we got a request in our project to egress data from ADLS location to REST API using ADF, is that possible? if so please provide your guidance

  • @alexfernandodossantossilva4785
    @alexfernandodossantossilva4785 ปีที่แล้ว

    How can I decode the API response?

  • @siddheshamrutkar8684
    @siddheshamrutkar8684 ปีที่แล้ว +1

    Nice Video.. 👍

  • @yashwanthv5604
    @yashwanthv5604 ปีที่แล้ว

    How to copy table from one storage account to another storage account in synapse

  • @vinuthab-zw4nj
    @vinuthab-zw4nj 9 หลายเดือนก่อน

    Same requirement with incremental data required. call watermark table col_Name 'ApiUrl' ,based on 'LastExportDt' data should be increment. Please make one Video

  • @saritapatole6030
    @saritapatole6030 ปีที่แล้ว +1

    Helpful

  • @ravindranathchowdaryalapat9072
    @ravindranathchowdaryalapat9072 9 หลายเดือนก่อน

    I have 10 jsons same as above scenario,Then how to load them in single for each activity by using dynamic column mapping,Can you please explain this

  • @p.rajeshkhanna8489
    @p.rajeshkhanna8489 2 หลายเดือนก่อน

    Is it possible to dynamically copy multiple json files to CSV files without manually importing schemas

    • @MrRegaschardijn
      @MrRegaschardijn 15 วันที่ผ่านมา

      you should define a input file (this could be a .txt file) and define the Sources and Destinations. Use this file as a input for your ADF Activity Lookup

    • @NeumsFor9
      @NeumsFor9 10 วันที่ผ่านมา

      Absolutely you can.... especially if you store source and target metadata as well as the file format options. Then have a stored procedure or function dynamically output the same json that ADF would output upon reading source and target metadata

  • @DilipKumar-s9f1y
    @DilipKumar-s9f1y ปีที่แล้ว

    thanks for sharing

  • @saipraveen9320
    @saipraveen9320 2 หลายเดือนก่อน

    How to acheive Incremental Load from Rest API or through ODATA?

    • @NeumsFor9
      @NeumsFor9 10 วันที่ผ่านมา

      You would need to know your watermark column name or names. Most likely, you would then need to include these columns and values as key value pairs in your api call. At the end of your process, I would then recommend you take the maximum value of your watermark columns and log them at the end of a successful load.....depending on source application some people take the max value of the watermark from their target table and some take the maximum value of the what was extracted. Both have their pluses and minuses.
      😊If it is a first time load I would suggest you have a metadata framework which sets an arbitrary start watermark value for any first time initial loads.

  • @Sarmoung-Biblioteca
    @Sarmoung-Biblioteca 11 หลายเดือนก่อน

    Make Use Web, set variable, to copy data !

  • @mayanksharma5981
    @mayanksharma5981 ปีที่แล้ว

    Awesome tutorial. Is there a way to divide PDF in chunks while copying from source to sink ? Like copying a 500 page PDF to 5 PDFs 100 page each ?