Processing Slowly Changing Dimensions with ADF Data Flows

แชร์
ฝัง
  • เผยแพร่เมื่อ 16 พ.ย. 2024

ความคิดเห็น • 19

  • @vishal259
    @vishal259 4 ปีที่แล้ว +4

    It would be good if you can upload scripts you have used in demo. It would be a good reference for audience.

  • @reckitt9173
    @reckitt9173 ปีที่แล้ว

    16:42 Correct me if I am wrong. This step will be prove bottleneck when data count is higher. Instead it would have added at 24:04 as lookup. We need a logic here for a data whose records found can be consider update and for other is insert.

  • @abelfelleke7290
    @abelfelleke7290 5 ปีที่แล้ว +2

    Thank you it neat and briefly explained

  • @moesmael
    @moesmael 3 ปีที่แล้ว +1

    It was a very good explanation, but in Insert Sink, it will not insert newly updated rows with new Keys and new ActiveEnddate. Please make sure about the process, but I would like to thank you very much, it was a very good demo and helpful

    • @rohitnath5545
      @rohitnath5545 ปีที่แล้ว

      Why do you think it wont work

  • @terryliu3635
    @terryliu3635 4 ปีที่แล้ว

    Great intro! Thanks!

  • @shivanityagi1073
    @shivanityagi1073 2 ปีที่แล้ว

    Can you please explain how to implement type 3 and type 2 scd in adf?

  • @Sarabjitsethi
    @Sarabjitsethi 3 ปีที่แล้ว +1

    how are u taking care,if same hash exists ?

  • @theownmages
    @theownmages 4 ปีที่แล้ว

    how would i quarry to get a view of a dataset looked like at a spesific date, if i structured it like this type 2?

  • @srinin6252
    @srinin6252 4 ปีที่แล้ว +1

    its great explanation ,can you provide me the source file which you used for current SCD Type one? will try for that

  • @YTEEsenior
    @YTEEsenior 5 ปีที่แล้ว +1

    On insert for uniqueidentifiers, how can you create a "new guid"?

    • @bobrubocki5305
      @bobrubocki5305 5 ปีที่แล้ว +1

      Assuming you are loading to a SQL Server DB, you could create a default constraint and use the NEWID() function.

  • @abelfelleke7290
    @abelfelleke7290 5 ปีที่แล้ว

    Can you make a video which shows how you transfered those data set files

    • @bobrubocki5305
      @bobrubocki5305 5 ปีที่แล้ว

      For this demo, I manually uploaded the files with Azure Storage Explorer.

  • @kal1leo
    @kal1leo 4 ปีที่แล้ว

    how do I export and import a data factory including all the pipelines,data flows & data sets?

    • @rameshchowdary6390
      @rameshchowdary6390 4 ปีที่แล้ว

      You can go to resource group where you created your data factory and click on export. It generates json code then click on download. ARM Template will be downloaded to your local machine.

  • @jagadishparale8632
    @jagadishparale8632 4 ปีที่แล้ว

    Can you share the files used in the demos

  • @amitnavgire8113
    @amitnavgire8113 4 ปีที่แล้ว

    ITs just too slow and requires a lot of patience to watch this video...