Hi, this is good. I followed these steps and did this using delta files in Data lake, it’s doing inserts & updates as expected. But, one thing we are missing is RecordCreatedDate & ModifiedDate columns. If weadd a created date as getdate, it’s updating each and every row since the time is changed now in that column. Can you please suggest how to capture the updated time in this scenario?
Hi Wafa, thanks for great videos, I have a question: what if you have 1 million rows in source and let's say 20000 rows in sink which has same value as source. In this scenario, will it update these 20000 rows again as well ( even if , it matches)?
Good video wafa, the only point i absorb here is you are hard coding filename and table name,if you make parameterization ,that willbe most real time scenario.
Thanks a lot WafaStudies, really this is a great explanation for type-1 and eagerly we are waiting for SCD-2 part. Please post soon :) !!!!
Thanks!
Welcome 😊
Very clearly explained ,..loved your videos always, thank you Wafastudies 😊
Thank you Priya 😊
This is good , no too many columns used in the jobs.
Good use case with detailed explanation
Really cool stuff and it’s very use full.
Can you make tutorial on Azure service bus & datagrid.
Hi, this is good. I followed these steps and did this using delta files in Data lake, it’s doing inserts & updates as expected. But, one thing we are missing is RecordCreatedDate & ModifiedDate columns. If weadd a created date as getdate, it’s updating each and every row since the time is changed now in that column.
Can you please suggest how to capture the updated time in this scenario?
Hi wafa!!
Can I use upsert option available under copy activity for the same purpose
very nice & explained well.
Thanks for liking
Hi Wafa, thanks for great videos, I have a question: what if you have 1 million rows in source and let's say 20000 rows in sink which has same value as source. In this scenario, will it update these 20000 rows again as well ( even if , it matches)?
it should update the only the changes
Sir pls share video related to Azure data warehouse and external tables
Hai bro i have one scenario that i have to copy files from blob to sqldb where file size is greater than 20 mb how can i achieve that
I have problem where are checking if sink has those data
In SCD type 1 : if i want to update 2 column rest of 4column .is it possible?
Good video wafa, the only point i absorb here is you are hard coding filename and table name,if you make parameterization ,that willbe most real time scenario.
Hi Can you please put SCD type 2 video..it really helpful..and appreciate in advanced...
Thank you for this video. In the scd2 could you please try to explain the scd2 data flow template already present in Microsoft .
Hi, is there any other way to get in touch with you regarding doubts?
Thank you :)
Hi Wafa Studies please show incremental loading video as well.
🙏🏻