I hope you find this video useful. If you have learned something or found it helpful, please consider giving it a like. What topic should I cover in the next end-to-end project video? Please let me know your thoughts in the comments. Follow my LinkedIn profile to get regular updates: www.linkedin.com/in/mrk-talkstech/ Thank you!😊😊😊
You could do a video migrating on premises to synapse, for example , loading facts and dims using serveless and delta tables, using databricks …congrats by your videos, your video is better than Microsoft official courses
This is the best end-to-end project I have ever seen. I was able to set up a whole data engineering solution and personally implement it in my company based on this video. Please add more end-to-end projects when you get time. Amazing tutor and amazing channel.
This is the video I have been searching for a long time to kick off my databricks journey, now Mr. K you offered it for public free, thank you very much. :) Please keep it up.
I just wanna say THANK YOU VERY MUCH , MR. K. I was able to complete the project End to End and learned so much with your video. I would really like to you make more projects with this clarity. Thank you General! 🍻🍻
hi, great tutorial and indeed good learning for starters as me. Can you also please make end to end azure data engineering real time project with continuous data stream & readily available big data (so that we can readily download from your link). It would be of great help for us.
Great video! Your explanation of End-to-end Azure Data Engineering was clear, simple and easy to follow. I appreciate your expertise in breaking it down for us. God bless!!
Excellent and well described end to end data pipeline. Waiting for more videos on real time and incremental batch processing using ADF and databricks. Thanks
The BEST tutorial about data engineering that I have Found. I'm a data analyst, but in my company I'm having to do a LOT of data engineering job and I was totally lost till I found this tutorial. THANK you for helping me.
Bro same condition with me as well But does the company give us data engineer experience and does the data analyst experience worth it for data engineer
The BEST tutorial about data engineering that I have Found,you have explained in a perfect way, that was like most easiest way to follow your video and learn the project, best video on end to end project
What a powerful 1st 10 minutes... Looking forward to following along! I am just a SQL DB to Power Bi, but want to push the company forward and this is just the knowledge I need to begin :)
Amazing Video @Kishore, you have explained in a perfect way, that was like most easiest way to follow your video and learn the project, best video on end to end project, hoping for more videos to come, thanks again man, keep shining
HI, thank you so much for amazing project, I was wondering would the process be same if it is JSON files instead of sql server ? I need a connection between JSON files API to azure cloud and finally to power BI. Please let me know. Thank you so mcuh. Regards
Great session 👍, just want to know can we do the same activity with just AZ Synapse as u told it can do both Ingest & transform as well. Plz make Part-2 of this with AZ Synapse only, if possible.
Hello Mr. K . As I follow this one, I have some problems connecting with my on premise. on your tut, you've successfuly konekted to on premise. but me. i'm having a hard time, on my installed IR on my local machine,under diagnostic, i tried my sql server settings. and sucess but in the azure portal > adf linked service, still can't connect
Mr. k Sirji I went through the first part of your video - Environment setup. You mentioned that after going through end to end one can have the architecture in one's resume. Surprised that basics like Linked Service, Datasets were just created on the fly. These are extremely important components of ADF and no concept was provided. I am lucky that I have previously gone through certain ADF data movement tutorials All essential components were already created and creation is services forms a basis of any tutorial. You create a SHR IR service for one premise SQL Server but the most important thing is the configuration of Security Lists / Groups (Virtual Cloud Nw Firewall). Anyway the presentation is good , voice volume is optimum and speed is also okay. Namaste 🙏
Awesome teaching !! Magnificent !! have one doubt if the source is CSV or oracle db(instead of sql server source db) can we do all this activity by using ADF?
Thank you so much for this video, im following it, but cannot find the resource files that you previously created. Can you please tell me where i can find them?
at 32:28 i am not getting the same page, its saying Access policies not available. The access configuration for this key vault is set to role-based access control. To add or manage your access policies, go to the Access control (IAM) page. what should I do help
A query regarding SHIR. AFter installation of integration runtime on local machine how does it connect to the SQL server ? It detects automatically or is there is any connection setting we need to do?
For anyone on M1+ Macs: the database setup for AdventureWorks is not possible due to SHIR limitations (self-hosted integrated runtime -> only possible on Windows). You cannot install SSMS with SQL Server. As an alternative to SSMS, you can use a dockerized version of Azure SQL Edge and install Azure Data Studio that connects to your localhost (the dockerized edge container). This way you can query/see the table on Mac M1. HOWEVER, you cannot connect to Azure Data Factory with SHIR in the step to get the data from your 'on-premise' (laptop) database to ADF, since SHIR is not installable on ARM Macs. I have simply installed a SQL server + Database in Azure itself and used that as a reference to continue the tutorial. Good luck!
I am also using M1 Mac. If we go with this we can directly use the default run time instead of the self-hosted run time correct? As both the resources are in the Azure cloud
Hats off to your patience and the detailed explanation of end to end pipeline , even the paid content courses will not give you this level of explanation ..grateful and thankful for this amazing end to end project video
can i add this in my resume and tell this as my project in company as i am moving my career from python developer to azure data engineer. if i can , then how many year experience candidate can add this
If any one getting these problem access policies is not available then go to in the left pane of azure key vault and search for access configuration in there You get permission model In there tick the vault access policy and apply it So after refresh the access policies page your problem will get solve
Wonderful explanation. Best on the TH-cam. Just a small doubt, u can do the entire ETL in Azure Synapse, why did u create seperate ADF,ADB resources while Azure Synapse alone contains all of it?
Thank you :) Yes, using synapse we can do the complete ETL process, but I wanted to include as much resources as possible in this Architecture for help people understand how each resources are used in Azure and its integrations :)
when you store data from bronze to silver,, is their any way to do naming convention as... i knew for one file.. but you saved using loop.. ho w to achieve this
Thank you MrK, I’m a project manager with not so tech background and I look after these kinds of projects. Your explanation gave me perfect idea of understanding the flow functionally and made me aware of Data engineering projects end to end, thank you!
Awesome project and really well explained. I learned a lot. I obviously built the same as the video progressed, although I used a more comprehensive database with multiple schemas. That forced me to learn how to create the pipelines slightly different. Really enjoyed every minute of the video. I like the end-to-end project approaches, you learn so much more and its closer to a real world data flow.
"Great video! Your explanation of End-to-end Azure Data Engineering was clear and easy to follow. I appreciate your expertise in breaking it down for us.
Amazing contents. Thank you so much for your efforts! I have a question about the tech stacks that you chose. I see a lot of different combinations regarding Azure native tools from companies. Like some companies only use ADF & Databricks (or even only Databricks); others like this demo - ADF, Data Lake Gen2, Synapse Analytics, and Databricks. Could you please explain to me why you chose those tools for the demo? Because for me, only with ADF for ingestion and Databricks for ETL indeed also works for this end to end, right? Or did you choose those just for the training purpose? It will be very much appreciated! Thanks you.
Mr K, This is a top notch content! but, I have been avoiding azure for 5 years and I think I should avoid it more. why they make the process too repetitive? I am a Fan of AWS and I can accomplish the same pipeline in a few line of code with some python knowledge. But your tutoring is exceptional. I 've subscribed!
I am following this tutorial with a M1 MacBook Pro. Please provide any advice as I can't seem to install adf integration runtime. I am currently using Azure Data Studio with Docker. I tried to install SSIM on UTM and Parallel, but they all failed.
Watched the tutorial in one go. When i decided to become azure data engineer and tried to watch your video it looked me difficult bcz azure have so many services with all those authentication. Then i practiced the adf, data lake, databricks, powerbi for months taking DP-203, PL-300 exams and making some small projects then i understood this project. For a beginner this might be difficult.
Woww!! What a clear, knowledge-filled video! Thank you so much!! One when to use what question between Synapse Analytics and Data Factory if both offer creating pipelines if want to use Databricks? Thank you in advance!
Hello Mr.K, thanks for this informative video. However, I am facing some connectivity issues at the beginning when I am trying to create a new linked service to connect database. I have tried several options and couldn't succeed. Can you please help?
Can you please make a video showing in what way the ADF Pipelines and ADB Notebooks should be created to be CI/CD ready i.e. to be promoted to higher environment?
Create video, thanks! One question: couldn't you do all the views etc (loading part after gold layer) in the Databricks, and then connect Power BI directly to them? So, would Synapse really needed here at all?
Very clear explanations along each step. Thank you very much. Would you do a video on Change Data Capture (CDC) and SCD Type with numerical examples. Thank You.
Thanks for giving such a good knowledge..! Can you tell me the logic for Incremental Load .. do we have to add one extra lookup to find maximum date of full data and then give condition if new_date=>max_date..
Hello Mr. K, At 1:08:45, I am not getting the option to enable Credential Passthrough and because of that the Databricks Notebook code is not working and I am stuck there. Can you please help me with this?
this video is very helpful, good hands on experience on azure services , the way you explained step by step process is very interesting, great content .
Thank you! GR8 Excellent video, Any suggestions ideas or video If scenario where the source data ingestion is .csv files as the database is a old legacy system and data lands in .csv format (11 different files of data for different functions of the business with one unique ID) .. so start point is to read from a directory that has the CSV files instead of on prem SQL
Nice explanation, thanks for the wonderful session I tried to create a data factory and ASA through Azure Sandbox, but unfortunately, I could not. It says "RequestDisallowedByPolicy". Can you have a solution for this? if there tell me.
Excellent video for anyone who wants to learn Azure data engineering. Do you know in real world, what'll be the volume of the on prem data in the source system and what'll the number of nodes in Azure Databricks in target system? Thanks in advance
Athough it's a good playlist, you should have started the video from creating the resources from scratch. As a beginner I faced challenges to built the resources with specific config.
Hi Bro, I have never seen a person explained like this. Though there is a speed in your explanation, I was able to grab the valuable information. A small request from my side, by any chance can you please explain, how to do the ETL validations here like.. count & duplicates, column mapping, business logics of records in source and target.
@@mr.ktalkstech Thank you so much for your reply. Your videos are amazing. Well done Sir! I am looking forward to watching DLT pipeline end to end and the rest of your videos
I hope you find this video useful. If you have learned something or found it helpful, please consider giving it a like.
What topic should I cover in the next end-to-end project video? Please let me know your thoughts in the comments.
Follow my LinkedIn profile to get regular updates:
www.linkedin.com/in/mrk-talkstech/
Thank you!😊😊😊
Please do project on autoloader incremental pipeline with dimensions, facts, quality checks, phase.
You could do a video migrating on premises to synapse, for example , loading facts and dims using serveless and delta tables, using databricks …congrats by your videos, your video is better than Microsoft official courses
Please do project for data warehouse .
instead of all tables how to choose custom tables, and push only those tables on prem sql-server to ADLS.
@hello sir what will be cluster configuration for real time projects
This is the best end-to-end project I have ever seen. I was able to set up a whole data engineering solution and personally implement it in my company based on this video. Please add more end-to-end projects when you get time. Amazing tutor and amazing channel.
Thank you soo much :)
Hey, can you assist me with something? I am stuck badly on something
This channel is definitely my go to place for End to End Projects . Absolutely love your content sir . Keep it coming . Thank you 😊
Thank you so much :)
This is the video I have been searching for a long time to kick off my databricks journey, now Mr. K you offered it for public free, thank you very much. :) Please keep it up.
Folks - make sure your SQL Server allows standard authentication and not only Windows authentication
Yeah well said.
I was unable to connect then used the azure sql database
How did you do that@@Armwrestling_Invasion
Please bring some more End-to-End projects on Azure
I just wanna say THANK YOU VERY MUCH , MR. K. I was able to complete the project End to End and learned so much with your video. I would really like to you make more projects with this clarity. Thank you General! 🍻🍻
Crispy clear, no-nonsense, dense, structured. Definitely subscribed.
Thank you so much :)
hi, great tutorial and indeed good learning for starters as me. Can you also please make end to end azure data engineering real time project with continuous data stream & readily available big data (so that we can readily download from your link). It would be of great help for us.
Can I become Data engineer in 3 hours? Is it bullshit or serious?
Great video! Your explanation of End-to-end Azure Data Engineering was clear, simple and easy to follow. I appreciate your expertise in breaking it down for us. God bless!!
Thank you so much :)
Excellent and well described end to end data pipeline. Waiting for more videos on real time and incremental batch processing using ADF and databricks. Thanks
First time, I've encountered a comprehensive and practical explanation from start to finish. Thank you very much for all your efforts.
Thank you so much :)
The BEST tutorial about data engineering that I have Found. I'm a data analyst, but in my company I'm having to do a LOT of data engineering job and I was totally lost till I found this tutorial. THANK you for helping me.
Bro same condition with me as well
But does the company give us data engineer experience and does the data analyst experience worth it for data engineer
Bro should have shown how to create those resources also. Can u create a separate video showing how to create these different resources
I already have videos for Databricks and ADF, please do watch it using my ADF and Databricks playlist :)
The BEST tutorial about data engineering that I have Found,you have explained in a perfect way, that was like most easiest way to follow your video and learn the project, best video on end to end project
What a powerful 1st 10 minutes... Looking forward to following along! I am just a SQL DB to Power Bi, but want to push the company forward and this is just the knowledge I need to begin :)
Thank you soo much :)
Mr. K, you are the gem man. Your way of explaining the things superb. Thanks a lot for this free content.
Thank you so much :)
Clear explanation with ADF BUT one needs to build more real time projects as we as with scenarios bro
All ways welcome brother...
Brilliant. Clearly explained and well structured.
Thank you so much :)
How awesome is this video! I spent probably like 20h watching, rewatching and redoing certain parts on my own :D I learned so much, thank you!
Your presentation and explanation were excellent, and Microsoft owes you appreciation for it!!
Thank you so much :)
Amazing Video @Kishore, you have explained in a perfect way, that was like most easiest way to follow your video and learn the project, best video on end to end project, hoping for more videos to come, thanks again man, keep shining
HI, thank you so much for amazing project, I was wondering would the process be same if it is JSON files instead of sql server ? I need a connection between JSON files API to azure cloud and finally to power BI. Please let me know. Thank you so mcuh. Regards
Great session 👍,
just want to know can we do the same activity with just AZ Synapse as u told it can do both Ingest & transform as well.
Plz make Part-2 of this with AZ Synapse only, if possible.
Hello Mr. K . As I follow this one,
I have some problems connecting with my on premise.
on your tut, you've successfuly konekted to on premise.
but me. i'm having a hard time,
on my installed IR on my local machine,under diagnostic, i tried my sql server settings. and sucess but in the azure portal > adf linked service, still can't connect
Mr. k Sirji I went through the first part of your video - Environment setup. You mentioned that after going through end to end one can have the architecture in one's resume.
Surprised that basics like Linked Service, Datasets were just created on the fly. These are extremely important components of ADF and no concept was provided. I am lucky that I have previously gone through certain ADF data movement tutorials
All essential components were already created and creation is services forms a basis of any tutorial.
You create a SHR IR service for one premise SQL Server but the most important thing is the configuration of Security Lists / Groups (Virtual Cloud Nw Firewall).
Anyway the presentation is good , voice volume is optimum and speed is also okay.
Namaste 🙏
Its a very good skill to have as a Data Engineer :)
Awesome teaching !! Magnificent !!
have one doubt if the source is CSV or oracle db(instead of sql server source db) can we do all this activity by using ADF?
Thank you so much for this video, im following it, but cannot find the resource files that you previously created. Can you please tell me where i can find them?
Best end-to-end project I have ever seen with real time explanation .Nice expanation and great content really appreciate your efforts
Can this project be worked out in Fabric environment much simpler way with out much configuration setups ?
Please let me know if you a video for that?
at 32:28 i am not getting the same page, its saying
Access policies not available.
The access configuration for this key vault is set to role-based access control. To add or manage your access policies, go to the Access control (IAM) page.
what should I do help
Hi , This is very useful project. However, i would like to request how do you it for real-time streaming pipelines when th data updation occurs.
Good job Mr k. I will like to ask, will it be possible to replicate this end to end project in Azure 3o days free trial subscription.Thanks
A query regarding SHIR. AFter installation of integration runtime on local machine how does it connect to the SQL server ? It detects automatically or is there is any connection setting we need to do?
my foreach activity not succeed (in progess) but inside copy activity succeed... Why this happened 😢?
Love your vids, man!
Is there any difference of this vid from your udemy vid?
Im about to purchase from udemy but dont know if they are the samething.
I hope he replies you. I have the exact same question.
Its the same one with few added information and resources, I have the course in the Udemy for wider reach :)
Its the same one with few added information and resources, I have the course in the Udemy for wider reach :)
Can you provide code link/ github link for the same
For anyone on M1+ Macs: the database setup for AdventureWorks is not possible due to SHIR limitations (self-hosted integrated runtime -> only possible on Windows). You cannot install SSMS with SQL Server. As an alternative to SSMS, you can use a dockerized version of Azure SQL Edge and install Azure Data Studio that connects to your localhost (the dockerized edge container). This way you can query/see the table on Mac M1. HOWEVER, you cannot connect to Azure Data Factory with SHIR in the step to get the data from your 'on-premise' (laptop) database to ADF, since SHIR is not installable on ARM Macs. I have simply installed a SQL server + Database in Azure itself and used that as a reference to continue the tutorial. Good luck!
Thanks for this, appreciate it :)
I am also using M1 Mac. If we go with this we can directly use the default run time instead of the self-hosted run time correct? As both the resources are in the Azure cloud
Thanks for this end to end project
can you also create video for incremental data loading in all systems
Hats off to your patience and the detailed explanation of end to end pipeline , even the paid content courses will not give you this level of explanation ..grateful and thankful for this amazing end to end project video
Can someone advise from where we can get input data sets? I mean all the required input tables in SSMS?
What is the need of synapse, cant we connect power bi directly to gold container Datalake?
Hello sir
From which repository you picked the data of sales , can u please mention the link of the data set
My sql server connection failed at the Source creation level/ timestamp - 31:44, if someone can help?
can i add this in my resume and tell this as my project in company
as i am moving my career from python developer to azure data engineer.
if i can , then how many year experience candidate can add this
Amazing work. Appreciate the help. just one question , Can we use azure data brick instead Synapse to avoid vendor lock in ?
Please come up with more videos.Thanks a lot for your effort
In real world, how many days/months/ man hours does this project take?
If any one getting these problem
access policies is not available then go to in the left pane of azure key vault and search for access configuration in there
You get permission model
In there tick the vault access policy and apply it
So after refresh the access policies page your problem will get solve
How to create this kind of videos. Please help with the tools to built this type video.
Crystal clear explanation & Amazing content Mr.K, Looking forward for more Hands-on projects.
Thank you for this contribution.
Thank you so much :)
Can you add a video on how you created the resource groups?
Very well crafted, the best possible way to create an end to end project for someone who is a beginner to Azure.
A very good job, Keep going buddy🎉
Thank you so much :)
how much does it cost to do this in pay-as-you-go model i azure for practice?
Wonderful explanation. Best on the TH-cam.
Just a small doubt, u can do the entire ETL in Azure Synapse, why did u create seperate ADF,ADB resources while Azure Synapse alone contains all of it?
Thank you :) Yes, using synapse we can do the complete ETL process, but I wanted to include as much resources as possible in this Architecture for help people understand how each resources are used in Azure and its integrations :)
for ETL ADF, Synapse are code free platform you wont get everything in big data, good to work on data bricks platform is best option spark cluster.
May I know how you have created the resources group use din this video
Can someone tell me, Is this day-to-day routine task of a data engineer?
Once dataloaded to bronze layer what aboutincremtal load
hi @Mr. K if possible please make a video on CICD.
Sure, will be uploaded soon :)
One more end to end project, with different scenarios would be great
on 14:27 , from where did u get that script , please help me out
hi is it merged version of this project playlist?
when you store data from bronze to silver,, is their any way to do naming convention as... i knew for one file.. but you saved using loop.. ho w to achieve this
Appreciate it Mr. K ! Thank you ❤
A senior colleague recommended this video to me and I must say I am beyond grateful. Thanks for compiling such an amazing content. A Game Changer!!!
Thank you so much :)
Thank you MrK, I’m a project manager with not so tech background and I look after these kinds of projects. Your explanation gave me perfect idea of understanding the flow functionally and made me aware of Data engineering projects end to end, thank you!
Thank you so much :)
can you plz provide the note books
Can anyone create resume based on this project?
Awesome project and really well explained. I learned a lot. I obviously built the same as the video progressed, although I used a more comprehensive database with multiple schemas. That forced me to learn how to create the pipelines slightly different. Really enjoyed every minute of the video. I like the end-to-end project approaches, you learn so much more and its closer to a real world data flow.
Thank you so much :)
Can you share the dataset link or script ?
"Great video! Your explanation of End-to-end Azure Data Engineering was clear and easy to follow. I appreciate your expertise in breaking it down for us.
Thank you so much :)
Amazing contents. Thank you so much for your efforts!
I have a question about the tech stacks that you chose. I see a lot of different combinations regarding Azure native tools from companies. Like some companies only use ADF & Databricks (or even only Databricks); others like this demo - ADF, Data Lake Gen2, Synapse Analytics, and Databricks. Could you please explain to me why you chose those tools for the demo? Because for me, only with ADF for ingestion and Databricks for ETL indeed also works for this end to end, right? Or did you choose those just for the training purpose? It will be very much appreciated! Thanks you.
Hi @Mr. K Talks Tech could you please make a project on delta live tables.
Sure :)
Mr K, This is a top notch content! but, I have been avoiding azure for 5 years and I think I should avoid it more. why they make the process too repetitive? I am a Fan of AWS and I can accomplish the same pipeline in a few line of code with some python knowledge. But your tutoring is exceptional. I 've subscribed!
I am following this tutorial with a M1 MacBook Pro. Please provide any advice as I can't seem to install adf integration runtime. I am currently using Azure Data Studio with Docker. I tried to install SSIM on UTM and Parallel, but they all failed.
Watched the tutorial in one go. When i decided to become azure data engineer and tried to watch your video it looked me difficult bcz azure have so many services with all those authentication. Then i practiced the adf, data lake, databricks, powerbi for months taking DP-203, PL-300 exams and making some small projects then i understood this project. For a beginner this might be difficult.
Each time copy all data from SSMS and will it overwrite to ADLS g2
Yes that's right, haven't implemented an incremental load in this Project :)
please make video on the same
@@mr.ktalkstech
df.write.format("delta").mode('overwrite').save("path") you can implement an incremental load with this option mode in pyspark.
Woww!! What a clear, knowledge-filled video! Thank you so much!!
One when to use what question between Synapse Analytics and Data Factory if both offer creating pipelines if want to use Databricks? Thank you in advance!
Please make more end to end real time project like this using azure. The video is so much informative and have learnt so many things.
Thank you so much, Sure :)
Thanks for this project, expecting such more end to end projects.
Hello Mr.K, thanks for this informative video. However, I am facing some connectivity issues at the beginning when I am trying to create a new linked service to connect database. I have tried several options and couldn't succeed. Can you please help?
Can you please make a video showing in what way the ADF Pipelines and ADB Notebooks should be created to be CI/CD ready i.e. to be promoted to higher environment?
Create video, thanks! One question: couldn't you do all the views etc (loading part after gold layer) in the Databricks, and then connect Power BI directly to them? So, would Synapse really needed here at all?
Very clear explanations along each step. Thank you very much. Would you do a video on Change Data Capture (CDC) and SCD Type with numerical examples. Thank You.
Thanks for giving such a good knowledge..! Can you tell me the logic for Incremental Load .. do we have to add one extra lookup to find maximum date of full data and then give condition if new_date=>max_date..
Hello Mr. K,
At 1:08:45, I am not getting the option to enable Credential Passthrough and because of that the Databricks Notebook code is not working and I am stuck there. Can you please help me with this?
this video is very helpful, good hands on experience on azure services , the way you explained step by step process is very interesting, great content .
Thank you so much :)
Thank you! GR8 Excellent video, Any suggestions ideas or video If scenario where the source data ingestion is .csv files as the database is a old legacy system and data lands in .csv format (11 different files of data for different functions of the business with one unique ID) .. so start point is to read from a directory that has the CSV files instead of on prem SQL
Nice explanation, thanks for the wonderful session
I tried to create a data factory and ASA through Azure Sandbox, but unfortunately, I could not. It says "RequestDisallowedByPolicy". Can you have a solution for this? if there tell me.
Excellent video for anyone who wants to learn Azure data engineering. Do you know in real world, what'll be the volume of the on prem data in the source system and what'll the number of nodes in Azure Databricks in target system? Thanks in advance
Athough it's a good playlist, you should have started the video from creating the resources from scratch. As a beginner I faced challenges to built the resources with specific config.
Hi Bro, I have never seen a person explained like this. Though there is a speed in your explanation, I was able to grab the valuable information. A small request from my side, by any chance can you please explain, how to do the ETL validations here like.. count & duplicates, column mapping, business logics of records in source and target.
Thank you so much :) Sure, ll do that in the future :)
make another project and add CDC,SURROGATE KEY and SCD2 IN IT
Just one question, whats the usage of azure synapse as we just loaded views. powerbi can be connected directly to ADLS?
Hi I dont know why my pipeline run failed in the foreach step, it says Out Of Memory although I followed the same steps I dont know how to fix this
Thank you for video MR K. I just wonder what made you to use ADF pipeline rather than DLT one?
This is one way to do it- yeah of course we can use the DLT as well- ll cover that in another End-to-End project :)
@@mr.ktalkstech Thank you so much for your reply. Your videos are amazing. Well done Sir! I am looking forward to watching DLT pipeline end to end and the rest of your videos
getting the error as loadiung failed when connecting with keyvault in integrationruntime...not able to connect with keyvault for password
Has anyone got charged for using azure key vault? How much it will cost as I am using pay as you go subscription
Anyone can help with the dataset?