The scenario which you should is working for that particular pipeline. It is not logging the values if we run other pipelines from ADF. could you please let me know why it is so??
That s a awesome explanation. Thanks. My question is, in this scenario we always hardcode status as 'success'. But what if the activity in the pipeline fails?
Thanks for creating these wonderful realtime usage of ADF, highly appreciated for the clear presentation and waiting for more relevant videos.
Thank you 🙂
Nice video, explaining with simple step by step....
Great job bruh.... 👍👍👍👍
Thank you 😊
The scenario which you should is working for that particular pipeline. It is not logging the values if we run other pipelines from ADF. could you please let me know why it is so??
That s a awesome explanation. Thanks. My question is, in this scenario we always hardcode status as 'success'. But what if the activity in the pipeline fails?
Can we change the location we put after creating a data factory?
number one class bro
Thank you 😊
@WafaStudies How can we add status of pipeline run dynamically to the audit file?
Can you please add source table table name ,data read,data right,data insert,update,delete,io kind of information in audit log..It would be great..
Can you please help me with the below question
What is the difference between Dataflow expression and Pipeline expression in Azure Data Factory ?
Hai bro my scenario is all retailers data uploaded to app service from there data should be uploaded to blob is it possible
Where you able to implement this scenario? If so then can you please let me know even I have to implement same scenario...your response will help
What if the Pipeline fails?
Why the status parameter is always Success you are putting @wafa
maza aa gya