Thankyou so much ma'am for the video. It was very helpful. But can you please make a video or guide me on how to use a stored procedure in the pipeline to first check if the table exists and then truncate it then load it.. or if the table does not exist then create the table and load it.
Great video, mam I have a doubt that after using a filter activity why don't we take a copy activity directly to load dat into destination. Please clear my doubt
Hello, as an alternative to the filter activity, could we instead use the length function over the childItems array of the second get metadata to get the number of files inside the folder?
Hello, I have one similar case but I need to iterate each container and each subfolder something like recursively, so I need to identify all the subfolder nested
Hi, thank you for your video, could you tell instead of count how do i get actual folder path (example: 2023/11/23....n subfolder and actual file name) how to output the actual folder path instead of count of files.
HI any, first of all good job on making such videos. One thing I didn't get it in the lookup activity you have used for folder name as '@{item().name}.But how the pipeline will come to know that is from GEtMetadata1 because we have one more fetMetadat2 also present where @{item().name} also present. Please explain
Hi Chittaranjan, thankyou for watching the video.. So, getmetadata2, filter and look up activities are inside for each . Foreach is the place where we have defined item as getmetadata1 output that means wherever inside foreach you refer item().name it will refer to foldernames one by one for each iteration. Now, if u see filter activity expression, there we are refering getmetadata2 output as item. So in filter expression when we are refereing item().name it means getmetadata2 output . But then again coming to lookup, item().name means getmetadata1 output. Now, suppose your requirement was to take all the file names coming out of getmetadata2 and copy all files as tables into sql , then we would add another foreach connected to getmetadata2 which is not directly possible to have foreach within foreach, so u gotta add execute pipeline and add foreach in child pipeline, and then pass getmetadata2 output as item in foreach2 and add copy activity within foreach2 and in copy activity if you refer item().name then it wl refer to getmetadata2 output not getmetadata1 output because you are within direct umbrella of foreach2 which means getmetadata2 output.
if i hava a empty folder before deploy,it is showing folder 1 -count 8,folder 2 - count 5,folder 3 - count 0 after that if i copy a file in folder 3 the it is again show double time show's folder 3 - count 0 & folder 3 - count 1..please help me.
Ok ive found several examples of this, NONE of which show a kind of recursion that then goes back into the level 2 folders to get or count the files in the subfolders. The use case in have is to start at a parent folder and extract the path and name of every file in all the subfolders. The folder structure is such that there may be any number of subfolders with files in the last child folder. The subfolders may be 6 or 7 levels deep 😑
.. I learned ADF watching your videos .. thank you so much
Glad to know that 😇 All the best for future
Very well explained
Great! This was what I was looking for.
Nice nice
Explanation is very good Annu
Keep up the good work ✌️😊
Thankyou so much ma'am for the video. It was very helpful. But can you please make a video or guide me on how to use a stored procedure in the pipeline to first check if the table exists and then truncate it then load it.. or if the table does not exist then create the table and load it.
Thanks Annu, very nicely explained
I wanted to loop through 5 layers of subfolders and copy the jason to azure sql db, this example was useful
Very glad to know it was helpful 😊 Thanks for watching and commenting
Great video, mam
I have a doubt that after using a filter activity why don't we take a copy activity directly to load dat into destination.
Please clear my doubt
Thxs for the video this has made my job simple. i am really thankful mam..
Thankyou for watching the videos. Glad to know it was helpful
Very nice video
Thank you
Hi Annu, thanks for your videos and sharing knowledge ,could you please make videos on incremental load, scd1,scd2
Kindly watch out part 9 to 12 of synapse playlist, similar implementation can be done in ADF: th-cam.com/video/FXw1gPaa2-M/w-d-xo.html
Thanks Annu
Great video! Can you please show how you can copy this multiple files into a database? not just the number of files but the con
tent of the files
Thank you so much🤗
This is impressive use case, but now consider everything until FILTER then copy to a blob storage all the files you get from filter.
Hello, as an alternative to the filter activity, could we instead use the length function over the childItems array of the second get metadata to get the number of files inside the folder?
Yes that can be another possible solution 😊 Thanks for sharing
@@azurecontentannu6399 You’re welcome. Nice video! This is a good use case for metadata management.
Hello, I have one similar case but I need to iterate each container and each subfolder something like recursively, so I need to identify all the subfolder nested
You can try wildcard file path too
can you please explain when we use Get Metadata activity and Lookup activity &
deference between the Get Metadata activity and Lookup activity
Hi, thank you for your video, could you tell instead of count how do i get actual folder path (example: 2023/11/23....n subfolder and actual file name) how to output the actual folder path instead of count of files.
you can use $Filepath option in additional column in copy activity but that will just give file name as the output
HI any, first of all good job on making such videos. One thing I didn't get it in the lookup activity you have used for folder name as '@{item().name}.But how the pipeline will come to know that is from GEtMetadata1 because we have one more fetMetadat2 also present where @{item().name} also present. Please explain
Can u please explain
Hi Chittaranjan, thankyou for watching the video.. So, getmetadata2, filter and look up activities are inside for each . Foreach is the place where we have defined item as getmetadata1 output that means wherever inside foreach you refer item().name it will refer to foldernames one by one for each iteration. Now, if u see filter activity expression, there we are refering getmetadata2 output as item. So in filter expression when we are refereing item().name it means getmetadata2 output . But then again coming to lookup, item().name means getmetadata1 output.
Now, suppose your requirement was to take all the file names coming out of getmetadata2 and copy all files as tables into sql , then we would add another foreach connected to getmetadata2 which is not directly possible to have foreach within foreach, so u gotta add execute pipeline and add foreach in child pipeline, and then pass getmetadata2 output as item in foreach2 and add copy activity within foreach2 and in copy activity if you refer item().name then it wl refer to getmetadata2 output not getmetadata1 output because you are within direct umbrella of foreach2 which means getmetadata2 output.
Can we load the data of files in different folder into the azure sql table?
hello
after 15 min this video is getting stuck and not able to follow afterwards
please check
This is good but if we need to copy Folders with file inside folders, how to do that please suggest
Hi Maam i have 10 files like file1,file2,.... i want to load first 5 files to sql can you do one video on this scenario
Sure 😊
if i hava a empty folder before deploy,it is showing folder 1 -count 8,folder 2 - count 5,folder 3 - count 0 after that if i copy a file in folder 3 the it is again show double time show's folder 3 - count 0 & folder 3 - count 1..please help me.
how to merge it
Ok ive found several examples of this, NONE of which show a kind of recursion that then goes back into the level 2 folders to get or count the files in the subfolders.
The use case in have is to start at a parent folder and extract the path and name of every file in all the subfolders. The folder structure is such that there may be any number of subfolders with files in the last child folder. The subfolders may be 6 or 7 levels deep 😑
gopi
0
krish
2
nag
8
gopi
1
krish
2
nag
8