I want to pass the JSON output of the web activity into databricks activity. I need the whole JSON output since databricks will process the whole output. How do we do that in adf?
Thanks for the video. I have a use case, Im trying to load JSON(Simple format) from an API to a single column(data type variant) in a SQL table through a copy activity in ADF. When im trying to do that the keys are splitting and number of columns mismatch error appears. May I know if there is any solution for this use case?
Thanks a lot Brother for your tutorials. I have a question - How to handle the special character in JSON (source) field names? Pipeline is failing with an error saying - input json format has special characters. and I am calling the json data using API(get). Please advise. Eg: {"customer (name) " : "abc"}
Hello techies, I am generating json code using teradata sql code and I want to send this data to a web application using api call. How should I go about it. I am completely new to adf and just started exploring it. So any help will be of great value.
Hello guru gaaru, how do I loop through this json for ingestion. I tried using lookup activity but unable to access child items. Can you please help anna. I want to use copy activity and pass the below items inside tablelist. to ingest data. { "tablelist": [ { "tblname": "rbf", "secretkey": "xyz", "container": "migration", "cpath": "dev01" }, { "tblname": "abcd", "secretkey": "abcd", "container": "migration", "cpath": "dev02/" } ]
I have Only one json file in Azure blob storage which is containing millions of Json object. My question is how read the that file and extract and upload million of single single json document instead of one. Any idea ?
I have stored all expressions in one config table to make everything dynamic here you are fetching only one type of variable... I have thousands type of variable How to fetch each of them into pipeline variable any body has that logic i don't want to use multiple times filter activity and set variable
Hi bro, I have 100 tables ,but when I create dataset ,is it mandatory create 100 datasets? And I want to talk personally with u , please provide mobile num
Thank you it helped to debug another issue which I am trying to solve
you are really my fav youtuber.
Thank you 🙂
But how to handle if it's an object instead of being an array
Appreciate your work. Keep it up 👍
Thank you Nitish Raju🙂
Pls give any idea ,how to kill infinite loop created inside the until activity in azure data factory
Until loop will break when defined condition becomes true.
If you taking about manually killing then cancel pipeline execution
I want to pass the JSON output of the web activity into databricks activity. I need the whole JSON output since databricks will process the whole output. How do we do that in adf?
The children array that you got from get metadata activity... i want to store all the children data in mongodb... how can i do that????
Thanks for the video. I have a use case, Im trying to load JSON(Simple format) from an API to a single column(data type variant) in a SQL table through a copy activity in ADF. When im trying to do that the keys are splitting and number of columns mismatch error appears. May I know if there is any solution for this use case?
Thank you Wafa!
Welcome 😀
Thanks a lot Brother for your tutorials. I have a question - How to handle the special character in JSON (source) field names? Pipeline is failing with an error saying - input json format has special characters. and I am calling the json data using API(get). Please advise.
Eg:
{"customer (name) " : "abc"}
wonderful ❤️
Thank you 😊
how to parse without foreach activity. I do get metadata inside foreach and get metadata always will return 1 file
Fantastic tutorial. Thank you. How do I write the failed output of a copy activity into a file?
great video
Thank you 😊
Hi, I want adf to extract content of text file (from storage) and based on value it should trigger another pipeline.
Use execute pipeline activity to trigger another pipeline
I am unable to read the web activity JSON response.getting property selection not supported on values of type String
Excellent
Thank you ☺️
Hello techies, I am generating json code using teradata sql code and I want to send this data to a web application using api call. How should I go about it. I am completely new to adf and just started exploring it. So any help will be of great value.
Thank you for all the hard Work. Please show how to store this Json Output to the Blob Storage. Thanks you
Please add video on upsert in ADF
Sure
Hello guru gaaru,
how do I loop through this json for ingestion. I tried using lookup activity but unable to access child items. Can you please help anna.
I want to use copy activity and pass the below items inside tablelist. to ingest data.
{
"tablelist": [
{
"tblname": "rbf",
"secretkey": "xyz",
"container": "migration",
"cpath": "dev01"
},
{
"tblname": "abcd",
"secretkey": "abcd",
"container": "migration",
"cpath": "dev02/"
}
]
}
Very good content. Nice Videos. I am looking for more such videos.
How can we log pipeline errors in a blob storage file?
The BEST
Thank you ☺️
I have Only one json file in Azure blob storage which is containing millions of Json object.
My question is how read the that file and extract and upload million of single single json document instead of one.
Any idea ?
@WafaStudies, are you available to do one ADF POC?
Good video!
can you make a video on copy data activity ? copy json from one activity to other pls
I have stored all expressions in one config table to make everything dynamic here you are fetching only one type of variable... I have thousands type of variable
How to fetch each of them into pipeline variable any body has that logic i don't want to use multiple times filter activity and set variable
Thanks
Please do tell about yourself as well we would love to know you
Sure. I will plan for it
the videos are very good but as i'm from Brazil and i don't have good english i need subtitles. It would be possible? Thanks
Thank you. I will try for subtitles
Need caption...
Hi bro, I have 100 tables ,but when I create dataset ,is it mandatory create 100 datasets? And I want to talk personally with u , please provide mobile num
Do you want to copy data from 100 tables or do something else?
Parameterize the dataset.
Thanks
Welcome 😁