If we have a file with more 50 column names,is there any option like whole importing itself mention first row as header,so that all column of source at least will be generated automatically.
I am getting Error: Invalid configuration value detected for fs.azure.account.key when i use delta lake connector , when i preview data in DBFS (Hive) i am able to view but when i am trying to copy data i am getting error can you please help me
Hii have seen all your videos i am working along with microsoft for an upcoming project with providence healthcare in US. Videos were really helpfull. Can you comment if copy data activity will work if we want to copy files in different subscriptions?
Actually i have created a Ftp server in my win 10 machine using IISM and while creating linked service to connect with it i get error "connection failed". Can you please do one thing like instead of storing these files in blob storage cloud directly, create a local FTP server in you local machine and then use data factory to connect with the Local FTP server and then copy data to blob storage.
Hello I want to compare values of four fields weather they are same or not .. like if a=b=c=d Can we do that using derived column activity ?thanks in advance
Hi Wafa, Could you please help me store the below data into a table of one row in SQL? Table Schema MPR,ReadDate,Block1,ReadType1,Block2,ReadType2,Block3,ReadType3,Block4,ReadType4,Block5,ReadType5,Block6,ReadType6 Explanation : Block1 ==> 00:30 and kWh value goes into this column Block2 ==> 01:00 and kWh value goes into this column Block3 ==> 01:30 and kWh value goes into this column Block4 ==> 02:00 and kWh value goes into this column Block5 ==> 02:30 and kWh value goes into this column Block6 ==> 03:00 and kWh value goes into this column Here is Data : MPR,ReadDatetime,kWh,ReadType 111,04/11/2020 00:30,100,Actual 111,04/11/2020 01:00,100,Actual 111,04/11/2020 01:30,100,Actual 111,04/11/2020 02:00,100,Actual 111,04/11/2020 02:30,100,Actual 111,04/11/2020 03:00,100,Actual please suggest to me how can I achieved this.
Awesome videos bro.Do you have the C# code to create copy activity programmatically? Im trying to add additional columns to the source dataset but couldn’t find a way.It would be great if you can help me in doing that. Thank you
Too many repeats. If you eliminate repetitions, then amount of information will be 20%-30% out of the all time. Try to avoid it as it makes lestener feel boring.
hi bro glad to see your classes but if you give the links of the documentation in description which you mentioned in the video that would be helpful
If we have a file with more 50 column names,is there any option like whole importing itself mention first row as header,so that all column of source at least will be generated automatically.
Please do some more practical sessions on ADF which are related to real-time projects. So it will helpful to all
Yesss exactly
for multiple files from a nested folder what we have to do ? In the source dataset keep the filename as empty and put here recursive in pipeline
thanks for your effort...better to complete Lab on each topic..instead showing them in ADF window.
I am getting Error: Invalid configuration value detected for fs.azure.account.key when i use delta lake connector , when i preview data in DBFS (Hive) i am able to view but when i am trying to copy data i am getting error can you please help me
Hey do you have a video for Copy activity where we load csv to Azure datalake ?
Hii have seen all your videos i am working along with microsoft for an upcoming project with providence healthcare in US. Videos were really helpfull. Can you comment if copy data activity will work if we want to copy files in different subscriptions?
How can I copy tables from hive usable and sink into .CSV?
can we connect i have an issue with HTTp Connector
Actually i have created a Ftp server in my win 10 machine using IISM and while creating linked service to connect with it i get error "connection failed".
Can you please do one thing like instead of storing these files in blob storage cloud directly,
create a local FTP server in you local machine and then use data factory to connect with the Local FTP server and then copy data to blob storage.
You are awesome bro.
Thank you 🙂
you have chosen csv file why ,But you started ur discussion with table.. by 11:38 Please share
Hello I want to compare values of four fields weather they are same or not .. like if a=b=c=d
Can we do that using derived column activity ?thanks in advance
How to copy data to a computed column in sink? It is giving error.
Bro, can you please make a video on using UPSERT option in copy data activity loading into multiple tables at a time?
How to load top 10 records from CSV file to Azure sql database in ADF
You can use suggograte transformation and filter condition to achieve this task
Hi Wafa,
Could you please help me store the below data into a table of one row in SQL?
Table Schema
MPR,ReadDate,Block1,ReadType1,Block2,ReadType2,Block3,ReadType3,Block4,ReadType4,Block5,ReadType5,Block6,ReadType6
Explanation :
Block1 ==> 00:30 and kWh value goes into this column
Block2 ==> 01:00 and kWh value goes into this column
Block3 ==> 01:30 and kWh value goes into this column
Block4 ==> 02:00 and kWh value goes into this column
Block5 ==> 02:30 and kWh value goes into this column
Block6 ==> 03:00 and kWh value goes into this column
Here is Data :
MPR,ReadDatetime,kWh,ReadType
111,04/11/2020 00:30,100,Actual
111,04/11/2020 01:00,100,Actual
111,04/11/2020 01:30,100,Actual
111,04/11/2020 02:00,100,Actual
111,04/11/2020 02:30,100,Actual
111,04/11/2020 03:00,100,Actual
please suggest to me how can I achieved this.
Possible to transfer pdf and jpg files
Hey, can u help me with unzipping password protected zip file using azure data factory
Awesome videos bro.Do you have the C# code to create copy activity programmatically? Im trying to add additional columns to the source dataset but couldn’t find a way.It would be great if you can help me in doing that. Thank you
Thank you!
Awesome, could you also Configure SSIS IR with Vnet
Can u plz share the video on on premises sql server database to azure blob storage using copy data tool
Thank you.
Welcome 🤗
Is that possible to view column pattern expression in windows transformation ?
I dream of a world where not 99% of it tutorials are indian
Hehe 😅
Thanks !
Thank a lot
Welcome 😊
Too many repeats. If you eliminate repetitions, then amount of information will be 20%-30% out of the all time. Try to avoid it as it makes lestener feel boring.
diffrence between file and table
Link should be placed on description not on video🤷🏻
Please share your presentation
Why u splitting small small topics this many videos, repeating in each video...U can tell 30 videos content in 1 single video 40 minutes
okay