Can you please help to understand how to handle NULL values in conditional split as i don't want my staging table to get those incorrect row count of null values
Hi, I don't have code for that thing. Below is the code for incremental load using sql queries but the table and column name is hard coded. th-cam.com/video/-rMDmD7GNtE/w-d-xo.html If you want to write dynamic code then you would need to write dynamic sql, maybe you can find something on google for your requirement.
is there any way I can delete the data from destination database table when the data in the source table is deleted particularly using merge join and conditionalsplit? Source: mysql , destination: sql server
For deleting the data you can only use the delete query. It won't be deleted using merge join and conditional split, but yes you can use merge join and conditional split to figure out the records those got deleted from mysql and insert those records to a staging table in sql and then based on a join with your destination table you can delete the data from destination table.
Incredible Sir! A question, I have a complex package and I need to execute the next task only is the previous one (script SQL task) in fully completed and success, is this possible?
This is the default behavior of SSIS tasks, next task in SSIS by default only gets executed if the previous task is successful. You don't need to make any special changes in the code to do this, just connect previous task with next task.
Is there any particular reason that update query is part of execute sql task and not part of data flow? I assume that you could have a condition to add data to email table on updated rows with conditional split or am I missing something?
To update a record to a mysql table we would have to use the update query. Now there are 2 ways to update the data. 1. Either we can use the OLE DB Command to update one record at a time inside data flow task. But this method will be very slower as to update 10 k records, the update query will run 10 k times. 2. Another method to update the records is that insert the records to be updated into a staging table and then outside of the data flow task just update the data from staging table to main table based on a join on id, this is called as set based update and this is very fast as compare to the previous method where update was done using OLE DB command transformation. And even I am not sure if OLE DB Command will work on MySQL or not, it works on SQL Server.
So, I have 1 millions rows in mysql I want to replicate it to mssql in every 10 seconds.. what method should I take ? If I go with this tutorial, it will update 1 millions rows every 10th seconds which will be a disaster.. what am I supposed to do here? please help
ok your channel is superb but now you do a complete etl such as removing column and removing those row which contain nulldata and add one or two column from another table and inserted only updated record and inserted only new records and do aggregation such as i have sale in province wise so save data in new column with name of pronince and just save data in warehouse is only valueable data which is almost ready for making dashboard for analysis i want to say you do some extraordinary work / full professionally work which is helpful in big organization
Honestly I don't see any incremental approach here - you needed to download all the data from sql server table, all the data from MySQL table... Then costly sort them, join them, split them doing logical operation. I'd call it - heavy and intensive operations out of sql server memory pool, without sql engine capabilities to perform fast and efficient operations on data sets... On larger datasets this approach will be slower than - truncate destination and fetch all data once again. Incremental means - fetch only those data that should be fetched - then do the rest - update, insert and very often omitted - delete. Your table dosen't have any obvious marker that data have changed. My approach: 1) on source - make additional table with id and binarysumcheck column (make it empty before first run) In ssis: 2) left join with that table - taking only rows with different sum (updated) or not existing (insert) . 3) split them and do the rest 4) recalculate additional table If you don't have a luxury of modification the source... You are... To be honest doomed...
Thank you so much for suggesting this approach. This seems very promising to me. Maybe I can test this approach as well to check how it works in terms of performance. Anyway thanks again.
Great Tutor! I enjoyed watching! ❤
Thank you so much for your appreciation.
Can you please help to understand how to handle NULL values in conditional split as i don't want my staging table to get those incorrect row count of null values
you can use a function replacenull(column,"")
Why can't we use look up Or SCD? Because of mysql won't support this type of operations or any other reason
SSIS does not support lookup or SCD, that's why we use this approach.
How to dynamically move multiple incremental record (insert update delete) sql tables from one server to another using SSIS?
Hi, I don't have code for that thing. Below is the code for incremental load using sql queries but the table and column name is hard coded.
th-cam.com/video/-rMDmD7GNtE/w-d-xo.html
If you want to write dynamic code then you would need to write dynamic sql, maybe you can find something on google for your requirement.
Can we update an oracle table from ssis? I tried with execute sql task but it will not work.
I have not tried it but ideally it should work.
is there any way I can delete the data from destination database table when the data in the source table is deleted particularly using merge join and conditionalsplit? Source: mysql , destination: sql server
For deleting the data you can only use the delete query. It won't be deleted using merge join and conditional split, but yes you can use merge join and conditional split to figure out the records those got deleted from mysql and insert those records to a staging table in sql and then based on a join with your destination table you can delete the data from destination table.
@@learnssis Thank you for the response. Love from Bhutan 🩷💜💜
Incredible Sir!
A question, I have a complex package and I need to execute the next task only is the previous one (script SQL task) in fully completed and success, is this possible?
This is the default behavior of SSIS tasks, next task in SSIS by default only gets executed if the previous task is successful. You don't need to make any special changes in the code to do this, just connect previous task with next task.
Is there any particular reason that update query is part of execute sql task and not part of data flow? I assume that you could have a condition to add data to email table on updated rows with conditional split or am I missing something?
To update a record to a mysql table we would have to use the update query. Now there are 2 ways to update the data.
1. Either we can use the OLE DB Command to update one record at a time inside data flow task. But this method will be very slower as to update 10 k records, the update query will run 10 k times.
2. Another method to update the records is that insert the records to be updated into a staging table and then outside of the data flow task just update the data from staging table to main table based on a join on id, this is called as set based update and this is very fast as compare to the previous method where update was done using OLE DB command transformation. And even I am not sure if OLE DB Command will work on MySQL or not, it works on SQL Server.
@@learnssis Got it
So, I have 1 millions rows in mysql
I want to replicate it to mssql in every 10 seconds..
what method should I take ?
If I go with this tutorial, it will update 1 millions rows every 10th seconds which will be a disaster..
what am I supposed to do here? please help
Currently there is no technology available which can replicate 1 million records from mysql to mssql in 10 seconds.
ok your channel is superb but now you do a complete etl such as removing column and removing those row which contain nulldata and add one or two column from another table and inserted only updated record and inserted only new records and do aggregation such as i have sale in province wise so save data in new column with name of pronince and just save data in warehouse is only valueable data which is almost ready for making dashboard for analysis
i want to say you do some extraordinary work / full professionally work which is helpful in
big organization
Thank you for your suggestion. Will take care of your suggestions.
how to delete incremental load record to mysql from sql server using ssis
Take a look at this video, here I shown the steps to delete incremental load records from sql server.
th-cam.com/video/bZaizKvRA8o/w-d-xo.html
@@learnssisthanks
Honestly I don't see any incremental approach here - you needed to download all the data from sql server table, all the data from MySQL table... Then costly sort them, join them, split them doing logical operation. I'd call it - heavy and intensive operations out of sql server memory pool, without sql engine capabilities to perform fast and efficient operations on data sets... On larger datasets this approach will be slower than - truncate destination and fetch all data once again. Incremental means - fetch only those data that should be fetched - then do the rest - update, insert and very often omitted - delete. Your table dosen't have any obvious marker that data have changed. My approach:
1) on source - make additional table with id and binarysumcheck column (make it empty before first run)
In ssis:
2) left join with that table - taking only rows with different sum (updated) or not existing (insert) .
3) split them and do the rest
4) recalculate additional table
If you don't have a luxury of modification the source... You are... To be honest doomed...
Thank you so much for suggesting this approach. This seems very promising to me. Maybe I can test this approach as well to check how it works in terms of performance. Anyway thanks again.
Really nice 👍
Thank you Naveen Raja.