Hey Kahan, very informative video. I had few doubts. If I am using DMS for CDC then each change will cause a new object in S3, thereby raising costs (beacuse of huge s3 writes). Is there a way to mitigate this?
Hi Kahan, any idea where is the binary file located and if there is cases whereby changes to the source doesnt capture in the DMS. Where will be the place to troubleshoot it.
Good afternoon, I have a question about the DMS. I have a source table with 50 columns and I want to generate a table with only 10 columns at the destination, can I configure this within the DMS?
Hi Kahan, amazing content. How would you feel about making a video setting up DMS in Terraform? In my experience, companies tend to use Terraform codes as a way of organizing all the stack, making this platform settings really uncommon in real life. I've got some codes examples if you want, this is a content that is not found on TH-cam, so I think it would be great for your channel as well.
You could ramp up your database specs to give it more memory/compute power. Or perhaps switch from event replication via DMS to a more batch loading process where it can replicate data during times of less activity (ex. overnight).
Hi Richa - I'm not 100% sure if this is possible. However, I would still recommend keeping this focused only on data migration. Adding logic at this step will add more complexity to your overall architecture. You may end up having logic happening in various places, making it difficult for other developers/stakeholders to put the pieces together.
@@KahanDataSolutions Thanks for the quick reply. I have a use case where I need to capture the data changes in Postgres DB and push the messages to SQS. This use can easily be achieved with debezium and Kafka but is there any alternative to do the same thing with AWS services preferably ? Like pushing data changes in Postgres to SQS?
@@richa6695 Perhaps you could use Lambda functions. The trigger could be the migration and then then the end point would be something in SQS. I have not tried this personally, but that's my first thought. Let me know if you end up implementing a solution!
Hi, unfortunately I won't be able to help you here. This sounds like an issue specific to your set-up. Try searching on Google or other forums for the same error. Good luck!
Looking for help with your team's data strategy? → www.kahandatasolutions.com
Looking to improve your data engineering skillset?→ bit.ly/more-kds
Straight to the point, informative, and easy to consume content. Thanks
Thanks for watching!
congratulations on the initiative
Hey Kahan, very informative video. I had few doubts. If I am using DMS for CDC then each change will cause a new object in S3, thereby raising costs (beacuse of huge s3 writes). Is there a way to mitigate this?
Hi Kahan, any idea where is the binary file located and if there is cases whereby changes to the source doesnt capture in the DMS. Where will be the place to troubleshoot it.
how can i use this way to store all the updates changes in one csv file (the updated result) rather than storing just the updates in individual csv's?
damn, it helps me a lot
too easy! thanks
thanks for the video. please how did you create that RDS Endpoint?
Good afternoon, I have a question about the DMS.
I have a source table with 50 columns and I want to generate a table with only 10 columns at the destination, can I configure this within the DMS?
Hi Kahan, amazing content. How would you feel about making a video setting up DMS in Terraform? In my experience, companies tend to use Terraform codes as a way of organizing all the stack, making this platform settings really uncommon in real life. I've got some codes examples if you want, this is a content that is not found on TH-cam, so I think it would be great for your channel as well.
My dms source endpoint is getting failed due to error : cannot connect to ODBC provider ODBC general error
AWESOME Sir !!!
Thanks for your video, if my mysql instance is On-premise, any solution to move the data to s3 like this?
Is there a free version of setting this up for small databases? Even the cheapest option costs ~$150 a year. ($0.018 an hour)
What's your monitor's refresh rate?
I'm not sure (sorry!).
But I'm a huge Family Guy fan. Can't believe Peter Griffin is also an aspiring data engineer!
@@KahanDataSolutions lol, no worries.
How do you handle performance issues impacting your db when you have alot of transactions replicating to s3
You could ramp up your database specs to give it more memory/compute power. Or perhaps switch from event replication via DMS to a more batch loading process where it can replicate data during times of less activity (ex. overnight).
Can you use DMS for business logic(for a specific use case) as well or should only be used for data migration
Hi Richa - I'm not 100% sure if this is possible. However, I would still recommend keeping this focused only on data migration.
Adding logic at this step will add more complexity to your overall architecture.
You may end up having logic happening in various places, making it difficult for other developers/stakeholders to put the pieces together.
@@KahanDataSolutions Thanks for the quick reply. I have a use case where I need to capture the data changes in Postgres DB and push the messages to SQS. This use can easily be achieved with debezium and Kafka but is there any alternative to do the same thing with AWS services preferably ? Like pushing data changes in Postgres to SQS?
@@richa6695 Perhaps you could use Lambda functions. The trigger could be the migration and then then the end point would be something in SQS.
I have not tried this personally, but that's my first thought.
Let me know if you end up implementing a solution!
Can you give list of service, policy, and roles that is used ?
I want to implement this migration using IAM user
This video will walk you through the permissions that were used for this - th-cam.com/video/9n28d8ezrLQ/w-d-xo.html
Legend🔥🤜🤛
Can u help me with this
Hi, unfortunately I won't be able to help you here. This sounds like an issue specific to your set-up. Try searching on Google or other forums for the same error. Good luck!