Informatica Cloud data integration. I am running into a scenario where I have to tweak over 30 - 40 fields data type coming from Salesforce to Teradata. Issue is Salesforce has string defined as over 1500 or 3500 for cetain fields. I am create target table during the run time, how I can I make the process to maximum data size per the target table Teradata dynamically ? or if there is a better way to update all string source fields all together instead of manually tweaking 1 at a time and setting the Varchar data type to 255 for target. Example : Salesforce string data type is defined as 3500, I need to tweak around 30-40 incoming fields to maximum of 225 for Teradata target table definition
Nice explanation anna
Thank you and keep watching
Informatica Cloud data integration. I am running into a scenario where I have to tweak over 30 - 40 fields data type coming from Salesforce to Teradata. Issue is Salesforce has string defined as over 1500 or 3500 for cetain fields. I am create target table during the run time, how I can I make the process to maximum data size per the target table Teradata dynamically ? or if there is a better way to update all string source fields all together instead of manually tweaking 1 at a time and setting the Varchar data type to 255 for target.
Example : Salesforce string data type is defined as 3500, I need to tweak around 30-40 incoming fields to maximum of 225 for Teradata target table definition
This is available in web also please do something special.
Sir can you please make one scenario for persistent cache and shared cache
Will try
Pls answer the below question