Can you also load below CSV type ? There is CSV file with some comment in initial 2-3 line and then header starts and at the end there is one comment line which says total number of records in CSV file. How to crawl such CSV file and load into snowflake ?
In our case, files are going to be loaded in S3 from on-prem file-system. S3 then has to check file integrity (md5 checksum and row count comparison between on-prem FS and S3). The files which pass this integrity check have to move from S3 to Posgres. Can you please suggest any way to do this?
Superb, helped me a lot,Thank you
Can we do update operation in mysql db with glue
can we also load incremental data automatically by scheduling glue job?
Why do you create s3 endpoint to access RDS which is in private. Is it not create RDS endpoint.
Hey Can I do it with EC2 sql..I mean s3 to EC2 sql instance using glue
Very informative video. Thank you very much for sharing :)
Can you also load below CSV type ?
There is CSV file with some comment in initial 2-3 line and then header starts and at the end there is one comment line which says total number of records in CSV file. How to crawl such CSV file and load into snowflake ?
Amazing, thank you very much
thank you very much!!!
In our case, files are going to be loaded in S3 from on-prem file-system. S3 then has to check file integrity (md5 checksum and row count comparison between on-prem FS and S3). The files which pass this integrity check have to move from S3 to Posgres. Can you please suggest any way to do this?
Hey you got any solution for this?
How can we load data from S3 to Amazon Aurora PostgreSql
Good question, please share the info if you are able too. tnxs