YOU ARE GREAT TEACHER IN EXPLAINING BEAUTIFUL WAY OF ENGLISH LANGUAGE PROUNICATIONS (not like others only confusions to understand English language pronunciations) AND SYSTEMTICALLY EASY WAY OF PRESENTING THAT HELPS ME TO LEARN AND TESTED SIMALTANEOUSLY. I CAN HOPE FOR SOMETHING IN THIS FIELD TO BE UNDERTAKEN. I LOVE YOUR WAY OF TEACHING METHODS THAT HELPS ME FOR THE NON-TECHNICAL BACKGROUND LIKE ME. MUHAMMAD from North America.
Thank you so much for watching, Rohit! 😊 I'm glad you found the video helpful. If you enjoyed it, feel free to subscribe for more content like this! Best, Team KSR
ou can use one stage for several schemas by creating the stage at the database level. Then, reference it in any schema by fully qualifying it, like @my_database_stage. Just ensure the correct permissions are granted for access. Let me know if you need more details!
If i have a file size of 5GB , how long it will take to load data into snowflake from S3 Show drafts It's difficult to predict the exact time it will take to load a 5GB file from S3 to Snowflake due to several factors: Network bandwidth: The speed of your internet connection will significantly impact the transfer time. Higher bandwidth allows for faster data transfer. Snowflake credits: The amount of Snowflake credits you allocate to the COPY INTO command affects processing speed. More credits allow for parallel processing and faster loading. File format: Compressed formats like Parquet or Avro can load faster than uncompressed formats like CSV due to less data to transfer. Snowflake stage configuration: The configuration of your Snowflake stage can impact performance. Factors include the number of compute resources allocated and the type of storage used (e.g., internal vs. external). However, here are some resources that can help you estimate the loading time: Snowflake documentation: The Snowflake documentation provides information on COPY INTO: docs.snowflake.com/en/sql-reference/sql/copy-into-location.html and best practices for loading data. Snowflake sizing calculator: This tool (available through your Snowflake account) can help you estimate the resources needed for your workload, potentially impacting load times. Testing: The best way to determine the actual load time for your specific scenario is to perform a test load. You can upload a smaller sample file (e.g., 100MB) from S3 to Snowflake and measure the time it takes. This will give you a more accurate estimate for the 5GB file based on your environment. Here are some additional tips for potentially improving load times: Use efficient file formats: Consider using compressed formats like Parquet or Avro for faster transfer and processing. Optimize your stage: Ensure your Snowflake stage is configured with appropriate compute resources and storage type. Parallelize the load: Use the FILE_FORMAT option with SINGLE = FALSE in the COPY INTO command to allow Snowflake to split the file and load data in parallel, potentially improving speed. By considering these factors and potentially running a test load, you can get a better idea of how long it will take to load your 5GB file from S3 to Snowflake
What a enthusiastic explanation . Excellent bro!!!
Thank you so much for your kind words!
The way you explain this concept is amazing.
Thanks for content.
You are most welcome, Please subscribe our channel, it motivates a lot
Crystal clear, you did a good job in explaining things.
Thank You so much, Please subscribe our channel for regular updates, its motivates us a lot
YOU ARE GREAT TEACHER IN EXPLAINING BEAUTIFUL WAY OF ENGLISH LANGUAGE PROUNICATIONS (not like others only confusions to understand English language pronunciations) AND SYSTEMTICALLY EASY WAY OF PRESENTING THAT HELPS ME TO LEARN AND TESTED SIMALTANEOUSLY. I CAN HOPE FOR SOMETHING IN THIS FIELD TO BE UNDERTAKEN. I LOVE YOUR WAY OF TEACHING METHODS THAT HELPS ME FOR THE NON-TECHNICAL BACKGROUND LIKE ME. MUHAMMAD from North America.
Thank you! 😃
the way you explain wow!!!!!
Thank you so much, Please subscribe our channel for regular updates ....
Good one..had a clear view of data loading from AWS .. Thank you.. pls give more videos about snowflake DB...
Thank You Nisha; Sure As soon as possible we will come up with more Videos.
Thanks for this video, you explained in a very nice manner and its working.
Thank you for your kind words, they mean a lot to me!
nice explanation. keep on doing videos on snowflake
Thank you, I will, please subscribe our channel for regular updates
Two thumbs up!
Thank you so much! 🙌 I'm glad you enjoyed the video. Your support means a lot! Don’t forget to subscribe for more content like this. 😊👍
it helped a lot, can you please make a video for continuous data upload in snowflake
sure sir, please subscribe our channel for regular updates
How to upload multiple csv files from single s3 bucket ?
How to make it as a continues load without manual intervention ?
Using snowpipe
you have use Snowpipe
Thank you
Awesome explanation 🎉🎉❤❤
Thank you so much! I'm glad you found the explanation helpful.
Thank you for this video
Thank you so much for watching, Rohit! 😊 I'm glad you found the video helpful. If you enjoyed it, feel free to subscribe for more content like this!
Best,
Team KSR
Good explanation
Thank you so much for your comment, Please like, share and subscribe our channel for regular updates. it motivates us a lot
Clear explination, bro can you please tell what are the prevalidations we do while loading from aws to snowflake table
Thank you, Please subscribe our channel for regular updates , all the best
Thanks for making this vedio,it is providing good knowledge.please make more vedios.
You're welcome, please subscribe our channel for regular updates
I heard that now Snowflake is available in AWS as a service. In that case all these stages and public access blocking is required? Please let me know.
Though snowflake is available.. It has some limitations.. Aws and snowflake is updating
Nice explanation
Thanks for checking out the video and leaving a comment!, Please subscribe our channel for regular updates and it motivates us a lot... thanks again.
Thank you sir clear explanation
You are welcome
Amazing✌
Thanks ✌️
thanks for video it really helped
Glad it helped Thank you, Please subscribe our channel for regular updates, its motivate us
Good Explanation
Thanks for liking, Please subscribe our channel for regular updates.
Do you have any training
Great ..but little need much practice...
I will try my best, Thank you for your suggestions .
Thank you so much for this video
Most welcome 😊, Please subscribe our channle for regular updates
How to use one stage to several schema under one database ? Pls ans my question if any query for this provide
ou can use one stage for several schemas by creating the stage at the database level. Then, reference it in any schema by fully qualifying it, like @my_database_stage. Just ensure the correct permissions are granted for access. Let me know if you need more details!
awesome
Thank you
If i have a file size of 5GB , how long it will take to load data into snowflake from S3
If i have a file size of 5GB , how long it will take to load data into snowflake from S3
Show drafts
It's difficult to predict the exact time it will take to load a 5GB file from S3 to Snowflake due to several factors:
Network bandwidth: The speed of your internet connection will significantly impact the transfer time. Higher bandwidth allows for faster data transfer.
Snowflake credits: The amount of Snowflake credits you allocate to the COPY INTO command affects processing speed. More credits allow for parallel processing and faster loading.
File format: Compressed formats like Parquet or Avro can load faster than uncompressed formats like CSV due to less data to transfer.
Snowflake stage configuration: The configuration of your Snowflake stage can impact performance. Factors include the number of compute resources allocated and the type of storage used (e.g., internal vs. external).
However, here are some resources that can help you estimate the loading time:
Snowflake documentation: The Snowflake documentation provides information on COPY INTO: docs.snowflake.com/en/sql-reference/sql/copy-into-location.html and best practices for loading data.
Snowflake sizing calculator: This tool (available through your Snowflake account) can help you estimate the resources needed for your workload, potentially impacting load times.
Testing: The best way to determine the actual load time for your specific scenario is to perform a test load. You can upload a smaller sample file (e.g., 100MB) from S3 to Snowflake and measure the time it takes. This will give you a more accurate estimate for the 5GB file based on your environment.
Here are some additional tips for potentially improving load times:
Use efficient file formats: Consider using compressed formats like Parquet or Avro for faster transfer and processing.
Optimize your stage: Ensure your Snowflake stage is configured with appropriate compute resources and storage type.
Parallelize the load: Use the FILE_FORMAT option with SINGLE = FALSE in the COPY INTO command to allow Snowflake to split the file and load data in parallel, potentially improving speed.
By considering these factors and potentially running a test load, you can get a better idea of how long it will take to load your 5GB file from S3 to Snowflake
I had practiced just with my S3 credentials it's making me error sir it told me that
Unexpected 'STORAGE_AWS_ROLE_ARN'.(line 6)
Yes
can i know the instructor name please
Nizamuddin
Thank You
Hi nizamuddin i think you
Thank you