I just finished going through Chapter-1 to Chapter-14 .Time Travel one was so cool and this chapter was interesting with all hands on tutorial. Thank you so much!
your all 14 video , I have seen. Now I am waiting for rest of your videos which you have mentioned in course curriculum . Thanks again for your wonderful content in Video
Thank you for the detailed tutorial. I really appreciate. It looks like Sql Scripts link is not working. It is giving an error:" You don't have permission to access the resource."
Excellent hardworking I can see from this team for making us expertise in snowflaks. Now I would like to get all the sql scripts used in the all vedios for practice. I would be thanks full if I get all the scripts including all python scripts for practice and complete understanding. Thanks in advance Ramu M
Glad it was helpful! and yes, if you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=SPECIAL50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=SPECIAL35
I have completed snowpro core certification. These videos are great and helped a lot. I am planning to do Snowflake Architecture Certification. Are there separate videos for that?
Another excellent video. Thank you so much. Please make a video on advance snowflake certifications if possible. Planning to go for another certification after snowpro but very little information is there on internet.
You are welcome... It is in my list but it takes time as I want to make sure that I cover as much as possible for my audience. Let me share my quick experience from my adv certification. 1. Questions are very descriptive and it takes time to read them. 2. You will have a time crunch as, 2 of the option will look exactly same. 3. Lot of scenario based question from every topic 4. 3-4 question will come from basics, so make sure you read the basic too... 5. Lot of question from data loading, micro-partition, stored procedure (specially javascript), transaction, performance and most of them scenarios. 6. You would not get time to go back and review, so don't jump, go in sequence. 7. Read the documentation and new features also.. like masking, API call etc.. and 4-6 weeks would be enough.. since you are snowpro certification, can you appear for adv de or architecture exam. Will surely make a detailed video, but you have to wait..
Glad it was helpful! ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡ I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them. 🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI 🌐 SnowPro Guide ➥ bit.ly/35S7Rcb 🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq 🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
hi, I have a question regd "data_retention_time_in_days". Is this attribute inheritable? If a database is created with this attribute value set to say 10 days, any schema, table within the database created without this attribute defined would get the value as 10 or would it be a default value of 1?
Thanks for nice content , Would love to see more ! I have a question, I had set time travel of 4 days at schema level but inside table , I have changed to 10 days . If schema gets dropped beyond time travel period, still can I get the data for table which is having time travel period of 10 days? Thanks !
Hi govardhan yadav, This was a good question and it helped me to clear my concept as I was also confused. I would like to share my learnings with you. You have defined a schema with a time travel retention period of 2 days. This means any tables using this schema will by default keep historical data for 2 days. However, for a specific table that uses this schema, you have overridden the retention period and set it to 10 days instead. In this case, the table-level setting takes precedence. So even though the schema retention is 2 days, that table will keep historical data for 10 days. The schema-level retention period acts as a default that can be overridden at the table level. But any other tables using the schema and not overriding will still follow the 2 day retention.
hi, Thanks for making such detailed videos which are really helpful in understanding snowflake. I have question here, I have created a table on day1 . From day 1 on wards it is receiving updates. I can recover the old data up to 90 days using time travel. What if the updates are happened beyond 90 days (or beyond fail safe stage). How can i recover the data after fail safe? And why it is only 90 days. Please explain. Thanks in advance.
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me. Time travel cost a lot for churning table (if enabled for 90 days) and if you make it forever, it will cost a hell in snowflake account. If you need to store historical data for longer period, probably time travel is not the right feature.
Thank you for taking time and responding. But what happens to the old micro partitions of a table for which updates have been received. And to those mp's for which retention period and failsafe is completed. Thank you.
We reduced time retention period from 3 to 2 so that past made(first day changes) changes will lost..isnt possible using fail safe can we bring changes we made back the changes
Yes, it can be done but it is too much of effort unless the data is very critical to your and your team... if it is permanent table and if you have lost data due to time-travel configuration, you can contact snowflake team and they will recover it.... I assume you are not using free trial edition and you/company has contract with snowflake team.
I would be more than happy to try it out. But I use individual personal account and unfortunately I have consumed all the credits available. The billing has emptied my pockets 😅
The default table is permanent table, you might be making a mistake. Could you share the DDL script? that will help me to understand the problem. I generally use transitent in my SQL as it cost less compare to permanent table;
@@DataEngineering Here's the same Table Script that you've used in this tutorial - CREATE OR REPLACE TABLE POPWAREHOUSE.ORDER_DETAILS ( O_ORDERKEY NUMBER(38), O_CUSTKEY NUMBER(38,0), O_ORDERSTATUS VARCHAR(1), O_TOTALPRICE NUMBER(12,2), O_ORDERDATE DATE, O_ORDERPRIORITY VARCHAR(15), O_CLERK VARCHAR(15), O_SHIPPRIORITY NUMBER(38,0), O_COMMENT VARCHAR(79) ) data_retention_time_in_days = 50; Actually, I'm using a Free Trial Account (Enterprise Edition), Could that be a reason?
Hi..I have one question time travel...Assume that we have created 1 table with time travel of 90 days..and there are no transaction happened in between..on 92nd day I dropped the table will I be able to un drop?
Yes, the 90 days is a moving window and it is not fixed from the day it is created... so if you created the table on 01-Jan and dropped the table on 02-Apr (31days in Jan, 29days in Feb & 31days in Mar = 90 days).. the recovery window will start from 03-Jan and any transaction done between 01-Jan to 02-jan will be lost but all the data starting from 03-Apr minus 90 days can be recovered. I hope this is clear.
Wonderful explanation, You are helping huge people without spending money for course training. Great Efforts which are uncountable.
Excellent
Thank you! Cheers!
this is my fav topic , so easily explained that too in 30 minutes with all demos , excellent ...Thanks
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me.
I just finished going through Chapter-1 to Chapter-14 .Time Travel one was so cool and this chapter was interesting with all hands on tutorial. Thank you so much!
Glad you enjoyed it.
Thanks for your excellent videos.
Can you please share the sql scripts used for time travel tutorial?
Just finished chapters 1-14...GREAT TUTORIALS! THANK YOU!!!!
😀
Glad you like them!
Great tutorial :) would you be able to share the SQL scripts used?I'm not able to see it in the link you provided. Thanks :)
your all 14 video , I have seen. Now I am waiting for rest of your videos which you have mentioned in course curriculum . Thanks again for your wonderful content in Video
The ch-15 is out th-cam.com/video/cAyM-Nj9WOc/w-d-xo.html, hope you would like and learn from it.
Hi,
Thanks for your excellent videos.
Can you please share the SQL scripts used for the time travel tutorial?
Thank you for the detailed tutorial. I really appreciate. It looks like Sql Scripts link is not working. It is giving an error:" You don't have permission to access the resource."
Excellent hardworking I can see from this team for making us expertise in snowflaks.
Now I would like to get all the sql scripts used in the all vedios for practice.
I would be thanks full if I get all the scripts including all python scripts for practice and complete understanding.
Thanks in advance
Ramu M
Yes, sure. Let me see how can I make all of them available.
Nice content and your efforts are superb. Eagerly waiting for 15th part, when can we expect?
Will upload soon
Nicely explained. Thank you
Glad it was helpful!
and yes, if you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=SPECIAL50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=SPECIAL35
Sessions are great and worth spending time. Can you please share the link for sample tests?
here is the entier playlist - and look for Ch-14 from this playlist (th-cam.com/play/PLba2xJ7yxHB5X2CMe7qZZu-V4LxNE1HbF.html)
Thanks a lot for your videos :)
thanks a lot .. good to know that these videos are adding value...
I have completed snowpro core certification. These videos are great and helped a lot. I am planning to do Snowflake Architecture Certification. Are there separate videos for that?
Great to hear! .. I have not yet published for architect level.. but will do it soon..
Another excellent video. Thank you so much.
Please make a video on advance snowflake certifications if possible. Planning to go for another certification after snowpro but very little information is there on internet.
You are welcome...
It is in my list but it takes time as I want to make sure that I cover as much as possible for my audience.
Let me share my quick experience from my adv certification.
1. Questions are very descriptive and it takes time to read them.
2. You will have a time crunch as, 2 of the option will look exactly same.
3. Lot of scenario based question from every topic
4. 3-4 question will come from basics, so make sure you read the basic too...
5. Lot of question from data loading, micro-partition, stored procedure (specially javascript), transaction, performance and most of them scenarios.
6. You would not get time to go back and review, so don't jump, go in sequence.
7. Read the documentation and new features also.. like masking, API call etc..
and 4-6 weeks would be enough.. since you are snowpro certification, can you appear for adv de or architecture exam.
Will surely make a detailed video, but you have to wait..
@@DataEngineering thank you so much for your inputs. Your videos helped a lot to clear snowpro.
Is time travel achieved through nonmutable micro partitions ?
great content, thank you!
Welcome!
Excellent explanation. Really enjoyed lot
Glad it was helpful!
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E
⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
All your vedios are great , Where I can find scripts used in your vedios ?
I am working on it.. will update you soon.
hi, I have a question regd "data_retention_time_in_days". Is this attribute inheritable? If a database is created with this attribute value set to say 10 days, any schema, table within the database created without this attribute defined would get the value as 10 or would it be a default value of 1?
Yes, you are right.
Thanks for nice content , Would love to see more !
I have a question, I had set time travel of 4 days at schema level but inside table , I have changed to 10 days . If schema gets dropped beyond time travel period, still can I get the data for table which is having time travel period of 10 days?
Thanks !
Hi govardhan yadav, This was a good question and it helped me to clear my concept as I was also confused. I would like to share my learnings with you.
You have defined a schema with a time travel retention period of 2 days. This means any tables using this schema will by default keep historical data for 2 days.
However, for a specific table that uses this schema, you have overridden the retention period and set it to 10 days instead.
In this case, the table-level setting takes precedence. So even though the schema retention is 2 days, that table will keep historical data for 10 days.
The schema-level retention period acts as a default that can be overridden at the table level. But any other tables using the schema and not overriding will still follow the 2 day retention.
Hi, it would be great if you mention which of your quiz videos we have to go thru after finishing each of this one's from series.
From Ch-11 they are one to one mapping...
hi, Thanks for making such detailed videos which are really helpful in understanding snowflake. I have question here, I have created a table on day1 . From day 1 on wards it is receiving updates. I can recover the old data up to 90 days using time travel. What if the updates are happened beyond 90 days (or beyond fail safe stage). How can i recover the data after fail safe? And why it is only 90 days. Please explain. Thanks in advance.
Thank you 🙏 for watching my video and your word of appreciation really means a lot to me.
Time travel cost a lot for churning table (if enabled for 90 days) and if you make it forever, it will cost a hell in snowflake account. If you need to store historical data for longer period, probably time travel is not the right feature.
Thank you for taking time and responding. But what happens to the old micro partitions of a table for which updates have been received. And to those mp's for which retention period and failsafe is completed. Thank you.
will fail safe enables in 1day for standard snowflake account? since standard edition is having 1 day time travel.
fail safe is a table level feature and not edition level feature... so yes...
Hi bro. Please help us by providing the sql scripts. It is not working.
We reduced time retention period from 3 to 2 so that past made(first day changes) changes will lost..isnt possible using fail safe can we bring changes we made back the changes
Yes, it can be done but it is too much of effort unless the data is very critical to your and your team... if it is permanent table and if you have lost data due to time-travel configuration, you can contact snowflake team and they will recover it.... I assume you are not using free trial edition and you/company has contract with snowflake team.
What happens to time travel if a table is renamed? And if I alter table and add new column? Or create or replace table?
good question.. but I would suggest.. it is easy to try that out and share the behaviour..
I would be more than happy to try it out. But I use individual personal account and unfortunately I have consumed all the credits available. The billing has emptied my pockets 😅
I've a question, while creating a table by default it is creating under transient tables, Is there's any way to change into the permanent table?
The default table is permanent table, you might be making a mistake.
Could you share the DDL script? that will help me to understand the problem.
I generally use transitent in my SQL as it cost less compare to permanent table;
@@DataEngineering Here's the same Table Script that you've used in this tutorial -
CREATE OR REPLACE TABLE POPWAREHOUSE.ORDER_DETAILS (
O_ORDERKEY NUMBER(38),
O_CUSTKEY NUMBER(38,0),
O_ORDERSTATUS VARCHAR(1),
O_TOTALPRICE NUMBER(12,2),
O_ORDERDATE DATE,
O_ORDERPRIORITY VARCHAR(15),
O_CLERK VARCHAR(15),
O_SHIPPRIORITY NUMBER(38,0),
O_COMMENT VARCHAR(79)
)
data_retention_time_in_days = 50;
Actually, I'm using a Free Trial Account (Enterprise Edition), Could that be a reason?
every time I need to take statement id while update or deleting the rows
Guys can anybody share the scripts, on site given by author it says not everyone could have access
Hi..I have one question time travel...Assume that we have created 1 table with time travel of 90 days..and there are no transaction happened in between..on 92nd day I dropped the table will I be able to un drop?
Yes, the 90 days is a moving window and it is not fixed from the day it is created... so if you created the table on 01-Jan and dropped the table on 02-Apr (31days in Jan, 29days in Feb & 31days in Mar = 90 days).. the recovery window will start from 03-Jan and any transaction done between 01-Jan to 02-jan will be lost but all the data starting from 03-Apr minus 90 days can be recovered.
I hope this is clear.
@@DataEngineering Thank You so much...your videos are awesome. Thank you so much for your efforts to help to learn others..
@@raghumajji2326 thanks..
unable to open sql scripts
Matillion how it's work
It is a big topic, do you have any specific requirement or learning expectation. Matallion has its own learning university.