Great effort in educating people with your videos. Can’t appreciate your effort enough with words. I have been watching all your playlists and I must say I’m hooked up to your channel so much. Thanks again for these wonderful knowledge sharing videos
Very informative video having information which are not easily available over internet. Loved the first episode, eagerly waiting for following episodes.
Very nice video playlists Sir. I am learning everyday from your videos. It would be much helpful if you also upload videos on Data structure and algorith using python
Great content again. Thanks alot. Eagerly waiting for all the 15 episodes. I have seen snowflake videos many times and learnt a lot and still watching. Can we expect one snowspark video every week? 🙂
what is the USP of snowpark? why should I use it if I am well versed with SQL. And if I am not and rather a pyspark developer then I will anyway not be able to use the overall power of Snowflake. please advise
Great video.. But question 1.when data present in snowflake the easiest wat to trasform the data using sql.why we need snowpark here? 2.can we use snowpark as a ETL like Talend or iics(taking the data from other rdbms table or file system) or this just for elt?
There are couple of reason 1. A team is migrating their work from PySpark or Spark to Snowpark, then you can use the same set of code without writing a fresh SQL statement and save tons of effort. 2. Having a DevOps based ETL development in some strict environment is more easy for Python based tool than using SQL 3. If your team is not good with SQL, then programming is the best approach (this will surprise you, but there are many who are very good in programming but poor in SQL) 4. Programming gives lot of code re-usability and library integration and much more easy to work than working with large SQLs. Follow my channel and I am going to publish a video that will talk about end to end ETL using Snowpark for data engineers.
@@subhojyotiganguly5013 dbt is getting very popular and yes, you can certainly used dbt. DBT is one of the future tool becoming very popular alongside snowflake.
If you are using python, then you need to have python 3.8.x (not higher version, else it will not work) and then use pip to install your snowflake-python-snowpark library and it will automatically be available for use. Once it is done, you can use session api to connect (I am releasing a video on this soon)
I am almost done with the snowflake video series. I have started giving interviews but as far as I understand there are very less calls for snowflake right now. Can we expect a good amount of interview calls at least from mid of May and June. Could you pls share ur thoughts on this. Thanks in advance.
Snowflake is a cloud data warehouse platform.. and you must have data warehouse knowledge or data engineering knowledge alongside snowflake.. else it will be hard to get a call... it also depends on what is your current year of experience ...
No.. here is my udemy contents These contents are available in udemy.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/ 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/
Why do your write Snowpark code that again coverts to SQL that would be SLOW? Why not directly use Snowflake Scripting using External table that would be FASTER?
if you have an existing spark code or python code that interacts with database and now your db or dwh is snowflake, you don't need migrate all your codebase to SQL. There are other use cases where doing it programmatically is much easier than writing sql code, like xl processing and then put it in snowflake, such use cases can not be done using sql and in that case Snowpark shine a lot.
The easiest-to-understand video about Snowpark. Thank you
Glad it was helpful!
I thought of asking you about the snowpark, But you started, Good to see sir
Yes, thanks
This is the best video so far describing what Snowpark is, thank you for sharing 🙏🏼
Glad it was helpful!
Thanks! - 13 Feb 2024 - 08:41 PM IST
Perfect compilation of information. Thank you.
Thanks for your note..
Great effort in educating people with your videos. Can’t appreciate your effort enough with words. I have been watching all your playlists and I must say I’m hooked up to your channel so much. Thanks again for these wonderful knowledge sharing videos
So nice of you
Very informative video having information which are not easily available over internet. Loved the first episode, eagerly waiting for following episodes.
Thanks a ton
Very nice video playlists Sir. I am learning everyday from your videos. It would be much helpful if you also upload videos on Data structure and algorith using python
Thanks for your note.. for now my focus is on Snowflake..
Great content again. Thanks alot. Eagerly waiting for all the 15 episodes. I have seen snowflake videos many times and learnt a lot and still watching.
Can we expect one snowspark video every week? 🙂
glad .. you liked all my videos... will try to publish one video every week...
what is the USP of snowpark? why should I use it if I am well versed with SQL. And if I am not and rather a pyspark developer then I will anyway not be able to use the overall power of Snowflake. please advise
How to extract the .gz file from stage to snowflake folder ? and how you maintaining the .csv files in separate folder in snowflake. Thanks
Great video..
But question
1.when data present in snowflake the easiest wat to trasform the data using sql.why we need snowpark here?
2.can we use snowpark as a ETL like Talend or iics(taking the data from other rdbms table or file system) or this just for elt?
There are couple of reason
1. A team is migrating their work from PySpark or Spark to Snowpark, then you can use the same set of code without writing a fresh SQL statement and save tons of effort.
2. Having a DevOps based ETL development in some strict environment is more easy for Python based tool than using SQL
3. If your team is not good with SQL, then programming is the best approach (this will surprise you, but there are many who are very good in programming but poor in SQL)
4. Programming gives lot of code re-usability and library integration and much more easy to work than working with large SQLs.
Follow my channel and I am going to publish a video that will talk about end to end ETL using Snowpark for data engineers.
@Data Engineering Simplified thank you . What do you suggest? How important DBT is to work on snowflake, creating data model.
@@subhojyotiganguly5013 dbt is getting very popular and yes, you can certainly used dbt. DBT is one of the future tool becoming very popular alongside snowflake.
Great content. Thanks.
Glad you liked it!
Where your snowflake is installed or can you explain how can i integrate snowflake and snowpark in my local windows system
If you are using python, then you need to have python 3.8.x (not higher version, else it will not work) and then use pip to install your snowflake-python-snowpark library and it will automatically be available for use. Once it is done, you can use session api to connect (I am releasing a video on this soon)
I am almost done with the snowflake video series. I have started giving interviews but as far as I understand there are very less calls for snowflake right now. Can we expect a good amount of interview calls at least from mid of May and June. Could you pls share ur thoughts on this. Thanks in advance.
Snowflake is a cloud data warehouse platform.. and you must have data warehouse knowledge or data engineering knowledge alongside snowflake.. else it will be hard to get a call... it also depends on what is your current year of experience ...
Very great knowledge..but even being in uk z there is calls for snowflake.. you should know our tools too to crack the jobs..
If I'm a SQL developer familiar with writing SQL, does that mean I can just write my code in SQL without needing to use Snowpark?
Thank you sir!
Welcome
Hi
How much syllabus covered Snowflake Advanced Data engineer certification
A lot + including scenario based questions..
@@DataEngineering Could you please share the play list link here for snowflake advanced certification. not able to find from your channel.
Are you the same tutor who had udemy course on aws redshift and related
No.. here is my udemy contents
These contents are available in udemy.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/
where to find this code ?
Please check this medium blog for code.. medium.com/@data-engineering-simplified/588b19591b47
Why do your write Snowpark code that again coverts to SQL that would be SLOW?
Why not directly use Snowflake Scripting using External table that would be FASTER?
if you have an existing spark code or python code that interacts with database and now your db or dwh is snowflake, you don't need migrate all your codebase to SQL. There are other use cases where doing it programmatically is much easier than writing sql code, like xl processing and then put it in snowflake, such use cases can not be done using sql and in that case Snowpark shine a lot.