I think this video is helpful but appreciate if you can send other videos for detailed explanation or online documentation.I am asking about the architecture it starts from 07:19
Hi @treyhannam3806, check this out: docs.snowflake.com/en/developer-guide/python-connector/python-connector-connect#connecting-using-the-connections-toml-file
I would appreciate a video on how to set up the environment , since I get stuck on trying to run the python script for loading the raw data. Edit: I found this video very usefull: th-cam.com/video/68I-oxeJ4HE/w-d-xo.html&ab_channel=NeuralNine
While not quite a video, you may find Steps 1, 2, and 3 of the associated Quickstart helpful: quickstarts.snowflake.com/guide/data_engineering_pipelines_with_snowpark_python/#0
Just a follow up to this. The associated GitHub repo for this demo now uses GitHub Codespaces to automatically provision and set up the environment for you. Check out the updated instructions in Step 2 here: quickstarts.snowflake.com/guide/data_engineering_pipelines_with_snowpark_python/#1
This was so good, thank you!
This was very informative!
Glad it was helpful!
Great Content
Thank you!
Amazing!!
If the raw files in csv format, is there anyway to infer the schema without explicitly defining the schema StructType.
I think this video is helpful but appreciate if you can send other videos for detailed explanation or online documentation.I am asking about the architecture it starts from 07:19
Is there documentation on the toml file?
Hi @treyhannam3806, check this out: docs.snowflake.com/en/developer-guide/python-connector/python-connector-connect#connecting-using-the-connections-toml-file
Do we need an external cloud storage account to implement this project
No not at all! Try out the lab today and let us know what you think!
I would appreciate a video on how to set up the environment , since I get stuck on trying to run the python script for loading the raw data. Edit: I found this video very usefull: th-cam.com/video/68I-oxeJ4HE/w-d-xo.html&ab_channel=NeuralNine
While not quite a video, you may find Steps 1, 2, and 3 of the associated Quickstart helpful: quickstarts.snowflake.com/guide/data_engineering_pipelines_with_snowpark_python/#0
Just a follow up to this. The associated GitHub repo for this demo now uses GitHub Codespaces to automatically provision and set up the environment for you. Check out the updated instructions in Step 2 here: quickstarts.snowflake.com/guide/data_engineering_pipelines_with_snowpark_python/#1