Imagine needing to consume and migrate not just a single table, but over 100. You'd have to create 100 pipes for inserting the data. Now, with Airflow, it's easier to customize and scale this process.
For those who don't see the host anymore, in the account field, make sure you add: youraccountnumber.yourregion.yourcloud For example: nb71231.eu-west-3.aws Basically, take everything between and snowflakecomputing.com and leave the region field empty Enjoy
@@MarcLamberti did not really help. I asked for a refund to the udemy. I had issues with the installation in your way. My CPU was maxing out. Nothing really was working after I was able to install the airflow in your recommended way. So, it was a bad first impression of the course. So, had to ask for a refund. Sorry.
@@datalearningsihan you don't have to be sorry. I believe your issues is more related to Docker than Airflow or the course. Check that you have enough memory. Otherwise, you can still install Airflow manually with pip install
Thanks Marc, I am facing this error when connecting to Snowflake from airflow; Airflow is running in docker compose (the file you provided in udemy course), ERROR- 250001: 250001: Could not connect to Snowflake backend after 2 attempt(s).Aborting I checked all the parameters but still facing this issue ( Airflow version - v2.8.1)
Still facing connection issues: getting error snowflake.connector.errors.OperationalError: 250001: 250001: Could not connect to Snowflake backend after 2 attempt(s).Aborting, please assist
can you check on creating the connection within airflow to snowflake? the interface has changed slightly and now i'm unable to create a connection. i've verified that all parameters are correct and yet the test is still failing
Hi, when creating connections in airflow, the test button is greyed out and says 'Testing connections is disabled in Airflow configuration. Contact your deployment admin to enable it' please can you help on this, so test is enabled, I can see in config it's set to disabled, just need to know how to switch it. Thanks
At SQL Requests STEP -> I had to execute the query to create Dataware House and Schema separately since I ran into a " No active warehouse selected in the current session " Error, later trying to Insert values into the table. Also, in the Airflow UI, in connections I don't have the Amazon S3 option !
Is there a benefit to using airflow instead of snowpipe for this purpose?
Imagine needing to consume and migrate not just a single table, but over 100. You'd have to create 100 pipes for inserting the data. Now, with Airflow, it's easier to customize and scale this process.
For those who don't see the host anymore, in the account field, make sure you add:
youraccountnumber.yourregion.yourcloud
For example: nb71231.eu-west-3.aws
Basically, take everything between and snowflakecomputing.com
and leave the region field empty
Enjoy
Still works 😄. really cool pipeline
Good to know 🥹
I was struggling with airflow installation, so I purchased your udemy course. Hoping, will get some better suppport.
Keep me posted ;)
@@MarcLamberti did not really help. I asked for a refund to the udemy. I had issues with the installation in your way. My CPU was maxing out. Nothing really was working after I was able to install the airflow in your recommended way. So, it was a bad first impression of the course. So, had to ask for a refund. Sorry.
@@datalearningsihan you don't have to be sorry. I believe your issues is more related to Docker than Airflow or the course. Check that you have enough memory. Otherwise, you can still install Airflow manually with pip install
Thanks Marc! Great Tutorial!
You’re welcome 🫶
Thanks man , very much appriciated.
You’re welcome
How do we manage connections credentials not via UI? I mean deploy them as code with a reference to secrets manager.
Learn an easiest way to build dev env for airflow data pipeline. Great!!
What is the best way to pass CSV between tasks?
for example: one function parse a JSON to CSV
second function take the CSV to S3 bucket.
Thanks Marc, I am facing this error when connecting to Snowflake from airflow; Airflow is running in docker compose (the file you provided in udemy course), ERROR- 250001: 250001: Could not connect to Snowflake backend after 2 attempt(s).Aborting
I checked all the parameters but still facing this issue ( Airflow version - v2.8.1)
salut marc, est ce que je dois faire astro dev start encore une fois lorsque je crée le nouveau dag dans le dossier dags
Nop
anyone else having issues with snowflake connection? I followed everything but it doesn't seem to work. Not even sure how to know what went wrong
yeah, same problem for me
250001: 250001: Could not connect to Snowflake backend after 0 attempt(s).Aborting
Still facing connection issues: getting error snowflake.connector.errors.OperationalError: 250001: 250001: Could not connect to Snowflake backend after 2 attempt(s).Aborting, please assist
I'm unable to see Amazon S3 on airflow localhost. Can you please help me with that?
Did you install the Amazon provider?
Hi Ruchi, I'm facing the same issue.
Did you mange to solve this?
Thanks
@@Yonatanx3 Use Amazon Web Services for the connection type ;)
can you check on creating the connection within airflow to snowflake? the interface has changed slightly and now i'm unable to create a connection. i've verified that all parameters are correct and yet the test is still failing
Yes I have the same issue
I’ve just released a new video that shows how to make that connection th-cam.com/video/YZTcIi5o7FI/w-d-xo.htmlsi=8-8-Q8LUasYfz2V0
Hi, when creating connections in airflow, the test button is greyed out and says 'Testing connections is disabled in Airflow configuration. Contact your deployment admin to enable it' please can you help on this, so test is enabled, I can see in config it's set to disabled, just need to know how to switch it. Thanks
Yes. That has been introduced in 2.7. Change the configuration setting AIRFLOW__CORE__TEST_CONNECTION to enabled
At SQL Requests STEP -> I had to execute the query to create Dataware House and Schema separately since I ran into a " No active warehouse selected in the current session " Error, later trying to Insert values into the table.
Also, in the Airflow UI, in connections I don't have the Amazon S3 option !
Use the AWS option instead of the connection. Thanks for sharing
For me there is no option to add the host url for snowflake as connection type ....please suggest something
You need to install the apache-airflow-providers-snowflake==4.4.0 provider
@@MarcLamberti I tried but still it is not working, could you please share the git repo for the entire process, this gonna be of great help for us
@@kkampassi4820 Look at the pinned comment :) I will release a video tomorrow that uses Snowflake as well with the updated way
I can't see amzon s3 connection type in airflow web
It’s AWS now
@@MarcLamberti ok
Thank you, but I'm sorry, English with French accent is terrible
Eating a 🥐 while listening makes it better ❤️
I just can't stand that accent
Me too 🤢
@@MarcLamberti I love your accent. Dont listen to ungrateful morons.
@@akj3344 Thank you🙏
@@akj3344 eat deek
I couldn't find the "Amazon S3 connection" on the airflow ui. What's going on?
can someone explain how I can install the s3 provider package?