Excellent thanks a lot...I have to check oracle table if a record exists for current date ,if exists then only call downdown stream task ,how to achieve this ?
whenever I first run the airflow test command, it throws an error saying 'sqlalchemy.exc.NoReferenceTableError: Foreign key associated with column 'dag_run_note.user_id' could not find table 'ab_user' with which to generate a foreign key to target column 'id' do you what it is about? and how to fix it? Second run it doesn't occur....
Can I run 2 dags with different python version and different pandas version? For now I see airflow offers only one python version at the time, and only predefined required dependencies. And all my dags must match that specs.
I am trying to pull the value from another task in a different folder, however, I see only the current task instance. Is this new behaviour, or am I unable to pass between different directories?
@@leomax87 you can import the sys module in python after which you will navigate a number of levels either up or down into the directory your file is located. import sys sys.path.append(".") # that's move a level up into the dags folder from includes.my_dags.functions import process
Great job, thanks for sharing it.
To the point, no fluff 👍🏽
Amazing. Thanks Marc!
My pleasure!
This is exactly what I was looking for
Awesome.. thanks for such a beautiful knowledge share.
Thank you. You've helped me a lot.
Thanks Marc for this very useful video! High appreciated ! Keep it up !!
Awesome. Thanks a lot, Marc.
Excellent thanks a lot...I have to check oracle table if a record exists for current date ,if exists then only call downdown stream task ,how to achieve this ?
That's was very useful. Thanks :)
you are the best
How to send object from gcs to email bu airflow. If possible send me code
this is great. thanks!
Very good vidoe!!!
whenever I first run the airflow test command, it throws an error saying 'sqlalchemy.exc.NoReferenceTableError: Foreign key associated with column 'dag_run_note.user_id' could not find table 'ab_user' with which to generate a foreign key to target column 'id'
do you what it is about? and how to fix it?
Second run it doesn't occur....
Hmmmmm interesting, if it works on the second run maybe it needs to generate the table first?
Can I run 2 dags with different python version and different pandas version? For now I see airflow offers only one python version at the time, and only predefined required dependencies.
And all my dags must match that specs.
instead of calling the script function, is there an operator that I can call the script completely to be executed?
How to turn ON this module includes?
did you resolve it?
I am trying to pull the value from another task in a different folder, however, I see only the current task instance. Is this new behaviour, or am I unable to pass between different directories?
So your DAG's are in different folders?
@@Astronomer I have no idea what I was talking about😄
nice man
import airflow cannot be resolved
Hey, while using includes I get the error - ModuleNotFoundError: No module named 'includes'
I'm facing the same error. How Can I resolve it?
@@leomax87 you can import the sys module in python after which you will navigate a number of levels either up or down into the directory your file is located.
import sys
sys.path.append(".") # that's move a level up into the dags folder
from includes.my_dags.functions import process
Gonzalez Betty White Scott Anderson Jennifer