5 MUST KNOW Airflow debug tips and tricks | Airflow Tutorial Tips 1
ฝัง
- เผยแพร่เมื่อ 10 ก.ค. 2024
- 5 MUST KNOW Airflow debug tips and tricks | Airflow Tutorial Tips 1
#Airflow #AirflowTutorial #Coder2j
========== VIDEO CONTENT 📚 ==========
You guys rock! We have hit more than 1k likes on the 2-hour Airflow Full Course. As promised in this tutorial, I am going to show you the 5 most common airflow errors and must-know tips and tricks to debug and fix them.
Video Request: forms.gle/UMp4GA3krcSMMWzy9
Subscribe and Smash the like button to unlock our 5k likes bonus tutorial videos!!
2-hour Airflow FULL Course: • Airflow Tutorial for B...
1000 likes 👍 - Bonus videos about how to debug Airflow DAG (Achieved!)
5000 likes 👍 - Bonus videos about Airflow Docker Operator (Keep going!!)
========== T I M E S T A M P ⏰ ==========
Throughout the course, you will learn:
00:00 - Intro
00:47 - DAG Import Error
06:26 - DAG shows up too slow
08:49 - DAG/Task not running
10:22 - DAG/Task run failed
12:26 - Scheduler is not running
========== L I N K S 🔗 ==========
Airflow FULL COURSE 👉 • Airflow Tutorial for B...
Airflow Books 👉 amzn.to/3N43rlI
Airflow Documentation 👉 bit.ly/3wbTqv4
Course GitHub Repo 👉 github.com/coder2j/airflow-do...
========== Connect with me 👏 ==========
Twitter 👉 / coder2j
Website 👉 coder2j.com
GitHub 👉 github.com/coder2j
Thanks for this great content 🙏
You are welcome!
greetings from Egypt
you are awesome
For Scheduler in docker alias start command. You can start/stop it when needed and instant refresh.
Valid point. 👍
DAG Shows up to slow section is for a local airflow setup only. For an airflow docker setup, use the following command in the airflow project directory wherever you've created it:
export AIRFLOW__SCHEDULER__DAG_DIR_LIST_INTERVAL=60
In airflow docker, you can set the corresponding environment variable in the yaml file.
can you explain bout trigger?
Could you please tell me what this parent dag and child dag has
regards
I have started docker container. Now when I create dag files in my folder- airflow_docker, import airflow is not recognised. What to do?
In you Dev tool, select the python env in your docker container as the interpreter.
- Hello, everybody!
- Hello, Cleveland Jr.!
Thank you so much for the excellent content!
In the airflow UI (docker), I could not see log history and got the following messages:
*** Log file does not exist: /opt/airflow/logs/dag_id=........
*** Fetching from: :8793/log/dag_id=..../run_id=manual__2022-12-17T18:49:13.457497+00:00/task_id=data_crawl/attempt=1.log
*** Failed to fetch log file from worker. Request URL is missing an '' or '' protocol.
Do you know if I am missing anything in setup? Thanks a lot in advance!
Can you see your log in the logs folder?
@@coder2j could not find the relevant log about the run in the logs folder, no matter the run was successful or failed😇
Hi, this is a great video. I tried all day long today to fix a error. My airflow is running, muy dag is in place but when I push refresh, nothing happen. I did manually but is like the task is not running. The date I wrote in the code is tomorrow. Any advice? Many thanks
What are the start date and schedule interval?
@@coder2j Start_date =( 2023,3,4)
schedule_interval="@daily"
I don't see an error but it's like the taks is not running. I am not able to see the dails of the taks. I when I go to graph I cannot click on the taks neither.
It will only run when your local time past 2023-03-04 00:00. Try to change start date to 2023-03-03.
@@coder2j Thanks for your time. I changed the date and know I can access to the Log but it shows me the following error: *** Log file does not exist: /opt/airflow/logs/dag_id=our_first_dag/run_id=manual__2023-03-03T23:25:26.672754+00:00/task_id=first_task/attempt=1.log
*** Fetching from: :8793/log/dag_id=our_first_dag/run_id=manual__2023-03-03T23:25:26.672754+00:00/task_id=first_task/attempt=1.log
*** Failed to fetch log file from worker. Request URL is missing an '' or '' protocol.
By the way, is the example of your airflow tutorial
I did it! it was a issue with the version of python. I was using python 3.10 and I changed to 3.8 and now is working. Thanks!
No task found how to debug this
Airflow scheduler is getting down due to deadlock error. what need to do to fix ?
Did you try restarting the airflow scheduler?
is subdags covered in this tutorial?