thanks for the great talk. We'll be happy to hear more about dynamic tasks creation, especially ones that are effected from the results of previous tasks in the DAG.
I saved all dags, each dag has multiple tasks with dependency definition into a single config file, each dag has its own scheduler, each task has this own customer handler. I could dynamically all dags pretty well. Only challenge is my tasks in dags are queued even I put None or @once for schedule_interval
Say I have a dag run happening and dynamically I updated the dag tasks.. Will it break the existing dag run? Say that particular dag run has 10 tasks to do, and I update dag when it's doing 1st task. Will it implement old tasks and newly run dag run implement new updated tasks??
thank you for this tutorial ! I followed the 1st method of generating the DAGs (single file) and I see the DAGs being generated on the UI but when I try to run it, I see an error from the executor which says "airflow.exceptions.AirflowException: dag_id could not be found: . Either the dag did not exist or it failed to parse." Even though the code below adds the dag to the global scope, I am wondering why it is NOT able to find the dag that has been generated when I try to run it: globals()[dag_id] = create_dag(dag_id, schedule, dag_number, default_args) I am not able to figure out why the executor is NOT able to find the dag that has been generated.
Finally dynamic DAG creation made simple in this video! Thank you!
Glad it was helpful!
thanks for the great talk. We'll be happy to hear more about dynamic tasks creation, especially ones that are effected from the results of previous tasks in the DAG.
Really useful and very clearly explained. Thanks!
Thank you so much
This is what i was looking for
Can we have more sessions like these?
Thanks for this great content. Much Appreciated :)
This was helpful. Thanks
Glad it was helpful!
Thanks for the talk, great content.
I saved all dags, each dag has multiple tasks with dependency definition into a single config file, each dag has its own scheduler, each task has this own customer handler. I could dynamically all dags pretty well. Only challenge is my tasks in dags are queued even I put None or @once for schedule_interval
Hmmmm, are you manually triggering the dags?
Say I have a dag run happening and dynamically I updated the dag tasks.. Will it break the existing dag run? Say that particular dag run has 10 tasks to do, and I update dag when it's doing 1st task. Will it implement old tasks and newly run dag run implement new updated tasks??
thank you for this tutorial ! I followed the 1st method of generating the DAGs (single file) and I see the DAGs being generated on the UI but when I try to run it, I see an error from the executor which says "airflow.exceptions.AirflowException: dag_id could not be found: . Either the dag did not exist or it failed to parse."
Even though the code below adds the dag to the global scope, I am wondering why it is NOT able to find the dag that has been generated when I try to run it:
globals()[dag_id] = create_dag(dag_id,
schedule,
dag_number,
default_args)
I am not able to figure out why the executor is NOT able to find the dag that has been generated.
thank you
Is there any wat to get list of dag id using python ?