Cloud Composer: Good development practices
ฝัง
- เผยแพร่เมื่อ 14 ก.ย. 2020
- This session is aimed towards helping Cloud Composer (Apache Airflow) developers understand best practices in writing workflows (DAGs) and avoid common pitfalls. Some Airflow development patterns are more optimal than others, and proper design choices influence reliability of workflow execution.
Learn both proper patterns and anti-patterns, to help Cloud Composer users maximize their productivity and lower the chance of challenges with workflow executions.
This content is designed for developers who have interacted with Cloud Composer or Apache Airflow before.
Speaker: Filip Knapik
Watch more:
Google Cloud Next ’20: OnAir → goo.gle/next2020
Subscribe to the GCP Channel → goo.gle/GCP
#GoogleCloudNext
DA305
event: Google Cloud Next 2020; re_ty: Publish; product: Cloud - Data Analytics - Cloud Composer; fullname: Filip Knapik; - วิทยาศาสตร์และเทคโนโลยี
Can you please share the link that contains all DAG operators? Also, when using PythonOperator with some custom data wrangling scripts (i.e. loading data into a Dataframe) is the memory used the "Airflow Database"? Is it recommended to trigger other services for this kind of data handling through AirFlow (i.e. Spark)?
Any examples about connecting to on premise database via ssh from cloud composer ?
Perfect
'don't fall behind composer versions' also 'we're 6 years behind the latest python version'