Getting Started With the Official Airflow Helm Chart
ฝัง
- เผยแพร่เมื่อ 14 มิ.ย. 2021
- The official helm chart (airflow.apache.org/docs/helm-...) of Apache Airflow is out! 🥳 The days of wondering what Helm Chart to use in production are over. Now, you only have one chart maintained and tested by Airflow PMC members as well as the community. It’s time to get your hands on it and take it for a spin! At the end of the webinar, you will have a fully functional Airflow instance deployed with the Official Helm Chart and running within a Kubernetes cluster locally.
In this webinar we’ll show you how to:
1. Create a Kubernetes cluster with KinD in local
2. Deploy Airflow in a few seconds with the Official Helm Chart
3. Discover the first parameters to configure in the Helm Chart
4. Synchronize your DAGs with a Git repository
MATERIALS: github.com/marclamberti/webin...
#helmchart #airflow #learnwithastronomer - วิทยาศาสตร์และเทคโนโลยี
Thank you very much for the great presentation and hands-on session. We are going to use Airflow in EKS, and our development Team needed a way to simulate their local environment to test their DAGs during development and become familiar with airflow on Kubernetes. Your guide was extremely helpful.
Thank you, Marc, for the awesome demonstration and tips!
This content is unreal given that it's available for free on youtube :)
Appreciate it Marc and @astro
Amazing, thank you so much for the great content! Really appreciate it Marc and @astro
thanks guys awesome demo
very interesting! Thanks a lot!
Great and valuable content, thanks!
My pleasure!
Hi Marc! In a real case, should we define a values.yaml file and passing it as an argument of the install command?
I am getting "CrashLoopBackOff" issue when i use git-sync option.
Thanks
Hi Marc. If possible can you to show how integrated airflow, k8s and Argocd end-to-end please?
Besides that how change default port '8080' in 'webserver > startupProbe' to do healthcheck? I am running command 'k port-forward svc/airflow-webserver 8081:8080 --namespace airflow' and I would like that healthcheck as well check this port.
And how configure dag folder in localhost without sync git?
For those that missed this, would we be able to see how to make it so we can get the logs? Thank you this was incredible!
Did you figured out how to get the logs?
Did you manage to figure out how to get the logs?
First of all, Great content! Quick question, with KubernetesExecutor, i would like my worker pods to run on preemptive vms(incase of GKE) node pool. How can i setup this helm chart to be aware of that?
did you figure out this I have the same problem?
It looks like that kind won't be able to load the custome image to the real multi-nodes kubernetes cluster with multiple vm machines. Any solution to that? Thanks!
Bring the custom image you want to run on the other nodes as part of your base image, and then deploy using a docker in docker strategy
Hello Marc, thank you for the video. I'm facing some problems with the new version of Airflow (because today, when I run all the scripts, the version is 2.5.1). I'm not able to reproduce the airflow-custom image in the pods. Can you help me, please? Or someone that could reproduce the scripts nowadays? Thankssss
Could you give me some more specifics?
How to use dockeroperator when AIrflow is running on Kubernetes using helm? I keep getting error which is associated with missing dock.sock file
Is your Airflow running on Kubernetes using Docker?
How do I mount a volume to the worker pods that get spun up for tasks?
If you're using KubernetesPods, check out this link: docs.astronomer.io/astro/kubernetespodoperator#mount-a-temporary-directory. should help you figure it out!
Hi can anyone know how to view logs?
Go into the Airflow UI, and then select a task, and you'll have the option to view logs for that task!
How to make airflow available with public IP
It should be available by default through its port 8080, unless you have some kind of firewall!
Hello guys. When I run following command "kind load docker-image airflow-custom:1.0.0 --name airflow-cluster"
I got below errors:
ERROR: command "docker save -o /tmp/image-tar642177335/image.tar airflow-custom:1.0.0" failed with error: exit status 1
Command Output: failed to save image: invalid output path: directory "/tmp/image-tar642177335" does not exist
Any idea how to solve?
All steps are done but I cannot open airflow UI installed on the server. :8080/ Airflow deployed and running but UI not accessible.
Use localhost:8080!
@@Astronomer how to access ui with public ip domain using ingress?