Hi, if xcom is not suitable for large data. How can we share data then between tasks? If it is mentioned in another video, could you tell me its number?
Sir,, why everytime i save a new dag, the previous dag that i saved before got lost in the dags list in the webserver? i mean the v01, v02, and v03 doesnt appear soon after i saved the v04 and v05, did i miss something, please answer ....
Very nice and concise. I have dict object that i want to pass as kwargs to pythonoperator callable function along with ti. How xan i do that?? Do i have to add 10 dictionary keys as function parameter and 11th paramter as ti or there is some other way..
You can just pass the dict object as a python parameter. Something like this: op_kwargs={'some_dict': {'a': 1, 'b': 2}} for python callable def greet(some_dict, ti):
Hi do you know why 'docker-compose up -d' doesn't start my localhost:8080? Instead I simply use "docker compose up' and it works, is there a reason for this?
It should work. Can you check if the container is up when you run the 'docker-compose up -d'. Keep in mind, it may take some minutes to bring the containers up
I have seen a python method like def get_spark_config(**context): ti = context['ti'] what is the purpose of context , why there are ** . What is the meaning for ti = context['ti']
context is a dictionary which contains a lot of dag info the airflow provides. With **, you basically put every key and values in dictionary context as function parameters. So context['ti'] basically get ti (task instance) value from context with key ti.
Excellent job! Finally, Airflow DAGs have been demystified.
Thankyou so much for clear, simple and very useful video.
Thank you.
It's just short and covers the core concepts nicely. Expecting more videos and Thanks for your time to prepare the valuable content.
You are welcome! I am glad that you liked it. 😉
Boom! It's really interesting and so useful! Thanks so much!!! 💕
I am glad that you like it. Thanks for your warm comment!! :-)
Finally!! Thanks for your hard work. Would love to see more too.
Working on it!
Really useful.Thank you !
Very easy to Understand and Practice.
very very good, I am inching my way along, this is helpful
Thanks for watching.
I have studied easily thanks to your video. Thank you very much!
You are welcome!
Great job!
Boom Super... nice explanation for beginners for airflow... pls do more videos on this.
Thanks, will do!
thank you very much from korea peninsula
You are welcome
Loving your tutorials! It would be great if you could share more about Airflow on Kubernetes
Amazing work
Thanks a lot 😊
Great
good
Hi, if xcom is not suitable for large data. How can we share data then between tasks? If it is mentioned in another video, could you tell me its number?
Sir,, why everytime i save a new dag, the previous dag that i saved before got lost in the dags list in the webserver? i mean the v01, v02, and v03 doesnt appear soon after i saved the v04 and v05, did i miss something, please answer ....
That's normal. When you saved it to new version. The scheduler will refresh and get the latest version only.
Very nice and concise. I have dict object that i want to pass as kwargs to pythonoperator callable function along with ti. How xan i do that?? Do i have to add 10 dictionary keys as function parameter and 11th paramter as ti or there is some other way..
You can just pass the dict object as a python parameter. Something like this: op_kwargs={'some_dict': {'a': 1, 'b': 2}} for python callable def greet(some_dict, ti):
If not using XCOM for larger values, what to use instead?
You can save intermediate large files to local disk or remote storage like S3.
Hi do you know why 'docker-compose up -d' doesn't start my localhost:8080? Instead I simply use "docker compose up' and it works, is there a reason for this?
It should work. Can you check if the container is up when you run the 'docker-compose up -d'. Keep in mind, it may take some minutes to bring the containers up
yes, it does thank you, if you don't mind me asking what's the difference between these two?@@coder2j
I have seen a python method like def get_spark_config(**context):
ti = context['ti'] what is the purpose of context , why there are ** . What is the meaning for ti = context['ti']
context is a dictionary which contains a lot of dag info the airflow provides. With **, you basically put every key and values in dictionary context as function parameters. So context['ti'] basically get ti (task instance) value from context with key ti.
@@coder2jUnderstood now , Thanks for your clear videos and explanation