Ur content is very helpful for me, i gained more knowledge than from other institutes i took course. Ur channel is more enough to get knowledge on pyspark and Databricks. Keep up the good work.❤
If we have more then 1 notebook for example take 3 notebook and i want to run them one by one how to do it and i want to add or connect a notebook to the on going data pipeline how to do it
We can use %run command multiple times. But for this requirement, it's better to go with workflow. Using workflow, we can create orchestration schedule - serial/parallel executions etc
Ur content is very helpful for me, i gained more knowledge than from other institutes i took course.
Ur channel is more enough to get knowledge on pyspark and Databricks.
Keep up the good work.❤
Glad to hear that! Thanks for your comment
Your content is really awesome the way how you explained really impressed thanks Raja for your efforts
Hi Ashok, thanks for your kind words. Glad to know it helps data engineers
CICD with git and azure devops also plz
Sure, will create a video on this requirement
Good Bro
Thanks!
Kindly make a video on push & pull notebook request from git account which refer to real project. 👍👍
Noted. Will create a video on this concept. Thanks for suggestion!
Can you create CICD in databricks
Sure, will create a video on this requirement
If we have more then 1 notebook for example take 3 notebook and i want to run them one by one how to do it and i want to add or connect a notebook to the on going data pipeline how to do it
We can use %run command multiple times. But for this requirement, it's better to go with workflow. Using workflow, we can create orchestration schedule - serial/parallel executions etc
Can u plz create aws videos also.based on aws data engineering
Sure, will start AWS also once completed most of the topics in azure data engineering
will you teach azure online?