As always, the video has crisp and precise content. Your videos are one of the best in the MLOps area. Earlier, I followed your Mlflow videos which were also very helpful. One Question though: Can we use DVC and mlflow together? For e.g., If one wants to do data versioning using DVC and the rest of the pipeline activities(like experiment tracking, model registry etc.) in Mlflow, how to do it?
Yes you can do it. I have already created the videos for both of them separately like how to do data versioning using DVC and experiment tracking using mlflow. You just follow those. Further if you have any query let me know
First thing this is absolute gem of a video on DVC which I've found this after lot of struggle. Thank you very much !! I have one doubt: If we push different data versions to the storage, will there be many versions if we don't replace every time and especially the data we are using is too large and we can go out of storage. If we replace every time how can we can get the older data versions. We have an use case where we need to run the model daily for personalization and load the predictions to the db. Already our data is ho huge with 100million rows just for a month data. How do data versioning help in these cases ?
You can create .py files in jupyter notebook and then execute it will work. There should not be any difference. Or even you can run code in chunks in notebook. Let me know if you have any speyissue in running in notebook. Databricks video I have not made yet. Will make and upload soon.
complete DVC playlist: th-cam.com/play/PLwFaZuSL_mfr5MM2QAxkvXK29oGEFFxGu.html
As always, the video has crisp and precise content. Your videos are one of the best in the MLOps area. Earlier, I followed your Mlflow videos which were also very helpful.
One Question though: Can we use DVC and mlflow together? For e.g., If one wants to do data versioning using DVC and the rest of the pipeline activities(like experiment tracking, model registry etc.) in Mlflow, how to do it?
Yes you can do it. I have already created the videos for both of them separately like how to do data versioning using DVC and experiment tracking using mlflow. You just follow those. Further if you have any query let me know
First thing this is absolute gem of a video on DVC which I've found this after lot of struggle. Thank you very much !!
I have one doubt: If we push different data versions to the storage, will there be many versions if we don't replace every time and especially the data we are using is too large and we can go out of storage. If we replace every time how can we can get the older data versions. We have an use case where we need to run the model daily for personalization and load the predictions to the db. Already our data is ho huge with 100million rows just for a month data. How do data versioning help in these cases ?
Awesome content sir !!! 🎉 Thank you
You are welcome
Good
Thank you Mummy 🙏
How DVC can be used with Azure ML studio, any idea or any link can anyone give regarding it?
How can we add MLflow code here for experiment tracking?
Hi Ashutosh can you please create a video on how to automate ml models using Airflow.
Yes. Planning to upload in next two weeks. Plz stay tuned
sir after push the data/model in remote, in my case split data and model ia present in data dir
It will be present in ur dvc remote storage directory.
Hi sir,How to do in Jupiter notebook and databriks
Thank you
You can create .py files in jupyter notebook and then execute it will work. There should not be any difference. Or even you can run code in chunks in notebook. Let me know if you have any speyissue in running in notebook.
Databricks video I have not made yet. Will make and upload soon.
sir how many videos are left in dvc playlist
One video. Are you looking for any specific one in dvc
@@AshutoshTripathi_AI i was just asking sir. i am following all the videos. sir which tool will you start after dvc ?
I am thinking about DASK. Do you have something in mind?
@@AshutoshTripathi_AI Ok, DASK pe he bana dijiye