@Mustafa Dholkawala Did you set up all the environment variables? AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, MLFLOW_S3_ENDPOINT_URL, MLFLOW_TRACKING_URI on the machine you're executing?
Hi, i get an error. Version in "./docker-compose.yml" is unsupported. You might be seeing this error because you're using the wrong Compose file version. Either specify a supported version (e.g "2.2" or "3.3") and place your service definitions under the `services` key, or omit the `version` key and place your service definitions at the root of the file to use version 1.
Its not AWS. Minio is a drop in replacement for the S3 protocol (storage protocol/api) that AWS/Amazon created, hence the name of the variable. Its compatible with all environments that can run docker. For more infomation about minio and S3 min.io/
Where do you have any "AWS" part in the video? There are 2 places when you have an option: hosting provider (could be azure, digitalocean, aws, whatever) and s3 bucket creation - you can do it using native aws cli or by using the script just as in the video
I think a big intention of setting up a remote tracking server is to run your code locally (not from the VM), and then be able to view the results in the remote mlflow ui. If you ran your code locally, or had multiple users submitting runs in python on there local machine, can they type the url and view it locally?
How could I do it on an AWS EC2 window instance? and My application doesn't have any parameters. It captures the image every 10 seconds and it also uses a flask to create a server. Please guide me on this.
You should install docker and then create a docker-compose.yml file just like we did with your application. Heres the tutorial with flask docs.docker.com/compose/gettingstarted/
Many thanks for you work. But I have 2 questions: 1) Can you add script for remove deleted experiments (from mlflow trash) by cron? 2) I try use you actual version but after 3 months usage i have an error: BAD_REQUEST: (pymysql.err.IntegrityError) (1062, "Duplicate entry 'mlflow.runName-a171d72f207b4cb38afb5d918a778c9e' for key 'PRIMARY'") Do you have any idea how to fix this?
Follow this tutorial to create VM th-cam.com/video/A9Ulrn98DPs/w-d-xo.html then this to open up the ports docs.aws.amazon.com/AWSEC2/latest/UserGuide/authorizing-access-to-an-instance.html then continue with this channel video :)
Directly because that is a separated system so for that reason minio lives in the docker-compose file (so "docker compose up" starts it up. Not sure if i got your question right :) Surely you can run minio and mlflow on the same container, but its not the way Docker was designed to be used. If the containers are separate, then you can easily update one independently of another one. Here you have an app, database, and storage completely independent, so you can use minio for other projects too
Theres an in-depth tutorial here: towardsdatascience.com/deploying-a-docker-container-with-ecs-and-fargate-7b0cbc9cd608 Magic of docker is that if you are able to create a container, you can deploy it anywhere you like - whatever environment that supports docker (and networking in this case so the service is available from the internet)
Today we've updated the github repo (it wasnt working because of some breaking changes in minio). Check this out github.com/Toumash/mlflow-docker/commit/b6ecfe7d0c0b8c3d3674b1d5a1a63c4d6f6249df
Check out our new video on saving tweets to csv file! 🐦 th-cam.com/video/rfqEJpJtCzw/w-d-xo.html
@Mustafa Dholkawala Did you set up all the environment variables? AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, MLFLOW_S3_ENDPOINT_URL, MLFLOW_TRACKING_URI on the machine you're executing?
Timestamps
1:45 - Azure VM creation
4:51 - Docker installation
6:02 - MLFlow server setup
6:52 - MLFlow configuration
7:57 - Creating buckets with minio
9:43 - Client side configuration
11:12 - Running mlflow on the developer machine
Hi, i get an error.
Version in "./docker-compose.yml" is unsupported. You might be seeing this error because you're using the wrong Compose file version. Either specify a supported version (e.g "2.2" or "3.3") and place your service definitions under the `services` key, or omit the `version` key and place your service definitions at the root of the file to use version 1.
I see AWS specific configuration in the env file... This doesn't appear to be compatible with Google Cloud or Azure hosted Docker servers.
Its not AWS. Minio is a drop in replacement for the S3 protocol (storage protocol/api) that AWS/Amazon created, hence the name of the variable. Its compatible with all environments that can run docker. For more infomation about minio and S3 min.io/
i did not understand the AWS part ? why did we gave AWS ? (just as a dummy ?)
Where do you have any "AWS" part in the video? There are 2 places when you have an option: hosting provider (could be azure, digitalocean, aws, whatever) and s3 bucket creation - you can do it using native aws cli or by using the script just as in the video
I think a big intention of setting up a remote tracking server is to run your code locally (not from the VM), and then be able to view the results in the remote mlflow ui. If you ran your code locally, or had multiple users submitting runs in python on there local machine, can they type the url and view it locally?
Yes, thats definetely the case. You dont get the url in the local run, but when you go to the web interface you can share a link with others ;)
How could I do it on an AWS EC2 window instance? and My application doesn't have any parameters. It captures the image every 10 seconds and it also uses a flask to create a server. Please guide me on this.
You should install docker and then create a docker-compose.yml file just like we did with your application. Heres the tutorial with flask docs.docker.com/compose/gettingstarted/
Any plan to update this for mlflow 2.0?
Many thanks for you work.
But I have 2 questions:
1) Can you add script for remove deleted experiments (from mlflow trash) by cron?
2) I try use you actual version but after 3 months usage i have an error:
BAD_REQUEST: (pymysql.err.IntegrityError) (1062, "Duplicate entry 'mlflow.runName-a171d72f207b4cb38afb5d918a778c9e' for key 'PRIMARY'")
Do you have any idea how to fix this?
How can we do this for GCP?
Follow this tutorial to create VM th-cam.com/video/A9Ulrn98DPs/w-d-xo.html
then this to open up the ports docs.aws.amazon.com/AWSEC2/latest/UserGuide/authorizing-access-to-an-instance.html
then continue with this channel video :)
This is great, thanks! Quick question however. Is there a reason why you don't include the installation of Minio inside your docker file?
Directly because that is a separated system so for that reason minio lives in the docker-compose file (so "docker compose up" starts it up. Not sure if i got your question right :) Surely you can run minio and mlflow on the same container, but its not the way Docker was designed to be used. If the containers are separate, then you can easily update one independently of another one. Here you have an app, database, and storage completely independent, so you can use minio for other projects too
Is there a way to enable a password protected login to the mflow web interface?
Yes, you can try the nginx basic auth newbedev.com/how-to-run-authentication-on-a-mlflow-server, but it will depend on your needs
please do this on aws by running it on fargate
Theres an in-depth tutorial here: towardsdatascience.com/deploying-a-docker-container-with-ecs-and-fargate-7b0cbc9cd608 Magic of docker is that if you are able to create a container, you can deploy it anywhere you like - whatever environment that supports docker (and networking in this case so the service is available from the internet)
Awesome tutorial, thank you
Please help with model serve
Whats the problem there? I would suggest you provide a link to a question of yours created for/on stackoverflow.com
Today we've updated the github repo (it wasnt working because of some breaking changes in minio). Check this out github.com/Toumash/mlflow-docker/commit/b6ecfe7d0c0b8c3d3674b1d5a1a63c4d6f6249df