Run custom training job with custom container in Vertex AI

แชร์
ฝัง
  • เผยแพร่เมื่อ 14 ต.ค. 2024

ความคิดเห็น • 14

  • @varshasahasrabuddhe1652
    @varshasahasrabuddhe1652 ปีที่แล้ว

    Hi, this video was very helpful. Have a question, how do we use this virtual environment?

  • @sushmitarai4471
    @sushmitarai4471 ปีที่แล้ว

    Hi great tutorial, I have a question, is it possible to run custom training pipeline from workflow service?

    • @cloud4datascience772
      @cloud4datascience772  ปีที่แล้ว

      Hey, as long as the workflow service supports any of the languages compatible with Vertex AI custom training service you should be able to do it. cloud.google.com/vertex-ai/docs/training/create-custom-job#create

  • @xyz-jn4oj
    @xyz-jn4oj 7 หลายเดือนก่อน

    hey what about model deployment? can u make video on it?

  • @thinhnguyenhoang6058
    @thinhnguyenhoang6058 ปีที่แล้ว

    Hello, greate tutorial. But I want to ask if I start the CustomContainerTrainingJob from a colab shell. And then I close my browser. Would the job run fine (not being canceled when the colab session disconect) if my model take days to train ?

    • @cloud4datascience772
      @cloud4datascience772  ปีที่แล้ว

      Hi, thanks! Regarding your question - once you submit your training job it is run on compute resources inside the Vertex AI for as long as it takes for your training job to finish. There is no need to keep your Colab session up and running for this entire time, it only serves as a starting point and once you submitted your job and you see it in Vertex AI, you can close your current session.

  • @prateeksingh6036
    @prateeksingh6036 ปีที่แล้ว

    Hey, Great tutorial, i have a doubt, how to add environment variables to custom jobs?(thorough cli)

    • @cloud4datascience772
      @cloud4datascience772  ปีที่แล้ว

      Hey, I am glad to hear that you liked the video, there are couple of ways of how to work with environment variables in docker.
      - first and the most simple way is to define it inside the Dockerfile with the ENV command (docs.docker.com/engine/reference/builder/#environment-replacement),
      - you can also pass the .env file to docker compose command
      - you could also pass it with -e flag: docker run -e "env_var_name=another_value" ${IMAGE_URI} --data_gcs_path=...
      I think that generally you should find what you are looking for there: vsupalov.com/docker-arg-env-variable-guide/#frequent-misconceptions
      Good luck :)

    • @prateeksingh6036
      @prateeksingh6036 ปีที่แล้ว

      Hey,
      Actually I want to pass the env variables while running the vertex custom job, the docker image will remain same and env variables will change with each job run. Any way how to pass then?

    • @cloud4datascience772
      @cloud4datascience772  ปีที่แล้ว

      ​@@prateeksingh6036 I am not sure why you would like to pass the env variables to the custom job, if you want to keep the docker image the same. If you would like to adjust the behavior of a python script, you can always do it by providing the necessary values as the arguments to the script with the args parameter in job.run() command.

    • @prateeksingh6036
      @prateeksingh6036 ปีที่แล้ว

      @@cloud4datascience772 I tried passing the arguments, but then i'm getting "The replica workerpool0-0 exited with a non-zero status of 127" and there is nothing usefull in the logs.

    • @cloud4datascience772
      @cloud4datascience772  ปีที่แล้ว

      @@prateeksingh6036 You can try to pass additional arguments in a same way that I am passing the dataset:
      args=['--data_gcs_path=gs://datasets-c4ds/healthcare-dataset-stroke-data.csv',
      '--your_next_argument=value']

  • @NicolasVallot
    @NicolasVallot ปีที่แล้ว

    Hello, can you please help me to create a model registery and an endpoint with your code.

    • @cloud4datascience772
      @cloud4datascience772  ปีที่แล้ว

      Hi, unfortunately, I do not have the time to provide individual assistance on those topics. Google provides documentation on the various topics related to the custom training on their platform, there is also something regarding the predictions: cloud.google.com/vertex-ai/docs/predictions/custom-container-requirements. I hope this will be helpful and in the future, I might have a tutorial on this topic as well.