It was hard to find, so I’m going to share with everyone how to use overrides with Cloud Scheduler. The Message Body of a schedule gives us this capability, but it’s not documented anywhere. So, let me show you how! --message-body='{ "overrides": { "containerOverrides": { "env": [ {"name": "ENV1", "value": "VALUE A"}, {"name": "ENV2", "value": "VALUE B"}, {"name": "ENV3", "value": "VALUE C"}, ] } }'
If you need to trigger code from events, a Cloud Run *service* (instead of job) is the way to go. If you search for "cloud run trigger with events" you will find the documentation for setting up triggers which will invoke Cloud Run services.
How to override arguments and command using for execution of a cloud run job in java code. We are using JobsClient class to execute the job. Could you please help us with a code example which needs to show clear steps how to send new arguments to the job for execution please
I believe it's done like this: try (JobsClient jobsClient = JobsClient.create()) { RunJobRequest request = RunJobRequest.newBuilder() .setName(JobName.of("[PROJECT]", "[LOCATION]", "[JOB]").toString()) .setValidateOnly(true) .setEtag("etag3123477") .setOverrides(RunJobRequest.Overrides.newBuilder().build()) .build(); Execution response = jobsClient.runJobAsync(request).get(); } You'd set your overrides with the setOverrides() call above. I found this code snippet by searching for "google cloud run java jobsclient". You will find more documentation of these classes there. Best of luck with your project!
I remember when we were evaluating Cloud Run. We rejected It because there was a hard timeout on Cloud Run execution time. Is that now removed? We have jobs that may run for 5 hours. Can we use Cloud Run for them?
That really looks awesome - quick and configurable. There's only one thing I wonder - how does this feature compare to the other available execution runtimes? For instance, can't the same be achieved with Cloud Functions? And if it can, what does this new approach give us more?
You can use Cloud Functions but this specific use case utilized containerized units of code. Whereas Cloud Functions provides an HTTP server wrapper to your userland function in the languages available (currently only Java, Python, Go, & Node). It's a matter of choice based on requirements. Do you use one of those languages and want to use HTTP as your transport? Then Cloud Functions is probably all you need. But if you want to run (or already have existing) code not in that language set, say Rust for example, you could package it into a container (or rely on Buildpacks to do it) and deploy the code to Cloud Run Jobs.
Can you edit the env variable using the code in the job? Let's say I have an env variable that keeps track of the last processed ID, so when the job starts it uses that variable to know where the last job finished, and right before the job ends, it updates that variable. The idea here would be triggering the job on a schedule. Thank you
Good question! Environment variables are useful for read-only settings. In the case of an ID that needs to be updated and then persisted between executions, I would use a database. If you want to keep it simple you could use Firestore. It requires no connection strings or other settings, and reading or writing a value only requires a single line of code.
Good point! I will do that for future videos. In the meantime, you can find working examples and how to deploy Cloud Run Jobs if you do a web search for "Quickstart: Create and execute a job in Cloud Run". Best of luck with your project!
A Cloud Run Job is simply a program that runs from top to bottom. This means that you can run your code on your local machine with your regular debugger. If your code is using services in the cloud, like databases, make sure you run "gcloud auth application-default login" at your command prompt before you run your code. That way your code will have the same access level to cloud services as you do.
Are you asking about returning a value from a Cloud Run to the caller? That is not possible. If you want to return a value from your Cloud Run code, there are two possibilities: 1. Either your Cloud Run Job writes any return values in a database, to Cloud Storage, or sends a Pub/Sub message. 2. Or you put your code in a Cloud Run *service* (instead of a Job). Cloud Run services are triggered by HTTP calls, so they can return values when they respond to such a call.
could someone please explain the part they skipped over - adding args for the run. on command line you'd have smth like --arg1=value1 --arg2=value2. I can't figure out how to pass them in the container arguments shown at 04.27. the job throws an error saying --arg1 not found
Hi, to run a job overriding the args, you run something like this: "gcloud run jobs execute JOB_NAME --args ARGS" . The args go in the same format as when you're specifying them in the "gcloud run jobs create" command.
Leave your comments and questions for Martin and Karolína down below!
It was hard to find, so I’m going to share with everyone how to use overrides with Cloud Scheduler.
The Message Body of a schedule gives us this capability, but it’s not documented anywhere. So, let me show you how!
--message-body='{
"overrides": {
"containerOverrides": {
"env": [
{"name": "ENV1", "value": "VALUE A"},
{"name": "ENV2", "value": "VALUE B"},
{"name": "ENV3", "value": "VALUE C"},
]
}
}'
Thanks for sharing what you found!
Cloud Run Rules! Martin, please make a firebase blocking functions video. Thanks a lot!
Thank you for the suggestion!
So close! I was expecting to see how override environment variables in the Cloud Scheduler configuration (Job Trigger).
Really useful thank you!
I am using cloud run jobs to run AI pipelines
@@techwithitchris Happy to hear you're finding Cloud Run Jobs useful! AI pipelines sounds like a great use case for this technology.
Thanks! Would cloud run jobs be appropriate for event-based triggers as well?
If you need to trigger code from events, a Cloud Run *service* (instead of job) is the way to go. If you search for "cloud run trigger with events" you will find the documentation for setting up triggers which will invoke Cloud Run services.
Cloud run jobs are awesome!
Is it possible to change the entrypoint of my job from the default index.js? I'm using the node library
Yes, in your package.json file, point the "start" script to another file than index.js. For example:
"start": "node my_file.js"
How to override arguments and command using for execution of a cloud run job in java code. We are using JobsClient class to execute the job. Could you please help us with a code example which needs to show clear steps how to send new arguments to the job for execution please
I believe it's done like this:
try (JobsClient jobsClient = JobsClient.create()) {
RunJobRequest request =
RunJobRequest.newBuilder()
.setName(JobName.of("[PROJECT]", "[LOCATION]", "[JOB]").toString())
.setValidateOnly(true)
.setEtag("etag3123477")
.setOverrides(RunJobRequest.Overrides.newBuilder().build())
.build();
Execution response = jobsClient.runJobAsync(request).get();
}
You'd set your overrides with the setOverrides() call above. I found this code snippet by searching for "google cloud run java jobsclient". You will find more documentation of these classes there.
Best of luck with your project!
I remember when we were evaluating Cloud Run. We rejected It because there was a hard timeout on Cloud Run execution time. Is that now removed? We have jobs that may run for 5 hours. Can we use Cloud Run for them?
You set the timeout of your Cloud Run Job when you create it. You can set it to anything up to 24 hours.
That really looks awesome - quick and configurable. There's only one thing I wonder - how does this feature compare to the other available execution runtimes? For instance, can't the same be achieved with Cloud Functions? And if it can, what does this new approach give us more?
You can use Cloud Functions but this specific use case utilized containerized units of code. Whereas Cloud Functions provides an HTTP server wrapper to your userland function in the languages available (currently only Java, Python, Go, & Node). It's a matter of choice based on requirements. Do you use one of those languages and want to use HTTP as your transport? Then Cloud Functions is probably all you need. But if you want to run (or already have existing) code not in that language set, say Rust for example, you could package it into a container (or rely on Buildpacks to do it) and deploy the code to Cloud Run Jobs.
@@jbellero Yep, that's the kind of perspective I was missing in the video. Thanks!
Can you edit the env variable using the code in the job? Let's say I have an env variable that keeps track of the last processed ID, so when the job starts it uses that variable to know where the last job finished, and right before the job ends, it updates that variable. The idea here would be triggering the job on a schedule. Thank you
Good question! Environment variables are useful for read-only settings. In the case of an ID that needs to be updated and then persisted between executions, I would use a database. If you want to keep it simple you could use Firestore. It requires no connection strings or other settings, and reading or writing a value only requires a single line of code.
You should give a link to the code used also you should give working examples in all supported languages
Good point! I will do that for future videos. In the meantime, you can find working examples and how to deploy Cloud Run Jobs if you do a web search for "Quickstart: Create and execute a job in Cloud Run". Best of luck with your project!
How can we debug cloud run jobs. Set a breakpoing and step in step over
A Cloud Run Job is simply a program that runs from top to bottom. This means that you can run your code on your local machine with your regular debugger. If your code is using services in the cloud, like databases, make sure you run "gcloud auth application-default login" at your command prompt before you run your code. That way your code will have the same access level to cloud services as you do.
Awesome, thanks.
How to pass the parameter externally from the cloud run job?
Are you asking about returning a value from a Cloud Run to the caller? That is not possible. If you want to return a value from your Cloud Run code, there are two possibilities:
1. Either your Cloud Run Job writes any return values in a database, to Cloud Storage, or sends a Pub/Sub message.
2. Or you put your code in a Cloud Run *service* (instead of a Job). Cloud Run services are triggered by HTTP calls, so they can return values when they respond to such a call.
How can i give my own execution_id while calling cloud run job?
I don't believe you can. Could you share what you are trying to accomplish? There may another way to implement what you need.
could someone please explain the part they skipped over - adding args for the run. on command line you'd have smth like --arg1=value1 --arg2=value2. I can't figure out how to pass them in the container arguments shown at 04.27. the job throws an error saying --arg1 not found
Hi, to run a job overriding the args, you run something like this: "gcloud run jobs execute JOB_NAME --args ARGS" . The args go in the same format as when you're specifying them in the "gcloud run jobs create" command.
Cloud Run Jobs via Worflow, run same job in parallel with overriding parameters.
Thank you for the suggestion!
Can we override the memory and CPU of the cloud run job ,if so how do we do it using the client libraries
I don't believe that's possible.
Google AI is amazing, look they are making videos already.
Okey
Understand me money all go problem
Ok