Serverless kinda sucked without this...
ฝัง
- เผยแพร่เมื่อ 19 พ.ค. 2024
- DISCLOSURE: VERCEL PAYS ME BUT NOT FOR THIS SPECIFIC VIDEO. I just wanted to rant about this
waitUntil and the benefits for us serverless diehards is huge. I've been asking for this forever so I'm pumped I finally have it.
Check out my Twitch, Twitter, Discord more at t3.gg
S/O Ph4se0n3 for the awesome edit 🙏 - วิทยาศาสตร์และเทคโนโลยี
Sounds like you should just use a server
why don't you just use a 5 dollar vps
great for 99% of projects, but it’s not scalable.
@@Om4r37 how about a second five dollar vps?
@@kevgoeswoof idk how nobody thought of this before, you’re a genius 😂
And if you blow up? What happens if many users want to access your site at the same time? Your company dies if you cant handle a spike in user traffic.
It does not scale
waitUntil(...) -> but don't wait too long.
Go and ask her out.
Thank you, man, now that I've read this in the place I least would have expected to I will
Classic example of first creating the problem and then "fixing" it. 😂
So like Cloudflare Workers?
Theo loves to sell Cool Aid after it boiled in the Sun for three summers.
Theo the type of guy to create a schedule for wearing each of his clothes
Worst of all...
#NotAVercelShill
Only talks hype, no technical insight...
@@drprdcts 🥶
Lambda ( response ) -> SQS -> Lambda ( wait until task ) = 🎉🎉
Skip the SQS, just invoke another Lambda asynchronously.
Ah but you’d want to have retries, and maybe a dead letter
works for your own lambda's and server side logic but if you are using nextjs and server actions. they control the infra
So like Cloudflare workers has had for years? or y'know just invoke a second lambda, or do the (at least IMO) just fire off a PutMessage to SQS since that is far more reliable for background processing anyways and with the minor overhead of having to await that put request. AWS Also support directly putting into SQS/EventBridge/SNS/Etc. directly from API Gateway anyway without you having to write even a single Lambda function (great to receiving spikes in traffic for submitting forms and such)
The whatever service you are wondering about is most likely Cloudlfare.
I had so many issues with that. Stale-while-revalidate made no sense at vercel because you had to pay the cost of waiting for revalidation while it still gave you the stale response in the end..
Why not just use a $5 vps?
I always wanted something like this, but I didn't even think about making my own architecture on top of lambda, I'm a noob.
It just makes sense to do what you need to do, send response, and then finish any extra stuff like updating cache, logs etc. Not to mention streaming.
❤ love that it didn't take you long to come around that feature, my main use case is logging and internal processing that the user doesn't/shouldn't care about
For vast majority of devs serverless is a massive headache with absolute garbage ROI.
That streamData issue seems like it's going to bite many people in the ass.
logging,
sending emails,
writing stuff in db the user don't need,
There are so many times I wanted to have those and it frustrated me because "it worked on local dev though"
Stop using javascript, problem solved. It was already done 1000 times in other languages or backend languages.
@@RaZziaN1 what do you use now?
@@RaZziaN1 you are right that's why I use typescript
it's crazy seeing this video because I just spent my entire weekend figuring this out.
- Vercel should keep this same behaviour in dev (drove me crazy)
- Also having issues with adding this to multiple requests
Laravel has queues built in. You can then interface with a number of drivers. Theres also a dispatch after response function you can use in your controllers to run arbitrary code.
The video is not sponsored by vercel BTW
I currently have a project that could use this. I'm holding back the client response artificially to wait for something it shouldn't need to wait for. Cool.
So... Like cloudflare workers had since forever?
I believe that Lambda function doesn't die after it serves sync invocation request. It is kept around for ~15mins to potentially serve more requests.
The deployment lives but you can't run code after you've returned a response. If you want to do that you have to invoke another lambda no?
Well, for example with dotnet runtime lambda is just a regular dotnet app so it can run what it wants in the background. I would assume it's the same for node.
But yeah, that still doesn't guarantee that secondary logic will actually eventually succeed. I wonder if the vercel invention can provide some actual guarantees
@@MegaMage79 It's definitely not hence the video. Once you return a response in a JS lambda the lambda is killed even if it's still running some other async task
@@gusryan What's the downside of just letting it continue running the other async task?
@@Yxcell it doesn't, that's the whole point of the video lol. Once the lambda returns the whole process is killed. If you want something to keep running you have to call to a second lambda before you return from the first one
The ending demo gave me nice jquery nestalgia. The waitUntil pattern idea itself seems universally awesome though, good for the client, good for the data, a tiny-bit-rougher on servers (more long-running processes). Pretty good value-add from Vercel, from the end-customer and dev perspective. If this is just something we can nab and tinker with without the need for vercel hosting, as it sounds like, I'd be down to tinker with it for a couple nights and see what happens.
please promise me that you will never stop doing content!
Continuations still newsworthy in 2024. Maybe they should replace "off-by-one-errors" in the top 3 hard problems list.
My way of handling tasks after lambda response has been to setup a lambda extension registered to invokes - its been honestly pretty painless for me (rust lambdas)
I think they’ve been lowkey doing this with revalidateTag. Remember how they removed the await from the docs.
for js deno deploy is strictly superior for such characteristics
woah this is very nice
Lambda isn't that much different from CGI of old, right?
lambda is basically php
I’ve never thought of it like that before. But I guess spinning up containers to run a program is like a distributed version of CGI in a way
Second a few comments below. I don't think serverless itself is bad
I think CloudFlare workers is better designed for frontend SSR workloads than AWS lambda.
Dax is anti vercel because he suggest there's already an alternative in place? oof
What color scheme does he use?
Poimanders
the vercal is good at copying cloudflare worker :D
A queue would make way more sense especialy for retries, I'm sure there is a use for this but the example you used is not the best in my opinion.
4:30 - of course it will die, you can't write to files on lambda 🤓
You can write files at /tmp 🤓
`.repeat(1024)`, to fudge gzip, was header -> 'accept-encoding: identity' not an option?
Why not just use a $500,000 VPS?
vercel shilling lol
Solving a pain point that doesn't really exist. Coolify + VPS = Freedom. Screw this VC funded companies