Wow, Vercel edge function is so impressive and this TwitterBio app is the best way diving in Vercel Edge Function and OpenAI API, thank you for sharing 🙏🙏🙏🤩🤩🤩
Visually showing the difference between the edge and serverless functions was a perfect demo!
ปีที่แล้ว +3
You can stream from serverless functions too. That makes the short timeout not as big of a problem. I implemented the same thing in an edge function at first, but needed auth and some other stuff which was kind of a pain with Edge functions, so I switched over to serverless instead. Overall, good video man 👍
Hmm, I'm just curious if the solution with while loop is ok to be used on production applicstions? Should I treat it as a best practice or is there more elegant way to do it? Btw veery interesting video and topic. Until now I didn't know that something like this is even possible :D it shows power of edge functions, thanks!
Really glad you enjoyed the video! Yep, it's okay to be used in production applications and a bunch of people have already implemented this in their apps. The only thing you may want to add to make it more robust is more error handling, but so far, I haven't run into any issues even with 10k+ people using the site.
@@HassanElMghari Hey man thanks for your video. Just to be clear, the "stream" part is something that is not feasible with serverless function, only edge function right? Thanks a lot!
Do you know how to fix this error: "error - Provided runtime "edge" is not supported. Please leave it empty or choose one of: experimental-edge, nodejs"? I see this on my server and I am on Next.js v12.3.0. My route is defined in pages/api/*
So why can't you use the OpenAI streams on the lambda version? cause looks to me most of the speed improvements come from the streaming part and not the edge functions
ปีที่แล้ว
You definitely can use steaming, and you're right. In this case the major improvement in UX comes from streaming the response.
I get a stream like this: "1. I1'm an. aspiring comedian I with'm a a passion professional for clown pun whos loves and to cheese make. people laugh2. . I'm an2 astronaut that. moon Ilights'm as a a master stand jugg-lerup and yoga comedian instructor.."
Question: This video th-cam.com/video/yOP5-3_WFus/w-d-xo.html talks about how edge is really bad when you need other data requests such as the GPT api here, did you benchmark with a non-edge function? You're talking about speed but I don't see any comparisons of TimeToFirstByte (TTFB), could you elaborate?
Serverless functions are stupid in general, but absolutely detrimental for working with slow APIs such as OpenAI. Why not have a server with a 100% uptime instead?
getting this error while trying this code, any help on this? uncaughtException: Error: aborted at connResetException (node:internal/errors:717:14) at abortIncoming (node:_http_server:744:17) at socketOnClose (node:_http_server:738:3) at Socket.emit (node:events:525:35) at TCP. (node:net:317:12) { code: 'ECONNRESET'
Wow, Vercel edge function is so impressive and this TwitterBio app is the best way diving in Vercel Edge Function and OpenAI API, thank you for sharing 🙏🙏🙏🤩🤩🤩
I agree!
Visually showing the difference between the edge and serverless functions was a perfect demo!
You can stream from serverless functions too. That makes the short timeout not as big of a problem.
I implemented the same thing in an edge function at first, but needed auth and some other stuff which was kind of a pain with Edge functions, so I switched over to serverless instead.
Overall, good video man 👍
WOW 🤯 Edge functions are crazy. Thank you for sharing.
Every day Vercel blows my mind.
Inspiring tutorial! I can't wait to build something.
Funny how you guys promote something that prevents users from upgrading to Pro plan, but it's great. :D
That's so awesome! 🤩
The app is amazing but I’m wondering how much Vercel are spending on the OpenAI API
Just learnt something new. Thank you!
love it
For the edge functions frontend file at line 61 & 62, why are you renaming the variable done twice? Why not just used as it was destructured.
is there a way to add multiple inputs to the useChat function?
Unlock! Thanks dude.
Hmm, I'm just curious if the solution with while loop is ok to be used on production applicstions? Should I treat it as a best practice or is there more elegant way to do it? Btw veery interesting video and topic. Until now I didn't know that something like this is even possible :D it shows power of edge functions, thanks!
Really glad you enjoyed the video! Yep, it's okay to be used in production applications and a bunch of people have already implemented this in their apps. The only thing you may want to add to make it more robust is more error handling, but so far, I haven't run into any issues even with 10k+ people using the site.
@@HassanElMghari That's perfect, thanks for the answer!
@@HassanElMghari Hey man thanks for your video. Just to be clear, the "stream" part is something that is not feasible with serverless function, only edge function right? Thanks a lot!
Do you know how to fix this error: "error - Provided runtime "edge" is not supported. Please leave it empty or choose one of: experimental-edge, nodejs"?
I see this on my server and I am on Next.js v12.3.0. My route is defined in pages/api/*
So why can't you use the OpenAI streams on the lambda version? cause looks to me most of the speed improvements come from the streaming part and not the edge functions
You definitely can use steaming, and you're right. In this case the major improvement in UX comes from streaming the response.
Is streaming not working in serverless functions?
after i define api as edge using 3 line of config , i am getting 500 error, is there anything else i need to configure?
And what about the next 13 stable release talk about it too what you can and can't regarding the new features
I dread to think what your OpenAi bill will be.
The hype will probably die off at the end of this year.
@@erickheredia8910 nope)
I thought nextjs was using app folder now
I get stream responses out of order. Anyone know how to fix this?
I get a stream like this: "1. I1'm an. aspiring comedian I with'm a a passion professional for clown pun whos loves and to cheese make. people laugh2. . I'm an2 astronaut that. moon Ilights'm as a a master stand jugg-lerup and yoga comedian instructor.."
Ah I was using a useEffect which was causing issues. Just make it an async button call ;)
Why ain't you people talking about the new next hooks like useselectedlayoutsegment and how to use them in project even in typescript
Question:
This video th-cam.com/video/yOP5-3_WFus/w-d-xo.html talks about how edge is really bad when you need other data requests such as the GPT api here, did you benchmark with a non-edge function? You're talking about speed but I don't see any comparisons of TimeToFirstByte (TTFB), could you elaborate?
NextJS is the BEST way to build a front end for OpenAI. Thank you very much for your contributions Hassan.
Seriously? It's literally the worst way. What other framework has hardcoded timeouts?
Serverless functions are stupid in general, but absolutely detrimental for working with slow APIs such as OpenAI. Why not have a server with a 100% uptime instead?
Deprecated.
getting this error while trying this code, any help on this?
uncaughtException: Error: aborted
at connResetException (node:internal/errors:717:14)
at abortIncoming (node:_http_server:744:17)
at socketOnClose (node:_http_server:738:3)
at Socket.emit (node:events:525:35)
at TCP. (node:net:317:12) {
code: 'ECONNRESET'