Aaron's tutorials are so good that he actually does reverse clickbait and uses "5 times faster" instead of more accurate title which would be "10 times faster".
I don't even use Laravel at work and still i find your videos valuable enough to watch and learn from it. I love how you describe each concepts in simple manner. Great video as always.
This video actually made me want to jump back on an older project, just to switch to using Octane and test it out. You're videos are really inspiring! Thanks! :)
Really good stuff! Was recently looking into Octane and seeing these kind of examples on how to utilize it in a clever way is very interesting. Thank you!
Great video Aaron! This is a Great example, of how small changes has a really Big Impact when they are deployed on Scale! Something that costs Nothing, make it Once and on return the reward is for every single user and request that comes! p.s. polite question - Are these teeth veneers or crowns and if so what brand - eMAX? Your smile is brighter than SuperNova! :)
This will go a long way towards mitigating the terrible latency and memory costs of using multiple microservices to replace monolithic APIs. Esp where those microservices are implementing zero-trust and operating in tiers. So this is not just a performance feature, but also enabling better security and maintainability by making those microservices feasible at scale.
I’m even scared to imagine how many Laravel developers have now rushed to connect Octane to their projects and what problems they will encounter due to memory leaks 😊
This might be a basic question, but when an application runs on a standard web server, where the environment is reset with each request, the connections are terminated, correct? I'm curious whether, in the case of using Octane, the connection between the server and the API server remains open continuously. Is that the case? If so, could this have any impact on the server's performance, especially when handling thousands of different requests?
The question is: what if the connection was still busy? how do we know when we need to create a new one or use existing? I know there must be some sort of property to indicate that but just wanted to bring up some considerations. Thanks Aaron!
With a single request you know whatever you cache belongs to a logged-in user. From my understanding, with Octane it's all shared. So how would you go about caching logged-in user-specific output? One way is using a request ID and then flushing the cache (memorize), creating a cache for a user within Redis or something. But I am curious if you had a nicer approach. Also, given that everything is shared, how would you go about tracking down memory issues - if there is a leak it is bound to keep eating up RAM. Btw, great videos, hope to get my hands on your screencasting course one day 😅
I think you can always just prepend the user ID to the cache key prefix. But this isn't necessarily caching the data, just keeping the connection open!
Do you mean memoize, as in memoization? Octane natively resets Laravel’s internal state between each request. Anything else could be in a key/value cache prefixed by the user ID, enforced at application-level.
There doesn’t seem much point. Octane improves performance through parallelisation of requests, but it doesn’t increase the overall speed of any one individual process. If you’re at the point where this matters, you’ve already outgrown shared hosting.
Nah, as long as you're not increasing what is stored over time. In this case the client is set once and then never again. If you had a static array you were pushing to each time then it would grow and grow and become a leak.
Oh my god, I lead an internal Laravel Octane API that gets about 2000/RPS at peak hours, and half of its operations are API calls to a third party. I am going to implement this immediately and benchmark at scale.
@@aarondfrancis that’s the plan, currently upgrading our image to attempt to use frankenphp static builds… it’s been a bit of a process haha. Hoping I can figure that out today and then try this tonight. Btw: Love your videos, new viewer here and have been binging. Used to write off PHP like many others but started using modern PHP/Laravel at my new job a couple of years ago and have fallen in love. Our tech stack is Next.js/TypeScript frontends and Laravel Octane backends.
If I offer my users an API, can they just leave connections held open on my server like this? From the server side is this kind of behaviour opt-in or out?
is it possible to use this same approach in a Job, where were using a worker to run the queue also? would be interested in that greatly, since i wont be able to use octane professionally anytime soon heh
@@aarondfrancis Like as singleton? I'm trying to figure out how to avoid memory leaks when using Octane. Thinking of starting to use it instead of good old fpm.
very nice my best teacher what about advanced course implement full project about any thing form writing code to test code to upload to server vps it will be paid of course
trouble with this is... the connection remains, and many api's usually don't like that (and can possibly ban you for doing so) For php to perform better, use asynchronous calls. And do multi threading on top (more calls allowed, or you can process a call on multiple threads at once), or use micro services. Also your example is purely for api calls, but what about the app itself? that needs to be optimized, e.g. DB queries, not loading things you don't need, the async/threading as mentioned. Most sites won't make api calls to third parties, but are on it's own. Laravel, is default, not very fast,because of the many layers/abstraction (fact), optimizing is required. The ORM does often make queries too complex, thus slowing the proces down (use raw queries when this happen).. I know you are a Laravel Fan, that's fine ofcourse. It's a nice framework, but is just not very fast out of the box. Easy of development is more important than execution speed so it seems. E.g. Symfony is much faster using the same libs (Laravel uses Sympfony libs as foundation,yet heavily modified often -> slower execution) Also running a webserver through php's built in server, is a no no in production. Never do that. (don't say i didn't warn you, when things go south) Use dedicated techniques, like a real webserver with php-fpm e.g.
Aaron's tutorials are so good that he actually does reverse clickbait and uses "5 times faster" instead of more accurate title which would be "10 times faster".
Hahaha that's my goal!
Thank you for this video. We need more practical real world insight on Octane like this. Please don't stop.
I don't even use Laravel at work and still i find your videos valuable enough to watch and learn from it. I love how you describe each concepts in simple manner. Great video as always.
Thank you!!
This video actually made me want to jump back on an older project, just to switch to using Octane and test it out. You're videos are really inspiring! Thanks! :)
I haven't worked with PHP for years, and I don't plan to. But I enjoy your videos so much, I still watch them.
Really good stuff! Was recently looking into Octane and seeing these kind of examples on how to utilize it in a clever way is very interesting. Thank you!
This is a really nice approach! I will definitely consider it in the near future.
Great video Aaron!
This is a Great example, of how small changes has a really Big Impact when they are deployed on Scale!
Something that costs Nothing, make it Once and on return the reward is for every single user and request that comes!
p.s. polite question - Are these teeth veneers or crowns and if so what brand - eMAX? Your smile is brighter than SuperNova! :)
Haha nope, just my teeth! 🙊
@@aarondfrancis Wow, DNA Supremacy! ;)
Oh my... I love that to use it in my newest project where I have a huge amount of the api connections.
Don't know much how to code but i love watching your videos!
Oh heck yeah! Glad you're here.
Remember that Octane has official support to first-party packages. Other packages may not work or have memory leak.
This will go a long way towards mitigating the terrible latency and memory costs of using multiple microservices to replace monolithic APIs. Esp where those microservices are implementing zero-trust and operating in tiers. So this is not just a performance feature, but also enabling better security and maintainability by making those microservices feasible at scale.
Yup I think so too
@@aarondfrancis This shifts the friction towards the good old C20K problem, if you're old enough to remember that one ;-)
Aaron is sooooo underated
I’m even scared to imagine how many Laravel developers have now rushed to connect Octane to their projects and what problems they will encounter due to memory leaks 😊
Doesn’t need to be using Octane, but this should also work well for long running PHP processes like a cli command script.
Correct!
it's really usefull resources for laravel please make more videos like this
This might be a basic question, but when an application runs on a standard web server, where the environment is reset with each request, the connections are terminated, correct? I'm curious whether, in the case of using Octane, the connection between the server and the API server remains open continuously. Is that the case? If so, could this have any impact on the server's performance, especially when handling thousands of different requests?
Nice video.
Does this have the same effect as I bind a guzzle client instance to the DI container by doing app()->singleton(… if I use octane ?
Love this explanation👍 You just gained a subscriber.
Thank you 🥹
This also works for queue workers in a non-octane configuration
Yup!
Love your helpful and interesting content ❤
interesting to leverage octane like this
Aaron is best!
The question is: what if the connection was still busy? how do we know when we need to create a new one or use existing?
I know there must be some sort of property to indicate that but just wanted to bring up some considerations.
Thanks Aaron!
I'm sure that Guzzle handles all of that transparently!
@@aarondfrancis If so, that will be great.
Thanks once again Aaron💯🚀
With a single request you know whatever you cache belongs to a logged-in user. From my understanding, with Octane it's all shared.
So how would you go about caching logged-in user-specific output?
One way is using a request ID and then flushing the cache (memorize), creating a cache for a user within Redis or something. But I am curious if you had a nicer approach.
Also, given that everything is shared, how would you go about tracking down memory issues - if there is a leak it is bound to keep eating up RAM.
Btw, great videos, hope to get my hands on your screencasting course one day 😅
I think you can always just prepend the user ID to the cache key prefix. But this isn't necessarily caching the data, just keeping the connection open!
Do you mean memoize, as in memoization? Octane natively resets Laravel’s internal state between each request. Anything else could be in a key/value cache prefixed by the user ID, enforced at application-level.
Very good video, but why specific to laravel octane tho? I mean couldn't we apply this with octane also?
If you have a single long running process, yes! Or you make multiple requests to the same service during a single request, yes!
@@aarondfrancis sorry I wanted to say with out octane?
Any suggestions on running octane on a shared hosting server?
There doesn’t seem much point. Octane improves performance through parallelisation of requests, but it doesn’t increase the overall speed of any one individual process. If you’re at the point where this matters, you’ve already outgrown shared hosting.
Yeah I agree with the other comment. First step would be to move to a VPS.
Awesome there is a book on Laravel Octane? did you read it?
Nope! I just stumbled upon this one by accident
Excellent
I think I will use octane just because of this 😂 as my app only does http calls (and a lot of them)
Nice Video.
Is there any chance of getting connection memory leak, since we are holding on it?
Nah, as long as you're not increasing what is stored over time. In this case the client is set once and then never again. If you had a static array you were pushing to each time then it would grow and grow and become a leak.
Great content, thanks!
Oh my god, I lead an internal Laravel Octane API that gets about 2000/RPS at peak hours, and half of its operations are API calls to a third party. I am going to implement this immediately and benchmark at scale.
Um please do and please report back!
@@aarondfrancis that’s the plan, currently upgrading our image to attempt to use frankenphp static builds… it’s been a bit of a process haha.
Hoping I can figure that out today and then try this tonight.
Btw: Love your videos, new viewer here and have been binging. Used to write off PHP like many others but started using modern PHP/Laravel at my new job a couple of years ago and have fallen in love.
Our tech stack is Next.js/TypeScript frontends and Laravel Octane backends.
Thank you! Love to hear that. Keep me posted! 🤞🤞
have u implemented that?
If I offer my users an API, can they just leave connections held open on my server like this? From the server side is this kind of behaviour opt-in or out?
I'm certain you could close it from your side but I'm not sure what the specifics are
This is so cool!
Is the client thread safe? I think you can have concurrent requests with octane
Sounds irrelevant, but I would love to know what Chrome colorscheme you're using.
I... have no idea! Haha. Just whatever is standard I'd have to imagine?
@@aarondfrancis It looks a little different than the standard ones
Does this mean Laravel is at par with NodeJS or faster when using Octane?
Impossible to say! Node is just a runtime, Laravel is a full framework
Isn't this leaking memory? Or does Guzzle clean up it's connections/state at some point?
Guzzle / curl will clean them up or close them at some point
Awesome!
is it possible to use this same approach in a Job, where were using a worker to run the queue also? would be interested in that greatly, since i wont be able to use octane professionally anytime soon heh
Yes! Totally possible
I wonder can it be used for database/redis connection as well and gained better performance too?
Frameworks already use connection pools
Roadrunner and swoole is not working on Windows please how can solve this problem it requires php 7.1
I could have used frank but it requires docker Ubuntu and that takes space on my pc i just have 117Gb in total and 23 gb available
Did you create that ApiClient as service?
I'm not exactly sure what you mean "as service." It's just a plain ol class
@@aarondfrancis Like as singleton? I'm trying to figure out how to avoid memory leaks when using Octane.
Thinking of starting to use it instead of good old fpm.
Does it work with laravel http client?
I have a video idea for measure the db pool connection time
Is it beneficial for the google server apis 🤔
You had me at 10x faster
Whew, glad I put that in the title! 5x is laaaaaame
@@aarondfrancis true, I don't get out of bed for anything less than 10x these days.
Just Woow!
I don't think this is true. Php will leave connections open for the next process to reuse. You can see it in "connection left intact..." log message
it is true
I thought this was the octane video, but retitled.
Nope! Brand new, just for you
very nice my best teacher what about advanced course implement full project about any thing form writing code to test code to upload to server vps it will be paid of course
I like that idea!
@@aarondfrancis Waiting for this surprise to be announced here or on your other links
I only can say, thank you for sahre bro! really cool the video 😎😎👌👌
Woosh
Haha one secret trick that Java devs don’t want you to know
It's great when communities learn from each other!
So people just rediscovered Web sockets?
More like HTTP keep-alive
Perhaps there's a kinder way to have said that? Saying nothing is also an option!
@@aarondfrancis Sorry
Suffixing with " in a single specific case" would not yield the same amount of engagement, huh
Dunno
trouble with this is... the connection remains, and many api's usually don't like that (and can possibly ban you for doing so)
For php to perform better, use asynchronous calls. And do multi threading on top (more calls allowed, or you can process a call on multiple threads at once), or use micro services.
Also your example is purely for api calls, but what about the app itself? that needs to be optimized, e.g. DB queries, not loading things you don't need, the async/threading as mentioned.
Most sites won't make api calls to third parties, but are on it's own. Laravel, is default, not very fast,because of the many layers/abstraction (fact), optimizing is required. The ORM does often make queries too complex, thus slowing the proces down (use raw queries when this happen).. I know you are a Laravel Fan, that's fine ofcourse. It's a nice framework, but is just not very fast out of the box. Easy of development is more important than execution speed so it seems. E.g. Symfony is much faster using the same libs (Laravel uses Sympfony libs as foundation,yet heavily modified often -> slower execution)
Also running a webserver through php's built in server, is a no no in production. Never do that. (don't say i didn't warn you, when things go south) Use dedicated techniques, like a real webserver with php-fpm e.g.
This is just showing one technique for one situation. I can't cover everything in every video 😂
That's a bad design. Connections to third parties should be handled in jobs, not synchronously in-process
Eh, not always.