@@Z4KIUSTrivial performance gains like this rarely matter to begin with. Spend your time addressing issues that cost real money or adding features that make it. Chasing tiny page load speeds is just mindless busywork.
@ But this isn’t about addressing relevant performance issues, it’s about pointlessly squeezing out a bit more, in a contrived demo, just for the sake of it.
Interestingly enough, the edge McMaster has is not that their website is so insanely fast, its that everything you order will be delivered to you company in a couple of hours. So if you think the page loading is fast, checkout their delivery, lol
Their other edge is that they have every conceivable product. They are all around a premium quality service with high prices to match. When you need something specific, fast, to exact specifications and perfect every time, you use this company. When price matters more, you try your luck on Ali.
Worked at McMaster for a few years. This kind of glosses over how we’re also able to perfectly sort/filter/and serve up data on over a half million different part numbers. There’s a looooot of stuff going on in the backend for this
It’s very very impressive stuff, especially for how long it’s existed and worked. I wish more of that info was public so I could have talked in depth about it 🙃
I like how theo thinks McMaster's competitive edge is their website and not that they knock on your door with your parts 3 minutes after you complete the order. 😄
I've used this as my go-to pat response to "can you give me an example of good web design/UX/UI" in interviews for years, is great that it's getting attention now 🎉
13:45 prefetching is great! When I started experimenting with HTMX, I immediately turned that on there as well (it supports both on mouse down and on hover, depending on your preferences). Great to see that next.js also supports it.
So one of the things you seemed to miss was that with was a classic .NET 4.5 ASP website. So the tech for this is about 15 years old. All that javascript at 4:45 is auto genned. The back page for this is much simpler.
@@PraiseYeezusrealest magic: cache everything in the browser indexedb, and store a hash, so when the hash sent from the server to the client is different, the client downloads everything over again
There's no free lunch. The original project does the same. You can choose not to preload the images if you're worried about that, only the HTML content. I'm gonna tell you for my company the price of traffic is easily covered by improved user experience. Also on mobile you can track the viewport and prefetch on item visible for a certain amount or some other metric, you'd need to research for a particular use case, or don't prefetch images and only the HTML for everything. Trade-offs are always there.
It's an eCommerce site, so network and bandwidth costs are very very low compared to the revenue generated from sales. However, load speed is crucial. I've seen a 30% drop in CTR/visitors when my website's page load time is slow.
As a purchase manager that orders from McMaster CONSTANTLY, it's wild to me every time their website gets talked about. Worlds colliding or something lol
@@PraiseYeezus Brazilian Channel 5 ( Rede Globo ) covering the Olympics in 2004 is a good example; we had super tight requirements with the size of the CSS and imagery. Basically, back in the day, at the end of 90's beginning of 00, you had to make websites that performed well because broadband wasn't so well spread, especially in South America. So it was expected that designers would know how to compress images and videos to the maximun amount of compression possible. Often, internet banners had incredibly low limits in size, so everyone back in the day would squeeze as many KB as possible of every single file. Nowadays, a lot of "designers" and "developers" will put images online without even trying to compress or make them the correct size before slapping them online.
@@PraiseYeezus for some reason my comment keeps being deleted. So i will rewrite it briefly, I wrote the main Brazilian website ( by our main tv channel, which was "the official channel" ) for covering the Athens olympics in the early 00's and many other websites and at the time broadband wasn't so popular and everyone in the team, designers and developers and project managers were well aware of file sizes and compression types, most projects had strict rules for file sizes and page load times. XMLHttpRequest API was standard and so it was having different if conditions for different popular browsers, jQuery was not there yet.
Tbh I don't think it feels that fast, especially for a page that is all text aside from some tiny black and white images. Getting a simple page like that to behave like the NextFaster example isn't that difficult, preloading, caching, and not doing full page reloads will get you most of the way there. The reason most websites are slower is because A. they're loading much more data, and B. their focus is on adding features and developing quickly, not trying to get page loads down to milliseconds.
I noticed a small but significant tweak that probably helps a lot: B&W images.. they probably get a lot of saving by the compression on top of the fact that images here are all small.. tthe result: the browser is done quicker loading and rendering the images
Sid, i agree but he conveniently missed few key points as he is a React/Next Shill. Few key points Theo is missing: 1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data. 2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
@mohitkumar-jv2bx as I mentioned in my reply above, the only one "conveniently missing" points here is you. 1) All of these examples use real databases. The DB for NextFaster has millions of entries. It's a fair comparison. 2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic.
@@t3dotgg just 20$ dollars for these many requests 💀. I mean i get it, most of these are just favicon/really small requests which don't take a lot of bandwidth, but the amount of requests a single user generates on this site is just absurd. So, that low price is indeed shocking.
I wonder how this project would perform on a self-hosted environment. We all know Vercel does a bunch of special optimizations for Next hosted in their cloud. I'm guessing it will still run pretty fast, but some of these optimizations will not work out of the box or not work at all
16 วันที่ผ่านมา +4
Man... I absolutely love your honesty when doing ads! Seriously!
"Choosing a specific technology won't necessarily make your website faster"...Nextjs makes all optimizations default..."Choosing Nextjs will definitely make your website faster"
Googles page speed tool is nothing to do with site speed to user, and everything to do with first page load. Optimizing for first page load and optimizing for general site speed are two different kettles of fish. Google has to assume the user is loading the site for the first time
First, the comparision between McMaster and NextFaster is not fair, McMaster does actually query the database on each product, while NextFaster downloads 10MB on the first page. this is not going to work if you have bigger database. McMaster Tech: 1. Jquery 2. Styled Component this proves that all newcomers frameworks wanting to fix slowness problems that other frameworks had originally weren't there, bad coding and adding dependencies are what we don't need.
@@t3dotgg even if it does, it's not as simple, what kind of enhancement on the database? is it in memory?, how big is it, is it redundant? knowing that NextFaster is all about speed, i am sure 100% they did some of the hacks to make it look that good, but in the real world, hello darkness my old friend...
@@hqcart1 Why don’t you take a look? It’s all open source and they’re very transparent about how it works. The database is Neon, which is a serverless ready Postgres provider. They provide most of what you’d hire a db admin for (backups, pooling, sharding etc)
People in the comments seriously overestimate how slow database queries are. In reality accessing a database is nothing compared to, say, network latency.
I'm really interested to hear why you're coming around to mousedown for interactions. I'm still in the mouseup camp but I haven't dug into it much and would love to hear what the arguments are! Future video?
Um... loading a lot of JS is not always fine. At that point it only works quickly with high internet speeds, which is not something everybody has across the world. If your target customers are in the US / EU / Australia and other areas where internet bandwidth is fast, then sure you can send in more data to avoid more requests, but if your target customers are every country or africe / latam, then you really have to think about every byte sent to the customer.
My marketing team needs to know when images were loaded for some reason. I need to write unoptimized in the next Image tag because when images are optimized by next js the URL has some params for getting the optimized image. Also, they say why the image loading feels slow :(
Few key points Theo is missing: 1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data. 2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
1) All of these examples use real databases. The DB has millions of entries. It's a fair comparison. 2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic
Fast usually means simple. Simple usually means less surface area. Less surface area usually means less room for exploits. There's no hard rules here, but generally speaking, simpler = better
I'm not very familiar with JS and so I don't know if he showed this in the video, but I wonder what exactly this 2 hour cache invalidation timeout effects? If things like stock and price cant update on every load or even update live, then I get the reasons for suspecting the cache is misrepresenting the comparison, but I lack the immediate skills to check without outpacing my own interest. But like, images only updating every 2 hours. Sure, why not?
5 years ago I made a SPA website for my college using just Django and vanilla js and that was fast as f 😅. I made a router and for first request It download the full page then for any click it download only the part of page and then I attach/replace to the page part and head scripts without changing the layout. /first-page (full page with layout) /next-page?req=spa (only changed content not full layout)
do nextMaster pregenerate all product pages and everything? Wonder how long that takes to build? I don't think it fair comparison to the original site since I don't think they are pregenerating all product pages.
yes, it will be a waste if you don't click, that's the tradeoff of choosing pre-fetching. Your traffic and billing can sky rocket if you are not being careful. They can afford the prefetch to provide a better UX for their clients. Hence, there are lots of times you don't want to prefetch.
To be honest, hovering doesn’t exist on mobile devices which is where the concern about wasteful request network bill is mostly relevant so I think it’s a good trade off for desktop devices. Yeah, yeah. Hover might technically exist on mobile too, but if you disable it the trade off is only on desktop.
@@m12652 Really 😅. Humans are quite wasteful too if you’re going to that length about environment concerns. Should we remove all toilets in the world because it’s inconvenient every time some one takes a dump to recycle as manure? I don’t think so, and I hope humanity is not heading that way. I think, It would be best in human interests to not sacrifice inconvenience but make up with other means for things we have been a little wasteful of.
11:20 I am not a fan of loading tons of data before a user gets to a page. Yes, it is nice for user experience, but it is not nice for user download rates or company server rates. Did see stopLoading if the mouse moves out, which is nice
Can someone please explain to me, if it is 1 mil products it means 1 mil photos. Which if you are using vercel image optimisation is around 5000 dollars. Who out of this enthusiast payed that much? The only reason I don’t use vercel image is because my side project makes no money and is not worth to spend 5 dollars per 1000 images
I am just really curious, why we just cannot use SPA version with a restful API of that instead of Next.js, especially if we're going to fetch all the endpoints in advance? I feel like we always reinvent the same wheel again and again. I remember my website which was fetching the HTML with sync ajax in 2013 with the exactly same speed. Surely, it wasn't complicated to build like in Next.js with dozens of optimizations. IMHO, there are many ways to build a website which can load faster. Surely, 99% of them easier than implementing in Next.js. Sorry, I just don't understand. Maybe, I am not nerd enough to get the point.
While you are objectively correct in saying that SPA + REST is superior, the fact is that Next has a significant footprint in the industry and as a result there will be content made around it
why it make fast because it reload the link when the link is hover so when the user is click then it automaticallly view the page because it like already loaded and thank you and i add it to my knowledge now,,
This is also example that Fastest website doesnt really that matter, we care because we look at the number, but what does that means to website consumer? sometimes more function could be helpful rather than microptimizing stuff
You just gave me an idea to promote some things I work on because... I write things that are both minimal and fast. I'm sure I could attain that speed, and with lower load size.
Putting it to use on my local server for an 11ty site I took navigating after initial load down to ~25ms. Mostly only took 4 lines for setup, but I had to refactor some things to deal with adding event listeners on page load. Added < 6kb to my bundle size, before compression. Could probably get it down to like 4ms and even reduce bundle size, while making it easier to maintain, but that'd basically mean a complete rewrite of things.
On face Prefetching looks good, but in most cases the amount of avoidable requests and load on the server it increases is not worth it unless it's just sending the static data which can be cached easily. If any db requests or computation is happening in the backend for those requests then it's just a waste of resources.
i used Brave Browser's Leo AI 0:00 - Introduction to the video and the McMaster website 1:40 - Analyzing the network requests and performance of the McMaster website 5:00 - Introducing the "Next Faster" website as a comparison 7:00 - Analyzing the performance and optimizations of the Next Faster website 12:00 - Diving into the code of the Next Faster website 16:00 - Discussing the custom Link component and image prefetching 20:00 - Comparing the performance of McMaster vs Next Faster with throttling 23:00 - Discussion of potential improvements to Next.js to incorporate the optimizations used in Next Faster 26:00 - Conclusion and final thoughts
Great example to show that choosing the right technology will not automatically make the website fast. You have to write good code.
good code in bad tech is often faster than bad code in good tech
@@Z4KIUSTrivial performance gains like this rarely matter to begin with. Spend your time addressing issues that cost real money or adding features that make it.
Chasing tiny page load speeds is just mindless busywork.
@@JohnSmith-op7ls good feature beats minuscule speed improvements, but big speed regressions at some point beat any features
@ But this isn’t about addressing relevant performance issues, it’s about pointlessly squeezing out a bit more, in a contrived demo, just for the sake of it.
why not make these feature the framework thing instead
Interestingly enough, the edge McMaster has is not that their website is so insanely fast, its that everything you order will be delivered to you company in a couple of hours. So if you think the page loading is fast, checkout their delivery, lol
Better than amazon!
Their other edge is that they have every conceivable product. They are all around a premium quality service with high prices to match. When you need something specific, fast, to exact specifications and perfect every time, you use this company. When price matters more, you try your luck on Ali.
@thewhitefalcon8539 I'll say they have a massive selection, but I often do not find what I am looking for there.
couple hours delivery is quite slow for Russia. The delivery here us usually 15 to 30 minutes
@@Ginto_O You clearly don't live in a rural area of Russia. McMaster delivers ANYWHERE in the Continental US that fast.
Worked at McMaster for a few years. This kind of glosses over how we’re also able to perfectly sort/filter/and serve up data on over a half million different part numbers. There’s a looooot of stuff going on in the backend for this
It’s very very impressive stuff, especially for how long it’s existed and worked. I wish more of that info was public so I could have talked in depth about it 🙃
I like how theo thinks McMaster's competitive edge is their website and not that they knock on your door with your parts 3 minutes after you complete the order. 😄
I live right next to a warehouse so for me it's more like 1 minute 😂
The craziest shit with McMasterCarr thou... is it's even fast for Australians. And we can't even buy shit from them without shenanigans
No one cares about Australia, it's irrelevant to world affairs. Shoo.
I've used this as my go-to pat response to "can you give me an example of good web design/UX/UI" in interviews for years, is great that it's getting attention now 🎉
McMaster-Carr Speedrun (100%, glitchless)
jquery baby, oh yaah, yo heard me right
13:45 prefetching is great! When I started experimenting with HTMX, I immediately turned that on there as well (it supports both on mouse down and on hover, depending on your preferences). Great to see that next.js also supports it.
McMaster-Carr, shouldering the weight of America's industrial might since 1901.
So one of the things you seemed to miss was that with was a classic .NET 4.5 ASP website. So the tech for this is about 15 years old. All that javascript at 4:45 is auto genned. The back page for this is much simpler.
As an engineer, McMaster is the greatest website known to man
The real magic: accept 2h delay for every change and you can cache *everything* for 2h.
the realer magic: 300ms delay for every change and caching things only after they're requested
@@PraiseYeezusrealest magic: cache everything in the browser indexedb, and store a hash, so when the hash sent from the server to the client is different, the client downloads everything over again
@@PraiseYeezusis this actually how McMaster works??
@@xiaoluwang7367 no that's how Vercel's infra works
25:32 Good sir, everything here is magical, if think back to the days of vanilla and jquery, but I get your point.
What about server/CDN and network costs for this amount of prefetch? How it works on mobile clients, where is no hover event?
There's no free lunch. The original project does the same. You can choose not to preload the images if you're worried about that, only the HTML content.
I'm gonna tell you for my company the price of traffic is easily covered by improved user experience.
Also on mobile you can track the viewport and prefetch on item visible for a certain amount or some other metric, you'd need to research for a particular use case, or don't prefetch images and only the HTML for everything.
Trade-offs are always there.
It's an eCommerce site, so network and bandwidth costs are very very low compared to the revenue generated from sales. However, load speed is crucial. I've seen a 30% drop in CTR/visitors when my website's page load time is slow.
except for image flickering, pretty smooth UX
As a purchase manager that orders from McMaster CONSTANTLY, it's wild to me every time their website gets talked about. Worlds colliding or something lol
Looks like they are using good ol’ ASPNET Web Parts technology, which is considered dead nowadays.
Love how at 3:57 he goes out of topic to compliment how pretty the nuts on this website are.
We all did things like this back on 90’s/00’s and it worked like a charm, no frameworks, no jQuery
which site did you build that performs this well?
@@PraiseYeezus Brazilian Channel 5 ( Rede Globo ) covering the Olympics in 2004 is a good example; we had super tight requirements with the size of the CSS and imagery.
Basically, back in the day, at the end of 90's beginning of 00, you had to make websites that performed well because broadband wasn't so well spread, especially in South America.
So it was expected that designers would know how to compress images and videos to the maximun amount of compression possible.
Often, internet banners had incredibly low limits in size, so everyone back in the day would squeeze as many KB as possible of every single file.
Nowadays, a lot of "designers" and "developers" will put images online without even trying to compress or make them the correct size before slapping them online.
@@PraiseYeezus for some reason my comment keeps being deleted. So i will rewrite it briefly, I wrote the main Brazilian website ( by our main tv channel, which was "the official channel" ) for covering the Athens olympics in the early 00's and many other websites and at the time broadband wasn't so popular and everyone in the team, designers and developers and project managers were well aware of file sizes and compression types, most projects had strict rules for file sizes and page load times. XMLHttpRequest API was standard and so it was having different if conditions for different popular browsers, jQuery was not there yet.
No jQuery before HTML5 and ES6 sounds like an awfully bad decision
this is really good to implement in a ecommerce website... it makes shopping online really really fast.
3:57 >>thats pretty nuts
Yeas, those are pretty nuts. And bolts
This is an interesting intersection between web development and ux. The site has amazing ux and software engineering.
Tbh I don't think it feels that fast, especially for a page that is all text aside from some tiny black and white images. Getting a simple page like that to behave like the NextFaster example isn't that difficult, preloading, caching, and not doing full page reloads will get you most of the way there. The reason most websites are slower is because A. they're loading much more data, and B. their focus is on adding features and developing quickly, not trying to get page loads down to milliseconds.
95% React or any other JS framework developers can't build a website as fast as McMaster site.
same thing is implemented in soundcloud when you hover on music it loads buffer and once u click it loaded buffer starts playing
I noticed a small but significant tweak that probably helps a lot: B&W images.. they probably get a lot of saving by the compression on top of the fact that images here are all small.. tthe result: the browser is done quicker loading and rendering the images
about every youtube streamer has covered this already
The images on Masters are actually Sprites
Great point. With sprites you are fetching far less images and then just using offsets.
Used to work in a machine shop and would pick up hardware from one of their warehouses regularly. Great customer service and hardware, great company.
The depths that you go to is honestly, unreal. I can only imagine what it takes to put these videos out. Kudos to you, my man!
Sid, i agree but he conveniently missed few key points as he is a React/Next Shill.
Few key points Theo is missing:
1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data.
2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
@mohitkumar-jv2bx as I mentioned in my reply above, the only one "conveniently missing" points here is you.
1) All of these examples use real databases. The DB for NextFaster has millions of entries. It's a fair comparison.
2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic.
@@t3dotgg just 20$ dollars for these many requests 💀. I mean i get it, most of these are just favicon/really small requests which don't take a lot of bandwidth, but the amount of requests a single user generates on this site is just absurd. So, that low price is indeed shocking.
Back when websites were built by code veterans optimizing for 1ms
I wonder how this project would perform on a self-hosted environment. We all know Vercel does a bunch of special optimizations for Next hosted in their cloud. I'm guessing it will still run pretty fast, but some of these optimizations will not work out of the box or not work at all
Man... I absolutely love your honesty when doing ads! Seriously!
These videos are always fun to watch but I'd really like it if you were to put chapters in a video.
I have learned a lot from this video, more videos like this would be awesome
Too!
"Choosing a specific technology won't necessarily make your website faster"...Nextjs makes all optimizations default..."Choosing Nextjs will definitely make your website faster"
Googles page speed tool is nothing to do with site speed to user, and everything to do with first page load. Optimizing for first page load and optimizing for general site speed are two different kettles of fish. Google has to assume the user is loading the site for the first time
If your ceo/manager asks you to rank higher on Pagespeed Insights, show them this video.
First, the comparision between McMaster and NextFaster is not fair, McMaster does actually query the database on each product, while NextFaster downloads 10MB on the first page. this is not going to work if you have bigger database.
McMaster Tech:
1. Jquery
2. Styled Component
this proves that all newcomers frameworks wanting to fix slowness problems that other frameworks had originally weren't there, bad coding and adding dependencies are what we don't need.
Did you watch the video? McMaster loads more data than NextFaster. Next also queries a database with literally millions of items in it.
@@t3dotgg even if it does, it's not as simple, what kind of enhancement on the database? is it in memory?, how big is it, is it redundant? knowing that NextFaster is all about speed, i am sure 100% they did some of the hacks to make it look that good, but in the real world, hello darkness my old friend...
@@hqcart1 Why don’t you take a look? It’s all open source and they’re very transparent about how it works.
The database is Neon, which is a serverless ready Postgres provider. They provide most of what you’d hire a db admin for (backups, pooling, sharding etc)
People in the comments seriously overestimate how slow database queries are. In reality accessing a database is nothing compared to, say, network latency.
This was a good video, learnt alot thanks!
Great breakdown Theo!!
I'm really interested to hear why you're coming around to mousedown for interactions. I'm still in the mouseup camp but I haven't dug into it much and would love to hear what the arguments are! Future video?
Pre fetching is something im shocked isnt more common. It used to be on lots of websites but then disappeared.
Some of those optimizations are already in Next (2 weeks later)
Wes Bos made a video on same thing 2weeks back then Codedamn hoped on the same thing and a dozen others.
Um... loading a lot of JS is not always fine. At that point it only works quickly with high internet speeds, which is not something everybody has across the world. If your target customers are in the US / EU / Australia and other areas where internet bandwidth is fast, then sure you can send in more data to avoid more requests, but if your target customers are every country or africe / latam, then you really have to think about every byte sent to the customer.
Can Theo just appreciate a good website without dunking on it and shilling NextJS? He doesn't need to be so defensive all the time.
its good until you want SEO, google dont see tag
the problem is always between the chair and the screen
PEBKAC
My marketing team needs to know when images were loaded for some reason. I need to write unoptimized in the next Image tag because when images are optimized by next js the URL has some params for getting the optimized image.
Also, they say why the image loading feels slow :(
If you assume no malicious actors, then maybe the clients could keep track of page loads and dump them to the server in batches later on?
Few key points Theo is missing:
1) Real site most likely is making db calls, cache(think redis/memcached) etc on the backend. whereas this “fake” site most likely is only mocking data.
2)Theo conveniently missed pointing out out the pricing. At scale Vercel will suck all the money out of your pocket. Whereas for the real site, they likely would just need to up a new ec2.
1) All of these examples use real databases. The DB has millions of entries. It's a fair comparison.
2) I've seen the bill for this project. Hosting costs are under $20 a month despite the absolutely insane amounts of traffic
hey blind 12:40 theo shows next one uses db calls
@@t3dotgg I mean realistically the site would have had that much traffic for a few days only no?
Appreciate you Theo, thanks for the video! 😄👍
Does fast mean more opportunities for vulnerabilities or less? Just curious your input on it.
Fast usually means simple. Simple usually means less surface area. Less surface area usually means less room for exploits. There's no hard rules here, but generally speaking, simpler = better
Will it be as fast as it is now if caching invalidation of 2hrs is removed?
Or is it playing a major role in time reduction?
I'm not very familiar with JS and so I don't know if he showed this in the video, but I wonder what exactly this 2 hour cache invalidation timeout effects?
If things like stock and price cant update on every load or even update live, then I get the reasons for suspecting the cache is misrepresenting the comparison, but I lack the immediate skills to check without outpacing my own interest.
But like, images only updating every 2 hours.
Sure, why not?
When the McMaster-Carr website was first created it was not fast. back then it was faster to pull out the huge book than to access the website.
Thanks for this, its Awesome
From Europe NextFaster doesn't feel fast. I would say McMaster-Carr feels much faster from here.
htmx prefetch can do similar hover prefetching pretty easily, including the images of the prefetched page
5 years ago I made a SPA website for my college using just Django and vanilla js and that was fast as f 😅.
I made a router and for first request It download the full page then for any click it download only the part of page and then I attach/replace to the page part and head scripts without changing the layout.
/first-page (full page with layout)
/next-page?req=spa (only changed content not full layout)
HTMX sounds like it’s up your alley
I’m sure it feels amazing to use this site on your optic fiber internet connection
I’m not on fiber sadly :( I also tried it with a very slow vpn and it still felt great!
How much load prefetching all links can generate on the server? what about server and bandwidth costs?
Thanks for finally doing this
do nextMaster pregenerate all product pages and everything? Wonder how long that takes to build? I don't think it fair comparison to the original site since I don't think they are pregenerating all product pages.
The website also looks pretty good
This is awesome
nextfaster's way how the images flicker in makes me feel bad.
I'm kinda scared of that request bombardment
I clicked like 5 links and got 1.5k requests
I think faster next missed to compress the images with brotli.
Im currently convincing my principal eng’s to rewrite our whole website because i saw the next-faster page the other day… wish me luck.
Sveltekit can do that if you use the default behavior that loads a link on hover. Prefetching images is cool.
I don't think it's default behavior. You do have to explicitly set data-sveltekit-preload-data="hover" in either the anchor or body tag , don't you?
@ ok newer versions of sveltekit require this. I haven’t generated a new project in some time. Anyway is dead simple to make load content on hover.
Ah yes be prepared for a lot of bandwith cost . Especially when using aws wrapper vercel
2:10 isn't pre-loading just because someone hovers a bit wasteful? I'd want to see stats on pre-loads to clicks first.
yes, it will be a waste if you don't click, that's the tradeoff of choosing pre-fetching. Your traffic and billing can sky rocket if you are not being careful. They can afford the prefetch to provide a better UX for their clients.
Hence, there are lots of times you don't want to prefetch.
@ a waste is a waste, to the environment its a big deal. Whats the carbon footprint of those trillions of unnecessary preloads combined I wonder?
To be honest, hovering doesn’t exist on mobile devices which is where the concern about wasteful request network bill is mostly relevant so I think it’s a good trade off for desktop devices.
Yeah, yeah. Hover might technically exist on mobile too, but if you disable it the trade off is only on desktop.
@@m12652
Really 😅.
Humans are quite wasteful too if you’re going to that length about environment concerns.
Should we remove all toilets in the world because it’s inconvenient every time some one takes a dump to recycle as manure?
I don’t think so, and I hope humanity is not heading that way.
I think, It would be best in human interests to not sacrifice inconvenience but make up with other means for things we have been a little wasteful of.
@ very educated and totally relevant... you must be one of those people that thinks israel is a country 🙄
11:20 I am not a fan of loading tons of data before a user gets to a page. Yes, it is nice for user experience, but it is not nice for user download rates or company server rates.
Did see stopLoading if the mouse moves out, which is nice
So, why they did all of that? Wouldnt be better to just use nextjs built-in prefetch?
Super useful video!
Can someone please explain to me, if it is 1 mil products it means 1 mil photos. Which if you are using vercel image optimisation is around 5000 dollars. Who out of this enthusiast payed that much?
The only reason I don’t use vercel image is because my side project makes no money and is not worth to spend 5 dollars per 1000 images
You do understand that if a legit shop has a million products, it's probably way too profitable to bother about $5k
@ the profit margins in e-commerce as an average is 30%
5k is not a small amount
@@KlimYadrintsev uh.. yes it is
I am just really curious, why we just cannot use SPA version with a restful API of that instead of Next.js, especially if we're going to fetch all the endpoints in advance? I feel like we always reinvent the same wheel again and again. I remember my website which was fetching the HTML with sync ajax in 2013 with the exactly same speed. Surely, it wasn't complicated to build like in Next.js with dozens of optimizations.
IMHO, there are many ways to build a website which can load faster. Surely, 99% of them easier than implementing in Next.js.
Sorry, I just don't understand. Maybe, I am not nerd enough to get the point.
While you are objectively correct in saying that SPA + REST is superior, the fact is that Next has a significant footprint in the industry and as a result there will be content made around it
Show me one faster website built in a similar way with simpler tech. If it doesn’t have millions of pages, it doesn’t count.
And if you anticipate using mobile, having a REST API would be a big win
Sponsor? I feel Vercel(Next.js) is a long term sponsor of the channel.
why it make fast because it reload the link when the link is hover so when the user is click then it automaticallly view the page because it like already loaded and thank you and i add it to my knowledge now,,
This is also example that Fastest website doesnt really that matter, we care because we look at the number, but what does that means to website consumer?
sometimes more function could be helpful rather than microptimizing stuff
On mobile I assume they do come kind of intersection observers?
You just gave me an idea to promote some things I work on because... I write things that are both minimal and fast. I'm sure I could attain that speed, and with lower load size.
Putting it to use on my local server for an 11ty site I took navigating after initial load down to ~25ms. Mostly only took 4 lines for setup, but I had to refactor some things to deal with adding event listeners on page load. Added < 6kb to my bundle size, before compression.
Could probably get it down to like 4ms and even reduce bundle size, while making it easier to maintain, but that'd basically mean a complete rewrite of things.
That’s how the old days worked
Alright, but how to deal with huge Google Analytics, Tags and Facebook Pixels weight????
I wish they could do a fast version of jiiiraaa
a better idea is to preload cached pages with blurhashes and lazily load images afterwards. It's even faster and uses less resources (bandwidth, cpu)
You don't need bluehashes with Progressive JPEG.
@@saiv46 not all images are jpegs
@@ac130kz but they can be.
On face Prefetching looks good, but in most cases the amount of avoidable requests and load on the server it increases is not worth it unless it's just sending the static data which can be cached easily. If any db requests or computation is happening in the backend for those requests then it's just a waste of resources.
most websites that do not have UGC would benefit from this. correct me if i'm wrong
The HTTP response indicates that McMasterCarr uses the Akamai CDN for caching.
The Rollercoaster Tycoon of HTML
It's not 1.6mb js transferred . Its only 784kb in 4:03
What would be cool is of its fast on GRPS
How do we measure the speed?
I use a stopwatch, personally
this is why a lot of web technologies feel pointless to- _OH_
Couldn't stop noticing you're using a terminal called "Ghostty", what is that?
All of this prefetching, is it intensive on a server (assuming a live production environment)?
Seems like it would be no?
a business doesnt care if they can deliver fast products
What font are you using in vs code?
Designer is the enemy of the web performance.
Cheers Wes Bos
i used Brave Browser's Leo AI
0:00 - Introduction to the video and the McMaster website
1:40 - Analyzing the network requests and performance of the McMaster website
5:00 - Introducing the "Next Faster" website as a comparison
7:00 - Analyzing the performance and optimizations of the Next Faster website
12:00 - Diving into the code of the Next Faster website
16:00 - Discussing the custom Link component and image prefetching
20:00 - Comparing the performance of McMaster vs Next Faster with throttling
23:00 - Discussion of potential improvements to Next.js to incorporate the optimizations used in Next Faster
26:00 - Conclusion and final thoughts
Htmx preload extension ftw
this video is sponsored by nuts and bolts !
Faster than my Figma prototype