It's not really "this generation" though, Nvidia kept their new design so under wrap that 90%+ of all RTX 5090 cards out in the wild, the non-FE ones, will be as massive as 4090s.
there is an argument to be made that it shrunk in the only direction that does not matter. slot height is kinda whatever in most cases but less wide and shorter would have been nice.
well, if you run games at lets say 120-200 fps but have a gaming 500hz monitor, you could use FG to get 500+ fps and you will have ultra smooth gameplay at low latency (even though FG *MAY* add latency, the time diference between 2 frames even at 100 fps is 10ms so it doesnt matter)
I use frame generation on my RX 7600 to run CEMU (specifically Xenoblade X) at 60fps. That way I enjoy the games locked to 30fps at a higher framerate and don't have to spent 2k grand on a new GPU that will skyrocket my electricity bill and probably trip the breakers every time I turn my PC on. Seriously, power consumption is a big factor when buying new GPUs, NVidia ones use ALL of the power to get those results, I'd rather stick with AMD for now because of the lower power consumption.
the fact that the 1080 Ti is still put on these graphs, is a testament to how many gamers are still running around with one and that it can still run with high end gaming (not incredibly well, but, at all)
That's what I still use along with 7700K, at 1440p. It does the job at low-medium settings in newer titles. Either way, building a whole new rig soon with a 5090 and possibly the new 9950x3d unless I decide to stick with Intel as I have for over 2 decades.
Upgraded from a 1070 to an RX6800 a couple months ago so I could play BeamNG drive in VR better, but the 1070 was perfectly fine for other VR titles and flat screen games. i7-6700K is the bottleneck now, but that won't be changing unless I find a CPU/Mobo deal second hand. Which won't happen in my price range I don't think.
I'm still using a 1070 Quick Silver with dual 1440 (admittedly 60Hz) monitors, and I can consistently run anywhere from medium to ultra settings on games even released in the past couple years.
@@Zeemyaohh you would be surprised. If you leave RT off, it gives similar fps to a 4060. And I am sure most people dont play at 4k and that is enough for 1080ti to run nearly all games with good enough fos to play.
It's for gaming, and productivity but certainly not AI focused. People that wanna game and/or edit videos and have lot of money to spend would buy it. For AI it is significantly worse than H100 which is also exceptionally expensive.
I have a Samsung QN900B 8K TV and I am impressed with the limited 8K results I have seen with this card. Looks to be 50 to 60% better than the 4090. I'm sure if it got this same uplift at 4K, this card would be getting really positive reviews
I always find these data-heavy videos interesting, but often find my attention strays away by the third or fourth graph. This format with different presenters covering different areas is a fantastic way to alleviate that! The regular change of pace, voice and presentation style really kept my attention - you absolutely nailed the format here, which isn't easy with a video like this!
Yeah, I'm the same. All I ultimately care about is the FPS difference, which they usually show at the start. Any other bits of info beyond that usually becomes noise to me.
I actually find it the opposite the change of presenters is ruining the flow causing me to have a hard time focusing because the change happening so often between people.
10 วันที่ผ่านมา +14
Absolutely! The different tones and delivery make you think it's a completely new section worth attention. That's why Apple is very effective with its keynotes featuring segmenting.
Ooh interesting! For me it's the other way around 😅 especially if the thumbnail shows one talent (in this case, Linus, but it can be with whomever) and then throw in a few more on screen talents, i don't like that
I think this essentially explained the differences in observed generation improvements. 30 series to 40 series: Samsung 8nm (8LPH) to TSMC 4nm (4N) 40 series to 50 series: TSMC 4nm (4N) to TSMC 4nm (4NP)
Actually, since at least 1997, "process nodes" have been named purely on a marketing basis, and have no direct relation to the dimensions on the integrated circuit.
Let me get this straight... So, Nvidia says we should want the 5090 because we get 25-30% more performance while consuming 36% more power, with a tiny chance at the the privilege to pay just 25% more than when compared to the 4090 launch? It's been almost 2.5 years since the 4090 and in that time Nvidia essentially gives a bigger die with 33% more of the same 4090 GPU cores another 33% more slightly faster VRAM, basically 33% more 4090 for 25-30% more performance and more fake/inferred frames? Nvidia didn't innovate or improve the GPU hardware one bit, every added capability or real generational improvement is the AI specific hardware, they took a bit more of their other Enterprise AI tech and bolted them on, switched to the latest Gen VRAM, jacked up the price and tells us we'll take what they give us and we'll like it. Seriously, the comments about this being a 4090 Ti are really quite accurate. This is little more than a 4090 with some more cores and newer VRAM which Nvidia had little to do with outside of deciding to use it. If this isn't proof that Nvidia is done with gaming GPUs then I don't know what else to tell you. Why waste time and resources on the consumer gaming GPU products when the same Blackwell silicon can be used in enterprise AI solutions and results in many times higher profit margins than when sold in products for gamers?
It's not proof that nvidia is done with gaming GPUs (although we already knew that), it's proof that we are reaching the end of what is even possible with silicon unfortunately. I _wish_ this was as simple as nvidia being a greedy ai company (which they are).
yeah... I skipped the 40's still have a 1080ti, and a laptop with 3070, (not by choice) with a z1 extreme handheld. bottom line, I won't buy new graphics for years. AI it's the most annoying s!
@@ShaXCwalk i went from a 2080Ti to a 4080... but what i am finding.. i dont game enough anymore to warrant 2K to upgrade to a 5090 from a 4080.. maybe the 6000 series will tickle me in the nethers
The clips of Linus suffering were honestly his best parts of this video, everyone else did so great. Let's see more short parts by him for the lulz and only apply his rabid energy to videos where professionalism isn't as important. Don't get me wrong, he can be professional, it's just that sometimes he's not.
@@AaronBlankenship Agreed kinda but I think he did that purposefully. He's talked about this a few times where he does this purposefully. Kinda pumps everyone else up if you will
You know, something that almost everyone is ignoring... The thing just has 33% more Shaders... That means it's 0% faster, just more shaders. It doesn't get more basic than that. This actually sounds like progress is at a halt. It's just as basic as AMD bringing out a 7600 variant ("even slower!") recently...
@@voydesvelado neither, buy an amd gpu like the 7800xt or the 6900xt going to be much cheaper and the only loss is ray tracing performance but you can use a tool called MorePowerTool to unlock the wattages and get alot more performance and often out perform the competing nvidia gpu
This is why I got the 4090 a year ago. When the dude from mores law is dead casually mentioned that ada was a bigger improvement to the prev gens akin to the 1080ti and Jay from J2C mentioned that Nvidia was gonna focus more on AI development over pure rasterization. This tech aint advancing much further. If there is a way to improve rasterization it will be with new revolutionary tech - which none of the manufacturers are working on. Nvidia doesnt need to, amd has given that up, and Intel only wants to be second best.
@@IdleWorker I think it’ll be reasonable to expect a bigger uplift in pure rasterisation when the next 3nm or whatever drops. But yeah, for now I think 4090 is the goat, that’ll keep on giving for years. Most games will focus on running on most other affordable cards and as Hardware Unboxed often does - some setting don’t need to be ultra. The difference between high and ultra on some setting is negligible, but comes at a big perf cost.
actually its a joke, what they mean is that Ava Lovelace is 11 letters while Blackwell is 9 letters, therefore a difference of about 18%. However I believe it is not a fair comparison since "Ada Lovelace" is a full name, while "Blackwell" is only a surname, not exactly apples-to-apples comparison.
If the upcoming M4 Ultra Mac Studio’s graphics don’t beat this, then the Mac Pro M4 Extreme will, at far lower power usage, but at least these have far better BOINC and Folding@Home volunteer computing project support, cause they’re running on Windows, so hopefully they will be sold out, cause people will use them for this. And for benchmarking this card, you do 8K ultra, no upscaling, full RT, to have no CPU bottlenecks at all.
@@renji35 The DF guys were calling it the "yass-ification" effect, lol. Nvidia is apparently aware that the AI is doing too many changes to the underlying artwork and they're working towards it complementing what's already there instead.
pretty much my thoughts, I bought a 3090 for about 700 at auction a while back and if this keeps up I guess it'll be like 8000 series before I upgrade.
Unfortunately given we’ve hit the limit for the size a silicon transistor can be ~4nm, we’re never going to see the generational jumps we used to, the engineers are limited by quantum mechanics. Not until we switch to a different semi conductor or method of computation
4090 was never stupid, it was 100% faster than the gen before it for the same price. 5090 is 30% faster for 50% more cost (Over 100% for 3rd party cards). This is not a gaming card, it's an AI card.
Its not even that great of a AI card. If you actually want to run high end LLM models you would need a lot more VRAM, which Nvidia is artificially lowering.
that's why everyone should be buying 5080 instead 5090, only if you work with ai that card is viable, but if it only for gaming, 5080 is the way to go for half of the price
Even 20 series top cards still cost a liver and a leg, and mostly just rich people buy them just to boast "I have the xx90 card!!!", while in reality 1060 would be more than enough for their needs.
21:25 The 4090 was an outlier for 40 series with lower tier cards offered a much lower performance bump... Yeah, but the 50 series will offer a much lower performance bump for it's lower tier cards... Again! (Hypothesised from the fact that 5090 = 30% jump over 4090, using 30% more cores and 30% more power, but the lower tier cards have around 7-10% more cores, so they'll offer.... erm.... 7-10% more performance?! Duh! Meaning that the 5090 will just be even more "outlier" than the 4090. Boo :(
One of the few groups of people actually need this kind of GPU processing power is the VR Flight Sim guys, a VR headset is eqvivalent to running a 6-8K monitor, and you need >60fps to make it smooth, >90fps is preferred.. So: Please add some testing with DCS in VR for future videos, thank you!
We just going to ignore other vr users? Also, the only vr head sets that have the same level of pixels as a 6k-8k monitor is like pimax or meganeX. The rest like the index are sitting at 4k equivalent of pixels or under
I clicked your timestamp and was disappointed it wasn't a 50-60Hz hum. Fewer people would have picked up on it but for me it would have been the perfect transformer sound effect.
This year (including a bit of Q4 of 2024) feels like the release of tech that is marketted as generational leaps while being more updated iterations at best. Playstation pro, Galaxy S25, 5000 Series cards, etc. I get why they need to do this as they can't give space to competitors to have better specs and also lose their "top of the line" statuses but technology doesn't always leap forward in time for releases. But it's pretty annoying when it's getting to a point where getting the latest version of something isn't just a question of cost, but even quality compared to the previous one.
Does AMD have anything equivalent to Nvidia's NVEC? I render a lot of 4k video and NVEC is a lot faster than CPU processing while maintaining the highest possible quality. I even have a 10 core 20 thread I9 and Nvidia NVEC blows it out of the water by like 20x the speed in Vegas Pro over just CPU rendering... I would really hate to give up such great 4k video rendering speed if I were to switch to AMD cards.
@@e5m956 they don't sadly, at least not until you switch to AV1. Funny enough some people will use AMD for their main GPU then pick up some old cheap very base level Nvidia card with NVENC that they can keep in a second x16 slot just for encoding as needed, lol.
@winnah9000 Thanks for the reply. I'd rather just stick to a single card solution. AMD should consider something similar. I do a lot of GoPro filming on the weekends ATV riding and drone flying. I'm pretty much a 50/50 split with gaming and video editing so both are important to me. I miss the ATI 9700 pro days when we had real strong competition with Nvidia. 😢
@@e5m956 10850k or 10900k? Those got a lot of hate, too. I try to tell people that nVidia cards are more aimed at professionals than casual gamers. These cards are 100% worth it for people that need real computational power rather than chasing FPS in a video game.
Elimination of the Moire patterns 12:15 could be a pretty big deal. That specific type of artifact is easily the most noticeable and distracting for me.
The 7900XTX lags behind in many games by only around 20 FPS, and that's AMD's last generation? Hard pass on the 5090 for now. Since I'm not a 4K gamer or interested in AI, I'll wait for AMD's next release, compare it to the RTX 4090, and pick one of those as my upgrade.
There really is “good enough” for most people. Good enough for me, at least right now, is a 7700x with a 7800xt. But there’s some great gaming ahead for people that aren’t rocking a 5090.
$2,000 is fucking insane for a card, let alone that the performance increase doesn’t meet the price increase. I can’t believe NVIDIA even released this thing 😂
Ay. I extremely appreciate the section with the video editor! I was shopping for a CPU recently and found a distinct lack of mentions for productivity and too much focus on “gaming performance”.
The 50 series is a huge letdown. They're good cards of course, but the way Nvidia talks about this stuff, you always expect a revolution, and you get an evolution. These are 40 series cards with a crapload of power pumped into them to keep stable. This is not exactly ideal. Nvidia is way too distracted with side projects. They've got 5000 features they're working on, and only a small handful of them ever pan out. Don't get me wrong, I love Nvidia. If it weren't for them, we'd be in a world of hurt. However, when it comes to just raw 3d rendering, we're getting screwed. I bought a GTX1080 for $500 in 2017, and today that will buy me a 4070. The 4070S is another $120. 8 years and only 85% improvement. That's insane. There was almost the same improvement from the 900 series to the 10 series, and that was 2 years. People always ask me "dude how can you still run a GTX1080 and a 7700k?!?!" and it's because I have a Gsync 144hz display, and in competitive games we all turn down graphics settings so I get lots of frames, and in games that aren't competitive, I have Gsync. So really nothing will change yet. Not until I can afford a card that will drive 1440p at 240hz+ for affordable prices. I'm not paying a thousand dollars for a graphics card. I have other hobbies.
It would take physics hardware acceleration and improvements, multi-die GPU's, FPGA scaling built-in, large DisplayPort bandwidth increases (at least 4X), super-sampled output mode (3DFX style), lag free chequerboard rendering for 4-8K, native rolling-scan modulation at the GPU frame buffer level, as well as frame-stacking, hyper low latency mode with native rendering, features for legacy PC games, I mean there are so many useful features that could benefit people, but these that would require actual R&D and actual features for your thousands of pounds, when they can just tack A.I. on instead and convince people that passive frames equal your monies worth.
It looks like it has all been said already but "our biggest review" is no lie! What a phenomenal job you guys have done here! The sponsor segments, the different hosts with their own bite-sized subjects, the great differentiation between raw performance and DLSS performance, the great graphs including very solid breadcrumbs on where to look, the exact right amount of humor, objective fact and opinion. Genuinely fantastic, great to see!
The production value on this video was fantastic. Loved including all of the different hosts, the sponsor segment was great and it kept you engaged. Great job!
The Anti-Trust law needs to kick in and NVIDIA needs to be broken up so that there is no fleecing of the customers and no shortage of supplies. This is getting ridiculous. Why was the government sleeping till now?
You know what would be a cool experiment: power limit all GPUs to one share value (200W or 300W or whatever would be to have at least 8 GPUs in test) and see how much you can get. I do not know if LTT can pull out such a complex test, but it would be very interesting.
How is this any different to just monitoring how much power each gpu uses at peak and dividing down frames by watts? Making the Watts equal in real life is pointless, thats what Maths is for.
Unfortunately I believe power limiting isn't as simple as that, having a 500W+ card running at less than half the power would have massive stability issues
@Alt-gy7se but the power usage in general can be real different. one GPU can be using 500 watts continuously throughout a scene, but another can be using a varying 200 to 400 and spike to 600 a couple times during the same scene. now it will rarely be that extreme, but I can see very clear differences in power consumption with my cards, and I'd love to see some more comprehensive testing to that, especially since power costs and GPU power consumption are on the rise!
This is why I love Linus tech tips, they don’t milk Gpu and cpu news for 2 months straight with 2 videos a day with no useful info. Instead they post the day of and post a banger video
Gamers Nexus puts out the best hardware review data and they are out now as well. this site is more tech news and entertainment than serious data points.
@@toddblankenship7164true, but no one is falling in love with tech via tech Jesus. Steve’s crowd is us bedroom dwelling number nerds. LTT has a format that invites new interest. It’s good that both have their niche
@@toddblankenship7164 TECH JUDAS is no longer relevant. We have LTT and Hardware Unboxed for all our Benchmarking needs. And Steve's lack of professionalism now makes me question everything he does, not to mention he is annoying and insufferable. So NO THANKS!
@@ronniebots9225Welcome to the future of Internet comments, where anything that obliquely references AI being used for graphics will be accompanied by a lame, low intellect response talking about how games will no longer be optimized ever.
No matter how much data is used or how many times the AI model is trained, it will never be flawless. Issues like ghosting, flickering, and others will always persist, as game content is dynamic and unpredictable.
Of course, but also keep in mind that it doesn't necessarily need to be flawless - only good enough that people either don't notice, or think that the tradeoff is worth it. I'm not saying that it's there yet, but there have been appreciable generation over generation improvements with these models and different people are going to have different points at which that can happen.
true for frame gen, but the new upscale method in DLSS4 is a massive improvement. At this point I just don't care about Path Tracing, I just want to play at 1440p with dlss quality above 90 fps, no ray tracing if possible. I'm only upgrading my 3070 because of the vram, it still holds up in raterization tho
The 4000D test with the new cooler style is such a thoughtful addition to the review. Its not something that was 100% necessary but really was a nice tidbit to know and consider. It really shows the thoroughness of the review.
@minus3dbintheteens60 I think it's probably enough time to get at least an idea of what that style of cooler could do to performance. I think people who are running that style of case and cooler are pretty unlikely to hit the CPU and GPU at that level in daily life for most workloads and games. Of course more data is better but for the conclusions they make with the information and the intended audience of this review it's enough. It answers the basic question on whether you should be worried about this cooler style affecting thermals. For people who are likely to hit the CPU and GPU that hard then It informed them that they should keep that in mind. They now have something to consider and look into when they look at other reviews of the product. Of course more in depth data is nice but realistically this is plenty for the type of person this review is intended for. For a review that needs to hit a release deadline it gives enough information to be useful while not slowing them down too much.
I'm genuinely curious, who's tradition is this? I personally don't care about 8k because it won't matter to my eyes and mostly doesn't exist but I'm not going to yuck your yums
I'll be honest, these tech videos don't really make sense to me but I still enjoyed these. I'm learning little by little with each video. As well this is the first time I see Oliver Cao in a video, but he talks like he's been doing these videos for years.
One way to explain the tensor cores and neural shading is as follows: The neural networks can be used to approximate extremely complicated functions for which we can't just write simple equations. However, we can create those networks with a simple procedure merely by showing them many examples. So for example we can show it what rocky terrain looks like at all levels of detail, and the network will learn how to do that, using less data (just what the network's values are) than trying to store all the terrain (what the network computes as output). It's sort of like procedural terrain, but learned instead of programmed. The tensor cores are what are used to efficiently compute those trained neural networks on the GPU. They're not the same as geometry or pixel shaders, so they need their own specialized cores for the task.
That's exactly why they always say to check out different reviewers. No one can test everything in such a short time and there are different perspectives out there
I love how watching GN, LTT and HUB will give you 3 completely different reviews, with some overlap that says the same things but all 3 have such different perspectives that I feel like they were all necessary. Some to get confirmation of the main results, some other to get morenrelevant information. Good job everyone! And then there's my country's version which is also very good, and covered some more stuff but mostly how it fits in the Italian market. Shoutout to Prodigeek, despite the annoying clickbaity title, thumbnail and first 10 to 20 seconds of their videos, the rest is fun, informational and cool.
It's what life and politics etc should be like. We agree on the facts, our opinions on what is good and bad about them can differ. But the facts stay the same.
"The 5090 is overkill when you play in 1440p" -I fully get it and partially agree, but in many games, it cannot get even the 144Hz of the current monitors, let alone 165Hz/240Hz (in full preset without RT). Now imagine 4K. We just started getting into that resolution. My 144Hz monitor supports 165Hz OC via DSC, but I have it at comfortable 100Hz which I love. Plus, capped frame rates and the GPU not only stays in cool levels, but the capping also helps in smoother gameplay since the GPU can be better in synchronization with the CPU. You can look it up, here, on YT. Capping the frame rates is good, end of story. Power consumption way less and the sync with the GPU culminates in smooth response times. It's like one straight line.
THANK YOU for finding ways to take all of this incredibly technical and complicated techno jargon and turn it into an easily digestible 20 minute video. This is one aspect that I've found *many* other outlets are having a hard time doing for this release specifically, either simplifying is way too much that It becomes almost uninformative, or going way too into the weeds and confusing the hell out of me.
The AI performance comparation at 5:36 is wrong! For 4090 value is provided for 8 bit quantization, but for 5090 it is 4bit quantization. For 8bit I would expect half of presented 5090 AI performance.
To be fair, the 4090s fp4 performance would be the same as it's fp8 performance due to not having it natively. Ultimately Nvidia is going to advertise the largest possible numbers, which for this generation is fp4. Ultimately, from my (admittedly fairly quick) research, both numbers are kind of BS since most models use FP16, but that's why we have independent reviewers to give more realistic performance comparisons.
@@BelacDarkstorm it’s real marketing bs unfortunately. Most likely the uplift in fp8 is only core counts where as fp4 is where the money is at. So inference in fp4 when the model fits within one card is gonna be where most benefit is at. During training a good chunk is in fp16 still for accuracy so there’s less gain in raw speed, though the added vram can allow for bigger batch size in training or bigger models. Not all bad but really depends on the use case
@@BelacDarkstorm I thought the term TOPS is used by default for integer operations (and TFLOPS for floating point). Both cards supports int8 and int4 nativelly, but 4090 performance is specyfied in int8 and 5090 in int4. Anyway this is another ugly manipulation from nvidia. The same story like comparing "performance" between fg 2x and fg 4x.
The guy who ran through the benchmarks made me feel super uncomfortable with secondhand embarrassment. He was trying way too hard. Muted it for his section.
True! Also nice seeing you here Phil, i suspect you'll be doing a magnolia county stream/video testing the 5090's performance once you get your hands on it?
I love that all the writers had a bit to cover as part of the review. But my favorite by far is that Alex just has a hammer for some reason? Perfect, never change.
@Random_dud31 There's a section in the video that cuts to him and he says "to break down the details" and he sort of raises the hammer, so I think that's why he had it, not sure if there are other hammer related lines.
The biggest problem with all the gaming focused features and tech, is that I simply don't need or care about them when actually playing a game. They may look cool in tech demos, but I never want to turn the settings on because it doesn't look or feel better to play an overly polished, shiny or blurry game.
With scalper prices it will be $1,000 cheaper but the up charge for the 5080 will be $2,000 and the 5090 will be $3,000. We aren’t going to be getting these for MSRP due to scalpers.
I think the simplest way to explain convolutional neural networks is: for every pixel, a new pixel value is calculated based on its nearby neighbours. E.g. if every pixel is replaced with the average of its neighbours, you get a blur effect. Think of a visual illusion where you see some broken lines and your mind fills in the missing parts of the line, that's how these CNNs would be used to preserve detail as they upscale. However, it's a lot of computation, and a challenge to preserve scene integrity because the neighbourhoods are small compared to the entire scene. Transformer models allow arbitrary areas to serve as context when computing new pixel values; think of it as like having configurable neighbourhoods to input to that computation. E.g. when trying to upscale the left side of a road in the game's scene, that computation can also refer to the right side of the road all the way across the screen.
I appreciate the AI benchmarks but why are all the LLM tests with 7B and 13B models? I can run those on my 4060, no one is buying a 32GB card to run models that small
Absolutely. They should highlight that this card can run bigger models. Until we get new quadro released, this card can be the sweet spot for running big models like llama 70b at a 'value price' at home
@ Sure, but it's priced like Quadro with some Quadro software features missing. And let's be real, this is "gamer" card, so you are still only supposed to run games, not AI models with it and without some dumb trickery it can't even run latest titles at stable 4k 60 fps. It's beyond pathetic release and IMO massive L for nV.
You're looking at it wrong. You're paying 30% more for 30% more performance. For people on that level of budget, it's totally worth it! And it will sell very well, just like the 4090 did.
@@costafilh0do you have any understanding of how technology works? Over time it gets better. So you can expect better performance for the same money after two years. If they had simply put the price up every year on GPUs based on performance increases since the 90s then a 5090 would cost tens of thousands of dollars!
One aspect that often goes overlooked is the quality of software, the card comes with cutting-edge CUDA cores, RT cores, and Tensor cores, the potential of these advancements can be undermined if the accompanying software is not properly optimized.
Corrections:
6:14 - Jake says "GDDR6 uses Pam4" when instead it should be "GDDR6x uses Pam4"
@@LinusTechTips great now I have the wrong data
too late now i'm not buying (i'm broke)
Uh oh, Steve gonna blow a gasket.
Time to find a new editor
5:43 +66% TLFOPS
the biggest advancement this generation was the cooler shrinking to 1/2 the 40 series sizes
It's not really "this generation" though, Nvidia kept their new design so under wrap that 90%+ of all RTX 5090 cards out in the wild, the non-FE ones, will be as massive as 4090s.
So I should get a 4090 ong New PC
there is an argument to be made that it shrunk in the only direction that does not matter. slot height is kinda whatever in most cases but less wide and shorter would have been nice.
@internet2oup650 they were able to shrink it because they are pushing fake frame.
now they just need to work on the power efficiency so it doesnt need to draw 600 watts by itself
"frame generation work's the best when it makes the least sense to use it"
Go figure
well, if you run games at lets say 120-200 fps but have a gaming 500hz monitor, you could use FG to get 500+ fps and you will have ultra smooth gameplay at low latency (even though FG *MAY* add latency, the time diference between 2 frames even at 100 fps is 10ms so it doesnt matter)
True but as long as you’re really above 60 it’s still a good-great tool. It’s a net benefit to majority of people still.
I use frame generation on my RX 7600 to run CEMU (specifically Xenoblade X) at 60fps.
That way I enjoy the games locked to 30fps at a higher framerate and don't have to spent 2k grand on a new GPU that will skyrocket my electricity bill and probably trip the breakers every time I turn my PC on.
Seriously, power consumption is a big factor when buying new GPUs, NVidia ones use ALL of the power to get those results, I'd rather stick with AMD for now because of the lower power consumption.
That's how life works. A bank will only lend you money if you don't need it.
My entire PC costs this much, lmao
You can get 5 of my rtx 2060 i7 9750h gaming laptops (i got it for 400 secondhand), all in AUD which is more inflated then USD lmao
@@nahby holy send the link
Built a rig with a 3070, 5700X3D, 32Gb DDR4, and a custom hardline watercooling loop for less than RTX5080 retail prices
I know right, lol, i could build like 3 PCs with that
@@nahby I mean that's a pretty old system
the fact that the 1080 Ti is still put on these graphs, is a testament to how many gamers are still running around with one and that it can still run with high end gaming (not incredibly well, but, at all)
That's what I still use along with 7700K, at 1440p. It does the job at low-medium settings in newer titles. Either way, building a whole new rig soon with a 5090 and possibly the new 9950x3d unless I decide to stick with Intel as I have for over 2 decades.
Upgraded from a 1070 to an RX6800 a couple months ago so I could play BeamNG drive in VR better, but the 1070 was perfectly fine for other VR titles and flat screen games. i7-6700K is the bottleneck now, but that won't be changing unless I find a CPU/Mobo deal second hand. Which won't happen in my price range I don't think.
I'm still using a 1070 Quick Silver with dual 1440 (admittedly 60Hz) monitors, and I can consistently run anywhere from medium to ultra settings on games even released in the past couple years.
The 1080 Ti has been incredible value for me
Mate im still runnin a 980ti with no issues and have played rust tournament's. These prices are fucking stupid
Great now I can finally afford a 1080ti
@@priyanshusharma1812 what could that even play now adays 😂
@@Zeemya most games
@@Zeemya Nothing that focuses on graphics over gameplay but from experience, everything fun.
@@Zeemya you would be quite surprised😁
@@Zeemyaohh you would be surprised. If you leave RT off, it gives similar fps to a 4060.
And I am sure most people dont play at 4k and that is enough for 1080ti to run nearly all games with good enough fos to play.
Don't forget, those cards never sell at MSRP. Even the cheapest 4090s are still selling above the announced price today.
That's what might make me buy one if I see one at msrp. Seems you'll be able to get your money back two years later unlike on the lesser cards.
@ali-volcano well depends if you get the founders card the price should be at msrp but other model prices will vary
That's why I buy directly from Nvidia. I also don't buy from third party sellers on Amazon.
Because they aren't making them anymore. Go back to November when it was still in production and prices were lower.
I was like "Really?" And yeah. In germany buying a new 4090 sets u back about 2600€ today
Unironically the video editor might be the best presenter in this video :D clear spoken and with a nice tempo
Right? He was so clear
but please give him a new chair lokk at his headrest xD
I would love to see the editor more, as a host even more so.
Okay I noticed this too but I thought I might’ve thinking too much into it glad I’m not the only one he should be on more!
Thought the same thing. His voice is so articulated and enunciated. VERY easy to listen to. Very calming.
Let’s all agree this ain’t a gaming card. Is a grab for AI market.
That's not a bad thing. I'm less interested in gaming, and more interesting in doing some interesting science computation on it.
@@marc.lepage its marketed towards gamers for gamers
It's for gaming, and productivity but certainly not AI focused. People that wanna game and/or edit videos and have lot of money to spend would buy it. For AI it is significantly worse than H100 which is also exceptionally expensive.
I have a Samsung QN900B 8K TV and I am impressed with the limited 8K results I have seen with this card. Looks to be 50 to 60% better than the 4090. I'm sure if it got this same uplift at 4K, this card would be getting really positive reviews
It's not AI focused. Nvidia just isn't focused on the gaming market because amd isn't putting in enough effort
5:33 “Change: -18% letter count” lmaooo that’s awesome
Twice the money and Less Letters?!?!
Twice the money? @@JJVernig
Yeah I missed the entire chart, I was just laughing at that bit.
Hahaha, I was just going to comment on that
So it still cannot play Wukong at 60 FPS on 4k... got it.
Ouch!
to be fair that's just 1 game so far : p
@@TheRafflesboy for $2000
Another Souls-like game made with badly optimized UE5 🤢🤮🤮
What is that never heard of it
I always find these data-heavy videos interesting, but often find my attention strays away by the third or fourth graph. This format with different presenters covering different areas is a fantastic way to alleviate that! The regular change of pace, voice and presentation style really kept my attention - you absolutely nailed the format here, which isn't easy with a video like this!
Yeah, I'm the same. All I ultimately care about is the FPS difference, which they usually show at the start. Any other bits of info beyond that usually becomes noise to me.
I actually find it the opposite the change of presenters is ruining the flow causing me to have a hard time focusing because the change happening so often between people.
Absolutely! The different tones and delivery make you think it's a completely new section worth attention. That's why Apple is very effective with its keynotes featuring segmenting.
Agreed this is so much better than monotone steve burke going over lame graphs for 45 mins
Ooh interesting! For me it's the other way around 😅 especially if the thumbnail shows one talent (in this case, Linus, but it can be with whomever) and then throw in a few more on screen talents, i don't like that
I think this essentially explained the differences in observed generation improvements.
30 series to 40 series: Samsung 8nm (8LPH) to TSMC 4nm (4N)
40 series to 50 series: TSMC 4nm (4N) to TSMC 4nm (4NP)
4N is not 4nm its actually 5nm
40 series was actually 5nm, Nvidia just lies about it's node sizes (just like it claimed 3000 was 7nm)
Actually, since at least 1997, "process nodes" have been named purely on a marketing basis, and have no direct relation to the dimensions on the integrated circuit.
But it's easy to see the gap in manufacturing processes is greater between 30 series & 40 series.
You're only right about the Samsung part but I'll wait for 6090Ti with 2nm EUV and CoWoS packaging~
That sponsor segment is an absolute gem
True Fr Fr
I just shidididididid all over the corridor ☹️ (I wasn't quick enough) but anyway, I agree, that segment was great
The Dennis sponsors are always amazing
My brother in the background spat his water
Dennis Kills it everytime😂
Let me get this straight... So, Nvidia says we should want the 5090 because we get 25-30% more performance while consuming 36% more power, with a tiny chance at the the privilege to pay just 25% more than when compared to the 4090 launch?
It's been almost 2.5 years since the 4090 and in that time Nvidia essentially gives a bigger die with 33% more of the same 4090 GPU cores another 33% more slightly faster VRAM, basically 33% more 4090 for 25-30% more performance and more fake/inferred frames?
Nvidia didn't innovate or improve the GPU hardware one bit, every added capability or real generational improvement is the AI specific hardware, they took a bit more of their other Enterprise AI tech and bolted them on, switched to the latest Gen VRAM, jacked up the price and tells us we'll take what they give us and we'll like it.
Seriously, the comments about this being a 4090 Ti are really quite accurate. This is little more than a 4090 with some more cores and newer VRAM which Nvidia had little to do with outside of deciding to use it.
If this isn't proof that Nvidia is done with gaming GPUs then I don't know what else to tell you. Why waste time and resources on the consumer gaming GPU products when the same Blackwell silicon can be used in enterprise AI solutions and results in many times higher profit margins than when sold in products for gamers?
Brokie
It's not proof that nvidia is done with gaming GPUs (although we already knew that), it's proof that we are reaching the end of what is even possible with silicon unfortunately. I _wish_ this was as simple as nvidia being a greedy ai company (which they are).
My only problem with your comment is the vram being "slightly faster" because it isn't, the vram is quite a considerable amount faster
I think similar to phones it makes less and less sense to upgrade each generation since the performance bump is plateauing.
@@dnegel9546 ur rage bait is shit. put my fries in the bag lil bro
I guess my left kidney can wait until the RTX 6090 launches, not worth the upgrade
Nik arent u on a 2015 IPS u got bigger issues
whats wrong with an IPS panel from 2015?
more fps = more gaming tho
@@ifox11CZ performance per dollar with overpriced cards tho is not worth it
What do you have now for a GPU?
With every new gen of graphics cards, the poorer i feel. These prices are getting out of hand.
yeah... I skipped the 40's still have a 1080ti, and a laptop with 3070, (not by choice) with a z1 extreme handheld. bottom line, I won't buy new graphics for years. AI it's the most annoying s!
The cheapest 4090 on Amazon right now is $2,497.
ololol
First time PC Game rig buyers and ppl updating from outdated PCs might find value in a 4090 prebuilt
Who buys anything off Amazon? 😁
Everything is 20-50% more expensive..
Yeah because Nvidia stopped making them. It's a complete ripoff.
That's why I'm saying People don't realize. These 5090's will not be sold for less than $5k a pop.
I remember when the 20 series dropped and we were like "oh wow it's over $1000 but I guess the ray tracing is a big generational upgrade". lol
Me coming from a 2080Ti and wanting to upgrade to 5090... I feel like I will do the same mistake again. But it's time I have to upgrade
@@ShaXCwalk i went from a 2080Ti to a 4080... but what i am finding.. i dont game enough anymore to warrant 2K to upgrade to a 5090 from a 4080.. maybe the 6000 series will tickle me in the nethers
@@donywahlberg who is we
Well, it was a huge generational upgrade
Nobody said that for the 2000 series.
theres something so epic when ltt make multi-host per segment videos, each host talking about the things that we know they are best at.
@@ViewportPlaythrough I agree. I really liked the style of the video.
The clips of Linus suffering were honestly his best parts of this video, everyone else did so great.
Let's see more short parts by him for the lulz and only apply his rabid energy to videos where professionalism isn't as important. Don't get me wrong, he can be professional, it's just that sometimes he's not.
@@AaronBlankenship Agreed kinda but I think he did that purposefully. He's talked about this a few times where he does this purposefully. Kinda pumps everyone else up if you will
TBH - this really just reminded me of why I unsubscribed from LTT a few years ago lmao
@@ratherlazy It's very jarring switching between presenters/voices. Don't like it. I like to watch LINUS only or go elsewhere.
You know, something that almost everyone is ignoring... The thing just has 33% more Shaders... That means it's 0% faster, just more shaders. It doesn't get more basic than that. This actually sounds like progress is at a halt. It's just as basic as AMD bringing out a 7600 variant ("even slower!") recently...
so basically at scalper prices these are gonna be completely not worth it coming from the 40 series
Its never worth it to upgrade everytime a new generation comes out
They aren't worth it at rrp.
There will be zero perceivable benefit with the fps counter on screen.
@@Mario-fg6ns should I buy 4 or 5th generation for my first pc?
@@voydesvelado neither, buy an amd gpu like the 7800xt or the 6900xt going to be much cheaper and the only loss is ray tracing performance but you can use a tool called MorePowerTool to unlock the wattages and get alot more performance and often out perform the competing nvidia gpu
@@voydesvelado5th gen. Why would you buy 4th gen for similar prices on a new build.
~20-30% more REAL frames on average for like 30% more power usage isn't great...
The 4090 was like ~80-90% more for the same power as a 3090ti.
@@MrXaniss 4090 ti bassicaly
more like 60% lol
The 4090 was an anomaly to be fair, at how much better it was than last gen. Usually the uplift is less.
This is why I got the 4090 a year ago. When the dude from mores law is dead casually mentioned that ada was a bigger improvement to the prev gens akin to the 1080ti and Jay from J2C mentioned that Nvidia was gonna focus more on AI development over pure rasterization.
This tech aint advancing much further. If there is a way to improve rasterization it will be with new revolutionary tech - which none of the manufacturers are working on. Nvidia doesnt need to, amd has given that up, and Intel only wants to be second best.
@@IdleWorker I think it’ll be reasonable to expect a bigger uplift in pure rasterisation when the next 3nm or whatever drops. But yeah, for now I think 4090 is the goat, that’ll keep on giving for years. Most games will focus on running on most other affordable cards and as Hardware Unboxed often does - some setting don’t need to be ultra. The difference between high and ultra on some setting is negligible, but comes at a big perf cost.
Now, this is a proper review not just a collection of graphs. Every aspect of the new GPU is covered. Excellent job guys.
Clearly you haven't seen a Hardware Unboxed review. It literally has all the metrics anyone interest in such a product could wish for
@@ThatNorma Does this change what he said or make it wrong?
@@ThatNormawho asked
Or Ray Tracing Foundry's absolute Nvidia shilling. Balanced criticism and praise I would say.
Linus Tech hardware reviews are flawed, so take them with a grain of salt
wow i really thought the 5090 was gonna do better compared to the 4090 .. this video saved me from buying the 5090 lol
The intro with that 5090 cake and optical inclusion is insane!
I think it's an AI tool or something that Dennis uses
I think it was AI, but still funny
And creative
Unless he managed to unleash the ghostblade technique, it's an AI video.
it was AI gen as a joke, geez c'mon people, get with the AI program.
@5:33 the “-18% Letter Count” got me so good. well played editors!
they needed to add for all of them lol
actually its a joke, what they mean is that Ava Lovelace is 11 letters while Blackwell is 9 letters, therefore a difference of about 18%. However I believe it is not a fair comparison since "Ada Lovelace" is a full name, while "Blackwell" is only a surname, not exactly apples-to-apples comparison.
@@donttouchthewatch645 NO SHIT SHERLOCK
@@donttouchthewatch645 Good point, Lovelace to Blackwell is a slight increase.
the video editor slayed in this video - the flow, the quality and the animations are spot on. Well done anonymous person
I would hope so. Nvidia is paying $$$$$ in order to market this GPU.
If the upcoming M4 Ultra Mac Studio’s graphics don’t beat this, then the Mac Pro M4 Extreme will, at far lower power usage, but at least these have far better BOINC and Folding@Home volunteer computing project support, cause they’re running on Windows, so hopefully they will be sold out, cause people will use them for this. And for benchmarking this card, you do 8K ultra, no upscaling, full RT, to have no CPU bottlenecks at all.
@ncard00 M4 graphics will NOT beat the RTX 5090. You're smoking crack.
4:53 I do not think that Neural Face is an improvement. It just looks more fake and more uncanny.
it looks like those ai singing videos where they take a picture and make it sing lol
Oh I thought it was a big improvement! it's not perfect, but way better than those jittery face jerks before?
@@renji35 The DF guys were calling it the "yass-ification" effect, lol. Nvidia is apparently aware that the AI is doing too many changes to the underlying artwork and they're working towards it complementing what's already there instead.
it's so uncanny valley, mostly in the face. It's very odd and a bit blurry
@@renji35 Dame da ne... dame yo... dame na no yo...
that's exactly 1300$ more than i am willing to pay
*for the FE
than you can afford*
@@-1solrak shh... its a bad deal... better poor than wasteful
pretty much my thoughts, I bought a 3090 for about 700 at auction a while back and if this keeps up I guess it'll be like 8000 series before I upgrade.
@@-1solrak go pay nvidia 20 grand with that same logic
Unfortunately given we’ve hit the limit for the size a silicon transistor can be ~4nm, we’re never going to see the generational jumps we used to, the engineers are limited by quantum mechanics. Not until we switch to a different semi conductor or method of computation
~4:06 I don't know why Alex has a hammer..... but I approve !
LOL i was wondering the same thing. They should have had him hit the 5090 cake at some point for a little extra comedy.
I assume since he was talking about the hardware, he had tools and hardware lol
I think that's why linus is speaking weird at the end of the vid lmao
HE'S BREAKING IT DOWN FOR US...get it?
For breaking it down of course!
The "multiple presenters" is something LTT should consider more often, amazing video guys , great job !
Ya Linus is cool, but the whole team is awesome imo, always happy to see them on the vids too
Eh…..
@@guiklering they were cringe AF
@@zggggg. and yet they are probably happier than you ever will be
Yaaa no. Lol
4090 was never stupid, it was 100% faster than the gen before it for the same price.
5090 is 30% faster for 50% more cost (Over 100% for 3rd party cards).
This is not a gaming card, it's an AI card.
Its not even that great of a AI card. If you actually want to run high end LLM models you would need a lot more VRAM, which Nvidia is artificially lowering.
@@strelocl True, but It's more an AI card than a gaming card.
Because most hardcore gamers won't buy it on the price and Rasta performance.
that's why everyone should be buying 5080 instead 5090, only if you work with ai that card is viable, but if it only for gaming, 5080 is the way to go for half of the price
Even 20 series top cards still cost a liver and a leg, and mostly just rich people buy them just to boast "I have the xx90 card!!!", while in reality 1060 would be more than enough for their needs.
exactly , its more suited for LLM then just gaming...
21:25 The 4090 was an outlier for 40 series with lower tier cards offered a much lower performance bump... Yeah, but the 50 series will offer a much lower performance bump for it's lower tier cards... Again! (Hypothesised from the fact that 5090 = 30% jump over 4090, using 30% more cores and 30% more power, but the lower tier cards have around 7-10% more cores, so they'll offer.... erm.... 7-10% more performance?! Duh! Meaning that the 5090 will just be even more "outlier" than the 4090. Boo :(
I'm really disappointed with the -18% letter count 05:37
These are the things which really matter smh.
An unfortunate consequence of the smaller cooler size 😔
xD
Such malicious attempt smh/s
One of the few groups of people actually need this kind of GPU processing power is the VR Flight Sim guys, a VR headset is eqvivalent to running a 6-8K monitor, and you need >60fps to make it smooth, >90fps is preferred..
So: Please add some testing with DCS in VR for future videos, thank you!
We just going to ignore other vr users? Also, the only vr head sets that have the same level of pixels as a 6k-8k monitor is like pimax or meganeX. The rest like the index are sitting at 4k equivalent of pixels or under
Yeah but most people are using a quest or index kind of thing, so those might be more like running a 4k monitor
@ I'm aware, that's why I mentioned them. They account for like 95%+ vr user base
Same for sim racing peeps, we also have people running triple 4k setups.
@@njdotson2800x2800 per eye, isn’t exactly a 4k monitor tbh
The person that edited the transformer audio effect at 11:03 has my entire heart
I clicked your timestamp and was disappointed it wasn't a 50-60Hz hum. Fewer people would have picked up on it but for me it would have been the perfect transformer sound effect.
Nice catch! I missed it.
This year (including a bit of Q4 of 2024) feels like the release of tech that is marketted as generational leaps while being more updated iterations at best.
Playstation pro, Galaxy S25, 5000 Series cards, etc. I get why they need to do this as they can't give space to competitors to have better specs and also lose their "top of the line" statuses but technology doesn't always leap forward in time for releases.
But it's pretty annoying when it's getting to a point where getting the latest version of something isn't just a question of cost, but even quality compared to the previous one.
This gpu makes AMD 7900 XTX look like an even better value prop than it already is.
Really the thinking man's gpu
yeah exactly, 5090 not fe gonna cost $3000 USD
Does AMD have anything equivalent to Nvidia's NVEC? I render a lot of 4k video and NVEC is a lot faster than CPU processing while maintaining the highest possible quality.
I even have a 10 core 20 thread I9 and Nvidia NVEC blows it out of the water by like 20x the speed in Vegas Pro over just CPU rendering... I would really hate to give up such great 4k video rendering speed if I were to switch to AMD cards.
@@e5m956 they don't sadly, at least not until you switch to AV1. Funny enough some people will use AMD for their main GPU then pick up some old cheap very base level Nvidia card with NVENC that they can keep in a second x16 slot just for encoding as needed, lol.
@winnah9000 Thanks for the reply. I'd rather just stick to a single card solution. AMD should consider something similar. I do a lot of GoPro filming on the weekends ATV riding and drone flying. I'm pretty much a 50/50 split with gaming and video editing so both are important to me.
I miss the ATI 9700 pro days when we had real strong competition with Nvidia. 😢
@@e5m956 10850k or 10900k? Those got a lot of hate, too.
I try to tell people that nVidia cards are more aimed at professionals than casual gamers. These cards are 100% worth it for people that need real computational power rather than chasing FPS in a video game.
This video had the greatest sponsored segment I've ever seen. 0:54
the cringiest...
Can people pls just stop saying everything is cringe and allow themselves to see something as fun xd. This is just silly fun and im all for it !
i will skip every ad even if it’s “great.” you should too
@@sommelierofstench Literally zero people care
@@memethanYT enjoy your ads
30% faster, 37% more energy, 33% more VRAM, 25% more expensive after 27 months! What a great deal!
/s
21:01 David summed it up PERFECTLY with a single sentence.
Ain’t even been 21 minutes since the vid released 😭😭😭😭
Their "Wait & See" conclusion is enough for me to skip this and wait for 6090.
Rich gamers; not like us
Yup. If one can wait ... just wait for next-gen.
@@eddieyongmengfai2845next gen will be even more….
Elimination of the Moire patterns 12:15 could be a pretty big deal. That specific type of artifact is easily the most noticeable and distracting for me.
0:40 Jumpscare Dont click it
Lmfao dark linus
"A little over two fifths"
Brother, just say around half.
The 7900XTX lags behind in many games by only around 20 FPS, and that's AMD's last generation? Hard pass on the 5090 for now. Since I'm not a 4K gamer or interested in AI, I'll wait for AMD's next release, compare it to the RTX 4090, and pick one of those as my upgrade.
Lets see what the new 700 series from Intel also does
There really is “good enough” for most people.
Good enough for me, at least right now, is a 7700x with a 7800xt. But there’s some great gaming ahead for people that aren’t rocking a 5090.
@@jaredbaker5964heard that. my 9800x3d and 7900xt is gonna have me set for a while. unless the 9000 series looks really good
@@Jacobe_Wan_Kenobi take these words lightly but apparently (APPARENTLY) the 9070 comes VERY close to 7900xtx
7900xtx 7 fps compared to 80 for 5090 in black myth is wild 😂
$2,000 is fucking insane for a card, let alone that the performance increase doesn’t meet the price increase. I can’t believe NVIDIA even released this thing 😂
well to be fair, its 25% price increase for 30%~ performance increase.. so it does.. but.. it's still pretty shit lol
@@TroublesomeOwl 2000$ to go from 60 fps to 78 lmao
@@MihaiCornean if all you’re doing is gaming on the 5090, you’re doing something wrong
@ like i said, it's pretty shit lol
They literally being like "wow 70fps"
Ay. I extremely appreciate the section with the video editor!
I was shopping for a CPU recently and found a distinct lack of mentions for productivity and too much focus on “gaming performance”.
look up @theTechNotice he is doing productivity focused benchmarks - no gaming at all
The 50 series is a huge letdown. They're good cards of course, but the way Nvidia talks about this stuff, you always expect a revolution, and you get an evolution. These are 40 series cards with a crapload of power pumped into them to keep stable. This is not exactly ideal. Nvidia is way too distracted with side projects. They've got 5000 features they're working on, and only a small handful of them ever pan out. Don't get me wrong, I love Nvidia. If it weren't for them, we'd be in a world of hurt. However, when it comes to just raw 3d rendering, we're getting screwed. I bought a GTX1080 for $500 in 2017, and today that will buy me a 4070. The 4070S is another $120. 8 years and only 85% improvement. That's insane. There was almost the same improvement from the 900 series to the 10 series, and that was 2 years. People always ask me "dude how can you still run a GTX1080 and a 7700k?!?!" and it's because I have a Gsync 144hz display, and in competitive games we all turn down graphics settings so I get lots of frames, and in games that aren't competitive, I have Gsync. So really nothing will change yet. Not until I can afford a card that will drive 1440p at 240hz+ for affordable prices. I'm not paying a thousand dollars for a graphics card. I have other hobbies.
It would take physics hardware acceleration and improvements, multi-die GPU's, FPGA scaling built-in, large DisplayPort bandwidth increases (at least 4X), super-sampled output mode (3DFX style), lag free chequerboard rendering for 4-8K, native rolling-scan modulation at the GPU frame buffer level, as well as frame-stacking, hyper low latency mode with native rendering, features for legacy PC games, I mean there are so many useful features that could benefit people, but these that would require actual R&D and actual features for your thousands of pounds, when they can just tack A.I. on instead and convince people that passive frames equal your monies worth.
Yeah sure buddy whatever you say. You don't have to lie to make yourself feel better about your potato PC.
5 big booms 0:17
Erm, my calculations have concluded that it was 6 booms
@gamernugget20 Glad I wasn't the only one who caught that lol
Only good part
So sorry that your 4000 graphics cards are now outdated they get 5 big booms
Reminded me of "The Super Inframan (1975)" review by cinemassacre (angry video game nerd)
0:22 guys Linus gave the 5090 5 big booms.
It looks like it has all been said already but "our biggest review" is no lie!
What a phenomenal job you guys have done here! The sponsor segments, the different hosts with their own bite-sized subjects, the great differentiation between raw performance and DLSS performance, the great graphs including very solid breadcrumbs on where to look, the exact right amount of humor, objective fact and opinion.
Genuinely fantastic, great to see!
Yes!❤
The production value on this video was fantastic. Loved including all of the different hosts, the sponsor segment was great and it kept you engaged. Great job!
This costs more than what my parents paid for our house
Do you live in an overturned bucket
Hilarious 😂
@@lpnp9477 Houses used to be dirt cheap a couple decades ago.
Brother lives in a shared ditch
Where do you live bro? I'm already packing
The Anti-Trust law needs to kick in and NVIDIA needs to be broken up so that there is no fleecing of the customers and no shortage of supplies. This is getting ridiculous. Why was the government sleeping till now?
You know what would be a cool experiment: power limit all GPUs to one share value (200W or 300W or whatever would be to have at least 8 GPUs in test) and see how much you can get.
I do not know if LTT can pull out such a complex test, but it would be very interesting.
ooh yes, the true frames per watt test
How is this any different to just monitoring how much power each gpu uses at peak and dividing down frames by watts? Making the Watts equal in real life is pointless, thats what Maths is for.
@@Alt-gy7sejust because the math says one thing doesn’t mean it’s going to translate perfectly into the real world
Unfortunately I believe power limiting isn't as simple as that, having a 500W+ card running at less than half the power would have massive stability issues
@Alt-gy7se but the power usage in general can be real different. one GPU can be using 500 watts continuously throughout a scene, but another can be using a varying 200 to 400 and spike to 600 a couple times during the same scene. now it will rarely be that extreme, but I can see very clear differences in power consumption with my cards, and I'd love to see some more comprehensive testing to that, especially since power costs and GPU power consumption are on the rise!
This is why I love Linus tech tips, they don’t milk Gpu and cpu news for 2 months straight with 2 videos a day with no useful info. Instead they post the day of and post a banger video
@@wyattsklee6769 yeah, they tend to milk the fun aspects of a tech launch (good), not by gating access to useful info (bad)
At this level, they can't afford to do a shitty job.
Gamers Nexus puts out the best hardware review data and they are out now as well. this site is more tech news and entertainment than serious data points.
@@toddblankenship7164true, but no one is falling in love with tech via tech Jesus. Steve’s crowd is us bedroom dwelling number nerds. LTT has a format that invites new interest. It’s good that both have their niche
@@toddblankenship7164 TECH JUDAS is no longer relevant. We have LTT and Hardware Unboxed for all our Benchmarking needs. And Steve's lack of professionalism now makes me question everything he does, not to mention he is annoying and insufferable. So NO THANKS!
NVIDIA next marketing :
You don't need food!
I'm sorry but is wrong with Wukong's RT? The 5090 can't even hit 60fps at 1440p at Max without upscaling? Wtf?
UE5 is Severely Un Optimised. So games that use it will tend to have lower performance
Welcome to the future of gaming where to demanding graphics or lack of optimization will be compensated by fake frames.
The optimization is Unreal
@@ronniebots9225Welcome to the future of Internet comments, where anything that obliquely references AI being used for graphics will be accompanied by a lame, low intellect response talking about how games will no longer be optimized ever.
@@chrisdpratt rasterization is the true mark of generational improvement post processing gimmicks are just gimmicks.
No matter how much data is used or how many times the AI model is trained, it will never be flawless. Issues like ghosting, flickering, and others will always persist, as game content is dynamic and unpredictable.
I beg to differ
@@thedurtylemur2282 if you cant see the artifacting from frame gen and up scaling don't look it up just enjoy it cant be unseen
Which is why traditional frame gen will always be better!
Of course, but also keep in mind that it doesn't necessarily need to be flawless - only good enough that people either don't notice, or think that the tradeoff is worth it. I'm not saying that it's there yet, but there have been appreciable generation over generation improvements with these models and different people are going to have different points at which that can happen.
true for frame gen, but the new upscale method in DLSS4 is a massive improvement. At this point I just don't care about Path Tracing, I just want to play at 1440p with dlss quality above 90 fps, no ray tracing if possible. I'm only upgrading my 3070 because of the vram, it still holds up in raterization tho
The 4000D test with the new cooler style is such a thoughtful addition to the review. Its not something that was 100% necessary but really was a nice tidbit to know and consider. It really shows the thoroughness of the review.
But do you think 15 minutes is long enough to hot-box that large case?
@minus3dbintheteens60 I think it's probably enough time to get at least an idea of what that style of cooler could do to performance. I think people who are running that style of case and cooler are pretty unlikely to hit the CPU and GPU at that level in daily life for most workloads and games.
Of course more data is better but for the conclusions they make with the information and the intended audience of this review it's enough. It answers the basic question on whether you should be worried about this cooler style affecting thermals.
For people who are likely to hit the CPU and GPU that hard then It informed them that they should keep that in mind. They now have something to consider and look into when they look at other reviews of the product.
Of course more in depth data is nice but realistically this is plenty for the type of person this review is intended for. For a review that needs to hit a release deadline it gives enough information to be useful while not slowing them down too much.
2k USD....yea how about Eff Off.
At this rate the RTX 6090 is gonna cost your firstborn child and 3 years of hard labour😭😭
And the RTX9090 will be a pod they will install you into. yes, the tube will be going where you think it will be going, 😉
Are you that poor?
And it will be 20% faster in very niche applications and still unable to run all games at 4k 60fps
@@ParTeTai Bro's alright with being ripped off by billion dollar companies 😂
@@Raghav-lq3qh Sell the house. You dont actually need it. But a 6090 is a must have. Priorities...
Hey LTT. It's a tradition to test it out on 8k for the fun of it. Don't skip that out y'all.
Does 8k monitors even exist? Even if they do they most be like 60hz.
I'm genuinely curious, who's tradition is this? I personally don't care about 8k because it won't matter to my eyes and mostly doesn't exist but I'm not going to yuck your yums
@@emilfrederiksen.1622 More for 8k TVs I think
Who's tradition?
For the last two graphic cards, they tested them at 8k for funsies. @@sevenofzach
I'll be honest, these tech videos don't really make sense to me but I still enjoyed these. I'm learning little by little with each video. As well this is the first time I see Oliver Cao in a video, but he talks like he's been doing these videos for years.
One way to explain the tensor cores and neural shading is as follows: The neural networks can be used to approximate extremely complicated functions for which we can't just write simple equations. However, we can create those networks with a simple procedure merely by showing them many examples. So for example we can show it what rocky terrain looks like at all levels of detail, and the network will learn how to do that, using less data (just what the network's values are) than trying to store all the terrain (what the network computes as output). It's sort of like procedural terrain, but learned instead of programmed. The tensor cores are what are used to efficiently compute those trained neural networks on the GPU. They're not the same as geometry or pixel shaders, so they need their own specialized cores for the task.
Can't believe they didn't test the 5090 on Vampire Survivors.
And HoloCure.
That's exactly why they always say to check out different reviewers. No one can test everything in such a short time and there are different perspectives out there
can't believe they didn' test 5090 on minesweeper
@@sevenofzach /woosh
@@smileychess 😂 yup
I love how watching GN, LTT and HUB will give you 3 completely different reviews, with some overlap that says the same things but all 3 have such different perspectives that I feel like they were all necessary. Some to get confirmation of the main results, some other to get morenrelevant information.
Good job everyone!
And then there's my country's version which is also very good, and covered some more stuff but mostly how it fits in the Italian market. Shoutout to Prodigeek, despite the annoying clickbaity title, thumbnail and first 10 to 20 seconds of their videos, the rest is fun, informational and cool.
It's what life and politics etc should be like. We agree on the facts, our opinions on what is good and bad about them can differ. But the facts stay the same.
Oliver Cao's segment was really great! Congrats! Voice, tone, tempo → Chefs kiss.
I hope he gets more screen time!
"The 5090 is overkill when you play in 1440p" -I fully get it and partially agree, but in many games, it cannot get even the 144Hz of the current monitors, let alone 165Hz/240Hz (in full preset without RT). Now imagine 4K. We just started getting into that resolution. My 144Hz monitor supports 165Hz OC via DSC, but I have it at comfortable 100Hz which I love.
Plus, capped frame rates and the GPU not only stays in cool levels, but the capping also helps in smoother gameplay since the GPU can be better in synchronization with the CPU. You can look it up, here, on YT. Capping the frame rates is good, end of story. Power consumption way less and the sync with the GPU culminates in smooth response times. It's like one straight line.
THANK YOU for finding ways to take all of this incredibly technical and complicated techno jargon and turn it into an easily digestible 20 minute video. This is one aspect that I've found *many* other outlets are having a hard time doing for this release specifically, either simplifying is way too much that It becomes almost uninformative, or going way too into the weeds and confusing the hell out of me.
The AI performance comparation at 5:36 is wrong! For 4090 value is provided for 8 bit quantization, but for 5090 it is 4bit quantization. For 8bit I would expect half of presented 5090 AI performance.
To be fair, the 4090s fp4 performance would be the same as it's fp8 performance due to not having it natively. Ultimately Nvidia is going to advertise the largest possible numbers, which for this generation is fp4. Ultimately, from my (admittedly fairly quick) research, both numbers are kind of BS since most models use FP16, but that's why we have independent reviewers to give more realistic performance comparisons.
@@BelacDarkstorm it’s real marketing bs unfortunately. Most likely the uplift in fp8 is only core counts where as fp4 is where the money is at. So inference in fp4 when the model fits within one card is gonna be where most benefit is at. During training a good chunk is in fp16 still for accuracy so there’s less gain in raw speed, though the added vram can allow for bigger batch size in training or bigger models. Not all bad but really depends on the use case
i know the pro had more 8 bit tops than a 3090 from working on it, how many does a 5090 have? over 1k surely
@@jeffrey5602 Btw. 4090 supports int4 with 2642 TOPS which provide the same memory benefits and similar accuracy
@@BelacDarkstorm I thought the term TOPS is used by default for integer operations (and TFLOPS for floating point). Both cards supports int8 and int4 nativelly, but 4090 performance is specyfied in int8 and 5090 in int4. Anyway this is another ugly manipulation from nvidia. The same story like comparing "performance" between fg 2x and fg 4x.
5:39 -18% letter count? Unacceptable!
PLEASE let Dennis do more sponsor segments 😭😭😭
5:18 - close your eyes, and you have Kermit the Frog explaining complex graphical technology to you.
Patrick Mahomes reviews the 5090
Cool can't stop hearing now.
"it ain't easy being green"
1:02 lmao 😂😂
GYAAT
your wallet just passed away, it gets 5 big booms,
0:27
wait, so it can't even play cyberpunk at max settings with over 60fps .. I'll wait.
As someone who only listens to the WAN show, not watch it, this was my first introduction to Dan.
Dan is the best
@@Chunkosaurus It's in the name. Other hosts are good, Dan is Besser (German for better ;))
Ive never even seen this guy on WAN
@junyaiwase He's behind the camera producing usually, I know sometimes they'll show him, but I never see it since I just listen to it as a podcast
The guy who ran through the benchmarks made me feel super uncomfortable with secondhand embarrassment. He was trying way too hard. Muted it for his section.
Legit one of the best sponsor spots I have seen. Never do I go back and watch again... but three times for this one, haha!
True! Also nice seeing you here Phil, i suspect you'll be doing a magnolia county stream/video testing the 5090's performance once you get your hands on it?
@@Zoshiao haha, it’s such a bad idea…. It I do have another computer to build….
I love that all the writers had a bit to cover as part of the review. But my favorite by far is that Alex just has a hammer for some reason? Perfect, never change.
@@zubairmotha6436 He was "breaking down" the details.
@@ricardoamendoeira3800i thought he was hammering his point down
@Random_dud31 There's a section in the video that cuts to him and he says "to break down the details" and he sort of raises the hammer, so I think that's why he had it, not sure if there are other hammer related lines.
its official, NVIDIA sold gamers to the highest bidder... AI
The biggest problem with all the gaming focused features and tech, is that I simply don't need or care about them when actually playing a game. They may look cool in tech demos, but I never want to turn the settings on because it doesn't look or feel better to play an overly polished, shiny or blurry game.
Just like anything in world, you gotta show useless slop to make it appealing to people
I like DLSS W/o frame gen in "quality" mode in most games. It seems to anti-alias pretty well with out the smearyness of TAA
4:40 ok but why do you have a hammer.
The bills might be high bud 😔🙏
To smash the RTX 5090 because it's so bad
Question: is it better to have a hammer .. or not?
Linus: I'm not a workaholic.
Also Linus: 21:45
LOL
5080 seems actually pretty nice and 1000 bucks cheaper than the 5090. Rather invest that extra money in a nice 4k monitor
This series is still not good enough for a great 4k experience imo
still 16g VRAM😅
With scalper prices it will be $1,000 cheaper but the up charge for the 5080 will be $2,000 and the 5090 will be $3,000. We aren’t going to be getting these for MSRP due to scalpers.
Recommends 4k monitor
Suggests buying 16 gig card
*Laughs in stuttering*
It's only 15% faster than 4080. In no world is good.
I think the simplest way to explain convolutional neural networks is: for every pixel, a new pixel value is calculated based on its nearby neighbours. E.g. if every pixel is replaced with the average of its neighbours, you get a blur effect. Think of a visual illusion where you see some broken lines and your mind fills in the missing parts of the line, that's how these CNNs would be used to preserve detail as they upscale. However, it's a lot of computation, and a challenge to preserve scene integrity because the neighbourhoods are small compared to the entire scene. Transformer models allow arbitrary areas to serve as context when computing new pixel values; think of it as like having configurable neighbourhoods to input to that computation. E.g. when trying to upscale the left side of a road in the game's scene, that computation can also refer to the right side of the road all the way across the screen.
I appreciate the AI benchmarks but why are all the LLM tests with 7B and 13B models? I can run those on my 4060, no one is buying a 32GB card to run models that small
I think that it makes sense for the sake of comparing performance to get an idea of generation-over-generation uplift?
Also this is just a "gamer" card, not a Quadro
Absolutely. They should highlight that this card can run bigger models. Until we get new quadro released, this card can be the sweet spot for running big models like llama 70b at a 'value price' at home
@ Sure, but it's priced like Quadro with some Quadro software features missing. And let's be real, this is "gamer" card, so you are still only supposed to run games, not AI models with it and without some dumb trickery it can't even run latest titles at stable 4k 60 fps. It's beyond pathetic release and IMO massive L for nV.
3:25 : "Making the 5090 look truly less like a next generation product and more of a 4090 super gt Zeeekai BUDOKAAAAII"😂😂😂😂
Paying 2000 for a card that gets 60 fps in 1440p ultra in ANY game is absolutely insane.
Wouldn't it also be insane to spend 1000 or 750 or even 500?
No, because path tracing is insane.
@@DonaldJPump24 That's a bit silly. It's not hard to make A game that's ridiculously demanding or poorly optimized.
You're looking at it wrong. You're paying 30% more for 30% more performance. For people on that level of budget, it's totally worth it! And it will sell very well, just like the 4090 did.
@@costafilh0do you have any understanding of how technology works? Over time it gets better. So you can expect better performance for the same money after two years. If they had simply put the price up every year on GPUs based on performance increases since the 90s then a 5090 would cost tens of thousands of dollars!
One aspect that often goes overlooked is the quality of software, the card comes with cutting-edge CUDA cores, RT cores, and Tensor cores, the potential of these advancements can be undermined if the accompanying software is not properly optimized.
8:31 why am I getting mean girl vibes 💅🏼 😂
It gave me a lot of insight to know that this architectures name is 18% shorter. ty ltt
By 2050, the GPU itself will be AI generated.
They already do use ai to generate the layout of the die
Eh, why not. Better is better
can't even play Wukong at 60 fps on 4k
16:26 Can we get Oliver a new chair?
NVIDIA 6090 predictions
Cuda cores: 29000
Process node: 3nm TSMC
Power usage: 500 watts
VRAM: 32GB
Functions: DLSS 5 & reflex 3
Price: 1799$
Release date: October 27th 2027
1799$? Cute.
Edit to: 9971$.
They whipped out the LTT avengers for this video
See what I dont understand is, we got rid of the fans in storage units, but cant do the same here, or its taking uncharacteristically long.