So glad that the 5090 is well priced for AI, but for some reason I believe GeForce is a gaming series of GPUs and I can't recall when gamers asked for any of this AI crap.
DLSS4 transformer model boosts my 4090 performance in cyberpunk from 24fps to over 100fps. Performance mode looks better than the old quality mode. The days of pure brute force raster performance are over unless you want to play your games at sub 30fps.
They can't. Because Nvidia has better technology and they've never stopped innovating. The software stack also plays a much bigger role for GPUs and Nvidia hardware is just backed by better software.
That’d require nvidia goofing up big time, and for as much dog food as they offer gamers to eat, it seems they haven’t missed a beat bad enough for that to happen yet.
More like if gamers buy this then they need to actually be realistic and keep it for 5+ years. The card is great but if you are a carrot chaser who buys every generation you aren't getting the full value of buying such a card.
Dude, same. My 10 yo laptop chugs along with a 1070 and I´m still able to play everything. Not on max settings. Not in 4k. But seriously, for a gamer dad it´s enough, I don´t have the time for most AAA titles anyways.
AMD can make something earth shattering like the x3D CPUs but not manage to do the same for Radeon, it's unfortunate cause competition in the high end would benefit all consumers. Currently, Nvidia can do whatever they want. And I like Nvidia GPUs, but it's still problematic.
The 5090 is an over priced space heater. The performance is fine, but 99% of the people out there don't need it. You have to play AAA titles that focus on next generation graphics and you need to run it at 4k minimum with ray tracing on. This is such a narrow group of hard core gamers with money to burn. Personally I would like to see hard caps at 125w for CPUs and 250w for GPUs and instead of just cranking up the power to get more performance they might actually work on making it more efficient instead.
To play devil’s advocate, I guess I’m one of those enthusiast gamers. I have a 4k/144hz monitor but even my 4090 often fails to reach FR cap with RTX enabled at ultra settings.
@@Pembo-vn7qqI'm the same boat and was trying to justify the 5090 to finally get the few games like Indiana Jones, unobtanium level avatar, and ue5 games to run at a playable level with less ai and dlss. It's not worth it though to have to sell the 4090, be without it till I find a 5090, and then spend so much on so little. 6090, here we come! Hopefully...
I think you're minimizing the market to support your personal complaints about the card. There are very many people with pockets who are actively searching for the best 4k experience possible (for both gaming and work), and like to play with ai. The 5090 satisfies that demand more than any card by a significant amount. If it's too expensive for you or you don't find it enough to merit an upgrade from whatever you have, then just don't buy it and move on.
If only 0.08% of the gaming market is buying a 5090 or 4090 and the rest are being purchased in bulk by the AI market, then shareholders are being misled. Nvidia has already been fined for this during the mining bubble but with such a weak monetary penalty, they'll continue duping customers and investors alike.
That being said, there is nothing fake about commercial/industrial GPU demand. While AI is overinflated, it will not implode like crypto mining did because it is not a scam to begin with and there is room in the market for quite a few companies that train models for a living. Many startups will still go bust, but their share will be picked up by others as consolidation ensues. And, of course, the technology evolves rapidly, potentially resulting in unprecedented breakthroughs and even more demand for powerful hardware.
@@Reirainsong "it is not a scam to begin with" oh dear, now that's statement. the best thing it can be said about AI is that it does something unlike crypto
They got fined for misleading disclosures in their SEC statements because they kept saying the crypto market wasn’t material - while at the same time they knowingly sold billions of dollars worth of product to crypto miners. And part of those sales were products designed and made specifically for the crypto market and had no other use case. And when the crypto bubble popped, nVidia had to backtrack and explain why sales were down, and they retroactively changed their story on how big the crypto market really was. THAT’s why they got fined. Right now nVidia is explicitly positioning themselves as an AI hardware company. There’s no pretense that AI isn’t their primary revenue driver. There won’t be any further penalties to them as a result because they’re already telling everyone how material AI is to their business.
Nope Nvidia isn't lying, they just allocate 0.000001% of their silicon to the 5090s and say they launched it. While saving the rest of their silicon allocation for the upcoming H200 AI GPUs which will sell for the price of a car EACH.......
@@Rickles I'm on the side of "Linus, please shut up so people don't have to keep responding to the stupid things you say" Everything anyone else has said - be it Steve at GamersNexus, Steve at HardwareUnboxed, Louis Rossmann - it's all been in response to some stupid crap said by someone at Linus Media Group. Usually Linus, but the really terrible comments at LTX about LTT Labs doing better benchmarking than everyone else, that kicked of GN Steve's big video and HUB Steve's small Twitter fight with Linus, was another employee if I remember correctly. But still, it stands - everyone's just responding to stupid stuff LMG keeps saying. So if LMG could just STFU, that'd be great. Like Will Smith said to the bug at the end of Men In Black: "Don't start nuthin, won't be nuthin!"
@mjc0961 Despite what his fan base insists, Steve at GN is not Tech Jesus, who walks on water and who's channel is always without fault. I've seen some reviews over there that weren't in line with other reviews or that made recommendations I don't think the data supported, and comments pointing out concerns get ruthlessly downvoted in favor of "Thanks Steve" and "This flawless work is why I watch only your reviews and don't get data from anyone else". I was really hoping that Steve may take a moment and reflect on some things, but he seems to have rejected out of hand the premise that his journalism has room for improvement and that he may be not be handling and presenting things in an impartial way that he presents himself as doing. LMG is a large organization that has it's share of faults and Linus has been unable to detach and taken things personal, but the issue is a lot more complex than "GN should be allowed to punch at LTT/LMG and they should sit there and thank Steve for it and ask for more because he is the north star of Tech TH-cam".
@mjc0961 Except this round was kicked off by Gamer's Nexus mentioning and criticizing LMG in an otherwise unrelated video for a decision years ago and withheld context. I'm not gonna act like I'm even on the "Linus" side, but don't be misleading now.
sarcasm: Just like the 30 series went down as the true variants of the 20 series. Just like the 10 series went down as the true variants of the 900 series. Going from 20 to 30: dual issue compute, GDDR6X Going from 30 to 40: die shrink, L2 cache Going from 40 to 50: GDDR7, INT4
3k for a damn video card that will surely be replaced in video card wars in a year....NO NO NO! Why are we playing this game. I certainly can afford it but NOOOOO! This shouldn't be tolerated !
Good for people who held on their 4090s. But the reality is, stock is dead. People looking for a high-end/midrange will have to jump on to 50s or used 40s, or hold off upgrading altogether. Lots of people taking advantage of selling older gen much especially HEAVILY USED on almost near launch prices. What a dog eat dog world.
yeah 4090's non-existent for a relevant price I kinda fear for this year, GPU's are gonna rise price again that's why I bought a 4070ti super at retail even tho new gen was coming out.
I think spending that kinda money for gaming while for less than half you can get an excellent GPU is… insane? That said. It ain’t my money. I don’t care. 🤷
I can't afford a Porsche 911 gt3 r but I want one. Have to settle with a Golf GTD still does the same thing but slower. That's what lower tier cards are for.
That is because $800 4070 Ti = 3090 Ti. Anything above $1000 for 3090 Ti in this situation was absurd. This time even 5080 will not be able to match 4090. So 4090 still remains in it's own league, slightly below 5090.
@@stangamer1151 5080 is 1k while the 4090 is (hovering around) 1.3k which I think isn't that bad tbh since the 40 series also get DLSS 4. 4090 with double the VRAM for that price isn't a bad buy imo. The next best card increases power draw and is $700 more while giving 30% improvement. 4090 really is in the perfect position. *I should note, 4090 prices depend on your location, those prices are around me.
I wish they had a 64gb option with the same chip. Currently using a Mac Studio with 64gb of unified memory for some tools built on 70b models. The 50 series should run laps around the Mac Studio, but it can’t as it doesn’t have enough memory. The huge compute power just can’t be harnessed.
Just get a second GPU, there are lots of 16GB cards for a good price, two of those is 32GB and for probably a 1/5th of what a 5090 costs. Alternatively if you could run two 3090s for 48GB for less than it costs to buy one 5090. Pure value wise I'd just rock a couple of 16GB cards.
I mean, that's not a bad idea. IGPUs are pretty strong these days. Strong enough to handle e-sports titles, anything 2d, a surprising amount of modern 3d games with lower requirements, even Call Of Duty, if you're willing to beat the settings into the dirt.
@@sleeplessindefatigable6385 I bought a Ryzen 7840U based laptop to replace my 15 year old "coffee table laptop".... I was royally surprised at how much can be squeezed out of this little thing. It's a 39W TDP design with a 1200p screen, and it runs everything my RX 580 did.
The ×90 cards are basically Titans of each generation. I've been saying this since 3090 release. It's been like that for a long while: 780Ti and Titan / Titan Black, 980Ti and Titan X, 1080Ti and Titan Xp / Volta, 2080Ti and Titan RTX. Then nVidia went and rebranded the Titan lineup into ×90 and suddenly they became the gaming highlight. Just like Titans of yester years that had the ability to turn on increased fp64 precision in drivers for workload purposes, ×90 excel at AI processing today. Yet since they went from a separate lineup into a general GeForce RTX category everyone now thinks they are exclusively gaming-oriented.
I agree. My take is that marketing just got annoyed with having to come up monikers for them so switched to 90 label. And the tech influencers bought it...
@@_Ekaros true, the titan class was always kinda like an experiement class, the prices have always wildly flutcuated, the Titan V from 2017 iirc was 3k and designed for AI research and other industries etc. Jensen probably went, "find a way to sell this to both markets boyos." (jensen is irish in my mind) so now they've rebranded back to the 90 series for consumers. It's great business but now we have it being used as a handicap from game devs and supply/demand issues. But my Nvidia stock is going to the moon boyos.
@@Simp_SupremeI think they became 90’s because Nvidia used to throw a fit about people calling titans gaming cards and having them in the gaming benchmarks while they were trying to start an AI buzz.
5080 makes the most sense as the flasghip for 99% of gamers IF they are on the 10/20 series and maybe a mid-lower end 30 series card if they are picking up the FE card. ASUS has lost its mind with the leaked $1800 for their astral 5080. If you have a 40 series card then you should honestly skip the 50 series
The RTX5080 is really an rtx 5070ti for $999. If you look at a past generation graphics half of the cuda cores of the 90 class card are always a 70 class card ex 5090 =21000 cuda cores so , according to the formula at 70 class card would have 11.000 cuda core So now we got that out the way a real rtx5080 would have 16000 cuda cores 80 class cards have always been 20% of the 90 class card So now let's do some math RTX5090 =21000 -20%= 4200 =16000 =REAL RTX5080 THIS IS THE BIG LIE AND NO TH-camRS ARE TALKING ABOUT IT WHICH IS MAKING ME MAD Nvida is trying to sell you an 70 class card class card for 80 class card money
You are wrong. My 2080s is working flawlesly. Ot will remain in my 300 euro case for loooong time to come. I do not see any reason to buy new gpu, even less reasons to buy nvidia.
@ Fair! I've heard that the 2070 super and 2080 cards are still holding strong. It sucks that Nvidia doesn't have enough competition on the higher end to promote better prices
@dr_diddy well, yeah, but Nvidia's suite of features with their cards in terms of software does beat out AMD. Alhough in terms of raw performance they aren't too different at certain levels a big reason I still lean towards Nvidia is because of the software. Plus for myself, at least, my PC is around the 3500 range, so 100 bucks isn't much of a big deal. I'm considering a 5090 but honestly $1000 more for the jump between the 5080 and 5090 is not worth it at all for me
Bought an 4090 FE when it came out two years ago thinking "I cant justify this, I'm an idiot but I'm so happy!" Sometime my genius... It's almost frightening.
8gb vram wasn't even enough on the 20xx series. Honestly for me the lack of vram on the non 5090 cards is the most shocking reveal to me. Pure raster means MORE vram is needed, ironically so if you're only interested in pure raster you should want more vram.
@@Lotus_River It's actually wild how terrible GPUs are designed to boost prices. I remember when the 2060 had RTX... RTX for what? games were a slideshow and it was pointless in 3d software. Budget GPUs shouldn't have RTX since they're barely playable and using DLSS and FG causes more issues at 30fps and below. 16GB on a 4K card... Nvidia is out of there mind surely 😭
25% faster, 50% more expensive in Europe and Australia... For AI you need more VRAM, that's why NVIDIA sells $7000 professional cards with professional feature sets and more VRAM, the AI VERSION OF THIS CARD has 96GB VRAM.
the only "professional feature set" those cards have is now the same thing the consumer cards have... ridiculously inflated prices. Everything else is just marketing B.S (if you weren't born yesterday you'd realise soft quardo was a thing. hint: its all 98% software limitations).
I waited for the 5090. Evaluated the specs for the price. In my countries currency, most likely $4500+ for AIB partner cards. Couldn’t justify that amount of money for a GPU, even though I could. I purchased a refurbished Zotac 4090 for $2900. Just waiting for the 9950X3D or 9900X3D to be launched so I can build a new PC after 10 years. The 1080ti was...IS a GOAT GPU for the money!
Same. Was waiting for the 5090 to upgrade my 1080ti but decided to go with a 4070 super rather than deal with try to buy an MSRP 5090 and dealing with scalpers. Now just need that 9950x3d
@@kyler247 Whatever raytracing consoles can do with their RDNA2 APUs is hardly usable. Forced raytracing in crossplatform releases is doomed for mass adoption until the generation after the next at least, because PS6 (RDNA 3.5 equivalent, finalized before upcoming 4th gen cards) is not going to run it well either.
@Reirainsong blah blah blah console gamers will be able to play Doom the Dark Ages, which requires a raytracing GPU. That's the truth. No mental gymnastics about how bad the raytracing is, they can do it and they can run the software. PC gamers crying about it need to upgrade their PC to bare minimum console standards.
7:00 "Why can't we buy them, AMD?" Let me answer that rhetorical question: "Because Nvidia have hogged/will hog all of Jan and Feb 2025 with their 50xx release". Re the 5080: Call me overly demanding, but a new GPU that doesn't beat or at least match the previous gen's next higher tier card is more than a little disappointing. 5080 should be at least as fast as the 4090 and if it's not it should be at least priced accordingly to make up for the relative lack of uplift - something Nvidia also don't seem to be be considering. So far it looks like there won't be any real "sweet spot" or hidden gem-cards in the 50xx range. 5080 will be too slow compared to both 50xx and 40xx and probably get a "healthy" increase in MSRP compared to the 4080 Super. 5090's uplift vs 4090 is ... there ... but not overly impressive plus you have to pay for it through the nose. 5070 is allegedly slower than the 4070 Ti Super and still only has 12 GB. The only candidate left that *might* turn out as a mild positive surprise is the 5070 Ti, but I'm not getting my hopes up for that one, either.
3080 was 30% faster than 2080ti stock vs stock.2080ti was underclocked and needed more power give it 400w its 3070ti perf. 4080 was 15% faster than 3090ti coz of node shrink. Biggest issue is how heavily cut down each tier is and using the same node
@JakeySurani Speaking of which: 3070 was about as fast as the 2080 Ti.. :) And yes: The gap between 5090 and 5080 WRT CUDA cores, VRAM, etc is immense and thus it seems like a deliberate design-choice on the part of Nvidia that the 5080 will be pretty underwhelming when compared to both the 5090 and the 4090/4080.
5070 Ti does not look good either compared to 4070 Ti Super. Same 16GB buffer and just 5% more SMs. The only major difference is faster VRAM. But since Ti Super already had 672 GB/s bandwidth (very decent amount), 30% increase here will not provide any significant improvement either.
@@stangamer1151 Yeah.. and for *some* reason neither the 4070 Super nor the 4070 Ti Super were mentioned as reference points in any of Nvidia's marketing-material. If my 4070 Super wasn't staring me in the face right now, I could almost think neither of these two cards ever existed.. :D
@@1SaG I bet 5070 will not be able to match 4070 Super. That is why Nvidia pretends there were no Supers at all. 5070 has so heavily cut down die compared to 5090 (just 28% to be exact), that it is obvious this card should have been 5060 or, at the very least, 5060 Ti.
Imagine paying the price of an entire PC just for the GPU alone. This is getting ridiculous. Even the lower end cards cost twice as much as they need to. Its hard to get excited for new PC hardware anymore when I know more BS no one wants, needs, or asked for is being added to justify drastically higher prices.
I remember when AMD launched the 295x2 for $1,500 and people thought charging that much for what was an absolute monster of a card for it's time was insane, and now here we are pushing clean past $2,000 for the base model and everyone acts like this is normal.
Ty so much for the lovely weekly tech news that you do so well without having to watch a super long video! Much love to the channel I have been watching you since NewEgg!
@@CenkolinoI see you don’t understand that everyone has their own taste preferences. I don’t like IPAs but I do like many other types of beers. Yet I can’t stand coffee and tea which most of the world seems to love. It is what it is.
@@Cenkolino I didn't like beer when I was young. With age I like sweet things less and less and less. In my mid 30s beer started tasting "ok, but nothing amazing". Now I really like the complex mix of bitter, sour, bready and mildly sweet. Your tastebuds will change with age too....
If I were ready to drop 2000+ USD for a "small" AI accelerator, I'd rather wait for 3000 USD "Project Digits" in May. Those at least will provide 3 times more memory for AI needs and are stackable. And will not need all the electric power in the world. So are 5090 GPUs useful? Maybe, for the people Huang was talking about, those with "10000 USD entertainment centers".
This! RIGHT HERE 2:13 is exactly what I want. All the fluff cut out and a condensation of the reviews instead of having to plow through all the professionals videos to form an opinion.
I have the MSI 4090 Suprim Liquid X and that card stays hella cool. Playing the new update for cyberpunk with ultra RT and path tracing I hit 52c at 1440p and 59c at 4k. Its a great card.
I thought in the US the maximum power through an outlet was 1800 watts or 2400 watts if it is a 20 amp circuit? So we are getting closer, but not quite to the maximum yet.
The RTX5080 is really an rtx 5070ti for $999. If you look at a past generation graphics half of the cuda cores of the 90 class card are always a 70 class card ex 5090 =21000 cuda cores so , according to the formula at 70 class card would have 11.000 cuda core So now we got that out the way a real rtx5080 would have 16000 cuda cores 80 class cards have always been 20% of the 90 class card So now let's do some math RTX5090 =21000 -20%= 4200 =16000 =REAL RTX5080 THIS IS THE BIG LIE AND NO TH-camRS ARE TALKING ABOUT IT WHICH IS MAKING ME MAD Nvida is trying to sell you an 70 class card class card for 80 class card moneny
@@markterek6232 yes and they are doing it again 4060 was really a 4050 think about it why do you think there was no 4050 desktop cards only 3050 no competitionthay way is was olny a little bit faster then a 3060
Because the 30 series was on Samsungs shit node, the 40 series on TSMCs then-current best, and the 50 series is on TCMSs yester-year node. The 30 series made Nvidia a shiiiit ton of money, the 40 series less so, and now the 50 series is back to milking.
I have been using a 1660ti gaming laptop for the last 5 years I think I may go desktop AMD or continue using what I have and get a xbox and use game pass.
@@jesusbarrera6916 Depends on the performance of course. If the 5080 can't deliver what I want for the price, then I'll probably go with the 5070ti and save some money.
Most of us RTX 4090 owners will be skipping the 50 series completely. I don't want fake frames or ai in my games. I want more raw power. Maybe I will get an RTX 6090 if it's any good.
Yeah, well, Mom's being a total bitch, and Dad's gone and built a new shed on the edge of the property! And neither of them ask me how *I'M* feeling! [breaks out journal and purple glitter pens] LOL
OMG Paul, your ending was priceless! We sometimes say that at work when people get into a big debate over whatever..... "I hate it when Mom and Dad fight". 🤣
The Aftermarket OC versions are $2200, so its just $2,000 for a standard 5090 Plus, Nvidia GPUs hold value, 3090's MSRP was $1,500 and sell for $800 The 5090 is twice as powerful So powerful its CPU bottlenecked in gaming
The “S” in MSRP stands for suggested. This doesn’t mean you will pay that. And we all know this is the case w/ anything. The market will determine how much they can charge.
Let's not forget that AMD is already a power house in cpus and there running a big enough show as it is so it would be hard for them to own both the GPU and cpu market when you think of it. Nvidia has to focus on one asset gaming and I tel mostly cpus but some decent gpus. None the less across the board amd is the winner because there cpus are top tier and there graphics cards are a notch or 2 below that but who really needs more than an AMD rx 9000 series gpu when they come out? You can stick play 1440p easily and 4k 60 fps
The hard truth is I'm locked into the nvidia because of their proprietary software that only works with their language, framenet, tensorflow, without using rocm. For all the people who only play games, I don't know why they don't just buy less expensive gpus. AMD doesn't help with their mostly piss poor pricing. Nvidia left the door wide open in the mid tier market. The largest market. And what did AMD do? Like they always do in the gpu market of recent, fall flat on their face. Well at least their cpu's are good. But now also pretty expensive now that Intel has fallen far from their perch.
@imbatman3537 because AMD is never at fault and launch their competitor GPUs at $30 less and 3 months after everyone bought Nvidia cards Nooooo.... it's the customer who is wrong
@@jesusbarrera6916 it's both to be fair, there are like 3 camps imo. >People locked into software packages >People loyal to brands >AMD dropping the ball AMD have the better GPUs in the bigger market due to pricing but as you mentioned they just set the price from Nvidia. They need some cojones, they dropped out of the enthusiasts market only to keep given us dogwater and now we have $2k cards again
@dr_diddy they only have better pricing months after they come out and everyone already bought NVIDIA People have to stop making excuses for AMD This tiem the market will buy 5090s SD 5080 while the rest will buy last gen first cheap BEFORE AMD even shows off their only GPU
This is going to be the best chance AMD will have in years to gain market share. FSR4 looks promising and the 50 series cards are basically just expensive sidegrades with (even more) fake frames as their main selling point. If AMD can deliver a solid upgrade for a reasonable price I'll probably switch.
@@natedisrud why are you all complaining? what are you going to do? NOT BUY IT?!?! LOL RIGHT.... What are you going to buy? AMD? INTEL? In the end, you'll buy it anyway! LOL! THANKS FOR THE LEATHER!!!!!!!!!!! AHAHAHAHHAHAHAHAA
$2,000 is NOT a good price for any consumer PC component, full stop. "mom and dad are fighting" is also a gross oversimplification, but sure let's go with that and not call out any of the disgusting behavior
@@arha-z1v Pickup trucks are very much designed as consumer cars. Modern pickup trucks aren't working vehicles, they are status symbols for people trying to overcompensate.
oh wow! I was waiting for next years water-cooled-double-power-railer-1000-giga-chad-watt 6090 (Nice!) but I didn't know about that 5090 Ti coming :) thanks for the news @Paul! :)
As far as I can tell, these cards will be going for upwards of $4,000 to $4,500 here in Australia. Maybe even more. Just looking at them from a rasterisation perspective, these cards are far from justifiable when compared the 40 series, or even the 30 series, and the machine learning stuff isn't a selling point for a majority of people out there, because barely anyone actually cares about it.
Here to drink the tears of gamers that spend 74.6 hours a week enjoying leisurely video gaming activities and can't afford the latest technological marvels that allow them to play 74.6 hours a week enjoying leisurely video gaming activities at a higher framerate because they have no skills and are poor because they spend 74.6 hours a week enjoying leisurely video gaming activities.
This is why we are starting to even hate you reviewers. "It's worth it, because of something you'll never utilize". That's everything wrong with nvidia's gpus and pricing. Selling people on stuff that is either A. completely fake and not worth it (dlss) and B. software features they will never use. The whole pc industry atm is just horribly disappointing for this reason. MBS, over priced for a feature no one will utilize. Gpus, over priced for features they dont even want. Reviewers,, "no its worth the money for people NOT you." Stop it.
It's basically the same story as every GPU launch at the end of the day. If you're on the GPU from last generation, don't upgrade. If you're on the GPU from two or more generations ago, upgrade. The 5090 isn't a good upgrade from the 4090 but it is a good upgrade from the 3090.
I have an RTX 4090 and I got it at msrp, so I'm going to hang on to it. And my work rig, which I use for 3D modeling and rendering has two RTX 3090's which will render very nearly as fast as a single RTX 5090, so I'm keeping that to.
0:50 Yeah, as someone invested in Nvidia the last few years, they do not care at all about gaming anymore. It's the last thing talked about in their quarterly reports, for good reason. Year over year the last 2-3 years they've done 100-200% more business year over year. ALL in data centers and AI chips, deals with car manufacturers etc.
Sorry, but Paul is buying propaganda. Nvidia shut off its production many months ago of the 4090. The only reason the price is so high is the supply is basically nonexistent. You do that with any card and the price goes up. I'll admit there is some demand, but the idea that the price is surging because of that is absurdly wrong. It's basically only because of the production cuts to set up the market so that people like Paul will declare the $2000 5090 is a great value. You don't have to be a paid shill to work for Nvidia. You just have to buy their bullshit and repeat it.
I passed on the 3090, but when going for the 4090, i purchased it early in its lifecycle to get the longest use and thus best return on investment ... I do use it for gaming but I'm a software engineer so having it for development of cuda/tensor models is very nice
He clearly stated that it was for AI consumers, and not gamers. It's nvidia misleading everybody marketing these towards gamers, not Paul. These aren't cards thought with gamers in mind. They're made to take advantage of the massive AI bubble. And hey, until we have gamers willing to pay that much for gpus, that's nvidia's job to keep selling them for that price. Nvidia only reason to exists is to make money, not be a charity act. Blame human beings stupidity that allowed nvidia to price gpus these ridiculously high, not paul stating a sad fact.
What are you moaning about? 5090 is not a gamer GPU, never has been (previously Titan), which is why it isn't priced as such. These will sell out to people who will waste them on "AI" and a few of richer gamers. People would easily pay over 3000 for these. During the plague we saw people would pay anything for a high-end GPU.
Ya youre actually wrong, even with ai the 5090 still isnt worth it as nvidia markets the 5090 with ai in terms of dlss4 which is a huge lie to what the 5090 really does. if you wanna see what the 5090 really does turn off dlss and run the card at its max, then and only then will you see what the 5090 actually does. when you turn on dlss4 youre getting fake frames and increased latency aka less performance
@@ColdVenom159 You do grasp that people use these cards for AI work that has ABSOLUTELY NOTHING AT ALL TO DO WITH GRAPHIC RENDERING, right? DLSS is AI, but not all AI is DLSS. When Paul was speaking about its value in AI work, he wasn't talking about DLSS, games, frame-rates, "fake frames", latency, or anything about graphics whatsoever. Just the raw value for people wanting the 5090 for its generative compute performance.
@sigmahyperion955 you do grasp that with the 5090 dlss4 is based off ai right? did you even listen to Nvidia when they spoke about how dlss4 actually works? without ai the gpu is a rip off in terms of performance.
When the 5000 series was revealed, I was FOMO-ing hard over buying my 4070TS during Black Friday, but seems like every day goes by I feel better and better about my purchase.
One of my favorite videos of the week, thanks Paul 👏👏 Steve vs Linus - Steve is not NOT the angel most viewers think he is, he should just focus on what he's best at, reviews of hardware!
I love that you reviewed every other channels review of the RTX5090, actually preferred that rather than watch 6 videos about the same thing :)
I agree. Every review was 30-40 mins and I wanted the TLDW
bahahhaa, Linus and Steve as Zoolander and Hansel were PERFECT 🤣🤣
So friggen hilarious!
So funny!! whoever made that did an amazing job. Even with the flashes and shadows LOL
What’s Linus and Steve fighting over?
That was too funny
No joke, I wish he had the source. Whoever did this is an artist.
So glad that the 5090 is well priced for AI, but for some reason I believe GeForce is a gaming series of GPUs and I can't recall when gamers asked for any of this AI crap.
but fake frames
Ncores help with frame generation and maybe even upscaling now.
Too bad 5090 retail is $2500 and not $1500 like 4090 ;)
It's a lot of money to pay for upscaling and inbetweening with artefacts.
DLSS4 transformer model boosts my 4090 performance in cyberpunk from 24fps to over 100fps. Performance mode looks better than the old quality mode. The days of pure brute force raster performance are over unless you want to play your games at sub 30fps.
@@NecroMoz sez u
that ending was absolutely perfect Paul
It was priceless. It took me a second to realise zoolander was Linus* lol
*edit
@@fuzzylumpkins6034Zoolander was Linus, not Steve
@ fixed- early morning 😅
"Dad and Dad, please stop fighting" was my response at the beginning of WAN show on Friday.
10/10 perfect deepfake
I wish AMD could do to NVIDIA what they've done to INTEL
Youd just be trading one monopoly for another. Be careful what you wish for
@@atomic3325 intel still has majority market share for cpus so its not really a monopoly
They can't. Because Nvidia has better technology and they've never stopped innovating. The software stack also plays a much bigger role for GPUs and Nvidia hardware is just backed by better software.
@@atomic3325 at least AMD and INTEL occasionally trade punches, but NVIDIA has been crushing everyone for sooo long
That’d require nvidia goofing up big time, and for as much dog food as they offer gamers to eat, it seems they haven’t missed a beat bad enough for that to happen yet.
Imagine paying $2000+ dollars and not even getting the full die.
Nvidia has lost their minds.
Sorry to tell you, but the best yields always go into the business products to be then sold for 10x price
@ for over a decade, we used to get ~90% for $699.
In two generations we now get 85% for +$2000.
Please explain to me how that is reasonable.
@@jasonhemphill8525 because they have a monopoly and they can do what they like.
It sucks.
@@jasonhemphill8525 but that's entirely different topic!
@@Mglunafh it isnt
"Mum and Dad are fighting". Love it, you cheeky bastard.
But which is which?
More like QVC and PBS are fighting
That gif was hilarious
@@C_C- this is so accurate
@@exxmodel Is even better with music !
th-cam.com/video/SY_KeoLnx8I/w-d-xo.html
"Mom and Dad are fighting again" boy that feels so true
The hard truth is that, if you are just a gamer, you absolutely do not need this thing.
Well if the Nvidia fanboys don't buy this card, Jensen might show up in a sweater or something. They need to get him a fresher jacket.
I know morons that bought this to play league and overwatch all day.
@@AlkaVirus worthy upgrade then
More like if gamers buy this then they need to actually be realistic and keep it for 5+ years. The card is great but if you are a carrot chaser who buys every generation you aren't getting the full value of buying such a card.
I disagree
What if you sim race on 4k triples?
This is actually still going to get maxed out in some titles at 1400p and 4k
*Cries in 1080TI* I know he's no longer he youngest or fastest kid on the block, but I still love him for who he is, okay?!
Me, with a 1060TI. 😂
@@truecaliber1995 you know even working at Mcdonalds putting 50 to 100 dollars away a month you could buy a better GPU or PC all round.
@@toddblankenship7164 how do you pay rent, feed yourself and save 50 to 100 dollars a month for a gpu working at McDonald's lol
Really ? right in front of my 730 GT ? :)
Dude, same. My 10 yo laptop chugs along with a 1070 and I´m still able to play everything. Not on max settings. Not in 4k. But seriously, for a gamer dad it´s enough, I don´t have the time for most AAA titles anyways.
that's what happens when there's no competition.
nobody stop AMD from releasing 5080 (4080) competition mid-year. Most gamers will buy 1k cards, only few will get $2.5k
@unotoli I don't know anybody who even buys 1K GPUs, and I'm a software engineer. $400 to $800 seems to be the sweet spot.
AMD can make something earth shattering like the x3D CPUs but not manage to do the same for Radeon, it's unfortunate cause competition in the high end would benefit all consumers. Currently, Nvidia can do whatever they want. And I like Nvidia GPUs, but it's still problematic.
@@fillei i bought 900 euros for my RTX3080 at launch. Still using it and skipping RTX50 series and holding for 2 years until RTX60 series
linus and steve?
The 5090 is an over priced space heater. The performance is fine, but 99% of the people out there don't need it. You have to play AAA titles that focus on next generation graphics and you need to run it at 4k minimum with ray tracing on. This is such a narrow group of hard core gamers with money to burn. Personally I would like to see hard caps at 125w for CPUs and 250w for GPUs and instead of just cranking up the power to get more performance they might actually work on making it more efficient instead.
Make that 65W and 125W and I'm on board.
To play devil’s advocate, I guess I’m one of those enthusiast gamers. I have a 4k/144hz monitor but even my 4090 often fails to reach FR cap with RTX enabled at ultra settings.
@@Pembo-vn7qqI'm the same boat and was trying to justify the 5090 to finally get the few games like Indiana Jones, unobtanium level avatar, and ue5 games to run at a playable level with less ai and dlss. It's not worth it though to have to sell the 4090, be without it till I find a 5090, and then spend so much on so little. 6090, here we come! Hopefully...
I think you're minimizing the market to support your personal complaints about the card. There are very many people with pockets who are actively searching for the best 4k experience possible (for both gaming and work), and like to play with ai. The 5090 satisfies that demand more than any card by a significant amount. If it's too expensive for you or you don't find it enough to merit an upgrade from whatever you have, then just don't buy it and move on.
Want efficient? Just buy an Intel Arc or Amd Radeon.
If only 0.08% of the gaming market is buying a 5090 or 4090 and the rest are being purchased in bulk by the AI market, then shareholders are being misled. Nvidia has already been fined for this during the mining bubble but with such a weak monetary penalty, they'll continue duping customers and investors alike.
^ This.
That being said, there is nothing fake about commercial/industrial GPU demand. While AI is overinflated, it will not implode like crypto mining did because it is not a scam to begin with and there is room in the market for quite a few companies that train models for a living. Many startups will still go bust, but their share will be picked up by others as consolidation ensues. And, of course, the technology evolves rapidly, potentially resulting in unprecedented breakthroughs and even more demand for powerful hardware.
@@Reirainsong "it is not a scam to begin with" oh dear, now that's statement.
the best thing it can be said about AI is that it does something unlike crypto
They got fined for misleading disclosures in their SEC statements because they kept saying the crypto market wasn’t material - while at the same time they knowingly sold billions of dollars worth of product to crypto miners. And part of those sales were products designed and made specifically for the crypto market and had no other use case. And when the crypto bubble popped, nVidia had to backtrack and explain why sales were down, and they retroactively changed their story on how big the crypto market really was. THAT’s why they got fined.
Right now nVidia is explicitly positioning themselves as an AI hardware company. There’s no pretense that AI isn’t their primary revenue driver. There won’t be any further penalties to them as a result because they’re already telling everyone how material AI is to their business.
Nope Nvidia isn't lying, they just allocate 0.000001% of their silicon to the 5090s and say they launched it. While saving the rest of their silicon allocation for the upcoming H200 AI GPUs which will sell for the price of a car EACH.......
Don't worry, Paul told me in secret he is on the same side of the beef you're on.
Oh good. I'd hate to have to unsub.
I am on nobody's side because nobody is on mine.
@@Rickles I'm on the side of "Linus, please shut up so people don't have to keep responding to the stupid things you say"
Everything anyone else has said - be it Steve at GamersNexus, Steve at HardwareUnboxed, Louis Rossmann - it's all been in response to some stupid crap said by someone at Linus Media Group. Usually Linus, but the really terrible comments at LTX about LTT Labs doing better benchmarking than everyone else, that kicked of GN Steve's big video and HUB Steve's small Twitter fight with Linus, was another employee if I remember correctly. But still, it stands - everyone's just responding to stupid stuff LMG keeps saying.
So if LMG could just STFU, that'd be great. Like Will Smith said to the bug at the end of Men In Black: "Don't start nuthin, won't be nuthin!"
@mjc0961 Despite what his fan base insists, Steve at GN is not Tech Jesus, who walks on water and who's channel is always without fault.
I've seen some reviews over there that weren't in line with other reviews or that made recommendations I don't think the data supported, and comments pointing out concerns get ruthlessly downvoted in favor of "Thanks Steve" and "This flawless work is why I watch only your reviews and don't get data from anyone else". I was really hoping that Steve may take a moment and reflect on some things, but he seems to have rejected out of hand the premise that his journalism has room for improvement and that he may be not be handling and presenting things in an impartial way that he presents himself as doing.
LMG is a large organization that has it's share of faults and Linus has been unable to detach and taken things personal, but the issue is a lot more complex than "GN should be allowed to punch at LTT/LMG and they should sit there and thank Steve for it and ask for more because he is the north star of Tech TH-cam".
@mjc0961 Except this round was kicked off by Gamer's Nexus mentioning and criticizing LMG in an otherwise unrelated video for a decision years ago and withheld context. I'm not gonna act like I'm even on the "Linus" side, but don't be misleading now.
I feel like the 50 series will go down as the TRUE (Ti/Super) variants of the 40 Series.....Change my mind....LOL
Hope gamers don't buy this card. 50's cards or for farming Ai stuff or bitcoin.
That's actually a really good take on this launch.
That is exactly what this 'gen' is. A glorified second refresh.
sarcasm:
Just like the 30 series went down as the true variants of the 20 series.
Just like the 10 series went down as the true variants of the 900 series.
Going from 20 to 30: dual issue compute, GDDR6X
Going from 30 to 40: die shrink, L2 cache
Going from 40 to 50: GDDR7, INT4
Frame gen means you are wrong... 4000 series don't have this
3k for a damn video card that will surely be replaced in video card wars in a year....NO NO NO! Why are we playing this game. I certainly can afford it but NOOOOO! This shouldn't be tolerated !
It will be superseded whenever TSMC improve their nodes. Nvidia doesn't have a magic wand to make things that much better without die shrinks
Quoting princess bride in the intro is the kind of hard truths i came here for Paul, thank you
My problem with the $2000 price for a 5090 is the FACT that Nvidia told their investors that $1200 of the price is JUST profits.
Good for people who held on their 4090s. But the reality is, stock is dead. People looking for a high-end/midrange will have to jump on to 50s or used 40s, or hold off upgrading altogether. Lots of people taking advantage of selling older gen much especially HEAVILY USED on almost near launch prices. What a dog eat dog world.
same folks selling the heavily used 4090s are the same ones saying 5090 isn't that great
There always is AMD. The XTX is a solid card. And performance pretty good even in Indiana Jones with RT. And basically destroys any raster game.
yeah 4090's non-existent for a relevant price I kinda fear for this year, GPU's are gonna rise price again that's why I bought a 4070ti super at retail even tho new gen was coming out.
@@StatusQuo209 "yh but it's AMD, icky."
AMDs biggest competitor is their reputation tbh, I don't know how they'll ever recover to the ATI days.
I think spending that kinda money for gaming while for less than half you can get an excellent GPU is… insane?
That said. It ain’t my money. I don’t care. 🤷
The hard truth about the RTX 5090 is that whether it's good or bad...I can't afford it
I can't afford a Porsche 911 gt3 r but I want one. Have to settle with a Golf GTD still does the same thing but slower. That's what lower tier cards are for.
@@jondonnelly3 Can get a 5090 if you save up for months. But for a 911 it will take decades.
remember when the 3090 Ti was selling for under $1000 when the 4090 was released. They are still charging $2400 for the soon "obsolete" 4090
That is because $800 4070 Ti = 3090 Ti. Anything above $1000 for 3090 Ti in this situation was absurd.
This time even 5080 will not be able to match 4090. So 4090 still remains in it's own league, slightly below 5090.
@@stangamer1151 5080 is 1k while the 4090 is (hovering around) 1.3k which I think isn't that bad tbh since the 40 series also get DLSS 4. 4090 with double the VRAM for that price isn't a bad buy imo.
The next best card increases power draw and is $700 more while giving 30% improvement.
4090 really is in the perfect position.
*I should note, 4090 prices depend on your location, those prices are around me.
@@stangamer1151 RTX 5080 was going to be always worse than RTX 4090 because we shouldn't forget it has only 16 GB of VRAM vs 24 GB on RTX 4090.
@@stangamer1151 eg. they are using the 4090 to make the 5090 look better in value. classic nvidia move.
Totally agree, to me the selling point isn't that I'll get high FPS in games, but that I would have 32gb of VRAM to run models locally with
I wish they had a 64gb option with the same chip.
Currently using a Mac Studio with 64gb of unified memory for some tools built on 70b models.
The 50 series should run laps around the Mac Studio, but it can’t as it doesn’t have enough memory. The huge compute power just can’t be harnessed.
Just get a second GPU, there are lots of 16GB cards for a good price, two of those is 32GB and for probably a 1/5th of what a 5090 costs. Alternatively if you could run two 3090s for 48GB for less than it costs to buy one 5090. Pure value wise I'd just rock a couple of 16GB cards.
@@jamieknight326 Check out the new RTX 6000 Blackwell, 96 GB GDDR7, GB202 chip. Gonna cost you a few bucks tough 💀
@@mybaIIzmost models aren’t optimized to run well, if at all, in multi-GPU systems. A small handful are, but most aren’t.
Time was there were different pro models for people wanting to compute with them, instead of just charging gamers pro model prices.
Guess I'm playing 2D indie games on my igpu until march.
Based
Sigma move i approve of
I mean, that's not a bad idea. IGPUs are pretty strong these days. Strong enough to handle e-sports titles, anything 2d, a surprising amount of modern 3d games with lower requirements, even Call Of Duty, if you're willing to beat the settings into the dirt.
@@sleeplessindefatigable6385 I bought a Ryzen 7840U based laptop to replace my 15 year old "coffee table laptop".... I was royally surprised at how much can be squeezed out of this little thing. It's a 39W TDP design with a 1200p screen, and it runs everything my RX 580 did.
The rtx 5050 will give you so-so performance to the dollar
You should do standup
X4 frame gen it'll be as good as what? A 4070 /s
nvidia gonna have to measure how worth it a card is by performance per 100 dollars soon
The rtx 5050 will give you -so so- SHIT performance to the dollar
I guess the joke went over most peoples heads
The ×90 cards are basically Titans of each generation. I've been saying this since 3090 release. It's been like that for a long while: 780Ti and Titan / Titan Black, 980Ti and Titan X, 1080Ti and Titan Xp / Volta, 2080Ti and Titan RTX. Then nVidia went and rebranded the Titan lineup into ×90 and suddenly they became the gaming highlight. Just like Titans of yester years that had the ability to turn on increased fp64 precision in drivers for workload purposes, ×90 excel at AI processing today. Yet since they went from a separate lineup into a general GeForce RTX category everyone now thinks they are exclusively gaming-oriented.
I agree. My take is that marketing just got annoyed with having to come up monikers for them so switched to 90 label. And the tech influencers bought it...
@@_Ekaros true, the titan class was always kinda like an experiement class, the prices have always wildly flutcuated, the Titan V from 2017 iirc was 3k and designed for AI research and other industries etc.
Jensen probably went, "find a way to sell this to both markets boyos." (jensen is irish in my mind) so now they've rebranded back to the 90 series for consumers.
It's great business but now we have it being used as a handicap from game devs and supply/demand issues. But my Nvidia stock is going to the moon boyos.
@@Simp_SupremeI think they became 90’s because Nvidia used to throw a fit about people calling titans gaming cards and having them in the gaming benchmarks while they were trying to start an AI buzz.
5080 makes the most sense as the flasghip for 99% of gamers IF they are on the 10/20 series and maybe a mid-lower end 30 series card if they are picking up the FE card. ASUS has lost its mind with the leaked $1800 for their astral 5080. If you have a 40 series card then you should honestly skip the 50 series
The RTX5080 is really an rtx 5070ti for $999. If you look at a past generation graphics half of the cuda cores of the 90 class card are always a 70 class card ex 5090 =21000 cuda cores so , according to the formula at 70 class card would have 11.000 cuda core
So now we got that out the way a real rtx5080 would have 16000 cuda cores 80 class cards have always been 20% of the 90 class card
So now let's do some math
RTX5090 =21000
-20%= 4200
=16000 =REAL RTX5080
THIS IS THE BIG LIE AND NO TH-camRS ARE TALKING ABOUT IT WHICH IS MAKING ME MAD
Nvida is trying to sell you an 70 class card class card for 80 class card money
You are wrong.
My 2080s is working flawlesly.
Ot will remain in my 300 euro case for loooong time to come.
I do not see any reason to buy new gpu, even less reasons to buy nvidia.
@ Fair! I've heard that the 2070 super and 2080 cards are still holding strong. It sucks that Nvidia doesn't have enough competition on the higher end to promote better prices
@@Dstryrr 7900 XTX: 👁👄👁
I dunno what res you guys game at but AMD have a 4080s competitor for cheaper.
@dr_diddy well, yeah, but Nvidia's suite of features with their cards in terms of software does beat out AMD. Alhough in terms of raw performance they aren't too different at certain levels a big reason I still lean towards Nvidia is because of the software. Plus for myself, at least, my PC is around the 3500 range, so 100 bucks isn't much of a big deal. I'm considering a 5090 but honestly $1000 more for the jump between the 5080 and 5090 is not worth it at all for me
Bought an 4090 FE when it came out two years ago thinking "I cant justify this, I'm an idiot but I'm so happy!"
Sometime my genius... It's almost frightening.
your less of an idiot than 5090 buyers. congrats?
the 50xx series with only 8gb ram still is a hard no for me
im still only interested in pure raster perf no RTX needed
8gb vram wasn't even enough on the 20xx series. Honestly for me the lack of vram on the non 5090 cards is the most shocking reveal to me. Pure raster means MORE vram is needed, ironically so if you're only interested in pure raster you should want more vram.
Get a 6800XT. 16gb, 3080 / 4070 TI performance, $350
In they pass down FSR4 to the 7000 series, the 7900 xt(x) will be a good option that should have staying power
@@Lotus_River It's actually wild how terrible GPUs are designed to boost prices. I remember when the 2060 had RTX... RTX for what? games were a slideshow and it was pointless in 3d software.
Budget GPUs shouldn't have RTX since they're barely playable and using DLSS and FG causes more issues at 30fps and below.
16GB on a 4K card... Nvidia is out of there mind surely 😭
What are you going to do when most games have ray tracing by default and you can’t turn it off?
looks like im staying with my 7900 xtx
25% faster, 50% more expensive in Europe and Australia... For AI you need more VRAM, that's why NVIDIA sells $7000 professional cards with professional feature sets and more VRAM, the AI VERSION OF THIS CARD has 96GB VRAM.
There is definitely a lot of AI you can do with 32 GB. Not all models will fit in, but the ones that fit will be blazingly fast with 5090
the only "professional feature set" those cards have is now the same thing the consumer cards have... ridiculously inflated prices. Everything else is just marketing B.S (if you weren't born yesterday you'd realise soft quardo was a thing. hint: its all 98% software limitations).
see you in 2 years, rtx 6090 will cost 4000 USD
I see you're an optimist
If you bought a 4090, you spent the extra money to skip generations.
The 5090Ti will be the new 6090, 2 years down the line ;)
Exactly, no need from Nvidia to drop it since AMD and Intel are sleeping...
I waited for the 5090. Evaluated the specs for the price. In my countries currency, most likely $4500+ for AIB partner cards. Couldn’t justify that amount of money for a GPU, even though I could. I purchased a refurbished Zotac 4090 for $2900. Just waiting for the 9950X3D or 9900X3D to be launched so I can build a new PC after 10 years. The 1080ti was...IS a GOAT GPU for the money!
Same. Was waiting for the 5090 to upgrade my 1080ti but decided to go with a 4070 super rather than deal with try to buy an MSRP 5090 and dealing with scalpers. Now just need that 9950x3d
@@HailAzathoth u dont need the 9950x3d for 4k gaming.
4090 is an amazing card. Congrats!
I love it, an assessment of Nvidia that leave commenters with little to complain about
9:40 - Zoolander! 🤣
$2k+ is never a good price for a GPU...
Not even for an rtx 6000?
80 bucks and requires a raytracing gpu making it "doom - not gonna play it for ages".😂
Even consoles have raytracing now though. You can't even play console games on your rig? Lol
@@kyler247 Whatever raytracing consoles can do with their RDNA2 APUs is hardly usable. Forced raytracing in crossplatform releases is doomed for mass adoption until the generation after the next at least, because PS6 (RDNA 3.5 equivalent, finalized before upcoming 4th gen cards) is not going to run it well either.
@Reirainsong blah blah blah console gamers will be able to play Doom the Dark Ages, which requires a raytracing GPU. That's the truth. No mental gymnastics about how bad the raytracing is, they can do it and they can run the software. PC gamers crying about it need to upgrade their PC to bare minimum console standards.
@@kyler247 This means that console version raytracing will be cut down to such an extent it would run without dedicated hardware support.
Thanks Joe and Paul for the Tech News! Now I can snooze😂
I guess I'll wait till the 7090. That number is probably the amount of pounds it will cost. Given the UK economy, it'll actually be about 10,000 quid.
7:00 "Why can't we buy them, AMD?"
Let me answer that rhetorical question: "Because Nvidia have hogged/will hog all of Jan and Feb 2025 with their 50xx release".
Re the 5080: Call me overly demanding, but a new GPU that doesn't beat or at least match the previous gen's next higher tier card is more than a little disappointing. 5080 should be at least as fast as the 4090 and if it's not it should be at least priced accordingly to make up for the relative lack of uplift - something Nvidia also don't seem to be be considering.
So far it looks like there won't be any real "sweet spot" or hidden gem-cards in the 50xx range. 5080 will be too slow compared to both 50xx and 40xx and probably get a "healthy" increase in MSRP compared to the 4080 Super. 5090's uplift vs 4090 is ... there ... but not overly impressive plus you have to pay for it through the nose. 5070 is allegedly slower than the 4070 Ti Super and still only has 12 GB. The only candidate left that *might* turn out as a mild positive surprise is the 5070 Ti, but I'm not getting my hopes up for that one, either.
3080 was 30% faster than 2080ti stock vs stock.2080ti was underclocked and needed more power give it 400w its 3070ti perf. 4080 was 15% faster than 3090ti coz of node shrink.
Biggest issue is how heavily cut down each tier is and using the same node
@JakeySurani Speaking of which: 3070 was about as fast as the 2080 Ti.. :)
And yes: The gap between 5090 and 5080 WRT CUDA cores, VRAM, etc is immense and thus it seems like a deliberate design-choice on the part of Nvidia that the 5080 will be pretty underwhelming when compared to both the 5090 and the 4090/4080.
5070 Ti does not look good either compared to 4070 Ti Super. Same 16GB buffer and just 5% more SMs. The only major difference is faster VRAM. But since Ti Super already had 672 GB/s bandwidth (very decent amount), 30% increase here will not provide any significant improvement either.
@@stangamer1151 Yeah.. and for *some* reason neither the 4070 Super nor the 4070 Ti Super were mentioned as reference points in any of Nvidia's marketing-material. If my 4070 Super wasn't staring me in the face right now, I could almost think neither of these two cards ever existed.. :D
@@1SaG I bet 5070 will not be able to match 4070 Super. That is why Nvidia pretends there were no Supers at all.
5070 has so heavily cut down die compared to 5090 (just 28% to be exact), that it is obvious this card should have been 5060 or, at the very least, 5060 Ti.
I agree with the thumbnail Paul. I look at that card with disgust too. This is anarchy!
Imagine paying the price of an entire PC just for the GPU alone. This is getting ridiculous. Even the lower end cards cost twice as much as they need to. Its hard to get excited for new PC hardware anymore when I know more BS no one wants, needs, or asked for is being added to justify drastically higher prices.
I remember when AMD launched the 295x2 for $1,500 and people thought charging that much for what was an absolute monster of a card for it's time was insane, and now here we are pushing clean past $2,000 for the base model and everyone acts like this is normal.
Paying 2000 not for a full die, and branding a 70 card as 80card cudacore wise is turning me down. Skipping the 5000 series again with the 1080ti :)
Ty so much for the lovely weekly tech news that you do so well without having to watch a super long video! Much love to the channel I have been watching you since NewEgg!
A bad pour on an expensive beer is the absolute worst!
Honestly, the amount of beer i waste yearly due to bad pours will pay for my 5090.
Beer tastes like shit anyway. How can people drink that garbage? I will never understand.
@@CenkolinoI see you don’t understand that everyone has their own taste preferences. I don’t like IPAs but I do like many other types of beers. Yet I can’t stand coffee and tea which most of the world seems to love. It is what it is.
@@Cenkolino I think peanut butter, almond milk and maple syrup taste like shit. I don't need to understand why other people like it.
@@Cenkolino I didn't like beer when I was young. With age I like sweet things less and less and less. In my mid 30s beer started tasting "ok, but nothing amazing". Now I really like the complex mix of bitter, sour, bready and mildly sweet. Your tastebuds will change with age too....
If I were ready to drop 2000+ USD for a "small" AI accelerator, I'd rather wait for 3000 USD "Project Digits" in May. Those at least will provide 3 times more memory for AI needs and are stackable. And will not need all the electric power in the world. So are 5090 GPUs useful? Maybe, for the people Huang was talking about, those with "10000 USD entertainment centers".
Holding out for the 6090. Because its almost rtx 69
nice!
Nice
Nice!
Nice!
lame
i see nvidia is still smoking its own supply.
Imagine if this industry wasn’t filled with monopolies.
Something can't be filled with monopolies. That's the opposite of a monopoly
9:39 - Brilliant edit, and tasteful segue. You, sirs, are scholars and gentlemen. Thank you.
Ah, that old meme of linus and steve is pure gold
This! RIGHT HERE 2:13 is exactly what I want. All the fluff cut out and a condensation of the reviews instead of having to plow through all the professionals videos to form an opinion.
the 20 series was bad compared to 10xx, the 50 series is gonna be the bad series in hindsight.
I have the MSI 4090 Suprim Liquid X and that card stays hella cool. Playing the new update for cyberpunk with ultra RT and path tracing I hit 52c at 1440p and 59c at 4k. Its a great card.
800 Watt for a Grahics card? Can't wait until it's common to have a dedicated GraphicsCard PowerSupply =D
I thought in the US the maximum power through an outlet was 1800 watts or 2400 watts if it is a 20 amp circuit? So we are getting closer, but not quite to the maximum yet.
@@EdDale44135 Well you can get 30, 40, 50 Amp receptacles if needed though 🙂
@ is Nvidia / Intel / AMD paying for the electrician to replace the breakers?
@ you would also need to change the wire to meet code on amperage changes but no thats on you.
I laughed so loud at the "mom and dad are fighting" joke I scared the dogs...
The RTX5080 is really an rtx 5070ti for $999. If you look at a past generation graphics half of the cuda cores of the 90 class card are always a 70 class card ex 5090 =21000 cuda cores so , according to the formula at 70 class card would have 11.000 cuda core
So now we got that out the way a real rtx5080 would have 16000 cuda cores 80 class cards have always been 20% of the 90 class card
So now let's do some math
RTX5090 =21000
-20%= 4200
=16000 =REAL RTX5080
THIS IS THE BIG LIE AND NO TH-camRS ARE TALKING ABOUT IT WHICH IS MAKING ME MAD
Nvida is trying to sell you an 70 class card class card for 80 class card moneny
This happened in the 4000 series to or not?😁
@@markterek6232 yes and they are doing it again 4060 was really a 4050 think about it why do you think there was no 4050 desktop cards only 3050 no competitionthay way is was olny a little bit faster then a 3060
4080 was 25% faster than 3090.
5090 is 25%% faster than 4090.
Because the 30 series was on Samsungs shit node, the 40 series on TSMCs then-current best, and the 50 series is on TCMSs yester-year node. The 30 series made Nvidia a shiiiit ton of money, the 40 series less so, and now the 50 series is back to milking.
"RIP 1080ti holdouts. I guess we're all going to have to upgrade now". I felt that. To be fair, I was also planning to buy a 5080 or 5070ti.
Eeeh still running my 980ti. New games are complete dogsjt anyway
I have been using a 1660ti gaming laptop for the last 5 years I think I may go desktop AMD or continue using what I have and get a xbox and use game pass.
You buy a 5080 just to fill Jensens pockets
@@jesusbarrera6916 Sparkly black leather pockets
@@jesusbarrera6916 Depends on the performance of course. If the 5080 can't deliver what I want for the price, then I'll probably go with the 5070ti and save some money.
Great tech news as always! Thank you Paul and team!
Greatly appreciated the skipping of the LTT and GN news.
This ripoff society we've set up is really lame at this point. I miss the good ole days.
The irony having the 🍊 as pfp.
@@DagobertX2 even though biden ripped us all off?
@@DagobertX2 No?
You’re so right. But I’m sure tariffs will totally help these sky high prices !
@ Yes
Most of us RTX 4090 owners will be skipping the 50 series completely. I don't want fake frames or ai in my games. I want more raw power. Maybe I will get an RTX 6090 if it's any good.
0:41 $2000 is indeed a good price, but will you see one selling for this mentioned price? 4090 sales make me doubt.
Basically the Chinese have been stripping 4090 of the GPU Chip and memory for Ai since they were banned from buying the chips
I've become one of those "people" but i'm waiting for the 5080 and 5070ti
If mom and dad are fighting again, do I get more presents for my birthday?
Yeah, well, Mom's being a total bitch, and Dad's gone and built a new shed on the edge of the property! And neither of them ask me how *I'M* feeling! [breaks out journal and purple glitter pens] LOL
I think you're on to something, there!
OMG Paul, your ending was priceless! We sometimes say that at work when people get into a big debate over whatever..... "I hate it when Mom and Dad fight". 🤣
Saying that rtx 5090 cost $1999 is misleading you know full well that you can't buy one at that price and that it will be around $3000
😭
The Aftermarket OC versions are $2200, so its just $2,000 for a standard 5090 Plus, Nvidia GPUs hold value, 3090's MSRP was $1,500 and sell for $800 The 5090 is twice as powerful So powerful its CPU bottlenecked in gaming
@@Conumdrum an rtx 4090 sells for $2400 are you really claiming an rtx 5090 is cheaper?
The “S” in MSRP stands for suggested. This doesn’t mean you will pay that. And we all know this is the case w/ anything. The market will determine how much they can charge.
@@Bob_Smith19 40 series sold for MSRP, theirs no reason to think 50 will be any different, bit coin mining is dead
AMD: Never misses an opportunity to miss an opportunity.
Intel: Always misses when it has the opportunity to dominate.
NVIDIA: Gives zero f**ks.
Let's not forget that AMD is already a power house in cpus and there running a big enough show as it is so it would be hard for them to own both the GPU and cpu market when you think of it. Nvidia has to focus on one asset gaming and I tel mostly cpus but some decent gpus. None the less across the board amd is the winner because there cpus are top tier and there graphics cards are a notch or 2 below that but who really needs more than an AMD rx 9000 series gpu when they come out? You can stick play 1440p easily and 4k 60 fps
I genuinely love Paul's channel. I always wait until things come down and just come here, drink something, and enjoy the news.
The hard truth is I'm locked into the nvidia because of their proprietary software that only works with their language, framenet, tensorflow, without using rocm.
For all the people who only play games, I don't know why they don't just buy less expensive gpus.
AMD doesn't help with their mostly piss poor pricing. Nvidia left the door wide open in the mid tier market. The largest market. And what did AMD do? Like they always do in the gpu market of recent, fall flat on their face.
Well at least their cpu's are good. But now also pretty expensive now that Intel has fallen far from their perch.
They don't buy them because of brand tribalism.
@imbatman3537 because AMD is never at fault and launch their competitor GPUs at $30 less and 3 months after everyone bought Nvidia cards
Nooooo.... it's the customer who is wrong
@@jesusbarrera6916 it's both to be fair, there are like 3 camps imo.
>People locked into software packages
>People loyal to brands
>AMD dropping the ball
AMD have the better GPUs in the bigger market due to pricing but as you mentioned they just set the price from Nvidia. They need some cojones, they dropped out of the enthusiasts market only to keep given us dogwater and now we have $2k cards again
@dr_diddy they only have better pricing months after they come out and everyone already bought NVIDIA
People have to stop making excuses for AMD
This tiem the market will buy 5090s SD 5080 while the rest will buy last gen first cheap BEFORE AMD even shows off their only GPU
This is going to be the best chance AMD will have in years to gain market share. FSR4 looks promising and the 50 series cards are basically just expensive sidegrades with (even more) fake frames as their main selling point. If AMD can deliver a solid upgrade for a reasonable price I'll probably switch.
The truth is, the more you buy, the more you save.
And the more you own nothing, and be happy with fake frames lol
@@natedisrud why are you all complaining? what are you going to do? NOT BUY IT?!?!
LOL RIGHT.... What are you going to buy? AMD? INTEL?
In the end, you'll buy it anyway! LOL!
THANKS FOR THE LEATHER!!!!!!!!!!! AHAHAHAHHAHAHAHAA
@jensenhuangnvidiaCEO im sticking with "YOUR" rtx4080s. 🤣
$2,000 is NOT a good price for any consumer PC component, full stop.
"mom and dad are fighting" is also a gross oversimplification, but sure let's go with that and not call out any of the disgusting behavior
Don't buy it lol. Simple as that. Ill be buying one day one at Microcenter.
the entire point is that this is not a consumer component, kinda like pickup trucks are not consumer cars but some people buy them as such anyway
@@arha-z1v Pickup trucks are very much designed as consumer cars. Modern pickup trucks aren't working vehicles, they are status symbols for people trying to overcompensate.
@ i think you are on a good path to getting my point
@ I disagree with your point that pickup trucks aren't consumer cars. They are designed, from the ground up, to be consumer cars.
oh wow! I was waiting for next years water-cooled-double-power-railer-1000-giga-chad-watt 6090 (Nice!) but I didn't know about that 5090 Ti coming :) thanks for the news @Paul! :)
As far as I can tell, these cards will be going for upwards of $4,000 to $4,500 here in Australia. Maybe even more.
Just looking at them from a rasterisation perspective, these cards are far from justifiable when compared the 40 series, or even the 30 series, and the machine learning stuff isn't a selling point for a majority of people out there, because barely anyone actually cares about it.
Here to drink the tears of gamers that spend 74.6 hours a week enjoying leisurely video gaming activities and can't afford the latest technological marvels that allow them to play 74.6 hours a week enjoying leisurely video gaming activities at a higher framerate because they have no skills and are poor because they spend 74.6 hours a week enjoying leisurely video gaming activities.
yay not going to buy a 5090.
This is why we are starting to even hate you reviewers. "It's worth it, because of something you'll never utilize". That's everything wrong with nvidia's gpus and pricing. Selling people on stuff that is either A. completely fake and not worth it (dlss) and B. software features they will never use. The whole pc industry atm is just horribly disappointing for this reason. MBS, over priced for a feature no one will utilize. Gpus, over priced for features they dont even want. Reviewers,, "no its worth the money for people NOT you." Stop it.
Tech Enthusiasts: “RTX 5090 is a good buy at $2,000 because of AI”.
*Deepseek enters chat*
Tech Enthusiasts: 😮
I mean it still is. Deepseek only gets better the more VRAM you throw at it.
It's basically the same story as every GPU launch at the end of the day. If you're on the GPU from last generation, don't upgrade. If you're on the GPU from two or more generations ago, upgrade. The 5090 isn't a good upgrade from the 4090 but it is a good upgrade from the 3090.
nVidia doesen't care about Gaming market anymore, they just do whatever they can to make their AI focused chips to work in games
❤
I have an RTX 4090 and I got it at msrp, so I'm going to hang on to it. And my work rig, which I use for 3D modeling and rendering has two RTX 3090's which will render very nearly as fast as a single RTX 5090, so I'm keeping that to.
That Zoolander/Hansel AI shot
“Nope. Stop it. Not gonna discuss it. Mom & dad is fighting again.”
😂
0:50 Yeah, as someone invested in Nvidia the last few years, they do not care at all about gaming anymore. It's the last thing talked about in their quarterly reports, for good reason. Year over year the last 2-3 years they've done 100-200% more business year over year. ALL in data centers and AI chips, deals with car manufacturers etc.
Pro tip, try to snatch 4090 for good price before 5090 is out.
4090s have been more expensive than 2 grand for months now
I've been searching they fucking expensive again 😅
Linus and Steve in the same FRAME. Only Paul could do something like this.
*SAINT APOSTLE PAUL, THE PEACEMAKER*
Sorry, but Paul is buying propaganda. Nvidia shut off its production many months ago of the 4090. The only reason the price is so high is the supply is basically nonexistent. You do that with any card and the price goes up. I'll admit there is some demand, but the idea that the price is surging because of that is absurdly wrong. It's basically only because of the production cuts to set up the market so that people like Paul will declare the $2000 5090 is a great value. You don't have to be a paid shill to work for Nvidia. You just have to buy their bullshit and repeat it.
I passed on the 3090, but when going for the 4090, i purchased it early in its lifecycle to get the longest use and thus best return on investment ... I do use it for gaming but I'm a software engineer so having it for development of cuda/tensor models is very nice
im sadden to hear someone like you say 2k for a 5090 is acceptable and you're part of the problem
He clearly stated that it was for AI consumers, and not gamers. It's nvidia misleading everybody marketing these towards gamers, not Paul. These aren't cards thought with gamers in mind. They're made to take advantage of the massive AI bubble. And hey, until we have gamers willing to pay that much for gpus, that's nvidia's job to keep selling them for that price. Nvidia only reason to exists is to make money, not be a charity act. Blame human beings stupidity that allowed nvidia to price gpus these ridiculously high, not paul stating a sad fact.
@@davidepannone6021 We can say whatever we want, at the end of the day the card is advertised for gaming, Nvidia knows that, not for AI.
What are you moaning about? 5090 is not a gamer GPU, never has been (previously Titan), which is why it isn't priced as such. These will sell out to people who will waste them on "AI" and a few of richer gamers. People would easily pay over 3000 for these. During the plague we saw people would pay anything for a high-end GPU.
@@thewingedringer that's what I said lmao. So who's to blame, Paul or Nvidia? Lmao.
“Part of the problem” = words always said by a POOR
Uses more power than my clothes dryer.
Ya youre actually wrong, even with ai the 5090 still isnt worth it as nvidia markets the 5090 with ai in terms of dlss4 which is a huge lie to what the 5090 really does. if you wanna see what the 5090 really does turn off dlss and run the card at its max, then and only then will you see what the 5090 actually does. when you turn on dlss4 youre getting fake frames and increased latency aka less performance
He did not mean DLSS when he said AI. He meant FP4, NPU count and generative compute performance. Not for gaming.
@@RalakFrozenDeath you do grasp that ai is part of dlss right?
@@ColdVenom159 You do grasp that people use these cards for AI work that has ABSOLUTELY NOTHING AT ALL TO DO WITH GRAPHIC RENDERING, right? DLSS is AI, but not all AI is DLSS. When Paul was speaking about its value in AI work, he wasn't talking about DLSS, games, frame-rates, "fake frames", latency, or anything about graphics whatsoever. Just the raw value for people wanting the 5090 for its generative compute performance.
@sigmahyperion955 you do grasp that with the 5090 dlss4 is based off ai right? did you even listen to Nvidia when they spoke about how dlss4 actually works? without ai the gpu is a rip off in terms of performance.
When the 5000 series was revealed, I was FOMO-ing hard over buying my 4070TS during Black Friday, but seems like every day goes by I feel better and better about my purchase.
Glad you’re happy. It’s hard not to get FOMO
One of my favorite videos of the week, thanks Paul 👏👏
Steve vs Linus - Steve is not NOT the angel most viewers think he is, he should just focus on what he's best at, reviews of hardware!
Linus is a literal criminal and got caught being that.
*me who likes both sides*
*eating popcorn furiously while watching both sides destroy each other*
And here in Denmark so far the pricing for the 5090 is roughly 3000-3500$ for Asus, Gigabyte and MSI cards 😅
Greenland will be ours soon
Keeping my 4090 for sure
If I recall correctly the 4080 was around 25-30% faster than the 3090, kind of disappointing that the 5080 doesn't keep that trend going