Yes, each with an accompanying price increment but likely without a VRAM increment. During the 5000-series launch parade of glory, Jensen will tell us that GDDR7 is so fast that his new GPUs can get away with having a negative amount.
There’s no downward price pressure on them AT ALL for the 5090. I wouldn’t be surprised even by a 3000 price point, it essentially has no competition. Though I hope it’s more sensible, I do doubt it.
There was no pressure on them even with previous gens. People buying the best want the best. Amd doesn't compete on features and sucks for productivity which is a big segment of 4090 buyers
หลายเดือนก่อน
@@mojojojo6292 It's true, I have a 4090 for productivity, exporting a 4K 60fps HDR video in Resolve with a complex grade and effects will use something like 18GB VRAM, so there's no other option really unless you want to slow stuff down drastically. Before that I had twin 3090s, I just sort of pay the Nvidia tax sadly, but this time I'm wondering if even I may baulk at the price for work uses... how far will they go? The main reason I can't use AMD is all the software I use is quickest with CUDA, which is Nvidia only. There are some people trying to create third-part from-scratch ways to interpret CUDA on other cards, but it probably won't happen in consumer land for many reasons.
No one normal is gonna spend $3K on a graphics card, the 5090 is going to end up being $1800. NVIDIA knows what it’s doing and it’s not gonna try to inch every profit of it to lose a massive audience.
หลายเดือนก่อน +3
@@HyperionZero TBH I didn't know anyone "normal" who bought a 4090 just for gaming though. They were all except for one, people buying it for productivity, and adding another thousand to, for example, save cumulative hours of render time is a small investment over a year or two. Everyone using them for production buys the top RTX because the step up to the Quaddro cards is gigantic cost increase and not usually beneficial.
หลายเดือนก่อน
@@mojojojo6292 yes i bought a 4090 for production, I use 18GB or more VRAM in Resolve regularly, and all 24GB in Substance Painter. It's very useful having 24GB! I'd never have bought it for gaming.
1440 watts on a 15 amp breaker. yeah we are cutting it fine. better add rewiring the circuit so you can SAFELY install a 20 amp breaker. how long before these rigs need a commercial (3 phase) panel?!?!?
This was why I was one of the few (apparently) that was really happy with the new AMD CPUs. Getting really tired of more performance through more power. Sure it was only a slight performance gain, but at half the power. If the 5090 was a 10% boost over the 4090 but half the power draw I’d be thrilled. It’s getting completely out of hand.
@@crcp4886 Wdym one of the few? Most people think intel is a joke lol, nobody in their right mind would pick a CPU that uses 300% more power just to compete or sometimes slightly outperform the other competitor...unless of course you have nothing upstairs that is capable of logical thinking.
@@jackthatmonkey8994 agreed, I really hope they can put out a strong mid-range card, having 3 competitive players in the space would be nice for buyers.
Yeah, I have an ARC A750, and it runs really well, despite being the cheaper option. 100% worth the money. I really hope they continue it, because the cards look very nice and they run well too.
DDR7 vs DDR6. If it doesn't need the RAM why add it? To make people feel better? Then ddr7 RAM costs money prices go up more complaining? Invidia puts better RAM gamers complain its not enough and they want it for $200 cheaper than it costs to manufacture....... I am starting to get why Invidia stopped trying to make gamers happy. You just cant.
@@Beezkneez187 16Gb means you're way more likely to run into Vram issues, especially on a card that costs this much. The buyers are more likely gonna have higher resolution displays. Also if you have a 4k display and wanna play the latest/upcoming games you're basically forced to buy a 5080 ti/super or whatever higher memory variant that'll come out or a 5090.
Human beings have believed in all sorts of crazy shit for thousands of years! What's so hard to believe about a damn graphics card?! (Y'know, other than the unbelievable price tags these days)
Is nobody remembering that they did this with 4070ti? trying to call it the 4080 and charge $100 more??? Everybody simply called out nvidia and they dropped the price and the card label... Now everyone needs to do this again but it's like everyone's sleeping on it, trying to think themselves smart about how 'this makes sense'... THIS DOES NOT MAKE SENSE! IT'S PROTEST TIME AGAIN FOLKS! WAKE UP!! THAT'S OUR 5070 TI THERE!!!! 😡
@@charleyweinhardt I agree, I wouldn't buy one, but the question is will Nvidia agree or give in? They've already, on purpose reduced 4xxx series distribution to insure those older cards aren't in good enough supply be bought over the 5xxx series.
The specs suggest it would be a little faster than a 4090, but with 16GB. That sounds EXACTLY like how Nvidia would dice it: "Look! You can now get better than 4090 performance for just $1200! That's a 40% frames per dollar improvement in just one generation! Isn't that just AMAZING?".
@@veritassyfer1185 AMD is not coming to the party this time around.... What makes you think Nvidia wont take full advantage of that like they did the last time?
I hate that the 5080 is half a 5090. Up untill the 40xx-series the xx80 sku was somewhere from 10-30% cut down from the 90. The chip that was half of the top sku was between xx60-70 sku.
You can thank the 5090 D for that ... I believe Nvidia is using the 5080 as the 5090 D to sell to China instead of making it closer to the actual 5090 for consumers.
I owned one before upgrading to a 4090 but performance wasn't too far off basically a locked 60fps at native 4k with everything maxed with Ray tracing while a 4090 is 80--90 fps most of the time with some games going into 120fps territory, actually playing doesn't feel that different.
That's because all cards under the 4090 were a lie they are 1 to 2 spots off there real class. 4060 is basically 10 faster then last gen, it is a 4050 in reality.
Nvidia is running on two things: 1) they know AMD is trying to accelerate RDNA5 which unlike RDNA4 is likely to have a halo-card in the lineup, so they need to make the 5090 as powerful as possible in the event it ends up having to compete against the hypothetical “9900XTX” (or whatever) for very long at all 2) they have enough fanboys who make buying them their identity no matter how expensive it is and how few games the extra hardware has any benefit for…so no market impediment to dissuade them from developing and releasing the product
400W is disappointing if true, I guess we can undervolt, they are simply relying on beefy coolers to gain performance, that's lazy, make them more efficient.
UV is a saving grace though, because the perf to W ratio has a curve of a plateau. Multiple sources after the release of the 4090 pointed out that these new Nvidia cards look like they've been "overclocked in factory" or something. You could realistically drop it down to 500 and still keep 90-95% of the performance ( the numbers are speculation just for the sake of example ), and looking at the specs of the 4080 AND the fact that AMD has withdrawn from the high end market, I espect to see nothing less than the repeat of the comical difference between 4090 and anything in any benchmark, be it gaming or productivity, so UV should be fine.
@@kellywilson137 I was lucky and for a gaming laptop got mine get after the 3000 nvidia gpu's came out got a deal on a 2070 latop works great for what I play .My desktop is great and should last a long time
Money. They don't want to cut into the profits from their professional GPUs that come with more VRAM. If they give you too much VRAM, they know professionals won't buy their massively marked up pro GPUs.
@@TillTheLightTakesUsYou Dope, right now it may not matter but in the future the card will be worthless when games will push the 4090 to the max… Nobody is going to want to buy a high wattage card
What people should be more focused on is the 10% cuda count increase of the 5080 vs the 4080 compared to 50% increase for the 5090 vs the 4090 and its 60% increase over the 3090 which was the sole reason that performance increased by 30% at native 4k.
16 gig VRAM is a joke, most mid tier have more than today. Even if they have GGDR7. Clearly a made decision so they can sell the TI/Super ones with higher VRAM...
Do people who buy x70-class cards really need more than 12GB? Honest question. I always bought x60 and x70-class cards when I ran 1600x1000 - 2304x1440. I had to move to x80 Ti / x90-class when I got a 4K OLED. What are some examples of games which max 12GB at 2560x1440 or less?
@@bricaaron3978 I have a RTX 3060 12GB and at 1080p in newer games I get textures pop-in because sometimes it goes over that amount. That's why 16GB Vram should start being the minimum, even at 1080p.
@@bricaaron3978 Nope they don't. They complain about the GPU costing a lot but so does all the other gear to even game at its capabilities. They are just complaining to complain that's why Invidia doesn't listen to gamers anymore even if they give them what they want AMD would be done overnight and prices would go back up after AMD shuts down. People only buy AMD cards because they are the affordable option they are not as good. If prices come down AMD will have to drop prices so hard to even be relevant.
I don't appreciate him saying the people buying those have kids now. He's not wrong, I had them as a kid and I have kids now. But I still don't appreciate it.
You know, the sad thing is that this is kind of like the mining craze: we're basically going to have to wait until AI-specific devices come out and handily outclass using a GPU, because right now (just like during the mining craze) there is a legit reason for people to buy 5090s and 4090s to use for AI.
5080 is going to be a glorious flop again. Im actually excited to see a 9950x3D with dual x3D, since its actually different enough and may not require core parking to function properly. Lets hope they dont charge 800-900$ for it though.
Well doesn't core parking kinda go hand-in-hand with any processor packing multiple chiplets on the same package? So you just kinda accept that if you're willing to have your cores split between two dies. -edit I agree, in that it's lame that if the parking isn't handled properly it noticeably impacts performance and that you _need_ the bastard Game Bar for it to work. Whatever Game Bar does is something that could be implemented in any game launcher like Steam or (bllleeech) uPlay. I think I'd rather have a few more cores on a single die than double the cores split between two.
@@JensAllerleiNah, that's not quite right. Core parking and 3D V-Cache are totally different things. It's like saying since your car has power steering, it can't have a turbocharger! 😂 AMD only put 3D V-Cache on one of the 9950X's chiplets at first, but rumor has it they're working on a version with it on both! Imagine a twin-turbo supercar! 🤯 Dual 3D V-Cache could make the 9950X insanely fast, especially for gaming.
@@411DLno. It’s only because the 3D cache is only on one of the chipsets. If it was on both they wouldn’t have to park because they wouldn’t have to share cache and increase latency.
That's what the original roadmap said. But then Alchemist got delayed forever, and then Battlemage got delayed forever. It's pretty much Intel 10nm all over again.
I don't game too much due to my preference generally being "If an assault rifle is 90% of trailers and gameplay, I'm out", "If I feel like I've seen this in the last 5 years then I'm out", "If the colour palette and design is dull, I'm out" and "If I can get the same or more from a Doom mod then you're not trying". So by the time a good game like Psychonauts 2 will come out. This will be lowered to an investment cost worth paying. Likely at the end or after 2026 if I'm being optimistic. But 2025 will be a no go for pricing. Luckily games don't need this to run 60 FPS or even 80. You're good with lower ones even for several years.
Your standards are pretty much superfluous considering what the AAA industry has been excreting for the last several years. I'm just thankful that I have my favorite games to replay, and to re-experience with mind-blowing immersion on my "High FoV Gaming" setup. That's in addition to the huge number of PC games from the last 40+ years that I look forward to trying for the first time.
@@oldtimergaming9514 havent you been following the news? Amd is not gonna even compete in the top end next generation, they will only focus on budget gpus
If you stick to 1080p then gaming is not so expensive. A 6600XT or 3060 is exactly as fast (in pure raster) at 1080p as a 4090 is at 4k. Here's the breakdown chart of the 9:16 resolutions and their computational "cost" 3840x2160p = 100% 2560x1440p = 45% 1920x1080p = 25% So a 7700XT, RX 6800, 3070ti or 4070 does at 1440p what a 4090 does at 4k.
This looks like they're intentionally positioning the 5090 as the "solo AI researcher" card by limiting the 5080's total VRAM. Same as the 4xxx series. They want you to buy the xx90 series or pay a lot more for the AI-branded models. My guess is the 5090 will be priced between $2000 and $2499 because of this. You saw this with the lower tier 4xxx generation as they didn't have a lot of VRAM present compared to the AMD counterparts (ROC isn't up to the same parity as CUDA).
@Krypto121 Press X to doubt Jokes aside, you're probably correct about productivity also being it's intended purpose, but gaming is also it's purpose because time and time again people chronically underestimate just how far people in the gaming space are willing to go when it comes to their cards. Nvidia knows it, which is why they have been targeting gamers even with their high end cards for decades, and also why they've been ramping their prices up after the scalping shortage a few years ago. Not because they don't care about gamers anymore, they do because we still make up 30-40% of their revenue, rather because they know they have a monopoly and ask for whatever the hell they want to ask for.
@Krypto121 Your friend is right depending on circumstances. The 4090 is on average ~20% faster in gaming than the 4080 super, and considering both of them are really a cut above the rest, that 20% is a MASSIVE advantage over the 4080 super. Quite frankly, the 4090 is crushing any and all competition in gaming by a long shot. That said, the 4080 super will be enough depending on what you're playing and what specs you are looking for. There are cases where it's not enough though. For example, the PC release of god of war ragnarok does not run in 1080p 144 fps on any card other than the 4090. Most newer games don't run 1440p 144 unless you're running a 7800x3d with a 4090 either. If you are playing indie games or esports titles mostly, yeah the 4090 is massive overkill. There is a legitimate market niche for the 4090 in the gaming space other than Nvidia scamming people, that much is undeniable, which is why I responded to you in the first place. That's my axe to grind here essentially. As for your second comment, sure, people are getting scammed for overpurchasing, some might even be stupid enough to by a threadripper thinking it's better for their use case when it's actually not.
@Krypto121 That's the thing tho, its priced in a way that targets a specific demographic, people who either use it for professional use or people who have the expendable income to afford it, they are not targeting Jimmy who was just barely able to afford their 3060TI lol...yet the people that cry the most are the Jimmy's who could never afford the card even if they dropped the price by 400$.
That wouldn't do, it's pretty much the same price the 4090 went for. Nvidia needs that sweet year over year infinite growth. Probably will be 2700-3000+
@@0Sicarii0 5090 32GB 512-bit $1999-2399 about $2199-2599 and 5080 384-bit 24GB will be almost same price as 4090 around $1399-1599 and 5080 256-bit 16GB $999-1199
I have an XTX at 1440p165Hz myself. If you're on a 1080p144Hz display, sure I believe you, because you'd be CPU capped all the time. Or if you did something stupid like pairing them with a Core 9600k or Ryzen 3600. But outside of completely stupid setups you're talking verifiable nonsense.
@@andersjjensen Samsung 57" 8K Ultra HD 240Hz 1ms GTG Curved VA LED FreeSync Gaming Monitor (LS57CG952NNXZA) - Black/White. 12900k cpu and ddr4 3600mhz 32GB ram.
From a very reliable source at nVidia (a buddy I used to race with and high up nVidia executive chain), 5090 will arrive early Q1 2025 and will start $2499 with FE's at $2699 and once again very limited supply as to provide "opportunity" for resellers to increase the price and margins even higher ... so early adopters should expect to be paying around $3000-$3200 US.
They don’t, but they should, as it would eliminate the fire hazard by splitting the current load between the two plugs. You’d think they’d have learned from the mistakes of the 40-series, especially when the remedy is just that simple.
Worst case scenario: The xx90 cards will become more unobtainable with the launch of the 5090. Given the specs, I wouldn't be surprised if that thing launched for 2000 USD or considerably more. Meaning the 5080 would then be sitting at around the USD 1000 mark (at least) ... for a high-end 2024/2025-card with 16 GB. Best case (aka "La-La-Land")-scenario: Nvidia have seen the error of their ways (4080 -> 4080 Super price correction seems to point to this) and will launch the 5090 at roughly the same MSRP as the 4090 - which would almost be justifiable given its specs. 5080 will see a slight price correction to ~USD 900 and everything below it will be calculated based on its price-tag. Again: While this is probably totally wrong, I just don't see how they could justify a 1000 (or more) MSRP for a 16GB card. The only "upside" my pessimistic self can see in all of this is for 4090 owners ... who don't really have a reason to get rid of their existing card in the near future. That GPU is still crazy fast and should last well into 2025 and perhaps beyond.
That's the key part that people who poo pooed the 4090 is it's a great card, you use it probably for 5-7 years or beyond with little to no performance issues. It spreads that cost out to the point that per a year it's cheaper than most mid range GPU's and delivers a much better experience during it's life. We saw this already with all the 1080 Ti folks who "overpaid" back then who are still often sitting pretty with their cards on everything except the most demanding games at the highest possible settings or 4k. Anyone who wants to replace their 4090 or even 3090 is a carrot chaser.
@@squirrelsinjacket1804 Correct, you don't have fancy software improvements of the newer cards. But even without those options you can play at the native resolution and still have most if not all the detail settings cranked up and maintain 60+ fps more than likely. We are reaching the point where eventually people with that generation of card will need to expect to make more and more tweaks to run stuff but right now its still very serviceable within reason.
The second Nvidia found out that AMD is exiting the high end market Nvidia decided to re-play the idea that backfired with RTX 40: Shift everything down a tier. The GB 202 will have the 5090 (which is already cut down 11%) and then eventually the 5080Ti (probably 15k cores and 24GB). The "70 class die" will have the 5080 in it's full configuration (which is expected to be about 10% faster than a 4090) and cost $1200 if we are VERY VERY lucky. The 5070Ti will likely also be 16GB (and $900 or something stupid) and be between a 4090 and 4080 in performance. It's not until we get to the 5070 (probably 12GB again) that AMD will have an answer. If Nvidia launches the 5070 for $700 and AMD launches the 8800XT for $500 people will complain a lot... and then buy the 5070.
It really feels like the 16gb vram is lagging behind for the 5080. They should have pushed it to 20gb and maybe up'd the CUDA cores a bit more too (the 5080 only got a 10% generational uplift in CUDA cores vs 33% uplift for the 5090 over 4090 by comparison.)
They do it on purpose. I wouldn't approach the 5090 because of power. 600watts is insane. You're paying for that and your electricity bill. Has great specs but I wouldn't doubt it would cost more than $2.5k
@@yourfriend4104 "I wouldn't approach the 5090 because of power. 600watts is insane." This is still conflicting info but for some reason you all believe kopite. There have literally been like 3 or 4 leaks and they all differ, however the other 2 leaks have the 5090 as a 500 - 550W card. Regardless, people who are buying these cards obviously don't care about their electricity bill....
If you predict it Nvidia will think you find it reasonable. Only thing anyone should be saying in these comments sections is I won't pay a penny over $1000 for a 5090.
@@deanal-jackson4593 Perhaps AMD should be? They've conceded the high end market to Nvidia because they can't compete. Their Ray Tracing still drags performance in the dirt. Nvidia offers many more features and I hope FRS4 is an improvement, but after so many iterations without getting it right it's probably just another AM5 type event.
I doubt it, Nvidia learned with the 4080 super. They're not gonna overcharge like last time, it's probs gonna be $999 and the 5090 is probs gonna be $1799.
Unless... 5090 is a paper launch and not truly available for 4 months and people lose patience and buy 5080s. Which seems very plausible. I'm waiting for the 5090 one way or the other. I've never built myself a top of the line PC and this is a divorce present to myself. Do I need to spend $5k on a PC? No. Do I WANT to spend $5k on a beastly rig and use it to play D2R and Genshin? Absofuckinglutely. The upgrade from 2060 and a 3600 to 5090 and 9800x3d will be the best present I've ever given myself.
They want to compress the textures in some way, plus less material cost for higher profits on the lower end low margin parts. They want you to buy the 5080 for more vram. Even with more vram, amd can't compete on the high end.
@Krypto121 even entry level GPUs can already utilize more than 8 GB in modern demanding titles. Nvidia is limiting the VRAM more than would be sensible for gaming purposes, because they do not want these cards to be used instead of their workstation counterparts... In general, more VRAM enables better visuals at all GPU classes. Of course, there is a point of diminishing returns, but current Nvidia GPUs are far from that.
They probably will...like a year later. Tbh it not really worth buying into Nvidia till launches have had time to come out cause that brand new 80 series card you bought for a 1000? Guess what? A few months later, an 5080 ti super duper will come out
Probably 24GB, 500W and 15k cores. So right in the middle. The 5090 is already cut down by 11%, so the 5080ti would be cut by about 35%. TSMCs yields are so good that this should make every die salvageable.
@@ExtremeGamersBenchmarks I’m not talking about using a different connector. I’m talking about using two of them. In case you haven’t heard, even ones plugged all the way in properly burned. Putting two of them on the board will stop that from happening by splitting the current up between the two so each one only has to deal with half the current. Since the problem is that sometimes these connectors are not properly handling all the current they were supposed to, using two will solve the problem completely.
I cant see PCI-E 5.0 having much if any FPS difference compared to 4.0. We only saw around 3fps difference in the previous gen and they barely saturate the throughput of 4.0. Wont be enough of a difference to make me upgrade my mobo.
No way the 5080 only has 16GB. The gap between the 80 and 90 is too massive. Plus, 16GB on an 80 card in 2024 is pathetic. It's gotta be like 20+. I don't believe it... But it's also Nvidia, so it's absolutely something they would do.
12:05 I don't know why but I want this to be review by Steve1989MRE. I want to hear his take on it because it looks like something you would find in an MRE.
ok cafeinated rammen, i kinda get... but once you reach the "we need to put ramen in squeeze pouch" point you really need to just stop & reconsider your life choices
Wow, a wopping 16GB of vram on a graphics card which will likely have an MSRP of about 1200 USD or more, and most models will likely go for 100-300 dollars MORE than the MSRP. Sure, that will be plenty of vram most of the time for maybe a year or two after it releases, but the vram will likely become an increasingly limiting factor for the RTX 5080 over time. That 16GB of vram is going to be a problem for ray tracing, and also frame-generation, in an increasingly larger number of games each year after it's first launched. I think 16 GB will probably be pretty good for next gen mid-range graphics cards, but it's just not enough for a high-end one in that price range.
The 4090 and 4080 weren't that far off in performance bud I literally owned both 😂, also most of y'all complaining don't even play games in 4k so you won't even need it 😂😂
Yep the funniest part is that people expect and think that the 4090 or in this case, the 5090 should be accessible and affordable to everyone...while also being the best GPU in the market by a mile lol. The highest end GPU is not made for people who can't afford it and it will never be that way, so the fact that so many people cry that its 2k or whatever are the people nvidia are NOT targeting to begin with.
@@unearthlynarratives_ fr doe... alot of people with room temp iq be angry at Nvidia.... A high end gpu aint a right or sum... like how you gon get made because some of the worlds most modern and capable gpu's are a bit expensive xD Mis as well complain that a mclaren or bugatti is too expensive....
Well since they are already shutting off the 4090 pipeline I kinda doubt it other than the used market. And that would be a lot of money to gamble on a used card.
Afraid that's not happening, NVidia have stopped production of the 4080/90 to run stock low, so prices are set to rise very soon, used cards will rise nearly as high as they are now, especially with the 5080 only having 16GB Vram. But yeah! wait
@@aldermediaproductions695 Well i have a: 2080/3090/4090, and i must say got the 2080 from my brother so have been playing around with it, its still a decent little card, yet the 8GB of VRAM.......dear god, its a total nuisance, 24 GB...what freedom. Yeah, i need to get some use out of the cards ive got, yet always nice having the best, cant do the crappy 5070/580 nonsense, get the best always.
Man I really want a 5090 but I know it wont be any less than $2000 or $2500.... Man I really dont want to spend that much money for a computer part. I know bleeding edge is pricey but thats too much. Please keep it like a $1000 max
What makes you think it'll drop for anything less than $2k when the 4090 did the same thing and insta-sold out? Especially with the artificial inflation rn.
@@Stubbies2003 yep, I'm essentially day dreaming, wishing it to somehow turn true. For the past few years I've been avoiding buying any nvidia GPUs in hopes of decreasing the number of units sold but I alone am too weak and people bought more of it anyways so here I am crying about it while it is 2k$+ I own a 2080ti and now we have a GPU that costs more than twice that 😔
Oh man, Paul just reminded me of the play by play guy from 'Major League'. When 5090 pricing is announced, I'm pretty sure that beer will have transformed into Jack Daniels.
It looks like somebody combined a mountain dew and canned spaghetti and put it in an applesauce squeeze pouch. Might as well just blend your food and bottle it up and drink that lol
You don't need a 5090. The cycle of people whining nonstop about high end GPUs being expensive and then collectively making them sell out the day they are released when no one even needs this much power is very tiring, I really hate the internet.
Yet another Nvidia launch, where all the juice is put in the 90 model. And the rest 10 models are crap and heavily overpriced. I am so tired of nvidia. We gamers need some love from our dear gpu manufacturers. Pc Will end up dying slowly if these prices continues
Our dear gpu manufacturers are what we have made them into. If 95% gamers will only buy Nvidia and "the best" happens to always be Nvidia and then AI happens and "the best" is a pretty good AI card then gl hf kthxbye. I have 5 rl acquaintances who are into PC gaming, they will all only purchase Nvidia. I've asked and the reasons have been "I don't know AMD so well", "I've heard that AMD is worse" and "Well everyone buys Nvidia". To someone who likes to research before a purchase those reasons suck, and they also show the mindspell Nvidia holds over the sheep.
Meanwhile AMD just said "We give up." Thus dooming the consumers to whatever Nvidia wants. It's bad enough that we have a duopoly but it's made worse when one side flat out gives up. People want to blame Nvidia for everything but the blame rests with AMD and their absolute failure to put any effort into their products over this last decade. Hopefully Intel gets off the ground enough to start threatening AMD in the budget and mid tier markets to force them into a bind. Maybe then they will decide to actually do something for a change.
@@zam1007 When you have no reason to hold back it gets hard to just leave money on the table. The market is out of balance and AMD just made it worse by noping out of it effectively.
@@Vantrakter We have only two GPU producers in the market also AMD's products consistently perform poorer in general to Nvidia's. So by the numbers when you actually look at the performance reviews unless your use case happens to fall into one of the places AMD is edging ahead in Nvidia is the better buy. We have gotten here because for the last decade AMD has refused to bother competing in the space because they need only price themselves a little cheaper and claim it's "value" pricing regardless of actual performance. That's going to burn customers and by word of mouth condemn the brand who will go to the only other game in town Nvidia who by being in first place in a two horse race is automatically the "premium" brand with the prices and performance expectations to boot.
If you think the 5090 specs are unbelievable, just wait for the pricing.
I really really hope they don’t keep pushing the price of the flagship up, but I’m pretty sure they will.
The pricing will also be unbelievable! And thats not a positive "unbelievable"!
5090 8GB will start at $1999.
50 Cent, jensen would be kind enough this time
I would say $2000 for the founders edition, as the 4090s sold like hotcakes.
The RTX 5090 won't come with a price, just a monthly subscription price of $99/month for 5 years.
With cheese Mr. Squidward....
New GeForce Now subscription plan
Dude. Dude. No giving the suits more ideas.
If that also includes a month's supply of Boost Noodle... then count me in!
Hahaha in your dreams
That gap between 5080 and 5090 is so wide, they can fit 5080 super super, 5080 ti super, 5080 super ti, 5080 super ti super in there at least.
It should have been called the Super Duper.@@sys-administrator
Yes, each with an accompanying price increment but likely without a VRAM increment. During the 5000-series launch parade of glory, Jensen will tell us that GDDR7 is so fast that his new GPUs can get away with having a negative amount.
Still not enough room for Jensen's ego.
They'd make a 5080 super, 5080 TI, and a 5080 TI Super within that gap
@@JayMaverick💀
Video starts at 1:54
There’s no downward price pressure on them AT ALL for the 5090. I wouldn’t be surprised even by a 3000 price point, it essentially has no competition. Though I hope it’s more sensible, I do doubt it.
There was no pressure on them even with previous gens. People buying the best want the best. Amd doesn't compete on features and sucks for productivity which is a big segment of 4090 buyers
@@mojojojo6292 It's true, I have a 4090 for productivity, exporting a 4K 60fps HDR video in Resolve with a complex grade and effects will use something like 18GB VRAM, so there's no other option really unless you want to slow stuff down drastically. Before that I had twin 3090s, I just sort of pay the Nvidia tax sadly, but this time I'm wondering if even I may baulk at the price for work uses... how far will they go?
The main reason I can't use AMD is all the software I use is quickest with CUDA, which is Nvidia only. There are some people trying to create third-part from-scratch ways to interpret CUDA on other cards, but it probably won't happen in consumer land for many reasons.
No one normal is gonna spend $3K on a graphics card, the 5090 is going to end up being $1800.
NVIDIA knows what it’s doing and it’s not gonna try to inch every profit of it to lose a massive audience.
@@HyperionZero TBH I didn't know anyone "normal" who bought a 4090 just for gaming though. They were all except for one, people buying it for productivity, and adding another thousand to, for example, save cumulative hours of render time is a small investment over a year or two. Everyone using them for production buys the top RTX because the step up to the Quaddro cards is gigantic cost increase and not usually beneficial.
@@mojojojo6292 yes i bought a 4090 for production, I use 18GB or more VRAM in Resolve regularly, and all 24GB in Substance Painter. It's very useful having 24GB! I'd never have bought it for gaming.
600W, with a good cpu, 2/3 monitors and other minor things... we are reaching the limit of a single phase in a normal household in NA....
1440 watts on a 15 amp breaker. yeah we are cutting it fine. better add rewiring the circuit so you can SAFELY install a 20 amp breaker. how long before these rigs need a commercial (3 phase) panel?!?!?
This was why I was one of the few (apparently) that was really happy with the new AMD CPUs. Getting really tired of more performance through more power. Sure it was only a slight performance gain, but at half the power. If the 5090 was a 10% boost over the 4090 but half the power draw I’d be thrilled. It’s getting completely out of hand.
And that’s why i rewired my office in 12-2wg and put them on a dedicated 20A breaker. Ready for dual psu’s!
@@crcp4886 Wdym one of the few? Most people think intel is a joke lol, nobody in their right mind would pick a CPU that uses 300% more power just to compete or sometimes slightly outperform the other competitor...unless of course you have nothing upstairs that is capable of logical thinking.
@unearthlynarratives_ because even with all of Intels problems, don't nobody want no AMD lol.
Rumored prices for 5090 . " first Born child " . One entire leg . a Left testicle . the list continues .....
Just get a second job for a month or so ha
Living homeless on the street, but I'll own my RTX 5090!
significant brain tissue sample
sell your Soul
@@rainerbehrendt9330 your soul is just the down payment...
5090 refering to the year when you have finally payed off the loan you took to buy it.
What are so many people misspelling paid
@@AndrewB23you made a grammar mistake yourself
lmao
@@AndrewB23 payed is a valid spelling tho ✋️😭
While I love the meme; if you saved $3/day for the entirety of the 4000-series, you’d have more than enough to afford a 5090 at launch.
Sad to hear about Intel ARC. I'm pretty sure they just got over the majority of driver issues. You can play most games now.
@@jackthatmonkey8994 agreed, I really hope they can put out a strong mid-range card, having 3 competitive players in the space would be nice for buyers.
Yeah, I have an ARC A750, and it runs really well, despite being the cheaper option. 100% worth the money. I really hope they continue it, because the cards look very nice and they run well too.
If you're eating Caffinated Ramen noodles out of a squeeze tube. Stop it. Get some help. You have a serious addiction.
Nah, that's just Darwinism. Let nature take it's course.
Think about how regular, properly-prepared ramen noodles taste. That stuff has to taste like luke-warm ass.
So deliciously dystopian!
@@bliglum Aaaand energetically dystopian!
Michael Jordan. "Stop it... get some help"
16 gigs on a 5080 is an absolute farce.
It may be if true. But we don't know anything for sure yet.
bUT iTs GDdr7 wAM
DDR7 vs DDR6. If it doesn't need the RAM why add it? To make people feel better? Then ddr7 RAM costs money prices go up more complaining?
Invidia puts better RAM gamers complain its not enough and they want it for $200 cheaper than it costs to manufacture....... I am starting to get why Invidia stopped trying to make gamers happy. You just cant.
@@Beezkneez187 16Gb means you're way more likely to run into Vram issues, especially on a card that costs this much. The buyers are more likely gonna have higher resolution displays. Also if you have a 4k display and wanna play the latest/upcoming games you're basically forced to buy a 5080 ti/super or whatever higher memory variant that'll come out or a 5090.
The bus size is atwhat a 384?
You're right. I don't believe it.
Human beings have believed in all sorts of crazy shit for thousands of years! What's so hard to believe about a damn graphics card?!
(Y'know, other than the unbelievable price tags these days)
I waited so long, im going to get it :)
Is nobody remembering that they did this with 4070ti? trying to call it the 4080 and charge $100 more???
Everybody simply called out nvidia and they dropped the price and the card label...
Now everyone needs to do this again but it's like everyone's sleeping on it, trying to think themselves smart about how 'this makes sense'...
THIS DOES NOT MAKE SENSE! IT'S PROTEST TIME AGAIN FOLKS! WAKE UP!! THAT'S OUR 5070 TI THERE!!!! 😡
5060 Ti/5070
@@mewoozy2word
It worked when there was competition and options, protest away, but I have my doubts it will work this time.
@@matilija older cards are half the price and not that far off in performance. it's not gonna fly
@@charleyweinhardt I agree, I wouldn't buy one, but the question is will Nvidia agree or give in? They've already, on purpose reduced 4xxx series distribution to insure those older cards aren't in good enough supply be bought over the 5xxx series.
2:30 the 5080 specs look incredibly underwhelming. It appears quite a bit slower than the 4090.
Thought the exact same, very sad looking, they'd have to price it near the same as the 4080 for it to sell.
Nice Gallagher "messy" insert. I think some tech companies could use a good whack from a Sledge 'O Matic!
The 5090 specs are believable, it's the 5080 specs that are unbelievable...
I saw the price! ( 🎶a-do do do 🎶)
Now I'm a believer! ( 🎶 a-do do do 🎶)
That's not a 5080, more like a 5070. They doing that to get consumers to buy the 5080ti/S variants at higher price.
The specs suggest it would be a little faster than a 4090, but with 16GB. That sounds EXACTLY like how Nvidia would dice it: "Look! You can now get better than 4090 performance for just $1200! That's a 40% frames per dollar improvement in just one generation! Isn't that just AMAZING?".
@@veritassyfer1185 AMD is not coming to the party this time around.... What makes you think Nvidia wont take full advantage of that like they did the last time?
@@andersjjensenproblem is we have decades of this being the sale for a 70 class card. Charging 80 class price for this is just ew.
I hate that the 5080 is half a 5090. Up untill the 40xx-series the xx80 sku was somewhere from 10-30% cut down from the 90. The chip that was half of the top sku was between xx60-70 sku.
You can thank the 5090 D for that ... I believe Nvidia is using the 5080 as the 5090 D to sell to China instead of making it closer to the actual 5090 for consumers.
I owned one before upgrading to a 4090 but performance wasn't too far off basically a locked 60fps at native 4k with everything maxed with Ray tracing while a 4090 is 80--90 fps most of the time with some games going into 120fps territory, actually playing doesn't feel that different.
That's because all cards under the 4090 were a lie they are 1 to 2 spots off there real class. 4060 is basically 10 faster then last gen, it is a 4050 in reality.
Nvidia is running on two things:
1) they know AMD is trying to accelerate RDNA5 which unlike RDNA4 is likely to have a halo-card in the lineup, so they need to make the 5090 as powerful as possible in the event it ends up having to compete against the hypothetical “9900XTX” (or whatever) for very long at all
2) they have enough fanboys who make buying them their identity no matter how expensive it is and how few games the extra hardware has any benefit for…so no market impediment to dissuade them from developing and releasing the product
The 5070Ti *
400W is disappointing if true, I guess we can undervolt, they are simply relying on beefy coolers to gain performance, that's lazy, make them more efficient.
Real efficency gains only come from reducing transistor size. And they hit that limit many years go. Hence artificial upscaling and frame generation.
Not really,
UV is a saving grace though, because the perf to W ratio has a curve of a plateau. Multiple sources after the release of the 4090 pointed out that these new Nvidia cards look like they've been "overclocked in factory" or something. You could realistically drop it down to 500 and still keep 90-95% of the performance ( the numbers are speculation just for the sake of example ), and looking at the specs of the 4080 AND the fact that AMD has withdrawn from the high end market, I espect to see nothing less than the repeat of the comical difference between 4090 and anything in any benchmark, be it gaming or productivity, so UV should be fine.
glass chips coming soon-ish and maybe that will help
More efficient = smaller trans size = more expensive.... Then you would be here complaining about price instead lol
Thanks for giving my build a shoutout at the end! ❤
Can't wait for $2999.99 pricing 👀
After my 1070 can't play games anymore, im becoming a Mobile Gamer lol.
Thats Awful.
@@kellywilson137 I was lucky and for a gaming laptop got mine get after the 3000 nvidia gpu's came out got a deal on a 2070 latop works great for what I play .My desktop is great and should last a long time
Why are they always so stingy with the video memory ?
because people keep giving them $$$ regardless.
becasue they sell 60 class cards as 70, there is physically no space on the die to put more memory controllers
Upselling to the next tier up, hopefully to xx90
Money. They don't want to cut into the profits from their professional GPUs that come with more VRAM. If they give you too much VRAM, they know professionals won't buy their massively marked up pro GPUs.
AI
wtf 600W
save on your heating bill with a 2 in 1 GPU.
Nvidia truly thinks of the little guy 😭
You can just UV it. 4090 is 450W yet I only use 300W at WORST. I usually draw 150W.
600 watt is pry stress test program not typical gaming draw.
My overclocked 3090Ti hits 480 watts at full load.
@@TillTheLightTakesUsYou Dope, right now it may not matter but in the future the card will be worthless when games will push the 4090 to the max…
Nobody is going to want to buy a high wattage card
That ending was superb LMAO
That rascal Joe!
I nearly didn't watch till the very end until I saw your comment😂
Def record scratch, I had to go back and re-watch it.
Almost had a spit take
What people should be more focused on is the 10% cuda count increase of the 5080 vs the 4080 compared to 50% increase for the 5090 vs the 4090 and its 60% increase over the 3090 which was the sole reason that performance increased by 30% at native 4k.
I bought Intel Arc A380 for my server without integrated GPU. It's pretty popular for video transcoding. Homelab community is keeping Arc dream alive.
5080 so weak!
You play games at 1080p you couldn't use it anyway 😂
16 gig VRAM is a joke, most mid tier have more than today. Even if they have GGDR7. Clearly a made decision so they can sell the TI/Super ones with higher VRAM...
Can't beat the 4090
They'll re-release an updated version right after release...
@@Arcadiez the thing that worries me is: if the 5080 has 16g, what will the 5070/5060 have? - Are we about to see another X060 card with 8G?
RTX 5080 is disappointing, so I have no hopes for the RTX 5070 or 5070 Ti probably 12Gb VRAM again.......
Do people who buy x70-class cards really need more than 12GB? Honest question.
I always bought x60 and x70-class cards when I ran 1600x1000 - 2304x1440. I had to move to x80 Ti / x90-class when I got a 4K OLED. What are some examples of games which max 12GB at 2560x1440 or less?
@@bricaaron3978 I have a RTX 3060 12GB and at 1080p in newer games I get textures pop-in because sometimes it goes over that amount. That's why 16GB Vram should start being the minimum, even at 1080p.
@@bricaaron3978 Nope they don't. They complain about the GPU costing a lot but so does all the other gear to even game at its capabilities. They are just complaining to complain that's why Invidia doesn't listen to gamers anymore even if they give them what they want AMD would be done overnight and prices would go back up after AMD shuts down. People only buy AMD cards because they are the affordable option they are not as good. If prices come down AMD will have to drop prices so hard to even be relevant.
@@Beezkneez187 I've never bought ATI/AMD, but it looks like AMD is the only one producing lower-end cards now (beside Intel, I suppose?).
@@bricaaron3978 Cyberpunk, Alan Wake, Avatar, Star Wars Outlaws, Horizon Forbidden West, RE4...
... I bought Now That's What I Call Music 1-3 from the music aisle when I was a kid.
I don't appreciate him saying the people buying those have kids now.
He's not wrong, I had them as a kid and I have kids now.
But I still don't appreciate it.
The ending video got me, hahah. Thanks Paul, hope you and your family have some fun adventures out in the world :D
You know, the sad thing is that this is kind of like the mining craze: we're basically going to have to wait until AI-specific devices come out and handily outclass using a GPU, because right now (just like during the mining craze) there is a legit reason for people to buy 5090s and 4090s to use for AI.
5080 is going to be a glorious flop again. Im actually excited to see a 9950x3D with dual x3D, since its actually different enough and may not require core parking to function properly. Lets hope they dont charge 800-900$ for it though.
Well doesn't core parking kinda go hand-in-hand with any processor packing multiple chiplets on the same package? So you just kinda accept that if you're willing to have your cores split between two dies.
-edit I agree, in that it's lame that if the parking isn't handled properly it noticeably impacts performance and that you _need_ the bastard Game Bar for it to work. Whatever Game Bar does is something that could be implemented in any game launcher like Steam or (bllleeech) uPlay. I think I'd rather have a few more cores on a single die than double the cores split between two.
As the 9950X requires core parking, I do not see the benefit of a dual 3D V-Cache Chip. 😅
@@JensAllerleiNah, that's not quite right. Core parking and 3D V-Cache are totally different things. It's like saying since your car has power steering, it can't have a turbocharger! 😂
AMD only put 3D V-Cache on one of the 9950X's chiplets at first, but rumor has it they're working on a version with it on both! Imagine a twin-turbo supercar! 🤯
Dual 3D V-Cache could make the 9950X insanely fast, especially for gaming.
@@411DLno. It’s only because the 3D cache is only on one of the chipsets. If it was on both they wouldn’t have to park because they wouldn’t have to share cache and increase latency.
Look Nvidia not interested in Desktop market, lets admit its oversaturated.
did I correctly hear paul say the intel battlemage card will launch in 2022?
He did! He makes a coment about it in the closed captions.
I'm sure it will redefine 'glass cannon'.
That's what the original roadmap said. But then Alchemist got delayed forever, and then Battlemage got delayed forever. It's pretty much Intel 10nm all over again.
@@andersjjensen Paul is still a silly goose
2222 yes sometime in future .
I too am hopeful that battlemage can came out before the end of 2022!
Maybe even before Alchemist!
I don't game too much due to my preference generally being "If an assault rifle is 90% of trailers and gameplay, I'm out", "If I feel like I've seen this in the last 5 years then I'm out", "If the colour palette and design is dull, I'm out" and "If I can get the same or more from a Doom mod then you're not trying".
So by the time a good game like Psychonauts 2 will come out. This will be lowered to an investment cost worth paying. Likely at the end or after 2026 if I'm being optimistic.
But 2025 will be a no go for pricing. Luckily games don't need this to run 60 FPS or even 80. You're good with lower ones even for several years.
Your standards are pretty much superfluous considering what the AAA industry has been excreting for the last several years.
I'm just thankful that I have my favorite games to replay, and to re-experience with mind-blowing immersion on my "High FoV Gaming" setup. That's in addition to the huge number of PC games from the last 40+ years that I look forward to trying for the first time.
I can’t wait to sell my house to fight scalpers and buy a 5090
5080 only 16GB? That's bs, 16GB should be the standard for low end GPUs not high end
Here here
THAT'S OUR 5070 TI!
it's a repeat of the last 4080 drop, people need to start the protest and make them drop the name and price down again!
yeah, and it's gddr7, not gddr6 like the 40 series.
you barely ever need more than 12gb vram
You kids today. 😂😂😂😂
@@Digikidthevoiceofreason Yep. I remember 1GB being science fiction.
How did you explain why your little one cannot write that word?? Also that had me rolling. 😂
Bet the mom had to explain. " Honey, you got this one?"
All I really want is an affordable GPU. New cards are great, but premium prices are not.
AMD might come in rescue
@@megapet777 🤣😂🤣😂🤣😂
@@megapet777 Oh wait! That wasn't a joke?
@@oldtimergaming9514 havent you been following the news? Amd is not gonna even compete in the top end next generation, they will only focus on budget gpus
If you stick to 1080p then gaming is not so expensive. A 6600XT or 3060 is exactly as fast (in pure raster) at 1080p as a 4090 is at 4k. Here's the breakdown chart of the 9:16 resolutions and their computational "cost"
3840x2160p = 100%
2560x1440p = 45%
1920x1080p = 25%
So a 7700XT, RX 6800, 3070ti or 4070 does at 1440p what a 4090 does at 4k.
This looks like they're intentionally positioning the 5090 as the "solo AI researcher" card by limiting the 5080's total VRAM. Same as the 4xxx series. They want you to buy the xx90 series or pay a lot more for the AI-branded models. My guess is the 5090 will be priced between $2000 and $2499 because of this.
You saw this with the lower tier 4xxx generation as they didn't have a lot of VRAM present compared to the AMD counterparts (ROC isn't up to the same parity as CUDA).
That 5090 is going to need its on power brick with all that power draw.
5090 will be crazy expensive 😢
And crazy good😂
Thats not true at all, theres a whole other set of non-gaming gpus that nvidia releases@Krypto121
@Krypto121 Press X to doubt
Jokes aside, you're probably correct about productivity also being it's intended purpose, but gaming is also it's purpose because time and time again people chronically underestimate just how far people in the gaming space are willing to go when it comes to their cards. Nvidia knows it, which is why they have been targeting gamers even with their high end cards for decades, and also why they've been ramping their prices up after the scalping shortage a few years ago. Not because they don't care about gamers anymore, they do because we still make up 30-40% of their revenue, rather because they know they have a monopoly and ask for whatever the hell they want to ask for.
@Krypto121 Your friend is right depending on circumstances. The 4090 is on average ~20% faster in gaming than the 4080 super, and considering both of them are really a cut above the rest, that 20% is a MASSIVE advantage over the 4080 super. Quite frankly, the 4090 is crushing any and all competition in gaming by a long shot. That said, the 4080 super will be enough depending on what you're playing and what specs you are looking for. There are cases where it's not enough though. For example, the PC release of god of war ragnarok does not run in 1080p 144 fps on any card other than the 4090. Most newer games don't run 1440p 144 unless you're running a 7800x3d with a 4090 either. If you are playing indie games or esports titles mostly, yeah the 4090 is massive overkill. There is a legitimate market niche for the 4090 in the gaming space other than Nvidia scamming people, that much is undeniable, which is why I responded to you in the first place. That's my axe to grind here essentially.
As for your second comment, sure, people are getting scammed for overpurchasing, some might even be stupid enough to by a threadripper thinking it's better for their use case when it's actually not.
@Krypto121 That's the thing tho, its priced in a way that targets a specific demographic, people who either use it for professional use or people who have the expendable income to afford it, they are not targeting Jimmy who was just barely able to afford their 3060TI lol...yet the people that cry the most are the Jimmy's who could never afford the card even if they dropped the price by 400$.
5090 priced at the low low price of 2200+
That wouldn't do, it's pretty much the same price the 4090 went for. Nvidia needs that sweet year over year infinite growth. Probably will be 2700-3000+
Only the 8GB version will be as low as 2200.
This is misinfo. We all know the first born is entry price point.
@@0Sicarii0 5090 32GB 512-bit $1999-2399 about $2199-2599 and 5080 384-bit 24GB will be almost same price as 4090 around $1399-1599 and 5080 256-bit 16GB $999-1199
Man i was watching a vid on the 5090 specs when this notification popped up. I switched videos😂😂
You bet that got a 'love' from Paul!
I bought a 4090 and a 7900xtx. I tried them both in my computer. I saw 0 difference. I returned the 4090 and kept the 7900 xtx.
You didn't look much then
I have an XTX at 1440p165Hz myself. If you're on a 1080p144Hz display, sure I believe you, because you'd be CPU capped all the time. Or if you did something stupid like pairing them with a Core 9600k or Ryzen 3600. But outside of completely stupid setups you're talking verifiable nonsense.
@@andersjjensen Samsung 57" 8K Ultra HD 240Hz 1ms GTG Curved VA LED FreeSync Gaming Monitor (LS57CG952NNXZA) - Black/White. 12900k cpu and ddr4 3600mhz 32GB ram.
From a very reliable source at nVidia (a buddy I used to race with and high up nVidia executive chain), 5090 will arrive early Q1 2025 and will start $2499 with FE's at $2699 and once again very limited supply as to provide "opportunity" for resellers to increase the price and margins even higher ... so early adopters should expect to be paying around $3000-$3200 US.
I don't doubt that's the plan.
12:32 --> time out to heat water.. no need, just put your noodle cup on your new RTX 5090 for a few minutes...
NVidia:
5090 has a dual 16-pin 12VHPWR for double the fire hazzards*
*Fire extinguisher sold separately
They don't
They don’t, but they should, as it would eliminate the fire hazard by splitting the current load between the two plugs. You’d think they’d have learned from the mistakes of the 40-series, especially when the remedy is just that simple.
@@stadizdiaz848 I hope it is the dual pin design this time along with dp 2.1 udr20 or whatever it's called
Worst case scenario: The xx90 cards will become more unobtainable with the launch of the 5090. Given the specs, I wouldn't be surprised if that thing launched for 2000 USD or considerably more. Meaning the 5080 would then be sitting at around the USD 1000 mark (at least) ... for a high-end 2024/2025-card with 16 GB.
Best case (aka "La-La-Land")-scenario: Nvidia have seen the error of their ways (4080 -> 4080 Super price correction seems to point to this) and will launch the 5090 at roughly the same MSRP as the 4090 - which would almost be justifiable given its specs. 5080 will see a slight price correction to ~USD 900 and everything below it will be calculated based on its price-tag. Again: While this is probably totally wrong, I just don't see how they could justify a 1000 (or more) MSRP for a 16GB card.
The only "upside" my pessimistic self can see in all of this is for 4090 owners ... who don't really have a reason to get rid of their existing card in the near future. That GPU is still crazy fast and should last well into 2025 and perhaps beyond.
That's the key part that people who poo pooed the 4090 is it's a great card, you use it probably for 5-7 years or beyond with little to no performance issues.
It spreads that cost out to the point that per a year it's cheaper than most mid range GPU's and delivers a much better experience during it's life.
We saw this already with all the 1080 Ti folks who "overpaid" back then who are still often sitting pretty with their cards on everything except the most demanding games at the highest possible settings or 4k.
Anyone who wants to replace their 4090 or even 3090 is a carrot chaser.
@@Hybris51129they didn't even have DLSS upscaling and framegen yet with 1080ti either.
@@squirrelsinjacket1804 Correct, you don't have fancy software improvements of the newer cards. But even without those options you can play at the native resolution and still have most if not all the detail settings cranked up and maintain 60+ fps more than likely.
We are reaching the point where eventually people with that generation of card will need to expect to make more and more tweaks to run stuff but right now its still very serviceable within reason.
Thankfully I never need the newest card on the market, so I don't have to care about these things.
The second Nvidia found out that AMD is exiting the high end market Nvidia decided to re-play the idea that backfired with RTX 40: Shift everything down a tier. The GB 202 will have the 5090 (which is already cut down 11%) and then eventually the 5080Ti (probably 15k cores and 24GB). The "70 class die" will have the 5080 in it's full configuration (which is expected to be about 10% faster than a 4090) and cost $1200 if we are VERY VERY lucky. The 5070Ti will likely also be 16GB (and $900 or something stupid) and be between a 4090 and 4080 in performance. It's not until we get to the 5070 (probably 12GB again) that AMD will have an answer. If Nvidia launches the 5070 for $700 and AMD launches the 8800XT for $500 people will complain a lot... and then buy the 5070.
thanks Paul and Joe! great stuff
I wasn't surprised to see Mr. Glasscock in the channel members list.
I think we could all see him coming.
It really feels like the 16gb vram is lagging behind for the 5080. They should have pushed it to 20gb and maybe up'd the CUDA cores a bit more too (the 5080 only got a 10% generational uplift in CUDA cores vs 33% uplift for the 5090 over 4090 by comparison.)
They do it on purpose.
I wouldn't approach the 5090 because of power. 600watts is insane. You're paying for that and your electricity bill. Has great specs but I wouldn't doubt it would cost more than $2.5k
@@yourfriend4104 "I wouldn't approach the 5090 because of power. 600watts is insane."
This is still conflicting info but for some reason you all believe kopite. There have literally been like 3 or 4 leaks and they all differ, however the other 2 leaks have the 5090 as a 500 - 550W card. Regardless, people who are buying these cards obviously don't care about their electricity bill....
My prediction: 5090 for 2200 usd.
I was thinking $2299, and 4090 ceases production, driving prices of remaining 4090s up, which will convince people to get the 5090.
If you predict it Nvidia will think you find it reasonable. Only thing anyone should be saying in these comments sections is I won't pay a penny over $1000 for a 5090.
Double
2800 in norway 😂
$2400, and ultra low stock to make sure the price won't drop.
A 5080 jumping from 320 watts for the 4080 to 400 watts should indicate a significant performance boost. The leaked numbers don't add up.
when you factor in Display Port 2.1 and GDDR7 memory it kinda makes sense.. the rx7900 xtx has 355 TDP and it's slightly better than a 4080 super
@@deanal-jackson4593 Your comparing AMD to Nvidia, I'm comparing Nvidia to Nvidia.
@@freedomearthmoon1the problem with comparing nVidia to nVidia is you see how shite each generation is compared to the previous.
@@freedomearthmoon1 as if AMD is in the business of making water dispensers or something
@@deanal-jackson4593 Perhaps AMD should be? They've conceded the high end market to Nvidia because they can't compete. Their Ray Tracing still drags performance in the dirt. Nvidia offers many more features and I hope FRS4 is an improvement, but after so many iterations without getting it right it's probably just another AM5 type event.
Ha ha - Boost Noodles. What in the what?!?
Thanks for the update Paul - long time viewer of your channel and still loving it. Rock on, good Sir!
Back from 5 days of power and internet outage. First video had to be Paul. Did not let me down. Thanks man. Great stuff.
Selling my house to buy the new 5090
Roof and walls are not needed anyway, just live inside the bubble of superheated air surrounding your new 5090.
I'll see you on the street!
You have a house!?
@@gameeverything816 yes, do you have a house?
5080 dead before launch
I doubt it, Nvidia learned with the 4080 super. They're not gonna overcharge like last time, it's probs gonna be $999 and the 5090 is probs gonna be $1799.
Unless... 5090 is a paper launch and not truly available for 4 months and people lose patience and buy 5080s. Which seems very plausible.
I'm waiting for the 5090 one way or the other. I've never built myself a top of the line PC and this is a divorce present to myself. Do I need to spend $5k on a PC? No. Do I WANT to spend $5k on a beastly rig and use it to play D2R and Genshin? Absofuckinglutely. The upgrade from 2060 and a 3600 to 5090 and 9800x3d will be the best present I've ever given myself.
@@michaeltorrisi7289let this man cook 👨🍳
@@michaeltorrisi7289 we are the same person
no that's our ' 70 ti ' again like last time they tried to call it the 4080...
protest time starts now!
If true this be some OBVIOUS gaps for a 5080TI with 24 gigs of Vram and maybe 18k cuda cores
no that's our ' 70 ti ' again like last time they tried to call it the 4080...
protest time starts now!
I was playing a medieval survival game while listening to this... and somehow that MGS alert sound still made me jump and check for enemies.
I was NOT ready for that acronym at the end 😂
why not just give the GPU VRAM like these and consumers will be happy...
5090 32gb
5080 20gb
5070 16gb
5060 12gb
5050 10gb
They'll just add some L2 cache and claim 4GB is all you need.
They don't care about consumers until the AI bubble bursts.
@@Hu55a1n I am still hoping that the rumoured 5080 turns out to be the 5070 xD
They want to compress the textures in some way, plus less material cost for higher profits on the lower end low margin parts. They want you to buy the 5080 for more vram. Even with more vram, amd can't compete on the high end.
@Krypto121 even entry level GPUs can already utilize more than 8 GB in modern demanding titles. Nvidia is limiting the VRAM more than would be sensible for gaming purposes, because they do not want these cards to be used instead of their workstation counterparts...
In general, more VRAM enables better visuals at all GPU classes. Of course, there is a point of diminishing returns, but current Nvidia GPUs are far from that.
Hoping that the 5080 at 16GB means that Nvidia will make a 5080 Ti with 20GB of VRAM
They probably will...like a year later. Tbh it not really worth buying into Nvidia till launches have had time to come out cause that brand new 80 series card you bought for a 1000? Guess what? A few months later, an 5080 ti super duper will come out
yeah but ti comes one yer later, not at launch
Probably 24GB, 500W and 15k cores. So right in the middle. The 5090 is already cut down by 11%, so the 5080ti would be cut by about 35%. TSMCs yields are so good that this should make every die salvageable.
Please, for the love of the Flying Spaghetti Monster, PUT TWO POWER PLUGS on the 50-series cards!
>Flying Spaghetti Monster
Soy and cringe.
@@yourneighbourhooddoomer Well, I dont think other gods would care if your GPU melts. FSM is cool, can make your GPU cool too.
@@yourneighbourhooddoomer lol Christians 😂
No thanks I prefer the 12v as I know how to plug my stuff in all the way.
@@ExtremeGamersBenchmarks I’m not talking about using a different connector. I’m talking about using two of them. In case you haven’t heard, even ones plugged all the way in properly burned. Putting two of them on the board will stop that from happening by splitting the current up between the two so each one only has to deal with half the current. Since the problem is that sometimes these connectors are not properly handling all the current they were supposed to, using two will solve the problem completely.
I cant see PCI-E 5.0 having much if any FPS difference compared to 4.0. We only saw around 3fps difference in the previous gen and they barely saturate the throughput of 4.0. Wont be enough of a difference to make me upgrade my mobo.
No way the 5080 only has 16GB. The gap between the 80 and 90 is too massive. Plus, 16GB on an 80 card in 2024 is pathetic. It's gotta be like 20+. I don't believe it... But it's also Nvidia, so it's absolutely something they would do.
greedy bastards will never stop
16gb is enough, but only barely. That’s how NV likes it
How much will the 5090 cost ? $5000 ?
🤔$5090
Вот что значит нет конкуренции!
$9001
$1800-$2000
$1799
Remember guys, vote with your wallet
Amen
And my axe!
As if i can afford any of these cards to begin with
Not everyone is broke LOL
@@challenger516 Well, more money than brains I guess.
Wait until they realize they have to use a 3 foot Display 2.1 cable.
Why 3 foot?
@@Decadent_Descent Comm speeds for bandwidth possibly not maintainable on longer cables
"It is irreversible...like my raincoat". That shit made me laugh out loud as it reminded me of Spaceballs. Bravo, Paul.
I cant wait for the 3rd party benchmarks for the 5080 and the 5090.
12:05 I don't know why but I want this to be review by Steve1989MRE. I want to hear his take on it because it looks like something you would find in an MRE.
Let's get this out on to a tray, nice
ok cafeinated rammen, i kinda get... but once you reach the "we need to put ramen in squeeze pouch" point you really need to just stop & reconsider your life choices
They make getting cancer so much more accessible now. Thank you Food Science!
Wow, a wopping 16GB of vram on a graphics card which will likely have an MSRP of about 1200 USD or more, and most models will likely go for 100-300 dollars MORE than the MSRP. Sure, that will be plenty of vram most of the time for maybe a year or two after it releases, but the vram will likely become an increasingly limiting factor for the RTX 5080 over time. That 16GB of vram is going to be a problem for ray tracing, and also frame-generation, in an increasingly larger number of games each year after it's first launched.
I think 16 GB will probably be pretty good for next gen mid-range graphics cards, but it's just not enough for a high-end one in that price range.
The 4090 and 4080 weren't that far off in performance bud I literally owned both 😂, also most of y'all complaining don't even play games in 4k so you won't even need it 😂😂
16Gb is not yet confirmed.
.. cannot wait for the RTX 9090 Review 🥱
Hell yeah I'm glad to see that build getting more attention. Thank you, Paul
People cry about Nvidia pricing
Goes out and buys anyways 😂
Yeah, you taught Nvidia a lesson alright.
New on TH-cam, noobs cry here !
They only want AMD to be competitive so that Nvidia drops pricing.
Ever notice that all the tech TH-cam channels have Nvidia in their personal rigs?
Yep the funniest part is that people expect and think that the 4090 or in this case, the 5090 should be accessible and affordable to everyone...while also being the best GPU in the market by a mile lol. The highest end GPU is not made for people who can't afford it and it will never be that way, so the fact that so many people cry that its 2k or whatever are the people nvidia are NOT targeting to begin with.
@@unearthlynarratives_ fr doe... alot of people with room temp iq be angry at Nvidia.... A high end gpu aint a right or sum... like how you gon get made because some of the worlds most modern and capable gpu's are a bit expensive xD
Mis as well complain that a mclaren or bugatti is too expensive....
@@FacialVomitTurtleFights can you write that again in English, instead of ebonics?
I'm still running a 980ti just fine because i don't play new games.
And?
People commenting under this video do play new games. So your comment is useless to us.
@@DingleBerryschnapps And so is your comment.
I love playing and replaying older games too. What OS are you running with the 980 Ti?
Que many more 12VHWP fires with that 5090 series. Those plugs should be banned.
Thanks for adding a link to that custom case.
You're giving off CraftComputing vibes with your in-video beverage 😊
waiting for new gen to drop so old cards in the use markets drop a cent lower 😂
Well since they are already shutting off the 4090 pipeline I kinda doubt it other than the used market. And that would be a lot of money to gamble on a used card.
Afraid that's not happening, NVidia have stopped production of the 4080/90 to run stock low, so prices are set to rise very soon, used cards will rise nearly as high as they are now, especially with the 5080 only having 16GB Vram. But yeah! wait
I'll have a 2024 4090 with limited use up for sale once I get my hands on a 5090.
@@aldermediaproductions695 Well i have a: 2080/3090/4090, and i must say got the 2080 from my brother so have been playing around with it, its still a decent little card, yet the 8GB of VRAM.......dear god, its a total nuisance, 24 GB...what freedom. Yeah, i need to get some use out of the cards ive got, yet always nice having the best, cant do the crappy 5070/580 nonsense, get the best always.
@@aldermediaproductions695 Good luck with that, you might need to have everything set up to grab one, good luck.
Nice outro 😅
Watched the entire video and got insulted at the end.
In some cultures, that's a sing of affection (mate).
I think the phrase 'noodle-like substance' is permanently seared into my brain now, thanks Paul. *shudder*
lmfao... I was not prepared for that outro. well played.. (:
Don't worry. Their fanboys are extremely fine with it.
Man I really want a 5090 but I know it wont be any less than $2000 or $2500....
Man I really dont want to spend that much money for a computer part. I know bleeding edge is pricey but thats too much. Please keep it like a $1000 max
What makes you think it'll drop for anything less than $2k when the 4090 did the same thing and insta-sold out?
Especially with the artificial inflation rn.
Yeah you are crazy dreaming if you thing the 5090 is going to be anywhere close to 1k.
They have a monopoly on high end GPUs, there's no reason for them not to drain your bank account in exchange for one.
@@Stubbies2003 yep, I'm essentially day dreaming, wishing it to somehow turn true.
For the past few years I've been avoiding buying any nvidia GPUs in hopes of decreasing the number of units sold but I alone am too weak and people bought more of it anyways so here I am crying about it while it is 2k$+
I own a 2080ti and now we have a GPU that costs more than twice that 😔
@@Hiperultimate If you don't wanna pay for a 5090 then why are you here crying about it? Why not find a new gpu you can actually pay for?🤦♂
Nvidia : Introducing rtx 5090! Too poor to buy?! Don't you have kidneys?!
I only have phone 😞
People are throwing money at them righ now...quite frankly they don't need your sales.
I only have one left. Because I used one for the 4090 already.
The saying these days is either MooDeng or Don't be GPU Poor
Confused say "If it sounds too good to be true, it probably is". Glad to hear you've found a way to boost your Noodle.
Oh man, Paul just reminded me of the play by play guy from 'Major League'. When 5090 pricing is announced, I'm pretty sure that beer will have transformed into Jack Daniels.
well boost noodle looks disgusting lmao
What about a "noodle-like substance" doesn't sound appealing? lol
It looks like somebody combined a mountain dew and canned spaghetti and put it in an applesauce squeeze pouch. Might as well just blend your food and bottle it up and drink that lol
You don't need a 5090. The cycle of people whining nonstop about high end GPUs being expensive and then collectively making them sell out the day they are released when no one even needs this much power is very tiring, I really hate the internet.
Tons of people need that much power(?) Graphics cards aren't just for gaming.
Yet another Nvidia launch, where all the juice is put in the 90 model. And the rest 10 models are crap and heavily overpriced. I am so tired of nvidia. We gamers need some love from our dear gpu manufacturers. Pc Will end up dying slowly if these prices continues
Our dear gpu manufacturers are what we have made them into. If 95% gamers will only buy Nvidia and "the best" happens to always be Nvidia and then AI happens and "the best" is a pretty good AI card then gl hf kthxbye. I have 5 rl acquaintances who are into PC gaming, they will all only purchase Nvidia. I've asked and the reasons have been "I don't know AMD so well", "I've heard that AMD is worse" and "Well everyone buys Nvidia". To someone who likes to research before a purchase those reasons suck, and they also show the mindspell Nvidia holds over the sheep.
Meanwhile AMD just said "We give up." Thus dooming the consumers to whatever Nvidia wants.
It's bad enough that we have a duopoly but it's made worse when one side flat out gives up.
People want to blame Nvidia for everything but the blame rests with AMD and their absolute failure to put any effort into their products over this last decade.
Hopefully Intel gets off the ground enough to start threatening AMD in the budget and mid tier markets to force them into a bind. Maybe then they will decide to actually do something for a change.
@@Hybris51129yeah it's amd's fault that NVidia jacks up prices and the consumer can't wait to pay the inflated prices
@@zam1007 When you have no reason to hold back it gets hard to just leave money on the table. The market is out of balance and AMD just made it worse by noping out of it effectively.
@@Vantrakter We have only two GPU producers in the market also AMD's products consistently perform poorer in general to Nvidia's.
So by the numbers when you actually look at the performance reviews unless your use case happens to fall into one of the places AMD is edging ahead in Nvidia is the better buy.
We have gotten here because for the last decade AMD has refused to bother competing in the space because they need only price themselves a little cheaper and claim it's "value" pricing regardless of actual performance.
That's going to burn customers and by word of mouth condemn the brand who will go to the only other game in town Nvidia who by being in first place in a two horse race is automatically the "premium" brand with the prices and performance expectations to boot.
Almost p***** my pants laughing at the end clip...
LOL at the end bit. I had to backup to double check what i had seen!
Amazing video, thank you, i subscribed. :D
I really liked the Vmin shafted joke, no need for the greyscale lmfao
Can’t wait to get a 5090! I’ll camp outside Microcenter again, had a blast for the 4090 launch.
That ending, Paul... 😆
End ending. Why Paul will forever be my favorite tech bro.