Why people keep saying 5080 should cost 1400$. Nvidia is going to keep their margins constant. But they are spending extra 20-33% on 5090 over 4090 which they pass on consumer by increasing MSRP. While 5080 is designed to cost around 4080 super or less but increase clock speeds and improve architecture to get higher performance.
@@jouniosmala9921 i mean... If 5080 ends up at the same price as the 4080S wouldnt be that bad actually, we will see cuz the trend of Nvidia this last years is to increase the price with each new generation, well more or less like Intel or AMD with cpus too now that i think about it 🤔
@@juanford55 The wafers cost four times as much as back in the good old days. It's kind of a TSMC gives nice bump of transistor density with cost increase that's a bit less than improvement in density. I'm hopeful that this time it's just a tweak for same process thus cost increase doesn't happen we get price/perf improvement from TSMC. In the old days, nearly all price/perf improvements came from TSMC and Nvidia and ATI were just utilizing that as well as they could.
Price is the only thing that matters. If it costs $700, I do not care how much 5080 is compared to 5090... The only thing that matters then is that 5080 is still faster than 4080.
The specs are fine. 4080 has just a tad bit more CUDA cores than 3080, yet it is 50% faster than it's predecessor. If 5080 is also ~50% faster then 4080, it will be a great product. The only thing that matters is price. If it costs
@@stangamer1151the only problem i see is the vram. 16 GB on what will be a 1000$+ card is really not great. Of course we still have to see how good and widespread NTC will be and what the new iteration of DLSS will bring to the table. Put as of now 16gb on a top of the line card is not great
@@massimovolpe1343 20GB would be excellent, but I do not think it is a major issue. According to some rumors, new gen consoles will only release in 2028. This means 16GB will be enough even for 4K res until that time. I mean, 12GB is still perfectly fine for 1440p. 4K usually needs just about 2GB more.
@@stangamer1151 its not, 5080 have half of 5090 means they sell 5070 named as 5080. we saw what they did with 4080 12gb one xD 70 tier have allways been half of 90 tier and now they will sell half for higher name/price
Nvidia seems dead set on making their "the more you buy the more you save" catchphrase true. By making sure that all their mid to high range cards are overpriced and underperforming. Only the highest tier gets decent products.
and what makes it worse... is that they NEVER have enough available to combat the scalpers and bulk-AI-farm purchases until it's almost half-way through the life-cycle. So only the high-end cards are really a single-gen upgrade... AND you can't F****ING buy them even if you have the money!
Except with these practices they actually disprove their point, rtx 2060 super is a 5 year old card and rtx 4060 is only 30% faster and same vram, so 2060 super maintained its value better relative to how much it cost to buy, 2080ti on the other hand...😂😂😂😂
@@Gustaviustwinkelberry That's just the fact that if you're going to the high end then gen over gen gains are 4x what the bottom of the market is doing, most people see value in a speed x cost + features kind of way so 4060 represents a somewhat bigger jump in perception of value because the average entry level buyer is more feature driven than a mid range buyer would be where the equation gets harder, 4070s vs 7900gre is a decent example of if you don't care about Ray tracing, this one does it cheaper.
@@elysian3623 its just funny to me that 1000$ rtx 2080ti is now worth sub 300$😂😂😂, 700$ loss!!!, this is why i always buy low end i just cant afford this amount of money bleed, especially with nvidia garbage practises screwing over previous gen buyers with vram and features,lmao its better economically to buy low end save the money and upgrade more often
well if people werent hell bent on upgrading every year, their slogan is correct. a 4090 for $1600 could easily last you the next 6 years at 4K gaming. thats $266 a year. Thats cheaper than if you got a cheaper card from the get-go and needed to upgrade later on. these cards are actually the best value theyve ever been, its just that people WANT to upgrade every year,.
Everyone seems to focus on price patterns, but volume often gets overlooked. I've been learning it more and believe that mastering volume with price action really enhances technical analysis!
Historical Volume, Price Movements, Support and Resistance, etc.. Pay attention to volume when a stock approaches key support or resistance levels. It is also crucial to understand market movements, i got some great help from my advisor on this, which really boosted my experience
Would you please provide the details of this expert who is guiding you? been considering financial advising for some time, but I'm having trouble locating a fiduciary.
2060 is 60% faster than 1060 6gb while still having 192bit bus. you are quite wrong about 2000 gen. 3000 gen is a dissapointment, but 3060 12gb is still good, not 3060 8bg, ti.
The 512 bit bus pleases me, I hated the leaks saying it was going to be 448 bit. I agree, I think NVIDIA have cut the 5080 way too much and are trying to upsell.
The RTX "5080" is actually the RTX 5070.Just like with the 4000 series,there will be no real 80 series card,just a sad 256-bit,16GB,puny GB203 die,impostor.Also power usage is beyond absurd.I also bet the 5090 "monster" will still not be fast enough to run the games maxed out at 4K without upscaling and fake frames.
It's typical NVidia: upselling pro max! 4090 will be faster than 5080 so prices don't drop. 5090 will be massive for those who have enough money, and everything else will be a POS with a lot of AI upscaling tucked on to it, to somehow make the graphs bigger.
Yup, make the 5080 way below the 5090 so as to make the 5090 look like the better deal, just like they did with the 4090. Plus, the doubling of VRAM is really interesting when it comes to running AI models locally. 5080 won't kill the 2nd hand market for the 4090 also, because it will probably be samey performance, but the lower VRAM kills it when it comes to AI. So, Nvidia doesnt offend their high end current 4090 users by basically keeping 90% of the card's value even when the 5080 will be out. If this is top line up the rest will be pretty crappy by the way, same as it is today. Hopefully AMD will produce something for the midrange, but they'll limit it at 16gb anyways. We're so doomed.
Newsflash: the 5080 is replacing the 4080, not the 4090. They have already stopped making 4000 series cards, any you buy now are old stock being cleared out.
Not a bad combo. I had a 3080 and 5800x and loved it. I did end up going to a 5700x3d then returned it and went with a full upgrade to a 7800x3d and a 7900xt. My 3080 was great and I loved it. I didn’t “need” to upgrade but it sure was nice. I don’t blame you for waiting.
Yes. Nvidia should have made a 1300$ 5080 instead of 800-1000$ 5080. More seriously, 5090 seems to be let's spend 33%more than 4090 to get 33% higher performance increase over 4090 than architectural and clock speed improvements would give. And NVidia isn't going to cut their margins. 5090 is peak performance at any costs design, while 5080 is fixing the issue of second fastest die still needing to be reasonable price range for consumers.
@@nossy232323 i am already locked on lower gpus. I'm waiting for them to officialy disappoint me and I will click-buy so fast (eg a used 6900xt) that my thumb will go through the screen. in case you want to start the "dont be poor" crap... I can afford 2x5090s whatever the price . but I am not a) a victim and b) an idiot.
@@jamesm568 then you should not have it at all! Ppl succumbing to what ever demands are not "big boys" but wimps. you act as if this is the best gpu in the world. but without competition how can you be so sure? it's like as if there was only one car company in the world saying "my car is the best!"!
@@eawblablatron9161 let's see It probably will be the best GPU for the average person since there is no other option out there that provides that type of performance. I would say it is a big boy GPU since it will be mainly adults that are able to buy it. You want to play with the big boys you pay the big boy price.
5080 is a complete joke. I’m happy with my 6900xt thanks nvidia. It’s almost like they want me to look at a 7900xtx with the price tag of a 5080 lmfao.
I’m opting out of NVidia stuff for at least the next 5 years probably much longer, got a 4070 Ti Super and it’s sufficient for AI tinkering, don’t much game anymore plus many AAA games are complete trash. My previous card was a 2012 GTX Titan that lasted me 12 years, this one should get 5-6 years of use. Frankly I’m done with the PC space and chasing hardware, my old 2012 tower PC is still running well. The next one will probably be a laptop, people want to buy $2500 GPUs for gaming are welcome to keep living on the upgrade treadmill, I’m done with companies like NVidia who wants to be the new Apple.
They all ready make non heat producing optical photonic light wave CPUs and GPUs that are 1000x faster than consumer electronic CPUs and GPUs in the public market today. They don't want to give the public the fastest best tech over night because they will lose billions in future sells schemes . The entire computer and gaming industry is a scam run by mafia criminals, don't fall victim to the hype, all your computer tech and gear is trash!
I skipped the 4090 cuz I thought it was too expensive. But I'm getting the 5090 no matter. The games coming out of UE5 engine feels truly next gen and it has just barely begun; the idea of training your own data locally is extremely tantalizing; the FULL upscale of ANY video on your screen is taxing on the GPU and 240Hz/4K are becoming the norm. Adobe has all but FULLY integrated GPU hardware encoding and real time special effects into ALL of their software stacks which if you care 10x the work speed with NO exaggeration. CPU rendering is PAINFULLY slow in comparison. I'm pretty excited for the 5090 TBH!
Doesn't the electric heater have a resistance of 600w and a 1000w? Honestly, no... just no. As good as the performance sounds this meme level requirments
Yeah Jensen is going to be on stage telling us all how a 5090 will actually save us money because we won't need to turn our heating on.... And remember "the more you buy the more you save" 🤣😭😭😭
@@timothywells8589 Rtx 5080 for for 899 dollars and 400w seems okayish but the 5090 goddamn... Heard that after the last RDNA from AMD next year they might opt again for high end. Rumors suggest a chiplet design 512 bit bus and 32 gb ram at 500w which is also bullshit honestly. Maybe they will do decent 24 gb ram at 384 bit bus though. Either way the top tier cards seem unreasonable but in their defence... if you are willing to spend that much it isn't like you can have any claim on bitching on the bills...
If it's anything like the 40 series. The 4090 has a 450w+ power limit but almost never uses anywhere close to that much power. My 4090 is currently using 287w playing FF16 at 4k DLSS Quality. I almost never even hit 450w in most games, and I have a Galax 666w vbios flashed onto my MSI Suprim.
@@jjlw2378 There are games where I have seen the power draw reached 450 plus watts. In some systems and on the internet the gpu was drawing 450 watts in games that are demanding and put a strain on it like Cyberpunk, Wukong and especially on the new Frostpunk game I think. You can limit the power also but we all have our concerns with the gpus, it seems the current technology is reaching it's limits in power efficiency.
If Nvidia wants a significant performance gap between the RTX 5090 and RTX 5080, there should be a corresponding price gap as well. Price the RTX 5080 at $900, and it is a good deal. I think it is possible as Nvidia is using the same 5/4nm node.
5nm will be more expensive in 2025. back in 2022 5nm starts at $17,000. 3nm at $20,000. next year 5nm TSMC will increase the price of 5nm to $20,000. TSMC said something like all their partner are agree to this price increase.
by using 600w it means they have done nothing in terms of innovation. they are just using the same old motor which they supercharged. it's fine by me but dont expect me to pay more than 1199 for old technology which you have oced Nvidia.
Don't expect them to sell it for 1199 lol. They should at least once lose money on this power draw insanity, otherwise next time it will be 1200 watt card. Insanity has no limits if you allow it.
@@gn2727 I won't mate. and you are right next time it's gonna be ?800W? what about "save the environment" make more efficient cpus gpus ( like 5 -6 years ago moto?) you are right next time its gonna be 800W. like an air heater.
512-bit 32GB? I wasn't expecting that. That's insane. I ain't complaining, though. This should provide a massive performance increase at resolutions such as 4K. I think the 5090 will have anywhere between 168-176 SMs.
Man I just really want good specs and a decent price on the 5070 so bad, I've already put 900 dollars aside to wait for the reveal but if its not going to be much better than the 4070ti super then i'll just go with the latter.
@stangamer1151 Yeah I think you're right, I'm currently using a gigabyte RTX 3070 so a jump to the 5070ti would be enormous, where I'm from the price of GPUs is insane, I cant find a 4070ti super for under 1050 dollars, even the shipping cost from amazon would bring it to around the same price(a 4090 is 2600 dollars btw).
just got a monster 4070ti last week but can't use it to its full power. Waiting on my RMA PSU, should I keep my card until next year and sell when the 5070 comes out or should I keep the card?
Cant wait for Nvidia to say "we hear you gamers! More vram!" And release a 5080ti with specs the 5080 should launched with and then there will be zero or negative price/performance upgrade.
Judging by how big is the difference between 5090 and 5080 specs, I belive the price gap between these two cards will also be huge. $2000 and $1000 sounds pretty realistic.
1000$ for a 70 class card is not okay.Sorry to said that.With this specs this disguised 80 class card not worth more than 600$.And even than I was quite generous.The 4090 total product price is not more than 300$.And they started it 1600$.So its not profitable right?Noooooooo..........
If NVIDIA delivers a RTX 5090 with a 512-bit bus and 50%+ performance improvement over the RTX 4090, Jensen will go for a $3000 MSRP just to be the first gaming GPU to hit that price.
So basically, I’m gonna have to have to lower many settings initially to keep the 5090 cool, and then I’m gonna have to spend at least 1200$AUD on a water loop to appropriately cool it, that’s not to mention I will need a new case, I’m not spend more than 3200$AUD + 200$ shipping. The only way I’m getting this gpu is if Nvidia had a major breakthrough and that my dynamics in air cooling, and also given it’s not 2499$USD, 1.67x = 4100$AUD
Good thing i didnt pull the trigger on a sff case yet, seems like ill keep my eatx case. The wattage scares me so waiting to hear official releases for power consumption as my office gets toasty in summer. Whats your guess on release date, danny?
It's a good card, but it sucked azz when it first come out due to poor driver support, but it did get better. I have one on the shelf myself collecting dust though. That card did make me switch to Nvidia because of AMD poor driver support.
@@jamesm568 Not unless you were on Linux. Linux support is still better, though, things may change relatively soon as Nvidia is introducing open kernel modules. We'll have to wait and see what happens on the userspace side.
2,499 i have the money easily but i also have to buy another expensive psu to support the two 600 watt connecters which is above $3,000 for the two. HELL FKIN NO!!!
I want to upgrade to it, but it's going to have such a stupid price point in Europe. The 4090 was 2300 euro's when it released. Then it shot up to almost 3000 euro. You either get it at launch and sell it for a profit so you can get it later down the line for cheaper or not get it at all. The prices are ridiculous.
What blows my mind is that an almost two year old AMD card gives you 24gb of VRAM but INVIDIA can't even give you that on their new 5080. I get that most people don't need more than 16gb but come on. At least give us 20gb.
How you summarized at the end is perfect. Im finally leaving the console hellhole and getting ready to build a PC within the next year to year and a half. With the new gen of gpu's on the horizon ive been excited to see how everything shakes out, but its definitely feeling like ill be getting one of these new RDNA 5 GPU's. Nividia is looking to be way to rich for my blood. Its early though, time will tell and im glad im learning everytbing when I am, a year from now will be a much better time to build a modern first time rig as everything will be a lot more clear.
So a 5090 could potentially offer at least 1.6 times the raster performance of a 4090 and with newer DLSS and AI technologies it can double this performance. That's a huge uplift since a 4090 is more than enough for the majority of the games at 4K.
I refuse to entrain nvidia 600w is utterly stupid especially when so many are struggling with high bills for electricity still and I absolutely refuse to use the power connector connection when there are still loads of issues and returns to what appears to be board burnout
Even if you are not struggling, 600w as a precedent is very bad. Buyers should not encourage this inefficient approach or next time they will be selling you 1200 watt cards. The only reason they are doing this is cause they got away with 450 watts. 300 should be an absolute maximum and I don't care what everyone else says.
@@gn2727 totally agree maybe a max 450w but 400w was the typical in old nvidia cards in the early years such as the excellent gtx294 and gtx400 even the epic 1080ti only pushed 320w and is still viable for 1080p on high , I still feel that this 5k series is just an ai overclocked pro card with pro features burned out and stupid size cooling , at this rate given that nvidia have talked about there own pc company I wouldn’t be Surprised the 6090 ditched the pc motherboards completely and did everything in there own slab pc with m2 slots using there own risk chips into cuda. Compatible with x86 think super steam deck in a tripple mini pic
TSMC don't give same kind of price/perf improvements that they used to in good old days when wafers were cheaper and structures scaled better. But they do give smaller price/perf increases and ability to spend more to get more. 5090 is using that spend more to get more while everything below it is just using price/perf improvements they got.
There goes my friggn Tax return. I'm still rocking my 2080Ti waiting for prices to to fall below 1k even if I got to wait several years for a used card.
16gb of vram will be fine for a pretty long time at least until the next gen consoles which won’t be anytime soon. Adding more VRAM when you don’t really need it is pointless and will up the cost.
3 week ago this channel was talking about the 5090 being cheaper than the 4090 at launch. Now its talking about being almost $1000 more than where the 4090 started. Leaks are nothing but click bait at this stage, until official figures and specs come out its pointless to speculate as if they will be fact. I've heard them all, 5090's will have 32GB VRAM, then its 28 then back to 24 again then back to 32GB.
I was really hoping it would only be a small increase, basically a die shrink and more power efficient 4090 that's cheaper, they need to take a break these graphics are getting way to advanced for the games we have available unless you want to run in 4k/8k
NVidia’s GPU designs are no longer driven by gaming requirements but by the needs of AI, these overpriced and overpowered flagship cards are basically budget data Center cards with less memory and fewer CUDA cores. If NVidia really wanted to they could release a 48GB VRAM card with the 5090s but don’t want to cannibalize their workstation sales. As Jensen Huang is now so fond of saying, NVidia is now an AI company, gamers will get AI features whether they want it or not. “The more you buy, the more you save!”
AMD could. They have the means to. But they've already publicly stated that the premium high end is not their main focus. They want to dominate the mid and low range segment and build market share, which is definitely the right move IMO.
The "Republic of Gamers" all-WHITE Rog Strix 4090 I purchased was $2,500 dollars. Ill just get a reverse Mortage on my house in California to buy the "All White Rog Strix 5090 when its released.
I would concentrate on 4080 and 5080 as there is realistic improvement. When I look at the specs of 5090 it is pretty much double than the 5080 to the point. The first thought that comes to mind is 5090 has two (2) 5080 GPU's on the same card and because 600W is the limit/max that can be supported and that it has the same 12 pin power supply indicates two GPU's in the one card. Also because the GPU's are not pushed as hard as 5080 (2 x 300W for 5090 vs 1 x 400W for 5080) there is less heat/GPU and similar size cooler can fit the 5090. Performance/specs/temperature/cooling all indicate that there is two instead of one GPU in the 5090. Old SLI comes to mind but in one card. Talking about competition. INTEL ARC A770 has 32 Xe cores. Predicted top Battlemage has predicted more than 32 Xe2 cores. Xe vs Xe2 already is 50% jump in performance. I've seen rumors of 40 Xe2 cores if not more for the top Battlemage. Now we are closing to double the performance of the first gen INTEL ARC GPU's. A770 GPU clock speed is 2.1 GHz. The upcoming BMG-G21 that is predicted to be the replacement for the A750 and has specs reaching to 2.85 GHz. So we should expect 3 GHz and above from the top Battlemage performer. Then there is the Memory speed/Bandwidth from 17.5 Gbps/560 GB/s from A770 to 19 Gbps/608 GB/s from the top Battlemage. Now add Intel Application Optimization (APO) that promise 10-50% gain with top INTEL CPU. INTEL XeSS XMX also is close to equal to DLSS. Don't forget that INTEL ARC via REBAR uses 2 silicon architecture taking the advantage of both silicon and doubling the performance. For example i7-13700k 253W (max) + A770 225W (max) = 478W (max). In general Wattage = performance. It is a good indication now that the clock speeds are predicted to match the competitions clock speeds. Plus with two silicon architecture there is no queuing or waiting meaning every frame is perfect. A770 in reality already gives you better experience on 4k than 4070ti. Current testing doesn't give you the performance of 2 silicon combined performance or recognize architecture where you don't have queuing or waiting counting every error and broken frame. To beat you need at least double the FPS count from AMD/Nvidia single silicon architecture at high resolution demanding tasks where CPU plays only very small part. Hence insane Wattage like 450W from 4090 or the predicted 600W from 5090. INTEL is cooking something big for sure. Don't forget that Arrow Lake also plays a huge part. Battlemage predicted Wattage is 250-275W. So it looks like it will be close to 4080/5080 cards with combined wattage of 500-550W.Arrow Lake max Wattage is predicted to be max 250-275W. It would make sense to match the Wattage on GPU and CPU to avoid bottlenecks. So when talking about Battlemage you should also add the Arrow Lake into the picture because it is going to be big part of the performance. Also new Motherboards for the Arrow Lake most likely will have enhanced REBAR for pairing via 5.0 interface. Would make sense. 4090 24gb VRAM 384 bit bus 450W 4080 16gb VRAM 256 bit bus 320W 4070ti 12gb VRAM 192 bit bus 285W a750 8gb VRAM 256 bit bus 225W a770 16gb VRAM 256 bit bus 225W Price point for the top Battlemage I predict is $500 (for reference 4080 +$1300). But here is the catch. You need top upcoming Arrow Lake CPU to get all the performance from Battlemage and that is a new build $500 for the CPU and $500 for the Motherboard. That's $1500 as a round ruff estimate. Now it doesn't look that cheap anymore. But if this combo competes with 4080/5080 then it starts to look extremely good value if you are looking for a new build. Also when the Wattage of the 4080/5080 are 320W/predicted 400W and add the Wattage of the CPU it is looking very similar in power cost. Arrow lake Wattage predicted Wattage is 250-275W. Top AMD CPU 170W. Battlemage prediction is 250-275W. Top INTEL combo 275W + 275W = 550W vs Top AMD CPU 170W + 320W(400W) = 490W(570W). As a summery I think we are looking at doubling the Alchemist/13th gen/14th gen CPU performance with Battlemage/Arrow Lake CPU that would compete with 4080/5080 performance. Potential also gives hope that it might even be more and beat these cards. It will come down to price and build new vs old for the top performance. So look for the Arrow Lake CPU launch and you can be sure Battlemage won't be far behind. Going back to the dual GPU of 5090 prediction or maybe just a feeling of it would do similar job as the INTEL pairing to INTEL CPU. There is talk about Nvidia getting into CPU market and I think this might just be the reason to have two silicon instead of one. This is just guessing and making conclusions from the specs.
I think that the 5090 will be priced high enough that scarcity shouldn't be a issue after 6 months. It's the 5080 that's gonna sell out if the price is right. 5080 looks like the card that can max out 4K.
I just wanna know if i should sell my 4070 for 480$ and use the rest of the year to save up for the 5080 or just buy the 4080 super paid 688$ for the 4070 when I bought it
My 4090 RTX now looks better and more appealing than. 5080 RTX. 16 GB of VRAM that is a problem a big problem on these new demanding games. 400 watts, tells me can’t draw the power it needs. Anyway, I’m obviously going to get the. 5090 giving I have the 4090 now and the 5090 looks like the only true next gen card here. Looking like a beast. Will need a whole new PSU for it! 😁
I’m on a 1660 super right now and was going to get a 4090, seeing these leaks though I’m definitely going to wait for a 5090 and ride that out for 5+ years if the gaming market learns how to optimise their games.
@@cyrus1411 well, it not so much optimising their games. It how it should be. Pushing games to their limits and the top end graphics should continually be pushed. Alan wake 2 really put a 4090 through it paces and if it didjt and the 4090 were hardly used then we’re doing something wrong. It like these clauses console owner that go my console barely being tested it’s the games. No, it your console can’t even run some of the game at 1080p now. It just upscalling them into a blurry mess because as it should be. The console are 4-5 years old on aging hardware that was barely mid-tier tech at the time
With zero competition along with the insane specs of the 5090 we can expect a $2500 price tag. The days of Nvidia giving gamers anything decent ended when people climbed all over themselves for a $1600 card.
would it be better to stick with a 4090 if i don’t wanna deal with scalpers and stuff for a 5090? or hypothetically if someone wanted to build a new gaming pc, would it be better for them to get a 4090 over a 5080 in terms of performance? (assuming they didn’t wanna pay the 5090 price of who knows what)
The only 4090s available will be on the used market after this quarter, the remaining new stock are being cleaned off the shelves now so you need to decide very soon, scalper prices are already appearing for ‘new’ 4090s being resold on eBay and Amazon.
You need to know the final prices & specs. All of this is subject to change. My guess would be the 5080 over the 4090...depending on what you are planning on using it for. Heck, I have the 4090 and the powerpin has me so nervous.... if they solved that with the 5080... the peace of mind would be enough to push the decision in the 5080's favor.
@@Recuper8lol I can’t support the 5080 with 16gb of ram seems like going backwards as a 4090 owner also only upgrade that makes sense unfortunately is the 5090
I'm ok with my GTX 1080 TI, these new cards not worth the high price they are sold, considering that the performance has worsened compared to the older generations.
What happened to that "Nvidia RTX 5090 Could Be Cheaper Than You Think! Here's Why - Are They Finally Listening?" ? That one with 1299 $ price tag in the thumbnail.
Just so unappealing to new buyers , looks like I won’t be getting a Nvidia Gpu as they pretty much shame anyone who doesn’t get the absolute highest tier card they have
Idk, if the ban on 4090s I'm china is still a thing, and they are discontinuing the 4090. I feel like they're purposely making thr 5080 a little weaker so they can sell it there
yh no way im spending on 5090 when companies dont even try to optimize the game anymore and just slap on frame gen/dlss needed. Optimize your games and we wont even need a 5090
I'm just waiting to see what the 5060 will look like since I'm squarely in that market bracket. I can't afford nor can I fit the likes of 5080 and 5090 in my case.
5080 is half the specs of the 5090... another x80 class GPU with x70 specs 😂, watch Nvidia pull another "4080 12GB" move.
Why people keep saying 5080 should cost 1400$. Nvidia is going to keep their margins constant. But they are spending extra 20-33% on 5090 over 4090 which they pass on consumer by increasing MSRP. While 5080 is designed to cost around 4080 super or less but increase clock speeds and improve architecture to get higher performance.
@@jouniosmala9921 i mean... If 5080 ends up at the same price as the 4080S wouldnt be that bad actually, we will see cuz the trend of Nvidia this last years is to increase the price with each new generation, well more or less like Intel or AMD with cpus too now that i think about it 🤔
5090 is the ultimate king gpu will do 8k gaming.@juanford55
@@juanford55 The wafers cost four times as much as back in the good old days. It's kind of a TSMC gives nice bump of transistor density with cost increase that's a bit less than improvement in density. I'm hopeful that this time it's just a tweak for same process thus cost increase doesn't happen we get price/perf improvement from TSMC.
In the old days, nearly all price/perf improvements came from TSMC and Nvidia and ATI were just utilizing that as well as they could.
Price is the only thing that matters. If it costs $700, I do not care how much 5080 is compared to 5090... The only thing that matters then is that 5080 is still faster than 4080.
The RTX 5080 is more like an RTX 5070 with those specs. These specs are absolutely ridiculous for a next gen 80 class GPU.
The specs are fine. 4080 has just a tad bit more CUDA cores than 3080, yet it is 50% faster than it's predecessor.
If 5080 is also ~50% faster then 4080, it will be a great product.
The only thing that matters is price. If it costs
@@stangamer1151the only problem i see is the vram. 16 GB on what will be a 1000$+ card is really not great. Of course we still have to see how good and widespread NTC will be and what the new iteration of DLSS will bring to the table. Put as of now 16gb on a top of the line card is not great
Dam, what is the 5060💀
@@massimovolpe1343 20GB would be excellent, but I do not think it is a major issue. According to some rumors, new gen consoles will only release in 2028. This means 16GB will be enough even for 4K res until that time. I mean, 12GB is still perfectly fine for 1440p. 4K usually needs just about 2GB more.
@@stangamer1151 its not, 5080 have half of 5090 means they sell 5070 named as 5080.
we saw what they did with 4080 12gb one xD
70 tier have allways been half of 90 tier and now they will sell half for higher name/price
Nvidia seems dead set on making their "the more you buy the more you save" catchphrase true. By making sure that all their mid to high range cards are overpriced and underperforming. Only the highest tier gets decent products.
and what makes it worse... is that they NEVER have enough available to combat the scalpers and bulk-AI-farm purchases until it's almost half-way through the life-cycle. So only the high-end cards are really a single-gen upgrade... AND you can't F****ING buy them even if you have the money!
Except with these practices they actually disprove their point, rtx 2060 super is a 5 year old card and rtx 4060 is only 30% faster and same vram, so 2060 super maintained its value better relative to how much it cost to buy, 2080ti on the other hand...😂😂😂😂
@@Gustaviustwinkelberry That's just the fact that if you're going to the high end then gen over gen gains are 4x what the bottom of the market is doing, most people see value in a speed x cost + features kind of way so 4060 represents a somewhat bigger jump in perception of value because the average entry level buyer is more feature driven than a mid range buyer would be where the equation gets harder, 4070s vs 7900gre is a decent example of if you don't care about Ray tracing, this one does it cheaper.
@@elysian3623 its just funny to me that 1000$ rtx 2080ti is now worth sub 300$😂😂😂, 700$ loss!!!, this is why i always buy low end i just cant afford this amount of money bleed, especially with nvidia garbage practises screwing over previous gen buyers with vram and features,lmao its better economically to buy low end save the money and upgrade more often
well if people werent hell bent on upgrading every year, their slogan is correct. a 4090 for $1600 could easily last you the next 6 years at 4K gaming. thats $266 a year. Thats cheaper than if you got a cheaper card from the get-go and needed to upgrade later on. these cards are actually the best value theyve ever been, its just that people WANT to upgrade every year,.
Everyone seems to focus on price patterns, but volume often gets overlooked. I've been learning it more and believe that mastering volume with price action really enhances technical analysis!
Can you share what you focus on in volume analysis?
Historical Volume, Price Movements, Support and Resistance, etc.. Pay attention to volume when a stock approaches key support or resistance levels. It is also crucial to understand market movements, i got some great help from my advisor on this, which really boosted my experience
Would you please provide the details of this expert who is guiding you? been considering financial advising for some time, but I'm having trouble locating a fiduciary.
*Layan* *Talia* *Chokr*
Just surf the web and make your research
Thanks for the pointer. I searched her name and her info came up
They neutered the 60 series in the 2000 gen. Then they neutered 70 series in 3000 and 4000 gen. Now they do that even with the 80 series this next gen
Whilist costing more :/. 4070 is like $800 here tf.
2060 is 60% faster than 1060 6gb while still having 192bit bus. you are quite wrong about 2000 gen. 3000 gen is a dissapointment, but 3060 12gb is still good, not 3060 8bg, ti.
@@ridleyroid9060 in my country 4070super is around $600, but I cannot afford it rn
The 3070 was insane performance wise what do you mean?
@@massimovolpe1343 Yeah but the 8gb vram absolutely killed its future proofing.
The 512 bit bus pleases me, I hated the leaks saying it was going to be 448 bit. I agree, I think NVIDIA have cut the 5080 way too much and are trying to upsell.
Sounds like it's 5090 or nothing. The 5080 is neutered to hell
5090 is likely overkill and VERY expensive based on those specs. RTX 5080 and RTX 4090 will be great 4k GPUs IMO
5080Ti incoming
@@johnc8327Nothing is ever overkill with the games we got today. We NEED more power to run high frame rates.
The RTX "5080" is actually the RTX 5070.Just like with the 4000 series,there will be no real 80 series card,just a sad 256-bit,16GB,puny GB203 die,impostor.Also power usage is beyond absurd.I also bet the 5090 "monster" will still not be fast enough to run the games maxed out at 4K without upscaling and fake frames.
@Shadowsmoke11 High frame rates? More like 4k 60 fps with pathtracing without upscaling would be revolutionary at this point.
It's typical NVidia: upselling pro max! 4090 will be faster than 5080 so prices don't drop. 5090 will be massive for those who have enough money, and everything else will be a POS with a lot of AI upscaling tucked on to it, to somehow make the graphs bigger.
Yup, make the 5080 way below the 5090 so as to make the 5090 look like the better deal, just like they did with the 4090. Plus, the doubling of VRAM is really interesting when it comes to running AI models locally.
5080 won't kill the 2nd hand market for the 4090 also, because it will probably be samey performance, but the lower VRAM kills it when it comes to AI. So, Nvidia doesnt offend their high end current 4090 users by basically keeping 90% of the card's value even when the 5080 will be out.
If this is top line up the rest will be pretty crappy by the way, same as it is today. Hopefully AMD will produce something for the midrange, but they'll limit it at 16gb anyways. We're so doomed.
Yep and 5090 will likely be $4000 aud
Newsflash: the 5080 is replacing the 4080, not the 4090.
They have already stopped making 4000 series cards, any you buy now are old stock being cleared out.
Looking like the ol' 3080 + 5800x3D combo just bought herself at least 2 more years
Not a bad combo. I had a 3080 and 5800x and loved it. I did end up going to a 5700x3d then returned it and went with a full upgrade to a 7800x3d and a 7900xt. My 3080 was great and I loved it. I didn’t “need” to upgrade but it sure was nice. I don’t blame you for waiting.
That's a rocking setup
5080 50% less chip? WTF NVIDIA!
Yes. Nvidia should have made a 1300$ 5080 instead of 800-1000$ 5080.
More seriously, 5090 seems to be let's spend 33%more than 4090 to get 33% higher performance increase over 4090 than architectural and clock speed improvements would give.
And NVidia isn't going to cut their margins. 5090 is peak performance at any costs design, while 5080 is fixing the issue of second fastest die still needing to be reasonable price range for consumers.
what 2500 dollars?
I am expecting for it to be 50% better than 4090 in order to give 1600.
No deal.
So, no deal for you?
@@nossy232323 i am already locked on lower gpus. I'm waiting for them to officialy disappoint me and I will click-buy so fast (eg a used 6900xt) that my thumb will go through the screen.
in case you want to start the "dont be poor" crap... I can afford 2x5090s whatever the price . but I am not a) a victim and b) an idiot.
It's a big boy card.
@@jamesm568 then you should not have it at all! Ppl succumbing to what ever demands are not "big boys" but wimps.
you act as if this is the best gpu in the world. but without competition how can you be so sure?
it's like as if there was only one car company in the world saying "my car is the best!"!
@@eawblablatron9161 let's see It probably will be the best GPU for the average person since there is no other option out there that provides that type of performance. I would say it is a big boy GPU since it will be mainly adults that are able to buy it. You want to play with the big boys you pay the big boy price.
Me waiting for my 5070 upgrade looking at the 5080 specs 🤡
more and more people will find out
5080 is a complete joke. I’m happy with my 6900xt thanks nvidia. It’s almost like they want me to look at a 7900xtx with the price tag of a 5080 lmfao.
total waste of time...
The 5090 is a AI card at this point. The highest end card for gamers is the 5080 which prob wont even beat the 4090
I’m opting out of NVidia stuff for at least the next 5 years probably much longer, got a 4070 Ti Super and it’s sufficient for AI tinkering, don’t much game anymore plus many AAA games are complete trash.
My previous card was a 2012 GTX Titan that lasted me 12 years, this one should get 5-6 years of use.
Frankly I’m done with the PC space and chasing hardware, my old 2012 tower PC is still running well.
The next one will probably be a laptop, people want to buy $2500 GPUs for gaming are welcome to keep living on the upgrade treadmill, I’m done with companies like NVidia who wants to be the new Apple.
You need to be a whale to even care about that garbage.
Enjoy your 400W Intel CPU and your 600W Nvidia GPU.
They all ready make non heat producing optical photonic light wave CPUs and GPUs that are 1000x faster than consumer electronic CPUs and GPUs in the public market today. They don't want to give the public the fastest best tech over night because they will lose billions in future sells schemes . The entire computer and gaming industry is a scam run by mafia criminals, don't fall victim to the hype, all your computer tech and gear is trash!
I skipped the 4090 cuz I thought it was too expensive. But I'm getting the 5090 no matter. The games coming out of UE5 engine feels truly next gen and it has just barely begun; the idea of training your own data locally is extremely tantalizing; the FULL upscale of ANY video on your screen is taxing on the GPU and 240Hz/4K are becoming the norm. Adobe has all but FULLY integrated GPU hardware encoding and real time special effects into ALL of their software stacks which if you care 10x the work speed with NO exaggeration. CPU rendering is PAINFULLY slow in comparison. I'm pretty excited for the 5090 TBH!
Doesn't the electric heater have a resistance of 600w and a 1000w? Honestly, no... just no. As good as the performance sounds this meme level requirments
Yeah Jensen is going to be on stage telling us all how a 5090 will actually save us money because we won't need to turn our heating on.... And remember "the more you buy the more you save" 🤣😭😭😭
@@timothywells8589 Rtx 5080 for for 899 dollars and 400w seems okayish but the 5090 goddamn...
Heard that after the last RDNA from AMD next year they might opt again for high end. Rumors suggest a chiplet design 512 bit bus and 32 gb ram at 500w which is also bullshit honestly. Maybe they will do decent 24 gb ram at 384 bit bus though. Either way the top tier cards seem unreasonable but in their defence... if you are willing to spend that much it isn't like you can have any claim on bitching on the bills...
If it's anything like the 40 series. The 4090 has a 450w+ power limit but almost never uses anywhere close to that much power. My 4090 is currently using 287w playing FF16 at 4k DLSS Quality. I almost never even hit 450w in most games, and I have a Galax 666w vbios flashed onto my MSI Suprim.
@@jjlw2378 There are games where I have seen the power draw reached 450 plus watts. In some systems and on the internet the gpu was drawing 450 watts in games that are demanding and put a strain on it like Cyberpunk, Wukong and especially on the new Frostpunk game I think. You can limit the power also but we all have our concerns with the gpus, it seems the current technology is reaching it's limits in power efficiency.
600W, 900W and 1500W... my old trusty oil heater
If Nvidia wants a significant performance gap between the RTX 5090 and RTX 5080, there should be a corresponding price gap as well. Price the RTX 5080 at $900, and it is a good deal. I think it is possible as Nvidia is using the same 5/4nm node.
5nm will be more expensive in 2025. back in 2022 5nm starts at $17,000. 3nm at $20,000. next year 5nm TSMC will increase the price of 5nm to $20,000. TSMC said something like all their partner are agree to this price increase.
@arenzricodexd4409 That's 300mm wafer cost 🤔, don't you think 5nm is mature enough to have high yield and economies of scale ?
@@neti_neti_that's what I'm thinking - 4nm should be mature enough by now that there will be very few defects
@@tourmaline07 Yeah, I hope Nvidia passes these benefits on to us consumers. I'm not paying a dime more than $900 for an RTX5080.
by using 600w it means they have done nothing in terms of innovation. they are just using the same old motor which they supercharged.
it's fine by me but dont expect me to pay more than 1199 for old technology which you have oced Nvidia.
Don't expect them to sell it for 1199 lol. They should at least once lose money on this power draw insanity, otherwise next time it will be 1200 watt card. Insanity has no limits if you allow it.
@@gn2727 I won't mate. and you are right next time it's gonna be ?800W?
what about "save the environment" make more efficient cpus gpus ( like 5 -6 years ago moto?)
you are right next time its gonna be 800W. like an air heater.
Good to see nVidia screwing their fans more and more 😂😂😂 that 5080 is absolutely embarrassing if these specs are true 😂😂😂😂
Looks like things are only getting worse
Bro in what way other that price? Performance is only getting better lol
@@NJMazani performance increase doesn't equal price increase
@@SD-vp5vo even better means there's even less disadvatages
You mean better? Faster and cheaper
@@xpodx cheaper?
512-bit 32GB? I wasn't expecting that. That's insane. I ain't complaining, though. This should provide a massive performance increase at resolutions such as 4K. I think the 5090 will have anywhere between 168-176 SMs.
Yh I'm buying it just for the bus width alone its gonna help for work in ai
Virgins
with this card i spect that my enemies be killed before i see them around corners
Man I just really want good specs and a decent price on the 5070 so bad, I've already put 900 dollars aside to wait for the reveal but if its not going to be much better than the 4070ti super then i'll just go with the latter.
You'd better wait for 5070 Ti. It will probably replace 4070 Ti Super in terms of price, but will be much faster (+30-40%).
@stangamer1151 Yeah I think you're right, I'm currently using a gigabyte RTX 3070 so a jump to the 5070ti would be enormous, where I'm from the price of GPUs is insane, I cant find a 4070ti super for under 1050 dollars, even the shipping cost from amazon would bring it to around the same price(a 4090 is 2600 dollars btw).
@@polobruh7255 or wait for a 5070 ti super version cause a 3070 is still powerful
have fun staying poor
@Uthleber Well what do you use then?
5080 probably only 20-25% better than 4080
That's very strange don't you think with the new memory type thier using in the 50 series?
just got a monster 4070ti last week but can't use it to its full power. Waiting on my RMA PSU, should I keep my card until next year and sell when the 5070 comes out or should I keep the card?
you are repeading yourself, sucker
Cant wait for Nvidia to say "we hear you gamers! More vram!" And release a 5080ti with specs the 5080 should launched with and then there will be zero or negative price/performance upgrade.
5090 2000?
5080ti 1500?
5080 1099?
Hotel?
Trivago 💯
@@dsayiasdad8099 and this will be like the 4080 S - too little, too late.
shrinkflation is insane
should i get a 1200w psu or a 1600 psu if i get the 5090??? anyone know
If the 5080 is lets say, same price as the 4080S(1000$) would be ok, but i can see them keeping it at 1200-1300$ and 5090 bumping up to +2000$.
Judging by how big is the difference between 5090 and 5080 specs, I belive the price gap between these two cards will also be huge. $2000 and $1000 sounds pretty realistic.
@@stangamer1151 and it should, also could leave room for a 5080 Ti too.
1000$ for a 70 class card is not okay.Sorry to said that.With this specs this disguised 80 class card not worth more than 600$.And even than I was quite generous.The 4090 total product price is not more than 300$.And they started it 1600$.So its not profitable right?Noooooooo..........
I have no idea how strong the 5080 is compared to the 4090. But it looks like the 12 GB 4080 that Nvidia canceled in the past.
If NVIDIA delivers a RTX 5090 with a 512-bit bus and 50%+ performance improvement over the RTX 4090, Jensen will go for a $3000 MSRP just to be the first gaming GPU to hit that price.
Thanks for this update man. Well done. Totally missed this leak.
So basically, I’m gonna have to have to lower many settings initially to keep the 5090 cool, and then I’m gonna have to spend at least 1200$AUD on a water loop to appropriately cool it, that’s not to mention I will need a new case, I’m not spend more than 3200$AUD + 200$ shipping.
The only way I’m getting this gpu is if Nvidia had a major breakthrough and that my dynamics in air cooling, and also given it’s not 2499$USD, 1.67x = 4100$AUD
With this power draw I will buy it if it's 3x faster compared to 4090.
Good thing i didnt pull the trigger on a sff case yet, seems like ill keep my eatx case. The wattage scares me so waiting to hear official releases for power consumption as my office gets toasty in summer.
Whats your guess on release date, danny?
i would most likely get a used 4090 at those prices
just be careful with the connector
I'll just stick with my 6900XT, thanks
great card
It's a good card, but it sucked azz when it first come out due to poor driver support, but it did get better. I have one on the shelf myself collecting dust though. That card did make me switch to Nvidia because of AMD poor driver support.
@@jamesm568 Not unless you were on Linux. Linux support is still better, though, things may change relatively soon as Nvidia is introducing open kernel modules. We'll have to wait and see what happens on the userspace side.
@@electronix6898 This was when it was first released in AMD never updated drivers for MSFS2020 till over a year later.
@@jamesm568 nice try nVidia Marketing team. you are so ridiculous.
2,499 i have the money easily but i also have to buy another expensive psu to support the two 600 watt connecters which is above $3,000 for the two. HELL FKIN NO!!!
5090 sounds like a nuclear reactor
Oh man I really hope that 5090 will cost $2499 it will be so funny 😂😂😂😂😂 please be true
I want to upgrade to it, but it's going to have such a stupid price point in Europe. The 4090 was 2300 euro's when it released. Then it shot up to almost 3000 euro.
You either get it at launch and sell it for a profit so you can get it later down the line for cheaper or not get it at all. The prices are ridiculous.
And I bet that the pricing and the fire hazard is also going to be quite aggressive for the 5090🤣🤣🤣
16 gigs of VRAM on 5080 = No Buy from me.
What blows my mind is that an almost two year old AMD card gives you 24gb of VRAM but INVIDIA can't even give you that on their new 5080. I get that most people don't need more than 16gb but come on. At least give us 20gb.
This GPU needs to be on a show called "My 600 watt life"
"Prepare Your Wallets!!!!💰💸💰" is right 👍👍
half of 90 tier means 70 tier and Nvidia will try to sell us 5080 with 5070 specs xD
another rtx 4080 12gb move
Double the specs for more than double the price.
And almost double power draw lol.
Thank you very much for this video as I was on the fence on upgrading from the 4090 to 5090. I will definitely upgrade now.
How you summarized at the end is perfect. Im finally leaving the console hellhole and getting ready to build a PC within the next year to year and a half.
With the new gen of gpu's on the horizon ive been excited to see how everything shakes out, but its definitely feeling like ill be getting one of these new RDNA 5 GPU's. Nividia is looking to be way to rich for my blood.
Its early though, time will tell and im glad im learning everytbing when I am, a year from now will be a much better time to build a modern first time rig as everything will be a lot more clear.
Not everyone has $5,000+ to build a pc dawg fuvk nvidia and there price gouging I’ll stay w my ps5 that still plays games good at 60fps
No sort of preparation can make my poor wallet ready for those prices.
the 5090 is clearly two 5080 chips conjoined, 4090 to 5090 is likely a bigger perf jump than 3090 to 4090
So a 5090 could potentially offer at least 1.6 times the raster performance of a 4090 and with newer DLSS and AI technologies it can double this performance. That's a huge uplift since a 4090 is more than enough for the majority of the games at 4K.
Don't worry, just wait for the 5080 super :p
I refuse to entrain nvidia 600w is utterly stupid especially when so many are struggling with high bills for electricity still and I absolutely refuse to use the power connector connection when there are still loads of issues and returns to what appears to be board burnout
Even if you are not struggling, 600w as a precedent is very bad. Buyers should not encourage this inefficient approach or next time they will be selling you 1200 watt cards. The only reason they are doing this is cause they got away with 450 watts. 300 should be an absolute maximum and I don't care what everyone else says.
@@gn2727 totally agree maybe a max 450w but 400w was the typical in old nvidia cards in the early years such as the excellent gtx294 and gtx400 even the epic 1080ti only pushed 320w and is still viable for 1080p on high , I still feel that this 5k series is just an ai overclocked pro card with pro features burned out and stupid size cooling , at this rate given that nvidia have talked about there own pc company I wouldn’t be Surprised the 6090 ditched the pc motherboards completely and did everything in there own slab pc with m2 slots using there own risk chips into cuda. Compatible with x86 think super steam deck in a tripple mini pic
TSMC don't give same kind of price/perf improvements that they used to in good old days when wafers were cheaper and structures scaled better. But they do give smaller price/perf increases and ability to spend more to get more. 5090 is using that spend more to get more while everything below it is just using price/perf improvements they got.
TSMC isnt the only one that makes wafers ASML and Zeiss (and a few others i think)
There goes my friggn Tax return. I'm still rocking my 2080Ti waiting for prices to to fall below 1k even if I got to wait several years for a used card.
16gb of vram will be fine for a pretty long time at least until the next gen consoles which won’t be anytime soon. Adding more VRAM when you don’t really need it is pointless and will up the cost.
3 week ago this channel was talking about the 5090 being cheaper than the 4090 at launch. Now its talking about being almost $1000 more than where the 4090 started. Leaks are nothing but click bait at this stage, until official figures and specs come out its pointless to speculate as if they will be fact. I've heard them all, 5090's will have 32GB VRAM, then its 28 then back to 24 again then back to 32GB.
I was really hoping it would only be a small increase, basically a die shrink and more power efficient 4090 that's cheaper, they need to take a break these graphics are getting way to advanced for the games we have available unless you want to run in 4k/8k
NVidia’s GPU designs are no longer driven by gaming requirements but by the needs of AI, these overpriced and overpowered flagship cards are basically budget data Center cards with less memory and fewer CUDA cores.
If NVidia really wanted to they could release a 48GB VRAM card with the 5090s but don’t want to cannibalize their workstation sales.
As Jensen Huang is now so fond of saying, NVidia is now an AI company, gamers will get AI features whether they want it or not.
“The more you buy, the more you save!”
WE ARE OVERCLOCKING CONNECTORS WITH THIS ONE.
Come on AMD... we need some competition. There is no way they couldn't come out with a more powerful 7900 XTX.
Rumor is a 7900xt equivalent at $500.
Amd at the moment is not targeting the flagship market but the mid range.
AMD could. They have the means to. But they've already publicly stated that the premium high end is not their main focus. They want to dominate the mid and low range segment and build market share, which is definitely the right move IMO.
5080 16gb $1,299 , 5090 32gb $1,799. RTX 5090 32Gb TITAN A.I $2,599 with nvlink later 2025 , 5080 24gb later this 2025 $1,499.
Does 50 series really require PCIe 5 x16 lane?
My motherboard doesn thave it. Didnt thought it will be this soon i require it😞
No, it does not need it. It's backward compatible.
The "Republic of Gamers" all-WHITE Rog Strix 4090 I purchased was $2,500 dollars. Ill just get a reverse Mortage on my house in California to buy the "All White Rog Strix 5090 when its released.
That’s what happens when you got a %10 state tax. Move to a better state. That should help
@@goalieman206 maybe try leaving america all together at that point
I'm definitely get an Asus TUF/STRIX or FE. The waterblock support for MSI Suprim is so lackluster. I just want a Heatkiller block for my 5090.
nvidias motto: "double the price for half the specs"
I would concentrate on 4080 and 5080 as there is realistic improvement. When I look at the specs of 5090 it is pretty much double than the 5080 to the point. The first thought that comes to mind is 5090 has two (2) 5080 GPU's on the same card and because 600W is the limit/max that can be supported and that it has the same 12 pin power supply indicates two GPU's in the one card. Also because the GPU's are not pushed as hard as 5080 (2 x 300W for 5090 vs 1 x 400W for 5080) there is less heat/GPU and similar size cooler can fit the 5090. Performance/specs/temperature/cooling all indicate that there is two instead of one GPU in the 5090. Old SLI comes to mind but in one card.
Talking about competition.
INTEL ARC A770 has 32 Xe cores. Predicted top Battlemage has predicted more than 32 Xe2 cores. Xe vs Xe2 already is 50% jump in performance. I've seen rumors of 40 Xe2 cores if not more for the top Battlemage. Now we are closing to double the performance of the first gen INTEL ARC GPU's.
A770 GPU clock speed is 2.1 GHz. The upcoming BMG-G21 that is predicted to be the replacement for the A750 and has specs reaching to 2.85 GHz. So we should expect 3 GHz and above from the top Battlemage performer. Then there is the Memory speed/Bandwidth from 17.5 Gbps/560 GB/s from A770 to 19 Gbps/608 GB/s from the top Battlemage.
Now add Intel Application Optimization (APO) that promise 10-50% gain with top INTEL CPU. INTEL XeSS XMX also is close to equal to DLSS.
Don't forget that INTEL ARC via REBAR uses 2 silicon architecture taking the advantage of both silicon and doubling the performance. For example i7-13700k 253W (max) + A770 225W (max) = 478W (max). In general Wattage = performance. It is a good indication now that the clock speeds are predicted to match the competitions clock speeds.
Plus with two silicon architecture there is no queuing or waiting meaning every frame is perfect. A770 in reality already gives you better experience on 4k than 4070ti. Current testing doesn't give you the performance of 2 silicon combined performance or recognize architecture where you don't have queuing or waiting counting every error and broken frame. To beat you need at least double the FPS count from AMD/Nvidia single silicon architecture at high resolution demanding tasks where CPU plays only very small part. Hence insane Wattage like 450W from 4090 or the predicted 600W from 5090.
INTEL is cooking something big for sure. Don't forget that Arrow Lake also plays a huge part. Battlemage predicted Wattage is 250-275W. So it looks like it will be close to 4080/5080 cards with combined wattage of 500-550W.Arrow Lake max Wattage is predicted to be max 250-275W. It would make sense to match the Wattage on GPU and CPU to avoid bottlenecks. So when talking about Battlemage you should also add the Arrow Lake into the picture because it is going to be big part of the performance. Also new Motherboards for the Arrow Lake most likely will have enhanced REBAR for pairing via 5.0 interface. Would make sense.
4090 24gb VRAM 384 bit bus 450W
4080 16gb VRAM 256 bit bus 320W
4070ti 12gb VRAM 192 bit bus 285W
a750 8gb VRAM 256 bit bus 225W
a770 16gb VRAM 256 bit bus 225W
Price point for the top Battlemage I predict is $500 (for reference 4080 +$1300). But here is the catch. You need top upcoming Arrow Lake CPU to get all the performance from Battlemage and that is a new build $500 for the CPU and $500 for the Motherboard. That's $1500 as a round ruff estimate. Now it doesn't look that cheap anymore. But if this combo competes with 4080/5080 then it starts to look extremely good value if you are looking for a new build. Also when the Wattage of the 4080/5080 are 320W/predicted 400W and add the Wattage of the CPU it is looking very similar in power cost. Arrow lake Wattage predicted Wattage is 250-275W. Top AMD CPU 170W. Battlemage prediction is 250-275W. Top INTEL combo 275W + 275W = 550W vs Top AMD CPU 170W + 320W(400W) = 490W(570W).
As a summery I think we are looking at doubling the Alchemist/13th gen/14th gen CPU performance with Battlemage/Arrow Lake CPU that would compete with 4080/5080 performance. Potential also gives hope that it might even be more and beat these cards. It will come down to price and build new vs old for the top performance. So look for the Arrow Lake CPU launch and you can be sure Battlemage won't be far behind.
Going back to the dual GPU of 5090 prediction or maybe just a feeling of it would do similar job as the INTEL pairing to INTEL CPU. There is talk about Nvidia getting into CPU market and I think this might just be the reason to have two silicon instead of one. This is just guessing and making conclusions from the specs.
Great video Danny!
Brace for another round of 16-pin 12VHPWR connectors melting/burning
lol we are getting 1 next gen gpu the rest are old crap for a slightly better price
Nvidia is missing an opportunity to mess up names even more. Call the 5090 the 5090 T (titan) and then release and 5090 TTI super
I think that the 5090 will be priced high enough that scarcity shouldn't be a issue after 6 months. It's the 5080 that's gonna sell out if the price is right. 5080 looks like the card that can max out 4K.
There is no way the 5080 can max out 4K. In fact even the 5090 won't without upscaling.
@@nossy232323 What the hell are you trying to play bro? 😑
Biggest Jump was RTX 3000. When a 3060 was trading blows with RTX2080. This will never happen again
You can use an Ikea Pax closet as your case.
I just wanna know if i should sell my 4070 for 480$ and use the rest of the year to save up for the 5080 or just buy the 4080 super paid 688$ for the 4070 when I bought it
do it, to stay a poor nvidia sucker.
How many TF will the 5080 and 5090 have ?
Don’t believe they just gonna jump $1k from the 4090 I wanna say $2k for the 5090 and 1500 for the 5080
My 4090 RTX now looks better and more appealing than. 5080 RTX. 16 GB of VRAM that is a problem a big problem on these new demanding games. 400 watts, tells me can’t draw the power it needs.
Anyway, I’m obviously going to get the. 5090 giving I have the 4090 now and the 5090 looks like the only true next gen card here. Looking like a beast. Will need a whole new PSU for it! 😁
I’m on a 1660 super right now and was going to get a 4090, seeing these leaks though I’m definitely going to wait for a 5090 and ride that out for 5+ years if the gaming market learns how to optimise their games.
I’ve a 1500w psu.. hope that’s enough
@@cyrus1411 well, it not so much optimising their games. It how it should be. Pushing games to their limits and the top end graphics should continually be pushed. Alan wake 2 really put a 4090 through it paces and if it didjt and the 4090 were hardly used then we’re doing something wrong.
It like these clauses console owner that go my console barely being tested it’s the games. No, it your console can’t even run some of the game at 1080p now. It just upscalling them into a blurry mess because as it should be. The console are 4-5 years old on aging hardware that was barely mid-tier tech at the time
32gb g7 is a game changer looking forward to building a arrow lake Blackwell pc on a G4 oled
With zero competition along with the insane specs of the 5090 we can expect a $2500 price tag. The days of Nvidia giving gamers anything decent ended when people climbed all over themselves for a $1600 card.
glad you call AMD competition with 1/10 market share :D
I definitely disagree with you comparing this to the 20 series generation. The 5090 is an absolute monster. While the 2080ti was trash.
Somehow, I'm not surprise about the watt and price increase. 😝
If the 5080 spec are that low there is definitely a 5080ti and tiS. So basically the 5080 just another 5070ti
would it be better to stick with a 4090 if i don’t wanna deal with scalpers and stuff for a 5090?
or hypothetically if someone wanted to build a new gaming pc, would it be better for them to get a 4090 over a 5080 in terms of performance? (assuming they didn’t wanna pay the 5090 price of who knows what)
The only 4090s available will be on the used market after this quarter, the remaining new stock are being cleaned off the shelves now so you need to decide very soon, scalper prices are already appearing for ‘new’ 4090s being resold on eBay and Amazon.
@@glenyoung1809 that’s true, i do agree that the 4090s will probably be scarce pretty soon, and people will see that as an opportunity to make money
You need to know the final prices & specs. All of this is subject to change. My guess would be the 5080 over the 4090...depending on what you are planning on using it for. Heck, I have the 4090 and the powerpin has me so nervous.... if they solved that with the 5080... the peace of mind would be enough to push the decision in the 5080's favor.
@@Recuper8lol I can’t support the 5080 with 16gb of ram seems like going backwards as a 4090 owner also only upgrade that makes sense unfortunately is the 5090
@@Jigaboo1929 The 16GB 5080 for $1200 is a clear ripoff but NVidia and Jensen Huang now have same attitude as Apple, either buy it or get lost.
Guess I’ll hold onto my 3080 or get the 4080ti once the price drops and I get the rest of my stuff upgraded
Hope comes out sooner than expected.
I'm ok with my GTX 1080 TI, these new cards not worth the high price they are sold, considering that the performance has worsened compared to the older generations.
Correction, those specs belong to the RTX 5070.
Now all we need is a 9800X3D
My guess is that 2slot coolers are going to be A4 format
What happened to that "Nvidia RTX 5090 Could Be Cheaper Than You Think! Here's Why - Are They Finally Listening?" ? That one with 1299 $ price tag in the thumbnail.
Just so unappealing to new buyers , looks like I won’t be getting a Nvidia Gpu as they pretty much shame anyone who doesn’t get the absolute highest tier card they have
I guess we have to wait for rdna5 or celestial to fix the market
Steam Deck 2, with triple A games that actually optimized
@@MrEdioss steam deck is an apu i dont think it can run a lot of games without compromise
Idk, if the ban on 4090s I'm china is still a thing, and they are discontinuing the 4090. I feel like they're purposely making thr 5080 a little weaker so they can sell it there
yh no way im spending on 5090 when companies dont even try to optimize the game anymore and just slap on frame gen/dlss needed. Optimize your games and we wont even need a 5090
PC hardware is becoming a luxury item.
Rtx 5080 has same cuda cores as a 3090ti just on a better node even has less bandwidth and 33% less vram. No way should this be 1k. 800 max
I'm just waiting to see what the 5060 will look like since I'm squarely in that market bracket. I can't afford nor can I fit the likes of 5080 and 5090 in my case.
= 3080 just with 9 GB and 96 bit bus
@@Uthleber 96 bit bus? You mean 192? Cause if we get another 8gb 128 bit Ill be fairly dissapointed. At least make it 12gb, 128 I guess is fine.