Sorry for some of the issues with the audio. It somehow got switched to the wrong framerate, and there were some errors I didn't notice when changing it back.
ask them why their memory controllers on the io die suck so much and why they dont support faster frequencies of ram which seems to make quite a difference especially in gaming and AI and having chiplets is no excuse intel has chiplet design cpus which support higher frequency ram and if they plan in the next generation or refresh to include a better memory controller. Edit: also ask them why not using 2 x3d chips on the 9950x (and any x3d cpu with more than 1 ccds,,, ) extra memory not making difference is a BS excuse tell them -because that's what they will respond with - that it doesnt have to be extra memory they could just reduce the size of each x3d chip by about half and add each to each ccd (so again e.g 128 MB total but instead of one ccd having 96 and the other 32 make it so that both have 64 MB or god forbid 76MB ) so that there is no governor issue when gaming on which thread goes to which ccd...
PLEASE PLEASE PLEASE ASK AMD WHY THEIR PRODUCT NAMING IS SO CONVOLUTED, CHANGES SO MUCH, AND IS NEARLY IDENTICAL TO THEIR COMPETITORS PRODUC NAMES? (or something along those lines that doesn't get you kicked out of CES lol.... but you HAVE TO bring up them changing their names SO MUCH 🙏🙏 PLEASE!!)
I bet it will be more expensive than GRE same raster more raytracing performance otherwise they wouldn't discontinue the GRE. So like 600$-700$ at least so it will be bad at best lol.
@@cin2110 its said to be as fast as the 4080. the GRE did not compete with the 4080. itll be at around the 7900xt performance. 600-700$ sounds like the right area. even then the rumor is that itll be at a good price point. they are aiming for more market space. in theory the only way to do that is by offering higher performance for lower prices. if the 9070xt drops at around 500-600 then nvidia is in ttrouble. thats if they hold true what they have said. cant trust big corps only time will tell on the price point.
it's to help the consumer find their way by taking the known names of the market leader, and honestly it makes things easier for me like with processors
it makes it easy to compare and find out what are the same tier gpus from two companies. It's good for the consumer IMO. This 9070 xt equals to 5070 or 5070 Ti.
just wait till the even cheaper ARC B570 comes out in a few days.... almost RTX 4060 performance in most games, and even matching the nvidia card in Cyberpunk. for like a tad over $200 on launch. hehehehe go intel!!!
The performance of the b580 is pretty bad in most games thought it was supposed to be a 4060ti competitor but it’s nowhere close. Used 3090 your best bang for buck atm
3:10 I think you got this a little wrong. AMD did not have a competitor for the 4090 and the 7900 XTX was a competitor for the 4080, if even that. They just didn't match their 7X00 numbers to the 40X0 numbers of nvidia, and honestley why should they. Imo you have to look at performance at a similar price range and nothing else.
The 7900xtx is between a 4090 and 4080 in performance. The ray tracing isn’t as good as probably the 4080 but it’s only 20% slower in raw rasterization than a 4090.
And a gen later they still don't have a real competitor for the 4080.. That is pathetic. They are literally almost 2 generations behind ..., and I only see that gab widening over the next few years.. Intel will likely catch up to AMD next generation and then surpass them.. I see AMD tanking like 3DFX did
People saying that if the 9070 XT is priced well, then Nvidia in "trouble". Look... Even if it sells well, that's still far from "trouble" for the likes of Nvidia. It's teetering on the edge of delusion to suggest Nvidia is anywhere near "trouble" in 2025/26
True. As Warren Buffer said the market can stay irrational longer than you can stay solvent. Nvidia is the new Apple. A cult. The fact that those gimped three generation old features on new iPhones are still selling says it all.
I would assume they are talking about in the gaming GPU market. They would probably lose some market share but they do not care about their gaming revenue at all because it’s tiny compared to their AI revenue. That’s why they are always priced ridiculously high. They know they are the Supreme of PC gaming and it doesn’t even matter if people aren’t willing to pay that premium. Every company in the world is getting on their knees to buy their AI cards and software.
nah, even if the 9070xt does phenomenally well as far as gpu sales go. It could capture what? maybe 15% marketshare at the absolute most. This would also represent a LOT more people buying in this price range than normal. So yeah, that's like a best case scenario. More realistic would be btwn 5 and 8% again.....if it does really well. But hey....it would be a move in the right direction for a change.
AMD figured that an "8800xt" would be compared to a 5080 and suffer in comparative performance. AMD's "Navi 48" card may be such a better value in terms of memory & price/performance than a 5070 that AMD changed the name at the last minute to "9070" to invite that comparison instead.
From all the leaks so far, the 9070xt looks highly disappointing. It's no good comparing it with current gen cards, it's supposed to be be the next gen card. It's looking barely better than a 7800xt(which is what I already have) so far. Nvidia is going to absolutely dominate this generation in terms of performance. And that's tragic for everyone's wallets!
The leaks have put the 9070xt no lower than the 7900gre with the ray-tracing performance of the 4070ti and a max price of $600… That’s the worst case scenario, not even including FSR4
Of course Ngreedie is trying to upsell their higher end models. Why pay for a GPU that has marginal performance increase when you can spend x5 the amount of money and get x10 the performance. So glad I got a 7900xt.
I mean that would be amazing value if 5x the cost got 10x performance. With higher tiers it's usually 100 percent price increase for 50 percent more performance.
If I'm not mistaken I have 5120 cores. Toxic sapphire 6900xt liquid cooled. While I'm at it even the 7900xtx has 6140 or something close to that so why go from 6k to 4k. I don't get it
@@djofulll buy a fkn server rack at that point dude. Just saying stuff like that is the reason the rest of us have to suffer. Besides ur playing CS, you can do that with a 4070
I've never been an AMD GPU Fan, witnessed a lot of bugs from friends, bad drivers, huge drops all over the years. I was glad i had ever since my nVidia GPUs. I love RTX and DLSS. But nowdays? Pricing is hugely odd, VRAM is lacking in WQHD and perfomance wise you just get good numbers with DLSS and Frame-Gen in some cases. Looking forward to the new RX 9xxx, maybe ist the new meta after this point.
not going to happen intel is new in the graphics card market and nvidia is not going to lower the prices of their gpus ever 5090 for example some will cost almost 3k
I wouldn't be surprised if by D or E series cards Intel are matching XX70-80 tier Nvidia cards, they have some incredible engineers working on the project
They can barely compete with AMD in CPUs, what makes you think they can beat Nvidia if not even AMD can beat Nvidia? The only thing they'll do is give tons of vram for now, lol.
I just designed a new PC this June. I have an AMD Ryzen 9 7950x3d 16 core CPU. And an MSI GeForce 4080 RTX 16GB Super Slim. There is no reason for me to get a 5090. Always remember the golden rule with computers I made two decades ago "A computer is only as good for what you want to use it for."
i would assume at least SOME level of performance per clock increase. multiple metrics to improve performance with so if im being realistic id guess somewhere between the 7900xt and xtx for like $550~ish
@@tomaspavka2014 Timespy doesn't really mean that much. games scale very differently. With that said, I do think the 7900 xt will be faster. If it comes in at a cheap price, then it's a winner.
@@varcaic9170 based on these leaked benchmarks not looking to be anywhere near that. We'll have to wait and see! gamer nexus and hardware unboxed will surely let us know :)
@@carolebaskin138 well, my rx 5600 xt broke last year and I had to pick something fast. I had budget for 4060/3060 ti so I took 4060. It was decent but in newest games its unbearable cuz of vram even in 1080p. Now im selling it and I picked 4070 ti super. Hopefully it will be enough for at least 4 years in 1440p. 2070 super is pretty much same thing as 4060 so I assume you want to upgrade as well
XD XD XDXDXDXDXDXDXD 5070 gonna be like 15% more expensive for 4% gain. Coming from the company that wants us to believe Moor's law is dead, while other manufacturers seem to not have the same conclusion. At this point Nvidia isn't just betting on their fans being idiots, they already accepted it to be the case.
Nah I'm just trying to get a solid 120 fps with max settings in the newest games on my 4k 120hz monitor without having to use upscaling or frame generation. Do you think the 5090 will be capable of that? I bet I would still have to use DLSS.
It's like Nvidia is relying on RayTracing and DLSS to sell 5070 and below. They're not putting much effort into improving those cards compared to previous generations. Memory clocks go up with the addition of GDDR7, but they're not increasing the amount of memory or the cores (meaningfully). And all this with a 25% increased power draw? For what? A little bit of base clock?
i built my first PC in 2018 w/ an Ryzen 3700X and an RTX 2070, switched to 5800x3d w/ 7900XT in 2023 and i am honestly happy to see Intel's relative success w/ the B580 covering entry level systems. i will probably wait this generation out to strike a deal, since my system is doing just fine in 3440*1440.
expect the RTX 5080 TI to get 24 GB VRAM, as it would make sense, but the price might be higher than RTX 4090, with better performance because of GDDR7
there is still the thing with the core count. having faster memory wont really matter for games unless the core of the 80TI goes way higher than the 80.
744mm squared means less per wafer accounting for other material costs including new GDDR7 someone broke it down in a video and it would be in the price range of a little over $1900ish if the 4090 is the base of how much it cost to make that card. This card is easily going to be over $2k sadly i hate it and hate even more that the 5080 is basically a 70 card because its half the cores of the 90.
The 5090 probably will cost that much but the 5080 simply cannot sell past $1000. It's a boosted 5070 Ti... and the 5070 Ti will probably be trading blows with a 4080 Super. Unless they have the gall to try and charge more for the 5070 Ti which is comparable with a card you can buy for $900 in the used market... there's NO WAY a $1200 5080 would sell. Even the most dense Nvidia fanboy can see to either get the 5070 Ti or 5090.
No, it's an 80 series card. 90s series were dual GPU cards so it makes sense to use this nomenclature for a card that roughly has twice of everything. The difference is that unlike in the past, companies can actually produce massive dies somewhat economically so there's no need for using 2 GPUs and any of the associated downsides like extra latency from the interconnect or any of the issues of Crossfire/SLI.
I own a 3090, bought it at launch. I refuse to buy the 4090 at launch & that was a mistake on my part given the performance for $100 more. I looked at the process on Newegg today & the cheapest was $2800
The more technology advances, instead of decreasing the size and having a decent video card with quantum technology, we have another giant BASIN, like a Dodge Challenger instead of a Ferrari. What a sad disappointment
the one take away I noticed here is that if those numbers are accurate? Then that puts intel's 580 in a Very competitive spot with the 5070 right off the bat.
That additional 50W of power that the 5070 is going to suck down could power an M4 Mac running at full load, CPU, RAM, SSD *and* GPU with 10W still left over to power external devices.
@@rev3489 You're right, I'll actually get some work done. But feel free to pay $2,000 for a sand brick that eats 600W for a few extra FPS. Best wallet-opening device ever!
Well and even the M4 max is stuck at 40-50 FPS medium settings 1080p gaming for the few games that run on it. GPU performance is a whole different thing than CPU performance and to this day no one created a low power competitive GPU not even Apple so your comment is invalid.
@@Max-9871 Well thanks for making that up on the spot, Max-9871, but I think we all prefer actual facts. The M4's GPU cores are on the same die as the CPU, along with the neural engine cores. All 3 share the same RAM. So how did Apple manage this in under 40 watts while Intel and nVidia are pushing systems closer to 1,000W?
These insane component prices and limited stock remind me of the pandemic days.... unfortunate what it means to have a "High end" PC has now grown out of reach for most.
I find it interesting that so many consumer tech channels are hyping the 5090 when that overpriced card isn't even marketed for or priced to be purchased by enthusiasts.
If the 9070XT is more than $450, it's DOA...plain and simple. AMD wants to 'be aggressive' with their pricing, to lure away NVIDIA customers, well...here's their chance. DON'T BLOW IT, AMD!!
I'd say the 5070 vs 4070 hardware comparison is a bit misleading. The 4070 was already an alright card, despite the price. The undisclosed clock speed increase, paired with the much faster memory for texture and physics calculations should have a somewhat noticeable affect on FPS
I have a 7900 XTX on order for $800 and I’m not feeling any reason from AMD or Nvidia to not just stick with that considering I’m not a super gamer and not as much caring about ray tracing, I’m kind of thinking the 24 gig of VRAM is the better future proofing
5090 will be ~15 -25 faster than 4090 because the same processing node IPC gained will be small. But they push clock speed and power used to increase performance. That's why they don't launch 4090ti. Because it will make 5090 small upgrades.
As an AMD fan I kinda see 9070 performance disappointing.. I already own a rx 7900 xtx and looks like I’ll have to stick with it for quite a long time.
@@ilbro7874 what is this thing with the 7900xtx being a room heater? I have one and it doesn’t run hot at all, even in the most demanding games it doesn’t go above 70°c
So much conflicting information. The last leak said the 9070xt was close to the 7900gre in performance though to be fair that was time spy and games always perform differently, but this huge of a gap would be surprising, we'll see. Now this card would possibly make at least a little sense at $650, still hoping for closer to $550, but if this is true it could mean there's a $500 card that's close to the 7900gre and 4070ti in performance.
In my opinion 5070 ti should be compared to 4070 ti super and when doing that it's an even smaller increase in specs, almost negligible, but will cost 20% more by the looks of it
So either Nvidia will charge $900 for the 5070 Ti (where it won't sell well against just getting a used 4080/S And they charge $1200 for the 5080 which then would be DOA because it's just a boosted 5070 Ti. Save $300 and sac 15% which you can cut in half from an OC.
I would have loved to see the 9950x3D have the 3D vcache on both dies. Based on videos I've seen it's a pain to jump through the hoops to get the CPU to park the proper cores. I'd pay extra to just have the whole chip stacked up.
nvidia charges what they want because nothing else is even close in performance. The 9070xt will have to be less than $450 to even be viable, even less if they want some market share it barely competes with the 4070ti
I think the 50 series cards should be compared to the 40 series super cards. 4070 super has 7168 cores and the 5070 has 6144 cores, what a leap -1024 cores. The 5070 is a joke.
I was really hoping AMD's flagship would beat a 4080. It seems both AMD and Nvidia are holding back some in the mid range market. They have ddr7 which is way faster than ddr6. A 1440P gamer has to sell a kidney to play at good frame rates.
I don't want to pay A$2799 for a 5080 after paying A$999 for a 3080. I'll be getting a 9070XT, or a 7900XT or 4070 Ti Super. When they don't do the job I guess I'll never play PC games again.
We've had the rough specs of RDNA4 and Blackwell for most of the year. People are scrounging for every scrap of info in the leadup to CES in just 11 days.
@@TheShaunydogFunny im in the exact same boat and im considering the exact same set of options. I want to lean AMD, but they just make themselves so damn difficult to support.
@@TheShaunydog tbh i love my suprim x, overclocks well and is just so cool n quiet and still runs so many of my games well but the next gen of games (a bit unoptimised maybe) are coming and it just cant keep up unless i lower settings but i want to get into ray tracing and up the visuals of my single player experience at 1440p so its time to
@@gavinferguson2938 ive got an rx6800 but i need upscaling in a bunch of games and fsr looks kinda bad, if AMD could just get an upscaler on dlss’s level or preferably more raw performance i wouldnt mind buying them but im planning on keeping my card for years and know upscaling will inevitably be needed, nvidia just has better quality n features for now
Everyone is stuck on this stupid minutiae of naming scheme. It's not an nVidia thing. When people talk about these cards in terms of CLASS we're not saying "700/600" it's always "70/60" GPU. Why keep the confusion when people clearly prefer to talk about these in such a way even when nVidia hardware is completely out of the conversation? That is to say, people talk like this when comparing AMD GPUs. It's pointless mental gymnastics and annoying.
BIIIIIIG IF: IF the 9070xt is close to 4080, day 1 purchase. If its blow 7900xt, might explore Nvidia. 4080 performance would be incredible if its priced well.
@@trentongardner2106 Yea I've seen those but here we are with another video claiming near 4080 performance. I expect a GRE+ but a 4080-ish performance would be a day-1 buy if its in the $600 range.
Sorry for some of the issues with the audio. It somehow got switched to the wrong framerate, and there were some errors I didn't notice when changing it back.
I am more excited for the price drop for card below it. Unless the prices be in range of 800 euros for 50 series.
ask them why their memory controllers on the io die suck so much and why they dont support faster frequencies of ram which seems to make quite a difference especially in gaming and AI and having chiplets is no excuse intel has chiplet design cpus which support higher frequency ram and if they plan in the next generation or refresh to include a better memory controller.
Edit: also ask them why not using 2 x3d chips on the 9950x (and any x3d cpu with more than 1 ccds,,, ) extra memory not making difference is a BS excuse tell them -because that's what they will respond with - that it doesnt have to be extra memory they could just reduce the size of each x3d chip by about half and add each to each ccd (so again e.g 128 MB total but instead of one ccd having 96 and the other 32 make it so that both have 64 MB or god forbid 76MB ) so that there is no governor issue when gaming on which thread goes to which ccd...
From your point of view, how likely is it that RX 9070 TI models will come out that do not exceed 305mm in length and no more than two slots in width?
PLEASE PLEASE PLEASE ASK AMD WHY THEIR PRODUCT NAMING IS SO CONVOLUTED, CHANGES SO MUCH, AND IS NEARLY IDENTICAL TO THEIR COMPETITORS PRODUC NAMES?
(or something along those lines that doesn't get you kicked out of CES lol.... but you HAVE TO bring up them changing their names SO MUCH 🙏🙏 PLEASE!!)
yeah... my audio is in italian. and I can't change it, so I have to read the closed captions instead...
DONT BUY ANY RTX CARDS UNTIL THEY FIX THE PRICING!!!!!!!!!
I'm buying one day one
@@buzzkill8214 me too.....5090 baby!!!!!!!
nice try I’m getting one as well
Sorry, I gotta have it 😂
Haha get ur money up. The people who can afford it don't mind bro.
if that 9070xt is priced correctly then nvidia is in trouble.
I bet it will be more expensive than GRE same raster more raytracing performance otherwise they wouldn't discontinue the GRE. So like 600$-700$ at least so it will be bad at best lol.
@@cin2110 its said to be as fast as the 4080. the GRE did not compete with the 4080. itll be at around the 7900xt performance. 600-700$ sounds like the right area. even then the rumor is that itll be at a good price point. they are aiming for more market space. in theory the only way to do that is by offering higher performance for lower prices. if the 9070xt drops at around 500-600 then nvidia is in ttrouble. thats if they hold true what they have said. cant trust big corps only time will tell on the price point.
@@moldetaco2281 timespy score was leaked and it as barely faster than a 7900 GRE
@@cin2110absolutely
From your lips to God's ears.
“We want to distance ourselves from nvidia “
*uses the same naming scheme as nvidia
Lol okay
That leak sounds fake as fuck though. Seriously? 9070? Who's begging for their leak to get attention here?
it's to help the consumer find their way by taking the known names of the market leader, and honestly it makes things easier for me like with processors
KOPITE7KIMI
it makes it easy to compare and find out what are the same tier gpus from two companies. It's good for the consumer IMO. This 9070 xt equals to 5070 or 5070 Ti.
they can say what they want but if they feel it can make them more money they will do it
that intel B580 is starting to look better and better right now for 250$, hope AMD or NVdia has a better option for the price once new gen hits.
just wait till the even cheaper ARC B570 comes out in a few days....
almost RTX 4060 performance in most games, and even matching the nvidia card in Cyberpunk.
for like a tad over $200 on launch.
hehehehe go intel!!!
Def will
and the possible B770 B770 will darn sure be a better choice for less money
THEY DO NOT EXIST AT $250. Stop praising a fake price, and channel that energy into being pissed it no longer exist.
The performance of the b580 is pretty bad in most games thought it was supposed to be a 4060ti competitor but it’s nowhere close. Used 3090 your best bang for buck atm
3:10 I think you got this a little wrong. AMD did not have a competitor for the 4090 and the 7900 XTX was a competitor for the 4080, if even that.
They just didn't match their 7X00 numbers to the 40X0 numbers of nvidia, and honestley why should they.
Imo you have to look at performance at a similar price range and nothing else.
The 7900xtx is between a 4090 and 4080 in performance. The ray tracing isn’t as good as probably the 4080 but it’s only 20% slower in raw rasterization than a 4090.
And a gen later they still don't have a real competitor for the 4080.. That is pathetic. They are literally almost 2 generations behind ..., and I only see that gab widening over the next few years.. Intel will likely catch up to AMD next generation and then surpass them.. I see AMD tanking like 3DFX did
@TheJackelantern ragebait used to be believable lmao
7900 xtx gets real close to the 4090 in raster with a good overclock
i mean... 1000$ for 7900XTX, and double that amount for the 4090. The Nvidia Pricing is atrocious.
@@TheJackelantern you sound like one of those idiots that’s actually gaming on like a 4060 or some shit
Yk what else is massive??
your mom
Don’t
MY MOM
Ahem, JUNGLIST MASSIVE
THIS LOW TAPER FADE
People saying that if the 9070 XT is priced well, then Nvidia in "trouble".
Look... Even if it sells well, that's still far from "trouble" for the likes of Nvidia. It's teetering on the edge of delusion to suggest Nvidia is anywhere near "trouble" in 2025/26
True. As Warren Buffer said the market can stay irrational longer than you can stay solvent. Nvidia is the new Apple. A cult. The fact that those gimped three generation old features on new iPhones are still selling says it all.
I would assume they are talking about in the gaming GPU market. They would probably lose some market share but they do not care about their gaming revenue at all because it’s tiny compared to their AI revenue. That’s why they are always priced ridiculously high. They know they are the Supreme of PC gaming and it doesn’t even matter if people aren’t willing to pay that premium. Every company in the world is getting on their knees to buy their AI cards and software.
nah, even if the 9070xt does phenomenally well as far as gpu sales go. It could capture what? maybe 15% marketshare at the absolute most. This would also represent a LOT more people buying in this price range than normal. So yeah, that's like a best case scenario. More realistic would be btwn 5 and 8% again.....if it does really well. But hey....it would be a move in the right direction for a change.
obvious man, the nvidia market aren't of gamers XDDD, thats 20% or less jajajajajaja
It's about which card Internet Cafes choose...
i think who ever says they buy over-priced GPU are bots created by this companies. Don't buy overpriced products.
I’m buying 5090 I’m from Ohio not a bot I’m 20 years old turn 21 in may
I'm buying the 5080 day one
"Over priced" is a non existant concept. I have tons of money. For me, they're cheap.
What about KOPITE7KIMI?
@@tnh.tenshi with current AI giving that response is not hard bot.
AMD figured that an "8800xt" would be compared to a 5080 and suffer in comparative performance. AMD's "Navi 48" card may be such a better value in terms of memory & price/performance than a 5070 that AMD changed the name at the last minute to "9070" to invite that comparison instead.
A gamble that will totally work Now that we have confirmation of a garbage 12GB 5070.
From all the leaks so far, the 9070xt looks highly disappointing. It's no good comparing it with current gen cards, it's supposed to be be the next gen card. It's looking barely better than a 7800xt(which is what I already have) so far. Nvidia is going to absolutely dominate this generation in terms of performance. And that's tragic for everyone's wallets!
The leaks have put the 9070xt no lower than the 7900gre with the ray-tracing performance of the 4070ti and a max price of $600… That’s the worst case scenario, not even including FSR4
@@akathenugget That's not the worse case scenario as the leaked price range went all the way up to 650 USD.
Of course Ngreedie is trying to upsell their higher end models. Why pay for a GPU that has marginal performance increase when you can spend x5 the amount of money and get x10 the performance. So glad I got a 7900xt.
I mean that would be amazing value if 5x the cost got 10x performance. With higher tiers it's usually 100 percent price increase for 50 percent more performance.
See i don't get how the new gpu comes with less cores than my 6900xt.
If I'm not mistaken I have 5120 cores. Toxic sapphire 6900xt liquid cooled. While I'm at it even the 7900xtx has 6140 or something close to that so why go from 6k to 4k. I don't get it
Less cores, higher speeds is my guess.
Same or better performance with less cores. Better efficiency, optimizations, etc. More cores does not equal better performance.
@kirilbarbov6949 no but more cores mean more processing
Please ask AMD, "When are you going to fire your Marketing department?"
"We might get regression at the XX60 series next gen"
We already did, the RTX 3060 matched and sometimes beats the RTX 4060.
5060 will match the 4060ti.
what ya'll think Nvidia is gonna price their flagship card at? 2.5K? 3.5k??? lmao
1.8 for founder, 2k and up to 3k for custom 5090
@@YueZhuang-pt6ff So similar to 4090 at launch price.
They could make it $7,000 and I'm still buying it
It will only cost one. One Bitcoin.
@@djofulll buy a fkn server rack at that point dude. Just saying stuff like that is the reason the rest of us have to suffer. Besides ur playing CS, you can do that with a 4070
The price is also gonna be MASSIVE!! 😆
And you know What Else is massive?
@razzor7861 LOOOOOOOOW TAPER FADE
And?
If you keep telling consumers to not buy Nvidia to tell them these prices are not acceptable, than you need to walk the walk and not hype Nvidia....
more like boycott ...
It's career suicide to ignore NVIDIA this January.
It’s also $2500 plus. Total insanity. I’m done with Nvidia.
You were slightly mistaken at 1:04 stating 148MB but the screenshot lists 128MB of L3 Cache. Just in case someone was only listening and not watching.
You just saved my life bro, thanks
what if that someone is only listening, not watching, not reading ?🤣
Yea i dont think anyone cares lol
6:13 Of course, this comparison totally ignores the 4070 Ti Super and 4070 Super cards we are buying now when making those claimed improvements.
I've never been an AMD GPU Fan, witnessed a lot of bugs from friends, bad drivers, huge drops all over the years.
I was glad i had ever since my nVidia GPUs. I love RTX and DLSS.
But nowdays? Pricing is hugely odd, VRAM is lacking in WQHD and perfomance wise you just get good numbers with DLSS and Frame-Gen in some cases.
Looking forward to the new RX 9xxx, maybe ist the new meta after this point.
Let's hope Intel comes in and gives Nvidia some much needed competition to hopefully drive down prices again.
not going to happen intel is new in the graphics card market and nvidia is not going to lower the prices of their gpus ever 5090 for example some will cost almost 3k
I wouldn't be surprised if by D or E series cards Intel are matching XX70-80 tier Nvidia cards, they have some incredible engineers working on the project
You have to pray to the KOPITE7KIMI. He can fix this! KOPITE7KIMI is a hero!
They can barely compete with AMD in CPUs, what makes you think they can beat Nvidia if not even AMD can beat Nvidia?
The only thing they'll do is give tons of vram for now, lol.
U wish 🤣
I just designed a new PC this June. I have an AMD Ryzen 9 7950x3d 16 core CPU. And an MSI GeForce 4080 RTX 16GB Super Slim. There is no reason for me to get a 5090. Always remember the golden rule with computers I made two decades ago "A computer is only as good for what you want to use it for."
9070 XT has the same specs as a 6800 XT basically, it will rival a 5060 Ti at 399-499 most likely
The 7900 xt has 5,376 shaders. So, unless there is a performance boost via some kind of other chance. It'll be slower than the 7900 xt.
25300 time spy graphics. Oc mode. Less that 7900xt. 23000 normal mode. 7900gre or 7800xt oc. Really weak high end amd card
i would assume at least SOME level of performance per clock increase. multiple metrics to improve performance with so if im being realistic id guess somewhere between the 7900xt and xtx for like $550~ish
@@tomaspavka2014 Timespy doesn't really mean that much. games scale very differently.
With that said, I do think the 7900 xt will be faster.
If it comes in at a cheap price, then it's a winner.
@@varcaic9170 based on these leaked benchmarks not looking to be anywhere near that.
We'll have to wait and see! gamer nexus and hardware unboxed will surely let us know :)
It’s like cpu’s you know. having the same core count doesn’t mean the same performance again as each individual core can be faster.
Dont buy the new cards guys! Let them take a hit so we can all get a better market
Oh they will buy, tons of people are waiting on 2060/2070 etc to upgrade. So they wont wait for 6xxx
@@jordaniansniper934 I'm in the 2070 super boat unfortunately.
@@carolebaskin138 well, my rx 5600 xt broke last year and I had to pick something fast. I had budget for 4060/3060 ti so I took 4060. It was decent but in newest games its unbearable cuz of vram even in 1080p. Now im selling it and I picked 4070 ti super. Hopefully it will be enough for at least 4 years in 1440p. 2070 super is pretty much same thing as 4060 so I assume you want to upgrade as well
@jordaniansniper934 yeah I do
XD XD XDXDXDXDXDXDXD 5070 gonna be like 15% more expensive for 4% gain. Coming from the company that wants us to believe Moor's law is dead, while other manufacturers seem to not have the same conclusion. At this point Nvidia isn't just betting on their fans being idiots, they already accepted it to be the case.
All you kids can't wait to spend $2k so you get 400 fps on your 144Hz 1440p monitors.. 😂
VR gaming is much more demanding, especially with high res VR headsets. 5090 can easily be tapped out with this.
Nah I'm just trying to get a solid 120 fps with max settings in the newest games on my 4k 120hz monitor without having to use upscaling or frame generation. Do you think the 5090 will be capable of that? I bet I would still have to use DLSS.
360hz QD-OLED*
5070 para 1440p 120fps path tracing ya me vale
Current triple A games are so unoptimised on PC these days. Even the best hardware can struggle.
It is gonna be massive, but my budget might be low so I hope the price will fade
if the 9070 turns out to be so close to nvidia's counterpart I ll be even more pissed that they don t release the high end cards as well....
It's funny to see some commenters who own a 4080 want to upgrade to 5080
It's like Nvidia is relying on RayTracing and DLSS to sell 5070 and below. They're not putting much effort into improving those cards compared to previous generations. Memory clocks go up with the addition of GDDR7, but they're not increasing the amount of memory or the cores (meaningfully). And all this with a 25% increased power draw? For what? A little bit of base clock?
So how high is amd gonna go ? Will there be a rx 9080xt ? Or not ?
i built my first PC in 2018 w/ an Ryzen 3700X and an RTX 2070, switched to 5800x3d w/ 7900XT in 2023 and i am honestly happy to see Intel's relative success w/ the B580 covering entry level systems. i will probably wait this generation out to strike a deal, since my system is doing just fine in 3440*1440.
You think that’ll make Nvidia lower their prices a bit? Really tryna get a 5080 with my new 9800x3d
People need to stop buying nvidia cards so they can fix their pricing.
They have full market control.
So they are overpricing everything.
expect the RTX 5080 TI to get 24 GB VRAM, as it would make sense, but the price might be higher than RTX 4090, with better performance because of GDDR7
there is still the thing with the core count. having faster memory wont really matter for games unless the core of the 80TI goes way higher than the 80.
260w TBP? It's basically the same as the 7900GRE...
come on, AMD, low down this damn consumption
744mm squared means less per wafer accounting for other material costs including new GDDR7 someone broke it down in a video and it would be in the price range of a little over $1900ish if the 4090 is the base of how much it cost to make that card. This card is easily going to be over $2k sadly i hate it and hate even more that the 5080 is basically a 70 card because its half the cores of the 90.
Anything below 4090/5090 is basically one tier lower than what they claim 4070 is actually a 4060, 4060 is a 4050 renamed etc.
@@cin2110did i miss the part where they told people that those cards are supposed have hardware that makes you feel good?
The 5090 probably will cost that much but the 5080 simply cannot sell past $1000.
It's a boosted 5070 Ti... and the 5070 Ti will probably be trading blows with a 4080 Super.
Unless they have the gall to try and charge more for the 5070 Ti which is comparable with a card you can buy for $900 in the used market... there's NO WAY a $1200 5080 would sell. Even the most dense Nvidia fanboy can see to either get the 5070 Ti or 5090.
No, it's an 80 series card. 90s series were dual GPU cards so it makes sense to use this nomenclature for a card that roughly has twice of everything.
The difference is that unlike in the past, companies can actually produce massive dies somewhat economically so there's no need for using 2 GPUs and any of the associated downsides like extra latency from the interconnect or any of the issues of Crossfire/SLI.
I own a 3090, bought it at launch. I refuse to buy the 4090 at launch & that was a mistake on my part given the performance for $100 more. I looked at the process on Newegg today & the cheapest was $2800
why was it a mistake? what games are you failing to get the performance you need from the 3090? 100 dollars more than the price of a 3090 is obscene.
So 7900xtx level of performance with better RT performance for $600? Not a bad card IMO
The more technology advances, instead of decreasing the size and having a decent video card with quantum technology, we have another giant BASIN, like a Dodge Challenger instead of a Ferrari.
What a sad disappointment
the one take away I noticed here is that if those numbers are accurate? Then that puts intel's 580 in a Very competitive spot with the 5070 right off the bat.
That additional 50W of power that the 5070 is going to suck down could power an M4 Mac running at full load, CPU, RAM, SSD *and* GPU with 10W still left over to power external devices.
Mac and nvidia are for idiots
and what u gonna do with it? i doubt the same
@@rev3489 You're right, I'll actually get some work done.
But feel free to pay $2,000 for a sand brick that eats 600W for a few extra FPS. Best wallet-opening device ever!
Well and even the M4 max is stuck at 40-50 FPS medium settings 1080p gaming for the few games that run on it. GPU performance is a whole different thing than CPU performance and to this day no one created a low power competitive GPU not even Apple so your comment is invalid.
@@Max-9871 Well thanks for making that up on the spot, Max-9871, but I think we all prefer actual facts.
The M4's GPU cores are on the same die as the CPU, along with the neural engine cores. All 3 share the same RAM.
So how did Apple manage this in under 40 watts while Intel and nVidia are pushing systems closer to 1,000W?
Why are the 50 series cards not compared to nvidias latest super version but the release versions
7:47 not to foreshadow the 4060 getting less cores than 3060
V-cache on 9950X3D would guarantee no bottleneck due to the IO die. That would actually have been quite helpful!
Did you mean to say 128MB L3 Cache?
L3 cache total X3d is 96mb on top of the regular cache (32mb per ccd)
L3 same as RDNA3
These insane component prices and limited stock remind me of the pandemic days.... unfortunate what it means to have a "High end" PC has now grown out of reach for most.
I find it interesting that so many consumer tech channels are hyping the 5090 when that overpriced card isn't even marketed for or priced to be purchased by enthusiasts.
I think the MSRP of the RTX 5070 will be priced at almost 700 USD. If this happens then I think the RTX 4070 SUPER at less than 600 USD is a banger.
If the 9070XT is more than $450, it's DOA...plain and simple.
AMD wants to 'be aggressive' with their pricing, to lure away NVIDIA customers, well...here's their chance.
DON'T BLOW IT, AMD!!
I'd say the 5070 vs 4070 hardware comparison is a bit misleading.
The 4070 was already an alright card, despite the price.
The undisclosed clock speed increase, paired with the much faster memory for texture and physics calculations should have a somewhat noticeable affect on FPS
I would like you to ask AMD why their productions schedule for the 9800x3d is so out of kilter with demand. Have they really dropped the ball here?
I have a 7900 XTX on order for $800 and I’m not feeling any reason from AMD or Nvidia to not just stick with that considering I’m not a super gamer and not as much caring about ray tracing, I’m kind of thinking the 24 gig of VRAM is the better future proofing
really makes the hanging question that much heavier. how much will they cost?
5090 will be ~15 -25 faster than 4090 because the same processing node IPC gained will be small. But they push clock speed and power used to increase performance. That's why they don't launch 4090ti. Because it will make 5090 small upgrades.
I bought an extremely large PC case for this very reason. So far it’s housed my 3090, 4090, and soon to be 5090. 👍🏻
if the 5090 is bigger in size as the 4090 i need a bigger case
As an AMD fan I kinda see 9070 performance disappointing.. I already own a rx 7900 xtx and looks like I’ll have to stick with it for quite a long time.
So, people might as well forego the newest card and just go for the 7900XTX.
Disappointing.
7900xtx is trash for Ray Tracing and that terrible FSR
Nah fam, that shit will make my room so fucking hot. Plus i do want a lil RT yea
@@ilbro7874 what is this thing with the 7900xtx being a room heater? I have one and it doesn’t run hot at all, even in the most demanding games it doesn’t go above 70°c
@@TTx04xCOBRA **for those of us who don't care about RT.
Nvidia be like: "We don' trade our cards in cash no mo, we trade em in Organs"
I hope 9070 XT have driver option to set base clock so it work 260W. I rather choose lower watts than higher performance.
do you know what else is massive other than the 5090?
Low taper fade!!
The numbers when he steps on a weighing scale?
@@Maboidatboi thats crazy😂
Regardless of the RTX 5090 specs, I'll still buy one. It would be a nice 2025 Christmas gift.
Wonder what the RX 9090XT?
why do you think only one chiplet has access? both chiplets could have access but to ONE 3d v-cache that does not require to be double the size
Will there be 9600X3D??
So much conflicting information. The last leak said the 9070xt was close to the 7900gre in performance though to be fair that was time spy and games always perform differently, but this huge of a gap would be surprising, we'll see. Now this card would possibly make at least a little sense at $650, still hoping for closer to $550, but if this is true it could mean there's a $500 card that's close to the 7900gre and 4070ti in performance.
In my opinion 5070 ti should be compared to 4070 ti super and when doing that it's an even smaller increase in specs, almost negligible, but will cost 20% more by the looks of it
the 7900gre abuses the 7800xt
So either Nvidia will charge $900 for the 5070 Ti (where it won't sell well against just getting a used 4080/S
And they charge $1200 for the 5080 which then would be DOA because it's just a boosted 5070 Ti. Save $300 and sac 15% which you can cut in half from an OC.
Nvidia needs to raise the price more higher.
I would have loved to see the 9950x3D have the 3D vcache on both dies. Based on videos I've seen it's a pain to jump through the hoops to get the CPU to park the proper cores. I'd pay extra to just have the whole chip stacked up.
Your first 3 seconds are just banging, how do you do that intro sequence?!
Ask AMD if they plan on releasing a APU that can run VR well enough to not need a GPU, for PC not just consoles and laptops.
Have there been any pricing leaks on the 50 series cards?
nvidia charges what they want because nothing else is even close in performance. The 9070xt will have to be less than $450 to even be viable, even less if they want some market share it barely competes with the 4070ti
I think it might be time to retire my 1080ti(SLI) setup...bring on the 5090.
9070 is gonna suck arse. AMD cares less about their descreet cards now than they ever have. They still got stock of their last generation to sell.
I think the 50 series cards should be compared to the 40 series super cards. 4070 super has 7168 cores and the 5070 has 6144 cores, what a leap -1024 cores. The 5070 is a joke.
I was really hoping AMD's flagship would beat a 4080. It seems both AMD and Nvidia are holding back some in the mid range market. They have ddr7 which is way faster than ddr6. A 1440P gamer has to sell a kidney to play at good frame rates.
I don't want to pay A$2799 for a 5080 after paying A$999 for a 3080. I'll be getting a 9070XT, or a 7900XT or 4070 Ti Super.
When they don't do the job I guess I'll never play PC games again.
The rx 9070xt performance leaks is one big bullshit.
It won't be close to the rtx 4080.
Idk why people still believe it
I usually upgrade every two generation, but this time it would be a downgrade from my 6950 XT. Maybe about time to switch to team green again?
Can’t wait for the 9070XT to be an idiotic $599 and be DOA
Oh, 9070XT will be DOA if it's more than $450.
@@muaddib7356 and 90% sure it will be more than $450
People need to relax, the 5090 wont show you any magic in game. Your 30/40 series will do just like 50-.series will do.
We've had the rough specs of RDNA4 and Blackwell for most of the year. People are scrounging for every scrap of info in the leadup to CES in just 11 days.
you know what else is massive? Low taper fade
It’s gonna be the 5070ti for me, will be a nice upgrade from my 3070
3070 is absolute garbage now, 5070ti or a 7900xtx, maybe a 9070xt.
I wanna see bench marks before i buy tho
@@TheShaunydogFunny im in the exact same boat and im considering the exact same set of options. I want to lean AMD, but they just make themselves so damn difficult to support.
im also looking at going a 5070ti from a 3070ti or a 5080 depends on the price
@@TheShaunydog tbh i love my suprim x, overclocks well and is just so cool n quiet and still runs so many of my games well but the next gen of games (a bit unoptimised maybe) are coming and it just cant keep up unless i lower settings but i want to get into ray tracing and up the visuals of my single player experience at 1440p so its time to
@@gavinferguson2938 ive got an rx6800 but i need upscaling in a bunch of games and fsr looks kinda bad, if AMD could just get an upscaler on dlss’s level or preferably more raw performance i wouldnt mind buying them but im planning on keeping my card for years and know upscaling will inevitably be needed, nvidia just has better quality n features for now
if i want to save more money to buy amd will the amd card stil be the same powerful as a 5090
9070 xt at 500$ will be crazy
Everyone is stuck on this stupid minutiae of naming scheme. It's not an nVidia thing.
When people talk about these cards in terms of CLASS we're not saying "700/600" it's always "70/60" GPU.
Why keep the confusion when people clearly prefer to talk about these in such a way even when nVidia hardware is completely out of the conversation?
That is to say, people talk like this when comparing AMD GPUs. It's pointless mental gymnastics and annoying.
BIIIIIIG IF: IF the 9070xt is close to 4080, day 1 purchase. If its blow 7900xt, might explore Nvidia. 4080 performance would be incredible if its priced well.
All other sources that I've seen pointing at sub 7900xt performance.
There are rastor leaks for it. It's a 7900gre with better ray tracing. Thats why they pulled the 7900gre from production.
@@mrg2039 I know, and this channel peddles hype, but the dream is some secret switcharoo where its actually competitive with a 4080.
@@trentongardner2106 Yea I've seen those but here we are with another video claiming near 4080 performance. I expect a GRE+ but a 4080-ish performance would be a day-1 buy if its in the $600 range.
@balthorpayne I can tell you why that is. They're getting paid to say that.
Nvidia cheapening out on RAM, not touching their products again.
Why exactly do you need so much power for such insane prices?
Day by day hype for 50 series is dying.
I want my 5090 now!!!! 😊
does RTX 5090 have electrical problems, just like 4090 ???
9800x3d + 5090 gonna be amazing.
if the leaks are true a 5090 is going to be around 5k here in Australia considering 4090 are selling for 4k