I think we've seen enough banking-gaming apps that fall through and substantially hurt users to know that this is not a legitimate app. And while I understand the necessity for you to take these sponsorships, I hope that few people actually use this app just so they aren't severely negatively impacted through something Benjamin does
Hmmm............not sure how it makes sense that NVIDIA literally stopped production of RTX4090 and RTX4080 good 3-4months ago and pushed in full TSMC fab allocations of respective chips to produce RTX5000 series GPUs respectively. And GDDR7 been in full swing production for months now. So this low shortage on cards mean they were making less than or around 500 cards per day if not lower than that. How is that possible. And if that is not the case and there been thousands produced where did all the chips end up going.
From what Hardware Unboxed mentioned after their press meeting with AMD last week, they admitted that they messed up pricing with the 7000 series and a lot of it was due to the high costs associated with those chips. They have optimized Navi 48 to be far more cost efficient. Plus the reduced TDP will mean a big savings when it comes to board design and cooler design. That should help keep prices lower and undercut Blackwell.
People will always find a way to hate on them. I honestly think it's cope from sunken cost fallacy. People want to justify why they spent double/triple on GPUs over the past 10 years. "raYtRaCiNG pErForMaNce and AMD bad" They've done amazing things with CPUs and people still don't really give them credit there either. They just pretend they don't exist after they buy one.
@ It's not glaze. Why do they need good marketing? How does Nvidia have good marketing? How does intel have good marketing? They, and many more staple tech companies don't market at all because they don't have to. People aren't magically swayed to buy a gpu. You buy one when you know you want one and you know what you want already before you buy it.
This is becoming a bit of a nail biter. Both the 3950X and 5800X3D were launched without any prior hype, and both hit Intel like a ton of bricks. When AMD hypes they're about to do a belly flop. When they stay quiet they are dangerous.... and they're hella quiet right now.
People who make statements like this come across like they want AMD to fail. if you have no faith go throw your money at Nvidia, a future with no competition looks bleak.
I mean 479 reference and 489-550$ usd for aib models I been hearing from very reliable sources would be amazing for performance these look like they'll give
I want AMD to have good pricing at launch AND A BOATLOAD OF CARDS.... so people who want to buy them can buy them... since nvidia cards will be nowhere to be seen for months.
Given their history it doesn't matter if they price it correctly.... they will lower it unofficially after disappointing reviews, making it easy for you to get one. I got my XTX for $900 5 weeks after launch... The only reason to hope AMD makes it a crowd pleaser from the get-go is if you take personal pleasure in seeing Nvidia lose market share. I happen to be one of those, but that's another matter.
Given you comment it could be 30% faster and 30% cheaper like last Gen and you still bought ngreedia and it's why next Gen you will get ngreedia 8gb cards for 650$
Watch the Hardware Unboxed video they made after meeting with AMD. They cover all of the pricing. Short story is AMD admitted the pricing was bad and was primarily due to how expensive that gen was to produce, and that Navi 48 is much cheaper so cards will be much more aggressively priced to gain marketshare.
@@hochhaul yes, but the worry is their profit margin. Since thr 5070 and 5070ti suck, they could theoretically price it the same, since it's better and has more vram, but they don't take into account the amd tax. Nobody will pay the same price for an amd Gpu even if it's faster.
@@aeromotive2Because their PC/GPU business is much much smaller in revenue and profit than their AI/datacenter business. They are likely diverting resources there which makes sense.
Correction, I worked at intel for 5 years. We had the internal employee store where every cpu was approx half off MSRP! though it would usually take many months after a product launch for the stock to be available on our internal employee store. For reference I bought my 13700K for about $200 when 13th Gen was within 6 months of launch, iirc the 13600k was about $170. Internal employee store was great for cpu's
@@rodrigoferreiramaciel4815I feel like the person above just extrapolates their personal experience into a broader, more general statement about employee stores not getting the product AT launch necessarily, so the video’s overarching statement about **overall** 5090 FE supply based on employee store leak might not be entirely accurate.
I'll upgrade to 9070 XT from my 6800 XT for $500 or less. If it's any more, it's not worth it; I'd instead look at used XTXs in a few months. But price this aggressively, and the market will accept it with open arms. Come on, AMD. Provide a good GPU launch at a killer price.
@@pointlessless2656 You're telling me, my 6800 XT has been at 1040mV for 2 years (reference btw). But I'm only biting for 9070 XT if it's priced right. Please AMD...
@@cjeffcoatjr $479 would be great, though I'd probably get a aib card for $530+ just for the better power limiter, the 5070 being 12gb is already a deal breaker no matter what dlss 4 offers
I think this is unrealistic, they should give you the GPU for free, why would they want money? Right? 9070xt 550$ is a great price, and 600$ is a good price, anything more is just AMD not wanting money. 9070 should be 420$-480$. Anything more is crazy.
i really hope they price the 9070 xt close to the 7800 xt msrp, since it is not a much bigger card. that would give them an insane gen on gen price to performance increase, but if they instead decide to increase their margins for it, i doubt it would gain much mindshare
@@arkyr1570 To be fair you can buy the 7900xtx on newegg for $869 brand new, 7900xt for $659. I doubt they'll be at $500 dollars but I could see 690 to 750 for sure. Especially if it ends up having less VRAM than either of those cards.
Do people think that? I mean look the 4090 sold very well for a bunch of reasons I don't want to regurgitate...but the 4080 didn't. That's why the 5080 is $200 cheaper. Now, that's not necessarily "enough" of a discount, but it does show gamers DO care....and maybe we wait and see how this gen goes before we assume everyone's still ok with the status quo...
@@MooresLawIsDead I think the 5080 was priced $1000 because its no more than %5-%10 faster than the current $1K 4080S . nvidia just admitted that the 5080 is only %15 faster than the non-S 4080 in raster. if your numbers end up the same as reviews then basically the 5080 is not going to be "meaningfully" faster than the 9070xt.
@@hunternewberry5860 Agree graphics cards are cheap compared to other hobbies, i pay 1500€ every 25000km to replace the breaks from my Stinger GT. Despite that i only buy a 9070XT or a 5070ti because i dont like the pattern and dont see the valure of overpriced high end cards being locked out on next generations big feature.
Same but from my 3080 ( pretty similar performance to your card ) should be around a 50% uplift which is awesome Imo. I hope they don't do something stupid and over price these. $499 and its a no brainer
@@PCgamingbenchmarktime Yeah it makes more sense for you too for the VRAM, shame about the 3070 and 3080, they're still very capable cards, but they're hamstrung. Also I don't know if you've had an AMD card before, but the move from Nvidia to AMD for gaming at least is a non issue, the experience is largely the same, don't let the haters put you or anyone else reading this off man.
@StarkR3ality i had a 6800xt for a week. I had no issues with software, I had a large issue with coil whine though so I had to return it....but it was when stock was impossible to get and the prices went up so I couldn't get a replacement. Lol I just sold my 3080 and just been using my 9800x3d cpu to much around with ( it can actually manage 80-120fps in halo 4 at 1080p low settings 🤣) and i realised that vrr works much with amd...I play with a TV screen. 144hz that has freesync premium pro. I had issues with my 3080 in a lot of games where it looked jittery even though vrr was working, some games it good but I'd say it was more bad then good. I tested a couple of games out that I knew were problems and they worked perfectly with vrr. So Amd is the way to go for people playing with a TV screen imo. So that's already a huge bonus, the only other thing that put me off Amd was Fsr3 since I play at 4k. But fsr4 looked fantastic from what I've seen so I'm ready to give Amd another shot 👏 they just need to price it properly lol. A lot are saying it will be $600, that makes no sense at all to me lol I can't imagine them fumbling again after they lost marketshare with the 7000 series
I might go insane checking Twitter and reddit to see if AMD has formally announced 9000 series. Please AMD, put me out of my misery and let me pre-order already!
They are scrambling HARD to get as much info on the actual performance of the 5070 and 5070Ti as possible. Pricing is going to be everything this time around. Initially I was under the impression that the 9070XT would be slower than the 5070Ti, but with what we've seen so far... it might actually be faster. If AMD can match it in RT, beat it in raster, and knock $200 off the price, while at least showing off FSR4 in a few games at launch, that's going to be a massive win for gamers.
Retailers have been getting stock for 7 days. They typically get stock for 2 to 4 weeks before launch. You won't have long to wait! It may launch in as little as 7 days!
Nope, those 5090’s were pre ordered by enterprise companies. For months, system integrators have been allowing pre sale of workstation PC’s including up to four, FOUR, 5090’s. nothing paper about this launch, they could just care less about us peons unwilling to shell out $8000 at a time.
@@nipa5961if they had 9070XT priced at $500 with 7900XTX performance the one that will be in trouble will be AMD themselves. How they going to clear 7900XT and XTX stock? Nvidia care less if their 5070 and 5070Ti are not being bought by gamer. There are tons of semi pro waiting to buy those cards because to them they still saves thousands by not buying Quadro equivalent to those card.
The 5080 doesn't seem to be offering much that the 4080 and 4080S aren't already offering. People who can afford a 5080 have likely already gotten a 4080 Super, and I think some of them are also going to be very well aware that that's an awful lot to pay for a graphics card with 16GB of vram in 2025. What good is all that horsepower, and ray-tracing performance, if you don't have enough vram to properly use it?
Yeah the 80's class card are a joke right now, I have a 4080 but the vram on this thing is killing it. I really want that 5090 for the vram but if the supply leaks are true, I won't be able to get it on launch day. I use my GPU as a work card that's why I need tons of vram.
_"that's an awful lot to pay for a graphics card with 16GB of vram in 2025..."_ Why? There is absolutely *zero* chance of 16GB of VRAM being a gaming "issue" before a 5080 is e-waste. Poorly ported console games (which will be pretty much *all* of them) will invariably exist, but they are always patched _after_ the rushed, profit-grasping, launches.
Seriously considering 9070xt over 5080. I got a 1080 eight years ago and it has been a great card, but it's time to back AMD up. Nvidia is disappointing.
I'm on a 1060 and plan to get 5080. 20series was bad value with RTX early adopters tax, Scalpers ruined 30 series. Nvidia scalped 40series themselves and amd failed to take advantage. I'm not gonna wait any longer. 1060 was fine for cago but with cs2 it struggles and in ACC in vr as well.
whats the deal with low supply? I thought everyone was hoarding and that blackwell has been done for a long time and they were just waiting on 40 series to dry up.
2 possible reasons. 1. Intentional supply side leaks to expose leakers. 2. A supply bottleneck that didn't get resolved in the timeframe people thought it would be. Possibly memory. It's entirely possible there are gpu parts sitting on warehouse shelves waiting to be assembled once memory shows up.
@@Uncle-LemonXY GPU overclocking has been dead for 3 generations now. With GTX 1080 you could actually get some good results, but with 3080 or 6800xt... just no. Like 5% gains
@@berke__ usually the automatic boost clock is so close to overclock that it doesn't matter. Same was already true in 1080, but in same cases the FE cards where at 1700mhz and you could get them to 2100mhz.
Now that Nvidia announced prices that were way more aggressive than we anticipated they will bring back artificial scarcity to counter that. Exactly what I expected. Thats why the official MSRP is always worthless as it does not represent actual pricing.
Artificial "scarcity" only works if it somehow benefits the *manufacturer* . What possible benefit does nVidia derive from selling fewer cards (basically chipsets to AIBs, at at _fixed_ price) than the market demands? Answer: *ZERO* . Short-term scarcity might increase perceived desirability, but long-term it serves no purpose whatsoever. Companies exist to make *profits* and if they don't sell product, the don't make *profit* . Tough concept, I know...
@@awebuser5914you’re forgetting that Nvidia effectively operates as a monopoly. Who cares if their overpricing leads to AMD sales going up 30%? AMD is 10% of the market right now, a 30% increase for them is Nvidia losing 3% of their buyers. How concerned are they going to be about that if they squeeze the remaining 97% for 10-20% more money?
@@awebuser5914 ROFL. Do you realize that over 90% of NVIDIA profit comes from AI and data centers? Selling to gamers is mere few % of their total profits. They can literally create artificial scarcity to make people buy at inflated prices cause there is always someone who wants a better card. MSRP is just a propaganda placeholder price that almost never happens.
So basically Nvidia claims the 1000$ MSRP to make consumers believe they learned their lesson but then they constrain supply and let AIBs enfore the actual planned 1200$ price point. Let's hope people see through this. And AMD.....please, please, PLEASE don't f**** this up again by releasing your cards at a ridiculous MSRP. One more thing though: while I massively appreciate the leaks it's honestly a bit surprising to me how often you manage to recieve full internal documentation from AMD compared to Nvidia. They really should get a grip on that.
Stored supply is high, release supply is low to maximize profits, like stocking up on toilet paper when it's cheap and then selling them at high prices coz it's in high demand. then when tariffs kick in those stored supply prices aren't affected but you can still make the same profit when you increase the prices again due to the tariffs
If AMD's primary goal for RDNA 4 is to take market share in the mainstream, then these cards MUST have mainstream prices. We're talking no more than $399 for the 9070 and - at MOST - $549 for the 9070 XT. And even then, you could argue they should actually go lower...
Because NOBODY wants 4080 raster for under $600? And NOBODY wants 7900xt performance for under $500? You're an idiot .. of course, everyone wants this at reasonable prices - NOT YOUR CRAZY BS ...
The 9070xt sounds like a 5070 with significantly better raster, and 16GB. The non-xt looks pretty quick too. AMD should charge $500 or $529 for AIB models, and the non-xt $400 or $429 for AIB models if the plan is to gain market/mindshare. Those prices would make the 5070 and 5060ti pointless. The 5070ti would also be in a strange place, definitely not $250 'better' than a 9070xt. The 5060 will also need to be dirt cheap if the 9070 is anywhere near the 9070xt lol
The only exclusive thing that they bring to the older model from DLSS4 is the transformer model. What people need to aware of is the fact that using transformer model is more expensive thus potentially bringing transformer model to older GPUs will make those GPUs run slower vs the old CNN model. While yes, I believe in Nvidia driver you can pick to use what model, in the end most people will just use whatever the game uses. If the game only offer DLSS4 SR (which uses transformer model), then the uplift from using DLSS4 on older GPUs will not be as high vs when using DLSS in other game with similar quality. This will make older GPUs feel dated in terms of performance more quickly when in fact the older GPUs should still perform well if not being bottlenecked by the tensor core performance. Again, just be aware of that.
I've been wondering about that. There's been a general assumption that dlss improvements will be free from any additional performance costs. That seems highly unlikely. The value proposition of 'improvements' for pre-existing nvidia hardware may be less attractive than people are assuming.
@@MaxIronsThirdI don't think they're specified on what gen though. I'd be concerned about the overhead on Turing since those Tensor Cores are quite a bit more limited
@@MaxIronsThird Wait until the actual review. 5% in what game? What DLSS settings were used? Even if you look at the current games using DLSS3, not all games reacted the same. I can easily imagine a situation on a game with high enough FPS that this quoted 5% penalty turn into something much worse. Also there is a difference between the high tier GPU and lower tier GPU with less tensor performance to play with. If the test was done only using the highest tier GPU then it can mask the penalty since obviously highest tier GPU in their respective generation have more performance to spare. Remember that in terms of compute for upscaling, it doesn't matter if you tested it on a 3090 or on a 3060, the compute cost (in terms of operation per second) for doing it is the same assuming the resolution is the same, but obviously since lower tier GPU has less tensor, to run that same upscaling it will require more time, thus costing more milliseconds. I believe Nvidia have shown this before, about the cost of doing upscaling, which is different on lower end GPU vs higher end GPU. Edit: basically what I'm saying is that the 5% penalty (which I actually read that on reddit before posting) is pretty much useless without any context. I need to know the GPU being used, the game, resolution, and upscaling quality settings. Also obviously instead of 5% it would be better to present it as ms. How is the fps difference vs DLSS3 and convert that to ms. If it was 100fps using DLSS3 and now 95fps using DLSS4 (which is 5%), then the extra cost is 0.5ms. But if it was 30fps using DLSS3 and 28.5fps using DLSS4, that is 1.8ms more. That extra 1.8ms more if being applied to a 100fps game will turn into 84.7fps. Nvidia had a technical briefings and they said this transformer model use 4x more compute.
Guys, the 50 series is the greatest Nvidia series ever. Every card, from the 5050 to 5080 will have the same performance as the 4090! On a more serious note. Feels like it may be worth upgrading to 9070xt this year. I hope AMD will have the supply and get good press this time around.
People hate on AMD simply out of sheepishly following the pack. I've had 2 AMD cards, the only problem I ever had was with a driver for a 18 year old game. You get more for your money, and the cards last longer. Name a card from 2017 that you can still game on that Nvidia sold for $250 or less. I still see people say they game just fine on their RX 580. It's lame cause Nvidia pulls the wool over the consumer's eyes and upcharges cause they know they have a grip on the market, AMD and Intel go out of their way to cater to the common man like *we all say we want* but they always end up getting clowned on for whatever reason. Seems like a circular argument to me.
If 9070xt msrp will be higher then rtx 5070 msrp - AMD will not gain market share. To gain market share, the 9070xt must be significantly faster and cheaper. Only this combination will change consumer behavior.
Exactly, 10% price cut from the 5070 while beating in raster by over 30%. That's the only way they're going to convince people. $499 is the magic price. Anything over that and its less tempting and people will start looking towards nvidia likes always. Surely they've learned this by now lol
@@PCgamingbenchmarktime $499 wont be enough to get most plebs from buying the 5070 at $550 regardless of performance or vram. $449 is the market slayer price if they actually want to gain market share.
@haukikannel no it can't. It's only matching it in RT. if its $599 you can bet most people will just spend the extra to get the 5070ti which will have much better RT and more gimmicks to wow people with. I'm sure Amd knows this and will price it correctly. Anything above $549 and its not going to do very well. $499 and its a must buy
Same, I'd like to see AMD's solution slaughter my card even if it's just in terms of price, The entire market has become a joke and never recovered from crypto mining days.
@@ocerco93...Which is why people are hopeful about the 9070 and 9070xt since they've made clear improvements towards FPS w/ RT enabled compared to previous iterations.
LOL, sold my 4090 for $1,900 but on eBay (no local bites), so I netted a bit over $1,600. Same boat though. I got a 3090 and 4090 on launch day, I might pull it off again... but.. I'm not as sure I WANT to. Ugh. Well, we'll see what the benchmarks are in a few days. Well, in my case I have a 3080 as a backup. Words better than expected. Worst case I end up purchasing a used 4090 locally (selling $1,200 - $1,700 range)
man... you really should have wait to get the 5090 first, in my memories getting a 4090 wasn't so hard at release day, but the 5090 could be way harder, considering how concerning the leaks about supplies are. and for the price $2000 is going to be the minimum, you probably gonna find 5090 for more than $2700 with AIB
@@andersjjensen I am the same but my current excitement now lean more towards 9070 xt which was not the case during CES. I hope those leaks are true so we can at least enjoy the midrange.
As a 7900xtx owner I’m cheering on that 9070XT and hoping it can out kick its coverage. Please please please make it cheap. Not just affordable but cheap. 🎉
@@nimaahmadi-s7o That's going to be the actual price though. nvidia prices are bullshit for a long time, it's not what you are going to be paying for not even close.
AMD has to go $400-500 range on these cards. They just don't have a choice at this point. If they loose more market share this gen. Lisa would be a fool, and shareholders would not allow her to be a fool. TO continue a dGPU line. Essentially that has already happened with the UDMA announcement. If the people in the GPU dept at AMD want to be able to sell cut down CDMA cards for consumers post RDNA4... they NEED to gain market share. If these benches are real and the GPU dept can talk Lisa into $399 and $499 (or even better if they can manage it) they have a chance to grab enough market share to continue the consumer line. They loose another 5-10% market share over '25 what is the point of supporting consumer dGPU anymore.
@@dgillies5420 I'm smoking reality. AMD has lost mind share almost completely it doesn't matter if their hardware is 10% faster then Nvidia at 80% the price. Its not ENOUGH. So being 10% slower at 80% even 70% the price is really not going to work. Its go big or go home time. The point of this gen... was to design an affordable product. They won't loose money. They can't afford to NOT gain market share. Not loosing share isn't good enough anymore. They NEED to turn Nvidia 2060, 2070, 3060, 1660 owners into 9070 owners. If they fail at that. Lisa is going to end dGPU. It isn't enough to get the dwindling number of current AMD users to upgrade. They need to start attracting Nvidia customers to AMD.
I think the rumors regarding $479 standard and up to $549 AIB cards sounds about right for this card so far, definitely doesn't need to be aggressier at least, at this kind of pricing I would definitely pick one up in a heartbeat and I haven't considered AMD for like ever (neither Nvidia since 2000 series why I'm on an Intel Arc atm lol but would go for AMD RDNA4 if pricing is right). I can say they'd definitely get market share whit that kind pricing, there's more to it than just repeating history here as FSR4 looks like a huge step up and RT performance has a huge leap as well. Also add to that, more and more customers are getting fed up with Nvidia pricing / "strategic supply issues" and fake frames marketing as they are the same as real ones etc. and are often more prepared to support the opponent this time around if enough value is presented.
@@mddunlap03 Yeah he shouts it from the rooftop every time he's right and then it's nothing but crickets when he's wrong. Observant viewers will know that lots of what he says amounts to little more than gossip.
FSR 4 on RDNA 3, at least on some level is amazing. I thought I was out of luck and had to wait for UDNA to get FSR 4 or 5 because I owned a 7900 XTX, now I'm really excited to see FSR 4. I don't really care if it's only like 50% of FSR 4, if there's improvements, that's a w for me.
@@FrostyBud777 dude is like the guy stupidly waiting at the coach station, unaware that the coach has already pulled in 15mins ago and waiting for him to board.
When AMD announce the card I'd like to see a Steve Race style presentation where whoever the Scott Herkelman replacement is at Radeon Technologies Group just picks up the reference 9070XT card from a plinth , holds it up in front of the camera and says "499" , then he puts the card back down . In the background is a polished slide with these benchmark results . He repeats the act for the 9070 , saying "399" this time around, and then walks off the stage with "available 3rd February 2025" as the final slide behind. Should last 1 minute tops. Everyone knows everything about the card so no need for info 🤣
Hearing that RDNA 4 performance is encouraging. It looked like after CES that Nvidia was going to completely curbstomp AMD into the ground, so that's encouraging I guess. Also LOVE that you're going to have Asianometry back on. He's probably my favorite guest. Guy is so knowledgeable.
Linus said that basically everyone at CES, all the aib's and SI's said they all pretty much sold out of 5090's, and it was mostly buyers for companies placing orders for new workstations to replace the workstations that they currently are running.
AMD knows how to overthrow a giant. They did it in the CPU sector by offering a way more performant CPU at a way cheaper price. As for GPUs however RDNA4 can't compete with Blackwell. So i am really curious about AMD's strategy for this generation.
B570 ASROCK and Sparkle has been up on Newegg at MSRP for sale for 2 days, stock's still holding. Edit - Amazon has them today now too. So not too bad a launch.
My local Microcenter still has B570 stock... But the site showed they only had one model with 5 cards in stock as of 40 minutes after opening on launch day, and as of now they still have only that one model but the quantity has dropped to 2. I've not seen the number go up between check-ins, suggesting no restock yet. Seems like inventory was low but demand was even lower.
@ It is, but the Rx 7600 used to be even cheaper.. Now it went up in price a lot over the last few days, probably because people realized the B570 won't be as cheap as reviewers told them lol
Is it a bad idea to go ahead and sell the 4070 TiSuper and get a 9070 XT? I really want to support AMD and it seems it will be similar or much faster depending on the game. Could get $700 for the GPU and guy the XT for 600 likely. I am very tempted. This is looked impressive AF going from a 384bit to a 256bit GPU both on gddr6.. wow.
Radeon has to nail the combination of price, value and supply to gain market share. Their best chance to do so in recent memory was RDNA 2. RX 6800xt and RX 6800 were fantastic but sadly AMD was not willing to supply enough until the end of the gen. So they ended up losing market share despite a fantastic product.
They would have been dumb to sacrifice Zen 3 supply (which they make a lot better margins on) for RDNA2 supply. And TSMC was fully booked because of the whole pandemic thing. Currently TSMC is not at full capacity on N5 and derivatives, so if AMD needs to book more wafers, they can.
@rareapuenjoyer which country are you talking about? Globally there was chip shortage and AMD prioritized CPUs as they have a higher margin. Only later in the gen they were sitting on shelves.
I sincerely think AMD is going to price it cheaper. AMD is probably seeing how people are reacting to the "fake frames" issue with nvidia having little to no generational uplift. AMD smells blood, and they are going for it. AMD aggressively priced their CPU chips against Intel and look at where it got them. I believe they would do the same with Nvidia.
Let's hope so. If they price it at $499 while smashing the 5070 in raster and matching in RT it will make the 5070 look like the joke it is lol. $599....no one will care, most will just get the 5070ti for the better ray tracing at that point. Hope they don't stuff this up, first time I've been excited for an AMD card lol
Deja Vu, I swear I heard this at the launch of RDNA 3. We all know how that turned out. Going by history, not by hope, AMD will price the RX 9070 XT $100-$150 more than we want - aka $600-$650.
Looks pretty amazing. Even the lower RX 9070 model (56CU) is beating the RX 7800XT by up to 30 - 40% in some of those benchmarks. Wow - I was expecting the cut down RX 9070 model to about equal the RX 7800XT.😄
Well if intel is selling the B580 at cost at the employee store, and they are losing money on each card, wouldn't it cost more than it cost at retail? This looks like it's going to be a good generation for video cards.
That sentence makes no sense, if Intel was eating the cost to get rid of a terrible product instead of just having it sit around it would be more expensive? What?
@@DebasedAnon He's saying Intel's employee store would have to raise the price to sell "at cost" since they are selling their retail copies at less than cost. Like if the card costs 300$ to make and they sell it retail for 250 they lose 50 dollars. So the employee store has to sell it for the higher price to NOT lose money. What is likely happening is that manufacturing costs are less than 250 but the R+D, shipping, marketing, overhead, retailer cut etc make the overall costs a net loss per purchase.
Look at the BOM on the 4070 vs MSRP ($550). Then look at the BOM on the B580. EXACTLY THE SAME ~EVERYTHING (trivial difference in GPU die size). WHAT A HORRIBLE DESIGN (33% slower than the 4070). Intel ARC is from the 100% nonprofit division of Intel!
That 220w 9070 gives me a lot of hope for a solid cut down 12gb N48 card focused on efficiency with ~48CUs, I'm really hoping to see a resurgence in good sub 200w GPUs as I need a replacement for the poor RX 6600 in my TV rig deskmeet X300, it has trouble even performance upscaling to 4k sometimes and alleviating this would make for amazing, tiny systems.
@kushy_TV 4060 ti is an awful option since the 16gb is the only viable one, but scales awfully due to that shitty memory bus and costs $450, there's 0 chance that a card coming out within the next couple months won't beat that in multiple ways.
i think that's the plan, 9060XT based on N48 with cut down 192bit bus/12GB of VRAM, then the 9060(and lower models on laptop) are based on N44, which means a 128bit bus/8GB of VRAM.
Realistically AMD has little need to price the XT above $499. It's probably around the size of the 7800 XT. But with this kind of performance they have the legitimacy to price it right next to the 5070 Ti (even though gamers would snub it still).
Those prices aren't great...lol $599 is a little too close to the 5070ti. It would be DOA. $499 is what makes the most sense . Same price as the 7800xt. They're around the same size too , it would be foolish of them to price it $100 higher lol
@@PCgamingbenchmarktime Plus it's 70s class card (right? This is why they changed their name scheme), so why 70 class card should cost 100$ more than previous 80s class tier?
honestly doesn’t feel like a large enough price gap between the two cards it would just kill the 9070 at launch the 9070 needs to be atleast 100 dollars cheaper or they might as well only launch the xt
These are some nice performance charts for the new AMD cards. Now all they have to do it price these cards competitively and they will sell like crazy! The power reduction is a very welcomed bonus in my book.
If the 9070XT is less than 10% slower than the XTX on average in raster, and gets close to a 4070Ti Super in RT I will sell my XTX and get a 9070XT if it's $550 or less.
I think it will be closer to 5-7% difference. Remember those drivers are over a month old. A few of those games seemed quite oddly low compared to the others. AMD themselves said things are better with their latest drivers
@entreri76x its in line with the other benchmarks and Amd themselves have said its going to perform better than we expect. Seems pretty likely . Probably has improved since these benchmarks too since the driver's are over 1 month old. Some of the games didn't perform like they should, assuming there's more work been done since then
If a 64CU 9070 XT is almost as fast as a 96CU 7900XTX, AMD could have easily released a 96CU 9080 XT to tackle 5080 or even a 5080Ti. But wait.... Its AMD... They are not happy without a bullet in thier foot... shot by themselves.
They could, but AMD chose to prioritize next-gen Multi-chiplet GPUs since Nvidia also wants to bring theirs faster to the market. This gen should be short-lived, -1.5 years.
Agreed. Given everything I’ve heard and seen, it would have been trivial for them to just make a 96CU, 24GB 9080XT for $800-1000. They can still credibly say they’re not worrying about the “high-end” when their top card is less than half the cost of Nvidia’s.
If those benchmarks are accurate, then AMD isn't pricing the 9070 XT any lower than $599. In fact, $649 to $679 is more likely since it'll be a competent 5070 Ti competitor. Anyone hoping AMD would launch the 9070 XT in the $449 to $499 range are on a lot of hopium.
Ridiculous 😂😂 it would be DOA. Max price would be $549, no chance they're dumb enough to botch another launch with such a sure thing. The Card probably cost the same to produce as the 7800xt. They have no need to price above $499. Who the hell would buy it at $649😂😂
while i can see your point, the BOM cost of N48 is the same or lower than N32 (7800xt) so at 550 they are making more money than they have on any product since the 6950xt. and they will take market share at that price.
@AltRockKing exactly, the bom costs would be very similar to the 7800xt. Even at $499 they're like making over 60% margins on these cards, they can afford to price aggressively and go after marketshare. I have a feeling they'll be stupid and price at $549 which won't be enough to sway people. $499 is 10% less than the 5070 while performing like a 5070ti in raster and probably still slightly beating the 5070 in RT. That would dominate the midrange, i really hope they play if more aggressively and really stick it to nvidia lol
It doesn't take that much imagination to predict that a company that has in the past severely limited supply in order to keep demand, and therefore prices high will do so again. 🙄
When he told "even if amd chooses to 599$ for 9070xt and 499$ for 9070" thats big mistake from Tom, there can't be even hope for Amd to get those prices, it's has to be 449$(max 479$) 9070XT and 399$(max 429$) for 9070. We need Amd to stay in market and keep greedy nvidia down.
As long as it's under $499 it will sell well. It's 30% faster in raster at least compared to the 5070, but it will probably only be on par or slightly ahead in RT. $499 makes a lot of sense, I think they'll probably match the price of the 5070 though to be honest which would be a bit less exciting. $499 is midrange, anything over that I'd consider upper midrange prices
@@letsplay1097 if i would smoke same thing as you, next generation we would need to pay 850$. 9070XT 450-479$ for reference, ~500$ AIB. nvidia already made you think that price increase is ok. Amd is in bad pasition, they need to gain market share, get in to our minds, there is no other way to go back (unless they have some monolithic 9080XT which would beat 5090, but not this genaration)
@@devilheadas don’t get me wrong i would love to see it being priced below 500 but the thing is the market has changed forever and yeah amd targets market share but that wouldn’t mean pricing the 9070xt at 300$ less than the competitor it just doesn’t make sense profits would be zero , how else would they finance they’re r&d
@@letsplay1097 Yeah but AMD doesn't have the feature set Nvidia have, not to mention the mindshare so they need to price it way lower so that it can grasp market share. 499$ should be max for RX 9070XT
The benchmarks for the 9070 look awesome, like so close to the 7900xt, even $500 would be a good price, $450 would be a crazy good deal, like it would be madness to buy anything but that GPU, I've a 6800 non xt, and really not thinking about an upgrade as it is still too good for the games i play, but if it comes near that $450.... It's a no brainier it will be the deal of the decade
Seems like the 50 series will be a performance whimper as opposed to a bang. I think nVidia was leaning into this with their fake frames nonsense. They'll claim higher performance with all their AI stuff turned on, but in reality it'll be incremental. Synthetic benchmarks or productivity will be more interesting, not expecting a huge gaming bump.
I'm thinking the same thing as the whole reason they renamed these cards to 9070/xt was that they were targeting the 5070/5070ti. The 5070ti will prob be about 5-10% slower than a 4080 and I don't see these cards besting that or they would have went with 9080.
650,550 are not good prices even with more vram and better raster, the only way they will truly claw back market share is with 550,450.Looking at leaked specs, 9070 XT vs 7900 GRE has 20% less cuda cores while boosting approx 25-30% more [2.2 vs 2.9 ghz boost] shouldn't mean a lot of performance difference unless the rdna 4 cores are better than rdna 3 cores in some way.Seems very sketchy they decided to end 7900 GRE production as it looks like its the best competitor to the 9070 non-xt atleast which boosts to 2.5ghz with a little less cuda cores.Just seems a little too good to be true
AMD ray tracing is catching up to 4070 ti-ish, and 50-series are going to have low stock and aren't a huge improvement in raw performance. I'll probably just stick with my 4070 ti for now.
Tom in 2020: Sell your 2080ti, get a 3070 Tom in 2021: No I was right, if you had gotten a 3070 at launch you would have saved money Tom in 2025: I told you, you couldn't have gotten a 3070 at launch, there was no supply *bruh*
No, I said a few things your attention span missed: 1) If you sold your 2080 Ti for $1200 when I told you to, you could have bought it back for $700 a few weeks later. Fact. 2) Also, a 6700 XT 12GB was EASILY bought for $600-$700 from scalpers. Again - a $500 profit for 2080 Ti sellers, and now you get a real card with 12GB! Fact. 3) The 3070 didn't actually have the worst supply. Go watch my videos from then bruh - it's public lol. The 3070 had decent supply, and even though demand was still insane, you could get one - I did! It's the 3090 and 3080 that had paper launches. Facts. As always - facts are facts, and they don't care about your feelings.
Prices can be decided at the last moment. AIBs can't just suddenly conjure up a million VRM components without lead times. So when the AIBs say they aren't expecting much supply, you can be pretty confidant that situation can't suddenly chance in less than 6-8 weeks.
Prices can be decided at last given moment or given fake price beforehand to throw competition off. Supply is far harder to change. Especially when some parts are not manufactured in USA (and some are not, which is why so many techtubers talks about Trump tarrifs influencing future pricing)
@@andersjjensen Nvidia setup RTX 5090 to be the most attractive with huge profit of all the RTX 5K; there’s no way they would make short supply of them. All the wrong very high price leaks and short supply leaks are just propagandas to convince people to spend the money on the massive selling of 4090, so those sellers can move those money over to the 5090. Same cases for the 5070, 5070 Ti and 5080.
If AMD wants to reach the goal that you talked about a few months ago as soon as possible, they should make an event, come out and say: 399 and they go... Like Sony for PS1. and that's it. Everything else is prolonging the agony. Sacrifice those few million earnings on the product, not on ads.
I hope I can get one from the nvidia website, but I dont think ill beat the scalpers. It sucks that nvidia cant do a reservation thing, its first come first serve, meaning people can bot the site and etc, it sucks
So all I get from the comments is that people want AMD to give these GPUs for free, or they won't buy 9070xt for 550$-600$ that is faster than a 5070ti that is priced at 750$.
@@LvcIvsLIcInIvsLvcvllvs You actually don't know that. 5070 and 5070 Ti are not leaps, they are small improvements on the last generation. AMD could easily wipe the ground with them.
Here's what I find interesting. Out of all the online EU shops i normally look at, there is not a single 9070 card listing. Meanwhile, all those shops do have the rtx 5000 series listing. Now Nvidia claims they won't have a very high supply. AMD hasn't made a statement in that regard. We are still awaiting pricing; no idea about its supply... I think this could be a magic AMD moment in the making. I'm rooting for AMD. But now, so many gamers have the fear of missing out (FOMO) on the multi-gen 200fps+ ray-traced potential of Nvidia. My friends, it's only a gimmick. Usable ray-tracing is coming to the 6000 series and RDNA 5. For now, be content with raster performance. In which case, I'd recommend to either wait for 6000 / rdna 5, or grab a 4000 series on sale. There's a lot of uncertainty in the markets with Trump coming to Presidency, don't forget that. We're about to witness a huge shell-shock effect. peace
@@godnamedtay You're either new to PCs or have a short (or selective?) memory. The RX580 was a sales hit for AMD, and so was the 5700XT. RDNA2 sold reasonably well, but was not priced well at launch. In RDNA3 only the 7800XT was well received, and lo and behold, it sold so well there was price creep for two months before supply caught up.
@@godnamedtay Look at Nvidia's list of "known issues" in their latest driver update. When AMD has driver issues it's "AMD baaad!" and when Nvidia has them it's a completely normal part of software development. RDNA1 had the dreaded black-screen issue that took entirely too long to fix. RDNA2 and RDNA3 were completely normal. Stop sucking Jensen off. He doesn't even like you.
I think we are missing the 80 Ti class so frikking much right now. Those 102 dies are like unobtanium. All we need now is the anouncement and release date. Also if this is how good RDNA 4 is and its meant to be a "fixed" RDNA 3 with a few optimisations imagine how good actual RDNA 3 could have been. Heck imagine how good RDNA 4 high end might have been too.
So will they never release a higher end model this cycle? Wish they at least tried to compete for 2nd place instead of being a few tiers below Nvidias best cards.
All they did for RDNA4 was bus tweaks (RDNA3.5) and they finished RDNA 3 which was supposed to clock at 3 gigahertz+. But yields were too inconsistent and so they had to clock at 2.5 Ghz in RDNA3. Some overclockers beat the 4090 with a 7900 XTX clocked @ 3.3 Ghz!
I think a 96CU RDNA 4 monolithic chip with 24Gbps gddr6x would be extremely interesting - another 30-35% above 9070XT would edge the 4090 in raster and comfortably beat the 4080S in RT . If they could price that at $899 to take on the 5080 they'd clean house on nvidia's stack (without using fake frames😅)
@MaxIronsThird i don't think so even with rdna 3 the 7900xt and xtx get better performance over time. Check out ancient gameplay comparison for xt, xtx and 4080 super, games like cyberpunk increased on average 10% from a year ago.
That makes no sense, nvidia also improves over time. Also they are adding DLSS4 transformer model with better RT and DLSS for the RTX2000+ series, AMD isnt adding FSR4 for their older gen cards.
@nuddin99 yes, stated that in the video for rdna 3 and 3.5 some version if fsr4 will be supported. Also it does make sense they improve over time, the evidence shows that I didn't say anything about Nvidia, tell me you're a Nvidia fanboy without telling me you're a fanboy.
I've been watching MLID videos probably for 1 year or so to get some information about how would the new gen amd cards behave vs the current 7900xtx. MLID has been consistently saying for literaly months that the 9070xtx would be something between the 7900xtx and 7900xt for raster but with better raytracing, this awesome leak proves his accuracy was dead on! Thank you for helping me out not wasting 1000€ on a 7900xtx and wait for this new generation! Cant wait for the full reviews to come out!
[SPON: Ditch Dull Banking Apps, Get Rewarded Every Day w/ Benjamin: benjaminone.onelink.me/rky7/MooresLawIsDead ]
I think we've seen enough banking-gaming apps that fall through and substantially hurt users to know that this is not a legitimate app. And while I understand the necessity for you to take these sponsorships, I hope that few people actually use this app just so they aren't severely negatively impacted through something Benjamin does
Can't wait for your podcast with Jon!
@@ModernOddity728 suck up
Hmmm............not sure how it makes sense that NVIDIA literally stopped production of RTX4090 and RTX4080 good 3-4months ago and pushed in full TSMC fab allocations of respective chips to produce RTX5000 series GPUs respectively. And GDDR7 been in full swing production for months now. So this low shortage on cards mean they were making less than or around 500 cards per day if not lower than that. How is that possible. And if that is not the case and there been thousands produced where did all the chips end up going.
@@KING_DRANZER Hmmm... AI DATACENTERS?!?
All AMD has to do is price this correctly which is a big ask considering they have the marketing smarts of a toddler.
From what Hardware Unboxed mentioned after their press meeting with AMD last week, they admitted that they messed up pricing with the 7000 series and a lot of it was due to the high costs associated with those chips. They have optimized Navi 48 to be far more cost efficient. Plus the reduced TDP will mean a big savings when it comes to board design and cooler design. That should help keep prices lower and undercut Blackwell.
People will always find a way to hate on them. I honestly think it's cope from sunken cost fallacy. People want to justify why they spent double/triple on GPUs over the past 10 years. "raYtRaCiNG pErForMaNce and AMD bad"
They've done amazing things with CPUs and people still don't really give them credit there either. They just pretend they don't exist after they buy one.
@@I_am_Dad_SonOh stop glazing they truly have awful marketing
@ It's not glaze. Why do they need good marketing? How does Nvidia have good marketing? How does intel have good marketing? They, and many more staple tech companies don't market at all because they don't have to. People aren't magically swayed to buy a gpu. You buy one when you know you want one and you know what you want already before you buy it.
@@Frozokenthey won't sell at a loss.
No way AMD is going to miss this opportunity to miss this opportunity… Right guys? Right????
Depends on how they price it now.
Surely
This is becoming a bit of a nail biter. Both the 3950X and 5800X3D were launched without any prior hype, and both hit Intel like a ton of bricks. When AMD hypes they're about to do a belly flop. When they stay quiet they are dangerous.... and they're hella quiet right now.
People who make statements like this come across like they want AMD to fail. if you have no faith go throw your money at Nvidia, a future with no competition looks bleak.
@@andersjjensen The quiet part is a very good analysis. Hoping its true.
I want AMD to price the 9070XT at a point that I feel silly for buying an RX-7900GRE last year.
I mean 479 reference and 489-550$ usd for aib models I been hearing from very reliable sources would be amazing for performance these look like they'll give
You will
Doubt that’s gonna happen based on how stupid they are when it comes to pricing video cards.
you can either sell it and buy the 9070xt or keep the gre because the gre is still a good card, unless you're a 4k, ultra, ray tracing kinda guy.
I want AMD to have good pricing at launch AND A BOATLOAD OF CARDS.... so people who want to buy them can buy them... since nvidia cards will be nowhere to be seen for months.
If AMD prices the 9070's right I will be getting one. Given their history I'm getting ready for disappointment.
Given their history it doesn't matter if they price it correctly.... they will lower it unofficially after disappointing reviews, making it easy for you to get one. I got my XTX for $900 5 weeks after launch... The only reason to hope AMD makes it a crowd pleaser from the get-go is if you take personal pleasure in seeing Nvidia lose market share. I happen to be one of those, but that's another matter.
if and AMD don't go well
@@andersjjensen well, AMD will be AMD
Given you comment it could be 30% faster and 30% cheaper like last Gen and you still bought ngreedia and it's why next Gen you will get ngreedia 8gb cards for 650$
@@mddunlap03 was that a attempt at English ?
" Given you comment " it could be 30 % faster it could not be 30 % faster- 30 % faster than what a 6600 ?
Watch amd completely destroy their chance by pricing it at $649 and $549
😭😭😭
Watch the Hardware Unboxed video they made after meeting with AMD. They cover all of the pricing. Short story is AMD admitted the pricing was bad and was primarily due to how expensive that gen was to produce, and that Navi 48 is much cheaper so cards will be much more aggressively priced to gain marketshare.
@@hochhaul yes, but the worry is their profit margin. Since thr 5070 and 5070ti suck, they could theoretically price it the same, since it's better and has more vram, but they don't take into account the amd tax. Nobody will pay the same price for an amd Gpu even if it's faster.
@elpato3190 then let the fools be separated from thier money no way should amd have to be 30% faster and 30% cheaper
$649 is too high. $550 for AIB model 9070XT is the way to go. That would really destroy mr. Leather Jacket
$500 and $400.
If they want to get market share, this is it
They don't, they want money.
$550 and $400
@@badpuppy3 everyone want man what do you mean man???
550 and 479
No need $600 is enough it to be just fine
Nvidia knows when the plebs get their taxes.
😂
NVIDIA knows how to make a good gpu…unlike the company all u fanboys drool over & who believe anything this clown says.
@godnamedtay talk about a fanboy. You got triggered over a business joke. 😂🤣🤡🤡🤡
@@godnamedtay 🤡, You are the reason why we got a 12gb RTX 5070 and 16gb RTX 5080 in 2025.
@@godnamedtay thats what they said about intel when amd was catching up there. its not like making a gpu is a mystery.
So unsurprisingly Jensen lied about producing them "as large scale".....lmao
for scalpers including Nvidia ?- don't music event organisers now sell X amount on scalper markets = I'm sure Nvidia wants some of theat pie
i dunno why the heck they're not making enough
He didn't technically lie, it was Blackwell but not consumer cards
@@aeromotive2Because their PC/GPU business is much much smaller in revenue and profit than their AI/datacenter business. They are likely diverting resources there which makes sense.
Jensen's gonna be the new Todd.
Correction, I worked at intel for 5 years. We had the internal employee store where every cpu was approx half off MSRP! though it would usually take many months after a product launch for the stock to be available on our internal employee store. For reference I bought my 13700K for about $200 when 13th Gen was within 6 months of launch, iirc the 13600k was about $170. Internal employee store was great for cpu's
your comment doesn't contradict what he has said in the video, he was talking about nvidia, not intel.
@@rodrigoferreiramaciel4815I feel like the person above just extrapolates their personal experience into a broader, more general statement about employee stores not getting the product AT launch necessarily, so the video’s overarching statement about **overall** 5090 FE supply based on employee store leak might not be entirely accurate.
I'll upgrade to 9070 XT from my 6800 XT for $500 or less. If it's any more, it's not worth it; I'd instead look at used XTXs in a few months. But price this aggressively, and the market will accept it with open arms. Come on, AMD. Provide a good GPU launch at a killer price.
Imagine the 9070xt undervolt performance tho
@@pointlessless2656 You're telling me, my 6800 XT has been at 1040mV for 2 years (reference btw). But I'm only biting for 9070 XT if it's priced right. Please AMD...
@@cjeffcoatjr $479 would be great, though I'd probably get a aib card for $530+ just for the better power limiter, the 5070 being 12gb is already a deal breaker no matter what dlss 4 offers
I think this is unrealistic, they should give you the GPU for free, why would they want money? Right?
9070xt 550$ is a great price, and 600$ is a good price, anything more is just AMD not wanting money.
9070 should be 420$-480$. Anything more is crazy.
@@atalkhan6698 It might be a good price but the goal is to get people to buy a Radeon card for the first time. Most people just buy nVidia.
AMD will price just high enough to lose 2% market share
i really hope they price the 9070 xt close to the 7800 xt msrp, since it is not a much bigger card. that would give them an insane gen on gen price to performance increase, but if they instead decide to increase their margins for it, i doubt it would gain much mindshare
They should price it to significantly undercut the 5070.
$499 is the sweet spot. They need another rx480 win!
579.99 FE aib 600-679
@@odaharryif it matches a 7900xtx that aibs sold for up to 1200$ don’t expect it to be 500$….
@@arkyr1570 To be fair you can buy the 7900xtx on newegg for $869 brand new, 7900xt for $659. I doubt they'll be at $500 dollars but I could see 690 to 750 for sure. Especially if it ends up having less VRAM than either of those cards.
Incredible that there are still so many that think 1200-2000$ for a GPU is perfectly fine 🤡🤡🤡
Do people think that? I mean look the 4090 sold very well for a bunch of reasons I don't want to regurgitate...but the 4080 didn't. That's why the 5080 is $200 cheaper.
Now, that's not necessarily "enough" of a discount, but it does show gamers DO care....and maybe we wait and see how this gen goes before we assume everyone's still ok with the status quo...
Bro its your primary hobby for 5+ years. 2,000$ is almost nothing.
@@hunternewberry58602000 is 2 weeks of pay … people need to get a job and stop living in their parents basement
@@MooresLawIsDead I think the 5080 was priced $1000 because its no more than %5-%10 faster than the current $1K 4080S . nvidia just admitted that the 5080 is only %15 faster than the non-S 4080 in raster.
if your numbers end up the same as reviews then basically the 5080 is not going to be "meaningfully" faster than the 9070xt.
@@hunternewberry5860 Agree graphics cards are cheap compared to other hobbies, i pay 1500€ every 25000km to replace the breaks from my Stinger GT. Despite that i only buy a 9070XT or a 5070ti because i dont like the pattern and dont see the valure of overpriced high end cards being locked out on next generations big feature.
Wasn’t going to upgrade from my 7800 XT, but for 4080 performance for £499, FSR4 and actually capable ray tracing, I might have to.
Same but from my 3080 ( pretty similar performance to your card ) should be around a 50% uplift which is awesome Imo. I hope they don't do something stupid and over price these. $499 and its a no brainer
@@PCgamingbenchmarktime Yeah it makes more sense for you too for the VRAM, shame about the 3070 and 3080, they're still very capable cards, but they're hamstrung.
Also I don't know if you've had an AMD card before, but the move from Nvidia to AMD for gaming at least is a non issue, the experience is largely the same, don't let the haters put you or anyone else reading this off man.
@StarkR3ality i had a 6800xt for a week. I had no issues with software, I had a large issue with coil whine though so I had to return it....but it was when stock was impossible to get and the prices went up so I couldn't get a replacement. Lol I just sold my 3080 and just been using my 9800x3d cpu to much around with ( it can actually manage 80-120fps in halo 4 at 1080p low settings 🤣) and i realised that vrr works much with amd...I play with a TV screen. 144hz that has freesync premium pro. I had issues with my 3080 in a lot of games where it looked jittery even though vrr was working, some games it good but I'd say it was more bad then good. I tested a couple of games out that I knew were problems and they worked perfectly with vrr. So Amd is the way to go for people playing with a TV screen imo. So that's already a huge bonus, the only other thing that put me off Amd was Fsr3 since I play at 4k. But fsr4 looked fantastic from what I've seen so I'm ready to give Amd another shot 👏 they just need to price it properly lol. A lot are saying it will be $600, that makes no sense at all to me lol I can't imagine them fumbling again after they lost marketshare with the 7000 series
I might go insane checking Twitter and reddit to see if AMD has formally announced 9000 series.
Please AMD, put me out of my misery and let me pre-order already!
I'm in the same boat 🙏🏿
They are scrambling HARD to get as much info on the actual performance of the 5070 and 5070Ti as possible. Pricing is going to be everything this time around. Initially I was under the impression that the 9070XT would be slower than the 5070Ti, but with what we've seen so far... it might actually be faster. If AMD can match it in RT, beat it in raster, and knock $200 off the price, while at least showing off FSR4 in a few games at launch, that's going to be a massive win for gamers.
Get a 5090 instead, king!
Retailers have been getting stock for 7 days. They typically get stock for 2 to 4 weeks before launch. You won't have long to wait! It may launch in as little as 7 days!
23/24 january pre-order.
So nvidia is basically doing a paper launch with that supply
Jensen did say that last year. Most of their production is going to AI gpus. So the supply will be smaller for gaming GPUs… That was from last year!
Nope, those 5090’s were pre ordered by enterprise companies. For months, system integrators have been allowing pre sale of workstation PC’s including up to four, FOUR, 5090’s. nothing paper about this launch, they could just care less about us peons unwilling to shell out $8000 at a time.
@@user-mh6ie9wm6m I don't hear any mention if the 9070XT will be a paper launch or not, in order to keep their prices inflated.
scalpers
Wow amd has such a huge opportunity here based on these benchmarks
it looks very good but the games in those benchmarks are a lot of amd favored games.
4080S/7900XTX performance for $500 would be mindblowing. 5070 and 5070TI would be DOA.
@@nipa5961if they had 9070XT priced at $500 with 7900XTX performance the one that will be in trouble will be AMD themselves. How they going to clear 7900XT and XTX stock? Nvidia care less if their 5070 and 5070Ti are not being bought by gamer. There are tons of semi pro waiting to buy those cards because to them they still saves thousands by not buying Quadro equivalent to those card.
@@rotm4447wasnt nvidia games on their charts nvidia sided aswell? Lol they alwaysdo the same e ery company
AMD and Opportunity are phenomena from parallel worlds.
The 5080 doesn't seem to be offering much that the 4080 and 4080S aren't already offering. People who can afford a 5080 have likely already gotten a 4080 Super, and I think some of them are also going to be very well aware that that's an awful lot to pay for a graphics card with 16GB of vram in 2025. What good is all that horsepower, and ray-tracing performance, if you don't have enough vram to properly use it?
Yeah the 80's class card are a joke right now, I have a 4080 but the vram on this thing is killing it. I really want that 5090 for the vram but if the supply leaks are true, I won't be able to get it on launch day. I use my GPU as a work card that's why I need tons of vram.
@@apricreed9580maybe a used 3090 no? 24gb of vram and a lot of raw performance
In which games the 16GB are not enough?
@@plasmahvh raw performance is kinda meh, considering that they are quite overpriced even today.
_"that's an awful lot to pay for a graphics card with 16GB of vram in 2025..."_ Why? There is absolutely *zero* chance of 16GB of VRAM being a gaming "issue" before a 5080 is e-waste. Poorly ported console games (which will be pretty much *all* of them) will invariably exist, but they are always patched _after_ the rushed, profit-grasping, launches.
Seriously considering 9070xt over 5080. I got a 1080 eight years ago and it has been a great card, but it's time to back AMD up. Nvidia is disappointing.
Well its certainly not gonna be over 5080...
Keep with the 1080 dude, its still a perfectly usable card. Don't bother with these prices, its just a game.
@@melski9205 well 1080 sucks in most new games. i got an rtx 2070 which is slightly faster and it sucks in new games. even in games that are years old
I'm on a 1060 and plan to get 5080. 20series was bad value with RTX early adopters tax, Scalpers ruined 30 series. Nvidia scalped 40series themselves and amd failed to take advantage. I'm not gonna wait any longer. 1060 was fine for cago but with cs2 it struggles and in ACC in vr as well.
5080 is a terribly priced GPU. $1000 for a GPU with only 16GB of VRAM in 2025?
whats the deal with low supply? I thought everyone was hoarding and that blackwell has been done for a long time and they were just waiting on 40 series to dry up.
2 possible reasons.
1. Intentional supply side leaks to expose leakers.
2. A supply bottleneck that didn't get resolved in the timeframe people thought it would be. Possibly memory. It's entirely possible there are gpu parts sitting on warehouse shelves waiting to be assembled once memory shows up.
FE cards are always low supply…..
@@djnes2k7yeah but Jensen literally said they would have a ton of them at their keynote...
@ these leaks are AIB not FE.
@@LiveTypeGDDR7 shortage would make total sense, that sounds the most plausible reason
Would be icing on the cake if EVGA decided to return and join team Red to make GPUs
If that happens, Sapphire might decrease their Nitro+ series prices because of competition.
Not happening. Would be nice but not happening
@@Uncle-LemonXY GPU overclocking has been dead for 3 generations now. With GTX 1080 you could actually get some good results, but with 3080 or 6800xt... just no. Like 5% gains
@@DuBstep115 7900 gre 🤔
@@berke__ usually the automatic boost clock is so close to overclock that it doesn't matter. Same was already true in 1080, but in same cases the FE cards where at 1700mhz and you could get them to 2100mhz.
Me thinks the GDDR7 production numbers are really low
No more like Nvidia has learned to artificially restrict the supply so they can milk the consumer for as much money as they can.
Maybe
Nvidia is an AI company. Their useful idiots get the inflated scraps.
It's for sure that a 750 mm square chip has a very very low yield! The 5090 is a recipe for disappointment!
Now that Nvidia announced prices that were way more aggressive than we anticipated they will bring back artificial scarcity to counter that. Exactly what I expected. Thats why the official MSRP is always worthless as it does not represent actual pricing.
It really pisses me off how youtube reviewers ignore fake MSRP bullshit. You could never buy a 4090 for $1600. Every time I checked they were $2000.
@@SirMo it's because they dont buy anything or get stuff as companies directly from suppliers so they get it at the proper price.
Artificial "scarcity" only works if it somehow benefits the *manufacturer* . What possible benefit does nVidia derive from selling fewer cards (basically chipsets to AIBs, at at _fixed_ price) than the market demands? Answer: *ZERO* . Short-term scarcity might increase perceived desirability, but long-term it serves no purpose whatsoever. Companies exist to make *profits* and if they don't sell product, the don't make *profit* . Tough concept, I know...
@@awebuser5914you’re forgetting that Nvidia effectively operates as a monopoly. Who cares if their overpricing leads to AMD sales going up 30%? AMD is 10% of the market right now, a 30% increase for them is Nvidia losing 3% of their buyers. How concerned are they going to be about that if they squeeze the remaining 97% for 10-20% more money?
@@awebuser5914 ROFL. Do you realize that over 90% of NVIDIA profit comes from AI and data centers? Selling to gamers is mere few % of their total profits. They can literally create artificial scarcity to make people buy at inflated prices cause there is always someone who wants a better card. MSRP is just a propaganda placeholder price that almost never happens.
my kidneys are ready
Can a brotha borrow one? I'm fresh out 😭
These aren't Nvidia man, no need to get them kidneys ready, maybe just a pinky finger
Leave the kidneys alone 😂
It won't be that expensive.
You just need a liter of blood
@@FLMKane dont give AMD any ideas!
damn, that 9070XT performance looks promising as hell, I'm really hoping they don't screw up the pricing on it
So basically Nvidia claims the 1000$ MSRP to make consumers believe they learned their lesson but then they constrain supply and let AIBs enfore the actual planned 1200$ price point. Let's hope people see through this. And AMD.....please, please, PLEASE don't f**** this up again by releasing your cards at a ridiculous MSRP.
One more thing though: while I massively appreciate the leaks it's honestly a bit surprising to me how often you manage to recieve full internal documentation from AMD compared to Nvidia. They really should get a grip on that.
I'd be surprised if Blackwell could do playable path tracing on the 70 class cards.
You get 15 fps but then with the magic of 4x framegen you get 60 fps that still feels like 15 fps!
@Flyon86 yEaH bUt ReFlEx mAkEs It FaStEr!
You can already do that with the 4070 super using upscaling only
how can supply be so low? I thought they were scrambling to get in inventory before the tarrifs kick in
Stored supply is high, release supply is low to maximize profits, like stocking up on toilet paper when it's cheap and then selling them at high prices coz it's in high demand. then when tariffs kick in those stored supply prices aren't affected but you can still make the same profit when you increase the prices again due to the tariffs
@@scarletspidernz god i hate theses companys and how they treat their customers with artificial scarcity
They need supply for the datacenter GPUs which are way more profitable.
If AMD's primary goal for RDNA 4 is to take market share in the mainstream, then these cards MUST have mainstream prices.
We're talking no more than $399 for the 9070 and - at MOST - $549 for the 9070 XT. And even then, you could argue they should actually go lower...
There are no new games that are worth a gpu upgrade ... maybe use the extra ram for LLM smut?
Mainstream is not what it used to be…
$600 for 9070XT would be cheap… $500 for 9070…
And it would be competative!
@@SDKSeizO Not everyone have 3090 or 4080. There are a lot of people who have only 8gb vram card
Because NOBODY wants 4080 raster for under $600? And NOBODY wants 7900xt performance for under $500? You're an idiot .. of course, everyone wants this at reasonable prices - NOT YOUR CRAZY BS ...
The 9070xt sounds like a 5070 with significantly better raster, and 16GB. The non-xt looks pretty quick too. AMD should charge $500 or $529 for AIB models, and the non-xt $400 or $429 for AIB models if the plan is to gain market/mindshare. Those prices would make the 5070 and 5060ti pointless. The 5070ti would also be in a strange place, definitely not $250 'better' than a 9070xt. The 5060 will also need to be dirt cheap if the 9070 is anywhere near the 9070xt lol
This is getting crazy with their supply... less and less cards are made to inflate prices. I think this is intentional made by Nvidia.
The only exclusive thing that they bring to the older model from DLSS4 is the transformer model. What people need to aware of is the fact that using transformer model is more expensive thus potentially bringing transformer model to older GPUs will make those GPUs run slower vs the old CNN model. While yes, I believe in Nvidia driver you can pick to use what model, in the end most people will just use whatever the game uses. If the game only offer DLSS4 SR (which uses transformer model), then the uplift from using DLSS4 on older GPUs will not be as high vs when using DLSS in other game with similar quality. This will make older GPUs feel dated in terms of performance more quickly when in fact the older GPUs should still perform well if not being bottlenecked by the tensor core performance. Again, just be aware of that.
I've been wondering about that. There's been a general assumption that dlss improvements will be free from any additional performance costs. That seems highly unlikely. The value proposition of 'improvements' for pre-existing nvidia hardware may be less attractive than people are assuming.
they said it's about 5% performance penalty.
@@MaxIronsThirdI don't think they're specified on what gen though. I'd be concerned about the overhead on Turing since those Tensor Cores are quite a bit more limited
@@MaxIronsThird Wait until the actual review. 5% in what game? What DLSS settings were used? Even if you look at the current games using DLSS3, not all games reacted the same. I can easily imagine a situation on a game with high enough FPS that this quoted 5% penalty turn into something much worse. Also there is a difference between the high tier GPU and lower tier GPU with less tensor performance to play with. If the test was done only using the highest tier GPU then it can mask the penalty since obviously highest tier GPU in their respective generation have more performance to spare. Remember that in terms of compute for upscaling, it doesn't matter if you tested it on a 3090 or on a 3060, the compute cost (in terms of operation per second) for doing it is the same assuming the resolution is the same, but obviously since lower tier GPU has less tensor, to run that same upscaling it will require more time, thus costing more milliseconds. I believe Nvidia have shown this before, about the cost of doing upscaling, which is different on lower end GPU vs higher end GPU.
Edit: basically what I'm saying is that the 5% penalty (which I actually read that on reddit before posting) is pretty much useless without any context. I need to know the GPU being used, the game, resolution, and upscaling quality settings. Also obviously instead of 5% it would be better to present it as ms. How is the fps difference vs DLSS3 and convert that to ms. If it was 100fps using DLSS3 and now 95fps using DLSS4 (which is 5%), then the extra cost is 0.5ms. But if it was 30fps using DLSS3 and 28.5fps using DLSS4, that is 1.8ms more. That extra 1.8ms more if being applied to a 100fps game will turn into 84.7fps. Nvidia had a technical briefings and they said this transformer model use 4x more compute.
Which works in NVIDIA's favor as ppl will then upgrade since they may not know exacly
Guys, the 50 series is the greatest Nvidia series ever. Every card, from the 5050 to 5080 will have the same performance as the 4090!
On a more serious note. Feels like it may be worth upgrading to 9070xt this year. I hope AMD will have the supply and get good press this time around.
a 9070 XT, really
take the meds
People hate on AMD simply out of sheepishly following the pack. I've had 2 AMD cards, the only problem I ever had was with a driver for a 18 year old game. You get more for your money, and the cards last longer. Name a card from 2017 that you can still game on that Nvidia sold for $250 or less. I still see people say they game just fine on their RX 580.
It's lame cause Nvidia pulls the wool over the consumer's eyes and upcharges cause they know they have a grip on the market, AMD and Intel go out of their way to cater to the common man like *we all say we want* but they always end up getting clowned on for whatever reason. Seems like a circular argument to me.
You are dumb.
Very original..
Muppet!
If 9070xt msrp will be higher then rtx 5070 msrp - AMD will not gain market share. To gain market share, the 9070xt must be significantly faster and cheaper. Only this combination will change consumer behavior.
Exactly, 10% price cut from the 5070 while beating in raster by over 30%. That's the only way they're going to convince people. $499 is the magic price. Anything over that and its less tempting and people will start looking towards nvidia likes always. Surely they've learned this by now lol
@@PCgamingbenchmarktime $499 wont be enough to get most plebs from buying the 5070 at $550 regardless of performance or vram. $449 is the market slayer price if they actually want to gain market share.
A gpu that is faster than 5070… can be more expensive…
@@haukikannelwrong.
@haukikannel no it can't. It's only matching it in RT. if its $599 you can bet most people will just spend the extra to get the 5070ti which will have much better RT and more gimmicks to wow people with. I'm sure Amd knows this and will price it correctly. Anything above $549 and its not going to do very well. $499 and its a must buy
Took me 11 months to get my 3080 , i orderd it the day it was realeased.
6 for me ordered it on the day 2.
i scored 5x3080 at msrp im better than fucking scalpers man
Wtf how is that even possible. I ordered my strix 4090 and took a week
@@godnamedtay that really happened with lots of owners. There were queues that were in the 100K just to get a 3000 series.
@@godnamedtay because it was a 4090. No where near as bad as 30 series.
i have a 4080 and even i want that 9070xt to be 500$ to finally push raytracing forward and competition
Yes we all want a 500$ 5080 but may as well be a 400$ 5090 competitor
Same, I'd like to see AMD's solution slaughter my card even if it's just in terms of price, The entire market has become a joke and never recovered from crypto mining days.
I aint no specialist, but amd card are bad about Ray tracing, or maybe will be outdated within a year a two.
@@ocerco93...Which is why people are hopeful about the 9070 and 9070xt since they've made clear improvements towards FPS w/ RT enabled compared to previous iterations.
@@AbsoleteAim i hope you right, i doubt it tho.. 7900xtx was supposed to be good with RT enabled lol
I just sold my 4090 for 1900 thinking getting a 5090 would be a cheap upgrade. Looks like no computer for a while
LOL, sold my 4090 for $1,900 but on eBay (no local bites), so I netted a bit over $1,600. Same boat though. I got a 3090 and 4090 on launch day, I might pull it off again... but.. I'm not as sure I WANT to. Ugh. Well, we'll see what the benchmarks are in a few days. Well, in my case I have a 3080 as a backup. Words better than expected.
Worst case I end up purchasing a used 4090 locally (selling $1,200 - $1,700 range)
man... you really should have wait to get the 5090 first, in my memories getting a 4090 wasn't so hard at release day, but the 5090 could be way harder, considering how concerning the leaks about supplies are.
and for the price $2000 is going to be the minimum, you probably gonna find 5090 for more than $2700 with AIB
😂😂😂 you f up. Tbh the 4090 is plenty.
@@stevetb7777 ive never bought a card on launch day... whats the trick to getting one?
@@nahbro3240 It`s not for those with more money than brains...
I am now leaning more towards 9070 xt as my 1440p GPU if these leaks are true after 3rd party reviews.
Steve from Hardware Unboxed will get to the bottom of it. I'm not moving until we have a 40 game average from him.
You want the 5090 bro, trust me
@@badpuppy3 If you have the money for it, sure. It would definitely be the best especially that AMD will not compete on the high-end.
@@andersjjensen I am the same but my current excitement now lean more towards 9070 xt which was not the case during CES. I hope those leaks are true so we can at least enjoy the midrange.
So a 5080 is a 1440p card
So, I might be able to pick myself up a 5090 by my birthday in August? Sounds about right...
As a 7900xtx owner I’m cheering on that 9070XT and hoping it can out kick its coverage. Please please please make it cheap. Not just affordable but cheap. 🎉
He only talked to the cart pushers in the parking lot. ...upper management 😂
Yeah just like when he was telling us that the 5090 was gonna be 2500 and 5080 1350, Unsub now
@@nimaahmadi-s7o That's going to be the actual price though. nvidia prices are bullshit for a long time, it's not what you are going to be paying for not even close.
If the 9700XT is so incredible, why wouldn’t AMD at least show something before the Nvidia launch?
And video cards that compete with it are gonna launch in february.So amd has a lot of time
I also want to mention my cheapest option for a 5080 for my DISTRIBUTOR is $1083.78 so they will NOT BE $999
AMD has to go $400-500 range on these cards. They just don't have a choice at this point. If they loose more market share this gen. Lisa would be a fool, and shareholders would not allow her to be a fool. TO continue a dGPU line. Essentially that has already happened with the UDMA announcement. If the people in the GPU dept at AMD want to be able to sell cut down CDMA cards for consumers post RDNA4... they NEED to gain market share. If these benches are real and the GPU dept can talk Lisa into $399 and $499 (or even better if they can manage it) they have a chance to grab enough market share to continue the consumer line. They loose another 5-10% market share over '25 what is the point of supporting consumer dGPU anymore.
No need to go that low!
Because nobody wants 4080 performance for $600? That would be ick? WTF are you smoking?
@@dgillies5420 I'm smoking reality. AMD has lost mind share almost completely it doesn't matter if their hardware is 10% faster then Nvidia at 80% the price. Its not ENOUGH. So being 10% slower at 80% even 70% the price is really not going to work. Its go big or go home time. The point of this gen... was to design an affordable product. They won't loose money. They can't afford to NOT gain market share. Not loosing share isn't good enough anymore. They NEED to turn Nvidia 2060, 2070, 3060, 1660 owners into 9070 owners. If they fail at that. Lisa is going to end dGPU. It isn't enough to get the dwindling number of current AMD users to upgrade. They need to start attracting Nvidia customers to AMD.
@@dgillies5420it's slower than a 4080
I think the rumors regarding $479 standard and up to $549 AIB cards sounds about right for this card so far, definitely doesn't need to be aggressier at least, at this kind of pricing I would definitely pick one up in a heartbeat and I haven't considered AMD for like ever (neither Nvidia since 2000 series why I'm on an Intel Arc atm lol but would go for AMD RDNA4 if pricing is right). I can say they'd definitely get market share whit that kind pricing, there's more to it than just repeating history here as FSR4 looks like a huge step up and RT performance has a huge leap as well. Also add to that, more and more customers are getting fed up with Nvidia pricing / "strategic supply issues" and fake frames marketing as they are the same as real ones etc. and are often more prepared to support the opponent this time around if enough value is presented.
How do you only have 200k subscribers? You have been THE tech inside information guy for years.
His ego
He’s an arrogant cocky sob that’s why
@@mddunlap03 Not ego you are just stupid person and you have more estrogen it seems matches, its very simple to understand him.
@@mddunlap03it's not really an ego problem when he's right.
@@mddunlap03 Yeah he shouts it from the rooftop every time he's right and then it's nothing but crickets when he's wrong. Observant viewers will know that lots of what he says amounts to little more than gossip.
Man, my RDNA1 5700 is looking ancient.
This was such an attractive card back then.
You just weren't able to get one... xD
My 6750XT isn't looking much better
It could be worse. I'm still on a GTX 1070. 😅 I'm waiting for AMD to launch the 9070 so I can finally play modern games.
@@jaxonswain3408i have a 1050ti, i want the 9070 to do the same😭😭 we in the same boat
@@jaxonswain3408 same here dude Vega 56
FSR 4 on RDNA 3, at least on some level is amazing.
I thought I was out of luck and had to wait for UDNA to get FSR 4 or 5 because I owned a 7900 XTX, now I'm really excited to see FSR 4. I don't really care if it's only like 50% of FSR 4, if there's improvements, that's a w for me.
im done waiting 2 years for frs 3.1 in games. i going for 5080 from 7900xtx just for dlss. no more waiting on fsr...
Have you tried Fluid Motion Frames 2?
@@FrostyBud777 dude is like the guy stupidly waiting at the coach station, unaware that the coach has already pulled in 15mins ago and waiting for him to board.
i think i will keep my 7900xtx for the next generation after the coming up ones. Performance increase is not worth the price at all.
9070xt needs to be 500$ for AMD to gain any market share. Anything else is just a tossup between the two
hope AMD is smart and price this right, and i hope people support them if they do the right thing.
When AMD announce the card I'd like to see a Steve Race style presentation where whoever the Scott Herkelman replacement is at Radeon Technologies Group just picks up the reference 9070XT card from a plinth , holds it up in front of the camera and says "499" , then he puts the card back down . In the background is a polished slide with these benchmark results . He repeats the act for the 9070 , saying "399" this time around, and then walks off the stage with "available 3rd February 2025" as the final slide behind. Should last 1 minute tops. Everyone knows everything about the card so no need for info 🤣
Hearing that RDNA 4 performance is encouraging. It looked like after CES that Nvidia was going to completely curbstomp AMD into the ground, so that's encouraging I guess.
Also LOVE that you're going to have Asianometry back on. He's probably my favorite guest. Guy is so knowledgeable.
wait fsr 4 support for rdna 3+3,5 means strix and strix halo with fsr4???? thats the even bigger thing for me here
Linus said that basically everyone at CES, all the aib's and SI's said they all pretty much sold out of 5090's, and it was mostly buyers for companies placing orders for new workstations to replace the workstations that they currently are running.
AMD knows how to overthrow a giant. They did it in the CPU sector by offering a way more performant CPU at a way cheaper price. As for GPUs however RDNA4 can't compete with Blackwell. So i am really curious about AMD's strategy for this generation.
they can compete in value.
If these raw performance leaks are true and it's priced at $499 I will buy it.
B570 ASROCK and Sparkle has been up on Newegg at MSRP for sale for 2 days, stock's still holding. Edit - Amazon has them today now too. So not too bad a launch.
My local Microcenter still has B570 stock... But the site showed they only had one model with 5 cards in stock as of 40 minutes after opening on launch day, and as of now they still have only that one model but the quantity has dropped to 2. I've not seen the number go up between check-ins, suggesting no restock yet.
Seems like inventory was low but demand was even lower.
Only for the US though. Everywhere else in the world they're either unavailable or $280.
@stuartthurstan 280€ here in Germany which would be a little under $250 without tax... which was supposed to be the MSRP of the B580 lol
@ Well, that's pretty decent. To be fair Germany is probably one of the best places in europe to perhaps get hold of PC hardware at honest prices.
@ It is, but the Rx 7600 used to be even cheaper.. Now it went up in price a lot over the last few days, probably because people realized the B570 won't be as cheap as reviewers told them lol
Is it a bad idea to go ahead and sell the 4070 TiSuper and get a 9070 XT? I really want to support AMD and it seems it will be similar or much faster depending on the game. Could get $700 for the GPU and guy the XT for 600 likely. I am very tempted. This is looked impressive AF going from a 384bit to a 256bit GPU both on gddr6.. wow.
Scalpers set to get the largest percentage increase from the 50xx generation. Congratulations!
so basically, why sell blackwell as a 5090 for $2000 when you can put that silicon towards AI accelerators and sell it for 10-20x the price
Yep
I think that Jensen did tweat last year that the supply for next gen gaming GPUs will be smaller… So this is old news… in that regard.
Radeon has to nail the combination of price, value and supply to gain market share.
Their best chance to do so in recent memory was RDNA 2. RX 6800xt and RX 6800 were fantastic but sadly AMD was not willing to supply enough until the end of the gen. So they ended up losing market share despite a fantastic product.
They would have been dumb to sacrifice Zen 3 supply (which they make a lot better margins on) for RDNA2 supply. And TSMC was fully booked because of the whole pandemic thing. Currently TSMC is not at full capacity on N5 and derivatives, so if AMD needs to book more wafers, they can.
There was plenty of supply. No one wanted to buy them. Ever visit a store during COVID? The AMD cards were staying on the shelves.
@rareapuenjoyer which country are you talking about?
Globally there was chip shortage and AMD prioritized CPUs as they have a higher margin. Only later in the gen they were sitting on shelves.
I sincerely think AMD is going to price it cheaper. AMD is probably seeing how people are reacting to the "fake frames" issue with nvidia having little to no generational uplift. AMD smells blood, and they are going for it. AMD aggressively priced their CPU chips against Intel and look at where it got them. I believe they would do the same with Nvidia.
Let's hope so. If they price it at $499 while smashing the 5070 in raster and matching in RT it will make the 5070 look like the joke it is lol. $599....no one will care, most will just get the 5070ti for the better ray tracing at that point. Hope they don't stuff this up, first time I've been excited for an AMD card lol
30% is not little to no uplift and the fake frames “issues” for now is manufactured. Nobody but Nvidia knows how it performs yet
@@proggz39 that's only for the 5090 mate. The 5080 only has around a 15% raster uplift. The rest are around 20% at best
@@proggz39its 30% for a 2000$ msrp
Deja Vu, I swear I heard this at the launch of RDNA 3. We all know how that turned out.
Going by history, not by hope, AMD will price the RX 9070 XT $100-$150 more than we want - aka $600-$650.
You also said the 5080 will be 1200-1500. Even a broken clock is right twice a day.
@@ryanvasei8412 you will pay 1200€ for one, don't worry about it
Obviously he's going to cherry pick and emphasize only on leaks he got correct and hope everyone forgets all the other things he got way wrong.
@MegaStupidMonkeys your name is kinda right
Yo did AMD just drop a card of the fucking decade? Looks like a 1080ti style card. I am definitely picking one up.
No they didn't
Looks pretty amazing.
Even the lower RX 9070 model (56CU) is beating the RX 7800XT by up to 30 - 40% in some of those benchmarks. Wow - I was expecting the cut down RX 9070 model to about equal the RX 7800XT.😄
It all depends on the price. If the 9070XT is anywhere near $650 it's DoA. I personally wouldn't pay a dime more than $500
@ This is stupid considering the 7900xtx is 900+ dollars right now.
220w? I'm sold! That's my card!
Well if intel is selling the B580 at cost at the employee store, and they are losing money on each card, wouldn't it cost more than it cost at retail? This looks like it's going to be a good generation for video cards.
That sentence makes no sense, if Intel was eating the cost to get rid of a terrible product instead of just having it sit around it would be more expensive? What?
@@DebasedAnonIntel would only eat the cost when selling to a customer.
@@DebasedAnon He's saying Intel's employee store would have to raise the price to sell "at cost" since they are selling their retail copies at less than cost. Like if the card costs 300$ to make and they sell it retail for 250 they lose 50 dollars. So the employee store has to sell it for the higher price to NOT lose money. What is likely happening is that manufacturing costs are less than 250 but the R+D, shipping, marketing, overhead, retailer cut etc make the overall costs a net loss per purchase.
@jojoq.3961
Fair enough but i thought that was an obvious assumption right? Obv manufacturing itself wont be 250...
Look at the BOM on the 4070 vs MSRP ($550). Then look at the BOM on the B580. EXACTLY THE SAME ~EVERYTHING (trivial difference in GPU die size). WHAT A HORRIBLE DESIGN (33% slower than the 4070). Intel ARC is from the 100% nonprofit division of Intel!
I'm looking forward to preordering the 9950x3d and 9070 xt to build a new simulation gaming pc this year
Fascinating if priced correctly AMD has a real chance at market disruption.
That 220w 9070 gives me a lot of hope for a solid cut down 12gb N48 card focused on efficiency with ~48CUs, I'm really hoping to see a resurgence in good sub 200w GPUs as I need a replacement for the poor RX 6600 in my TV rig deskmeet X300, it has trouble even performance upscaling to 4k sometimes and alleviating this would make for amazing, tiny systems.
if you want a good gpu under 200mm for your deskmeet, i doubt amd will have a good option. your only good options are a 4060ti or a zephyr 4070
Those would be near unobtainable like the 6700-non-XT.
@kushy_TV 4060 ti is an awful option since the 16gb is the only viable one, but scales awfully due to that shitty memory bus and costs $450, there's 0 chance that a card coming out within the next couple months won't beat that in multiple ways.
N44 you mean?
i think that's the plan, 9060XT based on N48 with cut down 192bit bus/12GB of VRAM, then the 9060(and lower models on laptop) are based on N44, which means a 128bit bus/8GB of VRAM.
pricing is everything amd. i would go lower, 9070xt 548 if 599 is a real price idea imo. make the shiny man look stupid.
Realistically AMD has little need to price the XT above $499. It's probably around the size of the 7800 XT. But with this kind of performance they have the legitimacy to price it right next to the 5070 Ti (even though gamers would snub it still).
Those prices aren't great...lol $599 is a little too close to the 5070ti. It would be DOA. $499 is what makes the most sense . Same price as the 7800xt. They're around the same size too , it would be foolish of them to price it $100 higher lol
599 is a really stupid price though?
@@PCgamingbenchmarktime Plus it's 70s class card (right? This is why they changed their name scheme), so why 70 class card should cost 100$ more than previous 80s class tier?
$599 for 9070xt is going to be too much. Its not going to sell at that price. It should be max $549
But the pricing is the important factor here
Wonder if they do $500 for the 9070xt and $450 for the 9070
honestly doesn’t feel like a large enough price gap between the two cards it would just kill the 9070 at launch the 9070 needs to be atleast 100 dollars cheaper or they might as well only launch the xt
Not so low… $600 for 9070XT and $450 for 9070 sounds more plausible.
This isn’t surprising. Nvidia would rather allocate wafer supply as much as they can contractually with TSMC to AI
That is the point.
These are some nice performance charts for the new AMD cards. Now all they have to do it price these cards competitively and they will sell like crazy! The power reduction is a very welcomed bonus in my book.
If the 9070XT is less than 10% slower than the XTX on average in raster, and gets close to a 4070Ti Super in RT I will sell my XTX and get a 9070XT if it's $550 or less.
I think it will be closer to 5-7% difference. Remember those drivers are over a month old. A few of those games seemed quite oddly low compared to the others. AMD themselves said things are better with their latest drivers
I’m just gonna say you might wanna take this with a grain of salt..
@entreri76x its in line with the other benchmarks and Amd themselves have said its going to perform better than we expect. Seems pretty likely . Probably has improved since these benchmarks too since the driver's are over 1 month old. Some of the games didn't perform like they should, assuming there's more work been done since then
If a 64CU 9070 XT is almost as fast as a 96CU 7900XTX, AMD could have easily released a 96CU 9080 XT to tackle 5080 or even a 5080Ti. But wait.... Its AMD... They are not happy without a bullet in thier foot... shot by themselves.
They could, but AMD chose to prioritize next-gen Multi-chiplet GPUs since Nvidia also wants to bring theirs faster to the market. This gen should be short-lived, -1.5 years.
Agreed. Given everything I’ve heard and seen, it would have been trivial for them to just make a 96CU, 24GB 9080XT for $800-1000. They can still credibly say they’re not worrying about the “high-end” when their top card is less than half the cost of Nvidia’s.
If those benchmarks are accurate, then AMD isn't pricing the 9070 XT any lower than $599. In fact, $649 to $679 is more likely since it'll be a competent 5070 Ti competitor. Anyone hoping AMD would launch the 9070 XT in the $449 to $499 range are on a lot of hopium.
Ridiculous 😂😂 it would be DOA. Max price would be $549, no chance they're dumb enough to botch another launch with such a sure thing. The Card probably cost the same to produce as the 7800xt. They have no need to price above $499. Who the hell would buy it at $649😂😂
XT for $600 would be good deal and non xt $450… even $500
while i can see your point, the BOM cost of N48 is the same or lower than N32 (7800xt) so at 550 they are making more money than they have on any product since the 6950xt. and they will take market share at that price.
@AltRockKing exactly, the bom costs would be very similar to the 7800xt. Even at $499 they're like making over 60% margins on these cards, they can afford to price aggressively and go after marketshare. I have a feeling they'll be stupid and price at $549 which won't be enough to sway people. $499 is 10% less than the 5070 while performing like a 5070ti in raster and probably still slightly beating the 5070 in RT. That would dominate the midrange, i really hope they play if more aggressively and really stick it to nvidia lol
It doesn't take that much imagination to predict that a company that has in the past severely limited supply in order to keep demand, and therefore prices high will do so again. 🙄
When he told "even if amd chooses to 599$ for 9070xt and 499$ for 9070" thats big mistake from Tom, there can't be even hope for Amd to get those prices, it's has to be 449$(max 479$) 9070XT and 399$(max 429$) for 9070. We need Amd to stay in market and keep greedy nvidia down.
As long as it's under $499 it will sell well. It's 30% faster in raster at least compared to the 5070, but it will probably only be on par or slightly ahead in RT. $499 makes a lot of sense, I think they'll probably match the price of the 5070 though to be honest which would be a bit less exciting. $499 is midrange, anything over that I'd consider upper midrange prices
450$ ?? Bro it’s competing against a 750$ card are you smoking ?
@@letsplay1097 if i would smoke same thing as you, next generation we would need to pay 850$. 9070XT 450-479$ for reference, ~500$ AIB. nvidia already made you think that price increase is ok. Amd is in bad pasition, they need to gain market share, get in to our minds, there is no other way to go back (unless they have some monolithic 9080XT which would beat 5090, but not this genaration)
@@devilheadas don’t get me wrong i would love to see it being priced below 500 but the thing is the market has changed forever and yeah amd targets market share but that wouldn’t mean pricing the 9070xt at 300$ less than the competitor it just doesn’t make sense profits would be zero , how else would they finance they’re r&d
@@letsplay1097 Yeah but AMD doesn't have the feature set Nvidia have, not to mention the mindshare so they need to price it way lower so that it can grasp market share. 499$ should be max for RX 9070XT
The benchmarks for the 9070 look awesome, like so close to the 7900xt, even $500 would be a good price, $450 would be a crazy good deal, like it would be madness to buy anything but that GPU, I've a 6800 non xt, and really not thinking about an upgrade as it is still too good for the games i play, but if it comes near that $450.... It's a no brainier it will be the deal of the decade
Seems like the 50 series will be a performance whimper as opposed to a bang. I think nVidia was leaning into this with their fake frames nonsense. They'll claim higher performance with all their AI stuff turned on, but in reality it'll be incremental. Synthetic benchmarks or productivity will be more interesting, not expecting a huge gaming bump.
The leak says Navi 48 XTX, are you sure that's not an undisclosed 9070 XTX, and not the suggested 9070 XT?
I'm thinking the same thing as the whole reason they renamed these cards to 9070/xt was that they were targeting the 5070/5070ti. The 5070ti will prob be about 5-10% slower than a 4080 and I don't see these cards besting that or they would have went with 9080.
650,550 are not good prices even with more vram and better raster, the only way they will truly claw back market share is with 550,450.Looking at leaked specs, 9070 XT vs 7900 GRE has 20% less cuda cores while boosting approx 25-30% more [2.2 vs 2.9 ghz boost] shouldn't mean a lot of performance difference unless the rdna 4 cores are better than rdna 3 cores in some way.Seems very sketchy they decided to end 7900 GRE production as it looks like its the best competitor to the 9070 non-xt atleast which boosts to 2.5ghz with a little less cuda cores.Just seems a little too good to be true
AMD ray tracing is catching up to 4070 ti-ish, and 50-series are going to have low stock and aren't a huge improvement in raw performance. I'll probably just stick with my 4070 ti for now.
Tom in 2020: Sell your 2080ti, get a 3070
Tom in 2021: No I was right, if you had gotten a 3070 at launch you would have saved money
Tom in 2025: I told you, you couldn't have gotten a 3070 at launch, there was no supply
*bruh*
No, I said a few things your attention span missed:
1) If you sold your 2080 Ti for $1200 when I told you to, you could have bought it back for $700 a few weeks later. Fact.
2) Also, a 6700 XT 12GB was EASILY bought for $600-$700 from scalpers. Again - a $500 profit for 2080 Ti sellers, and now you get a real card with 12GB! Fact.
3) The 3070 didn't actually have the worst supply. Go watch my videos from then bruh - it's public lol. The 3070 had decent supply, and even though demand was still insane, you could get one - I did! It's the 3090 and 3080 that had paper launches. Facts.
As always - facts are facts, and they don't care about your feelings.
Omg yeah you're right. I completely forgot he was the one who started the craze of making people sell their 2080ti's. YIKES 😬
Man, I've already planned for April or May being the "real date" to get either a 9800x3d or whatever new gpu reviews really well.
well may still be true, 2026 ... :D i ordered a 9800x3d yesterday in hope the 5090 might follow soon, my hopes have been crushed haha
The RTX 5000’s supply leaks are as credible as their price leaks.
Prices can be decided at the last moment. AIBs can't just suddenly conjure up a million VRM components without lead times. So when the AIBs say they aren't expecting much supply, you can be pretty confidant that situation can't suddenly chance in less than 6-8 weeks.
Prices can be decided at last given moment or given fake price beforehand to throw competition off.
Supply is far harder to change. Especially when some parts are not manufactured in USA (and some are not, which is why so many techtubers talks about Trump tarrifs influencing future pricing)
@ Jensen already made a statement of large scale production for the RTX 5K during the CES 2025 presentation.
@@andersjjensen Nvidia setup RTX 5090 to be the most attractive with huge profit of all the RTX 5K; there’s no way they would make short supply of them. All the wrong very high price leaks and short supply leaks are just propagandas to convince people to spend the money on the massive selling of 4090, so those sellers can move those money over to the 5090. Same cases for the 5070, 5070 Ti and 5080.
I have a good feeling the sapphire nitro oc is going to be an overclocking demon. No news on it, maybe 350w oc limit?
I thought the rumor was Nvidia was filling warehouses with cards to avoid tariffs
They probably have a heap in warehouses but with no and cards and they can limit supply and keep prices high
If AMD wants to reach the goal that you talked about a few months ago as soon as possible, they should make an event, come out and say:
399
and they go... Like Sony for PS1.
and that's it.
Everything else is prolonging the agony. Sacrifice those few million earnings on the product, not on ads.
I hope I can get one from the nvidia website, but I dont think ill beat the scalpers. It sucks that nvidia cant do a reservation thing, its first come first serve, meaning people can bot the site and etc, it sucks
i beat scalpers back in 2020 had 4xfe cards and few aib ones at msrp
@@lordzed83 are you trying for a few 5090s? maybe you can try and get one for me
You just know they're gonna overprice it.
If they don't, I'll snag one, been about 4 years since my last upgrade
So all I get from the comments is that people want AMD to give these GPUs for free, or they won't buy 9070xt for 550$-600$ that is faster than a 5070ti that is priced at 750$.
Nah bro, if AMD wants to gain market share they must price the 9070 xt at 300 usd or less!
/s
Lol, fools. Just go get a job or work some extra hours.
It wont be faster than the 5070 ti and certainly not in RT and with no magic 4xframe gen
@LvcIvsLIcInIvsLvcvllvs I was speaking in a hypothetical sense, in accordince to this leak, but you have clearly tested both, so you know better.
@@LvcIvsLIcInIvsLvcvllvs You actually don't know that. 5070 and 5070 Ti are not leaps, they are small improvements on the last generation. AMD could easily wipe the ground with them.
Here's what I find interesting. Out of all the online EU shops i normally look at, there is not a single 9070 card listing. Meanwhile, all those shops do have the rtx 5000 series listing. Now Nvidia claims they won't have a very high supply. AMD hasn't made a statement in that regard. We are still awaiting pricing; no idea about its supply... I think this could be a magic AMD moment in the making. I'm rooting for AMD. But now, so many gamers have the fear of missing out (FOMO) on the multi-gen 200fps+ ray-traced potential of Nvidia. My friends, it's only a gimmick. Usable ray-tracing is coming to the 6000 series and RDNA 5. For now, be content with raster performance. In which case, I'd recommend to either wait for 6000 / rdna 5, or grab a 4000 series on sale. There's a lot of uncertainty in the markets with Trump coming to Presidency, don't forget that. We're about to witness a huge shell-shock effect. peace
AMD don't let us down!!!
you joking or your stupid lol
They will, they always have, & always will.
@@godnamedtay You're either new to PCs or have a short (or selective?) memory. The RX580 was a sales hit for AMD, and so was the 5700XT. RDNA2 sold reasonably well, but was not priced well at launch. In RDNA3 only the 7800XT was well received, and lo and behold, it sold so well there was price creep for two months before supply caught up.
@@andersjjensen the rx580 was 100 years ago. Every gpu after that for the most part had so many driver issues there’s too many to count. Stop it lol.
@@godnamedtay Look at Nvidia's list of "known issues" in their latest driver update. When AMD has driver issues it's "AMD baaad!" and when Nvidia has them it's a completely normal part of software development. RDNA1 had the dreaded black-screen issue that took entirely too long to fix. RDNA2 and RDNA3 were completely normal. Stop sucking Jensen off. He doesn't even like you.
I think we are missing the 80 Ti class so frikking much right now.
Those 102 dies are like unobtanium.
All we need now is the anouncement and release date.
Also if this is how good RDNA 4 is and its meant to be a "fixed" RDNA 3 with a few optimisations imagine how good actual RDNA 3 could have been. Heck imagine how good RDNA 4 high end might have been too.
The chiplet did not work… That is why they did dump the big chiplet version!
So will they never release a higher end model this cycle? Wish they at least tried to compete for 2nd place instead of being a few tiers below Nvidias best cards.
All they did for RDNA4 was bus tweaks (RDNA3.5) and they finished RDNA 3 which was supposed to clock at 3 gigahertz+. But yields were too inconsistent and so they had to clock at 2.5 Ghz in RDNA3. Some overclockers beat the 4090 with a 7900 XTX clocked @ 3.3 Ghz!
@ It was a thought experiment.
I think a 96CU RDNA 4 monolithic chip with 24Gbps gddr6x would be extremely interesting - another 30-35% above 9070XT would edge the 4090 in raster and comfortably beat the 4080S in RT . If they could price that at $899 to take on the 5080 they'd clean house on nvidia's stack (without using fake frames😅)
Plus remember amd usually gets a 5-10% improvement over time with driver updates, so it could eventually be equal to the xtx in most games.
@MaxIronsThird i don't think so even with rdna 3 the 7900xt and xtx get better performance over time. Check out ancient gameplay comparison for xt, xtx and 4080 super, games like cyberpunk increased on average 10% from a year ago.
That makes no sense, nvidia also improves over time. Also they are adding DLSS4 transformer model with better RT and DLSS for the RTX2000+ series, AMD isnt adding FSR4 for their older gen cards.
@@MaxIronsThird No. My 6700XT was 4-5% faster on average when I sold it than when I bought it, and my XTX has seen some good improvements too.
@nuddin99 yes, stated that in the video for rdna 3 and 3.5 some version if fsr4 will be supported. Also it does make sense they improve over time, the evidence shows that I didn't say anything about Nvidia, tell me you're a Nvidia fanboy without telling me you're a fanboy.
@@nuddin99nah nvidia cards usually slow down quite a bit with newer drivers.
Thanks for the great info, Tom! You're awesome! I hope to support the channel someday
I've been watching MLID videos probably for 1 year or so to get some information about how would the new gen amd cards behave vs the current 7900xtx. MLID has been consistently saying for literaly months that the 9070xtx would be something between the 7900xtx and 7900xt for raster but with better raytracing, this awesome leak proves his accuracy was dead on!
Thank you for helping me out not wasting 1000€ on a 7900xtx and wait for this new generation! Cant wait for the full reviews to come out!
I’ve been watching the dude since 2019 and he’s a coin flip on accuracy.
when are they going to price/announce them exactly?