I am terrified at the thought of the 5090 price... Between AMD saying they're mid end now and the 4070 GDDR6 having no price reduction for worse performance is disheartening to say the least.
4070 isn't exactly memory bandwidth limited so going from GDDR6X to GDDR6 isn't going to hurt performance (unless they've changed other stuff). It will however, lower power consumption. GDDR6X is a weird Frankenstein experiment between NVIDIA and Micron that should've only been on the 4090 because its a halo product. I do agree about price though. GDDR6 is cheaper so it should've gone down to consumers as well.
RX 6800 has the potential to be a decent classic as well. It sits in a nice price point secondhand these days, and if you run Linux, it is a no brainer.
I really hope they nail the price to performance ratio when it comes to rasterization. I don't care much about raytracing. I want bang for the buck and stability.
Raytracing is where rendering is moving towards, games are already being built with raytracing that CANNOT BE TURNED OFF, it's not going to be optional. You'll be shooting yourself in the foot by ignoring RT performance for a supposedly future proof upgrade. Also saying you "don't care" about something is useless in the face of developers caring very much about it, they build the games and you buy the hardware to run them, you can't stop progress.
@@ybned-ed8rz some UE5 games like avatar and black myth have some amount of ray tracing on at all times and the "ray tracing" mode is actually path tracing
@@steel5897 I really hope that this is not the case. As gamers we should be looking for fun games that are worth our time not the most realistic and graphical cutting edge. As graphical fidelity increases, the scope of games is also increasing which means longer development times. This is why AAA is dying. They do not see the value in making a fun game and would rather pump out 10 year projects with cutting edge technology, when a fun and enjoyable experience is all gamers actually care for. Also the UE5 look is absolutely horrid that is spitting out mirror copies of the same thing. Indie studios will undermine these mega studios if this continues on. And as for bigger studios, look at games like Elden Ring. Graphically speaking its nothing to brag about, with its bare minimum textures at best but with lighting that carries. Despite the mixed graphical fidelity it is easily one of the best games of modern.
RDNA 3 and Zen 5 have similar weird vibes. They both perform pretty much like the previous generation and cost more. I think RDNA 3 underperformed expectations, given that they chose to call the 60 CU model '7800 XT', vs. the 72 CU 6800 XT. In some games, the older card wins. This is the sort of weirdness that makes consumers trust and understand Nvidia, even if they are being ripped off. To be fair though, I did buy a 7800 XT because they were going for £429 and for that price it's solid.
RDNA 3 was a bit like Intel's Meteor Lake- moving stuff all over to a modular design, which itself is an achievement. But it's boring for us buying it even if it bodes well for the future? Maybe? If the modular design is scaleable and actually leads to stuff like more performance? Which it doesn't seem to be for some reason? etc
@@ShinDMitsuki the "trust" here means it will always have better performance and margin than previous gen. Also the "trust" count with driver stability, longetivity and the future proofing. I dunno if Nvidia is Open source like AMD.
@@azravalencia4577 I wouldn't say the 7600XT is dogshit, it fills the niche that the 6600XT filled and that card was an underdog 1440p performer. I'd say the 7600XT is in the same boat, it competes with the 4060, has double the VRAM, and is almost an underdog 4K card at medium settings in a few games
Of course, why would they charge less if people are still willing to pay? People love to complain about Nvidia pricing, but they are still selling GPUs like hot cakes. I guess people don't really care about their money and don't want to get more for their money.
@@rand0mtv660You mean rich people? I’m not allowed to be mad at the fact people who don’t care and just buy the best drive up the prices for everyone else?
@@ausden9525 I think you misunderstood my comment. I'm saying judging by how much people complain about these prices I would expect that Nvidia wouldn't really be selling that much, but people buy their stuff anyway. It's just crazy consumerism.
Well if you're not stupid you'll understand that they tried to mimic the success of Ryzen againt Intel by making the chiplet architecture on RDNA3, something that indeed made Ryzen powerful and affordable. Their GPUs are indeed Powerful in raster for the price, better than Nvidia. But it's easier to say than to make AMD superior, it's the first Chiplet GPUs it's not mature and it's complex, they're going back to mostly monolothic for RDN4, apparently. Also devs optimisation on Nvidia using Optix and GameWorks makes games less optimized on AMD hardware, for example Alan Wake using optix, and that's one of the problems pointed out by the Radeon executive, which is why he's targeting the mid-range, selling as much as possible to have a base that would attract developers... PS5/PS4 games run superbly on AMD GPUs because they're RDNA2-optimized games. but most PC games aren't... and gamers have this mania for not thinking Cartesianly and in industry and hardware detail, devs use Optix/GameWorks but you'll very rarely see the Radeon ProRender logo which optimizes for amd GPUs raytracing etc. you're being duped by NVidia's marketing and showcases in partnership with studios like Remedy etc to sells card. you are being fooled by NVidia's marketing and showcases in partnership with studios like Remedy etc who have contracts that require them not to use more powerful or similar tools, which is why Alan Wake does not use FSR3 and DLSS3 is implemented, these games are consciously under-optimized.
@@technite5360 it’s already a fact that AMD is just worse at ray tracing and even AMD knows this. Even AMD sponsored titles with Ray tracing often use lower fidelity ray tracing than those who are Nvidia sponsored. Plus you can’t ignore that AMD’s way of calculating Ray tracing absolutely sucks at path tracing. The calculations necessary make it extremely difficult to process as while some aspects their fine, others they suck at. There’s a reason why AMD hasn’t shown off path tracing at all while Nvidia does. The consoles don’t show anything different. You just have different developers who are very good at optimizing for specific features on one set of hardware. The PS5 doesn’t really do any better in ray tracing than the similar PC GPU, the RX 6700. The games run as expected and you get the games that run great from the great developers that Sony has had in house that put out the best looking Sony games last gen. It’s just a case of the advantages of developing for one console so you can make hyper specific optimizations that you won’t get on PC
Not really, it's what gives them a niche in the first place. They can't produce something identical or better than NVIDIA and also price it cheaper. That's simply not going to happen. So they have to look at the market and see where they can fit in, which is being "diet nvidia" as you put it lol. Producing stuff that is slightly behind nvidia, with pricing that reflects it. If you want low pricing, they have to give up in other areas.
They still fulfill the niche in the market, better price-performance for non-raytracing games. With the way that games are heading right now, with pretty much no AAA games set to release for 2025 and indie games dominating the market, I think its a growing niche. But you're right they need to advertise better and price even better if they want to beat nvidia
We need great value $150-$200 GPUs again. This pci ex 4x nonsense with the 6400 and 6500 XT killed AMD on the low end. What use is there being 11 options in the $350-$550 range if most people don't spend that much on a GPU??? How the hell were they thinking this would grow their market share?
@@zoltankramer3383 More so, there are budget cards in the 250-350 price range... 7600XT is around 300-350 bux and maybe it doesn't run everything at 60fps on 1440p ultra, but neither did the 570 or 580 lol. People's standards have just gotten absurd. With Nvidia inflation is pretty obvious though. Adjusted for inflation the $379 GTX 1070 launched as a $497 card and the RTX 3070 launched at $499. Obviously it doesn't bring much comfort if prices have only risen according to inflation but your wages haven't....(and obviously if wages also rose then the GPU prices would rise faster than inflation for the sake of profit) but that's definitely a larger issue with causes completely disconnected from GPUs...
Isn't the RX 6600 150$? Maybe i didn't understand your comment, but the RX 6600 is better than both you mentioned. It's not even "low end", yet still around 150$. What the hell do you consider mid end?
I've given up on hoping that AMD will actually launch a good value product, instead of barely undercutting Nvidia. I am FAR more interested in Intel's Arc Battlemage (gen2) than RDNA 4, as Intel has made it clear they are willing to offer really good prices to sell products and the drivers have made huge progress to the point where Arc GPUs are recommended now.
Unfortunately Intel is in serious trouble. They fired bunch of people, their stock has gone down and they just aren't doing very well. We'll see if this whole GPU division they have will live long term.
Intel are currently making GPUs twice as large as AMD just to get the same performance level, that's a very unsustainable strategy given their recent financial woes. Sure the Arc cards are priced well currently, but i doubt they're making any money from them. I'm hoping Battlemage improves things, but they'll need to seriously improve their drivers and architecture if they want to remain competitive.
I have no faith in Intel anymore, they have that "declining company" aura similar to what Boeing has. I get the feeling it's all downhill for them. Already rumors of them potentially selling their manufacturing division altogether though admittedly I don't know if that has any truth to it.
the last super notable amd card that really made waves was the 580 series of cards, they weren't powerhouses but damn were they good value and gave good enough performance, should've kept that train rolling when they had the chance
I loved the 4870. It was released in 2008 at $300. Which is $438 in today's dollars. Today, this money gets you RX 7700 XT 12GB, 20% slower than 6900 XT.
Current MLID leaks suggest that it'll have 7900 XT-level performance rather than XTX, with RT performance equivalent to the 4070 Ti Super. Take it with a grain of salt, of course.
@@Vorexia imma suggest not to suggest any leaker and keep your expectations in the sand because the last 2 times we have did this it ended with everyone being dissapointed
Watch radeon release same perfromance as nvidias card but 50$ cheaper as they have for several generations... So the question becomes is better upscaling and other software features worth extra 50$. Want higher fps turn on DLSS2 (thats miles better than FSR2), want eye candy turn on raytracing that runs better on nvidia cards...
Main issues right now is alot of the time when a new game is out it doesn't work on amd cards and just crash for the first week of release, one of the main reasons I have not bought one.
@@currentlyatlas5074 think you're mixing up intel gpus with radeon. Radeon drivers are fine. Im not asking much just for FSR2 to match DLSS2 picture quality.
@@Takashita_Sukakoki Wukong, BG3, Witcher next gen update all were unplayable on launch on 7000 series gpus. WoW DX 12 dosent work to this day on AMD cards too. Theres plenty of problems with radeon drivers.
100%. DLSS is much more mature than FSR and significantly looks better when upscaling 1080p to 4k screen. Not only that, but I'd gladly pay $50 if I don't have to deal with AMD driver issues (seriously why do they keep breaking their own driver and release it anyway).
I remember about 2 decades ago I've bought some program to download sites. Internet was on per-minute tariff, so you could just spent like 15-30 minutes and entire site, with gifs and flash animations was soved locally.
I also expect pricing to be not great. AMD's manufacturer is TSMC, and they are expensive, since they're the only high end chip manufacturer. The RX470 was manufactured by Global Foundries, which made them worse at power efficiency, but allowed AMD to keep their pricing low. PS: Funnily I didn't click until now partly because I didn't want to hear speculation about performance etc, as you've mentionend in the video. It was a great watch and I'm glad you've managed to make something I didn't expect!
Shout-outs to the Fiji Fury series for being a wild bonkers card that (in the Fury X) was way too huge, hot, and power hungry but was still an amazing experiment.
AMD's pricing really is a problem, one what is compounded by the feature set disparity. RX 7700XT launched at 489€, sold poorly, and is now down to 389-399€. Had it launched at or below that price point, they would've had a card faster than the RTX 4060 ti 8gb on the market, with sufficient VRAM and performance lead to make it a genuine option even with FSR being what it is. But no, AMD had to try and upsell while having the inferior upscaler and worse raytracing performance, against a competitor with market share and mind share dominance. And I'm saying this as someone who owns an AMD GPU myself!
not only are you comparing the prices of amd gpu that is head neck and shoulders above the nvidia card LITERALLY in another tier, you're also talking about raytracing on anything short of a rtx 4080. incredible. over here in the uk anything nvidia is overpriced by at least £100 but it can easily go over £300 more for the same tiered card. a 4060ti 16gb is priced like a 7800xt. the time of year does not matter. i agree launch prices were ridiculous but what prices werent? a radeon 6750xt was £463 2 years ago because we left covid, lived in a crypto mining crisis and still continue live in a crisis because covid. what happens now, is you can buy a current gen at discounts while always having the option to buy a high tier previous gen for cheap. you'd also get to do yourself a favour by running native. why because dlss and fsr are trash, dont delude yourself that playing with vaseline over the screen is fantastic. they're there to mask how terribly rushed new games are. games made from unreal engine come to mind as they print photorealistic games like never before, while adding dlss and fsr as the only optimisation. wait until people remember/find out we had photorealistic games back in 2015 rivalling or beating todays games, running off r9 290/gtx 970's.
I ordered but not paid yet for rx 7900xt at 661 euro (including tax and shipping). I already have rtx 4070 and don't know if I should sell it and pay for that order, as I would need to add 200 euro or more. I'm not convinced because of bad fsr and poor rt performance. Yes the card would be faster and 8gb vram more, but in rt+fsr it won't run much better and won't look the same.
@@gameison4813 TH-cam comments aren't the best place for any advice, let alone monetary advice. Reddit's buildapc for example would be a better place to go to. With that caveat away, if the RTX 4070 performs well enough that you can afford to wait six months or more, I'd recommend waiting until the RTX 5000 series launch is complete, which itself is expected to occur in late 2024 or early 2025. At minimum I'd wait until Black Friday sales. RX 7900XT at 660€ is a fairly decent price and the raw performance in non raytraced situations and very light raytraced games like Resident Evil 4 Remake is faster or nearly as fast as 4070ti super. Of course, the moment you try to run say Alan Wake 2 with raytracing the performance craters. At 1440p native the 7900xt would have the same or worse performance as the RTX 4070 at 40fps or so, with an upscaler that isn't as good. It all depends on what games you're likely to play. Honestly? I wouldn't bother upgrading until I'd get +60% performance boost at around the same price or if the GPU I had couldn't get 60+ fps at the resolution of my monitor. The 4070 only launched last year, so I'd wait until next year at minimum before even considering upgrade. If you had a 2070/3070, it would make sense to upgrade, but I'd rather save the money and see how the market develops if I was in your position. Again, I will stress that this is the youtube comment section, so take all of what I said with a shovelful of salt.
I hope we go back to the RX 400/500 series days. You even got options for entry-level cards. My first GPU was the 550, which I upgraded to a 580 a year later. I still have it on my testbench system. It's honestly the best AMD's GPU division has ever been
Still rocking the old Polaris RX 590. The GPU market has been terrible ever since. It's a shame that great range never took off. This event probably cascaded into the sad state of affairs we have today.
I think you should have talked about battlemage along with rdna 4 since it will be coming around or after rdna 4 , and it seems they have addressed the issues with alchemist and older game compatibility will be better and newer titles will also be running great from day one.....
@@bmqww223 Yeah. And Intel also has more commitment to upscaling (XeSS). I bet they already work on AI frame generation. I would be surprised if AMD adds something like DLSS or XeSS to RDNA part 4.
@@cube2fox AMD should partner with the Lossless Scaling guy for framegen. LS is fucking incredible for laptops or other dual-GPU (iGPU + GPU) setups and supporting something like that on the hardware level would be cray cray.
@@cube2fox they are working on a fsr AI or fsr 4 , upscaling for sure, rdna 3 was a wakeup call for them when they saw xess 1.3 drop and beat it easily , and they were a bit cocky with the intel driver situation, but know if you follow the intel engineers presentation they are have thrown the microsoft directx 9 to 12 compatibility layer and running in the metal which shows that older game would run fine, and for the modern game, just see their indirect xi metrics its almost 12 times more(fancy marketing way of saying that they have got it up and running) which will help in modern titles and even help with shitty day one launches which gets fixed, so i am 100 percent sure battlemage is the reason amd is singing different tune right now and the threat is looming over amd right now .......
I bought a RX 7800 XT to replace my GTX 1070, 1st time I ever went with AMD in 9 years and changing the GPU brand. And honestly I'm more than satisfied with it since I can run all games no problem, just a shame people would buy the more expensive and inferior RTX 4070 just because it's from Nvidia and that people still think "AMD has really bad drivers" and is "unstable" but I've been using my 7800 XT for 6 months and had 0 issues. Just wish people would do their research.
Same. I got my 7900 XT as an upgrade to my 6700 XT since I was upgrading from a HP Reverb G2 to a Bigscreen Beyond, and I wanted the extra performance for VR and my 32:9 1440p monitor. I was looking at the 4070 ti, but the extra VRAM on the 7900 XT sold me on it and I couldn't be happier. There are games that I have played that used more than 16gb of VRAM, so it definitely helps.
@@smallbutdeadly931 I just couldn't justify buying the 4070, since in my country I paid 520 Euros for the Asus Dual 7800 XT, and the cheapest Inno3D RTX 4070 that probably was horrible was 635 Euros, at which I could've bought the 7900 XT lol, so I just got the 7800 XT and im really happy with it :D
Yes, people should do their research. Trust the guy who says "just buy it I have had no problems." If you think the 4070 is inferior to the 7800XT, then the one who should be doing their research is you.
would be nice if they can ofer rtx 4080 ( 5070) level performance card but much cheaper but I doubt it they would sell it much cheaper, maybe like 10% lower price at best
@@pathway4582 I truly cannot comprehend spending hundreds or thousands on something as frivolous as a gaming PC and then worrying about a power bill like a penny pinching old man. It seems common online, but I just can't fully get my head around it. .16c per kilowat-hr is normal pricing for electricity 100 watts means if you used it 10hrs , it'd cost you an extra 16 cents on your bill compared to a card 100 watts more efficient. So we are literally talking about a couple dollars even if you game ridiculous amounts. Let's say you game like it's your job. 40hrs a week would be 160hrs a month. Even if we say the electricity is .25c per kwh.. that's a whopping 4 dollars on your bill! An extra 50 dollars a year? Tops?
the pascal/polaris era of gpus were absolute powerhouses, its sad how things turned out not that long after their releases. I'm not ultra confident rdna 4 will be the shakeup we were hoping for
Decent at good prices is not good enough - its needs to be the xx70 Ti but like no more than $399. Are they going to do that? They were rather vague, and moreso than if this really meant blowing nvidia away on mid performance and pricing - or just equaling the new stuff at way lower prices. So, really, its "no high end card, nothing else changes, but sell it as something good and watch influencers think we meant something else because stock price".
This, so much this. a 4070 ti or 4070 super + 24GB vram for 400$ would hit the jackpot. AMD don't need to aim at the 4090, they need a great card at a great price. If they fail to deliver at least this then they will fail again.
The GTX 1050 Ti had the huge advantage of having low-profile models that had no external power connectors. Cramming a 1050 Ti into a small form factor office PC (like Dell Optiplex or Lenovo Thinkcentre) from eBay was an unbeatable deal. Hell, in my country the 1050 Ti is still quite expensive on the second hand market because of it. It would be nice to have a modern GPU that: 1) is low profile, 2) has no external power connectors, 3) is based on the newest architecture, 4) is as cheap as possible. While APUs are great, their iGPUs are often not good enough. We need dedicated GPUs for ordinary people and we need them ASAP.
I'm trying to jump to Linux and told myself "the next card I want is the top-shelf AMD card" since I don't use the RTX features while in Windows, the performance is similar, if not better in some areas, and the compatibility with Linux is much better (ongoing progress in that area, but still not there for my Linux choice). I'm still going to consider AMD when the new generation(s) release and I feel the need to upgrade, but I don't really see that happening until maybe 2025.
I bought the 1050Ti when it came out. I had 150€ to spend and quite frankly didn't even know AMD existed. Dumb highschoolers not doing market research are to blame for the 1050ti selling like hot cakes I reckon, based on my own experience.
2 หลายเดือนก่อน +1
As Linux users AMD is the only real option for a high-end card that doesn't depend on drivers compiled from dubious sources.
Hey, a small addition. RX 6800 was a cutdown version of 69xxXT. Not sure why they decided to use Navi 32 instead of Navi 31 for 7800XT, but AMD used Navi 21 for 6800 series. It was priced kind of competitively for a flagship card, and the remaining stock of it is an unbeatable value card for what it is, as they are sold for $350~
@@2kliksphilip I felt like it was worth mentioning, as you went through 6700XT AND 7800XTs being different dies than their top-end counterparts. It felt like you either didn't know or intentionally omitted the existence of 6800s, and them being the same die as their top-end products. I know that it's just a tid-bit piece of really unimportant part, but RX 6800 and their reduced prices(being the release MSRP of RX 6600, oh the times) are kind of what you are looking for when it comes to GPUs, from my understanding that is. Plenty VRAM, same die as powerful counterparts, energy efficient for what it is and cheap compared to other stuff Which is why I kinda made an edit under the original comment to further say that you probably already knew it.
@@2kliksphilip The transcript : "And the whole idea that their cards will be smaller and cheaper and more efficient just because they're only targeting the mid-range doesn't really make sense when you consider that cards like the Radeon 6700XT and 7800XT are already kind of like this. They're not cut-down versions of a high-end card, they are their own smaller die" This kind of directly describes RX 6800. More efficient than RTX 3070, performs near 3080. Hardware Unboxed and GamersNexus praised the card for its efficiency. They had a good card that should have dominated the mid-range with stats on paper, but as usual, people wanted NVIDIA. While you'd be correct to call me out on RX 6800 not being mid-range or very accessible, the MSRP was priced a bit above 3070, and should have been a $500 card, just like RX 7800 XT. You are also correct on the front that AMD didn't advertise the card's upcomings like they did in Polaris. After all, RX 6800 is an intentionally shrunk down top-end SKU that was priced tad above Navi 22 GPUs. Just like 7900 GRE's pricing compared to 7800XTs nowadays. Though it sucks that they now do top-end priced like high mid-end instead of top-end to low-mid end.
I want ITX / reasonably sized / space-competitive cards, i.e. single fan ~200W card like 1080/2070/3060Ti (notice there is no 4070) And the single Powercolor RX 5700, but nothing for the 6650 / 7600
I'll be starting on a new system this black week, hoping to catch some sales, but I'll be holding off a few months on upgrading graphics card awaiting the next generation. I'm hoping AMD can offer good performance and raytracing at 120FPS+ 1440p along with an updated tech stack that doesn't suck ass. Right now the best option for my goals seems to be the 4070 Ti Super, which is a whopping $200 more expensive than the 7900 XT for the same rasterized performance and LESS VRAM! I'm paying a huge price premium for what's essentially just a few RT cores and better software.
I do hope RDNA 4 is good. I've been getting that upgrade itch the past few weeks and I figure the best time to do so is next generation or maybe the generation after that with whatever the most valuable midrange cards are.
i think AMD will do pretty well. Im pretty optimistic overall about AMD as a brand and im hopeful that they can deliver their goal of a mid range champion. That being said, a few years back I bought a 6900 xt and its been holding up great. Eventually I'm hopeful that mid range cards will be good enough to do 4k gaming with lighter frame generation similar to how they went from being 1080p champions to 1440p ones.
Honestly at that price point you won't make that big of a loss as there's not that much value to lose. Waiting for half a year for a budget card doesn't seem like the greatest of ideas. If you're looking for something like a 7900XT or 7900XTX then I'd argue that waiting is smart if you want, but anything below. Nah.
@@byperx You can probably get like 120$ for it second hand. New 7700XT's go for about 400$s which is great value and will more than double your performance. If you want a quick indication of performance difference google for " TPU" and look at the relative performance. With a 1660Ti I am, however, somewhat concerned that your CPU is underpowered. If you're at 1080p you should take that into consideration.
@@Spentalei If, instead of considering how much value the 7700 XT will lose, you consider what else you'll be able to get by then at its current price point, then things do change a bit. The 7800 XT or even the 7900 GRE will be likely be available for a very minor price increase by that point, if not a similar price altogether. With their memory configs, those cards have a much more positive future ahead of them and can even fiddle with 4K. The best time to buy current-gen is to wait until it's last-gen. A bonus is that by then you'll probably be able to compare the upcoming feature sets, notably FSR 4 vs DLSS 4 vs XeSS 2 but most importantly whether the state of Intel ARC drivers are good enough for you.
They need 7900XTX raster with 4080 RT at 500-600$ because that's what the 5070 will probably offer for 600$, otherwise they're out of the market. I would like to see: 8700XT = 7900xtx raster and 4080 RT for 600$ 8700 = 7900xt raster 4070 ti super RT for 500$ 8600XT = 7800XT raster 4070 super RT for 350$ 8600 = 7700XT raster 4060ti RT for 275$. 8500XT = 7600 raster 4060 RT for 200$.
@@rogoznicafc9672 I think he's overbidding still. 8700XT = 7900xtx raster and 4080 RT for 600$ = 5070 "will probably offer for 600$"? If 8700XT = 5070 and both = 600$ guess what people will go for? nvidia.... Needs to be 8700XT = 5070 perf and 80% of 5070 price (480-500?) well more likely 5070 will start at 650, so 500$ max 8700XT is the way to go to employ the best value strategy. Nvidia cant raise more without loosing market share, even if to older gen (famous last words).
They won't hit XTX level raster but the 5070 is more than a year in the future so it doesn't matter what Nvidia may release then if AMD can still release their cards this year.
there is another probably positive side effect to leaving the "high end" more people will be able to afford to buy the top of their new line up card and it actually does feel good to have the top of the line thing, even if some other manufacturer is making even higher tier things.
I blame cryptobubble for 1050Ti marketshare. At one point almost all AMD GPUs were bought by miners. Shops in Poland were selling them directly to miners without even putting them on stock for normal people to buy. Even 200 and 300 series were selling for absurd prices because they had good computing performance. 1050Ti was cheapest new card avaible because it sucked at mining.
I think now they're focusing on the software side of things trynna catch up with nvidia's DLSS and AI features , RT too , it's a good decision tbh , raw performance effeciency and vram are not the only factor anymore , you need to factor in producivity use and media engine and nvidia seems to have an edge in those areas most of the time
this video sums up my feelings about the video card market right now. I think there needs to be more affordable cards for the masses not high end cards for the few. and if AMD is repeating a strat they did with Radeon HD 3000 series and then Radeon HD 4000 series I'd be ok with that because that worked at the time
The RX 480 8GB aged better than the 6GB GTX 1060, not only because it had more vram, but because the performance ended up being a lot closer to that of the 1070 in a lot of games later on, even when 6GB of vram was all you needed. It did take a long time for that 8GB version to become better than the 4GB version. At the time, 4GB actually wasn't that bad for a mid-range graphics card, but more and more games started offering higher resolution textures which you needed more than 4GB of vram for, even at 1080p.
I'd argue from then until recently, Nvidia cards have had more "additional features" while AMD has had more raster performance and RAM at a price. With the 10 series, it was a much better video encoder vs the 4xx, 5xx and vega lineup. The 5000 and 6000 series closed that gap, and with the 7000 series there is no longer that difference. The same thing happened again with RTX (given you ignore the first 3 years where there was no support), or drivers where AMD has generally had more issues, see vega launch or rx5xxx launch. If I wanted to make clips from my gameplay, Nvidia looked way better than what AMD had, especially at lower bitrates. For the longest time Nvidia has had the most powerful GPU (for those chasing every frame), and at lower tiers of performance, the most compelling set of additional features. Small things like this keep people buying Nvidia, and while some are more gimmicky than others, they add up and can have a huge impact in what a person choses to buy. I feel like with the launch of the 7000 series, AMD has caught up with all the actually important features Nvidia has. They have similar features in frame gen, upscaling, video encoding, VRR, power draw etc. The only real difference at this point is VRAM, which AMD has an advantage at most price points over Nvidia, RT and CUDA, which are niche and few people actually need. AMD has the capability of making competitive products, with competitive features that won't keep a loyalist buying green if AMD can offer a compelling price.
@@faranocks the quantity if features maybe is there, put there is no quality parity - I have RX 6800XT and i miss DLSS from when I had RTX 2060 - FSR is just not as good, and that is the main reason i will upgrade to nvidia :/
It's not really that close to 1070 lmao don't get delusional also 480 i'd say is about equal to 1060 it's the 580 which is slightly better than 1060 6G
I'd love the RDNA4 product stack to br varied and interesting like RDNA2. Rx 6000 series was so awesome to me. I'd love: -Rx 8500 = rx 6600 xt-rx 6650xt /rtx 3060 12gb - Rx 8600 = Rx 6700 10gb -Rx 8700= Rx 6750 xt - Rx 8700 xt = Rx 7700xt/rx 6800 -Rx 8800 = Rx 7800xt -Rx 8800 xt = Rx 7900gre. And the pricing should be around the CURRENT pricing, as off Novmber 2024. Max GPU cost of £700 for highest RDNA4 sku. Imagine if AMD did that. With better production features , more flesher out AV1 codec, power efficiency of Rtx 40 series. Even if the ray-tracing is say, 80%-90% as good as 40 series ray-tracing. That'll be quite nice. And olease, at least FSR 3.1 in dozens to dozens more games please (minimum 3.0).
AMD's mistake is that they're selling Hardware while NVidia is selling software RTX and DLSS are their strong points, and they have cut on improved raster performance generationally to sell their software features. Not only that, but they have their Halo product further, making the rest of their product stack look powerful by association. AMD needs their own exclusive software features set to push numbers. They can make their own software like a Super DSR or a graphical enhancer for frame rate or visual fidelity. Now that they own both consoles, they can further push it that software to consoles and get developers using this software on both Consoles and PCs. AMD needs a strong Halo product AND a software, unless they max out a card and have it at a VERY competitive price to counter the $2000 5090 price at better cost per frame.
No. That's utterly stupid. AMD should NOT contribute to the insanity of hardware specific software. What NEEDS to happen, is for people to stop being blinded by 'RTX and DLSS' and use their brains to understand the only reason NVIDIA has an advantage is because people keep rewarding them for anti-competitive practices.
@@jtnachos16 DLSS is better than FSR. Nvidia does ray tracing better. And encoding, too. These are things I use, and if i want my money's worth for these aspects, Nvidia is better.
@@jtnachos16 bro you think that ppl really are that stupid? do you even even have a high end 4000 card or are you just sitting there talking shit? nvidia's pricing is ridiculous, everybody knows that, but the absolute fact that dlss is simply that good is why ppl still prefer nvidia.
Halo product that nobody will buy, because "green card better". They were neck and neck with 30 series on each performance tier and did cost less. Guess who sold much more cards still
You can claim its fine but that top 25 on the steam hardware survey being what it is, they really need to tell us more "focus on where most people are". Really for now, it feels like they dont want to bother competing so they wont and we will all suffer. i kind hate that we have to hope on Intel to eventually scale up. PS - the markup is high on cards, so they could have done high end for a good price, but chose the stupid thing that of course meant they didn't sell - and we can thank the stock market for it being bad to not make ALL the money and try to actually compete on price.
I remember helping my brother buy a HD 3870. That was such a good GPU for the price. I was very jealous. Thankfully I did not have much time for gaming at the time, Think I had a 6600 GT at the time.
My only issue with AMD is that you run into odd edge cases depending on the software you use and if you do any on the fly encoding (Game streaming or wireless VR). For most people I recommend AMD, but if you use Blender or do any streaming I have to recommend Nvidia. Which sucks because I hate they're anti-consumer practices, lack of Opensource drivers (Until recently) and over priced hardware.
I expect some things since I tried the GRE... Better RT and Better power efficiency IDLE. I'm not asking for crazy spec but the best of both world, Razter and RT and maybe IA (I expect it in video games more and more and I used it for assistance)
3:00 tech channels back then were liking gtx1060 and gtx1050ti by extension way more than anything from AMD. I doubt it will change when we had very similiar situation with rx6700xt and 6800xt vs rtx alternatives
I just want a new RX 6400 single slot low profile GPU to pair with the 8600G, to slightly boost the iGPU performances. Micro-Atx low power is for my future. Graphics are almost like the 480... I dont need more than that, really. 115 Eur for a RX6400 Pulse 4gb, just upgrade to 8gb and relaunch it with Zen 3. If it can play Forza at 60 FPS, is good enough for me. The RTX A2000 Quadro is sweet, I wish AMD could make a similar card, and slightly cheaper, 500 is a lot, but I'm super excited by the low power cards. Arc 380, RX 6400, and now RTX A2000....
The problem with AMD aiming to compete at the mid range is they have to compete with themselves. The 6000 and 7000 series will drop in price and offer insane value.
I'm a bit confused, so no 8900 xtx, is that what they mean? I don't need the strongest graphics card I just want an upgrade from my 3090, so that I can run Linux easier.
tbh amd's doing the right thing because my rx 6800 xt leaves no room for upgrades. its just the BEST card ever made. if amd focusses on midrange once again i may MAYBE upgrade. but i definitely wont need to for the next several years
I want amd specifically because amd freesync premium monitors are cheaper and i already own one if there is unironically anything i will give up last is LFC not even raytracing or ai upscaling which my rx 7900 xt can run cyberpunk at high (not ultra cuz theres almosy no difference but the missing fps) settings fsr quality capped at 60 fps (and with stable consistent 1% and 0.1% partially due to my dram overclock) but man i do want amd to have better raytracing and ai upscaling for the niche games that benefit even tho most gaming I do is rastorized which is primarily why I also prefer AMD
I hope AMD's new series does well and has good prices, but after years of having constant driver-related issues on RX 580, I honestly don't trust AMD's software support enough anymore and will just go for Nvidia for my next GPU.
Radeon just needs its own Zen-moment and as stated in the video pricing is everything for this. My guess is that they can but they don't want to. Must earn them enough money even with low sales.
I will recommend everyone to keep expectations low and take every leak with a grain of salt. Zen 5 and RDNA 3 must be set as prime examples of when we as consumers but our excitement over reality.
5:20 people weren't tricked into thinking the card is good. At the time the card was released there was a very high price because of chip shortage, which is not AMD's fault and the card is quite capable for the price it is being sold right now.
marketing-wise AMD has done poorly these last two generation, but looking at actual market pricing I honestly think they have been the best option out there. Here in Sweden I was able to scoop up an 6800XT for the same price as an 8GB 3060Ti. People still buy nvidia. But in the all-popular mid-range pricepoint at around 250-550EUR Radeon has just offered much more value. I can't even say that AMD was worse at RT, because RT performance on the 6800XT is far better than the 3060Ti. :)
The only saving grace for them can be rtx 3070 performance with 12GB of vram in the price of rx6600. RX7600 is like a smaller brother to rx6700, but overclocked so it may be a bit faster, when it already should be rx6700xt performace with at least 10GB of vram! and XT version with 12GB and ~15-20% faster. I still prefer Nvidia, coz I can use Phsyx in older games and DLSS is a more stable look. But if they would give me rtx3070 performance with PCIE x16, with lower power draw and price a bit lower than rtx4060, then I would buy it someday. RT has a point only with rtx4060ti performane and +10GB of vram. This is why I wait for RTX5060 or used rtx4070.
I'm using a 6650 xt 8gb. I wanted to upgrade to an 7900 xtx for 4k gaming but it's too pricey. If the next gen rx 8000 will have a normal price then for me it's good enough. Even if the best rx 8000 is in reality a 7900 xtx. If the price is about 500€ for the best 8000 series and has the performance of an 7900 xtx then i'll buy it at day one.
Glad to see you be the voice of reason again. Was starting to question my sanity with so many people around me accepting and even defending the pricing of modern video cards.
AMD needs to deliver previous generation flagship performance for 500$ in order to take market share. If they deliver a 7900xt equivalent with better raytracing for 600$ like some outlets are reporting, then RDNA4 will be DOA.
Userbenchmark continues to bemuse
*Busemarchmenk
@@matti...*Userbarkmench
@@matti... *Ubermensch
Loserbenchlark
@@miguelrosete8930 Untermenschen!!!
AMD: It will be hot and ready!
Philip: Will it be good?
AMD: It will be HOT! And it will be READY!
I better hope their card are not HOT. 110 degrees hotspot toasty...
@@ner0718 Hotspot meh
@@ner0718 But will it be *nice*?
H-O-T-T-O-G-O you can get it hot to go!
ive been having the same problem with my xfx 6600
should I rma this card or is this normal?
@@ner0718
I am terrified at the thought of the 5090 price...
Between AMD saying they're mid end now and the 4070 GDDR6 having no price reduction for worse performance is disheartening to say the least.
$2500
Maybe Ngreedia will show us there great sense of humor again and the RTX 5090 will be $5090
@@PubEnemylike why they would care about gaming
4070 isn't exactly memory bandwidth limited so going from GDDR6X to GDDR6 isn't going to hurt performance (unless they've changed other stuff).
It will however, lower power consumption. GDDR6X is a weird Frankenstein experiment between NVIDIA and Micron that should've only been on the 4090 because its a halo product.
I do agree about price though. GDDR6 is cheaper so it should've gone down to consumers as well.
We dug this hole ourselves by not voting with our money and supporting NGreedia.
The RX 480/580 8GB was peak AMD, it was fast enough and had enough ram to max out textures for many years after launch.
Good times
Yes it was an outstanding GPU 580 8GB and got just around 5 year out of it
RX 6800 has the potential to be a decent classic as well. It sits in a nice price point secondhand these days, and if you run Linux, it is a no brainer.
I still have my 580 8 gb in a computer running Fedora. It works very well still.
@@BjørjaBear yes we group friends are working on hand me down build for friends child use old parts and my old rx580 is earmarked for build
I really hope they nail the price to performance ratio when it comes to rasterization. I don't care much about raytracing. I want bang for the buck and stability.
i absolutely do care about raytracing, but i dont think the actual software using it is good enough that we need it just yet
Raytracing is where rendering is moving towards, games are already being built with raytracing that CANNOT BE TURNED OFF, it's not going to be optional. You'll be shooting yourself in the foot by ignoring RT performance for a supposedly future proof upgrade.
Also saying you "don't care" about something is useless in the face of developers caring very much about it, they build the games and you buy the hardware to run them, you can't stop progress.
@@steel5897 which games?
@@ybned-ed8rz some UE5 games like avatar and black myth have some amount of ray tracing on at all times and the "ray tracing" mode is actually path tracing
@@steel5897 I really hope that this is not the case. As gamers we should be looking for fun games that are worth our time not the most realistic and graphical cutting edge. As graphical fidelity increases, the scope of games is also increasing which means longer development times. This is why AAA is dying. They do not see the value in making a fun game and would rather pump out 10 year projects with cutting edge technology, when a fun and enjoyable experience is all gamers actually care for. Also the UE5 look is absolutely horrid that is spitting out mirror copies of the same thing. Indie studios will undermine these mega studios if this continues on. And as for bigger studios, look at games like Elden Ring. Graphically speaking its nothing to brag about, with its bare minimum textures at best but with lighting that carries. Despite the mixed graphical fidelity it is easily one of the best games of modern.
RDNA 3 and Zen 5 have similar weird vibes. They both perform pretty much like the previous generation and cost more. I think RDNA 3 underperformed expectations, given that they chose to call the 60 CU model '7800 XT', vs. the 72 CU 6800 XT. In some games, the older card wins. This is the sort of weirdness that makes consumers trust and understand Nvidia, even if they are being ripped off. To be fair though, I did buy a 7800 XT because they were going for £429 and for that price it's solid.
RDNA 3 was a bit like Intel's Meteor Lake- moving stuff all over to a modular design, which itself is an achievement. But it's boring for us buying it even if it bodes well for the future? Maybe? If the modular design is scaleable and actually leads to stuff like more performance? Which it doesn't seem to be for some reason? etc
4060 Ti also sometimes underperformed the 3060 Ti and launched at the same price. At least the 7800 XT MSRP was cheaper than the 6800 XT lol.
@@2kliksphilip I've heard they're already thinking about ditching chiplet design for GPUs lol but we'll see
How does that make you trust Nvidia when they do the exact same thing you are talking about but sometimes even worse?
@@ShinDMitsuki the "trust" here means it will always have better performance and margin than previous gen. Also the "trust" count with driver stability, longetivity and the future proofing. I dunno if Nvidia is Open source like AMD.
It just needs to be affordable and better than 4060 ngl
This, 7600XT is dogshit and folks still better take 6700XT for it at just slight margin with 3060.
@@azravalencia4577 I wouldn't say the 7600XT is dogshit, it fills the niche that the 6600XT filled and that card was an underdog 1440p performer. I'd say the 7600XT is in the same boat, it competes with the 4060, has double the VRAM, and is almost an underdog 4K card at medium settings in a few games
This is supposed to compete with the 5000 series not 4000.
@@Toma-6216600 is so good tho
@@Toma-621 im being too harsh, i should say it does got overglanced by better priced 6700 XT.
The genie is out of the bottle: Nvidia saw that they can charge any price they want and they aren't gonna stop.
Of course, why would they charge less if people are still willing to pay? People love to complain about Nvidia pricing, but they are still selling GPUs like hot cakes. I guess people don't really care about their money and don't want to get more for their money.
@@rand0mtv660 Complaining costs nothing, and they are too lazy to search better alternatives.
@@rand0mtv660You mean rich people? I’m not allowed to be mad at the fact people who don’t care and just buy the best drive up the prices for everyone else?
@@ausden9525 I think you misunderstood my comment. I'm saying judging by how much people complain about these prices I would expect that Nvidia wouldn't really be selling that much, but people buy their stuff anyway. It's just crazy consumerism.
Hopefully Intel can bring in some competition since AMD doesn't seem to want to.
R.I.P. Anandtech :(
F
R.I.P Bit-Tech too :(
AMD being diet Nvidia but with worse features is really hurting them. They need to be like what they did to intel when they released Ryzen
That's easier said than done I'm afraid. Nvidia is head and shoulders ahead of the competition right now.
Well if you're not stupid you'll understand that they tried to mimic the success of Ryzen againt Intel by making the chiplet architecture on RDNA3, something that indeed made Ryzen powerful and affordable.
Their GPUs are indeed Powerful in raster for the price, better than Nvidia.
But it's easier to say than to make AMD superior, it's the first Chiplet GPUs it's not mature and it's complex, they're going back to mostly monolothic for RDN4, apparently.
Also devs optimisation on Nvidia using Optix and GameWorks makes games less optimized on AMD hardware, for example Alan Wake using optix, and that's one of the problems pointed out by the Radeon executive, which is why he's targeting the mid-range, selling as much as possible to have a base that would attract developers... PS5/PS4 games run superbly on AMD GPUs because they're RDNA2-optimized games.
but most PC games aren't... and gamers have this mania for not thinking Cartesianly and in industry and hardware detail, devs use Optix/GameWorks but you'll very rarely see the Radeon ProRender logo which optimizes for amd GPUs raytracing etc.
you're being duped by NVidia's marketing and showcases in partnership with studios like Remedy etc to sells card.
you are being fooled by NVidia's marketing and showcases in partnership with studios like Remedy etc who have contracts that require them not to use more powerful or similar tools, which is why Alan Wake does not use FSR3 and DLSS3 is implemented, these games are consciously under-optimized.
@@technite5360 it’s already a fact that AMD is just worse at ray tracing and even AMD knows this. Even AMD sponsored titles with Ray tracing often use lower fidelity ray tracing than those who are Nvidia sponsored. Plus you can’t ignore that AMD’s way of calculating Ray tracing absolutely sucks at path tracing. The calculations necessary make it extremely difficult to process as while some aspects their fine, others they suck at.
There’s a reason why AMD hasn’t shown off path tracing at all while Nvidia does. The consoles don’t show anything different. You just have different developers who are very good at optimizing for specific features on one set of hardware. The PS5 doesn’t really do any better in ray tracing than the similar PC GPU, the RX 6700. The games run as expected and you get the games that run great from the great developers that Sony has had in house that put out the best looking Sony games last gen. It’s just a case of the advantages of developing for one console so you can make hyper specific optimizations that you won’t get on PC
Not really, it's what gives them a niche in the first place. They can't produce something identical or better than NVIDIA and also price it cheaper. That's simply not going to happen.
So they have to look at the market and see where they can fit in, which is being "diet nvidia" as you put it lol.
Producing stuff that is slightly behind nvidia, with pricing that reflects it.
If you want low pricing, they have to give up in other areas.
They still fulfill the niche in the market, better price-performance for non-raytracing games. With the way that games are heading right now, with pretty much no AAA games set to release for 2025 and indie games dominating the market, I think its a growing niche. But you're right they need to advertise better and price even better if they want to beat nvidia
We need great value $150-$200 GPUs again. This pci ex 4x nonsense with the 6400 and 6500 XT killed AMD on the low end. What use is there being 11 options in the $350-$550 range if most people don't spend that much on a GPU??? How the hell were they thinking this would grow their market share?
Never gonna happen. Inflation, production costs etc. Prices will only rise.
@@zoltankramer3383 More so, there are budget cards in the 250-350 price range... 7600XT is around 300-350 bux and maybe it doesn't run everything at 60fps on 1440p ultra, but neither did the 570 or 580 lol. People's standards have just gotten absurd.
With Nvidia inflation is pretty obvious though. Adjusted for inflation the $379 GTX 1070 launched as a $497 card and the RTX 3070 launched at $499.
Obviously it doesn't bring much comfort if prices have only risen according to inflation but your wages haven't....(and obviously if wages also rose then the GPU prices would rise faster than inflation for the sake of profit) but that's definitely a larger issue with causes completely disconnected from GPUs...
would be in the 3-400$ ball park
Isn't the RX 6600 150$? Maybe i didn't understand your comment, but the RX 6600 is better than both you mentioned. It's not even "low end", yet still around 150$. What the hell do you consider mid end?
I've given up on hoping that AMD will actually launch a good value product, instead of barely undercutting Nvidia. I am FAR more interested in Intel's Arc Battlemage (gen2) than RDNA 4, as Intel has made it clear they are willing to offer really good prices to sell products and the drivers have made huge progress to the point where Arc GPUs are recommended now.
Go play Space Marine 2 a week after release, that runs on a750 at half the fps of 4060/7600 and tell me more about great Intel drivers
Unfortunately Intel is in serious trouble. They fired bunch of people, their stock has gone down and they just aren't doing very well. We'll see if this whole GPU division they have will live long term.
@@necrotic256wait, space marine 2 runs at half FPS compared to a 4060? Isn't that card half the price of a 4060? Seems pretty good to me
Intel are currently making GPUs twice as large as AMD just to get the same performance level, that's a very unsustainable strategy given their recent financial woes. Sure the Arc cards are priced well currently, but i doubt they're making any money from them. I'm hoping Battlemage improves things, but they'll need to seriously improve their drivers and architecture if they want to remain competitive.
I have no faith in Intel anymore, they have that "declining company" aura similar to what Boeing has. I get the feeling it's all downhill for them. Already rumors of them potentially selling their manufacturing division altogether though admittedly I don't know if that has any truth to it.
I remember when I got my HD7770, that price/performance was insane back in the day.
I hope they manage to do something similar again.
The focus on the elephant in the room, price, is appreciated here.
the last super notable amd card that really made waves was the 580 series of cards, they weren't powerhouses but damn were they good value and gave good enough performance, should've kept that train rolling when they had the chance
And they’re still relatively high in deployment in the Steam user hardware survey for the AMD stack.
I loved the 4870. It was released in 2008 at $300. Which is $438 in today's dollars. Today, this money gets you RX 7700 XT 12GB, 20% slower than 6900 XT.
If 8800xt delivers 7900xtx performance, it has to be hella efficiency gains in a single generation.
Nah, doubt it. 7900 XTX power draw is a hella chomp. But if 8800 draw lower, that's true hella efficient there.
Current MLID leaks suggest that it'll have 7900 XT-level performance rather than XTX, with RT performance equivalent to the 4070 Ti Super. Take it with a grain of salt, of course.
7900gre ... top RDNA4 card ... 7900XT - my guess.
@@Vorexia imma suggest not to suggest any leaker and keep your expectations in the sand because the last 2 times we have did this it ended with everyone being dissapointed
@@KrYPTzx Did you read my last sentence?
even tho you're an nvidia enjoyer, i respect that you also make amd videos.
What kinda corporate braindead shill are you? wtf
Watch radeon release same perfromance as nvidias card but 50$ cheaper as they have for several generations... So the question becomes is better upscaling and other software features worth extra 50$. Want higher fps turn on DLSS2 (thats miles better than FSR2), want eye candy turn on raytracing that runs better on nvidia cards...
Main issues right now is alot of the time when a new game is out it doesn't work on amd cards and just crash for the first week of release, one of the main reasons I have not bought one.
@@currentlyatlas5074 think you're mixing up intel gpus with radeon. Radeon drivers are fine. Im not asking much just for FSR2 to match DLSS2 picture quality.
Devs these days using upscalers to wipe their ass doesn't bode well for the future.
@@Takashita_Sukakoki Wukong, BG3, Witcher next gen update all were unplayable on launch on 7000 series gpus. WoW DX 12 dosent work to this day on AMD cards too.
Theres plenty of problems with radeon drivers.
100%. DLSS is much more mature than FSR and significantly looks better when upscaling 1080p to 4k screen. Not only that, but I'd gladly pay $50 if I don't have to deal with AMD driver issues (seriously why do they keep breaking their own driver and release it anyway).
1:23 Anandtech's article will be no more soon, so much history will be just gone😢
They said they'd keep it up
@@2kliksphilip If you really want to trust them.
@@crazybeatrice4555 They are probably making enough from adverts to keep the site running - just not enough to buy and review products.
I remember about 2 decades ago I've bought some program to download sites. Internet was on per-minute tariff, so you could just spent like 15-30 minutes and entire site, with gifs and flash animations was soved locally.
What about the way back machine?
I'm looking forward to being dissapointed
heh, me too
We all are
I also expect pricing to be not great. AMD's manufacturer is TSMC, and they are expensive, since they're the only high end chip manufacturer.
The RX470 was manufactured by Global Foundries, which made them worse at power efficiency, but allowed AMD to keep their pricing low.
PS: Funnily I didn't click until now partly because I didn't want to hear speculation about performance etc, as you've mentionend in the video. It was a great watch and I'm glad you've managed to make something I didn't expect!
Shout-outs to the Fiji Fury series for being a wild bonkers card that (in the Fury X) was way too huge, hot, and power hungry but was still an amazing experiment.
At one point the r9 nano was sort of a "midrange" card... that was actually really good too. Undervolt them slightly, and get even better performance.
AMD's pricing really is a problem, one what is compounded by the feature set disparity. RX 7700XT launched at 489€, sold poorly, and is now down to 389-399€. Had it launched at or below that price point, they would've had a card faster than the RTX 4060 ti 8gb on the market, with sufficient VRAM and performance lead to make it a genuine option even with FSR being what it is. But no, AMD had to try and upsell while having the inferior upscaler and worse raytracing performance, against a competitor with market share and mind share dominance. And I'm saying this as someone who owns an AMD GPU myself!
not only are you comparing the prices of amd gpu that is head neck and shoulders above the nvidia card LITERALLY in another tier, you're also talking about raytracing on anything short of a rtx 4080. incredible.
over here in the uk anything nvidia is overpriced by at least £100 but it can easily go over £300 more for the same tiered card. a 4060ti 16gb is priced like a 7800xt. the time of year does not matter.
i agree launch prices were ridiculous but what prices werent? a radeon 6750xt was £463 2 years ago because we left covid, lived in a crypto mining crisis and still continue live in a crisis because covid.
what happens now, is you can buy a current gen at discounts while always having the option to buy a high tier previous gen for cheap. you'd also get to do yourself a favour by running native. why because dlss and fsr are trash, dont delude yourself that playing with vaseline over the screen is fantastic. they're there to mask how terribly rushed new games are. games made from unreal engine come to mind as they print photorealistic games like never before, while adding dlss and fsr as the only optimisation.
wait until people remember/find out we had photorealistic games back in 2015 rivalling or beating todays games, running off r9 290/gtx 970's.
An honest AMD owner, rare.
I ordered but not paid yet for rx 7900xt at 661 euro (including tax and shipping). I already have rtx 4070 and don't know if I should sell it and pay for that order, as I would need to add 200 euro or more. I'm not convinced because of bad fsr and poor rt performance. Yes the card would be faster and 8gb vram more, but in rt+fsr it won't run much better and won't look the same.
@@gameison4813
TH-cam comments aren't the best place for any advice, let alone monetary advice. Reddit's buildapc for example would be a better place to go to.
With that caveat away, if the RTX 4070 performs well enough that you can afford to wait six months or more, I'd recommend waiting until the RTX 5000 series launch is complete, which itself is expected to occur in late 2024 or early 2025. At minimum I'd wait until Black Friday sales. RX 7900XT at 660€ is a fairly decent price and the raw performance in non raytraced situations and very light raytraced games like Resident Evil 4 Remake is faster or nearly as fast as 4070ti super. Of course, the moment you try to run say Alan Wake 2 with raytracing the performance craters. At 1440p native the 7900xt would have the same or worse performance as the RTX 4070 at 40fps or so, with an upscaler that isn't as good. It all depends on what games you're likely to play.
Honestly? I wouldn't bother upgrading until I'd get +60% performance boost at around the same price or if the GPU I had couldn't get 60+ fps at the resolution of my monitor. The 4070 only launched last year, so I'd wait until next year at minimum before even considering upgrade. If you had a 2070/3070, it would make sense to upgrade, but I'd rather save the money and see how the market develops if I was in your position.
Again, I will stress that this is the youtube comment section, so take all of what I said with a shovelful of salt.
I hope we go back to the RX 400/500 series days. You even got options for entry-level cards. My first GPU was the 550, which I upgraded to a 580 a year later. I still have it on my testbench system. It's honestly the best AMD's GPU division has ever been
Still rocking the old Polaris RX 590. The GPU market has been terrible ever since. It's a shame that great range never took off. This event probably cascaded into the sad state of affairs we have today.
Got a 5700XT for like $150 a few years ago and it's still running all my games perfectly
What a vague statement
@@VinniePaz it's an anecdotal internet comment. I'm not writing you a dissertation
@@someonesomewhere8658 Perfect 👌
My favorite food review channel
I think you should have talked about battlemage along with rdna 4 since it will be coming around or after rdna 4 , and it seems they have addressed the issues with alchemist and older game compatibility will be better and newer titles will also be running great from day one.....
Lunar Lake has a Battlemage inside and it seems to be pretty powerful.
@@cube2fox i know thats why i said so...
@@bmqww223 Yeah. And Intel also has more commitment to upscaling (XeSS). I bet they already work on AI frame generation. I would be surprised if AMD adds something like DLSS or XeSS to RDNA part 4.
@@cube2fox AMD should partner with the Lossless Scaling guy for framegen. LS is fucking incredible for laptops or other dual-GPU (iGPU + GPU) setups and supporting something like that on the hardware level would be cray cray.
@@cube2fox they are working on a fsr AI or fsr 4 , upscaling for sure, rdna 3 was a wakeup call for them when they saw xess 1.3 drop and beat it easily , and they were a bit cocky with the intel driver situation, but know if you follow the intel engineers presentation they are have thrown the microsoft directx 9 to 12 compatibility layer and running in the metal which shows that older game would run fine, and for the modern game, just see their indirect xi metrics its almost 12 times more(fancy marketing way of saying that they have got it up and running) which will help in modern titles and even help with shitty day one launches which gets fixed, so i am 100 percent sure battlemage is the reason amd is singing different tune right now and the threat is looming over amd right now .......
I bought a RX 7800 XT to replace my GTX 1070, 1st time I ever went with AMD in 9 years and changing the GPU brand. And honestly I'm more than satisfied with it since I can run all games no problem, just a shame people would buy the more expensive and inferior RTX 4070 just because it's from Nvidia and that people still think "AMD has really bad drivers" and is "unstable" but I've been using my 7800 XT for 6 months and had 0 issues. Just wish people would do their research.
Same. I got my 7900 XT as an upgrade to my 6700 XT since I was upgrading from a HP Reverb G2 to a Bigscreen Beyond, and I wanted the extra performance for VR and my 32:9 1440p monitor. I was looking at the 4070 ti, but the extra VRAM on the 7900 XT sold me on it and I couldn't be happier. There are games that I have played that used more than 16gb of VRAM, so it definitely helps.
@@smallbutdeadly931 I just couldn't justify buying the 4070, since in my country I paid 520 Euros for the Asus Dual 7800 XT, and the cheapest Inno3D RTX 4070 that probably was horrible was 635 Euros, at which I could've bought the 7900 XT lol, so I just got the 7800 XT and im really happy with it :D
Even on VR I have no driver issues with my 7800XT.
Yes, people should do their research. Trust the guy who says "just buy it I have had no problems." If you think the 4070 is inferior to the 7800XT, then the one who should be doing their research is you.
@@mag3slay4 Nvidia shill.
Who in the hell are you to decide if someone is having problems?
would be nice if they can ofer rtx 4080 ( 5070) level performance card but much cheaper but I doubt it they would sell it much cheaper, maybe like 10% lower price at best
The rx 7900xtx is already rtx 4080 levels and 10% cheaper.
@@sierra5065 without ray tracing
@@sierra5065 It also consumes 100 watts more power, so depending on where you live you will pay for it in the long term.
@@sierra5065 wow, 10% cheaper - needs to be more like roughly 25% - they cannot price that close if they want the price to be such a selling point.
@@pathway4582 I truly cannot comprehend spending hundreds or thousands on something as frivolous as a gaming PC and then worrying about a power bill like a penny pinching old man. It seems common online, but I just can't fully get my head around it.
.16c per kilowat-hr is normal pricing for electricity
100 watts means if you used it 10hrs , it'd cost you an extra 16 cents on your bill compared to a card 100 watts more efficient.
So we are literally talking about a couple dollars even if you game ridiculous amounts.
Let's say you game like it's your job. 40hrs a week would be 160hrs a month. Even if we say the electricity is .25c per kwh.. that's a whopping 4 dollars on your bill!
An extra 50 dollars a year? Tops?
when I built my first pc in 2013 I used an r9 290 gpu, it was the top of the line card and it only ran me about $350, crazy how the times have changed
the pascal/polaris era of gpus were absolute powerhouses, its sad how things turned out not that long after their releases. I'm not ultra confident rdna 4 will be the shakeup we were hoping for
Nice to see AMD is working to improve FSR with FSR 4
Decent at good prices is not good enough - its needs to be the xx70 Ti but like no more than $399. Are they going to do that? They were rather vague, and moreso than if this really meant blowing nvidia away on mid performance and pricing - or just equaling the new stuff at way lower prices. So, really, its "no high end card, nothing else changes, but sell it as something good and watch influencers think we meant something else because stock price".
This, so much this. a 4070 ti or 4070 super + 24GB vram for 400$ would hit the jackpot. AMD don't need to aim at the 4090, they need a great card at a great price. If they fail to deliver at least this then they will fail again.
The GTX 1050 Ti had the huge advantage of having low-profile models that had no external power connectors. Cramming a 1050 Ti into a small form factor office PC (like Dell Optiplex or Lenovo Thinkcentre) from eBay was an unbeatable deal. Hell, in my country the 1050 Ti is still quite expensive on the second hand market because of it.
It would be nice to have a modern GPU that:
1) is low profile,
2) has no external power connectors,
3) is based on the newest architecture,
4) is as cheap as possible.
While APUs are great, their iGPUs are often not good enough. We need dedicated GPUs for ordinary people and we need them ASAP.
I've been waiting for a mid-range Radeon card. Let's hope these are good cards because I desperately need an upgrade.
I'm trying to jump to Linux and told myself "the next card I want is the top-shelf AMD card" since I don't use the RTX features while in Windows, the performance is similar, if not better in some areas, and the compatibility with Linux is much better (ongoing progress in that area, but still not there for my Linux choice). I'm still going to consider AMD when the new generation(s) release and I feel the need to upgrade, but I don't really see that happening until maybe 2025.
I bought the 1050Ti when it came out. I had 150€ to spend and quite frankly didn't even know AMD existed.
Dumb highschoolers not doing market research are to blame for the 1050ti selling like hot cakes I reckon, based on my own experience.
As Linux users AMD is the only real option for a high-end card that doesn't depend on drivers compiled from dubious sources.
RDNA 1 was also only targeting the mid-range, with the highest performing card being the 5700 XT, which was similar in performance to the 2070.
Hey, a small addition. RX 6800 was a cutdown version of 69xxXT. Not sure why they decided to use Navi 32 instead of Navi 31 for 7800XT, but AMD used Navi 21 for 6800 series. It was priced kind of competitively for a flagship card, and the remaining stock of it is an unbeatable value card for what it is, as they are sold for $350~
Edit: though you probably already know this now that I think about it, as you deliberately omitted it while mentioning 6700xt and 7800xt.
Why would I mention the 6800 XT when referring to intentionally smaller dies made for lower tiered parts?
@@2kliksphilip I felt like it was worth mentioning, as you went through 6700XT AND 7800XTs being different dies than their top-end counterparts. It felt like you either didn't know or intentionally omitted the existence of 6800s, and them being the same die as their top-end products.
I know that it's just a tid-bit piece of really unimportant part, but RX 6800 and their reduced prices(being the release MSRP of RX 6600, oh the times) are kind of what you are looking for when it comes to GPUs, from my understanding that is. Plenty VRAM, same die as powerful counterparts, energy efficient for what it is and cheap compared to other stuff
Which is why I kinda made an edit under the original comment to further say that you probably already knew it.
Why would I mention top end dies when mentioning intentionally shrunk mid end dies? It's literally completely different lol
@@2kliksphilip The transcript : "And the whole idea that their cards will be smaller and cheaper and more efficient just because they're only targeting the mid-range doesn't really make sense when you consider that cards like the Radeon 6700XT and 7800XT are already kind of like this. They're not cut-down versions of a high-end card, they are their own smaller die"
This kind of directly describes RX 6800. More efficient than RTX 3070, performs near 3080. Hardware Unboxed and GamersNexus praised the card for its efficiency. They had a good card that should have dominated the mid-range with stats on paper, but as usual, people wanted NVIDIA.
While you'd be correct to call me out on RX 6800 not being mid-range or very accessible, the MSRP was priced a bit above 3070, and should have been a $500 card, just like RX 7800 XT. You are also correct on the front that AMD didn't advertise the card's upcomings like they did in Polaris.
After all, RX 6800 is an intentionally shrunk down top-end SKU that was priced tad above Navi 22 GPUs. Just like 7900 GRE's pricing compared to 7800XTs nowadays.
Though it sucks that they now do top-end priced like high mid-end instead of top-end to low-mid end.
I want ITX / reasonably sized / space-competitive cards, i.e. single fan ~200W card like 1080/2070/3060Ti (notice there is no 4070)
And the single Powercolor RX 5700, but nothing for the 6650 / 7600
I've been thinking about upgrading from my Sapphire Radeon XR 5700 XT lately and I'm waiting to see what the 8000 series are going to be about.
20% more performance at same power or nothing.
I'll be starting on a new system this black week, hoping to catch some sales, but I'll be holding off a few months on upgrading graphics card awaiting the next generation. I'm hoping AMD can offer good performance and raytracing at 120FPS+ 1440p along with an updated tech stack that doesn't suck ass. Right now the best option for my goals seems to be the 4070 Ti Super, which is a whopping $200 more expensive than the 7900 XT for the same rasterized performance and LESS VRAM! I'm paying a huge price premium for what's essentially just a few RT cores and better software.
I do hope RDNA 4 is good. I've been getting that upgrade itch the past few weeks and I figure the best time to do so is next generation or maybe the generation after that with whatever the most valuable midrange cards are.
Sadly, we all know RDNA 4 won't be as cheap as we hope. Also, release the RTX 4090 8K followup where you add DLSS in the mix pls.
i think AMD will do pretty well. Im pretty optimistic overall about AMD as a brand and im hopeful that they can deliver their goal of a mid range champion. That being said, a few years back I bought a 6900 xt and its been holding up great. Eventually I'm hopeful that mid range cards will be good enough to do 4k gaming with lighter frame generation similar to how they went from being 1080p champions to 1440p ones.
i was thinking of buying the 7700, but now i will wait for the 8000 series
Honestly at that price point you won't make that big of a loss as there's not that much value to lose. Waiting for half a year for a budget card doesn't seem like the greatest of ideas.
If you're looking for something like a 7900XT or 7900XTX then I'd argue that waiting is smart if you want, but anything below. Nah.
Buy the 6800 or go save for 7900GRE, that's the only good thing with Red rn.
@@Spentalei rn i have t he 1660ti so upgrading to one of those will be a huge leap
@@byperx You can probably get like 120$ for it second hand. New 7700XT's go for about 400$s which is great value and will more than double your performance. If you want a quick indication of performance difference google for " TPU" and look at the relative performance.
With a 1660Ti I am, however, somewhat concerned that your CPU is underpowered. If you're at 1080p you should take that into consideration.
@@Spentalei If, instead of considering how much value the 7700 XT will lose, you consider what else you'll be able to get by then at its current price point, then things do change a bit. The 7800 XT or even the 7900 GRE will be likely be available for a very minor price increase by that point, if not a similar price altogether. With their memory configs, those cards have a much more positive future ahead of them and can even fiddle with 4K. The best time to buy current-gen is to wait until it's last-gen. A bonus is that by then you'll probably be able to compare the upcoming feature sets, notably FSR 4 vs DLSS 4 vs XeSS 2 but most importantly whether the state of Intel ARC drivers are good enough for you.
They need 7900XTX raster with 4080 RT at 500-600$ because that's what the 5070 will probably offer for 600$, otherwise they're out of the market.
I would like to see:
8700XT = 7900xtx raster and 4080 RT for 600$
8700 = 7900xt raster 4070 ti super RT for 500$
8600XT = 7800XT raster 4070 super RT for 350$
8600 = 7700XT raster 4060ti RT for 275$.
8500XT = 7600 raster 4060 RT for 200$.
oh you are mighty hopeful my friend
@@rogoznicafc9672 I know right? 🤣
@@rogoznicafc9672 I think he's overbidding still.
8700XT = 7900xtx raster and 4080 RT for 600$ = 5070 "will probably offer for 600$"?
If 8700XT = 5070 and both = 600$ guess what people will go for? nvidia....
Needs to be 8700XT = 5070 perf and 80% of 5070 price (480-500?)
well more likely 5070 will start at 650, so 500$ max 8700XT is the way to go to employ the best value strategy.
Nvidia cant raise more without loosing market share, even if to older gen (famous last words).
matching a 4080 in RT is completely pointless. when it would be a 5070 next generation.
They won't hit XTX level raster but the 5070 is more than a year in the future so it doesn't matter what Nvidia may release then if AMD can still release their cards this year.
there is another probably positive side effect to leaving the "high end" more people will be able to afford to buy the top of their new line up card and it actually does feel good to have the top of the line thing, even if some other manufacturer is making even higher tier things.
I blame cryptobubble for 1050Ti marketshare.
At one point almost all AMD GPUs were bought by miners. Shops in Poland were selling them directly to miners without even putting them on stock for normal people to buy.
Even 200 and 300 series were selling for absurd prices because they had good computing performance.
1050Ti was cheapest new card avaible because it sucked at mining.
I think now they're focusing on the software side of things trynna catch up with nvidia's DLSS and AI features , RT too , it's a good decision tbh , raw performance effeciency and vram are not the only factor anymore , you need to factor in producivity use and media engine and nvidia seems to have an edge in those areas most of the time
Finally a good video on my feed
A competitive 200$ gpu would go so crazy
this video sums up my feelings about the video card market right now. I think there needs to be more affordable cards for the masses not high end cards for the few. and if AMD is repeating a strat they did with Radeon HD 3000 series and then Radeon HD 4000 series I'd be ok with that because that worked at the time
The RX 480 8GB aged better than the 6GB GTX 1060, not only because it had more vram, but because the performance ended up being a lot closer to that of the 1070 in a lot of games later on, even when 6GB of vram was all you needed.
It did take a long time for that 8GB version to become better than the 4GB version. At the time, 4GB actually wasn't that bad for a mid-range graphics card, but more and more games started offering higher resolution textures which you needed more than 4GB of vram for, even at 1080p.
I'd argue from then until recently, Nvidia cards have had more "additional features" while AMD has had more raster performance and RAM at a price. With the 10 series, it was a much better video encoder vs the 4xx, 5xx and vega lineup. The 5000 and 6000 series closed that gap, and with the 7000 series there is no longer that difference. The same thing happened again with RTX (given you ignore the first 3 years where there was no support), or drivers where AMD has generally had more issues, see vega launch or rx5xxx launch. If I wanted to make clips from my gameplay, Nvidia looked way better than what AMD had, especially at lower bitrates. For the longest time Nvidia has had the most powerful GPU (for those chasing every frame), and at lower tiers of performance, the most compelling set of additional features. Small things like this keep people buying Nvidia, and while some are more gimmicky than others, they add up and can have a huge impact in what a person choses to buy.
I feel like with the launch of the 7000 series, AMD has caught up with all the actually important features Nvidia has. They have similar features in frame gen, upscaling, video encoding, VRR, power draw etc. The only real difference at this point is VRAM, which AMD has an advantage at most price points over Nvidia, RT and CUDA, which are niche and few people actually need. AMD has the capability of making competitive products, with competitive features that won't keep a loyalist buying green if AMD can offer a compelling price.
@@faranocks the quantity if features maybe is there, put there is no quality parity - I have RX 6800XT and i miss DLSS from when I had RTX 2060 - FSR is just not as good, and that is the main reason i will upgrade to nvidia :/
It's not really that close to 1070 lmao don't get delusional also 480 i'd say is about equal to 1060 it's the 580 which is slightly better than 1060 6G
I'd love the RDNA4 product stack to br varied and interesting like RDNA2. Rx 6000 series was so awesome to me.
I'd love:
-Rx 8500 = rx 6600 xt-rx 6650xt /rtx 3060 12gb
- Rx 8600 = Rx 6700 10gb
-Rx 8700= Rx 6750 xt
- Rx 8700 xt = Rx 7700xt/rx 6800
-Rx 8800 = Rx 7800xt
-Rx 8800 xt = Rx 7900gre.
And the pricing should be around the CURRENT pricing, as off Novmber 2024. Max GPU cost of £700 for highest RDNA4 sku.
Imagine if AMD did that. With better production features , more flesher out AV1 codec, power efficiency of Rtx 40 series.
Even if the ray-tracing is say, 80%-90% as good as 40 series ray-tracing. That'll be quite nice. And olease, at least FSR 3.1 in dozens to dozens more games please (minimum 3.0).
Thank you Mr Philip
Press “X” to doubt.
AMD's mistake is that they're selling Hardware while NVidia is selling software
RTX and DLSS are their strong points, and they have cut on improved raster performance generationally to sell their software features. Not only that, but they have their Halo product further, making the rest of their product stack look powerful by association.
AMD needs their own exclusive software features set to push numbers. They can make their own software like a Super DSR or a graphical enhancer for frame rate or visual fidelity. Now that they own both consoles, they can further push it that software to consoles and get developers using this software on both Consoles and PCs.
AMD needs a strong Halo product AND a software, unless they max out a card and have it at a VERY competitive price to counter the $2000 5090 price at better cost per frame.
No. That's utterly stupid. AMD should NOT contribute to the insanity of hardware specific software. What NEEDS to happen, is for people to stop being blinded by 'RTX and DLSS' and use their brains to understand the only reason NVIDIA has an advantage is because people keep rewarding them for anti-competitive practices.
@@jtnachos16 DLSS is better than FSR. Nvidia does ray tracing better. And encoding, too. These are things I use, and if i want my money's worth for these aspects, Nvidia is better.
@@jtnachos16 bro you think that ppl really are that stupid? do you even even have a high end 4000 card or are you just sitting there talking shit? nvidia's pricing is ridiculous, everybody knows that, but the absolute fact that dlss is simply that good is why ppl still prefer nvidia.
Halo product that nobody will buy, because "green card better". They were neck and neck with 30 series on each performance tier and did cost less. Guess who sold much more cards still
@@necrotic256omg. Calm down, go buy an ATI gpu and wait, because I swear this is the year of Linux desktop.
You can claim its fine but that top 25 on the steam hardware survey being what it is, they really need to tell us more "focus on where most people are". Really for now, it feels like they dont want to bother competing so they wont and we will all suffer. i kind hate that we have to hope on Intel to eventually scale up. PS - the markup is high on cards, so they could have done high end for a good price, but chose the stupid thing that of course meant they didn't sell - and we can thank the stock market for it being bad to not make ALL the money and try to actually compete on price.
3870 was a pretty sweet card back in the day. Only upgraded to a 8800gtx since i got one for nearly free.
it will be best option for budget to 600$ (yes 600 dollars is budget nowadays)
I remember helping my brother buy a HD 3870.
That was such a good GPU for the price. I was very jealous. Thankfully I did not have much time for gaming at the time, Think I had a 6600 GT at the time.
AMDnubis: There will be gaming.
Me: Will there be be better performance?
AMDnubis: There will be gaming.
4870 was insane apart from the vrm which got stupid hot, watercooling fixed that for me though :D
my favourite hardware channel
Thanks for stopping with the insane stuff and actually just making a normal informative video. Hopefully the sanity holds.
My only issue with AMD is that you run into odd edge cases depending on the software you use and if you do any on the fly encoding (Game streaming or wireless VR).
For most people I recommend AMD, but if you use Blender or do any streaming I have to recommend Nvidia.
Which sucks because I hate they're anti-consumer practices, lack of Opensource drivers (Until recently) and over priced hardware.
I have 6700XT and it's a Beast playing at 1440p high/ultra settings 130-144 FPS :)
i bought a 7700 xt recently because its price had dropped to about £380, making it a reasonable price point for this build im doing.
I expect some things since I tried the GRE... Better RT and Better power efficiency IDLE.
I'm not asking for crazy spec but the best of both world, Razter and RT and maybe IA (I expect it in video games more and more and I used it for assistance)
3:00 tech channels back then were liking gtx1060 and gtx1050ti by extension way more than anything from AMD.
I doubt it will change when we had very similiar situation with rx6700xt and 6800xt vs rtx alternatives
I just want a new RX 6400 single slot low profile GPU to pair with the 8600G, to slightly boost the iGPU performances. Micro-Atx low power is for my future. Graphics are almost like the 480... I dont need more than that, really. 115 Eur for a RX6400 Pulse 4gb, just upgrade to 8gb and relaunch it with Zen 3. If it can play Forza at 60 FPS, is good enough for me. The RTX A2000 Quadro is sweet, I wish AMD could make a similar card, and slightly cheaper, 500 is a lot, but I'm super excited by the low power cards. Arc 380, RX 6400, and now RTX A2000....
Would love a rx7800xt performance card for $400 in sweden. Its $650 now.
The problem with AMD aiming to compete at the mid range is they have to compete with themselves. The 6000 and 7000 series will drop in price and offer insane value.
I'm a bit confused, so no 8900 xtx, is that what they mean? I don't need the strongest graphics card I just want an upgrade from my 3090, so that I can run Linux easier.
If AMD can release a card that performs as good if not better than the rumored 5070 but with way more VRAM and lower price then I'm sold.
tbh amd's doing the right thing because my rx 6800 xt leaves no room for upgrades. its just the BEST card ever made. if amd focusses on midrange once again i may MAYBE upgrade. but i definitely wont need to for the next several years
At the top of the 8000 i’m really hopping for performance between a 4080-4090 at $1k usd but that might be hoping for too much.
i suspect it will be nice
2:36 Return of the MuserBenchark. Yay!
i think at rt it will be really a hard competetor for nvidia, not at highend but on entry and midtier gaming and at creators :)
I want amd specifically because amd freesync premium monitors are cheaper and i already own one if there is unironically anything i will give up last is LFC not even raytracing or ai upscaling which my rx 7900 xt can run cyberpunk at high (not ultra cuz theres almosy no difference but the missing fps) settings fsr quality capped at 60 fps (and with stable consistent 1% and 0.1% partially due to my dram overclock) but man i do want amd to have better raytracing and ai upscaling for the niche games that benefit even tho most gaming I do is rastorized which is primarily why I also prefer AMD
I hope AMD's new series does well and has good prices, but after years of having constant driver-related issues on RX 580, I honestly don't trust AMD's software support enough anymore and will just go for Nvidia for my next GPU.
Radeon just needs its own Zen-moment and as stated in the video pricing is everything for this. My guess is that they can but they don't want to. Must earn them enough money even with low sales.
I will recommend everyone to keep expectations low and take every leak with a grain of salt. Zen 5 and RDNA 3 must be set as prime examples of when we as consumers but our excitement over reality.
5:20 people weren't tricked into thinking the card is good. At the time the card was released there was a very high price because of chip shortage, which is not AMD's fault and the card is quite capable for the price it is being sold right now.
That's what tricked people into thinking it IS good now, even though there isn't a chip shortage any more.
I just got a 7900gre and called it a day for now. More than enough for what I do and down the road I’ll just buy a rtx card again I guess
marketing-wise AMD has done poorly these last two generation, but looking at actual market pricing I honestly think they have been the best option out there. Here in Sweden I was able to scoop up an 6800XT for the same price as an 8GB 3060Ti. People still buy nvidia. But in the all-popular mid-range pricepoint at around 250-550EUR Radeon has just offered much more value. I can't even say that AMD was worse at RT, because RT performance on the 6800XT is far better than the 3060Ti. :)
The only saving grace for them can be rtx 3070 performance with 12GB of vram in the price of rx6600.
RX7600 is like a smaller brother to rx6700, but overclocked so it may be a bit faster, when it already should be rx6700xt performace with at least 10GB of vram! and XT version with 12GB and ~15-20% faster.
I still prefer Nvidia, coz I can use Phsyx in older games and DLSS is a more stable look. But if they would give me rtx3070 performance with PCIE x16, with lower power draw and price a bit lower than rtx4060, then I would buy it someday.
RT has a point only with rtx4060ti performane and +10GB of vram. This is why I wait for RTX5060 or used rtx4070.
I'm using a 6650 xt 8gb. I wanted to upgrade to an 7900 xtx for 4k gaming but it's too pricey. If the next gen rx 8000 will have a normal price then for me it's good enough. Even if the best rx 8000 is in reality a 7900 xtx. If the price is about 500€ for the best 8000 series and has the performance of an 7900 xtx then i'll buy it at day one.
Everyone should support amd , if they stop making it , you best believe the mid range price for a GPU would be expensive.
I'll upgrade my rx480 when I'm offered the same relative value.
Thankfully the AAA offerings from the last 4 years haven't been enticing at all.
Glad to see you be the voice of reason again. Was starting to question my sanity with so many people around me accepting and even defending the pricing of modern video cards.
AMD needs to deliver previous generation flagship performance for 500$ in order to take market share. If they deliver a 7900xt equivalent with better raytracing for 600$ like some outlets are reporting, then RDNA4 will be DOA.
We need something like the 6800 xt again. Thats going on Amazon for 350 usd, fucking amazing value