Basic pc gamer : "i want AMD to make high end gpu so that we have competition, allowing me to buy an Nvidia at lower price" This kind of people is the reason why AMD stopped high end gpu
But to be honest AMD was lacking in a lot of other areas. For 3D Rendering for example, you kinda could use AMD (sometimes)..but Nvidia was always 2-3x faster and was much more reliable/stable. It is all those things adding up, not just the gaming performance - that also lacked behind. That is the reason why Nvidia is the must have. Nvidia is known to be much more versatile and AMD is concidered the 2nd pick therefore (if you want to game). I think it started all with Nvidia CUDA beeing very sucessful and OpenCL was not accepted on the same level. Suddenly GPUs became useful to a broader audience besides the gamers. And i think this is also a huge factor...even if you are only a gamer. Support! Not because of the bad people cheating on poor AMD, it is competition and AMD is behind and has been known for that now for longer than just a few years. Otherwhise people would not pay those insane prices and give Nvidia a clear marketlead.
@@Isjan-og4ph well Intel seems to be on the descending branch...AMD is doing good on CPUs. Always getting CPUs that are good and still have very moderate power consumption.
Handhelds would benefit the most from upscaling, but the current implementations work more poorly on them due to lack of pixel density. With a more grounds-up approach it could serve everyone now. Same with frame gen.
@@aberkae Everyone is assuming it due to rumors. None of us really know but yes, I am waiting. My choices are between the 7900xtx and 4080 Super personally but I froze that purchase to see what is coming soon. $500 with 70ti level RT is an instabuy. Even $600 may be doable.
@balthorpayne for reference the vram for 8800xt is GDDR6 Memory Clock 2438 MHz 19.5 Gbps effective vs 7900xt Memory Clock GDDR6 2500 MHz 20 Gbps effective as per Techpowerup's place holder.
FSR 4 is unlikely to be mobile-only. There's a bit of a chance that it will start there, but AMD doesn't tend to lock technologies to specific architectures, and there's no real need for that. A Radeon 7600 should be able to run inference about as well as an NPU. The benefit of the NPU is that it can do it with less power, which is why it's good for mobile.
AMD's problem is their GPU architecture isn't fit for purpose. Adding more features to the CUs instead of dedicated hardware just isn't efficient. FSR 4 only working on APUs (with dedicated ML units) is simply further proof of how far behind they've fallen.
True. Because it's a cost and power cutting efficiency. You'd be paying a lot more for AMD cards up with NVIDIA prices if they did and I think AMD wants to undercut and make their GPU's more affordable which is their strong point.
@exoticspeedefy7916 This would be true IF the only cards with Ai hardware were the high end cards which isn't the case. Tensor cores are on cheap cards.
AMD needs to compete with 60, 70, 70Ti tier cards, with performance, reliability, feature set. Plus they need to advertise that they are in the console, on consoles and in games to move the mind share. On top of that they need to be cheaper by at least 20%, to make people jump from what they know. Why would I buy RX 5000, 6000 or 7000 at similar price, when nvidia does it better…. AMD made their bed, let them sleep on it. They should focus on mobile and mid to mid highest end, leave enthusiast grade aside. They need to be investing and investing after that to csught up. Nvidia is not intel, they don’t take brakes, nor do they sleep.
Yeah. They need to sell competitive low and mid range cards with way lower price than nvidia to gain any form of share. So far their genius plan has been to release overpriced cards and later down the line when no one cares anymore slightly lower the price to a more decent point. That hasn't been working obviously.
Based on the current market, at least in the UK, the best value card is currently 7800xt, can be had for £410 - 430, priced within £10 of a 4060Ti, but the overwhelming sentiment is that people hate them and crap on them so hard because of silly stuff like FSR & RT but in a given price bracket, at least below 7900 they're vastly superior card's, actually good fat coolers and plenty of headroom but it just won't sell because now people who care about value enough already bought one or a 7900gre, the lack of decent prebuilds featuring Radeon hardware is alarming as well, people would buy if the cost savings were passed on but any that exist are priced according to performance which is a little bit of a red flag.
@@XiLock_Alotx the OLED refresh was fantastic though. 90hz, amazing blacks/color, moved to a more efficient & performant node, bigger battery, better cooling solution, all while having a very reasonable price. And Valve's custom Zen 2 SoC is somehow still by far the best at 10-15w, even in the face of the much newer SoC in stuff like the latest Rog Ally.
I own a 4090 because I am economically secure after decades of work. If I was a just out of college and starting out in the work world then I'd own a 7800xt. It's a lot of gpu for around $500 when you turn down the "bells & whistles".
they don't. their shader cores can do AI workloads, but it comes at the cost of being able to do other compute workloads. Nvidia can do frame rendering and upscaling another frame concurrently, AMD would have to pick one or the other
People buying 4060 because the 4090 exists is completely insane, people can't be that stupid. I think the actual reason people buy the 4060 is because AMD is reputed to suck at RT and upscaling.
Ray Tracing also doesn't make any sense on most graphics cards specially the ones at that level of performance. People are influenced by FOMO way too much.
It's also the fact that they buy Nvidia, because they are always at the cutting edge of any new GPU tech. Also for the most part their GPU's just tend to work pretty flawlessly. If you are more than satisfied with a brands product, then you just tend to stick to that brands products. There's no real incentive to venture elsewhere, if you have always been happy with any product that you've bought from that company. Even if it costs more, you'll pay that extra just for peace of mind in knowing that you'll likely get a decent quality product, which will meet you needs. Most people simply stick with Nvidia, because they usually deliver decent GPU's with the latest cutting edge features. As for AMD, it doesn't help them having a long tradition of always comically tripping over their own two feet, them never delivering on their pre-launch hype train and them having a past history of releasing some pretty shoddy products (overheating GPU's, terrible unfixed drivers for years ect.). So it's kind of understandable why some people then give AMD GPU's a wide birth. Especially when they are always playing second fiddle when it comes to new tech and because they simply never seem to deliver on any of their own overhyped promises. It's even worse if that customer has also been burnt by a problematic AMD GPU, they've had in the distant past. Many people honestly and rightfully think that it's always a risk and gamble going for a AMD GPU. And some will only take that risk, if saving themselves some money is the discerning factor in a purchase. It's been said a million times before, but AMD has always been it's own worse enemy. And most of it's undoing has always been by it's own inept hands.
The problem with comparing with RX6000 series is that for most of their life they were unobtanium. AMD simply did not have enough supply from TSMC. Every graphics card they made for almost 2 years was sold. That they had such low marketshare was because they weren't making many.
@@OmnianMIU if that was the case it wouldn't have been impossible to buy. The only reason rx6000 didn't sell better was that AMD didn't contract enough of TSMC's supply lines.
AMD clearly prioritized products with higher margin like CPUs, I always found ironic that so many complain for NVIDIA being greedy to them while AMD and Intel have way higher margin on consumer CPUs. I also wondered if the high end RDNA 2 were designed as an halo product from the beginning
If this is for mobile Amd might wanna upmarket the RyzenMAX APU with 6750XT like igpu They can consolidate 4k120 gaming with upscaled AI from 1080p Fsr4+AFMF 2+in-game framegen might be able to play all games easily without any dgpu which is a blessing for $499-999 laptops/mini PC
Even if FSR 4 releases tomorrow and is just as good as DLSS, what about all the other features? Each generation Nvidia releases new features like RTX HDR and VSR, extending the lead over AMD. I don't think AMD will ever compete with Nvidia in dGPUs. Nvidia has far more money and is already leading. They will only extend their lead. Nvidia is not Intel
By the time AMD release a 4090 beater NVidia will be on the 5090. Best for them to produce great mid-range gpus and target the budget market nVidia seems to have forgotten about.
Did they really forget about it, though? If you're not gonna compete with the 4090, you're now competing with the 4070 super and 4060 Ti and such, and we've seen AMD get slaughtered in those market tiers too. I don't see things getting any better for them unless they can compete with DLSS which is doubly important in midrange cards.
In design, scaling gpus is a relatively easy task (in the context of gpu design) but whether you could get it at a reasonable area and power profile and be price competitive is another issue. The "midrange" here is not just add in cards but future APUs.
The moment I'm forced to buy an Nvidia GPU, is the moment I stop buying new games. First, I'm on linux, my distro of choise is nixos and I run a window manager under wayland. AMDs support here is essentially first class, everything works thanks to their drivers being part of the kernel and there user space drivers being open and packaged by every distro under the sun in the form of mesa / radv. This isnt the case for Nvidia. Their drivers arent open and their wayland support is at best in beta. Secondly, Nvidia has done so many anti competitive and consumer stuff, its honestly hard to count. From gameworks, the whole tessellation era, and stuff like the 3.5GB 970, there are plenty of examples to choose from. Thirdly, they seldom release open technologies and instead actively segregate tech that could easily work on their older generations. A good example of this is frame gen or Gsync. Physx is another good example. Finally, Nvidia is to blame for this frame generation and upscaling hell hole that we find our selfs in. Their whole marketing hinting at that DLSS is better than native is straight up BS. The only thing is actually better is TAA, by far the worse method of AA out there, which somehow make developers forget that alpha channels exist, thanks for the dithering effect when I disable TAA, very cool. I'm not saying you should buy GPUs / hardware based on blind loyalty, thats never good. Instead you should buy based on your price point and feature requirements, I personally also choose not to buy from companies I hate. I once made the mistake of buying and nvidia GPU in the form of a 3070 in the hope that DLSS was actually good, just to be disappointed and plagued with a lack of vram for high res textures. I ended up selling it and getting a 6900XT and never looked back.
@@skinscalp222 I'd rather not use any form of upscaling period. I'm already fed up with games forcing TAA with no alternative AA method. Then when you get around to disable TAA, you're left with the most horrid dithering artifacts, since developers forgot what an alpha channel is. TAA and anything based on them (every upscaler) can go to hell for all I care. Motion clarity has been going down the gutter for years now, I might as well smear vaseline on my screen. Worst of all, these technologies went from being optional, too being required for any decent performance at higher res. Monster hunter wilds seems to be the next game that goes this route.
So NVIDIA is to blame for that FG bullshit, but we should praise AMD for capitalizing on that now with AFMF and FSR 3.1 because it's usable on a lot more GPUs ? (not even that much because it essentially exclude Turing/RDNA 1 as well for recommended specs) If you hate FG/RT/upscaling so much, you shouldn't use any of these technologies regardless of your GPU. I've seen way too much hypocrisy from AMD fanboys, I'm just disgusted of how AMD gets a pass for these "annoying things" NVIDIA tried to implement first. It's literally like politics where people don't want bills/laws to pass for the sole reason that it wasn't made/proposed by their party, even though they essentially want to pass the bill/law as part of their program. It is the case for FG, it was the case for DLSS vs. FSR, it'll be the case for RT performance once AMD gets up to Nvidia/Intel level.
@@rinsenpai135 Ofcourse they're the ones to blame, nvidia is the market leader and they introduced these technologies, while also hinting that it was better than native and so on. Where do you think that DLSS being better than native came from? Which is not BTW. I'm not saying AMD are angles, they've done their fare share of BS, just look at their entire marketing department, but at least their tech is open. And no I dont use AFMF since I'm on linux. People claiming that its a god send that AFMF exist are lying to themselves and yes it being exclusive to the 7000 series is straight up BS. And yes I try to avoid using the technologies when ever I can, but these tools went from being an optional upgrade, to straight up required. Black myth wukong, remnant 2, lords of the fallen and a bunch of other games expect you to enable upscaling, and some even expect you to have FG. Now even a monster hunter wilds, one of the titles I'm most excited about, recommends enabling upscaling and frame gen for 60fps. Not only do we have to deal wit TAA being forced down our throat, now upscaling and FG slowly are becoming mandatory. Gaming is in a worse place thanks to these technologies ruining motion clarity. And fair enough, ultimately its not the companies that drive a trend, its the consumer and the average gamer has been voting for these features, so inevitably they're to blame.
Not making a high end doesn't mean their mid range will suddenly sell more. All this does is save them money. Their problem is price and software, and they can fix that while still having a high end. He mentions ati tried this, well ati tended to have worse drivers and support. Their problem is themselves. They are also constantly late on hardware and software features, sometimes their cards use way more power when being idle or on multimonitor . Amd cards dont even do hdr 1000 correct on my alienware dwf. This has nothing to do with high end.
@@anthonyrizzo9043 Because It costs AMD more to make a competitive card, since they don't have access to resources like NVIDIA. If they made a card that was on par to a 4090 It would cost more than a 4090 and also use more power. Rember NVIDIA has much more in the way of market share what it would cost AMD more to buy the same recourse's.
Coming from the bottom up makes a ton of sense but AMD will be back in the high end when they figure out how to make their AI cards able to downgrade to the consumer market. Imo AMD marketing and execution has been their biggest flaws. Imagine a review cycle where the 7900xt/x launched at $750 and $900.
Yup. The way AMD is doing things and able to sell at the price point to undercut NVIDIA is to sacrifice performance on their level. It's not that AMD cant compete in performance they want to make more affordable products to the consumer they said it themselves. That's why we don't see dedicated AI cores and RT cores, etc in their GPU's they just make do with what can already be utilised, things like compute units, shaders etc for these tasks albeit at a more performance hit but it's easier on ya wallet.
Fingers crossed that Battlemage is up to the task (or still exists). Though AMD should just be glad that Nvidia mostly ignored mobile chips after Tegra phones and tablets mostly failed. But with the success of the Switch, I wouldn’t be all that surprised if a chip based on the Switch 2’s SoC gets put out into the world for PC handhelds.
Battlemage is projected to MAYBE catch up with 40 Series and RDNA 3, years later. Did y'all forget Intel already said they're not interested in the high-end?
There is a very high chance everything console gaming wise is moving to ARM over the next decade. It makes no sense from a development standpoint or have to port games to multiple architectures. There is also the chance of a real-time translation layer similar to the steam deck, either way there's going to be a big change in the landscape.
@@oo--7714 if you can use steam on a phone there’d be a market. The issue is that people aren’t used to buying games on their phones, but if the already have a steam library, every purchase they make on steam deck or pc could carry over to their phones. Assuming that’s why steam is making the arm translation layer
The rumor is they are taking the high end off this generation so that they can better optimize performance on a complex multi GPU chiplet model that can surpass Nivida.
That's kinda something I was wondering about the Switch 2, would it simply get an older version of DLSS, or a custom one, maybe one that is specific made to maximise energy on the Switch 2. Ig it wouldnt need A.I., if it's a fixed platform like a console.
I own both an AMD and NIVIDA gaming PC. DLSS is Nivida biggest advantage but there’s also a lot of misinformation about AMD and how they have bad drivers which simply isn’t true. If AMD can make FSR look as good as DLSS does especially in the performance setting then I don’t see why someone wouldn’t consider them as long as they don’t believe the people that lie about their products.
Sooooo we are going to talk about this everyday ey. Amd made that statement and every sense, there has been a daily discussion about it here…….yawn. Amd is keeping content creators lights on right now…… they are the topic everyday even though there isn’t any real news
AMD just made a non-sense excuse since hilariously 7900XTX has the highest ownership share on Steam survey right now out of all the 7*** series. It's not 10% market share to 80% market share as claimed at all. They have the entire console market. They have the devs since nearly all games are made on consoles. All nVidia features have to be tacked on afterwards. It's just mental gymnastics to claim it's a reasonable conscious move. They are just covering for the low performance products they will present later on. "They are bad at benchmarks? Oh well, it's intended" they will say. I wish AMD well and want to buy from them again but the main issue is that they aren't making competitive products at all and shaving 10% off their rivals with a far worse feature set during 2020s. They're acting like their CPU department as if they won the battle already when they stagnated. They should probably throw RDNA out of the window and make a hail mary like they did with Ryzen for Radeon. nVidia can only have the massive market share while being more expensive because AMD has dropped the ball and never picked it up over the last 5 years.
While AMD has for sure messed up in the past, some things are overwhelmingly difficult to overcome. Money talks as we all know, and I assume Nvidia has several times more of it laying around. No matter what AMD does Nvidia could just outspend them. On top of that, the money buff is so big that AMD would need some type of genius breakthrough in computing to catch up to Nvidia. Not impossible I suppose since as you mentioned the Ryzen thing worked out great for them. But even then Intel CPUs were still generally better the entire time. Ryzen just had the price to performance locked down and I don't think CPU loyalty is as big as GPU. People just see CPUs as like memory or other things to an extent. Whichever cost the least and works the best is what people get. With the 7900XTX thing I believe what AMD was saying is that they can't be as profitable as they want with high end cards. Even though the high end is what sold well this time around, its still miniscule. Along with their mid tier offerings not being as good as Nvidia's this time around. So they are like hey screw it, we'll just put more effort into the midrange and below and leave the small market high end to others. Lets just say that AMDs high end is Nvidia's mid range in terms of money spent in development and such (I know its not i'm just saying) then its easier for AMD to just market said cards and development in that middle tier. Rather than fight at a level they have little chance of winning. Not saying its a good idea, thats just how I interpreted it.
@@noodles9345 People talking about money as if nVidia was 3 trillion $ for decades. nVidia's R&D hasn't stopped innovating even when they were about broke, it's only because they don't drop the ball and everyone else dropped the ball is that we're not talking about 3dfx today instead of nVidia. Intel also dropped the ball which is how AMD could catch up. Even then AMD has been for decades printing uncontested money through consoles. Console wars? No problem because AMD arms both companies. Even next gen will still be AMD no matter how they're utterly losing in the PC GPU scene. You can't just throw money and succeed at this anyway, look an Intel. They have been the giant for half a century and they couldn't make any inroads to the market. But they saw the trends and at least went trying them from the outset like an AI upscaler, pretty much sold at a loss. AMD is following like 3 generations behind in feature set. And what I'm saying here is that this is an excuse but not a strategy. They'll probably make their best stuff and it'll be on 5080 level. Then instead of calling it a flagship, they'll call it their mid-tier. Pretty much what happened this generation. They just don't want the same people who hyped 7900XTX to high heavens before it arrived doing the same again.
Amd is a decade behind Nvidia in AI cores. There is no future where they catch up, ever. The lead is way too far unless they have some crazy innovation instantly.
@@sapphyrus I get you with the money thing, money certainly isn't everything. But it certainly matters. When you can pay the best and brightest engineers on the planet to work for you, they are naturally going to come up with new technologies and refine them at a faster rate. AMD can of course make products that are just as good as Nvidia but the disadvantage and current gap in knowledge is severe. For a goofy example, its like trying to fly to another galaxy before you try getting to another planet in the solar system. AMD practically has to speedrun through everything and immediately just learn how to travel insanely far to catch up to Nvidia. They even said in the video, something about the in the past AMD had a superior card to Nvidia but people just kept buying Nvidia anyway. So even when AMD does on the rare occasion have a better product people just don't care.
@@noodles9345 What I'm saying is that nVidia didn't become what it became today with money. ATI/AMD has been around just as long and they actually had other products to fall back on like CPUs. Transform & Lighting, CUDA, physics, AI upscaling, frame generation etc nVidia has been pretty much the sole innovator over the last two decades in this business. And this was all with selling graphics cards to gamers. They certainly didn't have big backers to prop them up as well. So no, money doesn't really anything when the vision isn't there. It's definitely not a matter of "here's the money, come up with something". If anything, it's mentioned that nVidia shares going up the moon meant that all those engineers paid with shares don't even need to work another day ever again, they're set for life. That might be a problem down the road for them. In any case, AMD at this stage has demonstrated that they have zero vision which is mentioned in the video. The guy says "I told our team this is not where the market was going so we switched a year ago". Wow captain obvious, I could tell your team that back in 2020. What are you even getting paid for? If Intel is coming back to GPUs after a 25 year break and stomping on your feature set, you better change your entire game plan. I bought many AMD cards over these last 25 years but there is no reason whatsoever to buy them now. They used to be cheap, now they're too expensive for what they offer. I got that 380 mentioned in the video and afterwards they just lost their way. I hope they can compete but not without any vision they can't.
Oh Radeon, remember when it was an ATI product and went head-to-head with Nvidia: back and forth and back and forth the two of them would go taking the top spot. I mean it never quite reached the consistent heights of Nvidia but many times it would beat its flagship. WTF is with these huge companies that gobble up another one and then basically run it to the ground to the point where It's barely a shadow of its former self. Shame on AMD for fkn it up.
I'm mostly worried about Nvidia hitting the breaks on (releasing) innovation. I think it would be fair to say Nvidia caught AMD with their pants down when they released rtx 2080, and suddenly it wasn't just about rasterizing more pixels, faster. And then amd's efforts to catch up have sucked so bad, even Intel dipping their toes in discrete gpu has managed to upstage Radeon for modern games when it comes to the voodoo of increasing visual quality & framerate. But if ps6 and xbox series finale are both amd soc's..... 1. Everything will be held back to accommodate console hardware game development targets 2. Nvidia will probably reallocate resources and staff to focus on datacenter gpu's to help Microsoft with their co-pilot efforts, to create the biggest computer security disaster that will ever happen 3. ....? Nvidia does have more lucrative biz going on the AI bubble, but what if they found the time and effort to add something to the switch 2 soc that is as game changing as dlss(past&present), and that amd/Intel aren't working on because they had no idea?....
I doubt NVIDIA will hit the brakes but we have to realize that transistor improvement and cost reduction is slowing down dramatically. But I agree with you, NVIDIA being pushed out of the home console market really sucked, it really killed competition there with MS and Sony comfortably enjoying the no innovation/low risk situation especially on the old gen
I've been getting a new card every generation for a while now, but with how prices were this generation, I'm thinking about skipping a generation or 2 and staying one behind each generation. So I have RTX 40 series now. Wait for RTX 60 series and buy used RTX 50 series. That's going to be my new way of doing it.
Since the RDNA cards have released by AMD the only people i seen hyping up anything RDNA related have been console fanboys and in particular xbox fanboys raving about the series x being full RDNA 2
Literally upscaling is good and is the future. Same for frame Gen. The faster you realize this and implement an understanding of it.. the sooner you'll be content with AI upscaling like dlss. It's really good.
@@christophermullins7163Just tried xess 1.3 in Cyberpunk and even though it's the dp4a path not the full xmx path of the ai upscaler. I'm blown away at how good it really is. If amd makes a ai upscaler native for my 7900xt. I could get better performance and much better image quality. It's such a shame that they don't have this available.
Keeping the analogy with cars, there was a time when winning races sold cars. Did they sell the racing car? absolutely not, just the semblance of it as the racing car would be unatainable for the regular person, but buying a product of the winning team made you feel better even if the product was worse than the others. Ferraris were a horendous car but a good sports car and a great looking one, no space for luggage, crooked buttons, but you were in the winning team. Going back to video cards ... most ppl don't buy the 4090, but they buy the winning teams cards because the alternative is to buy AMD which you buy because?... it's the budget option, the cheaper option, you could not afford to be on the wining team, why not shell out 20 bucks to get nVidia to be on the winning team. They did this with 5700XT and 580 already, great cards? absolutely! But ... the guys that make Ferraris also make similary priced option, would not like to have a piece of heritage? This is what a big percentage of that 80% marketshare thinkgs ... AMD's decision makers need to be shaken up a bit and change the marketing and engineering strategy really. In race with 2 places fighting to be a strong 2nd place isn't going to change anything.
Everything will have a NPU in a bit so not nessaryily, plus the GPUs can have accelration in the future, I think of it in a Intel XeSS situaltion where you can have mutiple models for their respective application mobile/desktop. kinda shot themselves int he foot taking this long to have dedication Ai engines in their GPU, it took SOny to wake these dumbies up.
That's what it sounds like to me as well. FSR4 will require a NPU and that's why it's coming to mobile first because only the mobile APUs have the XDNA NPU. And if AMD decide to continue making FSR compatible on Intel hardware, Meteor Lake and Lunar Lake have an NPU as well. I guess that also means that the XDNA NPU is going to be added to future AMD discrete graphics cards?
The 960 vs 280, 780/780ti/970 vs 290x...many more examples where AMD actually had the better product but still got massively outsold. With the advent of social media and sources like DF who can tell people flat out "This product is better than this one" it still hasn't moved the needle. If AMD abandoned the consumer discreet market for anything other than mid-range, it wouldn't be shocking. The people who will complain the most will be those who only wanted Nvidia but wanted it cheaper, hence "We need competition."
Its bad news , Nvidia has allready no competition , wait until the 5090 comes out price wise gonna be insane...if u leave RT off the table 7900xtx is a great card and they catching up with Fsr 3.1 , but Nvidia can throw whatever budget for developing a gpu , Amd cant do that...will see how the RT on RDNA 4 will perform
One big issue is that AMD had terrible drivers for a long time. I see to this day that people still think this is the case. When honestly I've had less issues with AMDs GPU drivers Vs Nvidia 🤷🏻♂️ tuck on DLSS and I think people just go Nvidia because of brand reputation.
Someone suggested that a rule when investing in consumer electronics companies is to choose the company chasing the high end in their category because, *"The high end subsidizes the low end."* How true is this? Can we evaluate it empirically?
Someone explain to me why you would want to spend big money to buy a card with 60-70% more performance than the 4090...... like what game are you buying these level of performance for? The entire AAA gaming market has crashed and is only going to get worse in the future.
@@stem7141 better value? Are you serius? Did you watch benchmark of latest games? All games run better on RTX cards because all radeon have obsoelete architecture and you still mind to VRAM 🤷🏻♂️ Nextgen of videogames is here from beginning of 2024 and it meant RT + AI upscaler. Even sony with ps5 pro move to this type of render and you got a 7900xtx LOL
@@OmnianMIU Most of the time, games run better on 7900XTX in rasterization while the cost is 200 less. I will not spend 200$ more for having 4% less performance. And I don't mind about RT because even if it's better with Nvidia that's absolutely not satisfying, so my choice goes to 7900XTX. We've seen that RX 7900XTX run God Of War Ragnarok at 40 fps on 8K while the 4080 is VRAM bound at 28-30 fps. Maybe that 8K is overkill but that is not positive for a long run with a 4080. Yeah if I want to play on fake 4K I go to console not PC, I'd rather have a mix of high/medium settings and play on native 4K instead.
@@stem7141 🤣🤣🤣🤣🤣 you really really miss the point buddy. You don't understand that from beginning of 2024 the gaming is RT + AI upscaler. Enjoy your peace of garbage 😉
Can we use AI to generate fake hair for Richard? It's like you interpolate the guy on the left and the guy on the right to create fake hair on Richard's frame.
It'll be several decades before(it ever) people are primarily gaming on handheld. Even then.. a good chunk(50%) of gamers will be using big power hogs.
Not mobile only lmao that would be a shot in the foot. Makes zero sense. Bad take. Likely 6000/7000 will see this. Anything with ai accelerators would.
@@OmnianMIU I don't choose sides I own a 40 series as well lol. I'm using my noggin. You should try it. Why would AMD not include it in the last couple gen GPUs when they also include the same AI accelerator as the mobile chips? Come on now ;)
Thats what we all want but they simply cant keep up with nvidia. Last time amd was competitive in the high end gpu's was over decade ago. Ryzen route was possible because intel kept releasing the same cpu's year after year not bothering to develop better cpu's because they had no competition. Nvidia is not doing that mistake.
Ryzen took several generations to catch up and pass Intel and they are still only 24%ish of the CPU market. Consumers pick Intel even when AMD is better. Same for Nvidia. An AMD Ryzen moment would be celebrated because people would assume that means Nvidia would be cheaper.
@@balthorpayneAMD cpu are sold more than Intel in destop. Amd laptop are hard to get Specially the powerful from them. Intel own the big data center because big companies they dont like the chance everything. I own a small internet cafe and all of this are AMD cpu and gpu.
A few new titles already implement RT (software lumen) without an option to turn it off. Similar to hardware T&L back in the day, tessellation, ssao and other tech that people deemed to be superfluous and gimmicky, they became the norm in a few generations and no one even mentions them anymore nor do they incur the heavy performance penalty that they used to. RT is here to stay and it's the only way to maintain or drive development costs down as visual fidelity increases because doing good lighting by hand takes a crap ton of both talent, time and Dev input to get to Uncharted levels of quality.
@@christianr.5868with AMD’s track record, I don’t have confidence that they will. PSSR can be trained like DLSS, and training models for console games sounds like it should be the holy grail of upscaling.
What is anti-lag? Don't get me wrong, I'm personally very anti lag. Pretty sure not even the great Nvidia will manage to violate the space-time continuum by the time they release DLSS 1000. I'd absolutely pay an extra $50/month to my isp if my data could wormhole to game servers instead of being routed down the best non timey-wimey route
@thelonejedi538 Anti-lag is for input lag, basically it's AMD's version of Nvidia reflex. On Nvidia it makes such a big difference that with geforce now Destiny 2 is more responsive on the cloud than a Series X natively at 60 fps.
@@deathtrooper2048 ah, my bad. When I hear lag, I just assume it is referring network connection and not input latency. But yeah, I agree. I use a wired fightstick for many games on my ps4 and series x and wish it was as responsive as my pc on the same screen. Honestly doubt much will done though. I'm sure heavy use of dynamic res isn't helping the situation either
"When you hear 'RTX 4090' you think 'Ferrari' or 'Lamborghini'..." A terrible waste of resources that only assholes with too much money buy? "... the best of the best" Oh... No, I think the other thing is more correct. I also think that 'the strategy has been tried before and it failed' isn't a great argument when the market has changed pretty significantly in the past few years. When the RX580 launched, the top-end of the price range for video cards was $700 (MSRP). Now it's more than twice that. If AMD comes out with a line of cards where the top-end is priced at around what the 7900 GRE launched at, with similar comparative performance, and integrating some of the power efficiency gains that they've learned from working in the handheld space, that can be a pretty compelling sell to the mass market.
Clearly you are not an engineer or an engineer minded person, those cars are a marvel of human ingenuity and craftsmanship, if you prefer McDonal or factory food instead stuff made by a real cook that shouldn't prevent to recognize its value
@@Stef3m if by 'marvel of human ingenuity and craftsmanship' you mean 'vehicles with the worst fuel economy of any thing with four wheels, by weight' and 'not actually fit for purpose as a vehicle.' They get the same mpg as a Hummer while weighing half as much and being incapable of transporting more than two people or any cargo. They're fine as art pieces. They're useless and wasteful as cars, and only assholes with too much money buy them and drive them.
@@Fafhrd42 How convenient is for you to completely ignore performance in order to protect your belief from reality, take F1 cars for example which engine are by far the most efficient piston engines ever made reaching 26% in fuel specific energy to kinetic energy produced, it's just that the scope is different but a small low power engine could be produced if people would be willing to pay the extra cost. And this is nothing compare to a turbine engine which consume a lot of fuel but have an efficiency that is simply unthinkable for any piston engine.
Basic pc gamer :
"i want AMD to make high end gpu so that we have competition, allowing me to buy an Nvidia at lower price"
This kind of people is the reason why AMD stopped high end gpu
But to be honest AMD was lacking in a lot of other areas. For 3D Rendering for example, you kinda could use AMD (sometimes)..but Nvidia was always 2-3x faster and was much more reliable/stable.
It is all those things adding up, not just the gaming performance - that also lacked behind. That is the reason why Nvidia is the must have. Nvidia is known to be much more versatile and AMD is concidered the 2nd pick therefore (if you want to game). I think it started all with Nvidia CUDA beeing very sucessful and OpenCL was not accepted on the same level. Suddenly GPUs became useful to a broader audience besides the gamers. And i think this is also a huge factor...even if you are only a gamer. Support!
Not because of the bad people cheating on poor AMD, it is competition and AMD is behind and has been known for that now for longer than just a few years. Otherwhise people would not pay those insane prices and give Nvidia a clear marketlead.
Braindead comment
It's AMD's fault. Nvidia "dials" go to 11 with features like ray tracing. AMD is at best a step behind.
Yes, exactly, people saying this and then were buying Intel and Nvidia. Heard this a lot from my friends, sadly !
@@Isjan-og4ph well Intel seems to be on the descending branch...AMD is doing good on CPUs. Always getting CPUs that are good and still have very moderate power consumption.
I mean... Nissan, Toyota etc... have more revenues than Ferrari
Yeah but people settle for those cars, nobody settles for a ferrari
@@B2gboi as people settles for 4060s and not 4090s
@@JackTheBeast88 they buy a Maserati instead, see the difference?
@@B2gboi not really
@@JackTheBeast88 I'm settling with a 4090 and I'm not complaining lol
Handhelds would benefit the most from upscaling, but the current implementations work more poorly on them due to lack of pixel density. With a more grounds-up approach it could serve everyone now.
Same with frame gen.
I guess this means Nvidia will continue charging ludicrous amounts for their cards as there will be no-one to challenge them.
AMD is complicit because they benefit from allowing prices to swell!
@@aberkae In case you didn't notice, AMD isn't selling nearly as much. They are not benefiting or they would keep up the "10% cheaper" strat.
@balthorpayne everyone is assuming AMD is going to price rdna 4 at $500. Lets wait and see.
@@aberkae Everyone is assuming it due to rumors. None of us really know but yes, I am waiting. My choices are between the 7900xtx and 4080 Super personally but I froze that purchase to see what is coming soon. $500 with 70ti level RT is an instabuy. Even $600 may be doable.
@balthorpayne for reference the vram for 8800xt is GDDR6 Memory Clock
2438 MHz
19.5 Gbps effective
vs 7900xt Memory Clock GDDR6
2500 MHz
20 Gbps effective
as per Techpowerup's place holder.
FSR 4 is unlikely to be mobile-only. There's a bit of a chance that it will start there, but AMD doesn't tend to lock technologies to specific architectures, and there's no real need for that. A Radeon 7600 should be able to run inference about as well as an NPU. The benefit of the NPU is that it can do it with less power, which is why it's good for mobile.
Really hope Steam Deck can make use of FSR 4 in their next revision. Would be huge and make the upgrade even more worth it
AMD's problem is their GPU architecture isn't fit for purpose. Adding more features to the CUs instead of dedicated hardware just isn't efficient. FSR 4 only working on APUs (with dedicated ML units) is simply further proof of how far behind they've fallen.
True. Because it's a cost and power cutting efficiency. You'd be paying a lot more for AMD cards up with NVIDIA prices if they did and I think AMD wants to undercut and make their GPU's more affordable which is their strong point.
@exoticspeedefy7916 This would be true IF the only cards with Ai hardware were the high end cards which isn't the case. Tensor cores are on cheap cards.
It's plenty efficient for rasterization specifically.
"FSR 4 only working on APUs ... is simply further proof" - for people who see speculation as proof.
7900 xtx is pure power
AMD needs to compete with 60, 70, 70Ti tier cards, with performance, reliability, feature set.
Plus they need to advertise that they are in the console, on consoles and in games to move the mind share.
On top of that they need to be cheaper by at least 20%, to make people jump from what they know.
Why would I buy RX 5000, 6000 or 7000 at similar price, when nvidia does it better….
AMD made their bed, let them sleep on it.
They should focus on mobile and mid to mid highest end, leave enthusiast grade aside.
They need to be investing and investing after that to csught up.
Nvidia is not intel, they don’t take brakes, nor do they sleep.
Yeah. They need to sell competitive low and mid range cards with way lower price than nvidia to gain any form of share.
So far their genius plan has been to release overpriced cards and later down the line when no one cares anymore slightly lower the price to a more decent point. That hasn't been working obviously.
Based on the current market, at least in the UK, the best value card is currently 7800xt, can be had for £410 - 430, priced within £10 of a 4060Ti, but the overwhelming sentiment is that people hate them and crap on them so hard because of silly stuff like FSR & RT but in a given price bracket, at least below 7900 they're vastly superior card's, actually good fat coolers and plenty of headroom but it just won't sell because now people who care about value enough already bought one or a 7900gre, the lack of decent prebuilds featuring Radeon hardware is alarming as well, people would buy if the cost savings were passed on but any that exist are priced according to performance which is a little bit of a red flag.
Most of their cards are 15% to 20% cheaper than their Nvidia counterparts
AMD need to make better driver for their gpus or they never rise up their gpu marketshare, NEVER!!
@jabrilwilliams1752 yes and trashdeon cards still overpriced for what its offers!
Steam deck 2 baby
Glad I waited. And it was hard to pass on that OLED SD. Whew!
@@XiLock_Alotx hope it comes out next year, knowing valve though, it could still be a number of years. Tbh deck OLED is pretty great still
@@XiLock_Alotx the OLED refresh was fantastic though. 90hz, amazing blacks/color, moved to a more efficient & performant node, bigger battery, better cooling solution, all while having a very reasonable price. And Valve's custom Zen 2 SoC is somehow still by far the best at 10-15w, even in the face of the much newer SoC in stuff like the latest Rog Ally.
I own a 4090 because I am economically secure after decades of work. If I was a just out of college and starting out in the work world then I'd own a 7800xt. It's a lot of gpu for around $500 when you turn down the "bells & whistles".
RX 7000 series GPU's have a AI chip i think inside the gpu right?
Yeah they do
they don't. their shader cores can do AI workloads, but it comes at the cost of being able to do other compute workloads. Nvidia can do frame rendering and upscaling another frame concurrently, AMD would have to pick one or the other
Better put those ai accelerators to work and give us proper ai upscaling
they don't, AMD has no nvidia tensor core equivalent as of RX7000 series. theirs is just a band-aid solution like every other AMD gpu tech.
Yes. RX 7000 has 2 AI Accelerators per compute unit
People buying 4060 because the 4090 exists is completely insane, people can't be that stupid. I think the actual reason people buy the 4060 is because AMD is reputed to suck at RT and upscaling.
Ray Tracing also doesn't make any sense on most graphics cards specially the ones at that level of performance.
People are influenced by FOMO way too much.
Trust me people are that stupid. Still, it is true that DLSS is what makes lower end cards stand out.
@jonas_bento no you don't make sense in the world. Entire videogame industry is just rendering in RT always on. Get a life AMDelusional fangirl
It's also the fact that they buy Nvidia, because they are always at the cutting edge of any new GPU tech. Also for the most part their GPU's just tend to work pretty flawlessly. If you are more than satisfied with a brands product, then you just tend to stick to that brands products. There's no real incentive to venture elsewhere, if you have always been happy with any product that you've bought from that company. Even if it costs more, you'll pay that extra just for peace of mind in knowing that you'll likely get a decent quality product, which will meet you needs. Most people simply stick with Nvidia, because they usually deliver decent GPU's with the latest cutting edge features.
As for AMD, it doesn't help them having a long tradition of always comically tripping over their own two feet, them never delivering on their pre-launch hype train and them having a past history of releasing some pretty shoddy products (overheating GPU's, terrible unfixed drivers for years ect.). So it's kind of understandable why some people then give AMD GPU's a wide birth. Especially when they are always playing second fiddle when it comes to new tech and because they simply never seem to deliver on any of their own overhyped promises. It's even worse if that customer has also been burnt by a problematic AMD GPU, they've had in the distant past.
Many people honestly and rightfully think that it's always a risk and gamble going for a AMD GPU. And some will only take that risk, if saving themselves some money is the discerning factor in a purchase.
It's been said a million times before, but AMD has always been it's own worse enemy. And most of it's undoing has always been by it's own inept hands.
Ray tracing on a 4060 is pointless plus the competing amd card beats the 4060.
The problem with comparing with RX6000 series is that for most of their life they were unobtanium. AMD simply did not have enough supply from TSMC. Every graphics card they made for almost 2 years was sold. That they had such low marketshare was because they weren't making many.
@@naheemsays5140 no, because people in the world won't junk inside pc
@@OmnianMIU if that was the case it wouldn't have been impossible to buy.
The only reason rx6000 didn't sell better was that AMD didn't contract enough of TSMC's supply lines.
AMD clearly prioritized products with higher margin like CPUs, I always found ironic that so many complain for NVIDIA being greedy to them while AMD and Intel have way higher margin on consumer CPUs. I also wondered if the high end RDNA 2 were designed as an halo product from the beginning
If this is for mobile
Amd might wanna upmarket the RyzenMAX APU with 6750XT like igpu
They can consolidate 4k120 gaming with upscaled AI from 1080p
Fsr4+AFMF 2+in-game framegen might be able to play all games easily without any dgpu which is a blessing for $499-999 laptops/mini PC
Even if FSR 4 releases tomorrow and is just as good as DLSS, what about all the other features? Each generation Nvidia releases new features like RTX HDR and VSR, extending the lead over AMD.
I don't think AMD will ever compete with Nvidia in dGPUs. Nvidia has far more money and is already leading. They will only extend their lead. Nvidia is not Intel
By the time AMD release a 4090 beater NVidia will be on the 5090.
Best for them to produce great mid-range gpus and target the budget market nVidia seems to have forgotten about.
Did they really forget about it, though? If you're not gonna compete with the 4090, you're now competing with the 4070 super and 4060 Ti and such, and we've seen AMD get slaughtered in those market tiers too. I don't see things getting any better for them unless they can compete with DLSS which is doubly important in midrange cards.
In design, scaling gpus is a relatively easy task (in the context of gpu design) but whether you could get it at a reasonable area and power profile and be price competitive is another issue.
The "midrange" here is not just add in cards but future APUs.
The moment I'm forced to buy an Nvidia GPU, is the moment I stop buying new games.
First, I'm on linux, my distro of choise is nixos and I run a window manager under wayland. AMDs support here is essentially first class, everything works thanks to their drivers being part of the kernel and there user space drivers being open and packaged by every distro under the sun in the form of mesa / radv. This isnt the case for Nvidia. Their drivers arent open and their wayland support is at best in beta.
Secondly, Nvidia has done so many anti competitive and consumer stuff, its honestly hard to count. From gameworks, the whole tessellation era, and stuff like the 3.5GB 970, there are plenty of examples to choose from.
Thirdly, they seldom release open technologies and instead actively segregate tech that could easily work on their older generations. A good example of this is frame gen or Gsync. Physx is another good example.
Finally, Nvidia is to blame for this frame generation and upscaling hell hole that we find our selfs in. Their whole marketing hinting at that DLSS is better than native is straight up BS. The only thing is actually better is TAA, by far the worse method of AA out there, which somehow make developers forget that alpha channels exist, thanks for the dithering effect when I disable TAA, very cool.
I'm not saying you should buy GPUs / hardware based on blind loyalty, thats never good. Instead you should buy based on your price point and feature requirements, I personally also choose not to buy from companies I hate. I once made the mistake of buying and nvidia GPU in the form of a 3070 in the hope that DLSS was actually good, just to be disappointed and plagued with a lack of vram for high res textures. I ended up selling it and getting a 6900XT and never looked back.
You didn't mind AMD paying companies to implement FSR only?
@@skinscalp222 I'd rather not use any form of upscaling period.
I'm already fed up with games forcing TAA with no alternative AA method. Then when you get around to disable TAA, you're left with the most horrid dithering artifacts, since developers forgot what an alpha channel is. TAA and anything based on them (every upscaler) can go to hell for all I care. Motion clarity has been going down the gutter for years now, I might as well smear vaseline on my screen.
Worst of all, these technologies went from being optional, too being required for any decent performance at higher res. Monster hunter wilds seems to be the next game that goes this route.
So NVIDIA is to blame for that FG bullshit, but we should praise AMD for capitalizing on that now with AFMF and FSR 3.1 because it's usable on a lot more GPUs ? (not even that much because it essentially exclude Turing/RDNA 1 as well for recommended specs)
If you hate FG/RT/upscaling so much, you shouldn't use any of these technologies regardless of your GPU.
I've seen way too much hypocrisy from AMD fanboys, I'm just disgusted of how AMD gets a pass for these "annoying things" NVIDIA tried to implement first. It's literally like politics where people don't want bills/laws to pass for the sole reason that it wasn't made/proposed by their party, even though they essentially want to pass the bill/law as part of their program.
It is the case for FG, it was the case for DLSS vs. FSR, it'll be the case for RT performance once AMD gets up to Nvidia/Intel level.
@@rinsenpai135 Ofcourse they're the ones to blame, nvidia is the market leader and they introduced these technologies, while also hinting that it was better than native and so on. Where do you think that DLSS being better than native came from? Which is not BTW.
I'm not saying AMD are angles, they've done their fare share of BS, just look at their entire marketing department, but at least their tech is open.
And no I dont use AFMF since I'm on linux. People claiming that its a god send that AFMF exist are lying to themselves and yes it being exclusive to the 7000 series is straight up BS.
And yes I try to avoid using the technologies when ever I can, but these tools went from being an optional upgrade, to straight up required. Black myth wukong, remnant 2, lords of the fallen and a bunch of other games expect you to enable upscaling, and some even expect you to have FG.
Now even a monster hunter wilds, one of the titles I'm most excited about, recommends enabling upscaling and frame gen for 60fps. Not only do we have to deal wit TAA being forced down our throat, now upscaling and FG slowly are becoming mandatory.
Gaming is in a worse place thanks to these technologies ruining motion clarity.
And fair enough, ultimately its not the companies that drive a trend, its the consumer and the average gamer has been voting for these features, so inevitably they're to blame.
Not making a high end doesn't mean their mid range will suddenly sell more. All this does is save them money. Their problem is price and software, and they can fix that while still having a high end. He mentions ati tried this, well ati tended to have worse drivers and support. Their problem is themselves. They are also constantly late on hardware and software features, sometimes their cards use way more power when being idle or on multimonitor . Amd cards dont even do hdr 1000 correct on my alienware dwf. This has nothing to do with high end.
All this does is save YOU the consumer money.
@exoticspeedefy7916 how? Buying midrange that both companies make alrlettingr by letting nvidia run away with price any higher than what amd makes?
@@anthonyrizzo9043 Because It costs AMD more to make a competitive card, since they don't have access to resources like NVIDIA. If they made a card that was on par to a 4090 It would cost more than a 4090 and also use more power. Rember NVIDIA has much more in the way of market share what it would cost AMD more to buy the same recourse's.
@@exoticspeedefy7916 so how does that give them more market share by only making midrange? They already do this.
@@anthonyrizzo9043 They are realizing that their customer base cares more about price to performance. So that's their goal
Coming from the bottom up makes a ton of sense but AMD will be back in the high end when they figure out how to make their AI cards able to downgrade to the consumer market. Imo AMD marketing and execution has been their biggest flaws. Imagine a review cycle where the 7900xt/x launched at $750 and $900.
Yup. The way AMD is doing things and able to sell at the price point to undercut NVIDIA is to sacrifice performance on their level. It's not that AMD cant compete in performance they want to make more affordable products to the consumer they said it themselves. That's why we don't see dedicated AI cores and RT cores, etc in their GPU's they just make do with what can already be utilised, things like compute units, shaders etc for these tasks albeit at a more performance hit but it's easier on ya wallet.
Hand held is the future. I want my pc in the living room, i want it in the car, on the bus, in a hotel room, i want my pc while i poop.
That's because only mobile APU have the NPU, which added to meet the requirements for Microsoft Copilot+ PC 😅
Finally some use for these "for copilot" NPU, less gimmicky now 😅
Fingers crossed that Battlemage is up to the task (or still exists). Though AMD should just be glad that Nvidia mostly ignored mobile chips after Tegra phones and tablets mostly failed. But with the success of the Switch, I wouldn’t be all that surprised if a chip based on the Switch 2’s SoC gets put out into the world for PC handhelds.
Battlemage is projected to MAYBE catch up with 40 Series and RDNA 3, years later. Did y'all forget Intel already said they're not interested in the high-end?
There is a very high chance everything console gaming wise is moving to ARM over the next decade. It makes no sense from a development standpoint or have to port games to multiple architectures. There is also the chance of a real-time translation layer similar to the steam deck, either way there's going to be a big change in the landscape.
Why? , if there was a market for aaa games on mobile you would have a point but there isn’t.
@@oo--7714 if you can use steam on a phone there’d be a market. The issue is that people aren’t used to buying games on their phones, but if the already have a steam library, every purchase they make on steam deck or pc could carry over to their phones. Assuming that’s why steam is making the arm translation layer
i switched over to amd it sucks to hear this i hope they support the old gpus
The lies of AMDemential fanbase
Unfortunatly unless your old card can do intr8 and DP4a instructions I doubt it. any RDNA 2 and up
I think at least 6000 7000 would get it., Makes zero sense not to.
@@dingleberries360 you will be too much delusional
@@OmnianMIU You don't even make sense lol. What I said lines up with reality.
The rumor is they are taking the high end off this generation so that they can better optimize performance on a complex multi GPU chiplet model that can surpass Nivida.
That's kinda something I was wondering about the Switch 2, would it simply get an older version of DLSS, or a custom one, maybe one that is specific made to maximise energy on the Switch 2. Ig it wouldnt need A.I., if it's a fixed platform like a console.
it's likely their algorithm uses the npu so it needs the CPU ready and ryzen desktop they don't have it
isn't a lot of raw data needs to be pushed from GPU through PCI-E bus to CPU and back to HDMI/DP after processing it?
If AMD can deliver good GPU’s at an affordable price, then that would be very exciting news.
John says competition is nice, yet would never be caught dead using the competition. He’s an Nvidia/intel fanboy.
They need a better upscaler and they need to complete at the high end
Well, I'm happy for them going back to RDNA 1, I want a better bang for buck and don't want to spend more than 600 euros, so for me it's better.
I own both an AMD and NIVIDA gaming PC. DLSS is Nivida biggest advantage but there’s also a lot of misinformation about AMD and how they have bad drivers which simply isn’t true. If AMD can make FSR look as good as DLSS does especially in the performance setting then I don’t see why someone wouldn’t consider them as long as they don’t believe the people that lie about their products.
Alex's webcam game is lit.
Where are the pricing and performance gains from switching to chiplet design in rdna3 and 4?
Sooooo we are going to talk about this everyday ey. Amd made that statement and every sense, there has been a daily discussion about it here…….yawn. Amd is keeping content creators lights on right now…… they are the topic everyday even though there isn’t any real news
Could you guys do a video about what gaming chairs you use?
Honestly it's unacceptable if 5080 doesn't big jump compare to last Gen .
AMD just made a non-sense excuse since hilariously 7900XTX has the highest ownership share on Steam survey right now out of all the 7*** series. It's not 10% market share to 80% market share as claimed at all. They have the entire console market. They have the devs since nearly all games are made on consoles. All nVidia features have to be tacked on afterwards. It's just mental gymnastics to claim it's a reasonable conscious move. They are just covering for the low performance products they will present later on. "They are bad at benchmarks? Oh well, it's intended" they will say.
I wish AMD well and want to buy from them again but the main issue is that they aren't making competitive products at all and shaving 10% off their rivals with a far worse feature set during 2020s. They're acting like their CPU department as if they won the battle already when they stagnated. They should probably throw RDNA out of the window and make a hail mary like they did with Ryzen for Radeon. nVidia can only have the massive market share while being more expensive because AMD has dropped the ball and never picked it up over the last 5 years.
While AMD has for sure messed up in the past, some things are overwhelmingly difficult to overcome. Money talks as we all know, and I assume Nvidia has several times more of it laying around. No matter what AMD does Nvidia could just outspend them. On top of that, the money buff is so big that AMD would need some type of genius breakthrough in computing to catch up to Nvidia. Not impossible I suppose since as you mentioned the Ryzen thing worked out great for them. But even then Intel CPUs were still generally better the entire time. Ryzen just had the price to performance locked down and I don't think CPU loyalty is as big as GPU. People just see CPUs as like memory or other things to an extent. Whichever cost the least and works the best is what people get.
With the 7900XTX thing I believe what AMD was saying is that they can't be as profitable as they want with high end cards. Even though the high end is what sold well this time around, its still miniscule. Along with their mid tier offerings not being as good as Nvidia's this time around. So they are like hey screw it, we'll just put more effort into the midrange and below and leave the small market high end to others. Lets just say that AMDs high end is Nvidia's mid range in terms of money spent in development and such (I know its not i'm just saying) then its easier for AMD to just market said cards and development in that middle tier. Rather than fight at a level they have little chance of winning. Not saying its a good idea, thats just how I interpreted it.
@@noodles9345 People talking about money as if nVidia was 3 trillion $ for decades. nVidia's R&D hasn't stopped innovating even when they were about broke, it's only because they don't drop the ball and everyone else dropped the ball is that we're not talking about 3dfx today instead of nVidia.
Intel also dropped the ball which is how AMD could catch up. Even then AMD has been for decades printing uncontested money through consoles. Console wars? No problem because AMD arms both companies. Even next gen will still be AMD no matter how they're utterly losing in the PC GPU scene.
You can't just throw money and succeed at this anyway, look an Intel. They have been the giant for half a century and they couldn't make any inroads to the market. But they saw the trends and at least went trying them from the outset like an AI upscaler, pretty much sold at a loss. AMD is following like 3 generations behind in feature set.
And what I'm saying here is that this is an excuse but not a strategy. They'll probably make their best stuff and it'll be on 5080 level. Then instead of calling it a flagship, they'll call it their mid-tier. Pretty much what happened this generation. They just don't want the same people who hyped 7900XTX to high heavens before it arrived doing the same again.
Amd is a decade behind Nvidia in AI cores. There is no future where they catch up, ever. The lead is way too far unless they have some crazy innovation instantly.
@@sapphyrus I get you with the money thing, money certainly isn't everything. But it certainly matters. When you can pay the best and brightest engineers on the planet to work for you, they are naturally going to come up with new technologies and refine them at a faster rate. AMD can of course make products that are just as good as Nvidia but the disadvantage and current gap in knowledge is severe. For a goofy example, its like trying to fly to another galaxy before you try getting to another planet in the solar system. AMD practically has to speedrun through everything and immediately just learn how to travel insanely far to catch up to Nvidia. They even said in the video, something about the in the past AMD had a superior card to Nvidia but people just kept buying Nvidia anyway. So even when AMD does on the rare occasion have a better product people just don't care.
@@noodles9345 What I'm saying is that nVidia didn't become what it became today with money. ATI/AMD has been around just as long and they actually had other products to fall back on like CPUs. Transform & Lighting, CUDA, physics, AI upscaling, frame generation etc nVidia has been pretty much the sole innovator over the last two decades in this business. And this was all with selling graphics cards to gamers. They certainly didn't have big backers to prop them up as well.
So no, money doesn't really anything when the vision isn't there. It's definitely not a matter of "here's the money, come up with something". If anything, it's mentioned that nVidia shares going up the moon meant that all those engineers paid with shares don't even need to work another day ever again, they're set for life. That might be a problem down the road for them.
In any case, AMD at this stage has demonstrated that they have zero vision which is mentioned in the video. The guy says "I told our team this is not where the market was going so we switched a year ago". Wow captain obvious, I could tell your team that back in 2020. What are you even getting paid for? If Intel is coming back to GPUs after a 25 year break and stomping on your feature set, you better change your entire game plan.
I bought many AMD cards over these last 25 years but there is no reason whatsoever to buy them now. They used to be cheap, now they're too expensive for what they offer. I got that 380 mentioned in the video and afterwards they just lost their way. I hope they can compete but not without any vision they can't.
Oh Radeon, remember when it was an ATI product and went head-to-head with Nvidia: back and forth and back and forth the two of them would go taking the top spot. I mean it never quite reached the consistent heights of Nvidia but many times it would beat its flagship. WTF is with these huge companies that gobble up another one and then basically run it to the ground to the point where It's barely a shadow of its former self. Shame on AMD for fkn it up.
It's easier that way
I'm mostly worried about Nvidia hitting the breaks on (releasing) innovation.
I think it would be fair to say Nvidia caught AMD with their pants down when they released rtx 2080, and suddenly it wasn't just about rasterizing more pixels, faster.
And then amd's efforts to catch up have sucked so bad, even Intel dipping their toes in discrete gpu has managed to upstage Radeon for modern games when it comes to the voodoo of increasing visual quality & framerate.
But if ps6 and xbox series finale are both amd soc's.....
1. Everything will be held back to accommodate console hardware game development targets
2. Nvidia will probably reallocate resources and staff to focus on datacenter gpu's to help Microsoft with their co-pilot efforts, to create the biggest computer security disaster that will ever happen
3. ....? Nvidia does have more lucrative biz going on the AI bubble, but what if they found the time and effort to add something to the switch 2 soc that is as game changing as dlss(past&present), and that amd/Intel aren't working on because they had no idea?....
I doubt NVIDIA will hit the brakes but we have to realize that transistor improvement and cost reduction is slowing down dramatically.
But I agree with you, NVIDIA being pushed out of the home console market really sucked, it really killed competition there with MS and Sony comfortably enjoying the no innovation/low risk situation especially on the old gen
I've been getting a new card every generation for a while now, but with how prices were this generation, I'm thinking about skipping a generation or 2 and staying one behind each generation. So I have RTX 40 series now. Wait for RTX 60 series and buy used RTX 50 series. That's going to be my new way of doing it.
Since the RDNA cards have released by AMD the only people i seen hyping up anything RDNA related have been console fanboys and in particular xbox fanboys raving about the series x being full RDNA 2
give me real pixels or give me death
Death it is lol
Literally upscaling is good and is the future. Same for frame Gen. The faster you realize this and implement an understanding of it.. the sooner you'll be content with AI upscaling like dlss. It's really good.
@@christophermullins7163Just tried xess 1.3 in Cyberpunk and even though it's the dp4a path not the full xmx path of the ai upscaler. I'm blown away at how good it really is. If amd makes a ai upscaler native for my 7900xt. I could get better performance and much better image quality. It's such a shame that they don't have this available.
@@christophermullins7163 You will eat ze bugs and enjoy it .
Keeping the analogy with cars, there was a time when winning races sold cars. Did they sell the racing car? absolutely not, just the semblance of it as the racing car would be unatainable for the regular person, but buying a product of the winning team made you feel better even if the product was worse than the others. Ferraris were a horendous car but a good sports car and a great looking one, no space for luggage, crooked buttons, but you were in the winning team.
Going back to video cards ... most ppl don't buy the 4090, but they buy the winning teams cards because the alternative is to buy AMD which you buy because?... it's the budget option, the cheaper option, you could not afford to be on the wining team, why not shell out 20 bucks to get nVidia to be on the winning team.
They did this with 5700XT and 580 already, great cards? absolutely! But ... the guys that make Ferraris also make similary priced option, would not like to have a piece of heritage? This is what a big percentage of that 80% marketshare thinkgs ...
AMD's decision makers need to be shaken up a bit and change the marketing and engineering strategy really. In race with 2 places fighting to be a strong 2nd place isn't going to change anything.
Comparing the 4080 and 4090 to a Ferrari is disingenuous.
Middle class enthusiasts can easily afford the former
If console gaming is becoming a middle class hobby, then pc gaming is definitely for the upper middle class people
Everything will have a NPU in a bit so not nessaryily, plus the GPUs can have accelration in the future, I think of it in a Intel XeSS situaltion where you can have mutiple models for their respective application mobile/desktop. kinda shot themselves int he foot taking this long to have dedication Ai engines in their GPU, it took SOny to wake these dumbies up.
That's what it sounds like to me as well. FSR4 will require a NPU and that's why it's coming to mobile first because only the mobile APUs have the XDNA NPU.
And if AMD decide to continue making FSR compatible on Intel hardware, Meteor Lake and Lunar Lake have an NPU as well.
I guess that also means that the XDNA NPU is going to be added to future AMD discrete graphics cards?
@@khkl_23 NPU is a really loose term, NVIDIA and Intel GPUs already contain NPUs
Intel is our only hope.
The 960 vs 280, 780/780ti/970 vs 290x...many more examples where AMD actually had the better product but still got massively outsold. With the advent of social media and sources like DF who can tell people flat out "This product is better than this one" it still hasn't moved the needle. If AMD abandoned the consumer discreet market for anything other than mid-range, it wouldn't be shocking. The people who will complain the most will be those who only wanted Nvidia but wanted it cheaper, hence "We need competition."
Its bad news , Nvidia has allready no competition , wait until the 5090 comes out price wise gonna be insane...if u leave RT off the table 7900xtx is a great card and they catching up with Fsr 3.1 , but Nvidia can throw whatever budget for developing a gpu , Amd cant do that...will see how the RT on RDNA 4 will perform
One big issue is that AMD had terrible drivers for a long time. I see to this day that people still think this is the case. When honestly I've had less issues with AMDs GPU drivers Vs Nvidia 🤷🏻♂️ tuck on DLSS and I think people just go Nvidia because of brand reputation.
Nvidia also has problems . Their architecture is not evolving enough. They will have to swicht to chiplets , otherwise it will always cost more .
AM what ? Those guys built GPU ? I don't know them, always bought Nvidia Gpus. Always used them since NV1 STG2000.
Someone suggested that a rule when investing in consumer electronics companies is to choose the company chasing the high end in their category because,
*"The high end subsidizes the low end."*
How true is this? Can we evaluate it empirically?
Someone explain to me why you would want to spend big money to buy a card with 60-70% more performance than the 4090...... like what game are you buying these level of performance for? The entire AAA gaming market has crashed and is only going to get worse in the future.
Just ordered my RX 7900 XTX
You are really crazy then
@@OmnianMIU Why, it's way better value than 4080 with it's 16go VRAM
@@stem7141 better value? Are you serius? Did you watch benchmark of latest games? All games run better on RTX cards because all radeon have obsoelete architecture and you still mind to VRAM 🤷🏻♂️ Nextgen of videogames is here from beginning of 2024 and it meant RT + AI upscaler. Even sony with ps5 pro move to this type of render and you got a 7900xtx LOL
@@OmnianMIU Most of the time, games run better on 7900XTX in rasterization while the cost is 200 less.
I will not spend 200$ more for having 4% less performance. And I don't mind about RT because even if it's better with Nvidia that's absolutely not satisfying, so my choice goes to 7900XTX.
We've seen that RX 7900XTX run God Of War Ragnarok at 40 fps on 8K while the 4080 is VRAM bound at 28-30 fps. Maybe that 8K is overkill but that is not positive for a long run with a 4080.
Yeah if I want to play on fake 4K I go to console not PC, I'd rather have a mix of high/medium settings and play on native 4K instead.
@@stem7141 🤣🤣🤣🤣🤣 you really really miss the point buddy. You don't understand that from beginning of 2024 the gaming is RT + AI upscaler. Enjoy your peace of garbage 😉
Can we use AI to generate fake hair for Richard? It's like you interpolate the guy on the left and the guy on the right to create fake hair on Richard's frame.
I hope my 6800m gets support
LoL just wait 😂
The takes on here is why those folks are on a podcast and not running a billion dollar company.
Vague, nothing-burger opinions like this comment here is why you’re commenting on a TH-cam channel and not running a billion dollar company.
He's not wrong though. It gets pretty arm chair.
@@natedagreat19 try to think on your own instead of relying on podcasters to tell you what to think mate.
@@ThePoushal Ironic, considering you gave no thoughts of your own on the matter.
Mobile gaming really does seem to be the Future.
People need to buy AAA mobile ports first
@@christianr.5868 Mobile includes handhelds (which AMD has essentially cornered so far), not just phones and tablets.
@@holy-g2334 that makes sense, especially for PC handhelds
Steam deck barely sold 3 Million unit... Nintendo is the only one that can sold handheld as well as sold software for it
It'll be several decades before(it ever) people are primarily gaming on handheld. Even then.. a good chunk(50%) of gamers will be using big power hogs.
Not mobile only lmao that would be a shot in the foot. Makes zero sense. Bad take. Likely 6000/7000 will see this. Anything with ai accelerators would.
@@dingleberries360 sure 🤣🤣🤣 still coping harder AMDemential boy
@@OmnianMIU I don't choose sides I own a 40 series as well lol. I'm using my noggin. You should try it. Why would AMD not include it in the last couple gen GPUs when they also include the same AI accelerator as the mobile chips? Come on now ;)
@@dingleberries360 because AI units of RDOA4 are useless! No one use this stuff in the world
Instead of dropping the high end, I wish AMD would go the Ryzen route and offer high end performance for significantly less than the competition.
I would like them to give high-performance GPUs away for free; actually, I would like them to pay me for using one of their products.
So loose money and go broke lmao
Thats what we all want but they simply cant keep up with nvidia. Last time amd was competitive in the high end gpu's was over decade ago. Ryzen route was possible because intel kept releasing the same cpu's year after year not bothering to develop better cpu's because they had no competition.
Nvidia is not doing that mistake.
Ryzen took several generations to catch up and pass Intel and they are still only 24%ish of the CPU market. Consumers pick Intel even when AMD is better. Same for Nvidia. An AMD Ryzen moment would be celebrated because people would assume that means Nvidia would be cheaper.
@@balthorpayneAMD cpu are sold more than Intel in destop.
Amd laptop are hard to get
Specially the powerful from them.
Intel own the big data center because big companies they dont like the chance everything.
I own a small internet cafe and all of this are AMD cpu and gpu.
8800XT will be 7800XT with better RT and AI
It is going to be disappointment for many people specially for people who don't care about RT
@killermoon635 from 2024 RT is always on. So RT will be carefully for all people in the world
A few new titles already implement RT (software lumen) without an option to turn it off. Similar to hardware T&L back in the day, tessellation, ssao and other tech that people deemed to be superfluous and gimmicky, they became the norm in a few generations and no one even mentions them anymore nor do they incur the heavy performance penalty that they used to. RT is here to stay and it's the only way to maintain or drive development costs down as visual fidelity increases because doing good lighting by hand takes a crap ton of both talent, time and Dev input to get to Uncharted levels of quality.
I hope Xbox swith to Auto Super Resolution and skip FSR 4
xbox is dead dude
@@QuantumChrist cool story attention seeking kid
@@QuantumChrist you are the poor hopelessly white loner who hides behind a black icon.
FSR image reconstruction is a bad joke. To little to late. I cannot wait for PSSR on ps5 pro
@@AJJJ-co2vd FSR 4 will have dedicated hardware, it should be better than pisser
@@christianr.5868with AMD’s track record, I don’t have confidence that they will. PSSR can be trained like DLSS, and training models for console games sounds like it should be the holy grail of upscaling.
Amd sucks at everything no wonder why sony had to come up with their own upscaling tech
Instead of giving us more fake frames and resolutions, give us anti-lag on consoles.
What is anti-lag? Don't get me wrong, I'm personally very anti lag.
Pretty sure not even the great Nvidia will manage to violate the space-time continuum by the time they release DLSS 1000.
I'd absolutely pay an extra $50/month to my isp if my data could wormhole to game servers instead of being routed down the best non timey-wimey route
@thelonejedi538 Anti-lag is for input lag, basically it's AMD's version of Nvidia reflex. On Nvidia it makes such a big difference that with geforce now Destiny 2 is more responsive on the cloud than a Series X natively at 60 fps.
whether or not anti-lag goes to consoles is up to Sony and Microsoft sadly
@@deathtrooper2048 ah, my bad. When I hear lag, I just assume it is referring network connection and not input latency.
But yeah, I agree. I use a wired fightstick for many games on my ps4 and series x and wish it was as responsive as my pc on the same screen.
Honestly doubt much will done though. I'm sure heavy use of dynamic res isn't helping the situation either
"When you hear 'RTX 4090' you think 'Ferrari' or 'Lamborghini'..."
A terrible waste of resources that only assholes with too much money buy?
"... the best of the best"
Oh... No, I think the other thing is more correct.
I also think that 'the strategy has been tried before and it failed' isn't a great argument when the market has changed pretty significantly in the past few years. When the RX580 launched, the top-end of the price range for video cards was $700 (MSRP). Now it's more than twice that. If AMD comes out with a line of cards where the top-end is priced at around what the 7900 GRE launched at, with similar comparative performance, and integrating some of the power efficiency gains that they've learned from working in the handheld space, that can be a pretty compelling sell to the mass market.
Clearly you are not an engineer or an engineer minded person, those cars are a marvel of human ingenuity and craftsmanship, if you prefer McDonal or factory food instead stuff made by a real cook that shouldn't prevent to recognize its value
@@Stef3m if by 'marvel of human ingenuity and craftsmanship' you mean 'vehicles with the worst fuel economy of any thing with four wheels, by weight' and 'not actually fit for purpose as a vehicle.' They get the same mpg as a Hummer while weighing half as much and being incapable of transporting more than two people or any cargo.
They're fine as art pieces. They're useless and wasteful as cars, and only assholes with too much money buy them and drive them.
@@Fafhrd42 How convenient is for you to completely ignore performance in order to protect your belief from reality, take F1 cars for example which engine are by far the most efficient piston engines ever made reaching 26% in fuel specific energy to kinetic energy produced, it's just that the scope is different but a small low power engine could be produced if people would be willing to pay the extra cost. And this is nothing compare to a turbine engine which consume a lot of fuel but have an efficiency that is simply unthinkable for any piston engine.
First 🥳🎉
whoa 😦
Bot