The problem isn't the graphics cards, it's the developers who have stopped optimizing games and are patching everything with dlls and frame gen. Even the game requirements state that to run a game you need X card with dlss enabled to play at 30 fps 1080p.
100% agree. We're going down a road where game devs are praying that everyone buys flagship gpus so that they don't have to spend time optimizing their blurry TAA-ridden games.
it's insane yes i'm stupid for having a 4060 but the new monster hunter title minimum specs requires a 4060 like what? these games are poorly optimized af will be upgrading with the 50's series
Blink three times if you have Nvidia Stockholm syndrome and absolute LOVE being hostage of Nvidia and enjoy all the awesome software progresses from Tessellation to Raytracing and a dozen other goodies! ;D
Don't forget G-sync, the proprietary hardware based variable refresh anti tearing solution, requiring a Nvidia GPU and an expensive G-sync monitor. AMD solved that one tho!
Nvidia have me hostage, as my monitor is was £900 when i bought it and only has a G sync module in it so i always have to upgrade to Nvidia or go AMD and buy a monitor too.
Not only AMD fixed it, but Vesa added standards for it too. And now, Nvidia GPUs can use freesync (the AMD version of it) as well, which it couldn't before and was a huge mess.
the tech to make VRR have been always been there but no one really thought using it on PC especially on games until nvidia come up with it. and when nvidia did it they are not doing half assed solution that might or might not work. early freesync was a mess despite AMD have one year and a half to study what nvidia has done with Gsync. anandtech have a good article about this issue before when AMD finally realized there is no such thing as "free" if you want to come up with a proper product with good experience.
"Neural rendering" is just the newest marketspeak. People are skeptical of the term AI because it is constantly misused in marketing, so here is a new one.
true and all Neural rendering does is make the edges really smooth and pop certain textures to make it look EVEN MORE realistic... feels like a rebranded Tessellation.
Are you planning to continue to call this feature marketspeak when AMD announces their own version of it in 12 months, or will it stop being marketspeak all of a sudden?
While true many companies have abused the word Ai. However when it comes to a GPU, I can believe it being the actual terminology (similar to how DLSS IS actually using ai trained models)
@VintageCR In a sense, you're spot on. From what I've seen, it's pretty much to Tessellation+Hairworks/PhysX what DLSS is to native resolution. Not as accurate, but faster. It's what Satya Nadela calls agent in terms of comparison to large models. Smaller, narrow and specific usecases. (just pointing out Nvidia isn't even inventing any of this, just first to leverage it's hardware to implement this stuff).
Nvidia is not n suffocating amd. They just arenproving quality of image and beautiful graphics are NOT on their priorities. Amd images are sharper and colors are accurate and bolder
It's damn successful not as a beta test, but as a marketing effort because in the process they've conveniently drill it into our minds to call it 'RTX' instead of ray tracing, as evidenced by this comment. In MGSV terms, Nvidia earned tyranny status and made their subbrands, the 'lingua franca' of cutting edge graphics.
yeah and I doubt 50 series will change that much. The 4090 was a nice attempt but really we would need DOUBLE that RT/PT performance JUST to be able to enable it at 4k without terrible upscaling and framegen solutions (they blur/smear the image a lot in many cases)
@@gerarderloper It is not "terrible" in any way, in fact it looks really good for the level of extra performance it gives and makes something unplayable, actually smooth. Without framegen you simply wouldnt be able to afford a GPU that puts out that level of performance. Upscaling is the future. The technology needed to give a raw double a 4090's RT/PT performance in a consumer size GPU without upscaling simply doesn't exist. Even if it did exist, it would cost a ludicrous amount to the point that developers wouldn't even bother implementing that level of graphics if only people with a 10k USD gpu can run that setting, which will be a very small segment of the market.
PhysX was originally physics acceleration on separate add-on cards made by a company named Aegia. nVidia bought Aegia, integrated the acceleration into their GPUs. And then proceeded to be an evil corp as per usual.
^This. Imagine if physx was never bought out , stayed a separate add in card that didn't hurt fps and was available to more game developers without exclusivity rights from Nvidia. Could have been a game changer.
@@DragonOfTheMortalKombat Ew, not this argument. You obviously pick the less evil company instead of just going "HURRR DURR NO ETHICAL CONSUMPTION UNDER CAPITALISM" and buying from the people who eat babies instead of the ones who just punch them
wasn't it the thing where you could buy an extra component to put in your computer on top of your GPU that would handle just PhysX for more performance.
Actually ATI introduced Tessellation to Radeon before Nvidia did. It is just that when Nvidia implemented in a way that was more performant. PhysX was special hardware, that was eventually bought by Nvidia and then they killed the hardware and then vendor locked acceleration to there GPUs, then eventually they made it all CPU based. Hairworks was just a crappy unoptimized version of hair simulation, which was done much better by other tech. DLSS and all that jazz is suffocating PC gaming, simply because its being pushed on the devs to implement, which is a problem for PC gaming being an open platform in the long term, pushing APIs that only work on one set of vendor hardware is really bad, and resembles back to the 3Dfx days when Glide API ruled the roost, making it super difficult for competing GPUs to compete on a even play field.
PhysX since the very beginning have both CPU and hardware accelerated part. but Aegia want to sell those PPU so the highlight is mostly on the accelerated part. and the initial CPU physx also being accused to purposely using things like x87 instruction to make it slow running on CPU. nvidia over haul PhysX development with PhysX 3 making the CPU part more performant and compete head to head with havok. this is when some big game engine start ditching havok in favor of much cheaper PhysX in term of licensing price. nvidia eventually make the CPU PhysX open source in 2015.
@@arenzricodexd4409 Yeah they did open source, when it became useless to do the vendor locking bullshit. Also it didn't end up helping them capture more GPU market anyway.
Every company is anti consumer. If amd where at nvidias level they would do the exact same thing. Its scary people actually think companies have morality and are on the consumers side.
@@dante19890i fully believe that AMD would be just as bad if on top, but that doesn't mean we can't expect better from these brands. There are still brands out there that don't rip off people like Nvidia or Apple do, even if they are on top.
@@interrobangingsamd is not good company as well cuz they fuck up non stop but at least they are not that greedy as nvidia i have 3 generation of nvidia gpus and im not plán Ing to buy another and if you ask why well i no longer wanna pay for premium when it nôt give mé whats been promised.
AMD doesn't need to keep up - they can do million of things. They have got 3 different tech for path tracing ready and fsr is co developed with Sony and Samsung. AMD needs to start playing their own game - maybe chasing nvidia is just cheaper.
@@michahojwa8132 so you are saying they somehow have the tech that they are too complacent to develop? huh??? You are not making any logical sense. Sounds like major cope.
Fun fact: The Witcher devs coded hair works wrong in the original copy of the game the one before they released the next gen patch. It use to run on CPU and not GPU that is why it was a big performance hit
they can afford to, but the strategy is even if you buy higher end to make you feel like the highest end is better. you could slap 16GB or more vram on every card in the series without a major price difference.
Even with a 4090, you need DLSS Balanced with Frame Gen to do path tracing and DLSS only looks good at Quality for 4K to me. NVIDIA does this so you want to buy the next gen 5090 hoping it will finally do path tracing with minimal DLSS. If it does, then NVIDIA will introduce the next tech into the games weeks later so the 5090 will not be good enough again.
They likely don't want anybody turning down the DLSS... That's likely their substitute for any real upgrades from here on out. If the DLSS is not needed, it becomes 1 less marketing point for them.
Thats why we have raytraced global illumination so the dark scenes actually look fantastic reacting realistically to ambient light in the scene. People who think RT is just reflections are clueless.
@@WayStedYou ya, because you still need to calculate rays... So it doesn't matter, when you will have the sky, fps will be the same because rtx doesn't care about complexity. Also, he using not top card in 4k in one of the most demanding games, what was he expecting? Why not path traycing in 8k and 240fsp on 4050M? Just switch to QHD or turn on dlss and it will be playable. Change raytracing from ultra to mid or low, it will be still better than not having RT at all and fps will be more than comfortable.
2:38 "Whether or not the developers could have done better on the left side is questionable." No, it's not questionable at all! They could have added some normal mapping to make it less than 1% difference between them! 😠 That comparison is typical Nvidia BS full of lies. They are liars and shouldn't be trusted. That game came out 2 months after the GTX 970 was released - which had 3.5GB VRAM! Technically 4GB but the last 0.5GB was SO SLOW that if a game hit that region it brought the card to its knees. Gimpvidia doing gimpvidia things, gimping them. Now they vomit these PCIe 8x + 128-bit for $400 (and wanted $500 for another 8GB VRAM). 😠😠
Honestly, I expect if there is no change in the market that AMD will stop producing consumer cards. It doesn't make them much money. Even when they release cards with good performance and priced well, people still don't by them. The only reason most normal consumers want AMD around is for competition. But even if they make the best bang for the buck card, they will never actually chose AMD. If Intel sees the same, they will drop graphic cards as well and we'll just have a monopoly. If you want diversity and variety in your markets, the consumer actually needs to support that. Nvidia knows now that there is no price too high, since most of their customers refuse to purchase other companies cards.
Well I ordered B580 (but I am not sure when I will get it), but man, this card is overpriced in Europe. I accept that I am buying a slightly worse product, but I will have QuickSync 🙂 That matters for me, since I already has AMD CPU and not gonna switch to Intel.
That's crap, Amd haven't been pricing their cards aggressive enough. They've been to far behind with upscaling and RT. They needed to price lower to compensate. They're catching up a lot in RT with the 9070xt apparently and they have fsr 4 coming which is now a.i based. If nvidia keep raising prices Amd have a chance to step in and take markershare if they're aggressive enough. I'd happily buy the 9070xt if it was 4080 performance and $499....but will they actually price it properly or match those performance levels? It needs to beat the 5070 by at least 10% in raster imo while being $100 cheaper. Otherwise it will be the same old story, this is their best bet so far so it would be such a fail if they botch this lol. We'll see in 8 hours i guess
@@PCgamingbenchmarktime Well, let's not forget that AMD actually needs to make money on those cards to keep making them. They can only drop their margins so low before they may as well not bother with making GPUs at all.
@HunterTracks who said anything about them not making money on them. The 7800xt was $499 at launch, later dropped down too $449. They're still using gddr6, the same amount of ram, i heard rumours that die size isn't that far off the 7800xt. The profit margins would still be there, its about making the card as efficiently as possible. Nvidia just has ridiculously high margins let's be honest. Cards can be cheaper than what they're selling and still be making big profits. I see no reason why the costs would be anymore than the 7800xt. The 2080 to 3080 had a 50% uplift for the same price, so a 40% uplift from the 7800xt for the same price seems very reasonable
It doesn't work like that... Prior to Ryzen gen 1 AMD had far less money compared to Intel. Yet after a few generations of Ryzen they eventually surpassed Intel in spite of them originally having less money. The reason Intel and AMD and struggling to catch up to Nvidia is simply because Nvidia has the best engineers and best feature set.
AMD is doing quite well. Contracts for both xbox and playstation, including their new ones coming out. Also int he PC hand held world, they are about to release some next level chips for that. And their CPUS are on fire. Their stock is only growing. AMD doens't need to focus on super high end. Not important.
Dude reality check for you. Nvidia dominates the PC GPU market with 88% marketshare....AMD has only 12% and this gap has only been widening year after year. on CPU side their situation is without a doubt much better and they are taking ground from Intel, but on GPU side they are not even outperforming RTX 4060 and 4060 ti, even though their GPUs at same price range are more affordable and offer better performance than 4060/4060ti. If you dont take my word for it, just go look at Steam hardware surveys. AMD has mountain to climb just to get more consumers on their side with low and mid tier GPUs.
@@Totenglocke42 You're welcome to share your opinions :) Nothing is set in stone, high end is determined by the individual and their needs. How cool is that.
@@DaveTheeMan-wj2nk Except it's not. High end is determined by what exists. A top of the line GPU that has all the features available and runs games at the best resolutions and frame rates is high-end. A GPU that's using almost ten year old tech and can't run any of the modern graphics settings properly is low-end at best.
AMD was ripping off Intel untill they started to simply make better products and not only performance wise, but also price was affordable to most of us casual pc users/gamers. AMD did so well that Intel is in the shadows now. If they gonna start to copy and beat Nvidias insanely stupid prices then we should'nt complain but be happy. Because I don't know in what world you guys living but to pay 1,2k or more for a gpu is stupid. Little % of people sees that price as affordable, to most it's few months savings that's left after paying taxes and rent.
This is a complete misunderstanding of 40 out of the last 50 years of CPU history! AMD invented 64-bit x86, STARTLING INTEL. AMD crushed Intel with Athlon. Intel only survived by breaking the law MANY TIMES! AMD is very amall and made a disastrous design mistake with excavator CPUs - NOT COPIES OF ANY INTEL COU, EVER! Now they have Ryzen and pioneered chiplets and vcache while Intel sits on its fat lazy ASS! AMD were only a second source (printing licensed copies of 80xx cpus) in the late 1970s and early 1980s when nobody would buy your chips without a second source foundry to make more chips in case you went bankrupt!
@@thechurchofsupersampling that x2 4600+ was pretty decent if I remember correctly. And yea they reverse engineered pentium cpu's at first, and ended up improving them. So a amd 486 would be faster than the intel one.
Biggest PhysX titles back in the day: Bordelrands 2, Batman Arkaham Origins, Metro 2033 (original), Mafia II (original), Arma 3, Cryostasis, Medal of Honor Airborn, Just Cause 2, Assassin's Creed IV Black Flag, Dragon Age II.
oh man the smoke and debris effects in AC4 were INSANELY cool at the time. I still had my GTX 770 after I upgraded to a 980TI and used it as a dedicated physx card
Bro Dont Forget ther 1st Mirrors Edge! I remember playing that and Arkham Asylum with my GTX260 and was blown away how well the glass and smoke effects looked!
AMD should focus on software improvements before starting to compete with 80 and 90 class cards. Nvidia is thriving now because they took the pain and effort to make these technologies decades ago.
Yes, AMD is better value for 16 - 25 years old kids who are only doing gaming and NVIDIA for 25+ years old adults who needs to game and WORK. That's one of the reasons Nvidia dominate so much. Nobody cares to spend 300/400 euro more for a polyvalent GPU. AMD fans are coping " Nooo AMD is better value, it cost 50 euro less and have 10 + FPS 🤓🤓🤓🤓 so it's better !!! "
Both of this guys above me spit nothing but facts. AMD should fr compete with Nvidia on technology, cause fsr stil sucks ass so bad 😂, the fact that tim from hardware unboxed say's that Xess is out right better is sad for Lisa su & her whole GPU departement.
@@zzephi bro its just a corporation that want's your money. Their not your friend or lover. They make a good product, but dominate the market and can make up whatever pricing they want. Corporate shills are gross as hell.
i do not understand the raytracing hype. yes it looks good. but why the fuck would i use it when you have to put DLSS in Performance mode so the textures look way worse so i can get playable fps. i just ignore RT in every Game. just a look what i can do feature to get sales for people with weak character.
It's still in its early stages, and tech needs time to develop and become mainstream. It also took quite a while for 1440p to become playable at an affordable price, and yet most people still play on 1080p. Meanwhile, there are 4k monitors.
Lighting is the most important part of game graphics. You might have a shitty GPU now, but in 10 years you will sit here in the comment and gush over RT and you cant even turn it off cuz it wouldnt have a rastorised fallback by that time. Its the future.
@@dante19890 people said the same thing when the 20 series came out and it still makes games run like shit plus i shouldn't have to wait a decade for something that's currently out right now
@@utopic1312it’s a future not to far off I’d say probably in 2 years or so since game devs give no fucks about how well their games actually run so they force rt on with no way to turn it off👍
You used to be able to have a separate Nvidia card on top of your AMD card and dedicate the Nvidia card to PhysX. So you could pair a cheap Nvidia with a powerful AMD card. Good old days.
15:30 They do that so the consumer can easily understand the naming of them, they've been this for the Ryzen 3,5 and 7 in comparison of Intel's i3,5 and 7. They match the names to make it easier to understand for everyone, that's actually really clever.
The truth is: only a few games are Nvidia sponsored. AMD has AFMF2, which Nvidia hasn't (yet). I hope that AMD gives us something like to Microsofts AutoSR and makes it as accessible as RSR. That would be a great feature for older games, I think
@@roller4312How are they supposed to capitalize on that? ”Buy our stuff, it's like a PS5 but better!” SURELY wouldn't piss off their business partner, Sony /s
Tesselation is not about giving Y value to textures, parallax occlusion mapping does that. Tesselation triples and even quadruples the polygon count to applied textures to create higher level of detail, which is why it was used sparingly, only for some objects for a long time. Now, try lowering tesselation to minimum, for example, in Forbidden West, see what you get. Then, there's soft PCF shadows, which later down the line turned into PCSS, unified shaders, driver forced FXAA for games with no AA, then driver SSAA in DSR. Nothing's changed, RT is the new SSAO and soft shadows, that need unified shaders, accelarated by CUDA, and therefore run better on Nvidia. Somehow, ATI and then AMD always play catch-up, making competition for Nvidia either in features or raw performance.
PhysX was previously a physical physics card that worked more or less like graphics accelerators like VooDoo did at the time. The company was bought by NVidia and its components were added to graphics cards for physics calculations.
Just a couple days ago I had a short argument with someone who claimed that Raytracing TOTALLY makes a huge difference. Another guy in that comment section also said that Pathtracing gives games a "hypnotizing quality" which is the most batshit insane level of falling for marketing I've ever seen. For some people it reaches almost pathologically obsessive levels. But anyway that's besides the point. The dude I was arguing was hilariously, basically already cartoonishly arrogant. His first statement - as per usual for NVIDIA hardcore fanboys - was that if I don't think of RT as great i must be running on an old shitty GPU. In his words he wrote: "I don't know about anything else but my 5080 is going to run that ray tracing beautifully cuck boy" lmao.
Often these people arguing on social media don’t even own high-end gpus that can do this stuff. They base it off of watching youtube videos and pretending they know. Hypnotized by the marketing. Like poor people who defend billionaires because they believe they will someday be one. 😬
I dont care about RT but I do care about upscaling, and FSR is completely unusable. You act like AMD fanboys arnt in complete denial about all of the driver issues. They all claim to have "zero driver issues" but AMD's own adrenaline patch notes list all of the games that constantly crash on all amd cards. It takes months for them to fix crashes in games like wukong, helldivers 2, lords of the fallen, etc.
@@Dempig you'll be surprised, but having issues in some apps is unavoidable. Did an update on an Nvidia machine recently, and you know what was there in a changelog? "Fixed an issue..." Will you call out Nvidia on "bad drivers"? Nah, cause you are a sheep, apparently.
the AMD naming is a good thing. previous naming could confuze where is the GPU and where is CPU. example R7-7700x or RX-7700xt pretty similar for someone who is shoping for a gift or planning to upgrade their PC
@@gerarderloper Depends on, when they switch from RDNA to UDNA they will more than likely do another fucking naming change but it is what it is. Now RDNA5 could go the Vega rout and just be its own thing then UDNA1 would be whatever it would be
It's not about nvidia. It's about consummers being stupid. Do you need tesselation at max settings ? Do you need hairwork ? Do you need physX ? The answer is always no because these are settings above ultra on a few select games. And by the time the tech become mainstream, AMD runs them as well as nvidia. I am playing without DLSS and without ray tracing today and i don't miss it. Nobody is forced of anything, it's not like you had to pay to unlock those settings and now you are traped in the nvidia ecosystem because you don't want to lose all that money you spent. It's not apple.
Well nowadays we’ve got games that force rt on with no way to turn it off so soon enough we may actually need good rt performance to play the latest games, we have a not so bright future
Apparently Hairworks can work on CPU and it was glorious on my old rig, cause it allowed me to play Witcher 3 on ultra on shitty GTX960, cause I had decent i7 CPU. And then they patched it... It's literally a feature lock. And due to IP laws AMD cannot even propose serious alternative most of the time. I'm now on full AMD rig, but it's a bit stupid that I get worse experience on top of the line AMD than on middle of the road NVidia, just cause hairworks is locked behind hardware.
@@PyromancerRift Do you need them to get the game to run? No. Do you want them for the best experience? Yes. If clowns like you had your way, we'd all still be playing on a 486 with no 3D graphics cards because "Ugh, it can run Doom and Command & Conquer just fine! We don't need new tech!"
yup. its AMD cope. Do i like the pricing? nope but Nvidia invests a boatload in not only developing tech but they go to game dev studios to help implement said tech. AMD needs to compete like they do in the CPU market. Intel was dominant but AMD made moves. I switched from a 8700K to a 7900X3D then sold it and got a 9800X3D And all this crying about ray tracing is so dumb! IT'S OPTIONAL! You don't have to switch it on
@@samson7294Well no your missing the whole point of the video and here you go again with this, dlss was not meant to be a technology of injection lf fake frames to play at 60 fps in most games secondly all the technology enables is slop developers to un optimize their triple a or modern slop 😂
@@samson7294 That's missing the point. Getting tech implemented to make games run worse on the competition is the kind of stuff that gets the government involved in other industries. Heck Microsoft was forced to give the option to install browsers on setup because MS coming bundled with Internet Explorer was considered unfair competition.
16:00 Don't know why but it sounded like you said Ketchup instead of catch-up. Love your content! - And I don't like the new naming scheme. It feels like what AMD did back in the Athlon days and they had to put the Performance-Rating (PR) on their processornames to compate with Intel. It was first when they threw all that away and made an entirely new name (Ryzen) and architecture that they could establish themselves and distant them from Intel. Hope they can do the same with UDNA in the future against nVIDIA.
my RX 6800 over a year old now bought it 2nd hand in 2023 ex mining card...still going strong. Paired with my 5800X3D. I'm waiting for high performance APUs so I can downsize my rig to a MINI PC>
@@Epic3032 Extremelly lucky it even works for sure, but i personally wouldn't take the bet not only because the odds of it being in a good state are horribly low, but also because crypto scum are amongst the worst people, the entire thing being extremelly tightly linked to organised crime of the worst kind
@@LeNathDuNet a mining card thats been on 24/7 and likely undervolted in a clean server rack will more than likely be better than a 2 year old gaming card on a PC hat's been booted on and off power cycles for more than 3 years...
@@UncannySense ho because you thought they treated those like professional server racks? Let's just ignore the hundreds of thousands of totalled cards because these idiots powerwashed them with water to resell to other idiots
The first time I downloaded PhysX was when I installed ghost recon advanced warfighter 2 for PC. It was really cool for that time, which was early 2007 iirc. Explosions in a destructible house would send debris flying all over in a way that just wasn't possible without PhysX.
I bought RX6800XT almost 3 yrs ago because 3080 was 2.5x more expensive than this GPU for which I was able to buy for less than 500$ with full waranty so... 3 years ago that was extremely cheap. Is it bad? No, everything I play runs great cannot RT enough CP2077 though but buying overpriced nvidia shit for 1 game doesn't justify it for me. Suprisingly stalker 2 runs decent but that's probably thanks to my X3D CPU. We don't have like good options right now. My budget for GPU is like 700$ max anything above since I play games rarely due to time constraints I'd rather invest such moneh in my oldtimer motorcycles collection which gains value with time.
AMD should focus on their driver stability. I still have issues with their Adrenaline app crashing randomly every few hours and crashing both games and games crashing the app. Incredible. I'm not even one of those that just update apps, since I am comfortable with using ddu and safe mode and such.
Raytracing is just another Phyx, hair works ect. I believe Nvidia wanted to add something that would cripple AMD. And honestly idk how raytracing even got as far as it did. It's another useless gimmick that no game can run well with. It being use as an argument is stupid as well.
No raytracing is actually the future of rendering. You never gonna get rid of it. Its here to stay. Next console generation with ps6 every new game is gonna be using RT from the ground up.
Ray tracing is an incredibly useful tool for developers. If and when ray tracing capable (and I mean fully capable, so probably not for a while) cards are wide spread, developers will essentially never have to bother with manually setting up lighting tricks the way they have to do now. It's an insane cost and time cutting feature, it's just not able to shine quite yet
Sorry but proper raytraced global illumination or even better pathtracing is NOT a gimmick. It absolutely transforms the games graphics positively while also reducing workload on environment designers. Its rather obvious that you never actually play raytraced titles and much prefer insane framerates. DLSS is so good at quality dettings that it results in better image quality than the best AA solutions, while giving you a massive performance boost. Back in the days we were VERY happy hitting mid /high settings at 720p in crysis. I hope nvidia keeps pushing tech, because AMD sure as shit wont..
If you enable PathTraicing, you have to disable rasterization options that do not affect the final visual quality but that reduce FPS. They are not automatically disabled when using PT, in most games. DLSS in quality mode or DLAA are too much for most if PT is used, with lower resolutions it looks very good and with more FPS when using PT. Pathtracing replaces many things in rasterization so they must be disabled.
It's not imposible to keep up, AMD has to focus on mid tier user like they said they would, high end users prefer nvidia low end and mid end could go with AMD if properly educated, in almost all aspects AMD offers more for less money, but on low to mid end people still pick nvidia over AMD and that is something AMD has to change if they want to keep up the pace
If Amd doesn't botch the launch I'll happily switch to the 9070xt from my 3080. FSR has been the main thing holding me back since i play at 4k and fsr looks terrible in most games, But if FSR 4 looks promising and the 9070xt is actually very very close to a 4080 for $499 I'll buy that. But Amd have a bad habit of disappointing so we'll see how it goes. I'd rather not get another nvidia card but if Amd fails I'll probably get a second hand 4080. Fingers crossed for Amd lol
@@MyouKyuubi this is where AMD is not superior. brand loyalty. even 4070 at $600 outsells 7600 at $300. 7600 compete with 4060 and yet it's outsold from 4070. 4060 laptop and desktop outsell 4070. imagine how much 4060 outsells 7600. there's not even a contest. there's more contest with 4070 than 7600. Nvidia wins from brand loyalty, and most people don't even know this. it's like trying to outsell Iphone. not gonna happen. Nvidia is the Iphone of gpu, but unlike in phones where there's Samsung. there's no Samsung in gpu.
I was an Intel/nVidia fanboy from my early gaming days in the late 90's. Bought my 2nd AMD system ever back in 2023 and went with a AMD GPU as well. The Intel/nVidia prices got utterly insane at some point. AMD got better price/performance ratio if you disregard RT performance.
Upscaling is a misleading marketing term. It should be called rescaling since it lowers the resolution then uses AI to upscale it back to the target resolution.
Considering Nvidia’s GPUs keep becoming more expensive than previous generations or even have worse power consumption, Nvidia’s domination can’t last forever right?
They're currently frontlining the market's current obsession with AI. The massive price jumps is just piggybacking off of their recent success in that area. So long as shareholders demand they profit further from the hype, nothing will change.
Worse power consumption? Maybe higher end but the 4060 is one of the most efficient gpus this generation and does wonders in the laptop and mobile space. Any competing amd gpu swallows up power like a whale. You can't beat 115 watts.
Just in the process of jumping ship. Sick of the prices now. Managed to get a 7900xtx on a great deal, so went team red. Arrives tomorrow, and hoping the switch goes smoothly
I used Tessellation with ATi Radeon 9000 cards in Unreal Tournament. That was over 20 years ago. It was nice allowing you to have more imaginary polygons between vertexes, but yeah that Tessellation wasn't the same deal that was later introduced that worked more like parallax effect.
One big thing about buying AMD Graphics in Germany is the Wattage of the GPU. Yeah i paid like 100€ more for a 4070ti instead of buying a 7900XT but like this i can play my games with raytraycing and dlss + undervolted to 165-185 watts on max instead of nearly needing twice as much while paying 33 cents per KWh. So as long as u keep ur GPU for more than 2 Years, or play more than 2h daily on 300 days of the year Nvidia will be cheaper than going AMD while offering more. So its hard to not go Nvidia over here if u have this in mind.
The 7900XT undervolts so well, it can play Elden Ring fully maxed out with max RT at a locked *native* 1440P 60FPS while using 125w. Less than a 4070Ti Super ever could at those settings. You punked yourself by not researching AMD tuning. It's wild on the 7900 cards, you can choose to be super efficient, balanced or Ballz2thewall performance.
I don't understand why you think the 4070 Ti S is meant for 4k with RT. Regarding Ray Reconstruction, even NVIDIA itself said that for now it will only work at an acceptable FPS with DLSS 3 (unless you have a 4090 for 1080p gaming). I hope that I will live to see the moment when turning on RT will not drop FPS by 2 times, when there will be enough RT cores for processing. And we are likely to see the 6070 series consume power at the level of the 4090.
They can. But not in 4k. Like DLSS was created more for 4k gaming than 1080p, but use it now with the words "medium settings, upscale balance, 1440p at 30 FPS". 4k monitors are more expensive, rendering such a resolution requires almost 4 times more performance, which the graphics accelerators of that time could not do (without DLSS). Even the "jacket" says "the more you buy, the more you save". So what am I getting at? These are the times of AI, when the performance per watt has increased significantly, which cannot be said about the gaming industry. For me, the 90 series has always been either for 4k or for VR (which is essentially the same 4k+) at maximum settings with the DLSS. In addition, 4k became popular too early, and video card manufacturers were not ready for it. RT / Ray Reconstruction / Path Tracing is still very "expensive" processing for GPU and you should not expect 60 FPS in 4k on the same 4070 Ti Super.
So Tessellation was used by Nvidia at a crazy high level, so high that is was next to impossible to notice it. Cutting the Tessellation level by half caused no change to video quality but suddenly AMD cards ran it fine. Hairworks in Witcher was another example of something like this being done.
yes this is so correct, thanks for spitting truths Vex. Many many Games do hidden optimization (deoptimization for others) in favor of their sponsorshipped brand. Thus not only market share affected also competitive gamers are affected.
Nvidia giving consumers innovative features is pro consumer. But it is very anti-AMD. If you are upset that Nvidia innovates you probably have an irrational emotional attachment to Intel or AMD.
Pro-consumer is giving away the technology so that the consumer is not locked down. For example PhysX was not coded to run on x86 so that CPUs couldn't run PhysX without massive performance losses. Not letting the consumer run PhysX on the CPU is an anti-consumer measure.
@@ChucksSEADnDEAD Lmao your example is awful. Nvidia purchased Ageia, who owned and created PhysX. At the time CPUs could run PhysX but not very well at all. You had to buy a dedicated Ageia PhysX card to run PhysX but they were expensive. When Nvidia bought PhysX they integrated PhysX into Geforce meaning users didnt need to buy a second GPU. it was quite a consumer friendly move. I remember it first hand, im 38 and ive been doing this since I was teenager. Even today PhysX isnt handled well on a CPU and you would want it on the GPU. Of course money would need to be spent to get it to work on Radeon cards. Why should Nvidia spend that money and help its competition? Also, If you look Nvidia spends a far higher percentage of its profits on R&D than AMD does. Why should Nvidia give the competition the fruits of its labour? This has nothing to do with the consumer and the consumer doesnt benefit if AMD get their hands on Nvidias technology. You appear to be mistaking healthy competition for anti-consumerism. AMD make billions in profits every year. They are not short of cash to compete with Nvidia. They merely choose not to spend it. If they were broke and getting crushed by a massive multi national then id understand but that is not the case. AMD are just keeping the money.
Borderlands 2 was THE PhysX Game back in 2012. Hella unoptimized, but those dynamic cloth sheets that could tear was some awesome tech to me and the fluid effects of stuff like toxic goo looked really fitting to the Borderlands art style.
@@mradsi8923 seems it really likes 1440p though and gets beat by the 4060 on 1080p......... That is a thing to think about as many budget gamers still game on 1080p, or use a 1440 card for multiple years and game 1080p in the last few of those years.
today we would like to introduce drum roll please... THE BRAND NEW RTX 6090 TI SUPER WITH 12GB VRAM GDDR8X HALF THE CUDA CORES HALF THE SM'S SHITTIER OVERALL 20% BETTTER DLSS, UPSCALING AND RAY TRACING FOR THE SMALL PRICE OF $5500 USD AND YOUR GRANNY, remember the more you buy THE MORE YOU SAVE!!! (features sold separately)
Nvidia tried to do the same thing with RT that they did with CUDA, be the first out of the gate and do it better than AMD so devs would feel forced to use their shit and no one else's. Thing is this time AMD is in the consoles and they are a huge portion of the market so ngreedia couldnt cut them out of the market. It is making RT slower to adopt but you can really see the devs that are in nvidia's pocket (CDPR-etarded). RT hardware can't be copywritten but ya know what can? DLSS so that was their plan this whole time. Try and force studios to put RT in their games and then ngreedia sells you "secret sauce" to fix its performance hit... especially with path tracing. The only way to fix this is for people to get off of ngreedia's dick and at the same time AMD has to stop fucking up each and every GPU launch. FSR4 is coming out when DLSS is going to get its next big update so AMD's software is still 2 years behind on the GPU front. They need to price their cards accordingly yet they never fucking do.
Wukong being heavily nvidia biased made me stunned by the overfocus of the online community on the culture war stuff They robbed you of your gpu's extra performance A pc 6500 XT is 30% faster than my laptop's RTX 2050. Yet both has similar performance according to HU's charts and my laptop's benchmark
I am building my first PC right now from scratch, on my own. I haven't built one in 15 years or so and never without help. It is absolutely wild what I have come back to. I have a MOBO and a CPU (670X & 7950 X3D) and now I am waiting for the new cards. I wanted to do Nvidia since I have never had one before but I do not like the way they're doing things. I'm thinking of grabbing an 7900 XTX or something newer, if AMD makes anything worth checking out around that performace and a lower price than the green team. Otherwise I am "doomed" to grab a 4080/5080. Anyway, I have been checking out your videos here and there along with GamersNexus (
The 7900xtx may actually be the most powerfull gpu from AMD in the next 3 years since amd said they wouldnt try to make a high end gpu this year/gen. Just to inform you que you may need to brace yourself for disapointment
You dont buy by brands morale, you buy a brand becsue they provide with the product you need or want, this is not a multi year relation, this is a one time deal wish you make a one day transition, you get the good they get their reward. Period.
I don't care how good Nvidia will get, i'll never buy from them, until they stop their scumbaggery. Nvidia have done this kind of thing for the last 20+ years. Research new tech, make it proprietary, pay off game devs to optimise for their cards and put the new tech in their games, now gamers have to buy their cards because it runs better on Nvidia and it has exclusive features, since they partnered with the devs. Don't care for this mentality. In the mid range (where my budget is), AMD is still king in price to performance anyway.
Well, personally, i wont buy Nvidia again, because they treat their own customers like trash... I'm one of the suckers that fell for the RTX 2080 TI scam... First they burned me, by making my Sick SLI-setup invalid, by discontinuing SLI-support, i could only run my games on ONE card, which was a huge performance hit... But i lived with it for many years, until i decided to upgrade to that 2080, and then although my performance improved, i still had crazy stuttering issues... So that was the second time i got burned... And then 1 month later, the 3000 series released, making me feel like i wasted my money on the 2080 TI. I'm not buying from Nvidia again. :) I'm buying AMD, and, if AMD falls off a cliff, i'd rather buy Intel GPU's, than Nvidia... And if Intel AND amd falls off a cliff somehow, i'd rather buy and play IRL board-games than buy an Nvidia product, or heck, even give those obscure chinese knock-offs as try! :P
@@Codyslx From a business standpoint, sure. But if they wanted, they could at least make it so that at least previous gen cards can use some of the features of the new ones, but they lock that away too. They are anti consumer even with their own customers. Not to mention the outrageous prices, or how they sell you a worse card for more money year after year(3060 vs 4060for example). And if AMD can give all gamers FSR and frame gen for free, so could Nvidia. Most of their money is from the AI, crypto and big tech side, not gamers, but they nickel and dime everyone. They still sell 8gig cards for more money than AMD does their faster 12gig cards.
@@misterlobsterman How will previous gens run software that require specifc hardware to work? There is a reason why amd is moving to hardware level machine based upscaling, that their other gpus most definitely won't have access to. The 4060 is also better than the 3060 even removing dlss3 and dlssfg all while only using 115 watts max.
@@Codyslx Hmm, i think that means Nvidia isn't confident they could stay on top, performance-wise, and was afraid a competitor would pull it off better than they could. If you make something propriatery, you are only showing you fear what your competitors might accomplish with it. It's cowardice, is what it is... If i was confident i could remain on top, i would share the tech, and still remain on top, everybody wins! Heck, my competitors might even invent something new off of what i invented, which i can proceed to innovate on, in turn... perhaps we'll accidentally discover something that can reduce production cost by 50%, allowing us to sell as the same price, but still increasing profits.... etc. So actually yes, it is smart to "supplement" your competitors... Only someone who isn't smart enough to see the benefits, would think it isn't smart. :P
Cuda and other tech like tensor cores is huge. Cuda acceleration is huge for both gaming and also productivity. Funnily enough, a lot of what nvidia has implemented as propriotary stuff, is stuff that also was introduced in the PS2 tech demo, showing the new possibilities the PS2 had, like better simulations.
i don't get your point. Are you frustrated that Nvidia keeps introducing new features for their hardware? because it makes developers lazy? it's like saying cars are bad, It makes people not want to walk!
Also this thing about DLSS and AI features that nvidia has. They're always saying "DLSS BAD" "Ray tracing is a gimmick" "DLSS is cheating and fsr looks the same", istg it's like they prefer a giant bulky steam engine instead of a modern v8
@@M4x505. FSR looks objectively worse, lmao... And i'm using a 6800 XT for my main system... I'd rather use proper anti-aliasing if i can, FSR looks awful... DLSS does look better, and would be the preferred AA-tech, if developers weren't abusing it as an excuse to avoid doing the optimization work they're supposed to do... : / Ray tracing IS a gimmick though, is only looks SLIGHTLY better, but it costs 90% of your frames, as well as 90% of your bank-savings... It is so not worth it, AT ALL! 🤣 That's why i went with AMD, because AMD does the basic raster stuff, really, REALLY WELL, so i get MORE frames than Nvidia products get with all the RTX stuff turned off... That's why AMD is better, imho... AND is also cheaper, literally win-win. I only get sad when games, like Hogwarts Legacy, doesn't have normal anti-aliasing techniques, at all, and ONLY has TAA, and then FSR and DLSS, those are the only choices you have on Hogwarts legacy, is so silly. :P
@MyouKyuubi I'm an ai researcher. AMD is waaaaaaaay behind in the business side. developers don't even optimize around their hardware anymore. i hope they step it up
@@AmrAbdeen Oh you mean market share? Sure, i guess... However, Whenever you buy a console or a laptop, what CPU does it have? AMD, what is the best gaming CPU on the market right now? AMD... What's the cheapest gaming hardware? AMD! Only reason Nvidia, and Intel have the most market share, is because they're partnered up with Microsoft and local PC assemblers to promote and sell their stuff, before anything else... Trying to create a closed system where they can each have a monopoly on whatever they're each selling... Like an ouroboros of parasites. The fact that AMD ISN'T engaging in monopolistic practices like that, makes me respect them more. :) After all, you AI-researchers will soon be out of a job, once everyone realizes AI is just the latest buzzword and snake-oil, to get people to buy ewaste. I haven't heard of a single individual bragging about how much they love AI, everyone, even the target demographic for this AI nonsense, absolutely f***ing HATES AI, lmao, to the point where they're literally sick of hearing the word "AI"! xD
@MyouKyuubi why so defensive? out of job.. buy ewaste? That's the most incoherent take i have seen so far on research. you know that mostly everything tech you use day to day depends on machine learning one way or another, right?
I genuinely do not understand the loyalty to Nvidia.. I've been using AMD GPUs for a long time, and they've been legitimately customer friendly with their practices since way back. The drivers are more user friendly, and they require less overhead. I'm not sure about the newest revision of Nvidia's driver software, but AMD's has been more feature packed for years. 🤷 *It really has to be because Nvidia makes the top tier card. If AMD took the crown, we'd see a massive shift just like with CPUs.* I truly believe that, but I don't think it's going to happen any time soon, since the rumors are AMD plans to only target 4080 or 5080 levels for their top card (whichever it was).
Nvidia copers fill the comments section with their budget 30s that can't even use DLSS 3 and still they cope. This is the reason why we can't never leave Nvidia is because of idiotic consumerism.
Yep, idiotic consumerism is the problem here, not Nvidia. Although Nvidia has done a lot of shady s**t, this, in particular, is a consumerism issue, nothing else. : / I went with AMD, i got burned by Nvidia's actual wrong-doings three times, and i called it there... First they discontinued SLI-support, rendering my fresh PC setup invalid, but i gritted my teeth and stuck with it for many years... Then i bought the 2080 Ti, the long-awaited upgrade, which stuttered like crazy and was barely an upgrade at all, and then 1 month later 3000 series comes out, making me realize i got scammed, twice, by Nvidia. Never buying Nvidia again after that, lmao, i'd sooner buy Intel, or even an abscure chinese knock-off before i ever consider buying scamvidia products again! 🤣
Just enough people buying everything what NVIDIA throws on the table. Just waiting for real vendor lock in by NVIDIA, where games can only run NVIDIA cards.
physx was primarily about fluid and particle simulaiton and was a big deal for about 10 years, it was tech developed by a company named ageia and is now open sourceware
The fact that AMD and Intel are not catching up in the GPU field hurts us consumers because of the lack of competition. Competition drives innovation and drives prices down, both being good for us consumers.
The fact he's never seen PhysX in a game, blows my mind. So he basically never played any game that came out in the golden age of computer games (2004-2018). Modern gaming is is a trap, it's 90% fiddling with settings and it never looks good. Older Games even look a lot better imo. Nothing beats clear, sharp graphics on a 2k Monitor. Modern games are like: "Okay it's sharp, but not if you stand so far away, we can't handle good Textures anymore. Also have fun with transparencys, depth of field with DLSS together, it's a mess, good luck...".
@@derbigpr500 even then, I would rather get lower tier Nvidia card than an AMD one. Never again, drivers suck (and yes I did ALL the ddu and everything). Plus, if you want to do anything productive with your system there's no choice but green team.
Yeah, i ran the original Half Life game a while ago, and i noticed there's no anti-aliasing settings in in the menu... So i decided to inspect the anti-aliasing the game just runs, natively... and it's fking amazing, literally flawless anti-aliasing, it looks spectacular
I actually just really appreciated the history lesson for the early stuff that was before I got into PC gaming and building. The whole video was good though.
this is monopolistic, we should really class action nvidia and make them open source cuda, and allow vram to be upgraded, the only reason for any of it is to protect a monopoly thats sad its millions wasnt enough
Why would the government make CUDA open source.. NVIDIA developed hardware and software for it. Its their intellectual property. Lmao. Or are we becoming socio-communist now??
AMD should never have played catch up for these stupid tracing features. If I was them, I would have collaborated with developers to maximize performance without the need for an upscaler. Also, I want to criticize the devs on the same topic: I think they wasted time on implementing ray tracing and path tracing, plus this need for realism is silly when the game's art style was always more important. I don't want a game that's 100 GB, needs the latest hardware, latest features, looks like real life. For realism and path tracing, I just go outside (and it doesn't even cost anything).
@@TopShot501st Median market price for Battlemage-580 is 380 Bucks/Euros, if you can find it. And you also need HighEnd CPU (5800x3D and upwards) to run Alchemist/Battlemage (Intel GPU overhead). How is Intel the cheap option?
The old benchmark of 60 fps, dlss off, ultra quality, 4k res was such a good time for progress. Now we got this ray tracing/path tracing garbage that requires dlss to hit 60 at medium/high instead of ultra at 1080p. The whole ray tracing tech has caused this backwards trend and until people start giving backlash on this change in testing methods, its only gonna get worse
AMD performs much better than Nvidia when there is a CPU limitation. Nearly all the stupid graphics card comparison tests are done with the fastest CPU available, which hides this fact.
The problem isn't the graphics cards, it's the developers who have stopped optimizing games and are patching everything with dlls and frame gen. Even the game requirements state that to run a game you need X card with dlss enabled to play at 30 fps 1080p.
100% agree. We're going down a road where game devs are praying that everyone buys flagship gpus so that they don't have to spend time optimizing their blurry TAA-ridden games.
And who do you think pulls the strings? It is not the game publishers but nvidia and amd.
It is cuz developers are making game with upscaling in mind and it comes from the consoles.
it's insane yes i'm stupid for having a 4060 but the new monster hunter title minimum specs requires a 4060 like what? these games are poorly optimized af will be upgrading with the 50's series
@@thewingedringer Put yout aluminiom foil no head. What amd or nvidia have to game optimisation
Old McJensen had a farm AI-AI-ooo
lol wtf
LOL²
I love this haha
This is stupidly funny 😂
haha
Blink twice if Jensen is holding you hostage 😥
😐
😌
LMAO
It’s okay i have an AMD cpu 😢
😖😖
Blink three times if you have Nvidia Stockholm syndrome and absolute LOVE being hostage of Nvidia and enjoy all the awesome software progresses from Tessellation to Raytracing and a dozen other goodies! ;D
Don't forget G-sync, the proprietary hardware based variable refresh anti tearing solution, requiring a Nvidia GPU and an expensive G-sync monitor. AMD solved that one tho!
Yup.
Nvidia have me hostage, as my monitor is was £900 when i bought it and only has a G sync module in it so i always have to upgrade to Nvidia or go AMD and buy a monitor too.
Not only AMD fixed it, but Vesa added standards for it too.
And now, Nvidia GPUs can use freesync (the AMD version of it) as well, which it couldn't before and was a huge mess.
the tech to make VRR have been always been there but no one really thought using it on PC especially on games until nvidia come up with it. and when nvidia did it they are not doing half assed solution that might or might not work. early freesync was a mess despite AMD have one year and a half to study what nvidia has done with Gsync. anandtech have a good article about this issue before when AMD finally realized there is no such thing as "free" if you want to come up with a proper product with good experience.
You dont even need AMD, HDMI 2.1 solved it. Nvidia was just first, just like with everything.
"Neural rendering" is just the newest marketspeak. People are skeptical of the term AI because it is constantly misused in marketing, so here is a new one.
true and all Neural rendering does is make the edges really smooth and pop certain textures to make it look EVEN MORE realistic... feels like a rebranded Tessellation.
Are you planning to continue to call this feature marketspeak when AMD announces their own version of it in 12 months, or will it stop being marketspeak all of a sudden?
While true many companies have abused the word Ai. However when it comes to a GPU, I can believe it being the actual terminology (similar to how DLSS IS actually using ai trained models)
@VintageCR In a sense, you're spot on. From what I've seen, it's pretty much to Tessellation+Hairworks/PhysX what DLSS is to native resolution. Not as accurate, but faster. It's what Satya Nadela calls agent in terms of comparison to large models. Smaller, narrow and specific usecases. (just pointing out Nvidia isn't even inventing any of this, just first to leverage it's hardware to implement this stuff).
You nailed it.
Let's spend 300% more money for 10% improvement, that makes sense!
Most viewers: 😡omg this is unnacceptable
General public regardless all tech youtuber guidance: 🤩😍
(rinse and repeat for each generation 😭)
@@akosv96 That is exactly why people are still buying cards like the 4060
@@akosv96 To be fair, tech youtuber guidance is pretty useless to most consumers.
@@duxnihilothat’s why we watch vex because other people are lying to us to make money
well, you gotta remember, the more you buy the more you save ;)
CES Nvidia / AMD keynote script leak:
AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, GAMING, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI...
Yeaaaaa, you know what's coming.
Nvidia is not n suffocating amd. They just arenproving quality of image and beautiful graphics are NOT on their priorities. Amd images are sharper and colors are accurate and bolder
RTX has just been a 6/7 year long beta test as far as gaming is considered
It's damn successful not as a beta test, but as a marketing effort because in the process they've conveniently drill it into our minds to call it 'RTX' instead of ray tracing, as evidenced by this comment.
In MGSV terms, Nvidia earned tyranny status and made their subbrands, the 'lingua franca' of cutting edge graphics.
Pretty nice for a beta test.
yeah and I doubt 50 series will change that much. The 4090 was a nice attempt but really we would need DOUBLE that RT/PT performance JUST to be able to enable it at 4k without terrible upscaling and framegen solutions (they blur/smear the image a lot in many cases)
@@gerarderloper It is not "terrible" in any way, in fact it looks really good for the level of extra performance it gives and makes something unplayable, actually smooth. Without framegen you simply wouldnt be able to afford a GPU that puts out that level of performance. Upscaling is the future. The technology needed to give a raw double a 4090's RT/PT performance in a consumer size GPU without upscaling simply doesn't exist. Even if it did exist, it would cost a ludicrous amount to the point that developers wouldn't even bother implementing that level of graphics if only people with a 10k USD gpu can run that setting, which will be a very small segment of the market.
RTX is branding, not a feature!
PhysX was originally physics acceleration on separate add-on cards made by a company named Aegia. nVidia bought Aegia, integrated the acceleration into their GPUs. And then proceeded to be an evil corp as per usual.
^This. Imagine if physx was never bought out , stayed a separate add in card that didn't hurt fps and was available to more game developers without exclusivity rights from Nvidia. Could have been a game changer.
"Evil corp" lol tell me one for profit publicly corp that ain't evil lol
@@DragonOfTheMortalKombat Ew, not this argument. You obviously pick the less evil company instead of just going "HURRR DURR NO ETHICAL CONSUMPTION UNDER CAPITALISM" and buying from the people who eat babies instead of the ones who just punch them
@@DragonOfTheMortalKombat It's graded on a curve.
wasn't it the thing where you could buy an extra component to put in your computer on top of your GPU that would handle just PhysX for more performance.
Actually ATI introduced Tessellation to Radeon before Nvidia did. It is just that when Nvidia implemented in a way that was more performant. PhysX was special hardware, that was eventually bought by Nvidia and then they killed the hardware and then vendor locked acceleration to there GPUs, then eventually they made it all CPU based. Hairworks was just a crappy unoptimized version of hair simulation, which was done much better by other tech. DLSS and all that jazz is suffocating PC gaming, simply because its being pushed on the devs to implement, which is a problem for PC gaming being an open platform in the long term, pushing APIs that only work on one set of vendor hardware is really bad, and resembles back to the 3Dfx days when Glide API ruled the roost, making it super difficult for competing GPUs to compete on a even play field.
PhysX since the very beginning have both CPU and hardware accelerated part. but Aegia want to sell those PPU so the highlight is mostly on the accelerated part. and the initial CPU physx also being accused to purposely using things like x87 instruction to make it slow running on CPU. nvidia over haul PhysX development with PhysX 3 making the CPU part more performant and compete head to head with havok. this is when some big game engine start ditching havok in favor of much cheaper PhysX in term of licensing price. nvidia eventually make the CPU PhysX open source in 2015.
@@arenzricodexd4409 Yeah they did open source, when it became useless to do the vendor locking bullshit. Also it didn't end up helping them capture more GPU market anyway.
@@diablosv36 After all its nvidia. They will try to milk as much of money as they can from the consumer.
Dude the fake fps trent right now is making me consider console gaming -.-
Exactly.
this is why I haven't bought an nvidia product since 10 series, they're cartoonishly anti-consumer with their behaviour and pricing
Every company is anti consumer. If amd where at nvidias level they would do the exact same thing.
Its scary people actually think companies have morality and are on the consumers side.
@dante19890 sorry, where did my post say that AMD is inherently pro-consumer?
it's scary how you're either illiterate or jumping to conclusions
@@dante19890i fully believe that AMD would be just as bad if on top, but that doesn't mean we can't expect better from these brands. There are still brands out there that don't rip off people like Nvidia or Apple do, even if they are on top.
@@interrobangingsamd is not good company as well cuz they fuck up non stop but at least they are not that greedy as nvidia i have 3 generation of nvidia gpus and im not plán Ing to buy another and if you ask why well i no longer wanna pay for premium when it nôt give mé whats been promised.
cuz they are innovating?
I couldnt care less for raytracing and all this high end stuff. Only want stable high fps
More developers are putting raytracing into the foundation of their games. It will soon be inescapable.
My cheap ass 4050 runs every game i play at constant 60 fps.
sadly games are coming with it as default, no way to turn it off.
Ask game devs and publishers for that
@@gerarderloper mods
AMD doesn't need to keep up - they can do million of things. They have got 3 different tech for path tracing ready and fsr is co developed with Sony and Samsung. AMD needs to start playing their own game - maybe chasing nvidia is just cheaper.
they should try making their own features. They are just copying what nvidia does. Even in PS5 Pro they think PSSR can't even compete with dlss.
what are you talking about, they are behind in every metric.
"They have got 3 different tech for path tracing ready " - source?
@GreyDeathVaccine GI1, brixels and path tracing optimizations. amdgpuopen has the papers and videos. AMD has options - it's just complacent.
@@michahojwa8132 so you are saying they somehow have the tech that they are too complacent to develop? huh??? You are not making any logical sense. Sounds like major cope.
Thank you for the shout out!
Very glad to see you you bringing this topic closer to the surface!
I love how he pauses for a brief moment on the rabbit liquid scene
He had serious Zootopia fandom flashbacks
Fun fact: The Witcher devs coded hair works wrong in the original copy of the game the one before they released the next gen patch. It use to run on CPU and not GPU that is why it was a big performance hit
I didn't know that. That's actually funny :D.
Irony is all these nvidia's features require more vram 😄
& nvidia is too greedy to give vram
That’s their sales model; for nvidia not a bug but a bottom line feature. Copying Apple.
I think they are just afraid of building cards that are going to last for too long, like 4-8 years.
they can afford to, but the strategy is even if you buy higher end to make you feel like the highest end is better. you could slap 16GB or more vram on every card in the series without a major price difference.
that's how nvidia being smart with it.
It's too much greed coz 8gb vram costs around $40-$50 for nvidia to manufacture
Even with a 4090, you need DLSS Balanced with Frame Gen to do path tracing and DLSS only looks good at Quality for 4K to me. NVIDIA does this so you want to buy the next gen 5090 hoping it will finally do path tracing with minimal DLSS. If it does, then NVIDIA will introduce the next tech into the games weeks later so the 5090 will not be good enough again.
They likely don't want anybody turning down the DLSS... That's likely their substitute for any real upgrades from here on out. If the DLSS is not needed, it becomes 1 less marketing point for them.
a 4090 runs marvel rivals at 90fps on 4k........that is all kinds of sad.
We need better optimization in games.
Wait, so they keep introducing better and better tech as hardware becomes capable of handling it? Oh the horror.
@@Robsham1 I wish it were that simple.
@@transformerstuff7029 90fps with 1/4th of the frames lacking any user input given the game runs isn't actually at 90fps lol
Soon RayPathSuperTracing will be available for just 4k$ on 480p... and this will be midrange gpu because top range will start from 20k$...
Damn, choom...
RPST , it sounds like PTSD lmao 🤣
@MidnightDrake the translation is wild lol
"We've got ray tracing and sky reflections all turned on. Now let me go to this skyless, dark pit to demonstrate absolutely none of it."
and yet even when none of it is shown it still more than halved his perfromance
Sky reflections require 5k cards.
Thats why we have raytraced global illumination so the dark scenes actually look fantastic reacting realistically to ambient light in the scene. People who think RT is just reflections are clueless.
@@WayStedYougood point
@@WayStedYou ya, because you still need to calculate rays... So it doesn't matter, when you will have the sky, fps will be the same because rtx doesn't care about complexity.
Also, he using not top card in 4k in one of the most demanding games, what was he expecting? Why not path traycing in 8k and 240fsp on 4050M? Just switch to QHD or turn on dlss and it will be playable. Change raytracing from ultra to mid or low, it will be still better than not having RT at all and fps will be more than comfortable.
2:38 "Whether or not the developers could have done better on the left side is questionable."
No, it's not questionable at all! They could have added some normal mapping to make it less than 1% difference between them! 😠 That comparison is typical Nvidia BS full of lies. They are liars and shouldn't be trusted. That game came out 2 months after the GTX 970 was released - which had 3.5GB VRAM! Technically 4GB but the last 0.5GB was SO SLOW that if a game hit that region it brought the card to its knees. Gimpvidia doing gimpvidia things, gimping them. Now they vomit these PCIe 8x + 128-bit for $400 (and wanted $500 for another 8GB VRAM). 😠😠
Honestly, I expect if there is no change in the market that AMD will stop producing consumer cards. It doesn't make them much money. Even when they release cards with good performance and priced well, people still don't by them. The only reason most normal consumers want AMD around is for competition. But even if they make the best bang for the buck card, they will never actually chose AMD. If Intel sees the same, they will drop graphic cards as well and we'll just have a monopoly. If you want diversity and variety in your markets, the consumer actually needs to support that. Nvidia knows now that there is no price too high, since most of their customers refuse to purchase other companies cards.
Well I ordered B580 (but I am not sure when I will get it), but man, this card is overpriced in Europe. I accept that I am buying a slightly worse product, but I will have QuickSync 🙂 That matters for me, since I already has AMD CPU and not gonna switch to Intel.
That's crap, Amd haven't been pricing their cards aggressive enough. They've been to far behind with upscaling and RT. They needed to price lower to compensate. They're catching up a lot in RT with the 9070xt apparently and they have fsr 4 coming which is now a.i based. If nvidia keep raising prices Amd have a chance to step in and take markershare if they're aggressive enough. I'd happily buy the 9070xt if it was 4080 performance and $499....but will they actually price it properly or match those performance levels? It needs to beat the 5070 by at least 10% in raster imo while being $100 cheaper. Otherwise it will be the same old story, this is their best bet so far so it would be such a fail if they botch this lol. We'll see in 8 hours i guess
@@PCgamingbenchmarktime Well, let's not forget that AMD actually needs to make money on those cards to keep making them. They can only drop their margins so low before they may as well not bother with making GPUs at all.
@HunterTracks who said anything about them not making money on them. The 7800xt was $499 at launch, later dropped down too $449. They're still using gddr6, the same amount of ram, i heard rumours that die size isn't that far off the 7800xt. The profit margins would still be there, its about making the card as efficiently as possible. Nvidia just has ridiculously high margins let's be honest. Cards can be cheaper than what they're selling and still be making big profits. I see no reason why the costs would be anymore than the 7800xt. The 2080 to 3080 had a 50% uplift for the same price, so a 40% uplift from the 7800xt for the same price seems very reasonable
@@PCgamingbenchmarktime and yet they tried lower pricing a decade ago until recently and still didnt gain market share.
4:49 sums up RT pretty well. You can actually see a difference with the lighting .. probably ..
Yep. It's so sad how people fall for gimmicks. Even when they don't see a difference. Branding is fucking powerful
amd has not as much money to throw at their gpu branch unlike nvidia. lets just hope they dont f up the prices this time
It doesn't work like that...
Prior to Ryzen gen 1 AMD had far less money compared to Intel. Yet after a few generations of Ryzen they eventually surpassed Intel in spite of them originally having less money.
The reason Intel and AMD and struggling to catch up to Nvidia is simply because Nvidia has the best engineers and best feature set.
AMD is doing quite well.
Contracts for both xbox and playstation, including their new ones coming out.
Also int he PC hand held world, they are about to release some next level chips for that.
And their CPUS are on fire.
Their stock is only growing.
AMD doens't need to focus on super high end. Not important.
Dude reality check for you. Nvidia dominates the PC GPU market with 88% marketshare....AMD has only 12% and this gap has only been widening year after year. on CPU side their situation is without a doubt much better and they are taking ground from Intel, but on GPU side they are not even outperforming RTX 4060 and 4060 ti, even though their GPUs at same price range are more affordable and offer better performance than 4060/4060ti. If you dont take my word for it, just go look at Steam hardware surveys. AMD has mountain to climb just to get more consumers on their side with low and mid tier GPUs.
@@DaveTheeMan-wj2nk Having basic features of a modern GPU is not "high end".
@@Totenglocke42 You're welcome to share your opinions :)
Nothing is set in stone, high end is determined by the individual and their needs.
How cool is that.
@@DaveTheeMan-wj2nk Except it's not. High end is determined by what exists. A top of the line GPU that has all the features available and runs games at the best resolutions and frame rates is high-end. A GPU that's using almost ten year old tech and can't run any of the modern graphics settings properly is low-end at best.
@@Totenglocke42 I'm happy with my previous comment.
AMD was ripping off Intel untill they started to simply make better products and not only performance wise, but also price was affordable to most of us casual pc users/gamers. AMD did so well that Intel is in the shadows now. If they gonna start to copy and beat Nvidias insanely stupid prices then we should'nt complain but be happy. Because I don't know in what world you guys living but to pay 1,2k or more for a gpu is stupid. Little % of people sees that price as affordable, to most it's few months savings that's left after paying taxes and rent.
Intel still makes better products than AMD. Only AMD fanboys who overdose on copium daily believe AMD is better now.
This is a complete misunderstanding of 40 out of the last 50 years of CPU history! AMD invented 64-bit x86, STARTLING INTEL. AMD crushed Intel with Athlon. Intel only survived by breaking the law MANY TIMES! AMD is very amall and made a disastrous design mistake with excavator CPUs - NOT COPIES OF ANY INTEL COU, EVER! Now they have Ryzen and pioneered chiplets and vcache while Intel sits on its fat lazy ASS! AMD were only a second source (printing licensed copies of 80xx cpus) in the late 1970s and early 1980s when nobody would buy your chips without a second source foundry to make more chips in case you went bankrupt!
Wasn't AMD producing like a faster 486 CPU back in the day?
@@transformerstuff7029More the Athlon XP series, AMD was on top then, a long time ago though
@@thechurchofsupersampling that x2 4600+ was pretty decent if I remember correctly.
And yea they reverse engineered pentium cpu's at first, and ended up improving them. So a amd 486 would be faster than the intel one.
Biggest PhysX titles back in the day: Bordelrands 2, Batman Arkaham Origins, Metro 2033 (original), Mafia II (original), Arma 3, Cryostasis, Medal of Honor Airborn, Just Cause 2, Assassin's Creed IV Black Flag, Dragon Age II.
Bordelrands
oh man the smoke and debris effects in AC4 were INSANELY cool at the time. I still had my GTX 770 after I upgraded to a 980TI and used it as a dedicated physx card
Bro Dont Forget ther 1st Mirrors Edge! I remember playing that and Arkham Asylum with my GTX260 and was blown away how well the glass and smoke effects looked!
@@marceladex9529 😂😂😂😂
Fear too
AMD should focus on software improvements before starting to compete with 80 and 90 class cards. Nvidia is thriving now because they took the pain and effort to make these technologies decades ago.
Yes, AMD is better value for 16 - 25 years old kids who are only doing gaming
and NVIDIA for 25+ years old adults who needs to game and WORK.
That's one of the reasons Nvidia dominate so much. Nobody cares to spend 300/400 euro more for a polyvalent GPU.
AMD fans are coping " Nooo AMD is better value, it cost 50 euro less and have 10 + FPS 🤓🤓🤓🤓 so it's better !!! "
Both of this guys above me spit nothing but facts. AMD should fr compete with Nvidia on technology, cause fsr stil sucks ass so bad 😂, the fact that tim from hardware unboxed say's that Xess is out right better is sad for Lisa su & her whole GPU departement.
@@zzephi bro its just a corporation that want's your money. Their not your friend or lover. They make a good product, but dominate the market and can make up whatever pricing they want. Corporate shills are gross as hell.
Tessellation, hairworks and phsyx wasn't developed by Nvidia, they bought them or basically stole them
What's wrong with AMD's software? Explain.
i do not understand the raytracing hype. yes it looks good. but why the fuck would i use it when you have to put DLSS in Performance mode so the textures look way worse so i can get playable fps. i just ignore RT in every Game. just a look what i can do feature to get sales for people with weak character.
It's still in its early stages, and tech needs time to develop and become mainstream. It also took quite a while for 1440p to become playable at an affordable price, and yet most people still play on 1080p. Meanwhile, there are 4k monitors.
@callmenoodles8955 going into the 7th year RT not that new anymore and is demanding same as it Has before.
Lighting is the most important part of game graphics. You might have a shitty GPU now, but in 10 years you will sit here in the comment and gush over RT and you cant even turn it off cuz it wouldnt have a rastorised fallback by that time. Its the future.
@@dante19890 people said the same thing when the 20 series came out and it still makes games run like shit plus i shouldn't have to wait a decade for something that's currently out right now
@@utopic1312it’s a future not to far off I’d say probably in 2 years or so since game devs give no fucks about how well their games actually run so they force rt on with no way to turn it off👍
3:00 Nvidia didn't introduce ray tracing. That technology is probably as old as your grandma.
they made it work feasibly in real time for games
You used to be able to have a separate Nvidia card on top of your AMD card and dedicate the Nvidia card to PhysX. So you could pair a cheap Nvidia with a powerful AMD card. Good old days.
They had to eliminate multi gpu tech, made too much sense
@@evaone4286 it had constant issues
@@utopic1312right. Not to mention that nowadays just buying a single good GPU is expensive, imagine two
@@utopic1312 developers weren't implementing it properly
@@potatonite2340 it was expensive for 2 then as well
15:30 They do that so the consumer can easily understand the naming of them, they've been this for the Ryzen 3,5 and 7 in comparison of Intel's i3,5 and 7.
They match the names to make it easier to understand for everyone, that's actually really clever.
The truth is: only a few games are Nvidia sponsored.
AMD has AFMF2, which Nvidia hasn't (yet).
I hope that AMD gives us something like to Microsofts AutoSR and makes it as accessible as RSR. That would be a great feature for older games, I think
AFMF is literally just lossless scaling which costs 4$. AMD is being a stubborn donkey and i hate it
@@Efsaaneh It's also worse than Lossless Scaling which is both hilarious, and depressing.
@@SeasoningTheObeseworse? In my case it depends on the game but in most of them AFMF2 is much better than LS
it's even funnier when you think how AMD has basically all the console market and still can't capitalize on that.
@@roller4312How are they supposed to capitalize on that?
”Buy our stuff, it's like a PS5 but better!” SURELY wouldn't piss off their business partner, Sony /s
Since the GeForce 256 Nvidia has been the king of dirty tricks. The most soulless evil corporation I have known for decades now.
Tesselation is not about giving Y value to textures, parallax occlusion mapping does that. Tesselation triples and even quadruples the polygon count to applied textures to create higher level of detail, which is why it was used sparingly, only for some objects for a long time. Now, try lowering tesselation to minimum, for example, in Forbidden West, see what you get. Then, there's soft PCF shadows, which later down the line turned into PCSS, unified shaders, driver forced FXAA for games with no AA, then driver SSAA in DSR. Nothing's changed, RT is the new SSAO and soft shadows, that need unified shaders, accelarated by CUDA, and therefore run better on Nvidia. Somehow, ATI and then AMD always play catch-up, making competition for Nvidia either in features or raw performance.
I recommend everyone to find the book titled The Elite Society's Money Manifestation, It changed my life.
I honestly hope AMD kicks some nVidia asses with the RX9700XT value/performance.
PhysX was previously a physical physics card that worked more or less like graphics accelerators like VooDoo did at the time. The company was bought by NVidia and its components were added to graphics cards for physics calculations.
Just a couple days ago I had a short argument with someone who claimed that Raytracing TOTALLY makes a huge difference. Another guy in that comment section also said that Pathtracing gives games a "hypnotizing quality" which is the most batshit insane level of falling for marketing I've ever seen. For some people it reaches almost pathologically obsessive levels. But anyway that's besides the point. The dude I was arguing was hilariously, basically already cartoonishly arrogant. His first statement - as per usual for NVIDIA hardcore fanboys - was that if I don't think of RT as great i must be running on an old shitty GPU. In his words he wrote: "I don't know about anything else but my 5080 is going to run that ray tracing beautifully cuck boy" lmao.
Often these people arguing on social media don’t even own high-end gpus that can do this stuff. They base it off of watching youtube videos and pretending they know. Hypnotized by the marketing. Like poor people who defend billionaires because they believe they will someday be one. 😬
I dont care about RT but I do care about upscaling, and FSR is completely unusable. You act like AMD fanboys arnt in complete denial about all of the driver issues. They all claim to have "zero driver issues" but AMD's own adrenaline patch notes list all of the games that constantly crash on all amd cards. It takes months for them to fix crashes in games like wukong, helldivers 2, lords of the fallen, etc.
@@Dempig you'll be surprised, but having issues in some apps is unavoidable. Did an update on an Nvidia machine recently, and you know what was there in a changelog? "Fixed an issue..." Will you call out Nvidia on "bad drivers"? Nah, cause you are a sheep, apparently.
@@Dempig took 2 weeks for helldivers, idk about wukong
uh you know some people do think it does and you dont. But they are insane and you are not? Bruh is digital foundry insane then?
the AMD naming is a good thing. previous naming could confuze where is the GPU and where is CPU. example R7-7700x or RX-7700xt pretty similar for someone who is shoping for a gift or planning to upgrade their PC
what is RDNA5 going to be called? 10000 series?
@@gerarderloper they could skip 10xxx and go to 11xxx
@@gerarderloper Depends on, when they switch from RDNA to UDNA they will more than likely do another fucking naming change but it is what it is. Now RDNA5 could go the Vega rout and just be its own thing then UDNA1 would be whatever it would be
@@gerarderloper there isnt going to be rdna 5 they are going to udna
@@Acrylick42seems like that would snowball into another naming problem eventually
It's not about nvidia. It's about consummers being stupid. Do you need tesselation at max settings ? Do you need hairwork ? Do you need physX ? The answer is always no because these are settings above ultra on a few select games. And by the time the tech become mainstream, AMD runs them as well as nvidia.
I am playing without DLSS and without ray tracing today and i don't miss it.
Nobody is forced of anything, it's not like you had to pay to unlock those settings and now you are traped in the nvidia ecosystem because you don't want to lose all that money you spent. It's not apple.
Well nowadays we’ve got games that force rt on with no way to turn it off so soon enough we may actually need good rt performance to play the latest games, we have a not so bright future
@AstroBot-bc0213 just the indiana jones game so far right?
Apparently Hairworks can work on CPU and it was glorious on my old rig, cause it allowed me to play Witcher 3 on ultra on shitty GTX960, cause I had decent i7 CPU. And then they patched it... It's literally a feature lock. And due to IP laws AMD cannot even propose serious alternative most of the time. I'm now on full AMD rig, but it's a bit stupid that I get worse experience on top of the line AMD than on middle of the road NVidia, just cause hairworks is locked behind hardware.
I think Outlaws does software RT, but I can see more games following Indie. Which I abhor since they want to bury the mistake that is the 1080ti.
@@PyromancerRift Do you need them to get the game to run? No. Do you want them for the best experience? Yes. If clowns like you had your way, we'd all still be playing on a 486 with no 3D graphics cards because "Ugh, it can run Doom and Command & Conquer just fine! We don't need new tech!"
The kind of advantage you have when you invest in research and development.
yup. its AMD cope. Do i like the pricing? nope but Nvidia invests a boatload in not only developing tech but they go to game dev studios to help implement said tech.
AMD needs to compete like they do in the CPU market. Intel was dominant but AMD made moves. I switched from a 8700K to a 7900X3D then sold it and got a 9800X3D
And all this crying about ray tracing is so dumb! IT'S OPTIONAL! You don't have to switch it on
@@samson7294Well no your missing the whole point of the video and here you go again with this, dlss was not meant to be a technology of injection lf fake frames to play at 60 fps in most games secondly all the technology enables is slop developers to un optimize their triple a or modern slop 😂
@@samson7294 That's missing the point. Getting tech implemented to make games run worse on the competition is the kind of stuff that gets the government involved in other industries. Heck Microsoft was forced to give the option to install browsers on setup because MS coming bundled with Internet Explorer was considered unfair competition.
@@samson7294 " IT'S OPTIONAL! You don't have to switch it on" my honest Indiana Jones and The Great Circle reaction
Watching this from the future a couple days after the beggining of CES. Your predictions were SPOT ON. Gotta say that I'm impressed.
funny enough i was thinking this exact thing with how raytracing is the current tessellation experience
16:00 Don't know why but it sounded like you said Ketchup instead of catch-up. Love your content! - And I don't like the new naming scheme. It feels like what AMD did back in the Athlon days and they had to put the Performance-Rating (PR) on their processornames to compate with Intel. It was first when they threw all that away and made an entirely new name (Ryzen) and architecture that they could establish themselves and distant them from Intel. Hope they can do the same with UDNA in the future against nVIDIA.
my RX 6800 over a year old now bought it 2nd hand in 2023 ex mining card...still going strong. Paired with my 5800X3D. I'm waiting for high performance APUs so I can downsize my rig to a MINI PC>
Buying an ex mining card has to be one of the most idiotic moves ever
@@LeNathDuNet haha well its over a year and mines still going strong so jokes on you mate. I got lucky when I bought mine.
@@Epic3032 Extremelly lucky it even works for sure, but i personally wouldn't take the bet not only because the odds of it being in a good state are horribly low, but also because crypto scum are amongst the worst people, the entire thing being extremelly tightly linked to organised crime of the worst kind
@@LeNathDuNet a mining card thats been on 24/7 and likely undervolted in a clean server rack will more than likely be better than a 2 year old gaming card on a PC hat's been booted on and off power cycles for more than 3 years...
@@UncannySense ho because you thought they treated those like professional server racks?
Let's just ignore the hundreds of thousands of totalled cards because these idiots powerwashed them with water to resell to other idiots
The first time I downloaded PhysX was when I installed ghost recon advanced warfighter 2 for PC. It was really cool for that time, which was early 2007 iirc. Explosions in a destructible house would send debris flying all over in a way that just wasn't possible without PhysX.
7900 XTX is what I have and it’s absolutely great. There is no need for anyone to give Nvidia all of that money for that bullshit they’re selling.!!!
Getting my new build with this exact GPU in a couple of weeks! Can't wait to experience AMD GPUs after 2 decades of NGridia's.
@@NicolasCharly don't forget CES is this week and they'll be revealing new cards
@@NicolasCharly Same!
I bought RX6800XT almost 3 yrs ago because 3080 was 2.5x more expensive than this GPU for which I was able to buy for less than 500$ with full waranty so... 3 years ago that was extremely cheap. Is it bad? No, everything I play runs great cannot RT enough CP2077 though but buying overpriced nvidia shit for 1 game doesn't justify it for me. Suprisingly stalker 2 runs decent but that's probably thanks to my X3D CPU. We don't have like good options right now. My budget for GPU is like 700$ max anything above since I play games rarely due to time constraints I'd rather invest such moneh in my oldtimer motorcycles collection which gains value with time.
7900xtx is unusable for me. Upscaling is required at 4k and fsr looks so bad its unusable
AMD should focus on their driver stability. I still have issues with their Adrenaline app crashing randomly every few hours and crashing both games and games crashing the app. Incredible. I'm not even one of those that just update apps, since I am comfortable with using ddu and safe mode and such.
Very unlucky for you i havent had any issues since switching from nvidia
@@Annihilation99this
adrenaline works fine for me
Had a 6950xt for a while now and had no driver issues what so ever.
Did you use dddu for a clean driver install when swapping GPUs?
7800 xt, no issues for me, since i stoped windows update and using ddu for uninstalling drivers.
Raytracing is just another Phyx, hair works ect. I believe Nvidia wanted to add something that would cripple AMD. And honestly idk how raytracing even got as far as it did. It's another useless gimmick that no game can run well with. It being use as an argument is stupid as well.
No raytracing is actually the future of rendering. You never gonna get rid of it. Its here to stay.
Next console generation with ps6 every new game is gonna be using RT from the ground up.
RT can be good but on powerful GPUs and 1440p, not 4k.
Ray tracing is an incredibly useful tool for developers. If and when ray tracing capable (and I mean fully capable, so probably not for a while) cards are wide spread, developers will essentially never have to bother with manually setting up lighting tricks the way they have to do now. It's an insane cost and time cutting feature, it's just not able to shine quite yet
It's not useless if you have the money
Sorry but proper raytraced global illumination or even better pathtracing is NOT a gimmick. It absolutely transforms the games graphics positively while also reducing workload on environment designers.
Its rather obvious that you never actually play raytraced titles and much prefer insane framerates.
DLSS is so good at quality dettings that it results in better image quality than the best AA solutions, while giving you a massive performance boost.
Back in the days we were VERY happy hitting mid /high settings at 720p in crysis.
I hope nvidia keeps pushing tech, because AMD sure as shit wont..
If you enable PathTraicing, you have to disable rasterization options that do not affect the final visual quality but that reduce FPS. They are not automatically disabled when using PT, in most games.
DLSS in quality mode or DLAA are too much for most if PT is used, with lower resolutions it looks very good and with more FPS when using PT. Pathtracing replaces many things in rasterization so they must be disabled.
It's not imposible to keep up, AMD has to focus on mid tier user like they said they would, high end users prefer nvidia low end and mid end could go with AMD if properly educated, in almost all aspects AMD offers more for less money, but on low to mid end people still pick nvidia over AMD and that is something AMD has to change if they want to keep up the pace
If Amd doesn't botch the launch I'll happily switch to the 9070xt from my 3080. FSR has been the main thing holding me back since i play at 4k and fsr looks terrible in most games, But if FSR 4 looks promising and the 9070xt is actually very very close to a 4080 for $499 I'll buy that. But Amd have a bad habit of disappointing so we'll see how it goes. I'd rather not get another nvidia card but if Amd fails I'll probably get a second hand 4080. Fingers crossed for Amd lol
" if properly educated, in almost all aspects AMD offers more for less money" - Keep coping. AMD isn't ahead in anything, even in value.
I mean, it makes no sense for low-enders to choose Nvidia, because AMD products are superior on the low end, like... Universally superior. xD
@@derbigpr500 You have no idea how wrong you are about that, lmao! xD
@@MyouKyuubi this is where AMD is not superior. brand loyalty. even 4070 at $600 outsells 7600 at $300. 7600 compete with 4060 and yet it's outsold from 4070. 4060 laptop and desktop outsell 4070. imagine how much 4060 outsells 7600. there's not even a contest. there's more contest with 4070 than 7600. Nvidia wins from brand loyalty, and most people don't even know this. it's like trying to outsell Iphone. not gonna happen. Nvidia is the Iphone of gpu, but unlike in phones where there's Samsung. there's no Samsung in gpu.
I was an Intel/nVidia fanboy from my early gaming days in the late 90's. Bought my 2nd AMD system ever back in 2023 and went with a AMD GPU as well. The Intel/nVidia prices got utterly insane at some point. AMD got better price/performance ratio if you disregard RT performance.
Reality check on 1:07
Upscaling is a misleading marketing term. It should be called rescaling since it lowers the resolution then uses AI to upscale it back to the target resolution.
I can when ever i want. I luckely have no stockholm syndrom with gaming and nVidia so. But good Video as always! Time to subscribe
I remember when AVP came out with tessellation in 2010 and being blown away at how smooth the Xenomorph heads were compared to the previous games.
01:35
Our brains are totally messed up.
0:01 ATI TruForm was the first tessellation effect before even NVIDIA came out with their version way back on Radeon 8500
Considering Nvidia’s GPUs keep becoming more expensive than previous generations or even have worse power consumption, Nvidia’s domination can’t last forever right?
this question might not age good- it's a bet at this time the way I see it
You underestimate the stupidity of consumers.
They're currently frontlining the market's current obsession with AI. The massive price jumps is just piggybacking off of their recent success in that area. So long as shareholders demand they profit further from the hype, nothing will change.
Look at Intel CPUs, they were considered unbeatable when Ryzen launched.
Worse power consumption? Maybe higher end but the 4060 is one of the most efficient gpus this generation and does wonders in the laptop and mobile space. Any competing amd gpu swallows up power like a whale. You can't beat 115 watts.
Just in the process of jumping ship. Sick of the prices now. Managed to get a 7900xtx on a great deal, so went team red. Arrives tomorrow, and hoping the switch goes smoothly
3:20 a 4070tiS is 1150 euros here...i wish it was only 800...
1150€ crazy.. where do u live? In Germany Ti Super is 830-900€
@@r0xa18 France, GPU are overpriced
@@dragonmares59110 mate, I know it's hard, but you need to NOT accept those prices, 4070 Ti should be around MAX 600€, and that's already pushing it.
@@r0xa18 In Croatia is 940e up,its just insane
@@JcsP3D Well i don't need to accept it, i just can't afford it anyway xD Never putting more than 500 euro in a gpu
I used Tessellation with ATi Radeon 9000 cards in Unreal Tournament. That was over 20 years ago. It was nice allowing you to have more imaginary polygons between vertexes, but yeah that Tessellation wasn't the same deal that was later introduced that worked more like parallax effect.
One big thing about buying AMD Graphics in Germany is the Wattage of the GPU. Yeah i paid like 100€ more for a 4070ti instead of buying a 7900XT but like this i can play my games with raytraycing and dlss + undervolted to 165-185 watts on max instead of nearly needing twice as much while paying 33 cents per KWh. So as long as u keep ur GPU for more than 2 Years, or play more than 2h daily on 300 days of the year Nvidia will be cheaper than going AMD while offering more. So its hard to not go Nvidia over here if u have this in mind.
The 7900XT undervolts so well, it can play Elden Ring fully maxed out with max RT at a locked *native* 1440P 60FPS while using 125w. Less than a 4070Ti Super ever could at those settings.
You punked yourself by not researching AMD tuning. It's wild on the 7900 cards, you can choose to be super efficient, balanced or Ballz2thewall performance.
33 cents/kWh in germany, damn. I live in Denmark our avg price in 2024 was 0,07 eur. So I saved money and bought the 7900xt.
‘Omg I’m getting old’
You are in for a shock my friend.
Don't worry mate, we are all old.
Gonna get a 5070 because I don’t have GPU, only to play games from 5-15 years ago.
I don't understand why you think the 4070 Ti S is meant for 4k with RT. Regarding Ray Reconstruction, even NVIDIA itself said that for now it will only work at an acceptable FPS with DLSS 3 (unless you have a 4090 for 1080p gaming). I hope that I will live to see the moment when turning on RT will not drop FPS by 2 times, when there will be enough RT cores for processing. And we are likely to see the 6070 series consume power at the level of the 4090.
So all cards except of overpriced one can’t do any of nvidia features? Whats the point of them then?
They can. But not in 4k. Like DLSS was created more for 4k gaming than 1080p, but use it now with the words "medium settings, upscale balance, 1440p at 30 FPS". 4k monitors are more expensive, rendering such a resolution requires almost 4 times more performance, which the graphics accelerators of that time could not do (without DLSS). Even the "jacket" says "the more you buy, the more you save". So what am I getting at? These are the times of AI, when the performance per watt has increased significantly, which cannot be said about the gaming industry.
For me, the 90 series has always been either for 4k or for VR (which is essentially the same 4k+) at maximum settings with the DLSS. In addition, 4k became popular too early, and video card manufacturers were not ready for it. RT / Ray Reconstruction / Path Tracing is still very "expensive" processing for GPU and you should not expect 60 FPS in 4k on the same 4070 Ti Super.
So Tessellation was used by Nvidia at a crazy high level, so high that is was next to impossible to notice it. Cutting the Tessellation level by half caused no change to video quality but suddenly AMD cards ran it fine. Hairworks in Witcher was another example of something like this being done.
Looking forward to your CES content👍
yes this is so correct, thanks for spitting truths Vex. Many many Games do hidden optimization (deoptimization for others) in favor of their sponsorshipped brand. Thus not only market share affected also competitive gamers are affected.
Nvidia giving consumers innovative features is pro consumer. But it is very anti-AMD. If you are upset that Nvidia innovates you probably have an irrational emotional attachment to Intel or AMD.
Pro-consumer is giving away the technology so that the consumer is not locked down.
For example PhysX was not coded to run on x86 so that CPUs couldn't run PhysX without massive performance losses. Not letting the consumer run PhysX on the CPU is an anti-consumer measure.
@@ChucksSEADnDEAD Lmao your example is awful. Nvidia purchased Ageia, who owned and created PhysX. At the time CPUs could run PhysX but not very well at all. You had to buy a dedicated Ageia PhysX card to run PhysX but they were expensive. When Nvidia bought PhysX they integrated PhysX into Geforce meaning users didnt need to buy a second GPU. it was quite a consumer friendly move. I remember it first hand, im 38 and ive been doing this since I was teenager. Even today PhysX isnt handled well on a CPU and you would want it on the GPU. Of course money would need to be spent to get it to work on Radeon cards. Why should Nvidia spend that money and help its competition?
Also, If you look Nvidia spends a far higher percentage of its profits on R&D than AMD does. Why should Nvidia give the competition the fruits of its labour? This has nothing to do with the consumer and the consumer doesnt benefit if AMD get their hands on Nvidias technology. You appear to be mistaking healthy competition for anti-consumerism.
AMD make billions in profits every year. They are not short of cash to compete with Nvidia. They merely choose not to spend it. If they were broke and getting crushed by a massive multi national then id understand but that is not the case. AMD are just keeping the money.
Borderlands 2 was THE PhysX Game back in 2012. Hella unoptimized, but those dynamic cloth sheets that could tear was some awesome tech to me and the fluid effects of stuff like toxic goo looked really fitting to the Borderlands art style.
My RTX 2060 is dying, am hanging on until next gen launches, and planning on going AMD.
Literally me can barely run red dead with smooth frames at 60 😭 even with DLSS
Have you considered ARC? B580 looks really promising for a mid-cost gpu
I went AMD and haven't looked back.
@@mradsi8923 seems it really likes 1440p though and gets beat by the 4060 on 1080p.........
That is a thing to think about as many budget gamers still game on 1080p, or use a 1440 card for multiple years and game 1080p in the last few of those years.
@@Pazo139 same, gotta buy the competition if you want them to stay relevant.
today we would like to introduce drum roll please... THE BRAND NEW RTX 6090 TI SUPER WITH 12GB VRAM GDDR8X HALF THE CUDA CORES HALF THE SM'S SHITTIER OVERALL 20% BETTTER DLSS, UPSCALING AND RAY TRACING FOR THE SMALL PRICE OF $5500 USD AND YOUR GRANNY, remember the more you buy THE MORE YOU SAVE!!! (features sold separately)
Nvidia tried to do the same thing with RT that they did with CUDA, be the first out of the gate and do it better than AMD so devs would feel forced to use their shit and no one else's. Thing is this time AMD is in the consoles and they are a huge portion of the market so ngreedia couldnt cut them out of the market. It is making RT slower to adopt but you can really see the devs that are in nvidia's pocket (CDPR-etarded).
RT hardware can't be copywritten but ya know what can? DLSS so that was their plan this whole time. Try and force studios to put RT in their games and then ngreedia sells you "secret sauce" to fix its performance hit... especially with path tracing.
The only way to fix this is for people to get off of ngreedia's dick and at the same time AMD has to stop fucking up each and every GPU launch.
FSR4 is coming out when DLSS is going to get its next big update so AMD's software is still 2 years behind on the GPU front. They need to price their cards accordingly yet they never fucking do.
Wukong being heavily nvidia biased made me stunned by the overfocus of the online community on the culture war stuff
They robbed you of your gpu's extra performance
A pc 6500 XT is 30% faster than my laptop's RTX 2050. Yet both has similar performance according to HU's charts and my laptop's benchmark
5:15 did I hear that correctly ?
Yes sir
I am building my first PC right now from scratch, on my own. I haven't built one in 15 years or so and never without help. It is absolutely wild what I have come back to. I have a MOBO and a CPU (670X & 7950 X3D) and now I am waiting for the new cards. I wanted to do Nvidia since I have never had one before but I do not like the way they're doing things. I'm thinking of grabbing an 7900 XTX or something newer, if AMD makes anything worth checking out around that performace and a lower price than the green team. Otherwise I am "doomed" to grab a 4080/5080.
Anyway, I have been checking out your videos here and there along with GamersNexus (
The 7900xtx may actually be the most powerfull gpu from AMD in the next 3 years since amd said they wouldnt try to make a high end gpu this year/gen. Just to inform you que you may need to brace yourself for disapointment
You dont buy by brands morale, you buy a brand becsue they provide with the product you need or want, this is not a multi year relation, this is a one time deal wish you make a one day transition, you get the good they get their reward. Period.
@@gsst6389 That is okay if you think that. I disagree and will spend my money how I want.
I don't care how good Nvidia will get, i'll never buy from them, until they stop their scumbaggery.
Nvidia have done this kind of thing for the last 20+ years. Research new tech, make it proprietary, pay off game devs to optimise for their cards and put the new tech in their games, now gamers have to buy their cards because it runs better on Nvidia and it has exclusive features, since they partnered with the devs.
Don't care for this mentality. In the mid range (where my budget is), AMD is still king in price to performance anyway.
Well, personally, i wont buy Nvidia again, because they treat their own customers like trash... I'm one of the suckers that fell for the RTX 2080 TI scam...
First they burned me, by making my Sick SLI-setup invalid, by discontinuing SLI-support, i could only run my games on ONE card, which was a huge performance hit... But i lived with it for many years, until i decided to upgrade to that 2080, and then although my performance improved, i still had crazy stuttering issues... So that was the second time i got burned... And then 1 month later, the 3000 series released, making me feel like i wasted my money on the 2080 TI.
I'm not buying from Nvidia again. :)
I'm buying AMD, and, if AMD falls off a cliff, i'd rather buy Intel GPU's, than Nvidia... And if Intel AND amd falls off a cliff somehow, i'd rather buy and play IRL board-games than buy an Nvidia product, or heck, even give those obscure chinese knock-offs as try! :P
Why would you spend money on R & D and not make it proprietary? It's not smart to supplement your competitors.
@@Codyslx From a business standpoint, sure. But if they wanted, they could at least make it so that at least previous gen cards can use some of the features of the new ones, but they lock that away too. They are anti consumer even with their own customers. Not to mention the outrageous prices, or how they sell you a worse card for more money year after year(3060 vs 4060for example). And if AMD can give all gamers FSR and frame gen for free, so could Nvidia. Most of their money is from the AI, crypto and big tech side, not gamers, but they nickel and dime everyone. They still sell 8gig cards for more money than AMD does their faster 12gig cards.
@@misterlobsterman How will previous gens run software that require specifc hardware to work? There is a reason why amd is moving to hardware level machine based upscaling, that their other gpus most definitely won't have access to.
The 4060 is also better than the 3060 even removing dlss3 and dlssfg all while only using 115 watts max.
@@Codyslx Hmm, i think that means Nvidia isn't confident they could stay on top, performance-wise, and was afraid a competitor would pull it off better than they could.
If you make something propriatery, you are only showing you fear what your competitors might accomplish with it.
It's cowardice, is what it is...
If i was confident i could remain on top, i would share the tech, and still remain on top, everybody wins! Heck, my competitors might even invent something new off of what i invented, which i can proceed to innovate on, in turn... perhaps we'll accidentally discover something that can reduce production cost by 50%, allowing us to sell as the same price, but still increasing profits.... etc.
So actually yes, it is smart to "supplement" your competitors... Only someone who isn't smart enough to see the benefits, would think it isn't smart. :P
Title doesn't make sense.....it should say "doesn't" not "don't"
Cuda and other tech like tensor cores is huge. Cuda acceleration is huge for both gaming and also productivity. Funnily enough, a lot of what nvidia has implemented as propriotary stuff, is stuff that also was introduced in the PS2 tech demo, showing the new possibilities the PS2 had, like better simulations.
i don't get your point. Are you frustrated that Nvidia keeps introducing new features for their hardware? because it makes developers lazy?
it's like saying cars are bad, It makes people not want to walk!
Also this thing about DLSS and AI features that nvidia has. They're always saying "DLSS BAD" "Ray tracing is a gimmick" "DLSS is cheating and fsr looks the same", istg it's like they prefer a giant bulky steam engine instead of a modern v8
@@M4x505. FSR looks objectively worse, lmao... And i'm using a 6800 XT for my main system... I'd rather use proper anti-aliasing if i can, FSR looks awful... DLSS does look better, and would be the preferred AA-tech, if developers weren't abusing it as an excuse to avoid doing the optimization work they're supposed to do... : /
Ray tracing IS a gimmick though, is only looks SLIGHTLY better, but it costs 90% of your frames, as well as 90% of your bank-savings... It is so not worth it, AT ALL! 🤣
That's why i went with AMD, because AMD does the basic raster stuff, really, REALLY WELL, so i get MORE frames than Nvidia products get with all the RTX stuff turned off... That's why AMD is better, imho... AND is also cheaper, literally win-win.
I only get sad when games, like Hogwarts Legacy, doesn't have normal anti-aliasing techniques, at all, and ONLY has TAA, and then FSR and DLSS, those are the only choices you have on Hogwarts legacy, is so silly. :P
@MyouKyuubi I'm an ai researcher. AMD is waaaaaaaay behind in the business side. developers don't even optimize around their hardware anymore. i hope they step it up
@@AmrAbdeen Oh you mean market share? Sure, i guess...
However, Whenever you buy a console or a laptop, what CPU does it have? AMD, what is the best gaming CPU on the market right now? AMD... What's the cheapest gaming hardware? AMD!
Only reason Nvidia, and Intel have the most market share, is because they're partnered up with Microsoft and local PC assemblers to promote and sell their stuff, before anything else... Trying to create a closed system where they can each have a monopoly on whatever they're each selling... Like an ouroboros of parasites.
The fact that AMD ISN'T engaging in monopolistic practices like that, makes me respect them more. :)
After all, you AI-researchers will soon be out of a job, once everyone realizes AI is just the latest buzzword and snake-oil, to get people to buy ewaste.
I haven't heard of a single individual bragging about how much they love AI, everyone, even the target demographic for this AI nonsense, absolutely f***ing HATES AI, lmao, to the point where they're literally sick of hearing the word "AI"! xD
@MyouKyuubi why so defensive? out of job.. buy ewaste? That's the most incoherent take i have seen so far on research. you know that mostly everything tech you use day to day depends on machine learning one way or another, right?
I genuinely do not understand the loyalty to Nvidia..
I've been using AMD GPUs for a long time, and they've been legitimately customer friendly with their practices since way back.
The drivers are more user friendly, and they require less overhead. I'm not sure about the newest revision of Nvidia's driver software, but AMD's has been more feature packed for years. 🤷
*It really has to be because Nvidia makes the top tier card. If AMD took the crown, we'd see a massive shift just like with CPUs.*
I truly believe that, but I don't think it's going to happen any time soon, since the rumors are AMD plans to only target 4080 or 5080 levels for their top card (whichever it was).
Nvidia copers fill the comments section with their budget 30s that can't even use DLSS 3 and still they cope. This is the reason why we can't never leave Nvidia is because of idiotic consumerism.
Yep, idiotic consumerism is the problem here, not Nvidia.
Although Nvidia has done a lot of shady s**t, this, in particular, is a consumerism issue, nothing else. : /
I went with AMD, i got burned by Nvidia's actual wrong-doings three times, and i called it there... First they discontinued SLI-support, rendering my fresh PC setup invalid, but i gritted my teeth and stuck with it for many years... Then i bought the 2080 Ti, the long-awaited upgrade, which stuttered like crazy and was barely an upgrade at all, and then 1 month later 3000 series comes out, making me realize i got scammed, twice, by Nvidia.
Never buying Nvidia again after that, lmao, i'd sooner buy Intel, or even an abscure chinese knock-off before i ever consider buying scamvidia products again! 🤣
Just enough people buying everything what NVIDIA throws on the table. Just waiting for real vendor lock in by NVIDIA, where games can only run NVIDIA cards.
physx was primarily about fluid and particle simulaiton and was a big deal for about 10 years, it was tech developed by a company named ageia and is now open sourceware
Yes and my brother used a software and with it he played bordeland 2 with an R9 390 with physx on and it was playable
The fact that AMD and Intel are not catching up in the GPU field hurts us consumers because of the lack of competition.
Competition drives innovation and drives prices down, both being good for us consumers.
The fact he's never seen PhysX in a game, blows my mind. So he basically never played any game that came out in the golden age of computer games (2004-2018). Modern gaming is is a trap, it's 90% fiddling with settings and it never looks good. Older Games even look a lot better imo. Nothing beats clear, sharp graphics on a 2k Monitor. Modern games are like: "Okay it's sharp, but not if you stand so far away, we can't handle good Textures anymore. Also have fun with transparencys, depth of field with DLSS together, it's a mess, good luck...".
He's a clueless AMD fanboy, what do you expect.
99% of the people that praise AMD and absolutely hate Nvidia will buy an nvidia card.
@@M4x505. They would if they could afford to. But their parents won't give them the extra allowance, hence the fact they're so mad about it online.
@@derbigpr500 even then, I would rather get lower tier Nvidia card than an AMD one. Never again, drivers suck (and yes I did ALL the ddu and everything). Plus, if you want to do anything productive with your system there's no choice but green team.
Yeah, i ran the original Half Life game a while ago, and i noticed there's no anti-aliasing settings in in the menu... So i decided to inspect the anti-aliasing the game just runs, natively... and it's fking amazing, literally flawless anti-aliasing, it looks spectacular
I actually just really appreciated the history lesson for the early stuff that was before I got into PC gaming and building. The whole video was good though.
this is monopolistic, we should really class action nvidia and make them open source cuda, and allow vram to be upgraded, the only reason for any of it is to protect a monopoly thats sad its millions wasnt enough
Why would the government make CUDA open source.. NVIDIA developed hardware and software for it. Its their intellectual property. Lmao. Or are we becoming socio-communist now??
Why would I spend a shit load of money on research and development and partnerships and then make all my competition use it?
@@youssefmohammed5456 exactly.
@@youssefmohammed5456 intel was made to do the same by your parents or grandparents because they understood monopoly is bad
AMD should never have played catch up for these stupid tracing features. If I was them, I would have collaborated with developers to maximize performance without the need for an upscaler. Also, I want to criticize the devs on the same topic: I think they wasted time on implementing ray tracing and path tracing, plus this need for realism is silly when the game's art style was always more important. I don't want a game that's 100 GB, needs the latest hardware, latest features, looks like real life. For realism and path tracing, I just go outside (and it doesn't even cost anything).
Just give us cheap graphics cards AMD.
They won't, you only have Intel for that.
If they are in stock@@TopShot501st
@@TopShot501st Median market price for Battlemage-580 is 380 Bucks/Euros, if you can find it.
And you also need HighEnd CPU (5800x3D and upwards) to run Alchemist/Battlemage (Intel GPU overhead).
How is Intel the cheap option?
wdym? AMD has plenty of viable cheap cards for you to peruse. :O
@@MyouKyuubi Where is the RX 580 successor?
The old benchmark of 60 fps, dlss off, ultra quality, 4k res was such a good time for progress. Now we got this ray tracing/path tracing garbage that requires dlss to hit 60 at medium/high instead of ultra at 1080p. The whole ray tracing tech has caused this backwards trend and until people start giving backlash on this change in testing methods, its only gonna get worse
Radeon better
🎯
If you bought that stock back in 2016, you’re loving life right about now.
AMD performs much better than Nvidia when there is a CPU limitation. Nearly all the stupid graphics card comparison tests are done with the fastest CPU available, which hides this fact.
Interesting
that's true!
I think nvidia needs to be broken up for being a monopoly.
Unrelated but have you seen the B580 overhead issue testing done by HCN and HBU?
Much respect for the ROR2 music
sounds to me like AMD and Intel need to start innovating on their own and not follow what nvidia does all the time.