How Nvidia is Suffocating AMD (and don't let us leave)

แชร์
ฝัง
  • เผยแพร่เมื่อ 10 ก.พ. 2025

ความคิดเห็น • 1.6K

  • @Leelareso
    @Leelareso หลายเดือนก่อน +890

    The problem isn't the graphics cards, it's the developers who have stopped optimizing games and are patching everything with dlls and frame gen. Even the game requirements state that to run a game you need X card with dlss enabled to play at 30 fps 1080p.

    • @smallbutdeadly931
      @smallbutdeadly931 หลายเดือนก่อน +149

      100% agree. We're going down a road where game devs are praying that everyone buys flagship gpus so that they don't have to spend time optimizing their blurry TAA-ridden games.

    • @thewingedringer
      @thewingedringer หลายเดือนก่อน +16

      And who do you think pulls the strings? It is not the game publishers but nvidia and amd.

    • @dante19890
      @dante19890 หลายเดือนก่อน +11

      It is cuz developers are making game with upscaling in mind and it comes from the consoles.

    • @PrayToChrist
      @PrayToChrist หลายเดือนก่อน

      it's insane yes i'm stupid for having a 4060 but the new monster hunter title minimum specs requires a 4060 like what? these games are poorly optimized af will be upgrading with the 50's series

    • @Leelareso
      @Leelareso หลายเดือนก่อน +34

      @@thewingedringer Put yout aluminiom foil no head. What amd or nvidia have to game optimisation

  • @getawaydance
    @getawaydance หลายเดือนก่อน +345

    Old McJensen had a farm AI-AI-ooo

    • @jahithber1430
      @jahithber1430 หลายเดือนก่อน +26

      lol wtf

    • @zf8496
      @zf8496 หลายเดือนก่อน +20

      LOL²

    • @icarus3874
      @icarus3874 หลายเดือนก่อน +7

      I love this haha

    • @rem_0
      @rem_0 หลายเดือนก่อน +10

      This is stupidly funny 😂

    • @tezy0193
      @tezy0193 หลายเดือนก่อน +4

      haha

  • @akosv96
    @akosv96 หลายเดือนก่อน +675

    Blink twice if Jensen is holding you hostage 😥

    • @wikwayer
      @wikwayer หลายเดือนก่อน +22

      😐
      😌

    • @GreyDeathVaccine
      @GreyDeathVaccine หลายเดือนก่อน +7

      LMAO

    • @TopDplays
      @TopDplays หลายเดือนก่อน +7

      It’s okay i have an AMD cpu 😢

    • @ridleyroid9060
      @ridleyroid9060 หลายเดือนก่อน +5

      😖😖

    • @juliashenandoah3965
      @juliashenandoah3965 หลายเดือนก่อน +10

      Blink three times if you have Nvidia Stockholm syndrome and absolute LOVE being hostage of Nvidia and enjoy all the awesome software progresses from Tessellation to Raytracing and a dozen other goodies! ;D

  • @mRibbons
    @mRibbons หลายเดือนก่อน +398

    Don't forget G-sync, the proprietary hardware based variable refresh anti tearing solution, requiring a Nvidia GPU and an expensive G-sync monitor. AMD solved that one tho!

    • @HanSolo__
      @HanSolo__ หลายเดือนก่อน +8

      Yup.

    • @sexyassstephen
      @sexyassstephen หลายเดือนก่อน

      Nvidia have me hostage, as my monitor is was £900 when i bought it and only has a G sync module in it so i always have to upgrade to Nvidia or go AMD and buy a monitor too.

    • @espertalhao041
      @espertalhao041 หลายเดือนก่อน +65

      Not only AMD fixed it, but Vesa added standards for it too.
      And now, Nvidia GPUs can use freesync (the AMD version of it) as well, which it couldn't before and was a huge mess.

    • @arenzricodexd4409
      @arenzricodexd4409 หลายเดือนก่อน

      the tech to make VRR have been always been there but no one really thought using it on PC especially on games until nvidia come up with it. and when nvidia did it they are not doing half assed solution that might or might not work. early freesync was a mess despite AMD have one year and a half to study what nvidia has done with Gsync. anandtech have a good article about this issue before when AMD finally realized there is no such thing as "free" if you want to come up with a proper product with good experience.

    • @dante19890
      @dante19890 หลายเดือนก่อน +14

      You dont even need AMD, HDMI 2.1 solved it. Nvidia was just first, just like with everything.

  • @Rastloese
    @Rastloese หลายเดือนก่อน +141

    "Neural rendering" is just the newest marketspeak. People are skeptical of the term AI because it is constantly misused in marketing, so here is a new one.

    • @VintageCR
      @VintageCR หลายเดือนก่อน +6

      true and all Neural rendering does is make the edges really smooth and pop certain textures to make it look EVEN MORE realistic... feels like a rebranded Tessellation.

    • @Speejays2
      @Speejays2 หลายเดือนก่อน +4

      Are you planning to continue to call this feature marketspeak when AMD announces their own version of it in 12 months, or will it stop being marketspeak all of a sudden?

    • @legendp2011
      @legendp2011 หลายเดือนก่อน +1

      While true many companies have abused the word Ai. However when it comes to a GPU, I can believe it being the actual terminology (similar to how DLSS IS actually using ai trained models)

    • @marciusnhasty
      @marciusnhasty หลายเดือนก่อน +1

      ​@VintageCR In a sense, you're spot on. From what I've seen, it's pretty much to Tessellation+Hairworks/PhysX what DLSS is to native resolution. Not as accurate, but faster. It's what Satya Nadela calls agent in terms of comparison to large models. Smaller, narrow and specific usecases. (just pointing out Nvidia isn't even inventing any of this, just first to leverage it's hardware to implement this stuff).

    • @nickhtk6285
      @nickhtk6285 หลายเดือนก่อน

      You nailed it.

  • @lukepeterson110
    @lukepeterson110 หลายเดือนก่อน +396

    Let's spend 300% more money for 10% improvement, that makes sense!

    • @akosv96
      @akosv96 หลายเดือนก่อน +68

      Most viewers: 😡omg this is unnacceptable
      General public regardless all tech youtuber guidance: 🤩😍
      (rinse and repeat for each generation 😭)

    • @M1szS
      @M1szS หลายเดือนก่อน +32

      @@akosv96 That is exactly why people are still buying cards like the 4060

    • @duxnihilo
      @duxnihilo หลายเดือนก่อน +17

      @@akosv96 To be fair, tech youtuber guidance is pretty useless to most consumers.

    • @Ricardolivechannel
      @Ricardolivechannel หลายเดือนก่อน

      @@duxnihilothat’s why we watch vex because other people are lying to us to make money

    • @D2Mephisto
      @D2Mephisto หลายเดือนก่อน +8

      well, you gotta remember, the more you buy the more you save ;)

  • @JcsP3D
    @JcsP3D หลายเดือนก่อน +62

    CES Nvidia / AMD keynote script leak:
    AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, GAMING, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI, AI...
    Yeaaaaa, you know what's coming.

    • @metabang03
      @metabang03 หลายเดือนก่อน +1

      Nvidia is not n suffocating amd. They just arenproving quality of image and beautiful graphics are NOT on their priorities. Amd images are sharper and colors are accurate and bolder

  • @JettzCG
    @JettzCG หลายเดือนก่อน +428

    RTX has just been a 6/7 year long beta test as far as gaming is considered

    • @fanchiuho1
      @fanchiuho1 หลายเดือนก่อน +59

      It's damn successful not as a beta test, but as a marketing effort because in the process they've conveniently drill it into our minds to call it 'RTX' instead of ray tracing, as evidenced by this comment.
      In MGSV terms, Nvidia earned tyranny status and made their subbrands, the 'lingua franca' of cutting edge graphics.

    • @NeovanGoth
      @NeovanGoth หลายเดือนก่อน +9

      Pretty nice for a beta test.

    • @gerarderloper
      @gerarderloper หลายเดือนก่อน +7

      yeah and I doubt 50 series will change that much. The 4090 was a nice attempt but really we would need DOUBLE that RT/PT performance JUST to be able to enable it at 4k without terrible upscaling and framegen solutions (they blur/smear the image a lot in many cases)

    • @michaelblue4619
      @michaelblue4619 หลายเดือนก่อน +4

      @@gerarderloper It is not "terrible" in any way, in fact it looks really good for the level of extra performance it gives and makes something unplayable, actually smooth. Without framegen you simply wouldnt be able to afford a GPU that puts out that level of performance. Upscaling is the future. The technology needed to give a raw double a 4090's RT/PT performance in a consumer size GPU without upscaling simply doesn't exist. Even if it did exist, it would cost a ludicrous amount to the point that developers wouldn't even bother implementing that level of graphics if only people with a 10k USD gpu can run that setting, which will be a very small segment of the market.

    • @mitsuhh
      @mitsuhh หลายเดือนก่อน +5

      RTX is branding, not a feature!

  • @MrFerisko
    @MrFerisko หลายเดือนก่อน +334

    PhysX was originally physics acceleration on separate add-on cards made by a company named Aegia. nVidia bought Aegia, integrated the acceleration into their GPUs. And then proceeded to be an evil corp as per usual.

    • @alwaysrise
      @alwaysrise หลายเดือนก่อน +41

      ^This. Imagine if physx was never bought out , stayed a separate add in card that didn't hurt fps and was available to more game developers without exclusivity rights from Nvidia. Could have been a game changer.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat หลายเดือนก่อน +25

      "Evil corp" lol tell me one for profit publicly corp that ain't evil lol

    • @interrobangings
      @interrobangings หลายเดือนก่อน +1

      ​@@DragonOfTheMortalKombat Ew, not this argument. You obviously pick the less evil company instead of just going "HURRR DURR NO ETHICAL CONSUMPTION UNDER CAPITALISM" and buying from the people who eat babies instead of the ones who just punch them

    • @ElderGamerX
      @ElderGamerX หลายเดือนก่อน +11

      ​@@DragonOfTheMortalKombat It's graded on a curve.

    • @sylvainh2o
      @sylvainh2o หลายเดือนก่อน +2

      wasn't it the thing where you could buy an extra component to put in your computer on top of your GPU that would handle just PhysX for more performance.

  • @diablosv36
    @diablosv36 หลายเดือนก่อน +169

    Actually ATI introduced Tessellation to Radeon before Nvidia did. It is just that when Nvidia implemented in a way that was more performant. PhysX was special hardware, that was eventually bought by Nvidia and then they killed the hardware and then vendor locked acceleration to there GPUs, then eventually they made it all CPU based. Hairworks was just a crappy unoptimized version of hair simulation, which was done much better by other tech. DLSS and all that jazz is suffocating PC gaming, simply because its being pushed on the devs to implement, which is a problem for PC gaming being an open platform in the long term, pushing APIs that only work on one set of vendor hardware is really bad, and resembles back to the 3Dfx days when Glide API ruled the roost, making it super difficult for competing GPUs to compete on a even play field.

    • @arenzricodexd4409
      @arenzricodexd4409 หลายเดือนก่อน +1

      PhysX since the very beginning have both CPU and hardware accelerated part. but Aegia want to sell those PPU so the highlight is mostly on the accelerated part. and the initial CPU physx also being accused to purposely using things like x87 instruction to make it slow running on CPU. nvidia over haul PhysX development with PhysX 3 making the CPU part more performant and compete head to head with havok. this is when some big game engine start ditching havok in favor of much cheaper PhysX in term of licensing price. nvidia eventually make the CPU PhysX open source in 2015.

    • @diablosv36
      @diablosv36 หลายเดือนก่อน +2

      @@arenzricodexd4409 Yeah they did open source, when it became useless to do the vendor locking bullshit. Also it didn't end up helping them capture more GPU market anyway.

    • @cvxiscvc
      @cvxiscvc หลายเดือนก่อน

      @@diablosv36 After all its nvidia. They will try to milk as much of money as they can from the consumer.

    • @transformerstuff7029
      @transformerstuff7029 หลายเดือนก่อน +2

      Dude the fake fps trent right now is making me consider console gaming -.-

    • @albal156
      @albal156 หลายเดือนก่อน

      Exactly.

  • @interrobangings
    @interrobangings หลายเดือนก่อน +298

    this is why I haven't bought an nvidia product since 10 series, they're cartoonishly anti-consumer with their behaviour and pricing

    • @dante19890
      @dante19890 หลายเดือนก่อน +51

      Every company is anti consumer. If amd where at nvidias level they would do the exact same thing.
      Its scary people actually think companies have morality and are on the consumers side.

    • @interrobangings
      @interrobangings หลายเดือนก่อน +99

      @dante19890 sorry, where did my post say that AMD is inherently pro-consumer?
      it's scary how you're either illiterate or jumping to conclusions

    • @angeltzepesh1
      @angeltzepesh1 หลายเดือนก่อน +15

      ​@@dante19890i fully believe that AMD would be just as bad if on top, but that doesn't mean we can't expect better from these brands. There are still brands out there that don't rip off people like Nvidia or Apple do, even if they are on top.

    • @eikbolha5883
      @eikbolha5883 หลายเดือนก่อน +1

      ​@@interrobangingsamd is not good company as well cuz they fuck up non stop but at least they are not that greedy as nvidia i have 3 generation of nvidia gpus and im not plán Ing to buy another and if you ask why well i no longer wanna pay for premium when it nôt give mé whats been promised.

    • @Alfi-x6b
      @Alfi-x6b หลายเดือนก่อน +3

      cuz they are innovating?

  • @eduardo3331
    @eduardo3331 หลายเดือนก่อน +136

    I couldnt care less for raytracing and all this high end stuff. Only want stable high fps

    • @jamesp877
      @jamesp877 หลายเดือนก่อน +10

      More developers are putting raytracing into the foundation of their games. It will soon be inescapable.

    • @GoldenEDM_2018
      @GoldenEDM_2018 หลายเดือนก่อน +4

      My cheap ass 4050 runs every game i play at constant 60 fps.

    • @gerarderloper
      @gerarderloper หลายเดือนก่อน +1

      sadly games are coming with it as default, no way to turn it off.

    • @foxx8x
      @foxx8x หลายเดือนก่อน +1

      Ask game devs and publishers for that

    • @nahual7x62
      @nahual7x62 หลายเดือนก่อน +4

      @@gerarderloper mods

  • @michahojwa8132
    @michahojwa8132 หลายเดือนก่อน +87

    AMD doesn't need to keep up - they can do million of things. They have got 3 different tech for path tracing ready and fsr is co developed with Sony and Samsung. AMD needs to start playing their own game - maybe chasing nvidia is just cheaper.

    • @deathtoinfidelsdeusvult2184
      @deathtoinfidelsdeusvult2184 หลายเดือนก่อน +3

      they should try making their own features. They are just copying what nvidia does. Even in PS5 Pro they think PSSR can't even compete with dlss.

    • @dante19890
      @dante19890 หลายเดือนก่อน +19

      what are you talking about, they are behind in every metric.

    • @GreyDeathVaccine
      @GreyDeathVaccine หลายเดือนก่อน +9

      "They have got 3 different tech for path tracing ready " - source?

    • @michahojwa8132
      @michahojwa8132 หลายเดือนก่อน +13

      @GreyDeathVaccine GI1, brixels and path tracing optimizations. amdgpuopen has the papers and videos. AMD has options - it's just complacent.

    • @dante19890
      @dante19890 หลายเดือนก่อน +12

      @@michahojwa8132 so you are saying they somehow have the tech that they are too complacent to develop? huh??? You are not making any logical sense. Sounds like major cope.

  • @ThreatInteractive
    @ThreatInteractive หลายเดือนก่อน +13

    Thank you for the shout out!
    Very glad to see you you bringing this topic closer to the surface!

  • @uniqueprogressive9908
    @uniqueprogressive9908 หลายเดือนก่อน +57

    I love how he pauses for a brief moment on the rabbit liquid scene

    • @Igor369
      @Igor369 หลายเดือนก่อน +1

      He had serious Zootopia fandom flashbacks

  • @j-cuts9396
    @j-cuts9396 หลายเดือนก่อน +26

    Fun fact: The Witcher devs coded hair works wrong in the original copy of the game the one before they released the next gen patch. It use to run on CPU and not GPU that is why it was a big performance hit

    • @Trashloot
      @Trashloot หลายเดือนก่อน +1

      I didn't know that. That's actually funny :D.

  • @LoneWolf-tk9em
    @LoneWolf-tk9em หลายเดือนก่อน +101

    Irony is all these nvidia's features require more vram 😄
    & nvidia is too greedy to give vram

    • @RiderOfKarma
      @RiderOfKarma หลายเดือนก่อน +10

      That’s their sales model; for nvidia not a bug but a bottom line feature. Copying Apple.

    • @mkeyx82
      @mkeyx82 หลายเดือนก่อน +12

      I think they are just afraid of building cards that are going to last for too long, like 4-8 years.

    • @q1337
      @q1337 หลายเดือนก่อน +4

      they can afford to, but the strategy is even if you buy higher end to make you feel like the highest end is better. you could slap 16GB or more vram on every card in the series without a major price difference.

    • @arenzricodexd4409
      @arenzricodexd4409 หลายเดือนก่อน

      that's how nvidia being smart with it.

    • @LoneWolf-tk9em
      @LoneWolf-tk9em หลายเดือนก่อน +2

      It's too much greed coz 8gb vram costs around $40-$50 for nvidia to manufacture

  • @rgstroud
    @rgstroud หลายเดือนก่อน +35

    Even with a 4090, you need DLSS Balanced with Frame Gen to do path tracing and DLSS only looks good at Quality for 4K to me. NVIDIA does this so you want to buy the next gen 5090 hoping it will finally do path tracing with minimal DLSS. If it does, then NVIDIA will introduce the next tech into the games weeks later so the 5090 will not be good enough again.

    • @slumpedmage
      @slumpedmage หลายเดือนก่อน

      They likely don't want anybody turning down the DLSS... That's likely their substitute for any real upgrades from here on out. If the DLSS is not needed, it becomes 1 less marketing point for them.

    • @transformerstuff7029
      @transformerstuff7029 หลายเดือนก่อน +1

      a 4090 runs marvel rivals at 90fps on 4k........that is all kinds of sad.
      We need better optimization in games.

    • @Robsham1
      @Robsham1 หลายเดือนก่อน

      Wait, so they keep introducing better and better tech as hardware becomes capable of handling it? Oh the horror.

    • @slumpedmage
      @slumpedmage หลายเดือนก่อน

      @@Robsham1 I wish it were that simple.

    • @APunishedManNamed2
      @APunishedManNamed2 หลายเดือนก่อน

      @@transformerstuff7029 90fps with 1/4th of the frames lacking any user input given the game runs isn't actually at 90fps lol

  • @WujoFefer
    @WujoFefer หลายเดือนก่อน +99

    Soon RayPathSuperTracing will be available for just 4k$ on 480p... and this will be midrange gpu because top range will start from 20k$...

    • @MidnightDrake
      @MidnightDrake หลายเดือนก่อน +6

      Damn, choom...

    • @ahmedzid1119
      @ahmedzid1119 หลายเดือนก่อน +4

      RPST , it sounds like PTSD lmao 🤣

    • @ELpeaceonearth
      @ELpeaceonearth หลายเดือนก่อน

      ​@MidnightDrake the translation is wild lol

  • @Martial-Mat
    @Martial-Mat หลายเดือนก่อน +77

    "We've got ray tracing and sky reflections all turned on. Now let me go to this skyless, dark pit to demonstrate absolutely none of it."

    • @WayStedYou
      @WayStedYou หลายเดือนก่อน +52

      and yet even when none of it is shown it still more than halved his perfromance

    • @mkeyx82
      @mkeyx82 หลายเดือนก่อน +3

      Sky reflections require 5k cards.

    • @dante19890
      @dante19890 หลายเดือนก่อน +7

      Thats why we have raytraced global illumination so the dark scenes actually look fantastic reacting realistically to ambient light in the scene. People who think RT is just reflections are clueless.

    • @potatonite2340
      @potatonite2340 หลายเดือนก่อน +3

      ​@@WayStedYougood point

    • @d-ima8313
      @d-ima8313 หลายเดือนก่อน +1

      @@WayStedYou ya, because you still need to calculate rays... So it doesn't matter, when you will have the sky, fps will be the same because rtx doesn't care about complexity.
      Also, he using not top card in 4k in one of the most demanding games, what was he expecting? Why not path traycing in 8k and 240fsp on 4050M? Just switch to QHD or turn on dlss and it will be playable. Change raytracing from ultra to mid or low, it will be still better than not having RT at all and fps will be more than comfortable.

  • @zdspider6778
    @zdspider6778 หลายเดือนก่อน +28

    2:38 "Whether or not the developers could have done better on the left side is questionable."
    No, it's not questionable at all! They could have added some normal mapping to make it less than 1% difference between them! 😠 That comparison is typical Nvidia BS full of lies. They are liars and shouldn't be trusted. That game came out 2 months after the GTX 970 was released - which had 3.5GB VRAM! Technically 4GB but the last 0.5GB was SO SLOW that if a game hit that region it brought the card to its knees. Gimpvidia doing gimpvidia things, gimping them. Now they vomit these PCIe 8x + 128-bit for $400 (and wanted $500 for another 8GB VRAM). 😠😠

  • @dragonhero14
    @dragonhero14 หลายเดือนก่อน +29

    Honestly, I expect if there is no change in the market that AMD will stop producing consumer cards. It doesn't make them much money. Even when they release cards with good performance and priced well, people still don't by them. The only reason most normal consumers want AMD around is for competition. But even if they make the best bang for the buck card, they will never actually chose AMD. If Intel sees the same, they will drop graphic cards as well and we'll just have a monopoly. If you want diversity and variety in your markets, the consumer actually needs to support that. Nvidia knows now that there is no price too high, since most of their customers refuse to purchase other companies cards.

    • @GreyDeathVaccine
      @GreyDeathVaccine หลายเดือนก่อน +6

      Well I ordered B580 (but I am not sure when I will get it), but man, this card is overpriced in Europe. I accept that I am buying a slightly worse product, but I will have QuickSync 🙂 That matters for me, since I already has AMD CPU and not gonna switch to Intel.

    • @PCgamingbenchmarktime
      @PCgamingbenchmarktime หลายเดือนก่อน +12

      That's crap, Amd haven't been pricing their cards aggressive enough. They've been to far behind with upscaling and RT. They needed to price lower to compensate. They're catching up a lot in RT with the 9070xt apparently and they have fsr 4 coming which is now a.i based. If nvidia keep raising prices Amd have a chance to step in and take markershare if they're aggressive enough. I'd happily buy the 9070xt if it was 4080 performance and $499....but will they actually price it properly or match those performance levels? It needs to beat the 5070 by at least 10% in raster imo while being $100 cheaper. Otherwise it will be the same old story, this is their best bet so far so it would be such a fail if they botch this lol. We'll see in 8 hours i guess

    • @HunterTracks
      @HunterTracks หลายเดือนก่อน +8

      @@PCgamingbenchmarktime Well, let's not forget that AMD actually needs to make money on those cards to keep making them. They can only drop their margins so low before they may as well not bother with making GPUs at all.

    • @PCgamingbenchmarktime
      @PCgamingbenchmarktime หลายเดือนก่อน +6

      @HunterTracks who said anything about them not making money on them. The 7800xt was $499 at launch, later dropped down too $449. They're still using gddr6, the same amount of ram, i heard rumours that die size isn't that far off the 7800xt. The profit margins would still be there, its about making the card as efficiently as possible. Nvidia just has ridiculously high margins let's be honest. Cards can be cheaper than what they're selling and still be making big profits. I see no reason why the costs would be anymore than the 7800xt. The 2080 to 3080 had a 50% uplift for the same price, so a 40% uplift from the 7800xt for the same price seems very reasonable

    • @WayStedYou
      @WayStedYou หลายเดือนก่อน +2

      @@PCgamingbenchmarktime and yet they tried lower pricing a decade ago until recently and still didnt gain market share.

  • @CrazyShiftee
    @CrazyShiftee หลายเดือนก่อน +19

    4:49 sums up RT pretty well. You can actually see a difference with the lighting .. probably ..

    • @Soccercrazyigboman
      @Soccercrazyigboman หลายเดือนก่อน

      Yep. It's so sad how people fall for gimmicks. Even when they don't see a difference. Branding is fucking powerful

  • @peanut93able
    @peanut93able หลายเดือนก่อน +25

    amd has not as much money to throw at their gpu branch unlike nvidia. lets just hope they dont f up the prices this time

    • @RegalPixelKing
      @RegalPixelKing หลายเดือนก่อน +1

      It doesn't work like that...
      Prior to Ryzen gen 1 AMD had far less money compared to Intel. Yet after a few generations of Ryzen they eventually surpassed Intel in spite of them originally having less money.
      The reason Intel and AMD and struggling to catch up to Nvidia is simply because Nvidia has the best engineers and best feature set.

  • @DaveTheeMan-wj2nk
    @DaveTheeMan-wj2nk หลายเดือนก่อน +5

    AMD is doing quite well.
    Contracts for both xbox and playstation, including their new ones coming out.
    Also int he PC hand held world, they are about to release some next level chips for that.
    And their CPUS are on fire.
    Their stock is only growing.
    AMD doens't need to focus on super high end. Not important.

    • @Balnazzardi
      @Balnazzardi หลายเดือนก่อน +2

      Dude reality check for you. Nvidia dominates the PC GPU market with 88% marketshare....AMD has only 12% and this gap has only been widening year after year. on CPU side their situation is without a doubt much better and they are taking ground from Intel, but on GPU side they are not even outperforming RTX 4060 and 4060 ti, even though their GPUs at same price range are more affordable and offer better performance than 4060/4060ti. If you dont take my word for it, just go look at Steam hardware surveys. AMD has mountain to climb just to get more consumers on their side with low and mid tier GPUs.

    • @Totenglocke42
      @Totenglocke42 หลายเดือนก่อน

      @@DaveTheeMan-wj2nk Having basic features of a modern GPU is not "high end".

    • @DaveTheeMan-wj2nk
      @DaveTheeMan-wj2nk หลายเดือนก่อน

      @@Totenglocke42 You're welcome to share your opinions :)
      Nothing is set in stone, high end is determined by the individual and their needs.
      How cool is that.

    • @Totenglocke42
      @Totenglocke42 หลายเดือนก่อน

      @@DaveTheeMan-wj2nk Except it's not. High end is determined by what exists. A top of the line GPU that has all the features available and runs games at the best resolutions and frame rates is high-end. A GPU that's using almost ten year old tech and can't run any of the modern graphics settings properly is low-end at best.

    • @DaveTheeMan-wj2nk
      @DaveTheeMan-wj2nk หลายเดือนก่อน

      @@Totenglocke42 I'm happy with my previous comment.

  • @DrChill-eo4dq
    @DrChill-eo4dq หลายเดือนก่อน +19

    AMD was ripping off Intel untill they started to simply make better products and not only performance wise, but also price was affordable to most of us casual pc users/gamers. AMD did so well that Intel is in the shadows now. If they gonna start to copy and beat Nvidias insanely stupid prices then we should'nt complain but be happy. Because I don't know in what world you guys living but to pay 1,2k or more for a gpu is stupid. Little % of people sees that price as affordable, to most it's few months savings that's left after paying taxes and rent.

    • @derbigpr500
      @derbigpr500 หลายเดือนก่อน

      Intel still makes better products than AMD. Only AMD fanboys who overdose on copium daily believe AMD is better now.

    • @dgillies5420
      @dgillies5420 หลายเดือนก่อน

      This is a complete misunderstanding of 40 out of the last 50 years of CPU history! AMD invented 64-bit x86, STARTLING INTEL. AMD crushed Intel with Athlon. Intel only survived by breaking the law MANY TIMES! AMD is very amall and made a disastrous design mistake with excavator CPUs - NOT COPIES OF ANY INTEL COU, EVER! Now they have Ryzen and pioneered chiplets and vcache while Intel sits on its fat lazy ASS! AMD were only a second source (printing licensed copies of 80xx cpus) in the late 1970s and early 1980s when nobody would buy your chips without a second source foundry to make more chips in case you went bankrupt!

    • @transformerstuff7029
      @transformerstuff7029 หลายเดือนก่อน +2

      Wasn't AMD producing like a faster 486 CPU back in the day?

    • @thechurchofsupersampling
      @thechurchofsupersampling หลายเดือนก่อน +2

      ​@@transformerstuff7029More the Athlon XP series, AMD was on top then, a long time ago though

    • @transformerstuff7029
      @transformerstuff7029 หลายเดือนก่อน +2

      @@thechurchofsupersampling that x2 4600+ was pretty decent if I remember correctly.
      And yea they reverse engineered pentium cpu's at first, and ended up improving them. So a amd 486 would be faster than the intel one.

  • @thatzaliasguy
    @thatzaliasguy หลายเดือนก่อน +62

    Biggest PhysX titles back in the day: Bordelrands 2, Batman Arkaham Origins, Metro 2033 (original), Mafia II (original), Arma 3, Cryostasis, Medal of Honor Airborn, Just Cause 2, Assassin's Creed IV Black Flag, Dragon Age II.

    • @marceladex9529
      @marceladex9529 หลายเดือนก่อน +6

      Bordelrands

    • @interrobangings
      @interrobangings หลายเดือนก่อน +6

      oh man the smoke and debris effects in AC4 were INSANELY cool at the time. I still had my GTX 770 after I upgraded to a 980TI and used it as a dedicated physx card

    • @the_unded6173
      @the_unded6173 หลายเดือนก่อน +7

      Bro Dont Forget ther 1st Mirrors Edge! I remember playing that and Arkham Asylum with my GTX260 and was blown away how well the glass and smoke effects looked!

    • @Yorgarazgreece
      @Yorgarazgreece หลายเดือนก่อน

      @@marceladex9529 😂😂😂😂

    • @davidepannone6021
      @davidepannone6021 หลายเดือนก่อน +1

      Fear too

  • @Riyozsu
    @Riyozsu หลายเดือนก่อน +184

    AMD should focus on software improvements before starting to compete with 80 and 90 class cards. Nvidia is thriving now because they took the pain and effort to make these technologies decades ago.

    • @zzephi
      @zzephi หลายเดือนก่อน +32

      Yes, AMD is better value for 16 - 25 years old kids who are only doing gaming
      and NVIDIA for 25+ years old adults who needs to game and WORK.
      That's one of the reasons Nvidia dominate so much. Nobody cares to spend 300/400 euro more for a polyvalent GPU.
      AMD fans are coping " Nooo AMD is better value, it cost 50 euro less and have 10 + FPS 🤓🤓🤓🤓 so it's better !!! "

    • @SG-Megatron
      @SG-Megatron หลายเดือนก่อน +15

      Both of this guys above me spit nothing but facts. AMD should fr compete with Nvidia on technology, cause fsr stil sucks ass so bad 😂, the fact that tim from hardware unboxed say's that Xess is out right better is sad for Lisa su & her whole GPU departement.

    • @Bulkyninja
      @Bulkyninja หลายเดือนก่อน +101

      @@zzephi bro its just a corporation that want's your money. Their not your friend or lover. They make a good product, but dominate the market and can make up whatever pricing they want. Corporate shills are gross as hell.

    • @diablosv36
      @diablosv36 หลายเดือนก่อน +32

      Tessellation, hairworks and phsyx wasn't developed by Nvidia, they bought them or basically stole them

    • @sidewinder86ify
      @sidewinder86ify หลายเดือนก่อน +12

      What's wrong with AMD's software? Explain.

  • @TheAnalgetikum
    @TheAnalgetikum หลายเดือนก่อน +45

    i do not understand the raytracing hype. yes it looks good. but why the fuck would i use it when you have to put DLSS in Performance mode so the textures look way worse so i can get playable fps. i just ignore RT in every Game. just a look what i can do feature to get sales for people with weak character.

    • @callmenoodles8955
      @callmenoodles8955 หลายเดือนก่อน

      It's still in its early stages, and tech needs time to develop and become mainstream. It also took quite a while for 1440p to become playable at an affordable price, and yet most people still play on 1080p. Meanwhile, there are 4k monitors.

    • @TheAnalgetikum
      @TheAnalgetikum หลายเดือนก่อน +12

      @callmenoodles8955 going into the 7th year RT not that new anymore and is demanding same as it Has before.

    • @dante19890
      @dante19890 หลายเดือนก่อน

      Lighting is the most important part of game graphics. You might have a shitty GPU now, but in 10 years you will sit here in the comment and gush over RT and you cant even turn it off cuz it wouldnt have a rastorised fallback by that time. Its the future.

    • @utopic1312
      @utopic1312 หลายเดือนก่อน +7

      @@dante19890 people said the same thing when the 20 series came out and it still makes games run like shit plus i shouldn't have to wait a decade for something that's currently out right now

    • @AstroBot-bc0213
      @AstroBot-bc0213 หลายเดือนก่อน

      @@utopic1312it’s a future not to far off I’d say probably in 2 years or so since game devs give no fucks about how well their games actually run so they force rt on with no way to turn it off👍

  • @Crecross
    @Crecross หลายเดือนก่อน +3

    3:00 Nvidia didn't introduce ray tracing. That technology is probably as old as your grandma.

    • @fxncy2566
      @fxncy2566 หลายเดือนก่อน +1

      they made it work feasibly in real time for games

  • @ShiroKage009
    @ShiroKage009 หลายเดือนก่อน +10

    You used to be able to have a separate Nvidia card on top of your AMD card and dedicate the Nvidia card to PhysX. So you could pair a cheap Nvidia with a powerful AMD card. Good old days.

    • @evaone4286
      @evaone4286 หลายเดือนก่อน +7

      They had to eliminate multi gpu tech, made too much sense

    • @utopic1312
      @utopic1312 หลายเดือนก่อน +1

      @@evaone4286 it had constant issues

    • @potatonite2340
      @potatonite2340 หลายเดือนก่อน +1

      ​@@utopic1312right. Not to mention that nowadays just buying a single good GPU is expensive, imagine two

    • @evaone4286
      @evaone4286 หลายเดือนก่อน +1

      @@utopic1312 developers weren't implementing it properly

    • @utopic1312
      @utopic1312 หลายเดือนก่อน

      @@potatonite2340 it was expensive for 2 then as well

  • @chrls.1093
    @chrls.1093 หลายเดือนก่อน +4

    15:30 They do that so the consumer can easily understand the naming of them, they've been this for the Ryzen 3,5 and 7 in comparison of Intel's i3,5 and 7.
    They match the names to make it easier to understand for everyone, that's actually really clever.

  • @rob4222
    @rob4222 หลายเดือนก่อน +25

    The truth is: only a few games are Nvidia sponsored.
    AMD has AFMF2, which Nvidia hasn't (yet).
    I hope that AMD gives us something like to Microsofts AutoSR and makes it as accessible as RSR. That would be a great feature for older games, I think

    • @Efsaaneh
      @Efsaaneh หลายเดือนก่อน +4

      AFMF is literally just lossless scaling which costs 4$. AMD is being a stubborn donkey and i hate it

    • @SeasoningTheObese
      @SeasoningTheObese หลายเดือนก่อน +6

      @@Efsaaneh It's also worse than Lossless Scaling which is both hilarious, and depressing.

    • @grzywa7401
      @grzywa7401 หลายเดือนก่อน +7

      ​@@SeasoningTheObeseworse? In my case it depends on the game but in most of them AFMF2 is much better than LS

    • @roller4312
      @roller4312 หลายเดือนก่อน +5

      it's even funnier when you think how AMD has basically all the console market and still can't capitalize on that.

    • @interrobangings
      @interrobangings หลายเดือนก่อน +9

      ​​@@roller4312How are they supposed to capitalize on that?
      ”Buy our stuff, it's like a PS5 but better!” SURELY wouldn't piss off their business partner, Sony /s

  • @neilbrideau8520
    @neilbrideau8520 หลายเดือนก่อน +2

    Since the GeForce 256 Nvidia has been the king of dirty tricks. The most soulless evil corporation I have known for decades now.

  • @БоднарРостислав
    @БоднарРостислав หลายเดือนก่อน +5

    Tesselation is not about giving Y value to textures, parallax occlusion mapping does that. Tesselation triples and even quadruples the polygon count to applied textures to create higher level of detail, which is why it was used sparingly, only for some objects for a long time. Now, try lowering tesselation to minimum, for example, in Forbidden West, see what you get. Then, there's soft PCF shadows, which later down the line turned into PCSS, unified shaders, driver forced FXAA for games with no AA, then driver SSAA in DSR. Nothing's changed, RT is the new SSAO and soft shadows, that need unified shaders, accelarated by CUDA, and therefore run better on Nvidia. Somehow, ATI and then AMD always play catch-up, making competition for Nvidia either in features or raw performance.

  • @ukezafinje
    @ukezafinje หลายเดือนก่อน +173

    I recommend everyone to find the book titled The Elite Society's Money Manifestation, It changed my life.

  • @taomahNEGEV
    @taomahNEGEV หลายเดือนก่อน +4

    I honestly hope AMD kicks some nVidia asses with the RX9700XT value/performance.

  • @DomikDomik
    @DomikDomik หลายเดือนก่อน +13

    PhysX was previously a physical physics card that worked more or less like graphics accelerators like VooDoo did at the time. The company was bought by NVidia and its components were added to graphics cards for physics calculations.

  • @kaystephan2610
    @kaystephan2610 หลายเดือนก่อน +8

    Just a couple days ago I had a short argument with someone who claimed that Raytracing TOTALLY makes a huge difference. Another guy in that comment section also said that Pathtracing gives games a "hypnotizing quality" which is the most batshit insane level of falling for marketing I've ever seen. For some people it reaches almost pathologically obsessive levels. But anyway that's besides the point. The dude I was arguing was hilariously, basically already cartoonishly arrogant. His first statement - as per usual for NVIDIA hardcore fanboys - was that if I don't think of RT as great i must be running on an old shitty GPU. In his words he wrote: "I don't know about anything else but my 5080 is going to run that ray tracing beautifully cuck boy" lmao.

    • @RiderOfKarma
      @RiderOfKarma หลายเดือนก่อน +3

      Often these people arguing on social media don’t even own high-end gpus that can do this stuff. They base it off of watching youtube videos and pretending they know. Hypnotized by the marketing. Like poor people who defend billionaires because they believe they will someday be one. 😬

    • @Dempig
      @Dempig หลายเดือนก่อน

      I dont care about RT but I do care about upscaling, and FSR is completely unusable. You act like AMD fanboys arnt in complete denial about all of the driver issues. They all claim to have "zero driver issues" but AMD's own adrenaline patch notes list all of the games that constantly crash on all amd cards. It takes months for them to fix crashes in games like wukong, helldivers 2, lords of the fallen, etc.

    • @chacharealsmooth941
      @chacharealsmooth941 หลายเดือนก่อน

      @@Dempig you'll be surprised, but having issues in some apps is unavoidable. Did an update on an Nvidia machine recently, and you know what was there in a changelog? "Fixed an issue..." Will you call out Nvidia on "bad drivers"? Nah, cause you are a sheep, apparently.

    • @incognita112
      @incognita112 หลายเดือนก่อน +2

      @@Dempig took 2 weeks for helldivers, idk about wukong

    • @be0wulfmarshallz
      @be0wulfmarshallz หลายเดือนก่อน

      uh you know some people do think it does and you dont. But they are insane and you are not? Bruh is digital foundry insane then?

  • @R3I3ELLI0N
    @R3I3ELLI0N หลายเดือนก่อน +17

    the AMD naming is a good thing. previous naming could confuze where is the GPU and where is CPU. example R7-7700x or RX-7700xt pretty similar for someone who is shoping for a gift or planning to upgrade their PC

    • @gerarderloper
      @gerarderloper หลายเดือนก่อน +3

      what is RDNA5 going to be called? 10000 series?

    • @Acrylick42
      @Acrylick42 หลายเดือนก่อน

      @@gerarderloper they could skip 10xxx and go to 11xxx

    • @Azureskies01
      @Azureskies01 หลายเดือนก่อน +2

      @@gerarderloper Depends on, when they switch from RDNA to UDNA they will more than likely do another fucking naming change but it is what it is. Now RDNA5 could go the Vega rout and just be its own thing then UDNA1 would be whatever it would be

    • @WayStedYou
      @WayStedYou หลายเดือนก่อน +1

      @@gerarderloper there isnt going to be rdna 5 they are going to udna

    • @potatonite2340
      @potatonite2340 หลายเดือนก่อน

      ​@@Acrylick42seems like that would snowball into another naming problem eventually

  • @PyromancerRift
    @PyromancerRift หลายเดือนก่อน +30

    It's not about nvidia. It's about consummers being stupid. Do you need tesselation at max settings ? Do you need hairwork ? Do you need physX ? The answer is always no because these are settings above ultra on a few select games. And by the time the tech become mainstream, AMD runs them as well as nvidia.
    I am playing without DLSS and without ray tracing today and i don't miss it.
    Nobody is forced of anything, it's not like you had to pay to unlock those settings and now you are traped in the nvidia ecosystem because you don't want to lose all that money you spent. It's not apple.

    • @AstroBot-bc0213
      @AstroBot-bc0213 หลายเดือนก่อน

      Well nowadays we’ve got games that force rt on with no way to turn it off so soon enough we may actually need good rt performance to play the latest games, we have a not so bright future

    • @transformerstuff7029
      @transformerstuff7029 หลายเดือนก่อน +1

      @AstroBot-bc0213 just the indiana jones game so far right?

    • @Ussurin
      @Ussurin หลายเดือนก่อน

      Apparently Hairworks can work on CPU and it was glorious on my old rig, cause it allowed me to play Witcher 3 on ultra on shitty GTX960, cause I had decent i7 CPU. And then they patched it... It's literally a feature lock. And due to IP laws AMD cannot even propose serious alternative most of the time. I'm now on full AMD rig, but it's a bit stupid that I get worse experience on top of the line AMD than on middle of the road NVidia, just cause hairworks is locked behind hardware.

    • @GamingLovesJohn
      @GamingLovesJohn หลายเดือนก่อน

      I think Outlaws does software RT, but I can see more games following Indie. Which I abhor since they want to bury the mistake that is the 1080ti.

    • @Totenglocke42
      @Totenglocke42 หลายเดือนก่อน +1

      @@PyromancerRift Do you need them to get the game to run? No. Do you want them for the best experience? Yes. If clowns like you had your way, we'd all still be playing on a 486 with no 3D graphics cards because "Ugh, it can run Doom and Command & Conquer just fine! We don't need new tech!"

  • @leoni7649
    @leoni7649 หลายเดือนก่อน +10

    The kind of advantage you have when you invest in research and development.

    • @samson7294
      @samson7294 หลายเดือนก่อน +3

      yup. its AMD cope. Do i like the pricing? nope but Nvidia invests a boatload in not only developing tech but they go to game dev studios to help implement said tech.
      AMD needs to compete like they do in the CPU market. Intel was dominant but AMD made moves. I switched from a 8700K to a 7900X3D then sold it and got a 9800X3D
      And all this crying about ray tracing is so dumb! IT'S OPTIONAL! You don't have to switch it on

    • @FBI_Agent_
      @FBI_Agent_ หลายเดือนก่อน +4

      ​@@samson7294Well no your missing the whole point of the video and here you go again with this, dlss was not meant to be a technology of injection lf fake frames to play at 60 fps in most games secondly all the technology enables is slop developers to un optimize their triple a or modern slop 😂

    • @ChucksSEADnDEAD
      @ChucksSEADnDEAD หลายเดือนก่อน

      ​@@samson7294 That's missing the point. Getting tech implemented to make games run worse on the competition is the kind of stuff that gets the government involved in other industries. Heck Microsoft was forced to give the option to install browsers on setup because MS coming bundled with Internet Explorer was considered unfair competition.

    • @nightmarepotato5000
      @nightmarepotato5000 3 วันที่ผ่านมา

      @@samson7294 " IT'S OPTIONAL! You don't have to switch it on" my honest Indiana Jones and The Great Circle reaction

  • @raynel8495
    @raynel8495 หลายเดือนก่อน +1

    Watching this from the future a couple days after the beggining of CES. Your predictions were SPOT ON. Gotta say that I'm impressed.

  • @johnsonnguyen1374
    @johnsonnguyen1374 หลายเดือนก่อน +5

    funny enough i was thinking this exact thing with how raytracing is the current tessellation experience

  • @CRBarchager
    @CRBarchager หลายเดือนก่อน +2

    16:00 Don't know why but it sounded like you said Ketchup instead of catch-up. Love your content! - And I don't like the new naming scheme. It feels like what AMD did back in the Athlon days and they had to put the Performance-Rating (PR) on their processornames to compate with Intel. It was first when they threw all that away and made an entirely new name (Ryzen) and architecture that they could establish themselves and distant them from Intel. Hope they can do the same with UDNA in the future against nVIDIA.

  • @Epic3032
    @Epic3032 หลายเดือนก่อน +8

    my RX 6800 over a year old now bought it 2nd hand in 2023 ex mining card...still going strong. Paired with my 5800X3D. I'm waiting for high performance APUs so I can downsize my rig to a MINI PC>

    • @LeNathDuNet
      @LeNathDuNet หลายเดือนก่อน +2

      Buying an ex mining card has to be one of the most idiotic moves ever

    • @Epic3032
      @Epic3032 หลายเดือนก่อน +1

      @@LeNathDuNet haha well its over a year and mines still going strong so jokes on you mate. I got lucky when I bought mine.

    • @LeNathDuNet
      @LeNathDuNet หลายเดือนก่อน

      @@Epic3032 Extremelly lucky it even works for sure, but i personally wouldn't take the bet not only because the odds of it being in a good state are horribly low, but also because crypto scum are amongst the worst people, the entire thing being extremelly tightly linked to organised crime of the worst kind

    • @UncannySense
      @UncannySense หลายเดือนก่อน +1

      @@LeNathDuNet a mining card thats been on 24/7 and likely undervolted in a clean server rack will more than likely be better than a 2 year old gaming card on a PC hat's been booted on and off power cycles for more than 3 years...

    • @LeNathDuNet
      @LeNathDuNet หลายเดือนก่อน

      @@UncannySense ho because you thought they treated those like professional server racks?
      Let's just ignore the hundreds of thousands of totalled cards because these idiots powerwashed them with water to resell to other idiots

  • @ForestTekkenVideos
    @ForestTekkenVideos หลายเดือนก่อน

    The first time I downloaded PhysX was when I installed ghost recon advanced warfighter 2 for PC. It was really cool for that time, which was early 2007 iirc. Explosions in a destructible house would send debris flying all over in a way that just wasn't possible without PhysX.

  • @tak2ulata
    @tak2ulata หลายเดือนก่อน +22

    7900 XTX is what I have and it’s absolutely great. There is no need for anyone to give Nvidia all of that money for that bullshit they’re selling.!!!

    • @NicolasCharly
      @NicolasCharly หลายเดือนก่อน +3

      Getting my new build with this exact GPU in a couple of weeks! Can't wait to experience AMD GPUs after 2 decades of NGridia's.

    • @interrobangings
      @interrobangings หลายเดือนก่อน +7

      ​​@@NicolasCharly don't forget CES is this week and they'll be revealing new cards

    • @XNyvedX
      @XNyvedX หลายเดือนก่อน +1

      @@NicolasCharly Same!

    • @MrHimer12
      @MrHimer12 หลายเดือนก่อน

      I bought RX6800XT almost 3 yrs ago because 3080 was 2.5x more expensive than this GPU for which I was able to buy for less than 500$ with full waranty so... 3 years ago that was extremely cheap. Is it bad? No, everything I play runs great cannot RT enough CP2077 though but buying overpriced nvidia shit for 1 game doesn't justify it for me. Suprisingly stalker 2 runs decent but that's probably thanks to my X3D CPU. We don't have like good options right now. My budget for GPU is like 700$ max anything above since I play games rarely due to time constraints I'd rather invest such moneh in my oldtimer motorcycles collection which gains value with time.

    • @Dempig
      @Dempig หลายเดือนก่อน

      7900xtx is unusable for me. Upscaling is required at 4k and fsr looks so bad its unusable

  • @PillowOfEvil
    @PillowOfEvil หลายเดือนก่อน +7

    AMD should focus on their driver stability. I still have issues with their Adrenaline app crashing randomly every few hours and crashing both games and games crashing the app. Incredible. I'm not even one of those that just update apps, since I am comfortable with using ddu and safe mode and such.

    • @Annihilation99
      @Annihilation99 หลายเดือนก่อน +17

      Very unlucky for you i havent had any issues since switching from nvidia

    • @interrobangings
      @interrobangings หลายเดือนก่อน +11

      ​@@Annihilation99this
      adrenaline works fine for me

    • @danielainger
      @danielainger หลายเดือนก่อน +7

      Had a 6950xt for a while now and had no driver issues what so ever.

    • @guilhermedecarvalhofernand1629
      @guilhermedecarvalhofernand1629 หลายเดือนก่อน +4

      Did you use dddu for a clean driver install when swapping GPUs?

    • @sergiu6650
      @sergiu6650 หลายเดือนก่อน +1

      7800 xt, no issues for me, since i stoped windows update and using ddu for uninstalling drivers.

  • @banner7310
    @banner7310 หลายเดือนก่อน +52

    Raytracing is just another Phyx, hair works ect. I believe Nvidia wanted to add something that would cripple AMD. And honestly idk how raytracing even got as far as it did. It's another useless gimmick that no game can run well with. It being use as an argument is stupid as well.

    • @dante19890
      @dante19890 หลายเดือนก่อน +6

      No raytracing is actually the future of rendering. You never gonna get rid of it. Its here to stay.
      Next console generation with ps6 every new game is gonna be using RT from the ground up.

    • @GreyDeathVaccine
      @GreyDeathVaccine หลายเดือนก่อน +7

      RT can be good but on powerful GPUs and 1440p, not 4k.

    • @kawaiipiggy9143
      @kawaiipiggy9143 หลายเดือนก่อน +7

      Ray tracing is an incredibly useful tool for developers. If and when ray tracing capable (and I mean fully capable, so probably not for a while) cards are wide spread, developers will essentially never have to bother with manually setting up lighting tricks the way they have to do now. It's an insane cost and time cutting feature, it's just not able to shine quite yet

    • @PriyansuBhagabati
      @PriyansuBhagabati หลายเดือนก่อน

      It's not useless if you have the money

    • @snakeace0
      @snakeace0 หลายเดือนก่อน +6

      Sorry but proper raytraced global illumination or even better pathtracing is NOT a gimmick. It absolutely transforms the games graphics positively while also reducing workload on environment designers.
      Its rather obvious that you never actually play raytraced titles and much prefer insane framerates.
      DLSS is so good at quality dettings that it results in better image quality than the best AA solutions, while giving you a massive performance boost.
      Back in the days we were VERY happy hitting mid /high settings at 720p in crysis.
      I hope nvidia keeps pushing tech, because AMD sure as shit wont..

  • @a-job7276
    @a-job7276 หลายเดือนก่อน

    If you enable PathTraicing, you have to disable rasterization options that do not affect the final visual quality but that reduce FPS. They are not automatically disabled when using PT, in most games.
    DLSS in quality mode or DLAA are too much for most if PT is used, with lower resolutions it looks very good and with more FPS when using PT. Pathtracing replaces many things in rasterization so they must be disabled.

  • @michelemugno6878
    @michelemugno6878 หลายเดือนก่อน +7

    It's not imposible to keep up, AMD has to focus on mid tier user like they said they would, high end users prefer nvidia low end and mid end could go with AMD if properly educated, in almost all aspects AMD offers more for less money, but on low to mid end people still pick nvidia over AMD and that is something AMD has to change if they want to keep up the pace

    • @PCgamingbenchmarktime
      @PCgamingbenchmarktime หลายเดือนก่อน

      If Amd doesn't botch the launch I'll happily switch to the 9070xt from my 3080. FSR has been the main thing holding me back since i play at 4k and fsr looks terrible in most games, But if FSR 4 looks promising and the 9070xt is actually very very close to a 4080 for $499 I'll buy that. But Amd have a bad habit of disappointing so we'll see how it goes. I'd rather not get another nvidia card but if Amd fails I'll probably get a second hand 4080. Fingers crossed for Amd lol

    • @derbigpr500
      @derbigpr500 หลายเดือนก่อน

      " if properly educated, in almost all aspects AMD offers more for less money" - Keep coping. AMD isn't ahead in anything, even in value.

    • @MyouKyuubi
      @MyouKyuubi หลายเดือนก่อน +1

      I mean, it makes no sense for low-enders to choose Nvidia, because AMD products are superior on the low end, like... Universally superior. xD

    • @MyouKyuubi
      @MyouKyuubi หลายเดือนก่อน +1

      @@derbigpr500 You have no idea how wrong you are about that, lmao! xD

    • @SapiaNt0mata
      @SapiaNt0mata หลายเดือนก่อน +2

      @@MyouKyuubi this is where AMD is not superior. brand loyalty. even 4070 at $600 outsells 7600 at $300. 7600 compete with 4060 and yet it's outsold from 4070. 4060 laptop and desktop outsell 4070. imagine how much 4060 outsells 7600. there's not even a contest. there's more contest with 4070 than 7600. Nvidia wins from brand loyalty, and most people don't even know this. it's like trying to outsell Iphone. not gonna happen. Nvidia is the Iphone of gpu, but unlike in phones where there's Samsung. there's no Samsung in gpu.

  • @simon_a_s
    @simon_a_s หลายเดือนก่อน +1

    I was an Intel/nVidia fanboy from my early gaming days in the late 90's. Bought my 2nd AMD system ever back in 2023 and went with a AMD GPU as well. The Intel/nVidia prices got utterly insane at some point. AMD got better price/performance ratio if you disregard RT performance.

  • @metashawn
    @metashawn หลายเดือนก่อน +4

    Reality check on 1:07

  • @garyshearer0
    @garyshearer0 หลายเดือนก่อน +2

    Upscaling is a misleading marketing term. It should be called rescaling since it lowers the resolution then uses AI to upscale it back to the target resolution.

  • @CHAstaroth
    @CHAstaroth หลายเดือนก่อน +7

    I can when ever i want. I luckely have no stockholm syndrom with gaming and nVidia so. But good Video as always! Time to subscribe

  • @Saieden
    @Saieden หลายเดือนก่อน

    I remember when AVP came out with tessellation in 2010 and being blown away at how smooth the Xenomorph heads were compared to the previous games.

  • @Vitaroart
    @Vitaroart หลายเดือนก่อน +10

    01:35
    Our brains are totally messed up.

  • @younasqureshi9179
    @younasqureshi9179 หลายเดือนก่อน +1

    0:01 ATI TruForm was the first tessellation effect before even NVIDIA came out with their version way back on Radeon 8500

  • @infernal-toad
    @infernal-toad หลายเดือนก่อน +7

    Considering Nvidia’s GPUs keep becoming more expensive than previous generations or even have worse power consumption, Nvidia’s domination can’t last forever right?

    • @icarus3874
      @icarus3874 หลายเดือนก่อน

      this question might not age good- it's a bet at this time the way I see it

    • @Ablequerq
      @Ablequerq หลายเดือนก่อน +4

      You underestimate the stupidity of consumers.

    • @Guy-McPerson
      @Guy-McPerson หลายเดือนก่อน

      They're currently frontlining the market's current obsession with AI. The massive price jumps is just piggybacking off of their recent success in that area. So long as shareholders demand they profit further from the hype, nothing will change.

    • @RiderOfKarma
      @RiderOfKarma หลายเดือนก่อน

      Look at Intel CPUs, they were considered unbeatable when Ryzen launched.

    • @Codyslx
      @Codyslx หลายเดือนก่อน

      Worse power consumption? Maybe higher end but the 4060 is one of the most efficient gpus this generation and does wonders in the laptop and mobile space. Any competing amd gpu swallows up power like a whale. You can't beat 115 watts.

  • @JustDave_GG
    @JustDave_GG หลายเดือนก่อน

    Just in the process of jumping ship. Sick of the prices now. Managed to get a 7900xtx on a great deal, so went team red. Arrives tomorrow, and hoping the switch goes smoothly

  • @dragonmares59110
    @dragonmares59110 หลายเดือนก่อน +9

    3:20 a 4070tiS is 1150 euros here...i wish it was only 800...

    • @r0xa18
      @r0xa18 หลายเดือนก่อน +1

      1150€ crazy.. where do u live? In Germany Ti Super is 830-900€

    • @dragonmares59110
      @dragonmares59110 หลายเดือนก่อน

      @@r0xa18 France, GPU are overpriced

    • @JcsP3D
      @JcsP3D หลายเดือนก่อน +2

      @@dragonmares59110 mate, I know it's hard, but you need to NOT accept those prices, 4070 Ti should be around MAX 600€, and that's already pushing it.

    • @BladeRunner031
      @BladeRunner031 หลายเดือนก่อน

      @@r0xa18 In Croatia is 940e up,its just insane

    • @dragonmares59110
      @dragonmares59110 หลายเดือนก่อน

      @@JcsP3D Well i don't need to accept it, i just can't afford it anyway xD Never putting more than 500 euro in a gpu

  • @Kankipappa
    @Kankipappa หลายเดือนก่อน

    I used Tessellation with ATi Radeon 9000 cards in Unreal Tournament. That was over 20 years ago. It was nice allowing you to have more imaginary polygons between vertexes, but yeah that Tessellation wasn't the same deal that was later introduced that worked more like parallax effect.

  • @disturebd
    @disturebd หลายเดือนก่อน +7

    One big thing about buying AMD Graphics in Germany is the Wattage of the GPU. Yeah i paid like 100€ more for a 4070ti instead of buying a 7900XT but like this i can play my games with raytraycing and dlss + undervolted to 165-185 watts on max instead of nearly needing twice as much while paying 33 cents per KWh. So as long as u keep ur GPU for more than 2 Years, or play more than 2h daily on 300 days of the year Nvidia will be cheaper than going AMD while offering more. So its hard to not go Nvidia over here if u have this in mind.

    • @BKYoutube-zq3yv
      @BKYoutube-zq3yv หลายเดือนก่อน +2

      The 7900XT undervolts so well, it can play Elden Ring fully maxed out with max RT at a locked *native* 1440P 60FPS while using 125w. Less than a 4070Ti Super ever could at those settings.
      You punked yourself by not researching AMD tuning. It's wild on the 7900 cards, you can choose to be super efficient, balanced or Ballz2thewall performance.

    • @crm484
      @crm484 หลายเดือนก่อน

      33 cents/kWh in germany, damn. I live in Denmark our avg price in 2024 was 0,07 eur. So I saved money and bought the 7900xt.

  • @sgtcaco
    @sgtcaco หลายเดือนก่อน +1

    ‘Omg I’m getting old’
    You are in for a shock my friend.

  • @m-copyright
    @m-copyright หลายเดือนก่อน +5

    Don't worry mate, we are all old.

  • @CrazyBrosCael
    @CrazyBrosCael หลายเดือนก่อน +1

    Gonna get a 5070 because I don’t have GPU, only to play games from 5-15 years ago.

  • @Serpic
    @Serpic หลายเดือนก่อน +3

    I don't understand why you think the 4070 Ti S is meant for 4k with RT. Regarding Ray Reconstruction, even NVIDIA itself said that for now it will only work at an acceptable FPS with DLSS 3 (unless you have a 4090 for 1080p gaming). I hope that I will live to see the moment when turning on RT will not drop FPS by 2 times, when there will be enough RT cores for processing. And we are likely to see the 6070 series consume power at the level of the 4090.

    • @YuokoII
      @YuokoII หลายเดือนก่อน +2

      So all cards except of overpriced one can’t do any of nvidia features? Whats the point of them then?

    • @Serpic
      @Serpic หลายเดือนก่อน

      They can. But not in 4k. Like DLSS was created more for 4k gaming than 1080p, but use it now with the words "medium settings, upscale balance, 1440p at 30 FPS". 4k monitors are more expensive, rendering such a resolution requires almost 4 times more performance, which the graphics accelerators of that time could not do (without DLSS). Even the "jacket" says "the more you buy, the more you save". So what am I getting at? These are the times of AI, when the performance per watt has increased significantly, which cannot be said about the gaming industry.
      For me, the 90 series has always been either for 4k or for VR (which is essentially the same 4k+) at maximum settings with the DLSS. In addition, 4k became popular too early, and video card manufacturers were not ready for it. RT / Ray Reconstruction / Path Tracing is still very "expensive" processing for GPU and you should not expect 60 FPS in 4k on the same 4070 Ti Super.

  • @ELCrisler
    @ELCrisler หลายเดือนก่อน

    So Tessellation was used by Nvidia at a crazy high level, so high that is was next to impossible to notice it. Cutting the Tessellation level by half caused no change to video quality but suddenly AMD cards ran it fine. Hairworks in Witcher was another example of something like this being done.

  • @71janas
    @71janas หลายเดือนก่อน +6

    Looking forward to your CES content👍

  • @ufukbakan4741
    @ufukbakan4741 หลายเดือนก่อน +2

    yes this is so correct, thanks for spitting truths Vex. Many many Games do hidden optimization (deoptimization for others) in favor of their sponsorshipped brand. Thus not only market share affected also competitive gamers are affected.

  • @scubasausage
    @scubasausage หลายเดือนก่อน +3

    Nvidia giving consumers innovative features is pro consumer. But it is very anti-AMD. If you are upset that Nvidia innovates you probably have an irrational emotional attachment to Intel or AMD.

    • @ChucksSEADnDEAD
      @ChucksSEADnDEAD หลายเดือนก่อน

      Pro-consumer is giving away the technology so that the consumer is not locked down.
      For example PhysX was not coded to run on x86 so that CPUs couldn't run PhysX without massive performance losses. Not letting the consumer run PhysX on the CPU is an anti-consumer measure.

    • @scubasausage
      @scubasausage หลายเดือนก่อน

      @@ChucksSEADnDEAD Lmao your example is awful. Nvidia purchased Ageia, who owned and created PhysX. At the time CPUs could run PhysX but not very well at all. You had to buy a dedicated Ageia PhysX card to run PhysX but they were expensive. When Nvidia bought PhysX they integrated PhysX into Geforce meaning users didnt need to buy a second GPU. it was quite a consumer friendly move. I remember it first hand, im 38 and ive been doing this since I was teenager. Even today PhysX isnt handled well on a CPU and you would want it on the GPU. Of course money would need to be spent to get it to work on Radeon cards. Why should Nvidia spend that money and help its competition?
      Also, If you look Nvidia spends a far higher percentage of its profits on R&D than AMD does. Why should Nvidia give the competition the fruits of its labour? This has nothing to do with the consumer and the consumer doesnt benefit if AMD get their hands on Nvidias technology. You appear to be mistaking healthy competition for anti-consumerism.
      AMD make billions in profits every year. They are not short of cash to compete with Nvidia. They merely choose not to spend it. If they were broke and getting crushed by a massive multi national then id understand but that is not the case. AMD are just keeping the money.

  • @TeeJaey
    @TeeJaey หลายเดือนก่อน

    Borderlands 2 was THE PhysX Game back in 2012. Hella unoptimized, but those dynamic cloth sheets that could tear was some awesome tech to me and the fluid effects of stuff like toxic goo looked really fitting to the Borderlands art style.

  • @erroneousbosch
    @erroneousbosch หลายเดือนก่อน +15

    My RTX 2060 is dying, am hanging on until next gen launches, and planning on going AMD.

    • @suouco
      @suouco หลายเดือนก่อน

      Literally me can barely run red dead with smooth frames at 60 😭 even with DLSS

    • @mradsi8923
      @mradsi8923 หลายเดือนก่อน +1

      Have you considered ARC? B580 looks really promising for a mid-cost gpu

    • @Pazo139
      @Pazo139 หลายเดือนก่อน

      I went AMD and haven't looked back.

    • @transformerstuff7029
      @transformerstuff7029 หลายเดือนก่อน +1

      @@mradsi8923 seems it really likes 1440p though and gets beat by the 4060 on 1080p.........
      That is a thing to think about as many budget gamers still game on 1080p, or use a 1440 card for multiple years and game 1080p in the last few of those years.

    • @transformerstuff7029
      @transformerstuff7029 หลายเดือนก่อน

      @@Pazo139 same, gotta buy the competition if you want them to stay relevant.

  • @GhostProductionsGP
    @GhostProductionsGP หลายเดือนก่อน +2

    today we would like to introduce drum roll please... THE BRAND NEW RTX 6090 TI SUPER WITH 12GB VRAM GDDR8X HALF THE CUDA CORES HALF THE SM'S SHITTIER OVERALL 20% BETTTER DLSS, UPSCALING AND RAY TRACING FOR THE SMALL PRICE OF $5500 USD AND YOUR GRANNY, remember the more you buy THE MORE YOU SAVE!!! (features sold separately)

  • @Azureskies01
    @Azureskies01 หลายเดือนก่อน +6

    Nvidia tried to do the same thing with RT that they did with CUDA, be the first out of the gate and do it better than AMD so devs would feel forced to use their shit and no one else's. Thing is this time AMD is in the consoles and they are a huge portion of the market so ngreedia couldnt cut them out of the market. It is making RT slower to adopt but you can really see the devs that are in nvidia's pocket (CDPR-etarded).
    RT hardware can't be copywritten but ya know what can? DLSS so that was their plan this whole time. Try and force studios to put RT in their games and then ngreedia sells you "secret sauce" to fix its performance hit... especially with path tracing.
    The only way to fix this is for people to get off of ngreedia's dick and at the same time AMD has to stop fucking up each and every GPU launch.
    FSR4 is coming out when DLSS is going to get its next big update so AMD's software is still 2 years behind on the GPU front. They need to price their cards accordingly yet they never fucking do.

    • @3dcomrade
      @3dcomrade หลายเดือนก่อน

      Wukong being heavily nvidia biased made me stunned by the overfocus of the online community on the culture war stuff
      They robbed you of your gpu's extra performance
      A pc 6500 XT is 30% faster than my laptop's RTX 2050. Yet both has similar performance according to HU's charts and my laptop's benchmark

  • @Dawnkeekong
    @Dawnkeekong หลายเดือนก่อน +6

    5:15 did I hear that correctly ?

  • @RebieRebs
    @RebieRebs หลายเดือนก่อน

    I am building my first PC right now from scratch, on my own. I haven't built one in 15 years or so and never without help. It is absolutely wild what I have come back to. I have a MOBO and a CPU (670X & 7950 X3D) and now I am waiting for the new cards. I wanted to do Nvidia since I have never had one before but I do not like the way they're doing things. I'm thinking of grabbing an 7900 XTX or something newer, if AMD makes anything worth checking out around that performace and a lower price than the green team. Otherwise I am "doomed" to grab a 4080/5080.
    Anyway, I have been checking out your videos here and there along with GamersNexus (

    • @ct2651
      @ct2651 หลายเดือนก่อน +1

      The 7900xtx may actually be the most powerfull gpu from AMD in the next 3 years since amd said they wouldnt try to make a high end gpu this year/gen. Just to inform you que you may need to brace yourself for disapointment

    • @gsst6389
      @gsst6389 หลายเดือนก่อน

      You dont buy by brands morale, you buy a brand becsue they provide with the product you need or want, this is not a multi year relation, this is a one time deal wish you make a one day transition, you get the good they get their reward. Period.

    • @RebieRebs
      @RebieRebs หลายเดือนก่อน

      @@gsst6389 That is okay if you think that. I disagree and will spend my money how I want.

  • @misterlobsterman
    @misterlobsterman หลายเดือนก่อน +9

    I don't care how good Nvidia will get, i'll never buy from them, until they stop their scumbaggery.
    Nvidia have done this kind of thing for the last 20+ years. Research new tech, make it proprietary, pay off game devs to optimise for their cards and put the new tech in their games, now gamers have to buy their cards because it runs better on Nvidia and it has exclusive features, since they partnered with the devs.
    Don't care for this mentality. In the mid range (where my budget is), AMD is still king in price to performance anyway.

    • @MyouKyuubi
      @MyouKyuubi หลายเดือนก่อน

      Well, personally, i wont buy Nvidia again, because they treat their own customers like trash... I'm one of the suckers that fell for the RTX 2080 TI scam...
      First they burned me, by making my Sick SLI-setup invalid, by discontinuing SLI-support, i could only run my games on ONE card, which was a huge performance hit... But i lived with it for many years, until i decided to upgrade to that 2080, and then although my performance improved, i still had crazy stuttering issues... So that was the second time i got burned... And then 1 month later, the 3000 series released, making me feel like i wasted my money on the 2080 TI.
      I'm not buying from Nvidia again. :)
      I'm buying AMD, and, if AMD falls off a cliff, i'd rather buy Intel GPU's, than Nvidia... And if Intel AND amd falls off a cliff somehow, i'd rather buy and play IRL board-games than buy an Nvidia product, or heck, even give those obscure chinese knock-offs as try! :P

    • @Codyslx
      @Codyslx หลายเดือนก่อน

      Why would you spend money on R & D and not make it proprietary? It's not smart to supplement your competitors.

    • @misterlobsterman
      @misterlobsterman หลายเดือนก่อน +1

      @@Codyslx From a business standpoint, sure. But if they wanted, they could at least make it so that at least previous gen cards can use some of the features of the new ones, but they lock that away too. They are anti consumer even with their own customers. Not to mention the outrageous prices, or how they sell you a worse card for more money year after year(3060 vs 4060for example). And if AMD can give all gamers FSR and frame gen for free, so could Nvidia. Most of their money is from the AI, crypto and big tech side, not gamers, but they nickel and dime everyone. They still sell 8gig cards for more money than AMD does their faster 12gig cards.

    • @Codyslx
      @Codyslx หลายเดือนก่อน

      @@misterlobsterman How will previous gens run software that require specifc hardware to work? There is a reason why amd is moving to hardware level machine based upscaling, that their other gpus most definitely won't have access to.
      The 4060 is also better than the 3060 even removing dlss3 and dlssfg all while only using 115 watts max.

    • @MyouKyuubi
      @MyouKyuubi หลายเดือนก่อน

      @@Codyslx Hmm, i think that means Nvidia isn't confident they could stay on top, performance-wise, and was afraid a competitor would pull it off better than they could.
      If you make something propriatery, you are only showing you fear what your competitors might accomplish with it.
      It's cowardice, is what it is...
      If i was confident i could remain on top, i would share the tech, and still remain on top, everybody wins! Heck, my competitors might even invent something new off of what i invented, which i can proceed to innovate on, in turn... perhaps we'll accidentally discover something that can reduce production cost by 50%, allowing us to sell as the same price, but still increasing profits.... etc.
      So actually yes, it is smart to "supplement" your competitors... Only someone who isn't smart enough to see the benefits, would think it isn't smart. :P

  • @fishjohn014
    @fishjohn014 หลายเดือนก่อน +7

    Title doesn't make sense.....it should say "doesn't" not "don't"

  • @verebellus
    @verebellus หลายเดือนก่อน +1

    Cuda and other tech like tensor cores is huge. Cuda acceleration is huge for both gaming and also productivity. Funnily enough, a lot of what nvidia has implemented as propriotary stuff, is stuff that also was introduced in the PS2 tech demo, showing the new possibilities the PS2 had, like better simulations.

  • @AmrAbdeen
    @AmrAbdeen หลายเดือนก่อน +3

    i don't get your point. Are you frustrated that Nvidia keeps introducing new features for their hardware? because it makes developers lazy?
    it's like saying cars are bad, It makes people not want to walk!

    • @M4x505.
      @M4x505. หลายเดือนก่อน +1

      Also this thing about DLSS and AI features that nvidia has. They're always saying "DLSS BAD" "Ray tracing is a gimmick" "DLSS is cheating and fsr looks the same", istg it's like they prefer a giant bulky steam engine instead of a modern v8

    • @MyouKyuubi
      @MyouKyuubi หลายเดือนก่อน +2

      @@M4x505. FSR looks objectively worse, lmao... And i'm using a 6800 XT for my main system... I'd rather use proper anti-aliasing if i can, FSR looks awful... DLSS does look better, and would be the preferred AA-tech, if developers weren't abusing it as an excuse to avoid doing the optimization work they're supposed to do... : /
      Ray tracing IS a gimmick though, is only looks SLIGHTLY better, but it costs 90% of your frames, as well as 90% of your bank-savings... It is so not worth it, AT ALL! 🤣
      That's why i went with AMD, because AMD does the basic raster stuff, really, REALLY WELL, so i get MORE frames than Nvidia products get with all the RTX stuff turned off... That's why AMD is better, imho... AND is also cheaper, literally win-win.
      I only get sad when games, like Hogwarts Legacy, doesn't have normal anti-aliasing techniques, at all, and ONLY has TAA, and then FSR and DLSS, those are the only choices you have on Hogwarts legacy, is so silly. :P

    • @AmrAbdeen
      @AmrAbdeen หลายเดือนก่อน +1

      @MyouKyuubi I'm an ai researcher. AMD is waaaaaaaay behind in the business side. developers don't even optimize around their hardware anymore. i hope they step it up

    • @MyouKyuubi
      @MyouKyuubi หลายเดือนก่อน

      @@AmrAbdeen Oh you mean market share? Sure, i guess...
      However, Whenever you buy a console or a laptop, what CPU does it have? AMD, what is the best gaming CPU on the market right now? AMD... What's the cheapest gaming hardware? AMD!
      Only reason Nvidia, and Intel have the most market share, is because they're partnered up with Microsoft and local PC assemblers to promote and sell their stuff, before anything else... Trying to create a closed system where they can each have a monopoly on whatever they're each selling... Like an ouroboros of parasites.
      The fact that AMD ISN'T engaging in monopolistic practices like that, makes me respect them more. :)
      After all, you AI-researchers will soon be out of a job, once everyone realizes AI is just the latest buzzword and snake-oil, to get people to buy ewaste.
      I haven't heard of a single individual bragging about how much they love AI, everyone, even the target demographic for this AI nonsense, absolutely f***ing HATES AI, lmao, to the point where they're literally sick of hearing the word "AI"! xD

    • @AmrAbdeen
      @AmrAbdeen หลายเดือนก่อน

      @MyouKyuubi why so defensive? out of job.. buy ewaste? That's the most incoherent take i have seen so far on research. you know that mostly everything tech you use day to day depends on machine learning one way or another, right?

  • @LukeHimself
    @LukeHimself หลายเดือนก่อน +1

    I genuinely do not understand the loyalty to Nvidia..
    I've been using AMD GPUs for a long time, and they've been legitimately customer friendly with their practices since way back.
    The drivers are more user friendly, and they require less overhead. I'm not sure about the newest revision of Nvidia's driver software, but AMD's has been more feature packed for years. 🤷
    *It really has to be because Nvidia makes the top tier card. If AMD took the crown, we'd see a massive shift just like with CPUs.*
    I truly believe that, but I don't think it's going to happen any time soon, since the rumors are AMD plans to only target 4080 or 5080 levels for their top card (whichever it was).

  • @jordanlok365
    @jordanlok365 หลายเดือนก่อน +6

    Nvidia copers fill the comments section with their budget 30s that can't even use DLSS 3 and still they cope. This is the reason why we can't never leave Nvidia is because of idiotic consumerism.

    • @MyouKyuubi
      @MyouKyuubi หลายเดือนก่อน +2

      Yep, idiotic consumerism is the problem here, not Nvidia.
      Although Nvidia has done a lot of shady s**t, this, in particular, is a consumerism issue, nothing else. : /
      I went with AMD, i got burned by Nvidia's actual wrong-doings three times, and i called it there... First they discontinued SLI-support, rendering my fresh PC setup invalid, but i gritted my teeth and stuck with it for many years... Then i bought the 2080 Ti, the long-awaited upgrade, which stuttered like crazy and was barely an upgrade at all, and then 1 month later 3000 series comes out, making me realize i got scammed, twice, by Nvidia.
      Never buying Nvidia again after that, lmao, i'd sooner buy Intel, or even an abscure chinese knock-off before i ever consider buying scamvidia products again! 🤣

  • @ChaosFraktal
    @ChaosFraktal หลายเดือนก่อน +2

    Just enough people buying everything what NVIDIA throws on the table. Just waiting for real vendor lock in by NVIDIA, where games can only run NVIDIA cards.

  • @chrispaul7595
    @chrispaul7595 หลายเดือนก่อน +6

    physx was primarily about fluid and particle simulaiton and was a big deal for about 10 years, it was tech developed by a company named ageia and is now open sourceware

    • @kostaskourkoulos6790
      @kostaskourkoulos6790 หลายเดือนก่อน +1

      Yes and my brother used a software and with it he played bordeland 2 with an R9 390 with physx on and it was playable

  • @MagiconIce
    @MagiconIce 9 วันที่ผ่านมา

    The fact that AMD and Intel are not catching up in the GPU field hurts us consumers because of the lack of competition.
    Competition drives innovation and drives prices down, both being good for us consumers.

  • @UnBknT
    @UnBknT หลายเดือนก่อน +6

    The fact he's never seen PhysX in a game, blows my mind. So he basically never played any game that came out in the golden age of computer games (2004-2018). Modern gaming is is a trap, it's 90% fiddling with settings and it never looks good. Older Games even look a lot better imo. Nothing beats clear, sharp graphics on a 2k Monitor. Modern games are like: "Okay it's sharp, but not if you stand so far away, we can't handle good Textures anymore. Also have fun with transparencys, depth of field with DLSS together, it's a mess, good luck...".

    • @derbigpr500
      @derbigpr500 หลายเดือนก่อน +1

      He's a clueless AMD fanboy, what do you expect.

    • @M4x505.
      @M4x505. หลายเดือนก่อน

      99% of the people that praise AMD and absolutely hate Nvidia will buy an nvidia card.

    • @derbigpr500
      @derbigpr500 หลายเดือนก่อน

      @@M4x505. They would if they could afford to. But their parents won't give them the extra allowance, hence the fact they're so mad about it online.

    • @M4x505.
      @M4x505. หลายเดือนก่อน

      @@derbigpr500 even then, I would rather get lower tier Nvidia card than an AMD one. Never again, drivers suck (and yes I did ALL the ddu and everything). Plus, if you want to do anything productive with your system there's no choice but green team.

    • @MyouKyuubi
      @MyouKyuubi หลายเดือนก่อน

      Yeah, i ran the original Half Life game a while ago, and i noticed there's no anti-aliasing settings in in the menu... So i decided to inspect the anti-aliasing the game just runs, natively... and it's fking amazing, literally flawless anti-aliasing, it looks spectacular

  • @Lynx_Melynx
    @Lynx_Melynx หลายเดือนก่อน

    I actually just really appreciated the history lesson for the early stuff that was before I got into PC gaming and building. The whole video was good though.

  • @nßultz1440
    @nßultz1440 หลายเดือนก่อน +8

    this is monopolistic, we should really class action nvidia and make them open source cuda, and allow vram to be upgraded, the only reason for any of it is to protect a monopoly thats sad its millions wasnt enough

    • @GoldenEDM_2018
      @GoldenEDM_2018 หลายเดือนก่อน +1

      Why would the government make CUDA open source.. NVIDIA developed hardware and software for it. Its their intellectual property. Lmao. Or are we becoming socio-communist now??

    • @youssefmohammed5456
      @youssefmohammed5456 หลายเดือนก่อน +3

      Why would I spend a shit load of money on research and development and partnerships and then make all my competition use it?

    • @Codyslx
      @Codyslx หลายเดือนก่อน

      ​@@youssefmohammed5456 exactly.

    • @nßultz1440
      @nßultz1440 หลายเดือนก่อน

      @@youssefmohammed5456 intel was made to do the same by your parents or grandparents because they understood monopoly is bad

  • @emirmasinovic
    @emirmasinovic หลายเดือนก่อน +1

    AMD should never have played catch up for these stupid tracing features. If I was them, I would have collaborated with developers to maximize performance without the need for an upscaler. Also, I want to criticize the devs on the same topic: I think they wasted time on implementing ray tracing and path tracing, plus this need for realism is silly when the game's art style was always more important. I don't want a game that's 100 GB, needs the latest hardware, latest features, looks like real life. For realism and path tracing, I just go outside (and it doesn't even cost anything).

  • @raychii7361
    @raychii7361 หลายเดือนก่อน +5

    Just give us cheap graphics cards AMD.

    • @TopShot501st
      @TopShot501st หลายเดือนก่อน +2

      They won't, you only have Intel for that.

    • @raychii7361
      @raychii7361 หลายเดือนก่อน

      If they are in stock​@@TopShot501st

    • @IcecalGamer
      @IcecalGamer หลายเดือนก่อน

      ​@@TopShot501st Median market price for Battlemage-580 is 380 Bucks/Euros, if you can find it.
      And you also need HighEnd CPU (5800x3D and upwards) to run Alchemist/Battlemage (Intel GPU overhead).
      How is Intel the cheap option?

    • @MyouKyuubi
      @MyouKyuubi หลายเดือนก่อน

      wdym? AMD has plenty of viable cheap cards for you to peruse. :O

    • @raychii7361
      @raychii7361 หลายเดือนก่อน +2

      @@MyouKyuubi Where is the RX 580 successor?

  • @PerciusLive
    @PerciusLive หลายเดือนก่อน

    The old benchmark of 60 fps, dlss off, ultra quality, 4k res was such a good time for progress. Now we got this ray tracing/path tracing garbage that requires dlss to hit 60 at medium/high instead of ultra at 1080p. The whole ray tracing tech has caused this backwards trend and until people start giving backlash on this change in testing methods, its only gonna get worse

  • @KianFloppa
    @KianFloppa หลายเดือนก่อน +6

    Radeon better

    • @MyouKyuubi
      @MyouKyuubi หลายเดือนก่อน

      🎯

  • @geneevans7885
    @geneevans7885 หลายเดือนก่อน

    If you bought that stock back in 2016, you’re loving life right about now.

  • @Y0Uanonymous
    @Y0Uanonymous หลายเดือนก่อน +11

    AMD performs much better than Nvidia when there is a CPU limitation. Nearly all the stupid graphics card comparison tests are done with the fastest CPU available, which hides this fact.

    • @evaone4286
      @evaone4286 หลายเดือนก่อน +1

      Interesting

    • @zf8496
      @zf8496 หลายเดือนก่อน +1

      that's true!

  • @DaveGamesVT
    @DaveGamesVT หลายเดือนก่อน +1

    I think nvidia needs to be broken up for being a monopoly.

  • @ridleyroid9060
    @ridleyroid9060 หลายเดือนก่อน +4

    Unrelated but have you seen the B580 overhead issue testing done by HCN and HBU?

  • @miljanvideo
    @miljanvideo หลายเดือนก่อน

    Much respect for the ROR2 music

  • @rozzbourn3653
    @rozzbourn3653 หลายเดือนก่อน +5

    sounds to me like AMD and Intel need to start innovating on their own and not follow what nvidia does all the time.