You Won't Need a New GPU Ever Again...

แชร์
ฝัง
  • เผยแพร่เมื่อ 28 พ.ค. 2024
  • ==JOIN THE DISCORD!==
    / discord
    Graphics cards are becoming a lot more like smartphones nowadays- most people wait a significant amount of time without upgrading GPUs. With the pace of PC hardware and GPU technology going slow alongside games demanding more than ever before. It is ushering in an era where GPUs are getting increasingly expensive and feel more pointless to upgrade. To remedy this, Nvidia, AMD, and Intel have made tech like DLSS, FSR, and XESS to bypass the demands of games. We've also see many devs just not care about the latest tech anyways and just make games not demanding. Regardless, everything compiles to barely upgrading the GPU.
    What do we do? idk
    just be aware ig, lmk what you think
    __subscribe___if____you_____DAREEEEE______ooooooooooo__
    Digital Foundry: • Inside DLSS 3.5 Ray Re...
    • Cyberpunk 2077 2.0 - P...
    Daniel Owen: • RTX 2060 vs RX 6600: T...
    • GTX 1060 vs RTX 2060 v...
    Testing Games: • GTX 1080 vs RTX 2080 v...
    The Verge: • Google Pixel 8 event i...
    blogs.nvidia.com/blog/2018/03...
    0:00- GPUs and Smartphones are pretty similar now
    0:52- GPU tech is slowing down
    3:38- DLSS and FSR as a response
    4:55- Just don't make games demanding :/
    7:45- GPUs have been getting expensive
    11:12- The good and bad
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 3.7K

  • @MyRkAcc
    @MyRkAcc 7 หลายเดือนก่อน +4196

    With 50% of steam users using 900 and 1000 series Nvidia GPUs, Devs should know not to make RT/Lumen only games if they want to sell their games 😂

    • @chielvandenberg8190
      @chielvandenberg8190 7 หลายเดือนก่อน +325

      With those games being heavily overpriced they know they will get rich enough from the 15% that’s able to run the games

    • @atnfn
      @atnfn 7 หลายเดือนก่อน +143

      Well if they are only gonna make games for 900 and 1000 series I guess they should stop making games using UE5 and go back to using UE4. (or equivalent)

    • @Realnamenogimmicksdude
      @Realnamenogimmicksdude 7 หลายเดือนก่อน +81

      Don’t you worry son, they well sell you a remake of those rt/lumen games as soon as your phone can run it, probably for double the price

    • @lionmuller2680
      @lionmuller2680 7 หลายเดือนก่อน +1

      And stupid enough to buy that crap. Especially in presale.

    • @Alpine_flo92002
      @Alpine_flo92002 7 หลายเดือนก่อน +47

      @@chielvandenberg8190 Games these days are basically the same if not even cheaper than AAA games from like 5-10 years ago. Inflation my guy

  • @rahyrammartinez835
    @rahyrammartinez835 7 หลายเดือนก่อน +599

    Games are striving for ultra-realistic graphics these days while sometimes forgetting the core fundamental aspect of a game, which is that it should be fun. Nowadays, games often demand PC upgrades, consuming a significant amount of space on our SSDs, and, in the end, they are becoming less enjoyable. Games like BattleBits and Vampire Survivors come out to remind us gamers of what a game should be: fun above all else.

    • @MyouKyuubi
      @MyouKyuubi 7 หลายเดือนก่อน +58

      *Monkey hear XP pickup noises.... Neuron activation.*

    • @holymonk293
      @holymonk293 7 หลายเดือนก่อน +3

      @@MyouKyuubi lmfao

    • @elgonzo7239
      @elgonzo7239 7 หลายเดือนก่อน +49

      I am sorry, but many of these oh-so ultra-realistic RT-based visualizations in games don't look much more realistic than well-done rasterization-only visualizations...

    • @pokiblue5870
      @pokiblue5870 7 หลายเดือนก่อน +18

      They need to make more solo games just like Rareware did in the past or like Tomb raider series or last of us. But they only care about lootbox, battlepass, dlc, microtransaction p2w.

    • @elgonzo7239
      @elgonzo7239 7 หลายเดือนก่อน

      @@pokiblue5870 Loot boxes, season passes, p2w and microtransactions make a lot of money for the companies -- really a lot. This money comes from somewhere, right? Frankly, this being a lucrative practice for so many years already, it is rather unfair to put the blame on the companies alone. If one is honest, a part of the blame has to be put also on the enablers of such corporate behavior -- the consumers themselves. It's the gamers who tell with their money again and again what a great idea p2w, loot boxes, and the likes are...

  • @skl345
    @skl345 5 หลายเดือนก่อน +139

    I've been rocking a GTX 1080 TI for a month now. And honestly with the technology and FSR 3 I sincerely think we can squeeze another two to three years out of this graphics card before games get completely unplayable

    • @coutou5302
      @coutou5302 4 หลายเดือนก่อน +10

      All hail the gtx 1080 ti

    • @SaradominOSRS
      @SaradominOSRS 3 หลายเดือนก่อน +5

      U realize a 4060 ti is better and cheaper?

    • @tabushka292
      @tabushka292 3 หลายเดือนก่อน +21

      @@SaradominOSRS Not if you are buying used. Which at this point is kinda the only way to buy a 1080ti. They go for around $220 bucks give or take. 4060ti is at least double that even used since it's new and didn't drop in "value" yet. And the "value" of that card is debatable, when it can't even outperform the card it's supposed to replace. And the 16 GB version is just a joke, some sort of sick mockery from Nvidia. "Here is the solution to the VRAM problem we created, but we will only give you 16 gigs with slower memory and only on our worst card, enjoy sucker lmao"

    • @tourmaline07
      @tourmaline07 3 หลายเดือนก่อน +3

      ​​@@tabushka292I'd go for a 2080ti instead now used and overclock it - that thing can get within 15% of a 4070/3080 and it crushes the 4060ti so long as it doesn't run out of VRAM (which will be an issue for some titles such as Ratchet and Clank). You also got DLSS Super resolution too. Just check the thermal paste and pads - my copy burnt itself out after two years and the coverage wasn't very good.

    • @drek9k2
      @drek9k2 3 หลายเดือนก่อน

      @@SaradominOSRS Why tf would you buy a "new" 1080ti
      Why THE F would you buy an alleged 4060ti either wtf is wrong with you
      Absolutely the fuck not do you have any idea how ridiculously price gouged that piece of shit card actually is? You should be able to get a GTX 1080ti or a 5700XT Nitro+ or Taichi for like $200 on ebay and they're generally working and good cards. I've gotten GPUs on ebay before, it's fine. So like, why would I even consider paying $500 for nvidia's overpriced crap when I can pay not even half for a 1080ti. Actually wonder what OP is using tbh because I didn't think FSR worked beyond Maxwell, but at Maxwell it's not a big enough upgrade. Unless he has a GTX 970 3.5gb the VRAM is the issue there, many games work fine enough on 970, and at 1080p if OP bought a 1080ti off of ebay he's going to be able to play literally any game at 1080p. Even Cyberpunk is 60fps at 1080p ultra with a 1080ti or 5700XT. That's 3060 tier.

  • @jjwhittaker93
    @jjwhittaker93 4 หลายเดือนก่อน +96

    Prices for GPUS are absolutely insane, at this point I'm just going to wait till my 1060 breaks before I upgrade. Hopfully they will come down. No reason a gpu should cost over 500$.

    • @the_golden_s5682
      @the_golden_s5682 3 หลายเดือนก่อน +2

      500 too low I'd say no reason a consumer gaming gpu should cost for than 800usd
      But a gpu like a titan, I'd say we let them charge whatever they want for a titan

    • @jjwhittaker93
      @jjwhittaker93 3 หลายเดือนก่อน +10

      It simply doesn't cost them that much to manufacture, nor should a Graphics Card cost more then the entire PC. Their is a market and its theft after a certain point.
      You wouldn't pay $100 for a box of cheerios because you know better. They are overpricing GPUS by alot and it's simply not worth the price they are asking.

    • @wpdlatm5576
      @wpdlatm5576 3 หลายเดือนก่อน +4

      @@jjwhittaker93 while I agree it is overpriced, your reasoning regarding manufacturing cost is wrong. it is also extremely cheap to make medicine, but what pays for the decades of research done by all these scientists? likewise, who pays the bills to gather the most brilliant of minds and motivate them to come up with the next generation of technology? if companies don't make money, who is going to fund research? R&D is not funded on charity, it has and always be be funded by profit. A market that cannot profit will not R&D, with no R&D technology advancements will slow down.

    • @jjwhittaker93
      @jjwhittaker93 3 หลายเดือนก่อน

      @wpdlatm5576
      Greed pays for decades of research. It doesn't cost money to pick up a book and expierence. It takes time and effort 👌 . Companies will exploit this by paying those "scientists" pennies on the dollar and then have their market analysts figure out how much they can screw us without people getting upset, and look they got you.

    • @drek9k2
      @drek9k2 3 หลายเดือนก่อน +1

      @@jjwhittaker93 All companies are doing this and frankly even cologne has suffered, like they'll rerelease stuff that's had the core ingredients cut out, reformulate it with the cheapest garbage like lavender, and then proceed to jack the price twice as high and in some cases cut the perfume oils so bad, like they went from 20.5% parfum in Initio to now it's just 17%. That doesn't seem like much I know but that's huge, that's 1/6th of the product being shaved off. I've seen some designer brands introduce crap that's nearly a joke or a scam, with insanely low longevity we're talking 2 hours and the scent just vanished. EVERYTHING is a race to the bottom now, no one even bothers making anything good anymore, I think Capitalism, the corporate consumer neoliberal kind, is a Soviet tier failure that is cannibalising itself at this point and there's no way out other than debt cannibalism.

  • @samcrdx8016
    @samcrdx8016 7 หลายเดือนก่อน +941

    That's what everybody thought when FSR and DLSS first landed. Fast forward to 2023, devs showcase these upscaling technologies as a part of their game's reveal. It won't be long when we'll be averaging 44FPS even with frame gen on.

    • @Benny-tb3ci
      @Benny-tb3ci 7 หลายเดือนก่อน +151

      On the flip side, these things need a good base frame rate to feel "right". 240Hz 4K OLEDs are hitting the market already. People will eventually want to run their games at that resolution and FPS. No matter how good these upscaling/frame generating technologies get, you will always need a good base frame rate. Poorly optimized games will just get a bad rep.

    • @emirobinatoru
      @emirobinatoru 7 หลายเดือนก่อน +44

      ​@@Benny-tb3ciExactly, I buy 144hz I want 144fps and such

    • @pf100andahalf
      @pf100andahalf 7 หลายเดือนก่อน +40

      But then they're going to give you two fake frames for every real frame, then 3, then 4, and eventually every frame will be fake and the gpu will just draw what it thinks should be on the screen

    • @mostlygoodamericans
      @mostlygoodamericans 7 หลายเดือนก่อน

      interesting point will be cool to see how it all plays out @@pf100andahalf

    • @user-ru8dz8rq3y
      @user-ru8dz8rq3y 7 หลายเดือนก่อน +65

      ​@@pf100andahalfsooo, basically the same as how GPUs are working now?😂

  • @mjn5016
    @mjn5016 7 หลายเดือนก่อน +341

    I sometimes do prefer rasterisation over raytracing because the latter often darkens games way too much for my liking.
    I don't mind realism but I also want to be able to SEE.

    • @squid.4349
      @squid.4349 5 หลายเดือนก่อน +19

      correctly configured HDR fixes this

    • @remus907
      @remus907 5 หลายเดือนก่อน

      Not everyone has a good HDR panel@@squid.4349

    • @jakeola10
      @jakeola10 5 หลายเดือนก่อน +15

      @@squid.4349 yeah hdr really needs to become the standard going forward if we want actual good looking games. cyberpunk 2077 with path tracing and hdr on an oled screen is mindblowing. i can imagine the visibility on an lcd screen is really bad though.

    • @OatmealTheCrazy
      @OatmealTheCrazy 5 หลายเดือนก่อน +3

      I just honestly never cared. I've never wanted realism out of my graphics, at least not to the point where I don't think any improvements in the last 10 years have actually mattered.

    • @brando4526
      @brando4526 5 หลายเดือนก่อน +1

      You mean pure rasterization. The real-time ray tracing in games is actually a hybrid between raytracing and rasterization. Pure raytracing takes hours to render a single frame and that is used in CG for most movies.

  • @Faziel1807
    @Faziel1807 4 หลายเดือนก่อน +39

    does that mean i never have to upgrade my ati radeon hd5870 ever anymore wow thats cool man

  • @neonshadow5005
    @neonshadow5005 6 หลายเดือนก่อน +30

    I bought a really cheap, low-end card as a stand-in until I could afford to upgrade to something better but .. in the end, I never did because until just recently, that stubborn little GPU has handled everything I've thrown at it and it was only $110 brand new, and I've had it for years. I need to replace it *now* but it held it's own for a long time.

    • @theantil7
      @theantil7 4 หลายเดือนก่อน +4

      What GPU was it and what are you replacing it with?

  • @Hat-san
    @Hat-san 7 หลายเดือนก่อน +511

    The simple fact that nvidia shot the price point of a 3080 to 4080 by more than $500 (almost double the price) were red flags and people took notice which is why we saw 4080s not coming off the shelves. Nvidia tried to get greedy and they need to come back down to earth.

    • @PFnove
      @PFnove 7 หลายเดือนก่อน +9

      I don't think you realize how much faster a 4080 is compared to a 3080

    • @Hat-san
      @Hat-san 7 หลายเดือนก่อน +154

      @@PFnove That is not a justification for a $500+ price increase in the span of 1 generation leap... The gtx 980 vs gtx 1080 is about the same jump as the 30 to 40 series and yet the price difference for the 980 to 1080 was $50.

    • @jaronmarles941
      @jaronmarles941 7 หลายเดือนก่อน +89

      @@PFnove Even after inflation, that's a 31% price hike. Which is unacceptable

    • @jaronmarles941
      @jaronmarles941 7 หลายเดือนก่อน +46

      @@Hat-san 9 to 1080 was only a 7% after inflation. $550 in 2014 was $558 in 2016. 30 to 4080 is 31%, or almost 5 times as bad for the same performance jump.

    • @TheBontekraai
      @TheBontekraai 7 หลายเดือนก่อน +57

      Tried to get greedy? They were always greedy. It's the same with apple.

  • @slapnut892
    @slapnut892 7 หลายเดือนก่อน +238

    The fact that a 1080 Ti can match an RTX 4060 in terms of raw performance is proof that NVIDIA was and still is holding back.

    • @Munenushi
      @Munenushi 6 หลายเดือนก่อน +20

      it wasn't long ago that NVIDIA and AMD/ATi were in trouble for price-and-tech 'fixing' so that they barely stepped things up and released in agreed increments, to milk the end users the most.... I remember reading it in a PCXL (Pc Accelerator) magazine years ago

    • @nightvision3182
      @nightvision3182 6 หลายเดือนก่อน +8

      Look a bit deeper into where they buy the semiconductors and their board members, then look into the Nvidia and Amd board members.....and it will become evident they do exactly what you describe.@@Munenushi

    • @nemiw4429
      @nemiw4429 6 หลายเดือนก่อน +17

      ​@@nightvision3182are u Batman hiding in a dark corner whispering, or can u say it out loud like a normal person?

    • @nightvision3182
      @nightvision3182 6 หลายเดือนก่อน

      go play with your playstation, it's not a topic for lemmings....@@nemiw4429

    • @Tommo020788
      @Tommo020788 6 หลายเดือนก่อน +10

      You are dreaming... I used to have a 1080 ti, and when I upgraded to a 3060ti (not even a 4060 yet) I noticed a big difference in performance, AND quality in modern games that utilise DLSS. Using a 3060 ti will net you about 15-30 frames per second (depending on the title) more than the 1080 ti on most games which can make quite a big difference to player experience, and a 4060 improves on that further.

  • @lencox2x296
    @lencox2x296 6 หลายเดือนก่อน +29

    Thank you for your insights. The old days where we used to update every CPU or GPU generations have past years ago.
    I said the fact about longer usage time 1st for Smartphones, Laptops and CPUs about several years ago. For 99 % of the people in a daily use scenario, there is simply little or no benefit of getting newer one.
    Same goes for GPUs now. For 99% a solid GPU may last > 5 years.

  • @bztube888
    @bztube888 4 หลายเดือนก่อน +7

    I am still using a GTX 970, and with new games too. If the game is very demanding, I just optimize the framerate vs quality. It always worked out so far, I am not a "gamer" though.

  • @voteDC
    @voteDC 7 หลายเดือนก่อน +446

    The problem Ray Tracing has, as well as a performance hit, is that you only really experience the benefit when you stop to look for it.

    • @Coffeeandacigarette
      @Coffeeandacigarette 7 หลายเดือนก่อน +46

      I honestly think rdr2 looks better than cyberpunk 2077 with Ray tracing. The volumetrics are incredible

    • @shalomsam5362
      @shalomsam5362 7 หลายเดือนก่อน +18

      same i think rdr2 is still one of the most stunning games ever producd cant wait to see what rockstar has to offer later on ;)
      @@Coffeeandacigarette

    • @nimrodery
      @nimrodery 7 หลายเดือนก่อน +1

      That makes it more lifelike, occasionally you have to stop and look around or you miss it.

    • @Remu-
      @Remu- 7 หลายเดือนก่อน +15

      My GF always plays RDR2 on the TV right next to my PC setup and every now and then I just stop to admire the scenery and graphics on that game and it's not even the PC version.

    • @MyouKyuubi
      @MyouKyuubi 7 หลายเดือนก่อน

      @@shalomsam5362 rdr2 is one of those games where, regardless of what you do with the graphics settings, the game will still look blurry af. xD
      Sure would like to enjoy the details of my black, gold patterned, ivory handled LeMat revolvers...

  • @DrekRivers
    @DrekRivers 7 หลายเดือนก่อน +19

    the best games out are not about graphics but gameplay. the dev nowadays pretty much always focus appearance over gameplay and wont show real gameplay on steam, just fuckin cinematics and stuff like that. you buy a game day 1 you're sure to have a mediocre experience (server lag, unoptimized game, bugs, lack of options) and you're sure to pay for beta testing the game. game industry these days just feel like a big huge scam

    • @freelancerthe2561
      @freelancerthe2561 5 หลายเดือนก่อน

      I'm of the mind that both should be in service of the game experience, and that lacking in either is a problem. There is a huge myth that pixel graphics means a game is more focused on game play. This is totally dumb, because modern pixel games have incredibly high readability, which is 100% a graphics focused problem.
      Dwarf Fortress simultaneously represents both extremes at the same time, because despite the game being unrealistically game play dense, the vast majority of people wouldn't even touch it because of the ASCII graphics were considered incomprehensible. And too few took the time to learn the graphics and its obtuse UI system to be able to play it.
      On another extreme end, I've been arguing since the early 2000s that "fancy graphics don't matter" is because we've (as in gamers and developers) never bothered to take full advantage of graphics as a standard game play element. Things like light and shadow being utterly worthless in most MP FPS games, because people would just turn them down to get better performance. And we don't allow it to matter, because we're afraid of bad performance.
      But what infuriates me the most..... Is that we FINALLY reached a point where the hardware and graphics engines can sustain excellent levels of detail at stable performance...... and all of that is completely wasted on a modern UX design paradigm, where everything is highlighted and outlined, so we don't even need to process the game's visuals or art style. We live and interact with most games almost entirely in an AR layer within the game.
      So congratulations, graphics don't matter, because the UX now bypasses it. Why not just complete journey, and just make the game the UI layer.

    • @TechTusiast
      @TechTusiast 2 หลายเดือนก่อน

      I don't see it that way. I think looks are important too and updates fix if there are issues. You can wait before buying a game if bugs are a bid deal for you. I have not had terrible experiences with games being overly buggy or troublesome. In The Last of Us I had one bug troublesome bug which almost stopped me from advancing and in Cyberpunk 2077 I fell through stairs once and had some other glitch, but nothing too bad. People sometimes complain about stutters or performance issues, but just look at fps, then set fps cap to just below what that game runs typically and framerate evens out nice.

  • @Ricelord4
    @Ricelord4 3 หลายเดือนก่อน +5

    I've always been the kind of person who's bought mid to high tier stuff, only to hold onto it for 5 years or more. I had my Samsung Galaxy S7 phone for 5 years and only replaced it when the screen separated from the frame. My most recent purchases from 2 years ago - an HP OMEN 17 laptop (with an Intel Core i7 12700H CPU and an NVIDIA RTX 3070 Ti laptop GPU) and Samsung Galaxy S22 Ultra - will hopefully continue the trend and last 5 years or more. Like you said, with technologies such as upscaling (DLSS, FSR, XeSS, etc) and rasterization optimization, I'm hoping to really squeeze the utility out of my gear. And it's not like stuff becomes unusable in even 5 years... I still use a 2009 Apple MacBook Pro 5,5 to do simple web browsing, listen to music, and watch DVDs.

  • @A.I.5000
    @A.I.5000 5 หลายเดือนก่อน +7

    You're right in principle, because they don't improve as quickly as they need to play games. The tip is to change the processor every 10-15 years, because the rare case of when the promises require the power of the processor, most often a good video card is needed.

  • @sirderpsalot
    @sirderpsalot 7 หลายเดือนก่อน +119

    I think the main issue is that many (particularly AAA) games are so often poorly optimised that you need more GPU power than should be necessary for performance. If this continues then the value of just lowering a couple settings may not be enough to achieve comparable visuals and FPS to when you got the card. I wouldn't be surprised if some of these game companies are intentionally doing this in agreement with the GPU companies to generate more sales.

    • @Zarkil
      @Zarkil 7 หลายเดือนก่อน +4

      The optimization comes from games being made for consoles first. Their hardware is streamlined for gaming so even though pc specs may be higher their not as efficient. This generation of consoles has had 12gb of vram since launch. Scaling that down to 8gb or less is a huge pain in the ass for what is generally a less popular market. Although pc's accounted for 70% of cyberpunk 2077 sales so we will see.

    • @ocularcavity8412
      @ocularcavity8412 7 หลายเดือนก่อน +16

      @@Zarkil It is not Just VRAM that makes most PC ports poorly optimized I have had 12gb of VRAM in my system since Nvidia Maxwell, 900's (the 2080ti did have 11GB) that is 4 generations of video cards I bought and I have seen performance go WAY down but not get any visuals to make up for it. Dying light is a great example DL1 in my opinion looks noticeably better than DL2 and yet I could run DL1 on a SINGLE Titan Xp at 4k max setting and yet my 2080ti STRUGGLED to run DL2 with NO raytracing at 4k, and NEEDED dlss to get decent Framerates and it was UGLIER! Personally I think FSR and DLSS have become a Crutch the Publishers use as an excuse to cut time and Budgets they used to allocate to Optimization and I started Noticing this trend when the 20 series launched BEFORE PS5 and Xbox Series it was slow but is was definitely there

    • @trevorveillette8415
      @trevorveillette8415 7 หลายเดือนก่อน

      ​@@ocularcavity8412how is dying light 2 uglier? I played it on a 4090 and the game looks way better than dl1.
      Also 20 series was a gimmick generation. Very little improvement in raster over 1000 series bit it could support rtx

    • @ocularcavity8412
      @ocularcavity8412 7 หลายเดือนก่อน +5

      If you look at texture Quality and Environmental detail, body physics and cloth physics and mesh/model detail all look better the only thing that looks better on DL2 is the lighting, Turn off Raytracing and it looks like last gen and BLURRY as $#IT thanks TAA and still runs pretty bad without DLSS and even with RT on it looks like an ugly game with good lighting and for REAL though as bad as people Dogged Cyberpunk at launch version 1.0 had WAY better performance than DL2 on my 2080ti and hands DOWN stomps DL2 in visual quality RT on or off

    • @MiniDevilDF
      @MiniDevilDF 7 หลายเดือนก่อน +4

      @@Zarkil I disagree, there's so many PC-only games that are not optimized. It's not because of platform ports, it's because of lazy devs.

  • @Raums
    @Raums 7 หลายเดือนก่อน +53

    I only update every 4 or 5 generations. Been gaming since 3DFX in the 90s and there’s simply no need to upgrade earlier unless you go for a higher fps or res monitor tbh
    We have settings in games for a reason..

    • @michaelthompson9798
      @michaelthompson9798 7 หลายเดือนก่อน +5

      Same here … gaming since the Atari 2600 days and grew up with astounded with 640 x 480p gaming 😳🤯! I think modern gamers think if their GPU can do High / Ultra settings of the box, it’s time for an upgrade. Graphical sliders and settings are a Street Directory to modern gamers and they can’t navigate the menus! Understand how the settings etc work, what taxes what resource … CPU / Ram / GPU core or Vram etc. I grew up altering my game .ini files / console command etc to get a game running ….. these days if it can’t just run out of the box at the highest settings ….. my hardware needs upgrading ….. a great example was in this video about smart phone upgrading ….. there’s been no major advancements anywhere in the past 10 or so years required g you to upgrade more often then when you can’t replace your mobiles battery 5-7years due to batteries not feint made….. I actually have an iPhone XS Max and have no need to upgrade it…. It has plenty of storage, a basic 2x optical and 5x digital zoom, 512gb storage, no cosmetic issues and I only charge the battery every 2nd night on a slow charge so my mastery is still great at holding a charge for a couple days.

    • @elitepauper7400
      @elitepauper7400 7 หลายเดือนก่อน +3

      Ah yes, lets drop all the settings at low and enjoy a pixelated mess

    • @Saif0412
      @Saif0412 7 หลายเดือนก่อน

      ​@@elitepauper7400😂

    • @PKmuffdiver
      @PKmuffdiver 7 หลายเดือนก่อน

      Love it. I was a Voodoo2 junkie. My last card was a Geforce 680Mobile. Lol. Thus round I ended up with a 4090. : ) it was a rip off but I do like the jump in performance. Blew my mind.

    • @borgy7085
      @borgy7085 7 หลายเดือนก่อน

      Well some games, like cyberpunk, actually so CPU heavy, my i7 7700 just dying under it, no matter my 6700xt AMD card, and the 12gb vram... Low Graphics 50 fps, sometimes 60, max settings, almost the same but never 60 XD

  • @play_by_ear
    @play_by_ear 4 หลายเดือนก่อน

    Love your video. There’s also the problem of diminishing returns when it comes to the insanely high power needed to power some GPUs. I switched from a 3090 to a 4070 and my room is physically so much cooler when I’m gaming or rendering a video.

  • @traewatkins931
    @traewatkins931 6 หลายเดือนก่อน +6

    I went from a 660 -> 1060 -> 4060 (last week). Prior to that I always spent around 2-300 every 4-6 years with the exception of when 3DFX died and everything stopped supporting them almost overnight.

  • @3makakram115
    @3makakram115 7 หลายเดือนก่อน +126

    The big GPU companies should extend the every 2 year "New" GPU release and take 4 Years to actually make a New GPU Gen, not only will we actually get a proper upgrade but I can see it to be cheaper for them which can make GPU'S more affordable, Overall with more time they can make a cheaper, better and just a good experience for everyone

    • @emirobinatoru
      @emirobinatoru 7 หลายเดือนก่อน +29

      This^
      If gpu manufacturers took double the time to focus on generations we would get a better design, better performance etc

    • @0rdyin
      @0rdyin 7 หลายเดือนก่อน +13

      I strongly agree with your opinion.. We need to have at least a 4 years of gap between GPU generations, may be some mid-gen refresh in between (2 years after initial release) to keep the business going..

    • @oliversmith2129
      @oliversmith2129 7 หลายเดือนก่อน +21

      @@0rdyin "mid-gen refresh" that can be the super variants/ Ti after initial launch.

    • @0rdyin
      @0rdyin 7 หลายเดือนก่อน +5

      @@oliversmith2129 That's exactly what I had in my mind.. In the end the GPU industry might be able to have a consistent product cycle similar to console industry..

    • @Technicellie
      @Technicellie 7 หลายเดือนก่อน +1

      This is also where the industry is moving towards. It used to be every half a year for each gen to come out, then a year, then 1.5 years and now it is 2 years. So it is acually a trend that exists for a long time.

  • @Skylancer727
    @Skylancer727 7 หลายเดือนก่อน +123

    If people are gonna continue to say the trickle down effect, well I'll just point out that the GTX 980 and GTX 1080 were both advertised as 4K GPUs. Turns out when GPUs get stronger 4K just got harder to run and has continued to do so almost linearly.

    • @z.k.5259
      @z.k.5259 7 หลายเดือนก่อน +18

      Yea what is the deal with that. I remember getting 160-200 fps in warzone next time i logged in i got about 50-70 on my 1080ti and it was super card. Now i go back to games where i had like 150 fps and it runs on 50 fps i just don't understand and the gpu max temps in game are like 60°. Did they downgrade my gpu with drivers or something?

    • @Louie-ji8xn
      @Louie-ji8xn 7 หลายเดือนก่อน +21

      ​@@z.k.5259not your gpu its the constant updates on the games and bad optimization smh

    • @Lifty83
      @Lifty83 7 หลายเดือนก่อน +5

      Totally true. This also applies to 1080p. Games are just getting more demanding visually (and less optimized). So, cards that could do 4K 8 years ago with (now) relatively poor grachics ca't keep up with the UE5 type titles coming out. It sucks but, it was the same back then. 60 series cards were for 1080, ti/titan cards were for 4k.

    • @rcane6842
      @rcane6842 7 หลายเดือนก่อน +1

      ​@@z.k.5259have you tried playing those old games with their old updates? Just to see if your GPU is not really at fault (maybe include old gpu driver as well)

    • @jocap3837
      @jocap3837 7 หลายเดือนก่อน +2

      HEAR ME OUT 4K Medium-Low Settings 🤣🤣🤣

  • @OzzyFan80
    @OzzyFan80 21 วันที่ผ่านมา

    I just found your channel and I really love the information you provide this video in particular is extremely helpful. They all are but this really opens your eyes as to how not to waste money by thinking you are having to upgrade every cycle.

  • @POVGOD
    @POVGOD 4 หลายเดือนก่อน +27

    I recently got back into PC gaming. When i was around 14 i had a GeForce 780, now i've gone for an amd 7900xtx so im really looking forward to 4k gaming!

    • @EwanJP2001
      @EwanJP2001 3 หลายเดือนก่อน +2

      Hey! Sorry to bother you but how are you finding the 7900xtx? I'm attempting to build my first PC and figured I'd build 4k powerhouse that'll keep me from upscaling games for a while. If you wouldn't mind, could you also share what CPU and motherboard you've paired it with? I'd greatly appreciate it!

    • @JaggedElk
      @JaggedElk 3 หลายเดือนก่อน +2

      @@EwanJP2001 okay i know you didnt ask me but i paired mine with a 12900k overclocked to 4.9Ghz just to make sure there aint no processor bottlenecking. the motherboard i used is the asus z790 prime wifi, and 32 gb of g.skill ddr5 ram. i aint tested it yet but ill be back in a couple of days tellin u how it works once i install my os

    • @JaggedElk
      @JaggedElk 3 หลายเดือนก่อน +1

      @@EwanJP2001 okay so steam aint workin yet it just wont fetch my user data, but coolmath games is running like a dream

    • @EwanJP2001
      @EwanJP2001 3 หลายเดือนก่อน +1

      @@JaggedElkinstantly sold! Cheers 🤣

  • @kirilivanov5852
    @kirilivanov5852 6 หลายเดือนก่อน +302

    Lol.. First time watching you and I'm pretty surprised seeing a TH-camr with an actual functional brain telling the truth that nobody is doing just because most of them are payed to advertise new products.. I like the way you play, you've got my subscription here 😏

  • @alextaras7176
    @alextaras7176 7 หลายเดือนก่อน +88

    First issue would be that you usually can't disable nanite, and there are recent tests done by developers where it seems that nanite have worse performance than traditional LoDs. Second issue, devs seem to use UE5 out of the box and rely on upscalers and frame gen instead of optimizing for their games. This used to be the norm some years ago, to optimize the engine you're using, now everyone chases great graphics and sacrifice performance. It's good that NV pushes the limits, evolution needs to happen (juts like in the past we evolved from text to 2d graphics to 3d graphics) but it also feels that both NV and AMD are holding back tech for proffit nowaday.

    • @ethanwasme4307
      @ethanwasme4307 7 หลายเดือนก่อน +3

      i guarantee you that studios aren't just turning nanite on without profiling or shipping their games with default settings...
      rest are fair points

    • @Technicellie
      @Technicellie 7 หลายเดือนก่อน +2

      That is one of the reasons why I vouch for different author systems... Okay, Unity sucks hard thanks to that "fucking idiots"-CEO.
      But there are alternatives out there, and we might see some other engines that run games, while looking beautiful in the future. (looking at you, Godot)

    • @Greybell
      @Greybell 7 หลายเดือนก่อน

      idk how game dev and optimization work but I've always wondered how well Nanite optimizes a game. It sounds good on paper but from reviews so far, UE5 games aren't as smooth. Would be great if there's a slider for how extreme polygons are decimated.

    • @More_Dread
      @More_Dread 7 หลายเดือนก่อน +1

      unreal engine 4 brought deferred rendering to the table, which allowed for alot more extra dynamic light sources than previous methods but came at a higher base cost in terms of performance. the same thing is happening now with unreal engine 5. the base requirements are going up but you can add way more detail without as much of a performance drop compared to older methods. its not worse performance than traditional LoDs overall, its an insane improvement but it won't be immediately obvious, especially on older hardware.
      large parts of the industry has operated on a 30fps cinematic paradigm for 2 decades but the push for higher framerates has gotten bigger and bigger since then and its gotten to a point where we sometimes get both: 'cinematic' graphics and 60, 120, 240+ fps.
      readily available complex 3d engines allow less competent developers to use more advanced techniques they would otherwise not have access too. yes that means some of them will poorly optimize... and yes that sometimes will happen with bigger studios because of... lets say bad priorities.
      but that doesn't mean the absolute amount of studios who know how and care to optimize is going down, just that theres more games being made in total.

    • @rremnar
      @rremnar 7 หลายเดือนก่อน

      In Unreal Engine, I turned off lumen and nanite for my simple arcade game. It isn't just with a option click, you also have to do it in the engine console and type out the command to turn it off. Make sure your shaders and models have nanite turned off as well.

  • @gael3259
    @gael3259 3 หลายเดือนก่อน +1

    It's the first time I see a hardware a gpus video with pokemon black and white soundtrack in it, fits really well

  • @Jakalwarrior
    @Jakalwarrior 5 หลายเดือนก่อน +10

    I think they've gone up so much because they've had to go to huge die designs that probably have terrible yields just to get increased performance.

    • @TechTusiast
      @TechTusiast 2 หลายเดือนก่อน

      Nope. Not really. Die sizes are actually quite small with modern techniques. Also there are fewer memory lanes needed due to cache memory and faster memory chips, which too make GPUs cheaper to make. They simply charge a hefty profit from them.

  • @toufusoup
    @toufusoup 7 หลายเดือนก่อน +69

    Even though I’m running a 4070 (upgrade from 1660), I can definitely appreciate FSR3 existing, especially when my pals aren’t as well off as me and they have to sacrifice so much for a game night with the bois. Props to AMD! 🎉

    • @michahojwa8132
      @michahojwa8132 7 หลายเดือนก่อน

      Thanks man

    • @jalapeno.tabasco
      @jalapeno.tabasco 7 หลายเดือนก่อน +5

      yeah, they share their tech and they don't have dirty business practices like intel and nvidia

    • @michahojwa8132
      @michahojwa8132 7 หลายเดือนก่อน +4

      @@jalapeno.tabasco Every corpo is dirty, competition is good.

    • @tumultoustortellini
      @tumultoustortellini 7 หลายเดือนก่อน +4

      @@jalapeno.tabasco In all fairness, intel is making good gpu products. They're sharing tech, and bringing prices back to old-market standard. Once their drivers become standard on out-of-the-box windows, I think they'll be the most consumer friendly.

    • @jalapeno.tabasco
      @jalapeno.tabasco 7 หลายเดือนก่อน +1

      @@michahojwa8132 intel and nvidia were engaging in ANTI competitive practices
      do you know what that means?

  • @Otacanthus
    @Otacanthus 7 หลายเดือนก่อน +15

    The pricing is a self inflicted wound. Many more people upgrade every single generation of graphics card before the new prices. They're pricing out many who had thoughts of upgrading, so less people buy.

    • @Andytlp
      @Andytlp 7 หลายเดือนก่อน

      Upgrading every generation has its benefits. You sell the old card just before the new generation comes out and recoup probably at lerast 75% of the money spent, thats after a whole year+ of gaming on it. Then that 75% plus some cash you get a brand new gpu. Can do the same with the rest of the system but it's more of a hassle selling the entire pc except for storage. It's easier to replace gpu every gen and upgrade cpu every 2-3 gens.

  • @blackout5560
    @blackout5560 3 หลายเดือนก่อน +2

    just upgraded from a 1660 to a 4060ti 16gb. my friend accidentally killed my motherboard when going from 16gb to 32gb ram, and trying to troubleshoot blue screens. Used it as an excuse to upgrade to a brand new motherboard, cpu, and doubled my ssd.

  • @kllause6681
    @kllause6681 6 หลายเดือนก่อน +33

    I feel like the introduction of raytracing is VERY similar to the introduction of LCD monitors. Yes it absolutely SUCKS now due to the requirements and cost you need to use it effectively, and its likely that most people are going to barely use it, if at all for the next 5-10 years at least, due to the cost. But once we break that thresh-hold I genuinely think it will be something that no one is going to be able to live without, especially once it gets much more advanced (or at the very least easier to run)
    EDIT: corporate greed also absolutely plays a factor here,, but when im talking about costs im not just talking about 40 series, but also 20 and 30 series cards. Especially with how half the 30 series cards perform on alan wake 2 LMAO

    • @bilalbaig8586
      @bilalbaig8586 5 หลายเดือนก่อน +5

      Raytracing is going to revolutionise game development rather than the actual gaming experience. Rasterisation may be able to match and even exceed raytracing but a ton of work is required to achieve a good result. This means only large studios can afford AAA graphics. On the other hand raytracing allows the same level of visuals but for a much lower effort. This means that even a small studio can achieve AAA graphics. Today the only advantage big players like Blizzard, Ubiosft, etc. have is that indie studios cannot produce games at scale. Raytracing combined with AI will allow small teams of indie game devs to make large AAA games wihtin their budgets. Raytracing + AI would mean the end of corporate cash grabs and microtransactions. I believe in a few years time we will be AAA masterpieces coming out of small indie studios.

    • @esotericjahanism5251
      @esotericjahanism5251 4 หลายเดือนก่อน

      @@bilalbaig8586 This guy gets it. Ray Tracing actually makes it much easier for Devs to create lighting within their games. Raster might require much less compute to actually render but it requires the Devs to make all these weird fakey reflections and such. It's not easy, Post Processing like ENB, or Reshade can make this easier but again it takes a lot of Compute to do and is a layer added on top of the Rasterization and post processing can really fuck with upscalers. If Devs could work from the assumption everyone has an RT capable card they could focus solely on RT for lighting and make massive optimizations software wise to make it render much easier rather than just using brute force compute.

    • @Je-kg8up
      @Je-kg8up 3 หลายเดือนก่อน

      People were saying that at RT release and ofc it is still is not really viable except at the top end. There comes a point where you have to say was it really worth all the time, money, effort and performance costs to get RT to work and not even well. If Nvidia and AMD focused all that time and effort into traditional rasterization graphics tricks and hardware performance I think we would be in a much better position than we are now.

  • @thingsiplay
    @thingsiplay 7 หลายเดือนก่อน +9

    My mom upgraded her iPhone 6 recently. Because the battery was swallen up. Actually it was dangerous! I'm not an Apple fan, but it's still amazing how well old iPhones hold up to this day.

    • @emirobinatoru
      @emirobinatoru 7 หลายเดือนก่อน +2

      Yea, old iPhones were made with innovation in mind

    • @GenericPast
      @GenericPast 7 หลายเดือนก่อน +3

      I'd still be using my Galaxy S9+ if the battery life wasn't so horrible.

    • @ThePC007
      @ThePC007 7 หลายเดือนก่อน +1

      I recently upgraded from my Note 2 to an XCover 5 after 10 years. Technology can be quite long lived if only the battery is replaceable.

  • @Bigi444444
    @Bigi444444 7 หลายเดือนก่อน +245

    Happy to hear such rational thoughts by a young man. You see, it's not that big of a deal to turn a setting down or two. Graphics are so realistic today anyway. Especially for people like me that have started gaming on systems like C64 etc, it should not be considered even as a compromise. At the end of the day you 've spent so many hours on games with light years worse graphics but had a great time. Yes, progression is a good thing, but not the only thing as for graphics per se.

    • @ComeonmenID10T
      @ComeonmenID10T 6 หลายเดือนก่อน +5

      remember "Davids Midnight Magic" ? my C64 experienced the biggest "Interrupt" ever with this Pinball Game, because i had all the lights on, Multi ball going and all 3 Balls went straight out in the middle, took me 5 minutes to find all the Parts / Keys to put it back together

    • @andersonsachetto169
      @andersonsachetto169 6 หลายเดือนก่อน

      I`m sorry but in my opinion, those comments in the end of the video were so much bad. Because it was looking like some kind of excuse for big tech companys to set its exorbitant and abusive prices.
      They earn billions every month/year and they want even more. That the big problem.
      Not to mention that those monopoly companies owns the entire industry by themselfs.
      We cannot say that, just because consumers are buying less GPUs every year, they need to get the prices up 100% or more. This is just bullshit.

    • @andersonsachetto169
      @andersonsachetto169 6 หลายเดือนก่อน +3

      And of course. I agree in almost everything you saying in this comment, besides that first part. It was not a 100% rational thought (my opinion again)

    • @jamsquan9415
      @jamsquan9415 6 หลายเดือนก่อน +5

      now if only turning down graphics actually gave performance increases in modern games. so many are so broken that nothing makes a difference.

    • @MarkLikesCoffee860
      @MarkLikesCoffee860 5 หลายเดือนก่อน +3

      There are not enough settings in my opinion. There should be Ultra Ultra settings and Super Ultra Ultra settings above those. It keeps the games relevant for longer. So you can go back to 10-year-old games and max out the settings. To make games look relevant for longer. That's what the ultra settings were invented for. For future-proofing. Instead, people get angry if they can't use ultra at release.

  • @apparixion
    @apparixion 3 หลายเดือนก่อน

    I like this dude. keep em coming my man. subscribed! :D

  • @OTBASH
    @OTBASH 5 หลายเดือนก่อน +6

    I was able to get 8 years out of my r9 290 before it finally crapped out and joined the gpu afterlife. The jump to a 7900 xt blew my damn mind. I plan on keeping this card for the next 10 years, but I suppose that will change if games continue to get too demanding to the point where I'LL HAVE to upgrade again sooner rather than later.

    • @ichwalsadam9179
      @ichwalsadam9179 4 หลายเดือนก่อน

      Im curious how do you know that ur graphic card.... Just dead? Like is there a warning pop up or what happened

    • @OTBASH
      @OTBASH 4 หลายเดือนก่อน

      @@ichwalsadam9179 Was playing a game, game started stuttering like crazy, then the picture cut out. Tried restarting my pc, still nothing. I switched hdmi cable to my mobo and got a signal no problem. Made the conclusion my gpu finally gave out.

    • @TheAtkinson118
      @TheAtkinson118 4 หลายเดือนก่อน +1

      ​@@ichwalsadam9179Your FPS counter doesn't go above 60fps on the settings you desire and the game's you want to play..At this point the card is useless I would assume??
      This is me taking a wild guess at someone in this position,as I'm fortunate enough to just do what I like thankfully been a grown arse man.

  • @drCox12
    @drCox12 7 หลายเดือนก่อน +14

    When it comes to RT, Nanites, Lumen... whatever the new feature may be (hardware or software doesn't matter), I remember the good ol' days of the "tessellation wars".
    AMD (then still using the ATI brand) was the first company that sold consumer GPUs with hardware tessellation: The Radeon HD 2000 series. But it didn't catch on because it was proprietary and we all had to wait a little longer for DirectX 11 as the standard API for tessellation.
    Don't hold you breath, DirectX tessellation performance of Nvidia GPUs was huge at that time and AMD's tessellation performance was relatively "meh" in direct comparison.
    The problem back then was: It was a new feature and the few showcase benchmarks that made use of tessellation just massively overdid it. Those benchmarks weren't really representative of how tessellation was (and is) used in real world games. Today, of course, nobody even asks for tessellation performance of GPUs anymore - we just take it for granted.
    And I think what we see today is that game developers mostly forgot about the art of making reasonable use of new features. They could make new features run on mid-range GPUs (in terms like RT even the last gen GPUs). All they need to do is to limit the scope of RT etc. and only to use it where it really makes a visual difference. But then, as you said in your video: RT often isn't really that much visually superior to modern rasterization.
    In the end it all comes down to managing your expectations. Gamers have always been waiting for the "next big thing" in graphics. The thing that will revolutionize our gaming experience and that will boost us to whole new levels of immersion. And GPU companies, especially Nvidia, know about this and they exploit our desire for their marketing. They just start the hype train and give free tickets to the press in order to ensnare as much gamers as possible.
    Btw. PhysX anyone?
    But instead of learning from disappointments in the past, what do many of us do? - Yeah right: Gaze at Cyberpunk which was born as a tech demo and needed years of patching to become an acceptable game.
    /end rant ;)

    • @SianaGearz
      @SianaGearz 7 หลายเดือนก่อน +1

      Tess shader has become increasingly relatively slow, you are better off doing your geometry gen in compute shader and have been since the DX11.1 grade hardware, it was a badly designed feature from the get go.

    • @azenyr
      @azenyr 7 หลายเดือนก่อน +2

      Oh god I agree with this so much. This is why I currently hate ray tracing in almost all games that use it like cyberpunk, fortnite etc. It's just extremely overused. It's reflections and shadows overload. Everything is too dark inside houses, too bright and blown out outside, too much reflections everywhere, too much puddles in the ground on purpose, everything is just too much. Devs should use ray tracing as a small complement to rasterization, and only use it in a few cases keeping the performance impact to the minimum. Games where you almost can't notice ray tracing are the best games. They used it as a complement instead of puking ray tracing stuff all over the screen.
      When games finally stop obsessing about ray tracing and using it responsibly, is when ray tracing games will get "easy to run". Just like it happened with all past 3D tech before this

    • @SianaGearz
      @SianaGearz 7 หลายเดือนก่อน +1

      @@azenyr I've been enjoying Spiderman Remaster with raytracing on lowest quality. The environment is fundamentally reflective, you can't not represent all that building glass, and grounding it in a low resolution representation of actual environment as opposed to some abstract skybox/probe really helps.
      But arguably the technique has been misapplied there as well. Because for the most part, you can really see just two reflective planes at 90° to each other both vertically aligned. If you have a low resolution environment bake in like sub 30k polygons you can just render two scissor-limited planar reflections and you'd be good with effectively zero performance hit. You can then enhance it with a handful entities rendered in, for example the player character.
      I have also been annoyed at the prevalence and misapplication of a raycasting effect which predates RT hardware - screen space reflections - some of it is good, but seeing the wrong side of player character in head on reflections gets old fast.
      A more interesting take at computational raycasting was in 2012 game Remember Me, and this ran full speed 720p30 on PS3 and Xbox360, and obviously just easily gets into 4k HRR territory on anything more modern. There the reflective surfaces were raycast in the pixel shader from a simplified scene representation consisting of a single inverse convex body or parallax corrected environment map plus additionally billboards for some environmental objects, and player and enemy characters. Character entities used ellipsoid bone representation also treated as billboards for the reflection. This was so good! Obviously you can't do it with open world games quite exactly like that, but i mean it's interesting.

  • @I2ed3ye
    @I2ed3ye 7 หลายเดือนก่อน +11

    Maybe it's just me but I'm so disenfranchised now after years of hype marketing, empty promises, increased pricing, disappointing releases, half-apologies, excuses, and aggressive patch cycles.

  • @FandangosBR
    @FandangosBR 5 หลายเดือนก่อน +6

    How do I use rasterisarion?
    I'm a newbie when it comes to frame generation technologies. I have a rx 6750 xt and use a 144hz 1080p monitor with a i76700k (yeah I get some bottleneck but its ok). What should I use? Fsr? (Which version? / Can you change the version of the Fsr? Is it super resolution?)Rasterisation? Ray tracing? Sorry for so many questions I just want to balance the looks and performance for the best experience. Any help is welcome

    • @kidreaper9360
      @kidreaper9360 4 หลายเดือนก่อน +3

      Raster is just normal rendering, fsr and DLSS use a different technology to render the image lower that raster would and then upscale it so it looks as close to raster as possible while reducing performance costs. Since you are using an AMD card, I recommend using FSR when possible to get best performance

    • @TechTusiast
      @TechTusiast 2 หลายเดือนก่อน

      Get a 27" 1440p monitor. Image quality is much better. FSR and DLSS and XeXX are scaling techniques to get higher framerate at the cost of image quality. Rasterization is when you do not use raytracing, pathtracing or lumen.

  • @shiftyeffect6597
    @shiftyeffect6597 5 หลายเดือนก่อน +4

    Good video but you miss one crucial thing, many devs don't optimize games like they used to. I don't know whether it's due to schedule and release times or it takes too long to reach optimization goals but it happens way too often. Only a handful of AAA games per year should be considered demanding with the hardware we have at hand and yet much more are released unoptimized.

  • @markothevrba
    @markothevrba 7 หลายเดือนก่อน +26

    I like waiting multiple generations because it feels so nice to bump those settings from low-medium to ultra.
    Recently went from RX 580 to RX 7800XT and man does it feel good. Games that were struggling on low before now run on ultra no problem. I wish it wasn't so expensive sure, but it feels good when the upgrade is such a huge leap.

    • @noLKWD_vital
      @noLKWD_vital 7 หลายเดือนก่อน

      I still enjoy my RX 580 🙂

    • @alinn.4341
      @alinn.4341 7 หลายเดือนก่อน

      I'm thinking of moving from rx580 to RX6800/xt/7800 myself. Problem is I need a 2 slot one xD can't find one anywhere.

    • @insomniacgr1
      @insomniacgr1 7 หลายเดือนก่อน

      Which CPU do you use with the 7800XT?

    • @theradgaming
      @theradgaming 7 หลายเดือนก่อน

      @@noLKWD_vital sold mine last month, card is still a beast even in this day and age

    • @allgunsblazed9106
      @allgunsblazed9106 7 หลายเดือนก่อน

      How are you're frames in competitive titles? I play at 1440p and have a red devil 7800xt arriving soon.

  • @peterwstacey
    @peterwstacey 7 หลายเดือนก่อน +38

    Very good points. Graphics have been "good enough" in rasterization for about 5 years now, which is why there's a push towards ray-tracing. I upgraded from a GT-660 to a RTX-3080 last year, but the most recent game I play is from 2018, and this one will keep me going for another 10 years or so. Am happy to play games from 1993 to the present, so there's a huge catalogue I can play before needing to upgrade

    • @MyouKyuubi
      @MyouKyuubi 7 หลายเดือนก่อน +4

      TES 3: Morrowind.

    • @antonsiberian
      @antonsiberian 7 หลายเดือนก่อน +1

      Interesting, but recently I started replay best old games since 1997, now I'm in 1999. And some of them are really great even today, I mean gameplay (Half Life, Fallout 2, Total Annihilation etc). I also bought Sony PS3 for old school console games as well. I will don't need upgrades for long time 😅
      Now classic games for me is like culture phenomenon, not just entertainment or smth.

    • @peterwstacey
      @peterwstacey 7 หลายเดือนก่อน +3

      I just finished a game of Civilization II :) just started on CoD WW2. After that, probably do Baldur's Gate 1. Let's see!

    • @antonsiberian
      @antonsiberian 7 หลายเดือนก่อน

      @@peterwstacey I also played Baldur's Gate 1 a month ago, quite liked it! But it's quite hard for me though. Civilization 2 was one of the best in my childhood:)

    • @BastyTHz
      @BastyTHz 7 หลายเดือนก่อน

      most benefit is RT global illumination that made different in visual advantage, u can see things under shade easier that normal rasterization

  • @IfritBoi
    @IfritBoi 5 หลายเดือนก่อน +3

    Part of me just feels like either upgrading upgrading by a couple more gens (AMD's FSR3 seems pretty good and they're version of CUDA cores seem even better, but NVIDIA DLSS is super good rn, too) or crossfiring with the 5700xt I have because my PC just needs a little push for max presets and 4k. Other than that, I'd upgrade my RAM to 64GB or 128GB depending on the gaming landscape

  • @mohamedbe4942
    @mohamedbe4942 5 หลายเดือนก่อน +2

    I've been using a 1060 up until 2023, changed it for 3060 a few months ago and I don't plan on swapping for a couple of years, at least until I can't run any new games at 60fps with medium settings.

  • @PoeLemic
    @PoeLemic 7 หลายเดือนก่อน +16

    Your video is exactly how I see things. I never upgrade my phone. It is probably 3 or 4 generations back. I only moved from Note5 to Note9 when mine got stolen. So, N9 was same price as N5, so I went with it. But, after Note5, I don't need much more capabilities. I could still use a Note5 and be hunky-dorie happy. And, I still have a GTX 1080 TI, and I am still 100% happy with it.

    • @LocalDiscordCatgirl
      @LocalDiscordCatgirl 7 หลายเดือนก่อน +1

      I’ll probably jump to Pixel when my current iphone dies. Reason being is purely camera quality.

  • @flori4551
    @flori4551 7 หลายเดือนก่อน +54

    The major problem is that none of the current midrange graphics cards actually feel futureproof. Nvidia has been very stingy with VRAM and AMD seems to be years behind in features (FSR is worse than DLSS, no ray reconstruction). When you buy a 4070/7800XT now, you cannot reasonably expect them to last another 7 years. What's even worse is that all the allegedly live-prolonging features are or may be exclusive to current hardware generations and who knows whether the next big thing (e.g. ai texture compression) will be available on current gen?
    If the hypothetical DLSS4 and FSR4 "require" new specialized ai cores to run, both the 4070 and 7800Xt will fall off quickly. We saw a similar development with the 3070 - after only two years on the market its 8GB of VRAM have already become insufficient for certain games on higher settings and it did not get DLSS3 support. While a 3070 is perfectly adequate for the time being, I don't see it lasting another 5 years. Nvidia (and AMD to a lesser extend) put planned obsolence in their gpus by limiting VRAM (and bandwidth on current 60s) and locking older cards out of current features.

    • @vitordelima
      @vitordelima 7 หลายเดือนก่อน +19

      And all of those issues could be easily prevented. It's planned obsolescence.

    • @borgy7085
      @borgy7085 7 หลายเดือนก่อน +1

      BS

    • @SanctusBacchus
      @SanctusBacchus 7 หลายเดือนก่อน +14

      >mid-range
      >future-proof
      Pick one.

    • @chrisk3127
      @chrisk3127 7 หลายเดือนก่อน +5

      @@SanctusBacchus my 1070ti is starting to struggle with newer games, I also run at 1440p for all but 2 games I've played, 6 years for a mid tier card to start struggling is pretty good lol, feel like the 40 series will be on the outs in 3-4 years

    • @PrefoX
      @PrefoX 7 หลายเดือนก่อน +1

      if its really your hobby, you have the money to spend and who the fuck is using a GPU for 7 years? if you do just play old games... stop holding back new games.

  • @TheThorns
    @TheThorns 4 หลายเดือนก่อน

    FSR plugins and MSI afterburner got me through the video card shortage during the pandemic. I was stuck on a 1050 ti with 4GB of Ram and refused to pay flipper prices.

  • @orionmec
    @orionmec 5 หลายเดือนก่อน

    Good explanation and analysis. I just built a new computer to be able to Fly combat jets in DCS and boy did I pay for it. I see this PC lasting way down the road baring a melt down.

  • @jerrycurls88
    @jerrycurls88 7 หลายเดือนก่อน +78

    Precisely why I bought a 6800 XT last fall. Rasterization is king to me and I fully expect to be able to play games on very high settings @ 1440p for 5 years. Then I'll see what the GPU landscape looks like. Hopefully intel gets their sh*t together and we have a 3 horse race for our dollars.

    • @kilroy5680
      @kilroy5680 7 หลายเดือนก่อน +2

      literally me fr but in feburary

    • @MyouKyuubi
      @MyouKyuubi 7 หลายเดือนก่อน +12

      i bought a whole new pc with the 6800XT, lol... Upgraded from a 2080 Ti which was kind of a waste of money, to be honest. the 6800xt runs like a dream, and runs as cold as a viking winter, regardless of what game i play, or what graphics settings i use... the fans are at minimum speed, and it's just chillin out.
      RT was not worth is, Raster looks good enough, gaming finally feels smooth and fun again... 'cus that 2080 Ti was performing poorly, even with RT disabled, because it was basically a stripped down GTX 1080 card with less tensor cores, to make space for the RT cores, which were barely used, because most games dont have ray tracing... -.-'
      i paid 1200 for that 2080 Ti, lol... absolute scam, and before that, i was running dual GTX 680's in SLI... But since SLI support was discontinued, i had to deal with really bad performance for several years, 'cus i could only run ONE 680 at a time, before the 2080, and then still deal with poor performance with the 2080 'cus of the problems i mentioned... With the 6800xt, gaming finally feels blissful again... fck, i've been dealing with shtty performance for 10 years now, holy crap. xD

    • @jimch.9961
      @jimch.9961 7 หลายเดือนก่อน +2

      intel delivered on the A580, so battlemage is gonna be pretty good, since they're putting in the work needed to compete

    • @cyberbillp
      @cyberbillp 7 หลายเดือนก่อน +3

      @@MyouKyuubi Yup. 6900 Xt here. AMD is King of 1080p displays. I use a 21:9 ultrawide and it runs like a dream. 100 to 150+ fps on ultra in every game. The price is 30% of what it was year ago. Great time to buy a great card.

    • @ladyinwight
      @ladyinwight 7 หลายเดือนก่อน +1

      intel's focus on dx12 and some other things makes me really hopeful, but right now intel gpus are pretty much just enthusiast early adopter stuff and unfortunate inclusions in pre-built pcs lmao

  • @armoredsquirrel946
    @armoredsquirrel946 7 หลายเดือนก่อน +54

    i love how every time we talk about how hard it is to run "modern gaming" with current hardware we always end watching mostly bad to mid af games for the benchmarks 😂

    • @LordOfChaos.x
      @LordOfChaos.x 7 หลายเดือนก่อน +36

      Elden Ring is not the next gen graphics game but looks prettier than all games in the last 2 years just because it has consistent art style and good use of colors.
      Thats proof we should give up on this bullshit ultra realistic graphics and focus on gameplay.

    • @armoredsquirrel946
      @armoredsquirrel946 7 หลายเดือนก่อน

      ​@@LordOfChaos.xAGREEEEEE

    • @armoredsquirrel946
      @armoredsquirrel946 7 หลายเดือนก่อน +6

      @@LordOfChaos.x Also, theres things like RE4RE both looking awesome and running better than the other games WHILE being a really fun and polished experience

    • @LordOfChaos.x
      @LordOfChaos.x 7 หลายเดือนก่อน

      @armoredsquirrel946 thats how a game should be and then its definitely worth 70 euros

    • @Old_Gregg
      @Old_Gregg 7 หลายเดือนก่อน

      @@LordOfChaos.xTotally agree! Having both as option would be nice too tho 😅

  • @XTSonic
    @XTSonic 6 หลายเดือนก่อน +1

    Makes sense. Upgraded from 1080 to 4090 recently, and hope to run that one for 6-7 years as well

    • @poopoppy
      @poopoppy 5 หลายเดือนก่อน +2

      That is a mega-upgrade. It must of felt amazing......For about a week, and then you got used to it :D

    • @XTSonic
      @XTSonic 5 หลายเดือนก่อน

      @@poopoppy Yea, definitely. Also upgraded from 28" 1080p to 51" 5120x1440 (Neo G9).
      Still love it, both for work purposes as games :) Cant go back now

    • @LuciferZMorningStar
      @LuciferZMorningStar หลายเดือนก่อน +1

      ​@@XTSonicBrother that card will hopefully run a decade, iam trying to do the same with my laptop to see if I can run it for a decade.

  • @angelosfyrigos5404
    @angelosfyrigos5404 6 หลายเดือนก่อน +3

    The problem comes when rasterization techniques are getting less and less polished as more games incorporate the ray tracing methods into their games. You can already see that in many recent titles, ex shadows in COD. Developers are starting to give less priority to make a flawless rasterized world and rely on the ray tracing to do their job. The setting to disable ray tracing will be here for many years, but the games I think will start to look even worse by deactivating it.

    • @RandomFandomOfficial
      @RandomFandomOfficial 6 หลายเดือนก่อน +1

      Games already look worse with modern AA techniques, hence the existent of subreddits like r/fucktaa

  • @PaulRoneClarke
    @PaulRoneClarke 7 หลายเดือนก่อน +51

    I used to upgrade every gen. I've done it since 3dfx voodoo in the 1990's From 1998 to probably 2018 I upgraded maybe 15 times. Mostly Nvida but I've had a couple of AMD as well
    But since 2018 the selling points aren't "You need to upgrade to play this" - instead the selling points are about specific uplifts in features I'm already perfectly happy with
    "More FPS " - I'm happy with 60
    "Support higher resolutions" - I'm happy with 2560x1080, my eyesight can't appreciate much higher pixel density than that even on a huge screen
    "Ray Tracing".. - Meh!!
    All the games I want to play play well on my now 3 year old GFX and I see no games that I want to play coming in the next year or two to change that. 3 generations now seems about right.

    • @RmX.
      @RmX. 6 หลายเดือนก่อน +3

      Oh man 3dfx voodoo cost like $10000 now

    • @siRrk1337
      @siRrk1337 6 หลายเดือนก่อน

      i recently got my first 165hz monitor and its night and day. ure missing out friend :D im running my 5yo 2080 with it. ditch high settings, high fps is THE SHIT

    • @PaulRoneClarke
      @PaulRoneClarke 6 หลายเดือนก่อน +1

      @@siRrk1337 My son has an Ilyama 240 Hz monitor and I've used that. Above about 90 fps I can't see or sense anything. I know people can - but I can't
      But I am almost 57 years old. It might be night and day for someone younger with better reactions, senses and eyesight. But for me it offers no value what-so-ever.

    • @siRrk1337
      @siRrk1337 6 หลายเดือนก่อน +1

      @@PaulRoneClarke when i played on my friends 144hz monitor, i also thought there is not much of a difference, but somehow when i switched on my pc, the difference was stark. i even noticed animations ive never noticed before, like chrome tabs bouncing a little and so on. my pc just feels more responsive, everything is more fluent. its still new for me, so im still hyped :D
      i always had good vision tho, so i cant realy relate; my intuition was, bad vision would be more about spacial "resolution" than temporal. but now that i think about it, it makes sense :) TIL!

    • @PaulRoneClarke
      @PaulRoneClarke 6 หลายเดือนก่อน +3

      @@siRrk1337 Yes - I had good vision until I was almost 50. I can relate. I've also tried on a 27" 4K screen. The extra detail is wasted at that pixel density for me.
      Part of the issue is also the cost benefit.
      A PC that works at my favoured res (2560 X 1080 - ideal for 2 documents at work and love ultrawide gaming) on a 33" inch screen. The hardware to run games on that monitor at medium or high settings at 60 fps is - by and large - about £1000 UK
      To get 144hz on a 1440 screen and run at medium and high settings I'd need a more expensive monitor and a far more expensive PC. Double the price - if not more.
      So the value proposition of going "Hmm might feel a little bit better if I look close and really think about it" isn't worth an extra £1000. In fact in my use case, it's not worth an extra £100.
      But the "me" of 20 years ago would have gone for it, and been happy with it. :) So I completely get where people are coming from.
      Also I play single player games only. I'm from a generation who turned on the PC to get away from things. I have a very active social, family and sporting life. My PC is a refuge away from people. So the competitive advantage in multiplayer also doesn't apply to me as I don't play multiplayer games.
      Again though. I realise I'm a minority and understand the edge a faster frame rate might give to good players in multiplayer games.
      I get it.
      It's just not for me.

  • @AndehX
    @AndehX 7 หลายเดือนก่อน +60

    This is hilarious to me, because I called this all the way back in 2018 when RT first appeared. My first reaction when it was announced was along the lines of "There's no way graphics cards are going to be able to do real time lighting effects at any kind of playable performance". This opinion came from playing things like Quake source ports (yes Quake 1) like DarkPlaces, which have real time lighting effects, and even on the fastest cards at the time (1080Ti) DarkPlaces was bringing those cards to it's knees, and that was just Quake 1. The vast majority of gamers are just going to bypass RT effects, until GPU's can run it at the same, or similar performance to standard rasterization lighting. That's just the facts.

    • @cy-one
      @cy-one 6 หลายเดือนก่อน +6

      Yes and no. It's just a matter of time. Considering what's happening, I'd assume 5-10 years tops.
      But yeah, currently it's just not worth it. I went with a 7900 XTX instead of a 4080/4090 for a reason :D

    • @nigosan8311
      @nigosan8311 6 หลายเดือนก่อน +4

      Minecraft with RT looks great though.

    • @tstager1978
      @tstager1978 5 หลายเดือนก่อน +3

      @@cy-one My 7900xtx handles RT pretty well and its rasterization is second best. You can't lose for 600+ less dollars. Its rasterization performance improves its rt perf to acceptable levels.

    • @pontiacgtx4801
      @pontiacgtx4801 5 หลายเดือนก่อน +1

      ehm BF1 was struggling to achieve the expected performance that they had to reduce the amount of Ray/s for their game

    • @brando4526
      @brando4526 5 หลายเดือนก่อน +2

      The raytracing in games is not pure raytracing it's a hybrid between rasterization and raytracing.

  • @indosuprem2296
    @indosuprem2296 6 หลายเดือนก่อน +3

    4:55 i am in the camp of let hold graphic in the level of ps4 rdr2 and just increase gameplay, animation, optimization and interesting game mechanic

  • @flyingturret208thecannon5
    @flyingturret208thecannon5 6 หลายเดือนก่อน

    This reminds me of a conversation I had with my teacher, I was wondering why serial transmissions picked up over parallel, despite parallel communicating more at a time.

    • @freelancerthe2561
      @freelancerthe2561 5 หลายเดือนก่อน

      Same reason we have multi-threaded CPUs, but less than 25% of the software we uses it effectively. Parallel systems have a lot more synchronization problems than a Serialized one. Ergo, its a lot easier to up scale serialized jobs, or brute force your way to success. But theres still a wall (usually physics) that sets an upper limit to what a serialized system can achieve; which then forces you to go wide to keep advancing.
      But in an ironic twist, what we've ended up doing is using parallel systems to enhance a serialized one. For example, USB has 2 data lines, but only one data stream. The same data is sent out on both lines, and encoded in a way to detect and deal with errors on a hardware level.
      If theres any place where parallel transmission is used most heavily, its in Radio system. And thats largely because you can't just get "faster speed" in a given frequency; you can only make it more information dense.

  • @aeonmazer9662
    @aeonmazer9662 7 หลายเดือนก่อน +262

    I personally feel RTX 30 and RDNA 2 are gonna be the last two great gpu generations

    • @tourmaline07
      @tourmaline07 7 หลายเดือนก่อน +14

      RDNA was good for AMD , but Ampere was a terrible gen in terms of performance uplifts - not withstanding the fact video cards were money printers then too.

    • @CodeStrife7
      @CodeStrife7 7 หลายเดือนก่อน

      @@tourmaline07lmao that’s such a lie ampere (apart from the 3050) smoked turing, the 3070 and 3080 were significantly stronger than the 2070 and the 2080, so quit pulling information out your ass

    • @jibcot8541
      @jibcot8541 7 หลายเดือนก่อน +22

      An RTX 4090 is great, but only if you are in the wage bracket that can afford to spend that much of what is basically a toy.

    • @tonyz1121
      @tonyz1121 7 หลายเดือนก่อน +8

      @@jibcot8541 A Lamborghini or Ferrari are great car but i wont drive it to work on a point A to B basic trip.
      Everyone ofcourse can afford a GPU but now the issue is (performance:Price=2P) problem they are speaking.

    • @MrPhyxsyus
      @MrPhyxsyus 7 หลายเดือนก่อน

      I own a 3070, I was lucky to buy at msrp. The 4000 made go full Linus Torvald. Fuck you nvidia. It's clear that the 2000 and 3000 series of rtx can use dlss 3. Nvidia is just being assholes. Fsr3 ftw.

  • @IAmNotJustJess
    @IAmNotJustJess 7 หลายเดือนก่อน +4

    Not only that, but also the lower graphical settings are getting to look quite incredible too! Although with exception for texture resolution though!

  • @HokeyBugle
    @HokeyBugle 4 หลายเดือนก่อน

    Been running the same card for about a decade (although it was flagship at the time)
    Finally considering an upgrade

  • @ickn2005
    @ickn2005 7 หลายเดือนก่อน +10

    I'm still using a Zotac 1080, it has been a great card for me. I don't play too many demanding AAA games. Mostly Diablo series and Division 2. Even Dead island 2 I was able to play at 1440p on mostly ultra. I wanted to get a 3080, back in the day, but well scalpers jaded me. Then I wanted a 4080, but I refuse to pay those MSRPs. So now I just stay in my lane and I don't feed the beast. But I eventually would love to one day upgrade into Raytracing. Solid video friend.

  • @kevinrowley55
    @kevinrowley55 7 หลายเดือนก่อน +78

    I went from 7th gen to 13th gen nvidia, from gtx 760 x2 sli to Rtx 3060, I’m totally fine with mid range cards, I think they are just right for me and I can get about 4-5 generations maybe less like 3-4 but still decent.

    • @Purplehain
      @Purplehain 6 หลายเดือนก่อน +6

      I just went from 9th Gen to 14th Gen since I encountered the first games that couldn't even be played lag free on lowest settings 😂

    • @Purplehain
      @Purplehain 6 หลายเดือนก่อน +5

      My cpu is even worse, I went from 4th Gen I7 to 14th Gen I7
      And from 16gb DDR3 to 32gb DDR5
      I'm curious how long that shit will last this time, technology develops faster and faster each year.

    • @DomenG33K
      @DomenG33K 6 หลายเดือนก่อน +3

      @@Purplehain It is slowing down and has been for a while. What happened with GTX 900,1000 series is crazy for todays standards...

    • @npip99
      @npip99 6 หลายเดือนก่อน

      Wow. I built a PC in 7th grade with a GTX 760 and I'm in my mid 20s now. Big gap for a card; but I won't lie, I tried that PC a few years ago and had no issue playing all of my games on even fairly reasonably high settings.

    • @Purplehain
      @Purplehain 6 หลายเดือนก่อน

      @@npip99 than I guess all your games are from before DirectX 12 Was a thing 😂

  • @ExecutionerDan
    @ExecutionerDan 4 หลายเดือนก่อน

    Love the pkmn white music in the background

  • @hesyo1656
    @hesyo1656 2 หลายเดือนก่อน +1

    I went from a GTX 750 Ti to a RX 5600 XT and it's great, with the new FSR coming i feel like im gonna keep it for another 2-3 years

  • @System0Error0Message
    @System0Error0Message 7 หลายเดือนก่อน +29

    the reason for the performance problem is both the proprietary nature of nvidia features and the lack of RT cores. Intel on the other hand took a more balanced approach with more hardware specific units per shader than nvidia or AMD. AMD takes the spam shaders first, hardware specific units later. I like the intel GPU path for its more balanced approach which is similar to what some mobile SoCs do. Dedicated processing units and a more balanced approach on performance for every part while some SoCs will spam CPU and GPU but lack dedicated cores for other tasks. The difference in experience for gaming can differ but also depends on how easy it is to use these dedicated features in software.

  • @Art_Vandelay_Industries
    @Art_Vandelay_Industries 7 หลายเดือนก่อน +16

    Some good points. But in regards to the upgrade cycle and the price, i feel it's bit of a chicken and the egg thing. I suspect developers increased the prices for gpu in the midst of the crypto mining craze followed by corona lockdowns and used that opportunity to keep these prices high in order to make more profit. This, combined with inflation and less money available to spend on stuff like computer upgrades, leads to consumers being hesitant to upgrade as frequently as they maybe did in the past. And this again is a good reason for developers to stretch the time between meaningful releases for a higher mark up.

    • @Bojeezy
      @Bojeezy 7 หลายเดือนก่อน +2

      I believe it is just overall greed by Nvidia and AMD. If they come out with a decently price card, people will buy them.
      Look at the recently released 7800. It is actually sold out. What if AMD has priced 6000 series a little better. They would have more sales and market share but this is AMD we are talking about. It took them almost a whole product stack to get the price correct.
      At least unlike Nvidia. AMD is more willing to lower their price or adjust to market demands.

    • @MyouKyuubi
      @MyouKyuubi 7 หลายเดือนก่อน

      Well, the initial price-boom was primarily because of the silicon shortage caused by the quarantine that forced all the silicon miners to stay at home, NOT mining.
      This means LESS silicon in circulaltion, meaning it gets more expensive to purchase, AND harder to make graphics cards, bumping up prices... And there was no telling when the silicon was going start circulating again, so they had to adjust prices for that as well, in order to soften the blow in case the silicon ran out, completely at any point.
      And of course, on top of all of this, scalpers and crypto miners were finding ways to bypass the normal purchasing procedure, in order to secure truckloads, LITERALLY TRUCKLOADS of graphics cards for themselves, before they even hit the shelves, via back-door deals and sht.
      ALL of these factors come together to form a perfect recipe for disaster. :P

    • @markbrettnell3503
      @markbrettnell3503 7 หลายเดือนก่อน +2

      You're 100% on the money with this statement. We all forgot that when the 40 series came out, people weren't buying them so they decreased production to artificially inflate the GPU's pricing to keep the prices up, rather then producing to lower the prices like they have to now because no one's really buying them still 😂

    • @Bojeezy
      @Bojeezy 7 หลายเดือนก่อน

      @@MyouKyuubi I am sure not having enough silicon around may have played a factor but there isn’t a mining boom anymore and GPUs like the 4090 are still priced at $1,600 if not more.

    • @MyouKyuubi
      @MyouKyuubi 7 หลายเดือนก่อน

      @@Bojeezy UUh, yeah, that's normal... 90 models are the equivalent of "Titan" models way back before the RTX cards came out, they're not gaming cards, they're game-developer cards, they cost exponentially more than the regular gaming cards do. :P
      You want a 4090 for 500 dollars? You're delusional... And even then, the 4090 is 400 dollars CHEAPER than the old Titan cards were. : /
      4080 and below are the cards meant for actual consumer gaming... Those are the prices you should be looking at, and they have objectively decreased since the quarantine.

  • @indus3270
    @indus3270 4 หลายเดือนก่อน

    My oldest gaming rig still sports a Radeon R9 390, which incidently has been used for crypto mining and though I wish I hadn't done that, it can still run relatively modern games like RDR2 and Satisfactory in "fairly high" to highest settings, though I now have to spike up the airflow in my case quite a bit, as to not throttle my entire system with lingering heat. The card just doesn't do ray-tracing, which really isn't an issue for me, since I'm usually too concentrated on my in-game goals to notice how pretty the light looks... What bothers me is that I was looking into a custom water cooling loop for my system, which I might have been able to build for less money than a high-end AIO-cooler, if it weren't for the fact that I can't find any gpu blocks that fit my card anymore. Most of the companies that offer custom water cooling parts seem to have jumped on the band wagon of making perfectly capable hardware redundant because the spare parts aren't being made available anymore...

  • @rolac6375
    @rolac6375 หลายเดือนก่อน

    First time I hear this viewpoint on gpu prices. It is spot on, convincing

  • @thingsiplay
    @thingsiplay 7 หลายเดือนก่อน +7

    Last month I build a new computer and upgraded my GTX 1070 to a new AMD 7600. Not the biggest jump, even after all those years and generations passed. It is meant to be a stop gap solution only and will probably upgrade to a faster card next year.

    • @navisoul-oi8mo
      @navisoul-oi8mo 7 หลายเดือนก่อน +6

      Not the biggest jump but noticeable. However the 1070 lasted quite well

  • @springbokspringbok3249
    @springbokspringbok3249 7 หลายเดือนก่อน +7

    I went from a 1070 to a RX 7800 XT, and I'm definitely not going to upgrade for the foreseeable future, it's a huge chunk of money. The only thing that might make me upgrade is energy efficiency. I would have to run the numbers, though, and see what energy prices do.

    • @LocalDiscordCatgirl
      @LocalDiscordCatgirl 7 หลายเดือนก่อน

      I jumped from a 965m to a 5700 non-xt, to a 3080ti due to driver issues.
      The 3080ti’s been undervolted and still decimates whatever I throw at it. I plan on testing the 5700 drivers again, might pass it down to one of my brothers.

    • @Rain1
      @Rain1 7 หลายเดือนก่อน

      @@LocalDiscordCatgirl you just reminded me, I need to undervolt my recently purchased used 3080

    • @LuciferZMorningStar
      @LuciferZMorningStar หลายเดือนก่อน

      What does undervolt exactly do, I have read many threads but iam still confused at it's process, cons and pros.​@@LocalDiscordCatgirl

  • @ISAK.M
    @ISAK.M หลายเดือนก่อน

    I have been thinking about this for a very very long time, rasterization is just at its limit now. Thats just how it is development within everything is reaching its limit really.

  • @IsDaveGaming
    @IsDaveGaming 3 หลายเดือนก่อน +1

    I updated my PC for the first time in 5 years. went from 1080ti x 2 in SLI to RTX4090 and i9-7920x to 14900k

  • @GD-zd4tj
    @GD-zd4tj 6 หลายเดือนก่อน +12

    One more thing i have issue with is, they should focus on giving similar performance while using less power or resources, and/or more performance for similar power draw or resources. This is just both power draw and performance go up, that shouldn't be the case in name of next gen and of course, for same price as the last gen.

    • @disketa25
      @disketa25 3 หลายเดือนก่อน

      Who cares about power draw in the liquid cooled era, if we aren't talking about laptops? For me, anything below 2000w peak consumption would be okay.

    • @greatexpectation6456
      @greatexpectation6456 2 หลายเดือนก่อน

      ​@@disketa25most of us aren't even using 360mm or 240mm liquid coolers bcz we don't really use overclock cpus a non over clock i7 is more than enough.However for amd u need 360mm for every CPU even for Ryzen 5 to some extent.

    • @greatexpectation6456
      @greatexpectation6456 2 หลายเดือนก่อน

      ​@@disketa25bro you are rich therefore you have 2000 watt power supply also 2000 watt is rare and expensive someone like Linus owns .

  • @ProtossOP
    @ProtossOP 7 หลายเดือนก่อน +3

    9:26 4080 for $12,000 damn that's expensive(I know it's a typo, still funny though)

  • @tunnlrat3
    @tunnlrat3 3 หลายเดือนก่อน

    You talk a lot about RT/Lumen and Rasterization but you are running everything at 1080p. Once you move to 4k gaming will the older cards still keep up? I don't know if DLSS or FSR etc will help if you are looking to move beyond 1080p

  • @SohaibBeddi
    @SohaibBeddi 4 หลายเดือนก่อน

    my first gpu was nvidia 5200 xfx .....i have an rx580 now and i will upgrade this year and u just saved me a lot of money ty...i feel smarter after watching this video

  • @davk
    @davk 7 หลายเดือนก่อน +5

    We upgrade devices less often because the prices have increased, not the other way! It has nothing to do with how many new features they have.

    • @SianaGearz
      @SianaGearz 7 หลายเดือนก่อน

      It's both. Like if the new GPU or device isn't giving you any fundamentally new capability, like it's maybe 20% faster, but it doesn't unlock any software that you couldn't use otherwise, doesn't give you something new to admire either with new features, do you buy it, do you buy it at any price? You already spent those let's say $300 many years ago on a previous generation device. You say charging $500 for such a device today is too much, and you're absolutely right. But will you spend $300 again today to get an experience that is effectively the same? Absolutely not, why throw money to the wind, even though it would have not gotten more expensive. What if you had to pay $150 or even $100? Still kind of a pointless expense, still very unattractive, you'll just continue using your old device, because there isn't enough upgrade pressure. Sure if they gave it to you for free, you would upgrade, or for very little money, but it's fundamentally impossible, there is too much of an underlying cost to the device.

  • @titan64xl48
    @titan64xl48 7 หลายเดือนก่อน +14

    Graphics are the least important thing to me. Because what I see as important is the BALANCE between Graphics, Artstyle, and Performance. If those 3 things aren't treated well, then we have a problem

    • @destroreact5706
      @destroreact5706 7 หลายเดือนก่อน +1

      Exactly, I don't want to look at a powerpoint slide with my 500+usd gpu. I want smooth as well as good looking gameplay, not a extremely stunning still image.

  • @erzfeind4151
    @erzfeind4151 6 หลายเดือนก่อน +1

    Chillin on a 7 year old GPU and 5 year old smartphone and still no need to upgrade

  • @Boosted0ne
    @Boosted0ne 5 หลายเดือนก่อน +1

    I still use a GTX980 in one of my gaming rigs. That card still runs good even on older CPU. Pretty impressive actually considering the card is 10 years old now.

    • @freelancerthe2561
      @freelancerthe2561 5 หลายเดือนก่อน

      I had an Asus G20 where the only upgrade was a GTX 1060. Thats a 4th Gen Circa 2014 Intel CPU in service for around 8 years. I only upgraded to a 12th Gen Intel and GTX4080 THIS year, because I got tired of waiting for the prices to hit sane levels. This upgrade was 5 years on the back burner. I fully expect this system to last me another 8, because practical game performance in modern engines are backaswards in how their bottlenecks happen.

  • @Shinesart
    @Shinesart 7 หลายเดือนก่อน +16

    The only reason I upgraded to RTX 2060 was for rendering with UE 5 and Twinmotion and it's just minimum requirement. I don't play games usually with RT on. I rather play with higher frame rate with nice enough looks than lower frame with bells and whistles.

    • @sourpenguin360
      @sourpenguin360 7 หลายเดือนก่อน +4

      I upgraded my gt 210 to 2060 lol

    • @onkarmohite7560
      @onkarmohite7560 7 หลายเดือนก่อน

      @@sourpenguin360 get G 100 from 2008

  • @strawberrysunburst6113
    @strawberrysunburst6113 7 หลายเดือนก่อน +6

    With all the crazy recommended settings new games require, I believe only the rich would be able to play newer titles in the near future. I just hope they don't forget people still stuck with using 1000 series Nvidia GPUs and 500 series Radeon RX GPUs.

    • @Brothers-rv7ht
      @Brothers-rv7ht 7 หลายเดือนก่อน +2

      Bruh 1000 series.... I get hardware charts say its the most popular but u just have to accept those ppl aren't playing new games

    • @TheQoogle
      @TheQoogle 7 หลายเดือนก่อน

      The 1000 series can still pull of some newer games pretty well, you know settings menus in videogames are a thing right ? I bought a new PC only this month , was on a 1060 GB since 2016. @@Brothers-rv7ht

    • @strawberrysunburst6113
      @strawberrysunburst6113 7 หลายเดือนก่อน

      I know, they are probably off from chinese internet cafes specializing in esport games. I do want to play triple A titled games, newer titles, but I am unable to with hardware limitations. The only recently made modern game I have ever played in my RX 580 was Mafia Definitive Edition and it run smooth as butter though it isn't as modern as today since it is about 3 - 4 years old. But damn, I hope they turn those graphics down a bit so we can play these games and not turn out as exclusive only for high-end components.@@Brothers-rv7ht

    • @Koleuz2
      @Koleuz2 7 หลายเดือนก่อน

      @@Brothers-rv7htWrong.
      I have a 1000 series card, and I've been playing new games consistently over the past years. Doom Eternal in 2020, Elden Ring in 2022, BG3 and ACVI more recently, and it all runs fine (Doom Eternal runs on 60+FPS on Ultra settings as well as ACVI and Elden Ring if I take the steps necessary to bypass EAC - problem is, if I do that I can only play offline. But even with EAC I can still run on Ultra, only at 40-50FPS tops - only BG3 requires me to lower my settings to mid-low for increased performance). The idea that the only people playing the newer games are those that are buying new graphics card every year or so to stay "up to date" is nothing but exactly the kind of misinformation these corpos want you to believe. 1000 series cards are still perfectly fine to play most games and it wasn't until most recently, where some companies and developers are pushing for underperforming but extremely power consuming "features" that some games are getting pretty taxing on older systems (and, in some cases, even on current high-spec systems as well).
      What we are seeing is called planned obsolescence. Plenty of games in the early 2010's looked just as good if not better than many of the games that are releasing today, and they run infinitely better, even by those times standards (having the early 2010's equivalent of a 4070 or 4080 netted you better performance on a new game with top of the line graphics than it does nowadays, and back then, no upscaling was needed to even run them at their highest settings and resolutions).
      What the GPU companies and AAA developers want is to force all these new "features" down your throat (basically making newer games extremely hard to play or even outright be unplayable on older hardware), forcefully phasing out old tech to force you to upgrade, basically taking a page from the Apple disgusting commercial practices. And they rely on the so called "gamers" who treat a hobby as more of a drug, willing to constantly waste hundred if not thousands of $ every year for the next big top-dog hardware for that extra 5 FPS in upscaled 4k resolution (yaaaay) because not being able to play the newest game on 200FPS is enough reason to jump of a building (of course, I'm exaggerating here, but you get the gist of it). I already read some comments in other videos, commenting how gaming is a luxury hobby, like wtf? Are these people on drugs? Games have never been a luxury, and way back in the day anyone could have a famicon or super famicon. Had a friend in high-school that got his PS2 2 years earlier than I did and his family was poorer than mine. Gaming has always been more of a niche hobby, but a very accessible one, and now we have "gamers" trying to pass this hobby as some kind of "rich and elite" kind of passtime, lol.
      All it takes for these practices to fail extremely fast and hard is for people to just not do what they want: don't upgrade your system just for the sake of the new underperforming power-taxing "features" that do jack-shit, only upgrade if you really have to, and definitely don't upgrade just because this new games requires you to do so to even be playable. If everyone, or the vast majority of people did this, the profits of these GPU companies and AAA developers would go the red pretty quickly and they would have to go on damage control mode, and attempt to optimize said games to be playable on older hardware so as to not lose to much of a profit (some of these newer games requiring insane recommended hardware specs could easily be playable on older hardware if only the devs actually worked on optimizing their games).
      Problem is, many "gamers" are so up their own asses on this whole "stockholm syndrome" situation of actually applauding these practices that they will outright feed the monster that wants to milk them for all their worth rather than treating them as a customer. Pretty much the same with the players that go "but developers need to eat" when they run in defense of microtransactions in games that already sold enough to rake in millions of $ (looking at you Diablo IV).
      The good side to all this is there are plenty of indie games out there being made that don't require you to keep upgrading your PC constantly like a "gaming slave", and plenty of old games are still there and are awesome to play (most are even better than what we have been getting recently, which is also a huge reason I haven't ugpraded in years - most newer games seems to be taking steps backwards in pretty much every sense, with only the rough diamond in the dirt type of games sticking out. Funnily enough, plenty of these "rough diamonds" are perfectly playable on older hardware, almost as if the developers treated their game as a passion project rather than a way to force their playerbase to have to spend hundreds only to have the privilege of playing their newest garbage dump).

  • @imnot4chriss
    @imnot4chriss 4 หลายเดือนก่อน

    i have no plans to upgrade mine at all, i guess if i start running into performance issues and feel like i really need some more frames, ill prolly get one a little bit better

  • @tutacat
    @tutacat 6 หลายเดือนก่อน +1

    The reason it's demanding is RT.
    raster has been hacked so much, which is why it's so optimized now.
    RT takes some shortcuts, but it simulates light physically, so it can get much better results with less effort (i.e. no baked lighting required). raster has been stylized to what people want, and it's easier to control.

  • @tojiroh
    @tojiroh 7 หลายเดือนก่อน +29

    Vex, always glad to hear your honest take. Seeing how much of an investment it is to afford a decent GPU nowadays, it's important to have a clear idea of what you need (a rational purchase) versus what you want (an impulsive buy). Yeah, I'm definitely waiting for Battlemage to see how things end up. 😉

    • @gabrielaleactus9932
      @gabrielaleactus9932 7 หลายเดือนก่อน

      Don't get your hopes up

    • @tatianaes3354
      @tatianaes3354 6 หลายเดือนก่อน

      I like that this Vex lad is not in the bubble of GPU latest tricks (even though he knows them all well), unlike the delusional crowd of Digital Foundry, endlessly praising NVidia for years and pretending that those overpriced cards is a must have deal, even though only like a few games can actually utilise the new tricks and only like 5% of cards can perform them.

    • @Bambi-er6fd
      @Bambi-er6fd 6 หลายเดือนก่อน

      @@tatianaes3354 the vocal part of the pc community are the normies who buy into it for the hype and end up playing warzone and minecraft, if you've been around for atleast a generation you start to realize how to be smart with your money

  • @polarbear3262
    @polarbear3262 7 หลายเดือนก่อน +5

    The prices of pc components are insane. I wasnt able to use my pc for 2 years after my 1050ti died. Few months ago i got amd 6600. It is nice to be able to play Cyberpunk and AC Oddessy but im afraid we will be pushed more and more towards the tech that still preforms badly, that uses a lot of power to run and not to mention how much it cost. All the while games look amazing in the "old school" tech. I think rise of indie games is connected, giving players what they want , not a demanding game but a game with cool gameplay mechanics, great stories, unique art style and so on. Also it is the reason more and more people get some sort of console.

  • @HazFrostYT
    @HazFrostYT 5 หลายเดือนก่อน +1

    I always thought Ray Tracing was one of those things that you go "Oh wow, this sure gives really cool visuals!" but never actually think about literally playing a game with it.
    And not to mention, most competitive games need high FPS, so some people with even the most powerful computers still play on the lowest graphics.
    Overall, I totally agree with your points made here!

  • @standly5477
    @standly5477 3 หลายเดือนก่อน

    am fearing an online only large ai graphic optimization feature with a subscription model being release in the future forcing graphic card user to use for more performance

  • @ZeroIQ2
    @ZeroIQ2 7 หลายเดือนก่อน +4

    Another reason graphic card prices have gone up, is because of things like LLMs and local Stable Diffusion tools (like Automatic1111 and ComfyUI).
    Games might need 8GB, but LLMs and image AI see 8GB as the lowest level, so people want cards with way more memory, so the demand for the top end cards increased.

    • @erythreas34
      @erythreas34 6 หลายเดือนก่อน

      Those people usually go for Quadros. Not RTX cards.

  • @dirkjewitt5037
    @dirkjewitt5037 7 หลายเดือนก่อน +17

    I think AMD is going about it right. The next gen isn't going to be near what we've seen as far as clocks go but I'm banking on my water cooled 6950xt to last at least 2 gens.

    • @DMPLAYER1000
      @DMPLAYER1000 7 หลายเดือนก่อน +3

      Same. My 6950xt is holding strong. Probably gonna wsit until at least the rx 9000 series to upgrade

    • @vladimirdosen6677
      @vladimirdosen6677 7 หลายเดือนก่อน

      Intel is gonna blast them all away if they really follow through...

    • @matrix8847
      @matrix8847 7 หลายเดือนก่อน +1

      There are rumors that rdna 4 won't have high end gpus as amd wants to focus on rdna 5 and will compete on the mid market for rdna 4 in the meantime.

    • @alienorificeinvestigation
      @alienorificeinvestigation 7 หลายเดือนก่อน

      Its already been one gen so you probably could go 3.

  • @kubiku
    @kubiku 5 หลายเดือนก่อน

    I was thinking about changing my GPU from 1660 s to 6600 xt. But now that i think about it and seeing your video, i will not changing my GPU.

  • @Charlemagne89
    @Charlemagne89 5 หลายเดือนก่อน

    I got 5 years out of my 970 at 1080p. I bought a 1080ti used for $350 right before the Pandemic silicon shortage and now I'm at 1440p. Between xess and fsr I plan to use it as long as I can! Maxwell and Pascal cards are GOATed at this point

  • @dmaxcustom
    @dmaxcustom 7 หลายเดือนก่อน +5

    The main factor that makes people take any purchase decision, is price. If the card become pricey then you gonna save for what you can get and make it last all you can.
    After all, you have to make the investment worth it.

    • @possiblyinsane6995
      @possiblyinsane6995 6 หลายเดือนก่อน

      people just need to stop being cheap and get what they want. you only live once, spend the money dont eat or something. if you're a gamer you're probably fat and dont need to eat anyways. im serious you dont need to eat for at least 3 weeks, you may be hungry but hey dont you want that shiny rtx 4090. i know i do

    • @EvilUmagon
      @EvilUmagon 6 หลายเดือนก่อน

      Not all gamers only play games. Gaming is so mainstream now, most gamers have other things in life to spend money on as well. Plus, AAA games are not that amazing in recent years, especially the graphically intensive ones. Indies are booming right now for example. I don't need a 4090 for those.@@possiblyinsane6995

  • @Cptraktorn
    @Cptraktorn 7 หลายเดือนก่อน +6

    imo framegen is only good if you reach around 60fps without it,anything lower and the input lag is way too much (i'm using a 4090)

  • @opafritzsche
    @opafritzsche 4 หลายเดือนก่อน

    THIS is the MOST intelligent, most truth video, i have seen to the thing of "gaming" and "crazyness" the last 10 years!
    THANK YOU!

  • @RingRingRingBananaPhone
    @RingRingRingBananaPhone 6 หลายเดือนก่อน

    Im still on a 1080 seahawk ek X all these years later and it still runs games flawlessly, i might have to play some on medium settings but it doesnt bother me. I will only upgrade if I can double the performance for $500 or under, otherwise it just isnt worth it

  • @dutchdykefinger
    @dutchdykefinger 7 หลายเดือนก่อน +6

    i've been having the same gpu forever (rx480)
    it's getting a bit long in the teeth these days but still does the job if you throw out ambient occlusion and some other stuff it doesn't do well with :D
    and if you were lucky enough to buy a 1080ti at the time, the entire generation of geforce cards after that was pretty much pointless, it still holds up today, although yeah the TDP is a bit more toasty

  • @Blind-Matter
    @Blind-Matter 7 หลายเดือนก่อน +7

    I'll be honest, I was on a 1080 ti 11gb and 2700x but just upgraded to a 4080 and a 58003dx so hopefully that will set me up for the many years ahead (hopefully).

    • @SaltHuman
      @SaltHuman 7 หลายเดือนก่อน

      Until the 5080, and the 6080 comes out in the next 5 years with 5x the performance

    • @vladvah77
      @vladvah77 7 หลายเดือนก่อน +1

      @@SaltHuman and 5x the price /s

    • @SaltHuman
      @SaltHuman 7 หลายเดือนก่อน

      @@vladvah77 Doubt. No one would pay for it. Companies like Nvidia need commercial success, a million people paying 1000$ for hardware trumps 10k people paying 100,000$

  • @Xeros_VII
    @Xeros_VII 6 หลายเดือนก่อน

    I might wait at LEAST 5-6 years until I upgrade, especially seeing I have a laptop and I want to wait a little until the framework laptop has a couple more options like hopefully a radeon 10900M or something.

  • @JovianStone_
    @JovianStone_ 3 หลายเดือนก่อน +1

    I've been saying this all along. Ray Tracing (at this point in time) is just another way for corporations to put more costs onto the retail consumer.

  • @87crimson
    @87crimson 7 หลายเดือนก่อน +5

    GPUs also had years of drivers support in the past. My GTX 670 got like 8 years of drivers from NVIDIA. AMD and NVIDIA thought the mining craze was going to be the new normal. It will still take a bit more time for them to come back to their senses and offer cards at a reasonable price. Rooting for Intel to gain market share quickly to put some pressure on then. This reminds me of the CPU era between the FX series and Ryzen when Intel was fleecing us for quad core processors that had meager incremental gains.
    One thing I noticed is that since 2019 all these tech advancements have been targeted towards DEVELOPERS instead of customers/gamers. Ray Tracing is an ask from their end, to make life easier and having to rely less on tricks and adding manually light sources and shadows. Also I don't quite understand why they seem to be forced to adopt UE5. There's no visual gain and a significant performance loss. Batman Arkham Knight is a modified UE3 title, and still one of the best looking games of the market. For all the talent they have coding, they also seem to have very little business sense: Release demanding and broken games and they will bomb (Forspoken, Lords of the Fallen, The Medium and many more)
    I get there's diminishing returns, now more than ever, and that graphics are also tied to the budget and talent a studio has. But we can't help but make the comparison between the best looking 8th generation games on decade old consoles, vs the newest games. There is nothing next generation about them, no wow factor, nothing but gigantic asks for our HW and wallets. A really terrible tradeoff. No wonder we are in 2023 and it feels that this generation hasn't started yet.

    • @phattjohnson
      @phattjohnson 7 หลายเดือนก่อน

      It's not just Nvidia and AMD.. everything from body massages to the construction industry has gone profit-hungry since Covid and shows no signs of returning to what we remembered as 'normal'.