8 GB VRAM is a Problem. Is 10G any Better?

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 ส.ค. 2024

ความคิดเห็น • 1.9K

  • @livedreamsg
    @livedreamsg ปีที่แล้ว +1730

    If Intel can afford to put 16GB VRAM on a $350 card, I don't want to hear any excuses for Nvidia.

    • @BeatmasterAC
      @BeatmasterAC ปีที่แล้ว

      NVidia: "but...but...mUh TeNsOr CoReS...mUh DlLs...MuH FrAmE gEnErAtIoN...mUh iNfLaTioN...mUh HigHeR tSmC pRiCeS...mUh CuDa CoReS..."

    • @shanksisnoteventhatstrongbruh
      @shanksisnoteventhatstrongbruh ปีที่แล้ว +284

      agreed, i mean Nvidia has a $350 card with at least 12GB (3060) the 3060ti, 3070, 3070ti and 3080 having less VRAM than the 3060 is so stupid, Ngreedia at its best.

    • @KermenTheFrog
      @KermenTheFrog ปีที่แล้ว +20

      The difference is intel was selling those cards at or near cost

    • @sorinpopa1442
      @sorinpopa1442 ปีที่แล้ว +44

      Exactly , Puck Ngreedya , they keeping the gaming industry freeze in the last years bcs of they gpus severely lacking vrams. (3090 only exception)

    • @karlhungus545
      @karlhungus545 ปีที่แล้ว +25

      Unfortunately Nvidia could care less what you think...or anyone else on YT for that matter. They own the GPU market, and will for the foreseeable future. You don't need that VRAM anyways, unless you only play crap console ports at 4K with the 1% that have a 4K monitor 🙄😂 Buy AMD then (you won't), or better yet, have a brain and just get a console...

  • @LucidStrike
    @LucidStrike ปีที่แล้ว +119

    I mean, the latest AAA games eventually become affordable Steam Sale games, and so the same problem eventually hits you even if you're not buying at launch.

    • @77wolfblade
      @77wolfblade ปีที่แล้ว +20

      Remember no preorders!

    • @SpinDlsc
      @SpinDlsc ปีที่แล้ว +9

      True. I also think his Steam argument only has a limited degree of validity, because if you also look at the Steam Hardware Survey and see what kind of graphics cards most people have, it's 50, 60 and 70-class cards, and a lot of those are still in the 10, 16 and 20 series. A big reason many people haven't wanted to upgrade in the last couple of years is because of the recent pricing problem in the GPU space and the current recession, so by that metric, most of those people aren't exactly going to try running any of the newer, shinier games.
      Also, if VRAM not being needed is the argument we were going to make, then we also have to ask why NVIDIA is going in so hard on marketing ray-tracing and now path-tracing to begin with when they aren't adding enough VRAM to help that feature run better on some of these cards in the long term. By the point of NVIDIA "not needing" to add more VRAM than is necessary for most users, we should also argue that they shouldn't be marketing ray-tracing to begin with.

    • @OrjonZ
      @OrjonZ ปีที่แล้ว +2

      A lot of people bought and played Hogwards.

    • @ZoragRingael
      @ZoragRingael ปีที่แล้ว +2

      + there are steam sales

    • @DeepfriedBeans4492
      @DeepfriedBeans4492 ปีที่แล้ว +1

      @@SpinDlscidk, I think raytracing is a good thing to push but it’s not there yet, so nvidia should be taking a hit to their profit margins to keep the prices actually making sense as opposed to doing the complete and utter opposite like they currently are.

  • @Tubes78
    @Tubes78 ปีที่แล้ว +591

    I remember choosing the 6800xt 16gb over the 3080 10gb because of this.

    • @eldritchtoilets210
      @eldritchtoilets210 ปีที่แล้ว +90

      Same, I guess it's the "fine wine" effect starting to settle in.

    • @OffBrandChicken
      @OffBrandChicken ปีที่แล้ว +45

      Same starting to see my purchase to be the correct one over all.
      While some people are saying “you don’t need 8gbs” they are also throttling their game. While I’m turning every up to ultra.

    • @gruiadevil
      @gruiadevil ปีที่แล้ว +68

      @@OffBrandChicken It's the same people that said "YoU dOn't nEeD a 4770K. What are you gonna do with 8 Threads? E8400, 2Cores/2Threads can run any game"
      In the meantime, in 10 years time, they swapped 6 CPU-s + Mobos + RAM kits, while I held on my i7.

    • @OffBrandChicken
      @OffBrandChicken ปีที่แล้ว +18

      @@gruiadevil Same, it's not about what tech is "technically fast right now is this specific use case". But "Is this going to fulfill my use cases for next X years"

    • @OffBrandChicken
      @OffBrandChicken ปีที่แล้ว +26

      ​@@gruiadevil I just love when Nvidia adds more than needed for the time, they're seen as the almighty jesus bestowing greatness. While when AMD does something extra, that frankly is very beneficial.
      The Nvidia users are like, "yes but i could get 10% speeds on games that use less than 8gbs 10 years ago. And that's what truly matters." Instead of thinking about the 20% they'll gain in the future.
      Like the copium is so hard that they don't even see it. Even the youtubers.

  • @Barryhick186
    @Barryhick186 ปีที่แล้ว +643

    Vram isn't the Problem. Nvidia's Vram is the problem.

    • @ForceInEvHorizon
      @ForceInEvHorizon ปีที่แล้ว +12

      Lol more like AMD

    • @RationHans
      @RationHans ปีที่แล้ว +51

      I thought the game Devs that do not optimize xd

    • @ForceInEvHorizon
      @ForceInEvHorizon ปีที่แล้ว +6

      @@RationHans if AMD haven't released they're Fsr we wouldn't have this problem. Nvidia DLSS isn't the problem since its only exclusive to RTX card but once amd released FSR in which available to every card the devs got lazy optimizing they're games since they know we can just use dlss/fsr

    • @V1CT1MIZED
      @V1CT1MIZED ปีที่แล้ว

      @@ForceInEvHorizon you sound like a child

    • @Rivexd
      @Rivexd ปีที่แล้ว +150

      @@ForceInEvHorizon that’s like saying “if guns weren’t invented, people wouldn’t kill each other”

  • @MrSwallows
    @MrSwallows ปีที่แล้ว +127

    I remember when 512MB was enough for gaming.
    Thank you for your service, GeForce 8800 GT.

    • @ro_valle
      @ro_valle 10 หลายเดือนก่อน +2

      I remember asking my dad for a 8800GTS 320mb and he surprised me with a 8800GTS 640mb , I was amazed by the amounts of vram

    • @davidrenton
      @davidrenton 10 หลายเดือนก่อน

      my 1st PC had 4MB of Ram (yep MB, not GB), not in the GFX card, it did'nt have 1, 3D acceleration did'nt exist.
      I think my hard disk was 20MB
      4MB System ram total, but hey that was'nt the problem it was trying to get all the DOS drivers like CDROM,Sound,Mouse into the 1st 640K.
      Doom 1 final Boss was stuttery, but then i spent an insane amount and went to 8MB, Doom final boss, smooth as butter.

    • @m.i.l9724
      @m.i.l9724 8 หลายเดือนก่อน

      that means if youd commit to be a father when you were 18 your child could have a child my age i guess damn@@davidrenton

    • @davidrenton
      @davidrenton 8 หลายเดือนก่อน +1

      @@m.i.l9724 not yet my child would off had to been a parent at 13 , which is unlikely, i'm 49, so 36 i would have had an 18 year kid, if i had them when i was 18. Hence my hypothetical grandkid could be 13.
      Saying that it's not impossible, people have kids at say 15, their children at 15, they are a grandparent by the time the are 30
      i recently watched a TV bit from the 80's, it was about 6 generations alive in 1 family, so from baby, mother, grandmother, great grandmother, great great grandmother and the great great great grandmother was still alive, they where they all together in the studio

    • @valentinvas6454
      @valentinvas6454 7 หลายเดือนก่อน

      In 2012 I thought 2GB in the GTX 660 will be enough for a decade but even 4-5 years later it was easily overwhelmed. Nvidia often screwed us over with VRAM capacity. The popular GTX 970 only had 3.5 usable VRAM and as soon as you used the last 0.5 GB that was much slower you started to see huge stutters.
      With Pascal and Turing they were quite generous but after that it's downhill once more. The 3070 TI and 4060 TI with 8Gb are such jokes.

  • @madrain3941
    @madrain3941 ปีที่แล้ว +255

    As soon as I heard that the RTX 4060 was gonna release with 8GB of VRAM I Instantly went ahead and purchased the RX 6700 XT with 12 GB of VRAM, and honestly, It is a HUGE game changer at least for me.

    • @weshouser821
      @weshouser821 ปีที่แล้ว +11

      What I don't understand is that we have systems with 64gb of memory I really don't see why we cand have a card that has 32/64gb of vram instead of messing around with it. Why can't we just make cards that have upgradable vram slots? I don't know... it's over my head, but I really think it's because of "planned obsolescence".

    • @LeoMajor1
      @LeoMajor1 ปีที่แล้ว +23

      @@weshouser821 Yes it isa bit over your head because GPU VRAM and SYSTEM RAM are not the same..... and a card with 64gb of VRAM would be HUUUUUUUUUUUUUUUUUUUUUGE and need a heavy power draw and more coooling and its even over my head so someone else can add to that

    • @weshouser821
      @weshouser821 ปีที่แล้ว

      @@LeoMajor1 Would it really though?

    • @brkbtjunkie
      @brkbtjunkie ปีที่แล้ว +3

      @@weshouser821 have you seen the prices of ddr5? Gddr6x is a whole different ballgame as well. Apples and oranges.

    • @Elinzar
      @Elinzar ปีที่แล้ว +5

      Why we don't have 64gb cards in the consumer space is simply because Gddr6 is still not dense enough, like the enterprise 3090ti card had 48gb of Vram and I think this gen might have an 84gb one or something like that
      Also capacities like that where only archived by HBM3 in the past gen
      So yeah I would say even 20gb+ midrange cards are still miles to go, 16gb will become the norm tho and something I love about what AMD did with RDNA2 (absolute underated cards this gen) is that the 6800 all the way to the top tier all got 16gb, I mean the 6950xt should have gotten 24 at least but you get the point, from the high midterm to the high end got a fair bit of Vram and the 6700xt got 10
      Only the entry level got 8gb

  • @liberteus
    @liberteus ปีที่แล้ว +191

    I own a 3080 10gb and my favorite game received tons of graphical updates to the point where 10gb isn't enough anymore, i had to cut all settings from ultra to a mix medium/high to get it over 60fps, down from 90 2 years ago.

    • @clownavenger0
      @clownavenger0 ปีที่แล้ว +19

      so they added more demanding graphical settings and that hurt performance. okay cool.

    • @sirab3ee198
      @sirab3ee198 ปีที่แล้ว +111

      @@clownavenger0 the GPUs are limited by VRAM, nothing to do with the demanding graphics, this is the same as r9 290x which had 4GB VRAM and 780TI which has 3GB RAM and the r9 290x aged much better than Nvidia counterpart. Nvidia is playing the planned obsolescence game forcing you to upgrade your GPU because it is RAM starved not because is slow. I can bet you my RX 6800 with 16GB RAM will run games smoother in 2 years than your 10GB RTX3080. When we told people 2 years ago about the limitations on 8GB VRAM on the RTX 3070 they called us Nvidia haters ....

    • @nazdhillon994
      @nazdhillon994 ปีที่แล้ว +5

      which game

    • @weakdazee
      @weakdazee ปีที่แล้ว

      literally same

    • @Tubes78
      @Tubes78 ปีที่แล้ว +3

      ​@clownavenger0 that's always going to have an impact but I can't help but think that the card would lose less performance with more VRAM.

  • @ConfusionDistortion
    @ConfusionDistortion ปีที่แล้ว +57

    Working adult here that buys AAA games, so yeah, this affected me. It was sobering to start up Company of Heroes 3 and find I couldn't max it out due to a vram limit on my 2070 Super. Have had this card for 3 years, and I still like it, but yeah, sign of the times. So now it sits in a secondary pc and dropped the cash on a 7900 XT. Problem solved, and now I am back to running everything again and not sweating vram issues on Last of Us, COH 3 etc.

    • @kirbyatethanos
      @kirbyatethanos ปีที่แล้ว +7

      Same boat as me. Recently had an RTX 2060 6GB. Upgraded to an RTX 4070 Ti(the 7900XT is more expensive where I live).

    • @xTurtleOW
      @xTurtleOW ปีที่แล้ว

      Same man my 3080 was running out of vram very fast so dropped in 3090ti on second rig and 4090 on my main now no vram issues anymore

    • @xkannibale8768
      @xkannibale8768 ปีที่แล้ว +4

      So go from ultra to high? Like lmao. There isn't even a difference you'd notice 90% of the time and it uses half the vram 😂

    • @legionscrest143
      @legionscrest143 ปีที่แล้ว +1

      ​@@xkannibale8768 really?

    • @JonesBeiges
      @JonesBeiges ปีที่แล้ว

      @@legionscrest143 yes really i did a cpu upgrade and can still play the newest titles with my 6 yr old gpu above console graphics.....
      Many people like you seem to fall for those youtube salesman....... Only idiot game devs create games for that 1% elitist pc nerds who need the 4 k ultra settings 144 hz because ......

  • @trr4gfreddrtgf
    @trr4gfreddrtgf ปีที่แล้ว +248

    This is also why I'm going to take a 7900xt over a 4070 ti, 20gbs seems a lot more future proofed then 12

    • @VaginaDestroyer69
      @VaginaDestroyer69 ปีที่แล้ว +40

      Yeah, and with FSR 3.0 coming out soon AMD is really closing the gap on Nshitia. RT is still not in a place where I would be willing to base my entire GPU purchase on just ray tracing performance alone. I can see AMD and Intel offering outstanding value to gamers in the future compared to Nshitia's extortionate pricing.

    • @trr4gfreddrtgf
      @trr4gfreddrtgf ปีที่แล้ว +13

      @@VaginaDestroyer69 Even the 7900xt and 7900xtx have made massive improvements on ray tracing, the 7900xt is only a little bit behind the 4070 ti (with raytracing) and I think we can expect the gap to close as the 4070 ti runs out of VRAM over time.
      I can't wait to see how FSR3 compares to DLSS3, it probably won't beat DLSS in terms of visual quality but hopefully it gives a similar performance boost.

    • @CRF250R1521
      @CRF250R1521 ปีที่แล้ว +10

      7900XT has low 1% lows. I returned it for a 4080.

    • @chickenpasta7359
      @chickenpasta7359 ปีที่แล้ว +2

      @@VaginaDestroyer69 you're acting like AMD is the hero in this situation. They literally wait until Nvidia drops their MSRP and then undercuts them by little

    • @WackyConundrum
      @WackyConundrum ปีที่แล้ว +4

      @@CRF250R1521 Interesting! Do you remember a particular benchmark with these results?

  • @rajagam5325
    @rajagam5325 ปีที่แล้ว +90

    i got a 6800xt for 3070 price, am really happy with it. and the video export times are really good.
    (dont support companies, support the better product :))

    • @gruiadevil
      @gruiadevil ปีที่แล้ว +2

      This is the best attitude!
      If NZXT or Corsair make a GPU and it's good price/performance compared to the other ones, I'm buying it.

    • @shanksisnoteventhatstrongbruh
      @shanksisnoteventhatstrongbruh ปีที่แล้ว +2

      yep, right now the 6950XT is CHEAPER than the 3070ti while having DOUBLE the VRAM (8gb vs 16gb) and being 36% faster!!! CRAZY

    • @IchiroSakamoto
      @IchiroSakamoto ปีที่แล้ว +4

      +1 I buy Nvidia products all my life, but went for 6800XT for better value for money as my 2070s really disappointed me in RT. Couldn't care less who's the maker but I'm disappointed there are so many fanboys around

    • @abdulqureshi2803
      @abdulqureshi2803 ปีที่แล้ว +1

      I got a 6800xt for MSRP right from AMD, upgraded from a 3070 (which I sold for the price of the 6800xt during the crypto boom) and it's soo much better. I ran into the vram problem on FH5 a few times ok the 3070, never had that issue with the rx 6800 xt tho had the black screen of death a couple times when I first got it

    • @Fluskar
      @Fluskar 8 หลายเดือนก่อน

      facts. i will never support brand loyalty, its plain out stupid.

  • @KobeLoverTatum
    @KobeLoverTatum ปีที่แล้ว +112

    Nvidia: “Here gaming studio, $$ to use more VRAM”
    Also Nvidia: “Higher VRAM costs $$$$$$$$$$$$$”

    • @hardrock22100
      @hardrock22100 ปีที่แล้ว +11

      You do realize the last of us and RE4 are AMD sponsored titles, right?

    • @gruiadevil
      @gruiadevil ปีที่แล้ว +29

      @@hardrock22100 You do realize they use more VRAM precisely because AMD packs their GPU-s with more VRAM, and nVidia doesn't.

    • @hardrock22100
      @hardrock22100 ปีที่แล้ว +13

      @@gruiadevil
      1. This person was trying to claim that Nvidia is paying devs to use more VRAM in games that are sponsored by AMD.
      2. It's interesting that amd sponsored titles are running like hot garbage.
      3. The company that ported the last of us was the same one that ported Arkham knight.
      4. The last of us crashes when you run out of vram. That should not be happening. I've seen it even BSOD some PCs.

    • @AntiGrieferGames
      @AntiGrieferGames ปีที่แล้ว

      @@hardrock22100 The last of US is just a piece of shit port to getting bsod

    • @vaguedreams
      @vaguedreams ปีที่แล้ว +4

      @@hardrock22100 3. The company that ported arkham knights is also the same company that ported uncharted legacy of thieves collection.

  • @stratuvarious8547
    @stratuvarious8547 ปีที่แล้ว +36

    When Nvidia released the 3060 with 12 GB of Vram, everything up to the 3070Ti should have also had 12 GB. with the 3080 and 3080Ti getting 16 GB. I just hope this is the straw that costs them enough market share to change their ways, instead of always thinking they can do whatever they want and people will just buy it.

    • @MarcoACto
      @MarcoACto ปีที่แล้ว +1

      The thing is that the 12 GB version of the 3060 was obviously aimed at crypto mining, which required a lot of vram and was hot at the time. It was never designed for gaming in mind.

    • @stratuvarious8547
      @stratuvarious8547 ปีที่แล้ว

      @@MarcoACto Yeah, it's true, but that doesn't change the fact that the skews above should have still been increased. Making GPUs obsolete 3 years after their release is inexcusable, and that's all that giving those cards 8 GB of Vram has done.

    • @naturesown4489
      @naturesown4489 ปีที่แล้ว

      @@MarcoACto Yeah that crypto thing is a myth. NotABotCommenter has the correct reason.

    • @r3tr0c0e3
      @r3tr0c0e3 ปีที่แล้ว +1

      3060 will still have 30fps less than 3070/80 regardless of how much vram it has lol
      you people are clueless

    • @stratuvarious8547
      @stratuvarious8547 ปีที่แล้ว +1

      @@r3tr0c0e3 Of course it'd have less FPS, it's a lower class GPU, I was talking about the longevity of the purchase. Maybe before calling someone "clueless", look at the context of the conversation.

  • @sirab3ee198
    @sirab3ee198 ปีที่แล้ว +11

    So AAA games are a niche now ????? :)))))))))))) RE4 sold 5million copies, Elden ring sold 20million Witcher 3 sold 40milion etc ....... I hate it when people misuse Steam Charts to prove their point, Hogwards Legacy is a singleplayer game same as Elden Ring (kinda) but after a month from launch people move on because it is a single player game!!! people finish it and move on. Nvidia giving you 8GB VRAM for the x70 series was a slap in the face for consumers, now they are doing the same with 12GB VRAM. People who bought the RTX 3070 and who will buy the RTX 4070 will want to play the latest AAA games.

  • @ngs2683
    @ngs2683 ปีที่แล้ว +67

    I just want to say one thing. I got a 1060 6GB in 2016 and spent nearly 7 years with it. Then finally I took my hard earned money and bought a 3080 Ti in November, this black friday. 12GB of VRAM. Then IMMEDIATELY new AAA games became this crazy demanding and devs are saying that 12 GB is minimum. On top of that, NVIDIA is effectly implementing planned obsolescence. The 4070Ti, the superior card to my 3080 Ti, had no evolution in VRAM. It's a 12 GB card. I just gotta say...it hurts to get an upgrade after 6.5 years only to end up immediately becoming the new low tier for this future they speak of. And I do blame NVIDIA. No card above the 3060 should have only 8 GB and the 3080 Ti should have been a 16 or 20 GB card. 3070 owners have all the right in the world to be mad. NVIDIA KNEW this was an issue but they don't care. They still don't.

    • @aquatix5273
      @aquatix5273 ปีที่แล้ว +5

      The cards would be completely fine with this list of VRAM. The VRAM on these cards would be how much they actually would need to stay efficient to their compute power:
      RTX 3050: 6 GB
      RTX 3060 + RTX 3060 Ti: 8 GB
      RTX 3070 + RTX 3070 Ti: 10 GB
      RTX 3080: 12 GB
      RTX 3080 Ti + RTX 3090 + RTX 3090 Ti: 16 GB
      RTX 4050: 6 GB
      RTX 4060: 8 GB
      RTX 4060 Ti: 10 GB
      RTX 4070: 12 GB
      RTX 4070 Ti: 16 GB
      RTX 4080: 20 GB
      RTX 4090: 24 GB

    • @dimintordevil7186
      @dimintordevil7186 11 หลายเดือนก่อน +1

      @@aquatix5273
      rtx 3070 = 12 gb of vram. lets be honest . it is cheap for factory
      rtx 3070 ti =12gb
      rtx 3080 = 16 gb
      3080 ti =16 gb
      3090 =20 or 24
      4050= 8 gb
      4060=10 gb
      4060 ti =12gb
      4070 =16gb
      4070 ti =16gb
      4080 = 1200 usd / 20 gb
      4090 = 24 gb

    • @ozgurpeynirci
      @ozgurpeynirci 11 หลายเดือนก่อน +1

      @@aquatix5273 4060 Ti is WAY MORE capable than 8 gb, that's why they made a 16GB version. 4070 should be 16 as well if not more. As a 10GB 3080 owner, this hurts.

    • @aquatix5273
      @aquatix5273 11 หลายเดือนก่อน +1

      @ozgurpeynirci Yeah, doubt, 4060 ti barely is better than the 3060 ti, both cards don't have the performance.

    • @dimintordevil7186
      @dimintordevil7186 11 หลายเดือนก่อน +2

      @@aquatix5273
      3060 ti is faster than 1080 ti . 1080 ti was faster than 2080 super . nowadays 2070 is as fast as 1080 ti . therefore , 3060 ti is a great card .

  • @Obie327
    @Obie327 ปีที่แล้ว +39

    Very good observation VEX, The older Pascal cards with 8 gigs of Vram utilize only what features sets they have baked in. The problem now is all these new advanced DX 12 features plus higher resolutions become more taxing on limited Vram buffers in niche situations. There's a car analogy here: When it's fast but runs out of gas? (tiny tank) Or the car can get to sixty really quick but tops out at 80 mph? (low gearing) i really think everyone wants something that performs great and has future potential/practicality, Or value? Hoping their GPU will last a good while for their current pricey investment? Limiting the Ram only limits the possibilities for game developers.

    • @user78405
      @user78405 ปีที่แล้ว +6

      Limiting ram should force game developers to open doors ...not milk them, that is john Carmack philosophy of good quality work ethics over quantity always become sloppy when having more than it can chew for a company, and ion storm is good example back then

    • @Obie327
      @Obie327 ปีที่แล้ว +1

      @@user78405 I totally agree with you. But like Moore's Law is dead interview with the game developer... The modern Consoles are using around 12 gigs of Vram. I hate sloppy laze code, But I do like to experience everything that the developers have to offer? Maybe AI can help clean this up? I feel that if more adopt higher ram limits this issue won't be a problem going forward. I feel like we are in a weird transition period and Nvidia could be more generous with their specs? Have a great weekend and Easter!

    • @David_Raab
      @David_Raab ปีที่แล้ว +1

      Some people like to buy a newly released graphics card (3070) for 600$-700$ and like that they have problems with already released games because of too less VRAM. People who critizice this are obviously AMD fanboys.

    • @Obie327
      @Obie327 ปีที่แล้ว +2

      @@David_Raab It's been years that we have had 8 gigs of Vram on a premium GPU. My GTX 1080 is 7 years old and AMD even longer. The latest consoles "was", The warning sign that more ram was going to be needed. And now they are using 12+ gigs for their new Console releases. I just think it's a damn shame to pay 500+ for anything new with only 8 gigs and call it exceptable going forward. I think Nvidia just missed the mark with their current product stack. Also Nvidia's new stuff still has the older display connector standard. Which has me scratching my head since they have the same display tech on the $1600 RTX 4090 as well. Intel's ARC A770 LE is only $350 dollars and has the latest display connectors, DX 12 ultimate/Vulcan/XeSS Feature sets, And 16 gigs of Vram. Is Video Ram that expensive to put more on a $800 4070ti? I just think the whole current role out of GPU's are off on many levels. Time will rapidly tell how fast are cards become obsolete? Crossing fingers, Peace!

    • @David_Raab
      @David_Raab ปีที่แล้ว +1

      @@Obie327 I agree to all of that. I find it a shame currently. I'm buying Nvidia now for nearly 20 years, and now i'm at the point of buying AMD instead. Nvidia now sells overpriced cards, the 4070 in my opinion should have been a 4060. I could live with such a card and 12GB if it would costs 300€, but not 600€. And yeah, they can't tell me that 4GB or 8GB more GDDR6 RAM can be so expensive. Any card over 300€ should have 16GB VRAM at least.

  • @veda9151
    @veda9151 ปีที่แล้ว +52

    It is very true that most don't actually affected by the vram issue now. The real controversy is Nvidia not providing enough vram while pricing their GPU as a high-end model. No one is complaining the 3050 or the 6600 only gots 8Gb. It's the 3070 and 3080(10Gb) that attract all the attention.

    • @MATRIX-ERROR-404
      @MATRIX-ERROR-404 ปีที่แล้ว +1

      RTX 3070 /RTX 3070 Ti/ RTX 3060 Ti = 8 GB vRAM

    • @Iridiumcosmos
      @Iridiumcosmos ปีที่แล้ว +6

      Because the 3050 & 6600 are entry level cards priced accordingly. The 3070/ti is around a whopping $500 with the same amount of VRAM as the entry level GPUs. Hence why people are calling out Nvidia’s stupidity.

    • @vectivuz1858
      @vectivuz1858 ปีที่แล้ว +3

      @@Iridiumcosmos That is exactly his point though. Price accordingly and people will understand.

    • @r3tr0c0e3
      @r3tr0c0e3 ปีที่แล้ว

      system ram like ddr4 or 5 or fast nvme can easily be used to compensate for the lack of vram, devs just need to implement it, but they are lazy af

    • @vectivuz1858
      @vectivuz1858 ปีที่แล้ว +1

      ​@@r3tr0c0e3 Uhm yes some games do that, and it causes major lagging.

  • @clownavenger0
    @clownavenger0 ปีที่แล้ว +33

    Hogwarts was patched and works fine on a 3070 now. RE has a bugged implementation of RT so if you turn that off the 3070 does not have any issue in that game either. TLOU has PS1 textures on medium and other graphical issues regardless of hardware. If the question was "Is 8 GB on the edge for the 3070?" i would say yes but games are releasing completely broken which increases the need for over powered hardware. Some very good looking 3D titles with high quality textures use 4-6GB (Atomic Heart for example) on ultra while TLOU uses 8 while looking like a PS2 game at medium settings. I run a 3080 10GB myself and play everything at 1440p or 1440p ultrawide while using DLSS quality whenever offered to push about 100 FPS. I have not has a single issue but I only buy games on sale. So the game might be finished by the time I buy it. It seems like people just want to make excuses for game developers.

    • @DenverStarkey
      @DenverStarkey ปีที่แล้ว +2

      well these games wer also designed around a card that had 16 gigs ,( the radeon cards) so the devs got sloppy with vram usage.

    • @jorge86rodriguez
      @jorge86rodriguez ปีที่แล้ว +4

      just buying the game on sale avoids a lot of headaches jajjajaja early buyers are beta testers xD

    • @tyisafk
      @tyisafk ปีที่แล้ว

      I played through RE on an RTX 2070 and Arc A750, both 8GB cards. I agree that RT (And hair strands on the Intel) was the main issue with the game. To be fair though, both reflection implementations aren't good at all so it's worth just having both RT and Screen Space turned off. I even used higher quality textures than the game suggested with those disabled as per Digital Foundry's suggestion and the game ran flawlessly on both cards. I'm glad I don't often care for big AAA titles, and I have a PS5 if I'm that desperate to play one that isn't optimized properly on PC, but I do feel bad for regular AAA game fans who exclusively play on PC. PC used to be the main go to for long term savings if you didn't mind more up front, but now a current gen console is definitely the better option if you just want to be able to play anything decently.

    • @arenzricodexd4409
      @arenzricodexd4409 ปีที่แล้ว

      @@DenverStarkey those are high end cards. How many people actually own such GPU? All the talk surrounding this "not enough VRAM" mostly if not all of them is about max setting. Years ago i read an interview with dev (forgot which tech outlet are doing it) they said on pc they will try to optimize their game even on intel iGPU because of how big the user base is. And back then intel iGP are considered as super crap.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat ปีที่แล้ว +2

      10GB will soon not be enough. Dying light 2 maxes it out easily on RT. You'll have to increase you reliance on DLSS and lower texture quality in the upcoming years. Even flight simulator maxes out 10GB VRAM at 1440p. So....................

  • @denerborba4994
    @denerborba4994 ปีที่แล้ว +29

    digital foundry recently made a review about RE4 remake and there they tested a 2070 super and on their analysis it seems that RE4 will never crash with texteures at 8 GB unless you are using ray tracing. i also have been playing the game with a 8gb card and have not faced any crash for far either about 13 hours in.

    • @viktor1master
      @viktor1master ปีที่แล้ว

      I test re 4 remake with 3070 ryzen 9 3950x 16 gb ram with total limit of 12 gb of vram textures 8 gb funny tho in settings it showed me mi card have only 7 gb of vram sou i dont know in another games its normal 8 gb but i was impressed it was 135 fps to 70/80 lowest slighty over 60 30 min testing demo tho now i know i must have it the games looks sou good🤣

    • @JoeL-xk6bo
      @JoeL-xk6bo ปีที่แล้ว +4

      it still has issues loading in high res textures. stop and look at certain surfaces the texture will go in and out.

    • @r3tr0c0e3
      @r3tr0c0e3 ปีที่แล้ว

      rt doesn't look particularly accurate or appealing in this game anyway, besides it's just another condom they placed on a perfectly good looking game without it, so it was off just like in any other unoptimized garbage they tried to sell us to
      unless game goes full path tracing it simply not worth it and as we can see even 4090 struggles to do that at playable fps

  • @stratuvarious8547
    @stratuvarious8547 ปีที่แล้ว +18

    I expected when the games designed for the current gen consoles (Xbox Series, PS5) started releasing on PC, this was gonna start to be a problem. That's why when I was looking to upgrade my 2070, I was looking for something with a minimum of 12GB of Vram. Since I couldn't get a new 3080 (or Ti) for a reasonable price, I went with the RX 6900 XT and it's massive 16 GB of Vram. Since it was $650, It felt like the best price to performance in the price range I was looking at.

    • @latlanticcityphil
      @latlanticcityphil ปีที่แล้ว +1

      Man, I love my RX 6900 XT, I have no problems and a great investment too. I can play all the games and have a great experience even with Cyper punk. 16 gb of VRAM DOES MAKE A DIFFERANCE!

  • @Doric357
    @Doric357 ปีที่แล้ว +27

    6800xt with the 16GB seems to be a sweet spot. I'm a casual enthusiast so I won't claim to know the in's and out's about all this but creators have been talking about VRAM forever I always believe more is better. However, I don't believe it should be at such a high premium.

    • @tokki2490
      @tokki2490 ปีที่แล้ว

      if you havea 6800xt... you are not a casual enthusiast lol

  • @Sinflux420
    @Sinflux420 ปีที่แล้ว +35

    Just got a 20gb 7900XT. Being able to run RE4 at max with ray tracing and having 6 gb leftover is pretty nice, ngl. Didn’t realize this ongoing issue until after getting the card, glad it’s well-equipped!

    • @Drake1701
      @Drake1701 ปีที่แล้ว

      Out of curiosity, what resolution do you play at?

    • @Austrium1483
      @Austrium1483 ปีที่แล้ว

      What did you pay

  • @themadnes5413
    @themadnes5413 ปีที่แล้ว +26

    I had a 1080ti, and the main reason i did not get a 20 or 30 series was vram. 3080 has less and 3080ti had 12 gb, 1200€ for a 12gb vram gpu is kinda stupid and in this regard a sidegrade. Now i have a 4080, 16gb is still a bit on the low side for a 1200€+ gpu but i can live with that. I know amd is an option too, and i was about to get a 7900xtx but the price of the 4080 was like 50€ more. So i choose the nvidia gpu, also i like rt and dlss a lot.

    • @dededede9257
      @dededede9257 ปีที่แล้ว +4

      I think 16gb still fine yeah he could be more for this price but i don't think you get vram limited

    • @zdspider6778
      @zdspider6778 ปีที่แล้ว +11

      1200€+ is the price of a "decent enough" second-hand car.
      The MSRP of the 1080 Ti was $699.
      Ngreedia is laughing all the way to the bank every time a schmuck buys one, lol. They're sitting comfortably on the shelves, not even scalpers are touching them. But enjoy it, I guess. LOL. You got a step-down, btw. From a "Ti" to a "non-Ti" 80-class, for much more money.

    • @paranikumarlpk
      @paranikumarlpk ปีที่แล้ว +3

      You could have easily choosed 7900xt but u just made an excuse to stick nvidia lol ggs

    • @dededede9257
      @dededede9257 ปีที่แล้ว +6

      @@paranikumarlpk he have make the good choice for almost same price the rtx 4080 is better thant xtx and doesn't have issues like 100w idle with multi monitor

    • @vaghatz
      @vaghatz ปีที่แล้ว +2

      ​@@paranikumarlpk DLSS

  • @metroplex29
    @metroplex29 ปีที่แล้ว +8

    that's why i preferred to go for the 6800xt with 16GB vram

  • @friendofp.24
    @friendofp.24 ปีที่แล้ว +28

    Heavily regretting buying the 3070 now. I camped outside of a Best Buy for 20 hours and had the option to choose any 30 series. I didn't understand at the time how much VRAM mattered.

    • @HUNK__S
      @HUNK__S ปีที่แล้ว

      😂😂 suck to be you

    • @soumen8624
      @soumen8624 ปีที่แล้ว +16

      It’s not your fault, VRAM really didnt matter until very recently.

    • @Stephan5916
      @Stephan5916 ปีที่แล้ว

      @friend You live and you learn. Vram always mattered.

    • @Stephan5916
      @Stephan5916 ปีที่แล้ว +2

      @clockworknick9410 It's still Nvidia's fault. Before the 3080 their flagship card was the 2080ti. That was 11gb of vram. They should have at least matched it or better the Vram with the base 3080 model.

    • @naturesown4489
      @naturesown4489 ปีที่แล้ว +1

      @Clockwork Nick There were people saying at the time of 3070 release (hardware unboxed) that the VRAM wouldn't be enough in a couple of years. The 1070 had 8GB, the 2070 had 8GB.. so why would they not have put more on the 3070?

  • @Tripnotik25
    @Tripnotik25 ปีที่แล้ว +5

    5:52 during this interview the dev brings up stuff like "using VRAM to have high quality on all the body parts, like the eyes incase someone looks closer" and im thinking were supposed to pay 500$+ for the sake of having 4K resolution on 30 body parts of an NPC. You're not kidding when you say niche, how bout these devs make good games with griping writing and stop crap ports relying on DLSS/FSR to cover up laziness. 95% of the market is going to continue using 8~12GB products and indies will thrive.

  • @tech6294
    @tech6294 ปีที่แล้ว +133

    Great video! We need more people talking about this. If Nvidia and AMD tomorrow put 24 GB as the standard midrange and 48 GB on the high-end games overnight would look photorealistic. And no vram doesn't cost that much more to go from 12gb to 24gb. You're probably only talking about a 40$ hike in price. These companies could easily make a 600$ 24gb card. They simply choose not to.

    • @Clashy69
      @Clashy69 ปีที่แล้ว +23

      amd has already put enough vram on their cards even their low end 6000 series, nvidia should do the same and add at least 12gb vram on their lower end cards id even be fine if it was just 10gb vram but we'll see since they gave the 4070 12gb vram

    • @bronsondixon4747
      @bronsondixon4747 ปีที่แล้ว +27

      It’d make no difference if 24gb was the minimum. It just needs to have more vram than the current console generation.
      Game developers wouldn’t take advantage of more than 16gb since that’s all they have available in PS5.

    • @66kaisersoza
      @66kaisersoza ปีที่แล้ว +8

      ​@@bronsondixon4747 the console ram is shared with the OS.
      Around 10gb is for the games and the other 6gb is dedicated to the OS

    • @luisgarciatrigas3651
      @luisgarciatrigas3651 ปีที่แล้ว +22

      ​@@66kaisersoza 13.5 for games, 2.5 for OS 👍

    • @retrofizz727
      @retrofizz727 ปีที่แล้ว +13

      24gb is overreact wtf, you wont need 24gb for 4K before like 2030

  • @franciscoc905
    @franciscoc905 ปีที่แล้ว +16

    Definitely was waiting to see what you have to contribute on the discussion. I definitely see this as a negative for people wanting to play AAA games in 2023 with high details, but it will be a fire sale of great deals and second hand graphic cards for competitive gaming.

  • @seaspeakss
    @seaspeakss ปีที่แล้ว +11

    Tbh I was expecting this. When Nvidia decided to put 8 gigs into the 1070, I was amazed, and looked forward for the future. But after the release of the 1080Ti, Nvidia got really comfortable, and havent really came out with a great card, considering cost-to-performance ratio (the 90 series cards are powerful, but super expensive, unlike what the 1080Ti was back in its day.) The 3070 STILL having 8 gigs of VRAM is what holds it back, and 3080 only having 10 is also a major dealbreaker for me.

    • @Mr.Stalin116
      @Mr.Stalin116 ปีที่แล้ว

      Tbh I feel like 10 gb is enough for 3080 for its performance. I recently got a 4070 ti 12gb, and I’m playing in 4k just fine. It does run out of vram when I’m playing 4k ultra rt in some games, like cyberpunk with rt override but those games would run at 20-30 fps anyway with more vram so there is not really any point of having more vram. And it sucks that there aren’t many other options if u wanna experience rt. And amd just doesn’t run well in rt. After trying rt in cyberpunk I was amazed by how much better it looks.

    • @y0h0p38
      @y0h0p38 9 หลายเดือนก่อน +1

      ​@@Mr.Stalin116Right now, 10 gbs is pleanty fine. What about the future though? Its a higher end card, you should be able to use it for years without any issues

    • @KillerBsan_
      @KillerBsan_ หลายเดือนก่อน

      ​@@y0h0p38Just buy our new product u slave!!! - Nvidia.

  • @VisibleVeil
    @VisibleVeil ปีที่แล้ว +10

    Vram is not niche because those triple AAA titles will be on sale for the rest of us in 1-2 years time. When that happens, how will we play with the limited ram on these cards?

    • @gruiadevil
      @gruiadevil ปีที่แล้ว +1

      You won't. Lol. You'll buy a new, better, much more expensive card and thank big Daddy nVidia for giving you another shite product.

  • @Jerad2142
    @Jerad2142 ปีที่แล้ว +7

    One bright side about laptop gaming is a lot of these chips have way more VRAM on them, for example my 3080 laptop has 16GB of VRAM.

    • @dededede9257
      @dededede9257 ปีที่แล้ว +4

      So for the first time of story the laptop gaming will age better that desktop

    • @spaghettiupseti9990
      @spaghettiupseti9990 ปีที่แล้ว +2

      @@dededede9257 probably not, 3080 mobile gpu's don't perform like 3080 desktop cards, not even in close.
      a 3080 mobile is about 40-50% slower than a 3080 desktop.

    • @whohan779
      @whohan779 ปีที่แล้ว +1

      Correct, @@spaghettiupseti9990, their naming is hugely misleading. Even a 3080 Ti mobile may be trounced by a 3060 Ti desktop depending on clocks (it's realistic).
      This is mostly because the RTX 3080 mobile is almost identical to 3080 Ti mobile, so they need the same memory bandwidth. While I'm sure Nvidia could explain away 8 GB for all 3080 mobiles (as they do for some), this wouldn't fly for the Ti models, hence they always have 16 GB on mobile.
      The mobile 3070 and up are (according to somewhat unreliable UserBenchmark) just 20% apart vs. 38% on desktop, so the only reason to pay up for an 80(👔) SKU (apart from higher power-limit) is the additional VRAM.

    • @UNKNOWN-li5qp
      @UNKNOWN-li5qp ปีที่แล้ว

      But a laptop with 3080 will be like 3000 dollars and at that point just buy 3090 or 4090 lol

    • @Jerad2142
      @Jerad2142 ปีที่แล้ว

      @@UNKNOWN-li5qp Don't even know if you can get "new" ones with a 3080 anymore, one with a 3080ti and a 12800HX is about 2149.99 if you went Omen though. My 4090 laptop was about $3,300. But yea, definitely paying a premium for portability.

  • @rebelblade7159
    @rebelblade7159 ปีที่แล้ว +40

    I remember buying the GTX 960 4GB in 2015 for like 200$ equivalent brand new. That amount of VRAM was considered overkill for many but it allowed me to use it all the way up to 2021. VRAM matters a lot if you want to use a GPU for a long time.

    • @FatheredPuma81
      @FatheredPuma81 ปีที่แล้ว +3

      At the time you gained almost nothing for whatever you paid extra and now you're gaining around 15% extra performance for whatever you paid extra.
      Just put the extra money you saved into a savings account, wait a few years, sell your 960 2GB, and get a 970 3.5GB for the exact same cost.

    • @jacobhargiss3839
      @jacobhargiss3839 ปีที่แล้ว

      ​@@FatheredPuma81 that assumes the price actually does drop and you can find the cards.

    • @FatheredPuma81
      @FatheredPuma81 ปีที่แล้ว

      @@jacobhargiss3839 Always has always will.

    • @FatheredPuma81
      @FatheredPuma81 ปีที่แล้ว

      ​@@DeepfriedBeans4492 Looking at the wayback machine it was around $40 more. Toss that into a half decent Savings Account (not your local garbage bank) and that turns into $44 minimum in 5 years.
      GTX 970 was under $100 just before the mining craze and I'd guess the GTX 960 2GB was at the very least above $55. I actually sold a GTX 960 that summer and bought a RX 580 8GB for $120 but can't remember how much I sold it for. (Sold the RX 580 for $120 though a year later though lol)
      Sucks to be you I guess if you're too terrified of potentially mugged at a McDonalds in broad daylight with people around over $60. Sounds like you live in Ghetto-Siberia or something. I'd suggest moving.
      P.S. Do Ghetto-Siberian shippers not let you pay $2 to print a label? Do Ghetto-Siberian's not order things online and have loads of boxes and packing materials laying around? Does Ghetto-Siberian eBay not give you free shipping for broken items?

    • @DeepfriedBeans4492
      @DeepfriedBeans4492 ปีที่แล้ว

      @@FatheredPuma81 please tell me what savings account you use that gives 110% returns in 5 years because the only kind of accounts I know of with that much potential are not what I would call a ‘savings account’, and most certainly do not come without large amounts of risk, and are also called ponzi schemes and are illegal.

  • @brandons9138
    @brandons9138 10 หลายเดือนก่อน +4

    Nvidia doesn't design these cards in a vacuum. They are talking to developers and working with them. I'm thinking that nVidia has something cooking that will help mitigate the vram issues. I can't see them making 8GB the baseline knowing full well that the games will not run well on them. We're all so used to having cards with 12-24 GB of vram, because in the past GPU performance was based solely on brute force rasterization. With technology like DLSS and FSR I think we'll see that change some what. I just benchmarked my base model 4060 with 8GB and it runs Cyberpunk 2077 at 1080p on ultra with DLSS at an average of 90 FPS. Even without DLSS it was over 60FPS. I'm thinking that both nVidia and AMD are working to make DLSS and FSR a way to keep performance high without having to resort to massive power hungry chips that are too expensive to make and don't sell as well because of the price. The reason why we are seeing some games crush these cards is because they were not developed with DLSS and FSR in mind. It may have been patched in, but who knows how well optimized it is.

  • @Saltbreather
    @Saltbreather ปีที่แล้ว +23

    If you go into the settings for MSI afterburner/RTSS, you can enable dedicated and allocated VRAM. That’ll give you a more accurate number when looking at how much VRAM a game is actually using.

    • @r3tr0c0e3
      @r3tr0c0e3 ปีที่แล้ว

      by that account if you have 48gb game will allocate that much if needed, so yeah we need 64gb vram now lol
      funny how while all recent RE games are supposedly use more than 8 gb vram yet games run smooth and without any stutters, yet in settings it indicates that vram limit is exceeded
      RE4 remake crashed not because of that, it was eventually fixed later, you will get small stutter and dips if you enable RT, even on 4090, so vram is not in an issue in this case, Farcy 6 however will destroy your performance if you enable ultra textures and only got like 8gb vram, which kind of look just like high lol, RE games will use system ram to compensate, many games do that actually, simply because recent consoles have shared ram, be it ddr6 but still, lazy devs simply can't be bothered to port them properly, hence the vram rage

  • @Einygmar
    @Einygmar ปีที่แล้ว +54

    VRAM is a problem but optimization affects this issue as well. Better texture\asset streaming and more optimized BVH structures for ray acceleration would fix a lot of problem. I think the bigger issue is memory bandwidth on the modern cards that limits the throughput affecting the streaming capabilities.

    • @Vasharan
      @Vasharan ปีที่แล้ว +6

      Yes, but games will continue to be unoptimized, as long as every developer isn't John Carmack* or Takahisa Taura (Platinum Games), and as long as studios have deadlines and cashflow constraints.
      As a consumer, you can either not buy unoptimized games, or not buy underprovisioned hardware, or some combination of both.
      * And even Carmack's Rage was a buggy mess for years as he tried to get texture streaming to work seamlessly

    • @gagec6390
      @gagec6390 ปีที่แล้ว +4

      @@SkeleTonHammer That's just not true though. 4K monitors and especially TVs have become very affordable. 1080p is only still the standard among e-sports players and extreme budget builds or laptops. Most people who are even somewhat serious about PC gaming have at least a 1440p monitor and the fact is that anything under 12gb of VRAM just isn't enough for even the near future much less the 4-5 years that most people keep their graphics cards. If you paid more than $300 for an 8gb card recently then you got fucking scammed. (I would know, I unfortunately bought a 3060ti a year ago instead of a 6700XT)

    • @r3tr0c0e3
      @r3tr0c0e3 ปีที่แล้ว

      @@gagec6390 1440p even 1080p looks playable on oled, upscaling made it possible
      i'm never going back to garbage tft panel

  • @bladimirarroyo8513
    @bladimirarroyo8513 ปีที่แล้ว +8

    Man im about to buy my first gpu and all your videos all answering my doubts.
    Thank you sm 🤗

    • @Ghostlynotme445
      @Ghostlynotme445 ปีที่แล้ว

      @@z3r009 if you got a console buy another console

  • @ComradeChyrk
    @ComradeChyrk ปีที่แล้ว +6

    I have a 3070 and I was concerned at first about the 8gb of vram, but so far I haven't had any issues. I play in 1440p but I never was really interested in things lile raytracing, or playing at ultra settings. As long as I can play 1440p at 100 fps, I'm happy with it

    • @David-ln8qh
      @David-ln8qh ปีที่แล้ว

      I bought my 3070 for $1000 deep into the card-pocalypse when 3080s were in the $1500-$1600 range. I'm frustrated about the situation but still feel like I didn't really have many options and was probably better off pocketing that 5-600 dollars for my next card, which I'm hoping is a least a couple years out.
      For the record I also play at 1440p 80-120 fps and haven't yet run into problems.

    • @ComradeChyrk
      @ComradeChyrk ปีที่แล้ว

      @@David-ln8qh I'm glad I waited cause I got my 3070 at sub 600$. I was holding out with a 970 until the prices dropped. I got my 3070 about a year ago.

    • @Trainboy1EJR
      @Trainboy1EJR ปีที่แล้ว

      @@ComradeChyrk Still, wouldn’t it have made more sense to go with a 16gb AMD card if you weren’t going to bother with Ray tracing?

    • @ComradeChyrk
      @ComradeChyrk ปีที่แล้ว

      @@Trainboy1EJR I wanted the dlss. Plus it was in my price range. The amd equivalent (6700 xt) was roughly the same price but didn't have as good of performance.

  • @j.rohmann3199
    @j.rohmann3199 ปีที่แล้ว +4

    I actually never had problems so far with my 3060 ti... it does amazing for me at 1080p and decent on 1440p. I will still be using it in like 4 years (if I live that long)

    • @j.rohmann3199
      @j.rohmann3199 ปีที่แล้ว +1

      @@VulcanM61 damn, I was going to do the same thing!
      From 5600x to 5800x3d... but maybe I will just go for a 5900x instead. I didnt decide yet

    • @j.rohmann3199
      @j.rohmann3199 ปีที่แล้ว +1

      @@VulcanM61 Epic!
      Yeah the X3D versions are crazy good. And pretty future proof!

  • @stephenpourciau8155
    @stephenpourciau8155 ปีที่แล้ว +4

    One little flaw is you did not turn on the setting that shows "memory usage \ process". This one in afterburner/rtss will show the ACTUAL vram usage of the application, and not what is allocated on the whole card.

  • @konstantinlozev2272
    @konstantinlozev2272 ปีที่แล้ว +3

    The real problem with high VRAM requirements and even with raytracing requirements is not (!) that it taxes your hardware. It's that the visual output is very underwhelming for the uptick in hardware requirements.
    You seem to be younger, but I do remember the Crysis WOW moment when we were seeing what kind of visual fidelity was possible.
    I fired up Titanfall 2 yesterday and on high it is a stunning game. Fake reflections and all, but you know what? It runs on 6-7 year old mid-range hardware. And looks just gorgeous.

  • @lukasbuhler1359
    @lukasbuhler1359 ปีที่แล้ว +8

    Planned obsolescence go crazy

  • @NootNoot.
    @NootNoot. ปีที่แล้ว +54

    I have to agree with you with the whole 'vram niche' point. I myself don't usually play AAA games that tax my gpu, although I do use workloads other than gaming that needs a lot of vram. Although, I do think that this whole vram fiasco, is a very important thing to discuss. Nvidias planned obsolescence should be put to a stop, and give consumers what they NEED for what they PAYED for. Like you said, performance is what scales with vram.
    The 1070 doesn't need anymore vram because of how it handles, unlike the 3070 where it should be able to play 1440p+ and shouldn't need to be bottleneck by memory, causing stutters, instability, to even not booting up the game. It's a business move, and it totally sucks. While these videos may seem 'repetitive' or controversial', I appreciate you making this.

    • @johnny_rook
      @johnny_rook ปีที่แล้ว +5

      Define "niche".
      AAA games sell by the millions on PC and new consoles have at least, 12GB VRAM addressed to GPU with a 4yo RTX 2070 tier GPU. People (both devs and players) will use high res. textures if they can, regardless of GPU power and textures are the biggest contributor to VRAM usage.

    • @Lyka-clock
      @Lyka-clock ปีที่แล้ว

      So far my 3060ti works well for mostly PC type games like RTS and some RPG's. I'll use console for AAA games but these days, most of its trash really. The ports haven't been that great to begin with. I tried Returnal on PC and it was stuttering no matter what settings i used and it wasn't a vram issue either. I already played Dead Space, RE4 and LOU. Maybe there needs to be a focus on making a new IP with great gameplay. That would be a wonderful idea! Lets hope Diablo 4 is actually fun. The demo was good but not great and this isn't like a new IP or anything.

    • @arenzricodexd4409
      @arenzricodexd4409 ปีที่แล้ว +2

      ​@@johnny_rook millions of those PC does not have gpu with 12GB vram. In fact more than half of them probably only using iGPU.

    • @johnny_rook
      @johnny_rook ปีที่แล้ว

      @@arenzricodexd4409 Yeah, not having enough VRAM is the issue isn't it?
      How do you know that, exctly? Isn't it funny when people pull numbers out of their asses, without a shread of evidence to support it?

    • @alanchen7757
      @alanchen7757 ปีที่แล้ว +1

      @@johnny_rook proof is when amd competing gpu against 3070 3080 etc out last Nvidia due to having more vram

  • @ner0718
    @ner0718 ปีที่แล้ว +5

    Spending a lot of time in Blender (3D modelling and rendering) being stuck on a 6gb card is incredibly frustrating as I can't render most of my scenes on the GPU as my Vram runs full. Can't upgrade as I am still a student and don't have the money to buy a new gpu.

  • @VDavid003
    @VDavid003 ปีที่แล้ว +7

    I'm just glad that my 3060 that I bought used has 12gb of VRAM.
    I actually wanted to have as much vram for the money as possible, since last time I went with a 3gb 1060, and in the end that 3gb bottlenecked the card in some cases.

    • @0xEF666
      @0xEF666 5 หลายเดือนก่อน

      same

  • @capnmoby9295
    @capnmoby9295 ปีที่แล้ว +4

    It's probably the dev's becoming more and more complacent and the games becoming more and more complicated

  • @terkiestorlorikin5958
    @terkiestorlorikin5958 ปีที่แล้ว +8

    7:39 The people that gets affected by the VRAM are the ones that like to run the game on Max settings, seriously, 90% of the games the differences between High and Ultra are barely noticeable, Alex from DigitalFoundry does amazing videos showing optimized graphical settings and most of the time you have to zoom in 200% to spot the difference between High and Ultra. I understand that the VRAM might be an issue in the future but some people should chill a little bit and ask themselves "Do I really need to run clouds at Ultra settings? Do I really need to run this specific setting at Ultra?".

  • @16xthedetail76
    @16xthedetail76 ปีที่แล้ว +5

    My GTX 980ti will be going hopefully for another 2 years...

  • @Skylancer727
    @Skylancer727 ปีที่แล้ว +11

    I completely disagree on it being a niche issue. I do agree AMD advertising higher VRAM hasn't helped them, but this is an incredibly serious issue. People bought 3070s, 3080s, and 3060s only a couple years ago and have the power to play newer games, yet it won't work. Remember that some even bought 3070s for over $800 during the mining gold rush. That's an incredibly rough place to be in. Especially since most people seem to own a GPU over 2 generations and even the 40 series is low on VRAM. Even at MSRP the 3070 was $500, the same as the next gen systems that are running the games and again, this GPU is objectively faster. This could scare people away from PC gaming just after it recently took off again.
    And yes it's only AAA games, today. But when games start adding features like direct storage (which most likely will) even 12GB will be in a tough spot. Hell even Satisfactory announced moving to UE5 for new map loading systems and nanite. More games are going to continue to do this. And many people play at least one AAA game. Did you see the player counts for Hogwarts Legacy? They even announced over 3 million sales on PC alone in the first week. And games like COD will also likely become similar in the next 2 years with dropping PS4 and Xbox One, likely this next COD game.

    • @DenverStarkey
      @DenverStarkey ปีที่แล้ว +2

      i just bought a used 3070 in october of 22 , and i already feel like nivdia is standing on my dick. Re2R and Re3R both go over and crash when Ray tracing is on.

  • @laszlomiko9085
    @laszlomiko9085 ปีที่แล้ว +2

    I'm running Hogwarts Legacy on an RX 6600XT at 1080p. No up-scaling, high preset with RT off, the VRAM usage was around 7.x GB. Changed the textures to medium, now it's usually under 7 GB. At 1080p you can't notice anyway, the game was made with 4K in mind. That's were you need textures at ultra (and also where you need a high-end GPU), high is probably fine on 1440p.

    • @TheTeremaster
      @TheTeremaster ปีที่แล้ว +1

      TBH i feel like if you're shelling out for a high refresh 4k monitor, you can afford an XTX or a 90

  • @Lev-The-King
    @Lev-The-King ปีที่แล้ว +18

    Hope AMD takes the opportunity to clown Nvidia for this... They probably won't.

    • @nombredeusuarioinnecesaria3688
      @nombredeusuarioinnecesaria3688 ปีที่แล้ว +8

      They did it at the time of the gtx 970 and its 3.5GB of Vram.

    • @lotto8466
      @lotto8466 ปีที่แล้ว

      @@nombredeusuarioinnecesaria3688 my 6750xt has 12 gb and play hog warts ultra perfectly

    • @jay-d8g3v
      @jay-d8g3v ปีที่แล้ว +1

      16gb on 6900xt, yum yum, newer 7xxx pushing 24

  • @simon6658
    @simon6658 ปีที่แล้ว +7

    It's always good to force GPU makers to add more VRAM.

  • @ItsFreakinJesus
    @ItsFreakinJesus ปีที่แล้ว +4

    Adjust your settings and it's a manageable issue even with AAA games. Shadows and lighting have massive RAM hits with little to no visual difference at the higher settings for example.

  • @voteDC
    @voteDC ปีที่แล้ว +2

    The vRAM issue is only for those who absolutely must be running at the top settings. I have a secondary PC I use just for media. It's an i7-4770K, GTX 970, and 8GB of DDR3 (in single channel), so not exactly a modern system. It runs Hogwarts Legacy at 1080p, all low settings (draw distance to high) with Quality FSR and rarely breaks from 60FPS. Sure it doesn't look as good as it does on my main gaming PC but it still looks and runs great.

  • @Sybertek
    @Sybertek ปีที่แล้ว +6

    No regrets with the 6800XT.

  • @StuffIThink
    @StuffIThink ปีที่แล้ว +5

    Just got a 6650xt. Don't really care if I can run games on ultra in the future. As long as it holds out a few years playing on medium or higher I'll be happy.

    • @gruiadevil
      @gruiadevil ปีที่แล้ว +4

      Yes. But you bought it cheap.
      Look at how much a 3070, 3070Ti, 3080, 3080Ti cost. Those are the ones discussed here.
      Not the cheap products. When you buy cheap, you say to yourself "If I can play new games at a combo of Medium/High Settings, so I can enjoy the game, I'm satisfied. If it lasts me for the next 2-3 years, I'm satisfied."

    • @StuffIThink
      @StuffIThink ปีที่แล้ว +6

      @@gruiadevil he asked people with 8 gb cards what they thought. Just answering his question.

    • @vanquishhgg
      @vanquishhgg ปีที่แล้ว

      Havent had any issues with my 6650xt Hellhound. Will last me another year until I do a full rebuild

  • @hyxlo_
    @hyxlo_ ปีที่แล้ว +2

    Devs are not optimizing there games and we are blaming gpu manufacturers 🤦‍♂️

  • @DeadlyKiller54
    @DeadlyKiller54 ปีที่แล้ว +7

    Just seeing this makes me super glad i got my 6700XT for 360 with the 12 GB of VRAM it has. Yeah not an nvidia card, but she still performs good, and streams decently.

    • @Trainboy1EJR
      @Trainboy1EJR ปีที่แล้ว

      Seeing this makes me even happier to have a $240 12gb RTX2060. XD Although it is almost certainly going to be my last Nvidia card, Intel is looking super good with ARC right now. Hopefully they teach Nvidia the lesson AMD hasn’t been able to. And with EVGA out of the scene, I completely understand not wanting to touch the 40xx series! Honestly I’m surprised more board partners didn’t “nope” out of this generation. XD

    • @GrainMuncher
      @GrainMuncher 4 หลายเดือนก่อน

      ​​@@Trainboy1EJRVRAM doesn't matter if the card is too weak to even use it. There's a reason the 3060 Ti 8gb destroys the 3060 12gb

    • @Trainboy1EJR
      @Trainboy1EJR 4 หลายเดือนก่อน

      @@GrainMuncher Destroyed? HA! 65fps vs 71fps is just the 192bit bus vs 256bit bus. I’ve learned to go with the most Vram card and have never been disappointed! 1gb GT220, 2gb GT640, 4gb 1050ti (laptop), 12gb RTX 2060. Let me repeat that, NEVER DISAPPOINTED!!! I will never sacrifice textures to play a game, because textures have zero impact to performance if you have the Vram for it.

  • @admiralcarrot756
    @admiralcarrot756 ปีที่แล้ว +9

    Meanwhile, AMD 6000 series offers you VRAM based on card tier like 6500 for 4GB, 6600 8GB, 6700 12GB, 6800 16GB, and lastly 6900 for 16GB too.
    Nvidia 3000 series be like... 3050 8GB, 3060 8GB, 3070 8GB, 3080 10GB, 3090 24GB. See the problem there?

    • @user78405
      @user78405 ปีที่แล้ว +2

      Having 8gb RAM feels like having 12gb in games ...while 10gb feels like having 16gb in games ...very different in tech between methodology on conservative usage that don't tax high in your system usage and finding out who are best game developer than lazy take more VRAM to cover up bad game flaws like forspoken that I find it embarrassing on quality AAA titles now vs doom brand is always been carefully respected still...and runs all cards today. Even 4gb GPU's due to idtech Carmack genius work he put into ...wish every developer is like him...to have brains to make games ...but like for spoken, I can tell the developer is rude and super lazy can bring entire team down with bad energy he or she spread into...talked about employee don't like his or her job

    • @VDavid003
      @VDavid003 ปีที่แล้ว +1

      Actually, the regular 3060 is 12gb which is even weirder.

    • @silvershines
      @silvershines ปีที่แล้ว

      Overall the line-up isn't too weird once you realize the original plan was that there was meant to be a RTX 3080 20 GB. But then the crypto boom happened and Nvidia decided to chase sales volume for that sweet crypto cash. Better to produce more cards than a good product.
      The past few crypto booms (that were isolated to AMD cards) also showed that regardless of what you do -- you will have to deal with a bunch of cheap 2nd-hand used cards cannibalizing your new card sales. So regardless of what happens, your company is going to be the baddie anyway so might as well raise the price and condition people to accept higher prices.

  • @Ben256MB
    @Ben256MB ปีที่แล้ว +12

    I don't think it's a problem because games are graphically more realistic and the texture sizes are bigger too .
    Remember the tech demo of Unreal engine V on PS5 I knew that 1440p might consume all 8GB or more .
    We can keep it extra real " Most games there very little difference between ultra and high settings .
    Just down the settings to high at 1440p or 4k in an 8GB card . People are too sensitive !!

    • @OffBrandChicken
      @OffBrandChicken ปีที่แล้ว +3

      Or just add more VRAM and don’t have the issue to begin with.
      You see Im not even remotely worried about my card anytime soon, and I’ll be able to run my games at higher settings than Nvidia because of it.
      I can crank up settings without issue.
      I just find it ironic that a worse performing GPU is performing better now.

    • @gruiadevil
      @gruiadevil ปีที่แล้ว +4

      You can't ask to crank down settings.
      I paid the nVidia Tax.
      I paid the 3000/4000 Series Tax.
      I expect to play a game at highest possible settings on the resolution of it's tier.
      You can't charge extra, and deliver less, just so people buy another GPU in 2 years times, because you want to sell more GPU-s.
      It's the same mentality General Motors had in the '70-80's by starting to making cars that brake in 10-15 years. And everyone followed suite.
      If you buy a BMW made before 2005, it's a tank.
      If you any BMW made after, it's going to start breaking piece by piece.

    • @Ben256MB
      @Ben256MB ปีที่แล้ว

      @@OffBrandChicken Lol VRAM can't be added because it's on the PCB of the GPU board !!

    • @Ben256MB
      @Ben256MB ปีที่แล้ว

      @@gruiadevil Lol bruh !! Because you bought a low end or a med GPU you don't get the same benefits as someone who paid $1700 for a 4090 .
      You have a choice of buy an AMD GPU which has bigger VRAM for less the price for slightly less performance .

    • @OffBrandChicken
      @OffBrandChicken ปีที่แล้ว +1

      @@Ben256MB are you serious? I’m saying add more vram to begin with. How was that hard to understand?

  • @Electric_Doodie
    @Electric_Doodie ปีที่แล้ว +2

    Unpopular Opinion: VRAM isn't the issue, the consumer is.
    We can all talk about how bad Nvidias pricing is, and it is bad. But so is AMD with the 7000 series currently.
    There was back in the Day much much more people considering the different 10xx/20xx of Nvidia in "Tiers".
    And while the 3070 Ti was pretty powerful, so was the 3060 Ti and so on, but they all are more Low to Mid Tier Card's if we go by that. We are now ~2 Years ahead of their release, they playing a AAA Game at ULTRA settings at 1440p/4k isn't their purpose anymore. What some influencers do test them for still.
    Does the extra VRAM of AMDs 5000/6000 Series do them any favors tho? I mean, sure you can probably set them to 4k/Ultra unlike Nvidia and don't run into the VRAM Bottleneck for some games, but at what Frames?
    The 6750XT is even with more VRAM then a 3070 Ti performing worse in HUB Hogwarts Legacy Testing (AVG and Lows FPS) and the 6800XT is performing worse then a 3080. Their kind of "Counterparts" of the Green side.
    There's obviously some other interesting things to Note, Power consumption, Performance in things besides Gaming (Davinci, Blender, etc), Driver's.
    12GB of VRAM of a 4070 Ti doesn't seem plenty, but it's a 1440p Card rather than a 4k Card which is all the hype for so many people tho, but they completely forget that.
    If you don't care about anything besides Gaming, special features (DLSS, Reflex, Ray tracing, etc), nor care about any sort of power consumption (which leads to more $$$ spend over time), AMD might be the right choice for you. Especially if you can get the Card's cheaper (like here in Germany, a 7800XT is ~100€ cheaper then a 4070 Ti).
    Always vote with your Wallet, not for Red or Green because someone said so / influenced you.
    People are just blindly following their favorite influencers choice and don't think much anymore it seems.

  • @Hakeraiden
    @Hakeraiden ปีที่แล้ว +6

    At this point I'm scared to buy, 4070 which will be released soon. Not sure if I should wait for 7800xt or 7700xt reviews from AMD. Recently completely upgraded my PC after more than 9 years, I want finally to play games and not rely on streaming (which is still awesome as replacement). For now, I will play on 1080p, but consider upgrading to 1440p in a year or less.

  • @ElladanKenet
    @ElladanKenet ปีที่แล้ว +6

    I upgraded from a GTX 960 4gb to a 3060ti in early 2021, and went from 720p to 1080p. The improvements were staggering, and it's still mostly impressive two years later, but there are a few games, like HL, that punish my system.

    • @jayclarke777
      @jayclarke777 ปีที่แล้ว +1

      Went from a 1050Ti to a 3060Ti. It was like going from VHS to Blu-ray

    • @Trainboy1EJR
      @Trainboy1EJR ปีที่แล้ว

      @@jayclarke777 as someone who had a 2gb GT640, I gotta say that Textures high, shadows low, everything else off looked really good. Never went past 40% gpu usage because of a CPU bottleneck. XD Played Lost Planet 2 totally cranked at 17fps. Had the game memorized from PS3, looked AMAZING on PC. Just Cause 2 24fps in the city, only like 15%GPU usage. XD
      Upgraded to 12gb 2060. Most Vram minimum price, high textures are what matters. Can’t wait to finish going through all my favorites in 4K 120fps! XD

  • @StraightcheD
    @StraightcheD ปีที่แล้ว +4

    10:33 It obviously depends on what you want to play, so I think you're technically right. I think people worry because nobody likes to end up in a niche by accident and be told that the game you want to play next just happens to be one of the dozen titles to avoid on your $600 card.

    • @dashkataey1740
      @dashkataey1740 ปีที่แล้ว +1

      This. When you spend that much on a card, you kind of expect it to last you a few years and be able to handle new titles for a while.

    • @r3tr0c0e3
      @r3tr0c0e3 ปีที่แล้ว

      @@dashkataey1740 problem is it's not 2016 anymore and big corps don't care if they ever did, they will just double down at this point
      consoles might be the best option for triple a title players, it will run like a turd, but at least you didn't pay 1000$ instead of 500 for a fake4k 30 fps experience, all this dlss and fsr is fake resolution and fake frames at this point
      rest can be played on a potato anyway

  • @FrankCrz-pg1mu
    @FrankCrz-pg1mu ปีที่แล้ว +2

    Having this problem now with my RTX 3070 8GB VRAM, I have enough power to play at 1440p ultra/high settings but not enough Vram for the game to be stable. Tbh, I’m starting to think even 16G Vram wouldn’t be enough for next gen games.

    • @Kage0No0Tenshi
      @Kage0No0Tenshi ปีที่แล้ว +1

      16gb is going to be new main vram for 1440p like 8gb started for 9 years ago. I am going to sell my rtx 3070 under price to be rid of it so fast I can like 440€ shipping free in Sweden to but new rx 6750xt for 550€ or rx 6800 used. Only game struggle for me now is mw2 1440p played game at 1440p medium high mixed hits 8gb easily, basic I am at 6 to 7gb usage

  • @LordAshura
    @LordAshura ปีที่แล้ว +2

    The problem is that when people pay good money for 3070/3080, they expect it to do high resolution with RT. But with high resolution and RT comes high VRAM usage and it is increasingly clear that these cards will not perform to expectations in the future.

  • @hyena-chase2176
    @hyena-chase2176 ปีที่แล้ว +7

    I find 12gb vram about right,would not go any lower if buying a new GPU in 2023 ,I play a few games like rust that use about 10+ gb at 1440p,so more Vram the better imo

    • @Pimp_Shrimp
      @Pimp_Shrimp ปีที่แล้ว

      Jesus, Rust got chunky. I used to play it on a 970 (albeit at 1080p) just fine many years ago.

  • @AntiGrieferGames
    @AntiGrieferGames ปีที่แล้ว +10

    Stop being the VRAM drama and tell the devs to optimizing their aaa games like Indie Games did!

    • @naipigidi
      @naipigidi ปีที่แล้ว +5

      Im putting you in the list of people with a brain.

    • @AntiGrieferGames
      @AntiGrieferGames ปีที่แล้ว

      @@naipigidi I dont get it what that means. lmao.

  • @DonreparD
    @DonreparD ปีที่แล้ว +1

    I noticed the Obsidian dev that was on MLID wasn't complaining.
    It's typically the VFX developers who want more VRAM in the hopes that more people will be impressed with their work on the explosions and fire effects. As far as I'm concerned, those developers would be happier working for _Scanline VFX_ .
    If people want to be "immersed" with visual effects, watch a movie or play one on a PlayStation.
    *This is just my opinion. I feel like the "EVEYONE & EVERYTHING NEEDS MORE VRAM" narrative is getting a bit ridiculous.

  • @gorytarrafa
    @gorytarrafa ปีที่แล้ว +2

    I have this setup: cpu is a i7-10700k , gpu RTX 3070 , 16gb of ram , The last of us part 1 wont even start on my computer , an error message apears saying not enough Vram . "Modern gaming"
    The Last Vram Of Us !

  • @aeneasoftroy9706
    @aeneasoftroy9706 ปีที่แล้ว +3

    next gen is here and that's it. moving forward you're just going to need high end hardware to play the latest games on pc. however, that doesn't mean you can't pc game, all it means is you need to have proper expectations for your hardware.

  • @OffBrandChicken
    @OffBrandChicken ปีที่แล้ว +4

    You know the only people that tend to try and even remotely justify 8gbs nowadays even on lower end graphic cards tend to be Nvidia users.
    You notice that? I think some people are coping and hoping they didn’t make a bad decision going with Nvidia this time.

    • @gruiadevil
      @gruiadevil ปีที่แล้ว

      The amount of Copium is high in these comments :))
      I noticed that too.
      And they're not just nVidia users. They're nVidia users who bought a 3070-3080 card during the boom. Ofc they are going to lower settings. They are still paying the loan they took to purchase that card :)))

    • @OffBrandChicken
      @OffBrandChicken ปีที่แล้ว +1

      ​@@gruiadevil It's so crazy on like how predictable their responses are. Like "Most Top Steam Games don't even require it."
      Like people don't realize that maybe the reason that they are still the most played. Is because the players Graphics Cards can't handle more.
      Most "Gamers" are gaming on laptops/prebuilts with lower end graphics.
      People that are building/upgrading, are doing so with the intent of playing modern games. Because you wouldn't waste that type of money otherwise.

    • @David-ln8qh
      @David-ln8qh ปีที่แล้ว

      Don't you need 8gb to say 8gb works for you?

  • @adventtrooper
    @adventtrooper ปีที่แล้ว +1

    My theory is Nvidia are betting on BaM (direct access to SSD) for 50-series, and don't want to set a precedent by putting more VRAM on 40-series then cutting it again on 50.

    • @Apurbo_
      @Apurbo_ ปีที่แล้ว

      Good one

  • @hasnihossainsami8375
    @hasnihossainsami8375 ปีที่แล้ว +1

    The problem with the example at 8:35 is that most of these are either multiplayer games that, besides low memory, require very little power to run or are far older games that, again, require very little power and memory to run with modern hardware. This is the reason why these games top the steam charts, because the kind of GPUs that are needed to run these games at an acceptable quality are easily accessible and/or affordable for the vast majority of people. You don't need to spend an entire PC's worth money on just the GPU to get a decent experience out of these kinds of games.
    Yes, people who play AAA games are a niche; it's an expensive hobby. But then again, people who buy GPUs for $600-700 or more are just as much a niche. They buy these cards *because* they expect to be able to run games that the vast majority cannot. It isn't unreasonable to think that the majority of people who own a recent $600+ GPU would want to try out at least one AAA game during it's relevancy, and this is where the argument falls apart.
    If I'm spending the kind of money on hardware that I expect to be able to comfortably play games on without resorting to unacceptable graphics quality, the hardware should meet those expectations. If the argument against this is "don't play AAA games," then one might as well say "don't buy expensive GPUs." Let's all just stick to our GTX1060s and play Apex and Dota forever.
    Or buy AMD.

  • @MrGalax00
    @MrGalax00 ปีที่แล้ว +3

    I wanted a 3070 or 3060 ti for my HTP but ended up getting a 6700XT because it was cheaper, and had more VRAM. On my main PC I'm using a 3090 FE, so I don't have to worry about VRAM usage.

  • @c523jw7
    @c523jw7 ปีที่แล้ว +5

    You bring up some good points here. All that matters is your personal experience and your card fitting your purpose. 10g has been more than enough on the games that I play and really happy with my experience . Now I do think nvidia suck for not giving their cards enough vram, no excuse for that. 4 year upgrades seems about right just a shame vram usage has really spiked the last few games. Best to future proof for any card purchase moving forward though.

  • @nyrahl593
    @nyrahl593 ปีที่แล้ว +2

    The EVGA 3090's had their VRAM in clamshell mode and its frustrating, because those cards clearly did not have issues running their VRAM in clamshell mode. So I really wonder how much more it would cost vendors to double the VRAM on current chips since clearly the X070 series must(JEDEC standards and all) support up to 16GB, the 080's 20GB.

  • @Andypooh37
    @Andypooh37 ปีที่แล้ว +2

    8gig vram for 1440p is starting to dead in 2023, but it'll be fine for 1080p for years to come. And video game graphics isn't that much improving since last decade.

    • @zdspider6778
      @zdspider6778 ปีที่แล้ว +1

      Bruh. You're okay with paying $600 for a GPU that can only game at 1080p?
      Anything above $250-300, for 1080p, should be laughed at.

    • @Andypooh37
      @Andypooh37 ปีที่แล้ว

      ​@@zdspider6778 I didn't bring anything about GPU price. Paying $600 gpu just to play 1080p in the beginning is not okay, some people may fine with it. You're demanding 8gig vram could do well for 1440p for stretch years later, i wonder how ngreedia says about that.

  • @tylerstokes6722
    @tylerstokes6722 ปีที่แล้ว +3

    I don't see vram being a huge problem, but I do feel all new gpu in this generation should of been 12 gb atleast or even 10gb for entry

  • @user78405
    @user78405 ปีที่แล้ว +3

    Lot of people said that adding extra 16 GB of system ram in their system with 3070 fix the majority of frame stuttering in LAST OF US and RE4 remake when I did notice it using 24gb of system ram...what a shocker....now I know why 3070 dips to 20fps when 16gb ram is bottle necking of data to process...now with 32gb...it stops and no more shader loading during gameplay...and fps minimal went up to 41 fps

    • @OffBrandChicken
      @OffBrandChicken ปีที่แล้ว +3

      When your computer cant use more VRAM it uses system ram. It’s still a VRAM issue at the end of the day.

    • @Kage0No0Tenshi
      @Kage0No0Tenshi ปีที่แล้ว

      My system can get up to 16gb vram usage but 8gb come from my rtx 3070, games crash or low fps in games when vram is not enough and in your case is that your vram was not enough and ram memmory xD

  • @warlock_r
    @warlock_r ปีที่แล้ว +1

    Missing the days where I'd open a game on my potato PC, had 15FPS, turned down the graphics and just played. Now, games are crashing on me while I have 100+FPS.

    • @bikechan9903
      @bikechan9903 ปีที่แล้ว

      You can ...
      Turn down...
      The graphics still

    • @warlock_r
      @warlock_r ปีที่แล้ว

      @@bikechan9903 Because I haven't thought of that, lol. I have actually. How do you think I'm getting 100fps. I'm talking about TLOU. It was crashing regardless.

  • @a2raya772
    @a2raya772 ปีที่แล้ว +1

    Re4 remake issue is the RT that causes it to crash. Even if you max it pass 14GB it won't crash unless RT is on. This is everything set max settings at 3440x1440 with res scale pushed to 130% no RT dlss set to quality. Not a single crash on 2nd play through on my 3080.

  • @nixboox
    @nixboox ปีที่แล้ว +2

    The issue isn't that deep. You're talking about needing...let me rephrase that..."needing"...to upgrade your graphics card when you have one from the LAST generation because it doesn't "quite" run all the latest games at their maximum settings. But for most of the real world this idea is insane. I have a pair of Nvidia Geforce 980s running in SLI that I have used since new. They have run every. single. game. I've wanted to play for the last ten years. It is only in the last year that I've found games that I am incapable of playing because they 4G of RAM on the cards is too little. No one who has a 30-series card is in need of upgrading for any valid reason. Those cards will run the latest games for the next 8 years with no problems. At this point, I would consider upgrading to a 40-series card because my 980s are, what, five generations old? The problem you ALL have is that chasing the next newest thing will always leave you unfulfilled. Learn to be happy with what you have and be thankful you don't have less. That's a general rule of thumb for living your best life.

    • @3rd.world.eliteAJ
      @3rd.world.eliteAJ ปีที่แล้ว

      Amazingly put! These people want all the latest and greatest RTX features with 20GB vram for the lovely price of 200$... While simultaneously ignoring the fact that developers are just milking the consumers with terrible ports and remakes year after year.
      These same people say that developers are now targeting console vram targets. Yet, even new console games are running into performance issues - Redfall is running at 30fps on PS5 & XSX without a 60fps performance mode available on launch. LOL... Somehow this makes 8-10GB obsolete? No, the developers are making PC gaming obsolete... Requiring 400W+ GPUs just to run games on 1440p 60fps is absolutely hilarious.

  • @Verpal
    @Verpal ปีที่แล้ว +9

    Considering there is only one singular GPU that was released with 10GB of VRAM, if the dev already decided to ''SCREW 8GB, lets put in more texture!'', I don't see how and why they will stop themselves at 10GB instead of the much more popular 12GB threshold.

    • @scythelord
      @scythelord ปีที่แล้ว +6

      They literally didn't even consider PC graphics cards when making the game. It's made to use what the Xbox Series X and Playstation 5 can use, which is a lot more than 8. This isn't a result of them upscaling anything for the PC release or intentionally screwing anyone. They're just straight porting the same game over.

  • @dazdaz2050
    @dazdaz2050 ปีที่แล้ว +1

    @Vex nice video i was hoping for this exspecially as i have the 3080 10GB
    Ok sorry if this has already been mentioned below but i dont have the patience to read all the comments to check and i have allot to try and get across, so this post might be abit disjointed lol. The lack of Vram is only half the story and people with powerful cards like the 3080 and 3070ti and so on should be considering these facts well before upgrading.
    1. (Memory Bus/Controller size) on GPU and 2. ( Pc system ram speed)
    When you run out of Vram all that happens is the extra ram needed for game assets spills over into the system ram THATS IT, yes system ram is traditionally allot slower than Vram but most cards with a large memory bus can swop data in and out of Vram so fast it actually some what compensates for the lack of Vram amount, also if you tune your Pc system ram that will also help dramatically too and its free to try.
    The 3080 has two things going for it gddr 6X and a bus width of 320bits. I personally overclocked my DDR4 dual rank+dual channel system ram from 3200 cl16 to 3766 cl16 (the maximum my cpu IMC could handle with four sticks) which i personally recommend everyone try well before upgrading something as expensive as a graphics card.
    If i was on a intel system my ram speed could be in the 4000+ faster ram would set you back maybe 100 once you factor in the sale of your old kit compared to a new high end GPU with 20gb+ of Vram which could be an extra 600+.
    Finally cap fps with riva tuner to 60 or at least 10fps under what you see your card is able to hit consistently and youl most probably find your still good to go.

  • @MrWasian
    @MrWasian ปีที่แล้ว +1

    As someone that has been building PCs for two decades. Its insane to me that the vocal minority has caused the majority to think that the VRAM issue is major. It isn't a factor for most gamers out there, it isn't going to impact them AT ALL. Most people aren't trying to game with everything maxed and on 4k.
    We've been so spoiled with consistent releases that people think you absolutely have to have the most in everything. Someone who purchased a 3080 or even a 3060Ti can still get 5 years out of their card easily, especially considering if they OC up to 20%, all while still having settings at high (ultra is negligible and should be used just for ss).
    I've built PCs for many customers and it's honestly tiring having to quell these unfounded claims. As I'm able to save them literally HUNDREDS on their build so they can buy a better monitor instead or something else instead of them thinking that they absolutely have to buy a 80Ti/90 variant in order to have "future proofing." Yet most of them don't care for gaming at 4k let alone A LOT of people can't even tell the difference between mixed medium/high vs high settings on most games. Just wild the amount of hogwash there is around the PC building world. Great video though! Its channels like this, Gamers Nexus, and HU that keep consumers properly informed and educated!

  • @soup-not-edible
    @soup-not-edible ปีที่แล้ว +3

    When I bought a 16GB RX 6800, I wasn't thinking much about its VRAM, but this is a godsend.
    I prefer having more RAM that's slower than less RAM that's faster.
    (Definitely laughing at Nvidia for the backlash of this "supposed" planned obsolescence)

    • @r3tr0c0e3
      @r3tr0c0e3 ปีที่แล้ว

      ironically 3080 with less ram still will be 30% faster than your 6800 and no vram will change that lol
      30 fps with stuttering is just as bad as 30 fps without, when the times comes to that and by then this cards will be just e-sports cards lol

  • @VisibleVeil
    @VisibleVeil ปีที่แล้ว +5

    When ps4 is no longer being considered, 16gb will be the standard because that’s what is in the new gen consoles

    • @03chrisv
      @03chrisv ปีที่แล้ว +6

      12GB will be the new standard. These new consoles only have like 13GB available for games (3GB is reserved for the OS), out of the 13GB only up to 10GB is allocated as vram, while the rest is used for other in game processes.

    • @VisibleVeil
      @VisibleVeil ปีที่แล้ว +2

      @@03chrisv I admit I'm not knowledgeable, but my understanding is that because the hardware + OS is dedicated for gaming, these consoles can make more use of the resources unlike in PC.
      There's API's that let devs use both the system memory + gpu memory as unified memory.

    • @03chrisv
      @03chrisv ปีที่แล้ว +4

      @@VisibleVeil Yes you're correct, consoles are more efficient with the hardware they have and there are benefits to their design over PC. However they have limits despite this. A PS5 with its 16GB of unified memory is basically equivalent to a PC with 16GB of main system ram and 12GB of vram for the GPU when it comes to gaming.

  • @zdspider6778
    @zdspider6778 ปีที่แล้ว +2

    Dude, what are you talking about? 16 GB should be standard by now! Instead, we get GPUs that are way more expensive than they should be, with shitty memory bus and crappy VRAM capacity. You now get less hardware for more money. The whole GPU market is disgusting rn. Intel is the only one (God help us) that seems to make any sense, but only because they're trying to enter the market. If they had a better product, with better drivers, they would have fleeced us, too.

  • @bmo61950
    @bmo61950 ปีที่แล้ว +1

    So, no one is going to talk about the absolute bad game optimization we get constantly? I wouldn't use the Last of Us as a reason to explain how bad 8-12gb cards are.

  • @AlchemyfromAshes
    @AlchemyfromAshes ปีที่แล้ว +4

    I'm on the fence on this being a niche problem. I would agree saying it's a bit more of a niche that people buy AAA PC titles on day one or immediately after release vs. console. I think there are a fair number of PC gamers who are like me though. I won't pay the initial AAA price for a game. They are bound to have serious bugs and I'm a bit older and used to older pricing, so the current standard AAA price just seems crazy to me. I will look at them seriously 1-2 years from release though when the price is cut in half in a steam sale etc. If Hogwarts Legacy has an OK sale any time this year for example I'll be picking it up. Definitely at half off, but maybe even if it just drops down to 75% retail. I picked up CyberPunk as soon as it went to 50%.
    I upgrade my video card every 4-6 years which I think is relatively common, so there is a fair chance that VRAM issues are going to impact me within the next year, and before I upgrade again (picked up a 3080 about a year and half ago). So, to me, the problem isn't niche as much as it's just delayed a bit. AMD has shown that VRAM cost doesn't have to be a serious factor in being competitive pricing wise. NVidia is just making up their own rules at this point and testing what the market will bend over and accept it seems. AMD is happy enough to do the same. Just my opinion, but at the obscene prices that NVidia and AMD are charging for cards right now, RAM shouldn't have ever been an issue. It should be plentiful on anything but a bargain bin card. It's like buying a luxury car and the dealer trying to stick the crappiest tires and no accessories on the thing, all while telling you that you should just shut up and be happy they're selling it to you. You don't expect the dealer to skimp on bells and whistles when you're paying well. It seems the video card manufacturers have lost touch with the average consumer and don't realize or care that, for most of us, video cards ARE luxury items. Very few of us can treat them like throw away electronics and just upgrade every time a new model comes out to keep up.
    From experience with friends and acquaintances, I would venture to say there are a fair number of people this already effects even at the AAA price though. For example, to people who are also console gamers, or coming to PC gaming from consoles, the AAA price is just the normal price of a game. It's not expensive. Most people probably aren't buying a AAA title a month, but it seems likely that a large number of gamers would pick up a AAA title or two a year, especially if they haven been waiting on the title for a long time. I think this could at least impact momentum for interest in PC gaming in the near future.

  • @afgncap
    @afgncap ปีที่แล้ว +3

    AMD approch worked for me in the past when I could not afford top shelf gpu and upgraded once every 6 years. Their cards until now aged fairly well. However I agree that having ridiculous amount of VRAM at the moment of buying doesn't really help you. I now have 7900 XTX and I doubt I will ever be able to use 24 GB of VRAM before I upgrade.

    • @scythelord
      @scythelord ปีที่แล้ว

      I've already used the 24 gigs of VRAM my 3090 has. It isn't difficult to do.

    • @afgncap
      @afgncap ปีที่แล้ว

      @@scythelord in a gaming scenario, unlikely. Workload sure.

  • @theftking
    @theftking ปีที่แล้ว +1

    No. It's not. After playing RE4R at 1440p on my freakin' $700 3070 Ti, I won't buy a card with less than 16GB VRAM ever again. I was hovering at 6.5-7.9GB VRAM utilization basically the entire time.

  • @Ebilcake
    @Ebilcake ปีที่แล้ว +1

    Issue is only really being highlighted because of the last of us, I'm on a 3080FE / 5800X3D and it runs just fine, it's just below the 10GB limit with DLSS and high textures and I'm 7 hours in since the hotfix and it's been perfectly stable. 3440*1440 HDR with DLSS Quality 80-120fps. No issues at all, butter smooth. Game looks stunning at times.
    If NVIDIA got a day one driver out that addressed the crashing, I wouldn't even know there was a vram issue., and that's been the case in all the games you mentioned.

  • @axxessdenied
    @axxessdenied ปีที่แล้ว +10

    Seeing how things are unfolding makes me pretty happy that I picked up a 3090. I've managed to hit 23+gb usage in CP77 with a bunch of texture mods.

    • @scythelord
      @scythelord ปีที่แล้ว +5

      Yep, same here, but I knew this was coming. You can't stagnate with VRAM levels that were available 10 years ago. The Radeon R9 290X had 8 gigs of VRAM back in 2013. 8 gigs was good for 2016. Today it's practically minimum tier. Double or triple that is just more sensible.

    • @r3tr0c0e3
      @r3tr0c0e3 ปีที่แล้ว

      @@scythelord depends on what you play, 8gb is still enough for games from 2017/8 and most people play older games, like csgo, mmos etc
      so only minority play unoptimized garbage triple a titles

  • @sololoquy3783
    @sololoquy3783 ปีที่แล้ว +4

    I don't think it's overstated. It is a big issue and it's only going to get bigger, simply for the reason that game devs are designing games to run on the next gen consoles. All of them have 16GB of memory, with the only exception being series S which only has 10GB, but then again Series S is barely running games at 1440p.
    Different story for midrange nvda cards, which will be running 1440p-4K raytraced using less memory than Series S, RIP.

    • @dante19890
      @dante19890 ปีที่แล้ว +3

      Yeah 100 %. When the current gen consoles are lead plattforms the pc spec req is gonna go up significantly

    • @118Shadow118
      @118Shadow118 ปีที่แล้ว +2

      that 16GB is shared memory between system and GPU, it's not gonna use all 16 just for GPU

    • @77wolfblade
      @77wolfblade ปีที่แล้ว +1

      Not quite since the consoles use shared memory normal ram and video ram are one in the same. in a console so some of the ram is done for a simple tasks and the other is for graphics so it could be half and half for all we know.

    • @OffBrandChicken
      @OffBrandChicken ปีที่แล้ว +1

      @@118Shadow118 yeah but given that when VRAM is too much System Memory is used. That part really doesn’t make a difference. It’s more less the speed of the ram at that point.
      Technically we could just upload the entire game into VRAM, but we don’t. Because we don’t have enough VRAM.

    • @sololoquy3783
      @sololoquy3783 ปีที่แล้ว

      16GB shared could realistically be using 12GB for graphics data. that is still more than 3070, 3060ti, even 3080 10 gigs.
      10GB from Series S could realistically be 6-8GB graphics data.
      Series S might be the saving grace, forcing game devs to allocate more efficiently, but even that is a stretch, considering how Series S is targeting a very modest resolution/graphical features compared to these PC graphics cards.

  • @duskdrummer1667
    @duskdrummer1667 ปีที่แล้ว +1

    What you are seeing in Afterburner is allocation, not actual usage!

  • @vitor900000
    @vitor900000 ปีที่แล้ว +1

    One thing people overlook is more Vran = Higher cost.
    If you think Nvidia is overpricing their GPUs with suboptimal amounts of Vram, imagine how much they will want to price their GPUs if people keep asking for more Vram.
    A extra high bus 4gb of GDDR6x will add a 20$~40$ just in cost alone. Adding the profit + overpricing and you have a extra 60$~80$ added to the final price.
    The only solution would be to make so that GPU memory is modular like we have with our CPUs. You would be able to add as much Vran as you need instead of having a fix amount per product tier. Would also increase the lifespan of the GPUs since they would become upgradable.

  • @brkbtjunkie
    @brkbtjunkie ปีที่แล้ว +4

    Something to be aware of is cached memory vs actual memory use. Many games load up the vram to the hilt, but only a portion of it is being used. 8GB was fine for me at 1440p/165hz and 4K/60hz on my 2070 and 10gb is fine on my 3080. I have zero issues with vram and the hitching I do get sometimes is not a vram issue, it’s a shader calculation issue or engine frame time issue.

  • @ICDedPeplArisen
    @ICDedPeplArisen ปีที่แล้ว +3

    I had just bought a 3070 8gb and I was loving it but suddenly all the games I wanted to try out need more vram its insane.

  • @OnikMod
    @OnikMod ปีที่แล้ว +1

    Have concerns now about my rtx 4070ti. Anyway, we will see when time passes by. It is already obvious how minimal and recommended settings are not matching the graphics, take the Golum for example.
    Now it looks like 7900xt is a better choice.

  • @duayen_genc
    @duayen_genc ปีที่แล้ว +1

    The sector most affected by Vram is the AI ​​sector. As an AI Developer I'm using RTX 3090 TI at home (It have 24 GB Vram and MSRP has dropped recently so much (I bought 24000 Turkish Lira = $1245). At the factory where I work, we bought RTX A6000 because it has 48 GB Vram. And I must say that the graphics card I use is only sufficient for training many image processing models(Object Detection, Segmentation, Diffuser Models etc.), as it is known NLG models have become very popular in recent years and great strides have been made in this field, (like GPT-3, ChatGPT, GPT-4, Bert models and Google BARD) and that NLG models requires so much vram than 48 GB. Even 3D render, game development and film making cant close the AI vram usage.