VRAM - is 8GB or 12GB really enough to play Games in 2023?

แชร์
ฝัง
  • เผยแพร่เมื่อ 28 ส.ค. 2024

ความคิดเห็น • 1.2K

  • @MarcoGPUtuber
    @MarcoGPUtuber ปีที่แล้ว +933

    I still run 4 GB of VRAM. It works so well. All my games run flawlessly on my 800x600 CRT!

    • @philscomputerlab
      @philscomputerlab ปีที่แล้ว +144

      4MB Voodoo is all you need ☺

    • @MarcoGPUtuber
      @MarcoGPUtuber ปีที่แล้ว +52

      @@philscomputerlab that's right! My voodoo 3 2000 should be all that's enough!
      Now time to play some DOOM!

    • @Po5itivemind5et
      @Po5itivemind5et ปีที่แล้ว +5

      @@MarcoGPUtuber loooool

    • @pf100andahalf
      @pf100andahalf ปีที่แล้ว +37

      640x480 is where it's at.

    • @ShinyHelmet
      @ShinyHelmet ปีที่แล้ว +18

      @@philscomputerlab Yeah, mine served me well playing the original Half Life at 640x480 on a 14 inch monitor. 🤩

  • @MovoSt
    @MovoSt ปีที่แล้ว +178

    A follow up test on 1080P Ultra and 1440P low/ ultra would be great.

    • @matthewIhorn
      @matthewIhorn ปีที่แล้ว +6

      Exactly!

    • @dgonsilver-fs5gf
      @dgonsilver-fs5gf ปีที่แล้ว +3

      100

    • @SomeSloan
      @SomeSloan 10 หลายเดือนก่อน

      Would be great to see a vid on vram usage at these resolutions! Based on this videos results, I think it would be safe to assume that if you want to play games at high-ultra settings on 1080p without worry- 10-12gb vram should be a safe bet

    • @upfront2375
      @upfront2375 7 หลายเดือนก่อน

      @@SomeSloan For today 8 is enough at 1080p. For yrs to come, you'd need to either down the textures OR upscale anyways to stay at 60fps.. both of which will greatly lower the vram usage. There're 2 reasons why 3060ti-4060 has only 8gb. 1.It's enough for the perf. power at lower res. high fps use case. 2.So ppl who need more go buy much more expensive cards. The 12gb on 3060 and 16 on 4060ti aren't useless overall for some creative scenarios, but for gaming?? sht it's gonna be exactly like 4gb 750ti.... like a big D on a catholic priest! lol

  • @DJ_Dopamine
    @DJ_Dopamine ปีที่แล้ว +37

    I game on a 1080p display. Not having issues with 8GB. Anyway, I'm always happy to turn things down a notch (or two) from Ultra if necessary. The visual difference is usually marginal.

  • @SuperConker
    @SuperConker ปีที่แล้ว +40

    This is what I think nVidia should have done for Vram on the entire 4000-series:
    -4050/4050 ti 12 GB
    -4060/4060 ti 12 GB
    -4070/4070 ti 16 GB
    -4080 16 GB
    -4090 24 GB
    Basically not a single model with under 12GB of Vram.

    • @Terry1212
      @Terry1212 ปีที่แล้ว

      The 4070 and 4070 Ti have 12GB of VRAM

    • @SuperConker
      @SuperConker ปีที่แล้ว +10

      @@Terry1212 I know, i'm just saying that's what nVidia SHOULD have done with the Vram.

    • @SherLock55
      @SherLock55 ปีที่แล้ว +4

      What's the point of 12gb on a 4050 LMFAO, it's not even fast enough to run at higher resolutions and settings where it would be needed.

    • @SuperConker
      @SuperConker ปีที่แล้ว +3

      @@SherLock55
      The 4050 would perform about the same as the 3060,
      which has 12 GB of VRAM.
      The 3060 again, performs the same as the good old 1080 ti
      (with it's 11 GB of VRAM).
      There are already games out in 2023 that can eat through 12 GB of VRAM at 1080p
      (and 1080p is not even a high resolution).
      So to release new cards in 2023 with as little as 8 GB of VRAM is a joke.
      Starting the lower-end models at 12 GB is perfectly fine.

    • @SherLock55
      @SherLock55 11 หลายเดือนก่อน +3

      @@SuperConker The only games eating 12gb of VRAM at 1080p are unoptimized pieces of trash not worth playing, don't get it twisted. Just because some devs are lazy or incompetent doesn't mean you actually need so much VRAM at such a low resolution.

  • @sc337
    @sc337 ปีที่แล้ว +136

    Hi Bryan, hope to see a follow up video by using a real 6GB, 8GB and 12GB cards on the exact same games. I bet the RAM utilization will be much different from the results of this video. Love your vids. Peace

    • @techyescity
      @techyescity  ปีที่แล้ว +46

      No worries man, will definitely be doing that for you! This for me personally is a whole journey that I want to uncover and learn about. I will make this a whole series.
      However starting out with the two 'unlimited' vram cards is for me a base case to then inference further data against.

    • @naturesown4489
      @naturesown4489 ปีที่แล้ว +2

      There are a lot of channels and sources that have done those comparisons, they're very similar

    • @sc337
      @sc337 ปีที่แล้ว +3

      @@techyescity much appreciated Bryan. Keep up the good work! 👍👍

    • @laszlodajka5946
      @laszlodajka5946 ปีที่แล้ว +3

      Yeah. Im having a 10gb 3080 and the last of us warns me of it when i push all the settings up to ultra but still runs well. So the 10 gb may still fall into the ok zone for now. May be interesting to see on what settings u can get away with less vram.

    • @Peter.H.A.Petersen
      @Peter.H.A.Petersen ปีที่แล้ว +2

      ​@@techyescity Also, I don't think even a 3070 ti could run 4K Ultra with Raytracing ON at proper frames, if it had 24GB and so isn't it irrelevant if it has enough vram to do it, if it can't do it anyways?

  • @wireless1235
    @wireless1235 ปีที่แล้ว +121

    I think there needs to be a part 2, which includes the impact of specific settings to vram usage and also includes the use of lower tier cards.

    • @dreamcat4
      @dreamcat4 ปีที่แล้ว +5

      if there is a part 2, it would be nice if byan can bring up a table for which of the games being tested are re releases from the latest console generation (who have *almost* 16gbs of addressable vram, minus the game and the operating system.... = so are more like 14gbs max.)
      versus modern game releases which are either originally last generation console releases? or pc only releases? do such games even exist anymore? i mean ever since xbox division already purchased the entire aaaa* games industry pretty much (near enough)
      and how much addressable vram did the last gen consoles have? like the ps4... ah only 5.5gb. about same for xbox one.
      then xbox 1x is 12gb max. and ps5 is 16gb max. but minus the shared memory overheads.

    • @gozutheDJ
      @gozutheDJ ปีที่แล้ว +3

      bro, it's called, do it yourself.

    • @simsdas4
      @simsdas4 ปีที่แล้ว

      I second this, sure you can run high settings but expect to drop textures and shadows for example.

  • @peterkeller7880
    @peterkeller7880 ปีที่แล้ว +25

    Its great to see you do this. This was needed to see. Not everyone games on 4k ultra. This helps people really make an informed decision when getting a new gpu especially when coming from older generation of gpu cards. Thank you for your hard work.

    • @thejollysloth5743
      @thejollysloth5743 ปีที่แล้ว

      I’m gonna grab a 16GB RX 6800 off EBay for £350. Since I don’t give a toss about RT it will last me at 1080p until the next console generation comes out.
      I don’t mind having to turn down a couple of settings like shadow quality, fog and other weather effects in 5 years as long as I get the top levels of anti aliasing, render distance, and texture quality.
      I’ve got a feeling a used RX 6800 16GB will be fine for that at 1080p for many years to come.
      I also think that a Ryzen 7 5800x will be more than good enough for 1080p ultra or high settings for 5 years or so, and they are so cheap now with a decent B550 board and 32GB of 3600 CL16 RAM.
      1080p is fine for me. I only use a 24 inch screen so I don’t really notice the pixelation I would with a 27 inch or larger screen.
      A lot of my friends still play CSGO at 900p to get the most FPS they can and the lowest latency. And these are pro level players who have 360hz or higher monitors downscaled from 1080p to 900p.
      A I really can’t notice much of a difference between 1080p and 1440p, but that could just be me…or the fact I’m so used to that resolution.

  • @Kapono5150
    @Kapono5150 ปีที่แล้ว +278

    So happy to see Nvidia users stand up for themselves on the 4070. Even fake frames doesn’t get them to open the purse.

    • @LeJimster
      @LeJimster ปีที่แล้ว +74

      Honestly, the frame generation and even upscaling tech feel scammy to me. Especially when they're advertising it in their benchmarks. I much prefer running my games at native resolution.

    • @Verpal
      @Verpal ปีที่แล้ว +55

      @@LeegallyBliindLOL TBH I do know some people prefer the native with jaggies, but I felt most people who claim upscaling is a scam is simply saying that because FSR 2 isn't remotely competitive for now, they don't hate the tech, they just hate NVIDIA.

    • @LeJimster
      @LeJimster ปีที่แล้ว +14

      @@LeegallyBliindLOL Well I can't use DLSS because I'm on AMD. But I'm pretty sure both DLSS and FSR have weird ghosting issues in movement and also strange artifacts. I only use FSR for performance reasons and at the resolutions I'm using it I notice a big visual fidelity drop (edited, because of brain performance failure). DLSS may be better, but I still think it's getting to the point where they aren't producing faster GPU's but faking performance through these techs and artificially locking the software to newer cards. The only way I would like to use this tech would be in reverse, since taking higher resolution and downscaling it produces a crisper image.

    • @mr.obeydoge5266
      @mr.obeydoge5266 ปีที่แล้ว +1

      Only one thing I can say. These companies dont care about us and I dont know why bother defending but I digress. Here is the main thing people, fake frames is fake frames. Native is definitely the way to go since it truly measures the raw capability of the component. There I said it. Dont allow them to control the market for so long and support their bad habit of setting insane value on products that are supposed to be produced in reasonable prices.

    • @LeegallyBliindLOL
      @LeegallyBliindLOL ปีที่แล้ว +18

      @@LeJimster so, you in reality, don't have a real world reference point. From my experience, even at 4K, FSR is noticeably worse (usually blurry) and I don't notice any artifacts with DLSS unless I use performance mode in some titles. I modded in DLSS for RE4 and it was a night and day difference. You can believe me or not. But in the end, TH-cam doesn't convey the differences well enough.

  • @timberwear369
    @timberwear369 ปีที่แล้ว +54

    I definitely would like you to include 1440p High Settings. You only tested the two extremes, 1080p vs 4K and Low vs Ultra. 1440p High for me makes much more sense. But maybe with highest texture settings.

    • @MsNyara
      @MsNyara ปีที่แล้ว +2

      1440p Ultra aiming for high frame rate tends to be the same as 4k HD High/Ultra 60FPS discussion.

  • @pkpnyt4711
    @pkpnyt4711 ปีที่แล้ว +82

    I think what were kind of missing with this test is we're using the top 2 high end cards from red and green. These cards have the highest bandwidth and throughput compared to the mid and lower end cards. These things might be fast enough to not having to load as much ram as they are fast enough to deal with it. How would ram usage look with a 4070ti and a 6950XT or even a more mid range offering? Im not so sure, but it's a legit question I have in mind.

    • @pf100andahalf
      @pf100andahalf ปีที่แล้ว +10

      Faster cards don't use less vram.

    • @SirJohnsonP
      @SirJohnsonP ปีที่แล้ว +8

      Yeah it can only get worse with those. But yeah Nvidia is now focusing on AI gpus, Musk ordered 10.000pcs for twitter, and OpenAI ordered more than 30.000pcs. So now they have a perfect excuse to lower the pc gpu production, and focus on AI gpu market... Too bad latex and leather fans are still buying nvidia gpus, giving them a reason more to produce 8gb 500$ gpus in 2023

    • @standarsh8056
      @standarsh8056 ปีที่แล้ว +6

      Not how it works. Faster memory = more frames, but if you lack the memory buffer in the first place it will still tank performance

    • @sc337
      @sc337 ปีที่แล้ว +3

      From what I observed, some games uses less VRAM on lower VRAM cards. For examplea same game with same settings, a 4GB card may show 3.5GB utilization while a 8GB card may show 4.5GB utilization. I think it makes more sense to test the games with a real 6GB, 8GB & 12GB cards. Anyway, still appreciate Bryan's effort

    • @pf100andahalf
      @pf100andahalf ปีที่แล้ว +5

      @@sc337 Vram will overflow into system ram. In your example of a lower vram card using less vram, it's using a hell of a lot more system ram.

  • @philscomputerlab
    @philscomputerlab ปีที่แล้ว +22

    For Windows XP Retro Gaming, less is more. Some games have issues with large VRAM, best to have 1 GB (GT 710 FTW) 😅

    • @Rabbit_AF
      @Rabbit_AF ปีที่แล้ว +1

      What if video card companies made cards that shared system memory again. I was a bit thrown off when a S3 Graphics card I was testing was doing this. ATI had a feature like this called Hyper memory.

    • @ShinyHelmet
      @ShinyHelmet ปีที่แล้ว

      I've still got a 256mb 7600 GT for all that retro malarky! 🥰

    • @devilzuser0050
      @devilzuser0050 ปีที่แล้ว

      I selld a gtx titan cause heroes2 & nfs2se doesn't start on it under XP. (6gb vram)

    • @necuz
      @necuz ปีที่แล้ว +1

      @@Rabbit_AF That's exactly what the Windows Video Memory Manager is doing, that's why games typically only run really poorly instead of crashing when you run out of VRAM.

    • @bryanwages3518
      @bryanwages3518 ปีที่แล้ว

      ​@Rabbit _AF amd vega cards can do this. It's called hbcc. You can expand your vram with your system ram.

  • @Beezzzzy_
    @Beezzzzy_ ปีที่แล้ว +37

    Any reason 1440p is left out? I think this will be a good data base to put together. Nobody really discusses VRAM, the 1080ti is still a solid card cause of its 11GB of VRAM being released 6 years ago, and we're still having cards come out with 8GB or less, 12GB should be the baseline sold in 2023, not 6GB-8GB anymore.

    • @HxR-eSports
      @HxR-eSports ปีที่แล้ว +1

      Yea there a reason. It would have shown results that went against his agenda.

    • @edeka3
      @edeka3 ปีที่แล้ว +1

      ​@@HxR-eSportsdo you think 8gb is enough to run at 1440p or 1600p? A little future proof?

  • @Thezuule1
    @Thezuule1 ปีที่แล้ว +42

    I use my GPU for VR and even with the relatively low memory requirements of most VR games, you still need to render above 4K and even 12gb is likely not enough now, and certainly won't be in a few years.

    • @gozutheDJ
      @gozutheDJ ปีที่แล้ว +5

      Vr is its own thing.

    • @darkkillex7220
      @darkkillex7220 ปีที่แล้ว +1

      Same here, I got a 3080 thinking I would be able to easily run VR with it. Except I bought it back when they still had only 10GB of VRAM and as soon as I try to run any game that's not a dedicated VR game in VR I'm just limited by the VRAM

    • @anthonylong5870
      @anthonylong5870 ปีที่แล้ว +2

      Bro Vr is at best 1080 lol , most is only 720 per eye....Your not rendering 4K

    • @Bos_Meong
      @Bos_Meong ปีที่แล้ว +4

      @@anthonylong5870 Actually its very close to 4k. quest 2 has resolution totaling 3664x1920 vs 4k 3840 x 2160. despite all of this hl alyx only consume 7gb vram all setting maxed out. Its all about how optimized the game, we need to stop supporting shitty port games

    • @Thezuule1
      @Thezuule1 ปีที่แล้ว +2

      @@anthonylong5870 each eye is more than 1080p dude..

  • @Art_Vandelay_Industries
    @Art_Vandelay_Industries ปีที่แล้ว +66

    What's crazy to me is that the graphic fidelity doesn't actually looks that good, considering the requirements. I think optimization should be more of a focus nowadays. That would also help with the insane prices for hardware atm.

    • @ShinyHelmet
      @ShinyHelmet ปีที่แล้ว +19

      The thinking seems to be that they develop for the hardware available on the new consoles and then just port it to PC.....and hopefully patch it later!

    • @sven957
      @sven957 ปีที่แล้ว +15

      They optimize for consoles which have 16GB combined memory.
      Yes, they COULD optimize it better but that costs a lot of dev time and money which they would rather invest into other parts of the game which makes total sense, considering consoles make up most of their revenue. Although there are titles like TLOU which are actually REALLY badly optimized.
      But other than that the only party to blame here is nvidia who decided to build planned obsolescence into their cards.

    • @grlmgor
      @grlmgor ปีที่แล้ว +3

      @@sven957 Well if they don't optimize, then don't buy their game.

    • @sven957
      @sven957 ปีที่แล้ว +4

      @@grlmgor Sure if you dont want to play uh - pretty much all of the major upcoming titles. Again you cant blame the devs in most cases (you can in TLOU) - 8 GB was first seen on a GPU in 2015. Nvidia did this to make you upgrade when their next overpriced generation drops.
      12GB right now is barely enough to max out games (if I'm paying that much for a fucking GPU you better let me max those settings!) just like how 8GB two years ago was barely enough. The 40 series cards will run into the same issues in max 2 years.

    • @peterpan408
      @peterpan408 ปีที่แล้ว +1

      For 1080P there is certainly a fidelity limit set by the pixels, that could be optimized for in the engine.

  • @kasmidjan
    @kasmidjan ปีที่แล้ว +9

    Ngreedia can Pay their investors with Cheap Leather jackets
    If they keep stingy with VRAM

  • @altun8310
    @altun8310 ปีที่แล้ว +20

    Hi from Canada. Thank you for the video. Timely analysis and I'll keep an eye out for your future ones on this topic. My suggestion is that you should add 1440p high settings as benchmark. That represents the upgrade path for the majority of people who still play at 1080p. Also, I also noted in youtube videos the ram usage/allocation between nvidia and amd. That would be interesting if you could investigate and explain!

  • @bctoy2779
    @bctoy2779 ปีที่แล้ว +6

    DLSS3 Frame Generation also requires more VRAM. So with 4070Ti, you can already run into a situation where the card can do 4k60 or better but runs out of VRAM and stutters.

    • @JustGaming24
      @JustGaming24 ปีที่แล้ว

      its not a 4k card tho

    • @brunoutechkaheeros1182
      @brunoutechkaheeros1182 ปีที่แล้ว

      @@JustGaming24 so why the hell people say 4070 ti beats 3090? wasnt 3090 a 4K card? lmao

    • @JustGaming24
      @JustGaming24 ปีที่แล้ว

      @@brunoutechkaheeros1182 more or less same performance is not considered a 4k gpu because of the 12gb vram the 3090 has double the amount.

  • @stratuvarious8547
    @stratuvarious8547 ปีที่แล้ว +18

    I know I didn't spend hundreds on a new GPU to play at low settings, which is why I bought a RX 6900 XT. Reasonable price, 16 GB of VRam, it was the best choice in my price range.

  • @f.ferenc88
    @f.ferenc88 ปีที่แล้ว +4

    1080p then 4k? Where the fucks 1440p ??? That is todays golden standard....

  • @IamMarkSmith
    @IamMarkSmith ปีที่แล้ว +12

    In my opinion, Nvidia is using tech like DLSS to be able to pinch on their physical hardware specs to not only charge more relative to the mindshare they have in the GPU market, but increase their profit margins all around. AMD is the closest they have ever been as competition to Nvidia with their current crop of RDNA 3 cards. If they can get the price to performance right on the forthcoming 7800 and 7700 models we will see them make inroads into Nvidia’s market share. We all win when there’s competition in the marketplace. I’m not a fanboy of either company, but I am a fanboy of better value for my money.

  • @Rizzlas
    @Rizzlas ปีที่แล้ว +13

    I have a 3070 8G , and I can confirm with Hogwarts Legacy in 1080p Ultra settings , I have some trouble having a smooth gameplay all the time (I have to use some optimization tool to do it) , but my friend have a 3060 with 12Gb of VRAM , he runs it in high-ultra settings got a lot smoother experience than me.
    Pretty sad to be honest :/ I'm thinking about maybe sell my 3070 to get a amd card of the same grade with more VRAM :)

    • @95928225
      @95928225 ปีที่แล้ว +5

      I sold my 3070 for 400$ and bought a new 6700xt and it is wayy better. No vram crashes like in re4, deathloop, forza horizon, howgarts, last of us

    • @Rizzlas
      @Rizzlas ปีที่แล้ว

      @@95928225 yeah but i'm doing a lot of render on adobe premiere and loosing cuda acceleration is not possible

    • @simon89oi
      @simon89oi 9 หลายเดือนก่อน

      ​​@@Rizzlas 2080ti should fit your needs than

  • @bledboost
    @bledboost ปีที่แล้ว +7

    There is still one way to play all the latest games even with 4GB of VRAM. Just play at 720P! In many cases you will get a better experience playing at 720P High than 1080P Low. This is especially true on gaming laptops since the screen is smaller so the difference in resolution is less noticeable.

    • @roadrash2005
      @roadrash2005 ปีที่แล้ว

      I have a 4K tv, I can’t go backwards lol I tried it was painful

    • @sololoquy3783
      @sololoquy3783 ปีที่แล้ว +1

      but you effectively gimped your card at that point... so yay nvidia!

    • @bledboost
      @bledboost ปีที่แล้ว

      ​@@InnerFury666 Well you wouldn't need to play pixelart games or even the average game at 720P because they don't have high VRAM requirements. I'm talking about the latest high end games that choke when you don't have enough VRAM. I'm just saying 720P is still an option to make those games very playable.

  • @RFLCPTR
    @RFLCPTR ปีที่แล้ว +2

    VRAM usage and how much is reserved by the game adjusts to the amount of VRAM present on your GPU.
    You would notice that when testing with an actual 4 GB VRAM card, instead of using a 24 GB VRAM card...

  • @maxdema115
    @maxdema115 ปีที่แล้ว +9

    Too few games tested (and one very broken like TLOU) to have a reliable analysis. And would have been great to see 1440P results, since the RTX 4070 has been designed for that target.

  • @TheSakrasta
    @TheSakrasta ปีที่แล้ว +5

    I did not expect the impact of resolution on vram usage to be that small compared to the quality settings.
    It might be very interesting to have a bunch of tables for some of the newest titles, which show vram usage at 1080/1440/4k + low/medium/high/ultra settings. Because a lot of games look very similar at high instead of ultra. So with such a table you could find the sweetspot settings for your personal vram amount. If dropping from 1440 to 1080 only saves you 1GB of vram, but going from ultra to high saves you 2GBs, you would most likely have a better looking game at 1440 high compared to 1080 ultra, while also using less vram.

    • @angrydragonslayer
      @angrydragonslayer ปีที่แล้ว +3

      The textures are still the same quality, you just use less of it on the screen

    • @bigturkey1
      @bigturkey1 ปีที่แล้ว

      just use dlss. i never have to turn down settings. i just changed dlss settings.

    • @dreamcat4
      @dreamcat4 ปีที่แล้ว +1

      yeah i agree somebody out there, if they are doing tables include its primary platform it was targeted for. whether a console, then what usable vram that console had. or if it was a pc only game. because clearly that is useful information to include in such tables... since that is the common underlying reason for these escalating vram requirements. each new console generation
      [edit] and lets hope sony ps6 will not come with greater than >24gbs of gddr.... or we will all be in trouble! ha

  • @HairyScrambler
    @HairyScrambler ปีที่แล้ว +6

    I think for the vast majority of games worth playing 4 GB will hold up for the near future, as long as devs can optimize their games. It’d a shame the 1060 3 GB is already starting to become unplayable 1080p in games like 2042.

  • @masterkalel06
    @masterkalel06 ปีที่แล้ว +11

    Hey Brian. Any chance on the V Ram tests, can you do 1080 Ti and 2080 TI against similar performing 8 GB cards. Since you're all about the used price performance, I'm curious if the 11 Gigs make those cards perform better going forward.

  • @trr4gfreddrtgf
    @trr4gfreddrtgf ปีที่แล้ว +6

    I don't think 12gbs is going to last long at all, I wouldn't be surprised if it runs out in 2 years or so. I think 16gbs is a much safer bet, most people want their GPUs to last 4-5 years and 16gbs should do that just fine.

    • @paranikumarlpk
      @paranikumarlpk ปีที่แล้ว

      Ye 16gb for 1440p and 20gb for 4k is fine for 2 to 3yrs ... I really hate the 308010g for these latest games .. it sucks even for 1440p but still ppl argue 8gb is enough for 1440p for 5more yrs lol and they mindlessly support the greedy nvidia.. they don't understand the quality of high Textures and its impact of vram .. how can they expect the top quality visuals to run on the potato gpus with 12gb or below vram

    • @bigturkey1
      @bigturkey1 ปีที่แล้ว

      12gbvram should last you until they start porting ps6 games

    • @pdmerritt
      @pdmerritt ปีที่แล้ว

      it doesn't matter all that much imho. If you're like me and coming from a 1070ti even the 4070 would be a huge uplift. If it only lasted for 2yrs because of vram issues...I could sell the card with 1yr of warranty on it (so for a decent price) and buy from the newer generation that would, hopefully, have a better price performance ratio than this disappointing generation.

    • @trr4gfreddrtgf
      @trr4gfreddrtgf ปีที่แล้ว

      @@pdmerritt True, I don't think warranties carry over if you sell them used though. Pretty sure it's for the original owner only, might be wrong though.

    • @pdmerritt
      @pdmerritt ปีที่แล้ว

      @@trr4gfreddrtgf how would anyone know? Even if I purchased with a credit card and my name is on the reciept...all the person would have to say is that it was a gift. I also don't have to register for the warranty.....the reciept will suffice.

  • @RobertJianu
    @RobertJianu ปีที่แล้ว +39

    I still run 4GB VRAM, both on my gtx 1050ti PC and my rtx 3050ti laptop. The 3050 has decent performance (like a watercooled and overclocked gtx 1070 desktop that a friend of mine has) but the lack of vram is starting to show. At least I don't game that much anymore. My next gpu will probably be from AMD tho

    • @Verpal
      @Verpal ปีที่แล้ว +5

      Ampere desktop is suffering from botherline insufficient VRAM already, and yet for some reason NVIDIA decided to squeeze the low end ampere laptop even harder, people who buy low end stuff need it to last longer, yet NVIDIA decided to screw them in particular.

    • @sergeleon1163
      @sergeleon1163 ปีที่แล้ว

      Yeah I was on GTX 1050ti and the 4GB really started to limit me, I upgraded this week for €250 to a RTX 3070 8GB and even when on specific games like here shown could be limited I will drop down settings as I'm aware of this 8GB can be hampering (in the future), while many games it will still be okay. But when on a budget both NVIDIA and AMD are playing gamers for too high prices and forgetting about people on a budget.

    • @Killersnake432
      @Killersnake432 ปีที่แล้ว

      I upgraded last month from a 1050ti to a RTX 3060. I was GPU and vram limited now I can play stuff I used to play far better and have the vram space for games like RE4 Remake that I can do crazy high settings with. I would had gone for 3060ti but that 8GB vram buffer turned me away from it.

    • @TechHarmonic
      @TechHarmonic ปีที่แล้ว +4

      I remember briefly having a 3050 ti laptop and it got on my nerves fast. Even with older games, maxing them out at higher resolutions I would get horrible frame drops because of the vram running out. I returned the Legion s7 since it was $1k and it felt super overpriced for that performance.

    • @RobertJianu
      @RobertJianu ปีที่แล้ว +1

      ​​@@TechHarmonic damn, I know how it feels. The 3050ti is decent for 1080p even with most recent games. You can't go higher than 1080p or bump the graphics too high because the vram will make your experience horrible. I kept it because I needed portability and the good part is that I got it for around 500$ at the time and it has a Ryzen 7 5800H, 16gb ram, 512gb nvme ssd and a 10 bit 165hz display. It's pretty good for my not so demanding games (FH5, RDR 2, God of War, Sons of the forest etc) and media creation (mostly editing in Photoshop since the screen has excelent colors, Sony Vegas and making documents) but I wouldn't recommend this GPU for a true gamer, 4gb vram is just unacceptable. Always aim for at least a xx60 series card since they age pretty good or just go with AMD (lower prices and higher vram than nvidia)

  • @CameraObscure
    @CameraObscure ปีที่แล้ว +5

    This test was almost meaningless. Using cards with plenty of VRAM only shows the maximum VRAM the game would load into the VRAM buffer. A better test would be testing two equivalent cards for respective resolution, a good comparison would be RTX3070 Vs a RX6700XT an 8GB versus a 12GB card. To see how the frame times and texture quality differs @1440p to how the VRAM limitation actually affects what that card can do with ultra/ High textures. That is where you will really see the différance. For the higher requirements of newer games going forward. It seems that 4K textures are going to be the norm for many newer games, with no lower texture packs being used for lower resolutions. Upcoming OS changes such as streaming textures directly from storage will supplement lower VRAM Capacities when they eventually make it into the OS. It's not as simple as your making out in this video.

  • @sapphyrus
    @sapphyrus ปีที่แล้ว +2

    If someone's on 4K, they can shave off about 1-2GB by using performance DLSS. It's what I have been doing with 3070 and it worked alright (high even if not ultra textures) so far with newer games without RT.

    • @Lordssr
      @Lordssr ปีที่แล้ว

      Dlss 3 can save 4

  • @WTBMrGrey
    @WTBMrGrey ปีที่แล้ว +32

    Nvidia is charging top dollar for their products, pushing DLS,RTX,A.I,reflex etc but skimping on Vram. The RX 470 came with 8gb vram and how old is that now?

    • @Kryptic1046
      @Kryptic1046 ปีที่แล้ว +9

      It's a pretty counterintuitive thing Nvidia is pushing. On the one hand, they really want to sell you resource-intensive features like raytracing/path tracing but then they don't want to give you enough VRAM in the mid-range to actually use it along with decent textures. DLSS can only do so much. In the near future, you'll probably have to choose between either having high textures with RT off or lower textures with RT on. You simply won't get to do both due to VRAM constraints in newer games.

    • @NostalgicMem0ries
      @NostalgicMem0ries ปีที่แล้ว +3

      wanna compare 3060 3070 performance vs rx470?

    • @WTBMrGrey
      @WTBMrGrey ปีที่แล้ว +6

      @@NostalgicMem0ries well obviously the 3060/3070 are a lot more powerfull. There is the point though. The 3070 is a decent 1440 GPU, but it only has 8gb vram which is pathetic. Even the 3060 has 12gb.

    • @mikeymaiku
      @mikeymaiku ปีที่แล้ว +1

      @@WTBMrGrey i guess you dont understand "why" it had 12gb

    • @mr.ihabissa8442
      @mr.ihabissa8442 ปีที่แล้ว +1

      ​@@WTBMrGrey
      3060 has 12GB Vram beacuse of it's 128bit bus ,they can do either 6 or 12 not 8 .. Regardless the 3060 is a shit card even on 1080p vs 3060ti/ 3070.

  • @nukedathlonman
    @nukedathlonman ปีที่แล้ว +5

    I was thinking most games would be optimized for 2K these days... Now I know it's only a snap shot, and the accuracy has been called into question numerous times, but Steam's hardware survey does indicate 1080 is the most common resolution and it's on a very slow decline, with the next large chuck being 1440 and growing very strong.

    • @nukedathlonman
      @nukedathlonman ปีที่แล้ว

      @El Cactuar No, 2K is 2560x1440

    • @nukedathlonman
      @nukedathlonman ปีที่แล้ว

      @El Cactuar 1080 is HD (or as some companies call it "FHD")

    • @nukedathlonman
      @nukedathlonman ปีที่แล้ว

      @El Cactuar Oh, you're going by cinema resolution for the 2K labeling. Monitor manufacturer's will use QHD or 2K to describe 2560x1440.

    • @nukedathlonman
      @nukedathlonman ปีที่แล้ว

      @El Cactuar No, that's HD... Or FHD if you're going by manufacturers since they insist on calling 720 "HD"

  • @buda3d2007
    @buda3d2007 ปีที่แล้ว +16

    I use Blender where Vram is king on larger scene files, once you run out of vram your card might as well be the equivalent of a great sports car spinning its tyres working 10 times as hard to get the job done when it would only need to do it once had it had more vram.

    • @furynotes
      @furynotes ปีที่แล้ว +1

      Even with character portraits 12gb is recommended.

  • @prosecanlik4296
    @prosecanlik4296 ปีที่แล้ว +5

    This is ONLY for those singleplayer titles, where you only aim for 60 fps at high or ultra at 1080p or higher. I usually don't play those games, only multiplayer shooters, esports like CSGO, so for me, 8gb vram would do just fine. Planning to get RX 6600 for that matter

    • @danielkowalski7527
      @danielkowalski7527 ปีที่แล้ว +2

      rx6600 undervolted eats only 80w ^^
      Idk why but colours are way better on rx6600 than my old 1650

    • @prosecanlik4296
      @prosecanlik4296 ปีที่แล้ว +1

      @@danielkowalski7527 will try to undervolt it if I get it one day

  • @jqwright28
    @jqwright28 ปีที่แล้ว +4

    I'd say you're right about games being designed for 4k and then scaled down. Also games like TLOU remake that are next gen ps5 only titles, probably also are designed for 16gb of unified system memory or whatever it's called, so they probably will run best on anything that can give them 12-13gb on the gpu.

    • @gozutheDJ
      @gozutheDJ ปีที่แล้ว +3

      ALL games have been this way for a while. doom 3 had a level for the maximum quality, uncompressed textures and then all the lower quality settings were scaled down from there. that's why games don't look like literal mud on low settings these days.

  • @ChiekoGamers
    @ChiekoGamers ปีที่แล้ว +4

    I'm still enjoying video games at 1080p high settings. I don't see the point of Ultra graphics.

  • @barrysloas277
    @barrysloas277 ปีที่แล้ว +4

    1 % of gamers are playing at 4k. Where are the 1440 stats that more people are gaming at

    • @Willbme4EVA
      @Willbme4EVA ปีที่แล้ว +1

      The gamer side of me really does not want to see grass swaying in the wind, I want to look at my opponents from a distance, before they see me. Take that shot and move on. The only time I would like to see a shadow is when an opponent is on the roof and casts a shadow on the ground outside. Higher HZ, not the p's or K's is my thing.

  • @darkkillex7220
    @darkkillex7220 ปีที่แล้ว +27

    I've definitely noticed the VRAM on my 3080 10G being a limiting factor in quite a few games recently...

    • @greenbow7888
      @greenbow7888 ปีที่แล้ว +5

      The 3080 10GB card could not even run Far Cry 6 HD textures, within a month of the 3080 release.

    • @thomassmith9362
      @thomassmith9362 ปีที่แล้ว +3

      Well it is now almost 3 years old, you shouldn’t be expecting to max out games on it. I’m going on along nicely with that card, 1440p@high on the last of us works great.

    • @Bos_Meong
      @Bos_Meong ปีที่แล้ว

      try to run msi afterburner and see if your vram is actually eating up or not, dont just "noticed" lmao

    • @kaythree8302
      @kaythree8302 ปีที่แล้ว +3

      @@Bos_Meong any decently competent person would assume that’s what he meant by “noticed”.

    • @Bos_Meong
      @Bos_Meong ปีที่แล้ว

      @@kaythree8302 decently competent? Thats my line for you. I bet 100% was just placebo, he was just assuming and not really tested it out himself. Because me rn running cyberpunk at overdrive and it only eat 9Gb of ram hows this a limiting factor? Also 3080 cant do overdrive anyway so at ultra its gonna consume far less vram, probably 7gb. Maybe he was playing trash of us, which is a badly optimized game

  • @Cogglesz
    @Cogglesz ปีที่แล้ว +8

    I'm rocking 8gb on my 3060ti, performance is fantastic for the price. Noticed my only cap seems to be 4K, Doom Eternal with ultra Nightmare texture pooling (everything else can be 1440p Ultra Nightmare), Forza 5 Eats it all up despite being able to run 120 v-synced. Game would pause and complain of low bandwidth (Much more than the Series X funny enough) I've wanted to just play at 60 with higher geometry to match the X. Turns out i''ll always bump into this issue, It's annoying my 64gb of ram is basically doing nothing. (Last of us managed to get 21GB of usaged though so gg's)
    Honestly i think 12gb is the new 8gb card. Midranged Vram amount seems to be inflating like our currencies. I feel some blame on porting has to be stated. RE4 medium textures are worse than PS4 somehow and it'l eat up a 8gb card happily.
    It's kinda sad when you've a lot of Resources but what holds you back is 8gb of Vram? Call me a boomer but i always felt 8 would be perfect for gaming, In the past we only really seen 12+ in professional cards only a few years back.

    • @Toulkun
      @Toulkun 11 หลายเดือนก่อน +1

      It comes down to trash optimizations too

  • @YouOnlyLiveOnce...
    @YouOnlyLiveOnce... ปีที่แล้ว +5

    Good data. Please include 1440p settings next time.

  • @telekarma
    @telekarma ปีที่แล้ว +4

    Devs can use more hardware resources with new/current gen only titles and this is what we get. From what I've seen 4k ultra and 1080p ultra VRAM usage delta isn't too big, about 1-2GB difference. That doesn't bode well for lower VRAM cards even if they are otherwise fast enough.

  • @kartikguha
    @kartikguha ปีที่แล้ว +1

    You missed the most important data point Brian, 1080P ultra. That's what people are most concerned about.
    Also, it's evident that in new titles higher texture/graphic quality is much more taxing than higher resolution.

  • @jasperfianen3431
    @jasperfianen3431 ปีที่แล้ว +2

    I have a 1080 TI with 11GB Vram that thing is such a beast

  • @Hostile2430
    @Hostile2430 ปีที่แล้ว +3

    I upgraded my GPU to 1660Super 6GB only recently and i already feel outdated trying to run some current games at high settings i exceed or consume most of my Vram and suffer from stuttering.
    Crazy to think just a few years ago 8GB vram was considered overkill and now its becoming bare minimum requirements to run most modern AAA titles at acceptable framerates.

    • @zicksee0
      @zicksee0 7 หลายเดือนก่อน

      dawg 1660 super isn’t meant to run games at high settings lol.

  • @tomtomkowski7653
    @tomtomkowski7653 ปีที่แล้ว +6

    We have had 8GB for so long that I would say it is obsolete. I mean, would you buy a new $400 GPU like the 4060ti and be already forced to lower settings at 1080p?
    12GB very soon will be the standard for 1080p gaming and if you want 1440p with RayTracing then you should have 16GB which should be the standard right now.
    12GB should be a standard for sub $500 GPUs and 16GB should be a standard for GPUs for more than $500 and 8GB should be some entry-level cards for sub $150.

    • @klanas40
      @klanas40 ปีที่แล้ว +1

      It should, but it doesn't mean that will happen soon.

    • @brettlawrence9015
      @brettlawrence9015 ปีที่แล้ว +2

      Yeah buying a brand new gpu for the current prices and having to reduce settings is a joke. At 4k I could understand but not 1080p 1440p.

    • @Willbme4EVA
      @Willbme4EVA ปีที่แล้ว

      if we are makeing requests to GPU makers that says allot. They do not seem to be listening. But if they are? Give me a supplemental plug and play alternative for Vram. Preferably a slot stuffer for Xmas.

    • @bigturkey1
      @bigturkey1 ปีที่แล้ว

      12gbvram should last you until they start porting ps6 games

    • @brettlawrence9015
      @brettlawrence9015 ปีที่แล้ว

      @@bigturkey1 depends if you want ultra settings then no. 12gb will be for medium to high settings.

  • @sebastienhebert6457
    @sebastienhebert6457 ปีที่แล้ว +1

    New to your channel and I love it. You it the sweet spot pragmatic technical useful information. Thanks for that last part on 1080p high settings.

  • @AndyBarber1981
    @AndyBarber1981 ปีที่แล้ว +2

    I was on warzone yesterday ,play on a ultrawide 3440x1440 put the game to extreme on Al-mazrah to see what my 4070 performance was like and it hit 11.2gb Vram

  • @HyperBawl
    @HyperBawl ปีที่แล้ว +12

    Amazing content as always ! I'm sure my 6700xt will last loooooong

  • @projectc1rca048
    @projectc1rca048 ปีที่แล้ว +3

    LOL! When you said VRAM - Magedon I literally laughed out loud, love it. Only @Tech Yes City man. I imagine with all these latest and greatest AAA titles raising the minimum pc requirements to run their games, especially the games that will be using Unreal Engine 5, 12gb of vram will be the new minimum/standard. Of course it will depend on the resolution and settings people play at. Great topic for a video and appreciate all the hard work my guy. Keep up the great Tech Yes City content.

  • @YoStu242
    @YoStu242 ปีที่แล้ว +1

    The same formula seems to apply in software development as in life in general, that if there is space in the apartment, it is lazily allowed to fill up with junk and you don't bother to clean it, let alone think about whether you even need all that junk

  • @ruxandy
    @ruxandy ปีที่แล้ว +7

    Great video! I would say that 12 GB VRAM at 1440p should be more than enough for the foreseeable future. I mean, sure, in the next couple of years there will probably be a new game which might require more than that for the absolute Ultra settings (Ultra textures, in particular), but I don't think we'll see a game where 12 GB of VRAM is unusable for High details anytime soon (it might happen when the next-gen consoles come out, but that's still a long way from happening). I for one have played The Last of Us with Ultra details @ 1440p on a Ryzen 7 5800X3D + RTX 4070Ti, and the experience has been absolutely flawless (had no crashes, and no stutters -> 65+ FPS for the 1% lows). So if this game runs great (and, as we all know, this title is the 'best' example of poor optimization), then I am not worried at all for the next 2 - 3 years. Fun fact: I've actually also played TLoU (in its entirety) on my backup PC, with an RTX 2060 @ 1080p/High details, and the experience, while not flawless, it still wasn't bad at all and very playable (and I didn't experience any crashes with this card either - must've been very lucky).
    On the other hand, 8 GB of VRAM is a whole different discussion. There were multiple signs throughout the past few years that... yeah, 8 GB VRAM wasn't gonna cut it anymore (especially considering the fact that new consoles came with 16 GB of unified memory). It's unfortunate that a lot of people did not listen and, even worse, they ended up spending 800+ euros for RTX 3070s and other cards like these.

    • @itsmorbintime6833
      @itsmorbintime6833 ปีที่แล้ว

      Do you think 12gb vram will be enough for 1080p for a long time?

    • @Ruslan-Night
      @Ruslan-Night 11 หลายเดือนก่อน

      @@itsmorbintime6833 ?

    • @Ruslan-Night
      @Ruslan-Night 11 หลายเดือนก่อน

      ?

  • @destrike702
    @destrike702 ปีที่แล้ว +4

    Currently have a RX 6600 8gb, can run the games on mid to high and quite happy on the performance.

    • @SlowHardware
      @SlowHardware ปีที่แล้ว

      I just sold my 6600 xt and bought a radeon vii, similar performance just 16gb vram. I did it because I got the radeon vii for $250 nzd 😅 sold the 6600 xt for $350 nzd

    • @destrike702
      @destrike702 ปีที่แล้ว

      @@SlowHardware I hope i can find that on the same price, in my country, even the 2nd hand gpu market is overpriced.

    • @SlowHardware
      @SlowHardware ปีที่แล้ว

      @destrike702 oh for sure it's usually way over priced for what it is. I just started bidding and got a good deal :) I'd just save a search on a couple sites and check occasionally and one may show up cheap :)

    • @evilleader1991
      @evilleader1991 ปีที่แล้ว

      @@SlowHardware What about power draw

    • @SlowHardware
      @SlowHardware ปีที่แล้ว

      @@evilleader1991 I have a 1000w psu it's fine

  • @necuz
    @necuz ปีที่แล้ว +8

    Much better VRAM management than what these games are doing is possible on more recent hardware, but PC is lagging behind the upgrade cycle a lot for obvious reasons. Using a DX12 Ultimate feature called Sampler Feedback you can figure out which parts of which textures need to be loaded at what quality in order to render a scene, this would massively cut down on VRAM usage especially in open world games. That could further be combined with DirectStorage 1.1 to quickly load textures on demand. The kicker is you then need to set your minimum requirements to RDNA2 or Turing, since that was when support for these was introduced. That would be a bold move, but I do wonder how many of the people still rocking their old 1060 are actually buying $70 AAA releases in 2023?

    • @arenzricodexd4409
      @arenzricodexd4409 ปีที่แล้ว

      AFAIK RDNA 1 did not even support any DX12 ultimate feature.

    • @necuz
      @necuz ปีที่แล้ว

      @@arenzricodexd4409 Ack, you're right. For some reason had the impression RDNA1 also had rudimentary support for this.

    • @MrMeanh
      @MrMeanh ปีที่แล้ว

      The issue is that DirectStorage 1.1 will increase the load on the GPU (asset decompression done by the GPU etc.), this will 100% reduce the available compute for rendering the game. From what I've heard it's something like at least a 10-20% performance hit if you want to use DS+Sampler Feedback while rendering the game. This all means that I'm sceptical of Sampler Feedback being a good solution for reducing VRAM usage at the moment.

    • @necuz
      @necuz ปีที่แล้ว +1

      ​@@MrMeanh It certainly isn't going to be free, however 20% sounds more like a situation like trying to run TLoU at Ultra on a 8 GB card where you're constantly running out of memory. So the question becomes, would you rather have smooth 20% less fps or the current stuttery mess?
      Additionally, almost all of the usual suspects among released games that have triggered this debate struggle to be GPU bound, so in many cases it might actually end up being essentially free...

  • @ElladanKenet
    @ElladanKenet ปีที่แล้ว +2

    You can tell that graphics cards and games have improved as far as vram compression and utilization go, so kudos to the developers and graphics card makers for that at least. That said, for NEW stuff, I think 4gb and 6gb have clearly had their day, and 8gb is the new minimum for 1080p on AAA titles, with 12gb being preferred.
    Obviously for games that are not as graphically intense, or a couple of years old, you won't need that much vram. But now Cyberpunk isn't the only 'canary in the coalmine' here, as it is joined by Miles Morales, Last of Us, HL, Halo Infinite, and other titles, all wanting more vram. 8gb just isn't enough if you want to push higher resolutions AND max settings.

  • @garipoter6336
    @garipoter6336 ปีที่แล้ว +1

    Next time when you're testing please include 1440p low-high-ultra, 1080p low-high-ultra, why am asking for high and ultra separated, because a lot of game see a minimal difference in visuals between high and ultra and a big gap in fps so it would be nice to know how is the difference there, also 1440p is pretty common resolution, even more mainstream than 4k...

  • @fVNzO
    @fVNzO ปีที่แล้ว +6

    I think it's important to note that these figures are indicative of what game developers *tolerate* in order to make their games work on popular graphics cards. Had the average been higher (Nvidia spending 20 bucks more on their GPU's etc.) games would indeed look better, grander or load quicker - and more vram would be needed. The mere second the average VRAM count goes up to 16+ game devs will just eat it up as they can finally make more intricate game environments. So, as data points these are fun benchmarks to run but they are ultimately a product of the limitations involved with producing software for customers who have been well frankly scammed for the past 6 years with no discernible increase in graphics VRAM in the mid range.

    • @TheAkashicTraveller
      @TheAkashicTraveller ปีที่แล้ว +4

      Except's not what we're seeing. They're clearly targeting the consoles and ngreedia just isn't keeping up.

    • @soniofficial6017
      @soniofficial6017 ปีที่แล้ว

      ​@@TheAkashicTraveller 😂😂😂

    • @arenzricodexd4409
      @arenzricodexd4409 ปีที่แล้ว +1

      @@TheAkashicTraveller the issue on PC is those ultra (plus RT) what makes people think 8GB and 12GB no longer enough because console have 16GB. but console most often only use what equals to medium on PC. in reality developer on console most likely did not use most of the memory on VRAM like many people think it was. game like returnal for example the developer want 120FPS. so they actually render the game at 1080p and then using their own upscaled tech to upscaled the final image result to 4k. game like spiderman the ray tracing reflection are rendered at 1080p instead of 4k. and in some comparison the reflection on water puddle are being completely disabled on console.

    • @fVNzO
      @fVNzO ปีที่แล้ว

      @@TheAkashicTraveller This is exactly what we are seeing. Consoles are proving my arguments. The PS5 spends 16B of shared memory with a basically infinitely large cache behind it right - it's the cheapest way to just give developers more VRAM. And they've been completely hamstrung on desktop since nvidia just decided to stop at pascal. This video is showing you that usage is around 12GB max for a lot of games, that is telling you precisely that developers are completely stuck and they have little ability to give us better games when there is no standard way to cache more data like on consoles.

    • @LeegallyBliindLOL
      @LeegallyBliindLOL ปีที่แล้ว

      What is up with all these comments claiming Apples to apples comparisons with consoles.
      A) the PS5 for example uses GDDR6.
      B) Quite a bit of it is reserved.

  • @mattfarrar5472
    @mattfarrar5472 ปีที่แล้ว +4

    Would have been good to see you run a 6 or 8gb card in those titles to see what it would do on different settings...

  • @BeniBalak
    @BeniBalak 6 หลายเดือนก่อน +1

    FYI, I had to buy a 4090 to be able to run my new 2x4K PCVR HMD (Bigscreen Beyond) bc my 12gb 3080ti was choking on vram at high quality setting, e.g. Alyx at ultra setting. The GPU was able to keep reasonable FPS but would choke on large texture files which I found out using a utility with a detailed data overlay.
    Just FYI for the tiny minority of PCVR users who do need at least 16g today :).

  • @narfsc2657
    @narfsc2657 ปีที่แล้ว +1

    most games are just console ports designed and optimized at 4k. This is not only an issue with VRAM usage, but also image quality on lower resolutions, because there is f.e. no real 1080p textures anymore, just downscaled 4k.

  • @Verpal
    @Verpal ปีที่แล้ว +5

    Generally NVIDIA seems to do texture/color compression more aggressively, thus the difference in utilization, still doesn't justify how stingy NVIDIA have been on VRAM though.

    • @arenzricodexd4409
      @arenzricodexd4409 ปีที่แล้ว +2

      those delta color compression to my knowledge did not lower VRAM usage that much. in nvidia presentation those delta color compression helps to reduce bandwidth requirement quite significantly. that's why nvidia cards usually have much lower bandwidth vs AMD card of the same performance tier. AMD on their part decided not to optimized this because initially they thought HBM will going to totally replace GDDR memory in GPU in the future.

  • @SlowHardware
    @SlowHardware ปีที่แล้ว +10

    Brian can you do a video on the radeon vii vs the 2080 in 2023? I'm curious how it stacks up now with it having 16gb vram

    • @mateyv
      @mateyv ปีที่แล้ว +4

      or 2080 ti vs 3070

    • @grlmgor
      @grlmgor ปีที่แล้ว

      A 2080 only has 8GB

    • @Kryptic1046
      @Kryptic1046 ปีที่แล้ว +2

      @@mateyv - I've seen at least one channel (I don't remember which) comparing the 2080ti vs the 3070 in some recent titles and the 3070 was getting trounced in some of the tests by the 2080ti due to the 3070's VRAM limitation. The 3070 was as fast as the 2080ti, until it wasn't, due to the 8GB of VRAM.

    • @bryanwages3518
      @bryanwages3518 ปีที่แล้ว

      I have a radeon vii and my cousin has a 2080. In most new games I can play at higher settings than he can and I don't have the frame drops like he does. We play a ton of warzone and his frame times are awful.

    • @SlowHardware
      @SlowHardware ปีที่แล้ว

      @bryanwages3518 thanks for the info, looks like I made a good choice getting one :)

  • @gozutheDJ
    @gozutheDJ ปีที่แล้ว +1

    you can really see where the last of us is allocating that vram to, which is something no one ever points out.
    if you just take a second to look at a swath of wall or ground texture, where most games will use the same texture over large swaths of space, thats not the case with the last of us., its one of the most inssanely detailed ggames ive ever seen. no two sections of wall will look the same, tons of grime and overgrowth and filth its really incredible

  • @ilkerdemirci3720
    @ilkerdemirci3720 ปีที่แล้ว +1

    Not even 12 is enough, 16 should be minimum for future proofing. 7900xt is actually at sweet spot

  • @inmypaants
    @inmypaants ปีที่แล้ว +13

    Thanks for testing these on GPUs with sufficient VRAM Brian. People arguing that you don’t need more than 8 and using 8 to test don’t realise games dynamically scale down textures. It’s fine if you have 8 and don’t notice or mind, but don’t argue that 8 is enough for midrange new GPUs, it simply is not.

    • @adriancioroianu1704
      @adriancioroianu1704 ปีที่แล้ว +1

      It is if you adjust some video settings here and there. Because only on 4k you see 8+ required on low and mid-range is not targeted for 4k, people don't buy a 3070 to play at 4k, it's ridiculous. On the other side there are people (mainly rx 580 users) who try to convince new buyers that 12GB cards are trash and they should go for 16GB or more even on 1440p because they saw a 4k benchmark (max settings) where the Vram went over 12. Its funny and sad in the same time.

    • @FenrirAlter
      @FenrirAlter ปีที่แล้ว

      @@adriancioroianu1704 Yes, i m most definitely buying a 600$ GPU, so I can just barely scrap by in 1440p while playing on high settings with rt on.

    • @inmypaants
      @inmypaants ปีที่แล้ว +1

      @@adriancioroianu1704 Brian didn’t show 1440p, plenty of people buy 3070 class GPUs to game at 1440p. That resolution also runs into issues at 6GB and it won’t be long until 8GB is saturated too. I don’t think people should sell and buy bigger GPUs mind you, I just think people should be vocalising to these companies that 12 is the minimum for midrange and really 16 is the value Nvidia and AMD should offer to really entice buyers.

  • @Tubes78
    @Tubes78 ปีที่แล้ว +4

    12GB is enough for most high end gaming for now.. But so was 8GB just a few months ago. I'm not sure what's needed in 2-3 years. I can imagine people spending 800+ US don't want to start lowering a lot of settings that fast.

    • @peterpan408
      @peterpan408 ปีที่แล้ว +1

      16GB of course.

    • @brettlawrence9015
      @brettlawrence9015 ปีที่แล้ว +1

      I can see 16gb being used in the next few years.

  • @basbas63
    @basbas63 ปีที่แล้ว +2

    Think there is a big chance optimization for current gen consoles is making this issue happen.
    Given they have predictable data streaming speed they can make the available memory much further than would be realistic in a PC with swapping stuff in and out of memory. On PC it is not possible to predict what storage speeds people are using.
    With last gen consoles the slow hard drive kept this in check for the PC space, however, with the storage and memory speeds & architecture of current gen systems..
    Think, as far as I know, that we in sense either need minimum storage speed requirements or massive VRAM buffers if we want to keep using SATA for storage on which games are installed. Or, developers should take PC limitations into considerations regarding the design of their game.

  • @martytube821
    @martytube821 ปีที่แล้ว +1

    Just seems some newer games the vast majority of gamers worldwide work be able to play as most people have 8GB vram or lower, so who are these game companies selling too.

  • @ABaumstumpf
    @ABaumstumpf ปีที่แล้ว +8

    With VRam it also heavily depends on the engine and the game it self how they treat it.
    Some engines by default just try to keep more stuff in memory if more is available, some games (Hogwarts, lastOfUs) just waste memory left right and center. Some games dynamically adjust LoD to stay without memory.
    Hogwarts can be run on a GTX 970 with medium-settings 1080p - just not with how the game is delivered. They really need to fix that crap when even the community has already fixed that with mods.

    • @winebartender6653
      @winebartender6653 ปีที่แล้ว

      "Fixed with mods" lmfao no. The only thing any of those mods, that even "worked", did was adjust the culling distance, which made texture pop in and texture resolution hilariously bad.
      Let's also stop pretending that they are "hogs" when they are developed with a vram buffer of 12gb for consoles.
      And the reality of the situation is that you shouldn't be limited by vram, you should be limited by the chips performance to then adjust settings that fit your fps needs. You shouldn't need to crank down settings because your vram buffer is too small.

    • @ABaumstumpf
      @ABaumstumpf ปีที่แล้ว

      @@winebartender6653 ""Fixed with mods" lmfao no. The only thing any of those mods, that even "worked", did was adjust the culling distance, which made texture pop in and texture resolution hilariously bad."
      nah, people have shown that the game keep full resolution textures loaded for background objects that are only rendered at lowest LoD.
      "Let's also stop pretending that they are "hogs" when they are developed with a vram buffer of 12gb for consoles. "
      aka - "we know it runs with 12 GB so just cram in everything even if that degrades performance on all hardware cause it still runs okish".
      "You shouldn't need to crank down settings because your vram buffer is too small."
      And you shouldn't need to crank down settings cause a tree 760 meters away rendered at lowest LoD is using more Vram than an NPC standing right next to your character.

  • @TheSleppy
    @TheSleppy ปีที่แล้ว +7

    I think if this type of test was done for longer example 10 minutes of play vs 30 minutes etc, the VRAM utilization would be even higher.
    Sometimes a 5 minute benchmark doesn't give the whole story.
    I agree overall that 12GB is the new minimum, great video.

    • @mnemonic8757
      @mnemonic8757 ปีที่แล้ว +1

      Exactly. I don't think everyone has time for such tests. I remember the FFXV open world game. With the best textures, it ran smoothly right after turning on, but when you drove around the land and visited a city, the vram got clogged. I think this problem can occur in other titles, especially with open world, where you have to load a huge amount of data.

    • @greggmacdonald9644
      @greggmacdonald9644 ปีที่แล้ว +1

      Hardware Unboxed addressed this recently, their vid about this is well worth watching. I agree that 12GB is a new minimum, and I'd also say that 16GB is better.

    • @RicochetForce
      @RicochetForce ปีที่แล้ว

      @@greggmacdonald9644 Yeah, I'd say 16GB is what mid range cards should have. 8GB VRAM is stone dead and 16GB of system RAM is much the same.

  • @kay_a_s
    @kay_a_s ปีที่แล้ว +1

    Sporting 8GB VRAM at the moment with my 5700xt. The few games I'm playing is already eating up 6 GB to 7 GB of VRAM. After the upgrade to a widescreen monitor (aw3423dw) after my old budget monitor died, there was an increase to 7 to 7.5 GB of VRAM. I have to tone down some graphics settings just to play without lagging. 8GB is starting to hit the limit and 16GB should be the next standard for years to come.

  • @rathstar
    @rathstar ปีที่แล้ว +2

    Absolutely great video and very informative.
    I'm sitting here with a 1080p monitor and a 8GB GTX 1080, the GPU market at the moment is a pain for those wanting to upgrade for a reaonable price. When upgrading from a 10 series GPU I feel I should be significantly increasing the VRAM, but then the prices start to get crazy. I'm thinking a RX 6700 or a second hand 2080Ti, but I may just hand on.

    • @user-jj4bv5yk5k
      @user-jj4bv5yk5k ปีที่แล้ว

      you are right, 6700/6700xt/2080ti is the way to go

  • @benjaminmaher8896
    @benjaminmaher8896 ปีที่แล้ว +4

    My 1070 is still pushing but not for long I feel like with the trend games are adopting of being wildly unoptimized

  • @danieljayasiri7739
    @danieljayasiri7739 ปีที่แล้ว +2

    Upgraded my old 6600k platform but still holding onto my 1080ti from evga 😅 upgrading soon but this time I'll probably go amd and take up the Sam as I've upgraded to ryzen 7 anyway... ❤ for all the 1080ti users still holding on

    • @firexz4185
      @firexz4185 ปีที่แล้ว +2

      I believe you shouldn't upgrade your gpu now cuz this 40 gen or 7000 Gen are horrible

    • @blinksone2768
      @blinksone2768 ปีที่แล้ว

      @@firexz4185 7000 is decent.

    • @Willbme4EVA
      @Willbme4EVA ปีที่แล้ว

      1080 ti is no joke, its really hard to upgrade without selling a great, great, grandchild at a -10% loss.

    • @danieljayasiri7739
      @danieljayasiri7739 ปีที่แล้ว +1

      @Willbme4EVA it's even harder when you plan on passing it down to your kid and you need to part with >1.5k nzd for your next gpu 😆

    • @firexz4185
      @firexz4185 ปีที่แล้ว

      @@blinksone2768 for the price they are not worth it

  • @surfx4804
    @surfx4804 ปีที่แล้ว +1

    I have seem 15k+ vRam used on 4090 in Cyberpunk, Ultra settings, 4k, DLSS Quality, Path Tracing, Frame Generation

  • @PotatMasterRace
    @PotatMasterRace ปีที่แล้ว +2

    12:47 basically games use the same 4K texture packs for both resolutions.

    • @ms3862
      @ms3862 ปีที่แล้ว

      Exactly. This was even confirmed by two developers on MLIds and podcast - games aren't even use proper high resolution textures yet - enabling 4k just outputs a higher resolution for the image but the underlying asset textures are the same at all resolutions and they are low quality. Take even the best looking game you know of, move the camera right up against a texture and it will low very blurry low quality

  • @puddingfoot
    @puddingfoot ปีที่แล้ว +4

    Hey! Love your content. Ive also been living in Japan for the last 17 years or so and sometimes check out janpara for deals, and totally agree with your stance on buying used parts for viable builds for 95% of games.
    However, with VRAM requirements being 12gb for high settings in new AAA titles, I am torn between buying a few cards:
    rx6800 ' ~48,000 yen, used
    rx6700t- ~38,000 yen, used
    rtx 3070* ~43,000 yen, used
    *Im only considering the 3070 for the video AI upscaling feature in VLC/chrome/edge (Video Super Resolution). What a killer feature! Does AMD offer similar features or is it in the works? I'd go with AMD in a heartbeat if so.
    Nvidia recommends the 3070 for the highest level (setting level 4) of Video Super Resolution. However, some 3060ti users report using VSR at level 4 without issues. Do you have an opinion on this? Maybe the technology is too young.
    hope your allergies arent so bad now. This year has sucked for cedar allergies in Japan but we are in the home stretch to golden week and less pollen in the air! ganbare🤘

  • @hartsickdisciple
    @hartsickdisciple ปีที่แล้ว +3

    I don't see any 1440p results, but based on the 1080p and 4k numbers, it looks like 12gb should be enough for 1440p high/ultra.
    There's no solid reason to believe VRAM requirements will increase substantially from where we're at now during this console generation. 8gb is the new comfortable minimum, 12gb for 1440p, and 16gb for 4k if you want a little headroom.

  • @keyput415
    @keyput415 ปีที่แล้ว +1

    I don't understand why you wouldn't do 1440p? Very few people game at 4k. Even if I had a 4k monitor I'd probably change the resolution to 1440p for intensive games

  • @YIFF_PUP
    @YIFF_PUP ปีที่แล้ว +1

    thank god I went for a 16gb 6800. I never thought I'd get even remotely close to that amount of vram usage at 1440p

  • @mechanicalpants
    @mechanicalpants ปีที่แล้ว +4

    The real issue is that these specific titles have really poor development/optimisation IMO. Dead island 2 just released and looks just as good and runs on GPUs with 4GB VRAM very well, even with nice looking textures. I saw a GTX 1060 6GB running mix of Ultra and High settings at around 60fps (Including Ultra textures).
    So why can't The Last Of Us etc do that? Because of shit programing! It's happened many times before on PC ports, but I think this time the neglect was centered around VRAM on these titles, so some have jumped to the conclusion that we all must have way more VRAM now to solve the issue (which overall is good to progress in that area of GPU tech because it obviously has benefits) but the main issue at the moment is the devs doing a bad job.
    Many other devs have done the work on their games properly and there are many examples of that (Dead Island 2, RE4 Remake etc) they look fantastic and don't have these issues in regards to VRAM. But some devs don't care or simply can't manage because of how the company is being run and the consumer is expected to just deal with it.

    • @nombredeusuarioinnecesaria3688
      @nombredeusuarioinnecesaria3688 ปีที่แล้ว +1

      Re4 remake crashes when you activate raytracing with 8gb of Vram for example on the RTX 3070 but on the RTX 3060 with 12gb it does not happen, when you play an actual game with low Vram the textures become blurry. Something similar happens with Doom eternal (another great optimized game), now everyone is talking about it because it is more and more evident that 8gb is not enough for a card with the power of the RTX 3060ti or higher.

    • @mechanicalpants
      @mechanicalpants ปีที่แล้ว +1

      @@nombredeusuarioinnecesaria3688 Yes, the 3060Ti and many other similar cards should have more VRAM, NVIDIA is just ripping people off atm and folks should buy GPUs with higher VRAM now and in the future, on this agree.
      But the reason TLOU and some other games are a stuttery mess on even 8GB cards is because of bad programming. This was happening without ray tracing at 1080p in TLOU, and yet other games have textures that look just as good but run on much lower VRAM smoothly eg Dead Island 2 on 4GB and 6GB GPUs that are considered old now.
      RE4 Remake may crash with raytracing but at least it runs acceptably at high texture levels on older GPUs like the GTX 1060 6GB a $300 card from 7 years ago, just the way it still should. But certain games like TLOU will have a meltdown if you attempt the same thing, but it still doesn't look any better than RE4 Remake.
      This indicates clearly that some devs don't know what they are doing or simply just don't care.
      More VRAM is great, especially for 1440p and 4K, but an 8GB GPU essentially collapsing because you want to run decent looking textures (ie dont look like blurry crap from the PS3 or even PS2 era of gaming) and only at 1080p is really ridiculous.
      It's mostly happening on console ports because the devs are not bothering to optimise for the PC and make their games run well on the platform. It can be done but wether or not they want to/ are allowed to by the people who run the company is the thing in question.
      If PC gamers put up with this BS it will likely keep happening.

  • @ellypsis603
    @ellypsis603 ปีที่แล้ว +9

    16GB should be standard these days, nvidia sold the 10 series with 8GB back in 2016!!
    And thats why those cards aged so well

  • @wayland7150
    @wayland7150 ปีที่แล้ว +2

    RX VEGA has 8GB but there is an option in the driver to add 4GB of system RAM to the VRAM. Battlefield 2042 seemed to expand itself into the now 12GB VRAM but I can't say if it really helped.

  • @cogthusiast1150
    @cogthusiast1150 ปีที่แล้ว +1

    Makes me feel like my 4080 with 16gb will be outdated in 12 months.
    It seems like developers are doing their testing on 24gb cards and don't think much about optimization.

  • @MinosML
    @MinosML ปีที่แล้ว +3

    Bryan yet again listening to what the community is preoccupied with and giving us a banger video with tons of useful info! Hope to see more vids on this subject so people buying GPUs at this point in time are aware of the compromises they'll have to make with lower amounts of VRAM. Devs are definitely focusing more and more on the current Consoles/4K and it shows. Thank again for the quality content!

    • @Willbme4EVA
      @Willbme4EVA ปีที่แล้ว +1

      Totally agree, more, more, more. Just one vid can not hit the full spectrum. But he does try to jam allot in this vid. Tech Yes City Series pls

  • @vaggeliskosiatzis5487
    @vaggeliskosiatzis5487 ปีที่แล้ว +4

    it's simple. Τhe developers on current consoles have access to 13,5gb of vram while in previous gen they had 5,5gb. When ps4 pro and xbox one x was out you could buy a brand new rx 580 8gb, a lot more Vram than the total amount that could be used on consoles back then, for 240$ that had the same memory bus and bandwidth with the consoles and it was faster than the pro and the same performance with the one x. Now that the consoles can utilise 13,5gb in total, where are the 300$ 16gb Gpus with the same memory bus and bandwidth and providing a little more or similar performance compare to the consoles?? That's the question that Pc gamers should do than make bad decisions by buying a 8gb 450$ or 800$ 12gb cards and then blame the games that need MORE VRAM than they supposedly should to look good. That is a given when a 500$ console can use up to 13,5gb of vram while playing a video game. Accept reality and make better purchasing decisions by looking on the used too and consider both companies, Nvidia and AMD before you buy a Gpu.

    • @brettlawrence9015
      @brettlawrence9015 ปีที่แล้ว

      They should have bought a card from amd. You could tell by the next gen specs what would happen. 16gb will cover this whole gen.

    • @bigturkey1
      @bigturkey1 ปีที่แล้ว

      ps5 has 12gb vram xbox has 11

    • @vaggeliskosiatzis5487
      @vaggeliskosiatzis5487 ปีที่แล้ว

      @@bigturkey1 no, that's incorrect. Both consoles have 16gb Vram total and 2.5gb is reserved for the OS that they run. The rest can be used from the devs.

    • @bigturkey1
      @bigturkey1 ปีที่แล้ว

      @@vaggeliskosiatzis5487 i think its 4 for the OS.

  • @jamesbolho
    @jamesbolho ปีที่แล้ว +2

    Your video and analysis just confirms one thing. Although many recent cards should have launched with more VRAM, the truth is that this sudden issue is also to blame on lack of optimization.

  • @RonBurrgundy
    @RonBurrgundy ปีที่แล้ว +1

    If you want to play without stuttering at 1080p just set your desktop resolution to 1080p and set scaling to display in graphics driver. Run the game in borderless window. This way you are running your desktop at 1080p and the game , your are not downscaling from 2160p to 1080p. This way you are saving Vram. Don't forget set scaling to display instead of GPU. Now I can run the last of as and hogwarts on High textures.

  • @BigLadGreen
    @BigLadGreen ปีที่แล้ว +5

    New games are made to be unoptimised so you keep buying expensive upgrades. Gaming in 2023 is a scam. Old games are the future.

  • @oldmanwithers4565
    @oldmanwithers4565 ปีที่แล้ว +3

    Test a 4gb card to see how much it actually affects things in the real world.

  • @bradb2012
    @bradb2012 ปีที่แล้ว +2

    Suggestion for a part 2: What impact on visuals does going from, low texture detail to ultra have? Is it better to have 4k low texture detail, or 1080p ultra texture detail... Where is the happy medium?

  • @soldier9927
    @soldier9927 8 หลายเดือนก่อน +2

    I’ve played titles like god of war , elder ring , fallout 4 , atomic heart , RDR2 at high to ultra settings 4k DLSS quality at 40-50 fps stable on Rtx 2060 laptop legion 5 115w 6gb vram and ryzen 6 core cpu . On Lg oled tv full screen

  • @SKHYJINX
    @SKHYJINX ปีที่แล้ว +4

    Wouldnt it be more detailed to use different generations and compare baselines, as different architectures buffer more than others in various engines.
    4090 low textures still seems to buffer more than say a 6gb 980ti at same settings... so differing architecture might hold revelations where just using flagship GPUs might skew results with their huge caching buffers.
    I hope for another round using 8GB gpus, not the flagships where game engines might buffer more just cos it sees a higher power gpu device ID.

  • @kurilrick2207
    @kurilrick2207 ปีที่แล้ว +9

    8 gigs of VRAM is plenty enough for 1080p, except of course when you run poorly optimized PC ports like The Last of Us or Hogwarts Legacy

    • @nombredeusuarioinnecesaria3688
      @nombredeusuarioinnecesaria3688 ปีที่แล้ว +3

      Or when you play well optimized games with raytracing (RE4 Remake, Doom eternal). 8gb is not good for a card that cost more than $300.

    • @brettlawrence9015
      @brettlawrence9015 ปีที่แล้ว +1

      You do realise that it will be become more prevalent now games are built for next gen consoles.

    • @kurilrick2207
      @kurilrick2207 ปีที่แล้ว

      @@nombredeusuarioinnecesaria3688 Haven't played the RE4 Remake yet, but I played Doom Eternal with RT enabled (1080p) and faced no issues so far. But I agree that 8 gigs of VRAM is too little for an expensive card, I really hope that the next Nvidia's 70 card (5070) will have at least 16 gigs of VRAM or else it's gonna be an overpriced garbage

  • @italianoinca
    @italianoinca ปีที่แล้ว +1

    The right question that many people are asking is not if 12GB is enough in 2023 (we know it is) but how long before it is obsolete at Ultra settings high res. and according to many it's probably no more then 2 years or so...

  • @gavetapenis8896
    @gavetapenis8896 7 หลายเดือนก่อน +1

    just bought a 4070 super with 12vram, honestly it is more than enough. People in forums do a lot of terrorism around vram quantity. I expect my new gpu to run fine on 1080p/1440p on high settings with good frames for at least 4 years

    • @sirdan357
      @sirdan357 7 หลายเดือนก่อน +1

      I just bought one too. Worst case scenario I can sell it for $400 in a year or two and upgrade to a better card.

  • @AmitKolay
    @AmitKolay ปีที่แล้ว +3

    Good luck 4070 and 4070ti users, even 4080 at 16 gb maybe a issue next year

    • @Jaml321
      @Jaml321 ปีที่แล้ว +4

      If they paid 600$/800$ for those POS they deserve to get screwed. No sane person would buy a graphics card with less than 16GB of Vram in 2023, especially at those ridiculous prices.

    • @raul1642
      @raul1642 ปีที่แล้ว

      @@Jaml321 666$ in my country, not nice nvidia

  • @ghosttheoremproductions5469
    @ghosttheoremproductions5469 ปีที่แล้ว +1

    Newer games don't have lower res texture packs to be used at lower settings. So, you're using a lot of vram regardless of resolution. Hence why the usage scaling is non-linear. - I expect the industry will backpedal this trend though. Too much of the market is still gaming at 1080p with modest hardware for this choice to make sense. World market conditions will likely cause upgrading hardware to stagnate further and DLSS simply wont offer a backwards compatible solution since render resolution will still be stuck with the higher res textures.

  • @peterjansen4826
    @peterjansen4826 ปีที่แล้ว +2

    It is well known that NVidia has more agressive compression, that is why the memory utilization is lower. Is that good or bad? Hard to tell, I know that in the past with NVidia you had more visible artefacts due to that compression, I don't know if that still is the case. Matter of critically comparing the pictures with uncompressed video-recording.

  • @pcmaravilla
    @pcmaravilla ปีที่แล้ว +2

    man i play with 6gb (rtx 2060) at 1080p and i dont have major problems, even in the last of us 1080p medium, 60 fps no stuttering.... i think people is just obsessed with 4K