Not Enough VRAM!!!

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 ส.ค. 2024
  • Is 8GB of VRAM enough? In some recent games, the answer is a clear NO. I began by testing out Resident Evil 4 Remake, and noticing that the RTX 3070 Ti cannot even run the game without crashing unless turning down textures to control VRAM, which is a shame because it clearly has enough performance to run the game maxed out, but is just running out of VRAM. I then discuss other recent games that are using a lot of VRAM like Hogwarts Legacy and Forspoken. I also believe that this will become even more of an issue over time as game developers start to target their 1440p max settings toward the newer generation of cards like the 4070 Ti and 4070, which feature 12GBs of VRAM.
    Test system specs:
    CPU: Ryzen 7700X amzn.to/3ODM90l
    Cooler: Corsair H150i Elite amzn.to/3VaYqeZ
    Mobo: ROG Strix X670E-a amzn.to/3F9DjEx
    RAM: 32GB Corsair Vengeance DDR5 6000 CL36 amzn.to/3u563Yx
    SSD: Samsung 980 Pro amzn.to/3BfkKds
    Case: Corsair iCUE 5000T RGB amzn.to/3OIaUsn
    PSU: Thermaltake 1650W Toughpower GF3 amzn.to/3UaC8cc
    Monitor: LG C1 48 inch OLED amzn.to/3nhgEMr
    Keyboard: Logitech G915 TKL (tactile) amzn.to/3U7FzA9
    Mouse: Logitech G305 amzn.to/3gDyfPh
    What equipment do I use to make my videos?
    Camera: Sony a6100 amzn.to/3wmDtR9
    Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
    Camera Capture Card: Elgato CamLink 4K ‎amzn.to/3AEAPcH
    PC Capture Card: amzn.to/3jwBjxF
    Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
    Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
    Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
    Greenscreen: Emart Collapsable amzn.to/3AGjQXx
    Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
    RGB Strip Backlight on desk: amzn.to/2ZceAwC
    Sponsor my channel monthly by clicking the "Join" button:
    / @danielowentech
    Donate directly to the channel via PayPal:
    www.paypal.com...
    Disclaimer: I may earn money on qualifying purchases through affiliate links above.

ความคิดเห็น • 3.2K

  • @danielowentech
    @danielowentech  ปีที่แล้ว +91

    I wanted this video to be a bit more general than just RE4, but I made a follow up video showing more details about the crashing in RE4, best ways to fix it, more details on what the texture settings mean and how they work, etc: th-cam.com/video/8fMB62F9_7I/w-d-xo.html

    • @tommypearson9260
      @tommypearson9260 ปีที่แล้ว +2

      I agreed with this game not needing features like RT or DLSS turned on it's amazing to me that people want to 1080p-1440p this game still runs over 160fps respectively.
      I tried to say this in a live stream but the person didn't understand, I was only talking about this specific user case RE4 not every game on the planet lol.........

    • @chun1324
      @chun1324 ปีที่แล้ว +2

      Capcom be like: You could lower the setting to use less vram, you can always fix the issue with a 4090 🎉

    • @VMaster1997
      @VMaster1997 ปีที่แล้ว +2

      I think also that developers are really lazy to optimize there game because i cant really tell if a game looks better with more v ram needed than a game from 4 years ago with less v ram needed i think they can do a lot with optimization on the vram side

    • @alephnole7009
      @alephnole7009 ปีที่แล้ว +1

      Try turning Ray tracing off and shadow quality down.
      Then push the texture quality up and see how that works because DF testing showed it's RT that's the issue. Something breaks with RT and texture quality above 4Gb at the same time.
      They also mention shadows effecting the crashes.

    • @alephnole7009
      @alephnole7009 ปีที่แล้ว +1

      @@ConfinedSpiral-xy8qz yeah its the ray tracing thats actually causing crashes. people can go into the red on vram without it and not have any issues.

  • @emeritusiv1366
    @emeritusiv1366 ปีที่แล้ว +2571

    The fact that they give 12gb to the 3060 instead of 3070, 3070ti or 3080 is mind blowing.

    • @PeterPauls
      @PeterPauls ปีที่แล้ว +903

      Original RTX 3060 has 192 bit memory bus, each memory module use 32 bit so 192/32 = 6, you can put 6 moduls, nVidia decided to go with 6x2GB modules because 6GB would have been too less. And RTX 3060 ti, 3070, 3070 ti have 256 bit memory bus which means 256/32 = 8, so they just went with 8x1GB modules and here were the biggest mistake IMO, they should have been put 2GB modules on that card, so the 3060 ti, 3070 and 3070 ti would have been 16GB cards. About RTX 3080 which has 320 bit wide memory bus, 320/32 = 10, so they went with 10GB but because they released the RTX 3080 ti with 384 bit bus which 384/32 = 12, they went with 12GB. They Fd up their lineup, it should have been RTX 3090 24GB, RTX 3080 20GB, RTX 3070 ti, 3070, 3060 ti with 16GB and RTX 3060 with 12GB and RTX 3050 with 8GB. I hope my logic is understandable even though English is not my best.

    • @od1sseas663
      @od1sseas663 ปีที่แล้ว +309

      @@PeterPauls Stop with the excuses. We know how it works with the bus thing. Just change the bus and put 12Gigs of VRAM or just make it 16.

    • @DanielOfRussia
      @DanielOfRussia ปีที่แล้ว +220

      @@od1sseas663 You do realize changing the bus impacts the performance as memory bandwidth either gets higher or lower?

    • @od1sseas663
      @od1sseas663 ปีที่แล้ว +26

      @ロシアのダニエル That's why I said "just put 16Gigs". Same bus.

    • @od1sseas663
      @od1sseas663 ปีที่แล้ว +22

      @Blue Yeah, that's why I said "or just put 16Gigs".

  • @winj3r
    @winj3r ปีที่แล้ว +985

    NVidia knows very well what they are doing by offering GPUs with such low vram.
    It's a good way to force people to upgrade and spend more money, much sooner.

    • @AAjax
      @AAjax ปีที่แล้ว +68

      Entirely this. As Jensen said, Moore's Law is dead, so how else is Nvidia going to get everybody on-board for a 50 series upgrade? Especially since CPUs are now bottle-necking the top end GPUs.

    • @LarsH0NEYtoast
      @LarsH0NEYtoast ปีที่แล้ว +30

      Yeah and it worked lol
      Played the demo with my 3060Ti and it struggled to maintain 1440p 60fps in the demanding areas. So I ended up feeling compelled to upgrade. I got a 4070ti and now I'm running the game at 1440P 120FPS with even higher textures. To think there is rumors of the new 4060 still coming out with 8G vram is ridiculous

    • @Toxicin2
      @Toxicin2 ปีที่แล้ว +80

      @@LarsH0NEYtoast News in, gamers get played and even paying for it.
      Thats simping my son.

    • @Hombremaniac
      @Hombremaniac ปีที่แล้ว +22

      It's actually ingenious move from Nvidia. You manufacture quite potent GPUs, but you gimp them with not enough of VRAM. Nvidia still sold those for extra high prices (and saved costs on VRAM) and used DLSS as a sorts of excuse for low VRAM and demanding Ray Traycing. They seem to continue with this trend since 4070ti has measly 12GB and in order to produce good FPS with Ray Traycing on, you are yet again forced to use DLSS.

    • @gameordeath9464
      @gameordeath9464 ปีที่แล้ว +13

      @@LarsH0NEYtoast 4060 rumored to even have 8GB should just be a shame to Nvidia, AMD is out here with the 6700/50XT at in 10GB non XT and 12GB XT is just crazy to me. I surely hope they don't actually give it 8GB as it will definitely be out of the "trend" soon with these new games coming out being very high demand. Im just glad I game on my AMD card PC and keep my NVIDIA for Adobe Suite creator stuff. It's just so dumb and they know people will still buy it because its just NVIDIA. I've seen people on reddit talk about how their 3080(ti) with 10GB non ti and 12GB ti of VRAM are struggling.

  • @Bestgameplayer10
    @Bestgameplayer10 ปีที่แล้ว +275

    I tried telling people early on that this was going to be an issue very soon. This is part of the reason why I'm on AMD. My 6800XT has 16GB of VRAM.

    • @agusvelozo3189
      @agusvelozo3189 ปีที่แล้ว +3

      I was thinking abt buying a 6800, what model did u buy? I'm thinking abt buying the phantom gaming.

    • @Bestgameplayer10
      @Bestgameplayer10 ปีที่แล้ว +8

      @@agusvelozo3189 I just have the AMD Radeon (the reference model). I didn’t get an AIB this time around.

    • @agusvelozo3189
      @agusvelozo3189 ปีที่แล้ว +2

      @@Bestgameplayer10 and is the performance good?

    • @Bestgameplayer10
      @Bestgameplayer10 ปีที่แล้ว +2

      @@agusvelozo3189 I suppose so. I haven’t looked into the performance of the other brand 6800XTs to know for comparison but I ain’t had no performance issues.

    • @agusvelozo3189
      @agusvelozo3189 ปีที่แล้ว

      @@Bestgameplayer10 nice, and drivers wise did u have any problem?

  • @Hypershell
    @Hypershell ปีที่แล้ว +933

    Well, I have a 3070, and Nvidia is doing their best to make sure my next upgrade is to AMD.

    • @83Bongo
      @83Bongo ปีที่แล้ว +82

      You only have yourself to blame. You knew the card had 8gb when you bought it.

    • @Mike0200000000
      @Mike0200000000 ปีที่แล้ว +21

      My 3070 handles all games perfectly, along with fhd and qhd. If I have to lower the textures from ultra to high, it runs great. Ray tracing and dlss make for nicer graphics and the card is better than the 6700xt

    • @Hypershell
      @Hypershell ปีที่แล้ว +86

      @@83Bongo Yes, I did. I'm happy with it, I'm just saying it's something to think about in the future.
      I'm more a jack-of-all-trades user than a hardcore gamer, so Nvidia having the better feature set was a draw (especially that NVENC encoder). But AMD is getting better and Nvidia is habitually going nuts with the power draw lately, so I didn't expect to be sticking to one brand forever.

    • @realpoxu
      @realpoxu ปีที่แล้ว +71

      @@Mike0200000000 Well the 6700 XT can run this at 1440p and Max settings + RT at over 60 FPS and FSR 2.1 does good look good indeed. 3070 < 6700 XT.

    • @Steel0079
      @Steel0079 ปีที่แล้ว +23

      Same, me too. Once I got 3070, I started running every game at max. The games that give me problem are the ones needing more VRAM. Sad part is that these games run out of VRAM, at 1080p. FFS.

  • @giuseppeabate7342
    @giuseppeabate7342 ปีที่แล้ว +169

    Missed the opportunity to name the video "Not Enough VRAM, Stranger!"

    • @nishant5655
      @nishant5655 ปีที่แล้ว +6

      🤔🤔

    • @maximmk6446
      @maximmk6446 ปีที่แล้ว +6

      Lmfao mate 😂

    • @rohanchooramun7288
      @rohanchooramun7288 ปีที่แล้ว +5

      Good one... lmoa🤣🤣

    • @gamingbillyo333
      @gamingbillyo333 ปีที่แล้ว

      Broo Thank You Giuseppe ! Gave Me The Title Name For My Next Video ! My 4080 16 Gb Was Upgraded To A 4090 24Gb Play This With Everything Maxed Out ! Not A Proud Moment Supporting Nvidias Insane Prices Twice ! Smh

  • @RAaaa777
    @RAaaa777 ปีที่แล้ว +31

    The patch is just automatically lowers the textures so that the game doesn't lags.
    If you pay attention in some areas the textures change.

  • @mirzaaljic
    @mirzaaljic ปีที่แล้ว +35

    After making a huge mistake of buying a 3GB version of GTX 1060 back in the day, I said that the first thing I'll be looking for in the next graphics card will be VRAM. Which is why I went for AMD this time around.

    • @happybuggy1582
      @happybuggy1582 ปีที่แล้ว

      Agreed with 2gb gtx960 with tears

    • @Shatterfury1871
      @Shatterfury1871 11 หลายเดือนก่อน +1

      Welcome to team RED, we have cookies, enough VRAM, raw raster and a bit worse ray tracing.

    • @Crazical
      @Crazical 10 หลายเดือนก่อน +1

      ​@@Shatterfury1871and the funky niche things like RSR and AFMF yeahhh

    • @zazoreal5536
      @zazoreal5536 9 หลายเดือนก่อน

      Gtx 1060 3gb club. I have it running in my backup PC.

    • @jagjot1697
      @jagjot1697 9 หลายเดือนก่อน +1

      I'm in the same boat. Yet another disappointed owner of the 1060 3GB. I'm thinking of going with the RX 6750XT now

  • @Ladioz
    @Ladioz ปีที่แล้ว +390

    Running out of Vram is probably the saddest thing ever. Having a fast card and expensive with all the entire components in the build up to date and on point, and you have to worry about a vram meter in game........

  • @wakingfromslumber9555
    @wakingfromslumber9555 ปีที่แล้ว +817

    NVIDIA: “We have ray tracing , path tracing , DLSS 3.0 but you need to pay 1200 dollars for it…
    Consumers: “Can we have high quality textures too ?”
    Nvidia: “No, you need to pay at least 1500 bucks for that”
    Consumer:”But I don’t need all this Ray tracing stuff “
    Nvidia:” I don’t care you are having it”

    • @arfianwismiga5912
      @arfianwismiga5912 ปีที่แล้ว +16

      that's why Nvidia give you DLSS

    • @Mike80528
      @Mike80528 ปีที่แล้ว +70

      Nvidia: "And tons of AI cores that you also don't really need, but our profitable endeavors require."

    • @winj3r
      @winj3r ปีที่แล้ว +96

      The funny thing is that RT also requires more vram. So people eventually will have to turn off RT, because of vram limitations, despite the GPU being capable of RT.

    • @user-wq9mw2xz3j
      @user-wq9mw2xz3j ปีที่แล้ว +7

      not really. If you disable raytracing, it uses much less vram.

    • @dreamcrabjr7408
      @dreamcrabjr7408 ปีที่แล้ว +83

      That’s why you get a rx 6800 xt with 16gbs of vram 😎

  • @ookiiokonomiyaki
    @ookiiokonomiyaki ปีที่แล้ว +5

    Some folks on the internet bragged about small vram pool on ampere rtx cards, but 95% of gamers claimed 8 gigs was enough, and thought that they were hating. Now with developers adapting to 9th generation consoles, it seems 8 gigs is in fact not that future proof as we had assumed.

    • @infini_ryu9461
      @infini_ryu9461 7 หลายเดือนก่อน

      Lol. Yeah. Not only are games so unoptimized for PC, the graphics cards don't have enough VRAM anymore.
      I mean, I have a 4090, so it's never gonna be a problem, but you shouldn't need that to play a game your graphics card could run normally if allowed to.
      Honestly, I think people are taking too much advice from 720p/1080p high fps competitive gamers. We all know to get beautiful games you're gonna want way more VRAM. 12GB will not be enough very soon.

  • @aapknaap
    @aapknaap ปีที่แล้ว +13

    About a year ago I bought a 3060 12gb because i thought 8 gigs of vram wouldn't be future proof and here we are.

    • @christopherbrewer2154
      @christopherbrewer2154 ปีที่แล้ว

      me too man

    • @verrat3219
      @verrat3219 ปีที่แล้ว +2

      3060 has decent vram but its a bad card due to its performance. so eventho its more future proof than 8gb vram, the cards performance itself is gunna be low on taxing games like rd4 remake. the good thing is atleast it will still be playable, unlike the 3070 where its unplayable/crashing.

    • @aapknaap
      @aapknaap ปีที่แล้ว +4

      The 3060 aint that bad. Never had performance issues so far

    • @dinakshow924
      @dinakshow924 ปีที่แล้ว +1

      @@aapknaap People draw their conclusions. from 120fps and others at 60fps... for example my GPU is an RX 6800 and the box says "4k gaming" and I said "WTF 4K" this gpu runs games at 4k but at an average of 60fps. but for 144fps I need to play at 1080p. what I mean. is that if they say that the "RTX 3060 is a bad card" it is very likely that they say that because I do not reach 144fps at 4k ..

    • @aapknaap
      @aapknaap ปีที่แล้ว

      Yeah

  • @DrearierSpider1
    @DrearierSpider1 ปีที่แล้ว +159

    Nvidia has a history of this. Recall that the 780 Ti launched for $700 at the same time as the PS4 & XB1. While that card is substantially more powerful than the 8th gen consoles, it didn't have the VRAM to keep up. In 2013/2014, developers were still developing games for cross gen platforms, which meant they still were constrained by the 512MB of RAM in the 360/PS3, so the 3GB wasn't initially a problem. While most 8th gen & PC versions of games featured higher quailty textures, those earlier games could still work within a 3GB limit.
    But once games started being developed for the 8GB consoles exclusively, the VRAM limits on Kepler really became an issue. We're in the same boat with Ampere, and probably the "midrange" Lovelace models if rumors are to be believed.
    The Radeon R9 390 launched 8 years ago with 8GB of VRAM for $330. It's long past time 8GB was a thing of the past for all but display adapter tier GPU's.

    • @frippyfroo6064
      @frippyfroo6064 ปีที่แล้ว

      I upgraded to a 3070 ti from a GTX 660 lmao I have 0 complaints

    • @therockwizard.9687
      @therockwizard.9687 ปีที่แล้ว +12

      @@frippyfroo6064should have bought a 6800

    • @andersjjensen
      @andersjjensen ปีที่แล้ว +5

      @@frippyfroo6064 If you're on 1080p it's not going to be a problem for the foreseeable future. If you're on 1440p you're going to have to adjust settings in mostly every new title by the end of this year. If you're on 4k using and using the highest DLSS quality to get there it's going to be the same story. If you're trying for 4k native you'll already be doing it because it lacks both the grunt and the VRAM.

    • @Ben256MB
      @Ben256MB ปีที่แล้ว +6

      I remember when RadeonVII came out with 16GB for 699 a lot of Nvidia fan were dissing the card , but now Radeon VII is actually going to age well .
      8GB VRAM GPU shouldn't cost 400+ dollars at all but Nvidia are soo selfish .
      I just wish AMD priced their GPU's more cheaper and they would have a big market share .

    • @od13166
      @od13166 ปีที่แล้ว +2

      Back then people calling R9 290x had 4gb not that issue.
      But due to 780ti missing several direct feature its horribly aged

  • @Burdman660
    @Burdman660 ปีที่แล้ว +326

    I skipped last generation because the 3070 only had 8gb of VRAM. I surpass 8gb VRAM in A LOT of titles. I ended up picking up the 7900XT, and it has been fantastic. Middle finger to you Nvidia!

    • @Hombremaniac
      @Hombremaniac ปีที่แล้ว +28

      Some folks (me included) just recently grabbed RX6800XT for good price and are going to ignore current gen. Also feasible solution and one more budgetwise I guess, since previously we had covid/crypto inflated prices and only now they are slowly getting down.

    • @alsoyes3287
      @alsoyes3287 ปีที่แล้ว +1

      How are the temps and the power draw? I was considering 7900xt, but it seems to have power draw issues with multiple monitors, draining 90w when idle at times.

    • @likeagrape57
      @likeagrape57 ปีที่แล้ว +8

      I got a 3070 in early 2021 but ended up upgrading to a 7900xt just recently because I was tired of vram constraints. You made a good choice skipping the 3070

    • @oxfordsparky
      @oxfordsparky ปีที่แล้ว +1

      7900XT is such garbage value, literally bought the worst option.

    • @likeagrape57
      @likeagrape57 ปีที่แล้ว +15

      @@oxfordsparky that's true. But at least it's garbage value that should last a while.

  • @OMaMYdG
    @OMaMYdG ปีที่แล้ว +61

    As a 3070 ti owner myself I'm starting to see this happen in certain games. It's very frustrating but of course there's ways around it.

    • @webtax
      @webtax ปีที่แล้ว

      could you summarize some tips to go around this, so i can know what to look for when searching?

    • @OMaMYdG
      @OMaMYdG ปีที่แล้ว +9

      @webtax basically just have to mess with some of your graphics settings. Turn down the resolution or ray tracing.

    • @nape_soot_asd
      @nape_soot_asd ปีที่แล้ว

      @@OMaMYdG How does it look like when you dont use raytracing
      and not play 4k

    • @gonlyhlpz
      @gonlyhlpz ปีที่แล้ว +2

      Best option would be to use dlls balanced or performance or FSR 2.0, you cant run native 4k with this card, I have been using this card for more than a year, but its a solid card for 1080p

    • @OMaMYdG
      @OMaMYdG ปีที่แล้ว +5

      @gonlyhlpz I'm playing in 1440p and with most games it's very good. Just the recent games like RE4 even with dlss has issues. I Just turn the settings down a bit and im fine.

  • @stusen6153
    @stusen6153 ปีที่แล้ว +203

    While people slate Nvidia (and rightly so), a big part of this issue is the unoptimized mess that companies release just to meet deadlines. The recent Steam survey shows that the majority of people are still on 8GB or lower video RAM. Having anything less than 4K require more that 8GB ram is just poor by the devs. As is often the case, VRAM issues are fixed by patches down the line, so in most cases they are entirely avoidable. Both manufactures and game studios need to do better!

    • @francescobenelli2707
      @francescobenelli2707 ปีที่แล้ว +3

      Exactly

    • @MelindaSordinoIsLiterallyMe
      @MelindaSordinoIsLiterallyMe ปีที่แล้ว

      Two points:
      - If you buy an Nvidia GPU, suck it. No seriously, if you buy one and then complain that you are "forced" to upgrade to a new overpriced Nvidia GPU, you absolutely deserve it, sucker.
      - I do agree that Nvidia has always skimped on VRAM, but I also don't think it is that serious. Everyone references The Last of Us Part I regarding that VRAM topic. That's basically the only big example and The Last of Us Part I is an awfully unoptimized PC port. Shader caching would lower the VRAM need by a lot. I don't want to play that game, so I don't f***** care either way.

    • @remmykun8315
      @remmykun8315 ปีที่แล้ว +52

      The real problem is that consoles are and have always been a priority in the gaming industry. And since they are; you as a PC gamer always should pay attention to how those consoles are made in order to decide which PC parts you should choose. PS5 and Xbox Series X have 8 CPU cores and 16 gigs of unified memory. So, if a company like nvidia releases a new product with just 8 gigs of vram, they know it won’t age well. And sometimes (depending on price or the fact that we cannot wait for whatever reason) they force you to buy a product with clear planned obsolescence. But if you have the option to skip the generation, go with the competition or ultimately buy the “halo” product (24 gigs) then maybe you should do it based on the new consoles’ features that tells you how the games are going to be… sooner than later.

    • @waifuhunter9709
      @waifuhunter9709 ปีที่แล้ว +5

      @@remmykun8315 really good mindset
      I will remeber your advice for next purchase

    • @noctus9731
      @noctus9731 ปีที่แล้ว +8

      RE4R is greatly optimized and the 8GB still can't handle it mate.

  • @josh4434
    @josh4434 ปีที่แล้ว +81

    My 6950XT is crushing this game in 1440p. Mostly sits at 100-130FPS on Max settings RT ON. I'm at Chapter 5 now and the heavy rain makes it dip down to 80+fps. Still, awesome experience with settings maxed out. Not a single crash so far.

    • @tomhanke7703
      @tomhanke7703 ปีที่แล้ว +1

      yes would you play in 4k you would have crashe like me because its then uses sometimes more than 16 gb vram

    • @josh4434
      @josh4434 ปีที่แล้ว +1

      @Tom Hanke that sucks. My 5800X3D/6950XT system is set up for 1440p. The highest I've seen it hit is 14GBs in a couple of games

    • @garyb7193
      @garyb7193 ปีที่แล้ว +3

      @@tomhanke7703 Almost everything in life come with compromise and limitations. He says he's gaming at 1440p 120fps with RT. I say he's gained (in playability) more than what he's lost. I make the same choice, which is why I picked a used RX 6800xt 16Gb for $425.

    • @DaddyHensei
      @DaddyHensei ปีที่แล้ว +9

      4k is not all that worth it anyways. Monitors are over priced for a decent refresh rate. Meanwhile 1440p you can get a nice one for way cheaper and still look pretty amazing. Josh hit the jack pot.
      Ya can confirm that the 6950xt actually can do RT pretty well at 1440p. I run a 6950xt and a 7700x at 1440p. My gaming experience has been nothing but butter smooth with ultra settings across every game I've thrown at it. Callisto, Deadspace, Cyberpunk, heck even hogwarts (Don't recommend rt though, not because fps but because the reflections will blind you. Hogwarts simply isn't made for rt haha) All of those games have been wonderful for me at 1440p

    • @tomhanke7703
      @tomhanke7703 ปีที่แล้ว

      @@DaddyHensei well i cant even play in 2 k anymore its to unsharp and doesnt Look very Good in my opinion so i would rather play in 4k without Rt with all maxed instead of 2k with Rt and the money Thing is not a point for me 800€ for 4k 32 cm 144 hz 1 ms hdr Monitor is ok for the price

  • @DrearierSpider1
    @DrearierSpider1 ปีที่แล้ว +115

    I bought a GTX 1080 at launch and with its 8GB buffer the card was able to last me almost 5 years. Now with my 3080 I'm being VRAM limited only 2-2.5 years into its life.

    • @AlexanderMuresan
      @AlexanderMuresan ปีที่แล้ว +18

      I ended up going from a GTX 1080 to a 7900 xtx. The 4080 was 50% more expensive and the 4090 was double the price of the 7900 xtx. And I don't have any kind of brand loyalty. The 1080 server me incredibly well for 6 years, but I needed something to handle modern games at 1440p and 4k resolutions. Team red seemed like the obvious choice this time around.

    • @Azurefanger
      @Azurefanger ปีที่แล้ว +14

      even the rx 6700 xt and 6800 xt have more vram and are cheaper nvidia just want people to upgrade more often with these bad optimization in the high end gpus i mean even the 3060 got 12 gb of vram why the 3070 and 3080 got less only for this reason so poeple need to buy another one

    • @Hippida
      @Hippida ปีที่แล้ว +4

      Same with my Vega 64. To this day, it's main limitation is the 8 GB vram

    • @crimson7151
      @crimson7151 ปีที่แล้ว +9

      I have a 3080 and this issue is killing me, the gpu itself is extremely powerful, with overclocking it can get really close to the 3090 and they decide to give it only 10gb of vram.

    • @Obie327
      @Obie327 ปีที่แล้ว

      @@AlexanderMuresan I had the same problem running 1440p with my GTX 1080, I decided to give Intel a try and switch to Team Blue. ARC A770 16 gb easily handles all my quality settings and gaming frame rates. It's a damn shame Nvidia's greed has pissed off their loyal fan base.

  • @parzival3632
    @parzival3632 ปีที่แล้ว +24

    Same with RX 6600 XT. When I set it to prioritize graphics, my GPU is only used at 60 - 70% max. Meaning I can set the graphics to higher. But as soon as I do and exceed the 8GB limit waaay too much, the game just crashes. I kinda regret not getting the RX 6700 XT now. It had 12 GB VRAM. I never thought VRAM would be an issue this quickly at 1080p.

  • @fredericothordendjent1408
    @fredericothordendjent1408 ปีที่แล้ว +13

    Definitely would love for you to do a comparison between the rtx 3080 12gb and the 6800 xt all maxed out with this game, I feel like the same thing could happen with the 3080..

  • @alchemira
    @alchemira ปีที่แล้ว +261

    The game runs perfectly fine on my rx6800 with 16GB VRAM @ 1440p. I chose 6800 over 3070(ti) because of VRAM. 4060(ti) will be DoA with 8GB VRAM.

    • @theelectricprince8231
      @theelectricprince8231 ปีที่แล้ว

      Nvidia are delusional if they think they are apple

    • @muresanandrei7565
      @muresanandrei7565 ปีที่แล้ว +6

      why 1440p I have the same card and I do 4k full max ray tracing fine 60fps. 1440p looks shit once you play 4k.

    • @atnfn
      @atnfn ปีที่แล้ว +103

      @@muresanandrei7565 Maybe he doesn't have a 4k monitor, I don't. I figure in fact most people don't.

    • @xiyax6241
      @xiyax6241 ปีที่แล้ว +9

      yeee bro same lotta games i play use up too 10/12gb of vram. amd over nvidia for gpu right now

    • @xiyax6241
      @xiyax6241 ปีที่แล้ว +2

      @@muresanandrei7565 ye man have same card its insane no limitations like nvidia 8gb cards

  • @Jefferson-he5rb
    @Jefferson-he5rb ปีที่แล้ว +77

    I think Nvidia is doing that on purpose. By adding lower vram, they force their customers to upgrade more often than they should. They are adding lots of useful features which are DLSS, Ray Tracing, Cuda cores for streaming and 3D modelling. It feels like future-proof, making you want to join in green team. But after 1-2 Years a fact hits you; Low VRAM... For example, RTX 3070 is a great card, like all in one. Great raw performance, good for 3D modelling, streaming, Better-than-competitors RTX performance. However, When you render high-poly projects in blender, RTX 3060 outperforms RTX 3070 because of the lack of VRAM RTX 3070 crashes. You try to play on high textures with RTX 3070 but it gives less performance or crash than RTX 3060 because of the VRAM. Even if it is so clear why Nvidia is doing that, Adding 2-4 GB more VRAM shouldn't be that expensive. RX 6700 XT is a more future-proof card than RTX 3070 and 3060 TI in my eyes.

    • @Frank-fg4jx
      @Frank-fg4jx ปีที่แล้ว +15

      It's the reason I will never buy another gpu from nvidia

    • @caribbaviator7058
      @caribbaviator7058 ปีที่แล้ว +5

      ​@@Frank-fg4jx Same here. The RX6800 gives me way better performance than any Nvidia card I own.

    • @DerpDerpson
      @DerpDerpson ปีที่แล้ว +6

      You hit the nail on the head. They started this practice with the GTX 970, but it went under the radar for most people because back then it wasn't causing that much of an issue. It is a big one today when nearly all developers can't into texture compression (due to everybody and their mother using UE4 and now 5), so they just let it rip and max out VRAM allocation even when the game looks like it's from 2010.

    • @MelindaSordinoIsLiterallyMe
      @MelindaSordinoIsLiterallyMe ปีที่แล้ว

      Two points:
      - If you buy an Nvidia GPU, suck it. No seriously, if you buy one and then complain that you are "forced" to upgrade to a new overpriced Nvidia GPU, you absolutely deserve it, sucker.
      - I do agree that Nvidia has always skimped on VRAM, but I also don't think it is that serious. Everyone references The Last of Us Part I regarding that VRAM topic. That's basically the only big example and The Last of Us Part I is an awfully unoptimized PC port. Shader caching would lower the VRAM need by a lot. I don't want to play that game, so I don't f***** care either way.

    • @pooriaroohie
      @pooriaroohie ปีที่แล้ว

      @@DerpDerpson they actualy started it with gtx 780 and gtx 780ti - which they both had 3 gb of vram and then they made another variation of gtx 760 that had 4gb of vram and that was the most retarded and absurd thing i ever saw.

  • @j.c.denton2312
    @j.c.denton2312 ปีที่แล้ว +21

    the main reason i bought the 6800xt over the 3080 was the increased vram kinda wild to see that paying off already. this game runs great! my only complaint is that the upscaling methods are poorly implemented

    • @clownavenger0
      @clownavenger0 ปีที่แล้ว

      How good does it run raytracing in this game?

    • @scylk
      @scylk ปีที่แล้ว +3

      ​@@clownavenger0 ray tracing is a joke anyway. Coming from someone who bought a 4070ti

    • @clownavenger0
      @clownavenger0 ปีที่แล้ว +1

      @@scylk Yeah mostly. I like some RT GI in games with dynamic lighting when its done well. shadows and reflections can be faked pretty well

    • @valenrn8657
      @valenrn8657 ปีที่แล้ว

      @@clownavenger0 Refer to Resident Evil Village.

    • @clownavenger0
      @clownavenger0 ปีที่แล้ว

      @@valenrn8657 okay. that was kinda poo

  • @BattlecryGWJ
    @BattlecryGWJ ปีที่แล้ว +12

    I bought a 3070 and a 3070 TI once I could get them at MSRP after prices crashed, wound up returning both and picking up a 6800 XT because I thought with more vram it would probably have better legs. Would love to see a comparison with a 6800 XT in these titles now.

    • @WalkerIGP
      @WalkerIGP ปีที่แล้ว +1

      You are not the only one. I am going to trade up mt 3070 Ti in a few days to a 6950XT or 7900XTX.

    • @user78405
      @user78405 ปีที่แล้ว

      Should get with 7900xt, a 6800xt is not powerful with more VRAM alone is slower engine to run other games that 3070ti ran best on with bigger engine even VRAM is small...odd choices to pick last gen...and for others I would recommend keep holding on, Microsoft already patch DirectX 12 issue with copy VRAM and copy system ram bug issues by upload Heap that devs will patch the game with new code that benefit 8gb VRAM by allowing the CPU have access VRAM directly than adding latency from system ram old fashion that been around for over 30 years of processing

    • @WalkerIGP
      @WalkerIGP ปีที่แล้ว +1

      @@user78405 In my opinion a 6950XT would be better, at just 600 to 700 USD, that is what I choose, on its way now. The 7900XT is not good value when you can get a XTX for just 100 USD more. This means that there is no reason to go for the 7900XT as a more powerful card is within reach. If the 7900XT was 700 to 800 USD I would have gone for it. I do agree that a 6800XT is not enough of an uplift from 3070ti, you need at least 6900XT or better to notice a difference. I also realized in my earlier comment I wrote XT, meant to write XTX.

    • @madtech5153
      @madtech5153 ปีที่แล้ว

      time to watch hardware unboxed's vid then

  • @IrocZIV
    @IrocZIV ปีที่แล้ว +50

    The 1070 still had 8GB, always thought Nvidia was being stingy with RAM on their newer cards.

    • @sofyo2bib
      @sofyo2bib ปีที่แล้ว

      but not the same bandwidth 3070 double than 1070 but still not enough hhhh

    • @tankdjsims
      @tankdjsims ปีที่แล้ว +3

      yeaaa I just had a 1070 and upgraded to a 3070, sad that they both have the same vram. That never made sense to me when upgrading but I still went an bought a 3070 for $300

    • @Deernailt
      @Deernailt ปีที่แล้ว

      @@tankdjsims I had 1070 just upgraded to a 2080 in 2018-2019 has same 8GB VRAM with more fps after upgraded to 4090 GIGABYTE GAMING OC in 2023 so got 24GB VRAM now skipped 30 series

    • @tankdjsims
      @tankdjsims ปีที่แล้ว

      @@Deernailt even though I just got this maybe two weeks ago, im really thinking about selling it and upgrading to a 40s 😩

    • @JudeTheYoutubePoopersubscribe
      @JudeTheYoutubePoopersubscribe ปีที่แล้ว

      I went from 1070 to 4070ti very recently.

  • @diablosv36
    @diablosv36 ปีที่แล้ว +153

    This is what I had with the 970 back in the day, was playing Tomb Raider and it would stutter like crazy. This is why I rejected the 3070 and got a RX 6800 instead. And now 2 years later having this card with a game like RE4 is something I can comfortably max out.

    • @ml_serenity
      @ml_serenity ปีที่แล้ว +1

      You rejected because you knew that 6800 has 16GB of VRAM vs 8GB in case of 3070 or simply because you had a bad experience with the 970 one?

    • @azraeihalim
      @azraeihalim ปีที่แล้ว +23

      @@ml_serenity I think both

    • @zentar2646
      @zentar2646 ปีที่แล้ว +34

      @@ml_serenity because he had experience with a once vram-gimped product like the 970 before he knew he shouldnt repeat it and instead get the offering with more vram

    • @EyeofValor
      @EyeofValor ปีที่แล้ว

      I'm maxed at a 10gb card. The op uploader is a moron and knows very little.

    • @xwar_88x30
      @xwar_88x30 ปีที่แล้ว +10

      ​@@ml_serenity he's basically saying he's future proofing by going for a gpu that has beefy vram and not experience a gimped card due to the lack of vram. This happened last gen with consoles people saying awh 2gb and 4gb is enough which lasted like a year and then they GPUs struggled, now ps5 etc is out games are just going to get more vram hungry as time goes on. Always future proof when it comes to pcs especially when it comes to vram. 6800/6800xt are going to just dominate the 3070 and 3080 as time goes on. It's sad because like Owen said the 3070 and 3070ti are easily able to max these games out but the lack of vram has gimped those cards.

  • @dinakshow924
    @dinakshow924 ปีที่แล้ว +4

    for that reason. i like AMD. nvidia is too expensive however doesn't have vram enought

  • @sL1NK
    @sL1NK ปีที่แล้ว +1

    I use a 3070Ti at 3440x1440, and I never had any issues with running out of VRAM. People really overexaggarate about "not having enough vram". Hogwarts Legacy has VRAM management issues, and it can be fixed with a few extra lines in the config file.

  • @gromm225
    @gromm225 ปีที่แล้ว +73

    Main reason I went with a 6700xt was VRAM. The new Xbox/PS released use 16GB ram shared between the CPU and GPU so essentially 12GB VRAM assuming 4GB reserved for CPU. Since game developers tend to focus on consoles and their capabilities its probably a good idea to have AT LEAST 12GB Vram. I could be wrong but I always thought the 30 series cards didn't have enough Vram and it seems to be starting to have an effect.

    • @pk417
      @pk417 ปีที่แล้ว +15

      The truth is even consoles don't use ultra settings
      So image what is going to happen in 2024 with 8gb gpus

    • @victorsegoviapalacios4710
      @victorsegoviapalacios4710 ปีที่แล้ว +2

      That is not how it works, max real VRAM available is around 10 GB, best case; usually even less. There is a reason of the memory config of the XSX

    • @larion2336
      @larion2336 ปีที่แล้ว +1

      Same here, also 6700 XT and it's been great. Runs a lot cooler than my old 1070 TI as well.

    • @saricubra2867
      @saricubra2867 ปีที่แล้ว

      But you are ignoring the 3080Ti and 3090.

    • @TankoxD
      @TankoxD ปีที่แล้ว +6

      @@saricubra2867 they are 2 times more expensive

  • @Avarent01
    @Avarent01 ปีที่แล้ว +58

    I am very happy that I chose my 6900XT over the 3070/3080. Not only is it super power efficient, but yes, the 16GB Vram make it potentially more future proof as well.

    • @renoputra5219
      @renoputra5219 ปีที่แล้ว +20

      I sold my 6900XT over 3080 10gb, the most stupid decision ever done in my live😢

    • @vladimirdosen6677
      @vladimirdosen6677 ปีที่แล้ว +1

      ​@@renoputra5219 Lowering texture can be compensated by using dlss which is superior in image quality to fsr. I would argue ray tracing and dlss is worth it more.

    • @rubsfern
      @rubsfern ปีที่แล้ว

      Same here :)

    • @nicane-9966
      @nicane-9966 ปีที่แล้ว +10

      ​@@vladimirdosen6677 dlss and fsr sre cheap techs that decreases your image quality. Besides the 6900xt is at 3090 level so it was already more powerfull tbw

    • @dampintellect
      @dampintellect ปีที่แล้ว +6

      @@vladimirdosen6677 That's all fine and good if you think so. But I would like to point out that Raytracing itself uses additional VRAM when turned on, so you would have to lower textures even further.

  • @chariothe9013
    @chariothe9013 ปีที่แล้ว +5

    Dunno why I chose 2060super with 12gb for my second desktop pc at that time but I feel lucky finally able to put it to use lol😅

  • @pRick7734
    @pRick7734 ปีที่แล้ว +45

    The reason why this happens is because the game loads textures into the vram and loads even more textures when you're traversing a loading zone from one area of the map to another, there's a clear spike in vram usage when that happens. Digital foundry covered this in a video, it's also the reason why traversal stuttering exists in the game. They need to fix this asap.

    • @mimimimeow
      @mimimimeow ปีที่แล้ว +13

      This is how PS5/XS is expected to work since they have a dedicated SSD IO stream decompression that goes directly into VRAM. Devs will stream data continuously thus sidestepping VRAM limit. PC DirectStorage isnt at that level yet and what we see is basically the result of MS and Nvidia not caring enough to push radical new things to PC. After all most PC gamers only care about getting 500 fps in CSGO *eats popcorn*

    • @brottochstraff
      @brottochstraff ปีที่แล้ว +9

      @@mimimimeow Also we are seeing more and more shitty console->pc ports...again. Last of Us, RE4, Plague and many more. There are clear technical flaws like the one you mention above where shader precompilation is needed and texture streaming is just not made for how PC hardware works best. If you look at games that kept PC in mind from start like Dying Light 2, Cyberpunk and others, there are no such issues. Im maxing out Dying Light 2 on RTX 3070 and it looks way better than RE4

    • @kitezzz360
      @kitezzz360 ปีที่แล้ว +1

      actually traversal stutter is loading the upcoming area into system memory, it has nothing to do with vram

    • @danielosetromera2090
      @danielosetromera2090 ปีที่แล้ว +4

      They are clearly doing things inefficiently. RDR 2, a game that looks a metric ton better than Hogwarts Legacy on every level, uses just 5,6 gb of vram at 4K. I repeat, AT 4K.

  • @tristanweide
    @tristanweide ปีที่แล้ว +262

    When the VRAM is more limiting than the performance of the card, you have an issue. I think the 3080 10GB is going to be running into the same issues here soon. If you think this is bad, you should see the RTX 3050TI trying to run anything with 4GB of VRAM. My poor laptop has to play at 1080P medium most games because of textures, not performance.

    • @danielowentech
      @danielowentech  ปีที่แล้ว +109

      Exactly! If a card runs out of VRAM, but it is at settings that wouldn't make sense to run anyway, it isn't that big of a deal. But cards like the 3070 Ti are frustrating because it is powerful enough to max out some games but runs into its VRAM issue like we are seeing here.

    • @GewelReal
      @GewelReal ปีที่แล้ว +38

      3080 10GB already is an issue at 4K
      Also no way Nvidia made 3050 woth 4GB of VRAM bruh

    • @mitanashi
      @mitanashi ปีที่แล้ว +18

      @@GewelReal the 3050 ti is a laptop varient i think

    • @LeegallyBliindLOL
      @LeegallyBliindLOL ปีที่แล้ว +14

      @@GewelReal interesting claim. Used 3080 10gb since launch at 4k. Never had any issues

    • @lifemocker85
      @lifemocker85 ปีที่แล้ว +21

      Thats why they gimped gpus that youd have to buy another soon

  • @adi6293
    @adi6293 ปีที่แล้ว +54

    My wife recently upgraded her 3070 to a 7900XTX due to ram issues in some games that she plays and I will be doing the same , we were going to get 4080 and 4090 but fuck nVidia , I'm aware of AMD's short comings but at lest they aren't stingy with the VRAM on their cards , 3070 should have had 12GB full stop , and my 3080 only having 10GB is just a disgrace

    • @GhOsThPk
      @GhOsThPk ปีที่แล้ว +8

      The 7900XT has 20GB, the RTX 4080 has 16GB same amount AMD gave to the RX 6800/XT last generation, NVIDIA does it because they know people will buy, they gimp it to make a little more profit.

    • @NXDf7
      @NXDf7 ปีที่แล้ว +2

      7900xtx is a great card, just make sure you get a good AIB like sapphire nitro. I have had nitro cards for 4 years and have had 0 problems.

    • @adi6293
      @adi6293 ปีที่แล้ว

      @@NXDf7 That is what she's got 😜 I'm also getting Nitro+

    • @igorrafael7429
      @igorrafael7429 ปีที่แล้ว +1

      7900XTX ray tracing's performance is on par with rtx 3080~~3090, which is pretty good, considerable stronger raster performance though. my 3080 is a beast but often limited by the vram, specially in rt games (rt is vram hungry, which is ironic), which makes me want to upgrade in less then 2 years of using it, unfortunately

    • @saricubra2867
      @saricubra2867 ปีที่แล้ว

      @@igorrafael7429 But that is quite mediocre for a flagship, on par with 3080 and 3090 means that it's the same as a 3080Ti from previous gen, therefore RTX4070Ti.
      A 4090 blows that Radeon out of the water. GTX 1080Ti 2.0

  • @TheGameBench
    @TheGameBench ปีที่แล้ว +5

    Had a feeling the 3070/3070 Ti were going to age poorly due to VRAM limitations. Going to be a 1080p card sooner than it should be.

    • @alrizo1115
      @alrizo1115 ปีที่แล้ว +1

      It will cost

    • @NyangisKhan
      @NyangisKhan ปีที่แล้ว +2

      Sure the vram of not only the 3070/3070ti but the 3080 doesn't make much sense. But this isn't a GPU issue. This is strictly a developer issue. RE4 remake has a poor PC port. It looks like absolute horseshit compared to games like "The vanishing of Ethan Carter" which was released a decade ago. And funny enough that game used to run with a stable 60FPS at max settings on low end GPUs at the time like the GTX650ti. Basically these recent incidents are just Capcom and Sony being absolute clowns.

    • @hircine92h
      @hircine92h ปีที่แล้ว

      Even at 1080p it would struggle on some games and you'll have to turn some settings down lmao

  • @Phoboskomboa
    @Phoboskomboa ปีที่แล้ว +4

    Even my 4080 at 4K required lowering the textures to 4GB to stop crashes. I was still in orange at 6GB, but it did crash a couple times, presumably when it hit a scene that used more VRAM. I've never been particularly sensitive to texture resolutions. I don't usually notice a difference between launch textures and the 4K fan-made ones that come out for a lot of games, so I don't really mind only allocating 4GB, but it's not really a great feeling that the $1200 video card I just bought is already hitting a hardcap on what game settings it can run.

    • @badimagerybyjohnromine
      @badimagerybyjohnromine ปีที่แล้ว +3

      Yeah this is unacceptable, if you pay a thousand dollars+ for a gpu you should be able to play whatever game you want on on ultra settings.

    • @Phoboskomboa
      @Phoboskomboa ปีที่แล้ว

      @@badimagerybyjohnromine I've noticed load times increasing and more stable behavior in the last few days. I think they may have quietly patched it to keep less in active VRAM at the cost of needing to load more per area. Or, it could be in my head.

    • @abhirupkundu2778
      @abhirupkundu2778 3 หลายเดือนก่อน

      you all here with rtx cards, while I play the game with my gtx 1650 GDDR6, textures at 0.5 GB only, hair strands off, 1080p, and yet face lag in intense areas due to low VRAM

  • @Dark88Dragon
    @Dark88Dragon ปีที่แล้ว +53

    Good that I went with AMDs 12Gigs instead of the 8 Gigs from Nvidias counterpart in 2021, this was one of the crucial aspects for me why I chose Team Red

    • @RS_Redbaron
      @RS_Redbaron ปีที่แล้ว

      A 1080Ti had 11 so what is 12 hahaha

    • @adi6293
      @adi6293 ปีที่แล้ว

      @@RS_Redbaron 1080Ti was a $699 high end card you idiot , 6700XT is a mid range card

    • @RS_Redbaron
      @RS_Redbaron ปีที่แล้ว

      @Khel easy !

    • @silverwatchdog
      @silverwatchdog ปีที่แล้ว +1

      @@RS_Redbaron And the 1080ti was a flagship overkill GPU, but somehow for a good price. It's not surprising it had 11GB in 2017 because it was meant to be overkill.

    • @mirzaaljic
      @mirzaaljic ปีที่แล้ว +1

      Same.

  • @theboostedbubba6432
    @theboostedbubba6432 ปีที่แล้ว +69

    I tested my friends RX 6600 XT against my GTX 1080 Ti in RE4 remake and even at 1080p the Max preset option simply doesn’t work on the 8gb 6600 XT. They delivered similar experiences when lowering the settings a bit but I just find it funny that 6 years later, the 1080 Ti’s Vram capacity is starting to help it out.

    • @user-eu5ol7mx8y
      @user-eu5ol7mx8y ปีที่แล้ว +34

      1080 Ti is legend

    • @fwef7445
      @fwef7445 ปีที่แล้ว +22

      1080ti was the last decent product nvidia made

    • @steveleadbeater8662
      @steveleadbeater8662 ปีที่แล้ว +12

      @@fwef7445 It would be the 3080 if it actually stayed at its $649 MSRP. Better to say it was the last Value for Money product.

    • @puppy.shorts
      @puppy.shorts ปีที่แล้ว +4

      same with my rtx 3060 12gb 😊

    • @TheEvilPorkchop03
      @TheEvilPorkchop03 ปีที่แล้ว +3

      i had my 1080ti from a couple months after launch till 2 months ago and for that reason it’s got to be the best. absolute unit that card is. Deffinetly better money spent than any computer component i’ve ever bought.

  • @GoldNugget
    @GoldNugget ปีที่แล้ว +2

    After a lot of thinking and comparison, last month I bought RTX 2080 Ti rather than RTX 3070. It's shame 3000 series Nvidia mid tier GPUs have same VRAM as the last gens... 11GB VRAM in RTX 2080 TI is deal changing.

  • @oktc68
    @oktc68 ปีที่แล้ว +7

    You make an excellent point. I've been struggling along with my 2070, and very nearly bought a 3070 Ti, as in many respects it was better than its predecessor, it was the 8GB VRAM that ultimately stopped me from buying this GPU. I'm glad I waited for the 4070 Ti, it's a great 1440p card, and the extra 50% of VRAM is just about ok for now and the next couple of years. 🤞🏻 Nvidia need to stop being tight with the VRAM on their cards, 8GB is only adequate for 1080p gaming, for the prices they charge they should have doubled it to 16GB for 1440p cards (don't know anything about 4K gaming, until you can run games at 144-165 FPS it doesn't really matter IMO) when paying for $1,000-$1,600 for a GPU I think it's fair that you get a card with some headroom, VRAM usage has been climbing in new titles and I'd argue that Nvidia are strangling the performance of some of their cards by being so tight with the VRAM

    • @katieadams5860
      @katieadams5860 11 หลายเดือนก่อน

      The 4070 ti would be a good card if it was $600 instead of $800

  • @resonator7728
    @resonator7728 ปีที่แล้ว +28

    Just to think that the 1080ti had 11gb vram way back then. Glad I held onto mine, just put in a second computer with a 6700k

    • @tourmaline07
      @tourmaline07 ปีที่แล้ว

      Same here with 2080ti , given the card's age I'm not too fussed if I now need to play high instead of ultra textures :)

    • @user-eu5ol7mx8y
      @user-eu5ol7mx8y ปีที่แล้ว +1

      Looks like high-end cards are best value after all.

    • @saricubra2867
      @saricubra2867 ปีที่แล้ว

      @@user-eu5ol7mx8y They ALWAYS have been (with the exception of the 2080Ti, Turing sucked so much). Graphics card get obsolete very quickly unlike CPUs.

    • @saricubra2867
      @saricubra2867 ปีที่แล้ว

      @@user-eu5ol7mx8y 1080Ti is doing fine, 3080Ti chilling at 4K despite NOT being a 4K card, 4090 being a 1080Ti 2.0 (a mistake). AMD entry level, Intel fighting with Raja while they keep losing money by the failure of Arc launch.

    • @otto5423
      @otto5423 ปีที่แล้ว

      @@saricubra2867 4090 is not 1080 ti 2.0 because 4090 is very expensive card compared to 1080 ti launch price. 2000€ vs 700€ lol

  • @TechnoTurtle9974
    @TechnoTurtle9974 ปีที่แล้ว +41

    This game is the one that let me know my 5700xt is already outdated because of the VRAM. Normally, lowering settings is fine, but there's something about lowering texture quality that hits different 😅

    • @raresmacovei8382
      @raresmacovei8382 ปีที่แล้ว +4

      We always knew that with just 8 GB VRAM (and lacking Vegas HBMC option).
      Sticking to high enough textures gives 80-120 fps in 1080p, maxed out, Hair Strands, FSR2 Quality. It's fine.

    • @Jacob-hl6sn
      @Jacob-hl6sn ปีที่แล้ว +1

      my rx 5700 xt was already obsoleted by performance in new games, you need a rx 6000 upgrade like I did

    • @demetter7936
      @demetter7936 ปีที่แล้ว +1

      Turn shadows from max to high and disable raytracing, it's only on water anyway.

    • @raresmacovei8382
      @raresmacovei8382 ปีที่แล้ว +1

      @@Jacob-hl6sn Which part of 80-120 fps feels like "lacking performance"?

    • @Jacob-hl6sn
      @Jacob-hl6sn ปีที่แล้ว

      @@raresmacovei8382 some new games i only got 60 fps with that gpu but that fps is fine if that's what it is in this game

  • @Azurefanger
    @Azurefanger ปีที่แล้ว +2

    one more reason to love rx 6700 xt and rx 6800 xt more vram for these exact problems i dont understand why people waste money in less vram gpus and hope they will work like a 4 gb extra vram one

  • @MarioGoatse
    @MarioGoatse ปีที่แล้ว

    That knocking at the end of the video freaked me out. Thought it was coming from my house at 3am!

  • @DeadphishyEP3
    @DeadphishyEP3 ปีที่แล้ว +39

    Great video. We are about to see a 4060 TI with 8gb of Vram running just as fast as a 3070 Ti and it will have the same issues.

    • @Mortalz2
      @Mortalz2 ปีที่แล้ว +3

      need rtx 4060 ti 16gb

    • @fwef7445
      @fwef7445 ปีที่แล้ว +1

      because the mid 3000 cards have an identity crisis, they have the raw power for 1440p but the vram for only 1080p, therefore they are badly designed. if you want to play at 1440p you ideally want 12gbs of vram. Absolutely nobody should be buying a 3070 or 3070ti, these cards suck. The 4060ti is going to suffer the same fate, it's just going to turn into yet another overpriced 1080p card

  • @zahnatom
    @zahnatom ปีที่แล้ว +182

    I'm glad i chose the 7900 XT over the 4070 Ti when there's memory issues like this

    • @Gh0st_91
      @Gh0st_91 ปีที่แล้ว +6

      me too

    • @gloomy935
      @gloomy935 ปีที่แล้ว +17

      The 4070 TI doesn't have 8gb of vram.

    • @andrewsolis2988
      @andrewsolis2988 ปีที่แล้ว +21

      Early adopters of the 7900 xt are now laughing at team green!! 😎

    • @shebeski
      @shebeski ปีที่แล้ว +3

      I've been playing just fine. I've had to turn down the textures slightly but can't notice the difference the vast majority of the time.
      The benefits of this card vs the 7900XT show in a variety of other titles. You win some, you lose some.

    • @Dragonlord826
      @Dragonlord826 ปีที่แล้ว +59

      ​@@gloomy935 12gb is pretty pathetic for the price

  • @invasivesurgery4595
    @invasivesurgery4595 ปีที่แล้ว +11

    I was Intel/Nvidia for over 10 years. I made the switch to AMD for my CPU in 2021 and was so happy. When I saw the 4000 series Nvidia cards, I'm happy to announce that my next GPU will be an AMD one. Currently rocking the 3060ti, but I'll be getting the 6800xt next

    • @nobody1101
      @nobody1101 ปีที่แล้ว +7

      the 6800XT isn't much faster than the 3060ti. Instead you should buy the 7800XT or the 6950XT. With 7800XT you will also have a way better ray tracing performance.

    • @82_930
      @82_930 ปีที่แล้ว +2

      @@nobody1101 bro it’s nearly 45% faster what are you smoking💀

    • @TFXZG
      @TFXZG 11 หลายเดือนก่อน +1

      Turn on ray tracing. 3060ti be faster than 6800xt.

  • @glebati_hlebati
    @glebati_hlebati ปีที่แล้ว +2

    I do not know if a lot of people have the same issue, but I experienced some memory issues in Dead Space Remake. It feels like my 3070 VRAM is not enough for this game even in 1080p, because after I started the game I have (in 1440p) ~90 fps, while after 10 minutes of running back-and-forth my VRAM consumption increases from 6GB to 7.9GB and my fps drops down to 25 and even lower. After restart of the game I get a good fps again but I cannot actually play the game :/ It feels like I need to conduct a surgery and increase the VRAM to be able to play

  • @radite2177
    @radite2177 ปีที่แล้ว +38

    This is the exact same reason why I switched to AMD (RX 6700XT) after almost 9 years using Nvidia GPU (my last Nvidia GPU was RTX 2060). I almost continue "the tradition" and planned to upgrade into RTX 3060 Ti but Im very glad now that I didnt do it and make a switch to AMD GPU

    • @lordmorbivod
      @lordmorbivod ปีที่แล้ว

      I took this one too, the game runs flawlessly at 60fps even with ray tracing on but keeps crashing with D3D error according to vram even lowering setting

    • @steadygaming7737
      @steadygaming7737 ปีที่แล้ว

      ​@GnarboK the d3d fatal error is due to a problem with the emplimentation of Ray tracing. I also have an rx 6700xt in my system, I can run the game at 1440p max settings with more than 60fps. When running the game with Ray tracing on, it crashes within 5 minutes, even though I have more than sufficient vram.

    • @radite2177
      @radite2177 ปีที่แล้ว +2

      @@lordmorbivod ah too bad.. me myself never like the ray tracing feature since first time trying it using RTX 2060 and now with 6700XT.. I just think that its not worth it of the performance impact, and didnt add substantial things to my enjoyment of the game itself.. but thats me.. back to each own preferences. For me just using normal reflection settings/features like SSR etc are more than enough

    • @serphvarna4154
      @serphvarna4154 ปีที่แล้ว +1

      6700xt to 3060ti is a downgrade

    • @radite2177
      @radite2177 ปีที่แล้ว +1

      @@serphvarna4154 yeah glad I choose 6700XT instead of 3060 Ti

  • @sjuric2435
    @sjuric2435 ปีที่แล้ว +16

    This is the reason why I chose a used RTX 2080ti instead of a used RTX 3070, it was even €50 cheaper, no matter what others said, it is newer, more modern, less power consumption, better Ray Tracing. Its turn out that RTX 2080ti with its 11gb will outlast both the 3070 and 3070ti. It's a pity the strong graphics but limited VRAM. Consoles are the base line, they have 16gb of ram, approximately 12-13gb are reserved for games. I think 12gb is the norm for upcoming games, especially if it's played with Ray Tracing.

    • @ZratP
      @ZratP ปีที่แล้ว +3

      Yup you pointed up something there. The current gen of consoles (PS5 and Xbox Series X) are a much more capable baseline compared to what the PS4 and Xbox One were compared to PC 3 years after their release retrospectively.
      I think on the Xbox Series X, you have 10 GB of RAM faster than the other 6 GB, those 10 GB are clearly mainly used for GPU and textures. Meaning that 8 GB may start to become a bottleneck for some GPUs in the upcoming years for high textures quality and 1440+p.
      Actually a lot of PC gamers in the upcoming years will start to feel the effects of that new baseline as cross-gen games is almost over and more and more games will move exclusively to PS5 and Xbox Series (even if the existence of Series S may actually control the damage). Their fast SSDs, 8 quite recent cores, plenty of memory, decent GPU may up minimal/recommended requirements.

    • @PeterPauls
      @PeterPauls ปีที่แล้ว

      Ah, no, on consoles they are optimizing and to be fair the textures look very bad at 0.5GB, much worse than other older games. I'm sure the new consoles doesn't use more memory for textures than 6-8GB.

    • @sjuric2435
      @sjuric2435 ปีที่แล้ว +3

      @@ZratP That's right. I also remember the PS4 era from 2013 when they came with 8gb of RAM, people shouted that 2gb of RAM is enough when it was cross gen. I remember reading articles about what programmers said, buy graphics with as many as possible Vrama consoles then used 3gb for the OS and 5gb for games. When the cross gen passed in 2014, I then bought a used R9 280x with 3gb. A few months later, a number of games are released, Far Cry 4 requires 3gb for High texture, Rise of Tomb Raider 3gb for High, Middle-earth: Shadow of Mordor 3gb for High and I had no problem. And then Nvidia only sold GTX 780/780ti with 3gb, while AMD had, as it does now, HD 7950/7970 3gb and r9 280/280x 3gb and r9 290/290x 4gb. Nothing has changed at Nvidia regarding VRAM since then they don't give it easily.

    • @sjuric2435
      @sjuric2435 ปีที่แล้ว +7

      @@PeterPauls We are not in 2017 anymore. In 2016, I had a GTX 1070 with 8GB, and for $230 you could buy new RX 480 with 8GB of VRAM. it's time to move away from 8gb and move on. People today play on higher resolutions, textures are getting bigger and more detailed, Ray Tracing is there, etc.

    • @mimimimeow
      @mimimimeow ปีที่แล้ว +1

      @@PeterPauls the reason console textures look bad is due to CPU and GPU competing for the limited RAM bandwidth, not the capacity. It's also why consoles often use inferior texture filtering. They can still load shitton of VRAM assets for a bigass level.
      There is no excuse that Nvidia skimped on VRAM. Looking at console RAM trends alone, 4MB-64MB-512MB-8GB, the current would have 64GB. Sure they only have 16GB, but these consoles are also betting on something we in the PC ecosystem lack:
      On the PS5 the dedicated SSD IO compression could swap between VRAMSSD up to 20-ish GB/s effective, that is PS3-era VRAM bandwidth. Now what's the point of this? A creative dev could theoretically constantly buffer data that it'd seem to have unlimited VRAM. We have no equivalent of this on Windows. DirectStorage has some ways to go.

  • @ericwheelhouse4371
    @ericwheelhouse4371 ปีที่แล้ว +2

    The issue isn’t just the capacity. It’s also how assets are loaded and managed. You don’t NEED all that vram if they optimized it to scale to what’s available. Adding more doesn’t fix the problem it just avoids it.

    • @WalkerIGP
      @WalkerIGP ปีที่แล้ว

      True, but I will take what solutions I can get. If the devs won't solve the problem then we gamers are forced to do it ourselves.

    • @cptnsx
      @cptnsx ปีที่แล้ว

      new games require MORE VRAM

    • @WalkerIGP
      @WalkerIGP ปีที่แล้ว

      @@cptnsx Yup and that is because dev are not optimizing their games. A well designed game uses less resources than available. The fact that all these games require so much VRAM points to devs getting more lazy. They know the VRAM is there, so why bother optimize? I just bought the 6950XT, no performance or VRAM issues for me!

  • @GraaD-87
    @GraaD-87 ปีที่แล้ว +1

    You should have mensioned that the problem does not exist with RT off. You can go all the way up to ~14 GB VRAM usage out of 8 GB available, and it will run just fine. No errors, no stutters, perfecly well. Turn raytracing on and it instanly crashes.

  • @hirakchatterjee5240
    @hirakchatterjee5240 ปีที่แล้ว +22

    I have a rtx 3060 and ultra textures in 1080p can take over 10 gigs of VRAM if you enable RT. It can handle it with 60 fps locked. So they definitely need more VRAM for the higher end cards.

    • @Anto-xh5vn
      @Anto-xh5vn ปีที่แล้ว +8

      Me with my 3060 Ti 8GB seeing it getting fucked because of vram 🗿🗿🗿🗿🗿🗿🗿🗿🗿🗿🗿🗿

    • @Simpleman1995
      @Simpleman1995 ปีที่แล้ว +1

      nvidia : say it louder i dont hear🗿🗿

    • @silverwatchdog
      @silverwatchdog ปีที่แล้ว

      You really don't need ultra textures on new games for 1080p. Sure on older games like RDR2 anything other than ultra textures looked horrible, but now even low texture will work on 1080p. Ultra textures are mainly for 8K and high for 4K. Depends on the game though. However if you have the spare VRAM, there is no reason not to put textures as high as they can go.

    • @nabilellaji6509
      @nabilellaji6509 ปีที่แล้ว

      @@silverwatchdog RTX 3060 is supposed to work better with 1080p anyways so you just cannot go wrong with this card if you don't mind playing at 1080p or even 1440p max.

    • @stygiann
      @stygiann ปีที่แล้ว

      I have 3060 and yeah the game can even perform even better (75-80fps) at 1080p, but I get the same VRAM crash so it's either an Nvidia or a Capcom issue

  • @GuyManley
    @GuyManley ปีที่แล้ว +16

    Yes sir. I have a 3080 10gb, and i HAD to turn of ray tracing to stop the crashes. Played the opening level 3 times on hardcore and crashed at the village fight 3 times.

    • @Grim_Jesus
      @Grim_Jesus ปีที่แล้ว

      For me, putting textures on 1gb, and shadows on 'high' with everything else maxed out stopped the crashing.

    • @unclehowdy409
      @unclehowdy409 ปีที่แล้ว

      Nvidia pushing a feature you can't even use properly on the flagship card. Really glad I went amd after my 2070 super

  • @granolafunk6192
    @granolafunk6192 ปีที่แล้ว +1

    It's even worse with the 3080 ti. Them trying to save a few $ and only giving it 12GB has basically created an early EOL date.

  • @gustavo8647
    @gustavo8647 ปีที่แล้ว +20

    AMD knew what was going to happen, that's why they equipped their top of the line equipment with 16 gb of memory 😌

    • @jpa3974
      @jpa3974 ปีที่แล้ว

      Probably because they work with the new consoles so they know what specs will be needed for the near future, but the question is... how NVidia was "unable" to see something so obvious? I know that a lot of TH-cam "tech" guys were until recently saying that VRAM is not important, but they are either shills or idiots. NVidia is not a company run by idiots, right? Why did they decide that 8GB VRAM, the same as a GTX1070ti, would be enough in the RTX3070?

    • @Biggrittz
      @Biggrittz ปีที่แล้ว +2

      @@jpa3974 It was Nvidia's plan all along to strip the VRAM on some of these cards short. That way you will be forced into upgrading. Which is unfournate to own a 3070 or 3070 Ti 8GB and was looking to play in 1440p or in some cases at the time 4K. I have a R7 5800X/RX 6900XT 16GB Tower and to Run Games in 4K and I sometimes will need over 12GB of VRAM. Far Cry 6 is a good example. So Even In 4K even the 3080 and 3080 Ti are unfournately also cut short.
      Meanwhile, i have an Aorus i7 12700h/RTX 3070m 8GB as my main laptop. Play in 1080p on there in which 8GB is fine in for now. The FPS on the 3070, 3070 Ti are still beyond exceptional when given it's proper VRAM space.
      P.S - Assuming the 3060 is powerful enough, I wonder if the 3060 12GB will be able to do things the 3070 and 3070 Ti couldn't. If so that would be a crazy sight to see. It may be too weak to run 1440p max settings at a fair enough frame rate. i have not checked the benchmarks so im not so sure.

  • @ConfusionDistortion
    @ConfusionDistortion ปีที่แล้ว +12

    Resident Evil 4 tells you you don't have enough vram even when you do, and crashes at around the 16gb mark after saying you are in the orange warning range (7900 XT has 20gb vram). Its a known bug that has carried over from RE 2 and RE 3. They both do the same thing. It seems to have been an issue introduced when they added ray tracing to the engine they use.

    • @Spr1ggan87
      @Spr1ggan87 ปีที่แล้ว +5

      That update came out ages ago and they still haven't fixed it in those those games, they just re-released the DX11 version of the game via beta branch and called it a day.

    • @MistyKathrine
      @MistyKathrine ปีที่แล้ว +3

      @@Spr1ggan87 I hate half-assed Steam ports of games. It's a huge problem in a lot of games. Like, is it really that hard to release games that don't have serious bugs?

    • @saricubra2867
      @saricubra2867 ปีที่แล้ว +1

      I have the Gamecube Orginal and i would rather play that instead of the foggy and bland Unreal Engine art look from the remake and we are talking about the Gamecube, optimization is perfect.

    • @Spr1ggan87
      @Spr1ggan87 ปีที่แล้ว +2

      @@MistyKathrine Try the PC version of RE4 HD with the HD Project mod installed, iirc it even has a set of lighting options to choose from with one mimicking the style of Gamecube version. Imho it's better than the GC version as in that version everything more than an arms length away from Leon looks like a pixelated mess.
      th-cam.com/video/WAf9R8_cLPo/w-d-xo.html

    • @MistyKathrine
      @MistyKathrine ปีที่แล้ว

      @@Spr1ggan87 Yeah there are a lot of games on Steam that pretty much require mods to make them playable. This reminds of another Gamecube title that's on Steam, Tales of Symphonia. What a disaster that port was, thankfully there were eventually mods that fixed most of the issues but that game was quite literally broken and unplayable at launch. I just don't understand how companies think it's okay to release games in that state as doing so just leads to bad reviews and poor sales.

  • @mikapeltokorpi7671
    @mikapeltokorpi7671 ปีที่แล้ว +14

    This is why I have always considered amount of VRAM as most important feature of GPU ( I bought very obscure FireGL workstation dGPU 20 years ago and the system was still top performer 5 years later in benchmarking services).

  • @airplaneB3N
    @airplaneB3N ปีที่แล้ว +3

    The fact that all these big content creators were saying to "just get a 3070" and "you don't need to buy a better card." That makes this even more sad.

    • @MistyKathrine
      @MistyKathrine ปีที่แล้ว

      3070 is fine for the vast majority of people.

    • @airplaneB3N
      @airplaneB3N ปีที่แล้ว

      @@MistyKathrine Maybe for 1080p. I own one, and like in the video, I found that 8gb of VRAM is not enough for modern games at 1440p.

    • @MistyKathrine
      @MistyKathrine ปีที่แล้ว

      @@airplaneB3N I never had any problems with mine in games and I was running at at 2160p /shrug.

    • @saricubra2867
      @saricubra2867 ปีที่แล้ว +1

      I was defending the 3080Ti, ignore these people saying a lot of bullsh*t.

  • @UnN3wBie
    @UnN3wBie ปีที่แล้ว +4

    I had the same problem with RE2/RE3 Remake with the new raytracing update, turning off fixes the issue. There are people reporting crashes with 4070 Ti or even 4090, so it's not about 8GB VRAM.

    • @ddd45125
      @ddd45125 ปีที่แล้ว

      4090 zero crashes in 44 hours if gameplay. RT on or off 130% resolution scale. RT currently off as it is kind of broken.

    • @UnN3wBie
      @UnN3wBie ปีที่แล้ว

      @@ddd45125 good for you, in any case rt is broken on this game (go check digital foundry's vid for details)

    • @valentds
      @valentds ปีที่แล้ว

      @@ddd45125 it's a 4090 bro, even with RT broken, 24 gb vram compensate that

  • @tonac13
    @tonac13 ปีที่แล้ว +9

    Excited for 4060 with 4GB of ram ❤gogo nVidia !

    • @renoputra5219
      @renoputra5219 ปีที่แล้ว +4

      I'm looking forward to 4050 Ti 1GB

    • @deaneyle3940
      @deaneyle3940 ปีที่แล้ว +2

      ​@@renoputra5219 and a standard 4050 with 500mb vram

    • @IISocratesII
      @IISocratesII ปีที่แล้ว +1

      This is why people used to called NVidia "NoVideo" lmao

  • @Metical1312
    @Metical1312 ปีที่แล้ว +17

    I predicted this would happen it made no sense that Nvidia released a gpu with 8gb in 2020 when they had a gpu in 2016 with 8gb of vram. This was there plan to force an upgrade before you really needed an upgrade. When ever your buying a video card you should always buy a higher tier then what the current Gen console has and in this case both xbox and ps5 have more vram then the 3070/70 ti. The best part about it is that the incoming 4060 is rumored to have 8gb and it will probably be somewhere between 3070 ti/3080 performance but kneecapped out the gate by the low vram.

    • @greggreg2458
      @greggreg2458 ปีที่แล้ว +1

      That level of performance with 6GB is so wrong...

    • @Metical1312
      @Metical1312 ปีที่แล้ว +2

      @@greggreg2458 I have been Nvidia ever since going back to school in 2013 not buy choice but just because of the need for cuda through out this whole time i have seen almost every card i purchase with Nvidia the Amd competitor ages better my GTX 780 6gb lost driver support as soon as the 900 series hit mean while the Rx 290 only stopped getting upgrades last year and is still viable for 1080 low gaming. GTX 1060 vs Rx 480/580 is no longer a topic today cause the 1060 can't really hang with it's 6gb in 2023. I'm still a slave to cuda and because of that now own a 3080 10gb instead of a 6800xt 16gb I'm sure AmD fine wine will win here to and the 6800xt will be viable longer then my 3080 even though they are roughly the same speed currently. Hopefully AMD gets there act together pertaining to 7800/7700 lines with decent performance/ VRAM and pricing to really apply pressure to Nvidia.🤞🏾

    • @Nightceasar
      @Nightceasar ปีที่แล้ว +3

      Not to mention the 4060 will probably have the 3070 performance but cost the same as the 3070 did aswell. So basically you are just getting the same 3070 again for the same price.

    • @niebuhr6197
      @niebuhr6197 ปีที่แล้ว

      4060 might not even match 3060ti. Unless they release something different than the laptop version with more specs but that looks unlikely.

  • @portman8909
    @portman8909 10 หลายเดือนก่อน +1

    How many of you can tell apart medium or high textures from ultra though? I'm willing to wager almost none unless pixel peeping.

  • @johnkaczynski1905
    @johnkaczynski1905 ปีที่แล้ว +2

    This is truly saddening, I too am a 3070 Ti owner and I will probably switch to Team Red for my next GPU. Noticed the VRAM problem in Cyberpunk 2077 years ago and i simply thought that it was because of how game's poor optimization that causes the memory leak, oh how wrong i was... Such a shame though that 3070 Ti is actually a powerful GPU but can't do anything because of the VRAM limit.

  • @od1sseas663
    @od1sseas663 ปีที่แล้ว +58

    That's exactly what Nvidia wanted. Go ahead upgrade to the 1000$ 12GB 4070 Ti that is also gonna run out of VRAM in 2 years!

    • @ALM_Relaxed
      @ALM_Relaxed ปีที่แล้ว +6

      Thats why you need to spend those 1600 bucks for The 4090 to stay relaxed for The upcoming future games

    • @albertoalves1063
      @albertoalves1063 ปีที่แล้ว +1

      @@ALM_Relaxed But even the 4090 will not hold on much time either, if the 4070TI is up to 2 years the 4090 is about 4

    • @ALM_Relaxed
      @ALM_Relaxed ปีที่แล้ว +1

      @@albertoalves1063 well 4 years it’s more than enought… than you can change gpu/entire pc

    • @albertoalves1063
      @albertoalves1063 ปีที่แล้ว +1

      @@ALM_Relaxed Bro the 4090 is like 8 times more powerful then the PS5 and in 4 years the PS5 will still will be running games that the 4090 will not be able to, because the usage of Vram

    • @od1sseas663
      @od1sseas663 ปีที่แล้ว +3

      @@ALM_Relaxed 1600$? Imma gonna buy 3!

  • @MrKevlad100
    @MrKevlad100 ปีที่แล้ว +28

    Glad I went with the 6700xt over the 3070, had a gut feeling vram would be an issue

    • @barrontrump3943
      @barrontrump3943 ปีที่แล้ว +2

      ​@@blue-lu3izgaming in general nowadays is a total rip off and scam. Just bought a new rig that's 3 generations into rtx? Gotta have that turned off fam or you ain't playing on 120 fps natively. Its a joke especially with ReMaKeS that are completely unnecessary and people just gobble it up because they have nothing else meaningful to live for

    • @lifemocker85
      @lifemocker85 ปีที่แล้ว +1

      No need for gut feelings when you use brains

    • @cripmeezy
      @cripmeezy ปีที่แล้ว

      Same here my guy no issues with resident evil 4 on high settings ray-tracing set to normal

    • @thelazyworkersandwich4169
      @thelazyworkersandwich4169 ปีที่แล้ว +1

      @@barrontrump3943 Facts. I'm playing games on a integrated graphics, have a ps4 for more demanding games and doing just fine, they're tons of legacy titles that provide way better content and value than whats coming out.

    • @AAjax
      @AAjax ปีที่แล้ว

      @@blue-lu3iz gotta fund the leather jackets somehow

  • @zinarmagadan3751
    @zinarmagadan3751 ปีที่แล้ว +2

    I still have a 1080 Ti and I tried out RE4 Remake with settings well over the VRAM limit and it never crashed. I wonder why that's the case for the 3070?

    • @thepizzaboy7904
      @thepizzaboy7904 4 หลายเดือนก่อน

      Ray tracing could be the factor causing crash

  • @rebelblade7159
    @rebelblade7159 ปีที่แล้ว +13

    That's why I'm currently staying at 1080p with my 3070. The VRAM limitation is a shame because if it had 12GB or more, it(and the 3060ti) could have had great longevity for 1440p gaming and even 4k to some extent. My old GTX 960 4GB that I got in 2015 lasted upto mid 2021 thanks to its VRAM. 4GB was considered overkill then but it let me be future proof. For those of us who take their time in upgrading, stuff like VRAM can matter a lot. Is it possible that Nvidia made this bad decision due to thinking everyone will use DLSS?

    • @andrerocha3998
      @andrerocha3998 ปีที่แล้ว +5

      no nvidia made this decision so you need to upgrade your gpu sonner, simple as that

    • @brettlawrence9015
      @brettlawrence9015 ปีที่แล้ว +1

      The fact the 1070 2070 and 3070 all have the same amount of vram is a joke.

  • @theyoungjawn
    @theyoungjawn ปีที่แล้ว +19

    Nvidia should’ve put at least 12GB VRAM on this card and 16GB on the 3080. Can’t believe how much they skimped last gen. Planned obsolescence.

  • @A_Potato610
    @A_Potato610 ปีที่แล้ว +21

    The diablo 4 beta also uses crazy amounts of vram. It was using 21gb on my 3090.

    • @theyoungjawn
      @theyoungjawn ปีที่แล้ว +14

      Pretty sure it’s been found the game has a memory leak. Hopefully they fix it during launch.

    • @MrSnake9419
      @MrSnake9419 ปีที่แล้ว +1

      changing the textures to medium should fix this

    • @tahustvedt
      @tahustvedt ปีที่แล้ว +6

      Some games will use whatever's available (up to a point off course) even though the performance benefit will roll off at a much lower point.

    • @MrEditsCinema
      @MrEditsCinema ปีที่แล้ว +1

      ​@Grumpy RC Modeler call of duty cold war for example..it will allocate 80% of all your vram for the game even though it doesn't need it
      Doesn't make the game more stable or improve frame times etc

  • @KoenvdW88
    @KoenvdW88 ปีที่แล้ว +9

    This makes me really happy I got a 3080 12GB back in August - I thought 10GB would be plenty for some time, but it looks as though I made the right call to get the 12GB card.

    • @KryssN1
      @KryssN1 ปีที่แล้ว +1

      People having opinion does not make them right, people are just idiots or uneducated often times.
      I repeated to many of my friends that a 3070 with 8GB will only last you 2 years as well as 3080 with 10gb, they still gone and bought them.
      Now they said they will be selling their cards, to upgrade again, only after 2 years, it's a repeat of 700 and 900 series.

    • @ToWatchWhileEating
      @ToWatchWhileEating ปีที่แล้ว

      @@KryssN1 I have a 3080 10gb and it’s more then enough, what are you talking about? It’s like an extra $200 for 2gbs.

    • @killersberg1
      @killersberg1 ปีที่แล้ว +2

      @@ToWatchWhileEating It is enough NOW only. It was pretty clear that 8gb was already outdated when rt30 came out. 10gb ist better but in a year this will not be enough.

  • @icantthinkofagoodusername4464
    @icantthinkofagoodusername4464 ปีที่แล้ว

    You've probably heard this by now, but apparently the crashing issue that occurs when going over the VRAM limit only happens with ray-tracing enabled. My GTX1080 can max everything out including texture quality, and while it doesn't run at what I'd call a stable 60fps at 1080p, it certainly doesn't crash like that despite memory usage being well over the limit. Not being able to even turn on real-time raytracing is apparently a benefit here. From what I understand the raytracing implementation here isn't anything actually transformative anyway, so it's not that big of a loss to keep it turned off.

  • @lflyr6287
    @lflyr6287 ปีที่แล้ว +5

    Daniel Owen : the since 2015 standarized (at least on AMD R9 300 series GPU-s) 8 GB VRAM buffer issue was already present in Resident Evil 2 REMAKE 2019 and Resident Evil 3 REMAKE 2020 at 1080p already using Ultra details. Resident Evil 2 REMAKE used/allocated slightly above 13 GB of VRAM at those settings.

  • @grimsaladbarvods8586
    @grimsaladbarvods8586 ปีที่แล้ว +15

    My 6700xt will be the best purchase I'll make for the next 4-5 years.
    Near 3070 levels of performance(minus RT) with 4GB more vram.

  • @chrisclayton6788
    @chrisclayton6788 ปีที่แล้ว +14

    I've got a 6600xt playing at 1080p. There's only a few games maxed out that produce stuttering for me (Hogwarts legacy and Forza Horizon 5) the fact Nvidia is producing such powerful cards with vram capacity that was nice on a midrange GPU 5 years ago is beyond me

    • @dammaguy1286
      @dammaguy1286 ปีที่แล้ว

      6600 xt is a powerful gpu it should not stutter on fh5

  • @TinMan445
    @TinMan445 ปีที่แล้ว +1

    I’m here learning why my 3070 ti was on sale for SO cheap

  • @AdrianMuslim
    @AdrianMuslim ปีที่แล้ว +13

    This is why I game at 1080p with high end GPU and never worry about FPS or lowering settings.

    • @AndyHDGaming
      @AndyHDGaming ปีที่แล้ว +1

      When you have a 4K screen, you will find that the picture quality of a 1080p screen is really poor, and the pixelation is very noticeable

    • @trenchcoats4life891
      @trenchcoats4life891 ปีที่แล้ว

      @@AndyHDGaming agreed. I jumped from 1080p to 1440p and I cannot stand 1080p now.

    • @AdrianMuslim
      @AdrianMuslim ปีที่แล้ว +1

      @@AndyHDGaming When I have a 4K screen, I will also find that frames are really poor. I barely get 60FPS in some games at 1080p, let alone 4K. lol.

  • @emma6648
    @emma6648 ปีที่แล้ว +3

    Your video’s title is Invisible to Nvidia
    RTX 5070 will have a large 3gb of vram, can’t wait

  • @steve9094
    @steve9094 7 หลายเดือนก่อน

    I wasn't aware of the vram issue before I upgraded from a 1070 to the 6650 XT during Cyber Monday, and now I'm finding that games like Hitman 3 run fine with ray tracing on my card but start becoming unplayable once the VRAM gradually fills up. So close, yet so far...

  • @Killa-rd6hj
    @Killa-rd6hj ปีที่แล้ว

    When it’s windy and raining in the village is the best time to test your fps in game without benchmarking

  • @cebuanostud
    @cebuanostud ปีที่แล้ว +28

    Intel's arc A770 starting to look really good

    • @UmbraWeiss
      @UmbraWeiss ปีที่แล้ว +2

      i have a 3060ti, the next upgrade i make will probably be AMD or Intel, in 2-3 years until this card is really outdated at 1080, Intel probably will be a really good option for Upgrade.

    • @gubbica99
      @gubbica99 ปีที่แล้ว +1

      @@UmbraWeiss i had an Amd card for 2 years after it broke (thinking its a joke for 2 years and the card is dead) i bought an Nvdia card and only after a year its not working as it should, would say the perfomnce of the card is at max 50 percent after games like witcher, elder ring and such. Thinking about intel card in like 2 years.

    • @xxvelocity_0439
      @xxvelocity_0439 ปีที่แล้ว

      @@gubbica99 gpus breaking down after a year of use(even intensive) is not common. If your AMD card died after two years and your Nvidia card is on the same road then im not so sure that the Intel card will be any better.

  • @sm4sh3d
    @sm4sh3d ปีที่แล้ว +5

    It's funny, I have a 3080 mobile, so a gpu barely on par with a standard 3070 desktop and clearly behind a 3070Ti. But I got 16Gb VRAM and I can max everything here hitting an amount close to 14Gb VRAM, and it's butter smooth. VRAM was really underestimated last gen, especially from Nvidia

  • @MarkSinister
    @MarkSinister ปีที่แล้ว +11

    That's why I picked up a 3060 12GB. My programs kept running out of Vram for Rendering and editing when I was using the 1070ti I wasn't going to upgrade to another 8GB GPU no matter how fast it was. The only other option was the 3080 and the 3090.

    • @KRZStudios
      @KRZStudios ปีที่แล้ว

      I get your point, but still the 3060ti or 3070 with 8GB are more powerfull.

    • @MarkSinister
      @MarkSinister ปีที่แล้ว

      @@KRZStudios Yeah. But when you're using applications and your run out of ram... I'd much rather have the ram to work with than the speed to work with.
      What the point of going fast if you can't carry the load?
      In some of the 3D applications I use if you run out of ram it means you have to start deleting objects in your scene or start rendering stuff in parts which in turn doubles the amount of time it take to do something.
      For me it Ram first.

  • @pwestonclark4729
    @pwestonclark4729 ปีที่แล้ว +3

    I literally just sold my 3070ti and upgraded to a 308012gb for this exact reason. I want my 30 series card to last as long as possible and don't want this to be an issue for future releases

    • @scylk
      @scylk ปีที่แล้ว

      Any reason you got the 3080ti over the 4070ti?

    • @pwestonclark4729
      @pwestonclark4729 ปีที่แล้ว

      @@scylk I bought used so it was significantly cheaper.

  • @Orain88
    @Orain88 ปีที่แล้ว +12

    The textures in this game don't work the way they do in a lot of games. It basically has 3 texture sets, low, medium, and high. Low being set at zero and then the medium and high category are subdivided into texture pool sizes that the game dynamically draws from. So when set to high texture 0.5 gb, you might see some lower textures but if you were to focus on one area the higher textures will load in.
    I generally find if you can tune in your other settings to allow for a 2gb high texture pool the game will typically keep up with showing you the games highest quality textures at distances that actually make a difference to the experience at 1440p.
    Also, the RT in this game is totally not worth it for the performance hit. If it were RTGI, or RTAO, or even RT shadows, I would be more interested. But simply just RT reflections? Not worth it imo.
    Edit: you can run everything 100% maxed out ignoring the VRAM limit so long as RT is off. The D3D crash only happens with RT on and VRAM overloaded.

    • @Keivz
      @Keivz ปีที่แล้ว +6

      Yup. Odd this wasn’t mentioned but it should be well known. Same with many other Resident Evil games. And I bet if he set the vram setting to 1gb, it wouldn’t have crashed and it would have looked fine.
      Meanwhile, his example #2 just showed how much it is the fault of developers and how such problems can be fixed.
      As for Forspoken, the demo has not been updated. Their latest patch notes says under optimization:
      Reduced the VRAM usage. (PC version).
      This video is much ado about nothing.

    • @Orain88
      @Orain88 ปีที่แล้ว

      @@Keivz yes indeed. I'd also add that this crash only seems to happen when RT is on in combination with too much VRAM usage. I haven't been able to cause this crash with RT off and all other settings maxed out.

    • @Spr1ggan87
      @Spr1ggan87 ปีที่แล้ว +2

      @@Keivz Nah it just takes longer to crash at 1Gb, the same thing is happening to me with RE2 Remake. Maxing textures leads to a crash within minutes of playing, setting it to 1Gb let's me play for around 4 hours before i get the D3D Fatal Error crash.

    • @MistyKathrine
      @MistyKathrine ปีที่แล้ว

      Yeah certainly not worth running RT at high. Can run it at low and medium and reduce the performance hit and give more room for textures.

    • @ChaosAlter
      @ChaosAlter ปีที่แล้ว +1

      Only if he did some googling for a couple of minutes

  • @valeera5415
    @valeera5415 ปีที่แล้ว +8

    Makes me really glad i went with the 3090 with its 24 gb vram

    • @MrEditsCinema
      @MrEditsCinema ปีที่แล้ว +2

      I've got 4080, I don't think 16gb vram will be a problem for quite awhile even at 4K

    • @joewhite8079
      @joewhite8079 ปีที่แล้ว

      ​@@MrEditsCinemaI nearly got a 4080 but hopefully 12gb will hold out for 2-3 years at 4k with my 4070 ti.

    • @Anon1370
      @Anon1370 ปีที่แล้ว +1

      i remember when i was trying to get a 3090 and someone said just buy the 3070 yeh now look glad i didnt take their advice.......

    • @saricubra2867
      @saricubra2867 ปีที่แล้ว

      ​@@Anon1370 3080Ti also works, same die as the 3090 (same RT as well) but 12GB is the sweetspot.

    • @valeera5415
      @valeera5415 ปีที่แล้ว +1

      @@Anon1370 i got mine brand new for 840 plus taxes and 20 shipping off ebay 2 weeks ago , heres a video about 12 gb vram tho th-cam.com/video/RQA3cTPNicg/w-d-xo.html

  • @mayconrodrigues2867
    @mayconrodrigues2867 ปีที่แล้ว +2

    All AMD-sponsored games are VRAM-heavy because the RX5000 and RX6000 series have more VRAM than the NVIDIA RTX 2000 and 3000, that's just to force us to go to AMD.

  • @eugenebebs7767
    @eugenebebs7767 ปีที่แล้ว +1

    I hope direct storage is gonna aliviate the vram limitations.

  • @gabriel_ramon
    @gabriel_ramon ปีที่แล้ว +17

    That's why I went with AMD instead of Nvidia this time. I had bought a 4070Ti for R$7,300 ($1.391,38 USD) here in Brazil. It had issues 2 weeks later and unfortunately died.
    I sent it for RMA and now they will refund me. I bought another graphics card to replace it for a much cheaper price, as I couldn't wait too long for the refund and decided to buy a 6750 XT, which is priced at R$3,400 ($648,04 USD).
    I'm quite happy with the performance of the 6750 XT, as it runs games smoothly in 2K resolution. I can run everything on max settings except ray tracing in Resident Evil 4 Remake, which causes the game to crash after a while. However, I don't really care about that as it doesn't make that much of a difference, and I prefer to have good performance rather than fancy lighting effects that drastically reduce it.
    My next card will be a 7900 XTX with 24GB of VRAM. I'm tired of Nvidia, this problem I had with the 4070Ti was unforgivable, such an expensive card having issues 2 weeks after use and still having only 12GB of VRAM.

  • @varmastiko2908
    @varmastiko2908 ปีที่แล้ว +13

    I've been talking about this issue since Turing launched in 2018. It's refreshing that it's becoming more widely acknowledged now.

    • @pk417
      @pk417 ปีที่แล้ว +5

      2024 is going to be the year when people will realize this

    • @IgorBozoki1989
      @IgorBozoki1989 ปีที่แล้ว +3

      12gb vram should be absolute minimum. 16gb vram should be standard by now.

    • @earthtaurus5515
      @earthtaurus5515 ปีที่แล้ว

      Had this debate with a friend years ago when I bought my RX480 as I was being told off for spending a little more on getting the 8GB version (about £20 or so - just before the cryptomining boom mind you) over the 4GB version as I was flat broke but needed a GPU and this was just before the crypto mining boom. In summary told him, need the vram for textures and it would be an issue in the future - so I'm future proofing by getting spending abit more now as opposed to buying a new GPU later. It sucks to see this is still going to be issue several years later.. I paid £250ish for the RX480 and that was 6 years ago in Feb 2017. In a few short months, the price was triple on 2nd hand market - some would say I could have sold it and got a better GPU but everything went up and everything I could get had lower capacity Vram so it was not worth the trade off. I've been happily gaming with high texture mods. Now running a 6750XT which was supposed to be going into a relatives PC but they changed their mind and since the GPU has gone in price compared to what I paid (the pricing is fluctuating +/- ~ £30 excluding extended warranty). Might as well use it instead of having it gathering dust.

    • @clownavenger0
      @clownavenger0 ปีที่แล้ว

      @@pk417 so these cards will be 4 years old when it becomes an issue?

    • @pk417
      @pk417 ปีที่แล้ว

      @@clownavenger0 no 3 year old.. 3070ti released in 2021
      The irony is 5 yr old 1070ti is still relevant today ..3070ti still cost 600 dollars even after launch of 4000 series
      I really don't understand what went wrong with nvidia after 1000 series with gpu manufacturing
      I mean 3070ti is strong gpu but is only bottlenecked by vram
      Bcz of dlss 3070ti would have lasted much longer only if it had more than 8gb vram

  • @Hitchens91
    @Hitchens91 ปีที่แล้ว +3

    Textures are the one setting that I will never compromise on, in any game. If I can’t play with max textures, I won’t play at all. Otherwise I’ll be forever wondering what could have been.

    • @Psevdonim123
      @Psevdonim123 ปีที่แล้ว

      Well, this day the difference is not as huge as it used to be, it's blurrier, but they're not compromising on details that much, at least you're not missing on important things anymore (unless you set them to REALLY low). Though yeah, it's really frustrating, that you have to limit it.

    • @madtech5153
      @madtech5153 ปีที่แล้ว +1

      @@Psevdonim123 only high vs ultra had slight difference to which you can discard but even in High settings, its already maxing out 8gb vram.
      you can go and watch hardware unboxed video 16gb vs 8gb

  • @tealuxe
    @tealuxe ปีที่แล้ว +2

    That's why I got the 3090 and decided to skip at least 2 or 3 generations of GPUS. I imagined the 10GB of the 3080 wasn't going to be enough, even the 1080ti had more VRAM. The 3090's 24GB or VRAM should allow me to skip until the 6000 or 7000 series.

  • @dcgerard
    @dcgerard ปีที่แล้ว +8

    I have a 3070 ti and I ran into this very problem in the chainsaw RE4 demo. The card had been great in just about everything else though, but yeah I do wish it had at least 12GB

    • @jacketofthe80s13
      @jacketofthe80s13 ปีที่แล้ว

      you don't man. the highest texture quality means aboslutly nothing you're just using all that vram to store shaders in a cache so it loads textures without hick ups. so no it's not your gpu it's capcoms retarded way to handle shaders

  • @Xilent1
    @Xilent1 ปีที่แล้ว +6

    I found out RE4:Remake loads shaders during the 1st loading screens and intro. Let it run and the game runs better. No stutters. Great video and informative

    • @abhirupkundu2778
      @abhirupkundu2778 3 หลายเดือนก่อน

      mine starts lagging when very intensive scenes are playing. Such as the village near the lake in chapter 4 or chp 5 I think. Cuz its a gtx 1650, runs the game smooth af other than the texture settings at high 1 GB, wtf is wrong with nvidia overpricing every card.

  • @MerolaC
    @MerolaC ปีที่แล้ว

    Missed the chance to quote the merchant.
    "Not enough VRAM, Stranger!"

  • @nabilellaji6509
    @nabilellaji6509 ปีที่แล้ว +1

    Ive never been more happy with my 3060, this vram thing has recently become an issue with big titles, hogwarts legacy, RE4 and now TLOU are just the beginning!

  • @iAmLanar
    @iAmLanar ปีที่แล้ว +34

    Hello Daniel! Thanks for sharing this with us. Now I'm a bit worried about my 3070. Was gonna play the game at 1440p/Max settings.
    Anyway, this is not supposed to work this way I guess. The game should not crash. We might expect to experience some stutters or visual glitches, such as popping textures and objects, at most. Data that cannot fit in VRAM should be stored in RAM and loaded on demand instead. For instance, there was a problem with the HD textures in Far Cry 6, where textures appeared blurry on GPUs that lacked sufficient VRAM. I'm not a game developer though...
    I hope it won't be getting worse in the future because we now have technologies such as direct storage available on PC.

    • @danielowentech
      @danielowentech  ปีที่แล้ว +19

      Most games I've seen behave more like the Forspoken benchmark I showed. More stutters, but not a crash. Not sure why RE4 completely crashes. Maybe could be patched.

    • @iAmLanar
      @iAmLanar ปีที่แล้ว +2

      @@danielowentech I hope so.

    • @mck8292
      @mck8292 ปีที่แล้ว +12

      3070 owner too :/
      Sad i didnt went for a 6800 instead..

    • @adi6293
      @adi6293 ปีที่แล้ว +4

      @@mck8292 Don't make that mistake again :P

    • @PineyJustice
      @PineyJustice ปีที่แล้ว +5

      @@mck8292 If only the warning signs were there since nvidia has been doing this garbage for over a decade...
      They sell upfront with "oh all we need is 2 or 4gb, that's fine for all current games at max settings" yet you're not buying a new card just to play old games, you want to use it for at least a year or two.

  • @kosmosyche
    @kosmosyche ปีที่แล้ว +13

    If this game preferences act the same way as in Re2 remake (which is highly likely, since they use the same RE engine), texture cache high 0,5gb and high 8gb should give the same result visually, but lower setting may or may not result in a bit more stuttering, bc the game needs to swap the textures to and from VRAM more frequently. That said, I completely agree that selling mid to high end cards for that amount of money without providing them with at least 12 GB of VRAM is extremely bad on nvidia part. Hell, I had 11GB of VRAM when I had GTX 1080Ti and that card came out like 7 years ago. Imagine the cruel irony of paying hundreds of bucks for modern ass GPU and having to play on lower settings than a person who bought his GPU 7 years ago or for peanuts second hand! Seems like Nvidia does everything to completely ruin their reputation this gen. Not that it was especially high to begin with, but still.😮

  • @ElonMuskT
    @ElonMuskT ปีที่แล้ว +2

    Sure vram could be more but people need to realize that the real problem here is the poor optimization that devs do, for new games this just became the norm i hope they do realize that optimization is more important then they think because in the current state of the world economy i don't think the mediocre person can just go out and comfortably afford a 4090 or 7900 xt for 1000+ bucks

  • @garyaudine3327
    @garyaudine3327 ปีที่แล้ว

    It's funny that the limiting factor for "modern" remake of old games to run smoothly on high tier cards are the VRAM of the Gpu. What a world we live in.