Stop blaming hardware whe the problem is software. No amount of vram will solve anything. I can make one single sculpture in blender that will eat any GPU. Gpus already have more then enough VRAM the problem is developers and unreal. And guess what lots of VRAM maks GPUs more expensive and prices will never go down.
@@markdove5930 Actually removing the RT cores/Tenors from the shaders & them having their own R.O.Ps would be considerable faster. Which also means you would need double the Vram for the increased processing power. There are bottlenecks in their designs & it was done on purpose.
Dear friend @@commanchi7 Good luck to nvidia trying to sell me any graphics card with less than 12 gb of vram for 1000 $. First of all, I never buy overpriced products. Secondly, any graphics card with less than 16 GB vram is instant "hard pass" for me. 16 GB vram is not even "future proof" it is just "never without" meaning "bare minimum" "nvidia/amd/intel" trio, rigged the market so much, probably since 2017-2018 onwards, we need competition to return and price fixing crime ends. I am still using R9 380 (4 GB) from 2015 and not intend to buy another graphics card, until I see true competition returns to market.
in my country RTX4060 8gb is still $310+ 🗿 the lowest series of RTX40 series while GTX1050ti is not the lowest series of RTX10 series. there are GT1010, GT1030, GTX1050 .
8GB of VRAM should be under $200. For a couple of years already the RX 6600 8GB, which is a damn good on performance to low TDP (130W), has been $180 to $200 brand new. Over $200 is ridiculous at this point.
@@natejennings5884 3060 12gb is a much better value than the base 4060 that's 8gb. Overclocked it's around the same performance as the 4060 as well. that extra 4gb s a life saver.
@@tehgendo Problem with intel is its drivers are even worse than AMDs. World of warcraft jusut crashes randomly, many games don't even start. Lots of videos done on it recently. I would never buy intel lol!! That's madness.
Since RDNA2 where they offered the same performance for half the money Im actually certain that it doesnt matter what AMD offers. I dont actually know the reason why people want an Nvidia monopoly that makes the insane current pricing look like a joke but I hope there actually is something rational behind it which i dont see.
Nvidia explaining themselves "It is designed this way, it can use only 8 gigs" should not be listened to but voted out with wallets. "You designed it this way, eat the costs" But the notion AMD does not offer a competitive solution boggles my mind. China, India and South America will buy those 8GB cards regardless.
@@RFKGthis could be true but not because north Americans are smart lol. Odds could be that they are running higher resolutions and would literally need more vram. Maybe not though too lol. Panels are getting so affordable it won't be long until most of the world plays at 4k and then 8gb will just not do...
16gb on 128 bit isn’t bad its just unoptimal. for some situation the extra vram might be better bur usually not for games at lower resolutions. I would agree that at the same price I would rather have 12gb on a 192 bit bus instead of 16gb on a 128 bit bus, but if the 16gb card is cheaper and pretty much the same performance I’d buy the cheaper one. also if nvidia used 3gb memory chips they could have 12gb of vram on a 128 bit bus.
@@justmatt2655 if the game or application uses a lot of object rendering. (example number of trees, NPCs and items) 8gb Vram will not be enough, it has been proven in some of the latest games. Vram below 12gb many objects are removed to adjust to the Vram capacity.👻
No. The issue is 100% the price point. 8GB is perfectly fine for competitive titles. And some AAA at 1080p. Not everyone, especially those constrained by a budget, buys a gpu to play AAA titles at ultra settings at 4K resolution.
@@agentoranj5858 I have a friend who would never buy used, even if it just came out recently. Some people just do not want to buy used for whatever reason they have. Part of the reason why Nvidia owns so much market share in the GPU space, because they have 3 times AMD's stock to sell in retail. A new GPU has warranty, ongoing support, and a feeling of security by buying new, it is what it is. I think it's dumb, but I'd rather not completely omit the 8GB options on newer cards while jacking up the entry cost by another 50$.
If you end up with not enough Vram, and cant crank down the settings, you are DONE... Mostly will make the game stutter heavy, unplayable Heavy ... So no 8Gb even for 1080p is a NO GO , i repeat NO GOOOO
Is true. These "8gb is enough for 1080p med textures" has been shut the fuq down with stalker2 and other recent releases. Just stop with 8gb copium. Lol
@@47battles Radeon cards do tend to allocate more VRAM than their nVidia counterparts, and _technically_ you could say that allocating more VRAM is also using more VRAM, even if that's not quite how it works. But this is due in part to more aggressive color and texture compression used by nVidia to conserve VRAM, and also down to the fact that Radeon counterparts often just have more VRAM available to allocate in the first place. This also often makes comparisons of VRAM usage between the vendors irrelevant and inconclusive.
This 8GB "issue" have apparently been a thing for years. Even back when the 3070 cards came out it was a potential issue. I've had a 3070Ti for 3 years now and still haven't ran out of VRAM in any titles. I argued back then that by the time I upgrade to a GPU with more VRAM my card will be struggling with fps regardless. And that is actually starting to become a reality. I'm planning to upgrade in the coming months if the prices are not ridiculous. And that is for performance reasons, not VRAM. The whole thing is highly dependent on the game you play; Eco, Foundation, Timberborn, IXION, Satisfactory, Cities Skylines 2, Manor Lords, New Cycle, Medieval Dynasty and Farming Sim 25. These are some of the games I've bought after I got the 3070Ti and none of them have had any issues with VRAM.
@@bp-it3ve As I said, it depends on the game. I don't do FPS games anymore. Probably 15 years since I last played a game like that I also tend to stay away from the more expensive AAA titles when they are new. Saves money on both the game and the hardware needed. I might buy them during a sale some years later, and by that time I probably have a good GPU as well without breaking the bank.
@@Gazer75 Most people dont only play indies. 3070 ti was outdated on release, i know I bought one on release day. Sold it a year later. Even back then it was running out of vram in games like control, far cry 5, final fantasy 7 remake etc. If you only play the games you listed you could do fine with on board graphics and no gpu at all.
@@WhoIsLostIf wanted to upgrade from a 3060 Ti after 4 years, I would get a card that has 16GB and is at least twice as fast. Even if the RTX 4070 Super is an upgrade, it wouldn't be enough for me. Only a 4070 Ti Super would be a worthy successor, but the price is crazy.
An 8GB VRAM card for 2025 simply shouldnt even exist. Even an RTX 5050M running at 30W in something like an XPS 14 should have 12GB. But going back to the question, my honest opinion is that if an 8GB GPU has to exist, it should be sub $200. Anything above $200 for an 8GB GPU in 2025 is a literal scam, due to how badly they perform as soon as the VRAM hits its limit. 1080 Ti users are going to be set until 6000 series if Nvidia gives users a $400 RTX 5060 Ti 8GB.
@@turtleneck369 Honestly I'm really wanting 12GB despite knowing that wont happen because games will continue to get more VRAM intensive especially with new versions of DLSS and FG hogging up even more.
@@RobloxianX well even 10gb would be big difference for entry level rtx card opposite to 8gb while it might not seem like huge difference it would actually mike lot of games way more playable at 1080p it would still be like 27% increase in VRAM memory 8gb is kinda useless specially considering prices they are going to release in its not going to be cheep and i fear its going to be even more expensive than people expect and predict
realistically speaking it doesn't matter how much we think it should cost, as long as there are still people buying Nvidia overpriced 8GB GPUs then they will keep charging more for 8GB GPUs
225-250$, i think thats the maximum a 8GB GPU should cost at this point, anything higher than that it feels like a scam, 300-400$ ones like future 5060/60 Ti or RX 8600 should have at least 10-12GB.
5060'ish for 400'ish is the same as 4070'ish for 400'ish or 3080'ish for 400'ish. so ist no good. literally 0 value over old gen. we will need 5060(ti) outperform 4070/3080 and cost more then 50$ less then current price. and ofc 12G minimum
@@jermygod and we will get 5060 Ti = 4070 and 5060 = 4060 Ti probably, i want to belive it wont be this way... But if rumors about 5070 with 4070 Ti raster performance are true, this is the most possible situation sadly, dunno if cuz is getting harder to make performance leaps between generations or lack of competition, but it kinda sucks.
true. but i dont think we gonna see these prices anymore. Inflation sucks, also gpu corpos aren't our friends. After all they are here for $$, and they saw in 2020-2021 people are paying high prices like joke
We already know what the prices should be depending on VRAM in our current generation which have the following prices: 8GB RAM - $200 12GB RAM - $300~$400 (7600XT [16GB] and 7700 XT) 16GB RAM - $450~$600 (7800 XT till 7900 GRE) 20GB RAM - $650~$800 (7900 XT) 24GB RAM - $800~$1,200 (7900XTX) 32GB RAM - Anything above that (RTX 5090) Edit: Corrected 7600XT as being 16GB and 7700XT as being 12GB.
@@hyperturbotechnomike because people keep buying them. It's not even an average person's fault because almost all pre-built systems use Nvidia's cards. If all the pre-builts can be changed to AMD, that alone would spike the usage charts. People knowledge about this stuff don't generally buy 4060 or 4060 Ti over the AMD counterparts, unless they've a specific use case of Nvidia.
wait till you see laptop 5070 will still have 8gb vram,for context laptop 1070 had a 256 bit bus and 8gb vram.They are gonna milk gamers dry with this shit.
Ray tracing is slow when you don't tweak. It's possible to run Path tracing on 6700xt on Cyberpunk, tinkerers can already game with rt on 4060 level cards. Steve and Tim must test on generic settings because of how much work they've got. Example: in Cyberpunk turn on rt, but only reflections and gi medium. Most people won't see the difference from ultra. Apart from 2x fps. effect multiplies with fsr+fg. On 6700xt it should be 1440p fsrq fg 110fps.
This is also true for many newer rasterized games. Stalker 2 is another great example right now, where GSC just checked all the boxes for unnecessarily computationally heavy shader effects. With a few optimization mods from the Nexus I have the game maxed out with a humble 5800X + 6800 XT at 1440p with FSR native AA and can maintain 60 FPS in villages and 70-90 FPS out exploring. My much cheaper rig is now only slightly behind what a 9800X3D + RTX 4090 gets on the stock max settings, and the only cost was a barely noticeable hit to shadowing and ambient occlusion quality. So many games have settings that come with a 50% performance hit for what amounts to 10% better visuals, and they're designed to make fools part with their money on overpriced hardware.
@K31TH3R Great example. For me and 5800x3d+7900xt I've played wukong and delta between high and cinematic settings is like 2x, meaning, average fps are similar but lows are halved. Something tweaked for nvidia must be in vhigh and cinematic, HU guys say GI or water effects. I can bet non rt lumen is using some nvidia acceleration.
5090 might be 32GB, 5080, 5080 Ti needs to be 20-24GB, 5070,5070 Ti 18-20GB. 5060, 5060 Ti 14-16GB, 5050, 5050ti 12 to 14 GB. They can use GDDR6x ram for the lower tier cards starting at 5060 Ti and lower for higher ram capacity. This leaves room for Super refresh having more or faster GDDR7 when price goes down ram.
Nvidia does it on purpose to force you to buy expensive cards. Hence why I buy AMD on that alone. For the same price, you get double the video memory. FTC should be going after Nvidia for their prices
I did a quick check of spot price for GDDR6, and it's currently about $2.3 per gig. GDDR6X would be about 50%-ish more. So that's about $18 for 8 GB GDDR6, maybe $27 for 8 GB GDDR6X. That's the raw price to the producers, if I understand the market right.
I don't think you should buy an 8GB card in 2025 at all, regardless of the price. You're buying a card that's obsolete the day you install it. There are also lots of options with 12+ GB VRAM even at entry level prices, so there's just no reason to go with an 8 GB card.
If you`re buying a GPU, delay your purchase by 3 or 6 months or even a year to save up and get a higher tier This generation of GPU`s have to be well ahead of RX 6000 series and RTX 3000 1080TI was matched by 6600XT and right in between 3060-3060TI but these are the lowest acceptable end cards Get yourself a GPU that is at least 20 to 30% better than a 6950XT and 3090TI It really is a shame that games get demanding at a way faster rate and GPU`s are stagnating. They also don`t look as good and devs are not optimizing. Just play older classics, horror games and even multiplayers until the market changes
rtx 5000 will not see a big step up in raster. only in ratracing. and it will use more power. I do not expect a better price/performance rating. The only hope is that AMD can provide 7900xtx performance for 600$ish
No, i will not, already happy with my RX 6600 at 186$ bought last week and playing all my games without issues NOW, this advice only works if you live in a country with a strong economy, enough spare money in your own wallet just in case the card's stock in your area either is still strong or goes dry and overpriced the next month, or care about the newest trends and AAA titles. Then again, if you find a good discount over anything above a RX 6700 or a 3060 Ti and care about the next Cyberpunk sequel or whatever the heck they invent, get it.
I bought in this past 2 weeks 2 RX6700XT . One XFX 319, and one Red Devil Edition both for 180€ used. And the red devil came with invoice papers with 2 years warranty left. It didn't make sense at least for me, spending the same or even more money on a 8gb card
I won't complain if they release a 150 dollar card with 8GB of vram, but it seems pretty obvious that Nvidia won't be doing that. AMD will likely release one or two very low-end 8GB cards, but those will likely be a pretty poor value. However, it seems all but certain that AMD will also be releasing at least one low end card with 12GB of vram somewhere in the 200-300 ish dollar price range. An actual purpose-built 12GB card would allow them to make them in large quantities at a relatively low cost, unlike the 7700 XT, which was never designed or intended to be produced or sold in large quantities, and kind-of only existed to make the 7800 XT look good, and to fill a gap in their product lineup.
@@syncmonism There is a chance that RDNA 4 and Battlemage could significantly improve the gaming GPU market if they deliver powerful low-end and midrange GPUs at aggressive prices. A lot of gamers want good price to performance. If AMD and Intel can manage to do that, I can see them gaining a considerable amount of market share. It might take Intel a few years. However, AMD could definitely gain quite a bit with RDNA 4. RDNA 4 and Battlemage are said to be very competitive in the low-end and midrange segments. The top end RDNA 4 GPU is rumoured to be in between the 7900 XT and 7900 XTX in rasterized games and rouhgly be on par with 4080 in ray-traced games for around $400-$500. If AMD can manage to pull that off, that would be impressive and should be an appealing option for midrange gamers.
@@syncmonism Navi 48 will be the top end GPU for RDNA 4. It's meant to have 16GB of VRAM on a 256-bit bus. Those are decent specs for a midrange graphics card.
@@syncmonism Do not be shocked when next year we get 4g cards still from both. Remember AMD had already said 8g was the minimum a card should had and the released the 6500xt followed by the 6400xt. And while in the 7000 series and 4000 serie we didnt got less then 8 , thats mostly because the 6400 and 6500xt are both readily avaliable still. And nvidia did not made anything under the 4060 because they still selling the 3050, which has a variant of 6gb.
8gb on the most budget cards is fine as long as they are cheap enough. Nvidia is still selling the 6gb 3050 and it’s not great but it’s not a complete ripoff. While I’d rather they just clamshell all the cards and double the VRAM, or give us a bigger bus with more. I know they won’t do that. Pricing is the big point for how much VRAM is acceptable.
250 max for 8gb vram, 400 max for 12gb, and 700 max for 16gb, thats the right prices but no one would do it, 20 and 24gb they should cost more than 900
It depends on; 1 - Whether there will be at least 16 GB of Vram or not 2- Whether or not, those offer at least "rtx 4070" and "rtx 4070 ti" performances 3 - Affordable and reasonable price. Simple is that...
I’m still waiting to see 8GB substantially limit a card at the price points where its been put on cards. The 4060Ti is the only one that springs to mind and in your own testing it was like 4% different at 1080p which isn’t exactly a game changer and was closer at 1440 due to the card not being fast enough to play at settings where the difference would actually show. 8GB is fine for a card targeted at 1080p as shown by yourselves with the 4060Ti.
i dont think its a hardware issue just look at dlss and framgen of 30fps target at 480p if u made every gpu 16gb overnight they would start to struggle as unoptimized bad devs will put 17gb of junk there. its always devs fault
I bought 4060 for around 200$ on a big sale from one of out local retailers and I am very happy with the performance after 1650. But when asked I tell my friends not to buy it and save more money. Because it's fairly annoying to think about that. Not that I have often encountered vram issues but it's just because I do check vram usage in the beggining to not get frivolous with the settings. And I just don't want to do this anymore. Next gpu will be 16 gigs mininum
I am just considering the 4060 TI 16 GB, but the prices are close or even the same as the 4070 12 GB for accompanying my 6800 XT. 8 GB of GPU VRAM for 2025 doesn't make sense to me, even if I only play 1080p. Whatever the 5060 price, I just wait for the 4070 price to go down when the 5070 comes.
" I just wait for the 4070 price to go down when the 5070 comes." That's very unlikely to happen. It's not going to get any cheaper. It's probably going to get more expensive because people will buy up the card to get away from the bad 50 lineup. Plus NVIDIA stopped making the 70 cards. Just buy it. The performance of the 4070 is worth 100 times more than the higher VRAM of the 4060 Ti. And if you can scrounge up some extra cash, the 4070 Super is even better. Trust me.
If they're going to use an 8 GB GDR7 then the bus speed should be a minimum of recommended 384 then and only then you might not see such bad framerates because it was going to have at least something for it to work with instead of bottlenecking the bandwidth instead because all they're doing is choking off the performance on purpose Recommended for the higher-in video cards like maybe the mid-range typically speaking it should be 512 bit because if you remember and I know I do there were 8 GB cards that had a one megabyte bandwidth and two megabyte bandwidth cards that performs so freaking well that they removed them so finding those is extremely ultra ultra ultra rare and that's what needs to be to come back
I bought a 4060 8gb prebuilt mini itx, and Im fine with it. But I mostly play older games. Not everybody needs to top of the line. The problem is memory leak and poor optimization with new releases, not video ram, especially if you play multiplayer games at lower resolutions, and higher fps.
Meanwhile I am still rocking my 250 dollar Zotac RTX 3050 Twin Edge OC 8GB that I got in middle of 2022(during crypto mining). I don't need to upgrade because I run older titles and I had the GT 1030 2GB GDDR5 for 5 years before getting the RTX 3050. I have catchup to do with my games. I haven't even finish Witcher 3, Metro: Last Light, oxygen not included, Torchlight 2, Warframe, Astroneer and Shadow of the Tomb Raider. Those are the ones I am currently busy with and then there are ones I have not even started yet or installed yet. There are so many old classics a person can play. Then there is the ps2/ps3 emulator and those games I have not finished yet. My am4 cpu and mobo died so I just got my I5 12400F and Gigabyte B760M Gaming DDR4 motherboard recently(installed it this week).
Same here brotha, i respect your taste and lateness in games, i just started playing RDR2 on my new purchased RX6600 last week and still got a backlog to run along with emulators lol.
Well said. I Play GTA5, Guild Wars 2, God of War, Horizon Zero Dawn, No Man Sky... I game at 1080p with a 3070 ti. Its working good at high to very high settings. But since i plan to upgrade my display within the next 6 months for a 1440p IPS monitor. There for, i will need to upgrade my GPU for sure. 16 go vram will be the very minimum. But if i want to run my games smoothly for the next 5 years, I am looking for a GPU with 20 Go vram instead. And Nvidia just do not give me that option. So, i will sacrifice DLSS for more vram. Maybee my visual wont be as flashy as it would be with DLSS but at least, my gaming sessions wont be stuttering.
You guys (HUB) are actually part of the problem here. The 60 series cards are not "entry level cards" they are the popular gaming cards. Check the steam charts. The even worse "50" series cards are the entry level cards...
If the video cards Intel plans to release on the 12th show up, it's like they are already setting the bar higher as to follow the RX 6700 10GB and RX 6700 XT 12GB standard in momery layout at a lower price.
also worth noting that 8gb vram on nvidia is different from 8gb of vram on AMD. you can still get away with nvidia in most cases at 8gb of vram but not on amd, because AMD just gulps down more vram playing the same game for some effing reasons. i wouldnt recommend 8gb vram on amd card, even if the performance might be a little better than comparing to the nvidia counterparts.
So far what the leaks suggest is the entry level Radeon and Nvidia cards having a 128bit memory bus once again so the possible memory configurations are the following: Intel B570?: -(160bit) 10GB G6 Intel B580: -(192bit) 12GB G6 (almost released by now) Nvidia RTX5060: -8GB G6 -16GB G6 (clamshelled, which adds extra cost) -12GB G7 (possible refresh with 3GB memory chips) Radeon RX8500XT/8600XT -8GB G6 -16GB G6 (clamshelled)
300 dollars for a GPU that can't be used in all games is a bit too much, we had 250 dollars GPU which lasted for years and had the same VRAM as the x80 class, 8 GPUs are old right now, they aren't worth anything more than 180 dollars.
I sort of want to know... if say you had 8gb of gddr7x on idk 512bit bus or whatever, would that do the trick? Or doesn't matter the speed/bus (Not that I think nvidia would do this - but I've always wondered if fast ram is better than more ram)
This will keep happening until people stop buying nvidia cards. Unfortunately nvidia has had extreme success in anti-AMD viral marketing (drivers nonsense, exaggerating benefits of dlss/rt). They're never going to stop screwing you unless you stop buying their crap.
minimum vram: 1080p | 12GB 1440p | 16GB 2160p | 20GB 200$ and below can be 8GB 250$ and above 12GB 400$ and above 16GB 800$ and above 20GB 1000$ and above 24GB+ those are the minimums for me Nvidia is known for cutting vram for profit, they done this strategy for decades.
I'm still divided about 8GB VRAM. In a lower tier card, it's totally fine. These are not meant to run every game on max texture settings or at the highest resolution. An RTX 3050 or RX6600 for example is a compromise. But everything with a Ti in it's name or above 250€ should have more VRAM. The most disappointing card for me personally was the RTX 3070 Ti. The biggest rubbish card nVidia has made since the dreaded Geforce"4" MX scam series in the 2000's. As someone with a 4K monitor, i always went with 70 tier cards, beginning with the GTX 770 Ti 4GB, GTX 1070, RTX 2070 super and now with the RTX 4070. The 70 class card are not perfect for 4K, but they have far better price to performance ratio than the 80 series one. The RTX 4080 is almost double the price. I'm fine with medium/high textures. The 4060 8GB would be fine if it was 250€ and not 320€, the 8GB Ti costs around 400€, which is nonsense. The best 8GB card in my opinion is still the 3060 Ti. Especially the used market has solid offers. Around 200€, what a card of this performance level should cost. My sister in law as an RTX 3060 Ti and usually only plays asian "2.5D" or when i play with her tactical/milsim games on a 4K TV. Perfectly capable. But she also wants to try out stalker and in 1080p, the game works fine. In 4K, DLSS Performance is needed which makes everything blurry and especially the very tranparent anomalies and enemies behind foliage are very difficult to see with all the AI noise.1440p is not tested. And then there is optimisation. A lot of games are just terribly optimised, especially Unreal Engine 5 games are. The shadow, reflection and texture settings are broken with many UE5 games, because the devs think that UE5 is the holy grail medicine, which solves all their technical problems. Metro Exodus is an example for a well optimised game, which has stellar graphics with RT on without needing more then 8GB VRAM. I played the enhanced edition, which is RT only on an RTX 2070 super in 4K at above 60fps. The engine was carefully developed for this specific game from the ground up. And not just some all around tool that tries to suit everyone, but at the end does nothing perfect. //Typo
1050Ti 4GB back in 2016 was 150€. Almost 10 years later and prices have skyrocketed, after the PLANNED gpu crisis. If it was not planned, prices would return to normal. PERIOD.-
4060 class cards need to be 140$. Something like 6600 needs to be a 100$ If GPU prices are realistic. 4070 needs to be 300$. 7700xt 230$. 4080 is 500$. 4090 600$. Vram standard should be 12gb. Perhaps even 16gb a two years from now.
Lets not forget that upscaling and framegen use vram. Imagine you can eun ultra 1080p 90hz or upscaled to 1440p from 1080p and framegen to 120fps if you have the vram. "Framegen and upscaling sucks" that is subjective.
steelrising (nvidia sponsored title requires 13.5gb to run on ultra 1440p), suicide squad requires 11gb vram to run in 1080p, RE4 remake with raytracing requires 12gb vram to run in 1440p. most my games from 2022 and onward need more than 8gb vram
i like that 8gb of vram is becoming an anchor point. it's ridiculous bad, but it's good that game devs can target 8gb and make the most of it. obviously don't buy an 8gb card in 2020 if you're looking forward at future stuff, but 8gb being a low-rent target is good.
I agree. new cards should have no less that 12gb. Otherwise they may as well just put 4 gb on there as there is no difference than how it runs with 8gb
8gb is the new 3-4gb imo 180-250 is the price I have set in my mind 12-16gb is the new midrange with a price up to 400-450 Then there’s the 90 class cards that can be 900-1000+
Another point to look at is Nvidia isn't focused on gaming gpu's now, its all about AI because that's what made them filthy rich over the last year or two. The best chips are going to be ear marked for AI and not gaming cards.
Nvidia released a low-end card with 12GB of vram nearly 4 years ago, though it was pretty pricey at 330, kind-of masquerading as a mid-range card, and AMD released a better 12GB card about a month later... although, those prices weren't exactly widely available until maybe a year later, after the GPU shortage situation ended. These two cards have really set the bar for the minimum amount of vram for a graphics card in this price range, and yet more 8GB graphics cards were released in the following generation, with MSRPs as high as 400 for the 4060 ti! It's honestly baffling that neither AMD nor Nvidia released a single graphics card with 12GB of vram in the price range of 250-400 dollars. Instead, all we get are a bunch of cards with not enough vram, and a couple of overpriced and underpowered versions of these 8GB cards with too much vram (the 7600 XT and the 16GB 4060 ti). The 7700 XT was never designed as a high volume product, and launched at a price which was too high relative to the 7800 XT. The 7700 XT finally started looking like a decent value recently with price cuts, but even then, it's not hard to look like a decent value when compared to either version of the 4060 ti. The 7800 XT has been seeing good discounts as well, and has remained the better value if you could find it at a similar percentage discount from the MSRP. Oddly, the best value graphics cards in the 250 to 400 dollar price range have generally been the 6700 XT, 6750 XT, and RX 6800 from the previous generation, though availability for the 6800 has often not been great, and as the 6700 XT and 6750 XT have gotten older, the price has mostly stayed fairly stagnant over the past two years, making its value proposition seem increasingly less compelling, though I have seen it for as low as about 270 or 260 in some recent sales, which definitely made it more exciting again.
8GB of VRAM should be about $700 or $800 when using the Nvidia math calculator. Thus far the Nvidia math calculator has not been wrong in a way that some simple rebranding couldn’t fix.
I find that I really don't need more than 8gb of video ram. I've never had a card fast enough to warrant more than 8gb. I would like to see faster cards with a larger memory bus. I have a Radeon RX 580 in my system; it's too slow for me to take full advantage of the vram. It's not that the ram is too slow, it's that the cores are too slow. I would really like to see faster cores; however, there not getting faster or more plentiful compared to the vram speeds. I predict that with in 10-15 years video cards will be made obsolete by the improvements to integrated graphics. This takes into account the fact that GDDR5 is plenty fast for video memory and AMD's efforts in the rnd of higher end integrated graphics. Integrated graphics has much lower latency than using a dedicated graphics card even now.
well here in romania i got my rx 7600 below 250$ 1 year go but for the same price here you might get a 3050 or rx 6600 or a gtx 1630 or gtx 1650 so yeah
They shouldn't cost more than $150. 8gb has been around for a long time and was main stream with the r9 390 and it's not like those cost $600. So we've had the r9 390, the 480s which truly brought 8gb to everyone for $240 then the 5000xt series, the 6000 series, the 7000 series and soon the 8000. So we are going into the 6th generation since 8gb was main stream and nvidia is still trying to fuck everyone in the ass with expensive 8gb cards and people still defend this crap
Repeating my comment from the QA, 8gb cards shouldn't be more than $200. Noone should be buying a new 8gb card in 2024. If you're playing games that actually use 8gb comfortably, you're playing either indie games or AAAs before 2020. And if that's the case, frankly, you can just buy a used card. A 3060, 3060ti, 6600XT or 6700XT are great at that level. $250~ for a used 6700XT on ebay on Dec 1st. And that's actually giving you 12gb.
They will push 8gig as long as people buy every slop nvidia pushes out
Ngreedia at its finest.
Stop blaming hardware whe the problem is software.
No amount of vram will solve anything.
I can make one single sculpture in blender that will eat any GPU.
Gpus already have more then enough VRAM the problem is developers and unreal.
And guess what lots of VRAM maks GPUs more expensive and prices will never go down.
9gb 96bit 5060 i guess
@@markdove5930both.
@@markdove5930 Actually removing the RT cores/Tenors from the shaders & them having their own R.O.Ps would be considerable faster. Which also means you would need double the Vram for the increased processing power. There are bottlenecks in their designs & it was done on purpose.
An 8GB VRAM card in 2025 should cost no more than what a 1050 Ti did when it was new ...
Just wait until the 5070 post-tarrif becomes a new 2025 card with 12gb of ram for over $1000...
and the 200$ rx 580 8GB in 2017 :D
Dear friend @@commanchi7
Good luck to nvidia trying to sell me any graphics card with less than 12 gb of vram for 1000 $.
First of all, I never buy overpriced products. Secondly, any graphics card with less than 16 GB vram is instant "hard pass" for me.
16 GB vram is not even "future proof" it is just "never without" meaning "bare minimum"
"nvidia/amd/intel" trio, rigged the market so much, probably since 2017-2018 onwards, we need competition to return and price fixing crime ends.
I am still using R9 380 (4 GB) from 2015 and not intend to buy another graphics card, until I see true competition returns to market.
True but inflation.
in my country RTX4060 8gb is still $310+ 🗿
the lowest series of RTX40 series
while GTX1050ti is not the lowest series of RTX10 series.
there are GT1010, GT1030, GTX1050 .
Even Intel equipped their b580 with 12gb, 8gb for 2025 is unacceptable unless it's cheap
8GB of VRAM should be under $200. For a couple of years already the RX 6600 8GB, which is a damn good on performance to low TDP (130W), has been $180 to $200 brand new. Over $200 is ridiculous at this point.
@@natejennings5884 3060 12gb is a much better value than the base 4060 that's 8gb.
Overclocked it's around the same performance as the 4060 as well.
that extra 4gb s a life saver.
in my country 6600 is the only good option from AMD, every other cards is priced as nvidia
aint the b580 rumored to be $250usd? so its cheap AND has more than 8.
@@tehgendo Problem with intel is its drivers are even worse than AMDs. World of warcraft jusut crashes randomly, many games don't even start. Lots of videos done on it recently.
I would never buy intel lol!! That's madness.
Watch Nvidia release the 5060 with only 8GB. AMD better not fumble this chance for an easy win between $200-$400.
Amd will fumble bcoz they always
7000 series is already much better, its not AMDs fault, its the braindead consumers
Since RDNA2 where they offered the same performance for half the money Im actually certain that it doesnt matter what AMD offers. I dont actually know the reason why people want an Nvidia monopoly that makes the insane current pricing look like a joke but I hope there actually is something rational behind it which i dont see.
Cant wait for amd to focus on the midrange market and still only have like 10% market share because people would rather pay the Nvidia tax
Amd is colluding with nvidia
I would not buy an 8GB card at any price in 2025.
What if it was 299 USD and offered 3080 performance but with 8gb GDDR6X?
@@Orcawhale1 $160 max for a 3060 performance
@@Orcawhale1 I will/did just buy a 4090
1080 ti with 11 gigs of vram still goin strong baby!!!
it has 11? wtf
still 4060 with 8gb is faster😅
@@alexandruciordas4941 Just barely by 7% and 4060 still sucks compared to 1080 FE TI in some areas.
@@mejdlocraftci Yeah. Ti used to mean weird af Vram amounts back then😂
@@alexandruciordas4941 took unhealthy ammount of years of bad product to reach its greatness
they will probably milking gamer dry with 8GB till 2030
You're probably right.
2024 Nvidia still releases GPUs with 6gb Vram.
I think 8gb Vram will last until 2050
@@penonton4260shhhh. 🤐
Well thats what gamers wanted. They put them into that semi monopoly position.
@@TheDude50447 Gamers want to pay too much for not enough vram?
Nvidia explaining themselves "It is designed this way, it can use only 8 gigs" should not be listened to but voted out with wallets. "You designed it this way, eat the costs" But the notion AMD does not offer a competitive solution boggles my mind.
China, India and South America will buy those 8GB cards regardless.
So north american consumers are smart and would never buy 8GB cards? You are a funny man.
@@RFKGits because pricing of amd cards sucks in india
@@RFKGthis could be true but not because north Americans are smart lol. Odds could be that they are running higher resolutions and would literally need more vram. Maybe not though too lol. Panels are getting so affordable it won't be long until most of the world plays at 4k and then 8gb will just not do...
I would buy a 8gb card if I couldn't afford 1440p. But then it should cost less then 200 USD to match the rest of the system.
$200 maximum for an 8GB VRAM GPU. $250-300 for a 16GB version of the same card.
$50 for 8 GB of GDDR7 is already a big margin. More is a scam.
16gb on a 128 bit bus is bad tho id much rather have 12gb on 192 bit
Why is it bad?
16gb on 128 bit isn’t bad its just unoptimal. for some situation the extra vram might be better bur usually not for games at lower resolutions. I would agree that at the same price I would rather have 12gb on a 192 bit bus instead of 16gb on a 128 bit bus, but if the 16gb card is cheaper and pretty much the same performance I’d buy the cheaper one. also if nvidia used 3gb memory chips they could have 12gb of vram on a 128 bit bus.
@@justmatt2655 if the game or application uses a lot of object rendering.
(example number of trees, NPCs and items)
8gb Vram will not be enough, it has been proven in some of the latest games.
Vram below 12gb many objects are removed to adjust to the Vram capacity.👻
Zero dollars.
Because such a card should not be sold at this point
I would be ok if it is very cheap and both ads and the box clearly states it is for older or light weight games only.
No. The issue is 100% the price point. 8GB is perfectly fine for competitive titles. And some AAA at 1080p. Not everyone, especially those constrained by a budget, buys a gpu to play AAA titles at ultra settings at 4K resolution.
@@MM-vs2et People constrained by a budget are better served by existing 8GB cards. A brand new product is no boon to them.
@@agentoranj5858 I have a friend who would never buy used, even if it just came out recently. Some people just do not want to buy used for whatever reason they have. Part of the reason why Nvidia owns so much market share in the GPU space, because they have 3 times AMD's stock to sell in retail. A new GPU has warranty, ongoing support, and a feeling of security by buying new, it is what it is. I think it's dumb, but I'd rather not completely omit the 8GB options on newer cards while jacking up the entry cost by another 50$.
If you end up with not enough Vram, and cant crank down the settings, you are DONE... Mostly will make the game stutter heavy, unplayable Heavy ... So no 8Gb even for 1080p is a NO GO , i repeat NO GOOOO
Is true. These "8gb is enough for 1080p med textures" has been shut the fuq down with stalker2 and other recent releases. Just stop with 8gb copium. Lol
8gb belongs in the xx50 class cards. that is why the 4060 had it, and the 5060 probably will too 🤷🏻♂
a new 8gb gpu in 2024-25 should cost no-more than 150$
Glad I got 20gb , I’ve played to many games that use 10/12 gb of vram
AMD GPUs use more VRAM by default,
@ not true but go crazy
@@47battles Radeon cards do tend to allocate more VRAM than their nVidia counterparts, and _technically_ you could say that allocating more VRAM is also using more VRAM, even if that's not quite how it works. But this is due in part to more aggressive color and texture compression used by nVidia to conserve VRAM, and also down to the fact that Radeon counterparts often just have more VRAM available to allocate in the first place. This also often makes comparisons of VRAM usage between the vendors irrelevant and inconclusive.
This 8GB "issue" have apparently been a thing for years. Even back when the 3070 cards came out it was a potential issue. I've had a 3070Ti for 3 years now and still haven't ran out of VRAM in any titles. I argued back then that by the time I upgrade to a GPU with more VRAM my card will be struggling with fps regardless. And that is actually starting to become a reality. I'm planning to upgrade in the coming months if the prices are not ridiculous. And that is for performance reasons, not VRAM.
The whole thing is highly dependent on the game you play; Eco, Foundation, Timberborn, IXION, Satisfactory, Cities Skylines 2, Manor Lords, New Cycle, Medieval Dynasty and Farming Sim 25. These are some of the games I've bought after I got the 3070Ti and none of them have had any issues with VRAM.
Only the most demanding games that are not optimized well will run out off 8gb vram in 1080p. In 1440p i guess it's more common but still not as bad
@@Margeletto Doom Eternal on max settings with rt with on in 1440p and you see how great 8gb vram are lol.
@@bp-it3ve As I said, it depends on the game. I don't do FPS games anymore. Probably 15 years since I last played a game like that
I also tend to stay away from the more expensive AAA titles when they are new. Saves money on both the game and the hardware needed. I might buy them during a sale some years later, and by that time I probably have a good GPU as well without breaking the bank.
@@Gazer75 Most people dont only play indies. 3070 ti was outdated on release, i know I bought one on release day. Sold it a year later. Even back then it was running out of vram in games like control, far cry 5, final fantasy 7 remake etc. If you only play the games you listed you could do fine with on board graphics and no gpu at all.
@@Dempig Not if you want the game to look good and not be at
5060 has 8 GB of VRAM.
Gonna rock my 3060 Ti for years to come huh
Get a 4070 super 60 class cards suck anyway
@@WhoIsLostIf wanted to upgrade from a 3060 Ti after 4 years, I would get a card that has 16GB and is at least twice as fast. Even if the RTX 4070 Super is an upgrade, it wouldn't be enough for me. Only a 4070 Ti Super would be a worthy successor, but the price is crazy.
An 8GB VRAM card for 2025 simply shouldnt even exist. Even an RTX 5050M running at 30W in something like an XPS 14 should have 12GB. But going back to the question, my honest opinion is that if an 8GB GPU has to exist, it should be sub $200. Anything above $200 for an 8GB GPU in 2025 is a literal scam, due to how badly they perform as soon as the VRAM hits its limit. 1080 Ti users are going to be set until 6000 series if Nvidia gives users a $400 RTX 5060 Ti 8GB.
Interesting alternative universe you live in
@@tysopiccaso8711 Giving an RTX 5050 8GB of VRAM is like giving an RTX 4080 12GB of VRAM, wait...
that is too extreme 5050 10gb is alright 5060 should have 12gb
@@turtleneck369 Honestly I'm really wanting 12GB despite knowing that wont happen because games will continue to get more VRAM intensive especially with new versions of DLSS and FG hogging up even more.
@@RobloxianX well even 10gb would be big difference for entry level rtx card opposite to 8gb while it might not seem like huge difference it would actually mike lot of games way more playable at 1080p it would still be like 27% increase in VRAM memory 8gb is kinda useless specially considering prices they are going to release in its not going to be cheep and i fear its going to be even more expensive than people expect and predict
realistically speaking it doesn't matter how much we think it should cost, as long as there are still people buying Nvidia overpriced 8GB GPUs then they will keep charging more for 8GB GPUs
225-250$, i think thats the maximum a 8GB GPU should cost at this point, anything higher than that it feels like a scam, 300-400$ ones like future 5060/60 Ti or RX 8600 should have at least 10-12GB.
5060'ish for 400'ish is the same as 4070'ish for 400'ish or 3080'ish for 400'ish. so ist no good. literally 0 value over old gen. we will need 5060(ti) outperform 4070/3080 and cost more then 50$ less then current price. and ofc 12G minimum
@@jermygod and we will get 5060 Ti = 4070 and 5060 = 4060 Ti probably, i want to belive it wont be this way... But if rumors about 5070 with 4070 Ti raster performance are true, this is the most possible situation sadly, dunno if cuz is getting harder to make performance leaps between generations or lack of competition, but it kinda sucks.
@@juanford55 Let's hope that at least the prices for older generations will go down
true. but i dont think we gonna see these prices anymore. Inflation sucks, also gpu corpos aren't our friends. After all they are here for $$, and they saw in 2020-2021 people are paying high prices like joke
We already know what the prices should be depending on VRAM in our current generation which have the following prices:
8GB RAM - $200
12GB RAM - $300~$400 (7600XT [16GB] and 7700 XT)
16GB RAM - $450~$600 (7800 XT till 7900 GRE)
20GB RAM - $650~$800 (7900 XT)
24GB RAM - $800~$1,200 (7900XTX)
32GB RAM - Anything above that (RTX 5090)
Edit: Corrected 7600XT as being 16GB and 7700XT as being 12GB.
But nVidia thinks that they can sell an 8GB card for 400€
7600 XT is actually 16gb
@@hyperturbotechnomike because people keep buying them. It's not even an average person's fault because almost all pre-built systems use Nvidia's cards. If all the pre-builts can be changed to AMD, that alone would spike the usage charts. People knowledge about this stuff don't generally buy 4060 or 4060 Ti over the AMD counterparts, unless they've a specific use case of Nvidia.
@@z3roo0 Oh my bad, I must have mixed it with some other model. I recall some AMD card having 12GB which one was it?
@@darkfeeels 7700 XT has 12 GB.
wait till you see laptop 5070 will still have 8gb vram,for context laptop 1070 had a 256 bit bus and 8gb vram.They are gonna milk gamers dry with this shit.
5060 ti desktop will be 192bit 12gb.. supposedly. I would guess laptop 5070 will be 10 or 12. If it's still 8.... 💀💀💀💀
Ray tracing is slow when you don't tweak. It's possible to run Path tracing on 6700xt on Cyberpunk, tinkerers can already game with rt on 4060 level cards. Steve and Tim must test on generic settings because of how much work they've got. Example: in Cyberpunk turn on rt, but only reflections and gi medium. Most people won't see the difference from ultra. Apart from 2x fps. effect multiplies with fsr+fg. On 6700xt it should be 1440p fsrq fg 110fps.
This is also true for many newer rasterized games. Stalker 2 is another great example right now, where GSC just checked all the boxes for unnecessarily computationally heavy shader effects. With a few optimization mods from the Nexus I have the game maxed out with a humble 5800X + 6800 XT at 1440p with FSR native AA and can maintain 60 FPS in villages and 70-90 FPS out exploring. My much cheaper rig is now only slightly behind what a 9800X3D + RTX 4090 gets on the stock max settings, and the only cost was a barely noticeable hit to shadowing and ambient occlusion quality. So many games have settings that come with a 50% performance hit for what amounts to 10% better visuals, and they're designed to make fools part with their money on overpriced hardware.
@K31TH3R Great example. For me and 5800x3d+7900xt I've played wukong and delta between high and cinematic settings is like 2x, meaning, average fps are similar but lows are halved. Something tweaked for nvidia must be in vhigh and cinematic, HU guys say GI or water effects. I can bet non rt lumen is using some nvidia acceleration.
All cards should start at 16g the 5070 16g 5070ti 16g 5080 20g 5090 24g this is what Nvidia should put out
5090 might be 32GB, 5080, 5080 Ti needs to be 20-24GB, 5070,5070 Ti 18-20GB. 5060, 5060 Ti 14-16GB, 5050, 5050ti 12 to 14 GB. They can use GDDR6x ram for the lower tier cards starting at 5060 Ti and lower for higher ram capacity. This leaves room for Super refresh having more or faster GDDR7 when price goes down ram.
12gb is good enough starting point at 5060,5070 16gb and more so on
Nvidia does it on purpose to force you to buy expensive cards. Hence why I buy AMD on that alone. For the same price, you get double the video memory. FTC should be going after Nvidia for their prices
its on the consumers to if they keep buying they will be doing it its great business
I did a quick check of spot price for GDDR6, and it's currently about $2.3 per gig. GDDR6X would be about 50%-ish more. So that's about $18 for 8 GB GDDR6, maybe $27 for 8 GB GDDR6X. That's the raw price to the producers, if I understand the market right.
If you pc owners think its bad wait till you get to laptops. You have to go to a 80 class to go above 8.
I don't think you should buy an 8GB card in 2025 at all, regardless of the price. You're buying a card that's obsolete the day you install it. There are also lots of options with 12+ GB VRAM even at entry level prices, so there's just no reason to go with an 8 GB card.
If you`re buying a GPU, delay your purchase by 3 or 6 months or even a year to save up and get a higher tier
This generation of GPU`s have to be well ahead of RX 6000 series and RTX 3000
1080TI was matched by 6600XT and right in between 3060-3060TI but these are the lowest acceptable end cards
Get yourself a GPU that is at least 20 to 30% better than a 6950XT and 3090TI
It really is a shame that games get demanding at a way faster rate and GPU`s are stagnating. They also don`t look as good and devs are not optimizing.
Just play older classics, horror games and even multiplayers until the market changes
rtx 5000 will not see a big step up in raster. only in ratracing. and it will use more power. I do not expect a better price/performance rating. The only hope is that AMD can provide 7900xtx performance for 600$ish
1080ti does not match the rx6600XT, 1080ti does not even match the rx 6600, it is slower.
No, i will not, already happy with my RX 6600 at 186$ bought last week and playing all my games without issues NOW, this advice only works if you live in a country with a strong economy, enough spare money in your own wallet just in case the card's stock in your area either is still strong or goes dry and overpriced the next month, or care about the newest trends and AAA titles.
Then again, if you find a good discount over anything above a RX 6700 or a 3060 Ti and care about the next Cyberpunk sequel or whatever the heck they invent, get it.
It costs as much as people will pay. And people will pay.
How much should it cost? They should not exist in the first place. Having an 8GB GPU in 2025 is an absolute joke. I won't get it even for free.
I'm still hanging onto my GTX 1080 Ti, until it dies or it becomes absolutely necessary to replace it.
Love my 3060 12gb. Oc'd it's close to a 4060.
1440p gaming with dlss quality and sometimes none.
Fantastic card for the price.
No sense in lowering prices when current gen 8gb GPUs are selling well. They can only be more expensive..😊
I bought in this past 2 weeks 2 RX6700XT . One XFX 319, and one Red Devil Edition both for 180€ used. And the red devil came with invoice papers with 2 years warranty left. It didn't make sense at least for me, spending the same or even more money on a 8gb card
An 8gb card is a sub $200 card only now.
A 12 gb card should be a sub $400 card now.
There should be no more 8GB shit in 2025. It's really that simple.
I won't complain if they release a 150 dollar card with 8GB of vram, but it seems pretty obvious that Nvidia won't be doing that. AMD will likely release one or two very low-end 8GB cards, but those will likely be a pretty poor value.
However, it seems all but certain that AMD will also be releasing at least one low end card with 12GB of vram somewhere in the 200-300 ish dollar price range. An actual purpose-built 12GB card would allow them to make them in large quantities at a relatively low cost, unlike the 7700 XT, which was never designed or intended to be produced or sold in large quantities, and kind-of only existed to make the 7800 XT look good, and to fill a gap in their product lineup.
@@syncmonism There is a chance that RDNA 4 and Battlemage could significantly improve the gaming GPU market if they deliver powerful low-end and midrange GPUs at aggressive prices. A lot of gamers want good price to performance. If AMD and Intel can manage to do that, I can see them gaining a considerable amount of market share. It might take Intel a few years. However, AMD could definitely gain quite a bit with RDNA 4. RDNA 4 and Battlemage are said to be very competitive in the low-end and midrange segments. The top end RDNA 4 GPU is rumoured to be in between the 7900 XT and 7900 XTX in rasterized games and rouhgly be on par with 4080 in ray-traced games for around $400-$500. If AMD can manage to pull that off, that would be impressive and should be an appealing option for midrange gamers.
@@syncmonism Navi 48 will be the top end GPU for RDNA 4. It's meant to have 16GB of VRAM on a 256-bit bus. Those are decent specs for a midrange graphics card.
@@syncmonism Do not be shocked when next year we get 4g cards still from both. Remember AMD had already said 8g was the minimum a card should had and the released the 6500xt followed by the 6400xt.
And while in the 7000 series and 4000 serie we didnt got less then 8 , thats mostly because the 6400 and 6500xt are both readily avaliable still. And nvidia did not made anything under the 4060 because they still selling the 3050, which has a variant of 6gb.
8gb on the most budget cards is fine as long as they are cheap enough. Nvidia is still selling the 6gb 3050 and it’s not great but it’s not a complete ripoff. While I’d rather they just clamshell all the cards and double the VRAM, or give us a bigger bus with more. I know they won’t do that. Pricing is the big point for how much VRAM is acceptable.
While 8gb is enough for my usage, paying $300 USD at the end of 2024 (basically 2025) for 8gb is scammy.
Max 150eur.. With this amount this is basically a video adapter no matter the features
250 max for 8gb vram, 400 max for 12gb, and 700 max for 16gb, thats the right prices but no one would do it, 20 and 24gb they should cost more than 900
My hopes and prayers for folks and me too, waiting for a Midrange 5060 or 5060ti with less than 499 usd price range
It depends on;
1 - Whether there will be at least 16 GB of Vram or not
2- Whether or not, those offer at least "rtx 4070" and "rtx 4070 ti" performances
3 - Affordable and reasonable price.
Simple is that...
@ only tomorrow will tell.. what all awaits for us mid range folks
Cheers 🥂 to better tomorrows
I’m still waiting to see 8GB substantially limit a card at the price points where its been put on cards. The 4060Ti is the only one that springs to mind and in your own testing it was like 4% different at 1080p which isn’t exactly a game changer and was closer at 1440 due to the card not being fast enough to play at settings where the difference would actually show. 8GB is fine for a card targeted at 1080p as shown by yourselves with the 4060Ti.
i dont think its a hardware issue just look at dlss and framgen of 30fps target at 480p
if u made every gpu 16gb overnight they would start to struggle as unoptimized bad devs will put 17gb of junk there.
its always devs fault
Very interesting point. Is no matter how much vram you would have if games always would be such unoptimized
The more vram we have the worse optimization games will get
I bought 4060 for around 200$ on a big sale from one of out local retailers and I am very happy with the performance after 1650. But when asked I tell my friends not to buy it and save more money. Because it's fairly annoying to think about that. Not that I have often encountered vram issues but it's just because I do check vram usage in the beggining to not get frivolous with the settings. And I just don't want to do this anymore. Next gpu will be 16 gigs mininum
At a certain point, you get what you pay for. You could never expect high end for 200 bucks. But $200 for the RTX 4060 is a fantastic price.
I am just considering the 4060 TI 16 GB, but the prices are close or even the same as the 4070 12 GB for accompanying my 6800 XT. 8 GB of GPU VRAM for 2025 doesn't make sense to me, even if I only play 1080p. Whatever the 5060 price, I just wait for the 4070 price to go down when the 5070 comes.
" I just wait for the 4070 price to go down when the 5070 comes." That's very unlikely to happen. It's not going to get any cheaper. It's probably going to get more expensive because people will buy up the card to get away from the bad 50 lineup. Plus NVIDIA stopped making the 70 cards. Just buy it. The performance of the 4070 is worth 100 times more than the higher VRAM of the 4060 Ti. And if you can scrounge up some extra cash, the 4070 Super is even better. Trust me.
8gb vram is 720p card
If they're going to use an 8 GB GDR7 then the bus speed should be a minimum of recommended 384 then and only then you might not see such bad framerates because it was going to have at least something for it to work with instead of bottlenecking the bandwidth instead because all they're doing is choking off the performance on purpose
Recommended for the higher-in video cards like maybe the mid-range typically speaking it should be 512 bit because if you remember and I know I do there were 8 GB cards that had a one megabyte bandwidth and two megabyte bandwidth cards that performs so freaking well that they removed them so finding those is extremely ultra ultra ultra rare and that's what needs to be to come back
Where the hell were all these kind of videos when i was a GPU
8GB shouldn't even be a thing anymore lol
I bought a 4060 8gb prebuilt mini itx, and Im fine with it. But I mostly play older games. Not everybody needs to top of the line. The problem is memory leak and poor optimization with new releases, not video ram, especially if you play multiplayer games at lower resolutions, and higher fps.
Meanwhile I am still rocking my 250 dollar Zotac RTX 3050 Twin Edge OC 8GB that I got in middle of 2022(during crypto mining). I don't need to upgrade because I run older titles and I had the GT 1030 2GB GDDR5 for 5 years before getting the RTX 3050. I have catchup to do with my games. I haven't even finish Witcher 3, Metro: Last Light, oxygen not included, Torchlight 2, Warframe, Astroneer and Shadow of the Tomb Raider. Those are the ones I am currently busy with and then there are ones I have not even started yet or installed yet. There are so many old classics a person can play. Then there is the ps2/ps3 emulator and those games I have not finished yet. My am4 cpu and mobo died so I just got my I5 12400F and Gigabyte B760M Gaming DDR4 motherboard recently(installed it this week).
Same here brotha, i respect your taste and lateness in games, i just started playing RDR2 on my new purchased RX6600 last week and still got a backlog to run along with emulators lol.
Well said. I Play GTA5, Guild Wars 2, God of War, Horizon Zero Dawn, No Man Sky... I game at 1080p with a 3070 ti. Its working good at high to very high settings. But since i plan to upgrade my display within the next 6 months for a 1440p IPS monitor. There for, i will need to upgrade my GPU for sure. 16 go vram will be the very minimum. But if i want to run my games smoothly for the next 5 years, I am looking for a GPU with 20 Go vram instead. And Nvidia just do not give me that option. So, i will sacrifice DLSS for more vram. Maybee my visual wont be as flashy as it would be with DLSS but at least, my gaming sessions wont be stuttering.
You guys (HUB) are actually part of the problem here. The 60 series cards are not "entry level cards" they are the popular gaming cards. Check the steam charts. The even worse "50" series cards are the entry level cards...
$150 or less 8GB is DOA games are asking for more VRAM stalker needs more than 8GB of VRAM to run good.
If the video cards Intel plans to release on the 12th show up, it's like they are already setting the bar higher as to follow the RX 6700 10GB and RX 6700 XT 12GB standard in momery layout at a lower price.
But how are they gona make you to upgrade if you have enough vram?
also worth noting that 8gb vram on nvidia is different from 8gb of vram on AMD.
you can still get away with nvidia in most cases at 8gb of vram but not on amd, because AMD just gulps down more vram playing the same game for some effing reasons.
i wouldnt recommend 8gb vram on amd card, even if the performance might be a little better than comparing to the nvidia counterparts.
sources were did u get that info from curious cuz i have not see explanation with data on that
@@turtleneck369 I have both amd and nvidia cards. i did a comparison. Amd indeed use more vram.
if 8gb is for 1080p it should cost for a 1080p setup not more than 25k
7:33 my ocd is tingling omg, look at that bent fin
That is not OCD, but ridiculousness.
8GB cards would be sufficient if unreal engine 5 is not used anymore ☠️
Am I missing something? Why were the cards being tested using pcie 2.0?
So far what the leaks suggest is the entry level Radeon and Nvidia cards having a 128bit memory bus once again so the possible memory configurations are the following:
Intel B570?:
-(160bit) 10GB G6
Intel B580:
-(192bit) 12GB G6 (almost released by now)
Nvidia RTX5060:
-8GB G6
-16GB G6 (clamshelled, which adds extra cost)
-12GB G7 (possible refresh with 3GB memory chips)
Radeon RX8500XT/8600XT
-8GB G6
-16GB G6 (clamshelled)
I miss when the games fully optimize😭
Welp thats just a dream never come true after all😋
300 dollars for a GPU that can't be used in all games is a bit too much, we had 250 dollars GPU which lasted for years and had the same VRAM as the x80 class, 8 GPUs are old right now, they aren't worth anything more than 180 dollars.
I sort of want to know... if say you had 8gb of gddr7x on idk 512bit bus or whatever, would that do the trick?
Or doesn't matter the speed/bus (Not that I think nvidia would do this - but I've always wondered if fast ram is better than more ram)
Why does NV not give a socket on the gpu to put a vram extension board as an extra. Then you could buy an additional 8g for each card.
This could be done in 90s gpus. You got the card with empty ram slots.
@ it also makes entry or lower gpus more affordable, means you can add the module at a later date. Keep initial costs down for new pc builders.
Why would they undermine their own deliberate product segmentation?
8GB - 1080p, $200 12GB - 1440p, $350 16GB+ ‐ 4k, $500 w/ great ray-tracing
This will keep happening until people stop buying nvidia cards. Unfortunately nvidia has had extreme success in anti-AMD viral marketing (drivers nonsense, exaggerating benefits of dlss/rt). They're never going to stop screwing you unless you stop buying their crap.
minimum vram:
1080p | 12GB
1440p | 16GB
2160p | 20GB
200$ and below can be 8GB
250$ and above 12GB
400$ and above 16GB
800$ and above 20GB
1000$ and above 24GB+
those are the minimums for me
Nvidia is known for cutting vram for profit, they done this strategy for decades.
People forget that productivity user need more vram
I'm still divided about 8GB VRAM. In a lower tier card, it's totally fine. These are not meant to run every game on max texture settings or at the highest resolution. An RTX 3050 or RX6600 for example is a compromise. But everything with a Ti in it's name or above 250€ should have more VRAM.
The most disappointing card for me personally was the RTX 3070 Ti. The biggest rubbish card nVidia has made since the dreaded Geforce"4" MX scam series in the 2000's. As someone with a 4K monitor, i always went with 70 tier cards, beginning with the GTX 770 Ti 4GB, GTX 1070, RTX 2070 super and now with the RTX 4070. The 70 class card are not perfect for 4K, but they have far better price to performance ratio than the 80 series one. The RTX 4080 is almost double the price. I'm fine with medium/high textures.
The 4060 8GB would be fine if it was 250€ and not 320€, the 8GB Ti costs around 400€, which is nonsense. The best 8GB card in my opinion is still the 3060 Ti. Especially the used market has solid offers. Around 200€, what a card of this performance level should cost.
My sister in law as an RTX 3060 Ti and usually only plays asian "2.5D" or when i play with her tactical/milsim games on a 4K TV. Perfectly capable. But she also wants to try out stalker and in 1080p, the game works fine. In 4K, DLSS Performance is needed which makes everything blurry and especially the very tranparent anomalies and enemies behind foliage are very difficult to see with all the AI noise.1440p is not tested.
And then there is optimisation. A lot of games are just terribly optimised, especially Unreal Engine 5 games are. The shadow, reflection and texture settings are broken with many UE5 games, because the devs think that UE5 is the holy grail medicine, which solves all their technical problems.
Metro Exodus is an example for a well optimised game, which has stellar graphics with RT on without needing more then 8GB VRAM. I played the enhanced edition, which is RT only on an RTX 2070 super in 4K at above 60fps. The engine was carefully developed for this specific game from the ground up. And not just some all around tool that tries to suit everyone, but at the end does nothing perfect.
//Typo
1050Ti 4GB back in 2016 was 150€. Almost 10 years later and prices have skyrocketed, after the PLANNED gpu crisis. If it was not planned, prices would return to normal. PERIOD.-
4060 class cards need to be 140$.
Something like 6600 needs to be a 100$
If GPU prices are realistic.
4070 needs to be 300$.
7700xt 230$.
4080 is 500$. 4090 600$.
Vram standard should be 12gb.
Perhaps even 16gb a two years from now.
Lets not forget that upscaling and framegen use vram. Imagine you can eun ultra 1080p 90hz or upscaled to 1440p from 1080p and framegen to 120fps if you have the vram. "Framegen and upscaling sucks" that is subjective.
my rx 7600 8gb would be fine if fsr 3 was properly implemented in games.
steelrising (nvidia sponsored title requires 13.5gb to run on ultra 1440p), suicide squad requires 11gb vram to run in 1080p, RE4 remake with raytracing requires 12gb vram to run in 1440p. most my games from 2022 and onward need more than 8gb vram
i like that 8gb of vram is becoming an anchor point. it's ridiculous bad, but it's good that game devs can target 8gb and make the most of it.
obviously don't buy an 8gb card in 2020 if you're looking forward at future stuff, but 8gb being a low-rent target is good.
I agree. new cards should have no less that 12gb. Otherwise they may as well just put 4 gb on there as there is no difference than how it runs with 8gb
The only series that should have 8GB in the next gen. Is if they make a 5050.
Why are you testing with an pcie 2.0 connection? O.o
8gb is the new 3-4gb imo 180-250 is the price I have set in my mind 12-16gb is the new midrange with a price up to 400-450
Then there’s the 90 class cards that can be 900-1000+
Why arent there more 12gb cards?
Another point to look at is Nvidia isn't focused on gaming gpu's now, its all about AI because that's what made them filthy rich over the last year or two. The best chips are going to be ear marked for AI and not gaming cards.
It's quite simple. If the comming Nvidia/AMD X6X class has only 8GB, i'll buy Intel (provided the B580 is really going to get 12GB)
Nvidia released a low-end card with 12GB of vram nearly 4 years ago, though it was pretty pricey at 330, kind-of masquerading as a mid-range card, and AMD released a better 12GB card about a month later... although, those prices weren't exactly widely available until maybe a year later, after the GPU shortage situation ended. These two cards have really set the bar for the minimum amount of vram for a graphics card in this price range, and yet more 8GB graphics cards were released in the following generation, with MSRPs as high as 400 for the 4060 ti!
It's honestly baffling that neither AMD nor Nvidia released a single graphics card with 12GB of vram in the price range of 250-400 dollars. Instead, all we get are a bunch of cards with not enough vram, and a couple of overpriced and underpowered versions of these 8GB cards with too much vram (the 7600 XT and the 16GB 4060 ti). The 7700 XT was never designed as a high volume product, and launched at a price which was too high relative to the 7800 XT. The 7700 XT finally started looking like a decent value recently with price cuts, but even then, it's not hard to look like a decent value when compared to either version of the 4060 ti. The 7800 XT has been seeing good discounts as well, and has remained the better value if you could find it at a similar percentage discount from the MSRP.
Oddly, the best value graphics cards in the 250 to 400 dollar price range have generally been the 6700 XT, 6750 XT, and RX 6800 from the previous generation, though availability for the 6800 has often not been great, and as the 6700 XT and 6750 XT have gotten older, the price has mostly stayed fairly stagnant over the past two years, making its value proposition seem increasingly less compelling, though I have seen it for as low as about 270 or 260 in some recent sales, which definitely made it more exciting again.
8GB of VRAM should be about $700 or $800 when using the Nvidia math calculator. Thus far the Nvidia math calculator has not been wrong in a way that some simple rebranding couldn’t fix.
Well. In 2015 it costed 330€$.
In 2016-17 it costed 230€$.
By math, in 2025 it should cost around 3 * 10 ^ minus 112.
I mean if they're so stingy with their vram can't they at least settle with 10GB? Like c'mon🤣
You would not need to replace it if it had 10GB. Manufacturing costs in the graphics card market became irrelevant.
Also they want to make sure it can't be used for AI workloads, because that would lose them millions
Its like 30$ for 8gb of GDDR6. They have no damn excuse to not put 16gb on a card.
AMD will be sorry they did not make N44 with 12GB as Intel does with the B580.
I find that I really don't need more than 8gb of video ram. I've never had a card fast enough to warrant more than 8gb. I would like to see faster cards with a larger memory bus. I have a Radeon RX 580 in my system; it's too slow for me to take full advantage of the vram. It's not that the ram is too slow, it's that the cores are too slow. I would really like to see faster cores; however, there not getting faster or more plentiful compared to the vram speeds. I predict that with in 10-15 years video cards will be made obsolete by the improvements to integrated graphics. This takes into account the fact that GDDR5 is plenty fast for video memory and AMD's efforts in the rnd of higher end integrated graphics. Integrated graphics has much lower latency than using a dedicated graphics card even now.
AI is coming to games, it also requires VRAM!
well here in romania i got my rx 7600 below 250$ 1 year go but for the same price here you might get a 3050 or rx 6600 or a gtx 1630 or gtx 1650 so yeah
Don't spend money on a 8GB card in 2025. Save a little more money and buy at least 12GB.
couldn't they at least do 10? i know 10 isn't a huge jump but like there has to be a compromise to price for something?
8GB should be max $200…maybe $225. From $250 up should be 12 minimum.
12GB VRAM is the minimum for modern AAA games at 1080p medium high settings and stable FPS
16gb is the new minimum.
50 series should be average 24gb
5070 gon be 12gb for sure, and then if amd will be able to compete they will do slightly better super variants
They shouldn't cost more than $150. 8gb has been around for a long time and was main stream with the r9 390 and it's not like those cost $600. So we've had the r9 390, the 480s which truly brought 8gb to everyone for $240 then the 5000xt series, the 6000 series, the 7000 series and soon the 8000. So we are going into the 6th generation since 8gb was main stream and nvidia is still trying to fuck everyone in the ass with expensive 8gb cards and people still defend this crap
Repeating my comment from the QA, 8gb cards shouldn't be more than $200. Noone should be buying a new 8gb card in 2024.
If you're playing games that actually use 8gb comfortably, you're playing either indie games or AAAs before 2020. And if that's the case, frankly, you can just buy a used card. A 3060, 3060ti, 6600XT or 6700XT are great at that level. $250~ for a used 6700XT on ebay on Dec 1st. And that's actually giving you 12gb.
itntels new battlemage arc b750 is 11gb for $250 so you tell me its coming early 2025.
so you tell me.