Capping the VRAM is the most convenient way to artificially limit the productive life cycles of GPUs. Remember 2020 with a EUR 500 Radeon VII that got 16 GB of expensive HBM2 memory?
5080 is now midrange, NVIDIA wants to force people to pay $2.000+ on a 5090 by making 5080 look mediocre. The performance difference between 5080 and 5090 will be easily 50%, plus the VRAM limitation, 16GB won't be enough for 4K Path Tracing.
bus bit doesn't matter 128bits in 2005 and 2025 are not the same , as memory speed keeps increasing , you don't need a wide bus width for high data rate. of course 8gb is not enough no matter its speed
@@theonlyjinx-420 Only if you're one of the 2% of people with a very top end GPU. For the absolute majority of gamers Radeon's the better buy in the low to midrange.
@@theonlyjinx-420 it really depends where you live buddy but i bought an rx 7900 xtx for 870 euro which is better than rtx 4800 super for 1200 here. Idk i don't feel like spending another 700 or so euro for an rtx 4900. Yh AMD cars need a little more power but since power prices are relative cheap here, i think AMD is better pick either way price/power wise.
@@TakaChan569 Lmfao no. Any card that can run the newest games on the highest settings are NOT low end regardless of the resolution. The 50 and 60 range have always been the lower mid end range but never low end. Despite that 8gb VRAM limitation, even the 3060ti still holds up for HIGH settings at 1440p gaming and EXTREME(or highest) settings at 1080p.
California, USA. I run a systems integration biz and the numbers on the 4060 gpu are terrible. Furmark scores under 10k, while the 4060ti scores are around 14k. Then the 4070 is around 17-18k depending on model and OC capabilities. How can the Ti version be almost a 40% increase/difference in score? I say the 5060 will be entry level come 2025. Probably with a $350-400 price tag to boot. If that's the case, i will write off all lower end 50 series cards and go with Battlemage for entry builds. Since they start at $250.
its pretty obvious that the 4060 is really a 4050 while the 4060 ti is the real 4060.. Its all trickery from nvidia's side. Like the Code names and chip name for the GPU's are completely different, the 4060 is AD107-400 and the 4060 ti is AD106-350. just to reinforce what i'v already said there is no 4050 for desktop use, but there is one for laptops and guess what.. it uses the AD107 The 107 is the designation for a xx50 series card
The 4060 was $300, their $400 gpus didn't sell well so what makes you think they're going to sell you a $400 8gb card in 2025 as their cheapest gpu? The 5060 will be an 8gb gpu costing $300 with rtx 4060 ti performance most likely and the 5060 16gb at $400-450.
@@insector2093 It's actually a bit different, higher up SKU's saw colossal increases in CU count etc. but the mid range has remained stagnant, but at this point it's only splitting hairs.
Exactly like shintel before Ryzen! Small incremments all the time , but huge increase in price all the time. Really nshitia needs the same treatment Ryzen do to shintel!
@@riannair7101 I think intel did that to themselves lol Sure Ryzen brought competition but Intel had that mindshare and still competed well until 10nm was crazy late(11series had less cores than 10series) and then the whole 13k, 13ks, 14k, 14ks where power increased 10% each time and with 2% performance increases. They had such an easy time in the early 2010s that when amd's mediocre competition came.. they shit themselves. Not to take away from AMD... Leading the way in Chiplets and Vcache was some foresight that Intel completely missed.
I made the switch to AMD (7900 XT) 2 years ago so I would not have to deal with Nvidia "memory crap" never again. Listening to this video confirmed I made a good decision.
but you have to deal with not being able to game with top performance and PT and lack of support from certain games. And when you are a sucker for visuals, AMD's FSR just can't compare.
@@chrisking6695if he cared so much about visuals i dont think he wouldve buyed an amd card and the 7900 xt can easily run games high performance at 1080p, 1440p and its also decent for entry level 4k gaming
@@chrisking6695 What's the point of having faster RT performance if Nvidia doesn't give you the necessary amount of VRAM to use it? Nvidia wants to sell gamers on these VRAM hungry features but won't give you the appropriate VRAM to take advantage of it.
it's funny how mad nvidia fanbois get when folks who'd rather spend their money elsewhere AND also have a great gaming experience get to do so at around 70 percent the cost, and excepting the 40-series TITAN (which nvidia fooled folks into clamoring for by changing the TITAN branding to 90-series) with the 7900XTX (being half the cost of a 4090), they'd still get a great experience. It absolutely tears their souls to shreds haha. It's gonna be hilarious when AMD disappears and nvidia can charge folks 5 grand for a piece of crap that's more of a humiliation ritual than a product worth spending your hard-earned money on but please, continue to buy "the best" with gimped features they're always so hilariously trying to convince you that you're living a lie by missing out on. Here, I'll make it simple - you wasted a ton of money chasing specs and you're angry that nobody really cares, haha.
So you have the RTX 5060TI and the RTX DOA. the 5060TI is probably going to cost 499 or 450... and the RTX DOA will be around 350 to start... the sad part is that even the 4060 core could do a lot more if it had the VRAM.... now the 5060 is theoretically even more powerfull and will probably also lose to the 3060 once the VRAM buffer spills over... that is a 2 generation old GPU beating a new generation one... good job Nvidia.... bravo...
And this, right here, is exactly why would should blindly buy stuff off brand recognition alone... Meanwhile Intels B580 has 12 GB of V-ram and is trading blows with the 4060 Ti for only 250$. Meanwhile AMD's 8800XT is launching soon, trading blows with the 4080 fora an estimated 499$ NVIDIA is turning into the Apple of GPU's, meaningless updates, built in bottle necks with ram to entice upgrades, planned obsolescence, selling glorified software at a ridiculous premium because they know a good slice of the market has been trained to lap up whatever they squirt out. When the 6090 comes out and costs 3500$ y'all can thank yourselves.
I have a 3070, the card is a beast, I get like 130fps in most games, but that 8gb of VRAM... Nvidia has really screw us over, and I could bet the 5060 non ti, still won't be faster than the 3070. The point is, we have to keep buying these damn GPU's cause we NEVER can have our cake and eat it too!!
@@4evahodlingdoge226have you actually watched benchmarks? B580 matches or outperform the 4060 in most games in 1080 and almost always outperforms in 1440p (which it can also play confidently, because it has the vram to do so), the only benefit to the 4060 in gaming is that it sometimes has better 1% lows which you will never notice and also it takes more power to run
@masterlee1988 I've bought lots from both sides, and recommend both for friends and family with their builds, but I won't be supporting Nvidia until they change their ways. They can charge lots for AI cards, but the gaming market should not be gouged.
Yup the upcoming RX 8800 XT looks like it fits me the most. If it's launch price is too expensive i'll just get a RX 7800 XT when the price drops. I do like what I hear with Battlemage so far but I think I'll give them another generation or 2 before I really consider them.
Yep, got a 7900gre to pass on my 2070s to the wife. I can run highly modded* games with high-max settings on ultrawide 1440 and I've been very happy, esp for the money I got it for. AMD has its little bugs and quirks software side compared to Nvidia but very livable and manageable tbh.
Are we just going to pretend that AMD didn't have a 8gb card ?Also actually buy the competition, not just yapping on comment section while both AMD and Intel share 10% of the market. RDNA 4 too will have a 8GB card call them out too...no shifting goalposts this time.
it seems like nvidia is going for GDDR7 for marketing purpose only and keeping the bus size low. It's probably cheaper to stay with GDDR6X with bigger bus and more capacity, but no, that'll make the card too capable and nvidia can't allow that
Bus width is entirely dependent on how many memory chips are on the card. 32-bits per chip. Memory bandwidth is the important number, not bus width. GDDR7 being significantly faster increases the bandwidth. A 16gb 5080 with a 256 bit bus using gddr7 would still nearly have as much memory bandwidth as a 4090 with a 382 bit bus. There's also cache size as a variable which increases effective bandwidth even more. The 3090 only had 6MB of L2 cache, the 4090 massively increased that to 72MB. It might increase again with the 5000 series.
@mojojojo6292 bus width matters because bus saturation happens. This causes latency. Just because the module itself is faster doesn't mean it cannot be saturated. it is all cost margins any other excuse to not even compete with hbm first gen is sad and pathetic. its all to save on a per module basis. cheaper modules higher prices for the consumer. GPUs should be on hbm on the mainstream at this point. The bandwidth and capacities at current market prices it should be a requirement.
@christopherfortineux6937 Last time we got HBM memory (from AMD) on consumer GPU's it was a big fat nothing burger, the only good thing that came from the rumours was the 1080Ti...
Additional memory bandwidth beyond a certain point is entirely pointless. Also, the majority of data needs to be transfered sequentially, a wider bus does literally nothing for this scenario. Also with Vega and HBM, you got very impressive gaming performance uplifts when lowering the timings of the HBM, you got next to nothing when increasing the frequency of it. The latency matters, the width and "theoretical bandwidth" didn't anywhere near as much (which goes to prove my point)
@@mikelay5360 I already have a GPU but no, won't stop "whining" about greedy companies ripping off fellow hard working man. None of these companies seem to care about entry-level other than Intel rn.
Looking back, I most likely made a bad decision but got a 4060 ti 16gb for $430. If wanting something in the near future, I would like to see what AMDs 8000 series can offer.
I'm all AMD GPUs now. Not giving my money to Nvidia. 3 PCs. one with RX 7800 XT, one with 7700 XT and one with a used 6700 XT I bought recently. Sensible Vram and memory banwith configs at a sensible price. The fact that Nvidia is still keeping 12GB for the 5070 is kinda dissapointing. And it will probably be very expensive. Should have had 16GB. Also, 5060 with 8GB is ridiculous at this point. Will probably be expensive aswell.
such a shame they are gimping the 60ti again... I might just keep using my 3060ti with the 256bit bus. There was already no point in going to the 4060ti....
What more exactly do you want from an entry level GPU? 60 class is entry level, if it doesn't look enticing to you then maybe it's time to go up a tier
@@wolfstorm5394 The issue is that these cards are usually more expensive than their competitors & Nvidia keeps gimping their fans on memory. Clearly they want you to spend more for a 70, 80 or 90 series card. Their 60 series needs a rework or a price reduction. Just Ngredia at it again.
@@Fry000 well geee I wonder why that might be the case, the thing is there's way more demand for Nvidia GPUs than any other, even just looking at the steam hardware survey you can see that over 70% of users are using Nvidia GPUs, prices go up when there's high demand...that's how it works Edit: just checked the market share again and looks like Nvidia is clocking in at 88% as of 2024 in the desktop GPU market share, there's literally no competition at this point
@@wolfstorm5394 I believe he expects generational improvement not a downgrade in die and bus width. Nvidia has been relying on architectural improvement to pick up the slack but in doing so the performance of the xx60 cards have been stagnant. How is it that the 2060 super is still a relevant comparison lol. The 5060 needs to see actual improvement
It's so easy, but nvidia just don't care. vram configurations should be as follows: 5060 12gb 5060ti 12gb 5070 16gb 5070ti 16gb 5080 20gb 5090 24gb They keep skimping at the bottom end whilst giving the flagship card far more than it's ever likely to need. Considering the stupid prices of the 90 class cards I suppose a case could be made for 32gb. Just to be sure you definitely never ever run out.
12GB for 1080p wtf. 5060 is fine with 8GB 5060Ti should be 10GB and 4070 14GB That's more than enough and you can't prove otherwise (I don't care about the 3 poorly optimized games that need 16GB for 1080p Ultra Textures+DLSS+FG+Path Tracing) People are speaking nonsense about how much VRAM you need.
@@mafi978 the 4060 is already lacking with its 8gbs on newer games even at 1080p. The 3060 was performing better than the 4060 in the new indiana jones game
@@mafi978 There's plenty of benchmarks showcasing 8GB is not enough for 1080p, 10GB is the very minimum and high-end 1440p cards should have 16GB for the future.
50-class = entry 60-class = low end 70-class = midrange 80-class = high-end 90-class = enthusiest The problem is, each class is named one class higher than what they actually are.
Rather than buy a 5060ti or otherwise. Buy a b580 and call it a day. Time for jensen to learn a lesson. The 5080 only having 16gb is a joke. Cant wait to see the prices for this gen. Im betting itll be laughable.
Dear Nvidia. 70 class GPUs are 256-bit or higher. 60 class GPUs are 192-bit or higher. 50 class GPUs are 128-bit. Anything smaller is barely a GPU. KTHX - BAI
In 2025 we will have 300 dollar 8 gb cards. 8 years before that we had 300 dollar 8 gb cards, and 8 years before that we had 300 dollar 512mb vram cards (vram went up 16 times in those 8 years)
one of the big lie of comments and unfluencers, comparing end of life on sale price to MSRP next gen, especially when MSRP isn't real for every other countries than the USA and even for them, they have been called out several times for making like 100 cards at MSRP then completely stop selling those wether officially or just no availability, the truth is I bought every 80 nvidia since the 580 except the 2080 and each and every time the price rose by roughly 150$ it never was "the same" between two gens, never
Nvidia not an option for me. I dont want a 5060 with 16gb neither the next sku with only 12 GB. looking forward to see what AMD and Intel have to offer.
@@mikelay5360 how is it ok being at $1,200 ? if it is 4090 performance thats only around a 30% uplift. so you would be paying 20% more for 30% more performance. thats barely moving the performance per dollar mark at all. that would be a terrible upgrade LOL. it needs to stay below $1000
60 class cards have always been midrange until recently, there was the 10-50 class cards which were low end, 60-70 class cards mid range, and 80-+ class cards high end, that has shifted now.
GTX 560 Ti was a really decent midrange card. It was running most games at 1080p/Maxed out back when it launched. And in 2011 1080p was in the same position as 1440p these days.
wtf is the point of 8gb gpus? 2025 and beyond, games will use more than 8gb at even 1080p low/medium settings. We are already seeing this with Indiana Jones.
i dont understand why the huge insane jump from the 5080 compared to the 5090 when it comes to VRAM, so you are trying to tell me the 5060ti and 5080 are both gonna have 16gb ? thats crazy
To upsell as many as possible potential 5080 buyers to the 5090. It's the 4080 all over again. Ironically it mostly results in a downsell to the XX70Ti (4070sTi), given the terrible cost per frame proposition of the XX80 and the huge jump in absolute cost to the flagship.
considering the 60 class has been nvidias lowest desktop cards i cant in good conscience call them "mid" range. they have been closer to the lower range with a mid range price tag.
My thoughts exactly.. 60 class is barely able to deliver stable native 60 fps in modern games at 1080p and I'm sorry but Full HD in 2024 is not "mid" range. Also why gamers keep accepting 30 fps? C'mon guys... it's sluggish as heck. I used to play at around 45 fps on my first PC back in 2005.. It's almost 2025. With today's fast monitors capable of 480Hz or more once you experience 90+ fps then even 60 feels a bit choppy. GPUs in general aren't cheap.. We pay 300$ and have to rely on up-scaling tricks in order to play with decent fps, but in the process we get shimmering image, full of artifacts and ghosting in motion? Wth is this?
@@Patryk122PL 100% agree. i hope for our sake that no one buys the 5060 cards. i truly hope the intel cards take over the low end and AMD gets their act together with RDNA/UDNA 5 and brings some much needed competition back to the high end. I cant argue with the Nvidia top end GPUs but people need to stop buying the bottom of the barrel garbage that Nvidia is pushing out with the 60 and 70 class cards.
The 60 was considered midrange because they used to deliver half the performance of the flagship hence midrange, a 1060 was 50% the performance of a 1080ti but nowadays a 4060 is only 30% of a 4090 and you need to move up to a 4070 to get 50% of the 4090 performance.
matching the current generation console in vram is minimal requirement for any pc 'gaming' card. games get made for that generations consoles, so any 'gaming' card needs to match it at a minimum or its not a 'gaming' card. For this generation thats 16gb, assuming nothing larger comes out the same times these cards come out, if its not 16gb its not a gaming card. I guess the 50 series cards means the 5060 16gb version is the entry level gaming card.... makes me wonder what nvidia's entry level pc gaming card pricing is going to be.
5060 is entry level and should be considered for 1080p gaming given the increase in size of textures in recent titles i still couldnt possibly see needing a frame buffer larger than 16 gigs ever at the intended resolution.
another 1 trillion dollar profits ...huge win for nvidia and major blow to amd again, amd lost 100 billion from stock marrket shares for whole year, big OUCH!!!
5060 Ti = 16GB 5070 = 12GB 5070 Ti = 16GB 🤨🤨🤨🧐🧐🧐 Is it me or does this make no sense? Why cap the 5070 with 12GB if you're gonna give the 5060 Ti 16GB?
@@ZackSNetwork Still makes no sense... why not make the 5060 ti, 5070 and 5070 Ti all 16GB? And make the 5080 have 20GB NVIDIA is doing the same shit since the RTX 30 series where they made the 3060 12GB but the 3060 Ti and 3070 all 8GB 4060 Ti 16GB only exists because NVIDIA wanted to prove AMD wrong when they called out for gimping VRAM.
It means Nvidia is once again making fun of clowns who bank everything on VRAM, because at the end of the day 12GB on a 70 class GPU will go way further than 16GB on an entry level 60 class GPU. And besides that most people will be buying from the 60 class anyway, everybody is excited for more VRAM on high end GPUs but we all know at the end of the day the majority settles in the 60 class.
@@Ad_Dystopia That's because AMD copied what NVIDIA did and released the 7600 XT with clam-shelled 16GB... AMD would've never made the 7600 XT 16GB if NVIDIA had not released the 4060 Ti 16GB
Suspiciously large gap in performance between the 5090 and 5080.... 5080 ti or super next year with 24GB Vram? 🙂 It is almost as if the 5070ti was relabelled 5080 and all cards below got a name upgrade and the real 5080 will release next year. It seems at every release, a lot of the increase comes from the new gddr speed and frame gen
a near high-end like xx07 never close that gap with super high-end like xx09 just with 1 generation only, at least 2-3 generations wll do, welcome to monopoly game
The thing is: nVidia can do whatever they want, people will buy it. Simple as. And if their "oopsie" causes trouble, like the Memory Configuration of the 30 series, people will go online and defend it with their lives and blame everyone but nVidia and themselves for it.
the 16GB version of the 4060ti is ABSOLUTELY POINTLESS. It has the same memory bus as the 8GB version, just bigger modules, so no real improvement for the extra 100 dollars
@@TheIgor449 Not when you start adding things like higher textures and frame generation. HUB have shown videos where a 4060 ti 8GB tanks due to running out of vram whereas the 16gb version is still giving playable framerates. The only bad thing about the 4060 ti 16gb was the extra $100 price on an already overpriced card.
24GB should be start for 5080s 16GB should be start for 5070s, 12GB should be start for 5060s. Come on Nvidia 8GB is not cutting it in games, they're demanding over 8GB right now or the games stutter whilst swapping out with the PCs main memory. If you're going to charge premium prices then you can't deliver a non-premium product and that includes 5060s. Otherwise people might as well by an intel or AMD GPU
I have a working theory that Nvidia are deliberately lowballing their releases to prevent the public from being able to develop AI unrelated to the giant companies. It all makes sense if you think of it this way
You could be right. Note they removed the ability to bridge cards with the 4090. Being able to do that on the 3090 allowed small businesses to run models that wouldn't work within 24GB of VRAM.
The GTX 1060 had a 256bit bus. It was one of the best "midrange" cards ever and held it's own for quite some time. Hearing that the 50 series equivalent is going to have only half the memory bit bus shows that Nvidia is purposely hobbling these cards as they did with the 40 series as they're afraid to have another "10 series" issue where folk don't need to upgrade their cards for several years. At the end of the day, it's all about greed.
I'm running a 3080ti FE that has a 384 bit bus. Im not paying more than 1k for a 5080 and these lower models just look like a bit of a joke at this point. Nvidia is outright cutting power and trading for their own proprietary features. Theyre swapping hardware for software and still wanting to prove gauge thier customer. I sat out the 40 series, and I'm more than happy to sit this one out too if they don't produce something with a good price to performance ratio.
600W TPD 5090 and 400W TPD 5080: The power draw is a lot for both cards, are you going to see any issue with the melting cable once more, or has that been fixed?
The lowest half-decent card is the 5070ti. The rest are way too gimped. And, again, they are coming out with this crap of pushing 16GB on low-end cards. Why? But put 12GB on a higher tier card. Doesn't make any kind of sense beyond "selling vram to people who don't know any better".
Nvidia hasnt listened to anything but their own greed, seemingly doing their best to make pc gaming something that only the affluent can afford... the only value to be found is at the highest end. If anything ill try getting a used 4080 super off of someone upgrading
looks like we are getting 50 class cards masscarading as 60 class cards again the 60ti cards used to have a 256bit bus and the 5060ti will have the same memory speed as the 3060ti as 448gbs
xx60 class of GPU is Low end. Always has been. IMO it is the lowest anyone should consider if buying new before you look into something used that is the same price or cheaper with just as much if not more performance.
I wish amd could offer something 4070ti like performance with a very good efficiency and 2 slots gpu, i had 7800xt but once i moved to an itx i cant use it anymore
@@nicane-9966 that's their ideal cause tariffs that will happen pricing will be to deter low or mid-end seekers into buying only the highend gpus called (The Decoy Effect) same time they will slash prices in 1st world countries for 40th series to make sure they stay dominating gpu market where it matters (it also serves a great PR like unlaunch of 4080 12GB)
this is my system specs: Motherboard: MSI MAG X570S TORPEDO MAX CPU: 5800X3D GPU: AsRock Challenger Pro RX 6800 RAM: G.Skill Ripjaws (2x16gb) 32Gb 3600mhz Storage: 2x 2tb Samsung 980 Pro Cooler: Corsair Hydro Series H150i PRO RGB 360mm Performance Liquid CPU Cooler (replaced with Silent wings 3 fans 120mm x 3 as exhaust) Case: BeQuiet Silent Base 802 PSU: Corsair TX850M 850W 80 Plus Gold Semi-Modular Power Supply Fans: 2 140mm Silent wings 2 at the front for intake, and 1 120mm silent wings 2 as rear exhaust Monitor: LG 32GP750-B 1440p 165hz I'll be waiting for the Ryzen 10000 series to upgrade, as rumors suggested, that the 800X3D variant will be getting a core upgrade. hopefully I won't need to upgrade my PSU but will need to upgrade my AIO....hoping Intel releases their B700 series next year to see how it compares with my RX 6800
What if NVIDIA is planning to use 3GB modules on RTX 5060? Meaning 12GB 128-bit. I really doubt it's the case, but we never know. I really hope people stop buying 8GB $300+ GPUs in 2025.
The 5060 is dead on arrival, no one should be getting an 8GB card in present day. The 5060Ti seems like it could be a good configuration, the memory headroom is nice although I think 12GB would be a better match for that class of card. The question is how much of an uplift it will be over the super cheap B580 which is already pretty much tied with the 4060Ti. The 5070 dropping back to 12GB is weird, as now we're reaching the point where the higher memory could actually matter. Above that it's disappointing to see nothing above 16GB until the 5090. Some games today already recommend that amount of memory for medium-high settings, so you definitely won't be future-proofed.
Considering how many GPU's there are in the stack, I always consider the 60's of the world (1060, 2060, 3060, 4060, etc) to be in the 'low end' bracket, and the Ti versions to MAYBE be the lower mid range. I mean, if those are mid range, does that make the 70's High End, 80's Ultra High End and 90's to be MEGA Super Insane High End? There has to be a Low End somewhere, even if Nvidia doesn't price it as Low End and tries to convince people they're more powerful than they are with DLSS and Frame Gen.
If 16gb is good for MidRange.....
Then explain the 5080 16gb Nvidia!
5060 8GB
5060Ti 16GB
5070 12GB
5070Ti 16GB
Nvidia product stack is a mess in terms of VRAM capacity.
@@stennan5060ti will be 8 or 12gb not 16
Capping the VRAM is the most convenient way to artificially limit the productive life cycles of GPUs. Remember 2020 with a EUR 500 Radeon VII that got 16 GB of expensive HBM2 memory?
they already said the 5080 is only for professionals
5080 is now midrange, NVIDIA wants to force people to pay $2.000+ on a 5090 by making 5080 look mediocre. The performance difference between 5080 and 5090 will be easily 50%, plus the VRAM limitation, 16GB won't be enough for 4K Path Tracing.
Finally, we needed a good GPU for 480P gaming with ray tracing that RTX 5060 will be a beast.
Yeah, my CRT monitor needs some dustin' but it'll run.
@@verigumetin4291 Yup nothing like a good old heavy 89Hz CRT monitor that consumes 150 watts per hour.
Weeeeell, 480p with DLSS of course. 360p native
@@alkoyyy Pfft! You're an amateur. I game at native 144p.
128 bit is not “mid range,” it is flat out low end!
bus bit doesn't matter 128bits in 2005 and 2025 are not the same , as memory speed keeps increasing , you don't need a wide bus width for high data rate. of course 8gb is not enough no matter its speed
96bit is low end!
Just wait 6050 at 2026 with 96bit memory bus and 9Gb of vram! Only $350!
my 1080 is 256 bit
@@haukionkannel i want my triple core back
It would not be an issue if the cards had a large cache similar to AMD's InfinityCache.
Keep buying nvidia people. You deserve it.
It's still the best there is. AMD is getting close, but until they surpass Nvidia people will continue buying the best on the market.
@@theonlyjinx-420 Only if you're one of the 2% of people with a very top end GPU.
For the absolute majority of gamers Radeon's the better buy in the low to midrange.
@@theonlyjinx-420 it really depends where you live buddy but i bought an rx 7900 xtx for 870 euro which is better than rtx 4800 super for 1200 here. Idk i don't feel like spending another 700 or so euro for an rtx 4900. Yh AMD cars need a little more power but since power prices are relative cheap here, i think AMD is better pick either way price/power wise.
@@theonlyjinx-420 Not for lowmid range it isn't. 30 series was the last time that was true
I will keep buying Nvidia and I do deserve it.
When the 60ti is considered mid-rage i'd say we all lost again lol, it's freaking ridiculous imo.
60ti is mid range. 70 class is upper midrange.
Come on we lost more than half a decade ago.
Huh? 60 Ti has always been mid-range. It was certainly never high-end.
@@oktusprime3637 As it stands they are 1080p lvl cards and are the low end imho.
@@TakaChan569 Lmfao no. Any card that can run the newest games on the highest settings are NOT low end regardless of the resolution. The 50 and 60 range have always been the lower mid end range but never low end. Despite that 8gb VRAM limitation, even the 3060ti still holds up for HIGH settings at 1440p gaming and EXTREME(or highest) settings at 1080p.
California, USA. I run a systems integration biz and the numbers on the 4060 gpu are terrible. Furmark scores under 10k, while the 4060ti scores are around 14k. Then the 4070 is around 17-18k depending on model and OC capabilities. How can the Ti version be almost a 40% increase/difference in score? I say the 5060 will be entry level come 2025. Probably with a $350-400 price tag to boot. If that's the case, i will write off all lower end 50 series cards and go with Battlemage for entry builds. Since they start at $250.
You're delusional
its pretty obvious that the 4060 is really a 4050 while the 4060 ti is the real 4060.. Its all trickery from nvidia's side.
Like the Code names and chip name for the GPU's are completely different, the 4060 is AD107-400 and the 4060 ti is AD106-350.
just to reinforce what i'v already said there is no 4050 for desktop use, but there is one for laptops and guess what.. it uses the AD107
The 107 is the designation for a xx50 series card
The 4060 was $300, their $400 gpus didn't sell well so what makes you think they're going to sell you a $400 8gb card in 2025 as their cheapest gpu?
The 5060 will be an 8gb gpu costing $300 with rtx 4060 ti performance most likely and the 5060 16gb at $400-450.
@@insector2093 It's actually a bit different, higher up SKU's saw colossal increases in CU count etc. but the mid range has remained stagnant, but at this point it's only splitting hairs.
Aaaaand here it comes the 2000 dollars video card with 8gb os ram
jajjajaja otro pobre de la pandilla basurilla de amd ¿donde esta la IA y tensor cores en amd? hasta intel es su gama baja tiene ya....
Exactly like shintel before Ryzen! Small incremments all the time , but huge increase in price all the time.
Really nshitia needs the same treatment Ryzen do to shintel!
Garbage comment.
@@riannair7101 I think intel did that to themselves lol Sure Ryzen brought competition but Intel had that mindshare and still competed well until 10nm was crazy late(11series had less cores than 10series) and then the whole 13k, 13ks, 14k, 14ks where power increased 10% each time and with 2% performance increases. They had such an easy time in the early 2010s that when amd's mediocre competition came.. they shit themselves. Not to take away from AMD... Leading the way in Chiplets and Vcache was some foresight that Intel completely missed.
@@riannair7101 intel has literally kept prices fixed to inflation for close to 2 decades.
Intel battlemage b580 is 250$ has 12gb ram and keeps up with 400$ 4060ti.
The 8gb 4060ti should have been the $300 card, and the 16gb variant no more than $400.
its more in line with the base 4060
@@tonydavis8696performance and cudacore wise its only a 50ti anyway and the bus still holds the 16gb version back.
@@tonydavis8696 Crazy cus for about $100 more, you can get a 4070 super brand new in some areas. FYI this card is on par with an RTX 3090.
@@tonydavis8696B580 did not exist two years ago
I made the switch to AMD (7900 XT) 2 years ago so I would not have to deal with Nvidia "memory crap" never again.
Listening to this video confirmed I made a good decision.
but you have to deal with not being able to game with top performance and PT and lack of support from certain games. And when you are a sucker for visuals, AMD's FSR just can't compare.
@@chrisking6695if he cared so much about visuals i dont think he wouldve buyed an amd card and the 7900 xt can easily run games high performance at 1080p, 1440p and its also decent for entry level 4k gaming
@@chrisking6695 He obviously doesn't care about RT and PT.
@@chrisking6695 What's the point of having faster RT performance if Nvidia doesn't give you the necessary amount of VRAM to use it? Nvidia wants to sell gamers on these VRAM hungry features but won't give you the appropriate VRAM to take advantage of it.
it's funny how mad nvidia fanbois get when folks who'd rather spend their money elsewhere AND also have a great gaming experience get to do so at around 70 percent the cost, and excepting the 40-series TITAN (which nvidia fooled folks into clamoring for by changing the TITAN branding to 90-series) with the 7900XTX (being half the cost of a 4090), they'd still get a great experience. It absolutely tears their souls to shreds haha. It's gonna be hilarious when AMD disappears and nvidia can charge folks 5 grand for a piece of crap that's more of a humiliation ritual than a product worth spending your hard-earned money on but please, continue to buy "the best" with gimped features they're always so hilariously trying to convince you that you're living a lie by missing out on. Here, I'll make it simple - you wasted a ton of money chasing specs and you're angry that nobody really cares, haha.
So you have the RTX 5060TI and the RTX DOA. the 5060TI is probably going to cost 499 or 450... and the RTX DOA will be around 350 to start... the sad part is that even the 4060 core could do a lot more if it had the VRAM.... now the 5060 is theoretically even more powerfull and will probably also lose to the 3060 once the VRAM buffer spills over... that is a 2 generation old GPU beating a new generation one... good job Nvidia.... bravo...
And this, right here, is exactly why would should blindly buy stuff off brand recognition alone...
Meanwhile Intels B580 has 12 GB of V-ram and is trading blows with the 4060 Ti for only 250$.
Meanwhile AMD's 8800XT is launching soon, trading blows with the 4080 fora an estimated 499$
NVIDIA is turning into the Apple of GPU's, meaningless updates, built in bottle necks with ram to entice upgrades, planned obsolescence, selling glorified software at a ridiculous premium because they know a good slice of the market has been trained to lap up whatever they squirt out. When the 6090 comes out and costs 3500$ y'all can thank yourselves.
@@AlexHonger-fj3nx The rtx 4060 beats the b580 at 1080, intel card is garbage.
I have a 3070, the card is a beast, I get like 130fps in most games, but that 8gb of VRAM... Nvidia has really screw us over, and I could bet the 5060 non ti, still won't be faster than the 3070. The point is, we have to keep buying these damn GPU's cause we NEVER can have our cake and eat it too!!
@@4evahodlingdoge226have you actually watched benchmarks? B580 matches or outperform the 4060 in most games in 1080 and almost always outperforms in 1440p (which it can also play confidently, because it has the vram to do so), the only benefit to the 4060 in gaming is that it sometimes has better 1% lows which you will never notice and also it takes more power to run
@4evahodlingdoge226 it's actually the other way around, plus B580 is cheaper.
Nice try, troll
Nvidia just doesn't give a shit anymore at this point since people keep buying them what's the point of trying to make them have value?
Exactly 💯
Yet another generation from Nvidia to boycott. Oh well. Go with AMD or Battlemage.
Even though I mainly buy Nvidia I'll probably go with AMD this time(RDNA 4 most likely).
@masterlee1988 I've bought lots from both sides, and recommend both for friends and family with their builds, but I won't be supporting Nvidia until they change their ways. They can charge lots for AI cards, but the gaming market should not be gouged.
Yup the upcoming RX 8800 XT looks like it fits me the most. If it's launch price is too expensive i'll just get a RX 7800 XT when the price drops. I do like what I hear with Battlemage so far but I think I'll give them another generation or 2 before I really consider them.
Yep, got a 7900gre to pass on my 2070s to the wife. I can run highly modded* games with high-max settings on ultrawide 1440 and I've been very happy, esp for the money I got it for. AMD has its little bugs and quirks software side compared to Nvidia but very livable and manageable tbh.
Are we just going to pretend that AMD didn't have a 8gb card ?Also actually buy the competition, not just yapping on comment section while both AMD and Intel share 10% of the market. RDNA 4 too will have a 8GB card call them out too...no shifting goalposts this time.
it seems like nvidia is going for GDDR7 for marketing purpose only and keeping the bus size low. It's probably cheaper to stay with GDDR6X with bigger bus and more capacity, but no, that'll make the card too capable and nvidia can't allow that
Bus width is entirely dependent on how many memory chips are on the card. 32-bits per chip. Memory bandwidth is the important number, not bus width. GDDR7 being significantly faster increases the bandwidth. A 16gb 5080 with a 256 bit bus using gddr7 would still nearly have as much memory bandwidth as a 4090 with a 382 bit bus. There's also cache size as a variable which increases effective bandwidth even more. The 3090 only had 6MB of L2 cache, the 4090 massively increased that to 72MB. It might increase again with the 5000 series.
@mojojojo6292 bus width matters because bus saturation happens. This causes latency. Just because the module itself is faster doesn't mean it cannot be saturated. it is all cost margins any other excuse to not even compete with hbm first gen is sad and pathetic. its all to save on a per module basis. cheaper modules higher prices for the consumer. GPUs should be on hbm on the mainstream at this point. The bandwidth and capacities at current market prices it should be a requirement.
@christopherfortineux6937 Last time we got HBM memory (from AMD) on consumer GPU's it was a big fat nothing burger, the only good thing that came from the rumours was the 1080Ti...
@@christopherfortineux6937 no memory bandwidth saturation happens. People need to stop conflating the 2
Additional memory bandwidth beyond a certain point is entirely pointless. Also, the majority of data needs to be transfered sequentially, a wider bus does literally nothing for this scenario. Also with Vega and HBM, you got very impressive gaming performance uplifts when lowering the timings of the HBM, you got next to nothing when increasing the frequency of it. The latency matters, the width and "theoretical bandwidth" didn't anywhere near as much (which goes to prove my point)
I wouldn't get excited over 16GB 5060ti, this just means MSRP will be $499 again which is atrocious
Middle range price for middle range GPU!
😂😂😂
There's always the option of buying the competition instead of whining in comment sections. Below the 5080 AMD will compete. so buy RDNA.!
@@mikelay5360 I already have a GPU but no, won't stop "whining" about greedy companies ripping off fellow hard working man. None of these companies seem to care about entry-level other than Intel rn.
Looking back, I most likely made a bad decision but got a 4060 ti 16gb for $430. If wanting something in the near future, I would like to see what AMDs 8000 series can offer.
I'm all AMD GPUs now. Not giving my money to Nvidia. 3 PCs. one with RX 7800 XT, one with 7700 XT and one with a used 6700 XT I bought recently. Sensible Vram and memory banwith configs at a sensible price. The fact that Nvidia is still keeping 12GB for the 5070 is kinda dissapointing. And it will probably be very expensive. Should have had 16GB. Also, 5060 with 8GB is ridiculous at this point. Will probably be expensive aswell.
2 PCs one with 7800xt and the other with 7700xt I've been all AMD for 7yrs now
Me too. Bought a used 6700xt and decided to trade in my 4060 8g for a 7800xt. Sadly still have a 4060ti 16gb and 3060
Considering how crazy the scalping is on the Arc B580, that is my biggest concern for the 50 series
Welp Intel did better in B580
Intel sell B580 at loss or no income. NVIDIA sell 5060 at 70% profit… So…
@@haukionkannelwhat?
@@haukionkannel is someone at Nvidia paying you to say dumb shit like this?
Ok? How does that affect me the consumer@@haukionkannel
@@haukionkannel i dont think any company sell anything att loss. Specially aib partners why they make parter cards if they not making money?
such a shame they are gimping the 60ti again... I might just keep using my 3060ti with the 256bit bus. There was already no point in going to the 4060ti....
What more exactly do you want from an entry level GPU? 60 class is entry level, if it doesn't look enticing to you then maybe it's time to go up a tier
@@wolfstorm5394 The issue is that these cards are usually more expensive than their competitors & Nvidia keeps gimping their fans on memory. Clearly they want you to spend more for a 70, 80 or 90 series card. Their 60 series needs a rework or a price reduction. Just Ngredia at it again.
@@Fry000 well geee I wonder why that might be the case, the thing is there's way more demand for Nvidia GPUs than any other, even just looking at the steam hardware survey you can see that over 70% of users are using Nvidia GPUs, prices go up when there's high demand...that's how it works
Edit: just checked the market share again and looks like Nvidia is clocking in at 88% as of 2024 in the desktop GPU market share, there's literally no competition at this point
@@wolfstorm5394 I believe he expects generational improvement not a downgrade in die and bus width. Nvidia has been relying on architectural improvement to pick up the slack but in doing so the performance of the xx60 cards have been stagnant. How is it that the 2060 super is still a relevant comparison lol. The 5060 needs to see actual improvement
@@itwasntoptional2513 And you're saying there hasn't been any improvements? You're missing the bigger picture, a 4060 is way better than a 2060 Super
5060 should have been 12 gb
Nah doesn't make sense
Why did RTX 3060 base model had 12GB?
Impossible, 8gb vram is enough for all new modern games, stop biased.
-nvidia
@@arlynnfolke Didn't Nvidia put out a statement admitting that 8GB was inadequate for modern games?
@@arlynnfolkeRX 580 had 8gb years ago
It's so easy, but nvidia just don't care. vram configurations should be as follows:
5060 12gb
5060ti 12gb
5070 16gb
5070ti 16gb
5080 20gb
5090 24gb
They keep skimping at the bottom end whilst giving the flagship card far more than it's ever likely to need. Considering the stupid prices of the 90 class cards I suppose a case could be made for 32gb. Just to be sure you definitely never ever run out.
surprisingly i have run out of VRAM on 4090 when playing raytraced Warthubder
12GB for 1080p wtf.
5060 is fine with 8GB
5060Ti should be 10GB and 4070 14GB
That's more than enough and you can't prove otherwise (I don't care about the 3 poorly optimized games that need 16GB for 1080p Ultra Textures+DLSS+FG+Path Tracing)
People are speaking nonsense about how much VRAM you need.
@@mafi978 the 4060 is already lacking with its 8gbs on newer games even at 1080p. The 3060 was performing better than the 4060 in the new indiana jones game
@@mafi978 There's plenty of benchmarks showcasing 8GB is not enough for 1080p, 10GB is the very minimum and high-end 1440p cards should have 16GB for the future.
@@sheepmasterrace Most I have used is around 20GB in Indiana Jones recently I think.
50-class = entry
60-class = low end
70-class = midrange
80-class = high-end
90-class = enthusiest
The problem is, each class is named one class higher than what they actually are.
Realistically...
50-class = garbage
60-class = entry
70-class = high end
80-class = enthusiast
90-class = micro-epeen
Rather than buy a 5060ti or otherwise. Buy a b580 and call it a day. Time for jensen to learn a lesson. The 5080 only having 16gb is a joke. Cant wait to see the prices for this gen. Im betting itll be laughable.
And sell more than anything else… Most likely NVIDIA market share goes up from 88% they have now and will increase prices for 6000 series!
I'm curious how much VRAM a B770 might have. If they ship with 24gb, that could be a great workhorse.
Dear Nvidia. 70 class GPUs are 256-bit or higher. 60 class GPUs are 192-bit or higher. 50 class GPUs are 128-bit. Anything smaller is barely a GPU.
KTHX - BAI
If people didn't buy so many 4060s last gen then Nvidia would of actually gave more performance to the 5060.
In 2025 we will have 300 dollar 8 gb cards. 8 years before that we had 300 dollar 8 gb cards, and 8 years before that we had 300 dollar 512mb vram cards (vram went up 16 times in those 8 years)
one of the big lie of comments and unfluencers, comparing end of life on sale price to MSRP next gen, especially when MSRP isn't real for every other countries than the USA and even for them, they have been called out several times for making like 100 cards at MSRP then completely stop selling those wether officially or just no availability, the truth is I bought every 80 nvidia since the 580 except the 2080 and each and every time the price rose by roughly 150$ it never was "the same" between two gens, never
Yeah. Look at the graphics in games. Same for the last 10 years.
Nvidia not an option for me. I dont want a 5060 with 16gb neither the next sku with only 12 GB.
looking forward to see what AMD and Intel have to offer.
Releasing a card with 8 GB VRAM will already make it obsolete on first day 😂😂
50 series should've been the generation where 8gb is non existent
5080 with 16 gig is a scam!
It's not a scam, it's profit
For nvidia
If they sell it at or below 1200 its okay. Assuming gaming performance is close to 4090.
@mikelay5360 won't go anywhere close to 4090 with those little amount of cuda cores, 15-30% faster than 4080 super at best
@wwk279 😂 I won't argue, we've got less than a month to put speculations to bed.
@@mikelay5360 how is it ok being at $1,200 ? if it is 4090 performance thats only around a 30% uplift. so you would be paying 20% more for 30% more performance. thats barely moving the performance per dollar mark at all. that would be a terrible upgrade LOL. it needs to stay below $1000
The Nvidia Ceo is an AI! Thats why he always wears the same jacket!😂
At 60 class card being called a mid-range is one of the terrible things wrong with the current GPU market facepalm
60 class cards have always been midrange until recently, there was the 10-50 class cards which were low end, 60-70 class cards mid range, and 80-+ class cards high end, that has shifted now.
GTX 560 Ti was a really decent midrange card. It was running most games at 1080p/Maxed out back when it launched. And in 2011 1080p was in the same position as 1440p these days.
@@stangamer1151 Yeah it was, my brother had one and I had a 570, 560 ti held its own really well back then
Just wait 6050 at 2026 using 96bit memory bus and it will have 9gb of vram… that will be the low end!
Screw this. I'm going AMD. RX8000 where you at.
Wow, you are fast. I just read about one of the items today in my news feed.
wtf is the point of 8gb gpus? 2025 and beyond, games will use more than 8gb at even 1080p low/medium settings. We are already seeing this with Indiana Jones.
web browsing
@@juggernaut316 you don't need a gpu for web browsing lol
i dont understand why the huge insane jump from the 5080 compared to the 5090 when it comes to VRAM, so you are trying to tell me the 5060ti and 5080 are both gonna have 16gb ? thats crazy
To upsell as many as possible potential 5080 buyers to the 5090. It's the 4080 all over again. Ironically it mostly results in a downsell to the XX70Ti (4070sTi), given the terrible cost per frame proposition of the XX80 and the huge jump in absolute cost to the flagship.
They are just inviting AMD to release a 5080 killer with them just offering 16gb.
considering the 60 class has been nvidias lowest desktop cards i cant in good conscience call them "mid" range. they have been closer to the lower range with a mid range price tag.
My thoughts exactly.. 60 class is barely able to deliver stable native 60 fps in modern games at 1080p and I'm sorry but Full HD in 2024 is not "mid" range. Also why gamers keep accepting 30 fps? C'mon guys... it's sluggish as heck. I used to play at around 45 fps on my first PC back in 2005.. It's almost 2025. With today's fast monitors capable of 480Hz or more once you experience 90+ fps then even 60 feels a bit choppy. GPUs in general aren't cheap.. We pay 300$ and have to rely on up-scaling tricks in order to play with decent fps, but in the process we get shimmering image, full of artifacts and ghosting in motion? Wth is this?
@@Patryk122PL "$300 for up-scaling tricks to play with decent fps." Game optimization is big problem these days. It's crazy.
@@Patryk122PL 100% agree. i hope for our sake that no one buys the 5060 cards. i truly hope the intel cards take over the low end and AMD gets their act together with RDNA/UDNA 5 and brings some much needed competition back to the high end. I cant argue with the Nvidia top end GPUs but people need to stop buying the bottom of the barrel garbage that Nvidia is pushing out with the 60 and 70 class cards.
They will release low end 5050 GPU at 2026 with 96bit memory bus and 9gb vram at $350. That will be the low end for 1080p with low settings!
😂😂😂
The 60 was considered midrange because they used to deliver half the performance of the flagship hence midrange, a 1060 was 50% the performance of a 1080ti but nowadays a 4060 is only 30% of a 4090 and you need to move up to a 4070 to get 50% of the 4090 performance.
matching the current generation console in vram is minimal requirement for any pc 'gaming' card. games get made for that generations consoles, so any 'gaming' card needs to match it at a minimum or its not a 'gaming' card.
For this generation thats 16gb, assuming nothing larger comes out the same times these cards come out, if its not 16gb its not a gaming card.
I guess the 50 series cards means the 5060 16gb version is the entry level gaming card.... makes me wonder what nvidia's entry level pc gaming card pricing is going to be.
5060 is entry level and should be considered for 1080p gaming given the increase in size of textures in recent titles i still couldnt possibly see needing a frame buffer larger than 16 gigs ever at the intended resolution.
these specs are underwhelming to say the least
3:42 I Love how nvidia made 139 fps Dlss 0ff look laggy, as if 139 fps was not smooth.
So Blackwell is another DOA generation.
And sell like hotcakes… So no wonder why NVIDIA do this!
@@haukionkannel 5060 also slower than 3060 then.
another 1 trillion dollar profits ...huge win for nvidia and major blow to amd again, amd lost 100 billion from stock marrket shares for whole year, big OUCH!!!
@@nipa5961
?
In some cases, but do you think that people who buy NVIDIA no matter the cost would gare?
Well with Nvidia still going 8gb DOA version, it looks pretty bad for us gamers.
12gb minimum was a must. Look at Intel.
B580 is about the size of 5070…. So same amounth of vram… as with 5070.
They just sell it at no profit because it is so bad…
XX50 - Budget Entry
XX60 - Low-end
XX70 - Mid-end
XX80 - High-end
XX90 - Enthusiast.
This is what I conclude with their SKUs.
I have two options. Buy RTX 5070 TI or RX 8800 XT. I cant wait to see the official specs and benchmarks
Same here, though I'm more likely to go for 8800 XT(depending on price and performance).
Buy b580 instead.
Not buying anything below 16GB. Indina Jones and The Great Circle is a good example for the one who used to say 8GB is enough for gaming.
5060 Ti = 16GB
5070 = 12GB
5070 Ti = 16GB
🤨🤨🤨🧐🧐🧐
Is it me or does this make no sense? Why cap the 5070 with 12GB if you're gonna give the 5060 Ti 16GB?
Because the 5060ti has a 128bit bus and slower memory. While the 5070 is a 192bit bus.
@@ZackSNetwork Still makes no sense... why not make the 5060 ti, 5070 and 5070 Ti all 16GB? And make the 5080 have 20GB
NVIDIA is doing the same shit since the RTX 30 series where they made the 3060 12GB but the 3060 Ti and 3070 all 8GB
4060 Ti 16GB only exists because NVIDIA wanted to prove AMD wrong when they called out for gimping VRAM.
RX 7000 had the same thing.
It means Nvidia is once again making fun of clowns who bank everything on VRAM, because at the end of the day 12GB on a 70 class GPU will go way further than 16GB on an entry level 60 class GPU. And besides that most people will be buying from the 60 class anyway, everybody is excited for more VRAM on high end GPUs but we all know at the end of the day the majority settles in the 60 class.
@@Ad_Dystopia That's because AMD copied what NVIDIA did and released the 7600 XT with clam-shelled 16GB... AMD would've never made the 7600 XT 16GB if NVIDIA had not released the 4060 Ti 16GB
5060 128bit 12gb
5060 ti 192bit 16gb
That’s what it would have been in a sane world.
if they donot have 16gb for 5070. then screw nvidia. Intel battlemage is good enough now.
4060? 5060? This video is confusing.
It's 3060 Ti Super
If you don't know anything about all the tech words, this vid is 8:45 of gibberish
😢I am out. Too expensive. Lets hope the IA craze gets some real competitors so they can lower prices across the board
Suspiciously large gap in performance between the 5090 and 5080.... 5080 ti or super next year with 24GB Vram? 🙂 It is almost as if the 5070ti was relabelled 5080 and all cards below got a name upgrade and the real 5080 will release next year. It seems at every release, a lot of the increase comes from the new gddr speed and frame gen
the 5070 ti should be 15% faster than a 4090 at $1200
cheaper to get a used 4090 then
a near high-end like xx07 never close that gap with super high-end like xx09 just with 1 generation only, at least 2-3 generations wll do, welcome to monopoly game
@@imtrash2812 blackwell es arquitectura nueva, podemos esperar grandes mejoras, no sabes lo que hablas.....
It won't be, 5% at best
@@DasJev 3090 came late 2020 for 1500$, the 4070 ti late 2022 beat it for 800$
Im happy with my 7900GRE purchase.
I don't think the 5060 ti will have only a 16gb card but it'll have the same configurations like 40 series, i.e. both 8 & 16 GB of 5060 ti
The thing is: nVidia can do whatever they want, people will buy it.
Simple as.
And if their "oopsie" causes trouble, like the Memory Configuration of the 30 series, people will go online and defend it with their lives and blame everyone but nVidia and themselves for it.
*NVIDIA, not "nVidia".
That 5090 with a 33%+ TBP is going to need a nuclear cooling tower at this rate. Its crazier then the Pentium 4 at this point.
600W for the gpu alone
5090 is not a consumer card, Nvida noticed people wanted to buy them for games, at the cost of 3 playstation 5 pro's 🤷♂
4060ti has a 16gb version to. and 5060ti is low range. 70 series like 5070 and up is mid tier.
the 16GB version of the 4060ti is ABSOLUTELY POINTLESS. It has the same memory bus as the 8GB version, just bigger modules, so no real improvement for the extra 100 dollars
@@TheIgor449 well that is a regarded take. the 4060ti literally has more RAM, how is that pointless?
@@ssl3546 both versions of the 4060 ti give the same FPS, just look at some random videos comparing both cards
@@TheIgor449 So it reminds me of the GTX 960 2GB and 4GB versions. Is that a fair comparison?
@@TheIgor449 Not when you start adding things like higher textures and frame generation. HUB have shown videos where a 4060 ti 8GB tanks due to running out of vram whereas the 16gb version is still giving playable framerates. The only bad thing about the 4060 ti 16gb was the extra $100 price on an already overpriced card.
Do you think it’s likely that Nvidia comes out with some sort of 5090 AI or 5090 Titan card with the 3GB modules?
24GB should be start for 5080s 16GB should be start for 5070s, 12GB should be start for 5060s. Come on Nvidia 8GB is not cutting it in games, they're demanding over 8GB right now or the games stutter whilst swapping out with the PCs main memory. If you're going to charge premium prices then you can't deliver a non-premium product and that includes 5060s. Otherwise people might as well by an intel or AMD GPU
At least give us 20gb for 5080 since we are being overcharged anyways, ok, maybe even 18, Jesus.
128 is a joke, thats it, i am not buying their mid range until they get their shit together
They don't care when 80% of the market buys from them
These mid-range cards have more VRAM than my current build has RAM lmao.
Anyone paying any series money for less than 16gb vram and low bus is mental - won’t be me
I have a working theory that Nvidia are deliberately lowballing their releases to prevent the public from being able to develop AI unrelated to the giant companies. It all makes sense if you think of it this way
You could be right. Note they removed the ability to bridge cards with the 4090. Being able to do that on the 3090 allowed small businesses to run models that wouldn't work within 24GB of VRAM.
They would rather sell their limited supply of TSMC silicon to AI corps for 40x more so make the gaming cards look as unattractive as possible.
@@JimmyChino55 100%
The GTX 1060 had a 256bit bus. It was one of the best "midrange" cards ever and held it's own for quite some time. Hearing that the 50 series equivalent is going to have only half the memory bit bus shows that Nvidia is purposely hobbling these cards as they did with the 40 series as they're afraid to have another "10 series" issue where folk don't need to upgrade their cards for several years. At the end of the day, it's all about greed.
I'm running a 3080ti FE that has a 384 bit bus.
Im not paying more than 1k for a 5080 and these lower models just look like a bit of a joke at this point.
Nvidia is outright cutting power and trading for their own proprietary features. Theyre swapping hardware for software and still wanting to prove gauge thier customer.
I sat out the 40 series, and I'm more than happy to sit this one out too if they don't produce something with a good price to performance ratio.
5080 with 84 SMs and 5090 with 170SMs is a scam for 5080. 5080 should have been 5070 I mean 5070 should have had 84 SMs. 5080 should have had 128 SMs.
How to read Ngreedia's language:
5060 ti --> 5060
5060 --> 5050
600W TPD 5090 and 400W TPD 5080: The power draw is a lot for both cards, are you going to see any issue with the melting cable once more, or has that been fixed?
Sad to see how people like a herd buy nvidia cards and support this while competition offer much superior products at lower prices
The lowest half-decent card is the 5070ti. The rest are way too gimped. And, again, they are coming out with this crap of pushing 16GB on low-end cards. Why? But put 12GB on a higher tier card. Doesn't make any kind of sense beyond "selling vram to people who don't know any better".
Nvidia hasnt listened to anything but their own greed, seemingly doing their best to make pc gaming something that only the affluent can afford... the only value to be found is at the highest end.
If anything ill try getting a used 4080 super off of someone upgrading
They know VRam is valuable asset so they don't let profit run. Because of monopoly, they can determine consumer.
yeah onward to rooting for AMD and now Intel's gpu lineups... Nvidia is crazy if they think I'm buying this for those prices
It’s very strange that they are shipping the 5070 with 12GB when the lower 5060Ti model has 16GB
The only thing crazy about nvidia is the prices...
I don't care what Nvidia does. Didn't care before either. I'm only looking at AMD and Intel for my gpu needs.
looks like we are getting 50 class cards masscarading as 60 class cards again the 60ti cards used to have a 256bit bus and the 5060ti will have the same memory speed as the 3060ti as 448gbs
Stop buying outdated 8GB GPUs!
xx60 class of GPU is Low end. Always has been. IMO it is the lowest anyone should consider if buying new before you look into something used that is the same price or cheaper with just as much if not more performance.
I wish amd could offer something 4070ti like performance with a very good efficiency and 2 slots gpu, i had 7800xt but once i moved to an itx i cant use it anymore
Here´s hoping AMD´s next cards are kick ass. Sick of the NVIDIA tax....
my guess is 499$ for 5060 and 649$ for the 5060 ti both a pcie 5.0 4x lane cards so no upgradeability on older boards
puedes ponerlas en pci 4.0 sin perder rendimiento, no se por que mientes
xd
seguro que por tu comentario Nvidia vendera dos millones menos jajajajaaj
if that happens then they will sell nothing, people would rather get aw 4070 and 4070 super lol
@@nicane-9966 that's their ideal cause tariffs that will happen pricing will be to deter low or mid-end seekers into buying only the highend gpus called (The Decoy Effect) same time they will slash prices in 1st world countries for 40th series to make sure they stay dominating gpu market where it matters (it also serves a great PR like unlaunch of 4080 12GB)
this is my system specs:
Motherboard: MSI MAG X570S TORPEDO MAX
CPU: 5800X3D
GPU: AsRock Challenger Pro RX 6800
RAM: G.Skill Ripjaws (2x16gb) 32Gb 3600mhz
Storage: 2x 2tb Samsung 980 Pro
Cooler: Corsair Hydro Series H150i PRO RGB 360mm Performance Liquid CPU Cooler (replaced with Silent wings 3 fans 120mm x 3 as exhaust)
Case: BeQuiet Silent Base 802
PSU: Corsair TX850M 850W 80 Plus Gold Semi-Modular Power Supply
Fans: 2 140mm Silent wings 2 at the front for intake, and 1 120mm silent wings 2 as rear exhaust
Monitor: LG 32GP750-B 1440p 165hz
I'll be waiting for the Ryzen 10000 series to upgrade, as rumors suggested, that the 800X3D variant will be getting a core upgrade. hopefully I won't need to upgrade my PSU but will need to upgrade my AIO....hoping Intel releases their B700 series next year to see how it compares with my RX 6800
So Ti means at least 16gb vram. At least there is one common thing with the 5000 series. Its been a mess of names so far..
What if NVIDIA is planning to use 3GB modules on RTX 5060? Meaning 12GB 128-bit. I really doubt it's the case, but we never know. I really hope people stop buying 8GB $300+ GPUs in 2025.
Congratulations! 5060 is just as good as the B580 for twice the price. lol
An 8gb card is DOA in 2025; Awaiting on battlemage to arrive.
The 5060 is dead on arrival, no one should be getting an 8GB card in present day.
The 5060Ti seems like it could be a good configuration, the memory headroom is nice although I think 12GB would be a better match for that class of card. The question is how much of an uplift it will be over the super cheap B580 which is already pretty much tied with the 4060Ti.
The 5070 dropping back to 12GB is weird, as now we're reaching the point where the higher memory could actually matter.
Above that it's disappointing to see nothing above 16GB until the 5090. Some games today already recommend that amount of memory for medium-high settings, so you definitely won't be future-proofed.
8GB GPUs should be sub 200$. 150-170$ is all you'd get out of me for an 8GB GPU.
Considering how many GPU's there are in the stack, I always consider the 60's of the world (1060, 2060, 3060, 4060, etc) to be in the 'low end' bracket, and the Ti versions to MAYBE be the lower mid range. I mean, if those are mid range, does that make the 70's High End, 80's Ultra High End and 90's to be MEGA Super Insane High End? There has to be a Low End somewhere, even if Nvidia doesn't price it as Low End and tries to convince people they're more powerful than they are with DLSS and Frame Gen.
So the 5060TI is gonna get 16g of Vram , but the 5070 is gonna get 12!? Thanks Jensen, very fu***ing useful!
Nvidia's chase for profit margin will be their downfall.
Memory is expensive? That's really not an excuse given how expensive Nvidia cards are across the entire range.
I will be upgrading from a GTX 1660 Super to the RTX 5090. I can't wait!!!
My 4060 has 16gb. What do you mean at 3:27?
nvidia just dont care, 128bit is wasting GDDR7