@@johnsmith-i5j7i B580 is releasing in January 2025. It has 12gb VRAM and clowns all other cards in its price bracket and still has room for even 10-15% more power with a little overclocking
Yeah man, you should stick with the GPU that's done you well for the last three hours that you've owned it. Come back and say comments like this in three years dude
Honestly that 5080 looks like a 5070 Ti... I think in our minds we need to lower each and every cards number one step lower. 5060 = 5050 5060 Ti = 5060 5070 = 5060 Ti 5070 Ti = 5070 5080 = 5070 Ti This is the "real" naming.
Yup. Same thing happend This generation. From 4050 being called the 4060. Go the 4070 being a 4060ti, then they released the 4070super that should have been there at launch. They already did this. And now they are doubleing down.
@mahshshsrklingfa7031 my point is that name doesn't matter. Nvidea doesn't respect it's previous generational naming scheme. Especially when they started mixing Super and ti super. The 60 class used to be mid range. Now it's bearly even low end, struggling on running 60fps on same gen game without stutters due to Vram limitations. In the past they gave 192bit-256 bit bus to 60 class card. Now they are not even adding 256bit on 70 class cards. The 4060, 4070,4070super, 4070ti are all starved off vram. 4060 is a tiny tiny chip. They could have easly made it +10% better if they spent 6-8$ more on silicon. And from current knowledge nvidea won't be increasing vram without makeing a cuter Ti/ super model with 3gb gddr7 chips and up charging the hell out of it.
As much as I love Nvidia (literally a lifelong user), deep inside I want the RTX 5000 series to fail. Nvidia is far too large to suffer the financial problems of Intel, and a failed GPU generation would probably get them back in touch with reality.
Everything below 5090 is cut down a lot, I'd be surprised if there's a price/performance sweet spot anymore.. Nvidia likes milking everything for all its worth these days, hence why we get Super/Ti cards like 12 months later with more VRAM and performance.
they did the testing of releasing 4070 power GPU with name scheme 4080 people took it they're gonna run it with it now for next gens so the gap between 5080 and 5090 will be like nothing we ever seen in previous generations it started in 3090 series though
@@nossy232323 yea i know performance wise it was not too far from each other. I was talking vram, since 10gb was way too little. Id argue that was worse than 16gb on the 5080, even though 16 is too little too. Its should have been 20. A shame amd wont be doing high end this generation.
@ChickenGMD The worst part is that I don't see AMD, Intel or anyone else being capable to compete that the high end for a long time. Even the mid level performance cards will be a challenge.
Nvidia has taken the Apple route for pricing. The 5060 is like a base model config that's just adequate enough to tempt you to jump up a tier to a config that actually makes sense.
@@WesternHypernormalization it breaks my heart. They could lower the price of the 6750XT to $300 or less and get a win against the B580. BUT it's AMD and they like losing for some reason
Nvidia is too entrenched in the minds of gamers that think AMD gpus are bad just because. They will lap up whatever Nvidia wants them to. It's sad. 3060ti will probably be the last nvidia card I buy unless they really improve.
@@tin9759 first of all it does not matter if DDR6 or DDR7, secound of all bandwidth does not mean anything at that low performance, it will not change anything because this cards are even too slow for 448Gb/s and what is Bps ? not any clue what bs the guy is talking but all i know is that he does not understand anything.
the intel still has more vram, most game engines are not defined to assume swaps (asserts), if it was a console it would be fine but it isn't pc games are optimised like garbage, maybe decima engine and RAGE engine games will, or doom
Look man times are rough, nvidia is a small poor corporation, you have to pay the premium. Why do you even need more than 8GB at $400, like just use low textures, its not like texture quality is the most impactful setting.
if the 5060 is only 8gb its dead on arrival. i made the mistake of buying the 3070ti for my 3440x1440p monitor and what a mistake it was. the gpu has the power but its 8gb vram just fills up on modern games so have to run low textures and shadows. so frustrating. next gpu for me will be 16gb and no less.
Gonna grab used 4070 for a cheap price to game at 1080p next year rather than buying 5060 8gigs... And I doubt the 5060 can't even reach the level of 3070ti.
This is a GTX 1050 level gpu. Even in 2016 that tier was gimped. You just have to buy higher up the stack. If you want a gpu to last just buy the flagship. 2080ti is still a monster with no need to upgrade still.
Depends on which games you play and what resolution. 2080ti certainly wont play newer games at 2k or 4k in high settings and its also missing alot of newer software technology upgrades.
Honestly, i think it is a great thing. Many will (hopefully) not be stupid enough to buy an 8GB 5060 at that price segment and go for an AMD/Intel card with 12GB instead. That is a great opportunity for those two to gain market share, which will ultimately benefit everyone.
Unfortunately the 5060 will sell like hot cakes since most of them will be sold in prebuilds. Go to any website that sells "gaming pcs" and sort from cheapest first and 99% of the time its an NVidia 50/60 series. Parents buying their kids a computer dont gaf if they have 8GB. If I were to guess I would say NVidia make a LOT more gpus than the competition so they can give a healthy discount for bulk purchases which can net shops more money.
These specs make me feel a lot better about buying a 7900xtx in a pre-black friday sale. The people who own the 4090 are the real winners, since its probably still going to be faster than a 5080 and end up being the only card in the performance gap between 5080 and 5090. I can't wait for all of the game development studios to show actual gameplay trailers using a gpu thats twice as powerful as the second most powerful card and then leave kids frustrated their 5060 prebuilt doesn't play games like that.
Given the fact that 5080 should only have 16gb, the test of the line up will just suck. And 5080 *will* suck when it's about price/performance. AMD will disappoint, because that's what they do continually by overpricing their products. Only gpu that will make sense? 5090. 32gb of vram=can do tons of AI related workloads, and will never run out of vram in any scenario for its lifetime.
@@johnsmith-i5j7isold my 4090 for 2k after near Release. I paid 1.700€😂. So its a good Invest. While waiting for new 5090 i use my old 6900xt. Dont regret the invest
@@johnsmith-i5j7i yes, mrsp will be whatever it is and then real price will be 20% on top of that for sure. I'll probably try to get a 5090 though, cause that's the ONLY card in the line up that will still be valuable when 6000 series comes up. It wont be a waste of money.
330€ the b580 is, for the same amount of money you can get a Rx6750xt that has more Performance and same amount of Vram, congrats you can finaly get a little bit closer to Rx6750xt user who bought the card years ago.
Lol cheapest don't say that... Intel B580 is better than this trash... And nividia game is over with this card... No one gonna buy that shiit of 8 gb... Nividia is dead already not even released 50 series yet... Dead
@@dvornikovalexei and one thing of ur knowledge... Intel is way ahead of nividia in dlss and ray tracing now has got better in intel card then ur nividia 4060 ti.. Lol.. Benchmark are insane of b 580.. U keep it up with 8 gb card for extra money..🤣🥲
8:36 I've said it somewhere else... but i think Blackwell will be a letdown for gamers. This architecture seems to be rushed by nvidia just to capitalize on AI. The fact that they had to stitch 2 dies for AI servers just to get big speed improvements says it all. If Blackwell was to be so revolutionary, so efficient, so fast... then they wouldn't have to increase power consumption, and for servers they wouldn't have to glue 2 chips together. And look at RTX 5090. 512bit bus?! it's this big for a reason. If there was no need for it they wouldn't go that route. All Titan cards topped at 384bit BUS, excluding Titan V which is on HBM memory.
nvidia selling software at this point not cards, 8 gb vram for entry will never change because they dont care, AI business is too good for them to the point they wont even give shit about gpus anymore, for nvidia gpus are just prestige thing at this point, they will release it for the sake of releasing it, they could shut down the entire gpu department and still wouldnt lose anything significant lol
The 8GB 5060 is actually GOOD for gamers (who have lower end cards such as the 4060 / 3060 ti, let me explain). Think about it, if the 5060 had 12gb, that pushes the 4070 etc closer to becoming more obsolete. At least 8gb gives us more time to enjoy 12GB cards, because developers will have to make 1080p use roughly 8gb of vram if they want to sell to more player bases. We learned from the 4060 that even it's bad value, the 60 series cards always sell the most, meaning 8gb has to be around the maximum for developers at 1080p. If the 5060 had 12gb, newer games would start moving towards that at 1080p even sooner, which would be worse for all of the others cards. And no, just because AMD and Intel are offering more VRAM, the market proves that NVIDIA is always leading in the market share and they have the most influence on video game graphic requirements. It's just facts.
No, it won't drive me crazy, because i was totally expecting it. And as long as Nvidia will dominate that market, they won't have ANY incentive to change their ways.
If everyone look at raw numbers and what in theory gpus can do. 40 series cards and the 7900xt SHOULD last 5 years MINIMUM. but games these days are so poorly optimized. Plus the graphics dont do the system spec requirements justice. Cyberpunk with RT off runs really well. HZD FW. God of war. But name a ue5 game besides fortnite and BAM performance sucks. The last of us port? Lol that port was a joke
Does anyone have any idea of what would cost more: a) Slightly smaller chip with 128bit and 8 x 2gb modules of gddr7 b) Slightly bigger chip with 192bit and 6 x 2gb modules of gddr7 (actually on 192bit they could even use gddr6) I'm really curious.
If it needs 16gb for 1080p which is where the bulk of pc gamers play it will get very little sales so no not going to happen. 8gb of vram will run it just like it runs every other game.
Shameful behaviour from Ngreedia, hopefully they will sell like shit and maybe they'll learn their lesson, but I find it difficult considering the masses love team green
Nice review building new pc but holding on GPU. My 2080super and 6600xt are fine for now. Do you see any value in SLI a 2080 and 3090 or 40 series? The used high end 24gb units are great value now and 24gb +11gb with SLI or a 6600xt and 24gb card with crossfire has to be worth something? Don’t see value in 50series as will be transition to new chip. Unless they are ready to announce and they do an apple with m4 style performance?
$300 is not the price of a low entry GPU. under $150 is the price of a low entry/budget GPU. Imagine paying $300++ for a mid range GPU just to play the latest games at 1080p low~ 🤡
Only cards that make sense: a) 5090 b) 5070 Ti. The rest are a joke. There is no point getting the 5080 over the 5070 Ti unless it comes with 20 or 24 GB.
It's a wonder that Nvidia did not put a 64 bit bus on this thing to mostly negate the advantage of the GDDR7 ram so that it would be just barely faster than the 4060.
"you keep buying them so GPU prices remain high" - your average gamer that thinks GPUs are only useful for games and is completely missing the AI/ML applications transforming the tech and research industry helping to develop medicine and sequence genomes 95% faster than previously possible.
If, 5070ti is rqual to 4080 or 4080 super, why call it entry level 4k? Or is the 4080 super entry level 4k? I just find it curious how when new gen comes, suddenly the new ones are entry or mid level when matching the most powerful ones from just a few days ago.
For current gen gaming, yes that is entry level 4k performance. Honestly it is not even a true 4k card for UE5 , you will need dlss upscaling. You have to base performance tiers on current gen gaming, not old games.
5060 and 5080 will both likely be dead on arrival. It's 2025. 1080p graphics are technically still relevant, but for anyone who has already made the transition to 1440p and above, 8GB GPU's are now totally out of the equation. Even at 1080p some new game titles are exceeding 8gb of VRAM at high/ultra settings. It's also going to be a tight squeeze for the 5070 at 1440p, especially on ultrawide monitors. The target market for the 5080 is probably intended to be 4k gamers, but again, that's double the pixels to push over 1440p with an even smaller VRAM to CUDA ratio.
If Nvidia put more vram in their card, developers will just get more lazy and do even less optimization. 16gbs of vram will be the new 8 sooner than you might think.
But Nvidia _do_ need to put more VRAM in the 5000 series. Newer cards need to be better than older cards, or there's no point to buying them. It's that simple. It doesn't matter if it makes developers lazy.
I would like to point out something that you said and several others say, even Valve/Steam without context. Yes the 4060 and 3060 sold tons of units and show that on platforms like Steam. There is a reality no one covers. Internet cafes in China ( outnumber every household pc owner combined in all 50 US states and all 3 4 countries in the UK added in) order 3060/4060- next 5060 in bulk. My point is that if a site showed AMD / Nvidia and series number sells by region, the 3060 and 4060 are nowhere near those 95% numbers shown on Steam as China has quadruple + the audience on Steam than the Americas/UK and Europe. It is a numbers thing at the end of the day. I will also include that the Steam figures claim individual users by log in, not owner. A single pc in a wifi shop can have 10 unique users per day 7 days a week. The number is inflated. When we were in Beijing last year, I almost fell over seeing 100s of wifi cafes packed and all of them using 3060/4060 gpus and having bins of used 1060s/2060s for sell. The numbers are grossly inflated and what is sold is sold in bulk in China for very different prices to the cafes.
Hi Dannyz, I've been into PC gaming for over twenty years. I predict getting my hands on a new 5090 will be in the third or fourth quarter of 2025 at the earliest. I like this card, but I like my money more.
i bought the rtx 4060 ti 16G a year ago as an upgrade from my rx 570, for the rtx 5060 i will have to pass and wait for the rtx 4070 ti super if prices comes down.
nvidia keeps charging these stupid prices becauses consumers keep making stupid purchases. You deserve 8gb of vram if you keep supporting nvidia. Why would they change when Jensen keeps getting away with taking their customers to the cleaners.
I think the problem is political and economics - bc adding more VRAM is purely a human decision, it's not a technical limitation - So Nvidia VRAM is dependent on SAMSUNG - leading in memory tech. - my guess is EVERYONE depends on Samsung - so must've been a not so smooth deal with Korea - Someone should look into this.
@ it is not a technical limitation on mid range cards. They can decide to mix n match vram to match their price points - ultimately to maximise profit. They are not going to give each tier their maximum VRAM capacity bc they want to control the supply and pricing. Since we have yet to need more than 32gb vram, then I'll say we don't have a technical limitation, relatively.
I hope there will be some kind of power consumption tax on AI companies in the near future and AI turns out not to improve companies' bottom line enough, the bubble has to burst before Jensen gets out of his straitjacket and starts thinking about gamers again. Godspeed, Intel (never thought I'd hear myself say that - oh well "normal" stopped happening since the 2019 bug).
There is bo point in releasing the 8GB version due to intel’s new card. If Nvidia is not stupid they just won’t release it and only release the 12GB version later on.
@ why would anyone want them to quit the GPU market? 🤷♂️. This competition is great and benefits us the consumers. I won’t be buying a GPU from intel any time soon since I want the best performance possible. Have a 4090 and will be buying a 5090. Actually as a loyal CPU customer for many years I finally decided to leave intel and go 9800X3D (upgrade to 9950X3D soon), and it’s the best thing I ever did. The performance difference is just mind boggling! Intel’s CPU sucks soooo much for gaming and they just stubbornly refuse to listen to consumers about extra cache. Do I prefer them to focus more on CPU and compete there idk 🤷♂️. I think the company is big enough to have both a GPU and CPU department.
@@pvdgucht because if nvidia give 5060 at least 12GB that will decimate intel sales. a few quarters ago intel already said in their earning reports that Arc did not make any money. Tom Peterson interview with HUB further confirming this is still the case. no matter how big the company are there is no sane company will want to keep unprofitable business.
@ Yeah probably. Anyway intel doesn’t even want to sell the new Arc cards because they straight up lose money on them. Intel sucks but I would rather have them not go bankrupt for competition.
I just ordered ASRock Arc A770 Phantom Gaming OC 16GB 😅 been dying to get one to play around with it 😂 hopefully B770 comes out too so I can get that too 😅 ( already have RX 7900XTX )
5060 should be 12gig min and the Ti should be 16gig. I play 1440p I have a RTX 3080 and there's games out there that get right up against the 10 gig buffer
I hope nvidia completely losses the low end market to intel and AMD
nvidia will just release 5 different skus for the gen after, 6090, 6090 super, 6090 ti, 6090 super ti, 6090 titan 700-3000 per card :P
Pretty sure they'll stop making it. NVIDIA doesn't care about us.
Bro Nvidia couldn't give a shit.. Desktop gpus are just side money for them at this point
If only people would actually buy AMD or Intel instead of seeking competition just to buy more Nvidia at cheaper prices...
Same
8gb lol.
I'll stick with my overclocked B580 Intel gpu.
8gb isn't enough. Don't care how powerful it is.
My 480 has 8gb..what year did that come out lol
@@johnsmith-i5j7i B580 is releasing in January 2025. It has 12gb VRAM and clowns all other cards in its price bracket and still has room for even 10-15% more power with a little overclocking
@@Mersoh B580 is out now. What country are you in?
@@johnsmith-i5j7i that came out 2016 :)
Almost 9 years ago.
Yeah man, you should stick with the GPU that's done you well for the last three hours that you've owned it. Come back and say comments like this in three years dude
Honestly that 5080 looks like a 5070 Ti... I think in our minds we need to lower each and every cards number one step lower.
5060 = 5050
5060 Ti = 5060
5070 = 5060 Ti
5070 Ti = 5070
5080 = 5070 Ti
This is the "real" naming.
Yup. Same thing happend This generation.
From 4050 being called the 4060.
Go the 4070 being a 4060ti, then they released the 4070super that should have been there at launch.
They already did this. And now they are doubleing down.
Badically this 80 series don't exist
no, 8GB these days its not 5050 its 5030
@@snakeinabox7220well then just call 4060 a 4060 and 5060 a 5060 lmao.
@mahshshsrklingfa7031 my point is that name doesn't matter.
Nvidea doesn't respect it's previous generational naming scheme. Especially when they started mixing Super and ti super.
The 60 class used to be mid range. Now it's bearly even low end, struggling on running 60fps on same gen game without stutters due to Vram limitations.
In the past they gave 192bit-256 bit bus to 60 class card.
Now they are not even adding 256bit on 70 class cards.
The 4060, 4070,4070super, 4070ti are all starved off vram.
4060 is a tiny tiny chip. They could have easly made it +10% better if they spent 6-8$ more on silicon.
And from current knowledge nvidea won't be increasing vram without makeing a cuter Ti/ super model with 3gb gddr7 chips and up charging the hell out of it.
Y'all keep buying those Nvidia cards so that sends them a strong message that it's ok,so congrats on your new 50 series.
Just bought 10,000 4090 just to keep GPU stupidly prices high. 😂
@johnsmith-i5j7i right on. Good job bro.
Agree
@@johnsmith-i5j7i Hero!
They are too cheap, I hope the xx60 class cost $999 in the future.
The only word for 5060 is "shit"
As much as I love Nvidia (literally a lifelong user), deep inside I want the RTX 5000 series to fail. Nvidia is far too large to suffer the financial problems of Intel, and a failed GPU generation would probably get them back in touch with reality.
more like it is gamer that need a dose of reality.
@arenzricodexd4409 I don't quite understand why you're blaming the gamers in all this mess. Care to explain?
@@cristianroth8524 never did i blame gamer. all i said the one that need to wake up to reality is gamer not nvidia.
Nvidia not too long ago was far smaller than Intel, Intel was a giant for decades.
@@cristianroth8524 Gamers ARE to blame for indulging Nvidia and paying their high prices....
Only the 5070Ti and 5090 make sense, others are just plain bad
Even 5070? I was invested in it and wanted to buy it.
actual 4070 consume 200w while 4070 super did 220w not surprise for them to put it as 250w but that is 205 chip, where did 204 chip go ?
Wait for the 5070 Super. The cheapest of the good ones and combine decent power & efficiency.
Everything below 5090 is cut down a lot, I'd be surprised if there's a price/performance sweet spot anymore.. Nvidia likes milking everything for all its worth these days, hence why we get Super/Ti cards like 12 months later with more VRAM and performance.
@Laconic-Spartan-GR and maybe they give it 16gb as it should have it already
Well done Intel. Godspeed!!! 5080 = 1/2 5090. Sounds like a next level mockery from the Jacket...
they did the testing of releasing 4070 power GPU with name scheme 4080 people took it they're gonna run it with it now for next gens so the gap between 5080 and 5090 will be like nothing we ever seen in previous generations it started in 3090 series though
tbf the 3080 vs 3090 was even worse
@ChickenGMD Not really. Performance difference was pretty small (around 6%) , while the RTX 5080 literally has half the specs of a RTX 5090.
@@nossy232323 yea i know performance wise it was not too far from each other. I was talking vram, since 10gb was way too little. Id argue that was worse than 16gb on the 5080, even though 16 is too little too. Its should have been 20.
A shame amd wont be doing high end this generation.
@ChickenGMD The worst part is that I don't see AMD, Intel or anyone else being capable to compete that the high end for a long time. Even the mid level performance cards will be a challenge.
This range of gpus with almost the same production cost, should be insulting for the consumer, its a monopole game right in our faces.
Nvidia has taken the Apple route for pricing. The 5060 is like a base model config that's just adequate enough to tempt you to jump up a tier to a config that actually makes sense.
Base 5060 should have been 12gb
its just speculation, we dont actually know for sure yet
B580 it is then
I dont mind buying B580. Intel changed the playing field.
well not for another month. the scalpers bought them out and have put a $150 upcharge on them
@@srobeck77 True... Don't pay scalper prices....
I'm going to get the b580 for my 2 computers that I made for my gir's. They only play Sims.
I'll keep my rtx 3060 12gb for now.
AMD better not fumble the 8000 series....but they probably will
This is an easy layup. Which means it's all but guaranteed AMD is going to screw this up.
AMD in the GPU sector never misses the chance to miss.
@@WesternHypernormalization it breaks my heart. They could lower the price of the 6750XT to $300 or less and get a win against the B580. BUT it's AMD and they like losing for some reason
When do those come oyt
Naw, frankly anyone who thought for even a nanosecond Nvidia weren't going to screw gamers was already crazy.
Nvidia is too entrenched in the minds of gamers that think AMD gpus are bad just because. They will lap up whatever Nvidia wants them to. It's sad. 3060ti will probably be the last nvidia card I buy unless they really improve.
B580 got 456GBps memory bandwidth while 5060 will get 448GBps. So even though the bus is narrower, the bandwidth is about the same.
RTX 5060 will come 6 months later
GDDR6 vs GDDR7
@@tin9759 first of all it does not matter if DDR6 or DDR7, secound of all bandwidth does not mean anything at that low performance, it will not change anything because this cards are even too slow for 448Gb/s and what is Bps ? not any clue what bs the guy is talking but all i know is that he does not understand anything.
the intel still has more vram, most game engines are not defined to assume swaps (asserts), if it was a console it would be fine but it isn't pc games are optimised like garbage, maybe decima engine and RAGE engine games will, or doom
Memory speed and bandwidth are not linear. You can have less bandwidth and faster speed.
It's all about Rtx5090 being pushed i see this a mile away. Guys if you have a RTX 4090 just stay in parked mode. This is pointless pretty much
Pritty sure nobody with an xx90 class card is asking for more vram... Nvidia needs to really start rethinking their mid-range segment vram issues...
Look man times are rough, nvidia is a small poor corporation, you have to pay the premium. Why do you even need more than 8GB at $400, like just use low textures, its not like texture quality is the most impactful setting.
if the 5060 is only 8gb its dead on arrival.
i made the mistake of buying the 3070ti for my 3440x1440p monitor and what a mistake it was. the gpu has the power but its 8gb vram just fills up on modern games so have to run low textures and shadows. so frustrating. next gpu for me will be 16gb and no less.
Gonna grab used 4070 for a cheap price to game at 1080p next year rather than buying 5060 8gigs... And I doubt the 5060 can't even reach the level of 3070ti.
what about the super version? 4070super seems like a good card for around $400 if they ever get that low lol
Yea i doubt it too
1080p sucks tho man 2k is the way, ez on a 4070 even if you use dlss quality
@@lawnmanGman 1440p sucks 4k or bust
I'll stick with the 4070 Super, if you already have a card of the 4000 series, you can save the 5000 series except for the 5090.
ima hold out with my 4070 ti super, when i decide to upgrade i just hope AMD is offering something better
This is a GTX 1050 level gpu. Even in 2016 that tier was gimped. You just have to buy higher up the stack. If you want a gpu to last just buy the flagship. 2080ti is still a monster with no need to upgrade still.
Depends on which games you play and what resolution. 2080ti certainly wont play newer games at 2k or 4k in high settings and its also missing alot of newer software technology upgrades.
"just buy the flagship bro"
5060 should be 12 gb. Shame on u nvidia. We do not need faster memory. We need more memory.
Honestly, i think it is a great thing. Many will (hopefully) not be stupid enough to buy an 8GB 5060 at that price segment and go for an AMD/Intel card with 12GB instead. That is a great opportunity for those two to gain market share, which will ultimately benefit everyone.
Unfortunately the 5060 will sell like hot cakes since most of them will be sold in prebuilds. Go to any website that sells "gaming pcs" and sort from cheapest first and 99% of the time its an NVidia 50/60 series. Parents buying their kids a computer dont gaf if they have 8GB. If I were to guess I would say NVidia make a LOT more gpus than the competition so they can give a healthy discount for bulk purchases which can net shops more money.
The amount of future gpu buyers that even know what vram is is like 5%, parents will give the cheapest new prebuilt to their children
Some said GDDR7 memory will save the day even at 8GB. I hope not.
Not gonna happen, memory speed won't do much when there's not enough memory to load assets into.
mabe in short term but give it 3 years and they will be all but useless for AAA games just in time for the 60 series to be sold to you
thas thas just it they can gift wrap it nicely don't mean it will preform how it should try it it's ur money
I'm glad the 5060 only has 8GB of VRAM. I want Nvidia sales to suffer in that segment, not to punish Nvidia but to make the market healthier.
These specs make me feel a lot better about buying a 7900xtx in a pre-black friday sale. The people who own the 4090 are the real winners, since its probably still going to be faster than a 5080 and end up being the only card in the performance gap between 5080 and 5090.
I can't wait for all of the game development studios to show actual gameplay trailers using a gpu thats twice as powerful as the second most powerful card and then leave kids frustrated their 5060 prebuilt doesn't play games like that.
What one did you get and what price ? I’m like 90% sure I’m gonna get a phantom gaming one. How’s it perform
Given the fact that 5080 should only have 16gb, the test of the line up will just suck. And 5080 *will* suck when it's about price/performance. AMD will disappoint, because that's what they do continually by overpricing their products.
Only gpu that will make sense? 5090. 32gb of vram=can do tons of AI related workloads, and will never run out of vram in any scenario for its lifetime.
Lol 5090 will be £2000. And and overcharge ? Hahahaha 🤣
@@johnsmith-i5j7isold my 4090 for 2k after near Release. I paid 1.700€😂. So its a good Invest. While waiting for new 5090 i use my old 6900xt. Dont regret the invest
@@johnsmith-i5j7i yes, mrsp will be whatever it is and then real price will be 20% on top of that for sure.
I'll probably try to get a 5090 though, cause that's the ONLY card in the line up that will still be valuable when 6000 series comes up. It wont be a waste of money.
@liberteus spending that amount of money on a GPU is insane
@@liberteus absolutely
At this point they should just stop making worst plans on selling lower gpu and leave it to intel
£350 GPU is low end? Lol.
330€ the b580 is, for the same amount of money you can get a Rx6750xt that has more Performance and same amount of Vram, congrats you can finaly get a little bit closer to Rx6750xt user who bought the card years ago.
@allxtend4005 6750xt is about £50 more.
the prices are crazy...
Will dlss 4.0 be available for 4000 series?
Lol cheapest don't say that... Intel B580 is better than this trash... And nividia game is over with this card... No one gonna buy that shiit of 8 gb... Nividia is dead already not even released 50 series yet... Dead
Why are you crying? I will buy 8 GB GPU and avoid every unoptimized garbage game like Indiana Jones
@dvornikovalexei ok if u have more money u will... But 8 Gb card in 2025 really bro🤣 u will regret.. 🤣
@@dvornikovalexei 2025 games won't be kind to your 8 GB GPU...
@@dvornikovalexei and one thing of ur knowledge... Intel is way ahead of nividia in dlss and ray tracing now has got better in intel card then ur nividia 4060 ti.. Lol.. Benchmark are insane of b 580.. U keep it up with 8 gb card for extra money..🤣🥲
cry moar
My prediction is that the 5060 will come later and have 12gb of vram on a 128 bus and perform like the 4060 ti for $300 and sell the most as always.
8:36 I've said it somewhere else... but i think Blackwell will be a letdown for gamers. This architecture seems to be rushed by nvidia just to capitalize on AI. The fact that they had to stitch 2 dies for AI servers just to get big speed improvements says it all. If Blackwell was to be so revolutionary, so efficient, so fast... then they wouldn't have to increase power consumption, and for servers they wouldn't have to glue 2 chips together. And look at RTX 5090. 512bit bus?! it's this big for a reason. If there was no need for it they wouldn't go that route. All Titan cards topped at 384bit BUS, excluding Titan V which is on HBM memory.
nvidia selling software at this point not cards, 8 gb vram for entry will never change because they dont care, AI business is too good for them to the point they wont even give shit about gpus anymore, for nvidia gpus are just prestige thing at this point, they will release it for the sake of releasing it, they could shut down the entire gpu department and still wouldnt lose anything significant lol
I remember once seeing a graphics setting in a game that said "my card is worth to cars", I guess we're starting to see that reality
Why's everyone crying that a low end GPU is less low end? Don't buy it
To NVidia, even the buyers of the 5090 are mere beggars receiving alms at the large table of AI.
Correct it's milking the pockets
The 8GB 5060 is actually GOOD for gamers (who have lower end cards such as the 4060 / 3060 ti, let me explain).
Think about it, if the 5060 had 12gb, that pushes the 4070 etc closer to becoming more obsolete. At least 8gb gives us more time to enjoy 12GB cards, because developers will have to make 1080p use roughly 8gb of vram if they want to sell to more player bases. We learned from the 4060 that even it's bad value, the 60 series cards always sell the most, meaning 8gb has to be around the maximum for developers at 1080p. If the 5060 had 12gb, newer games would start moving towards that at 1080p even sooner, which would be worse for all of the others cards.
And no, just because AMD and Intel are offering more VRAM, the market proves that NVIDIA is always leading in the market share and they have the most influence on video game graphic requirements. It's just facts.
The only thing I have to say about the entire RTX 50 series could be summed up with just one word: Nshittyuh
It should be 5050 8 gb and 5060 10 or 12 gb, but ngridia probs not going to release 5050
it's just that simple: don't buy nvidia; it's useless you keep complaining but then almost every1 has nvidia gpu
No, it won't drive me crazy, because i was totally expecting it. And as long as Nvidia will dominate that market, they won't have ANY incentive to change their ways.
Basically, pay more, and you'll get that 16gb.
Yeap. Apple.
Heck, AMD could drop the prices of the 7700 XT, 7800 XT, and 7900 GRE and not even come out with new GPUs and they'd win
GRE out off stock or same cost XT 😅 right now.
they wont, people just want amd to drop prices so nvidia also drops there and then they buy nvidia, awlays the same
If everyone look at raw numbers and what in theory gpus can do. 40 series cards and the 7900xt SHOULD last 5 years MINIMUM.
but games these days are so poorly optimized. Plus the graphics dont do the system spec requirements justice.
Cyberpunk with RT off runs really well. HZD FW. God of war.
But name a ue5 game besides fortnite and BAM performance sucks. The last of us port? Lol that port was a joke
Does anyone have any idea of what would cost more:
a) Slightly smaller chip with 128bit and 8 x 2gb modules of gddr7
b) Slightly bigger chip with 192bit and 6 x 2gb modules of gddr7 (actually on 192bit they could even use gddr6)
I'm really curious.
dont buy rtx 5060 8 vram GTA 6 minimal vram is 16
Only intersterting card is 5090 and 5070 ti depending on price
Minimal Vram and Reccomended Vram is different thing
5060 could run GTA VI just fine at 1080p
lol if GTA 6 need 16GB minimum then that is ultra garbage optimization.
gta 6 will be the same crap as rtx 5060
If it needs 16gb for 1080p which is where the bulk of pc gamers play it will get very little sales so no not going to happen. 8gb of vram will run it just like it runs every other game.
Shameful behaviour from Ngreedia, hopefully they will sell like shit and maybe they'll learn their lesson, but I find it difficult considering the masses love team green
Green zombie's
Why 5070 have 12gb and 5060 ti 16? Shouldn't be the opposite? O_o 5070 - 16gb / 5060 ti - 12?
In their eyes the 5070 is for 1440p gaming and you only need 12gb and the 5060 ti is for entry level productivity work.
Nice review building new pc but holding on GPU. My 2080super and 6600xt are fine for now.
Do you see any value in SLI a 2080 and 3090 or 40 series? The used high end 24gb units are great value now and 24gb +11gb with SLI or a 6600xt and 24gb card with crossfire has to be worth something? Don’t see value in 50series as will be transition to new chip. Unless they are ready to announce and they do an apple with m4 style performance?
$300 is not the price of a low entry GPU.
under $150 is the price of a low entry/budget GPU.
Imagine paying $300++ for a mid range GPU just to play the latest games at 1080p low~ 🤡
5070 12gb... end of the stand up, Nvidia won.
Only cards that make sense: a) 5090 b) 5070 Ti.
The rest are a joke. There is no point getting the 5080 over the 5070 Ti unless it comes with 20 or 24 GB.
It's a wonder that Nvidia did not put a 64 bit bus on this thing to mostly negate the advantage of the GDDR7 ram so that it would be just barely faster than the 4060.
"you keep buying them so GPU prices remain high" - your average gamer that thinks GPUs are only useful for games and is completely missing the AI/ML applications transforming the tech and research industry helping to develop medicine and sequence genomes 95% faster than previously possible.
If, 5070ti is rqual to 4080 or 4080 super, why call it entry level 4k? Or is the 4080 super entry level 4k? I just find it curious how when new gen comes, suddenly the new ones are entry or mid level when matching the most powerful ones from just a few days ago.
For current gen gaming, yes that is entry level 4k performance. Honestly it is not even a true 4k card for UE5 , you will need dlss upscaling. You have to base performance tiers on current gen gaming, not old games.
5060 and 5080 will both likely be dead on arrival. It's 2025. 1080p graphics are technically still relevant, but for anyone who has already made the transition to 1440p and above, 8GB GPU's are now totally out of the equation. Even at 1080p some new game titles are exceeding 8gb of VRAM at high/ultra settings. It's also going to be a tight squeeze for the 5070 at 1440p, especially on ultrawide monitors. The target market for the 5080 is probably intended to be 4k gamers, but again, that's double the pixels to push over 1440p with an even smaller VRAM to CUDA ratio.
Don't see the point of "entry level 4k gaming". If you are going 4k, just go all the way.
The 970 lasted me almost 10 years. Then again I was playing 2015 games most of the time
Decided i'm skipping the 50 series. Too expensive, i'll stick to my 4070S for the next 5 years.
How much is 5060/5060ti again...?
we should help intel, don't buy rTX 5060... so we have competition... don't buy nvidia.
Nvidia at screwing people over
screwing people? you only screwing yourself when you buy nvidia lol. other option exist like those newly released B580.
@@wwk279 never again... Learned my lesson with that 3060ti junk purchase... Useless for abything unreal engine 5 above 1080p low
Buy Nvidia card = support their greedy, they can make you suffer even more with your next purchas
@@wwk279 everyone has their own greed. including gamer.
Just buy whatever super variant there is now and wait for the next super refresh. That's where the actual value is.
If the 5060 has 8GB it's literally *DOA* 😂😂
If you buy a prebuilt pc, they pair an i9 14900k with a 4060 and a 1 tb ssd. It’s weird
8Gb in 2025 is not acceptable. I bought a RX 6700 XT with 12Gb for £220. You think I’ll upgrade to 5060 with 8Gb ? Err no
Next year we will have super cards from 5080 to 5060 😂
If Nvidia put more vram in their card, developers will just get more lazy and do even less optimization. 16gbs of vram will be the new 8 sooner than you might think.
Yep
But Nvidia _do_ need to put more VRAM in the 5000 series. Newer cards need to be better than older cards, or there's no point to buying them. It's that simple. It doesn't matter if it makes developers lazy.
Poor take. What kind of cope is it where you think they do it to make a point to developers?
I would like to point out something that you said and several others say, even Valve/Steam without context. Yes the 4060 and 3060 sold tons of units and show that on platforms like Steam.
There is a reality no one covers. Internet cafes in China ( outnumber every household pc owner combined in all 50 US states and all 3 4 countries in the UK added in) order 3060/4060- next 5060 in bulk.
My point is that if a site showed AMD / Nvidia and series number sells by region, the 3060 and 4060 are nowhere near those 95% numbers shown on Steam as China has quadruple + the audience on Steam than the Americas/UK and Europe.
It is a numbers thing at the end of the day. I will also include that the Steam figures claim individual users by log in, not owner. A single pc in a wifi shop can have 10 unique users per day 7 days a week. The number is inflated. When we were in Beijing last year, I almost fell over seeing 100s of wifi cafes packed and all of them using 3060/4060 gpus and having bins of used 1060s/2060s for sell.
The numbers are grossly inflated and what is sold is sold in bulk in China for very different prices to the cafes.
Maybe It Will have 8gb because Nvidia still has a lot of 3060s to sell, they were hoarding badly during the mining fever.
My 6800 is doing very well. I haven't had a game struggle at all. 🤷♂️
that intel 12gb is starting to look better then nvidia 5060 any day of the week
Why would drive me crazy? Lets be honest, we all knew it will be like that. :D
Hi Dannyz, I've been into PC gaming for over twenty years. I predict getting my hands on a new 5090 will be in the third or fourth quarter of 2025 at the earliest. I like this card, but I like my money more.
Screw Nvidia. I'm going intel.
wow its as if nvidia let intel take the lower price market
i bought the rtx 4060 ti 16G a year ago as an upgrade from my rx 570, for the rtx 5060 i will have to pass and wait for the rtx 4070 ti super if prices comes down.
8 gb of vram brings all the fan boys to the yard.
nvidia keeps charging these stupid prices becauses consumers keep making stupid purchases. You deserve 8gb of vram if you keep supporting nvidia. Why would they change when Jensen keeps getting away with taking their customers to the cleaners.
Nvidia's new slogan "Buy more get less"
I think the problem is political and economics - bc adding more VRAM is purely a human decision, it's not a technical limitation - So Nvidia VRAM is dependent on SAMSUNG - leading in memory tech. - my guess is EVERYONE depends on Samsung - so must've been a not so smooth deal with Korea - Someone should look into this.
In reality VRam is a techical limitation
@ it is not a technical limitation on mid range cards. They can decide to mix n match vram to match their price points - ultimately to maximise profit. They are not going to give each tier their maximum VRAM capacity bc they want to control the supply and pricing.
Since we have yet to need more than 32gb vram, then I'll say we don't have a technical limitation, relatively.
It will not drive me crazy as I never buy such a low end card.
5080 is a 4080 Super with gddr7 and higher clockspeed , same cuda cores. sounds bad
yup. proper fucked me mate ws banking on a decent uplift guess i should have known better gotta wait for a 5080 super now...
5:20 Yup. Y’all gonna eat the 5060 up. A race to the bottom. “PC Master Race”
5070Ti will be the sweet spot this Gen. just like 4070Ti Super was last Gen.
5060 to 300 bucks… nvidia out phasing 4060 ti .. pay up people .. and we are happy .. we need to change this milk the consumer
I hope there will be some kind of power consumption tax on AI companies in the near future and AI turns out not to improve companies' bottom line enough, the bubble has to burst before Jensen gets out of his straitjacket and starts thinking about gamers again. Godspeed, Intel (never thought I'd hear myself say that - oh well "normal" stopped happening since the 2019 bug).
5060 is DOA but if they name it 5030 it might sell.
5060 Ti, probably DOA, due to price/perf.
Oh well...intel b850 it is then
I’m getting the rx8800xt
@@phoenixrising4995 I mean your talking a $600-700 amd gpu vs a $250 intel gpu, so not really an apples to apples comparison
There is bo point in releasing the 8GB version due to intel’s new card. If Nvidia is not stupid they just won’t release it and only release the 12GB version later on.
so you want intel to continue to exist as a competitor or quit dGPU market?
@ why would anyone want them to quit the GPU market? 🤷♂️. This competition is great and benefits us the consumers. I won’t be buying a GPU from intel any time soon since I want the best performance possible. Have a 4090 and will be buying a 5090.
Actually as a loyal CPU customer for many years I finally decided to leave intel and go 9800X3D (upgrade to 9950X3D soon), and it’s the best thing I ever did. The performance difference is just mind boggling! Intel’s CPU sucks soooo much for gaming and they just stubbornly refuse to listen to consumers about extra cache. Do I prefer them to focus more on CPU and compete there idk 🤷♂️. I think the company is big enough to have both a GPU and CPU department.
@@pvdgucht because if nvidia give 5060 at least 12GB that will decimate intel sales. a few quarters ago intel already said in their earning reports that Arc did not make any money. Tom Peterson interview with HUB further confirming this is still the case. no matter how big the company are there is no sane company will want to keep unprofitable business.
@ Yeah probably. Anyway intel doesn’t even want to sell the new Arc cards because they straight up lose money on them. Intel sucks but I would rather have them not go bankrupt for competition.
My eyes are only on the 5080 ti/super. If it's a better median between the 5080 and 5090... it might be really good
I just ordered ASRock Arc A770 Phantom Gaming OC 16GB 😅 been dying to get one to play around with it 😂 hopefully B770 comes out too so I can get that too 😅 ( already have RX 7900XTX )
Why do you buy other cards, if you already have a much stronger 7900xtx?
wow nvidia are a joke
lol its just a +20% uplift from last gen except the 5090 thatll be like +50% performance
3060 TI
Come with 8GB
Downgraded from 3060 16GB version
Nvidia need to stop
5060 should be 12gig min and the Ti should be 16gig. I play 1440p I have a RTX 3080 and there's games out there that get right up against the 10 gig buffer
I'm buying a whole new PC this February. I'm thinking either the battlemage or Radeon 8800xt. I'm not going for nvidia
battlemage!
Nvidia fans are the best customer: dumb and willing to buy the worst cards out there like a 60 card paying ridiculous prices