it will, its nvidia's way of saying, this is our 1080p budget card. yet, its not budget when they launch it. or they like to keep that card bad for 1 reason, to push gamers to purchase more expensive hardware to pair with their 1440p monitor.
It's not even "highly" rumoured, the further you are to the actual launch date, the less accurate it gets. It's likely that they'll change something along the line. Furthermore, it just a rumoured when actual product, likely to launch mid 2025 unlike 4090, 4080 and 4070, which is just month aways.
Intel GPUs arent the best way to game on PC tho. Theyre just a good value proposition against a 4060 - which was a widely criticised card that has been out a while. Intel drivers are also behind both Nvidia and AMD
there are invinsible chinese market that amd and nvidia capitalized by making a specific gpu for that region only, intel is still lagging way too much for gpu market tbh
It's just a matter of time until someone gonna defend 8gb by saying "8gb for Nvidia is the same as 12gb for AMD/Intel", they are the apple of gpu market.
See that argument could never stand because Apple has its own ecosystem where every dev on an apple product is forced to optimize and use the full potential of every technology in their devices to use applications. Nvidia don’t got that. While some gaming partners will take their time to try and optimize for rtx cards, majority just try to optimize for all graphics, leaving most extra performance features for said card by the wayside. So anyone using that argument you can tell them they smoking something and it ain’t legal.
In terms of gaming, no. But for running LLMs, Nvidia is so much better that even the pathetic 8gb is better than a 12gb AMD/Intel in that field specifically. Which sucks, I have an RX 6750XT with 12gb and an RTX 3070 with 8gb. And the 3070 beats it for that usecase
@@Made1nGreeceEven 4060 barely could run RT medium on Cyberpunk 2077. That's a clown level of GPU for a X060 card. Just get a 3050 8 gigs instead, the value is better. You could save up money for a better GPU in the future because if we talking about 1080p, we talk about performance, not fancy graphics.
Westerners complaining about paying 200-300 USD for GPUS when I have to pay triple that price whilst having quarter of their salary... us 3rd worlders get bullied too much.
If we're talking about the Arc B580, then yes, Intel has only had around two weeks for GPU drivers, but technically not, because Intel has been developing integrated GPUs for 15+ years as well. So technically Intel has had much more than just about 2 years of experience in the GPU market from the Arc dGPUs.
@@random_person618 He's taling about the drivers of this specific card. Nvidia had 18 months to fix and patch any issues while the B580 just launched and could get driver updates and increase the performance even more.
i have a single fan 12gb 3060 lol, accidentally overheated it once when i didnt realize a cord jammed into the fan when i was messing with the wifi card... still running strong
I have a 2070 but I game at 900p and in Cyberpunk with the FSR 3 mod that enables Frame Generation for RTX 20 and 30 cards and DLSS set to Quality, I get over 120 fps. Not the most practical, but I couldn't be happier.
Not watching this, the real issue is lack of optimization on developers part. They offload the cost of this to the consumer by putting the extra load on the GPU. Games now look worse with better tech than several from the early-mid 2010 era. This is the entire controversy surrounding new game engines and apparent lack of progress and performance for the leaps in technology. It is laziness for the sake of capital gain
This is mostly true. However, it also has to do with the fact that NVIDIA relies almost entirely on AI to do everything because they know it's constantly getting hyped. With everyone saying that AI is improving by leaps and bounds, I understand why NVIDIA focuses on it so much. I'm pretty sure that NVIDIA already knows that AI isn't the future of GPUs, but they know that most of their consumers will buy the AI-nonsense and go with the flow. So they crank the prices really high (knowing full-well that they spent little to no money on actually improving the GPU itself, and just poured all of their time into the AI aspects of it). With all of that in mind, not all GPUs are focusing everything they have on AI. Some companies are more transparent than NVIDIA and won't dive into AI as much as NVIDIA has for the sake of profit over progression. For example, some other GPUs generally perform better with raw performance and get a slight boost with AI. However, with the reliance on NVIDIA pushing out more AI-nonsense, devs know they don't need to optimize their games because the cards will do most of the work, which is unfortunate.
@@kurtwinter4422 I remember when the 4060 Ti came out and there was the $100 difference between the 8 GB and 16 GB variants. Turns out the RAM price difference was just a bit over $20. So the premium was exorbitant. Until Nvidia gets their heads out of their collective rears I'm not going to purchase another card from them.
Buying more VRAM isent the solution. If you keep chasing development will become more lazy. There is no video card that can't be saturated. Also you kill older GPUs. Stop playing this game.
@@markdove5930 EXACTLY. These people are playing the GPU manufacturers game for them. It's super dumb. Focus on devs, and proper optimization first. 8GB of vram is still PLENTY in 2024... if games are actually made well, and I'm not talking about just slapping on heavy upscaling either.
So basically to sum this up for everyone who doesn't wanna watch the video.... the Nvidia 4060 is you, and the Intel B580 is the guy she told you not to worry about.
NVIDIA users have the call of duty consumer syndrome. people should only ever buy NVIDIA if you're in the market for the ''all best top performing gpu'' aka their 4090/4080 super in this case. The rest is terrible price for performance just go buy an AMD.
The problem is not necessarily always that we like to always buy nvidia gpus, a lot of us use GPUs for workstation purposes ( 3d rendering, animations, ML model training and inference, CUDA and simulations ) where the only option we have nvidia and we are forced to buy nvidia, because there is no other alternative that is as good nvidia is, AMD is still far behind nvidia when it comes to workstations and thus we are forced to buy nvidia, because of their monopoly of hardware and software and every workstation development pipelines have support for nvidia anyway now AMD ROCM is an alternative but for its not good enough to be used in production
@@exphised4515 No thanks. AMD can't use DLSS so that's a huge reason they're inferior. I'm looking to upgrade my 2060 Super and I'll likely buy a used 4060 TI for $300
@@FavGZ Lmfao omfg so funny dude, yeah spending a whopping ~$200 after I sell my current GPU for all around 25% faster performance and an extra 2GB of VRAM is so hilarious! Cool bro, great job comparing your $1K GPU to an entry level $300 GPU you're a really smart guy. Talking about a waste of money when you spent $1K on a graphics card, that can't even use the best technologies available like DLSS, that my 5 year old GPU can do lmfaooo
Yeah it's basically a 2080ti when overclocked which is fire for $250. Plus you get an extra gig of ram and better power efficiency even when overclocked
Who cares, if the game runs, fuc/k it! I am not giving more money to Nvidia or Intel or AMD. By now the only game we are playing is work simulator to pay for the new GPU every fuc&king time!
@@delresearch5416 swaped this month from 1060 for path of exile 2 to a 7900xt :D I love it and it even came with video card rack to support it so it doesn't sag.
My 3050 laptop with 4GB of Vram still runs my fav games just fine at 1080p 144hz. Tho said games are primarily mobile games that can also be played on PC so it's not as heavy as those AAA titles.
At the end of the day, IMO as a 20+ year graphics programmer, these games should be abiding by the user's graphics quality setting and dynamically scaling the LOD according to how much VRAM their GPU has. This means that everyone can have ultra-quality textures up-close, but the quality drop-off with distance will be greater on GPUs with less VRAM. There could also very well be an LOD balance where an end-user can sacrifice the up-close quality of material textures for more consistent texel-to-pixel density across the scene. There definitely should not be shipped games where they just blindly use up all the VRAM and then start shuttling resources from system RAM to render frames, which is at least better than over-committing to VRAM and just causing a system crash, but the framerate should never be able to drop to a slideshow as a result of resources overflowing out of VRAM into system RAM. The engine should be doing everything it can to keep performance up - otherwise what's the point of things like dynamic resolution scaling? What's the priority here, graphics or performance? They need to make up their mind.
I think game developers are being spoilt and getting lazy with optimization. They work with what, 24, 48gb of VRAM on their workstations and just let consumers pick up on their VRAM game instead of optimizing for the lowly peons with 6gb VRAM
@@kenhew4641 Right, they just get whatever the latest top tier hardware is - which was fine 20 years ago because a lot of people would be able to afford it too when the game launched, or within a year of it launching, but nowadays most people do not have a 4090 and probably won't have 4090 power for at least 5+ years. Building all these high-end features is fine, but they need to be realistic and cater to the mainstream market and what most people will have on launch day, and strive to improve the graphics on that while also keeping things performant. The worse hardware your game supports, without looking like a complete potato, the wider the audience, period. *Everyone* loves a game that looks good AND runs good. Making a game that only CAN look good if you have a $3000 computer ain't it.
Got a used RX6800 last year for less...That's how bad the new GPU situation is when the B580 looks like great value. I hope this pushes all 8gb GPUs under $200USD
@themodfather9382 they are doing something, they are showing people that you can have more vram then 8gb at under 300. It's showing nvidia is being Scummy and intentionally making their lower end cards obsolete compared to their nearly 2 gen old 12 gb 3060 card
@@themodfather9382just wait for the B770, because I heard about it pretty often and it might be unleashed to the market as a revolution (take this with a grain of salt however, it's still a rumour but very possible)
The problem with 8GB GPU's is not the price, the problem is that we already bought them!! I have a 3070 and in my country was the same price as a non ti 3060. The 3070 even though only 8GB, WIPES THE FLOOR with the crappy 3060. So sure, you have 12gb of VRAM with the 3060, but you don't even have the power to use them, with the 3070 you DO have the power to use more VRAM, wich of course, you don't have! hahahha. But again as I was saying, the problem is that a HUGE amount of gamers already purchased these cards!
A funny thing to note about Arc Cards. They intended to release the Alchemist (A series) cards shortly after the RTX 3000 series launch but had issues causing a delay and resulting in it competing with the RTX 4000 series cards. Granted the A770 kinda competes with the RTX 4060/4060TI, they where aiming to show their dominance over the 3060/3060ti cards in which it is definitely better spec'd. The Battlemage (B series) cards are going to have the same offset in terms of their generational comparison. The interesting part is Intel is aiming at the RTX 4070/4070TI. So we are going to have in the future a reasonably spec'd 4070 from Intel over Nvidia's premium brand tax.
It dont matter if the memory is faster and knowing nvidia they are gonna nerf it to 128 bit bus HECK EVEN FASTER MEMORY IS A PERFECT EXCUSE FOR THRM TO PUT 64 BIT BUS IN THAT CARD@@JuniorTech-q1k
@@JuniorTech-q1kFaster ram speed will improved performance/FPS, but it will not affect capacity, if the game needed more than 8gb it will struggle all the same.
Doesnt matter. The product stack has leaked and we will get the RTX 5060 with 8 GB of vram and and a 5060 ti with 16 GB of vram. Nvidia is a monopoly and they now go really hard for the upselling strategy by creating artificially bad products. Maybe the MSRP for the RTX 5060 will be "just" 299$, but Nvidias MSRPs are worthless numbers anyway. But gamers will buy it anyways because they like being screwed over.
its just people not being informed correctly and assuming things based on brands. for people interested in pc hardware and who don't just buy prebuilts its obviously different
Thanks Vex for the great content this year ✊. May you and your family have a great Christmas 🎄🎅, an awesome new year and a terrific 2025. Cheers 🍻 and looking forward to more awesome content in 2025.
How likely is it that a $300+ RTX 5060 will be gimped enough in 8GB of memory that a $250+ B580 will manage to remain competitive? Really hope Intel is at least breaking even with these cards and not selling them at a loss.
@@milkylul8868 Y'know, I was about to say, "wouldn't that only affect Americans, like my fellow people because we pay the tariffs?" Then I remembered there's a reason trade wars happen because of tariffs.
Life is so much easier when you're a 1080p gamer I really see no point in going higher than that. Sure it looks better, but it's like watching a movie. If the escapism is doing its job, then you won't notice in the heat of the action Before anyone says, yes I have used better monitors before
Totally agree, I bought 4 new 240Hz 1080p monitors on sale because they are going out of production soon, you know, greedy corporations that are trying to force the sheeps to buy buy buy.
This is also a reason of mine why I went with a 7900 GRE so I can play at 1080p high fps for a while even when games become more demanding and VRAM hungry.
What W? The die size is similar to a 4070 while being on a significantly smaller node and offering 4060 performance. Intel is loosing money on these cards and only launched them at all to keep investors from sueing. Stock will basically be non-existent. It doesn't even offer value when i can get a rx 6700 xt for $265 shipped. Both AMD and Nvidia can shut this down with 2 year old designs if they feel because they are cheaper to produce. That's not even taking into account AMD and Nvidia's next gen. Intel on a technical level is at least 2 gens behind.
@@deathhunter1029what are you talking about? its bad that games aren't being optimized correctly and instead theyre just expecting consumers to all have brand new top of the line hardware
The world doesn't revolve around US. AMD never really took off because their cards are super overpriced outside US and this Intel card is already 330$ in Europe and Asia after taxes while RTX 4060 is available for 280$. More VRAM, better price/performance means nothing when everyone can't buy them
@@SimplCup Im not that OP but a 4060 here in the UK costs 381 dollars and the B580 costs 330 dollars. But the 4060 Ti with its 16 gb ram is 635 dollars. So yeah the B580 is still amazing value for the amount of vram it has.
@@DailyThingsInLife WHAT !! B570 ? then when they will release the 700 series then dude its kinda lame they holding back on their own GPU and their own money look at 700 series from last gen it has 32 cores
no, actually it looks like 2016 mid range cards RX480/580 with 8GB. That time everybody said that 8GB is excess and never be usefull for these cards, so better get 4GB versions. After 4 years I laughed a lot watching their faces in 2020 when games started consume 8GB of VRAM in 1080p.
@geralt_silverhand @geralt_silverhand 8 GB isnt enough in plenty of cases to be able to push higher textures. I bought the red devil rx 480 during launch and around 2022 it started becoming more apparent than ever that even at 1080p it's no longer able to push high textures. A good example would be rdr2 textures.
The RX 480/580 had a good run but since 2022 it's been showing its age trying to push high textures. Pretty much every AAA game you play today that will be apparent. In the used market I would go now lower than the 1070 ti in terms of horsepower, however in terms of the new market you should be looking for more than 8gb of vram to get the most bang for your buck 👍
Each time I have upgraded my graphics card, I have always ensured that there was a substantial increase in vRAM. So far 2GB>4GB>8GB>12GB. There is a reason the 6700XT was chosen over the RTX 3070 (or slower+price comparable 3060) as a step up from my GTX 1070.
My journey looks kind of goofy as a PC builder for extra funds every now and again. I went from 3070 to 3080 to 5700xt currently. Swapping that 5700xt into another build and I’ll be on my trusty dusty 1060 until the new AMD cards drop.
I went with the 3060 because the vast majority of my GPU time is spent on generative AI, and at the time AMD support was still a work in progress. It has improved massively, and I knew it would, but I didn't want to wait six months.
How about we stop licking the nuts of these GPU companies, and start imploring developers to actually start optimizing their games properly, instead of doing a shit job, and then slapping AI upscaling onto it after the fact? That would be a much better solution.
You are helping Intel developing their cards at this point LOL Keep doing it man, Intel is having such a good start, they are our hope to bring balance to this market and pc gaming doable again.
RTX 2060 12GB -> RTX 3060 12GB -> RTX 4060 8GB -> RTX 5060 8GB? this is what happens when one corporation has a monopoly on the market A VRAM-cripple ends-up beeing a FPS-cripple sooner or later. Remember this you 6, 8, 10 and 12GB NVidia customers. At a point, when a behemoth company no longer cares if a particular segment purchase their products or not, basically grants them the license to do what ever they want. Nvidia is going mad with prices, giving you less for more. Nvidia’s strategy became known with the 40 series. Make the top product extremely powerful and jack the price up. They "could" have used 3GB chips, they "could" have just made each tiers memeory bus just 32 bits wider, with the extra bandwidth they "could" have used 4GB chips and cut the bus down 'up to' 50% and still been ahead. They didnt...
lol I get ur point but the 2060 had 6gb and the 2060 super had 8gb. I think after the 30 series launched there was an extremely niche 2060 12gb variant but almost no one has it Also no they couldn't have used 3gb chips and they still can't despite wanting too they'll have to wait about another 6 months and it'll only be a thing on gddr7 They absolutely should've just increased the memory bus like the 30 series instead tho
there's no need for this. turn on upscaler and frame generation. Monster Hunter Wilds literally writes this on their system requirements. even for 1080p on medium settings they say to enable frame generation. this is the future. and people thought that they will use these to upgrade to 1440p from 1080p with a $300 range gpu and not $700 like 4070. little did they do that they will need these for 1080p. and we warned that this will happen cause it will make devs get lazy and not do optimization.
For budget options Intel might be the best choice, but if you’re wanting to buy a higher priced card the only other option is AMD which are similarly priced without DLSS, far far worse FG and RT performance and much worse performance in workflow tasks.
Vex, here is something no other TH-camrs did yet. B580 has PCIE 4.0 x8. Can you test this card in a PCIE 3.0 system and post results? AMD would be easiest. Just get an a520, B450, X370, B350, or A320 motherboard for PCIE 3.0. PCIE 4.0 can be x570 or B550.
@@Uthleber I'll list off one game. 4060 ti in FarCry 6. You notice more than 2% FPS difference. See Hardware Unboxed when they put the 4060 ti in PCIE 3.0. The question is will I buy Arc B580 or spend the extra money on an RX 7700 XT to have that full x16. I prefer to spend less money and not take a hit in performance.
Okay don't forget that CoD BO6 uses 10gb there, but i play on 1080 high, a bit near ultra, but mostly high and I run it 100-144 fps while having discord open and other stuff on an rtx 3070, which has 8gb vram
@@goregejones7248 Did not will not Had the money to buy 4070ti in my country which cost like a 1000$ btw lol was thinking about it instead built a whole a$$ pc for my sisters little more than half the cost am5 32 gigs ram 1tb gen 4 ssd r5 7600 6750xt 750w gold psu and a decent mobo with good vrm too just found the discounts because people aren't buying non nvidia or intel in my country lol. I'm still rocking an rx580.
@@goregejones7248 u say as if he or most ppl here is gonna buy it smh i literally have 2060 12gb, and no im not buying shit, this 12gb can last me 10 years
@@urnoob5528 you don't know how the gamers are, I'm sure rtx 5060 will be purchased by many people, look at rtx 4060 which was criticised for its vram and price, but look steam hardware survey it's on 4th position.
We had 6gb on 60class card in 2016. 9 years later we still got only 8gb 60 class card with the 5060. The 16gb should be minimum vram nowadays. 8gb should only be for like 50class cards or lower. Its not people still use 8gb of ram gaming either anymore cause its just not enough
Even if AMD release a decent affordable card, people will just buy nvidia sadly. RX 6600 one of the best budget card, doesn't even reach 1% on steam survey 😢
Nvidia has 88% AIB(MSI, Asus, etc) market share, so of course this is what will happen. normies know nothing about vram, etc. nvidia is popular, the X060 is cheap and can do 1080p which is the most used resolution, 5060 will be best selling gpu.
Nvidia don't give a fak cause idiots will buy Nvidia regardless. of course i'm talking about the majority that are normies that do 0% research and buy what's popular = Nvidia. and unfortunately Nvidia knows this. they see the sales = 4060 best selling new gpu. 4060 8gb, so why increase the vram? no need.
@@daddysamosa I call them that because they keep whining about 8GB of VRAM when most AA and Indie games can run on potatoes, including THAT graphics card.
The VRAM limitation is NOT a problem with hardware. It is the failure of the game developers. None of them are building these games on that IbuyPower PC you bought for $1500 with the RTX badge visible through that tempered glass. Chase them if you like. You never catch up.
Not joking... it is true. 8gbyte is going to be problems. 12Gbyte should be the LEAST for that level of card. Midrange cards need 12Gbyte or better 16Gbyte because games will use it! I got a RTX 3060 a few years ago, with yes 12Gbyte, I might if I spare a little bit more money a 4060 8Gbyte. But I would hit that memorywall. So I rather have 10fps less but it being smooooooth then 10fps more and stuttering.
Not everyone plays AAA garbage. I play single player strategy and indie games. My 3070 is more than enough for this requirement at 1440p, 60fps, on high settings.
Bought my B580 - tho Sparkle Titan OC since LE is "unobtainium" here in Poland - literaly 30 minutes after the reviews came out has been using it since friday and i am damn happy with it - it is an great upgrade over my A770 LE -> and i have been daily 1440p since Alchemist so you can believe when i say that yes, B580 is a really great card and for that price it is juts no brainer
I'm super happy for Intel. Their GPU division clearly showed their passion and their hard work, and made a great card. If the corpos don't fuck it up, Intel has the potential to grab a ton of market share just by making better cards affordable. Something AMD could have done for the last 5-8 years. Sucks to suck AMD.
Im also seeing a very bad flaw in game design where they are resource hogging the daylights out of machines. With all the wonderful advancements achieved in the tech industry, theres so many spots that been left in neglect. I have had to open up Nvidia control panel just to utilize the "SYSTEM FALLBACK POLICY" just to get extra vram but with performance costs. I also needed to make a note. Intel makes CPU's and now GPU's, I wondering if intel put the CPU design idea IN to the GPU's architecture?
+15% performance for +33% power.. EH. EEEEEEEHHHHH. I'm more impressed by the base performance. It's rather efficient at that point. Intel's B580 is a fair attempt at a second generation product: it pretty much crushes the old A770, outside of a few exceptions, which I'm sure drivers will fix. An eventual B770 might be a real beast, but also probably cost closer 499, but given Intel needs good press, they might also sell those at a loss at 399 or possibly less. If they lay a smackdown, even if it means losing money, it's an absolute publicity stunt and means Celestial will be priced higher, but also offer good performance. If they mange another +15% or +20% (preferably) performance on the next gen while keep it cool and efficient, a $300 price might be acceptable for a C580. Then again, the B580 still has some driver issues: I expect a lot of them to keep improving. Either way, Intel made a damn banger on the second go. Unlike their CPUs, this is actually exciting. I'm on an 7800xt, so I'm fine, but like god damn, I'm exciting for a 3-way showdown when the next generation (Celestial vs 6000 vs 9000) rolls around.
Imagine playing those new unoptimized walking simulators without actual gameplay or shallow RPGs. That's why I am replaying older classics recently with my RTX 3060ti.
Ngreddia will release a 5060 with 8gb vram and the price I think will be $400. This card will have performance around 4060ti/3070 to 3070ti to other countries plus 20% tax and profit margin, not to mention OEM versions will be 10% more expensive so we will have a 5060 card for $480-520 ? wtf for 8gb vram 128bit ?
Thanks for making this great review. The B 580 is currently at the top of my list for my next build. I don't know where everyone gets the idea that 4060 cards only come with 8GB of RAM, though. I've seen 12GB versions and Gigabyte even make a 16GB card. "Gigabyte GeForce RTX 4060 Ti WINDFORCE OC 16G"
@@cz5836 realistically most of them 8gb rx480 580 super low end then have 12gb 6750xt (because who buys 7600 when this exists) low end and have 16gb for 6000 series mid range.
Can't hear you over my 24GB of VRAM, soon to be 5090 32GB of VRAM and 2500Watt PSU in January 2025 because I'm gonna sell my left kidney and soul for it.
The game doesn't want 10GB of Vram, the memory manager allocated 10GB of Vram for that game because that is what was available... The game might only be using 4,5 or 6 GB of the Vram that it reserved.
@@StricklandLiftsur so low knowledge its unreal. If u think RAM and Vram are the same and give the same performance and same speed, then I don't know what to tell you. You can have 64GB RAM with RTX 3070 8GB Vram, watch how the game will stutter if u run out of Vram.
Who killed them? Lazy damned coders. You wouldn't need 4 Gb with the latest titles if they actually practiced some restraint. They throw crap in there that doesn't need to be there.
This is pure cope. Big open worlds with high resolution textures will easily take up more than 4 GB. I'm sure you're probably cool with 8 GB of DDR3 RAM too then? Times change man
I was a big PC nerd when i was a teen, but honestly it’s gotten exhausting. I’m broke as and can’t even afford to upgrade in the first place. I’ve been running a laptop with an integrated gtx1650 since 2019 and every time i see some new tech news i go ‘oh, good for them i guess’. It’s wild to me how stupidly fast computing is going but at the same time i find its making people lazier having an audience with such abundant graphical and processing power.
Unoptimized slop is not a good way to benchmark cards. The developers being lazy at optimisation is the real problem, not the VRAM or the cards general performance. 8gb is perfectly fine in non-AAA games, it's not fine in AAA games because the devs don't know how to optimise their games anymore, they just think higher texture = better, but that's simply not the case, no one is going to notice your 4k bark textures in a 3rd person game, no one is looking that close or cares. 4k textures in general are pretty useless, they look almost the same as HD textures since the human eye simply can't discern something like that at a glance, which is what players will being doing when looking at the environments, they're not going to watching paint dry so to speak. Most indie developers are better at optimisation then AAA studios, look at Deep Rock Galactic, that can be played on a 2gb VRAM card from like 12 years ago. Photorealism doesn't = better either, in fact it can make a game worse since people play games to have fun and escape reality, not every game needs to be super realistic looking, that's just dull and photorealism hogs lots of performance.
This doesn't matter 5000 series will sell out in all tiers, cuz people is sheep minds and Nvidia today is exactly like Apple it's all about the brand not the product, and people is braindead set on, ooh new Nvidia GPUs i has to buy it cuz all other does the same! Even tho AMD is better raw performence at a lower cost with the Best GPU software ever made, and Intel Arc B580 is actually a very good entry lvl mid card.
i think intel is testing the market with the B580 because if they want to make a profit they have to sell a lot of B580s at that price, if the supply is too large in case the B580 is criticized it will be a big loss for them, that's why the production of B580 is not much, but the good thing is that everyone likes it, but because of the low supply everything outside the US is marked up quite high compared to msrp
@ there are none available in the USA either. They lose money on every B580, selling more will lose more. They have made a dent in the market, and its a big improvement. But its a paper launch.
@@tin9759 Bro the gpu sucks, if i'm building a $1k desktop pc i don't care if the gpu is $250 or $350, it's within range and i'd rather get a worse case, power supply, less ram etc as the gpu is the main component that drives performance yet it's competing with older gen gpus is weak as hell. Intel should've released a better performing card at $300-350 which is more in line with the 4070 performance, this gpu will flop imo.
No 3060ti too. Only 8gb but so fast for the price. It literally beat the 80 super card from the last gen. imagine if the upcoming 5060ti would beat the 4080super, it would be unfathomable.
I just ordered ASRock Arc A770 Phantom Gaming 16GB 😅 been dying to get one to play around with it 😂 hopefully B770 comes out too so I can get that too 😅 ( I already have a 7900XTX )
Don’t forget it’s highly rumored that the 5060 will have 8gb of vram
No bro it will have 6gb vram what u smoking
Those rumours are good for clicks tho
it will, its nvidia's way of saying, this is our 1080p budget card.
yet, its not budget when they launch it.
or they like to keep that card bad for 1 reason, to push gamers to purchase more expensive hardware to pair with their 1440p monitor.
If this is true, I'm never buying an nvidia gpu. I'm just going to pray intel releases the b770/b780 soon.
It's not even "highly" rumoured, the further you are to the actual launch date, the less accurate it gets. It's likely that they'll change something along the line.
Furthermore, it just a rumoured when actual product, likely to launch mid 2025 unlike 4090, 4080 and 4070, which is just month aways.
Imagine telling somebody 5 years ago that intel GPUs and Ryzen CPUs are the best way to game on PC XD
Intel GPUs arent the best way to game on PC tho. Theyre just a good value proposition against a 4060 - which was a widely criticised card that has been out a while. Intel drivers are also behind both Nvidia and AMD
Lmao
Why would you do that Intel GPUs are a dumpster fire
You forgot to add a qualifier. It's the best way to game 'on a low budget'. The best way to game is still Ryzen and Nvidia.
For a budget build*
I hope they sell a TON of B580's.
We need competition in the GPU market so badly.
They sold so much they are out of stock in most places
@Baabli_ultimate_Gamer yes, I don't think they expected the level of interest.
Heck, I'm thinking about building a mid-range PC with one of them.
Scalpers are f*cking everything up
Hell yeahh ill use a B580 now
there are invinsible chinese market that amd and nvidia capitalized by making a specific gpu for that region only, intel is still lagging way too much for gpu market tbh
It's just a matter of time until someone gonna defend 8gb by saying "8gb for Nvidia is the same as 12gb for AMD/Intel", they are the apple of gpu market.
See that argument could never stand because Apple has its own ecosystem where every dev on an apple product is forced to optimize and use the full potential of every technology in their devices to use applications. Nvidia don’t got that. While some gaming partners will take their time to try and optimize for rtx cards, majority just try to optimize for all graphics, leaving most extra performance features for said card by the wayside. So anyone using that argument you can tell them they smoking something and it ain’t legal.
@@Quintessenzanvidia tried it's hardest to make its own ecosystem with ray tracing brainwashing videos and cloud gaming but it didn't work.
8gb for Nvidia is the same as 12gb for AMD/Intel
In terms of gaming, no. But for running LLMs, Nvidia is so much better that even the pathetic 8gb is better than a 12gb AMD/Intel in that field specifically. Which sucks, I have an RX 6750XT with 12gb and an RTX 3070 with 8gb. And the 3070 beats it for that usecase
Nvidia = Apple
Amd = Samsung
Intel = just literally any other phone xD
Nvidia is like “hold ma beer, gonna launch another 400 usd 8gb card”
@@Made1nGreece you say that like it isn't a tragedy.
@@Made1nGreeceain't that the most tragic part of it all
Or they will just drop the price of existing cards before 50 series comes out. Maybe even revise their 50 series card prices. Competition is good.
@@Made1nGreeceEven 4060 barely could run RT medium on Cyberpunk 2077. That's a clown level of GPU for a X060 card. Just get a 3050 8 gigs instead, the value is better. You could save up money for a better GPU in the future because if we talking about 1080p, we talk about performance, not fancy graphics.
@@Made1nGreece wrong
Westerners complaining about paying 200-300 USD for GPUS when I have to pay triple that price whilst having quarter of their salary... us 3rd worlders get bullied too much.
IKR here the 4060 is selling for 377.68 USD without the taxes LMAO
I worked like a maniac to buy a 4060 this year in this god-forsaken country. Some things just aren’t fair, man.
@@bruno0898 tbf there's no game that didn't run well at 1080p
Hits home, I live in Brazil, import taxes on GPUS are HUGE here, I'm talking more than double the price, and we get paid less
@@bruno0898 Go for AMD if you're tight for money. Nvidia cards are kinda overpriced at this point.
Should also note the B580 is on launch drivers as it just came out. The RTX4060 has had 18months of post launch driver developement.
18 months and 20+ years of nvidia drivers.
If we're talking about the Arc B580, then yes, Intel has only had around two weeks for GPU drivers, but technically not, because Intel has been developing integrated GPUs for 15+ years as well. So technically Intel has had much more than just about 2 years of experience in the GPU market from the Arc dGPUs.
Yeah i imagine we can expect another 10 or even 20% performance increase as drivers improve
@@random_person618yes because integrated gpus have NO driver issues and never have massive amounts of artifacts or anything else lol
@@random_person618 He's taling about the drivers of this specific card. Nvidia had 18 months to fix and patch any issues while the B580 just launched and could get driver updates and increase the performance even more.
People with rtx 3060 12GB: They called me a madman...
3060 owner here 🤣
It was an extremly good budget card for the time. It's aging fairly well. May do better than my 3070 ti 8gb in the ling run
i have a single fan 12gb 3060 lol, accidentally overheated it once when i didnt realize a cord jammed into the fan when i was messing with the wifi card... still running strong
I have a 2070 but I game at 900p and in Cyberpunk with the FSR 3 mod that enables Frame Generation for RTX 20 and 30 cards and DLSS set to Quality, I get over 120 fps. Not the most practical, but I couldn't be happier.
@@rvgr420 you shoud try dldsr + dlss
Not watching this, the real issue is lack of optimization on developers part. They offload the cost of this to the consumer by putting the extra load on the GPU. Games now look worse with better tech than several from the early-mid 2010 era. This is the entire controversy surrounding new game engines and apparent lack of progress and performance for the leaps in technology. It is laziness for the sake of capital gain
This is mostly true. However, it also has to do with the fact that NVIDIA relies almost entirely on AI to do everything because they know it's constantly getting hyped. With everyone saying that AI is improving by leaps and bounds, I understand why NVIDIA focuses on it so much. I'm pretty sure that NVIDIA already knows that AI isn't the future of GPUs, but they know that most of their consumers will buy the AI-nonsense and go with the flow. So they crank the prices really high (knowing full-well that they spent little to no money on actually improving the GPU itself, and just poured all of their time into the AI aspects of it). With all of that in mind, not all GPUs are focusing everything they have on AI. Some companies are more transparent than NVIDIA and won't dive into AI as much as NVIDIA has for the sake of profit over progression. For example, some other GPUs generally perform better with raw performance and get a slight boost with AI. However, with the reliance on NVIDIA pushing out more AI-nonsense, devs know they don't need to optimize their games because the cards will do most of the work, which is unfortunate.
Nvidia: Guess what suckers , 5060 is also 8gb vram
NVIDIA doing Apple things
Nividia can suck on this, i aint buying anything thats over $300 and only has 8gb
i mean it won't be as bad as its gddr7 so the bandwidth will massively increase but it'll still be quite shit
Does Vex live in a warehouse that doesn't have a closed ceiling?
@@srobeck77 just don't buy Nvidia new XD
This is how it should have been:
- 5060 12GB
- 5070 16GB
- 5080 20GB
- 5090 32GB
Nah how we are supposed sale more by adding TI & Super behind their numbers?
It's Business
This is what we'd get if amd was the gpu top dog, until that happens nvidia couldn't care less about making high longevity cards.
5080 should have 24gb. At the prices they're charging, vram is cheap
@@kurtwinter4422 I remember when the 4060 Ti came out and there was the $100 difference between the 8 GB and 16 GB variants. Turns out the RAM price difference was just a bit over $20. So the premium was exorbitant. Until Nvidia gets their heads out of their collective rears I'm not going to purchase another card from them.
I'm gonna sell my left kidney to buy the 5090 32GB when it comes out. Heh
Me chillin in almost 2025 with my 1080ti with 11gbs of vram😂
Hope to be of a similar mind in 2035 with my 4090.
BASED tbh
Me with the RX 580 with 4gb still being able to play any game I want to.
me chillin with my i3 4th gen and Integrated graphics 🗿
The 5070 coming out with 12gb is criminal!
Buying more VRAM isent the solution.
If you keep chasing development will become more lazy.
There is no video card that can't be saturated.
Also you kill older GPUs.
Stop playing this game.
@@markdove5930 EXACTLY. These people are playing the GPU manufacturers game for them. It's super dumb. Focus on devs, and proper optimization first. 8GB of vram is still PLENTY in 2024... if games are actually made well, and I'm not talking about just slapping on heavy upscaling either.
@@markdove5930 fatal plot hole: not enough VRAM isnt the answer you seek and 8gb for over $500, aint it, fella.
@@brando3342 no it definitely is not "PLENTY" unless you play medium settings at 1080p, otherwise no
@@srobeck77 Clearly, you don't have the slightest clue what you're talking about.
So basically to sum this up for everyone who doesn't wanna watch the video....
the Nvidia 4060 is you, and the Intel B580 is the guy she told you not to worry about.
and the AMD 8700 is brad pitt
😂😂😂
And 6700XT is that chill middle aged man that still looks fine despite his age. 😂
man love you!
All Empires rise and fall.
NVIDIA: why would we care? 90% of the people will buy our potato anyway
NVIDIA users have the call of duty consumer syndrome. people should only ever buy NVIDIA if you're in the market for the ''all best top performing gpu'' aka their 4090/4080 super in this case. The rest is terrible price for performance just go buy an AMD.
The problem is not necessarily always that we like to always buy nvidia gpus, a lot of us use GPUs for workstation purposes ( 3d rendering, animations, ML model training and inference, CUDA and simulations ) where the only option we have nvidia and we are forced to buy nvidia, because there is no other alternative that is as good nvidia is, AMD is still far behind nvidia when it comes to workstations and thus we are forced to buy nvidia, because of their monopoly of hardware and software and every workstation development pipelines have support for nvidia anyway
now AMD ROCM is an alternative but for its not good enough to be used in production
@@exphised4515 No thanks. AMD can't use DLSS so that's a huge reason they're inferior. I'm looking to upgrade my 2060 Super and I'll likely buy a used 4060 TI for $300
@StricklandLifts lmfao waste of money dude, I went over to red team with a 7900xt on 1440p and I couldn't be happier.
@@FavGZ Lmfao omfg so funny dude, yeah spending a whopping ~$200 after I sell my current GPU for all around 25% faster performance and an extra 2GB of VRAM is so hilarious! Cool bro, great job comparing your $1K GPU to an entry level $300 GPU you're a really smart guy. Talking about a waste of money when you spent $1K on a graphics card, that can't even use the best technologies available like DLSS, that my 5 year old GPU can do lmfaooo
Wow no one has mentioned the overclocking in all the reviews I've seen. That's quite a significant uplift in performance. B580 the value king.
Yeah it's basically a 2080ti when overclocked which is fire for $250. Plus you get an extra gig of ram and better power efficiency even when overclocked
@@Frozokenalso better rt with newer games and xess 2.0 is good
8 gigs is not enough anymore, me with 6 gigs 💀
I'm with 4gb of vram
Love they 1060
Who cares, if the game runs, fuc/k it! I am not giving more money to Nvidia or Intel or AMD. By now the only game we are playing is work simulator to pay for the new GPU every fuc&king time!
@@delresearch5416 swaped this month from 1060 for path of exile 2 to a 7900xt :D I love it and it even came with video card rack to support it so it doesn't sag.
My 3050 laptop with 4GB of Vram still runs my fav games just fine at 1080p 144hz. Tho said games are primarily mobile games that can also be played on PC so it's not as heavy as those AAA titles.
Yet still feels like yesterday when R9 390(X) launched and I was like "whoa, 8GB, who even needs that much of VRAM"
How time flies
At the end of the day, IMO as a 20+ year graphics programmer, these games should be abiding by the user's graphics quality setting and dynamically scaling the LOD according to how much VRAM their GPU has. This means that everyone can have ultra-quality textures up-close, but the quality drop-off with distance will be greater on GPUs with less VRAM. There could also very well be an LOD balance where an end-user can sacrifice the up-close quality of material textures for more consistent texel-to-pixel density across the scene. There definitely should not be shipped games where they just blindly use up all the VRAM and then start shuttling resources from system RAM to render frames, which is at least better than over-committing to VRAM and just causing a system crash, but the framerate should never be able to drop to a slideshow as a result of resources overflowing out of VRAM into system RAM. The engine should be doing everything it can to keep performance up - otherwise what's the point of things like dynamic resolution scaling? What's the priority here, graphics or performance? They need to make up their mind.
Oh my God 100% this! I am so frustrated with everybody obsessing over vram.
I think game developers are being spoilt and getting lazy with optimization. They work with what, 24, 48gb of VRAM on their workstations and just let consumers pick up on their VRAM game instead of optimizing for the lowly peons with 6gb VRAM
But how will developers launch unreal engine 5 slop for 70$ 2 times a year if they have to optimize it?!?!
UE5 becoming a drag and drop development already killed game optimization
@@kenhew4641 Right, they just get whatever the latest top tier hardware is - which was fine 20 years ago because a lot of people would be able to afford it too when the game launched, or within a year of it launching, but nowadays most people do not have a 4090 and probably won't have 4090 power for at least 5+ years. Building all these high-end features is fine, but they need to be realistic and cater to the mainstream market and what most people will have on launch day, and strive to improve the graphics on that while also keeping things performant. The worse hardware your game supports, without looking like a complete potato, the wider the audience, period. *Everyone* loves a game that looks good AND runs good. Making a game that only CAN look good if you have a $3000 computer ain't it.
Got a used RX6800 last year for less...That's how bad the new GPU situation is when the B580 looks like great value. I hope this pushes all 8gb GPUs under $200USD
Intel cards don't do ANYTHING to fix the problem. They needed a 4070 competitor for $399 or less.
@themodfather9382 they are doing something, they are showing people that you can have more vram then 8gb at under 300. It's showing nvidia is being Scummy and intentionally making their lower end cards obsolete compared to their nearly 2 gen old 12 gb 3060 card
@@themodfather9382just wait for the B770, because I heard about it pretty often and it might be unleashed to the market as a revolution (take this with a grain of salt however, it's still a rumour but very possible)
NEW 8gb gpu SHOULD NOT EXIST IN 2025
The problem with 8GB GPU's is not the price, the problem is that we already bought them!! I have a 3070 and in my country was the same price as a non ti 3060. The 3070 even though only 8GB, WIPES THE FLOOR with the crappy 3060. So sure, you have 12gb of VRAM with the 3060, but you don't even have the power to use them, with the 3070 you DO have the power to use more VRAM, wich of course, you don't have! hahahha. But again as I was saying, the problem is that a HUGE amount of gamers already purchased these cards!
A funny thing to note about Arc Cards. They intended to release the Alchemist (A series) cards shortly after the RTX 3000 series launch but had issues causing a delay and resulting in it competing with the RTX 4000 series cards. Granted the A770 kinda competes with the RTX 4060/4060TI, they where aiming to show their dominance over the 3060/3060ti cards in which it is definitely better spec'd. The Battlemage (B series) cards are going to have the same offset in terms of their generational comparison. The interesting part is Intel is aiming at the RTX 4070/4070TI. So we are going to have in the future a reasonably spec'd 4070 from Intel over Nvidia's premium brand tax.
Wait till Nvidia Drops the 5060 with 8gb🤣💀
I am interested in 5060 ti 16 GB or 5070 ti 16 GB
But wait, 8 GB DDR7, let's wait and see what these memories can do
It dont matter if the memory is faster and knowing nvidia they are gonna nerf it to 128 bit bus HECK EVEN FASTER MEMORY IS A PERFECT EXCUSE FOR THRM TO PUT 64 BIT BUS IN THAT CARD@@JuniorTech-q1k
😆😆😆
@@JuniorTech-q1kFaster ram speed will improved performance/FPS, but it will not affect capacity, if the game needed more than 8gb it will struggle all the same.
Doesnt matter. The product stack has leaked and we will get the RTX 5060 with 8 GB of vram and and a 5060 ti with 16 GB of vram. Nvidia is a monopoly and they now go really hard for the upselling strategy by creating artificially bad products. Maybe the MSRP for the RTX 5060 will be "just" 299$, but Nvidias MSRPs are worthless numbers anyway. But gamers will buy it anyways because they like being screwed over.
Mfs say this and then buy an iphone
so my 3060 is better than the 4060 and 5060 what?good i bought it
its just people not being informed correctly and assuming things based on brands. for people interested in pc hardware and who don't just buy prebuilts its obviously different
They are not a monopoly. Not at all.
The more you buy, the more you save. I'll always take Nvidia over this non-optimized drivers intel trash.
Thanks Vex for the great content this year ✊. May you and your family have a great Christmas 🎄🎅, an awesome new year and a terrific 2025. Cheers 🍻 and looking forward to more awesome content in 2025.
Here before Vex
How likely is it that a $300+ RTX 5060 will be gimped enough in 8GB of memory that a $250+ B580 will manage to remain competitive?
Really hope Intel is at least breaking even with these cards and not selling them at a loss.
I mean they would, b570 basically serving to use subpar b580 chips to cut down the scale and still sell
damn, prices are that good in the us?? Here the basic 4060 is 330 dollars and the arc is 500
@@pudimyAnd they are only gonna get worse cuz of tariffs 😢
@@milkylul8868 Y'know, I was about to say, "wouldn't that only affect Americans, like my fellow people because we pay the tariffs?"
Then I remembered there's a reason trade wars happen because of tariffs.
You can be assured that the 5060 will be $400+, not 300. Nvidia has become insanely greedy.
Life is so much easier when you're a 1080p gamer
I really see no point in going higher than that. Sure it looks better, but it's like watching a movie. If the escapism is doing its job, then you won't notice in the heat of the action
Before anyone says, yes I have used better monitors before
This guy is speaking my language
Totally agree, I bought 4 new 240Hz 1080p monitors on sale because they are going out of production soon, you know, greedy corporations that are trying to force the sheeps to buy buy buy.
This is also a reason of mine why I went with a 7900 GRE so I can play at 1080p high fps for a while even when games become more demanding and VRAM hungry.
yeah 1080p gaming are much more forgiving.
Cope more 😂
This title gets more hilarious when the 5060 is rumored to still have 8GB
Rare Intel W
common Intel ARC W (if we disregard launch A series drivers)
What W? The die size is similar to a 4070 while being on a significantly smaller node and offering 4060 performance. Intel is loosing money on these cards and only launched them at all to keep investors from sueing. Stock will basically be non-existent. It doesn't even offer value when i can get a rx 6700 xt for $265 shipped. Both AMD and Nvidia can shut this down with 2 year old designs if they feel because they are cheaper to produce. That's not even taking into account AMD and Nvidia's next gen. Intel on a technical level is at least 2 gens behind.
@@pastelink6767 it still offered huge bang for buck and a cheap way for av1 encoding and it didn't have 12gb like the 4070
@@pastelink6767 dude hates intel to his heart dam, can't appreciate anything.
But fps is so low. So it is an L for the user.
I'm so happy I was able to get my sparkle b580 from my microcenter before it sold out, upgrading my summer 2019 build from a amd rx570 to the b580
Modern PC gaming is so unbelievably bad it’s actually kind of insane to think about. I’ll keep playing games from 2016 back.
there aren't good games now anyway
Funny how the requirements keep skyrocketing yet the graphics either remain the same if not getting worse
for real, im only playing old games & going through my backlog lol
Its bad??? Pc gamers are so entitled its insane… pc are so insanely powerful now. Everyone has rtx gpus now. I bet you’ve never reach 10 years old
@@deathhunter1029what are you talking about? its bad that games aren't being optimized correctly and instead theyre just expecting consumers to all have brand new top of the line hardware
The world doesn't revolve around US. AMD never really took off because their cards are super overpriced outside US and this Intel card is already 330$ in Europe and Asia after taxes while RTX 4060 is available for 280$. More VRAM, better price/performance means nothing when everyone can't buy them
Correct. I think the new Intel card is some 360-380 in euros. There goes the budget price angle.
it's surprising. in what country do you live? here in russia the 4060 costs 400-450 dollars, and whenever b580 comes here it'll cost 270 dollars
Agree
@@SimplCup Im not that OP but a 4060 here in the UK costs 381 dollars and the B580 costs 330 dollars. But the 4060 Ti with its 16 gb ram is 635 dollars. So yeah the B580 is still amazing value for the amount of vram it has.
@@faequeenapril6921 yeah. even if b580 ends up slightly more expensive than 4060 for some people it's still better value lmao.
YOU KNOW RTX 5060 WILL COME WITH 8 VRAM🙃
no way really????
@VZR_ not confirmed, but most likely it will, this Nvidia we're talking about.
Nvidia better make it $200 since B570 coming out next month with 10GB of VRAM
@@DailyThingsInLife WHAT !!
B570 ?
then when they will release the 700 series then
dude its kinda lame they holding back on their own GPU and their own money
look at 700 series from last gen
it has 32 cores
@@whizp261 b570 is the same die as the b580 but with more manufacturing defects and therefore underclocked.
The B580 is really lookin like the 1080 ti for the people! O_o
no, actually it looks like 2016 mid range cards RX480/580 with 8GB. That time everybody said that 8GB is excess and never be usefull for these cards, so better get 4GB versions.
After 4 years I laughed a lot watching their faces in 2020 when games started consume 8GB of VRAM in 1080p.
@geralt_silverhand @geralt_silverhand 8 GB isnt enough in plenty of cases to be able to push higher textures. I bought the red devil rx 480 during launch and around 2022 it started becoming more apparent than ever that even at 1080p it's no longer able to push high textures. A good example would be rdr2 textures.
The RX 480/580 had a good run but since 2022 it's been showing its age trying to push high textures. Pretty much every AAA game you play today that will be apparent. In the used market I would go now lower than the 1070 ti in terms of horsepower, however in terms of the new market you should be looking for more than 8gb of vram to get the most bang for your buck 👍
Intel drivers beg to differ. It sucks
@@fightnight14 theyve gotten better wdym?
Lower 12GB 192bit B580, beats higher 16GB 128bit RX 7600XT, skimping on memory bus also shortens the longevity of the cards these days.
@@OxMilkyyeah except the 7600xt uses slower memory too so the total bandwidth is about 55% higher on the b580.
Each time I have upgraded my graphics card, I have always ensured that there was a substantial increase in vRAM. So far 2GB>4GB>8GB>12GB. There is a reason the 6700XT was chosen over the RTX 3070 (or slower+price comparable 3060) as a step up from my GTX 1070.
my last 3 GPUs had 8 gb vram, should have upgraded the last time.
My journey looks kind of goofy as a PC builder for extra funds every now and again. I went from 3070 to 3080 to 5700xt currently. Swapping that 5700xt into another build and I’ll be on my trusty dusty 1060 until the new AMD cards drop.
Im going from a 3GB to a 16GB Possible 20
I went with the 3060 because the vast majority of my GPU time is spent on generative AI, and at the time AMD support was still a work in progress. It has improved massively, and I knew it would, but I didn't want to wait six months.
Big thanks for making this video!
These are the B580 insights we all wanted to see.
Me who is still playing games in 4 gb vram😅
I am on 2GB 😂🤣
@ACOnetwork 😂😂
I'm on 6gb 😂
Same and I will for the next 5-6 years
I'm on vega 8, and will definitely upgrade to this . Only disadvantage of this card is availablity and old games performance.
I traded my 10GB 3080 for a 16GB 6900 because I was running out of VRAM, nvidia's corner cutting ruins a lot of their cards.
i traded my 4070 for a 7900 GRE for similar reasons.. or well rather i bought a GRE and gifted my brother the 4070
How about we stop licking the nuts of these GPU companies, and start imploring developers to actually start optimizing their games properly, instead of doing a shit job, and then slapping AI upscaling onto it after the fact? That would be a much better solution.
@@brando3342 yeah I hate the over-reliance of devs on upscaling, it's extremely disgusting.
@@brando3342 Sadly CONSOOOMERS actually buy nvidia for that slop they think it's magic.
@@brando3342force GPU makers to make better cards at a fair price and force Devs to optimise their games
My RX 590 from years ago has 8GB. It's crazy that the standard isn't at least 12GB by now.
You are helping Intel developing their cards at this point LOL
Keep doing it man, Intel is having such a good start, they are our hope to bring balance to this market and pc gaming doable again.
RTX 2060 12GB -> RTX 3060 12GB -> RTX 4060 8GB -> RTX 5060 8GB?
this is what happens when one corporation has a monopoly on the market
A VRAM-cripple ends-up beeing a FPS-cripple sooner or later. Remember this you 6, 8, 10 and 12GB NVidia customers.
At a point, when a behemoth company no longer cares if a particular segment purchase their products or not, basically grants them the license to do what ever they want. Nvidia is going mad with prices, giving you less for more.
Nvidia’s strategy became known with the 40 series. Make the top product extremely powerful and jack the price up.
They "could" have used 3GB chips, they "could" have just made each tiers memeory bus just 32 bits wider, with the extra bandwidth they "could" have used 4GB chips and cut the bus down 'up to' 50% and still been ahead. They didnt...
lol I get ur point but the 2060 had 6gb and the 2060 super had 8gb. I think after the 30 series launched there was an extremely niche 2060 12gb variant but almost no one has it
Also no they couldn't have used 3gb chips and they still can't despite wanting too they'll have to wait about another 6 months and it'll only be a thing on gddr7 They absolutely should've just increased the memory bus like the 30 series instead tho
Im so glad that I was able to return my 4060 ti 8gb for a 16gb one. Gave me all the longevity i’d need
They should just optimize the games
And what they actually have to work. No, we live in a society of lazy people.
there's no need for this. turn on upscaler and frame generation. Monster Hunter Wilds literally writes this on their system requirements. even for 1080p on medium settings they say to enable frame generation. this is the future. and people thought that they will use these to upgrade to 1440p from 1080p with a $300 range gpu and not $700 like 4070. little did they do that they will need these for 1080p. and we warned that this will happen cause it will make devs get lazy and not do optimization.
that would require the publisher to not rush out the game as fast as possible, and we can't have that now can we?
And subject the devs to the inhumane punishment of **checks notes** Doing their job?
"optimize" = polishing a tird with wax. Well guess what, it still stinks
Me playing CS2 on 1280x960 with my 4060: "Heh.. plebs."
The perfect resolution.
Cope harder
Cheap 1080p displays have dropped bellow $50 so try that one day 😁
Why are you playing at such a low resolution on a 4060?
@@ogolow570 "stretched res", its some tryhard bs tbh
I love the no upscaling testing
IKR
Yeah, upscaling should be considered a bonus not baseline.
I hate it when people use the DLSS argument to buy nvidia
... ESPECIALLY since AMD is faster, cheaper, and has more VRAM
Because it's a legit argument.
For budget options Intel might be the best choice, but if you’re wanting to buy a higher priced card the only other option is AMD which are similarly priced without DLSS, far far worse FG and RT performance and much worse performance in workflow tasks.
Me when I buy my graphics card for the fancy motion blur. No I am not an idiot I swear.
@JasonAtlas What are you referring to here? DLSS is fancy motion blur? Or did I catch you wrong
That just about sums it up, yes.
You can play a game with low settings and high/ultra textures, and you'll get more fps than medium/high settings and will look better most of the time
The B580 card is fantastic, but let's wait and see if it's reliable first, don't let them fool us like they did with their processors...
Congrats on 100K! I know I'm late in saying this, but better late than never! 👊
Vex, here is something no other TH-camrs did yet.
B580 has PCIE 4.0 x8. Can you test this card in a PCIE 3.0 system and post results? AMD would be easiest. Just get an a520, B450, X370, B350, or A320 motherboard for PCIE 3.0. PCIE 4.0 can be x570 or B550.
Its got already testet and the performance is pretty much the same
@@Honigball Tell me where.
@@Ale-ch7xx bc only the 4090 barely needs 4.0 x16
All it's competition does too tho so it doesn't really matter on a relative scale
@@Uthleber I'll list off one game.
4060 ti in FarCry 6. You notice more than 2% FPS difference. See Hardware Unboxed when they put the 4060 ti in PCIE 3.0.
The question is will I buy Arc B580 or spend the extra money on an RX 7700 XT to have that full x16.
I prefer to spend less money and not take a hit in performance.
1:29 “exasturbated” 😂 you may have created a new word there!
The less VRAM you buy... the more RTX/DLSS you get. -- Some CEO's math probably
Man I just bought a used 4060, you are not making me feel better about it... 😂
bro same I just bought a 3060ti like 2 months ago 😭🙏
Same 🙂
Same Xd, I just bought a rtx 4060 to my first PC (on offer for sure)
@jesusayalaherrera4304
Me too i found it cheaper than 3060
@@S10hype nice bro, I think I Will have my card at least for 2 more years, where I live is very hard to sell something like a graphics card
Aye I’m rocking the rx 6750 xt. I plan on upgrading to a 7900xtx next year real soon. Amd has treated me well for a while now.
Okay don't forget that CoD BO6 uses 10gb there, but i play on 1080 high, a bit near ultra, but mostly high and I run it 100-144 fps while having discord open and other stuff on an rtx 3070, which has 8gb vram
RTX 5060 drops with 8GB VRAM 🤡
Don't buy then
@@goregejones7248 Did not will not Had the money to buy 4070ti in my country which cost like a 1000$ btw lol was thinking about it instead built a whole a$$ pc for my sisters little more than half the cost am5 32 gigs ram 1tb gen 4 ssd r5 7600 6750xt 750w gold psu and a decent mobo with good vrm too just found the discounts because people aren't buying non nvidia or intel in my country lol. I'm still rocking an rx580.
@@goregejones7248 u say as if he or most ppl here is gonna buy it
smh i literally have 2060 12gb, and no im not buying shit, this 12gb can last me 10 years
@@urnoob5528 you don't know how the gamers are, I'm sure rtx 5060 will be purchased by many people, look at rtx 4060 which was criticised for its vram and price, but look steam hardware survey it's on 4th position.
@@goregejones7248 los que dicen eso no pueden comprar nada, son indigentes digitales con apus de amd de 2 gigas XP
"if you wanna use 4k textures..."
Cool, I don't give a shit about new games but I hope this causes Nvidia and AMD to lose ground and start actually putting some fucking effort.
New games aren't worth playing, let alone spending thousands on lol
That Stalker stutter brought me back to 90s me playing Doom on an underpowered laptop.
Well yeah those benchmarks are in 1440p native high, when your using a 1080p card of course the performance is lackluster. 0:10
Yeah Jensen you're right 😂
There's a RTX 5050 chance we might get 6 gigabytes if vram in the future
🤣🤣
We had 6gb on 60class card in 2016. 9 years later we still got only 8gb 60 class card with the 5060. The 16gb should be minimum vram nowadays. 8gb should only be for like 50class cards or lower. Its not people still use 8gb of ram gaming either anymore cause its just not enough
We have gotten to the point companies are purposely trying to take us backwards but the price.. that only goes forwards..
congrats on your channel growing and thanks for sharring your info without dragging on the watch length
0:01 broo keeep it down my GTX1660 will hear u
AMD should take notes from Intel how to make the best affordable graphics card. And for Nvidia... they will never LMAO
And yet NV dominate sales at this (and every) price point. Funny dat.
Even if AMD release a decent affordable card, people will just buy nvidia sadly. RX 6600 one of the best budget card, doesn't even reach 1% on steam survey 😢
AMD is to busy striving to be 2nd best, soon to be 3rd the way things are going.
why would Nvidia learn when Steam's most popular GPU is the RTX 3060 and RTX 4060 lol..
@@schikey2076 because they got the best ptp outside the US. world doesn't revolve around the states
Evildia's plan will still work, people are drawn to the green eye logo and it has become the tradition to spend a lot of money for Evildia.
rtx 5060 will be 8gb and it will be the best selling gpu.
Nvidia has 88% AIB(MSI, Asus, etc) market share, so of course this is what will happen. normies know nothing about vram, etc. nvidia is popular, the X060 is cheap and can do 1080p which is the most used resolution, 5060 will be best selling gpu.
I wouldn't say they're dead. 8gb Vram should used for Low profile SFF type GPU's at this point.
Love Your videos! grats on pushing 100k milestone!
This just might pressure nvidia and amd to get higher vram or will just do nothing at all.
less than 12h ago updates suggest 5060 will be still at 8GB, while 5060ti might be 8 or 16
soo... yeaa
Nvidia don't give a fak cause idiots will buy Nvidia regardless. of course i'm talking about the majority that are normies that do 0% research and buy what's popular = Nvidia. and unfortunately Nvidia knows this. they see the sales = 4060 best selling new gpu. 4060 8gb, so why increase the vram? no need.
@@Made1nGreece
Yeah, in business perspective, why improve your products when you know very well thr people are going to buy it anyway
yh but isnt thats cuz modern games has bad optimsation?
These anti-8GB VRAM people only play games that need a 5090 in order to run them properly.
@@FrostclawTheGatomon bro wtf is an anti-8gb person 💀
@@daddysamosa I call them that because they keep whining about 8GB of VRAM when most AA and Indie games can run on potatoes, including THAT graphics card.
@@daddysamosa 8GB of VRAM is not bad.
You only need a decent computer to play the new Indiana Jones game.
8gb is dog shit in nearly 2025 @@FrostclawTheGatomon
The VRAM limitation is NOT a problem with hardware. It is the failure of the game developers. None of them are building these games on that IbuyPower PC you bought for $1500 with the RTX badge visible through that tempered glass. Chase them if you like. You never catch up.
Not joking... it is true. 8gbyte is going to be problems.
12Gbyte should be the LEAST for that level of card. Midrange cards need 12Gbyte or better 16Gbyte because games will use it!
I got a RTX 3060 a few years ago, with yes 12Gbyte, I might if I spare a little bit more money a 4060 8Gbyte. But I would hit that memorywall. So I rather have 10fps less but it being smooooooth then 10fps more and stuttering.
Not everyone plays AAA garbage. I play single player strategy and indie games. My 3070 is more than enough for this requirement at 1440p, 60fps, on high settings.
They're just people forcing companies to have more VRAM for their precious AAA garbage games.
Good show dude. I appreciated the card comparisons. Congrats on your subs.
Bought my B580 - tho Sparkle Titan OC since LE is "unobtainium" here in Poland - literaly 30 minutes after the reviews came out
has been using it since friday and i am damn happy with it - it is an great upgrade over my A770 LE -> and i have been daily 1440p since Alchemist so you can believe when i say that yes, B580 is a really great card and for that price it is juts no brainer
I'm super happy for Intel. Their GPU division clearly showed their passion and their hard work, and made a great card. If the corpos don't fuck it up, Intel has the potential to grab a ton of market share just by making better cards affordable. Something AMD could have done for the last 5-8 years.
Sucks to suck AMD.
I spent 300€ on a 2070 with 8gb... 4 years ago lol. this is just unacceptable
Im also seeing a very bad flaw in game design where they are resource hogging the daylights out of machines. With all the wonderful advancements achieved in the tech industry, theres so many spots that been left in neglect. I have had to open up Nvidia control panel just to utilize the "SYSTEM FALLBACK POLICY" just to get extra vram but with performance costs. I also needed to make a note. Intel makes CPU's and now GPU's, I wondering if intel put the CPU design idea IN to the GPU's architecture?
Wait, wait, He's standing now, He's serious!!?? 0:18
Nice work as always man! Cheers from Argentina
+15% performance for +33% power..
EH. EEEEEEEHHHHH.
I'm more impressed by the base performance. It's rather efficient at that point.
Intel's B580 is a fair attempt at a second generation product: it pretty much crushes the old A770, outside of a few exceptions, which I'm sure drivers will fix.
An eventual B770 might be a real beast, but also probably cost closer 499, but given Intel needs good press, they might also sell those at a loss at 399 or possibly less.
If they lay a smackdown, even if it means losing money, it's an absolute publicity stunt and means Celestial will be priced higher, but also offer good performance.
If they mange another +15% or +20% (preferably) performance on the next gen while keep it cool and efficient, a $300 price might be acceptable for a C580.
Then again, the B580 still has some driver issues: I expect a lot of them to keep improving.
Either way, Intel made a damn banger on the second go. Unlike their CPUs, this is actually exciting.
I'm on an 7800xt, so I'm fine, but like god damn, I'm exciting for a 3-way showdown when the next generation (Celestial vs 6000 vs 9000) rolls around.
Killed by intel is crazy
Imagine playing those new unoptimized walking simulators without actual gameplay or shallow RPGs. That's why I am replaying older classics recently with my RTX 3060ti.
Ngreddia will release a 5060 with 8gb vram and the price I think will be $400. This card will have performance around 4060ti/3070 to 3070ti to other countries plus 20% tax and profit margin, not to mention OEM versions will be 10% more expensive so we will have a 5060 card for $480-520 ? wtf for 8gb vram 128bit ?
They be telling us 1080p is dead since 2018. Clowns.
Thanks for making this great review. The B 580 is currently at the top of my list for my next build.
I don't know where everyone gets the idea that 4060 cards only come with 8GB of RAM, though. I've seen 12GB versions and Gigabyte even make a 16GB card.
"Gigabyte GeForce RTX 4060 Ti WINDFORCE OC 16G"
Sorry can't hear you over my 20 GIGABYTES OF VRAM
AMD GANG
Most of AMD gang don't have 20gb though.
@@cz5836 realistically most of them 8gb rx480 580 super low end then have 12gb 6750xt (because who buys 7600 when this exists) low end and have 16gb for 6000 series mid range.
If it's a question of VRAM you can just buy Nvidia Tesla GPU
But you have to use FSR which basically gives you $400 console style visuals. Gross.
Can't hear you over my 24GB of VRAM, soon to be 5090 32GB of VRAM and 2500Watt PSU in January 2025 because I'm gonna sell my left kidney and soul for it.
2:34 game wants 10GB of VRAM, but there is no perfomance difference between 8GB vs 16GB card? So VRAM amount doesn't matter?
you can also live with one arm or one leg.
@@Uthleber boom
Exactly. If you don't have enough vram then it will just use system ram instead. PC gamers can really be some crybabies lmao
The game doesn't want 10GB of Vram, the memory manager allocated 10GB of Vram for that game because that is what was available...
The game might only be using 4,5 or 6 GB of the Vram that it reserved.
@@StricklandLiftsur so low knowledge its unreal. If u think RAM and Vram are the same and give the same performance and same speed, then I don't know what to tell you.
You can have 64GB RAM with RTX 3070 8GB Vram, watch how the game will stutter if u run out of Vram.
This is why I went back to AMD with a 20GB RX 7900 XT for $620. It has a lot more worth to it.
Who killed them? Lazy damned coders. You wouldn't need 4 Gb with the latest titles if they actually practiced some restraint. They throw crap in there that doesn't need to be there.
This is pure cope. Big open worlds with high resolution textures will easily take up more than 4 GB. I'm sure you're probably cool with 8 GB of DDR3 RAM too then? Times change man
if 8gb of vram is dead then what do i do with my 2gb of vram
Btekol khara
I was a big PC nerd when i was a teen, but honestly it’s gotten exhausting. I’m broke as and can’t even afford to upgrade in the first place. I’ve been running a laptop with an integrated gtx1650 since 2019 and every time i see some new tech news i go ‘oh, good for them i guess’. It’s wild to me how stupidly fast computing is going but at the same time i find its making people lazier having an audience with such abundant graphical and processing power.
Unoptimized slop is not a good way to benchmark cards. The developers being lazy at optimisation is the real problem, not the VRAM or the cards general performance.
8gb is perfectly fine in non-AAA games, it's not fine in AAA games because the devs don't know how to optimise their games anymore, they just think higher texture = better, but that's simply not the case, no one is going to notice your 4k bark textures in a 3rd person game, no one is looking that close or cares. 4k textures in general are pretty useless, they look almost the same as HD textures since the human eye simply can't discern something like that at a glance, which is what players will being doing when looking at the environments, they're not going to watching paint dry so to speak.
Most indie developers are better at optimisation then AAA studios, look at Deep Rock Galactic, that can be played on a 2gb VRAM card from like 12 years ago. Photorealism doesn't = better either, in fact it can make a game worse since people play games to have fun and escape reality, not every game needs to be super realistic looking, that's just dull and photorealism hogs lots of performance.
Sorry, can't hear you over my 12 gigs of vram, ahhhhhhahahahahahhhaha
Keep crying, textures are the most important component to game visuals that don't affect performance.
@@MrEditorsSideKick shut your stinker, it really is an optimization issue.
If 8 GB is enough then how about 4 GB? 2 GB? Should 512 MB GPUs be capable of pathtracing?
@@ypro3102 Funny thing is that an amount of them probably are
just not fst
that argument you made was dumb anyway
This doesn't matter 5000 series will sell out in all tiers, cuz people is sheep minds and Nvidia today is exactly like Apple it's all about the brand not the product, and people is braindead set on, ooh new Nvidia GPUs i has to buy it cuz all other does the same!
Even tho AMD is better raw performence at a lower cost with the Best GPU software ever made, and Intel Arc B580 is actually a very good entry lvl mid card.
are you including the 50-100% scalper mark up in the price?
Too bad the B580 is a paper launch with no stock in sight
i think intel is testing the market with the B580 because if they want to make a profit they have to sell a lot of B580s at that price, if the supply is too large in case the B580 is criticized it will be a big loss for them, that's why the production of B580 is not much, but the good thing is that everyone likes it, but because of the low supply everything outside the US is marked up quite high compared to msrp
@ there are none available in the USA either. They lose money on every B580, selling more will lose more. They have made a dent in the market, and its a big improvement. But its a paper launch.
@@tin9759 Bro the gpu sucks, if i'm building a $1k desktop pc i don't care if the gpu is $250 or $350, it's within range and i'd rather get a worse case, power supply, less ram etc as the gpu is the main component that drives performance yet it's competing with older gen gpus is weak as hell.
Intel should've released a better performing card at $300-350 which is more in line with the 4070 performance, this gpu will flop imo.
rtx 3060 is the last good gpu from nvidia
I woud add rtx 4070 the gddr6x version for around $500
1080ti was
You mean 1080ti
What 's released nvidia AFTER gtx 1080Ti was full garbage ! Period.
No 3060ti too. Only 8gb but so fast for the price. It literally beat the 80 super card from the last gen. imagine if the upcoming 5060ti would beat the 4080super, it would be unfathomable.
I just ordered ASRock Arc A770 Phantom Gaming 16GB 😅 been dying to get one to play around with it 😂 hopefully B770 comes out too so I can get that too 😅 ( I already have a 7900XTX )
0:15 BRO HERE IN INDIA IT IS $447.46 WHAT WOULD YOU SAY !
import and export moment from EndVida
Import taxes and freight 🤯
@@MadarDoraeMon import and export moment 🥲🫡
I'd be happier if someone just robs my house and steals $200 worth of things than buying such a bad GPU for THAT high of a price IMO.
@@nathanhayballfreight? Last I checked India is closer to china than the us… or do you actually think these gpus are assembled in the us 😂