@@Josh-cw8by that is the easiest factor to adjust for yourself tho. He never focuses on the specific numbered price but the price bracket. Thats what normal ppl do as Well.
intel pushing 12gb of VRAM on the more budget side of cards is so nice, it's a beautiful bonus including it's very capable rasterized and RT performance... hopefully they arrive to argentina one day so I can get one...
La otra vez que fui a Mendoza la PS5 estaba al doble de precio que en Chile lol No creo que sea rentable para importadoras oficiales traer esa tarjeta.
@@RandomGaminginHD Hey great videos was wondering about these cards seem real nice especially lots of VRAM for futureproofing, one thing to suggest though with your testing is to try the exact same as you usually do but bump textures and AF to max instead of medium/high or 8x as theres plenty of VRAM to spare and wont negatively affect FPS, keep up the good work hope to see more videos on Intels new cards espeicially with the new upscaling theyre bringing out soon.
The day the Intel GPU is the hottest GPU in the news feels like I woke up in an alternate reality. Hell yeah though, we need more players in the game so we don't get more 8gb VRAM in 1000$ cards.
Now that it's becoming viable to run a full blue system, hopefully green starts making CPU's, and you'll have 3 choices now, red green and blue. That will unlock the full potential of RGB lighting
When you get these large companies, each of the Business Units are so vast, they are effectively separate companies. Intel's Networking BU is about the same size as AMDs whole Processor Group, for example (which in itself a huge division). Intel's CPUs are clearly struggling in games now, but remember that that corresponds to about 15% of their CPU sales, or about 3% of their overall sales. Their GPUs are literally a rounding error, which may be a good thing to escape the cuts that are coming Intel's way
@@peterwstacey Just like that one lawsuit of Sony Entertainment and Sony Music against each other. Funny as it may seem looking at the outside, both are distinctive enough to not just be distinct companies but distinct conglomerates in their own right under the bigger conglomerate wearing a Sony trench coat.
It's basically like this: anything applying to DX11 applies to DX10. It's not an entirely different API like DX9, DX12 or Vulkan. DX10 and DX11 share a common base
I like to mess around with various GPUs / CPUs. Lately, I've been playing around Chinese domestic stuff, Zhaoxin and Innosilicon. The Zhaoxin c960 and Innosilicon Fantasy No.2 are fine at desktop stuff and can even run some DX11 games/benchmarks. However, I always end up playing BF 1942, which is Directx 8. 😅
Great move for adding Indiana Jones and Stalker 2, itll contribute in YT algorithm and engagement. Since yesterday ive search for a benchmark but could only get a spreadsheet.
Thank you for mentioning how quiet the cards is and even more importantly, about there being no noticeable coil whine. I mostly dread about coil whine when getting a new card so that info was most welcome.
Last card i had with coil whine was an amd r7 360. The drivers were so bad anyway i have no trust in amd drivers ever being free of serious reliability issues
@06dpa it would, I think that's the point, to see how problematic and if it's better to stick to Gen 3 GPUs and those that don't need rebar until people can upgrade their system
So glad Intel came out with something super competitive, competition is key right now with Nvidia having outrageous prices and insufficient VRAM in cards.
Currently running an RTX 2070, I haven't felt the need to upgrade but this card is really intriguing to me and has me considering. I don't necessarily think it will be the biggest uplift in raw performance in games, but at $250 it's a very good chance to get my foot in the door on some newer hardware, and get myself 12GB of VRAM- as someone who's been enjoying the new Indiana Jones game, finally starting to see the need for more VRAM lol. Great video as always! 6:02
I upgraded from a 2070 to a 4070, decided i eventually didn't like the 4070 (and after asus tried to dodge out of a warranty claim because of coil whine and fan noise), gave it to my brother, and replaced it with a 7900 GRE. I saw a big uplift with either card but the GRE after overclocking it saw a bigger uplift because i mostly play rasterized games. It was about 83% from the 2070 to the 4070 and probably around 102% ish with the GRE from the 2070. Depends what you want with your games really. Id tell you to wait until next gen, but we don't know what the prices will be like. All ill say is im happy with my GRE, and my brothers happy with the 4070.
This card is sold out in South Africa and i would love to get my hands on this card. It has everything a player needs. Good performance, good vram, not too much power consumption, stable drivers so far I have seen and a good price.
I think it's 187 fps and then the fps craters to the low teens before recovering and repeating But in his footage the fps didn't lower and stuck at 187-188, which is why utilization is in the 50s
pretty much, of every benchmarker channels I've came across, this guy, zworms, and budget builds official are my go to channels for everything PC benchmarking
Great coverage on the card, definitely one of biggest "budget" graphics card releases in a long time. Planning on using it in my main gaming/content creation rig, I wish i could even buy it right now 😭 literally went out of stock in the states in a minute or so and hasnt come back in stock yet. Newegg is kind of the only viable option for it at the moment, Gunnir models are the only one available for around $400 and I'm not gonna support those prices lol, so just playing the waiting game now. Most US models will come with AC Shadows which I was going to buy anyway, so that helps "offset" the cost of the cars, or make it a better value, however you want to look at it.
Great to see a midrange card with adequate VRAM at an affordable $249 again, just like the GTX 1060 6GB in 2016. This can definitely ace the Steam charts if the driver issues are handled for newer titles in the immediate future. For a 6700XT owner though, nothing to see here, except that competition is good. This should have been the MSRP of 6700XT in an ideal world at launch if the industry was not spoiled by NVIDIA with $1200 for 2080 Ti.
Well, gamers only have themselves to blame for paying those ridiculous prices and creating a new normal. Same goes for pre-ordering games. It is destroying the industry because people paid in advance don't have any motivation to optimize games. The use of RT as the only lighting system in the Indy game was a decision made out of pure laziness. It was an awful business strategy because they gave the finger to half the people who wanted to play it. In the real world, most people can't just go spend $600 on a new GPU at the drop of a hat. It impacts their sales tremendously and I am hoping that, with the $300m price tag and the steep hardware requirements, they won't make their money back because the industry needs a high profile example of what happens when you alienate a huge portion of your customer base. It is too bad that Nintendo failed to learn from Metallica's mistakes. They sued people for piracy and hardly sold any albums for 20 years, effectively getting canceled. The problem with the Nintendo issue is that their games are as addictive as crack cocaine.
@@Lurch-Bot nintendo can go smd after the stunt they pulled with palworld. also gpu prices being ridiculously expensive cannot be pinned on gamers alone. if you go look at the steam hardware surveys, you can see that most people do not own a higher end GPU, so clearly gamers are not buying $1000+ GPUs en masse. dont forget that we are in the middle of an AI and machine learning boom, and these nvidia cards are also valued for their capabilities out of just pure gaming.
Doubt we'll see 1440p lol, he's still terrified of using MSAA on GTA V "because it'll hurt performance" while running it at 175fps. Like yea, I wanna see what this card can do lol.
8:23 Half-Life 2 (and most other Source Engine games) actually has a framerate cap of 300FPS. I may be wrong about this next part, but I BELIEVE it's done this way because if you go past 300fps, the tickrate starts becoming an issue, and physics, as well as player movement, starts to become very wonky. nearly every single modern system, with the right settings, can hit that 300FPS cap in HL2. Unrelated anecdote though, but Intel ARC in Half-Life 2 doesn't currently support DirectX 9.0c (or DirectX 9.0+, or DirectX 9 Shader Model 3) like AMD and Nvidia, so you don't get access to improvements that came with Shader Model 3, or HDR rendering (not an issue for most of HL2, but it does affect the Episodes more substantially, since they were released after HDR was released in Day of Defeat: Source and Half-Life 2: Lost Coast).
Correct me if I'm wrong, but Intel ARC hardware only natively supports DX12 & Vulcan, while all older DX are supported by emulation / translation layer?
@@kmieciu4ever I don't know much about how Battlemage handles it, but Alchemist definitely did handle everything else through translation, and I believe the same issue applied to HL2 on A-series cards.
Unless Valve changed the way HDR works in Source since the release of Lost Coast and the Episodes, they still use a very different means of producing HDR than most games. Most games use high precision floating point to render HDR, which requires a minimum of DX 9.0c(emphasis on the C part) with 32 bit precision per sub pixel to do properly. Valve wanted folks with older DX9 cards to be able to experience HDR as well(like my old X800XL from back in the day which was capped at DX 9.0b and still 24 bits per sub pixel precision like the original DX9 spec and what I believe was DX9.0a support on the FX 5000 series of Nvidia cards). So instead of using floating point data for HDR, Valve used integer data instead, the results of which turned out pretty well in my opinion(the inside of the church with the stained glass windows towards the end of Lost Coast in particular). So yeah, the original(and maybe still current) method of rendering HDR in these Source based HL games included DX9 shader model 2.0 and didn't require the shader model 3.0 precision available in DX 9.0C.
I got RX 7600 XT for 289$ a week ago before this card released. for my real world scenario as Steve said 1080p, i3-12100, 16gb, and 7600 xt absolutely kicking most game i played lately.
Pretty sweet budget system you've got there. Fits together well. There are way more games that are more than worthy to play through, than you could play during a lifetime, on that system of yours 😊.
I got a nice deal 7600XT/Ryzen 7 5800X 32GB a few weeks back. Plays Doom Eternal in 4K60 Ultra settings like a breeze. Plugged into my 65" TV, what a blast.
I really like the blue of that card's shroud. Ironically reminds me of those blue shrouds in those old Radeon cards that I also liked for their pleasingly monocolor simplicity.
This was a wonderful showcase of what the B580 can do! I thoroughly enjoyed watching this review over and above some of the other tech youtubers because you had a wider array of games tested which is exactly what I was hoping for! Thank you so much!
Youre just the best youtuber for things like these. Your voice is soothing, the video is informative, plenty of games tested and the fact that you test even obsure cards. Love your content
Thanks a ton for including DX9 title, you are the first one to do so. Would be really interesting to see comparison between last gen Intel card, B580 and some close competitor like 3060 Ti 12GB in those older titles
@@jeke8413 raw power wise its good but the vram is limiting in newer games. I wouldnt buy an 8gb GPU today. In 1080p you might get away with it though.
THANK YOU for running the BG3 benchmark in Act 3!!! So many other channels run it in Act 1 or even worse in the nautiloid. Act 3 performs like an entirely different game so this was really refreshing to see.
How do you not get sick of testing out GPU's all the time. You are really lucky to be able to focus on the same type of thing for so long without getting bored out of your mind. Great videos.
Judging by the reviews I’ve seen so far, it seems like Intel has mostly solved the hardware-level inefficiency problems found in their Arc lineup with Battlemage. I’m still holding out on getting the B580, though, because I’m unsure whether Linux performance would be adequate. Would you mind testing this card on Linux using older games and emulators to showcase its capabilities? I appreciate your hard work in providing straightforward gaming benchmarks.
Yeah this is slightly slower but impressive for the price :)
หลายเดือนก่อน +11
Outside the US market, at a price of $250 you can only dream. The US market is not the whole world! For example, in Croatia or Austria ARC 580 GPU costs 350-370 euro!
Currently here I've seen the B580 from 250-280 GBP. I paid 250 for a 1660 Ti in 2019 so for the price point the card is nice if you can get it at that price, anything more and you're better off searching for alternatives.
In when both CPU and GPU utilization drop along with framerate that means memory or the system bus bandwidth are the bottleneck, because there's too much data being transferred around for it to keep up which causes the CPU and GPU to sit there waiting for the data. This isn't always a bandwidth issue though because sometimes games/engines won't minimize data transfers as much as they could've, particularly in geometry/texture streaming situations. If standing in the middle of the town in RDR2 just sits consistently at low framerates where everything is resident on the GPU then it's almost surely memory bandwidth that's limiting things. There's also state changes for rendering different aspects of the G-buffer and different materials, and I know how busy Rockstar's engine gets when rendering a frame. For those heavy deferred rendering engines a lower resolution usually helps a lot, and not because you're cutting down on rasterization (though that does halp too) but because it's not having to write to and read from as much G-buffer. A typical G-buffer in a deferred renderer can be dozens of megabytes in size, which must be written to and read from (multiple times, sometimes) in a single frame, while doing a bunch of other stuff like changing render state and caching different resources for different rendering commands.
G'day Random, for any Aussies interested & anyone else wondering about comparing World Pricing down here B580s where I shop are... intel LE $439AUD (PCCG Sold Out) - ASRock Challenger $449AUD (Scorptec) - Sparkle Titan OC (PCCG) & ASRock Steel Legend (Scorptec) $479AUD [Edit] All Aussie List prices include our 10% GST (=Sales Tax)
I keep checking msy/umart every day. They have the listing, just not the options yet. I don't like the look of the asrocks, I'd prefer the intel or the sparkle one.
@@Retro-Iron11 being cheaper but not looking cheap like the ASRock Challenger I think the intel LE ones are gonna keep selling really well as soon as they are in stock, intel has done well against 4000 & 7000 for Perf/$$$, it will be interesting to see how Battlemage fares against 5000 & 8000 Perf/$$$ to see which is best value then
@@shaneeslick By the looks of it I may have to get the steel legend as it seems the intel le is sold out everywhere. $30 more but you get more cooling, physically larger heat sink and the boost clock is higher, although bench marks put that down to maybe 1fps faster. I do like the rgb on/off switch on it. I may be changing my mind the more I look at these things. The asrock challenger is just outright gross and nope.
Thank you for this video. All major tech journalists just provided frame graphs for ultra 1440p and 4k while this card is a spiritual successor for 1060 and 580 and the new best 1080p bang for buck card
Would love to see this paired with your 7500F...Maybe you did in this video, haven't made it that far yet. Man, talk about a value winner with those 2.
I got this also and i5-14600K and im very happy for performance. I allready tested Forza Horizon 5, Halo: Infinite, Forza Motorport, Starfield, Fallout 76, The Ellderscroll Online, Age of Ampire 4, Doom Eternal, Call of Duty Black OPS 6, Wolfenstein 2 and Age of Mythology Retold
7:34 a quick sidenote, GTA 5 has an engine framerate cap around 185 fps so it is normal to see lower utilization, also around that cap frametimes kinda goes haywire so either implement a framerate cap via RTSS or crank up the settings to avoid it 😄
that fps cap was 188FPS (limited by RAGE's internal settings called audio frame limiter), plus GTA 5 also have issues when going over 140FPS, with the game randomly stuttering on menu (due to the audio has to be synced with the main threads). This was both fixed by mod (GTAV.AudioFixes) which increases FPS limit to 270FPS (limited by mission scripts).
Great video as always, very informative! One thing I'd like to point out: It seems you record your voice in post and throw it over the prerecorded benchmark video, which is fine, but then you write the script as if you were doing it live, which throws me off like in 11:56 . Im definitely being nitpicky here but its something ive noticed :) Reacting to your footage is fine imo but perhaps playing it off as if it were live is too much for me
man, upscaling has done untold damage to the gaming sector. It's just a crutch for companies to get their games on the market faster, while looking "better" than the last gen. cards like this with optimization efforts from 5-8 yesrs ago could pull 1440p, probably maxed on settings.
A person who disables motion blur, depth of field (though this one depends heavily how it's used in the game), film grain and chromatic aberration is a man of culture.
@AntiGrieferGames Many modern games have TAA built in (as far back as Battlefield V) and cannot be switched off as certain other image processing depends on it. Some others lets you switch it off but it's obvious it's meant to be on since other AA options give an aliased mess (Resident Evil 3 Remake for example). TAA is honestly fine as long as some kind of sharpening filter is applied internally or via third party.
@@Tomazack older games are still an issue imo. as another comment said it doesn't support DirectX 9.0c so in Half life 2, you don't see some options. In fact for new games requiring dx12 and RT, intel will be better. And intel will proactively work on drivers for latest games instead of older ones
And maybe trying some older games which are less mainstream, I'm thinking Intel may have done some bespoke work to make the most common older games run well, but wont do it for the obscure stuff
Literaly day when B580 reviewrs came out, i ordered my Titann OC Sparkle B580 instantly... have beein running it since friday - amazing acrd. and it really an great UPGRADE over A770
@@fightnight14 thing is, we don't know anything about B770 - at least about potential release. OFC i could wait, but B580 is already 22% average better in 1440p than A770 - so it is still an upgrade if B770 will be anounced, i will save money for it - but i want to be safe and bought now what i wanted. + lets be fair, when CES 2025 hits.... we will have another scalper pandemic just like we had in 2020-2021...
As always thank you for the video! Always appreciate your reviews for the "Common" Gamer PLEASE do a Kingom Come Deliverance 2 video with this card vs other GPUs in its class when it comes out!
different countries have different economies so msrp of a gpu doesn't really say much, if thats the case you should get a used 6800 16gb as an alternative. the reason why this is such an exciting card is solely because of the vram and the performance, no other company would be generous enough to offer 12gb at 250
You said you capped your i9 at 125 watts so it performs like a 14700K ??? In what world does a 14700K draw 125 watts??? Mine draws 253 at the new microcode stock settings
@@freedomearthmoon1 under boost on the p cores. But you can modify that value.. I also had 253, but changed to a 14600k, and even there you can modify that if its a k version. In my bios its called p1 and p2
pre ordered the b580 here in Canada. Am very hyped, I got the arc a750 and it killed my windows install at launch so I'm excited to see how far they have come
Get inside that Escort! LOL! Love your video's, I'm glad we're getting options now, reminds me back in the early 2000's when we had multiple GPU manufacturers.
Question, why is the minimum fps higher than both the 1% and 0.1% lows? Shouldn't it be the global minimum of the benchmark session? As in 1% low = average of the lowest 1% fps, 0.1% low = average of the lowest 0.1% fps, Minimum = minimum fps ever reached in the entirety of the benchmark session Otherwise great video, your commentary is really nice.
i've been wondering about that too, This is my guess (maybe wrong): 1, 0.1% low - from the whole benchmark, take slowest frame times, convert to fps (1000 / frame time in ms), and average minimum fps - take a slowest frame time in every second of the benchmark, convert to fps, and average
You should test some more old(er) games, whichever ones you think of, to see if there's any driver issues. They don't even need to be benchmarks, just tests to see if they run without issues.
The price here in Luxembourg (Germany) is 300-310€ right now, you can get some used 6750XT for cheaper. Also a new 6750XT costs only 20€ more, soo it's great it exists for competition, but the price is slightly to high here. :')
As always, excellent reviewing from you yet again my friend, superb. You are far more relevant to a lot of gamers than some of the so-called 'big' TH-cam channels. Hats off to you! 👍
As someone who's been thinking of building their first PC within the next year or two (as opposed to buying another gaming laptop), these new cards have me very excited. The first generation of ARC cards were so iffy, it was really a turn off for most people, hell, even AMD still hasn't completely shaken off the 'crap drivers' stigma. It looks like Intel GPUs may finally be approaching parity with other vendors, so it doesn't have that 'only for enthusiasts of troubleshooting' label.
Cool to see! I've been trying to find DX 9 tests because I still mostly play those games... they're all indie games so I guarantee they won't get any attention from the driver developers if there are issues.
It's Rx 6700xt performance and Rx 6700xt amount of ram for an Rx 6700xt price. I'm glad it exists, but it is really f-ing sad that we are excited about stagnation at this point.
This looks like a great card if you play new games and have a low to medium budget. I would not trust it to play gta IV, freelancer, warcraft 3, c&c games etc that are old favorites I play along with a cod or fortnite. Nvidia and usually AMD GPUs tend to do the retro gaming better than Intel.
@@harrislawrence5250 dont get 8 GB cards for 1440p 🤐🤐B580 is great entry level 1440p honestly, it scale super well ( lower fps drop in higher res, cant fight with 6700 xt which is great)
This reminds me of the days when RAM and other components could be various colours like blue or red instead of just RGB. Nothing against RGB, I like it too. But those vibes hit different for some reason.
I believe this is the ideal channel to test a GPU like this.
Will always be mate. He's been our budget pc part youtuber for quite a while now.
Not really. Europeans pay more for certain GPU's so their view of performance per dollar is skewed and formulates a bias.
@@Josh-cw8by that is the easiest factor to adjust for yourself tho. He never focuses on the specific numbered price but the price bracket. Thats what normal ppl do as Well.
@@Josh-cw8by Yes. Biased by reality unlike many US reviewers who get biased by...money
Thank you means a lot :)
intel pushing 12gb of VRAM on the more budget side of cards is so nice, it's a beautiful bonus including it's very capable rasterized and RT performance... hopefully they arrive to argentina one day so I can get one...
ha, same
_meanwhile, at Nvidia, currently preparing the announcement of the 5060 8GB_
...shit.
I hope you can find one soon!
Intel, AMD they understand the power of 12Gbyte of memory.
Nvidia though will give you a RTX 5060 with... 8gbyte on an 128bits memory bus???
La otra vez que fui a Mendoza la PS5 estaba al doble de precio que en Chile lol
No creo que sea rentable para importadoras oficiales traer esa tarjeta.
Yes, the hottest GPU on the planet right now - in a certain familiar British back garden! Thanks Steve!
Thanks for watching!
it's also the most scalped GPU
@@RandomGaminginHD Hey great videos was wondering about these cards seem real nice especially lots of VRAM for futureproofing, one thing to suggest though with your testing is to try the exact same as you usually do but bump textures and AF to max instead of medium/high or 8x as theres plenty of VRAM to spare and wont negatively affect FPS, keep up the good work hope to see more videos on Intels new cards espeicially with the new upscaling theyre bringing out soon.
Meanwhile the 4090 prices are scalding LOL
The day the Intel GPU is the hottest GPU in the news feels like I woke up in an alternate reality. Hell yeah though, we need more players in the game so we don't get more 8gb VRAM in 1000$ cards.
I watched Steve's review of B580 then watched Steve's review and now watching Steve's review.
Hmm, maybe I should buy one and review it.
Thanks Steve
Haha In which order? I watched the Australian Steve first then the American one,
@@Voyajer.Back to you Steve
Thanks
is it just me or does he sound more enthusiastic and energetic in this video than the others
These new releases get me excited haha
I mean, it's a fan and different kind of release
Its finally an exciting GPU.
The last ones were HD7000, Pascal, Polaris, RX5700, and RX6600.
100% this is pushed sponsored shit.
Why would you say that? This is a genuinely exciting card.@@marisbarkans9251
Now that it's becoming viable to run a full blue system, hopefully green starts making CPU's, and you'll have 3 choices now, red green and blue. That will unlock the full potential of RGB lighting
And some blue screens
@@wrench_in jokes on you , i run Steam OS
Dude, I got a full green system. Athlon 64 X2, Nvidia 9600 GT.
@masterkamen371I have 8800 GTS 768 MB 😎
8 core 4 thread at 5.7ghz for 500 bucks
The old Escort pun cracked me up! 😂
cheeky 😉
To be fair, those are still quite nice rides 😊😅
or as some might say... the Cozeh.
Intel truly is Schrodinger's company.
When you get these large companies, each of the Business Units are so vast, they are effectively separate companies. Intel's Networking BU is about the same size as AMDs whole Processor Group, for example (which in itself a huge division). Intel's CPUs are clearly struggling in games now, but remember that that corresponds to about 15% of their CPU sales, or about 3% of their overall sales. Their GPUs are literally a rounding error, which may be a good thing to escape the cuts that are coming Intel's way
Intel ARC department and Core Department are effectively seperate entities under the same corporation
You can't even find these GPUs anywhere. So I think we have our answer.
@@peterwstacey Just like that one lawsuit of Sony Entertainment and Sony Music against each other. Funny as it may seem looking at the outside, both are distinctive enough to not just be distinct companies but distinct conglomerates in their own right under the bigger conglomerate wearing a Sony trench coat.
Its because intel gpu department head engineers are indians, if it was americans they would spend their funds on dei trainings
The fact that being a budget gamer even gives you the option of 60fps on Black Myth Wukong is certainly appreciated
Thanks for testing actually older games, helps to see them
Especially when some are still intense or popular
Yeah, I play old games, I need to see the performance in those too.
The other tech channels barely talk about the DX10 stuff
Thank god this one did
YES! Finally!
Somebody is addressing it!
PRAIZE JEBUS!
It's basically like this: anything applying to DX11 applies to DX10. It's not an entirely different API like DX9, DX12 or Vulkan.
DX10 and DX11 share a common base
yes some people actually wanna play some older games also
I like to mess around with various GPUs / CPUs. Lately, I've been playing around Chinese domestic stuff, Zhaoxin and Innosilicon. The Zhaoxin c960 and Innosilicon Fantasy No.2 are fine at desktop stuff and can even run some DX11 games/benchmarks.
However, I always end up playing BF 1942, which is Directx 8. 😅
Take my thumbs up now.
Haha thanks
Great move for adding Indiana Jones and Stalker 2, itll contribute in YT algorithm and engagement. Since yesterday ive search for a benchmark but could only get a spreadsheet.
I really enjoy both of the games too so I'll keep them in the benchmarks for a while :)
@RandomGaminginHD thank you! and highly appreciated!
Thank you for mentioning how quiet the cards is and even more importantly, about there being no noticeable coil whine. I mostly dread about coil whine when getting a new card so that info was most welcome.
Last card i had with coil whine was an amd r7 360. The drivers were so bad anyway i have no trust in amd drivers ever being free of serious reliability issues
Would be great to see a similar depth of testing on PCIe Gen 3, as at this price, it's really going to appeal to those running older hardware
Ah good point I may run a few tests with rebar disabled too
@RandomGaminginHD great idea, a bit of A/B testing. It was basically mandatory on the first gen cards wasn't it 👌 Great video as always by the way
No rebar might be problematic tho on older systems
@06dpa it would, I think that's the point, to see how problematic and if it's better to stick to Gen 3 GPUs and those that don't need rebar until people can upgrade their system
I'd be curious about gen 3 as well, although I do have rebar enabled on my system, old as it is.
So glad Intel came out with something super competitive, competition is key right now with Nvidia having outrageous prices and insufficient VRAM in cards.
Currently running an RTX 2070, I haven't felt the need to upgrade but this card is really intriguing to me and has me considering. I don't necessarily think it will be the biggest uplift in raw performance in games, but at $250 it's a very good chance to get my foot in the door on some newer hardware, and get myself 12GB of VRAM- as someone who's been enjoying the new Indiana Jones game, finally starting to see the need for more VRAM lol. Great video as always! 6:02
Compare prices to RX 6750 XT before you do this upgrade.
This card scales really well to 1440p but what got me excited is for the higher end gpus Intel has coming
i don't think it's a worthwhile upgrade. unless you can get a decent price on your 2070, but idk i'd go for a bigger jump than that.
Honestly try to save up and go for a 1440p card
If you're upgrading, upgrading to a better experience is always good
I upgraded from a 2070 to a 4070, decided i eventually didn't like the 4070 (and after asus tried to dodge out of a warranty claim because of coil whine and fan noise), gave it to my brother, and replaced it with a 7900 GRE. I saw a big uplift with either card but the GRE after overclocking it saw a bigger uplift because i mostly play rasterized games. It was about 83% from the 2070 to the 4070 and probably around 102% ish with the GRE from the 2070. Depends what you want with your games really. Id tell you to wait until next gen, but we don't know what the prices will be like. All ill say is im happy with my GRE, and my brothers happy with the 4070.
This card is sold out in South Africa and i would love to get my hands on this card. It has everything a player needs. Good performance, good vram, not too much power consumption, stable drivers so far I have seen and a good price.
whats the price in rand?
@@User1-b6m The card in rand at Evetech(manual search) is R6700. The RTX 4060 cost the same and RX 7600 cost also the same.
@@User1-b6m Looks like Evetech has them in stock at time of reply. R6.5k for dual fan and R7k for 3 fan one.
It's out of stock everywhere in the planet right now.. hopefully soon will get back to the shelves.
Perfect, I trust your opinion more than any overhyped youtuber. Looking forward to this.
Thank you that means a lot :)
Also as an FYI, GTA 5 has real engine troubles past 180 FPS and introduces stutter. Might be why you're not seeing full GPU utilisation.
I think it's 187 fps and then the fps craters to the low teens before recovering and repeating
But in his footage the fps didn't lower and stuck at 187-188, which is why utilization is in the 50s
@@TheJaffaMeme yeah, better off turning all advanced settings to the max and get 150 160 whatever fps you get
pretty much, of every benchmarker channels I've came across, this guy, zworms, and budget builds official are my go to channels for everything PC benchmarking
Great coverage on the card, definitely one of biggest "budget" graphics card releases in a long time. Planning on using it in my main gaming/content creation rig, I wish i could even buy it right now 😭 literally went out of stock in the states in a minute or so and hasnt come back in stock yet. Newegg is kind of the only viable option for it at the moment, Gunnir models are the only one available for around $400 and I'm not gonna support those prices lol, so just playing the waiting game now. Most US models will come with AC Shadows which I was going to buy anyway, so that helps "offset" the cost of the cars, or make it a better value, however you want to look at it.
Good luck with your search I hope you find one (at a reasonable price) soon!
Love the look of this card so simple and clean
Great to see a midrange card with adequate VRAM at an affordable $249 again, just like the GTX 1060 6GB in 2016. This can definitely ace the Steam charts if the driver issues are handled for newer titles in the immediate future.
For a 6700XT owner though, nothing to see here, except that competition is good. This should have been the MSRP of 6700XT in an ideal world at launch if the industry was not spoiled by NVIDIA with $1200 for 2080 Ti.
6800 XT owner here. I'm enjoying the fanfare. If I didn't already have a GPU, I'd grab a B580 easily.
Well, gamers only have themselves to blame for paying those ridiculous prices and creating a new normal. Same goes for pre-ordering games. It is destroying the industry because people paid in advance don't have any motivation to optimize games.
The use of RT as the only lighting system in the Indy game was a decision made out of pure laziness. It was an awful business strategy because they gave the finger to half the people who wanted to play it. In the real world, most people can't just go spend $600 on a new GPU at the drop of a hat. It impacts their sales tremendously and I am hoping that, with the $300m price tag and the steep hardware requirements, they won't make their money back because the industry needs a high profile example of what happens when you alienate a huge portion of your customer base.
It is too bad that Nintendo failed to learn from Metallica's mistakes. They sued people for piracy and hardly sold any albums for 20 years, effectively getting canceled. The problem with the Nintendo issue is that their games are as addictive as crack cocaine.
New PC assembler here. Im getting a good refurbished 6800 xt card below the price of b580 in my country.? Since you own a 6800XT should i go for it?
In Germany the Cost 335€
So to Expansiv ☹️
@@Lurch-Bot nintendo can go smd after the stunt they pulled with palworld.
also gpu prices being ridiculously expensive cannot be pinned on gamers alone. if you go look at the steam hardware surveys, you can see that most people do not own a higher end GPU, so clearly gamers are not buying $1000+ GPUs en masse. dont forget that we are in the middle of an AI and machine learning boom, and these nvidia cards are also valued for their capabilities out of just pure gaming.
I would appreciate a 1080p vs 1440p on this card. From other reviews it scales extremely well.
Doubt we'll see 1440p lol, he's still terrified of using MSAA on GTA V "because it'll hurt performance" while running it at 175fps. Like yea, I wanna see what this card can do lol.
"Let's cut to me inside an old escort...hmm probably should rephrase"
6:01
Has this even released yet i cant find any stock of this anywhere?! great vid as usual!
Yeah but they seem to be disappearing fast
There seems to be pretty high demand for it, it's sold out everywhere despite retailers saying they had good supply. Intel finally nailed it.
Intel says their stocks are replenished weekly
8:23 Half-Life 2 (and most other Source Engine games) actually has a framerate cap of 300FPS. I may be wrong about this next part, but I BELIEVE it's done this way because if you go past 300fps, the tickrate starts becoming an issue, and physics, as well as player movement, starts to become very wonky. nearly every single modern system, with the right settings, can hit that 300FPS cap in HL2.
Unrelated anecdote though, but Intel ARC in Half-Life 2 doesn't currently support DirectX 9.0c (or DirectX 9.0+, or DirectX 9 Shader Model 3) like AMD and Nvidia, so you don't get access to improvements that came with Shader Model 3, or HDR rendering (not an issue for most of HL2, but it does affect the Episodes more substantially, since they were released after HDR was released in Day of Defeat: Source and Half-Life 2: Lost Coast).
Ah ok thanks for the info :)
Correct me if I'm wrong, but Intel ARC hardware only natively supports DX12 & Vulcan, while all older DX are supported by emulation / translation layer?
@@kmieciu4ever I don't know much about how Battlemage handles it, but Alchemist definitely did handle everything else through translation, and I believe the same issue applied to HL2 on A-series cards.
Unless Valve changed the way HDR works in Source since the release of Lost Coast and the Episodes, they still use a very different means of producing HDR than most games.
Most games use high precision floating point to render HDR, which requires a minimum of DX 9.0c(emphasis on the C part) with 32 bit precision per sub pixel to do properly. Valve wanted folks with older DX9 cards to be able to experience HDR as well(like my old X800XL from back in the day which was capped at DX 9.0b and still 24 bits per sub pixel precision like the original DX9 spec and what I believe was DX9.0a support on the FX 5000 series of Nvidia cards). So instead of using floating point data for HDR, Valve used integer data instead, the results of which turned out pretty well in my opinion(the inside of the church with the stained glass windows towards the end of Lost Coast in particular). So yeah, the original(and maybe still current) method of rendering HDR in these Source based HL games included DX9 shader model 2.0 and didn't require the shader model 3.0 precision available in DX 9.0C.
@@jtenorj Oh, really?! Wow, I didn't know that, but that makes so much sense. Thanks for the clarification!
I got RX 7600 XT for 289$ a week ago before this card released. for my real world scenario as Steve said 1080p, i3-12100, 16gb, and 7600 xt absolutely kicking most game i played lately.
Nice :)
Pretty sweet budget system you've got there. Fits together well.
There are way more games that are more than worthy to play through, than you could play during a lifetime, on that system of yours 😊.
I got a nice deal 7600XT/Ryzen 7 5800X 32GB a few weeks back.
Plays Doom Eternal in 4K60 Ultra settings like a breeze.
Plugged into my 65" TV, what a blast.
You should go fo i5-12400f, also very cheap cpu but much better with little price difference
You'll definitely enjoy those 16 GB VRAM for years to come
I still rememeber this channel from the video 1060 Mini version years ago, have grown so much and quality up as well. Keep going, best.
I was waiting for your opinion since I know your tests are more equivalent to what I play daily.
I really like the blue of that card's shroud.
Ironically reminds me of those blue shrouds in those old Radeon cards that I also liked for their pleasingly monocolor simplicity.
Thanks dor testing cs2!
This was a wonderful showcase of what the B580 can do! I thoroughly enjoyed watching this review over and above some of the other tech youtubers because you had a wider array of games tested which is exactly what I was hoping for! Thank you so much!
You could use max textures instead of presets due to the 12GBs of VRAM.
Youre just the best youtuber for things like these. Your voice is soothing, the video is informative, plenty of games tested and the fact that you test even obsure cards. Love your content
Thanks a ton for including DX9 title, you are the first one to do so. Would be really interesting to see comparison between last gen Intel card, B580 and some close competitor like 3060 Ti 12GB in those older titles
3060ti is a league above and has 8gb. Think you mean 3060.
DX9 is still relevant because the sheer amount of people running legacy skyrim.
@@rejectxz Yea and sadly, a lot of people sleep on the 3060 Ti Ventus 3X 8G OC
@@jeke8413 raw power wise its good but the vram is limiting in newer games. I wouldnt buy an 8gb GPU today. In 1080p you might get away with it though.
THANK YOU for running the BG3 benchmark in Act 3!!! So many other channels run it in Act 1 or even worse in the nautiloid. Act 3 performs like an entirely different game so this was really refreshing to see.
I think Intel has a winner with cards like this. If I were in the market for a new card, it would be near the top of my list.
At the moment yes but the 50 series and 8000 are around the corner which will most likely beat this
@@Qelyn Intel could still discount the cards to undercut them in price
How do you not get sick of testing out GPU's all the time. You are really lucky to be able to focus on the same type of thing for so long without getting bored out of your mind. Great videos.
Judging by the reviews I’ve seen so far, it seems like Intel has mostly solved the hardware-level inefficiency problems found in their Arc lineup with Battlemage. I’m still holding out on getting the B580, though, because I’m unsure whether Linux performance would be adequate. Would you mind testing this card on Linux using older games and emulators to showcase its capabilities? I appreciate your hard work in providing straightforward gaming benchmarks.
Intel has fantastic Linux support (u like nvidia)
thanks a lot man for testing a dx9 games.
no one seems to care about the legacy games issues on arc cards .
love your videos always 🖤
Very interesting card, close to 6700XT performance for £250 is amazing, I paid £399 for my 6700XT in January 2023 :)
Yeah this is slightly slower but impressive for the price :)
Outside the US market, at a price of $250 you can only dream. The US market is not the whole world! For example, in Croatia or Austria ARC 580 GPU costs 350-370 euro!
Here in Italy I haven't found it for less than €390. With 4060s starting around €320 it makes very little sense here.
I paid £270 for mine, used in April this year. AMD and Nvidia really made the market (and prices) stagnant for years.... unbelievable.
Currently here I've seen the B580 from 250-280 GBP. I paid 250 for a 1660 Ti in 2019 so for the price point the card is nice if you can get it at that price, anything more and you're better off searching for alternatives.
I’ve been waiting for your specific review on this card. The budget gaming king!
In when both CPU and GPU utilization drop along with framerate that means memory or the system bus bandwidth are the bottleneck, because there's too much data being transferred around for it to keep up which causes the CPU and GPU to sit there waiting for the data. This isn't always a bandwidth issue though because sometimes games/engines won't minimize data transfers as much as they could've, particularly in geometry/texture streaming situations. If standing in the middle of the town in RDR2 just sits consistently at low framerates where everything is resident on the GPU then it's almost surely memory bandwidth that's limiting things. There's also state changes for rendering different aspects of the G-buffer and different materials, and I know how busy Rockstar's engine gets when rendering a frame. For those heavy deferred rendering engines a lower resolution usually helps a lot, and not because you're cutting down on rasterization (though that does halp too) but because it's not having to write to and read from as much G-buffer. A typical G-buffer in a deferred renderer can be dozens of megabytes in size, which must be written to and read from (multiple times, sometimes) in a single frame, while doing a bunch of other stuff like changing render state and caching different resources for different rendering commands.
Hell yeah! Here's to hoping they keep in steady supply, I'll be waiting for it to hit the second hand market in a year or two.
G'day Random,
for any Aussies interested & anyone else wondering about comparing World Pricing down here B580s where I shop are...
intel LE $439AUD (PCCG Sold Out) - ASRock Challenger $449AUD (Scorptec) - Sparkle Titan OC (PCCG) & ASRock Steel Legend (Scorptec) $479AUD
[Edit] All Aussie List prices include our 10% GST (=Sales Tax)
I keep checking msy/umart every day. They have the listing, just not the options yet. I don't like the look of the asrocks, I'd prefer the intel or the sparkle one.
In Brazil we didn´t even get listings for it 🥲
@@jilherme WOW! thank's intel
@@Retro-Iron11 being cheaper but not looking cheap like the ASRock Challenger I think the intel LE ones are gonna keep selling really well as soon as they are in stock,
intel has done well against 4000 & 7000 for Perf/$$$, it will be interesting to see how Battlemage fares against 5000 & 8000 Perf/$$$ to see which is best value then
@@shaneeslick By the looks of it I may have to get the steel legend as it seems the intel le is sold out everywhere. $30 more but you get more cooling, physically larger heat sink and the boost clock is higher, although bench marks put that down to maybe 1fps faster. I do like the rgb on/off switch on it. I may be changing my mind the more I look at these things. The asrock challenger is just outright gross and nope.
Thank you for this video. All major tech journalists just provided frame graphs for ultra 1440p and 4k while this card is a spiritual successor for 1060 and 580 and the new best 1080p bang for buck card
Please do a test on a slight older system with pcie gen 3 so we can see the difference between gen3 and 4 since it only uses 8 lanes and not 16
Definetly sonething I wanna see
I love your benchmark videos, thank you for the content
Would love to see this paired with your 7500F...Maybe you did in this video, haven't made it that far yet. Man, talk about a value winner with those 2.
Oh for sure. Honestly performance shouldn’t be much different the card will still reach its max potential with that cpu
I was waiting for your video on it, definitely seems like a card you'd comfortably review
I would have probably got this card over my 4060 if I had the choice. Intel is doing good things in their graphics department .
They’ve come a long way over the last couple of years with their GPUs for sure
At least with RTX4060 I can turn on Framegen, and have 90-120 frames in Stalker 2...
@@kmieciu4everid wait til driver updates before saying anything final.
@@kmieciu4ever Intel's own frame gen is coming down the pipeline as well.
I got this also and i5-14600K and im very happy for performance. I allready tested Forza Horizon 5, Halo: Infinite, Forza Motorport, Starfield, Fallout 76, The Ellderscroll Online, Age of Ampire 4, Doom Eternal, Call of Duty Black OPS 6, Wolfenstein 2 and Age of Mythology Retold
7:34 a quick sidenote, GTA 5 has an engine framerate cap around 185 fps so it is normal to see lower utilization, also around that cap frametimes kinda goes haywire so either implement a framerate cap via RTSS or crank up the settings to avoid it 😄
that fps cap was 188FPS (limited by RAGE's internal settings called audio frame limiter), plus GTA 5 also have issues when going over 140FPS, with the game randomly stuttering on menu (due to the audio has to be synced with the main threads). This was both fixed by mod (GTAV.AudioFixes) which increases FPS limit to 270FPS (limited by mission scripts).
Great video as always, very informative! One thing I'd like to point out: It seems you record your voice in post and throw it over the prerecorded benchmark video, which is fine, but then you write the script as if you were doing it live, which throws me off like in 11:56 . Im definitely being nitpicky here but its something ive noticed :) Reacting to your footage is fine imo but perhaps playing it off as if it were live is too much for me
Yeah… just no need to lie that you are speaking while playing like just why
man, upscaling has done untold damage to the gaming sector. It's just a crutch for companies to get their games on the market faster, while looking "better" than the last gen.
cards like this with optimization efforts from 5-8 yesrs ago could pull 1440p, probably maxed on settings.
i love that you test new game and older popular games
A person who disables motion blur, depth of field (though this one depends heavily how it's used in the game), film grain and chromatic aberration is a man of culture.
Haha every time. All of those are switched off!
What about disabling anti aliasing Especially TAA?
@@RandomGaminginHD depth of field, chromatic aberration, motion blur and lens flair are the first things I disable
Motion blur is good in some cases, especially per-object
@AntiGrieferGames Many modern games have TAA built in (as far back as Battlefield V) and cannot be switched off as certain other image processing depends on it. Some others lets you switch it off but it's obvious it's meant to be on since other AA options give an aliased mess (Resident Evil 3 Remake for example). TAA is honestly fine as long as some kind of sharpening filter is applied internally or via third party.
I was waiting for your test of this masterpies . Because you the best on this section my bro . alleh bless you .
would love u to test this gpu with way older games since i heard thats the main issue with intel gpus
seconding this, this gpu looks like great value but I can't commit if it's not playing nicely with my retro stuff :D
Intel did great improvements on those with the Alchemist cards, the biggest issues are with new titles nowadays.
This. I'm interested to see if it can run older era games 360-PS3 era
@@Tomazack older games are still an issue imo. as another comment said it doesn't support DirectX 9.0c so in Half life 2, you don't see some options. In fact for new games requiring dx12 and RT, intel will be better. And intel will proactively work on drivers for latest games instead of older ones
And maybe trying some older games which are less mainstream, I'm thinking Intel may have done some bespoke work to make the most common older games run well, but wont do it for the obscure stuff
What a great time to launch an actually affordable and good graphics card. The end of 2024 becomes better thanks to this.
Try to overclock it since you have a third party one, oc adds so much performance.
Hearing your voice brings me back my man! Found this video in my recommended. Turns out you've been uploading lol. Cheers mate
b770 might be the card of year.
That smiley on the arm at 0:59 indicates that budget gaming is back 😊
Literaly day when B580 reviewrs came out, i ordered my Titann OC Sparkle B580 instantly... have beein running it since friday - amazing acrd.
and it really an great UPGRADE over A770
Should have waited for B770
@@fightnight14 thing is, we don't know anything about B770 - at least about potential release.
OFC i could wait, but B580 is already 22% average better in 1440p than A770 - so it is still an upgrade
if B770 will be anounced, i will save money for it - but i want to be safe and bought now what i wanted.
+ lets be fair, when CES 2025 hits.... we will have another scalper pandemic just like we had in 2020-2021...
As always thank you for the video! Always appreciate your reviews for the "Common" Gamer
PLEASE do a Kingom Come Deliverance 2 video with this card vs other GPUs in its class when it comes out!
Nearly first! Forever a faithful RGINHD viewer
That escort bit had me creasing 😂
how much did you pay for this card ?
coz pricing in most online store is north of $350. It's simply not worth it at that price.
different countries have different economies so msrp of a gpu doesn't really say much, if thats the case you should get a used 6800 16gb as an alternative.
the reason why this is such an exciting card is solely because of the vram and the performance, no other company would be generous enough to offer 12gb at 250
12gb vram at $350 is very compelling imo, especially in the games I play.
Really nice touch from Sparkle when they included that awesome looking sag bracket
You said you capped your i9 at 125 watts so it performs like a 14700K ??? In what world does a 14700K draw 125 watts??? Mine draws 253 at the new microcode stock settings
Nope, 125w and 253 in boost. You can also mess.around with that in bios. Hard to tell if he has taken down p2 to 125w, Sounds strange like you say.
@@FastTcX Wrong, 253 under high load, I should know, I have one
@@freedomearthmoon1 under boost on the p cores. But you can modify that value.. I also had 253, but changed to a 14600k, and even there you can modify that if its a k version. In my bios its called p1 and p2
pre ordered the b580 here in Canada. Am very hyped, I got the arc a750 and it killed my windows install at launch so I'm excited to see how far they have come
2:28 yep, I always turn off depth of field
Been waiting for you to do this card
0:57 nice little drawn on smiley face :)
=)
Where?
@@yaltschulerlook at his wrist
Nice Reviews, Subscribed.
0:22 that GPU is half the length of the heatsink
Get inside that Escort! LOL! Love your video's, I'm glad we're getting options now, reminds me back in the early 2000's when we had multiple GPU manufacturers.
Question, why is the minimum fps higher than both the 1% and 0.1% lows? Shouldn't it be the global minimum of the benchmark session? As in
1% low = average of the lowest 1% fps,
0.1% low = average of the lowest 0.1% fps,
Minimum = minimum fps ever reached in the entirety of the benchmark session
Otherwise great video, your commentary is really nice.
Wow, that does seem quite weird. Wondering this too now that you've pointed it out.
i've been wondering about that too, This is my guess (maybe wrong):
1, 0.1% low - from the whole benchmark, take slowest frame times, convert to fps (1000 / frame time in ms), and average
minimum fps - take a slowest frame time in every second of the benchmark, convert to fps, and average
You should test some more old(er) games, whichever ones you think of, to see if there's any driver issues. They don't even need to be benchmarks, just tests to see if they run without issues.
Did anyone test these cards on older games like Skyrim or FO4 or anything that came out before 2016?
Sparkle is back 🤘🏽 love the Sparkle Caliber
For HL2, open the console and type fps_max 0 for having 1000 FPS Cap
Ah of course, thanks :)
@RandomGaminginHD Thanks to you man! You make great videos. I will probably replace the 1060 with this B580 thanks to your test.
@@luna6153 yeah, rx 580 and gtx 1060 owners finally have something new to upgrade to.... and it has 12gb of vram and good raster!
I am an early adopter, had my A770 for 2 years, the improvements make me want a B770 when ever I can make that happen.
The price here in Luxembourg (Germany) is 300-310€ right now, you can get some used 6750XT for cheaper.
Also a new 6750XT costs only 20€ more, soo it's great it exists for competition, but the price is slightly to high here. :')
The cheapest available b580 that I found in Germany is more expensive than a *new* 6750xt...
y its worth for american market but for european not so much, it would need ot be 250 Euro
Europeans getting exploited as always lol. The GPU market is messed up here.
The announcement of an ARC B770 would be very, very interesting indeed...
Will you be doing any PCIE 3.0 testing?
That would be neat especially with a 10th gen or 11th gen CPU.
Or Ryzen 3000/4000 series?
@@chomper720 Also if you're rocking an AM4 motherboard other than the X470, B550, or X570
@@Ale-ch7xx I have a B450 with 5900X :p
@@chomper720 B450 and 5950x here
@@chomper720 B450 + 5950x here
The escort joke reminds me of old gaming magazines from back in the day, subtle but you'll catch it if you know =) well played Sir
3:19 "no space on my hard drive" - welp, Steve just downloaded CoD... (/s)
As always, excellent reviewing from you yet again my friend, superb. You are far more relevant to a lot of gamers than some of the so-called 'big' TH-cam channels. Hats off to you! 👍
As someone who's been thinking of building their first PC within the next year or two (as opposed to buying another gaming laptop), these new cards have me very excited. The first generation of ARC cards were so iffy, it was really a turn off for most people, hell, even AMD still hasn't completely shaken off the 'crap drivers' stigma. It looks like Intel GPUs may finally be approaching parity with other vendors, so it doesn't have that 'only for enthusiasts of troubleshooting' label.
Cool to see! I've been trying to find DX 9 tests because I still mostly play those games... they're all indie games so I guarantee they won't get any attention from the driver developers if there are issues.
I wonder how this runs on pcie 3
same here.
Thank you for testing with older games.
It's Rx 6700xt performance and Rx 6700xt amount of ram for an Rx 6700xt price.
I'm glad it exists, but it is really f-ing sad that we are excited about stagnation at this point.
At least it is something compared to the 8GB scams released by their competitors
isn't that the PS5 GPU? Surely it's more powerful than that
Can you really buy a RX 6700XT for £250 ? Brand new?
@@dudemanismadcool it's more powerful in ray tracing, other than that the raster performance is pretty similar to the 6700 xt
6700xt is around $350. B580 is $250 and it’s better at 1440p and ray tracing. I’m excited to see if they make a B770 that can handle 4k for $350-$400.
Respect for showing Half-Life 2 as a baseline.
This looks like a great card if you play new games and have a low to medium budget. I would not trust it to play gta IV, freelancer, warcraft 3, c&c games etc that are old favorites I play along with a cod or fortnite. Nvidia and usually AMD GPUs tend to do the retro gaming better than Intel.
I’ll try gta 4 soon :)
... my intel laptop runs gta 4
thanks for the review bro. going to make my first gpu purchase and this card seems perfect for me.
Why test on 1080p? I need 1440p performance 😢
get rtx4060ti or rtx 3070 or rx6700xt or higher
you only get the vram for 1440p
It's really good 1440p card, there's alot of videos you can watch
@@harrislawrence5250 dont get 8 GB cards for 1440p 🤐🤐B580 is great entry level 1440p honestly, it scale super well ( lower fps drop in higher res, cant fight with 6700 xt which is great)
are you not using upscaling at 1440p?
This reminds me of the days when RAM and other components could be various colours like blue or red instead of just RGB. Nothing against RGB, I like it too. But those vibes hit different for some reason.