Hey everyone! Check out our full benchmarks over on the forum (link below) any and all edits, typos, etc. will be fixed ~ Amber forum.level1techs.com/t/l1-benchmarking-the-new-intel-b580-is-actually-good/221580 Update to clarify: Tn the older system is a 10700k, which is limited to PCIe 3.0. The motherboard it was in, the Z590, supports PCIe 4.0 but the CPU does not. So, the "older system" benchmarks are at PCIe 3.0 speeds.
Thank you Wendell. Asrock sold something called the Deskmax in China. It was slightly bigger than the deskmeet but had better GPU support. I'd really like it if you you could get your hands on it.
I would like to see some sort of SyCL benchmark on GPUs (oneAPI or AdaptiveCPP [formerly openSyCL/hipSyCL]); Being a sort of a multi-platform open source alternative to CUDA it seems like a reasonably good candidate for GPU-compute benchmarking.
Please stop endorsing temporal methods of real-time image procesing. Ghosting and blurryness is so annoying, it's unnaceptable, and it's solving a problem that shouldn't exist.
I really hope, even with the whole crisis thing Intel is going through, that the GPU division will be kept alive. We need competition, and the speed they've iterated with on Arc is amazing. If they keep focus on solid midrange performance at a decent price-point, focus on reducing power consumption at the same time as increasing performance, and keep offering non-gimped memory versions that are appealing to developers who want to dip their toes in ML without breaking the bank... this could really go somewhere.
They won't ditch the gpu division. They rightfully recognize that they can't just afford to lose out in the gpu segment. Intel arc was created originally to gain a foot hold in the dedicated gpu market, it was expanded to the igpu market because Amd had been running circles around intel mobile processors. Arc has become the successor to iris xe as a whole. Technically speaking, tiger lake (iris xe) had superior graphical hardware design held back by its dated architecture, low clock speeds and under developed gpu drivers.
Agreed. The speed that that they've done this all in is really notable. I know Intel is no stranger to GPU work, but dGPU work is a slightly different beast, as they've certainly noticed. I think that right now, given the economic climate in the US, this is absolutely Intel's chance for a commercial win here. They've done the hard part. If they market this properly, and build a ROCm/CUDA equivalent, I think this might find a massive audience amongst PC peeps.
Why people keep saying this?! The only thing they're targeting is brand recognition. They don't care about the midrange GPU market. They want to be selling AI cards, just like Nvidia is doing. Corporations are not your friend.
Same here. I've been looking for a decently priced card to upgrade my GTX 1080 from. This looks like a really good card, but I want to know how its feature set is implemented on Linux.
I was eyeing this card because of the new Xe kernel driver Intel is developing. The i915 kernel driver is great and legendary, but I'm gambling on Xe becoming as good as amdgpu one day, so fingers crossed! 🤞
I really love how you can present something like this with neither hype nor derision, but still be enthusiastic the whole time. A neutral review that is nice to watch, even entertaining. Your talent for this kinda stuff is admirable
I been watching Wendell since his face stayed hidden behind a sick set of monitors like Mr Wilson on Home Improvement. Wendell literally knew everything. Wise AF just like Mr Wilson. I thought that was cool AF. I have learned soo much from Wendell. I don't even know this dude but I legit grew up with him and he taught me things. You a cool dude Wendell. You done earned the respect of millions of bros. Freaking Kudos
@@gondil07 I actually watched their review first, which had me worried, but this one leads me to believe that it was a configuration or software issue rather than a hardware problem. would like to see more testing from more people, but I'm confident enough that GN had something wrong on that
it's weird that GN had such high power draw then, I'm sure they will compare with other reviewers and figure out wtf is wrong, cuz that looks like a serious bug if someone doesn't realize.
I don't think these are actually going to be widely available, at least not for very long. Intel is losing money selling them at this price, so they are not going to want to make a lot of them. But, maybe they will be able to make enough that they can keep up with demand for a significant amount of time. It's not like the demand is actually going to be that high for these, especially once the next gen competing cards from AMD and Nvidia are released.
I recall from the last year Intel gpu that there was a discussion about using the Intel enterprise gpu software with the consumer card (in particular gpu sharing among multiple vms without any expensive licenses). Is there an update about this with the new generation?
Highlighting the cpu difference on older cpus vs new is the kind of stuff that is really valuable about the l1t approach.Its a good ramble for sure, but its also a deep dive. Looking forward to the linux and AI deep dive that will come later :D
dude this and the AMD cpu's being affordable and fast is amazing stuff, who would've guessed we could play 1444p games at something like 800 bucks again in 2025!
@@gaav87 While I'm unable to change parts of a video that is live, you can find all revised graphs on our forum post: forum.level1techs.com/t/l1-benchmarking-the-new-intel-b580-is-actually-good/221580
Battlemage seems like such a fun card: It has the bus, the VRAM, and software to let you tweak and OC the card. However, what I am really excited about, besides hugely, much more mature drivers, are the AI features that Intel has included, such as LLM and AI image generation locally - that's Huge for content generators. Hence, I hope it comes to the EU region for 250 Eur.
Me too man, i am investigating also the Xe2, the NPU4 and the AI capabilities of the NPU in the mobile processors (Core Ultra) for LLM workloads. Lookung forward to optimize my llama.cpp fork for that
From the phoronix review looks to me like it's still a few steps behind. Still if they improve just a little bit my media pc has an a380 in it that might get replaced...
What is interesting is Intel's long history in graphics. They focused on commoditizing graphics a very very long time ago. First they did MMX extensions in the processor core. Then they added vector processing (AVX). They were tightly coupled with Microsoft in providing what the desktop graphics needed for acceleration, focusing on their bread and butter and protecting their "moat" by assisting DirectX. Eventually - to make laptops cooler and cheaper they integrated 3D graphics into the CPU chip thus moving the bar for a sub $1000 capable Windows Laptop. So they know what they are doing on a technical level and achieve some very impressive power numbers etc. They are still catching up on the non-DirectX graphics engine support in their drivers. That is a tough obstacle as the target is constantly evolving. New leadership coming in. Let's see how they plant their feet.
Ahem. What they really wanted was getting multimedia and gaming on IBM PC compatibles locked to their hardware and software. Libraries, promotional campaigns, supporting Microsoft's One and Only True Way of doing everything through Direct* libraries provided by the system (and written in part by Intel) were all born out of that. And it was a valid strategy in mid-'90s (remember that Doom and Quake were software rendered super hits, and needed a fast CPU - guess who profits from people buying top of the line processors). Then serious 3D accelerators appeared, and in a couple of years Direct3D became “interface to what nVidia and ATi wanted to offer”. Not Intel. However, they still managed to make themselves important for big multimedia businesses... through DRM. HDCP? Intel owns it. AACS? Intel participated in making it. Not to mention SGX, uncontrollable and unobservable execution of something only Intel and its clients understand at the firmware level, which can only be used for a single task: implement DRM systems for the big guys. (It is now deprecated, so Intel didn't have enough power to shove it into everyone's throat, but, famously, even before that government security agencies demanded for their systems to be provided with firmware from which such “features” were completely cut.)
This kind of price/performance reminds me of good old times when the green was greener - the times of GTX 960, 1060. If I were building a system with some 12400F right now I wouldn't have to think for too long I guess.
I'm very interested on how well these cards perform for entry level content creation, especially rendering videos on Davinci Resolve and the HEVC/AV1 encoder for live streaming.
@@williamp6800 i have used a b450 with rebar enabled, and hardware unboxed’s podcast had one of Intels guys on and according to him pcie 3 should not have any issues
@@williamp6800 rebar works since PCIe 2.0. Newer PCIe 3.0 boards should have it by default and older ones need to get an update (or you mod the option in yourself)
@@Aerobrake I've personally been using Linux since the Late 90's when Red Hat Deluxe, and Corel Linux came out, but I only made the full switch over on all my systems around 2015 when AMD open source GPU drivers, and Proton for STEAM got good enough to run most of my games, as I was fed up with Windows 10 being a resource hog, and I was one of the victims of the update that deleted stuff off people's HDD's without their permission. So almost a decade free of MS, & life could not be better for me running Manjaro Gnome on my various AMD CPU/GPU, and Intel CPU/iGPU systems, as I avoid Nvidia on purpose because it's more expensive,the user experience on Linux is overall worse because they have been anti open source for so long, thus causing the Linux community to have to work from scratch to develop the open source Nouveau drivers for Nvidia hardware(they have done amazing work, but should not have had too), while AMD, and Intel have their own teams contributing back to the Linux kernel for their hardware drivers which is a big 👍 in my book.
Phoronix showed some impressive compute benchmarks. This good showing is surprising. I hope the new Intel management won't kill off their dGPUs though.
I'm using a i7-8700 with my b580. I have a strix z390-e mobo. With rebar off games are unplayable, but with rebar on it is awesome even with a CPU that isn't "compatible"
GPU market needs more players so that consumers are not robbed ! Highly experienced companies like Intel should survive in the market so that there is no monopoly. I really hope that this generation of Intel GPUs crack the market ! Thanks for the awesome review. Love your work ❤️.
I know that sale margins are not impressive for shareholders, but Intel needs to keep fixing aggressively their shattered reputation amongst customers.
@@jmwintenn gotta love it when people read baseless info on reddit or Twitter from so called "experts" and just regurgitate that misinformation every chance they get
Very interesting, I wonder what the difference is between your monitor setup and Gamers Nexus'. They were measuring 35W at idle versus yours at 13W. Perhaps it's still some weirdness with ASPM? Alchemist had issues with never falling under 40W at idle without properly configured ASPM settings and even then, it depended on total display resolution. With my dual 1440p displays, it can't seem to sleep the main GPU and so it still refuses to fall below 40W at idle. Did you test those multi-monitor higher resolution scenarios?
I would guess its aspm. I had no end of errors from my samsung 960 after a bios update. and a later bios update turned aspm right back off. had no idea what to look for until months after the fact.
sometimes you can get a massive power draw with a new monitor because it keeps its memory clocked high for no seeming reason Like my 6700xt will pull 30w if i turn on my 2nd monitor at 60hz with the 120hz main panel but will idle at like 5-8w otherwise
The way Wendell talks about it, it makes you feel like Intel really listened to the consumer base and tried to deliver what we wanted, rather than try and dictate what we want while helping themselves to our wallets.
Liking this card, may just go ahead and order it. I'm looking at a new build for creative work (video editing etc using Davinci Resolve) and likely pairing it with a 15th gen core ultra 7 and Gigabyte Aero G Z890 MB for the new socket. Eventually, if they bring out an updated 770 type card, I'll likely get that eventually. All this to update a 7YO Dell Optiplex SFF Core i5 with an equally as ancient GT 1030 as that's all I could find to work with this setup.
We're all still waiting on Valve to support anything other than their AMD hardware right now, but distros like Bazzite are atomic like SteamOS with support for Nvidia. With either some tinkering or support from some distro maintainer it should be possible
yeah well 😢 it's either the LAST Intel GPU or the penultimate Intel GPU from what I hear. I live over in Hillsboro, OR near D1X and Ronler Acres, have A LOT of friends at Intel because I live like right here haha, though I don't know anybody directly on the GPU team tbh. But rumor is from Intel employees that almost the entire GPU team has been fired and the department is being kept alive ONLY to keep the drivers working while they manufacturer and release the last few finalized designs. 🙃 I want Intel to compete as well... but I think Intel would rather sit back & collect subsidies while they pray for a merger deal. At least that's what the board wants... and so far there's no Nelson Peltz -like character ready to come in and make war with the board so they're probably going to win.
@@robnobert Meanwhile the words of Intel directly: The software team is currently refining Xe3, while the hardware team has begun working on its successor, Xe4.
@@robnobert that sounds weird given their guy on said they were already working on the next one on the gamernexus channel. It would be weird to scrap a division that finally did something worthwhile. "but I think Intel would rather sit back & collect subsidies while they pray for a merger deal." No. Just no. Like......no. wow....not even.
@@robnobert when pat gelsinger referenced discrete GPUs, it was in a wider context of portable devices (i.e laptops) and basically said intel should shift priority from making laptop dGPUs, to APUs that can compete with AMD's Z1.
Interesting to see you hit 13watt idle with this card when every other review can hit whopping 30watts at best. Wonder where the discrepancy is coming from?
If you aim for a good Linux machine this is the card because compared to any Radeon out there, it has openAPI fully supported now. Especially for the ones that consider Blender software a must.
This may be my next GPU, i will be waiting for team red and green's next low and mid range models but i at least need a card with more than 8GBs of VRAM for my 1440p ultrawide. I will have to see how the new and used options compare price to performance wise about 6 months from now.
any info if intel is planning a successor for a310/380 ? single slot, no extra power, if possible passively cooled with 4+ monitor support with latest codecs and technologies supported would be nice to have for 150ish euro, not everybody runs local llms or games all the time, photo and video editing with occasional fhd game where the nvidia 720/730/1050/1650/1660 or amd 6400 are a bit obsolete or unnecessarily crippled
7:35 And this is big thing for me in the tropics, low(er) power consumption. Intel is doing it with their recent CPUs, but they are late in the game, but, they will get there.
12:12 - The second scene has background noise and music and I think its the combination that made me struggle disproportionately. I've a relatively minor audio processing issue, but I mention it coz it might trip up others too. I don't generally need subtitles to watch your content, but this bit was hard going. It's gunna be weird having "AMD CPU and Intel GPU" builds but that seems to be where the value is here at the moment. Crazy times!
Intel fighting for scraps in the low end section, both with CPU and GPU, is exactly what that company needed for a wake-up call. It's good for them, and it's good for us consumers. This card is an unexpected victory, and we should cheer Intel on, that credible competition is sorely needed. I can't help but think this only weakens AMD's GPUs sales however, since Nvidia pinches its nose as it walks past the low-end GPU marketplace. Nvidia buyers will not be tempted by this card, but potential AMD buyers will.
If AMD finds a way to make their RT stuff more competitive, I could see this turning into a true three way split of the market Intel/AMD/Nvidia -> low-mid/mid-high/stupidmoney.
@@johnathanmcdoe Intel vs AMD in the low to low-mid tier, AMD vs Nvidia in the mid tier, Nvidia vs nobody in the high end? It does seem it's AMD fighting the war on both fronts, and Intel isn't taking Nvidia on at all .. but that's my silly opinion.
@@dtsdigitalden5023 Why fight someone else's batlle if you haven't got the ammunition to fight it. Will be interesting to see how this scales to a B7xx. Looking forward Celestial is already taped out as is Druid. Could we have a C9xx or a D9xx. Things will be interesting in the next few years.
That's a bummer, but should have expected it. Was maybe going to buy it for one of my old systems (7700k, 2600k) but with resizable bar requirements I should have expected that was not gonna fly.
Intel already told reviewers how to address this. One change in BIOS and one in Power. Then idle is about 12W to 4060 8W or so. It's not a big difference.
5:30 The all-in-one boards with mobile CPU are interesting to be sure, but as soon as you get a GPU a $125 Ryzen 8700F starts to make a lot of sense (Or dare I say it an 8400F). Even adding $30-40 for an AXP120 cooler and $100 for a Night Devil B650i you are still in a better position by enough to afford 2 other components: Case/PSU/RAM/SSD.
I want to see a B770 card but I keep hearing that this might or might not happen which would be a shame since all I hear is decent stuff about this card.
Whether we see the B770 is still unknown. But what is certain is we only have 1-2 GPUs left from Intel. Apparently nearly the entire department has been let go and the only people they kept were to keep drivers working for a few years. They're going to release the few cards that were already in the pipeline and ready for manufacturing... and then close up shop. GPUs were Pat Gelsingers thing. And the corporate power players at Intel HATED that department. And now we know who won that battle 😢
หลายเดือนก่อน
This was debunked over and over. It wws confirmed that they working on 4th gen already. Go see gamers nexus videos with Tom Petersen or Peterson@@robnobert
Hi Wendell, mentioned testing the 10700k and the performance being lesser, and that possibly being driver overhead. The B580 is limited to x8 PCIe, could that have caused the lost performance on the 10700k which only supports PCIe3.0? Maybe a quick check of performance in PCIe4.0 vs PCIe3.0 would confirm?
One has to wonder what the BOM cost is on this 12 GB card. What is the price of GDDR6 at the moment? This is definitely a fire sale and they will only sell an amount of cards based on the TSMC contract fulfillment. Still the 6nm 7600 8 GB vs 5nm B580 12 GB, a node difference and not necessarily a node performance improvement. You did a good job of framing this review by not have the 4060 8 GB to compare. Depending on how the RX 8600 performs with RT, it could the nail in the coffin for Battlemage, but I guess B580 pricing is to clear these guys out before that happens.
I really want to buy a B580 when I build a new machine, but I most probably won't. The simple reason is I look for quite high gaming performance, more in the line of RTX4080 or RX 7900. The B580 would just be something I wanted so I could try it out. The sad thing is it won't work for my old machine either even though it has a lot better performance than my old GPU because that machine doesn't support resizable BAR. These Battlemage GPU's really require resizable BAR to perform anywhere near reasonable, just like the Alchemist GPUs do.
I also thought my older intel machine ( 8series) not supported resizablebar, but to my surprise it has a bios update available that adds it. Maybe worth checking if that also the case with your platform.
@Level1Techs Could you please make a comprehensive test video on the overhead issue when pairing the B580 with older CPUs? I think it's very important because after so many positive reviews published online, many people want to upgrade their old GPUs to this one, but they most likely are still using an old CPU right now and thus will not get the full performance out of this GPU.
Hey everyone!
Check out our full benchmarks over on the forum (link below) any and all edits, typos, etc. will be fixed ~ Amber
forum.level1techs.com/t/l1-benchmarking-the-new-intel-b580-is-actually-good/221580
Update to clarify: Tn the older system is a 10700k, which is limited to PCIe 3.0. The motherboard it was in, the Z590, supports PCIe 4.0 but the CPU does not. So, the "older system" benchmarks are at PCIe 3.0 speeds.
Is it reasonable for start citizen gameplay..... I dont see any content creater doing what I have offered your company for us gamers.
Thank you Wendell. Asrock sold something called the Deskmax in China. It was slightly bigger than the deskmeet but had better GPU support. I'd really like it if you you could get your hands on it.
Thanks ~Amber
I would like to see some sort of SyCL benchmark on GPUs (oneAPI or AdaptiveCPP [formerly openSyCL/hipSyCL]);
Being a sort of a multi-platform open source alternative to CUDA it seems like a reasonably good candidate for GPU-compute benchmarking.
Please stop endorsing temporal methods of real-time image procesing.
Ghosting and blurryness is so annoying, it's unnaceptable, and it's solving a problem that shouldn't exist.
"What Nvidia wants to sell you at $250..."
I don't think Nvidia wants to sell you anything $250.
2022: me loading PCPP everyday to see if the 3050 drops below $250
Or you could get useless AMD ,, get scammed
Something second hand maybe
Maybe some Jensen merch. Like a green button for your leather jacket, perfect for die-hard NVDA fans.
Add a 0 that's their desired entry level. Good ole Jensen gatekeeping gamers.
"twas we had GT580"
"once we had RX580"
"now we've got B580"
The circle has been completed.
RGB580!
Piss up my shit wicket @@MegaManNeo
God finally listened to our prayers brothers , never taught I'd run an AMD CPU with a INTEL GPU. 😂
UM ACTUALLY it's GTX580
Hahah I see what you did dehere
I really hope, even with the whole crisis thing Intel is going through, that the GPU division will be kept alive. We need competition, and the speed they've iterated with on Arc is amazing.
If they keep focus on solid midrange performance at a decent price-point, focus on reducing power consumption at the same time as increasing performance, and keep offering non-gimped memory versions that are appealing to developers who want to dip their toes in ML without breaking the bank... this could really go somewhere.
They won't ditch the gpu division. They rightfully recognize that they can't just afford to lose out in the gpu segment. Intel arc was created originally to gain a foot hold in the dedicated gpu market, it was expanded to the igpu market because Amd had been running circles around intel mobile processors. Arc has become the successor to iris xe as a whole. Technically speaking, tiger lake (iris xe) had superior graphical hardware design held back by its dated architecture, low clock speeds and under developed gpu drivers.
Agreed. The speed that that they've done this all in is really notable. I know Intel is no stranger to GPU work, but dGPU work is a slightly different beast, as they've certainly noticed. I think that right now, given the economic climate in the US, this is absolutely Intel's chance for a commercial win here. They've done the hard part. If they market this properly, and build a ROCm/CUDA equivalent, I think this might find a massive audience amongst PC peeps.
How does it compare to 6800XT?
They have one called inteloneapi which also supports Nvidia and amd @@gavination_domination
Why people keep saying this?! The only thing they're targeting is brand recognition. They don't care about the midrange GPU market. They want to be selling AI cards, just like Nvidia is doing. Corporations are not your friend.
Can't wait for Level1Linux vid on it, as it sounds like a nice upgrade this time around.
Came here to say this. I also wonder about SR-IOV...
Same here. I've been looking for a decently priced card to upgrade my GTX 1080 from. This looks like a really good card, but I want to know how its feature set is implemented on Linux.
I was eyeing this card because of the new Xe kernel driver Intel is developing.
The i915 kernel driver is great and legendary, but I'm gambling on Xe becoming as good as amdgpu one day, so fingers crossed! 🤞
We're working on it!
@@Level1Techs Would love to see pass through on Proxmox (SR-IOV) , performance in Immich and Plex (or other small server workloads).
I really love how you can present something like this with neither hype nor derision, but still be enthusiastic the whole time. A neutral review that is nice to watch, even entertaining. Your talent for this kinda stuff is admirable
I been watching Wendell since his face stayed hidden behind a sick set of monitors like Mr Wilson on Home Improvement. Wendell literally knew everything. Wise AF just like Mr Wilson. I thought that was cool AF. I have learned soo much from Wendell. I don't even know this dude but I legit grew up with him and he taught me things. You a cool dude Wendell. You done earned the respect of millions of bros. Freaking Kudos
Same, was always disappointed when they panned away from Wendell on Tek Syndicate
really happy to hear that you found such good idle power draw. that was my final hangup about arc
Just a heads-up, Gamers Nexus found a lot higher idle power draw. Better than last gen but still 35w on the GPU alone.
@gondil07 needs settings changed in bios and os. Intel laid out the changes, then it's about 12-15W, not as good as Nvidia but pretty close.
@@gondil07 I actually watched their review first, which had me worried, but this one leads me to believe that it was a configuration or software issue rather than a hardware problem. would like to see more testing from more people, but I'm confident enough that GN had something wrong on that
ASPM?
it's weird that GN had such high power draw then, I'm sure they will compare with other reviewers and figure out wtf is wrong, cuz that looks like a serious bug if someone doesn't realize.
I'm happy to see a card in the 250 dollar range that isn't poo poo... Great news for average gamers!
I don't think these are actually going to be widely available, at least not for very long. Intel is losing money selling them at this price, so they are not going to want to make a lot of them. But, maybe they will be able to make enough that they can keep up with demand for a significant amount of time. It's not like the demand is actually going to be that high for these, especially once the next gen competing cards from AMD and Nvidia are released.
@@syncmonism what they’re looking for now is market share, that’s why they’re willing to sell it at such a low price
@@Dell-ol6hbto late it up to 500$ thank you scrapping
The joke is that you can't get it, it's sold uot.
6:20 A note for buying a used system - you will want to ensure the system has resizable bar support or else arc GPU performance will be very poor.
Also, if the system is a PCIe 3 platform, you'll be limited to 8 lanes for the new GPU.
Pcie 3 doesn't matter. Pcie 2 maybe. @@wargamingrefugee9065
@@wargamingrefugee9065with B580 you’re always limited to 8 lanes. No matter if you have PCIe 2.0, 3.0 or 4.0.
@@adammarks4491 Yes, which might be more of an issue with PCIe 3 (or older, lol) with less bandwidth per lane.
@@adammarks4491 yeah but 8 lanes of 4.0 is the same as 3.0 x16 which is nowhere near a bottleneck in bandwidth
I recall from the last year Intel gpu that there was a discussion about using the Intel enterprise gpu software with the consumer card (in particular gpu sharing among multiple vms without any expensive licenses). Is there an update about this with the new generation?
They made a new one, it seems, not the arc thing anymore but an intel graphics software app
@@deadfool179 does it work with this card ? If that rhe case I buy it, one card to share between VMs would be ideal.
Highlighting the cpu difference on older cpus vs new is the kind of stuff that is really valuable about the l1t approach.Its a good ramble for sure, but its also a deep dive. Looking forward to the linux and AI deep dive that will come later :D
My smooth brain read the title as " It's ...Goop". It broke my head.
It's that little reflection off the GPU and it made me read the same thing 😂
I read 19:34 as (Canada, Wet Fart, Chase)
Same 😅
isnt that the stuff gwyneth paltrow sells?
Genuinely the most exciting tech product in a while
dude this and the AMD cpu's being affordable and fast is amazing stuff, who would've guessed we could play 1444p games at something like 800 bucks again in 2025!
Finally, a decent improvement to price for perf. It's basically my ~700 quid nvidia card from 5 years ago, for 250. Noice
Depending on how midrange RDNA 4 and RTX 5000 are priced (yeah, right.), I might recommend this one for midrange builds with some gaming in mind.
The charts have b580 with 8gb ram it has 12gb ram?????
oops this is a typo if so, 12gb on the gpu
@@Level1Techs Then fix it...
@@gaav87 ....
@@gaav87 While I'm unable to change parts of a video that is live, you can find all revised graphs on our forum post:
forum.level1techs.com/t/l1-benchmarking-the-new-intel-b580-is-actually-good/221580
@@gaav87 Yes SIR ,,, when did you get voted in as God and ruler of all humans?
I am so relieved this is a good card. Kudos to the team at Intel.
I will go blue for my kids upcomming updates to do my part as a consumer.
Thanks for giving competitors a chance, if I have the opportunity to build an entry level system for someone soon I'll do the same
Battlemage seems like such a fun card: It has the bus, the VRAM, and software to let you tweak and OC the card. However, what I am really excited about, besides hugely, much more mature drivers, are the AI features that Intel has included, such as LLM and AI image generation locally - that's Huge for content generators. Hence, I hope it comes to the EU region for 250 Eur.
Me too man, i am investigating also the Xe2, the NPU4 and the AI capabilities of the NPU in the mobile processors (Core Ultra) for LLM workloads. Lookung forward to optimize my llama.cpp fork for that
In Finland it's 330€. I was hoping for 250€ too.
level1linux video when?
Including Games plz😊
From the phoronix review looks to me like it's still a few steps behind. Still if they improve just a little bit my media pc has an a380 in it that might get replaced...
I want to know too. I really hope they will also go with open source drivers all in and then it is plug and play to greatness.
@@ThePirateParrotPhoronix definitely had some issues with his testing. His GTA V results are abysmally poor. I would wait for someone else to test it.
🫡
The flowthrough fan thanks to the smaller PCB is honestly so sick.
It's a good design
👍 love to see more competitors in the GPU segment
Just remember if going used, the system MUST support resizeable BAR, so nothing older than like 10th gen Intel (there may be some exceptions).
The only hardware reviewer advise that I actually listened to.
What is interesting is Intel's long history in graphics. They focused on commoditizing graphics a very very long time ago. First they did MMX extensions in the processor core. Then they added vector processing (AVX). They were tightly coupled with Microsoft in providing what the desktop graphics needed for acceleration, focusing on their bread and butter and protecting their "moat" by assisting DirectX. Eventually - to make laptops cooler and cheaper they integrated 3D graphics into the CPU chip thus moving the bar for a sub $1000 capable Windows Laptop. So they know what they are doing on a technical level and achieve some very impressive power numbers etc. They are still catching up on the non-DirectX graphics engine support in their drivers. That is a tough obstacle as the target is constantly evolving. New leadership coming in. Let's see how they plant their feet.
Ahem. What they really wanted was getting multimedia and gaming on IBM PC compatibles locked to their hardware and software. Libraries, promotional campaigns, supporting Microsoft's One and Only True Way of doing everything through Direct* libraries provided by the system (and written in part by Intel) were all born out of that. And it was a valid strategy in mid-'90s (remember that Doom and Quake were software rendered super hits, and needed a fast CPU - guess who profits from people buying top of the line processors). Then serious 3D accelerators appeared, and in a couple of years Direct3D became “interface to what nVidia and ATi wanted to offer”. Not Intel.
However, they still managed to make themselves important for big multimedia businesses... through DRM. HDCP? Intel owns it. AACS? Intel participated in making it. Not to mention SGX, uncontrollable and unobservable execution of something only Intel and its clients understand at the firmware level, which can only be used for a single task: implement DRM systems for the big guys. (It is now deprecated, so Intel didn't have enough power to shove it into everyone's throat, but, famously, even before that government security agencies demanded for their systems to be provided with firmware from which such “features” were completely cut.)
This kind of price/performance reminds me of good old times when the green was greener - the times of GTX 960, 1060.
If I were building a system with some 12400F right now I wouldn't have to think for too long I guess.
I'm very interested on how well these cards perform for entry level content creation, especially rendering videos on Davinci Resolve and the HEVC/AV1 encoder for live streaming.
I never would have considered Intel to be the budget GPU of choice even 5 years ago.
Will you be doing any PCIE 3.0 testing for us users still on PCIE 3.0 such as the B450 and a520 motherboard?
second this. the result would be useful for egpu users as well
It requires resizable bar. Is that even going to be a thing on MB with only PCIe 3?
Edit:
Hope it is because nothing I’ve got has PCIe 4.
@@williamp6800 i have used a b450 with rebar enabled, and hardware unboxed’s podcast had one of Intels guys on and according to him pcie 3 should not have any issues
@@williamp6800 Yes, my board has it.
@@williamp6800 rebar works since PCIe 2.0. Newer PCIe 3.0 boards should have it by default and older ones need to get an update (or you mod the option in yourself)
Looking fw to Level1Linux!!
Same, I want to see how these cards preform under Linux, & is it just near plug, and play like AMD has been for me.
especially because I am going to switch to linux very soon
@@Aerobrake I've personally been using Linux since the Late 90's when Red Hat Deluxe, and Corel Linux came out, but I only made the full switch over on all my systems around 2015 when AMD open source GPU drivers, and Proton for STEAM got good enough to run most of my games, as I was fed up with Windows 10 being a resource hog, and I was one of the victims of the update that deleted stuff off people's HDD's without their permission. So almost a decade free of MS, & life could not be better for me running Manjaro Gnome on my various AMD CPU/GPU, and Intel CPU/iGPU systems, as I avoid Nvidia on purpose because it's more expensive,the user experience on Linux is overall worse because they have been anti open source for so long, thus causing the Linux community to have to work from scratch to develop the open source Nouveau drivers for Nvidia hardware(they have done amazing work, but should not have had too), while AMD, and Intel have their own teams contributing back to the Linux kernel for their hardware drivers which is a big 👍 in my book.
Excellent video as us usual Wendel. I love the analysis that goes far beyond game testing numbers.
Phoronix showed some impressive compute benchmarks. This good showing is surprising. I hope the new Intel management won't kill off their dGPUs though.
Got sidetracked by your benchmarks background. How. Cool. Thank you for your insight once again good sir!
I'm using a i7-8700 with my b580. I have a strix z390-e mobo. With rebar off games are unplayable, but with rebar on it is awesome even with a CPU that isn't "compatible"
I'd love one of these. Hope it'll fit inside my Nuc12 extreme!
GPU market needs more players so that consumers are not robbed ! Highly experienced companies like Intel should survive in the market so that there is no monopoly. I really hope that this generation of Intel GPUs crack the market !
Thanks for the awesome review. Love your work ❤️.
I'm interested in a B570 for my home lab with that sweet, sweet transcode.
I know that sale margins are not impressive for shareholders, but Intel needs to keep fixing aggressively their shattered reputation amongst customers.
They should have done this a decade ago to sow salt in nvidia's bread and butter. They decided that the money wasnt there. They were wrong.
they're losing money on these,don't expect them to make many.
They wont be around much longer at this rate, no margins on these, no margins on their cpus no sales in server.
@@jmwintenn
gotta love it when people read baseless info on reddit or Twitter from so called "experts" and just regurgitate that misinformation every chance they get
Would love to see how Arc compares to Nvidia and AMD in streaming and recording, specially when using OBS.
Question for the room is does it do sr-iov?
Very interesting, I wonder what the difference is between your monitor setup and Gamers Nexus'. They were measuring 35W at idle versus yours at 13W. Perhaps it's still some weirdness with ASPM? Alchemist had issues with never falling under 40W at idle without properly configured ASPM settings and even then, it depended on total display resolution. With my dual 1440p displays, it can't seem to sleep the main GPU and so it still refuses to fall below 40W at idle. Did you test those multi-monitor higher resolution scenarios?
I would guess its aspm. I had no end of errors from my samsung 960 after a bios update. and a later bios update turned aspm right back off. had no idea what to look for until months after the fact.
sometimes you can get a massive power draw with a new monitor because it keeps its memory clocked high for no seeming reason
Like my 6700xt will pull 30w if i turn on my 2nd monitor at 60hz with the 120hz main panel but will idle at like 5-8w otherwise
I like this cards reviews, it looks solid for my kids pc and I hope they make a bigger one that I can use.
The way Wendell talks about it, it makes you feel like Intel really listened to the consumer base and tried to deliver what we wanted, rather than try and dictate what we want while helping themselves to our wallets.
2:20 if we are thinking about the same display, it only supports 480hz in 1080p mode, so there is no 4k480 display on the market
I'll be waiting for a B770/780.
Go intel! Can’t wait to build a budget PC with this gem.
I love this card - for price / performance - drivers much improved / lower power draw
Can’t wait to check 1 out
Liking this card, may just go ahead and order it. I'm looking at a new build for creative work (video editing etc using Davinci Resolve) and likely pairing it with a 15th gen core ultra 7 and Gigabyte Aero G Z890 MB for the new socket. Eventually, if they bring out an updated 770 type card, I'll likely get that eventually.
All this to update a 7YO Dell Optiplex SFF Core i5 with an equally as ancient GT 1030 as that's all I could find to work with this setup.
Waiting on your Linux "where does this lie" video. Also, curious if we can SteamOS with this.
We're all still waiting on Valve to support anything other than their AMD hardware right now, but distros like Bazzite are atomic like SteamOS with support for Nvidia. With either some tinkering or support from some distro maintainer it should be possible
Man, that's what I wanted to hear...an honest take on Intel's new cards. I'm going to buy one now, after waiting for an all-clear...thanks.
I think its impressive what intel has achieved here in just 2 generations. Really incredible. I hope they bring also bigger chips out like a b770.
I wonder how high prices will go here in Europe
It's good bc I want intel to compete
yeah well 😢 it's either the LAST Intel GPU or the penultimate Intel GPU from what I hear. I live over in Hillsboro, OR near D1X and Ronler Acres, have A LOT of friends at Intel because I live like right here haha, though I don't know anybody directly on the GPU team tbh. But rumor is from Intel employees that almost the entire GPU team has been fired and the department is being kept alive ONLY to keep the drivers working while they manufacturer and release the last few finalized designs. 🙃 I want Intel to compete as well... but I think Intel would rather sit back & collect subsidies while they pray for a merger deal. At least that's what the board wants... and so far there's no Nelson Peltz -like character ready to come in and make war with the board so they're probably going to win.
@@robnobert Meanwhile the words of Intel directly:
The software team is currently refining Xe3, while the hardware team has begun working on its successor, Xe4.
@PSXman9 🙄 irrelevant. Intel isn't stopping ALL GPU dev. They're stopping DISCREET GPU dev, which is what we're talking about.
@@robnobert that sounds weird given their guy on said they were already working on the next one on the gamernexus channel.
It would be weird to scrap a division that finally did something worthwhile.
"but I think Intel would rather sit back & collect subsidies while they pray for a merger deal."
No. Just no. Like......no. wow....not even.
@@robnobert when pat gelsinger referenced discrete GPUs, it was in a wider context of portable devices (i.e laptops) and basically said intel should shift priority from making laptop dGPUs, to APUs that can compete with AMD's Z1.
Nice to see Wendel in the new set up! :D
vm sharing? 👀
Only as good as vendor support
Great review. I would love to see you review this card with the new Intel 265 CPU. Thanks
Great review, hoping for the Linux one!
Perfect card if you want to upgrade from your 1080p monitor
Interesting to see you hit 13watt idle with this card when every other review can hit whopping 30watts at best. Wonder where the discrepancy is coming from?
Hooray! Long form video! (Not sarcasm, am genuinely happy)
Highlighting for prospective buyers, that you need a motherboard with re-sizable BAR to make this sing, or even for light chirping.
Very important point I havent seen often enough in the comments.
What about this card for gaming in Linux??
This is some astounding progress for Intel. If they keep this up, they'll be trading blows with nvidia in most segments within a few years!
If you aim for a good Linux machine this is the card because compared to any Radeon out there, it has openAPI fully supported now. Especially for the ones that consider Blender software a must.
Excited to buy the same thing with blessed drivers as the Arc Pro B60 for twice the price
This may be my next GPU, i will be waiting for team red and green's next low and mid range models but i at least need a card with more than 8GBs of VRAM for my 1440p ultrawide. I will have to see how the new and used options compare price to performance wise about 6 months from now.
How does it work in Linux? X11 and Wayland both good? Can it do 4k@120 (HDMI 2.1) unlike AMD?
It would be super interesting to see how the b580 would perform with rebar off
any info if intel is planning a successor for a310/380 ? single slot, no extra power, if possible passively cooled with 4+ monitor support with latest codecs and technologies supported would be nice to have for 150ish euro, not everybody runs local llms or games all the time, photo and video editing with occasional fhd game where the nvidia 720/730/1050/1650/1660 or amd 6400 are a bit obsolete or unnecessarily crippled
That's quite the ambitious wishlist
@@EidolonKaos jingle bells, jingle bells....wishlist delivered, product incoming???
How's the performance on PCIE gen 3?
The 10700K that was used to test is limited to PCIe 3.0. ~ Amber
I don't see new flex GPUs? Do we see anything about SR-IOV on these?
7:35 And this is big thing for me in the tropics, low(er) power consumption.
Intel is doing it with their recent CPUs, but they are late in the game, but, they will get there.
I'm waiting for the B7x0 series, but I hope they can continue the path they have chosen.
12:12 - The second scene has background noise and music and I think its the combination that made me struggle disproportionately. I've a relatively minor audio processing issue, but I mention it coz it might trip up others too. I don't generally need subtitles to watch your content, but this bit was hard going.
It's gunna be weird having "AMD CPU and Intel GPU" builds but that seems to be where the value is here at the moment. Crazy times!
Intel fighting for scraps in the low end section, both with CPU and GPU, is exactly what that company needed for a wake-up call. It's good for them, and it's good for us consumers. This card is an unexpected victory, and we should cheer Intel on, that credible competition is sorely needed. I can't help but think this only weakens AMD's GPUs sales however, since Nvidia pinches its nose as it walks past the low-end GPU marketplace. Nvidia buyers will not be tempted by this card, but potential AMD buyers will.
If AMD finds a way to make their RT stuff more competitive, I could see this turning into a true three way split of the market Intel/AMD/Nvidia -> low-mid/mid-high/stupidmoney.
@@johnathanmcdoeThe rumors are that RDNA 4 should be about equivalent to Lovelace in ray-tracing, iirc. Excited to see how RDNA 4 shakes out.
@@johnathanmcdoe Intel vs AMD in the low to low-mid tier, AMD vs Nvidia in the mid tier, Nvidia vs nobody in the high end? It does seem it's AMD fighting the war on both fronts, and Intel isn't taking Nvidia on at all .. but that's my silly opinion.
@@dtsdigitalden5023 Why fight someone else's batlle if you haven't got the ammunition to fight it. Will be interesting to see how this scales to a B7xx. Looking forward Celestial is already taped out as is Druid. Could we have a C9xx or a D9xx. Things will be interesting in the next few years.
@@benjaminoechsli1941why do people always believe bogus amd rumors? Its always false
Wondering how that would work in video production and capture.
I thought i was going crazy when the driver overhead thing was a brand new problem, but nope Wendell had pointed that out here on his day one review.
1:10 also assuming there’s enough supply and it’s not scalped to the moon.
thank you very much for your hardwork
Haven’t been able to find if there were any QuickSync improvements this generation? Would love to see some ffmpeg or handbrake benchmarks.
Awesome review!
Your total board power at idle is way lower than GN measured. Any idea why?
So, my question is how is the card with virtualization?
That's a bummer, but should have expected it. Was maybe going to buy it for one of my old systems (7700k, 2600k) but with resizable bar requirements I should have expected that was not gonna fly.
GN had idle consumption at 30w. Does a tone have any theories?
Intel already told reviewers how to address this. One change in BIOS and one in Power. Then idle is about 12W to 4060 8W or so. It's not a big difference.
Is this a significant upgrade from GTX1080 (non TI)?
For 1440p gaming plus frame generation support yes
Amazing review. But way too many ads.
Level1 needs to eat too. :-)
5:30 The all-in-one boards with mobile CPU are interesting to be sure, but as soon as you get a GPU a $125 Ryzen 8700F starts to make a lot of sense (Or dare I say it an 8400F). Even adding $30-40 for an AXP120 cooler and $100 for a Night Devil B650i you are still in a better position by enough to afford 2 other components: Case/PSU/RAM/SSD.
I want to see a B770 card but I keep hearing that this might or might not happen which would be a shame since all I hear is decent stuff about this card.
Whether we see the B770 is still unknown. But what is certain is we only have 1-2 GPUs left from Intel.
Apparently nearly the entire department has been let go and the only people they kept were to keep drivers working for a few years.
They're going to release the few cards that were already in the pipeline and ready for manufacturing... and then close up shop.
GPUs were Pat Gelsingers thing. And the corporate power players at Intel HATED that department.
And now we know who won that battle 😢
This was debunked over and over. It wws confirmed that they working on 4th gen already. Go see gamers nexus videos with Tom Petersen or Peterson@@robnobert
Hi Wendell, mentioned testing the 10700k and the performance being lesser, and that possibly being driver overhead. The B580 is limited to x8 PCIe, could that have caused the lost performance on the 10700k which only supports PCIe3.0? Maybe a quick check of performance in PCIe4.0 vs PCIe3.0 would confirm?
One has to wonder what the BOM cost is on this 12 GB card. What is the price of GDDR6 at the moment? This is definitely a fire sale and they will only sell an amount of cards based on the TSMC contract fulfillment. Still the 6nm 7600 8 GB vs 5nm B580 12 GB, a node difference and not necessarily a node performance improvement. You did a good job of framing this review by not have the 4060 8 GB to compare. Depending on how the RX 8600 performs with RT, it could the nail in the coffin for Battlemage, but I guess B580 pricing is to clear these guys out before that happens.
Do you think they will be releasing something like the A310? Maybe a B310?
Hard to say right now, they have financial difficulties right now, maybe the concentrate on the middle class for right now.
I really want to buy a B580 when I build a new machine, but I most probably won't. The simple reason is I look for quite high gaming performance, more in the line of RTX4080 or RX 7900. The B580 would just be something I wanted so I could try it out. The sad thing is it won't work for my old machine either even though it has a lot better performance than my old GPU because that machine doesn't support resizable BAR. These Battlemage GPU's really require resizable BAR to perform anywhere near reasonable, just like the Alchemist GPUs do.
I also thought my older intel machine ( 8series) not supported resizablebar, but to my surprise it has a bios update available that adds it. Maybe worth checking if that also the case with your platform.
How good are these Intel Arc cards in Linux? Any good or bad personal experience out there?
I have a week with the card but there are no drivers. Can you share with me ? 😢
So basically it has the same power as the ps5 gpu, the 6700, for $250. Impressive
it's on par with rx6700xt who is 10 to 20% better in games than rx6700.
@Level1Techs Could you please make a comprehensive test video on the overhead issue when pairing the B580 with older CPUs? I think it's very important because after so many positive reviews published online, many people want to upgrade their old GPUs to this one, but they most likely are still using an old CPU right now and thus will not get the full performance out of this GPU.
7:18 how did you measure 13 watts of usage if GamersNexus was measuring over 30 watts on the card, just idling?