13:00 Correction: x264/x265 are not formats, these are software encoders. I got a bit confised why x265 would be run on GPU at all for a second then I realise it's H.265 the format, not x265 the encoder which runs on the CPU.
Best review out there. Kodus for covering the productivity workloads as well. It's been a very, very long time since GPUs moved away from being dedicated gaming chips (also known as graphic accelerators) to a more general-purpose Graphics Processing Unit, and it is crazy to me that most reviewers still don't cover these things even today. I think it's because most of them don't edit their own videos, but at least have some respect for your fellow video editors helping you in your video production man, show some data that'd be helpful for making them choose their GPUs. Again, thank you very much for including the productivity and AI workload results.
I appreciate the early review. It does look promising. Will be real interested to see what the B770 winds up doing and whether they can shore up the drivers by then. It is very encouraging to see another player on the field.
We have 3 manufacturers now. That's great. And a fact that Intel found a niche where it beats competitor, is a very good thing. Because simply a bit cheaper ordinary card won't find many customers.
@@averytucker790 not at all, I use AI to translate some language for me on work because it's much better than just google translate or the old ML translator like DeepL translator.
Unfortunately, if the market is willing to pay $1000+ then that's where the top flight cards will stay. The main problem is that Intel isn't launching their A770 replacement until (maybe) mid next year. If that card can almost double up performance like this one did, then NVIDIA and AMD would have been in a spot of trouble right from the jump since it would technically put pressure on the RTX 5070 / RX 8700 if that's what they end up being called. But instead we get an entry level card.
About Warhammer 40k, It is new hardware units emulation on B-ARC's (Adding SIMD16 is fixing a lot of games sadly it isn't possible to do it without manual work on A-series)
Is there somewhere we can look up the testing process or benchmark for your STEM applications? I'd be really interested to see exactly how the Solidworks tests are performed.
Great video Mike, top HC quality as always, I just can't go without watching your reviews. Here's hoping Intel succeeds with Battlemage and we get a serious third player, at least in the price brackets that matter.
Better review than hardware unboxed I can't believe I'm saying this They just want to hype it up , they didnt test any E ports games. Which most ppl play.
Don't think one is better than another. They have their methodology, we have ours. Plus, its best to watch way more than one piece of content to get a balanced view of any product. :) Thanks for the compliment nonetheless.
As always a great review by Mike. Despite problem areas in development I'm glad to see Intel making such great strides. A year ago I very much included AMD and Intel cards in my research, wanting to support them, but for my purposes they couldn't quite seal the deal. So I went with Geforce again, putting aside things like Nvidia's stinginess with VRAM. The current competitive playing field gives us fresh hope!
best review we have so far, even for locallama I think, thanks for testing llama 3.1 and qwen 2.5. tho can you tell us which quant you used for either model? like context size (I suppose it's 2048 default tho), flashattention, quant size(bet it's Q4_0? if so try Q4_K_M or _S which is more widely used)
@@HardwareCanucks aye thanks! so this review also aligns with locallama usecases too, nice now I really want that boy, but thanks to our braindead president KRW is in trashcan so... guess I'll buy few once we impeach that bastard haha
@@HardwareCanucksyeah bro ive seen the reviews with linus tech and jayz 2 cents its real it even murder the 4060ti and the 7700xt and 3070 in some games too
very good review, not only games, also Ai, blender, video and photo. B580 is pretty similar to RX7600 and RTX4060, better in some workflows and games and worst in a few. So It's not a bad choice if you can buy it for 250$ or less.
I'm wondering if the lower performance between the Intel GPUs in the Creative apps like Blender has to do with the lower core count. I use Resolve and Blender on almost a daily basis so was looking to get an upgrade from my A750 but it looks like the improvements are only on the gaming side. Will have to wait for a B750/770 or whatever AMD and Nvidia have coming. I'm not touching anything from them that's under 12GB tho. No sir!
I really want Intel to succeed in the GPU market. What it needs now is better driver support for its cards and better collaboration with game and app/software developers to efficiently utilize their GPUs. The Battlemage GPU family is very promising. Intel please don't quit the GPU market. We need players like you to disrupt the market.
better sell your old card and push for something a little bit stronger, at least 4060Ti, also check for performance on this card in the games you want, Intel may have problems with drivers and performance in certain games, and that may affect the games you want to play
The growing pains will be ironed out with time as Intel works to improve the drivers. I don't doubt this card will be exponentially better in a few months.
Great review 👍 To be fair, GPU shopping is always based on games you play. Some favor Nvidia, some are AMD, now we have Intel. People have games they like, play a lot, or everyday. Not many people play everything on Steam (unless you're a game reviewer, tester, or something). So just choose what works best with your games. The same for productivity, Some people use Adobe, some use VDO editing/encoding, some use 3D modeling, some use AI. I don't know what kind of job you have if you use everything, if your job really need all this, you should have more budget for GPU anyway.
Hi, thanks alot for the AI benchmarck. Just a suggestion, but next time how about you also test running AI code in python itself, rather than use apps that have specific optimation for other hardware than Nvidia. Since the former is more representative of actually working and training/fine tunning AI model, the ladder(the one that you mostly test) is more about running the existing model and not modifying it. I would recommend checking out something like unsloth notebook(which got everything you need, and even sample training data, on a system with nvidia GPU that's config properly, which isn't hard, beside having to run in WSL or linux, run all should just run and train everything), try to run those on every system and see which one is the easist to run, if they even run on that GPU (or have config for WSL) at all, etc.
Thanks for the suggestions! I've sent them to our lab team because we hear you...our AI testing is still very much in the exploratory phases and we're always looking for new tests. - Mike
Btw Nice video! 12gb vram have few similar features to nvidia features great at productivity and more importantly, it's cheaper! and it can be better if intel keep updating their driver and fixing stuff!
Thanks for the STEM benchmarks. idk why so many other reviewers just focus on games. i hope intel driver developer will continue with the same speed of the last years, and will fix the under-performing in programms like blender. they did a better job then team red, so i have high hopes. older people may remember how many decades team red needed to fix opengl performance. if intel can keep delivering, also in the high-mid-range, amd will be in big trouble on gpu marked.
Hardware Canucks is now my preferred reviewer. Non-game apps and AI covere useful for engineers/tinkerers who really need the processingis, with a nice overall editorial. Agreed gpu makers need to stop bottlenecking their ram size in a AI world
The biggest draw of the B580 to me at least from reviews I'm seeing is that consumers are finally getting a 1440p capable graphics card that launches at an actual affordable price point of $250, and it can even do a little ray tracing to top it off and do it better than AMD does in most games. It also looks like it'll be good for someone who does production work with video editing, video rendering, and/or streaming, with the Intel Arc cards offering a wide array of video codecs to work with in video editing programs or with video recording programs like OBS Studio along with having fast render times for their price class of graphics cards. Pair the B580 with an affordable high gaming performance CPU like the Ryzen 5700X3D, and you can feasibly put together a 1440p capable computer for around $800 USD, depending on pricing where you live.
Would the next logical step for Intel be to release a B770 for $450 to $500 at the level of a 4080 ?? As I and many others have a 3060 12GB or a 4060 it's currently not worth the upgrade so I will wait for a B770 or equivalent.
Checking the prices between the 7600XT and B580, its 100 - 150$ difference, did not even bother checking Nvidia, we all know it will be a trillion dollar difference... Have to say its impressive, I still would be too afraid to buy one but I like what I see, I'm not in a position where I can afford to gamble with these things sadly. This is the price range a *060 card used to be at... 250 - 300$ but thats long gone now...
Intel needs to invest heavily in their drivers, people are not going to adopt their product if they can't maintain drivers for new games and even legacy titles and maintain their previous gen products
They are investing heavier than you may think. Getting this right for a new discrete GPU player will just take its time and several generations. People may forget but Nvidia took a long time and ATI->AMD even longer to get as stable as it is now drivers-wise through countless collaborations with developers. That being said, Intel does need to time to iron out the quirks but because they don't have that time luxury on their side, they are actually fixing drivers faster than AMD has done for the past little over a decade (and still has issues albeit smaller ones). Also, many games are tuned to vendor HW through vendor extensions over the years as part of big engines such as Unreal or through vendor-developer collaboration. Expecting this to be more common on a larger scale for Intel over the next years, it's just very time consuming regardless of the amount allocated resources. The more games moving onto new graphics API versions in the future, the closer the driver gap may become over the following years.
Great review! However, please refrain of the overuse of extraordinary words, expressions, like "demolished the competition", and "night and day", and "way ahead" and all alike. 5-15% difference is in normal terms nothing. No sane person will choose any product for these differences, above or under the others. This is nitpicking. Remember the days when a new product was 50% faster... This is so tiring... If you do benchmarks, and spend tons of time to be scientific, please do not be this overly sensational. Thank you.
Why are you the only one who talked about Blender? Why won’t people just run a benchmark for Blender? Need to know if it’s time to put the old 3060 12GB to rest and still to this day, the answer is no. lol.
3:20 "...and fares pretty well against the Radeon alternative..." - a few fps difference, sure, "but gets hammered by the rtx 4060..." - dude, it's like, 8 fps. Otherwise, great review.
It seems like Intel's new dedicated budget GPU B580 with 12 GB VRAM is excellent for QHD story games. Nvidia RTX 4060 and 3060 (12 GB VRAM) are still better EA shooter games.
Intel B580 still giving tough competition to nVidia 4060 inspite of having lower core count , shader units and RT cores with a very unmatured driver till now. Only problem with Intel GPU is high power sucking even in idle state. Blender and other DCC apps and all game Engines will be forced to be optimized for this new GPU as this is already out of stock and a smash hit for Intel after a long time.
You nailed it with the inconsistency conclusion. I'm of course buying one for the collection, but probably will wait for the $200 USD mark. The AIBs have lost the plot, and are already pushing north to $350 USD in many regions. Unfortunately, it still cannot compete with the 3070 / 6700 XT consistently, which go in my customer builds for around $225 used right now.
The best high Refresh rate 1080p gaming build in 2025 is intel Gpu with amd cpu 😂😂
I was thinking about the same thing because I recently bought a 5700X3D and need to upgrade my GPU. Very ironic, it's upside down.
Which amd cpu?
@@harishvyas2165 X3D chips, or any 7000 and 9000 series CPU can deliver high frame rates. It doesn't matter which one.
It doesn't seem though...considering the inconsistency in framerate...
its good for 1440p instead of 1080p
13:00 Correction: x264/x265 are not formats, these are software encoders. I got a bit confised why x265 would be run on GPU at all for a second then I realise it's H.265 the format, not x265 the encoder which runs on the CPU.
Yeah that's our mistake thanks for bringing it up!
Best review out there. Kodus for covering the productivity workloads as well. It's been a very, very long time since GPUs moved away from being dedicated gaming chips (also known as graphic accelerators) to a more general-purpose Graphics Processing Unit, and it is crazy to me that most reviewers still don't cover these things even today. I think it's because most of them don't edit their own videos, but at least have some respect for your fellow video editors helping you in your video production man, show some data that'd be helpful for making them choose their GPUs. Again, thank you very much for including the productivity and AI workload results.
I appreciate the early review. It does look promising. Will be real interested to see what the B770 winds up doing and whether they can shore up the drivers by then. It is very encouraging to see another player on the field.
Most useful review because you have covered productivity apps and AI.
Exactly!
Thank you guys for showing Davinci Resolve numbers! No other channel I watched gave me this info.
We have 3 manufacturers now. That's great. And a fact that Intel found a niche where it beats competitor, is a very good thing. Because simply a bit cheaper ordinary card won't find many customers.
If it can fix the frametime inconsistency/ 1% lows , which i think is plausible, the its the value gpu king 👑
Thanks for including AI benchmarks, the vast majority of other review channels don't seem to be interested in this area.
It's one of the fastest growing segments of the graphics market and we want to make sure folks know how these cards perform.
👍@@HardwareCanucks
Thanks for this video. I'm super excited to see Intel shake things up in the GPU market.
Really looking forward to the b700 cards. Probably my next upgrade!
Thanks for productivity and AI benchmarks, other reviewers haven't bothered doing it.
Our thought is these GPUs are no longer one-dimensional devices. So, may as well take the time to add a few other real world usage scenarios.
Because AI for productivity is kind of lame and overrated.
@@averytucker790 not at all, I use AI to translate some language for me on work because it's much better than just google translate or the old ML translator like DeepL translator.
@@averytucker790guess you didn't watch the video
Crazy good and clear charts. So easy "at a glance" clarity. HUGE Thumbs UP! 👍👍
Thanks! We've worked really hard to get as much information into as condensed a manner as possible.
Great review guys, arguably one of the best. Well done!
yippee! hopefully this will drive down prices, 1-2000$ for the top tier cards is just straight up stupid
Unfortunately, if the market is willing to pay $1000+ then that's where the top flight cards will stay. The main problem is that Intel isn't launching their A770 replacement until (maybe) mid next year. If that card can almost double up performance like this one did, then NVIDIA and AMD would have been in a spot of trouble right from the jump since it would technically put pressure on the RTX 5070 / RX 8700 if that's what they end up being called. But instead we get an entry level card.
@@HardwareCanucksand that's if they launch a replacement for the A770 at all.
@@HardwareCanucks :(
Nvidia will charge what ever they want mate. It's up to the people to vote with their wallets.
@@PureRushXevusthis will only happen if people buy these newer gpus.
About Warhammer 40k, It is new hardware units emulation on B-ARC's (Adding SIMD16 is fixing a lot of games sadly it isn't possible to do it without manual work on A-series)
Imagine how it will increase performance after global drivers updates)
Thanks for including productivity!
Great tests for productivity apps. Thx.
That's what I call phenomenal detail. Thank you!
I appreciate all the info man, cheers!
Solid review. Hopefully they see this and collaborate with software vendors a little more for optimization.
Thank you for including productivity and AI workloads!
Is there somewhere we can look up the testing process or benchmark for your STEM applications? I'd be really interested to see exactly how the Solidworks tests are performed.
Holy shit thank you. I have been searching for productivity benchmarks
Thank you for including video editing and AI benchmarks too, I use my PC for everything so always get frustrated when I only see gaming benchmarks.
i was rlly excited at first UNTIL i watched the 3D apps, do u think it's still a win for the 4060 ?
Great video Mike, top HC quality as always, I just can't go without watching your reviews.
Here's hoping Intel succeeds with Battlemage and we get a serious third player, at least in the price brackets that matter.
We can only hope!
Excited for the entry class!
Does it also have opensource software stack so we can start building apps? Or is it closed-source like Nvidia?
Yeah it does
It's called oneAPI.
It's on Github
Here's hoping that the driver support keeps improving. It seems like a solid effort for the price point.
I think that INTEL arc a cards are gonna need a good gpu up grade to properly compete, or it’s gonna need cheaper prices
0:39 it's PCIe 4.0 8x, why only 8x?!
Thank you for including AI benchmarks
Will there be b790 with 16gb-20gb ram ?
Better review than hardware unboxed I can't believe I'm saying this
They just want to hype it up , they didnt test any E ports games. Which most ppl play.
Don't think one is better than another. They have their methodology, we have ours. Plus, its best to watch way more than one piece of content to get a balanced view of any product. :) Thanks for the compliment nonetheless.
As always a great review by Mike. Despite problem areas in development I'm glad to see Intel making such great strides. A year ago I very much included AMD and Intel cards in my research, wanting to support them, but for my purposes they couldn't quite seal the deal. So I went with Geforce again, putting aside things like Nvidia's stinginess with VRAM. The current competitive playing field gives us fresh hope!
For AI benchmarking, it would be interesting to see how common platforms like ComfyUI and A1111 perform.
We will check those out!
@@HardwareCanucks thanks! Looking forward to more GPU reviews with AI benchmarks in the future.
Could it be possible to do this test again comparing it to an RTX4060 12GB?
There is no such thing .
Is resize bar on on all gpus?
Yes
best review we have so far, even for locallama I think, thanks for testing llama 3.1 and qwen 2.5.
tho can you tell us which quant you used for either model? like context size (I suppose it's 2048 default tho), flashattention, quant size(bet it's Q4_0? if so try Q4_K_M or _S which is more widely used)
Sure!
Llama 3.1 4K Context, Instruct Q4_K_M
Qwen2.5 4K Context, Instruct Q4_K_M
@@HardwareCanucks aye thanks! so this review also aligns with locallama usecases too, nice
now I really want that boy, but thanks to our braindead president KRW is in trashcan so... guess I'll buy few once we impeach that bastard haha
hope there will be another benchmarking mid 2025 to see if driver optimisation will have been done on the card expecially for professional workloads
JayzTwoCents has the B580 wiping the floor with the 4060.
....how? Even Intel's own numbers don't show that. Must be the games being tested.
@@HardwareCanucksit’s beating the 4060 in 90% of the games and by like 20%
@@STEALTHREX he tested 4 games. 4060 was in margin of error in the other two . the intel wins were cyber punk and borderlands 3
Not only that it mops the floor with the 4060ti and the 7700xt and the 3070 in some games
@@HardwareCanucksyeah bro ive seen the reviews with linus tech and jayz 2 cents its real it even murder the 4060ti and the 7700xt and 3070 in some games too
Seems that the B580 has really good potential.
With some more driver work I think this card can grow a lot more.
very good review, not only games, also Ai, blender, video and photo. B580 is pretty similar to RX7600 and RTX4060, better in some workflows and games and worst in a few. So It's not a bad choice if you can buy it for 250$ or less.
I'm wondering if the lower performance between the Intel GPUs in the Creative apps like Blender has to do with the lower core count. I use Resolve and Blender on almost a daily basis so was looking to get an upgrade from my A750 but it looks like the improvements are only on the gaming side. Will have to wait for a B750/770 or whatever AMD and Nvidia have coming. I'm not touching anything from them that's under 12GB tho. No sir!
I really want Intel to succeed in the GPU market. What it needs now is better driver support for its cards and better collaboration with game and app/software developers to efficiently utilize their GPUs. The Battlemage GPU family is very promising. Intel please don't quit the GPU market. We need players like you to disrupt the market.
Worth to upgrade from my rx 6600? 🙏🏻 Please helppppp
Easily
Lol, your gaming charts with Baldurs Gate 2 (Two!) in them. Man, I still love that game, still playing it more than 20 years later
Ah man, that's a mislabel. :(
@HardwareCanucks I know. Also, don't worry about it - we all know what you meant.
Amazing video. Thank you for making it.
@@HardwareCanucks You're just a proud Canadian, that's all!
Can't wait for the Gunnir aib card designs for the B7xx or whatever it's going to be called!
Would people recommend this as a upgrade coming from a rx 6600xt
better sell your old card and push for something a little bit stronger, at least 4060Ti, also check for performance on this card in the games you want, Intel may have problems with drivers and performance in certain games, and that may affect the games you want to play
@danr8011 main problem at the moment is vram most games seem to be using 7 to 7.5gb vram even on 1080p medium
@danr8011 warhammer dark tide, the outer worlds and marvel midnight sun's have the most issue with v ram.
@@atifghafoor6258 yeah, that is true. Radeon 7700xt then? :)
when we are going to see b770?
Supposedly mid next year...maybe.
Huh?
@@HardwareCanucks I really hope so
wow you are fast to actual review
The growing pains will be ironed out with time as Intel works to improve the drivers. I don't doubt this card will be exponentially better in a few months.
Okay Nvidia lets sell the 5060 for $250😅
Nice dream… then screw it up with 8gb vram
Nah, 90% market share means it’s $350 at launch 8GB VRAM
How does it behave in Linux? I would 100% buy one if it works well.
Great review 👍
To be fair, GPU shopping is always based on games you play. Some favor Nvidia, some are AMD, now we have Intel. People have games they like, play a lot, or everyday. Not many people play everything on Steam (unless you're a game reviewer, tester, or something). So just choose what works best with your games.
The same for productivity,
Some people use Adobe, some use VDO editing/encoding, some use 3D modeling, some use AI. I don't know what kind of job you have if you use everything, if your job really need all this, you should have more budget for GPU anyway.
Hi, thanks alot for the AI benchmarck. Just a suggestion, but next time how about you also test running AI code in python itself, rather than use apps that have specific optimation for other hardware than Nvidia. Since the former is more representative of actually working and training/fine tunning AI model, the ladder(the one that you mostly test) is more about running the existing model and not modifying it.
I would recommend checking out something like unsloth notebook(which got everything you need, and even sample training data, on a system with nvidia GPU that's config properly, which isn't hard, beside having to run in WSL or linux, run all should just run and train everything), try to run those on every system and see which one is the easist to run, if they even run on that GPU (or have config for WSL) at all, etc.
Thanks for the suggestions! I've sent them to our lab team because we hear you...our AI testing is still very much in the exploratory phases and we're always looking for new tests. - Mike
I'm waiting on the the B770 as well as the B380 Low-Profile to replace the a380 in my NAS/Server
The consistence may be improved by driver update. Still very promissing gpu
does this run bloodborne at 60fps?
What a good performing lower cost GPU
It makes sense that Nvidia gets the best optimization support, but seems this is another solid step forward for better competition.
a bit weird why this benchmark pairing with high end cpu 9800x3D?
Btw Nice video!
12gb vram
have few similar features to nvidia features
great at productivity
and more importantly, it's cheaper!
and it can be better if intel keep updating their driver and fixing stuff!
Sold out on Day 1!
Thanks for the STEM benchmarks. idk why so many other reviewers just focus on games. i hope intel driver developer will continue with the same speed of the last years, and will fix the under-performing in programms like blender. they did a better job then team red, so i have high hopes. older people may remember how many decades team red needed to fix opengl performance. if intel can keep delivering, also in the high-mid-range, amd will be in big trouble on gpu marked.
workloads, workloads never changes..
I’ll be getting 1 at least for testing I’m going to wait for the b570 to come out tho I just want to see the reviews if that first
Again! can't stress enough! HC makes the best graphs! makes LMG looks like powerpoint 95 by comparison!
So give me a link to a store in netherland so I can buy it for 230 euros
Hardware Canucks is now my preferred reviewer. Non-game apps and AI covere useful for engineers/tinkerers who really need the processingis, with a nice overall editorial. Agreed gpu makers need to stop bottlenecking their ram size in a AI world
The biggest draw of the B580 to me at least from reviews I'm seeing is that consumers are finally getting a 1440p capable graphics card that launches at an actual affordable price point of $250, and it can even do a little ray tracing to top it off and do it better than AMD does in most games. It also looks like it'll be good for someone who does production work with video editing, video rendering, and/or streaming, with the Intel Arc cards offering a wide array of video codecs to work with in video editing programs or with video recording programs like OBS Studio along with having fast render times for their price class of graphics cards.
Pair the B580 with an affordable high gaming performance CPU like the Ryzen 5700X3D, and you can feasibly put together a 1440p capable computer for around $800 USD, depending on pricing where you live.
Would the next logical step for Intel be to release a B770 for $450 to $500 at the level of a 4080 ?? As I and many others have a 3060 12GB or a 4060 it's currently not worth the upgrade so I will wait for a B770 or equivalent.
Checking the prices between the 7600XT and B580, its 100 - 150$ difference, did not even bother checking Nvidia, we all know it will be a trillion dollar difference...
Have to say its impressive, I still would be too afraid to buy one but I like what I see, I'm not in a position where I can afford to gamble with these things sadly.
This is the price range a *060 card used to be at... 250 - 300$ but thats long gone now...
4060 is about same price tho. It would be a good deal even, if it was an actual 4060 with 12GB of memory and 16 lines of pcie...
Intel needs to invest heavily in their drivers, people are not going to adopt their product if they can't maintain drivers for new games and even legacy titles and maintain their previous gen products
They are investing heavier than you may think. Getting this right for a new discrete GPU player will just take its time and several generations. People may forget but Nvidia took a long time and ATI->AMD even longer to get as stable as it is now drivers-wise through countless collaborations with developers. That being said, Intel does need to time to iron out the quirks but because they don't have that time luxury on their side, they are actually fixing drivers faster than AMD has done for the past little over a decade (and still has issues albeit smaller ones).
Also, many games are tuned to vendor HW through vendor extensions over the years as part of big engines such as Unreal or through vendor-developer collaboration. Expecting this to be more common on a larger scale for Intel over the next years, it's just very time consuming regardless of the amount allocated resources. The more games moving onto new graphics API versions in the future, the closer the driver gap may become over the following years.
Imagine 10 years ago we’d be saying that Intel has the best budget GPU and AMD has the best CPU. Unreal.
So a solid upgrade if ur 20 series and older. But at 1080p gaming kinda 50/50 to switch out my 3060 12gb
Great review!
However, please refrain of the overuse of extraordinary words, expressions, like "demolished the competition", and "night and day", and "way ahead" and all alike.
5-15% difference is in normal terms nothing.
No sane person will choose any product for these differences, above or under the others. This is nitpicking.
Remember the days when a new product was 50% faster...
This is so tiring... If you do benchmarks, and spend tons of time to be scientific, please do not be this overly sensational. Thank you.
Fine. I'll dumb down the verbiage.
@@HardwareCanucks Great you responded! I did not mean "dumbing down", just meant more realistic adjectives :)
thanks
Hogwarts shows that optimisation is more important than outright VRAM capacity.
Well done Intel
Why are you the only one who talked about Blender? Why won’t people just run a benchmark for Blender? Need to know if it’s time to put the old 3060 12GB to rest and still to this day, the answer is no. lol.
So this 250GBP card easily beats a750 which is selling at 170-180GBP now 🤔
Actually, it's the new RX 6700 XT with low power consumption & better raytracing but cheaper !
- Well done INTEL ! 👍
3D rendering results are just what I expected... guess I will go with nvidia.
I will buy one instead of 7700 XT.. I just play league anyway and maybe if I get bored I will play some triple A games at 1080p ..
What about older games like Age of Empires 2 DE or Portal and Half Life or Arcanum or Bully or AC 2 and AC Brotherhood and OLD SCHOOL RUNESCAPE
Its their 2nd gen of cards 😂😂😂 pre good for 2nd release and im a AMD card fan boy
3:20 "...and fares pretty well against the Radeon alternative..." - a few fps difference, sure, "but gets hammered by the rtx 4060..." - dude, it's like, 8 fps. Otherwise, great review.
good for 250$ /euro if that will be real store prices !! overall it matches 6700xt card and in some games over 4060ti
they gotta keep working on the drivers on a game by game basis.
So basically it has the same power as the ps5 gpu, the 6700, for $250. Impressive
Much more powerfull than the ps5 bro
solid
It seems like Intel's new dedicated budget GPU B580 with 12 GB VRAM is excellent for QHD story games. Nvidia RTX 4060 and 3060 (12 GB VRAM) are still better EA shooter games.
Intel B580 still giving tough competition to nVidia 4060 inspite of having lower core count , shader units and RT cores with a very unmatured driver till now. Only problem with Intel GPU is high power sucking even in idle state. Blender and other DCC apps and all game Engines will be forced to be optimized for this new GPU as this is already out of stock and a smash hit for Intel after a long time.
B700s could be interesting
You nailed it with the inconsistency conclusion. I'm of course buying one for the collection, but probably will wait for the $200 USD mark. The AIBs have lost the plot, and are already pushing north to $350 USD in many regions. Unfortunately, it still cannot compete with the 3070 / 6700 XT consistently, which go in my customer builds for around $225 used right now.
Based on other reviews, the b580 was beating the competition, i wonder why its different here
Might be game selection
TW: Warhammer 3 numbers are insane, which good as its the main game i play. 🎉👏
Same actually....- Mike