The Intel Arc B580: Midrange GPU Magic!

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 ม.ค. 2025

ความคิดเห็น • 747

  • @Level1Techs
    @Level1Techs  หลายเดือนก่อน +72

    Hey everyone!
    Check out our full benchmarks over on the forum (link below) any and all edits, typos, etc. will be fixed ~ Amber
    forum.level1techs.com/t/l1-benchmarking-the-new-intel-b580-is-actually-good/221580
    Update to clarify: Tn the older system is a 10700k, which is limited to PCIe 3.0. The motherboard it was in, the Z590, supports PCIe 4.0 but the CPU does not. So, the "older system" benchmarks are at PCIe 3.0 speeds.

    • @terrycook2733
      @terrycook2733 หลายเดือนก่อน

      Is it reasonable for start citizen gameplay..... I dont see any content creater doing what I have offered your company for us gamers.

    • @jackrabbitping
      @jackrabbitping หลายเดือนก่อน

      Thank you Wendell. Asrock sold something called the Deskmax in China. It was slightly bigger than the deskmeet but had better GPU support. I'd really like it if you you could get your hands on it.

    • @ThatGuy-ht9sp
      @ThatGuy-ht9sp หลายเดือนก่อน

      Thanks ~Amber

    • @mytech6779
      @mytech6779 หลายเดือนก่อน +2

      I would like to see some sort of SyCL benchmark on GPUs (oneAPI or AdaptiveCPP [formerly openSyCL/hipSyCL]);
      Being a sort of a multi-platform open source alternative to CUDA it seems like a reasonably good candidate for GPU-compute benchmarking.

    • @Splarkszter
      @Splarkszter หลายเดือนก่อน +3

      Please stop endorsing temporal methods of real-time image procesing.
      Ghosting and blurryness is so annoying, it's unnaceptable, and it's solving a problem that shouldn't exist.

  • @Anonymous-sb9rr
    @Anonymous-sb9rr หลายเดือนก่อน +473

    "What Nvidia wants to sell you at $250..."
    I don't think Nvidia wants to sell you anything $250.

    • @slimal1
      @slimal1 หลายเดือนก่อน +20

      2022: me loading PCPP everyday to see if the 3050 drops below $250

    • @tilapiadave3234
      @tilapiadave3234 หลายเดือนก่อน +3

      Or you could get useless AMD ,, get scammed

    • @angelg3986
      @angelg3986 หลายเดือนก่อน +7

      Something second hand maybe

    • @Seriouspatt
      @Seriouspatt หลายเดือนก่อน +8

      Maybe some Jensen merch. Like a green button for your leather jacket, perfect for die-hard NVDA fans.

    • @michaelkoerner4578
      @michaelkoerner4578 หลายเดือนก่อน +5

      Add a 0 that's their desired entry level. Good ole Jensen gatekeeping gamers.

  • @Tarodenaro
    @Tarodenaro หลายเดือนก่อน +230

    "twas we had GT580"
    "once we had RX580"
    "now we've got B580"
    The circle has been completed.

    • @MegaManNeo
      @MegaManNeo หลายเดือนก่อน +43

      RGB580!

    • @jdp360
      @jdp360 หลายเดือนก่อน

      Piss up my shit wicket ​@@MegaManNeo

    • @sketchi7306
      @sketchi7306 หลายเดือนก่อน +19

      God finally listened to our prayers brothers , never taught I'd run an AMD CPU with a INTEL GPU. 😂

    • @tannisroot
      @tannisroot หลายเดือนก่อน +6

      UM ACTUALLY it's GTX580

    • @elxero2189
      @elxero2189 หลายเดือนก่อน

      Hahah I see what you did dehere

  • @snemarch
    @snemarch หลายเดือนก่อน +268

    I really hope, even with the whole crisis thing Intel is going through, that the GPU division will be kept alive. We need competition, and the speed they've iterated with on Arc is amazing.
    If they keep focus on solid midrange performance at a decent price-point, focus on reducing power consumption at the same time as increasing performance, and keep offering non-gimped memory versions that are appealing to developers who want to dip their toes in ML without breaking the bank... this could really go somewhere.

    • @josephlh1690
      @josephlh1690 หลายเดือนก่อน +21

      They won't ditch the gpu division. They rightfully recognize that they can't just afford to lose out in the gpu segment. Intel arc was created originally to gain a foot hold in the dedicated gpu market, it was expanded to the igpu market because Amd had been running circles around intel mobile processors. Arc has become the successor to iris xe as a whole. Technically speaking, tiger lake (iris xe) had superior graphical hardware design held back by its dated architecture, low clock speeds and under developed gpu drivers.

    • @gavination_domination
      @gavination_domination หลายเดือนก่อน +6

      Agreed. The speed that that they've done this all in is really notable. I know Intel is no stranger to GPU work, but dGPU work is a slightly different beast, as they've certainly noticed. I think that right now, given the economic climate in the US, this is absolutely Intel's chance for a commercial win here. They've done the hard part. If they market this properly, and build a ROCm/CUDA equivalent, I think this might find a massive audience amongst PC peeps.

    • @jonathansaindon788
      @jonathansaindon788 หลายเดือนก่อน

      How does it compare to 6800XT?

    • @ZachAR3
      @ZachAR3 หลายเดือนก่อน

      They have one called inteloneapi which also supports Nvidia and amd ​@@gavination_domination

    • @jamesbyrd3740
      @jamesbyrd3740 หลายเดือนก่อน +6

      Why people keep saying this?! The only thing they're targeting is brand recognition. They don't care about the midrange GPU market. They want to be selling AI cards, just like Nvidia is doing. Corporations are not your friend.

  • @tnaxpw
    @tnaxpw หลายเดือนก่อน +296

    Can't wait for Level1Linux vid on it, as it sounds like a nice upgrade this time around.

    • @chamoferindamencha8964
      @chamoferindamencha8964 หลายเดือนก่อน +31

      Came here to say this. I also wonder about SR-IOV...

    • @Aegor1998
      @Aegor1998 หลายเดือนก่อน +19

      Same here. I've been looking for a decently priced card to upgrade my GTX 1080 from. This looks like a really good card, but I want to know how its feature set is implemented on Linux.

    • @msaenz4768
      @msaenz4768 หลายเดือนก่อน +11

      I was eyeing this card because of the new Xe kernel driver Intel is developing.
      The i915 kernel driver is great and legendary, but I'm gambling on Xe becoming as good as amdgpu one day, so fingers crossed! 🤞

    • @Level1Techs
      @Level1Techs  หลายเดือนก่อน +119

      We're working on it!

    • @oistk8956
      @oistk8956 หลายเดือนก่อน

      @@Level1Techs Would love to see pass through on Proxmox (SR-IOV) , performance in Immich and Plex (or other small server workloads).

  • @user-xp8mm8jx2d
    @user-xp8mm8jx2d หลายเดือนก่อน +25

    I really love how you can present something like this with neither hype nor derision, but still be enthusiastic the whole time. A neutral review that is nice to watch, even entertaining. Your talent for this kinda stuff is admirable

  • @blackryan5291
    @blackryan5291 หลายเดือนก่อน +68

    I been watching Wendell since his face stayed hidden behind a sick set of monitors like Mr Wilson on Home Improvement. Wendell literally knew everything. Wise AF just like Mr Wilson. I thought that was cool AF. I have learned soo much from Wendell. I don't even know this dude but I legit grew up with him and he taught me things. You a cool dude Wendell. You done earned the respect of millions of bros. Freaking Kudos

    • @AJyoutubes
      @AJyoutubes หลายเดือนก่อน +5

      Same, was always disappointed when they panned away from Wendell on Tek Syndicate

  • @Kerosyn
    @Kerosyn หลายเดือนก่อน +52

    really happy to hear that you found such good idle power draw. that was my final hangup about arc

    • @gondil07
      @gondil07 หลายเดือนก่อน +14

      Just a heads-up, Gamers Nexus found a lot higher idle power draw. Better than last gen but still 35w on the GPU alone.

    • @Berserkism
      @Berserkism หลายเดือนก่อน +15

      ​@gondil07 needs settings changed in bios and os. Intel laid out the changes, then it's about 12-15W, not as good as Nvidia but pretty close.

    • @Kerosyn
      @Kerosyn หลายเดือนก่อน +12

      @@gondil07 I actually watched their review first, which had me worried, but this one leads me to believe that it was a configuration or software issue rather than a hardware problem. would like to see more testing from more people, but I'm confident enough that GN had something wrong on that

    •  หลายเดือนก่อน +4

      ASPM?

    • @endless2239
      @endless2239 หลายเดือนก่อน +5

      it's weird that GN had such high power draw then, I'm sure they will compare with other reviewers and figure out wtf is wrong, cuz that looks like a serious bug if someone doesn't realize.

  • @Ztaal3
    @Ztaal3 หลายเดือนก่อน +96

    I'm happy to see a card in the 250 dollar range that isn't poo poo... Great news for average gamers!

    • @syncmonism
      @syncmonism หลายเดือนก่อน

      I don't think these are actually going to be widely available, at least not for very long. Intel is losing money selling them at this price, so they are not going to want to make a lot of them. But, maybe they will be able to make enough that they can keep up with demand for a significant amount of time. It's not like the demand is actually going to be that high for these, especially once the next gen competing cards from AMD and Nvidia are released.

    • @Dell-ol6hb
      @Dell-ol6hb หลายเดือนก่อน +1

      @@syncmonism what they’re looking for now is market share, that’s why they’re willing to sell it at such a low price

    • @catacocamping874
      @catacocamping874 หลายเดือนก่อน

      @@Dell-ol6hbto late it up to 500$ thank you scrapping

    • @eddgrs9193
      @eddgrs9193 15 วันที่ผ่านมา

      The joke is that you can't get it, it's sold uot.

  • @rantz0z
    @rantz0z หลายเดือนก่อน +34

    6:20 A note for buying a used system - you will want to ensure the system has resizable bar support or else arc GPU performance will be very poor.

    • @wargamingrefugee9065
      @wargamingrefugee9065 หลายเดือนก่อน +3

      Also, if the system is a PCIe 3 platform, you'll be limited to 8 lanes for the new GPU.

    • @ddhelmet
      @ddhelmet หลายเดือนก่อน

      Pcie 3 doesn't matter. Pcie 2 maybe. ​@@wargamingrefugee9065

    • @adammarks4491
      @adammarks4491 หลายเดือนก่อน +6

      @@wargamingrefugee9065with B580 you’re always limited to 8 lanes. No matter if you have PCIe 2.0, 3.0 or 4.0.

    • @kunka592
      @kunka592 หลายเดือนก่อน

      @@adammarks4491 Yes, which might be more of an issue with PCIe 3 (or older, lol) with less bandwidth per lane.

    • @WayStedYou
      @WayStedYou หลายเดือนก่อน +1

      @@adammarks4491 yeah but 8 lanes of 4.0 is the same as 3.0 x16 which is nowhere near a bottleneck in bandwidth

  • @aresthedevil
    @aresthedevil หลายเดือนก่อน +64

    I recall from the last year Intel gpu that there was a discussion about using the Intel enterprise gpu software with the consumer card (in particular gpu sharing among multiple vms without any expensive licenses). Is there an update about this with the new generation?

    • @deadfool179
      @deadfool179 หลายเดือนก่อน +4

      They made a new one, it seems, not the arc thing anymore but an intel graphics software app

    • @rudypieplenbosch6752
      @rudypieplenbosch6752 หลายเดือนก่อน +3

      ​​@@deadfool179 does it work with this card ? If that rhe case I buy it, one card to share between VMs would be ideal.

  • @VelcorHF
    @VelcorHF หลายเดือนก่อน +4

    Highlighting the cpu difference on older cpus vs new is the kind of stuff that is really valuable about the l1t approach.Its a good ramble for sure, but its also a deep dive. Looking forward to the linux and AI deep dive that will come later :D

  • @dallasangler
    @dallasangler หลายเดือนก่อน +113

    My smooth brain read the title as " It's ...Goop". It broke my head.

    • @loganwalsh
      @loganwalsh หลายเดือนก่อน

      It's that little reflection off the GPU and it made me read the same thing 😂

    • @BBWahoo
      @BBWahoo หลายเดือนก่อน

      I read 19:34 as (Canada, Wet Fart, Chase)

    • @reo1994
      @reo1994 หลายเดือนก่อน

      Same 😅

    • @transformerstuff7029
      @transformerstuff7029 หลายเดือนก่อน

      isnt that the stuff gwyneth paltrow sells?

  • @fundude365
    @fundude365 หลายเดือนก่อน +8

    Genuinely the most exciting tech product in a while

    • @transformerstuff7029
      @transformerstuff7029 หลายเดือนก่อน +2

      dude this and the AMD cpu's being affordable and fast is amazing stuff, who would've guessed we could play 1444p games at something like 800 bucks again in 2025!

  • @Bill-lt5qf
    @Bill-lt5qf หลายเดือนก่อน +16

    Finally, a decent improvement to price for perf. It's basically my ~700 quid nvidia card from 5 years ago, for 250. Noice

  • @GeoStreber
    @GeoStreber หลายเดือนก่อน +30

    Depending on how midrange RDNA 4 and RTX 5000 are priced (yeah, right.), I might recommend this one for midrange builds with some gaming in mind.

  • @azjeep26
    @azjeep26 หลายเดือนก่อน +109

    The charts have b580 with 8gb ram it has 12gb ram?????

    • @Level1Techs
      @Level1Techs  หลายเดือนก่อน +116

      oops this is a typo if so, 12gb on the gpu

    • @gaav87
      @gaav87 หลายเดือนก่อน +14

      @@Level1Techs Then fix it...

    • @Raizan-IO
      @Raizan-IO หลายเดือนก่อน +29

      @@gaav87 ....

    • @Level1Techs
      @Level1Techs  หลายเดือนก่อน +48

      @@gaav87 While I'm unable to change parts of a video that is live, you can find all revised graphs on our forum post:
      forum.level1techs.com/t/l1-benchmarking-the-new-intel-b580-is-actually-good/221580

    • @tilapiadave3234
      @tilapiadave3234 หลายเดือนก่อน +20

      @@gaav87 Yes SIR ,,, when did you get voted in as God and ruler of all humans?

  • @Norman_Fleming
    @Norman_Fleming หลายเดือนก่อน +11

    I am so relieved this is a good card. Kudos to the team at Intel.

  • @MrSwedish
    @MrSwedish หลายเดือนก่อน +41

    I will go blue for my kids upcomming updates to do my part as a consumer.

    • @EidolonKaos
      @EidolonKaos หลายเดือนก่อน +7

      Thanks for giving competitors a chance, if I have the opportunity to build an entry level system for someone soon I'll do the same

  • @haroldasraz
    @haroldasraz หลายเดือนก่อน +9

    Battlemage seems like such a fun card: It has the bus, the VRAM, and software to let you tweak and OC the card. However, what I am really excited about, besides hugely, much more mature drivers, are the AI features that Intel has included, such as LLM and AI image generation locally - that's Huge for content generators. Hence, I hope it comes to the EU region for 250 Eur.

    • @AyrtonGomesz
      @AyrtonGomesz หลายเดือนก่อน

      Me too man, i am investigating also the Xe2, the NPU4 and the AI capabilities of the NPU in the mobile processors (Core Ultra) for LLM workloads. Lookung forward to optimize my llama.cpp fork for that

    • @tapioorankiaalto2457
      @tapioorankiaalto2457 หลายเดือนก่อน +2

      In Finland it's 330€. I was hoping for 250€ too.

  • @potatoes5829
    @potatoes5829 หลายเดือนก่อน +83

    level1linux video when?

    • @appi1990
      @appi1990 หลายเดือนก่อน +22

      Including Games plz😊

    • @ThePirateParrot
      @ThePirateParrot หลายเดือนก่อน +4

      From the phoronix review looks to me like it's still a few steps behind. Still if they improve just a little bit my media pc has an a380 in it that might get replaced...

    • @Legion-495
      @Legion-495 หลายเดือนก่อน +4

      I want to know too. I really hope they will also go with open source drivers all in and then it is plug and play to greatness.

    • @artun42
      @artun42 หลายเดือนก่อน

      @@ThePirateParrotPhoronix definitely had some issues with his testing. His GTA V results are abysmally poor. I would wait for someone else to test it.

    • @ranjitmandal1612
      @ranjitmandal1612 หลายเดือนก่อน

      🫡

  • @MagicManCM
    @MagicManCM หลายเดือนก่อน +9

    The flowthrough fan thanks to the smaller PCB is honestly so sick.

    • @syncmonism
      @syncmonism หลายเดือนก่อน

      It's a good design

  • @lolwtfldr3
    @lolwtfldr3 หลายเดือนก่อน +4

    👍 love to see more competitors in the GPU segment

  • @bjn714
    @bjn714 หลายเดือนก่อน +6

    Just remember if going used, the system MUST support resizeable BAR, so nothing older than like 10th gen Intel (there may be some exceptions).

  • @ilovelimpfries
    @ilovelimpfries หลายเดือนก่อน +15

    The only hardware reviewer advise that I actually listened to.

  • @dennisfahey2379
    @dennisfahey2379 หลายเดือนก่อน +16

    What is interesting is Intel's long history in graphics. They focused on commoditizing graphics a very very long time ago. First they did MMX extensions in the processor core. Then they added vector processing (AVX). They were tightly coupled with Microsoft in providing what the desktop graphics needed for acceleration, focusing on their bread and butter and protecting their "moat" by assisting DirectX. Eventually - to make laptops cooler and cheaper they integrated 3D graphics into the CPU chip thus moving the bar for a sub $1000 capable Windows Laptop. So they know what they are doing on a technical level and achieve some very impressive power numbers etc. They are still catching up on the non-DirectX graphics engine support in their drivers. That is a tough obstacle as the target is constantly evolving. New leadership coming in. Let's see how they plant their feet.

    • @DarkKnight32768
      @DarkKnight32768 หลายเดือนก่อน +6

      Ahem. What they really wanted was getting multimedia and gaming on IBM PC compatibles locked to their hardware and software. Libraries, promotional campaigns, supporting Microsoft's One and Only True Way of doing everything through Direct* libraries provided by the system (and written in part by Intel) were all born out of that. And it was a valid strategy in mid-'90s (remember that Doom and Quake were software rendered super hits, and needed a fast CPU - guess who profits from people buying top of the line processors). Then serious 3D accelerators appeared, and in a couple of years Direct3D became “interface to what nVidia and ATi wanted to offer”. Not Intel.
      However, they still managed to make themselves important for big multimedia businesses... through DRM. HDCP? Intel owns it. AACS? Intel participated in making it. Not to mention SGX, uncontrollable and unobservable execution of something only Intel and its clients understand at the firmware level, which can only be used for a single task: implement DRM systems for the big guys. (It is now deprecated, so Intel didn't have enough power to shove it into everyone's throat, but, famously, even before that government security agencies demanded for their systems to be provided with firmware from which such “features” were completely cut.)

  • @vensroofcat6415
    @vensroofcat6415 หลายเดือนก่อน +5

    This kind of price/performance reminds me of good old times when the green was greener - the times of GTX 960, 1060.
    If I were building a system with some 12400F right now I wouldn't have to think for too long I guess.

  • @HawaiianTuber808
    @HawaiianTuber808 หลายเดือนก่อน +4

    I'm very interested on how well these cards perform for entry level content creation, especially rendering videos on Davinci Resolve and the HEVC/AV1 encoder for live streaming.

  • @shadow7037932
    @shadow7037932 หลายเดือนก่อน +2

    I never would have considered Intel to be the budget GPU of choice even 5 years ago.

  • @Ale-ch7xx
    @Ale-ch7xx หลายเดือนก่อน +32

    Will you be doing any PCIE 3.0 testing for us users still on PCIE 3.0 such as the B450 and a520 motherboard?

    • @KhanhDinh291
      @KhanhDinh291 หลายเดือนก่อน +9

      second this. the result would be useful for egpu users as well

    • @williamp6800
      @williamp6800 หลายเดือนก่อน +1

      It requires resizable bar. Is that even going to be a thing on MB with only PCIe 3?
      Edit:
      Hope it is because nothing I’ve got has PCIe 4.

    • @sjargo11
      @sjargo11 หลายเดือนก่อน

      @@williamp6800 i have used a b450 with rebar enabled, and hardware unboxed’s podcast had one of Intels guys on and according to him pcie 3 should not have any issues

    • @Ale-ch7xx
      @Ale-ch7xx หลายเดือนก่อน

      @@williamp6800 Yes, my board has it.

    • @crashniels
      @crashniels หลายเดือนก่อน +2

      ​@@williamp6800 rebar works since PCIe 2.0. Newer PCIe 3.0 boards should have it by default and older ones need to get an update (or you mod the option in yourself)

  • @Aerobrake
    @Aerobrake หลายเดือนก่อน +9

    Looking fw to Level1Linux!!

    • @CommodoreFan64
      @CommodoreFan64 หลายเดือนก่อน +2

      Same, I want to see how these cards preform under Linux, & is it just near plug, and play like AMD has been for me.

    • @Aerobrake
      @Aerobrake หลายเดือนก่อน +1

      especially because I am going to switch to linux very soon

    • @CommodoreFan64
      @CommodoreFan64 หลายเดือนก่อน

      @@Aerobrake I've personally been using Linux since the Late 90's when Red Hat Deluxe, and Corel Linux came out, but I only made the full switch over on all my systems around 2015 when AMD open source GPU drivers, and Proton for STEAM got good enough to run most of my games, as I was fed up with Windows 10 being a resource hog, and I was one of the victims of the update that deleted stuff off people's HDD's without their permission. So almost a decade free of MS, & life could not be better for me running Manjaro Gnome on my various AMD CPU/GPU, and Intel CPU/iGPU systems, as I avoid Nvidia on purpose because it's more expensive,the user experience on Linux is overall worse because they have been anti open source for so long, thus causing the Linux community to have to work from scratch to develop the open source Nouveau drivers for Nvidia hardware(they have done amazing work, but should not have had too), while AMD, and Intel have their own teams contributing back to the Linux kernel for their hardware drivers which is a big 👍 in my book.

  • @gunhaver12
    @gunhaver12 หลายเดือนก่อน +1

    Excellent video as us usual Wendel. I love the analysis that goes far beyond game testing numbers.

  • @seylaw
    @seylaw หลายเดือนก่อน +3

    Phoronix showed some impressive compute benchmarks. This good showing is surprising. I hope the new Intel management won't kill off their dGPUs though.

  • @hando87
    @hando87 หลายเดือนก่อน

    Got sidetracked by your benchmarks background. How. Cool. Thank you for your insight once again good sir!

  • @GNstudios1000
    @GNstudios1000 20 วันที่ผ่านมา +1

    I'm using a i7-8700 with my b580. I have a strix z390-e mobo. With rebar off games are unplayable, but with rebar on it is awesome even with a CPU that isn't "compatible"

  • @GarethFairclough
    @GarethFairclough หลายเดือนก่อน +1

    I'd love one of these. Hope it'll fit inside my Nuc12 extreme!

  • @ajit78910
    @ajit78910 หลายเดือนก่อน +1

    GPU market needs more players so that consumers are not robbed ! Highly experienced companies like Intel should survive in the market so that there is no monopoly. I really hope that this generation of Intel GPUs crack the market !
    Thanks for the awesome review. Love your work ❤️.

  • @MusashiOf5Rings
    @MusashiOf5Rings หลายเดือนก่อน +1

    I'm interested in a B570 for my home lab with that sweet, sweet transcode.

  • @moravianlion3108
    @moravianlion3108 หลายเดือนก่อน +24

    I know that sale margins are not impressive for shareholders, but Intel needs to keep fixing aggressively their shattered reputation amongst customers.

    • @aitorbleda8267
      @aitorbleda8267 หลายเดือนก่อน +1

      They should have done this a decade ago to sow salt in nvidia's bread and butter. They decided that the money wasnt there. They were wrong.

    • @jmwintenn
      @jmwintenn หลายเดือนก่อน +1

      they're losing money on these,don't expect them to make many.

    • @WayStedYou
      @WayStedYou หลายเดือนก่อน

      They wont be around much longer at this rate, no margins on these, no margins on their cpus no sales in server.

    • @marcel-q1m
      @marcel-q1m 24 วันที่ผ่านมา

      ​​@@jmwintenn
      gotta love it when people read baseless info on reddit or Twitter from so called "experts" and just regurgitate that misinformation every chance they get

  • @solidreactor
    @solidreactor หลายเดือนก่อน +2

    Would love to see how Arc compares to Nvidia and AMD in streaming and recording, specially when using OBS.

  • @damiendye6623
    @damiendye6623 หลายเดือนก่อน +4

    Question for the room is does it do sr-iov?

  • @AlexSchendel
    @AlexSchendel หลายเดือนก่อน +5

    Very interesting, I wonder what the difference is between your monitor setup and Gamers Nexus'. They were measuring 35W at idle versus yours at 13W. Perhaps it's still some weirdness with ASPM? Alchemist had issues with never falling under 40W at idle without properly configured ASPM settings and even then, it depended on total display resolution. With my dual 1440p displays, it can't seem to sleep the main GPU and so it still refuses to fall below 40W at idle. Did you test those multi-monitor higher resolution scenarios?

    • @irwnmfletcher
      @irwnmfletcher หลายเดือนก่อน

      I would guess its aspm. I had no end of errors from my samsung 960 after a bios update. and a later bios update turned aspm right back off. had no idea what to look for until months after the fact.

    • @WayStedYou
      @WayStedYou หลายเดือนก่อน

      sometimes you can get a massive power draw with a new monitor because it keeps its memory clocked high for no seeming reason
      Like my 6700xt will pull 30w if i turn on my 2nd monitor at 60hz with the 120hz main panel but will idle at like 5-8w otherwise

  • @FreshJ1v3
    @FreshJ1v3 หลายเดือนก่อน +3

    I like this cards reviews, it looks solid for my kids pc and I hope they make a bigger one that I can use.

  • @WilliamOwyong
    @WilliamOwyong หลายเดือนก่อน +1

    The way Wendell talks about it, it makes you feel like Intel really listened to the consumer base and tried to deliver what we wanted, rather than try and dictate what we want while helping themselves to our wallets.

  • @adamkamieniarz9223
    @adamkamieniarz9223 หลายเดือนก่อน +2

    2:20 if we are thinking about the same display, it only supports 480hz in 1080p mode, so there is no 4k480 display on the market

  • @00wheelie00
    @00wheelie00 หลายเดือนก่อน +4

    I'll be waiting for a B770/780.

  • @chadwolf3840
    @chadwolf3840 หลายเดือนก่อน +1

    Go intel! Can’t wait to build a budget PC with this gem.

  • @wiggybends3632
    @wiggybends3632 หลายเดือนก่อน +2

    I love this card - for price / performance - drivers much improved / lower power draw

  • @walkersgamingpcs
    @walkersgamingpcs หลายเดือนก่อน +2

    Can’t wait to check 1 out

  • @johnhpalmer6098
    @johnhpalmer6098 หลายเดือนก่อน +2

    Liking this card, may just go ahead and order it. I'm looking at a new build for creative work (video editing etc using Davinci Resolve) and likely pairing it with a 15th gen core ultra 7 and Gigabyte Aero G Z890 MB for the new socket. Eventually, if they bring out an updated 770 type card, I'll likely get that eventually.
    All this to update a 7YO Dell Optiplex SFF Core i5 with an equally as ancient GT 1030 as that's all I could find to work with this setup.

  • @stephanematis
    @stephanematis หลายเดือนก่อน +5

    Waiting on your Linux "where does this lie" video. Also, curious if we can SteamOS with this.

    • @EidolonKaos
      @EidolonKaos หลายเดือนก่อน +1

      We're all still waiting on Valve to support anything other than their AMD hardware right now, but distros like Bazzite are atomic like SteamOS with support for Nvidia. With either some tinkering or support from some distro maintainer it should be possible

  • @jrose-xp6tf
    @jrose-xp6tf หลายเดือนก่อน

    Man, that's what I wanted to hear...an honest take on Intel's new cards. I'm going to buy one now, after waiting for an all-clear...thanks.

  • @Met1900
    @Met1900 หลายเดือนก่อน

    I think its impressive what intel has achieved here in just 2 generations. Really incredible. I hope they bring also bigger chips out like a b770.

  • @roklaca3138
    @roklaca3138 หลายเดือนก่อน +3

    I wonder how high prices will go here in Europe

  • @disconductorder
    @disconductorder หลายเดือนก่อน +20

    It's good bc I want intel to compete

    • @robnobert
      @robnobert หลายเดือนก่อน

      yeah well 😢 it's either the LAST Intel GPU or the penultimate Intel GPU from what I hear. I live over in Hillsboro, OR near D1X and Ronler Acres, have A LOT of friends at Intel because I live like right here haha, though I don't know anybody directly on the GPU team tbh. But rumor is from Intel employees that almost the entire GPU team has been fired and the department is being kept alive ONLY to keep the drivers working while they manufacturer and release the last few finalized designs. 🙃 I want Intel to compete as well... but I think Intel would rather sit back & collect subsidies while they pray for a merger deal. At least that's what the board wants... and so far there's no Nelson Peltz -like character ready to come in and make war with the board so they're probably going to win.

    • @PSXman9
      @PSXman9 หลายเดือนก่อน

      @@robnobert Meanwhile the words of Intel directly:
      The software team is currently refining Xe3, while the hardware team has begun working on its successor, Xe4.

    • @robnobert
      @robnobert หลายเดือนก่อน

      @PSXman9 🙄 irrelevant. Intel isn't stopping ALL GPU dev. They're stopping DISCREET GPU dev, which is what we're talking about.

    • @transformerstuff7029
      @transformerstuff7029 หลายเดือนก่อน

      @@robnobert that sounds weird given their guy on said they were already working on the next one on the gamernexus channel.
      It would be weird to scrap a division that finally did something worthwhile.
      "but I think Intel would rather sit back & collect subsidies while they pray for a merger deal."
      No. Just no. Like......no. wow....not even.

    • @Bomkz
      @Bomkz หลายเดือนก่อน

      @@robnobert when pat gelsinger referenced discrete GPUs, it was in a wider context of portable devices (i.e laptops) and basically said intel should shift priority from making laptop dGPUs, to APUs that can compete with AMD's Z1.

  • @camogeko6804
    @camogeko6804 หลายเดือนก่อน

    Nice to see Wendel in the new set up! :D

  • @DavidAlsh
    @DavidAlsh หลายเดือนก่อน +22

    vm sharing? 👀

    • @ezekialsa
      @ezekialsa หลายเดือนก่อน +1

      Only as good as vendor support

  • @davidholder2100
    @davidholder2100 หลายเดือนก่อน +1

    Great review. I would love to see you review this card with the new Intel 265 CPU. Thanks

  • @spuchoa
    @spuchoa หลายเดือนก่อน +1

    Great review, hoping for the Linux one!

  • @Paakli
    @Paakli หลายเดือนก่อน +3

    Perfect card if you want to upgrade from your 1080p monitor

  • @kettusnuhveli
    @kettusnuhveli หลายเดือนก่อน +7

    Interesting to see you hit 13watt idle with this card when every other review can hit whopping 30watts at best. Wonder where the discrepancy is coming from?

  • @jevees01
    @jevees01 หลายเดือนก่อน

    Hooray! Long form video! (Not sarcasm, am genuinely happy)

  • @SwirlingDragonMist
    @SwirlingDragonMist หลายเดือนก่อน +1

    Highlighting for prospective buyers, that you need a motherboard with re-sizable BAR to make this sing, or even for light chirping.

    • @tontsar91
      @tontsar91 หลายเดือนก่อน +1

      Very important point I havent seen often enough in the comments.

  • @matiasm.3124
    @matiasm.3124 หลายเดือนก่อน +2

    What about this card for gaming in Linux??

  • @Matt-gi2kn
    @Matt-gi2kn หลายเดือนก่อน +1

    This is some astounding progress for Intel. If they keep this up, they'll be trading blows with nvidia in most segments within a few years!

  • @sitaroartworks
    @sitaroartworks หลายเดือนก่อน +1

    If you aim for a good Linux machine this is the card because compared to any Radeon out there, it has openAPI fully supported now. Especially for the ones that consider Blender software a must.

  • @phil.willoughby
    @phil.willoughby หลายเดือนก่อน

    Excited to buy the same thing with blessed drivers as the Arc Pro B60 for twice the price

  • @Dan-Simms
    @Dan-Simms หลายเดือนก่อน +1

    This may be my next GPU, i will be waiting for team red and green's next low and mid range models but i at least need a card with more than 8GBs of VRAM for my 1440p ultrawide. I will have to see how the new and used options compare price to performance wise about 6 months from now.

  • @intermarer9145
    @intermarer9145 หลายเดือนก่อน +2

    How does it work in Linux? X11 and Wayland both good? Can it do 4k@120 (HDMI 2.1) unlike AMD?

  • @dustinhipskind7665
    @dustinhipskind7665 หลายเดือนก่อน

    It would be super interesting to see how the b580 would perform with rebar off

  • @radeksparowski7174
    @radeksparowski7174 หลายเดือนก่อน +2

    any info if intel is planning a successor for a310/380 ? single slot, no extra power, if possible passively cooled with 4+ monitor support with latest codecs and technologies supported would be nice to have for 150ish euro, not everybody runs local llms or games all the time, photo and video editing with occasional fhd game where the nvidia 720/730/1050/1650/1660 or amd 6400 are a bit obsolete or unnecessarily crippled

    • @EidolonKaos
      @EidolonKaos หลายเดือนก่อน +1

      That's quite the ambitious wishlist

    • @radeksparowski7174
      @radeksparowski7174 หลายเดือนก่อน

      @@EidolonKaos jingle bells, jingle bells....wishlist delivered, product incoming???

  • @enihi
    @enihi หลายเดือนก่อน +6

    How's the performance on PCIE gen 3?

    • @Level1Techs
      @Level1Techs  หลายเดือนก่อน +1

      The 10700K that was used to test is limited to PCIe 3.0. ~ Amber

  • @Deveyus
    @Deveyus หลายเดือนก่อน +1

    I don't see new flex GPUs? Do we see anything about SR-IOV on these?

  • @colinmaharaj
    @colinmaharaj หลายเดือนก่อน +1

    7:35 And this is big thing for me in the tropics, low(er) power consumption.
    Intel is doing it with their recent CPUs, but they are late in the game, but, they will get there.

  • @Hans-gb4mv
    @Hans-gb4mv หลายเดือนก่อน

    I'm waiting for the B7x0 series, but I hope they can continue the path they have chosen.

  • @Nexxxeh
    @Nexxxeh หลายเดือนก่อน

    12:12 - The second scene has background noise and music and I think its the combination that made me struggle disproportionately. I've a relatively minor audio processing issue, but I mention it coz it might trip up others too. I don't generally need subtitles to watch your content, but this bit was hard going.
    It's gunna be weird having "AMD CPU and Intel GPU" builds but that seems to be where the value is here at the moment. Crazy times!

  • @dtsdigitalden5023
    @dtsdigitalden5023 หลายเดือนก่อน +20

    Intel fighting for scraps in the low end section, both with CPU and GPU, is exactly what that company needed for a wake-up call. It's good for them, and it's good for us consumers. This card is an unexpected victory, and we should cheer Intel on, that credible competition is sorely needed. I can't help but think this only weakens AMD's GPUs sales however, since Nvidia pinches its nose as it walks past the low-end GPU marketplace. Nvidia buyers will not be tempted by this card, but potential AMD buyers will.

    • @johnathanmcdoe
      @johnathanmcdoe หลายเดือนก่อน +4

      If AMD finds a way to make their RT stuff more competitive, I could see this turning into a true three way split of the market Intel/AMD/Nvidia -> low-mid/mid-high/stupidmoney.

    • @benjaminoechsli1941
      @benjaminoechsli1941 หลายเดือนก่อน +3

      ​@@johnathanmcdoeThe rumors are that RDNA 4 should be about equivalent to Lovelace in ray-tracing, iirc. Excited to see how RDNA 4 shakes out.

    • @dtsdigitalden5023
      @dtsdigitalden5023 หลายเดือนก่อน +3

      @@johnathanmcdoe Intel vs AMD in the low to low-mid tier, AMD vs Nvidia in the mid tier, Nvidia vs nobody in the high end? It does seem it's AMD fighting the war on both fronts, and Intel isn't taking Nvidia on at all .. but that's my silly opinion.

    • @clansome
      @clansome หลายเดือนก่อน +2

      @@dtsdigitalden5023 Why fight someone else's batlle if you haven't got the ammunition to fight it. Will be interesting to see how this scales to a B7xx. Looking forward Celestial is already taped out as is Druid. Could we have a C9xx or a D9xx. Things will be interesting in the next few years.

    • @Freestyle80
      @Freestyle80 หลายเดือนก่อน

      @@benjaminoechsli1941why do people always believe bogus amd rumors? Its always false

  • @Wintelburst
    @Wintelburst หลายเดือนก่อน +1

    Wondering how that would work in video production and capture.

  • @Rabbit_AF
    @Rabbit_AF 15 วันที่ผ่านมา

    I thought i was going crazy when the driver overhead thing was a brand new problem, but nope Wendell had pointed that out here on his day one review.

  • @Pårchmēntôs
    @Pårchmēntôs หลายเดือนก่อน +1

    1:10 also assuming there’s enough supply and it’s not scalped to the moon.

  • @kcanaladag
    @kcanaladag หลายเดือนก่อน

    thank you very much for your hardwork

  • @jsclayton
    @jsclayton หลายเดือนก่อน +1

    Haven’t been able to find if there were any QuickSync improvements this generation? Would love to see some ffmpeg or handbrake benchmarks.

  • @biniyamwhite3015
    @biniyamwhite3015 หลายเดือนก่อน

    Awesome review!

  • @willfancher9775
    @willfancher9775 หลายเดือนก่อน +1

    Your total board power at idle is way lower than GN measured. Any idea why?

  • @digicorefx5930
    @digicorefx5930 หลายเดือนก่อน +1

    So, my question is how is the card with virtualization?

  • @julesvanlaar
    @julesvanlaar หลายเดือนก่อน +1

    That's a bummer, but should have expected it. Was maybe going to buy it for one of my old systems (7700k, 2600k) but with resizable bar requirements I should have expected that was not gonna fly.

  • @LinusBerglund
    @LinusBerglund หลายเดือนก่อน +2

    GN had idle consumption at 30w. Does a tone have any theories?

    • @Berserkism
      @Berserkism หลายเดือนก่อน

      Intel already told reviewers how to address this. One change in BIOS and one in Power. Then idle is about 12W to 4060 8W or so. It's not a big difference.

  • @ZombieJig
    @ZombieJig หลายเดือนก่อน +3

    Is this a significant upgrade from GTX1080 (non TI)?

    • @chillhour6155
      @chillhour6155 หลายเดือนก่อน +1

      For 1440p gaming plus frame generation support yes

  • @trudeo42
    @trudeo42 หลายเดือนก่อน +2

    Amazing review. But way too many ads.

  • @milescarter7803
    @milescarter7803 หลายเดือนก่อน

    5:30 The all-in-one boards with mobile CPU are interesting to be sure, but as soon as you get a GPU a $125 Ryzen 8700F starts to make a lot of sense (Or dare I say it an 8400F). Even adding $30-40 for an AXP120 cooler and $100 for a Night Devil B650i you are still in a better position by enough to afford 2 other components: Case/PSU/RAM/SSD.

  • @doughy041
    @doughy041 หลายเดือนก่อน +3

    I want to see a B770 card but I keep hearing that this might or might not happen which would be a shame since all I hear is decent stuff about this card.

    • @robnobert
      @robnobert หลายเดือนก่อน

      Whether we see the B770 is still unknown. But what is certain is we only have 1-2 GPUs left from Intel.
      Apparently nearly the entire department has been let go and the only people they kept were to keep drivers working for a few years.
      They're going to release the few cards that were already in the pipeline and ready for manufacturing... and then close up shop.
      GPUs were Pat Gelsingers thing. And the corporate power players at Intel HATED that department.
      And now we know who won that battle 😢

    •  หลายเดือนก่อน

      This was debunked over and over. It wws confirmed that they working on 4th gen already. Go see gamers nexus videos with Tom Petersen or Peterson​@@robnobert

  • @DMonkeyR
    @DMonkeyR หลายเดือนก่อน +1

    Hi Wendell, mentioned testing the 10700k and the performance being lesser, and that possibly being driver overhead. The B580 is limited to x8 PCIe, could that have caused the lost performance on the 10700k which only supports PCIe3.0? Maybe a quick check of performance in PCIe4.0 vs PCIe3.0 would confirm?

  • @maxwellsmart3156
    @maxwellsmart3156 หลายเดือนก่อน +1

    One has to wonder what the BOM cost is on this 12 GB card. What is the price of GDDR6 at the moment? This is definitely a fire sale and they will only sell an amount of cards based on the TSMC contract fulfillment. Still the 6nm 7600 8 GB vs 5nm B580 12 GB, a node difference and not necessarily a node performance improvement. You did a good job of framing this review by not have the 4060 8 GB to compare. Depending on how the RX 8600 performs with RT, it could the nail in the coffin for Battlemage, but I guess B580 pricing is to clear these guys out before that happens.

  • @DennisMarwood
    @DennisMarwood หลายเดือนก่อน +2

    Do you think they will be releasing something like the A310? Maybe a B310?

    • @ibobeko4309
      @ibobeko4309 หลายเดือนก่อน

      Hard to say right now, they have financial difficulties right now, maybe the concentrate on the middle class for right now.

  • @blahorgaslisk7763
    @blahorgaslisk7763 หลายเดือนก่อน +3

    I really want to buy a B580 when I build a new machine, but I most probably won't. The simple reason is I look for quite high gaming performance, more in the line of RTX4080 or RX 7900. The B580 would just be something I wanted so I could try it out. The sad thing is it won't work for my old machine either even though it has a lot better performance than my old GPU because that machine doesn't support resizable BAR. These Battlemage GPU's really require resizable BAR to perform anywhere near reasonable, just like the Alchemist GPUs do.

    • @StarsAtNight1
      @StarsAtNight1 หลายเดือนก่อน +3

      I also thought my older intel machine ( 8series) not supported resizablebar, but to my surprise it has a bios update available that adds it. Maybe worth checking if that also the case with your platform.

  • @jarrettlone
    @jarrettlone หลายเดือนก่อน +1

    How good are these Intel Arc cards in Linux? Any good or bad personal experience out there?

  • @lYarmontl
    @lYarmontl หลายเดือนก่อน +1

    I have a week with the card but there are no drivers. Can you share with me ? 😢

  • @Boss_Fight_Index_muki
    @Boss_Fight_Index_muki หลายเดือนก่อน +5

    So basically it has the same power as the ps5 gpu, the 6700, for $250. Impressive

    • @golgothgaba904
      @golgothgaba904 หลายเดือนก่อน

      it's on par with rx6700xt who is 10 to 20% better in games than rx6700.

  • @user-suzie1818
    @user-suzie1818 หลายเดือนก่อน

    @Level1Techs Could you please make a comprehensive test video on the overhead issue when pairing the B580 with older CPUs? I think it's very important because after so many positive reviews published online, many people want to upgrade their old GPUs to this one, but they most likely are still using an old CPU right now and thus will not get the full performance out of this GPU.

  • @Adreno23421
    @Adreno23421 หลายเดือนก่อน

    7:18 how did you measure 13 watts of usage if GamersNexus was measuring over 30 watts on the card, just idling?