What the ARC A580 Means for the A770

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 ธ.ค. 2024

ความคิดเห็น •

  • @JustFeral
    @JustFeral ปีที่แล้ว +104

    I really hope Intel doesn't give up on the GPU market. They are sorely needed to keep Nvidia and and on check.

    • @SaltCollecta
      @SaltCollecta ปีที่แล้ว +8

      They have no choice. They started a core war with AMD and lost. They have to expand into GPU market to stay relevant.

    • @Trevellian
      @Trevellian ปีที่แล้ว +16

      Therein lies Intel's quandary. Many consumers only want Intel (and AMD) to improve their GPUs so that they can buy an Nvidia GPU more cheaply. These consumers would never *actually buy* a non-Nvidia GPU... ewwww.
      But unless a reasonable number of consumers actually *do buy* non-Nvidia GPUs, there will be diminishing justification for Intel (and AMD) to continue with the consumer side of the GPU market. Instead, focusing their resources on the A.I. side and its considerably larger margins. Leaving the Sisyphean task of optimizing gaming drivers to Nvidia.

    • @Trevellian
      @Trevellian ปีที่แล้ว +3

      But they don't have to stay in the *consumer* GPU market. The margins are staggeringly higher for enterprise, A.I. GPUs. And those products don't require the never ending task of optimizing gaming drivers. @@SaltCollecta

    • @slickjim861
      @slickjim861 ปีที่แล้ว

      I highly doubt that intel got into all this thinking they were going to just sell to gamers. If you remember when they got into all this it was when the crypto crap was going on. They were probably just shipping cards off to miners like and and nvidia did while making money like crazy. Which would explain why they’re drivers sucked so hard launch day. Now they have ai to sell cards for thousands so no matter how you cut this pie we are always the lowest smallest slice that nobody really cares about.

    • @TAP7a
      @TAP7a ปีที่แล้ว +6

      @@Trevellianhell, Nvidia themselves aren’t interested in consumers, especially low margin whiny gamers, when they can make up to $37,000 *per card* selling H100s in units of multiple racks packed with cards to AI/ML hyperscalers and specialists whose only public comments will be case studies and self promotion

  • @barapatan
    @barapatan ปีที่แล้ว +17

    I've owned an Intel A770 LE 16GB since March 2023. It was a frustrating experience in the beginning but it's 100% better as it stands today. The latest retail drivers are almost to the point where I would recommend this card to the budget minded gamer (at my $350 purchase price, even better below). However the Arc control center is... still developing.
    I would recommend just using the driver and avoiding the addon software for now, unless you are interested in providing feedback back to Intel.
    I'm interested in buying Battlemage when it comes out. I fervently hope Intel keeps in mind performance per price in the future and keeps in the game.
    No need being the best GPU on the block when the lower to mid tier market has so much room for competition.

    • @zk9964
      @zk9964 ปีที่แล้ว +2

      Just picked up an a750 to replace a 980Ti. Agreed there are some things intel needs to continue to improve on but at $200 this is a deal.

  • @labor4
    @labor4 ปีที่แล้ว +12

    the tension I myself was pulling through this vid was the expectation of any hint at SR-IOV.
    That would amplify all hopes mentioned by 200%

  • @Onticly
    @Onticly ปีที่แล้ว +3

    Getting my 770 tomorrow! I hope they continue developing and improving their gpus. Looking forward to their "battlemage" gpu hopefully coming next year

  • @michaelh9936
    @michaelh9936 ปีที่แล้ว +24

    WRT adding Arc to older PCIe 3.0 x16 systems, the lack of Resizable BAR is likely to be more of an issue for Arc than running at x8 on Radeon, unless ARC's performance with ReBar off has changed since launch.

    • @danieloberhofer9035
      @danieloberhofer9035 ปีที่แล้ว +3

      You're absolutely right. Unless the system in question is among the very few that only support PCIe3 but have been updated to support ReBar, any Alchemist GPU is "do not buy" territory. Intel has confirmed long ago that Alchemists memory subsystem heavily depends on ReBar and fixing that would require changes to the silicon - so that's not happening.
      They said that they'd look into that for Battlemage. We'll see.

    • @PhAyzoN
      @PhAyzoN ปีที่แล้ว +5

      Z390 (maybe Z370?) and Z490 are PCIe 3.0 and support ReBAR. Arc could be a reasonable choice for someone with a 9th or 10th gen Intel CPU.

    • @sijedevos2376
      @sijedevos2376 ปีที่แล้ว

      It has not changed.

    • @Heartsbane055
      @Heartsbane055 ปีที่แล้ว +3

      Most mobo and cpu released in last 5 years have support for resizable bar. So that's a non-issue.

    • @soldiersvejk2053
      @soldiersvejk2053 10 หลายเดือนก่อน +1

      It actually is not a big deal. After all for older platforms, the bottle neck is often the CPU. (Like my 1st gen Ryzen Dell that unfortunately lacks BIOS support for any newer CPU)

  • @orl2222
    @orl2222 ปีที่แล้ว +1

    I bought a A380 when released. I'm not a gamer, but had bad flickering on youtube videos using HDMI on my 43" 4K HDR TV. I almost gave up on the card, I used a old 1050 Nivida card. The lastest driver fixed HDR, and the flickering I discovered was due to not disabling the overlay in the arc control center. The key thing is that the intel arc cards have intel quick sync and AV1 encode and decode built in. I am using a Ryzen 5600G on a B550 board. I do some video editing, and with Adobe media encoder, and handrake and this lil card does a good job. I think Adobe support for these cards will get better. So, if you need the wider availablity of intel quick sync for encoding programs these arc card are the ticket, especially if your running a AMD system.

  • @Captn_Grumpy
    @Captn_Grumpy ปีที่แล้ว +2

    Picked up a discounted A750 a while ago. I love it. I dont do a lot of PC gaming but it generally holds up great (given the price) but where the fun is encoding. AV1 at over 300FPS and I havent even tried optimising it. I will have to see if I can throw some AI workloads at it one of these days. Been looking at options for my daughter's PC but at 1080 (her normal resolution) there is little reason to pay overs for AMD/NV solutions when the A-series will hold her over.

  • @Epsilonsama
    @Epsilonsama ปีที่แล้ว +21

    I'm an AMD GPU user and I never had the major issues some say they have but I still root for Intel because more options is better for us consumers.

    • @rawdez_
      @rawdez_ ปีที่แล้ว +1

      its NOT difficult to make gains and advancement. IT'S UNPROFITABLE TO MAKE THEM.
      the main concern that corporations have is not "how to make faster hardware than our old hardware or what our 'competition' have for cheaper" but it is "how to not make new tech too fast so our old products keep their value and we can continue to milk the market with spending as less as possible on R&D". they are perfectly capable of making WAY FASTER hardware, its just that MILKING THE MARKET IS WAY MORE PROFITABLE.
      in fact NGREEDIA is not only milking the market with overpriced AF GPUs but is also pushing buyers to NGREEDIA GaaS services.
      in fact AyyMD is not only milking the market with overpriced AF GPUs but is also pushing buyers to buy consoles which they make.
      ACTUALLY COMPETING IS THE LAST THING THEY WANT TO DO.
      corporations never want their products to be x2-3-5 times better/faster than their last products - disrupting markets like that is unprofitable = prices on their older hardware drop too much = margins drop = profits drop.
      so there is NO COMPETITION BECAUSE ACTUALLY COMPETING IS UNPROFITABLE.
      thats why corporations today price-performance match their new products to existing products, so there's 0 competition, prices don't drop, including own corporation's prices on its older products. Thats why everything is overpriced - GPUs, CPUs, laptops, smartphones etc. and performance gain/year/dollar is so low like 0-15%. We pretty much technologically stuck because of this.
      Its an industry-wide price fixing to milk the market maximally. in 2000s progress was way faster probably because corporations didn't bought all regulators yet and there actually was some competition. 15 years ago when inter-generational advances were insane compared to today the prices weren't climbing up at all. Now you can literally hold on to a build for >5 years and they're jacking up the prices. it doesn't make sense.
      consumers are paying for R&D always, its just with current prices consumers already have payed for R&D for like the next 20 years or so. just compare yearly profits and R&D expenses of these corporations and stfu already about R&D costs.
      they do R&D once, cut their products to have low gains (or increase price to not compete with older tech much) = are milking the same tech for years to not drop prices on the market.
      and people are buying into crap about "Moore's law is dead". nah, technically Moore's law is alive and well, its just its unprofitable for corporations to follow that law. so they don't. they milk the market instead.
      for corporations it's not about making faster/better/more affordable tech, its all about how to milk the profits with incremental as low as possible 0-15% "upgrades" - to milk tech they have to the max.
      EVERYTHING IS OUTDATED because ALL corporations killed even possibility of any meaningful progress in all tech to milk the market with the same crap year after year.
      we are technologically stuck in like 2015 if we compare the speed of progress in 2010s-2020s and in 2000s. we have in 10 years the speed of progress that we've had in 1,5-2 years in 2000s.
      a 15 y.o. Core 2 Duo is only 8 times slower than "modern" Ryzen 5700x but according to Moore's law it should be more like 128 times slower. "the 4060ti" only 5% faster than the 3060ti after 3 years.
      GPUs are x3-4 times overpriced when compared to pre-mining prices. and ALL techtubers like GN or LTT or HUB don't really criticize that at all, they are normalizing overprice instead talking about "value" of x3-4-10 times overpriced products comparing it to other overpriced AF products and not comparing to the past 5-10 years at all so they have a fish perspective of progress - not considering what was more than 5 minutes ago. tech tubers are basically shills now working for corporations even if they don't realize that - THEY ARE WHAT DRIVES PRICES UP and what normalizes overprice to viewers. what they make is ads, not reviews.

  • @Wild_Cat
    @Wild_Cat ปีที่แล้ว +2

    Intel Arc FTW

  • @bosstowndynamics5488
    @bosstowndynamics5488 ปีที่แล้ว +3

    I wouldn't expect nearly as much improvement out of the A580 as the A770 since they're both on the same architecture and, presumably, many of those improvements were architecture level rather than card specific

  • @MazeFrame
    @MazeFrame ปีที่แล้ว +2

    Been looking at the Intel Arc Pro A60 for productivity, yet absolutely unable to find one anywhere, or even as an option for an OEM-System.

  • @kurtwinter4422
    @kurtwinter4422 ปีที่แล้ว +32

    The $100-200 range has never been so interesting.

    • @rawdez_
      @rawdez_ ปีที่แล้ว +4

      used market has never been so interesting

    • @rawdez_
      @rawdez_ ปีที่แล้ว +1

      GTX 980 Ti Die Size 601 mm2 2015 Launch Price 650USD
      GTX 980 Die Size 398 mm2 15% slower 550 USD
      GTX 970 Die Size 398 mm2 30% slower 330 USD
      RTX 4090 Die Size 609 mm2 Launch Price 1600 USD
      RTX 4080 Die Size 379 mm2 30% slower 1200 USD. its a xx70 card.
      RTX 4090 must cost 600 bucks. RTX 4080 must cost 300 bucks. because TECHNOLOGICAL PROGRESS, ever heard of it?
      yeah, ngreedia can sell 4 times less GPUs for 4 times more money each and get the same profits, yet gamers will have 4 times less new faster GPUs and game devs don't make games for overpriced AF hardware, game devs make games for hardware that MOST people can afford to have a player base to sell games to. +most new GPUs have like 0 progress = thats how ngreedia is killing PC gaming with its overpriced x4 times crap.
      when you buy a top overpriced AF GPU there will be minimal amount of games that actually need such GPUs. and most new "hard to run" games actually is boring AF crap with bad gameplay and stupid storylines. and YOU SUPPORT GREEDY AF CORPORATIONS AND DEATH OF PC GAMING through overprice.
      and ayyymd isn't better - supporting overprice to sell more obsolete cheap-to-make high-margin ps5/xbox chips to sony/ms. ayyymd supports ngreedia overprice to push PC gamers buy crap consoles. and both corporations are making profits of slowing down technological progress. no progress = no price drops.
      people really should ONLY get used GPUs and don't feed GREEDY AF corporations at this point.

    • @VideogamesAsArt
      @VideogamesAsArt ปีที่แล้ว +1

      I mean... new? At that price new is only Rx 6400, Rx 6500 XT, Rx 6600 and Arc A580. From those, only 6600 and A580 are interesting. Back when GTX1600-series were able to play anything, this price range was much much much better.

    • @rawdez_
      @rawdez_ ปีที่แล้ว +1

      @@VideogamesAsArt its kinda funny and sad that ALL techtubers are basically shills at this point who ignore the fact that pre-mining top GPUs cost around 600 bucks, not 1600 bucks. PC market is basically dead due to overprice and "tech tubers" just roll with it promoting overpriced x3-4 times slow low-VRAM basically OBSOLETE crap. what a time to be alive!

    • @slaydog5102
      @slaydog5102 5 หลายเดือนก่อน

      @@VideogamesAsArtyou must be a fun person

  • @adhdengineer
    @adhdengineer ปีที่แล้ว +5

    Did you check 1440p? I've noticed a lot of benchmarks with the Intel cards doing disproportionally better at 1440p than the 1080p benchmarks would suggest. Almost like they have driver call overheads more than they do actual rendering overheads.

    • @EbonySaints
      @EbonySaints ปีที่แล้ว

      This. There's a few conspiracy theories floating around that Intel's drivers are considerably more CPU bound than Nvidia's, leading to decreased performance on six cores or less, at least on Ryzen 5000 CPUs.

    • @adhdengineer
      @adhdengineer ปีที่แล้ว +1

      @@EbonySaints now that would be an interesting set of test data... Intel arc cards benchmarked across a full range of CPUs...

    • @EbonySaints
      @EbonySaints ปีที่แล้ว

      @@adhdengineer It isn't much, but this is where I got some on that Ryzen info from. Mind you, it's half a year old now, so it might not be as relevant: th-cam.com/video/wps6JQ26xlM/w-d-xo.html

    • @Berserkism
      @Berserkism ปีที่แล้ว

      Agreed. The benchmarks I have seen Intel regain ground at 1440p disproportionate to what 1080p results would initially suggest.

    • @douglasmurphy3266
      @douglasmurphy3266 ปีที่แล้ว +1

      1440p will benefit from the 256bit memory bus on the ARC when compared to its 128bit limited competitors.

  • @TuxPeng
    @TuxPeng ปีที่แล้ว

    My BitFrost A770 has been flawless on Linux since kernel 6.2, No regrets

  • @abdullahtiry6394
    @abdullahtiry6394 5 หลายเดือนก่อน +1

    Hi , Which GPU is better for 4K Plex transcoding? Only need to transcoding in my server and no gaming..

  • @wjack4728
    @wjack4728 ปีที่แล้ว

    One thing to keep in mind with the Intel Arc cards is DOSBox doesn't work with them in Windows. They did at one time, but since about 4 months ago with driver 4676 DOSBox quite working. From what I've read Intel is not motivated to fix the problem, so it may never get fixed. The Arcs are great cards in my opinion, but if you need DOSBox to work go with AMD or Nvidia. I've got a Arc a750, and really like it, but just tried the latest driver 5084, and DOSBox still broke, going to switch back to my AMD card.

  • @maestro0428
    @maestro0428 ปีที่แล้ว

    I bought in. For the video editing aspect.

  • @Kendop16
    @Kendop16 ปีที่แล้ว +1

    I've got the Acer Bifrost A770. To be honest, it's become a bit of a love hate relationship. Brand new games seem to run quite well on it. But older titles are completely hit and miss! Unfortunately I have a lot of older games that are simply unplayable. What makes my situation worse was that I built a cheap gaming PC for my son using second hand parts. The only part I got brand new was an AMD RX 6600. It's gutting to see a card that cost £198 best my Intel Arc that I paid £377 for about 6 months ago. Not all games but a significant amount are better /smoother/less hassle on the RX6600. I had my eyes open when I bought the Arc, so I knew what I was getting in to. I'm still rooting for Intel to do well though! Is recommend an Arc if your playing just current titles. However I wouldn't recommend it if you have lots of older games you want to play.

    • @AshenTech
      @AshenTech ปีที่แล้ว

      Dxvk gplasync and dgvoodoo2

  • @MM.x.00
    @MM.x.00 ปีที่แล้ว

    Wendell, I'm having a good day and your video is helping that 😃👍

  • @LA-MJ
    @LA-MJ ปีที่แล้ว +1

    How's that srvio going?

  • @Anthony-Webb
    @Anthony-Webb ปีที่แล้ว +3

    I've been looking to upgrade from a Vega 56 that is beginning to artifact to something more modern and less power hungry. I'm interested in these because of the price, but can only seem to find conflicting information for Linux support. I don't game too much, but on the rare occasion that I do I'd like it to run decently.

    • @aeaeaeaeoaeaeaeaeae
      @aeaeaeaeoaeaeaeaeae ปีที่แล้ว +3

      Yeah same here, i really need good linux support but it looks like theyre only doing drivers on ubuntu. Shame. Guess ill keep my RX580 a while longer lol

    • @EbonySaints
      @EbonySaints ปีที่แล้ว

      ​@@aeaeaeaeoaeaeaeaeaeThe drivers are actually integrated into i915 package, so it was pretty plug and play for the day I used Debian. Played a bit of EU4 with my A380 too.
      However, there's been a bit of whinging over media hardware acceleration not being on that specific driver, leading to a chicken and egg scenario. Like all info relating to Arc, your best bet is to head over to either the subreddit or the Discord and ask around for anyone's experience on the subject. Arc troubleshooting is still in the Wild West days and good solid information is still hard to come by.

  • @Decenium
    @Decenium ปีที่แล้ว

    Intel should really also work on pushing XeSS more, its just so weird how often its not an option, like the most recent example, Alan Wake 2

  • @shaunhall6834
    @shaunhall6834 ปีที่แล้ว

    You can do this Intel! ❤

  • @frallorfrallor3410
    @frallorfrallor3410 ปีที่แล้ว

    got 4090 strix oc, 7900xtx red devil and 770 16gb LE and sparkle titan 770 16gb i love them all like a harmonic familiy 🤩

  • @Pins33ker
    @Pins33ker 10 หลายเดือนก่อน

    Interesting info here, but I would be more interested in the Encoding (AV1, H.264) performances in these GPU's (maybe adding a Nvidia 4070 or equivalent to a Intel A770) for streaming, not gaming. Nvidia 40 series GPU's are way too expensive, the A770 looks to be the way to go for a content creator like me. Can you do a video on that ?

  • @WayStedYou
    @WayStedYou ปีที่แล้ว

    Aren't the intel gpus entirely useless without pcie 4.0 resizeable bar

  • @jierenzheng7670
    @jierenzheng7670 7 หลายเดือนก่อน

    Still waiting for the Linux XE drivers to be up to feature parity with the Windows one =( A770 user here.

  • @magoid
    @magoid ปีที่แล้ว

    Were are the power numbers?

  • @powerpower-rg7bk
    @powerpower-rg7bk ปีที่แล้ว

    The challenge for Intel isn't just being comfortable being second or third in this market but actually having that position with some margins. The build quality for the A750 and A770 was rather high but prices on them dropped rather steadily that I believe that they were often sold at a loss. A company cannot sustain both being second/third in the market while loosing money for several continuous generations.
    I do think that the pricing wouldn't have cratered nor the A580 been delayed so long had the drivers launched at their current level of maturity. They are not the embarrassing mess they were at launch. As such with expected stagnation for GPUs in 2024, if they can launch Battlemage before AMD or Intel can release new chips (not refreshes for existing lines), they'll be in good shape and position in the market. They just need to execute and be on time.

  • @owomushi_vr
    @owomushi_vr ปีที่แล้ว

    nov update just make vr possible for the a770 so happy

  • @donizettilorenzo
    @donizettilorenzo ปีที่แล้ว

    Borderlands 3 in D3D11 or D3D12? Mankind Divided? (same question)

  • @profosist
    @profosist ปีที่แล้ว

    issue with those older PCI-E Gen 3 system they may not have re-bar which is basically required for Arc (and any other new cards)

    • @Heartsbane055
      @Heartsbane055 ปีที่แล้ว

      Says who? Lots of mobo with pci-e gen 3 supports rebar.

    • @profosist
      @profosist ปีที่แล้ว

      @@Heartsbane055 I'd argue the majority don't

    • @Heartsbane055
      @Heartsbane055 ปีที่แล้ว

      @@profosist Then you don't know anything.

  • @Arejen03
    @Arejen03 5 หลายเดือนก่อน

    its an interesting option for someone like me with b450 mobo PCIE 3.0 16x ) with resize bar. Most of the new entry level cards are 4.0 8x so ure gonna lose some performance buying these. Ur options is really used market like 2070 super, 3060, 3060 Ti, 6750xt or higher tier cards with 16 PCIE lanes. Of course a580 makes sence depending on the price, at the range like 160-170 euro its a great value, but personally i would wait for battlemage and if i needded card now id get a used amd 6000 series or nvidia 3000

  • @tassadarforaiur
    @tassadarforaiur ปีที่แล้ว +1

    Intel is intimately familiar with being 4th best. Remember 11th gen consumer cpus vs apple silicone, AMD, and Intel 10th Gen?

  • @danx494
    @danx494 ปีที่แล้ว

    Can somebody please share the solution he mentioned about splitting GPU into two data center gpus

  • @johndprob
    @johndprob ปีที่แล้ว

    You may want to look into asrocks official stance on warranty before showing there cards off unfortunately.

  • @Decenium
    @Decenium ปีที่แล้ว

    im really looks forward to battlemage

  • @LoganE01
    @LoganE01 ปีที่แล้ว +1

    Be nice to see them release a 30 to 35 watt GPU.

    • @VideogamesAsArt
      @VideogamesAsArt ปีที่แล้ว

      Today, Meteor Lake releases with iGPUs that consume around there, maybe even less for some SKUs.

    • @LoganE01
      @LoganE01 ปีที่แล้ว

      @@VideogamesAsArt well I meant a dGPU single slot low profile card that consumes that

    • @VideogamesAsArt
      @VideogamesAsArt ปีที่แล้ว

      @@LoganE01 I don't think we will see that in many years

    • @LoganE01
      @LoganE01 ปีที่แล้ว

      @@VideogamesAsArt well Intel did it for first gen ark under the DG iris XE, and AMD has the RX 6300 finished, and ready to go, but wont release it to the main stream public.

  • @QuentinStephens
    @QuentinStephens ปีที่แล้ว

    With regards to being x16 cards, how many PCIe3 systems out there support ReBAR?

    • @QuentinStephens
      @QuentinStephens ปีที่แล้ว

      @@commanderoof4578 You have it the wrong way around: the GPU is PCIe v4; my question is how many systems that are still PCIe v3 support ReBAR?

  • @anasevi9456
    @anasevi9456 ปีที่แล้ว +1

    Bethesda has a billions of bugged api calls in Starfield according to a Vulkan dev, that kind of cr@p takes a lot of early insider access as AMD had; or years of hardening your drivers against as Nvidia somewhat has to mitigate. Hard to really blame Intel given.

  • @hayseedfarmboy
    @hayseedfarmboy ปีที่แล้ว

    the main problem with older builds with Arc is resizable bar support, your looking below 10th intel or zen 2 anything older is going to be a flop, and the synthetic bench marks aren't even close in real world gaming performance, when you look at the hardware in the A770 its quite a card but i would assume at best anything below the 16 GB will be out dated by the time they get their drivers figured out , i tried a Ryzen 1600 build with a A380 which should have been a balance system as far performance, but its just not, a RX 550 is giving as playable experience and the GTX 1650 that its supposed to be comparable to out classes it big time

  • @sijedevos2376
    @sijedevos2376 ปีที่แล้ว +1

    The problem is that paired with a more budget cpu arc will perform worse especially in dx11. Even a Ryzen 5 5600x for example can becomes the limiting factor for a580 by a decent amount. Horizon zero dawn is a great example. I have a750 and get no where near the performance you are showing or all other big reviewers benching with high end CPU’s. My cpu usage is up in the 80-90% while gpu usage is not maxed out in that game. Same for majority of dx11 games. No big reviewer ever talks about it unfortunately. Budget cards need to work well in budget systems.

  • @phasechange5053
    @phasechange5053 ปีที่แล้ว

    How does it stack against the A750 is where I'm looking.
    God, I want intel to keep going on the GPU project because i think not only will it be good for us it will be good for them when one day, they want to try a more integrated high-end platform.
    It will also give them a IN on AI compute.

  • @DaemonForce
    @DaemonForce ปีที่แล้ว

    I can't wait to see low profile Intel Arc cards that I can drop into my rack, completely cheesing any competition from nVidia and AMD's AWOL stack of workstation GPUs. Cheap AV1 encoder card with a competent amount of PCI-E lanes, vram and modern features means I can look forward to VR/stream server going brrrr.

  • @Decenium
    @Decenium ปีที่แล้ว +1

    Intel should drop their prices by 100 dollars so it has mass adoption really, if you just match the established cards....you are not going to sell a lot

  • @terrycook2733
    @terrycook2733 ปีที่แล้ว

    I hope they run lanes as shorter and longer instead of cramming same lengths... this gives a natural flow of power of how it actually works in physics where it takes the shortest path first... I think some coders can relate to this by the software and its core settings making alot of wierd pileups and errors its because the lanes and flags placed in reasons that are good but takes a lane it can work better in a differnt

    • @Zosu22
      @Zosu22 ปีที่แล้ว +3

      That… doesn’t make any sense.

    • @terrycook2733
      @terrycook2733 ปีที่แล้ว

      @@Zosu22 take a pulse of electricty through a circuit board and watch the paths it takes..... if you didnt know this why you commenting like you dont know when you dont know lolol

    • @Zosu22
      @Zosu22 ปีที่แล้ว +2

      @@terrycook2733 what you’re saying is just incomprehensible. Like “watching the pulse of electricity”…? It’s just nonsense. Nobody knows what you’re trying to say here. What “lanes” are you even referring to in your original comment? PCIE lanes?

    • @bosstowndynamics5488
      @bosstowndynamics5488 ปีที่แล้ว +1

      ​@@terrycook2733Shorter PCB traces won't do anything to speed up PCIe, and they definitely won't somehow redirect signals through different lanes since each PCIe lane is a more-or-less independent duplex serial channel. The only circumstances where the path lengths are important are either if they're too long and you lose signal integrity (which they aren't on any off the shelf consumer GPU/system combination), or for hyper sensitive applications like RAM, where you specifically want identical path lengths because the minute change in transmission time causes desynchronisation of the parallel channels used for DDR.

    • @terrycook2733
      @terrycook2733 ปีที่แล้ว

      @@bosstowndynamics5488 wrong lanes there are inter connects to cpu to talk with ram to talk with gpu to talk with via motherboard to via controllers to say the lanes and threads need to co exist not randomly choose

  • @TylerDurden-oy2hm
    @TylerDurden-oy2hm ปีที่แล้ว

    intels been in the game before...ie the i740 (1998) i hope they stick around

  • @axn40
    @axn40 ปีที่แล้ว

    I hoped you were gonna talk about SRIOV 😂

  • @Pacho18
    @Pacho18 ปีที่แล้ว

    2023, almost 24. Where's SR-IOV?

  • @csh9853
    @csh9853 หลายเดือนก่อน

    my a580 will be here tomorrow but i could care less about making nvidia gpu's cheaper.

  • @p.d.k.
    @p.d.k. ปีที่แล้ว +1

    The performance improvements or lack thereof are only part of Arc's problems. Power consumption can never be fixed across the Arc A770/A750/A580 lineup. The VRAM clock is locked at maximum, which means these cards will never go below ~35-45w at idle, and that's with working ASPM. ASPM does not function on the Acer Bifrost models, and it doesn't function on most motherboards. Without ASPM, you're looking at ~65-70w at idle. My 4090 idles at 18w...
    No one should buy Arc and use it as a daily driver, unless one also needs a space heater. It's fine to fiddle around with, but my Arc systems are unpowered when I'm not testing out the latest driver to see how badly No Man's Sky performs.

  • @godwhomismike
    @godwhomismike ปีที่แล้ว +4

    I am really hoping Intel gets better, as I like AMD, but their drivers do the weirdest crap. Boot up, game loads, great frames. Close out game, leave computer running, and come back in the evening, load game and 20 fps. Reboot, open game, excellent high fps.

    • @TuskForce
      @TuskForce ปีที่แล้ว

      Yep, I'm gonna switch from AMD to Nvidia. The RX570 has been fine but I've had struggles in VR...
      Not really something I'm gonna put up with any longer

    • @rudysal1429
      @rudysal1429 ปีที่แล้ว

      ​@@TuskForcelmao why are you trying to run VR on a video card that came out in 2017 and wasn't even considered a flagship card. Lmao

    • @TuskForce
      @TuskForce ปีที่แล้ว

      @@rudysal1429 well, the card still has official driver support, so I expect the bugs to be fixed after a while but it seems like some issues just don't get any love
      Also, there are other things than gaming in life :)

  • @FELiPES101
    @FELiPES101 ปีที่แล้ว

    I can see intel being right there in a generation or two

  • @vladimirnovakovic3495
    @vladimirnovakovic3495 ปีที่แล้ว

    Where I live the A580 is a little more expensive than RX 6600 (around the equivalent of 231 US $). Not a good start. Importers and/or retailers are actually sabotaging Intel GPUs by slapping high margins on them.

    • @deathwarrior9294
      @deathwarrior9294 ปีที่แล้ว

      Well, The rx 6600 is 50 usd more in my country than the A580

  • @mitch7918
    @mitch7918 ปีที่แล้ว

    Did Intel ever fix the idle power consumption issue? I remember the A750/A770 were drawing some craaazy amount of power when idle which made them poor for server use

    • @EbonySaints
      @EbonySaints ปีที่แล้ว +1

      There's a workaround on /r/IntelArc that involves using the motherboard display out and some monitor/BIOS tweaking. But for normal people purposes, no, it has not been fixed.

  • @schtive81
    @schtive81 ปีที่แล้ว +1

    The challenger A580 seems like its inspired by the older Radeon 580 in naming.
    looking at hardware survey sites (Steam is a good one) that show that the 3060, as well as the 1650/1660 Super's are still in the top percentage of GPU's in use. The average user has 8gb of VRAM in their rigg.
    Below that is the Intel integrated graphics, which are still widely used in laptops obviously.
    I think Intel should focus more on 'that' market above their integrated graphics lines, maybe take a different angle from the hardcore gamer, though... And appeal to those looking to upgrade to their GPU's to something with more contemporary HW features in comparison to some of those older 3060's and 1650's that are still in use. Like this A580? But Driver support is still getting updated.
    I also think that they should focus on productivity cards with current features that could be in an affordable range.

  • @Xero_Wolf
    @Xero_Wolf ปีที่แล้ว

    Please disregard any information regarding Arc GPUs from Moore's Law is Dead. He's been singing songs of there death since before launched and has been completely off the mark since.

  • @SatsJava
    @SatsJava ปีที่แล้ว

    Good chip
    Need better driver

  • @capsulate8642
    @capsulate8642 ปีที่แล้ว +2

    From what I understand, the Anti-Lag issue was Valve consciously deciding to outlaw it. Realistically, if Nvidia had implemented it, it wouldn't have been banned and there would have been no controversy, just like how there wasn't with Nvidia's existing low-latency software.

    • @sulac4ever170
      @sulac4ever170 ปีที่แล้ว

      And why in the name of Gaben would Valve do that?!

  • @lightward9487
    @lightward9487 ปีที่แล้ว

    Battlemage compete 100%

  • @azjeep26
    @azjeep26 ปีที่แล้ว

    microsoft cpu / gpu that would crush them all

  • @SuperMari026
    @SuperMari026 ปีที่แล้ว

    I need those Wendell emotes as sticker on my phone tbh

  • @Bob-of-Zoid
    @Bob-of-Zoid ปีที่แล้ว

    I would love to see Intel stick to it. To hear about them dumping it just because they can't rake in big time right off the bat would be devastating, and I would be less willing to buy anything with their name on it, in fear they cannot be trusted to keep a good thing going and getting better, as they have already shown with Optane and then some. I have used AMD processors since way back in the 486 era, because even if not as good as Intel, they fought like hell and stayed on track, and now look at Threadripper! I feel good knowing I helped them through it all buying K6's Athlons and what not, and daaaaaaaaaaaaaang, My Ryzen 7 7500X rocks, and I can still upgrade to a Ryzen 9 5950X (16 cores) on my Mobo. Even better: I run a lean and mean Linux system (F*ck Windows!). Linux and Linux native apps know what to do with as many cores as you can throw at it them.

  • @Marti77e
    @Marti77e ปีที่แล้ว +1

    Arc 580 = buy the A750 much better option.

  • @Lynnfield3440
    @Lynnfield3440 ปีที่แล้ว

    the thing I like the least about this, is that it is asrock. they sadly dont even offer the minimum required warrantly by law. they have so little confidence in their products they dont offer any. And I 100% get it with how bad asrock products are.

  • @Kurukx
    @Kurukx ปีที่แล้ว +2

    I dont think steam has the balls to ban a NVIDIA feature that rolled out to say the NVIDIA GeForce RTX 3060 which had 6.27% in September. It was a terrible flex on AMD and a tiny gamer % and a single game. Ryan might say well if he hiked, was a fart in the forest.

  • @zapman2100
    @zapman2100 ปีที่แล้ว

    why anybody would risk buying junk from asrock I'm still astounded.

  • @dgo4490
    @dgo4490 ปีที่แล้ว

    LOL, by now Intel should knows all about being 2nd and 3rd place, there's been years since amd beat them, and then apple flushed them down the toilet for an in-house design.

  • @LA-MJ
    @LA-MJ ปีที่แล้ว

    Ready Player 3

  • @LeonRedfields
    @LeonRedfields ปีที่แล้ว

    Every time I see your face I cant help but think about that guy you used to do video's with with the big chin and goth gf. What happened to him I cant even remember his name

  • @Benny-tb3ci
    @Benny-tb3ci ปีที่แล้ว

    I'm honestly hoping both AMD and Intel will eventually get around to offering exclusive features that only all-AMD and only-Intel systems can leverage. and then Nvidia is the odd one out. And here's to hoping Intel makes it in the GPU market, cause lately it feels like its a duopoly. Nvidia chooses its price points and then AMD gets the rest of them.

  • @andljoy
    @andljoy ปีที่แล้ว +1

    Intel know how to be second or third, they have been behind AMD and ARM in server for years now :P

  • @KRAVER_
    @KRAVER_ ปีที่แล้ว

    only Intel GPU worth the money is the A770 16GB, and that's not even that great. my 6700XT 12GB Red Devil beats it easily.
    6700XT is really cheap now used there under $300. I would definitely Not buy any lower GPU's being AMD new APU are on par with 3060 and coming out really soon.

  • @P4NCH1
    @P4NCH1 ปีที่แล้ว

    Still. I can't withstand that dang ugly design that ASRock did on that card :S

  • @retrosean199
    @retrosean199 ปีที่แล้ว +1

    Intel needs a GPU that is legitimately faster than a RTX 4070 and way cheaper. Hopefully that becomes a reality.

    • @harrybryan9633
      @harrybryan9633 ปีที่แล้ว

      That is the design goal for Battlemage (4070 in gaming/4080 in compute).
      Alchemist was shooting for 3070 performance. It is getting steadily closer.

  • @rajuhirani9353
    @rajuhirani9353 ปีที่แล้ว

    #### Summary👉 The video script discusses the features and performance of the Intel Arc A580 and A770 GPUs.👉 The A580 has 16 PCIe lanes, a dual fan, and a solid metal backplate, making it appealing for its physical form factor.👉 The A580 performs well in synthetic benchmarks but falls behind the RX 7600 in real-life gaming performance.👉 The A770, on the other hand, shows significant performance improvements and competes well with the RX 7600.👉 Intel needs to stay in the GPU market and continue to improve its Arc GPUs, especially in the lower mid-market segment.👉 The video expresses hope for Intel's future in the GPU market but highlights the need for better driver management and communication.👉 The overall goal is to have a three-player ecosystem with well-distributed market share for the benefit of gamers.#### Highlights- 💡 The Intel Arc A580 offers a compact form factor, promising performance, and 16 PCIe lanes.- 💻 The A580's real-life gaming performance lags behind the RX 7600, while the A770 shows significant improvements.- 💰 Intel's presence in the GPU market helps drive down prices and offers competition to AMD and Nvidia.- 🚀 Intel needs to stay committed, improve driver management, and embrace the gaming market's demands.- 🎮 A three-player ecosystem with well-distributed market share is the best outcome for gamers.