And for more details on the software side of things, watch our recent Intel Arc Driver revisit! Some good, some bad for Arc: th-cam.com/video/aXU9wee0tec/w-d-xo.html Grab one of our PC Building Modmats for your new PC build! It's a HUGE, rugged work surface with useful diagrams: store.gamersnexus.net/products/large-modmat-gn15-anniversary Or get one of our GN15 metal emblem glasses! store.gamersnexus.net/products/gn-3d-emblem-glasses Article version is now live! gamersnexus.net/gpus/intel-arc-2024-revisit-benchmarks-a750-a770-a580-a380-updated-gpu-tests
I would be nice to have some productivity tests on GPUs too. Someone who doesnt have tons of money like me, how bought a 3060 12gb to start at 3D render for example, would like to see how a a770 16gb performs in blender, daz and v-ray.
im not sure how your gta-v results are so far off from mine, im on an old 7980xe@4.6ghz, and my gta-v results with the game cranked up...mins are way higher then your max.. very weird since your chip in this test SHOULD be better for gaming then this old beast... this result genuinely has me doing the o.O because... no joke online or sp... the games perfs WAY better then this... and i would really love to figure out why... (bet intel would as well...LOL..) i have submitted bench runs to the intel discord when people have fps issues in hopes it would help figure out wtf is going on... the MSAA in gta does have some bugs though... turn it up and down and off and retest and sometimes at random it will run alot better then others with it on, sometimes huge perf hit sometimes not, this even effects nv and amd hardware from my exp.. just not to this degree..holy shit..
just a suggestion, tossing Quake 2 RTX to the list isnt a horrible idea, it works amazingly on the a770 last i tested i did have to enable a toggle to get the perf up but, that may be a default now its been a while, either way, it also looks amazing for a game from my youth that shipped with both software and accelerated rendering support... better then some modern games... (also, we will be seeing more projects like this as nv have released that kit/project to replace the renderer in old games with one capable of modern features.. ) when we tested it was shocking how well it ran, better then the 3060ti despite it being older drivers from around this time last year now that we went and looked at when we did the tests... another suggestion if you see this.. a video showing arc on a range of cpu's and games to figure out where the sweet spot is.. again i get better results at 1440p and 4k in some of these titles then your benches are showing, and am on a much older cpu.. 7980xe, that i am regularly reminded "isnt a gaming cpu"...in the intel chat... by ppl who think a 5700g would have been better buy for around the same price..heh..) i do know its alot of work, but it could just be the 770 16g cards for the first run and if it looks interesting enough run the other cards, im betting the cut off for cpu will be quite a range depending on the games you play, then you will hit a point where you have enough cores/threads to deal with the cpu doing the "scorebording" for the gpu and still being in the process of heavy optimization, from our testing though a 7900x@5ghz with an acer a770 gets noicably better results on some titles vs his 12700h+a770m Nuc, where a few titles do run better on the nuc, its marginal most of of the time... and a few titles really hate the mixed-core chips and need you to force them to only use p cores.. using process lasso or similar.. anyway good video, i would say these are already aging better then nvidia products they compete against.. not 100% of the time but.. getting closer and closer.... and amd's 7600xt has the same issue as the 4060ti 16gb.. buss limitations, even at 1440p it starts to show.. and thats honestly arcs "Sweet spot" most of the time... sometimes its still faster at 1440p then 1080p... thats down to both drivers and hardware design, i think they might do well to hurry and get a DSR/VSR style toggle in drivers so people on 1080p screens could run at 1440p...thats the ONE feature i genuinely f-ing miss from amd and nvidia drivers... nvidias handling is better but... i just want a downscaling toggle... having to use crap like cru to try and do this shite is just.. frustrating and irritating.. heh..
I noticed that the card has really good specs on paper like even better than the Nvidia on paper how is it now? gaming / video encoding / decoding / AI funtime stuff?
@@blackIce504 You need a $1000 card to beat it in productivity. Also the fame time is lot better so comparing FPS to FPS is not that relevant. Half the FPS and it can beat 4070ti as more playable without stutter in higher resolutions over 1440p. A770 I would say is the entry level for 4k. Most of the times it can in mid and high settings and RT on too. XeSS XMX is as good as DLSS. Don't confuse it with XeSS DP4A that is directed to AMD/Nvidia. Yes on paper it is 6800/4070ti with less Wattage. But it does overclock pretty well. For example 2786MHz at 1.16V at 271W. Base is 2100MHz (2400MHz) 225W.
But it's not your daily driver GPU. This card is not for gamers. This card is meant for web browsing MS Solitaire. So let's stop this nonsense. I will never put this card into someone's new build gaming pc.
I bought an arc770 from an impatient friend, 250€, as a collectable object as the first "serious" Intel GPU, and it has become in my daily GPU for Counter Strike and PUGB, that are the games I play the most. Veeeeery happy with it!
nice, I bought it when it came down to €300 for the first time, also incredibly happy with it. Hope they'll get closer to Nvidias performance with Celestial
This reminds me of the olden days of 1998-1999 when S3 was trying to get somewhere between ATI, 3dfx, Matrox and NVIDIA. It was the old 2D powerhouse that had become the underdog of the early 3D era. Their price-cut Savage 4 had a place in my heart for having given my pocket-money-wielding teenage self access to some 3D PC gaming despite its quirks. They did have a clear edge in texture quality for games supporting S3TC, such as Unreal Tournament. S3TC was later renamed to DXTC as Microsoft made it standard in DirectX. I was really rooting for S3 to break through with the Savage 2000 back then, the only other graphics card to feature hardware T&L like the first GeForce, but it was a flop due to very poor drivers. They got bought by VIA who ended up only using their technology for chipset integrated graphics. Anyway, ramblings of an old guy aside, I hope that Intel finds its place and climbs the ladder to challenge NVIDIA. We need some serious competition, not only on the graphics side, but also the AI side.
Damn, back when there was competition on the market, i remember the S3 it was a decent card, hope that Intel will become stronger, the gpu market since 2020 is horrible.
same, we were super lucky with our ARC A770... as it ran our usual cycle of games just fine from the first. control panel UI sucked, but because we used it just for AAA gaming and super popular indie games like Phasmophobia and it's ilk... it took us a year and a half to find a game it really hated. The Morrowind Engine overhaul OpenMW, and ARC HATES old OpenGL.... You can fix it even on windows by injecting Linux Mesa drivers into the install.. however the multiplayer branch we were playing is on an older openMW version incompatible with that injector. Since youtube is being stupid and someone asked; google:( OpenMW windows Arc better performance Reddit ) if you want a how-to for that injector for single play. It does work.
If the results here are indicative, it would actually be more like 86% of the time they work every time. Starfield is probably a lost cause, but I'd like to see them put some effort into GTA 5. That one has been terrible for a long time and is actually a popular game.
my issue is then why is the prices not dropped by 15-30% to accomodate this and why do people justify the prices currently if this is the case like wtf.@@ryanspencer6778
Under load, that will take time. The architecture is way behind in terms of efficiency. But idle power efficiency should be solved for next generation, and I'd imagine that overall efficiency should see a nice improvement too.
I was not expecting to find out Intel's cards are now matching or outperforming my current card (RX 6600 xt) in a lot of places when I woke up today. Really interested to see how this plays out in the future.
happy to see competition. screw how expensive Nvidia has become, used to be a huge fan. Had 3 generations of Nvidia after AMD cards. Now, I'm just going back to mid level.
I would rather take a chance with an open-boxed or used RX 6700xt with a 15-30-day warranty than deal with A770 driver and game issues. It runs circles around the A770 in everything but Cyberpunk.
In terms of hardware the A580,A750 and especially the 16GB variant A770 (just the fact they offer a 256bit bus for these prices is insane) absolutely demolish anything AMD and especially Nvidia offer at similar prices. It's the drivers that are problematic.
I’m not much of a gamer. I got it for content creation. Pretty early adopter have it for a year plus now. Honestly for content creation, it hasn’t let me down. It’s about as stable (now) as Nvidia and AMD. I’ve had AMD in the past for a decade. Then had 1080ti. I simply love my Arc A770. I think it’s truly a content creation card masquerading as a gaming card. That’s how people should see it. More of that instead of a gaming card masquerading as a content creation card, which I know is how most people who buy it intend to use it. For my use case tho, it serves me wonderfully.
@@_S_P_A_C_E_M_A_N_ I’m strictly video editing now. No longer much a 3D fx or modeling guy. But I know there are other TH-camrs that have gone over it for 3D. Might want to search them.
I put a Intel A750 into my daughters computer. She plays some casual games (Minecraft, Wizard 101, etc), but she also plays Overwatch 2, and Killing Floor 2. The card pretty much worked for all the games she played right out of the box. She got World War Z when it was free at the Epic Store when she still had my old 980 GTX, but it only recently (maybe like 3 driver releases back) became playable. Overall I've been pretty impressed with how far their driver has come. I probably wouldn't have gotten it for her if she had to manage the updates / troubleshooting /etc, but she has me. When a new driver comes out, I make a special trip over to her place to update to and test the new driver. I'm hopeful that Intel will try to fight in the mid to high range of the market with Battle Mage. I just hope Intel's board of directors and Pat Gelsinger have the fortitude to see this business segment through the rough start, and make it to where they are competing. Surely their executive didn't think that they would hit a home run on their first product release???
Not necessarily that their executives didn't think it would hit a home run at first release, but that the sales were probably lower than expected. They run predictions for this and also have to accept the fact that both NVIDIA and AMD will sabotage them however they can. Otherwise the move to ARC branding for even their integrated graphics cards means they won't give up or axe it just suddenly but it will be a very slow and difficult market to break.
@@Aki-ow9hd If their sales were lower than someone calculated, either the engineering staff wasn't communicating well with the bean counters, or the bean counters were dismissing what the engineering staff was telling them. The engineers knew the release was going to be a cluster frack, that's why it was delayed as long as it was. Once reviewers (particularly like Gamer's Nexus and Hardware Unboxed) got the cards/drivers, Intel's engineers working on the drivers and supporting application (like Arc Control) knew it was going to be a media bloodbath. This kind of publicity kills sales. So either there was a communication problem at Intel, or the communication was being ignored if Intel actually had higher sales figure predicted. My point was that I hoped that Intel's executive had the fortitude to stay in it for the long run. They have a history of killing off / selling off divisions that don't make money, sometimes too soon. Especially right now when they are most vulnerable. No time in history has Intel been in this much trouble (not even in the Pentium 4 era) with being behind their competitors. I say Intel executive, but really it comes down to the executive's abilities to keep the shareholders at bay. If shareholders start calling for the graphics division to be culled, the executive has to provide some assurances that they can turn the division around in the long term. Thankfully Arc is many orders of magnitude better than when it released. If they can stay on the same trajectory leading into Battlemage and beyond, they might have a break even turning to profit moment in the not too distant future. I can only imagine a future when Intel hadn't abandoned the graphics market with the i740, and kept in it competing with ATI, 3DFx, nVidia, etc. They had the money to outcompete those companies back then. Now they are up against the giant nVidia and to some extent their CPU nemesis AMD in the graphics arena. I think to succeed (Gamer's Nexus has alluded to it a few times), they need only to beat AMD to stay relevant. AMD's graphics division should be the most worried by Intel's presence in the market.
@@dead-claudia their laptops are a shit show. Unupgradable, locked down parts, and shit thermals(in some instances), cheap components with a 200% markup
My A380 is a wonder card for my Plex server. It's a Dell Poweredge T330 that doesn't support iGPU CPUs or discrete GPUs officially. The A380 works with a power cable adapter and it can handle a ton of concurrent transcodes💪
If you can't use the iGPU or are running an AMD system it's indeed great. However, from a power efficiency standpoint it's much better to go with an Intel CPU with and iGPU. They'll do all that transcoding just as well, without having to power such a large GPU.
I have a friend who gave his Nvidia GPU to his wife and decided to roll the dice on an Intel a770. I've yet to hear any real complaints from him even playing somewhat esoteric stuff like mods and fan games
@@doorsbh actually i have the ARC770 16gb and for most things it works well .. ive got it paired with the 14600kf and the system in general is great .. my main is a 7800x3d 7900xtx but for a budget GPU that intel clearly wants to compete in the market with constant driver updates and support its a huge step in the right direction .. They dont need to be 1st of even 2nd yet they just need to gain consumer confidence and a following the position in the market will change as they become more trusted .. People forget that the majority of GPU's sold are in the budget sector not the 7900xtx's 4090s and higher end !!
I did something similar. The A770 was pretty much a direct competitor minus some software issues (a lot of which have been addressed). The AV1 CODEC and extra VRAM alone are worth it for dabbling in multimedia. Eventually, swapped back to 1080 Tis in my Gaming Rig, because I _do_ still play games that support SLI, but I'm all-in on BattleMage.
I feel like these GPUs are a good lesson for businesses about honesty and providing a valuable product. The GPUs at the start were very inconsistent, but Intel owned up to it and were pretty transparent about what was going on. Now way later, instead of being remembered for their horrible launch, they will mostly remembered for the massive improvements and competitive pricing (of course no one is going to completely forget it, but in 2 years, what matters most is where it ended).
I picked up an A770 on sale about a year ago and I've loved it. It hasn't been without issues, but it's gotten so much better and I'm glad to see that the performance seems to be continuing to improve.
One thing that would help a lot on the graphs readability is if you align every part of a GPU name on the Y axis on a ordered fashion, adding white spaces to align every part of the name, something like - so it would look something like that (Disregard youtube indentation needing multiple whitespaces) EVGA - RTX 3060 12GB Black (1/24) NVIDIA - RTX 4060 Ti 8GB FE (1/24) Gigabyte - RTX 2070 8GB OC (9/23) That way, it is always ordered and you always know where you'll find the Model or VRam size without having to search multiple names that have different sizes and different places where the model can be This is Kinda what already happens with the last VRAM size, Stock / OC and date, but it gets messy when looking for the model and brand of GPU
Yeah, the names have different lengths, that would indeed improve readability. No idea which software they use for the graphs, but it should be possible even in Excel.
It would also be pretty awesome if there were FPS/Price charts. Dividing FPS by Price will show relative value, and it will be awesome to see the best value cards. Obviously higher-end cards will perform poorly on this chart, but for someone trying to pick the best budget cards this would be pretty helpful. I'm certain Intel would have peaks next to similar performing cards in a few of these tests.
I bought the A380 for general computing (web browsing (some websites are pretty demanding on the GPU), video conferencing, some video editing, video playback, etc.) I was initially skeptical if it could handle video conferences, but it's been doing great, especially considering it's only 120 $. No issues with stability for me at all.
If you ever have a chance for an interview, it would be fun to hear more about this drivers optimization process. In a sense, not only it looks like they are reordering hardware calls and optimizing pipelines, but each driver has a specific module for each “supported” game, whereas the unsupported ones fall into the generic bucket. But knowing more about those details would be super. Thanks for all your work!
@@hirakchatterjee5240Nope. If you refer to the nonsense that MLID talked about, he was referring to laptop GPUs and even that I wouldnt trust lol. GUy has been wrong so many times.
@@dominikheinz2297So many people take that guys “information” as gospel. I used to enjoy watching his videos because unlike other shitty channels that fart out videos every 6 hours his presentation seemed more….. believable? Also the smug arrogance when he finally got something right was a nuisance to say the least.
Thanks for the re-visit! The most surprising part of this review to me is just how poorly the nvidia 20 series cards have aged!!!! I’m so glad to see intel making so much progress, and I think I’m finally close to being able to recommend it to savvy enough friends on a budget! I hope they can get closer to the “mid range” of say 4070 or theoretical 5070 on their next launch and I will also switch from my insanely poor value 3069ti that I bought at the height of the GPU drought for $479 :/ The return tracing vs similarly priced cards is especially impressive to me.
I agree with you, I have a RTX 2060 and I am searching right now all these GPUS, I'm currently between ARC A770 vs RTX 4060. What is your opnion on them?
@@rosavitif you’re buying right now some really good deals on 3070’s which I would recommend over 4060 (Newegg for $399) otherwise I’d say stick with nVida for now arc is super close but not all the way there, unless you’re ok with not 100%
yeah I'm waiting for Battlemage to come out too. Hopefully Intel understands they are the bargain brand and don't just copy AMD and set all their prices barley under Nvidia's.
Intel want to have fat margins everywhere… so if battle mage is good, they douple the prices! ARC is cheap because it was junk! And did not sell because of that.
@@haukionkannellol, that's not how it works. they'll probably offer somewhat better performance compared to equivalent AMD cards for a slightly lower price, and AMD either goes even down regarding market share or needs to lower their prices, too
I think we will be in aw if the rumors are true. Xe 64 and Xe 56. with 16gb/24gb and 256 bus for the B980/B970. Predicted launch H1. On these specs it would be challenging 4070ti/4080. But if it comes with APO that will put 10-50% on top of it it will be a killer GPU. (for reference A770 Xe 32/16gb/256 bus) It will be doubling the Alchemist performance.
Having owned an A770 LE since launch month, I'm impressed and pleased w/ how far it's come. Honestly, it's not very far away from being where I'm fine w/ it being, and w/ all that VRAM and generally good specs overall, I can see this one lasting and aging like fine wine.
@@GamersNexus Hey Steve, can you guys make a video about "GPU Hindsight" ? Basically which was the smartest GPU to buy and upgrade in each generation. With your experience and knowledge, it would be a great watch and summarisation to this market !!
I got the Intel Arc Sparkle A770 and it not only is a beautiful card, it runs (most) games absolutely flawless and is a large improvement from my old graphics card.
All of this improvement just increases my hope that Battlemage will be truly competitive in the low to mid-range and put pricing pressure on the market.
bruh if the rumored prices of 499 and 599 and the 10% less performance than the 4080 are true it would kill 30% of the used market gpu's and any mid range rtx 40 series rx7000 series since it could run any fing game and will make 4k gaming affordable again for just around 1100$ ud get a 980+i7 13700k and it will be more than enough for most gamers
i know its waaaay too good to be true but if one of the is true we'd have either a new 500ish gpu for more competition or another high-end gpu to compite with the 4080/7900xtx @@jesusbarrera6916
I'm really interested in the encoding performance of the a310. I have 3 Plex libraries. A 4k, a 1080p important, and a 1080p unimportant. I want my 4k and my 1080p important in av1, but idc about unimportant since I delete it when I'm done with it.
I think that would be interesting to see as well. Personally I've moved away from encoding my libraries and would prefer transcoding performance. I'd love to see what newer options may exist that would replace my venerable 1050ti 4GB with uncapped stream driver. A cheap low cost option that doesn't require quadro money to deliver 5-10 streams @1080 or @4k.
@@GamersNexus Please do, with some idle power consumption numbers, where you enabled ASPM and whatever else is recommended to get idle consumption down on these ARC GPUs.
@@GamersNexus I'm following the wrong crowd, because I've found little to no discussion of the 3D, AI, or hardware encoding capabilities of the Arc series. It's all games, games, games.
This is not something I've tested myself but something I've heard from someone else... The idea was getting better performance in video handling and compressing. The user bought a A310 ARC card and added it to his gaming machine that was running a i5-13600K and a Nvidia RTX-3070. After installing the A310, not connecting a monitor to it, he disabled the graphics integrated into the i5-13600K. The videosoftware that was previously using the quick sync technology would now use the A310 instead of the graphics processor in the CPU. According to what I was told the addition of the A310 increased the encoding performance drastically. Now this is an interesting idea if it actually works. A AMD user might be able to use Quick Sync when handling video encode and such. Or users of Intel processors that doesn't have the integrated GPU might be able to enhance their machines performance.
I've done this myself. The downside is that the Arc driver didn't cooperate well with the Nvidia driver, and software compatibility with the Arc encoder wasn't great. But that was around a year ago since I last tried, so things could definitely have changed.
@@tomaszszupryczynski5453 The quality has improved some, but compared to good software solutions it is worse. However the important thing for most people is that the encode is fast enough the machine doesn't sag even if you are playing a advanced game while capturing the game play at 60 Hz and high resolution and perhaps a video camera at the same time. To do this in real time takes quite some power and if there is a GPU with encoding support that does a lot to help. Now the only ones I've used myself is Nvidia and AMD cards. Both kind of work, though in my opinion the Nvidia solution is way better than AMD, both in stability, speed, quality at a given bitrate and file size at a given quality requirement. I can't say anything about Intel Quick Sync, but it has a lot of people who claims it's good enough. But yes, non of the hardware encoders I've used compares positively when compared to a slow and well configured software encoder.
AMD wasn't competing in the budget-midrange category because, until Arc, there was no competition besides their own APUs (or used cards) Hopefully Intel can make some more headway on this end of the market, and AMD can make gains against Nvidia in the high-end
I would love to see a separate video, or just an add on to these videos on Intel GPUs to see how DXVK changes performance in DX9, DX10, and DX11 games. Considering its just a copy and paste into the folder with the .exe and can change a game from a unstable, buggy and/or unplayable mess to a playable or improved state. Its always worth trying.
@@GamersNexusI'd also love to see you test Siege with Vulkan instead of DX11. Almost everyone I know prefers using Vulkan, and it would be interesting to see if Arc goes up in the rankings compared to DX11.
@@GamersNexus "Nvidia doesn't care at much" Not true. They care. their prices sometimes changed. they have done software work for linux. Intel Arc prices maybe a little bit cheaper. Dx12 slow in many thing. Dx11 Faster in MANY thing or Vulkan. game devs or Vulkan devs or both need to do few things Vulkan not choosable option in many game, Vulkan it could be one of the best thing for gamers. MAYBE yout information is good but i have read THINGS about Intel Arc not has support for Dx12 and Dx12.2 and few game and things still not really works on linux. DON'T MISUNDERSTAND ME, I NOT hate them. i have few product and what i have it's good. here in this countrie, MANY kind of VERY evil things are MANY years! in ALL peoples animal man TYPES are. many evil in basic things also. and my half brother in Usa not friendley poker player, website domain owner he has few. i can't communicate with him. help needed as fast as possible!
i truly hope intel continue to support the team behind the intel GPU division as they seem to be quite dedicated to making the drivers as good as they can. I can understand that they are trying to catch up to teams at AMD and Nvidea that have over many more years been able to refine their offerings and that in todays tech they are regarded as mature. Intel sadly have so often come to market with tech that is astounding but then a year or 2 later simply to cut costs dump the tech and the support.
I'm running 2 different A770 cards in 2 different builds and not having any issues. It was a gamble getting the LE card 7 days after launch, but it paid off rather well. My parents also didn't realize I had gotten the LE right after launch, and I had only mentioned briefly that I wished the Acer version was easier to obtain (it was "Asia-only" at the time). They got me that Predator Bifrost A770 for X-mas (and at the LE MSRP pricing, too) and it's now my daily driver. I had skipped over RTX, the best card I had before Arc was a GTX 1080Ti, and Arc's consistently stronger than that card, so as far as I can tell it's the perfect price-to-performance upgrade GPU for me.
Thanks the website review, was checking it out at work. I personally dont mind ads on your own website, even ads for your own merch. You give alot to a community of pc hardware enthusiast ❤
I've heard where the arc cards really shine is in encode/decode, like with streaming... it'd be interesting if there was a way to measure that kind of performance, as well as just game performance or raw calculation benchmarks.
Yes, Intel GPUs have been really good at media encoding and decoding for a while, and Arc inherited that from the iGPUs. They also get hardware accelerated AV1 encoding. So in theory, Quicksync AV1 should be the best way to record/stream games using a GPU, even better than NVENC AV1. Unfortunately, almost nobody supports AV1 streaming with Arc. Recording works well though.
If I recall correctly, Intel designed one really good hardware video acceleration block and put it in every ARC card, so theoretically all the ARC cards perform about the same there, as long as they aren't constrained by power and VRAM. Generally, hardware codecs have some sort of fixed throughput cap, so the datasheets might tell you everything. Making up an example: maybe the hardware encode block can process 500M pixels/sec. That's just enough for 4K 60Hz, but trying to encode a 4K 61Hz signal will overwhelm it and lead to some dropped frames.
Very intersting. I've been eyeballing an A770 as a replacement for my aging 1660Super, but have yet to pull the trigger given that BattleMage is due to launch later this year. Still, the A770s pricepoint is looking pretty damn nice when also viewed on the bang you get for the buck.
@@redslate Yeah, that's what I'm thinking as well. That being said, it'll be VERY interesting to see the pricepoint for them. Hopefully Intel doesn't go full-on ham with the prices.
I'd say wait for battlemage as well. The A770 is a nice card for recent games, but priced not quite right for it's performance. In games where it runs well, it performs great, but outside of that a RTX 4060 is cheaper and faster. Or a RX 7600, also running faster in those games and costs noticeably less.
I've been using an A770 for a month now at 1440p and I really hope to see more people buying Arc gpus now! It works so well and I rarely have issues with it. It also runs pretty quiet and temps are always in the 50s-60s (I have the Asrock 3-fan 16gb model so that's not surprising). Most of the few issues I have had are in blender, which makes sense since that favors Nvidia much more, but the gpu still works well enough there for me since I'm a beginner. I've also had small screen flickering issues in Persona 4 Golden, which I think might be related to Arc since I couldn't find anything about it online. I do need to update my drivers though...
Still holding on to my 1060 until Intel's cards are a viable upgrade in the games I play. They aren't quite there yet, but the gains so far are impressive.
Sadly the Acer Bifrost A770 is sold out in the UK now but that would have been my go to version looks wise. I have the Sapphire RX 6700 10GB, great card so it's more of a lateral performance change than an upgrade to the A770. I'm waiting to hear about the upcoming Battlemage cards if they release this year, otherwise might grab an RX 6800 (non XT) in a sale.
Owner of said ARK A 770, It's been a blessing and a curse with driver and bug issues. (Jan 18 drivers, Seems stable atm) I'm still using my Intel GPU for 1080p gaming and hoping better driver revisions. Thanks Steve and crew for the recent review.
I was considering an a770 16GB last year (actually I was almost pulling the trigger on it). The problem was, I am busy for trouble-shooting at this point in my life. I want something that "just works" and the 6700XT was priced lower and gave me more performance. I really want Intel to establish itself as the 3rd major player on the GPU market, but I also don't feel like kickstarting a multi-billion company. In 3-4 years latest, I will be on the hunt again - and if Intel is the best value by then, sign me up for team blue.
Super cool to see you keep following up on this. One request: do you think at some point you'll have the time to take a look into more niche games on Arc? Obviously not with a wide range of comparisons, but, say, with the 7600 and 4060 for comparison, looking at various indie titles, smaller "AA" titles, games using less common engines, and so on? Would be super interesting to see.
I would like that as well. Most of the games are basically never tested in reviews, but still need some level of performance. And not everything scales exactly the same, so those games tested barely say anything about performance in those niche games.
I've been following these Intel Arc videos, it's great to see their improvement. Maybe in 2/3 more patches, they'll be ready for the mainstream market.
I have had my Arc a770 for almost a year now. I play Minecraft mostly but when I play other games I haven’t had any non playable problems. Sometimes things don’t like to work but a lot of times all it takes is a reboot or reinstall of drivers and things are back to normal.
In Japan, the Acer Predator BiFrost Intel Arc A770 OC [PCIExp 16GB] - ¥39,800 MSI GeForce RTX 3060 VENTUS 2X 12G OC [PCIExp 12GB] - ¥39,475 (divide price by 150 for USD) Initially I was going to go with the Intel Arc A770 OC because of the 16GB vram since I use a 4k monitor (for video editing), but may go with the GeForce RTX 3060 12GB. (currently using a GTX 1660 6GB, and some games don't run well in 4K with this GPU)
ASRock ARC A380 can be found open box for like $90. I run a Samsung 4K LU28R55 with a secondary 1080P S22E450 while the 4K TH-cam Vids use about 18 Watts.
I may be interested in Battle Mage, when it comes out. What I'd like to know is how well the Arc cards work with popular emulators (Retroarch, Yuzu, etc.) now.
Truly very thankful for my A380 purchase! Its the best Low profile card and it was only $90. It absolutely crushes! Im using it as my encoding gpu for one of my recording pcs and it also does gaming alright!
I almost made the decision to buy an A770. Until you showed the graphs for GTAV. A game I'd want to be playing. So I'll wait a bit longer to see if the drivers keep improving. My GTX1650 isn't cutting it anymore. Good content.
I use lots of Adobe premier pro and would love to see the performance of the A750 and A770, I haven’t seen much info on these cards performance after they gained support for V24
GTA V being broken on Arc is baffling to me, since that is still one of the most popular PC games on Steam and should probably be treated a priority for Intel driver engineers.
I totally understand that the weaknesses of the card have to be mentioned in the review ( as they should ) however, as a consumer you can just Google if the games you play work well with arc gpus. My Arc 770 has been phenomenal for streaming, content creation and gaming.
I think A750 has potential to be the go-to budget card like RX480/580 used to if compatibility issues get fixed. I heard battlemage launch is most likely going to be pushed to 2025 but I really hope Intel keeps up with patching driver!!
It's almost feeling like NVidia is becoming the Apple of GPUs while AMD, and now hopefully intel, are going to be the companies fighting over actual consumers and bringing about honest innovation.
Nvidia has been trying to break out of the discrete gaming GPU market for years. They saw the writing on the wall years ago. Mainstream Discrete GPU SKUs will meet the same fate as other discrete cards. Integrated solutions become adequate for the masses while discrete solutions cater to hobbyist/professional niches. The whole crypto-craze and COVID bubbles only had delayed this by a couple of years.
@@Krogoth512 No idea what you are talking about with "discrete", but as long as AMD is right on NVidia's doorstep in terms of performance, they aren't special. Hence why I said "the Apple of GPUs". They just want to charge stupid prices for the mindless people who will gladly fork over money to them without thinking. AKA, exactly like Apple.
its actually easy if you do mostly video editing and Photoshop and a bit of 1080/1440p gaming u cant go wrong with the a770 or the 4060 (even thoe the 8gb vram will be a big problem in 1440p compared to the 16gb of the arc a770) if u do mostly gaming go with the 7600xt /6700xt both are really good for the price (for rt go with the rx7600 xt if i remember well it gets around 10/20 fps more with rt on) if u do care about rt then go for the rtx 4060 (same thing for those who use blender) and finally if u do all of this go for the a770 on a discount its not the best in any of this things but its better to have a versatile gpu that performs good enought in most tasks than one who performs very well in gaming but very bad in content creation (rx 7600) or one very good for those tasks but really limited in terms of hardware and a bit over priced(4060)
The A770 is in a very interesting spot at this point. Price wise it should perform similar to a RX 7600, which it often does. The RX 6700 XT on the other hand competes with the RTX 4060 Ti in price, and who wins that duel is pretty clear. To me the real winner of the Arc cards is the A750. Sure, it's slower than the A770, but you get RX 7600 performance at RX 6600 cost. And in some titles it runs past even the RTX 4060. But as GTA V and Starfield show, the cards aren't quite finished yet.
@@HappyBeezerStudios fr bro the a750 is underrated but i prefer the a770 for being faster and more vram (in video editing and 3d 8gb would be a problem specially at 4k) the only issue is the drivers they are still in progress
I am so glad I bought EVGA 2070 XC Ultra back in the day even though I thought that my 970 is perfectly fine. Actually, I am in a similar situation as I have been about 5 years ago, maybe it actually is time to upgrade and be covered for next 5 years :D
Intel needs to release something worth talking about. I keep hearing battle mage, battle mage, battle mage, and still we have nothing that can even take on a 7900XT or a 4070Ti. if they can't take on a 3070 now, what the hell are they going to do against nVidia 5000 series and AMD next cards.
I’ve been having a lot of fun testing Arc. Any chance you can do a cpu scaling test on arc? I have notice that arc has a pretty large cpu driver overhead which causes issues in some game like horizon zero dawn and death stranding. Thanks for the work!
I got an A770 for my Linux system. It’s focus on Vulkan has really complemented Linux’s ecosystem quite well (with Zink and DXVK already competitively covering OpenGL and DirectX for most games), and with the upcoming Xe driver it should become even better. It also is nice due to its comparatively low number of binary blobs for the GPU space.
@@GamersNexus The Arc 770 is in the same price range as the 16GB RX7600. Currently even 23 bucks cheaper at 336€ (Both Asrock models) located in Germany
Yeah pricing is rough new, especially the A770 for €380. A750 is around €250 while only being like 5% slower. Used however you can get A770 for pretty cheap, got mine for around €250 used.
@@GamersNexusRx 7600 is around €280, RTX 4060 is around €310, Arc A750 is around €250, A770 8GB is around €300 and A770 16GB is around €380. At the price of a A770 16GB you might as well get a RX 6700 XT.
I cannot believe Arc 770 performs near to a RTX 4060, seems like this NVIDIA gen is the same as the 3XXX, in the other way that's a massive improvement from Intel.
Titan Volta? / 1080 Ti / 2060S-2070S /3060 12 GB / 4060 / Radeon 7 / 5700 XT / 6600 XT / 6650 XT / 6700 10 GB / 7600 / 7600 XT / A750 / A770 There have never been more GPUs at the more or less same performance class, in the history of ever
honestly, competing at the 3XXX level is good enough. that's basically where the budget buyers are gonna be looking. they aren't quite there, but it's also the first generation.
Man I always thought my Arc A770 GTAV issues were user-error. Glad you mentioned that here. I'm sure Intel devs will add that to their fixit list after watching!
Well I just ordered parts to build my 3rd PC and I bought the Sparkle Intel Arc A770 Titan OC Edition • 16GB. I think it will do what I want. I'm not as of a big gamer anymore due to my eyesight. But the few I do play this is more than capable.
Enjoying my A380 that i had since Nov 2022. Initially it would dip to ~20fps in Destiny 2. Now it is consistently at 60 fps with smart vsync. It was broken in the last 6 months of drivers, but works really well on the latest one. 5331 or whatever it is
I have a 7900XTX main GPU, but it has issues with streaming video playback while gaming. (I keep YT or other video streaming on a secondary monitor going while I game.) So, I added a GTX1650 as a secondary GPU specifically to run Firefox, Media Player, and PowerDVD on my secondary monitor. (Originally, used a 750Ti, but that had quality issues, then tried a 1030, but that had performance issues.) It worked great for a long time, until recently. I'm not sure if the issues are coming from the Nvidia or AMD drivers, but just the last month or so, my games have begun having major bad performance for brief times, mostly when first starting games, but there are also random moments that seemingly come out of nowhere. I'm thinking of switching my secondary card to see if the issues go away. Would an Intel card, like an A380 or A580, be a good candidate for this duty?
@@dangingerich2559 I feel like mixing cards from different vendors like that is bound to create problems sooner or later. Are you sure the original playback issues aren't caused by something else entirely? I remember having similar problems with playback while gaming and all it took to fix was turning off hardware accel in the browser. Still if you really want two cards in your system sticking to the same vendor seems like a safer bet than mixing drivers.
I liked the A750, but I had to send it back. HDMI 2.1 support is broken. I have an LG OLED for my main display with no option to use display port. But, if using DP it's a fantastic card for the money.
@@GamersNexus Also try toggling HDR while in 2160p@120hz via HDMI 2.1 When you switch back from HDR, the colors are whacked out. Over saturated. Reboot required. Edit: Adding all info just in case Frequently would lose sync over HDMI 2.1. End up with a black screen. It always recognized my 2nd monitor via DP. If toggling video modes, might lose sync. If toggling HDR could lose sync. I also use ARC audio (not to be confused with the A750) and it would lose sync with audio return channel if using all bells and whistles (4k, 120hz, HDR) Sometimes, I would have a vertical line of artifacts down the OLED. Reboot could fix it, (if it sync'd). That was caused by upgrading the driver that also contained a flash. The early January update. I think that is everything. Via DP, everything works. Except for being able to use ARC audio.
@@sgredsch I could too. But it had lots of other problems. Do you use ARC for audio? Do you switch between HDR? Mine was also an Intel A750, maybe some difference. I 100% for sure had issues though.
I recently bought an A380 ELF for $120 and am extremely happy with it. I have a limited budget for building a gaming PC and the A380 gave me an excellent bang for my buck.
And for more details on the software side of things, watch our recent Intel Arc Driver revisit! Some good, some bad for Arc: th-cam.com/video/aXU9wee0tec/w-d-xo.html
Grab one of our PC Building Modmats for your new PC build! It's a HUGE, rugged work surface with useful diagrams: store.gamersnexus.net/products/large-modmat-gn15-anniversary
Or get one of our GN15 metal emblem glasses! store.gamersnexus.net/products/gn-3d-emblem-glasses
Article version is now live! gamersnexus.net/gpus/intel-arc-2024-revisit-benchmarks-a750-a770-a580-a380-updated-gpu-tests
I would be nice to have some productivity tests on GPUs too. Someone who doesnt have tons of money like me, how bought a 3060 12gb to start at 3D render for example, would like to see how a a770 16gb performs in blender, daz and v-ray.
These improvements make me very excited for battlemage
With the new pico connect on the pico 4 it works 100% in VR on the a770
im not sure how your gta-v results are so far off from mine, im on an old 7980xe@4.6ghz, and my gta-v results with the game cranked up...mins are way higher then your max.. very weird since your chip in this test SHOULD be better for gaming then this old beast... this result genuinely has me doing the o.O
because... no joke online or sp... the games perfs WAY better then this... and i would really love to figure out why... (bet intel would as well...LOL..)
i have submitted bench runs to the intel discord when people have fps issues in hopes it would help figure out wtf is going on...
the MSAA in gta does have some bugs though... turn it up and down and off and retest and sometimes at random it will run alot better then others with it on, sometimes huge perf hit sometimes not, this even effects nv and amd hardware from my exp.. just not to this degree..holy shit..
just a suggestion, tossing Quake 2 RTX to the list isnt a horrible idea, it works amazingly on the a770 last i tested i did have to enable a toggle to get the perf up but, that may be a default now its been a while, either way, it also looks amazing for a game from my youth that shipped with both software and accelerated rendering support... better then some modern games... (also, we will be seeing more projects like this as nv have released that kit/project to replace the renderer in old games with one capable of modern features.. )
when we tested it was shocking how well it ran, better then the 3060ti despite it being older drivers from around this time last year now that we went and looked at when we did the tests...
another suggestion if you see this.. a video showing arc on a range of cpu's and games to figure out where the sweet spot is..
again i get better results at 1440p and 4k in some of these titles then your benches are showing, and am on a much older cpu.. 7980xe, that i am regularly reminded "isnt a gaming cpu"...in the intel chat... by ppl who think a 5700g would have been better buy for around the same price..heh..)
i do know its alot of work, but it could just be the 770 16g cards for the first run and if it looks interesting enough run the other cards, im betting the cut off for cpu will be quite a range depending on the games you play, then you will hit a point where you have enough cores/threads to deal with the cpu doing the "scorebording" for the gpu and still being in the process of heavy optimization, from our testing though a 7900x@5ghz with an acer a770 gets noicably better results on some titles vs his 12700h+a770m Nuc, where a few titles do run better on the nuc, its marginal most of of the time... and a few titles really hate the mixed-core chips and need you to force them to only use p cores.. using process lasso or similar..
anyway good video, i would say these are already aging better then nvidia products they compete against.. not 100% of the time but.. getting closer and closer.... and amd's 7600xt has the same issue as the 4060ti 16gb.. buss limitations, even at 1440p it starts to show.. and thats honestly arcs "Sweet spot" most of the time... sometimes its still faster at 1440p then 1080p... thats down to both drivers and hardware design, i think they might do well to hurry and get a DSR/VSR style toggle in drivers so people on 1080p screens could run at 1440p...thats the ONE feature i genuinely f-ing miss from amd and nvidia drivers... nvidias handling is better but... i just want a downscaling toggle... having to use crap like cru to try and do this shite is just.. frustrating and irritating.. heh..
Having a A770 16GB since launch, and wow they updated drivers and firmware like crazy. Never regret the bought.
I noticed that the card has really good specs on paper like even better than the Nvidia on paper how is it now? gaming / video encoding / decoding / AI funtime stuff?
@@blackIce504 You need a $1000 card to beat it in productivity. Also the fame time is lot better so comparing FPS to FPS is not that relevant. Half the FPS and it can beat 4070ti as more playable without stutter in higher resolutions over 1440p. A770 I would say is the entry level for 4k. Most of the times it can in mid and high settings and RT on too. XeSS XMX is as good as DLSS. Don't confuse it with XeSS DP4A that is directed to AMD/Nvidia. Yes on paper it is 6800/4070ti with less Wattage. But it does overclock pretty well. For example 2786MHz at 1.16V at 271W.
Base is 2100MHz (2400MHz) 225W.
looking to get this for my wife’s PC. looks awesome for the price, and the games she plays seem to do well on this card also
@@paulboyce8537 I'll have what you're having
But it's not your daily driver GPU. This card is not for gamers. This card is meant for web browsing MS Solitaire. So let's stop this nonsense. I will never put this card into someone's new build gaming pc.
I bought an arc770 from an impatient friend, 250€, as a collectable object as the first "serious" Intel GPU, and it has become in my daily GPU for Counter Strike and PUGB, that are the games I play the most. Veeeeery happy with it!
PUGB , best type of pug
nice, I bought it when it came down to €300 for the first time, also incredibly happy with it. Hope they'll get closer to Nvidias performance with Celestial
Intels first GPU is the i740 from 1998.
Hey, what fps do you get in cs2? Wondering if I should upgrade from rtx 3060.
Pubg honestly underrated
A true redemption ARC!
Hopefully that pun doesn't arc to other commenters.
@@GamersNexus it's making my brain ARC out.
Take my Like, you commenter-on-the-internet you.
Let's stay grounded when talking about Intel's GPUs.
Comment of the month
8:05 is the best part of the video. The cute little squeak is a second or 2 after and it's really quiet so it's easy to miss. Thank you.
Nice catch.
And right before the mat and tshirt plug! ;D
This reminds me of the olden days of 1998-1999 when S3 was trying to get somewhere between ATI, 3dfx, Matrox and NVIDIA. It was the old 2D powerhouse that had become the underdog of the early 3D era. Their price-cut Savage 4 had a place in my heart for having given my pocket-money-wielding teenage self access to some 3D PC gaming despite its quirks. They did have a clear edge in texture quality for games supporting S3TC, such as Unreal Tournament. S3TC was later renamed to DXTC as Microsoft made it standard in DirectX. I was really rooting for S3 to break through with the Savage 2000 back then, the only other graphics card to feature hardware T&L like the first GeForce, but it was a flop due to very poor drivers. They got bought by VIA who ended up only using their technology for chipset integrated graphics.
Anyway, ramblings of an old guy aside, I hope that Intel finds its place and climbs the ladder to challenge NVIDIA. We need some serious competition, not only on the graphics side, but also the AI side.
I had Savage 3D back then. Great price/perfomance
Pour one out for 3DFX. 😢
I remember those days, hanging out on alt dot games dot mechwarrior2, watching threads of people debating ViRGE vs Verité vs Voodoo.
Damn, back when there was competition on the market, i remember the S3 it was a decent card, hope that Intel will become stronger, the gpu market since 2020 is horrible.
Oh man, VIA!
60% of the time it works every time. :). I'm hopeful for the next couple of generations of Intel cards.
hahaha, great quote for these.
same, we were super lucky with our ARC A770... as it ran our usual cycle of games just fine from the first. control panel UI sucked, but because we used it just for AAA gaming and super popular indie games like Phasmophobia and it's ilk... it took us a year and a half to find a game it really hated. The Morrowind Engine overhaul OpenMW, and ARC HATES old OpenGL.... You can fix it even on windows by injecting Linux Mesa drivers into the install.. however the multiplayer branch we were playing is on an older openMW version incompatible with that injector.
Since youtube is being stupid and someone asked; google:( OpenMW windows Arc better performance Reddit ) if you want a how-to for that injector for single play. It does work.
If the results here are indicative, it would actually be more like 86% of the time they work every time. Starfield is probably a lost cause, but I'd like to see them put some effort into GTA 5. That one has been terrible for a long time and is actually a popular game.
@@ryanspencer6778Yes, I also did the math... but the quote is from a movie. :)
my issue is then why is the prices not dropped by 15-30% to accomodate this and why do people justify the prices currently if this is the case like wtf.@@ryanspencer6778
Now we just need Intel to break the energy efficiency curse
the moment they do, amd's in BIG trouble
the moment they do, amd's in BIG trouble
That's called setting PL2 to a reasonable value
Under load, that will take time. The architecture is way behind in terms of efficiency. But idle power efficiency should be solved for next generation, and I'd imagine that overall efficiency should see a nice improvement too.
@@nepnep6894 Kneecapping your power limit does not equal a new product matching or exceeding previous performance metrics at a lower energy cost.
I was not expecting to find out Intel's cards are now matching or outperforming my current card (RX 6600 xt) in a lot of places when I woke up today. Really interested to see how this plays out in the future.
happy to see competition. screw how expensive Nvidia has become, used to be a huge fan. Had 3 generations of Nvidia after AMD cards.
Now, I'm just going back to mid level.
I would rather take a chance with an open-boxed or used RX 6700xt with a 15-30-day warranty than deal with A770 driver and game issues. It runs circles around the A770 in everything but Cyberpunk.
The only issue is older titles that's the reason why I didn't buy one because of at the time older game support wasn't as good as AMD
@@dnakatomiuk amd bad sometimes too, that’s why i go for nvidia
In terms of hardware the A580,A750 and especially the 16GB variant A770 (just the fact they offer a 256bit bus for these prices is insane) absolutely demolish anything AMD and especially Nvidia offer at similar prices.
It's the drivers that are problematic.
I’m not much of a gamer. I got it for content creation. Pretty early adopter have it for a year plus now.
Honestly for content creation, it hasn’t let me down.
It’s about as stable (now) as Nvidia and AMD.
I’ve had AMD in the past for a decade. Then had 1080ti.
I simply love my Arc A770. I think it’s truly a content creation card masquerading as a gaming card.
That’s how people should see it.
More of that instead of a gaming card masquerading as a content creation card, which I know is how most people who buy it intend to use it.
For my use case tho, it serves me wonderfully.
Why did u switch your 1080ti?
@@HunterOni av1 encoding. Also faster encoding.
Have you tested it for 3D rendering in something like Blender?
@@_S_P_A_C_E_M_A_N_ I’m strictly video editing now. No longer much a 3D fx or modeling guy. But I know there are other TH-camrs that have gone over it for 3D. Might want to search them.
I put a Intel A750 into my daughters computer. She plays some casual games (Minecraft, Wizard 101, etc), but she also plays Overwatch 2, and Killing Floor 2. The card pretty much worked for all the games she played right out of the box. She got World War Z when it was free at the Epic Store when she still had my old 980 GTX, but it only recently (maybe like 3 driver releases back) became playable. Overall I've been pretty impressed with how far their driver has come. I probably wouldn't have gotten it for her if she had to manage the updates / troubleshooting /etc, but she has me. When a new driver comes out, I make a special trip over to her place to update to and test the new driver. I'm hopeful that Intel will try to fight in the mid to high range of the market with Battle Mage. I just hope Intel's board of directors and Pat Gelsinger have the fortitude to see this business segment through the rough start, and make it to where they are competing. Surely their executive didn't think that they would hit a home run on their first product release???
Not necessarily that their executives didn't think it would hit a home run at first release, but that the sales were probably lower than expected. They run predictions for this and also have to accept the fact that both NVIDIA and AMD will sabotage them however they can. Otherwise the move to ARC branding for even their integrated graphics cards means they won't give up or axe it just suddenly but it will be a very slow and difficult market to break.
@@Aki-ow9hd If their sales were lower than someone calculated, either the engineering staff wasn't communicating well with the bean counters, or the bean counters were dismissing what the engineering staff was telling them. The engineers knew the release was going to be a cluster frack, that's why it was delayed as long as it was. Once reviewers (particularly like Gamer's Nexus and Hardware Unboxed) got the cards/drivers, Intel's engineers working on the drivers and supporting application (like Arc Control) knew it was going to be a media bloodbath. This kind of publicity kills sales. So either there was a communication problem at Intel, or the communication was being ignored if Intel actually had higher sales figure predicted.
My point was that I hoped that Intel's executive had the fortitude to stay in it for the long run. They have a history of killing off / selling off divisions that don't make money, sometimes too soon. Especially right now when they are most vulnerable. No time in history has Intel been in this much trouble (not even in the Pentium 4 era) with being behind their competitors. I say Intel executive, but really it comes down to the executive's abilities to keep the shareholders at bay. If shareholders start calling for the graphics division to be culled, the executive has to provide some assurances that they can turn the division around in the long term. Thankfully Arc is many orders of magnitude better than when it released. If they can stay on the same trajectory leading into Battlemage and beyond, they might have a break even turning to profit moment in the not too distant future.
I can only imagine a future when Intel hadn't abandoned the graphics market with the i740, and kept in it competing with ATI, 3DFx, nVidia, etc. They had the money to outcompete those companies back then. Now they are up against the giant nVidia and to some extent their CPU nemesis AMD in the graphics arena. I think to succeed (Gamer's Nexus has alluded to it a few times), they need only to beat AMD to stay relevant. AMD's graphics division should be the most worried by Intel's presence in the market.
Nvidia now has an Apple mentality.
and apple's only still leading in mobile perf bc it hasn't stupidly fallen victim to complacency like intel has
@@dead-claudia their laptops are a shit show. Unupgradable, locked down parts, and shit thermals(in some instances), cheap components with a 200% markup
Yes and No. Yes head in the clouds but No because it keeps expanding into new markets and dominating, actually, that's a Yes and Yes then !!
Apple phones are fine. Every other product and acessory? Overpriced junk
When you are essentially in a monopolistic situation, you tend to have the same mentality as any other company in the same monopolistic situation.
My A380 is a wonder card for my Plex server. It's a Dell Poweredge T330 that doesn't support iGPU CPUs or discrete GPUs officially. The A380 works with a power cable adapter and it can handle a ton of concurrent transcodes💪
Good use case. Saw another comment about transcoding also - we need to set up some testing for that.
If you can't use the iGPU or are running an AMD system it's indeed great. However, from a power efficiency standpoint it's much better to go with an Intel CPU with and iGPU. They'll do all that transcoding just as well, without having to power such a large GPU.
Unfortunately you still can’t monitor temperature on linux properly and future support of a card with new intel driver on linux is on question
I just sold my 1050 Ti and now plan on buying an A310 just for my server that I’m building and I’m super excited
@@KaRaTeLoRd11PS3 Very similar here with mine. I just finally sold my 1050 and picked up one of the Sparkle single-slot A310s.
I have a friend who gave his Nvidia GPU to his wife and decided to roll the dice on an Intel a770. I've yet to hear any real complaints from him even playing somewhat esoteric stuff like mods and fan games
😄 He can't admit it and ask his wife to give back his gpu.
@@doorsbh😂😂😂
The cards actually work really well on those title that it works with.
I love the smoothness.
Zero hitching...even in VR titles I run.
@@doorsbh actually i have the ARC770 16gb and for most things it works well .. ive got it paired with the 14600kf and the system in general is great .. my main is a 7800x3d 7900xtx but for a budget GPU that intel clearly wants to compete in the market with constant driver updates and support its a huge step in the right direction ..
They dont need to be 1st of even 2nd yet they just need to gain consumer confidence and a following the position in the market will change as they become more trusted ..
People forget that the majority of GPU's sold are in the budget sector not the 7900xtx's 4090s and higher end !!
I did something similar.
The A770 was pretty much a direct competitor minus some software issues (a lot of which have been addressed). The AV1 CODEC and extra VRAM alone are worth it for dabbling in multimedia.
Eventually, swapped back to 1080 Tis in my Gaming Rig, because I _do_ still play games that support SLI, but I'm all-in on BattleMage.
@@volvot6rdesignawd702 7900xtx's 4090s you can't get higher end right now lol
I feel like these GPUs are a good lesson for businesses about honesty and providing a valuable product.
The GPUs at the start were very inconsistent, but Intel owned up to it and were pretty transparent about what was going on.
Now way later, instead of being remembered for their horrible launch, they will mostly remembered for the massive improvements and competitive pricing (of course no one is going to completely forget it, but in 2 years, what matters most is where it ended).
I picked up an A770 on sale about a year ago and I've loved it. It hasn't been without issues, but it's gotten so much better and I'm glad to see that the performance seems to be continuing to improve.
One thing that would help a lot on the graphs readability is if you align every part of a GPU name on the Y axis on a ordered fashion, adding white spaces to align every part of the name, something like - so it would look something like that (Disregard youtube indentation needing multiple whitespaces)
EVGA - RTX 3060 12GB Black (1/24)
NVIDIA - RTX 4060 Ti 8GB FE (1/24)
Gigabyte - RTX 2070 8GB OC (9/23)
That way, it is always ordered and you always know where you'll find the Model or VRam size without having to search multiple names that have different sizes and different places where the model can be
This is Kinda what already happens with the last VRAM size, Stock / OC and date, but it gets messy when looking for the model and brand of GPU
Seconding this. Would make the graphs a lot quicker to read.
100% this
@@nicholasthesilly agreed
Yeah, the names have different lengths, that would indeed improve readability.
No idea which software they use for the graphs, but it should be possible even in Excel.
It would also be pretty awesome if there were FPS/Price charts. Dividing FPS by Price will show relative value, and it will be awesome to see the best value cards. Obviously higher-end cards will perform poorly on this chart, but for someone trying to pick the best budget cards this would be pretty helpful. I'm certain Intel would have peaks next to similar performing cards in a few of these tests.
So glad to hear Snowflake's little mew in your store ad here 😻
I am honestly optimistic that battlemage will kick some serious butt.
Hopefully the executives understand this will be a long and slow road to build the drivers.
Can’t wait for it. Having a lot of fun with my A750, A770 and A380
why no 580?@@IntelArcTesting
@@NANOSPERMATICO It was released rather late.
@@NANOSPERMATICO580 kills the vibe
Thanks for the review guys! I grabbed the A750 and A770 now that pricing matches performance much better.
I bought the A380 for general computing (web browsing (some websites are pretty demanding on the GPU), video conferencing, some video editing, video playback, etc.) I was initially skeptical if it could handle video conferences, but it's been doing great, especially considering it's only 120 $. No issues with stability for me at all.
If you ever have a chance for an interview, it would be fun to hear more about this drivers optimization process. In a sense, not only it looks like they are reordering hardware calls and optimizing pipelines, but each driver has a specific module for each “supported” game, whereas the unsupported ones fall into the generic bucket. But knowing more about those details would be super. Thanks for all your work!
Great technical insight!
Great benchmarks, mixed results but overall, Intel is actually becoming viable and that's good for all
Unfortunately the news is that they have discontinued their GPU units.
@@hirakchatterjee5240no they havent....,
@@hirakchatterjee5240 Asrock is still pumping them out and Battlemage is hopefully still coming out.
@@hirakchatterjee5240Nope. If you refer to the nonsense that MLID talked about, he was referring to laptop GPUs and even that I wouldnt trust lol. GUy has been wrong so many times.
@@dominikheinz2297So many people take that guys “information” as gospel. I used to enjoy watching his videos because unlike other shitty channels that fart out videos every 6 hours his presentation seemed more….. believable? Also the smug arrogance when he finally got something right was a nuisance to say the least.
8:07 Best moment of this past week for me.
Mew!
Thanks for the re-visit! The most surprising part of this review to me is just how poorly the nvidia 20 series cards have aged!!!! I’m so glad to see intel making so much progress, and I think I’m finally close to being able to recommend it to savvy enough friends on a budget! I hope they can get closer to the “mid range” of say 4070 or theoretical 5070 on their next launch and I will also switch from my insanely poor value 3069ti that I bought at the height of the GPU drought for $479 :/ The return tracing vs similarly priced cards is especially impressive to me.
I agree with you, I have a RTX 2060 and I am searching right now all these GPUS, I'm currently between ARC A770 vs RTX 4060.
What is your opnion on them?
@@rosavitif you’re buying right now some really good deals on 3070’s which I would recommend over 4060 (Newegg for $399) otherwise I’d say stick with nVida for now arc is super close but not all the way there, unless you’re ok with not 100%
@@rosavitwait for RTX 5000 series to buy a new GPU. Otherwise, get something used. 3070 8GB is fine for 1080p. 6700XT 12Gb is good for 1440p.
I'll definitely get the Battlemage series, seeing how serious Intel put effort into their GPU.
yeah I'm waiting for Battlemage to come out too. Hopefully Intel understands they are the bargain brand and don't just copy AMD and set all their prices barley under Nvidia's.
Intel want to have fat margins everywhere… so if battle mage is good, they douple the prices!
ARC is cheap because it was junk!
And did not sell because of that.
@@haukionkannel even if they would double the prices, getting a 4070ti+ performance at a 7700xt price is good for everyone
@@haukionkannellol, that's not how it works. they'll probably offer somewhat better performance compared to equivalent AMD cards for a slightly lower price, and AMD either goes even down regarding market share or needs to lower their prices, too
I think we will be in aw if the rumors are true. Xe 64 and Xe 56. with 16gb/24gb and 256 bus for the B980/B970. Predicted launch H1. On these specs it would be challenging 4070ti/4080. But if it comes with APO that will put 10-50% on top of it it will be a killer GPU. (for reference A770 Xe 32/16gb/256 bus) It will be doubling the Alchemist performance.
Intel ARC news are the only GPU news i enjoy reading/watching :D
it's a strange ray of hope in a bleak industry.
Having owned an A770 LE since launch month, I'm impressed and pleased w/ how far it's come. Honestly, it's not very far away from being where I'm fine w/ it being, and w/ all that VRAM and generally good specs overall, I can see this one lasting and aging like fine wine.
shout out to the 1080 ti
The GOAT!
I swear that card came from an alternate reality where GPUs were priced sanely.. oh wait
An elegant weapon for a more civilized age...
I worry they'll bury me with it!
@@GamersNexus Hey Steve, can you guys make a video about "GPU Hindsight" ? Basically which was the smartest GPU to buy and upgrade in each generation. With your experience and knowledge, it would be a great watch and summarisation to this market !!
I got the Intel Arc Sparkle A770 and it not only is a beautiful card, it runs (most) games absolutely flawless and is a large improvement from my old graphics card.
Proud user of an ARC 770, playing cyberpunk on RT psycho with xess balanced arround 40fps. Really want to continue my journey using gpu with Intel
B580 next gen
All of this improvement just increases my hope that Battlemage will be truly competitive in the low to mid-range and put pricing pressure on the market.
bruh if the rumored prices of 499 and 599 and the 10% less performance than the 4080 are true it would kill 30% of the used market gpu's and any mid range rtx 40 series rx7000 series since it could run any fing game and will make 4k gaming affordable again for just around 1100$ ud get a 980+i7 13700k and it will be more than enough for most gamers
@@malik-mahdi I have no idea why anybody would believe any of those rumours....
i know its waaaay too good to be true but if one of the is true we'd have either a new 500ish gpu for more competition or another high-end gpu to compite with the 4080/7900xtx @@jesusbarrera6916
@malik-mahdi I still think that's too hopeful. ARC is likely targeting 4070 with their flagship, like they did with the 3070 last Gen.
i dont hope for both just for one of them either good performance or good pricing
I'm really interested in the encoding performance of the a310. I have 3 Plex libraries. A 4k, a 1080p important, and a 1080p unimportant. I want my 4k and my 1080p important in av1, but idc about unimportant since I delete it when I'm done with it.
Great point! We can set up a test of some kind for that.
@@GamersNexus that would be super awesome!
I think that would be interesting to see as well. Personally I've moved away from encoding my libraries and would prefer transcoding performance. I'd love to see what newer options may exist that would replace my venerable 1050ti 4GB with uncapped stream driver. A cheap low cost option that doesn't require quadro money to deliver 5-10 streams @1080 or @4k.
@@GamersNexus Please do, with some idle power consumption numbers, where you enabled ASPM and whatever else is recommended to get idle consumption down on these ARC GPUs.
@@GamersNexus I'm following the wrong crowd, because I've found little to no discussion of the 3D, AI, or hardware encoding capabilities of the Arc series. It's all games, games, games.
This is not something I've tested myself but something I've heard from someone else...
The idea was getting better performance in video handling and compressing. The user bought a A310 ARC card and added it to his gaming machine that was running a i5-13600K and a Nvidia RTX-3070. After installing the A310, not connecting a monitor to it, he disabled the graphics integrated into the i5-13600K. The videosoftware that was previously using the quick sync technology would now use the A310 instead of the graphics processor in the CPU. According to what I was told the addition of the A310 increased the encoding performance drastically.
Now this is an interesting idea if it actually works. A AMD user might be able to use Quick Sync when handling video encode and such. Or users of Intel processors that doesn't have the integrated GPU might be able to enhance their machines performance.
I've done this myself. The downside is that the Arc driver didn't cooperate well with the Nvidia driver, and software compatibility with the Arc encoder wasn't great. But that was around a year ago since I last tried, so things could definitely have changed.
amd doesnt have quicksync and there support is god aweful
Damn, why have I not thought of that besides Crossfire/SLI.
gpu encode is fast but poor quality and big files
@@tomaszszupryczynski5453 The quality has improved some, but compared to good software solutions it is worse. However the important thing for most people is that the encode is fast enough the machine doesn't sag even if you are playing a advanced game while capturing the game play at 60 Hz and high resolution and perhaps a video camera at the same time. To do this in real time takes quite some power and if there is a GPU with encoding support that does a lot to help.
Now the only ones I've used myself is Nvidia and AMD cards. Both kind of work, though in my opinion the Nvidia solution is way better than AMD, both in stability, speed, quality at a given bitrate and file size at a given quality requirement.
I can't say anything about Intel Quick Sync, but it has a lot of people who claims it's good enough.
But yes, non of the hardware encoders I've used compares positively when compared to a slow and well configured software encoder.
Thank goodness that Arc GPUs are only broken on Starfield, and NOT a game that matters or that anyone would want to play.
Danke!
I brought Arc A770 basically at launch. Hadn't regretted it. It really does run cyberpunk well.
Thanks Steve!
Wow! That was unexpected - thank you!
@@GamersNexus Back to you, Steve!
AMD wasn't competing in the budget-midrange category because, until Arc, there was no competition besides their own APUs (or used cards)
Hopefully Intel can make some more headway on this end of the market, and AMD can make gains against Nvidia in the high-end
Thanks!
I am happy to have been an A770 user for the last 6 months. Thank you for videos like this!
Currently using the sparkle 380 for a small medical image processing PC. Does the job perfectly.
I would love to see a separate video, or just an add on to these videos on Intel GPUs to see how DXVK changes performance in DX9, DX10, and DX11 games. Considering its just a copy and paste into the folder with the .exe and can change a game from a unstable, buggy and/or unplayable mess to a playable or improved state. Its always worth trying.
Working on that soon! Hoping to get some more engineering insight first.
@@GamersNexusI'd also love to see you test Siege with Vulkan instead of DX11. Almost everyone I know prefers using Vulkan, and it would be interesting to see if Arc goes up in the rankings compared to DX11.
In many cases DXVK achieves 50+% performance.
@@lorsheckmolseh3345 it's amazing how much koolaid devs drank to still be writing games in directX
@@GamersNexus "Nvidia doesn't care at much" Not true. They care. their prices sometimes changed. they have done software work for linux. Intel Arc prices maybe a little bit cheaper. Dx12 slow in many thing. Dx11 Faster in MANY thing or Vulkan. game devs or Vulkan devs or both need to do few things Vulkan not choosable option in many game, Vulkan it could be one of the best thing for gamers. MAYBE yout information is good but i have read THINGS about Intel Arc not has support for Dx12 and Dx12.2 and few game and things still not really works on linux. DON'T MISUNDERSTAND ME, I NOT hate them. i have few product and what i have it's good. here in this countrie, MANY kind of VERY evil things are MANY years! in ALL peoples animal man TYPES are. many evil in basic things also. and my half brother in Usa not friendley poker player, website domain owner he has few. i can't communicate with him. help needed as fast as possible!
i truly hope intel continue to support the team behind the intel GPU division as they seem to be quite dedicated to making the drivers as good as they can. I can understand that they are trying to catch up to teams at AMD and Nvidea that have over many more years been able to refine their offerings and that in todays tech they are regarded as mature. Intel sadly have so often come to market with tech that is astounding but then a year or 2 later simply to cut costs dump the tech and the support.
I'm running 2 different A770 cards in 2 different builds and not having any issues. It was a gamble getting the LE card 7 days after launch, but it paid off rather well.
My parents also didn't realize I had gotten the LE right after launch, and I had only mentioned briefly that I wished the Acer version was easier to obtain (it was "Asia-only" at the time). They got me that Predator Bifrost A770 for X-mas (and at the LE MSRP pricing, too) and it's now my daily driver.
I had skipped over RTX, the best card I had before Arc was a GTX 1080Ti, and Arc's consistently stronger than that card, so as far as I can tell it's the perfect price-to-performance upgrade GPU for me.
Thanks the website review, was checking it out at work. I personally dont mind ads on your own website, even ads for your own merch. You give alot to a community of pc hardware enthusiast ❤
I've heard where the arc cards really shine is in encode/decode, like with streaming... it'd be interesting if there was a way to measure that kind of performance, as well as just game performance or raw calculation benchmarks.
Yeah you can stream with lower bitrate and get more quality.
I have questions about this too. I think a lot of people do more video production and encoding than gaming.
Yes, Intel GPUs have been really good at media encoding and decoding for a while, and Arc inherited that from the iGPUs. They also get hardware accelerated AV1 encoding. So in theory, Quicksync AV1 should be the best way to record/stream games using a GPU, even better than NVENC AV1. Unfortunately, almost nobody supports AV1 streaming with Arc. Recording works well though.
If I recall correctly, Intel designed one really good hardware video acceleration block and put it in every ARC card, so theoretically all the ARC cards perform about the same there, as long as they aren't constrained by power and VRAM. Generally, hardware codecs have some sort of fixed throughput cap, so the datasheets might tell you everything. Making up an example: maybe the hardware encode block can process 500M pixels/sec. That's just enough for 4K 60Hz, but trying to encode a 4K 61Hz signal will overwhelm it and lead to some dropped frames.
@@DrowningInTea "I think a lot of people do more video production and encoding than gaming"
then you must live in LA LA Land
Very intersting. I've been eyeballing an A770 as a replacement for my aging 1660Super, but have yet to pull the trigger given that BattleMage is due to launch later this year. Still, the A770s pricepoint is looking pretty damn nice when also viewed on the bang you get for the buck.
I'd wait for BattleMage later this year.
@@redslate Yeah, that's what I'm thinking as well. That being said, it'll be VERY interesting to see the pricepoint for them. Hopefully Intel doesn't go full-on ham with the prices.
I'd say wait for battlemage as well.
The A770 is a nice card for recent games, but priced not quite right for it's performance.
In games where it runs well, it performs great, but outside of that a RTX 4060 is cheaper and faster. Or a RX 7600, also running faster in those games and costs noticeably less.
Thank you for revisiting the Arc lineup. I both my 750 after whishing your video's on the cards and I haven't regretted the perches.
I've been using an A770 for a month now at 1440p and I really hope to see more people buying Arc gpus now! It works so well and I rarely have issues with it. It also runs pretty quiet and temps are always in the 50s-60s (I have the Asrock 3-fan 16gb model so that's not surprising).
Most of the few issues I have had are in blender, which makes sense since that favors Nvidia much more, but the gpu still works well enough there for me since I'm a beginner. I've also had small screen flickering issues in Persona 4 Golden, which I think might be related to Arc since I couldn't find anything about it online. I do need to update my drivers though...
Still holding on to my 1060 until Intel's cards are a viable upgrade in the games I play.
They aren't quite there yet, but the gains so far are impressive.
Sadly the Acer Bifrost A770 is sold out in the UK now but that would have been my go to version looks wise. I have the Sapphire RX 6700 10GB, great card so it's more of a lateral performance change than an upgrade to the A770. I'm waiting to hear about the upcoming Battlemage cards if they release this year, otherwise might grab an RX 6800 (non XT) in a sale.
I'm enjoying the Intel A380 that I got for $99. It runs games at 1080p on my Mini-ITX system along with AV1 support.
At $99, it's a great card if you're using it for encoding and light gaming.
Owner of said ARK A 770, It's been a blessing and a curse with driver and bug issues. (Jan 18 drivers, Seems stable atm) I'm still using my Intel GPU for 1080p gaming and hoping better driver revisions. Thanks Steve and crew for the recent review.
I was considering an a770 16GB last year (actually I was almost pulling the trigger on it). The problem was, I am busy for trouble-shooting at this point in my life. I want something that "just works" and the 6700XT was priced lower and gave me more performance. I really want Intel to establish itself as the 3rd major player on the GPU market, but I also don't feel like kickstarting a multi-billion company. In 3-4 years latest, I will be on the hunt again - and if Intel is the best value by then, sign me up for team blue.
Super cool to see you keep following up on this. One request: do you think at some point you'll have the time to take a look into more niche games on Arc? Obviously not with a wide range of comparisons, but, say, with the 7600 and 4060 for comparison, looking at various indie titles, smaller "AA" titles, games using less common engines, and so on? Would be super interesting to see.
I would like that as well. Most of the games are basically never tested in reviews, but still need some level of performance. And not everything scales exactly the same, so those games tested barely say anything about performance in those niche games.
Intel’s GPUs are the only thing in the tech space I’m excited about anymore.
I bought the ARC A580 yesterday and i am very happy with it.
Thanks, Steve.
I've been following these Intel Arc videos, it's great to see their improvement. Maybe in 2/3 more patches, they'll be ready for the mainstream market.
I have had my Arc a770 for almost a year now. I play Minecraft mostly but when I play other games I haven’t had any non playable problems. Sometimes things don’t like to work but a lot of times all it takes is a reboot or reinstall of drivers and things are back to normal.
In Japan, the
Acer Predator BiFrost Intel Arc A770 OC [PCIExp 16GB] - ¥39,800
MSI GeForce RTX 3060 VENTUS 2X 12G OC [PCIExp 12GB] - ¥39,475
(divide price by 150 for USD)
Initially I was going to go with the Intel Arc A770 OC because of the 16GB vram since I use a 4k monitor (for video editing), but may go with the
GeForce RTX 3060 12GB. (currently using a GTX 1660 6GB, and some games don't run well in 4K with this GPU)
Just buy a rx 6700xt
@@Celatra not in Japan, but right now in this part of the world the RX 6700 XT is not worth the money. Costs about as much as a RTX 4060 Ti.
Thanks Steve
Fingers crossed Battlemage launches a bit smoother. I wouldn't mind using it in small form factor boxes connected to TVs.
ASRock ARC A380 can be found open box for like $90.
I run a Samsung 4K LU28R55 with a secondary 1080P S22E450 while the 4K TH-cam Vids use about 18 Watts.
I may be interested in Battle Mage, when it comes out. What I'd like to know is how well the Arc cards work with popular emulators (Retroarch, Yuzu, etc.) now.
Truly very thankful for my A380 purchase! Its the best Low profile card and it was only $90. It absolutely crushes!
Im using it as my encoding gpu for one of my recording pcs and it also does gaming alright!
I almost made the decision to buy an A770. Until you showed the graphs for GTAV. A game I'd want to be playing. So I'll wait a bit longer to see if the drivers keep improving. My GTX1650 isn't cutting it anymore. Good content.
I bought my a750 for $180 during the holidays brand new and haven’t regretted it one bit.
Battlemage's gonna be interesting
I use lots of Adobe premier pro and would love to see the performance of the A750 and A770, I haven’t seen much info on these cards performance after they gained support for V24
Thx for that test.
I love to see a 1080Ti in the test, just for the perspective!
A 11 GB card from 2017 is still competitive to a certain degree...
GTA V being broken on Arc is baffling to me, since that is still one of the most popular PC games on Steam and should probably be treated a priority for Intel driver engineers.
12:00 Drivers can't be understated.
You should add some productivity tests to the GPU benchmarks, especially AI related. Like a stable diffusion test bench.
The Intel ARC cards have come a long way and are definitely worth buying.
Thanks for always having the absolute best deep dives on specific hardware readily available when we go to look for it
I totally understand that the weaknesses of the card have to be mentioned in the review ( as they should ) however, as a consumer you can just Google if the games you play work well with arc gpus. My Arc 770 has been phenomenal for streaming, content creation and gaming.
Please continue to make more videos about arc this is really helpful!
I really love how the A750 looks and I hope they keep the look going into the next gen if at all (I do hope this goes past 1st gen)
I think A750 has potential to be the go-to budget card like RX480/580 used to if compatibility issues get fixed. I heard battlemage launch is most likely going to be pushed to 2025 but I really hope Intel keeps up with patching driver!!
I can see that already. price/performance is great.
What is the beige box right in front of Steve?!? It looks like either an old external DAT or an external optical cartridge. This is going to bug me.
So lucky to snag an A750 for $145 with tax and shipping on those Newegg TikTok deals. Works great in my 14 year old’s gaming computer.
That's an insane deal right there. I wish I knew about that... 🥲
It's almost feeling like NVidia is becoming the Apple of GPUs while AMD, and now hopefully intel, are going to be the companies fighting over actual consumers and bringing about honest innovation.
Nvidia has been trying to break out of the discrete gaming GPU market for years. They saw the writing on the wall years ago. Mainstream Discrete GPU SKUs will meet the same fate as other discrete cards. Integrated solutions become adequate for the masses while discrete solutions cater to hobbyist/professional niches. The whole crypto-craze and COVID bubbles only had delayed this by a couple of years.
@@Krogoth512 No idea what you are talking about with "discrete", but as long as AMD is right on NVidia's doorstep in terms of performance, they aren't special. Hence why I said "the Apple of GPUs". They just want to charge stupid prices for the mindless people who will gladly fork over money to them without thinking. AKA, exactly like Apple.
its actually easy if you do mostly video editing and Photoshop and a bit of 1080/1440p gaming u cant go wrong with the a770 or the 4060 (even thoe the 8gb vram will be a big problem in 1440p compared to the 16gb of the arc a770) if u do mostly gaming go with the 7600xt /6700xt both are really good for the price (for rt go with the rx7600 xt if i remember well it gets around 10/20 fps more with rt on) if u do care about rt then go for the rtx 4060 (same thing for those who use blender) and finally if u do all of this go for the a770 on a discount its not the best in any of this things but its better to have a versatile gpu that performs good enought in most tasks than one who performs very well in gaming but very bad in content creation (rx 7600) or one very good for those tasks but really limited in terms of hardware and a bit over priced(4060)
The A770 is in a very interesting spot at this point. Price wise it should perform similar to a RX 7600, which it often does. The RX 6700 XT on the other hand competes with the RTX 4060 Ti in price, and who wins that duel is pretty clear.
To me the real winner of the Arc cards is the A750. Sure, it's slower than the A770, but you get RX 7600 performance at RX 6600 cost. And in some titles it runs past even the RTX 4060.
But as GTA V and Starfield show, the cards aren't quite finished yet.
@@HappyBeezerStudios fr bro the a750 is underrated but i prefer the a770 for being faster and more vram (in video editing and 3d 8gb would be a problem specially at 4k) the only issue is the drivers they are still in progress
I am so glad I bought EVGA 2070 XC Ultra back in the day even though I thought that my 970 is perfectly fine.
Actually, I am in a similar situation as I have been about 5 years ago, maybe it actually is time to upgrade and be covered for next 5 years :D
As an owner of an arc a750 I have no complaints. Paired it with a 5800x3d bundle from microcenter and haven’t looked back.
Intel needs to release something worth talking about. I keep hearing battle mage, battle mage, battle mage, and still we have nothing that can even take on a 7900XT or a 4070Ti. if they can't take on a 3070 now, what the hell are they going to do against nVidia 5000 series and AMD next cards.
Its actually insane how good the 1080ti still is, it came out what, 7 years ago? Holy
An elegant weapon for a more civilized age...
My GTX970 still holds very well too
I’ve been having a lot of fun testing Arc. Any chance you can do a cpu scaling test on arc? I have notice that arc has a pretty large cpu driver overhead which causes issues in some game like horizon zero dawn and death stranding. Thanks for the work!
Ah man I love these. Saving this one when going to bed❤
I got an A770 for my Linux system. It’s focus on Vulkan has really complemented Linux’s ecosystem quite well (with Zink and DXVK already competitively covering OpenGL and DirectX for most games), and with the upcoming Xe driver it should become even better. It also is nice due to its comparatively low number of binary blobs for the GPU space.
If only Arc pricing was more competitive in Europe, I'd gladly consider it...
What's the pricing like there compared to an RX 7600 or RTX 4060?
@@GamersNexus The Arc 770 is in the same price range as the 16GB RX7600. Currently even 23 bucks cheaper at 336€ (Both Asrock models)
located in Germany
@@GamersNexus
Here in Germany/Austria
Rx7600 259€
RX6650XT 249€
RTX4060 299€
A750 219€
A770 16GB 319€
Yeah pricing is rough new, especially the A770 for €380. A750 is around €250 while only being like 5% slower. Used however you can get A770 for pretty cheap, got mine for around €250 used.
@@GamersNexusRx 7600 is around €280, RTX 4060 is around €310, Arc A750 is around €250, A770 8GB is around €300 and A770 16GB is around €380. At the price of a A770 16GB you might as well get a RX 6700 XT.
I cannot believe Arc 770 performs near to a RTX 4060, seems like this NVIDIA gen is the same as the 3XXX, in the other way that's a massive improvement from Intel.
Titan Volta? / 1080 Ti / 2060S-2070S /3060 12 GB / 4060 / Radeon 7 / 5700 XT / 6600 XT / 6650 XT / 6700 10 GB / 7600 / 7600 XT / A750 / A770
There have never been more GPUs at the more or less same performance class, in the history of ever
@@raresmacovei8382 It wasn't about the other GPU's, is about how much Intel has improved.
honestly, competing at the 3XXX level is good enough. that's basically where the budget buyers are gonna be looking. they aren't quite there, but it's also the first generation.
Man I always thought my Arc A770 GTAV issues were user-error. Glad you mentioned that here. I'm sure Intel devs will add that to their fixit list after watching!
DXVK could help.
Well I just ordered parts to build my 3rd PC and I bought the Sparkle Intel Arc A770 Titan OC Edition • 16GB. I think it will do what I want. I'm not as of a big gamer anymore due to my eyesight. But the few I do play this is more than capable.
Enjoying my A380 that i had since Nov 2022. Initially it would dip to ~20fps in Destiny 2. Now it is consistently at 60 fps with smart vsync. It was broken in the last 6 months of drivers, but works really well on the latest one. 5331 or whatever it is
I have a 7900XTX main GPU, but it has issues with streaming video playback while gaming. (I keep YT or other video streaming on a secondary monitor going while I game.) So, I added a GTX1650 as a secondary GPU specifically to run Firefox, Media Player, and PowerDVD on my secondary monitor. (Originally, used a 750Ti, but that had quality issues, then tried a 1030, but that had performance issues.) It worked great for a long time, until recently. I'm not sure if the issues are coming from the Nvidia or AMD drivers, but just the last month or so, my games have begun having major bad performance for brief times, mostly when first starting games, but there are also random moments that seemingly come out of nowhere. I'm thinking of switching my secondary card to see if the issues go away. Would an Intel card, like an A380 or A580, be a good candidate for this duty?
That's really weird. I've never experienced that on my 7900XTX and I've been a user since launch with those wacky driver bugs.
Update your driver's. Lol
@@UKKNGaming I did update my drivers, but I can't tell which driver update it was that cause the issue.
You probably should not be mixing vendors like that on windows(this does not apply to laptops where vendors supply custom drivers)
@@dangingerich2559 I feel like mixing cards from different vendors like that is bound to create problems sooner or later.
Are you sure the original playback issues aren't caused by something else entirely?
I remember having similar problems with playback while gaming and all it took to fix was turning off hardware accel in the browser.
Still if you really want two cards in your system sticking to the same vendor seems like a safer bet than mixing drivers.
I liked the A750, but I had to send it back. HDMI 2.1 support is broken. I have an LG OLED for my main display with no option to use display port.
But, if using DP it's a fantastic card for the money.
Good to know -- will test that!
@@GamersNexus Also try toggling HDR while in 2160p@120hz via HDMI 2.1
When you switch back from HDR, the colors are whacked out. Over saturated. Reboot required.
Edit: Adding all info just in case
Frequently would lose sync over HDMI 2.1. End up with a black screen. It always recognized my 2nd monitor via DP.
If toggling video modes, might lose sync. If toggling HDR could lose sync. I also use ARC audio (not to be confused with the A750) and it would lose sync with audio return channel if using all bells and whistles (4k, 120hz, HDR)
Sometimes, I would have a vertical line of artifacts down the OLED. Reboot could fix it, (if it sync'd). That was caused by upgrading the driver that also contained a flash. The early January update.
I think that is everything.
Via DP, everything works. Except for being able to use ARC audio.
i can run my lg c9 via hdmi at full resolution and refresh rate with my a770.
@@sgredsch I could too. But it had lots of other problems. Do you use ARC for audio? Do you switch between HDR? Mine was also an Intel A750, maybe some difference. I 100% for sure had issues though.
@@sgredsch Oh, I should add that my EVGA 3080FTW3 does everything that HDMI 2.1 can do flawlessly. Not a cable or B9 problem.
Hang in there Intel, show us what Battlemage and later Celestial can do!
I recently bought an A380 ELF for $120 and am extremely happy with it. I have a limited budget for building a gaming PC and the A380 gave me an excellent bang for my buck.
I have a A750 LE and I love it, one of the most interesting GPUs in recent years
What still impresses me is the 6700 XT.
You're impressed by a $480 at-launch card? Cool, I guess. It does get beaten by the $430 at-launch 3060 Ti, though.