Disclaimer: Sponsored by ASROCK - What does that mean?👇 Friends from ASrock reached out to sponsor a video to talk about their ARC GPUs for gaming etc... I said: "I'm not talking about gaming, but I could talk/test them what their like for creators." They said: "Go for it." I did my independent testing (like with all GPUs) and shared the results. ASROCK did preview the video before published and the only thing they asked to change was: "Can you put the Intel ARC [TM] logo on top left corner please". What I'm trying to say is they didn't tell me what to say, in fact the other way, I said why don't we talk about X,Y & Z. But I did get paid for focusing on ASROCK ARC GPUs and not mention and show [but still talk about it] other competitor brands in the video. Wanted to be upfront about it, coz I know how the 'sponsored videos' can come across and this was a collaboration with very 'loose' sponsorship guide lines. 😊
Thank you for the honest review. I find lot of the channels that are sponsored leaving the most important out to shift the blame or just to make their sponsored product look good. 99% seem to fall to the "misinformation" when something is left out in purpose. Mind you there are some channels that will give you the truth no matter what. But not many. One point I think you missed was Intel Application Optimization (APO) that is upcoming and should get more attention because when the ARC is paired with high end INTEL CPU there is potential for lot more in near future. Also I see lot is still on the table for INTEL pairing that we are starting to see in a gap in performance between AMD vs INTEL CPU.
I've build a computer for one of my friend and she complained that the card wasn't stable. We tried a bunch of things like clean windows reinstall, differents driver always crash randomly. So I've bought the limited editon A770 from her, she bought herself a 3070Ti and I put the ARC in a test bench. It was still crashing randomly. So one day out of the blue, I put courage into my cofee and decided to burst open the card. The thermal paste looked factory like didn't touch the core at all. I've rebuild the card tested it and throw it in my main rig to do a 30 days intel challenge. Rock solid since then. I hope I've helped someone.
The one issue with arc cards that no one is pointing out which will be an issue for users with high end monitors is that they cannot output native 10bit bit depth over displayport.
Ah... I didnt' know that. I have three LG 32QN650-B 32-inch QHD which are 2560 x 1440. I don't think they qualify as "high end" but you've given me something to think about.
@@nicktayloriv310 Since you have a 10-bit panel you wouldn't be able to send a native 10bit signal to your display so you would have to rely on turning on HDR in windows which tends to look like crap. I built my PC for mainly photo work so I have a 4k color calibrated monitor so not being able to send native 10bit to my display defeated the purpose of me having such an expensive display in the first place. I sent my a770 back and ended up getting a refurbished rtx 3090 from micro center for $600. What's so dumb from intel is that they advertise having DP 2.0 ports but then don't support 10-bit which is so backwards.
this why my setup will be a750 + 4070 Ti Super. I should check if monitor i got can do 10-bit lol, i didn't check for that, I focused on color accuracy, ppi, QHD, 27". Had a ruff time deciding either HP Z or ASUS ProArt (ruler is interesting, but I dont do as much print anymore). My OCD made me choose the HP Z because my laptop is HP, though the Mobo I bought is ProArt.
The most interesting thing about my A770 phantom lately is how often I simply forget I'm running it. It works just fine in everything that I tried. Hopefully Intel will reward their early adopters with drivers and improvements for many years.
I have subscribed and followed you from my first viewing of an earlier video...The $600 Apple Wheel's, that had just came out at that time. You made a video introducing yourself and what you could do. You put together a computer for that same price as the, Apple wheels, impressive. You lose me when you go the 'expensive' route...but you get me back when you include my price range. "Budget, Friendly." Reality, of money situation sometimes is a rude awakening. You want the Best, Fastest, Greatest....and in reality....you cannot even save $100. So....please keep up with what you are doing! Keep both though...your high end money builds with the Budget ones....I enjoy both!
I really wanted this to work for me but have had to give up. I primarily work in Premiere Pro. I've got a Gigabyte Z790 motherboard, i5-14600k CPU, all SSD storage, 64GB of DDR4 RAM and an Asrock Phantom Gaming Arc A770 16GB. Since around March every time there is either an Premiere Pro upgrade or Arc driver upgrade the system goes to crap...and it is always something different. Honestly, I don't know what will work from one day to the other. I've tried so many different things. I'll appear to fix it, but then something new will happen. Plus, even after completely rebuilding my machine from scratch (fresh windows installation, clean c: drive) the regular Intel Arc installer doesn't get past the Start Screen. The only way to upgrade the driver is by opening the zip file and doing it manually. Even then there is no guarantee it will work....and yes I've used DDU and started in Safe mode. I've literally lost days of creative work due to these issues. Yesterday was the last straw and I pulled the trigger on a Asus Proart RTX 4070 TI Super. Sorry Intel Arc, I wanted to love you but you just couldn't be trusted.
Hope that Da Vinci Resolve will update their app to take full advantage of these beasts so we are not compelled to use Premiere Pro when having theses cards.
@@survivor303 Linux has no licences for HEVC and many other codecs, while on windows you can purchase or install them. Davinci Resolve will take some time but will support intel media encoder/decoder eventually🤞
The Arc 770 is honestly a good card. They've come a looooong way with the drivers, and you can find one for like $325 online. The A310 and A380 are great too just for work or old PCs that just need a little boost to be functional again. And those are only like $99!
I think the most optimal budget setup would be a dual card setup with an AMD card for gaming and then an ARC for editing if that was what you were looking for.
I'm a newb hobbyist video editor and maybe streamer, and I'm using an A770 paired with a Ryzen 5600X (which I already had) to learn. The A770 is more than good enough for both use cases, and I suspect it will keep getting better. My most recent
Hey, how is Arc A770 in productivity work like 3D animation, Game dev (Unreal Engine 5), video editing and AI stuff compared to RTX 4070? Like 4070 might be better, but which one would be a value for money?
After my intensive studies of current GPUs. The 4070 super for 600$ is the best purchase for overall performance/price. That is if you want optimized frame gen. If you dont care about Ai upscaling go AMD.
I have had the A750, it didnt work on premiere pro, because I used a 5800x cpu!getting the A580 for a 12700k build for a friend of mine!cant wait to see the results in 3 months time, since I only got psu, ram and a cabinet for now!
@@TC-hl1ws it needs cuda cors, I tried opencl but it would take sooo long to render!it works with amd, not good in games back then, couldnt play properly with 5800x and a 4k monitor!some wrote the a750 worked nice on an x570 mobo and a 59590x!
As an encoder for Plex/Jellyfin/Emby there is no better value. Ordered my first ARC (380). Can't wait to play with it. If only the higher models weren't so high TDP.
@@mo79ch he made a whole video about why and when he would recommend these cards... your question should have been answered by the video. Did you watch it?
Even though I'm Team DaVinci Resolve I think I'm going to buy an A770 and just sit on it. I'm holding out to the new 15th Gen Intel CPU's come out and get one with onboard graphics to help with the editing and coding. I wanted AMD to address their onboard CPU issues that us editors face (the Gamers, not so much) with some software updates or something but it looks as if it's not going to happen as they've announced they're going to pull back for a while to concentrate of other endeavors. I have a 4070Ti Super now (got it 2 weeks ago, replaced my 3080Ti with it) and a Ryzen 9 3900x that I'm going to run until things sort themselves out plus I've been tinkering with Blender for a few weeks as I want to incorporate 3D into my platform. For editing I didn't see going up to AM5 platform helping any so I'll wait for Intel. Sometimes I stream from this same machine as well.
Availability is the biggest factor. If I only have $300 at the one time and I won’t have that money again for the rest of the year., I’ll buy what’s there. If the shop doesn’t stock this, I can’t even consider it
It's weird when Intel gets a more value to users than the other two tech giants. Let's face it, the pc industry is overexpensive for budget oriented people. Even the i5 13th/14th are such great deal to performance. Thxs for the review.
Thank you for this video. I now feel that I can replace my 5700XT with the Asrock Phantom Gaming A770 OC and feel confident that with my 5900X CPU that I can handle my transcoding and make AI work for me my way
1:35 last video you did on GPUs, YOU made me decide to build with 2 GPUs. ARC (think i have settled on the a750 LE) for my Photoshop & Premiere Pro workload (aka daily driver) and when it's time to hit some UE5, SD/ML, Gaming... it's the 4070 Ti Super (either ASUS ProArt or MSI Expert).
thanks sir i just wanna know - can i choose 770 LE over 4060 ? i mean i am learning editing and do lil motion graphics . i am so confused coz its huge money for me (as an investment )
Excellent review. I'm looking for something that will fit in my ms-01 which is part of my portable livestreaming rig. Looks like I'll have to hack the coolers to make it fit. same for the RTX A2000 (much more expensive) which has the realtime AI video upscaler (so I can upscale HD sources to UHD livestreams).
QuickSync is part of the media engine that's in the integrated graphics. Arc GPUs are an evolution of the integrated graphics, so they inherit the evolved media engine.
🟨 Too late for a question?? I watched all of your ARC A770 videos and just bought one! 😀However, I used your score charts to decide, but I realize now that I missed that your test bench had an Intel iGPU as well (HyperSync boost?). I'm on AMD Ryzen. My A770 arrives in 2 days. 📦 Am I returning it and shelling out $180 more for a 7800 XT instead? I'm all h.264 and h.265. No RED, no RAW, no 3D. 🎬 Thanks!! Love your channel! ⭐⭐⭐⭐⭐
@@theTechNotice- Woohoo! 😃 Great news! Thanks for the clarification! Can't wait to rerun the Puget benchmarks when I install the A770 to see the difference! Thanks for the awesome videos!! 😃
@@theTechNotice I entered a longer comment on another video like this one but it keeps erasing. I don't know where it went, but the bottom line is I'm not having great luck here. Everything from bios, to chipset, to drivers, all updated. Beforehand, I tested with my RTX 2060 KO Ultra. Then I installed and tested the A770. (how do you post a link on here?) Trying to post a screenshot of the side-by-side comparison somehow.
I have this card. so far I love it! I don't use the RGB lighting (turned it off) but I have to say in playing with it.... I very nearly kept it on... There is this one setting where you can change it to red and take your pick it either looks like KITT from Knight Rider or the Cylons from Battlestar Galactica the original TV show... lol That was cool and honestly I will probably turn it on a few times just to show people how cool that is. One complaint I have, I had to turn off some sort of on screen overlay thing via editing the Windows registry. It doesn't seem to be a common problem but DDU and DDU and DDU I couldn't figure it out. TH-cam was the only program affected, but I love TH-cam so it did cause me to pull out some hairs for like 2 days until I stumbled on a post about someone having the same issue. Not sure how common it is (seems pretty rare) but turning that setting off in the registry by adding it in and some numbers like 00000005 fixed it. So happy camper now :) If someone could add that setting in the GPU control panel app that would be nice something like "If TH-cam is flickering for you uncheck this box" as I read that I will need to edit that registry setting every time Windows updates.
I got one pretty much just for video transcoding. While they're not the best for gaming for H265 or AV1 encoding Intel's chips render the highest quality video within the smallest file sizes - usually beating the software/CPU encoder. For live transcoding on my jellyfin setup and just archival encoding for my media library the A310 has been great. That's on a separate computer though and I use an RX6650 XT for actual gaming.
Well there is the hyperencode feature it would add, but it's a very specific thing, in most cases I'd sayno additional performance, but extra troubleshoot GPU :)
Hello. Maybe my comment will be outdated but I really need help. Right now I’m looking for a dedicated streaming pc using av1. And I have two options both almost the same price. First is to build pc with 16gb ram i5-12400f and arc a380. The second one is a laptop Lenovo LOQ 15iax9i with i5-12450HX 8gb ram and Intel Arc a530m 4gb. The ram can be changed on laptop, it has 2 slots. The only reason that I didn’t choose laptop already, I’m not sure is it enough a530m with 4gb for 1080p/60 or 1440p/60 av1 streaming. Did you have any thoughts on this? Laptop is preferred more because I always on go.
I almost pulled the trigger on an Intel Arc A770 this past week whilst Amazon had it for £298 but then the price jumped all the way up to £380! 😮🤬 Waiting for it to drop sub-£300 again…
Sooo, for someone who only uses Davinci Resolve Studio and Lightroom/Photoshop and has a budget for an 8 Gb RTX 4060. Would the ARC 580 8 Gb or a 750 8 Gb be a good choice? I run an i7 8700K and 32 Gb of DDR4 on an Asus Z-390A as the rest of my system. No gaming and no 3D work.
I have the ASRock b450m steel legend motherboard. I was planning to upgrade the cpu to 5700x and gpu to Arc770. Will i have access to resizable bar feature?
So if they don't upgrade the dual media engines inside the gpu the performance on premiere pro would be the same in a newer gpu? I'm waiting for a newer intel gpu, but don't know if I should as there's no rumours about it
Some people say that A770 crashes in Lightroom 's AI Denoise. Other people coment poor performance in same task (AI denoise) Can you tell us more about this? Thank you
So for now the main issue isn't the card it is the software ? Sucks that NVIDIA has had the market share for so long that all the software is coded to optimize NVIDIA GPUs.
I'm not sure to purchase now, or wait for the holidays. Rumors are saying Battlemage will drop in the fall/winter, and that a B580 with 12Gb of vram would be equivalent to an A770, but with a price of $200 USD, and lower power consumption.
@@survivor303 Maybe, but you have to remember that Intel is new to the GPU market, and if they want to gain market share, they have to be competitive against Nvidia and AMD. Remember, nobody believed a 6 core 12 thread CPU would be half the price of Intel's i7 6800k, but then Ryzen happened.
Let's be honest. People aren't going to buy ARC. It's going to be really hard to get them to leave Nvidia or AMD. Even if people avoid the 50 series with subpar specs, people are likely going to buy the 30 or 40 series for the next upcoming years before ever considering getting an ARC. If Intel has any hope of making any sales, they're going to have to release their GPUs at SIGNIFICANTLY lower prices than their established competitors.
I would love it if you made your bar charts mobile viewer friendly. Having so many arranged that way and size is impossible to see even on an iPhone pro max. Check out how max tech does it. Very mobile friendly.
My buddy has a rtx 4060 and a ryzen 7 7700 paired with 16 gb ddr5, all the while i had at the time an a580 arc* we ran some 7 days to die he had significant fps drops at the same scenes bit cities and all that he would bottom to 90 sometimes even 46! While my arc 580 sat locked at 110 in those same environments, i dont rightly know why it got outpaced, considering his build was a month old at the time, because of this fact my next gpu is an arc bifrost 770 ive had nothing but good experience with arc.
The A310 and A380 are both pretty good for media servers or boxes where you need a GPU but don't need a ton of GPU horsepower. Back before iGPU's office desktops used to come with low end quadro cards for example and the E2400 Xeons for example sometimes get used for workstations and don't have iGPU.
Thanks for the info! Good effort. It is difficult for me to use ranked card performance statistics, but with charts with mistakes, like the same exact same card twice in a list with different performance numbers (and also a few mistakes in the script, I think). Also the info with old cards at the top of the list has to be obsolete data. 2060 is not better than the 4080 in Blender. Trying to make a purchase decision on confusing info you list and talk about is not so easy. =/ Thanks for the good info otherwise though. Good topic!
linux is a platform where they really can shine, if only intel get hang of it.. they need to stop this gaming nonsense and make things for creators. sure they have those acl cards to give more performance for render speeds but that thing is for movie makers, not "creators". we need working profiles for these arc gpus for kdenlive.
The support depends on the motherboard. But, since Arc cards rely a lot on the resizable bar feature on the motherboard, I don't think the arc is the one for your build. But if you are planning to change the build, you can buy it and then do the upgrade. You need to know you have at least 20/30% less performance without resizable bar.
@@TonyRush21 The 10th gen is quite new and it should work fine by enabling Resizable BAR, but I would go with an 11th+ gen CPU in order to get advantage of Hyper-Encode feature, which utilizes both iGPU's Quicksync and ARC's Quicksync to render (export) faster, which it does (tried it).
My arc A770 cannot compete with mu RTX3090 when it comes to superscaling in DaVinci Resolve. it will take an hour or more to do what the rtx3090 does in minutes.
Is there any benefit from 16Gb of ARC A770 VRAM in Davinci Resolve or it doesn't worth it? I can't choose what to buy between RTX4060 and ARC A770 (the higher models are out of my budget) and I plan working in Davinci and Adobe Premiere. So I could use an advice!
Why are you constantly talking about the Asrock arc770, when the sparkle arc 750 is always faster AND CHEAPER? ok - while I was writing this I saw the "sponsored" message. Kind of said, as one could save 100 bugs if one would read and not listen.
Theyre to slow, unless youre a budget gamer and dont mind having issues w some games not running and high idle power usage and inconsistent performance vs the competition. Ive used an a770 and a750. 1440p newer titles gl w that and wanting a high frame rate. Theyre a 1080p gpu and low settings for the majority of games recently released that are demanding. Older games yeh sure go ahead.
I run 1440 medium setting in newer games no problem. Bought a Sparkle 770 16GB and only issue is stock fan control is set to 50C before fans come on but can change that with Arc Control. But ya NVIDIA is the way to go if you want max FPS just the way it is for now. BTW I run the card at stock setting, can change in Arc Control but have never needed to.
Why are you showing $dollars? Is kind of pointless for a channel labelled as United Kingdom? You can not get that card in the UK for anywhere near 300US converted to pounds, same as why I can’t get this card for €300….
Yes of course. Put the T1000 in the first slot for main display GPU (to get 10bit color) and the A380 for the Quicksync HW accelaration, it does miracles. In Resolve you can define the A380 to handle all the decoding in the timeline or AV1 encoding during export
Disclaimer: Sponsored by ASROCK - What does that mean?👇
Friends from ASrock reached out to sponsor a video to talk about their ARC GPUs for gaming etc...
I said: "I'm not talking about gaming, but I could talk/test them what their like for creators." They said: "Go for it."
I did my independent testing (like with all GPUs) and shared the results. ASROCK did preview the video before published and the only thing they asked to change was: "Can you put the Intel ARC [TM] logo on top left corner please".
What I'm trying to say is they didn't tell me what to say, in fact the other way, I said why don't we talk about X,Y & Z. But I did get paid for focusing on ASROCK ARC GPUs and not mention and show [but still talk about it] other competitor brands in the video.
Wanted to be upfront about it, coz I know how the 'sponsored videos' can come across and this was a collaboration with very 'loose' sponsorship guide lines. 😊
Thank you for the honest review. I find lot of the channels that are sponsored leaving the most important out to shift the blame or just to make their sponsored product look good. 99% seem to fall to the "misinformation" when something is left out in purpose. Mind you there are some channels that will give you the truth no matter what. But not many. One point I think you missed was Intel Application Optimization (APO) that is upcoming and should get more attention because when the ARC is paired with high end INTEL CPU there is potential for lot more in near future. Also I see lot is still on the table for INTEL pairing that we are starting to see in a gap in performance between AMD vs INTEL CPU.
I've build a computer for one of my friend and she complained that the card wasn't stable. We tried a bunch of things like clean windows reinstall, differents driver always crash randomly.
So I've bought the limited editon A770 from her, she bought herself a 3070Ti and I put the ARC in a test bench. It was still crashing randomly.
So one day out of the blue, I put courage into my cofee and decided to burst open the card.
The thermal paste looked factory like didn't touch the core at all.
I've rebuild the card tested it and throw it in my main rig to do a 30 days intel challenge.
Rock solid since then. I hope I've helped someone.
The one issue with arc cards that no one is pointing out which will be an issue for users with high end monitors is that they cannot output native 10bit bit depth over displayport.
Ah... I didnt' know that. I have three LG 32QN650-B 32-inch QHD which are 2560 x 1440. I don't think they qualify as "high end" but you've given me something to think about.
@@nicktayloriv310 Since you have a 10-bit panel you wouldn't be able to send a native 10bit signal to your display so you would have to rely on turning on HDR in windows which tends to look like crap. I built my PC for mainly photo work so I have a 4k color calibrated monitor so not being able to send native 10bit to my display defeated the purpose of me having such an expensive display in the first place. I sent my a770 back and ended up getting a refurbished rtx 3090 from micro center for $600.
What's so dumb from intel is that they advertise having DP 2.0 ports but then don't support 10-bit which is so backwards.
@@GustavoSanchez64 try with 4:2:0 :)
this why my setup will be a750 + 4070 Ti Super. I should check if monitor i got can do 10-bit lol, i didn't check for that, I focused on color accuracy, ppi, QHD, 27". Had a ruff time deciding either HP Z or ASUS ProArt (ruler is interesting, but I dont do as much print anymore). My OCD made me choose the HP Z because my laptop is HP, though the Mobo I bought is ProArt.
@GustavoSanchez64 My Asus 4080 Super arrived yesterday. Started to get a 4070 Ti Super but changed my mind at the last minute. 999 direct from Asus.
The most interesting thing about my A770 phantom lately is how often I simply forget I'm running it. It works just fine in everything that I tried. Hopefully Intel will reward their early adopters with drivers and improvements for many years.
Thanks for your transparency in the first comment. 👍🏻
It's crazy that Rob Dyrdek got into YT computer vids. Guy does it all!
Lol
I have subscribed and followed you from my first viewing of an earlier video...The $600 Apple Wheel's, that had just came out at that time. You made a video introducing yourself and what you could do. You put together a computer for that same price as the, Apple wheels, impressive.
You lose me when you go the 'expensive' route...but you get me back when you include my price range. "Budget, Friendly." Reality, of money situation sometimes is a rude awakening. You want the Best, Fastest, Greatest....and in reality....you cannot even save $100. So....please keep up with what you are doing! Keep both though...your high end money builds with the Budget ones....I enjoy both!
Thanks for the comment!
I really wanted this to work for me but have had to give up. I primarily work in Premiere Pro. I've got a Gigabyte Z790 motherboard, i5-14600k CPU, all SSD storage, 64GB of DDR4 RAM and an Asrock Phantom Gaming Arc A770 16GB. Since around March every time there is either an Premiere Pro upgrade or Arc driver upgrade the system goes to crap...and it is always something different. Honestly, I don't know what will work from one day to the other. I've tried so many different things. I'll appear to fix it, but then something new will happen. Plus, even after completely rebuilding my machine from scratch (fresh windows installation, clean c: drive) the regular Intel Arc installer doesn't get past the Start Screen. The only way to upgrade the driver is by opening the zip file and doing it manually. Even then there is no guarantee it will work....and yes I've used DDU and started in Safe mode. I've literally lost days of creative work due to these issues. Yesterday was the last straw and I pulled the trigger on a Asus Proart RTX 4070 TI Super. Sorry Intel Arc, I wanted to love you but you just couldn't be trusted.
Thank you very much for your report, I was going to buy an a770 but after your report, I'm going for the RTX 4060ti
@@agenciatuttimarketing7937 thanks for the response. Stretch to the 4070 Super if you can.
Idk mayne he have a broken gpu, because mine works tottaly without errors for over a year🤷♂️@@agenciatuttimarketing7937
I was deciding between rtx 3050 6gb or Intel a750 thanks for ur pov as a budget beginner safety & reliability is also my concern
Hope that Da Vinci Resolve will update their app to take full advantage of these beasts so we are not compelled to use Premiere Pro when having theses cards.
they likely doesn't do that, they even drop some perfectly good amd gpus from their linux version of the product.. about windows, i dont know :)
@@survivor303 Linux has no licences for HEVC and many other codecs, while on windows you can purchase or install them.
Davinci Resolve will take some time but will support intel media encoder/decoder eventually🤞
The Arc 770 is honestly a good card. They've come a looooong way with the drivers, and you can find one for like $325 online. The A310 and A380 are great too just for work or old PCs that just need a little boost to be functional again. And those are only like $99!
I got an Arc A380, it has worked great so far for games like MK1, Re4R and Silent Hill.
I think the most optimal budget setup would be a dual card setup with an AMD card for gaming and then an ARC for editing if that was what you were looking for.
I tried it on the threadripper build, didn't work so well...
@@theTechNotice Did you make a TH-cam video on this dual card experiment?
How's buying two cards on budget and what two were you thinking of?
@@TC-hl1ws Yes he did. Put Tech Notice Threadripper dual GPU in the search bar and... there it is.
I Built the Ultimate ALL AMD HEDT PC build in 2023... | Threadripper 7980x + RX7900XTX + ARC A380 is the title of the video.
To be honest it good to see more competition. Prices are crazy for anything decent in the big two gpu makers. Great review thanks.
I'm a newb hobbyist video editor and maybe streamer, and I'm using an A770 paired with a Ryzen 5600X (which I already had) to learn. The A770 is more than good enough for both use cases, and I suspect it will keep getting better. My most recent
Ofc you need 16 gb for 4k
Hey, how is Arc A770 in productivity work like 3D animation, Game dev (Unreal Engine 5), video editing and AI stuff compared to RTX 4070? Like 4070 might be better, but which one would be a value for money?
After my intensive studies of current GPUs. The 4070 super for 600$ is the best purchase for overall performance/price. That is if you want optimized frame gen. If you dont care about Ai upscaling go AMD.
I'm starting to think I should reimburse you because you're saving me a lot
I have had the A750, it didnt work on premiere pro, because I used a 5800x cpu!getting the A580 for a 12700k build for a friend of mine!cant wait to see the results in 3 months time, since I only got psu, ram and a cabinet for now!
Intel GPU doesn't work well with a AMD CPU???
@@TC-hl1ws it needs cuda cors, I tried opencl but it would take sooo long to render!it works with amd, not good in games back then, couldnt play properly with 5800x and a 4k monitor!some wrote the a750 worked nice on an x570 mobo and a 59590x!
@@TC-hl1ws My A750 LE has been working fine for a year now with my Ryzen 5600 on a B350 MoBo.
@@magnusnilsson9792 for premiere pro too?or Davinci?
@@TC-hl1ws Some AMD CPUs won't work with arc series, it is listed on intel's site which CPUs work with arc series.
As an encoder for Plex/Jellyfin/Emby there is no better value. Ordered my first ARC (380). Can't wait to play with it. If only the higher models weren't so high TDP.
When is the next generation card come out? If they come out new card with 500-700 price range, would be cool for content creator.
Thanks for the review,
but do you recommend it regardless of the price?
Not sure what you mean...
@@theTechNotice What I mean is
Is it recommended more than high-end graphics cards for video editing?
@@mo79ch Get the A770 16GB, the VRAM goes a long way for video editing, especially in Davinci Resolve
@@mo79ch he made a whole video about why and when he would recommend these cards... your question should have been answered by the video.
Did you watch it?
Intel should be able to improve a lot in the next GPU releases sure I hope they do, specially in the 3D department for my particular case.
Even though I'm Team DaVinci Resolve I think I'm going to buy an A770 and just sit on it. I'm holding out to the new 15th Gen Intel CPU's come out and get one with onboard graphics to help with the editing and coding. I wanted AMD to address their onboard CPU issues that us editors face (the Gamers, not so much) with some software updates or something but it looks as if it's not going to happen as they've announced they're going to pull back for a while to concentrate of other endeavors. I have a 4070Ti Super now (got it 2 weeks ago, replaced my 3080Ti with it) and a Ryzen 9 3900x that I'm going to run until things sort themselves out plus I've been tinkering with Blender for a few weeks as I want to incorporate 3D into my platform. For editing I didn't see going up to AM5 platform helping any so I'll wait for Intel. Sometimes I stream from this same machine as well.
so, you going to replace 4070ti with a770? do you know, you really dont need to do that ;)
@@survivor303 That's not what I said..
Availability is the biggest factor. If I only have $300 at the one time and I won’t have that money again for the rest of the year., I’ll buy what’s there. If the shop doesn’t stock this, I can’t even consider it
It's weird when Intel gets a more value to users than the other two tech giants. Let's face it, the pc industry is overexpensive for budget oriented people. Even the i5 13th/14th are such great deal to performance. Thxs for the review.
Thank you for this video. I now feel that I can replace my 5700XT with the Asrock Phantom Gaming A770 OC and feel confident that with my 5900X CPU that I can handle my transcoding and make AI work for me my way
1:35 last video you did on GPUs, YOU made me decide to build with 2 GPUs. ARC (think i have settled on the a750 LE) for my Photoshop & Premiere Pro workload (aka daily driver) and when it's time to hit some UE5, SD/ML, Gaming... it's the 4070 Ti Super (either ASUS ProArt or MSI Expert).
if you are rich sure
@mrbabyhugh how do you connect monitor(s) to both cards?
thanks sir
i just wanna know - can i choose 770 LE over 4060 ?
i mean i am learning editing and do lil motion graphics . i am so confused coz its huge money for me (as an investment )
A770 LE 16GB, the VRAM will help a lot and give a headroom for experiments. 8GB in 2024 for CG graphics is no go anymore.
@@xarisathos thanks alot !
As I can't wait for upcoming battlemage GPUs, ig for rn Arc a770 is my best option in the budget
Thanks again 🌸
Excellent review. I'm looking for something that will fit in my ms-01 which is part of my portable livestreaming rig. Looks like I'll have to hack the coolers to make it fit. same for the RTX A2000 (much more expensive) which has the realtime AI video upscaler (so I can upscale HD sources to UHD livestreams).
nice to see someone trying to bring back the 80's shoulder pads
I thought Quick Sync was a feature built into the CPU? You're saying it's built into the Arc card now?
It's built for Intel media engines
QuickSync is part of the media engine that's in the integrated graphics. Arc GPUs are an evolution of the integrated graphics, so they inherit the evolved media engine.
@@ryanspencer6778 So one could pair an AMD CPU with an Intel GPU (Arc or Battlemage) and you would get QuickSync without needing to have an Intel CPU?
@@squatch545 yes
@@ryanspencer6778 Good to know, thanks.
🟨 Too late for a question?? I watched all of your ARC A770 videos and just bought one! 😀However, I used your score charts to decide, but I realize now that I missed that your test bench had an Intel iGPU as well (HyperSync boost?). I'm on AMD Ryzen. My A770 arrives in 2 days. 📦 Am I returning it and shelling out $180 more for a 7800 XT instead? I'm all h.264 and h.265. No RED, no RAW, no 3D. 🎬 Thanks!! Love your channel! ⭐⭐⭐⭐⭐
Nope, when testing GPUs I turn off the iGPU on the CPU!
@@theTechNotice- Woohoo! 😃 Great news! Thanks for the clarification! Can't wait to rerun the Puget benchmarks when I install the A770 to see the difference! Thanks for the awesome videos!! 😃
@@theTechNotice I entered a longer comment on another video like this one but it keeps erasing. I don't know where it went, but the bottom line is I'm not having great luck here. Everything from bios, to chipset, to drivers, all updated. Beforehand, I tested with my RTX 2060 KO Ultra. Then I installed and tested the A770. (how do you post a link on here?) Trying to post a screenshot of the side-by-side comparison somehow.
I have this card. so far I love it! I don't use the RGB lighting (turned it off) but I have to say in playing with it.... I very nearly kept it on... There is this one setting where you can change it to red and take your pick it either looks like KITT from Knight Rider or the Cylons from Battlestar Galactica the original TV show... lol That was cool and honestly I will probably turn it on a few times just to show people how cool that is.
One complaint I have, I had to turn off some sort of on screen overlay thing via editing the Windows registry. It doesn't seem to be a common problem but DDU and DDU and DDU I couldn't figure it out. TH-cam was the only program affected, but I love TH-cam so it did cause me to pull out some hairs for like 2 days until I stumbled on a post about someone having the same issue. Not sure how common it is (seems pretty rare) but turning that setting off in the registry by adding it in and some numbers like 00000005 fixed it. So happy camper now :)
If someone could add that setting in the GPU control panel app that would be nice something like "If TH-cam is flickering for you uncheck this box" as I read that I will need to edit that registry setting every time Windows updates.
By this card, I mean the ASRock Phantom gaming OC edition of the 770.
I got one pretty much just for video transcoding. While they're not the best for gaming for H265 or AV1 encoding Intel's chips render the highest quality video within the smallest file sizes - usually beating the software/CPU encoder. For live transcoding on my jellyfin setup and just archival encoding for my media library the A310 has been great. That's on a separate computer though and I use an RX6650 XT for actual gaming.
One day we might have a PC with nVidia CPU and Intel GPU 🙂
Doest the integrated gpu from intel CPU's adds performance to Intel dedicated GPU or is it annulled?
Well there is the hyperencode feature it would add, but it's a very specific thing, in most cases I'd sayno additional performance, but extra troubleshoot GPU :)
@@theTechNotice thank you
Thank you my friend.
Would be VALUABLE to see SolidWorks results!!
I use Davinci resolve should I get 7900XT Ultra or 4070 super i manly edit Sony Fx3,30,and fx6 codecs? (the xt is 125$ dollar more)
Hello. Maybe my comment will be outdated but I really need help. Right now I’m looking for a dedicated streaming pc using av1. And I have two options both almost the same price. First is to build pc with 16gb ram i5-12400f and arc a380. The second one is a laptop Lenovo LOQ 15iax9i with i5-12450HX 8gb ram and Intel Arc a530m 4gb. The ram can be changed on laptop, it has 2 slots. The only reason that I didn’t choose laptop already, I’m not sure is it enough a530m with 4gb for 1080p/60 or 1440p/60 av1 streaming. Did you have any thoughts on this? Laptop is preferred more because I always on go.
I almost pulled the trigger on an Intel Arc A770 this past week whilst Amazon had it for £298 but then the price jumped all the way up to £380! 😮🤬
Waiting for it to drop sub-£300 again…
wait for battlemage bro
Yup, wait for Battlemage, it either be better for reasonable price or A770 gets cheaper.
currently £220 at currys !
@@scorpionfpv6412 - Yes, an absolute *STEAL!*
That is why I don't wait when there is a good deal anymore. Sure you could always be more greedy. Or you don't....
Sooo, for someone who only uses Davinci Resolve Studio and Lightroom/Photoshop and has a budget for an 8 Gb RTX 4060. Would the ARC 580 8 Gb or a 750 8 Gb be a good choice? I run an i7 8700K and 32 Gb of DDR4 on an Asus Z-390A as the rest of my system. No gaming and no 3D work.
Probably Arc A770 could be a good option due the 16Gb of VRAM since as I heard Davinci uses lots of VRAM.
Hoping battlemage grows and 15th gen intel cpu will be a great combo😊
I've smoked benchmarks numerous times and my 1% lows increased 10/10 would recommend.
I have the ASRock b450m steel legend motherboard. I was planning to upgrade the cpu to 5700x and gpu to Arc770. Will i have access to resizable bar feature?
Would this be a good option for a dual card setup with the i9 14900k and the Pro art z790 ?
Nope, don't recommend dual arc :)
Thanks for the quick reply 😁
Zero benefit
So if they don't upgrade the dual media engines inside the gpu the performance on premiere pro would be the same in a newer gpu? I'm waiting for a newer intel gpu, but don't know if I should as there's no rumours about it
computex is the 4th, hold out (:
can i get one of these on my old amd build and still get the advantage of quicksync? for premiere pro.
watching this on my new asrock x670e amd cpu with an asrock intel arc a770 16gb challenger.. love it! set up for video editing...
I wonder how good battlemage would be, specially for unreal engine.
Which is best for trading computer running software like MT5 for multiple chart windows ?
Some people say that A770 crashes in Lightroom 's AI Denoise. Other people coment poor performance in same task (AI denoise)
Can you tell us more about this? Thank you
So for now the main issue isn't the card it is the software ? Sucks that NVIDIA has had the market share for so long that all the software is coded to optimize NVIDIA GPUs.
I'm not sure to purchase now, or wait for the holidays. Rumors are saying Battlemage will drop in the fall/winter, and that a B580 with 12Gb of vram would be equivalent to an A770, but with a price of $200 USD, and lower power consumption.
nobody believes that 200USD :D perhaps it is intel price but you dont going to see it with 12GB ram on the shelfs :)
@@survivor303 Maybe, but you have to remember that Intel is new to the GPU market, and if they want to gain market share, they have to be competitive against Nvidia and AMD. Remember, nobody believed a 6 core 12 thread CPU would be half the price of Intel's i7 6800k, but then Ryzen happened.
Let's be honest. People aren't going to buy ARC. It's going to be really hard to get them to leave Nvidia or AMD. Even if people avoid the 50 series with subpar specs, people are likely going to buy the 30 or 40 series for the next upcoming years before ever considering getting an ARC.
If Intel has any hope of making any sales, they're going to have to release their GPUs at SIGNIFICANTLY lower prices than their established competitors.
Id like to upgrade my RX6600, but i dont know what i want to go with. lol. Wouldnt mind an ARC for my Plex/HDHomerun TV viewing acceleration..
I would love it if you made your bar charts mobile viewer friendly. Having so many arranged that way and size is impossible to see even on an iPhone pro max. Check out how max tech does it. Very mobile friendly.
if you use app for youtube, just spinzoom :)
yeah, but MAXtech has only 2-3 items in the chart, not ALL the GPUs :) I'd love any advice how to do it with 20-30 item chart? :)
some numbers in table for Davinci (13:19) are screwed: look at acer arc a770: 2171/3363 = 0.645
in every video comparing intel arc to nvidia intel arc cards just look so much sharper then nvidia cards.
My buddy has a rtx 4060 and a ryzen 7 7700 paired with 16 gb ddr5, all the while i had at the time an a580 arc* we ran some 7 days to die he had significant fps drops at the same scenes bit cities and all that he would bottom to 90 sometimes even 46! While my arc 580 sat locked at 110 in those same environments, i dont rightly know why it got outpaced, considering his build was a month old at the time, because of this fact my next gpu is an arc bifrost 770 ive had nothing but good experience with arc.
For whom is the ARC a310 for by the way? Not interested in gaming performance.
The A310 and A380 are both pretty good for media servers or boxes where you need a GPU but don't need a ton of GPU horsepower. Back before iGPU's office desktops used to come with low end quadro cards for example and the E2400 Xeons for example sometimes get used for workstations and don't have iGPU.
@@nadtz
Thank you for your reply! :)
A750 perfect choice
You really need to update Intel graphic driver and test them again
I'm building an extra machine just for fun and to play with ARC. Just for gaming not a creative.
Thanks for the info! Good effort. It is difficult for me to use ranked card performance statistics, but with charts with mistakes, like the same exact same card twice in a list with different performance numbers (and also a few mistakes in the script, I think). Also the info with old cards at the top of the list has to be obsolete data. 2060 is not better than the 4080 in Blender. Trying to make a purchase decision on confusing info you list and talk about is not so easy. =/ Thanks for the good info otherwise though. Good topic!
linux is a platform where they really can shine, if only intel get hang of it.. they need to stop this gaming nonsense and make things for creators. sure they have those acl cards to give more performance for render speeds but that thing is for movie makers, not "creators". we need working profiles for these arc gpus for kdenlive.
why A580 is outperforming A770? It doesn't make much sense, does it?
it's quiete good ehh... getting better in terms of driver eh :D
Question: will an intel 3770k cpu support the intel arc a580
The support depends on the motherboard. But, since Arc cards rely a lot on the resizable bar feature on the motherboard, I don't think the arc is the one for your build. But if you are planning to change the build, you can buy it and then do the upgrade. You need to know you have at least 20/30% less performance without resizable bar.
No, it doesn't work. Tried an A380 & A770 to revice the system but doesn't boot at all, it requires quite modern CPU/motherboard.
@@xarisathos I figured as much. I hope the intel i7 10th Gen works. thank you.
@@TonyRush21 The 10th gen is quite new and it should work fine by enabling Resizable BAR, but I would go with an 11th+ gen CPU in order to get advantage of Hyper-Encode feature, which utilizes both iGPU's Quicksync and ARC's Quicksync to render (export) faster, which it does (tried it).
@@xarisathos Thank you very much. I will definitely take that into consideration.
does intel has a low latency mode like n vidia reflex
If AMD can't get their shit together and Intel does start making better GPU's I might consider switching..lol
Can someone tell me how is arc A770 for Solidworks..?
Damn the intro is 🔥
;)
So a low profile GPU beat RTX 4090 in one test! that irony
My arc A770 cannot compete with mu RTX3090 when it comes to superscaling in DaVinci Resolve. it will take an hour or more to do what the rtx3090 does in minutes.
Is there any benefit from 16Gb of ARC A770 VRAM in Davinci Resolve or it doesn't worth it? I can't choose what to buy between RTX4060 and ARC A770 (the higher models are out of my budget) and I plan working in Davinci and Adobe Premiere. So I could use an advice!
Why are you constantly talking about the Asrock arc770, when the sparkle arc 750 is always faster AND CHEAPER?
ok - while I was writing this I saw the "sponsored" message.
Kind of said, as one could save 100 bugs if one would read and not listen.
just please down that intro volume. .holly sh... ;D
the volume of this video is too loud '__') no?
pretty sure other videos are not this loud
I like my A750.
A770 has better RT performance than rx6650 xt
Too bad the price is not good where I live.
Yes, now, better support on Linux please.
I'm on Mint, Intel GPU support on the latest kernel 6.5 is solid.
This guy is talking out his ass when saying sometime better than a 4090. Yeah no it’s not.
Theyre to slow, unless youre a budget gamer and dont mind having issues w some games not running and high idle power usage and inconsistent performance vs the competition. Ive used an a770 and a750. 1440p newer titles gl w that and wanting a high frame rate. Theyre a 1080p gpu and low settings for the majority of games recently released that are demanding. Older games yeh sure go ahead.
I run 1440 medium setting in newer games no problem. Bought a Sparkle 770 16GB and only issue is stock fan control is set to 50C before fans come on but can change that with Arc Control. But ya NVIDIA is the way to go if you want max FPS just the way it is for now. BTW I run the card at stock setting, can change in Arc Control but have never needed to.
@@rotnbazturd7569 yeh, on newer ue5 engine games?
Aren't you on the wrong channel?
Arc is still a better choice than Radeon. Whatever gpu you buy, don't buy AMD, even the chinese gpu's are better choice
intel catch radeon
no games
Why are you showing $dollars? Is kind of pointless for a channel labelled as United Kingdom? You can not get that card in the UK for anywhere near 300US converted to pounds, same as why I can’t get this card for €300….
Nice
intel's leadership is filled with DEI hires, hence the current state of intel
uh lol no
No gaming fps numbers = dislike
Plenty of them out there, check out GN ;)
deus vult
is it possible to use two GPU if i have 2PCIeX slot...arc a380 6GBDDR6 +T1000 Nvidia 8GB together???
Yes of course. Put the T1000 in the first slot for main display GPU (to get 10bit color) and the A380 for the Quicksync HW accelaration, it does miracles. In Resolve you can define the A380 to handle all the decoding in the timeline or AV1 encoding during export
No thanks😊😊😊😊
It seems like a sponsored video