This took like 6 hours to make so please don’t skip for averages! If you'd like to support the channel and help fund new game releases and future graphics card, I would greatly appreciate it if you visit the link below. Buy me a Coffee! - buymeacoffee.com/intelarctesting
@@noway5013 the whole reason I test with a 5600 is because arc cards have some trouble with older api’s and cpu driver overhead (death stranding). I can just brute force past all driver issues with my 7800x3d but that wouldn’t be realistic for most people out there buying these. I’m using a b450 so gen 3x16
@@IntelArcTesting are u using pbo for better boosts with 5600 i too have 5600 with pbo at-28 on all cores and CPU and MOBO phases to extreme in bios ,PWR limit to 100W i can sustain all cores 4550mhz under full load and in games for example some cores go up to 4750mhz ,stock is 4450 i believe so you can get easily 5-10% of raw performace even though in gaming it often does not make much difference but still usefull its basically the easiest way to squeeze as much from them ryzens cpus right now
Retesting the Arc A770 and the Arc A750 a few months later showed massive improvements. The A750 went from 56 frames per second (fps) in Cyberpunk 2077 at 1080p to 76 fps, outperforming Nvidia’s RTX 3060 Ti.
How can you say that. This video is from a month ago, so everybody can safely assume it's recorded on newest drivers. Personally im surprised there's such disparity between 3060 and 770 is certain games. I know, i know directX backwards compatibility is still behing but i'm hoping Intel will keep updating.
Its great how you test these cards with a normal rig. Really gives an interesting view on how these cards compare for those of us not running crazy fast PCs.
I think other channels have reported very low fps in Starfield, Death stranding and other games, so it’s not just on your system. People who suggest switching to a 7800X3D don’t get it, it’s such an unrealistic setup to pair that CPU with midrange GPUs like the Arcs.
Unfortunately, the Ryzen 5 5600 isn't a very efficient processor .In one of my PCs, I have the i5-13600K with Arc 750, and the results are much higher. In Starfield, I'm getting well over 50fps at 1080p.
Ryzen is more efficient then intel but this is not a cpu problem because as you can see paired with a RTX 3060 a lot of cpu bound scenarios you get on arc completely disappear, all driver related issues and I’m one of they very few people who benchmarks with a budget system. I could have used my 7800x3d and that would have probably pushed more games in favor of arc but the people buying budgets GPU’s own budget systems.
Si si quieres mejor rendimiento en juegos mientras más nueva la generación mejor pero te cuidado con el problema de intel ahora yo me compre el 12600kf y va de maravilla después que intel arregle ese. Problema de ahora con los de 13 gen y 14 gen me voy a comprar otro intel pero más potente
@@IntelArcTestingis it more efficient than 13600k? I highly doubt that. By what metrics ,is it because some say that or because of the lower power consumption on paper ? They don't even come close .
I'm very glad that Arc is making great progress compared to where it was last year, since in that time frame it could barely even run enhanced Xbox 360-era games above 30fps in the case of Halo MCC, and Halo 2 Anniversary was a goddamn unplayable slideshow. It at least seems to be slowly getting to a point where it is on par or even better with Nvidia's low-mid range cards in certain titles, which truthfully you would expect or hope by now. But as shown by games like Starfield and the PC ports of Playstation games, they still have a ways to go. Its hard to tell how much of this is down to driver optimizations or the strength of the cards since there's no "High end" Arc card outside of the A770, which seems to go from anywhere from 3060/3060ti levels of performance, nearly 3070 levels in a rough handful of cases, or well below that.
gracias a tus videos me decidí, ahora el 30 de este mes me llega mi Arc A750 Predator Bifrost no sé si sea la mejor versión de esta misma, pero yo sabía que las Arc iban a remontar.
Si, es la mejor. Y aprovecha para hacerle un overclock decente: 2700 o más. No tengas miedo, mi a770 LE está en 2750 desde que salió ;-) (Incrementé el límite de potencia a 400w por si lo nesesita)
I am asking as I play Red Dead Redemption 2 at 4k with some settings tweaked here and there and get over 60 fps using Arc A770 that's paired with i9 12900 and 32gb DDR4.
Retested multiple times and same results every time. Guess because the A770 is more cpu bound then the A750 it drags down the performance a bit. A750 is GTAV 1440p is actually better then 1080p probably for same reason.
@@rudrodhali1257 5600G is about the performance of a ryzen 5 3600 or 5500. I don’t think you’ll see much difference between then A750 and A770 on that cpu so I would probably go for A750 unless he need that 16GB of the A770 and it’s not significantly more expensive like in my country.
@@IntelArcTesting e wants to play upcoming games in maxed settings and no other gpu is offering extra vram in this price segment in my country. More and more games are becoming vram hungry and i think this trend will continue in future. I am worried about the bottleneck. Is putting extra cash for vram worth it?
THANK YOU FOR YOUR HARD WORK!!!! Maybe i won the "silicon lottery" but my 750 does at least 40 fps at 1440p with all ray tracing enabled in Cyberpunk (minus the rtx only stuff of course.) Also I have an i7-9700, so maybe arc plays nicer with Intel than AMD?
11:54 hmm, interesting. Why is the RTX moving "lazy" on this "old" game. I say it moving lazy because it's the first title it running cooler and at less, and much less power giving much lower frames than the ARCs.
Rebar is enabled in all of my video’s. The general public just doesn’t realize arc also has issues in newer DX12 games and not just DX11 or older. These result are 100% accurate and what you going to get with a 5600 or intel equivalent.
Awesome video, thanks for this! Could you do a Intel ARC Raytracing performance comparison? I heard Intel Arc gpus are torally cracked at Raytracing, would to see it.
Im planning to do a A580 vs A750 vs A770 in 10 games 1080P and 1440P when a new driver drops (so that I do not risk new driver dropping half way in making video to avoid having to retest).
the fact that arc a750 is cheaper than rtx 3060 yet better performing in dx12 games i believe old games are struggling but guessing this will be sorted using new updates
Nvidia would perform worse in DXVK. Have tried it on my 3080 a few times. Currently trying to sell a pc with the 3060 and have a few interested already.
They ryzen 5700x3d is currently on sale on amazon in the U.S. I picked up one for my wife's computer(she had a ryzen 5500x) and the performance jumped was amazing. Also the 5700x3d doesnt get as hot as the 5800x3d so you can run a pretty cheap air cooler on it.
A750 would be better choice imo. If working optimally A750 is better then RX 6600 but there are still issue not only with old games but also new game like Starfield so keep that in mind. Overal I think in majority of game you’ll get a better experience on Arc. Also XeSS is much better looking then FSR.
@@IntelArcTestingbro do you have any rx 6600 card if have then can you do a gaming test review because on youtube their are almost no video about rx 6600 vs arc a750.There are some videos but not with the latest drivers of arc a750 so can you make a video of it with the latest driver
bro I think there's some problem in your cards because on many trusted channels I have seen even an ARC 580 destroying 3060 by 10-12 FPS avg in some games
A580 destroying 3060? Doubtful. Were those games sponsored by Intel? Anyways such things can happen I guess, but I've seen A580 trading blows with RX6600 which is outside of Raytracing behind 3060 by 10-15%(or so).
@@GameslordXY ahh sorry not your problem actually I forgot to write "in some game" those guys tested A580 on the latest driver updates from intel that is why it was fast, and actually the reason I had picked an A580 is because of those dual media encoders which even beats 4060ti and 4070 in encoding and decoding you can search on youtube for it
this testing is flawed for not pairing the intel gpus with the intel cpus, a750 and a770 perform way better when paired with Intel cpus because they spread the workload and then combine to make the frames in the end. i know this might sound weird but pair these gpus with like a 12400f and you'll see the difference yourself
There have been people testing if it makes a difference and with a 1 or 2 outliers like ratchet and clank the difference is the difference in cpu power itself. It doesn’t matter.
idk what i buy, the arc a750 and the rx 6650xt are the same price in Brazil, i think the 6650xt is way better now, but the hardware from the arc could be wayyyyy better in the future. But i don´t want to wait like 1 or 2 years to get the potential.
It’s a pretty budget cooler. AsRock Challenger cooler is really good. I can overlock it quite a bit and will stay under 80C and you can barely hear the fans
arc a750 costs around 100$ less on average in most places so that's literally 50% of whole card in price difference no sh*t it better perform better considering this is intels first gpu test maybe in couple months and years we will see even better results from intel i hope in gpu market always good for consumers
@@turtleneck369 I wouldn't get a 3060 either tbh. Better to get a used 5700 xt or 1080 ti for $100 which will do better in most games. Way worse performance on my a770 in my case since none of my PCs support rebar.
@@SteelTumbleweed well of course but used market is totally different game i got myself rx6700xt year ago for 240€ while around that time new one was 350€+ and 3060 was around 300€ where i lived so yeah its always best for budget but no warranty and risks of course
@@IntelArcTesting i own both a a580 and a 4060, separate builds my 4060 has slightly more micro stutter whilst my a580 has lower fps but no stutter at all. The builds compensate each other well there's no bottle neck. Overall I can't fathom why your a770 has such low fps, doesn't add up with my Acer bifrost a770, keep in mind I'm amd biased here.
This took like 6 hours to make so please don’t skip for averages!
If you'd like to support the channel and help fund new game releases and future graphics card, I would greatly appreciate it if you visit the link below.
Buy me a Coffee! - buymeacoffee.com/intelarctesting
Thanks for the video but you should test with better intel cpu ! I guess a770 is bottlenecked by cpu and is the gpu running pcie 4*16
@@noway5013 the whole reason I test with a 5600 is because arc cards have some trouble with older api’s and cpu driver overhead (death stranding). I can just brute force past all driver issues with my 7800x3d but that wouldn’t be realistic for most people out there buying these. I’m using a b450 so gen 3x16
i already subs ur channel bro. good videos. i think a750 is same as a580
That is what Smart creators do, not upload averages.
@@IntelArcTesting are u using pbo for better boosts with 5600 i too have 5600 with pbo at-28 on all cores and CPU and MOBO phases to extreme in bios ,PWR limit to 100W i can sustain all cores 4550mhz under full load and in games for example some cores go up to 4750mhz ,stock is 4450 i believe so you can get easily 5-10% of raw performace even though in gaming it often does not make much difference but still usefull its basically the easiest way to squeeze as much from them ryzens cpus right now
Retesting the Arc A770 and the Arc A750 a few months later showed massive improvements. The A750 went from 56 frames per second (fps) in Cyberpunk 2077 at 1080p to 76 fps, outperforming Nvidia’s RTX 3060 Ti.
How can you say that. This video is from a month ago, so everybody can safely assume it's recorded on newest drivers. Personally im surprised there's such disparity between 3060 and 770 is certain games. I know, i know directX backwards compatibility is still behing but i'm hoping Intel will keep updating.
Yeah plus Xess Is actually good can't wait to find out what intel cooks next
Yeah. I think so
Love how the arc a770 was getting 51fps on death standing 1080p and 52fps on 1440p 😂
Its great how you test these cards with a normal rig. Really gives an interesting view on how these cards compare for those of us not running crazy fast PCs.
I think other channels have reported very low fps in Starfield, Death stranding and other games, so it’s not just on your system. People who suggest switching to a 7800X3D don’t get it, it’s such an unrealistic setup to pair that CPU with midrange GPUs like the Arcs.
A770 has so much potential. Big 16gb VRAM and a 256 bit bus means it has more untapped power. Just need more driver updates
After another half of a year powier is still untapped xD smth is wrong with this cards and that's it
Unfortunately, the Ryzen 5 5600 isn't a very efficient processor .In one of my PCs, I have the i5-13600K with Arc 750, and the results are much higher.
In Starfield, I'm getting well over 50fps at 1080p.
Ryzen is more efficient then intel but this is not a cpu problem because as you can see paired with a RTX 3060 a lot of cpu bound scenarios you get on arc completely disappear, all driver related issues and I’m one of they very few people who benchmarks with a budget system. I could have used my 7800x3d and that would have probably pushed more games in favor of arc but the people buying budgets GPU’s own budget systems.
I am getting 50FPS at 1440 on A750. It must be your AMD.
Si si quieres mejor rendimiento en juegos mientras más nueva la generación mejor pero te cuidado con el problema de intel ahora yo me compre el 12600kf y va de maravilla después que intel arregle ese. Problema de ahora con los de 13 gen y 14 gen me voy a comprar otro intel pero más potente
@@dbod4866blame it on the amd cpu all you want but the 3060 doesn’t face the same issue with the same cpu, its obviously driver related
@@IntelArcTestingis it more efficient than 13600k? I highly doubt that. By what metrics ,is it because some say that or because of the lower power consumption on paper ? They don't even come close .
One of my fav channels here.
Its funny how 750 and 770 are so close in performance.
You da ARC man! Thanks as always for these vids.
I'm very glad that Arc is making great progress compared to where it was last year, since in that time frame it could barely even run enhanced Xbox 360-era games above 30fps in the case of Halo MCC, and Halo 2 Anniversary was a goddamn unplayable slideshow. It at least seems to be slowly getting to a point where it is on par or even better with Nvidia's low-mid range cards in certain titles, which truthfully you would expect or hope by now. But as shown by games like Starfield and the PC ports of Playstation games, they still have a ways to go.
Its hard to tell how much of this is down to driver optimizations or the strength of the cards since there's no "High end" Arc card outside of the A770, which seems to go from anywhere from 3060/3060ti levels of performance, nearly 3070 levels in a rough handful of cases, or well below that.
gracias a tus videos me decidí, ahora el 30 de este mes me llega mi Arc A750 Predator Bifrost no sé si sea la mejor versión de esta misma, pero yo sabía que las Arc iban a remontar.
Yo compre la limited edicion
Si, es la mejor.
Y aprovecha para hacerle un overclock decente: 2700 o más.
No tengas miedo, mi a770 LE está en 2750 desde que salió ;-)
(Incrementé el límite de potencia a 400w por si lo nesesita)
yo me compre la de asrock
Con que fuente de poder recomiendan?
🙋♂ I wonder how it would work on a 1700 platform, with an Intel i5 processor 🤔
I am asking as I play Red Dead Redemption 2 at 4k with some settings tweaked here and there and get over 60 fps using Arc A770 that's paired with i9 12900 and 32gb DDR4.
Don’t own a 4K monitor but that sounds achievable
i am thinking I willl buy ryzen 7 7700 cpu for arc a770 pairing . can u make a video on how it perform please and which cpu should i choosse instead
Honestly if you are going for AM5 just get the 7800X3D, it’s truly a beast. Have it paired with a 3080 and I’ve never had more consistent fps ever.
Ryzen 9000 series will release this month 👍🏻
hogwarts lagacy just looks really different on 3060. much more shaper and detailed in A750 and A770
wtf is going on why arc a770 is performing worse than a750 in gta5,gow and last of us ?
Retested multiple times and same results every time. Guess because the A770 is more cpu bound then the A750 it drags down the performance a bit. A750 is GTAV 1440p is actually better then 1080p probably for same reason.
@@IntelArcTesting I'm currently using the 12700K and A770, and when I saw your results, I got a bit confused, hence why I asked. Thanks anyway.
@@IntelArcTestingmy friend has a ryzen 5600g what GPU should he buy? A750 or a770
@@rudrodhali1257 5600G is about the performance of a ryzen 5 3600 or 5500. I don’t think you’ll see much difference between then A750 and A770 on that cpu so I would probably go for A750 unless he need that 16GB of the A770 and it’s not significantly more expensive like in my country.
@@IntelArcTesting e wants to play upcoming games in maxed settings and no other gpu is offering extra vram in this price segment in my country. More and more games are becoming vram hungry and i think this trend will continue in future. I am worried about the bottleneck. Is putting extra cash for vram worth it?
I have arc a750 over for a year. And their drivers had a really good imorovement. Hope this card actually gonna be like rtx 4070 in future
BRO DO YOU USE RESIZABLE BAR OR NOT?
Always. I also mention it in description.
just out of curiosity did that have resizable bar enable in the bios?
Yes of course. All my videos have rebar enabled besides the one I compared rebar on vs off
@@IntelArcTesting i dont have any clue this is the first time i have ever watched a video of yours...
@@Stenchy333 No worries. Thanks for watching! I did specify rebar was on at start of video with specs and also mention it in description.
Very good work man.. Thanks for the video.
Hi can you test last of us with old and new drivers on a750? As it struggled a bit at the start
How... is the Arc 750 doing better than the 770 in some benches.
I can only assume some mis configurations in there..
18:00 the RTX behaving "lazy" again, weird.
THANK YOU FOR YOUR HARD WORK!!!!
Maybe i won the "silicon lottery" but my 750 does at least 40 fps at 1440p with all ray tracing enabled in Cyberpunk (minus the rtx only stuff of course.)
Also I have an i7-9700, so maybe arc plays nicer with Intel than AMD?
Just dawned on me it's probably because I use some xess
Bro can you share your arc control settings plz
Stock
What drivers are installed for Arcs?
First line of the overlay is driver version
Do you think that the performance would be better for A750/A770 with intel CPU ?
Yeah, ARC Cards do perform slightly better when paired with an Intel CPU, similar to how AMD Cards perform better when paired with an AMD CPU.
@@auritro3903 source?
11:54 hmm, interesting. Why is the RTX moving "lazy" on this "old" game. I say it moving lazy because it's the first title it running cooler and at less, and much less power giving much lower frames than the ARCs.
Please make sure resizable bar is enabled in bios. The intel arcs arent usin 100%
Rebar is enabled in all of my video’s. The general public just doesn’t realize arc also has issues in newer DX12 games and not just DX11 or older. These result are 100% accurate and what you going to get with a 5600 or intel equivalent.
I watched the full video. It was amazing!
Awesome video, thanks for this! Could you do a Intel ARC Raytracing performance comparison? I heard Intel Arc gpus are torally cracked at Raytracing, would to see it.
Was doing good with my rx5700 but 4fun got a new a770 16gb for 230€. Will see further driver progress with time.
can u test intel arc a750 on rvc okada pls?
Can u do a test a750 or a770 again with these games? thanks btw
Im planning to do a A580 vs A750 vs A770 in 10 games 1080P and 1440P when a new driver drops (so that I do not risk new driver dropping half way in making video to avoid having to retest).
@@IntelArcTesting greatest intel arc testing channel that ever lived 🫂
the fact that arc a750 is cheaper than rtx 3060 yet better performing in dx12 games
i believe old games are struggling but guessing this will be sorted using new updates
Great viddd !!
Excellent video I watched all of them, can you compare 3060 vs a750 using DXVK?
Nvidia would perform worse in DXVK. Have tried it on my 3080 a few times. Currently trying to sell a pc with the 3060 and have a few interested already.
@@IntelArcTesting do you think if you put to compite 3060 using native API in games vs A750 using DXVK, the performance will be the same?
They ryzen 5700x3d is currently on sale on amazon in the U.S. I picked up one for my wife's computer(she had a ryzen 5500x) and the performance jumped was amazing. Also the 5700x3d doesnt get as hot as the 5800x3d so you can run a pretty cheap air cooler on it.
I won’t upgrade. When battlemage comes I’ll just test using my main system which has a 7800x3D
*can anyone tell me that which will be best rx 6600 or arc a750* i have a i5 12gen
A750 would be better choice imo. If working optimally A750 is better then RX 6600 but there are still issue not only with old games but also new game like Starfield so keep that in mind. Overal I think in majority of game you’ll get a better experience on Arc. Also XeSS is much better looking then FSR.
@@IntelArcTestingbro do you have any rx 6600 card if have then can you do a gaming test review because on youtube their are almost no video about rx 6600 vs arc a750.There are some videos but not with the latest drivers of arc a750 so can you make a video of it with the latest driver
as in our country both of the gpus prices are same
bro I think there's some problem in your cards because on many trusted channels I have seen even an ARC 580 destroying 3060 by 10-12 FPS avg in some games
Might be cherry picked benchmarks and those people probably using a €600 cpu that overcomes the cpu driver overhead arc has in games.
@@IntelArcTesting Umm I think so, have you updated your drivers to the latest?
@@IntelArcTesting just asking beacuse I am thinking to buy ARC 580
A580 destroying 3060?
Doubtful.
Were those games sponsored by Intel?
Anyways such things can happen I guess, but I've seen A580 trading blows with RX6600 which is outside of Raytracing behind 3060 by 10-15%(or so).
@@GameslordXY ahh sorry not your problem actually I forgot to write "in some game" those guys tested A580 on the latest driver updates from intel that is why it was fast, and actually the reason I had picked an A580 is because of those dual media encoders which even beats 4060ti and 4070 in encoding and decoding you can search on youtube for it
this testing is flawed for not pairing the intel gpus with the intel cpus, a750 and a770 perform way better when paired with Intel cpus because they spread the workload and then combine to make the frames in the end. i know this might sound weird but pair these gpus with like a 12400f and you'll see the difference yourself
There have been people testing if it makes a difference and with a 1 or 2 outliers like ratchet and clank the difference is the difference in cpu power itself. It doesn’t matter.
Though it could have been maxed out preset on 2 GPU, why didn't you?
How come the last of us takes 5.5GB in 3060 and 9.7 in 770?
That would have just crippled the A750 and be just a vram test
@@franticgamer3804 because it has more vram, so it will allocate more.
idk what i buy, the arc a750 and the rx 6650xt are the same price in Brazil, i think the 6650xt is way better now, but the hardware from the arc could be wayyyyy better in the future. But i don´t want to wait like 1 or 2 years to get the potential.
6650 XT is better most case and older games, arc might surpass in future but don’t expect any time soon
I buy an Intel Arc 750 😊😊😊😊😊
Es la mejor compra que P e echo, juegos mucho a 2k y 4k con ella con un intel i5 12600kf
Me too. 😉
Why the temperature of rtx 3060 always higher than the arc gpus?
It’s a pretty budget cooler. AsRock Challenger cooler is really good. I can overlock it quite a bit and will stay under 80C and you can barely hear the fans
Intel 1st gen cards are really impressive! Great video!
considering that the 3060 is 330euros and the a750 is 210 i think its clear wich one im getting as an upgrade
All this time and it still can't beat a 3060... Thanks for testing!
there is a 100 dollars price diff
Price bro, price.
arc a750 costs around 100$ less on average in most places so that's literally 50% of whole card in price difference no sh*t it better perform better considering this is intels first gpu test maybe in couple months and years we will see even better results from intel i hope in gpu market always good for consumers
@@turtleneck369 I wouldn't get a 3060 either tbh. Better to get a used 5700 xt or 1080 ti for $100 which will do better in most games. Way worse performance on my a770 in my case since none of my PCs support rebar.
@@SteelTumbleweed well of course but used market is totally different game i got myself rx6700xt year ago for 240€ while around that time new one was 350€+ and 3060 was around 300€ where i lived so yeah its always best for budget but no warranty and risks of course
do a comparision video like this after recent update please
@@nahidhasan6541 don’t have the 3060 anymore unfortunately
is anomaly, a750 and a770 got the same fps in 1080p and 1440p
Intel has one of those GPUs which aged like fine wine.
b450 and rebar on ?
Perhaps
I'm surprised about the games where the A770 goes slower than the A750
hi, can you test with a580?
Still not back from RMA and I don’t have the 3060 anymore.
thank you for your videos and effort
U sure things are okay here I hv 5600g and 16 gb 3200 mhz cordair ram and a diff mobo but get way more fps in starfield and alan wake and some other
Do you have upscaler enabled?
Man the performance is way behind the 3060 in some games. I wish they would fix this..
if you want a headache free experience, go with 3060 at least you can guarantee that games are going to work
Im glad i bought an A750, its soooo good.
good work
New driver Intel arc a580 pliese compare rx 6600 . If it's good on Intel arch I'll buy it.
Lo malo que las Intel Arc funcionan pesimo con procesadores AMD, no logran sacar todo el potencial de la GPU
¿Dónde hay info sobre eso? Pregunto de buena fe.
can u please do this with a intel cpu
The only intel CPU’s I have are dual core and lack hyper threading.
Intel gpus will give best performance with intel processors (12th &13th gen)
What exactly happens in AW2 that justifies these low frame rates. I mean it doesn't look much different to GTA,.guess it's just lazy programming.
It looks like they fixed gta
kinda
Just don’t use MSAA
New driver out today fixing dragon's dogma btw
Intel cards are obviously not optimized for gaming.
price per performance sorry but intel is absolutely terrible as hell. Don't buy these
disgrace disappointment.I have intel arc a750 limited edition and ı use 8 months ,piece of garbage
When arc works properly it’s definitely better then a 3060, just still has issue to iron out. Will get better with time.
thank you
Intel arc 750 price range = rtx3050
3060 has abit of micro stutter
Arc has more stuttering in general.
@@IntelArcTesting i own both a a580 and a 4060, separate builds my 4060 has slightly more micro stutter whilst my a580 has lower fps but no stutter at all. The builds compensate each other well there's no bottle neck. Overall I can't fathom why your a770 has such low fps, doesn't add up with my Acer bifrost a770, keep in mind I'm amd biased here.
my next graphics will be intel!
ps exclusives dont perform well
Arc 🫶🏽
👍
for god sake overclock arc 750 and test it
majority of his audience suggested not to overclock the GPU, he conducted a poll. you are in minority.
The majority of viewers are here to see how the card performs at stock settings. not for the maximum amount of fps it can get
I used to but did a poll and most people prefer stock
As a overclocking user I sadly remember that poll.
My overclocked a770 has no issues (OC since it came out)
;-)
junk cards for poor homies from india