Hahaha, who knows what NVIDIA is thinking. By 2025, they might actually release the RTX 5060 with just 8GB VRAM again. Or maybe, after facing backlash, they’ll come up with a 12GB VRAM version later. Let’s see how it all unfolds!
Finally a real genuine gamer , who tested battlefield 2042 on arc b580 ! Big thanks Edited: i think officially your first 3:14 TH-camr who played bf 2042 on arc b580
This card is single handedly going to reset the GPU market. Competition will either lower their cost or get forgotten. I am extremely glad I waited to upgrade my pc.
I have a 4070 mobile and this card performs a lot like my 4070 mobile and it has 4GB less vRAM. Good to see Intel improving. Nothing better than a 3rd vendor.
In india b580 isn't out yet (when it comes it will be at least usd 270 + 18% gst), also rx 7600 & rtx 4060 is available at 300 usd including tax (using credit card offer at 270 usd) .. why would anyone go for intel at this time specially when both of these card will become cheaper with newer release just around the corner? Anyways b580 is great, but other than that extra 4gb vram , nothing special tbh ... I doubt ubcan get it under 300 in usa rn 🤔
I really got lucky, I bought the Asus RTX4060 for €279 and a few days later I managed to get my hands on the IntelArc B580 for €289 / I sent the RTX back when I saw what IntelArc could do at the same price 🤪
i have played Forest (Ultra settings) + 90 FPS and WD 2 (but it doesn't have drivers for Intel, it says that I have older GPU and I need better then some 2GB GPU from NVIDIA or AMD) 1-10 FPS (lagging alot). But I downloaded DXVK and It runs better, but not good (like 60+ FPS on low - medium settings, meanwhile my GTX 1660S 60+ FPS High - Ultra).
Skyrim Se as well as Fallout 4 do pretty well. No issues there. As do all the older Tomb Raider games (the new Remasters, Anniversary, Legend and Underworld).
@@Oozaru85any chance you’re playing modded Skyrim or fallout? I’m trying to see how well modlists perform on this card but nobody is going to post gameplay with such a new card.
New msi afterburner glitch: gpu power monitoring takes 1% lows while gaming uncheck the mark to fix. Tell this to your friend who has stuttering issues whos using msi afterburner thx.
In my country it will cost me 300 dollars (taxes) and still it's cheap and best card but not available as of now Will wait hopefully they optimize drivers Hardware is fire 🔥🔥🔥
@@edwardbenchmarks It offers better visuals than FSR because it utilizing the XMX cores. The XMX version gives way better performance than on non intel gpus while eliminating ghosting present in FSR.
If you use Intel, the XeSS will automatically get replaced to A.I. or hardware accelerated and not the generic version that available for all cards. 🤷🏻♀️🥴
Hey man great video! If it is possible , can you recreate the video with a budget cpu , if a pc builder is under a tight budget they will also use a budget cpu along with the gpu . The cpu used in this video is high end .
@AliSohu-kf8sp depends on implementationof taa and version of xess, eg XeSS 1.2 has Ultra Quality-Ultra Performance, while 1.3 has Quality to Ultra-Performance also XeSS and DLSS/FSR2.2/3.1 have different internal rendering resolutions the latter have the same quality level to internal res mapping . Additionally programs can define custom upscale ratios and or rename the existing if they want to. XeSS and FSR both need high quality TAA and Motion vectors to work properly . DLSS isn't as sensitive to the input quality as its MLModel is stronger than XeSS. XeSS I a little faster to compute on Intel Arc than non-Intel / DP4a path also XeSS on Arc is faster than DLSS on Ampere or Ada Lovelace. I implemented and worked with FSR3.1(upscaling and Frame gen) and DLSS3.7 upscaling in vulkan and experimented with XeSS in DX12 as XeSS currently doesn't support Vulkan as a Backend.
@limeisgaming Small correction. Intel vendorlocked the DP4a path in 2022 so only intel i/dgpus can use DP4a while other gpus must use the Shader Model 6.4 path in DX12 reducing performance. FLOAT16 vs INT8. Intel Alchemist and Batllemage uses the XMX version of XeSS which is undistinguishable from DLSS in games if properly implemented. DLSS uses the Lanczos upscaling technique(from nvidias whitepaper) while XeSS and FSR uses TAA.
@kazuviking thanks, but why fall back to 6.4 ? Intels own DG2/Xe-LPG and HGP architecture is SM 6.6 capable and and Xe2 based architectures are 6.8 and 6.9 compliant and the other vendors HW of the last 7 years has SM 6.8. Meaning they could all use wavefronts HW intrinsics. Seams kind of like a waisted opportunity for optimisation and HW alignment.
@limeisgaming Ask intel why did they cross vendor locked the DP4a path to their own gpus. From what i know XeSS 1.3 uses the Shader Model 6.4 Path version. They maybe updated it to SM6.6 in the 1.3.1 release but needs confirmation.
@ and it’s a game I definitely want to play on the ARC… beautiful GPU, you gonna Benchmark classic games as well, like The Batman games, Boarderlands games…
@@edwardbenchmarks MSI afterburner wasn't updated for 1 year - itshows wrong POWER numbers for Intel's B580 - It should be 180w instead od 120w - You should run Intel's OSD at the same time in upper right corner.
@@edwardbenchmarks nope thats just the Compute dies powerdraw, intel openly stated that RTSS and their own Intel Graphics control software, read only the dies Powerdraw which doesnt include the memory dies nor the Fans eg on a triple fan model you gotta estimate roughly 12W for the fans + 37W for 6 x GDDR6 16Gb Dies, so you gotta estimate roughly + 50 W to get the actual TBP value and not the GPU powerdraw. Edit: Its still depending on the title more energy efficient and cheaper than an RX 7600, depending on if you can find it at msrp.
I'm I the only person here doesn't have a bunch of game shortcuts on my pc home screen? I only have trash icon, steam, and microsoft Edge!😅 I like to see my picture background!
@@lds7104eva in asrock website they have mentioned it as LED Indicator! so i am wondering what they meant by indicator! maybe its mean the more temp goes high the more led will be bright!
you will have to add a temp sensor and wire it up to an rgb led you will also need a micro controller like attiny85 and program connect the led pins to attiny85 gpio write a condition to make it glow red like when temp reaches 80 + and blue otherwise 😏 good luck
@@ZahidHasan-tj2rl no no I'm just giving him a solution to add that feature. I think aura sync supports that feature natively and some cpu coolers have that feature too
Now I am curious how Intel B580 is doing in DirectX 8 and lower? Can You test old gramps like Giants - Citizen Kabuto & Industry Giant 2? Would love to see, if Intel cards won't crash at launch 😂
GTA wasn't updated to NVE FREE mod? Spiderman would go easily on 4k - you should just benchmark maximum playable resolution for B580 in each game. Does Intel have similiar option like NV DSR to run game in 4k on 2k monitor?
Nadajmo se 600km max, mada realno trebalo bi biti oko 480km, imam neke dojave da bi moglo kostati oko 480km za par mjeseci ali takodjer su mi dojavili i da ce biti 600 pa cak i 650km… Na zalost
@AshutoshMishra-s7e at the moment both of this setup is the same price and i worry about inconsistencity of new İntel graphics card, most games always go for nvidia
@@AshutoshMishra-s7e well i have 850 usd and i want to pick the best possible option and i know 5060 will be overpriced rigth now arc b580 promising but i didnt like rebar and other options feel like its made for programs and such and there is huge gap between some games and xess not advanced as DLSS so many things to worry about so ill pass to 4060
can i request MOBILE SUIT GUNDAM BATTLE OPERATION 2 at 1080p and 1440p ? this game is uniqe since dev is so lazy to put any optimization on any gpu brand.
I don’t see a reason for that. First, you will need a 240Hz monitor otherwise, you can’t use it properly since the GPU already provides enough frames. Bypassing the maximum refresh rate, which in my case is 144Hz, will lead to stutters, at least from what I’ve observed.
If Intel releases the b770 card sooner, while it is priced below $400, it will kill AMD occupied cards and Nvidia will be stuck in a difficult situation.
@@GuruKhatri-h7m nah not even a fanboy. I buy what is the best in the market. Intel drivers are still in the infancy stage. Compatibility with old games and emulation is rough.
@@mafiartxI think you should wait a little longer. When the card is fully optimized, there is no need to even think about buying it. It can be bought directly.
Meanwhile, Nvidia is pushing 8GB VRAM cards in 2025......
🤣
Rumors have it they will announce a texture compression technology that will be integrated into games like DLSS.
@@starbezIt will be mostly for Ai Nvidia is giving scrap to gamers also it won't be day and night difference
Hahaha, who knows what NVIDIA is thinking. By 2025, they might actually release the RTX 5060 with just 8GB VRAM again. Or maybe, after facing backlash, they’ll come up with a 12GB VRAM version later. Let’s see how it all unfolds!
@adity-m2g Probably saving VRAM modules for AI cards
Awesome. Wukong will probably get 60fps after driver updates.
or play 1440p in medium settings
Finally a real genuine gamer , who tested battlefield 2042 on arc b580 ! Big thanks
Edited: i think officially your first 3:14
TH-camr who played bf 2042 on arc b580
This card is single handedly going to reset the GPU market.
Competition will either lower their cost or get forgotten.
I am extremely glad I waited to upgrade my pc.
I have a 4070 mobile and this card performs a lot like my 4070 mobile and it has 4GB less vRAM. Good to see Intel improving. Nothing better than a 3rd vendor.
RTX 4060 owners crying in the corner
In india b580 isn't out yet (when it comes it will be at least usd 270 + 18% gst), also rx 7600 & rtx 4060 is available at 300 usd including tax (using credit card offer at 270 usd) .. why would anyone go for intel at this time specially when both of these card will become cheaper with newer release just around the corner?
Anyways b580 is great, but other than that extra 4gb vram , nothing special tbh ... I doubt ubcan get it under 300 in usa rn 🤔
I really got lucky, I bought the Asus RTX4060 for €279 and a few days later I managed to get my hands on the IntelArc B580 for €289 / I sent the RTX back when I saw what IntelArc could do at the same price 🤪
@@sjgghosh7677U can get it for less than that in us mate
@@sjgghosh7677Sir it will rtx 4060+ 4gb vram for the same price
@@sjgghosh7677 "Nothing special" the B580 performs around 20% faster at 1440p for cheaper price than the 4060...
I would love to see how older DirectX games perform.
i have played Forest (Ultra settings) + 90 FPS and WD 2 (but it doesn't have drivers for Intel, it says that I have older GPU and I need better then some 2GB GPU from NVIDIA or AMD) 1-10 FPS (lagging alot). But I downloaded DXVK and It runs better, but not good (like 60+ FPS on low - medium settings, meanwhile my GTX 1660S 60+ FPS High - Ultra).
Skyrim Se as well as Fallout 4 do pretty well. No issues there. As do all the older Tomb Raider games (the new Remasters, Anniversary, Legend and Underworld).
GTA 4 works great
@GejstrSo playing like old games like Idk Assassins Creed 2 would be funny
@@Oozaru85any chance you’re playing modded Skyrim or fallout? I’m trying to see how well modlists perform on this card but nobody is going to post gameplay with such a new card.
The budget gamer will thrive with this card. Amazing performance for the value. Wow.
V: I don't ride anymore
*Proceed to jump from the cliff.😂
Cyberbug… i mean cyberpunk 😶
Also V : Now I don't
0:19 that hair makes me uncomfortable 😂
Hahahaha 😂😂😂
pubic hair
Sim,me estourei na risada kkkkkkkkkkkkkk
It would be great if you could do 1 video in 1080p resolution and without XeSS turned on...! To see true 1080p power of this card !
I planned tbh, for 48h - 72h.
wow, I've never seen someone d1e so fast in 34 games
amazing bro ❤
Thanks man 🔥
Naaaaaah!
The founder's edition is the 🐐
Waiting for Arc B770🙂
poop shit bull shit huyndai junk shit poop huyndai
Samee
Just buy B580., just couple of dollars difference and it gets ur job done for 1440p gaming
@godofdestructiondhanasekha893 looking at the current game optimization situation it will become 1080p card in 1.5 years.
Also I need the 16GB too.
When will this card be restocked😭😭😭
January 3rd
For me in my country i can find it for 350euros…
I didn’t have those problems is it on stock or not, i preordered.
Inte is trying hard tho
I love how you are the only person who tests the Elden Ring DLC
I am really glad u like it 😁
The power draw is amazing aswell😮 really nice would love to see someone maybe try to overclock it after driver updates and all👍
New msi afterburner glitch: gpu power monitoring takes 1% lows while gaming uncheck the mark to fix. Tell this to your friend who has stuttering issues whos using msi afterburner thx.
Only affecting Ryzen is not? 🤔
@Jakiyyyyy yeah this issue happens with 7800x3d 9800x3d gaurunteed dont know if this happens to intel.
In my country it will cost me 300 dollars (taxes) and still it's cheap and best card but not available as of now
Will wait hopefully they optimize drivers
Hardware is fire 🔥🔥🔥
Black myth wukong is the only AAA game i own so this card will suffice! I also got the ASRock Challenger 12GB.
I just don't know why you use FSR in Farming simulator when XeSS is an option.
It doesn’t seem like perfect option to me in the term of quality to performance ratio. At least not in FS25.
@@edwardbenchmarks It offers better visuals than FSR because it utilizing the XMX cores. The XMX version gives way better performance than on non intel gpus while eliminating ghosting present in FSR.
If you use Intel, the XeSS will automatically get replaced to A.I. or hardware accelerated and not the generic version that available for all cards. 🤷🏻♀️🥴
@@Jakiyyyyy You can witch between DP4a and XMX path in the driver. It makes comparing image quality easier.
THX 4 Gameplay!!!
Thanks for posting this! Any visit you’ve been able to play modded Skyrim or fallout? Trying to see how the vram handles modlists.
Cyberpunk 2077, 1440p Ultra and XeSS Ultra Q and only consuming 120W?!?! Hoooooly 😍
Hey man great video! If it is possible , can you recreate the video with a budget cpu , if a pc builder is under a tight budget they will also use a budget cpu along with the gpu . The cpu used in this video is high end .
For 250 USD this is an amazing card.
Glad to see there is a third player now in the mid-range market, aside from Nvidia and AMD.
can you test it with rebar off? both 1080 and 1440?
Turn off GPU Power sensor in Afterburner to improve 1% and 0.1% lows.
Joking or for real?
@edwardbenchmarks real. Try it.
@@edwardbenchmarks Definitely real!
how much storage do you have to store these many games???
Those freezes on Elden Ring with
Bro, could you do a test in RPCS3 and Xenia please?
Wondering if it makes sense to get the a770 since u cant buy this anywhere
no stay away from A
Wait. There will be new stock coming
And what about the finals???
I need to make test in 100 games to test them all, i will try to do it, not promising anything.
@@edwardbenchmarks Anyways, thanks for this video
Will this graphic card someday also start coming in laptops?
Is xess ultra quality like Dlss 3.0 ?
No, xess ultra quality is like dlss 2
@AliSohu-kf8sp depends on implementationof taa and version of xess, eg XeSS 1.2 has Ultra Quality-Ultra Performance, while 1.3 has Quality to Ultra-Performance also XeSS and DLSS/FSR2.2/3.1 have different internal rendering resolutions the latter have the same quality level to internal res mapping . Additionally programs can define custom upscale ratios and or rename the existing if they want to. XeSS and FSR both need high quality TAA and Motion vectors to work properly . DLSS isn't as sensitive to the input quality as its MLModel is stronger than XeSS. XeSS I a little faster to compute on Intel Arc than non-Intel / DP4a path also XeSS on Arc is faster than DLSS on Ampere or Ada Lovelace. I implemented and worked with FSR3.1(upscaling and Frame gen) and DLSS3.7 upscaling in vulkan and experimented with XeSS in DX12 as XeSS currently doesn't support Vulkan as a Backend.
@limeisgaming Small correction. Intel vendorlocked the DP4a path in 2022 so only intel i/dgpus can use DP4a while other gpus must use the Shader Model 6.4 path in DX12 reducing performance. FLOAT16 vs INT8.
Intel Alchemist and Batllemage uses the XMX version of XeSS which is undistinguishable from DLSS in games if properly implemented.
DLSS uses the Lanczos upscaling technique(from nvidias whitepaper) while XeSS and FSR uses TAA.
@kazuviking thanks, but why fall back to 6.4 ? Intels own DG2/Xe-LPG and HGP architecture is SM 6.6 capable and and Xe2 based architectures are 6.8 and 6.9 compliant and the other vendors HW of the last 7 years has SM 6.8. Meaning they could all use wavefronts HW intrinsics. Seams kind of like a waisted opportunity for optimisation and HW alignment.
@limeisgaming Ask intel why did they cross vendor locked the DP4a path to their own gpus. From what i know XeSS 1.3 uses the Shader Model 6.4 Path version. They maybe updated it to SM6.6 in the 1.3.1 release but needs confirmation.
Yo dude can u plz do this same test but every game is maxed out at 1440p no upscale?
What program do you use for monitoring?
La B580 es un mito urbano!!
46:12 why Did it crash?
I don’t know, I don’t have explanation for that since I didn’t get any error. On other gpus its fine.
@ and it’s a game I definitely want to play on the ARC… beautiful GPU, you gonna Benchmark classic games as well, like The Batman games, Boarderlands games…
awesome performance, ill buy it.
What other gpu does the performance most closely compare to??
I just upgraded from 1050 2gb to 1650 super i want to keep watching your videos which GPU you have a benchmark for is equivalent to it
What 100w power 😮
Efficiency on 🔥
@@edwardbenchmarkshow to much power supply this graphics card
@@edwardbenchmarks MSI afterburner wasn't updated for 1 year - itshows wrong POWER numbers for Intel's B580 - It should be 180w instead od 120w - You should run Intel's OSD at the same time in upper right corner.
@@edwardbenchmarks nope thats just the Compute dies powerdraw, intel openly stated that RTSS and their own Intel Graphics control software, read only the dies Powerdraw which doesnt include the memory dies nor the Fans eg on a triple fan model you gotta estimate roughly 12W for the fans + 37W for 6 x GDDR6 16Gb Dies, so you gotta estimate roughly + 50 W to get the actual TBP value and not the GPU powerdraw.
Edit: Its still depending on the title more energy efficient and cheaper than an RX 7600, depending on if you can find it at msrp.
@limeisgaming i saw on reddit it doesn't add pcie power consumption that's why it's showing only 100w
Hey im currently running a 2070 with a 5600x with a 1440p monitor 144hz , is the b580 a worthy upgrade?
Can u do one with ultra settings and no upscaling? Only native
I'm asking for a test of the t400 4gb
LOL, the power usage and temp are corrects? Seems very good to me.
45:53 took me off guard D:
my heart stopped for a second there xd
Can you please review nvidia quadro m1200 it would mean a lot🔥
(Please also try to play ghost of tsushima on it)
Intel u are back my darling let's go😂😂😂
That's a beautiful card
Can you use it with a amd cpu and motherboard?
O yuh they cooked with this wtf
how often do intel update their drivers, also do the have an app like Nvidia
Whats your room temperature?
Around 25-26c.
What performance will I get with my ryzen 5 3600?
I'll be playing in 1080p
Please try Dragon's Dogma 2
I'm I the only person here doesn't have a bunch of game shortcuts on my pc home screen? I only have trash icon, steam, and microsoft Edge!😅 I like to see my picture background!
I wonder if this card can play BO Cold War with no graphics issues and if the game will even open without crashing
can you expllain how the led of this gpu works? can we get temp info through this led? how its looks like when temp goes high?
It’s just a led light for aesthetic. There’s only a switch to turn it off.
@@lds7104eva in asrock website they have mentioned it as LED Indicator! so i am wondering what they meant by indicator! maybe its mean the more temp goes high the more led will be bright!
you will have to add a temp sensor and wire it up to an rgb led you will also need a micro controller like attiny85 and program connect the led pins to attiny85 gpio write a condition to make it glow red like when temp reaches 80 + and blue otherwise 😏 good luck
@@johndrippergaming is that really that hard? even peopple use that?
@@ZahidHasan-tj2rl no no I'm just giving him a solution to add that feature. I think aura sync supports that feature natively and some cpu coolers have that feature too
And are there any driver issues?
star wars outlaws is crashing
Could you test Hunt Showdown 1898?
Now I am curious how Intel B580 is doing in DirectX 8 and lower? Can You test old gramps like Giants - Citizen Kabuto & Industry Giant 2? Would love to see, if Intel cards won't crash at launch 😂
All GPU vendors on modern Windows struggle already in DirectX 8 games... For DX8 games you should rather use dgVoodoo, d3d8to9 or DXVK.
Great price to performance 🗿
Is this an upgrade over the 7700xt?
no
The power usage ?
You can see it in the video that's the real power usage
@amy-295 i know its too low ...
0,1 % low in farming? Why?
Game optimisation.
GTA wasn't updated to NVE FREE mod? Spiderman would go easily on 4k - you should just benchmark maximum playable resolution for B580 in each game. Does Intel have similiar option like NV DSR to run game in 4k on 2k monitor?
Emulation testing is needed , how does the B580 can handle the emulators?
I second this! I know they’re mostly cpu dependent but I’m curious with higher graphics settings
Koliko ce kostati otprilike u BIH?
Nadajmo se 600km max, mada realno trebalo bi biti oko 480km, imam neke dojave da bi moglo kostati oko 480km za par mjeseci ali takodjer su mi dojavili i da ce biti 600 pa cak i 650km… Na zalost
can intel gpu to half vsync like nvidia?
bro pls test Gtx 960 4gb in games. it would be better if u test with thei5 4590 cpu. coz i have that cpu. is my cpu enough for Gtx 960 4gb?
Can you test tekken 8 on this GPU
Assassin creed mirage use still same sound from knife when u kill someone they didn't even change after all games lol
Some newly released games needs optimization
A580 8gb to b580 12gb
A750 8gb to b750 12gb or 16gb
A770 16gb to b770 20gb 😈
That may be true. We will see.
So models with lower numbers are faster or they sell for some reason slower cards with more VRAM?
could you kindly test Battlefield franchise on this GPU?
Dude you said my heart and my brain voice!
İ514400f, Arc B580 OR Amd Ryzen 7500f , RTX 4060 which i should go for? 1080p gaming
B580
@AshutoshMishra-s7e at the moment both of this setup is the same price and i worry about inconsistencity of new İntel graphics card, most games always go for nvidia
@@Koray-pn5cd I understand but so far intel seems like an option but if ur worried go with nvidia Or wait for new gen cards it's around the corner
@@AshutoshMishra-s7e well i have 850 usd and i want to pick the best possible option and i know 5060 will be overpriced rigth now arc b580 promising but i didnt like rebar and other options feel like its made for programs and such and there is huge gap between some games and xess not advanced as DLSS so many things to worry about so ill pass to 4060
@@Koray-pn5cd or u could just get rx 7700xt which is beast with 12gb vram but costs a lill more
29:30 gta v reverse test was crazy. somebody donate one to me :C
Me with a RTX 3050 8GB that I upgraded to 6 months ago (I went from 1050 2gb to 3050 8gb and I live in Turkey): 😢
How it will perform in pubg ??
can i request MOBILE SUIT GUNDAM BATTLE OPERATION 2 at 1080p and 1440p ? this game is uniqe since dev is so lazy to put any optimization on any gpu brand.
im gonna use it for 1080p
For 48-72h it will be video at 1080p.
Guys can someone test it in Rust?
Can you test it with loss less scaling?
I don’t see a reason for that. First, you will need a 240Hz monitor otherwise, you can’t use it properly since the GPU already provides enough frames. Bypassing the maximum refresh rate, which in my case is 144Hz, will lead to stutters, at least from what I’ve observed.
In Europe they adjusted the price from 320€ to 400€ on the b580 because of the high demand what a scam
But can it do 30fps ray tracing?! I think not!
Do one without xess and in 1080p
Wow, i thought it would be like intel iris or something
This is dedicated GPU for PC.
in 3rd world turkey, due to the exorbitant taxes levied by the state, this card costs 330 euro.
If Intel releases the b770 card sooner, while it is priced below $400, it will kill AMD occupied cards and Nvidia will be stuck in a difficult situation.
Please benchmark on alan wake 2. It is intensive gpu
I just bought an Rtx 4060 2 months ago and now seeing this is making me angry.
Почему процессор не АМД 9700x3d? Он бы раскрыл его еще больше
Can you test rust on it?
Don't Worry rtx 5060 also come with 8gb vram😂
Yet will still be faster than the B580 😂😂
@fightnight14 best of luck Nvidia fan boy😂
@@GuruKhatri-h7m nah not even a fanboy. I buy what is the best in the market. Intel drivers are still in the infancy stage. Compatibility with old games and emulation is rough.
@@fightnight14 no one becomes the best just because you say so.Good luck with ur 8gb vram with 128 bus width old pal, enjoy ur old games
@@GuruKhatri-h7m yet will still be faster lol
Is that gpu real?
There is no way this card is this good. Do you think I should exchange my 6600 for it?
The GPU is generally in all aspects better than RX 6600. Maybe I will make a comparison for it.
@@edwardbenchmarks yeah, thanks. All that I'm worried about are the driver issues and incompatibility.
@@mafiartxI think you should wait a little longer. When the card is fully optimized, there is no need to even think about buying it. It can be bought directly.
@@SteinRoth0 I know. You're right. That's my worry, Intel cards are usually unoptimized at launch.
@@mafiartx There is no reason not to buy it, but I think it would be better to wait.
Podrías hacer un video de fortnite con todas las calidades en 1080p y reescalado 🙏