Arc B980 is Wild - Intel Battlemage GPU Leak

แชร์
ฝัง
  • เผยแพร่เมื่อ 17 ม.ค. 2024
  • Thank you to Vip-cdkdeals for sponsoring this video! 30% code: GPC20
    ▬ Windows 10 pro ($16):www.vip-cdkdeals.com/vck/GPC2...
    ▬ Windows 11 pro($22):www.vip-cdkdeals.com/vck/gpc2...
    ▬ Windows 10 home ($13):www.vip-cdkdeals.com/vck/gpc20wh
    ▬ Office 2016($27):www.vip-cdkdeals.com/vck/gpc2...
    ▬ Office 2019($47):www.vip-cdkdeals.com/vck/GPC2...
    Arc B980 is Wild - Intel Battlemage GPU Leak
    A new leak suggest Intel may have very good value Battlemage GPUs with the Arc B980 and Arc B970 graphics cards.
    Sources
    Intel Battlemage Leak: • INTEL BATTLEMAGE SPECS...
    FULL DISCLOSURE
    Amazon has given me affiliate links which send me a percentage of each purchase when you use them. Additionally all Vip-cdkdeals links are sponsored.
    END DISCLOSURE
    Specs
    Camera: Canon EOS R6: amzn.to/3V2RnnL
    RAM: 64GB Gskill DDR5: amzn.to/3F1Pa6j
    CPU: Intel i9-13900K: amzn.to/3EaqGpD
    MOBO: EVGA Z690 Kingpin: www.evga.com/products/product...
    GPU: AMD RX 7900 XTX Aqua: click.linksynergy.com/deeplin...
    Storage: Kingston 4TB Renegade SSD: amzn.to/3OXvdlz
    Case: Lian Li O11 EVO: amzn.to/3VUscF3
    Music
    www.davidcuttermusic.com
    Outro: Everyone by Monplaisir
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 488

  • @GraphicallyChallenged
    @GraphicallyChallenged  4 หลายเดือนก่อน +4

    Thank you to Vip-cdkdeals for sponsoring this video! 30% code: GPC20
    ▬ Windows 10 pro ($16):www.vip-cdkdeals.com/vck/GPC20w10
    ▬ Windows 11 pro($22):www.vip-cdkdeals.com/vck/gpc20w11
    ▬ Windows 10 home ($13):www.vip-cdkdeals.com/vck/gpc20wh
    ▬ Office 2016($27):www.vip-cdkdeals.com/vck/gpc20of16
    ▬ Office 2019($47):www.vip-cdkdeals.com/vck/GPC20w10

    • @zakquack
      @zakquack 4 หลายเดือนก่อน

      bought the windows 11 pro key. they sent me one that didn't work but I used their live chat and received a new working one immediately.

    • @Deffine
      @Deffine 4 หลายเดือนก่อน

      @@zakquack Cool story, i was worried.

    • @zakquack
      @zakquack 4 หลายเดือนก่อน

      @@Deffine I originally posted that they're a scam but then they gave me a working key so I updated...

  • @bart_fox_hero2863
    @bart_fox_hero2863 4 หลายเดือนก่อน +317

    Actual budget and actual midrange gpus are what most of us want. This new idea of a $600 price point being midrange must be stopped!

    • @user-bj6mv7ok5q
      @user-bj6mv7ok5q 4 หลายเดือนก่อน +9

      Intel gpus are not going to perform like midrange cards.

    • @ncpv
      @ncpv 4 หลายเดือนก่อน +8

      $600 is modrange tho?

    • @yasunakaikumi
      @yasunakaikumi 4 หลายเดือนก่อน +7

      low end of AMD and Nvidia here in Japan cost 600+ tho...

    • @Siiimon426
      @Siiimon426 4 หลายเดือนก่อน +16

      In Sweden the midrange is like 1200$. And the rtx 4090 costs 2500$..

    • @Neilos-sd6ti
      @Neilos-sd6ti 4 หลายเดือนก่อน +47

      ​@@ncpvmidrange used to be around the 200-300 dollar mark.

  • @user-bj6mv7ok5q
    @user-bj6mv7ok5q 4 หลายเดือนก่อน +86

    A 450-500 dollar GPU on par with a 800 dollar GPU is very good for competiion.

    • @dominicharvey6048
      @dominicharvey6048 4 หลายเดือนก่อน +4

      If that is a real thing amd will definitely bring their prices down but nvidia doesn't care and rarely changes their prices

    • @RAM_845
      @RAM_845 3 หลายเดือนก่อน

      @@dominicharvey6048 Hence their nickname, Ngreedia

    • @owomushi_vr
      @owomushi_vr 3 หลายเดือนก่อน +1

      I won't get amd. I'll stick with Intel. VR is amazing on arc now with this of course I'm grabbing battlemage

    • @ES-jk9nr
      @ES-jk9nr 3 หลายเดือนก่อน

      If it can play games ...

    • @Monim_Shorts
      @Monim_Shorts หลายเดือนก่อน +1

      ​@ES-jk9ncan you shut up and welcome new competition

  • @Giovanni-Giorgio
    @Giovanni-Giorgio 4 หลายเดือนก่อน +201

    I hope Intel can battle Nvidia with this one and launch on time.

    • @denverbasshead
      @denverbasshead 4 หลายเดือนก่อน +15

      It wont

    • @-INFERNUS-
      @-INFERNUS- 4 หลายเดือนก่อน +12

      I would like that but Intel does not know how to even compete. If they were serious of even trying to take a little away from Nvidia they would have had Battlemage out this month, even one GPU that can go against the 4070 Super would be great, and give us more choices.

    • @montreauxs
      @montreauxs 4 หลายเดือนก่อน +3

      it will..@@denverbasshead

    • @denverbasshead
      @denverbasshead 4 หลายเดือนก่อน +2

      @@montreauxs what has Intel launched on time in the last few years? Nothing lol. Plus Intel overhypes everything.

    • @hunterz69621
      @hunterz69621 4 หลายเดือนก่อน +2

      It wont because it will get beaten by tha rtx 5090 and it will be expensive as hell

  • @tomtomkowski7653
    @tomtomkowski7653 4 หลายเดือนก่อน +149

    12GB of VRAM with a performance close to the 4070Ti Super will be the same L as the 4070 Super is.
    $450-$550 would be for sure better than $800 but still... Let's hope that the higher model will have 16GB as it should; to put more pressure on the market.

    • @invictious8373
      @invictious8373 4 หลายเดือนก่อน +15

      following patterns it'll have 24 prolly, the previous model had 8gb or 16gb, so this one should have 12 and 24

    • @user-bj6mv7ok5q
      @user-bj6mv7ok5q 4 หลายเดือนก่อน +14

      The actaul perfromance/price matters more than specs in some cases.

    • @siema14123
      @siema14123 4 หลายเดือนก่อน +22

      24 would be extreme overkill. If its around 4070 ti performance then 16 would be more than enough

    • @G_WOLF
      @G_WOLF 4 หลายเดือนก่อน +19

      @@invictious8373exactly, the arc A770 was 16GB, why would there be less on an enthusiast card!

    • @lightward9487
      @lightward9487 4 หลายเดือนก่อน +4

      B970 super 16gb compite 100%

  • @laszlozsurka8991
    @laszlozsurka8991 4 หลายเดือนก่อน +58

    If those specs turn out to be true then battlemage is gonna be a letdown. 12GB for that level of performance is gonna be just as much of an L as the 4070 SUPER. Why would Intel make the A770 16GB but the B980 12GB, that makes no sense.

    • @sisqobmx
      @sisqobmx 4 หลายเดือนก่อน +9

      Meanwhile im playing any game fine with my 1060 3gb. What yall need 10+gb for?😂😂

    • @warbearz1337
      @warbearz1337 4 หลายเดือนก่อน +27

      @@sisqobmxsomeone isn’t playing at max settings that’s for sure

    • @DiamondkeyOwO
      @DiamondkeyOwO 4 หลายเดือนก่อน +14

      ​@@sisqobmxprobably games that your GPU can't even launch due to lack of VRAM.

    • @tuttuti123
      @tuttuti123 4 หลายเดือนก่อน +1

      @@sisqobmx aye with a 1650 right now, anything would be an upgrade.

    • @laszlozsurka8991
      @laszlozsurka8991 4 หลายเดือนก่อน +3

      @@sisqobmx We are talking about current games, not games from 10 years ago.

  • @rozzbourn3653
    @rozzbourn3653 4 หลายเดือนก่อน +56

    i dont think they will run a 192-bit bus on their top cards.

    • @quatreraberbawinner2628
      @quatreraberbawinner2628 4 หลายเดือนก่อน +2

      Yeah thats the thing thats sticking out to me, not the 12gb

    • @Amy-kh3zl
      @Amy-kh3zl 4 หลายเดือนก่อน +5

      Wassup with this 192 bit bus thingy? i dont get it.please explain?

    • @paulboyce8537
      @paulboyce8537 4 หลายเดือนก่อน +8

      These specs just are stupid limiting your resolution to 1440p. And 12gb say it all that 4k is out of question.

    • @sultanofsauce9816
      @sultanofsauce9816 4 หลายเดือนก่อน +1

      @@Amy-kh3zl The memory bus is a data line in the chip that basically determines how much data can flow through the GPU/Memory at a time, small bus = slower GPU by default

    • @rozzbourn3653
      @rozzbourn3653 4 หลายเดือนก่อน +2

      @@Amy-kh3zl The term "192-bit bus" refers to the memory bus width. The memory bus is a pathway that connects the GPU to the video card's memory, allowing data to be transferred between them. The width of this bus is measured in bits and determines how much data can be transferred in a single clock cycle. for cards that are on the higher end of the product stack, usually 256-bit or more is standard.

  • @xtremezone987
    @xtremezone987 4 หลายเดือนก่อน +23

    Hopefully these leaks are very close to reality when they come out to really drop prices across the market. Drivers will most likely still be an issue but since Intel worked pretty fast updating ARC drivers, Battlemage updated drivers should be even better. Fingaz crossed cuz no true average gamer can afford a $4K midrange gpu in the future.

    • @MoriartyUC
      @MoriartyUC 4 หลายเดือนก่อน +4

      except you're gonna see a lot of ppl hop on the "intel sucks" bandwagon and complain still about cards being expensive while supporting Nvidia, the single company causing the prices to stay at the pricepoint they hate

    • @xtremezone987
      @xtremezone987 4 หลายเดือนก่อน +2

      @@MoriartyUC ya unfortunately true

    • @samuelbrogdon9165
      @samuelbrogdon9165 4 หลายเดือนก่อน

      why not 24gbs ?

    • @visitante-pc5zc
      @visitante-pc5zc 4 หลายเดือนก่อน

      @MoriartyUC everyone is supporting for intel. We need more options. The leather jacket guy needs to be humbled.

  • @Viking8888
    @Viking8888 4 หลายเดือนก่อน +48

    If these are going to be the specs, I'm not longer interested in intel. For me (and many others), there has to be 16GB VRAM minimum going forward, and the rumor that the B980 was going to trade blows with an RTX 4080 for less than half of it's price is the main reason I have been waiting to get a new GPU.

    • @unknownposter2075
      @unknownposter2075 4 หลายเดือนก่อน +13

      I mean, if it’s a lot cheaper, is less VRAM really that bad?

    • @jasser6470
      @jasser6470 4 หลายเดือนก่อน +9

      ​@@unknownposter2075 Yes, the consoles have made a 16-gig minimum going forward because that is how much VRAM they have, meaning developers will use that as the standard. Some games even now won't run well with less than 16

    • @gs-pd5ox
      @gs-pd5ox 4 หลายเดือนก่อน +6

      I think the ‘half price’ hype is what has messed this whole situation up. I have an A770 now and was looking forward to Battlemage. As of today let’s just call a 4080 $1050. I personally never expected a b980 to come within 90% of a 4080 for under $500. It doesn’t need to. Sign me up for an intel gpu that can trade blows with the 4070 ti super with 20gb of VRAM for $600+. Hell $800 if it’s actually 90% of a 4080/super. $450 was always unrealistic and now Intel is shaving down so as not to make people irate with a $600 and up price tag.

    • @anitaremenarova6662
      @anitaremenarova6662 4 หลายเดือนก่อน +6

      @@jasser6470 Consoles are also meant to be used with 4K TVs. If you're not playing on 4K you'll survive on 12GB for at least a couple more years.

    • @laszlozsurka8991
      @laszlozsurka8991 4 หลายเดือนก่อน +12

      @@jasser6470 The PS5 doesn't have 16 GB VRAM, the PS5 has 16 GB unified memory which includes the VRAM and RAM. About 4 GB is reserved by the console so in terms of VRAM the PS5 has 12 GB.

  • @shryko
    @shryko 4 หลายเดือนก่อน +9

    The most exciting thing about this would be the pressure put on prices, if it's even close to the leaked specs.

  • @zynnel484
    @zynnel484 4 หลายเดือนก่อน +23

    Im buying battlemage just because it sounds cool.

    • @Auriga4
      @Auriga4 4 หลายเดือนก่อน +2

      amen to that

    • @garrickadams7962
      @garrickadams7962 4 หลายเดือนก่อน +5

      Well that's smart 🎉😂

    • @JakeobE
      @JakeobE 4 หลายเดือนก่อน +2

      Battle Marge Simpson

    • @dragomegaman3711
      @dragomegaman3711 4 หลายเดือนก่อน +1

      Amen

  • @chrisnesbitt_jr
    @chrisnesbitt_jr 4 หลายเดือนก่อน +30

    Honestly if this is the case then Intel isn't going to be the type of competition I was hoping they'd be. I really thought they were going to bridge the gap between software, hardware, and price. With AMD being the cheaper, hardware focused giant. And Nvidia being the expensive, software based giant. I hoped Intel would be the best of both sides third player. Already it looks like they're going to start compromising the hardware and immediately that makes me wary that they'll even find a strong foothold in the GPU market.

    • @justinnguyen6772
      @justinnguyen6772 4 หลายเดือนก่อน +5

      Not many companies want to be referred to as the budget company. Rather stick to the high end with high end profit margins then some mediumish profits

    • @TheBann90
      @TheBann90 4 หลายเดือนก่อน +11

      True, this leak (probably fake) would mean close to 0 sales.
      But the 12GB of memory is a strong indicator of it being fake given the interview we saw with PC World.

    • @DelScully
      @DelScully 4 หลายเดือนก่อน

      i was really hoping for an intel 4080-7900xtx level card.

    • @sasquatchcrew
      @sasquatchcrew 4 หลายเดือนก่อน +2

      Intel has almost infinite money, they will catch up, I believe.

    • @Jpilgrim30
      @Jpilgrim30 3 หลายเดือนก่อน

      @@sasquatchcrewsame. Have to crawl before you walk and I imagine they are learning a lot through these first efforts.

  • @Pro720HyperMaster720
    @Pro720HyperMaster720 4 หลายเดือนก่อน +6

    If they truly end up managing to produce a model close to the 4070 Ti they’d better have a 24GB VRAM variant or a wider bus to allow 16GB configurations

  • @ErzbergAdventures
    @ErzbergAdventures 4 หลายเดือนก่อน +4

    definately curious about the new B980 now. not so much gaming but for my workload.

  • @Slane583
    @Slane583 4 หลายเดือนก่อน +2

    If they choose to go with 12GB of ram and expect to get more money for it then I will not be buying a Battlemage card. I went with a 16GB A770 a few months ago to play around with to see what it was like. So far I've been pleased with it as everything I play with the exception of a couple games gets very high fps. Why would I go backwards from 16GB of ram down to 12GB and want to pay more? The speed of the ram doesn't matter if you're taking away the extra capacity. It's like trying to to me this new 12oz cup is better than my current 16oz cup because the new cup is made from stronger materials even though it has less capacity. The durability/speed doesn't mean squat if it can no longer hold 16oz/16GB.

  • @denvera1g1
    @denvera1g1 4 หลายเดือนก่อน +5

    If the B980 is on the same node as the A770 it might end up being the same cost to manufacture as the A770 when it launched, but the end cost of the card is determined by the volume in which they sell(with a base cost for materials and labor if effectively infinite numbers sell, like for Nvidia)
    The manufacture line is a small part of what goes into a product, and if you dont sell much of a given product, the design might be the majority of the cost. but economies of scale means the 4070TI could have sold for $549 if Nvidia knew the 40 series would sell as well as it has(they had good info to say it would sell better than it has so far)
    FYI the A770 was likely a loss leader costing between the 3070TI and the 4080 with all 3 cards being of the 400mm 256 bit class of hardware but all on different nodes (A770 node closer to the cost of the 4080 node than 3070 node). What we dont know for sure is weather the 3070TI and 4080 are overpriced, or if the A770 is under priced, most likely A770 was under-priced with no margin, or even a loss, though unlikely without investor buy-in

    • @khirasier
      @khirasier 4 หลายเดือนก่อน +6

      lol the 30 and 40 series cards especially the 40 were way overpriced and gpus are even cheaper to make for intel cause they dont need to lease time from tsmc cause they have their own manufacturing facilities

    • @user-bj6mv7ok5q
      @user-bj6mv7ok5q 4 หลายเดือนก่อน +1

      @@khirasier wow

  • @laszlozsurka8991
    @laszlozsurka8991 4 หลายเดือนก่อน +7

    If B980 and B970 comes out late this year like Q3 or Q4 then it'll be bad since the RTX 50 series will be ready to launch and the RTX 5070 and RTX 5060 will make the Battlemage GPUs obsolete.
    RTX 5090 will probably launch Q4 this year
    RTX 5080 will probably launch Q1 2025
    RTX 5070 / 5070 Ti will probably launch Q2 2025
    RTX 5060 / 5060 Ti will probably launch Q3 2025
    So in other words Battlemage will last for 6 months until it'll be destroyed by the RTX 5070.

    • @G_WOLF
      @G_WOLF 4 หลายเดือนก่อน +4

      I wouldn't say that - I'm expecting Q2 at the latest. And plus if Nivdia's pricing isn't any different from the 40 Series - the majority will buy 5090 and 5070/ti

    • @user-bj6mv7ok5q
      @user-bj6mv7ok5q 4 หลายเดือนก่อน +4

      They will release sooner. Intel should be aware of the RTX 5000 series.

    • @G_WOLF
      @G_WOLF 4 หลายเดือนก่อน

      @@user-bj6mv7ok5q exactly

    • @AaronDaDon
      @AaronDaDon 3 หลายเดือนก่อน

      I bet you those 50 series card price will be crazy high for only 10% performance gain

    • @laszlozsurka8991
      @laszlozsurka8991 3 หลายเดือนก่อน

      @@AaronDaDon 50 series will definitely be more than 10% lol. It's a completely new node and architecture. 5070 will mostly be 10% faster than the 4080.

  • @unclej3910
    @unclej3910 4 หลายเดือนก่อน +14

    My Radeon RX 6800 has 16GB GDDR6 and a 256 bit memory bus. But, only 12GB on the Battlemage cards?

    • @user-bj6mv7ok5q
      @user-bj6mv7ok5q 4 หลายเดือนก่อน +2

      They are prolly lower end models.

  • @chrisblum8358
    @chrisblum8358 3 หลายเดือนก่อน

    Good content.I suspect your math on power for the speculative B980 is optimistic by ~10%. 270W seems like a better guess.

  • @pju28
    @pju28 4 หลายเดือนก่อน +8

    I do not understand why they are not capable to make GPUs with 150+ CU‘s/XE‘s… RTX Path Tracing is the main thing on gaming now! As you can see, older games are remastered pointing to RTX.

    • @frespects9624
      @frespects9624 4 หลายเดือนก่อน

      RTX is awful. It's still a very new technology and it'll likely be 7-10 years before it's matured enough to be viable for the average consumer to run on average hardware.

    • @user-bj6mv7ok5q
      @user-bj6mv7ok5q 4 หลายเดือนก่อน +1

      Right the GPU would last longer as new games come out.

  • @Greenalex89
    @Greenalex89 4 หลายเดือนก่อน +4

    Cmon intel, dont go full clown mode like Nvidia..Give your flagship more than 12GB VRAM and a 256 bit bus at least. If it trades blows with a 4070 its not worth upgrading from my 6700XT, nevertheless I hope for a serious impact on the upper mid range GPU market. Its not easy to fk this up, Intel !

  • @simonrichard1871
    @simonrichard1871 4 หลายเดือนก่อน

    Came for the news... Stayed on pause admiring (or trying to since it's blurred out) marvel at the axe/hammer in the background! 😮

  • @thetheoryguy5544
    @thetheoryguy5544 4 หลายเดือนก่อน +3

    It would make way more sense if they kept the B980 at 16gb like the A770 and the B970 12gb from the 8gb in the A750. So I hope the vram leak is wrong. Would be nice as well to have the 64xe cores. But them not putting out at least the 56 core version will be a make or break moment in weather I buy another Arc card or not.

  • @HEAD123456
    @HEAD123456 4 หลายเดือนก่อน +5

    12gb vram is dealbreaker for me because i play in 4k(lg c1 48"). Currently on 6800xt and theres no way i am buying 12gb card.

    • @paulboyce8537
      @paulboyce8537 4 หลายเดือนก่อน

      A770 also can ... just but it needs the 256 bus and 16gb. Battlemage is not going to have what it needs past 1440p.

  • @Aquaquake
    @Aquaquake 4 หลายเดือนก่อน

    Hopefully they can tackle power consumption of the new cards too. Original Alchemist hard ridiculous idle draw, I'm not sure if they ever fixed it.

  • @ze3bar
    @ze3bar 4 หลายเดือนก่อน +1

    Powerful gpus are awesome but I think a lot of companies producing games need to focus on optimization because no matter how powerful the graphics cards are they won't be able to keep up with a game that isn't optimized.

  • @GraphicdesignforFree
    @GraphicdesignforFree 4 หลายเดือนก่อน

    Sounds good. Hope they can do it with these specs / prices. I'm interested.

  • @yissnakklives8866
    @yissnakklives8866 4 หลายเดือนก่อน +1

    Wonder when we'll start seeing this architecture in mini pcs

  • @blaux
    @blaux 4 หลายเดือนก่อน +10

    The monstrous cache could make up for lower vram

    • @user-bj6mv7ok5q
      @user-bj6mv7ok5q 4 หลายเดือนก่อน +5

      True but vram has its advantages.

    • @GeneralLee131
      @GeneralLee131 2 หลายเดือนก่อน

      Nvidia’s gamble was that you could get around the need for so much extra VRAM with a better cache, and they were right. If ARC has such a unique L2 cache architecture, it could excel in unexpected ways.

  • @ksalbrecht88
    @ksalbrecht88 3 หลายเดือนก่อน

    I bought a ASROCK PHANTOM GAMING Intel ARC A770 16GB card in December 2023. I am very please with the performance I am getting from this card. Now I am currently having a single issue that is a bit frustrating. When Intel ARC Control attempts to find an update for the drivers, the software times out while searching, so I am unable to update the drivers without manually deleting them & then it somehow finds the most current driver update & downloads it. This isn't a huge issue... though it is frustrating. From what I have read in forums, this issue has been happening for a while. Intel also hasn't made it clear from what I have found that they have a fix for this.

  • @corbinknight9702
    @corbinknight9702 4 หลายเดือนก่อน

    Wanted to thank you for the code for windows 10 pro key!

  • @crzyces1693
    @crzyces1693 4 หลายเดือนก่อน +7

    I really, really hope that I can pick up an upgrade worth buying this year. I grabbed a 3080 FE right when the mining boom ended a year and a half ago for $525 bucks. I'm surprised it hung in there at 1440P as long as it has considering I play almost exclusively AAA single player games. Thank goodness for super sampling tech I suppose.
    Now money isn't the issue, it really is a principle thing *_for me._* When the 30 series launched mining made both the 3K and 6K series from AMD absurd for almost their entire generation, so I stuck with a 5700 non XT for 2 years longer than expected. Now with OLED's appearing like they will finally come down in price sometime between this Summer and next, I'll most likely be picking up a 32" something or other when they are available at 1440P+/144hz+ w/variable refresh rates for $700 or less. Let's talk GPU's though. Pretty much all of the information about GPUs is _"Haha Haa!!! If you want a high-end card you are stuck with Nvidia!"_ Now unfortunately, _"High-End"_ apparently just means 1440P/High/Ultra Textures at 60 FPS with Raytracing on (at least Global Illumination which seems to be more taxing than reflections) which is absurd considering we are 6 years into the technology and have barely been able to move the needle, going from, say a 15 out of 100 with Turing, to 35 out of 100 with Ampere to 45 out of 100 (100 being full raytraced 2 bounce lighting, not balls to the wall 6 bounce gradually diminishing pathtracing with accurate reflections as we'd need to cut the initial numbers in half were that the case) and the prices are a bit ridiculous.
    So what am I hoping for? Either 50% better raster than the 4090 with 75% stronger RT for $2K or less *OR* 4080 Super levels for $600 to get by for another couple of years as just like I can't bring myself to pay $1500-$2000 for the monitor specs I currently want, I simply cannot justify paying over $2000 for a GPU. I can't even justify $1600 for 4090 levels of performance and they haven't been at MSRP since August? If I needed a GPU for my main source of income it would be no big deal, but for gaming? Idk, that's a 2 week vacation in Columbia including spending money. It's a down payment on a $20K car (I have a 2023 Honda HRV and a 2021 Silverado, both paid so I don't need a new _used_ car, that's just an example) or, well you get the point. So $600 bucks for 4070 Ti'ish performance is _almost_ there if the 5090 under-performs (which it might on the gaming side with more and more focus being put on AI) or comes in at, say $2199.99. And yes, $200 extra on a $2000 would have me saying *_"Nope"_* out of the afore mentioned "principle." Just because I can afford something does not make it a good purchase, and I don't want to be part of the problem by helping Nvidia and AMD with their _"Market Leader Monopoly"_ bullshit that the SEC and FTC obviously don't give a shit about. I normally want government to stay out of things outside of fixing the roads and helping the elderly/sick/infirmed/destitute. I love cops but hate when I know more about the Constitution than they do and firmly believe their qualified immunity is almost as ridiculous as their insistence on illegally obtaining ID's under threat of arrest...but in the case of Nvidia it's another matter altogether. GPU tech is to important to let a company controlled by shareholders and a CEO whose net worth is tied directly to their share price to set said prices. It's would be akin to the government locking up all of the auto patents for decades and allowing Ford or GM to set the prices of cars at whatever they felt like. Or not imposing price limits on standard oil and eventually breaking them up and being much firmer with their price controls moving forward. It ended up making the Rockefellers and the majority of other big players involved more money in the end while also making Cars, or in the latter case oil and gasoline much, much more affordable.
    I don't hate Jensen, Lisa Su, Nvidia, AMD or Intel. I just want fair pricing which Nvidia seems incapable of providing while AMD has been content just *_"Slotting their cards in"_* as opposed to saying _"We've got 6 years. We'll take a 30% for 4 generations and if we can't get to 40% market share we'll have to go back to the drawing board."_ Intel seems like they are trying, but they are just so far behind. AMD is too the moment you turn raytracing on in heavy workloads. Ahh well, enough of my bitchathon essay, I just hope we have a great generation for everyone, especially the people who are in the $200-$500 price range, as let's face it, if you look where we were with Turing for $1200 bucks and where we are now, the progress hasn't been all that incredible.

  • @user78405
    @user78405 4 หลายเดือนก่อน +1

    there 3 models....40 xe , 56xe and 80xe in not gonna be leak until reveal, they all been last minute changes in architecture changes by introducing bigger cache and different memory options to make it more affordable while 80 xe is on separate class and tier for other users for intel haven't thought price structure yet for 4080 and 7900xtx killer...its more like 7900xtx gonna under perform in everything for something might cost $600 best part....their is rumor it has memory bandwidth 1tb closer to 4080 memory bandwidth due full memory bus from 192 cut down bus to 384bit

  • @roqeyt3566
    @roqeyt3566 4 หลายเดือนก่อน

    Whoa whoa whoa, adamantine???
    They put that stuff on here?
    I'm really hoping they're actually using it for performance and not for some security feature. That would be very, very interesting in keeping the card relevant in it's second half of its life
    Adamantine is ofc slow for cache, as we know it that is. So it'll be more there for keeping the card above 30 fps or something

  • @RealTechnoPanda
    @RealTechnoPanda 4 หลายเดือนก่อน +1

    Windows 10 OEM keys making a smooth transition to Windows 11 - love it!

  • @ajbauto
    @ajbauto 4 หลายเดือนก่อน +2

    I will buy the B980
    We need Arc to succeed

  • @nicksterba
    @nicksterba 4 หลายเดือนก่อน +20

    If these specs end up being true, that would be pretty dissapointing to be honest.

    • @user-bj6mv7ok5q
      @user-bj6mv7ok5q 4 หลายเดือนก่อน +2

      The specs may not be real performance that can be something entirely different like
      outperforming NVIDIA.

    • @wjack4728
      @wjack4728 4 หลายเดือนก่อน +4

      256 to 192 wtf! Disappointing if you ask me. Intel is being sneaky just like the others.

    • @KoItai1
      @KoItai1 4 หลายเดือนก่อน

      @@wjack4728 don t you see the 512 mb of adamantine cache ? or that doesn t matter for you

  • @Psychx_
    @Psychx_ 4 หลายเดือนก่อน

    It would be great if there was also a version that comes with a non cut-down die. Drivers still need to improve though.

  • @AnonymousUser-ww6ns
    @AnonymousUser-ww6ns 4 หลายเดือนก่อน +4

    I don’t mind 12GB VRAM GPUs if the price is right. The Max I would spend for a GPU that has 12GB on a 192 bit memory bus is $300. Any more than that and I should expect to see 16GB VRAM on a 256 memory bus.

    • @AnonymousUser-ww6ns
      @AnonymousUser-ww6ns 4 หลายเดือนก่อน +3

      Ideally I would try and get 16GB VRAM knowing that 12GB is allready reaching its limit without downgrading settings.

    • @oswaldmosley6179
      @oswaldmosley6179 4 หลายเดือนก่อน

      Agreed.

    • @prateekpujara5742
      @prateekpujara5742 2 หลายเดือนก่อน +1

      @@AnonymousUser-ww6nsActually 8GB is in the limits , 12GB will get you covered for atleast 5 years.

    • @AnonymousUser-ww6ns
      @AnonymousUser-ww6ns 2 หลายเดือนก่อน

      After doing some research if you stick at 1080p it should be fine for 5 years but at 1440p more like a couple years.
      That’s why’s I’m saying 16GB VRaM on a 256 bit memory bus should be the starting point.

    • @AnonymousUser-ww6ns
      @AnonymousUser-ww6ns 2 หลายเดือนก่อน +1

      In summary
      1080p max for 5 years 12GB VRAM
      1440p max for 5 years 16GB VRAM
      4K max for 5 years at least 20GB or more VRAM.
      Do not Buy 8GB VRaM GPUS unless you are on an extreme budget and you are okay with lowering settings or playing older titles before 2022z

  • @TeeqPRO
    @TeeqPRO 4 หลายเดือนก่อน +3

    BROO JUST GOT ARC A770💀💀

    • @AaronDaDon
      @AaronDaDon 3 หลายเดือนก่อน

      You can always sell it

  • @sauce777
    @sauce777 3 หลายเดือนก่อน

    A 56% generation jump is really good. A couple more generations, and they could close the gap with team green, and red.

  • @artsergo
    @artsergo 2 หลายเดือนก่อน

    интересно если они выпустят B980 какое типа питание у него будет 2 на 8 pin или как щас использует Nvidia 12-й новый разъём

  • @PejicVladimir
    @PejicVladimir 4 หลายเดือนก่อน

    Thanks for the Windows 10 key recommendation!

  • @pullupucci5215
    @pullupucci5215 4 หลายเดือนก่อน +2

    looks like a typical 20-30% performance gain

  • @bllacksalt
    @bllacksalt 4 หลายเดือนก่อน

    Thanks! And get well soon!

  • @rustypotatoes
    @rustypotatoes 4 หลายเดือนก่อน +1

    Hey, I have a question. I have a gtx 1080, and I'm looking to upgrade at some point. I also play vr games, though I'm not sure that matters for which card I get. I've been wanting a 4080 or an a770, from a 1080 do you think the b980 would be a better upgrade for me? 😅

    • @PixTure
      @PixTure 4 หลายเดือนก่อน +1

      I own an Arc A770 and it's a pretty enjoyable card that still does need some polishing here and there. I really wouldn't get arc yet if you are planning on using it for VR gaming as there isn't really any native driver support for VR. You might be able to get some Meta headsets or WMR headsets to work through Virtual Desktop but headsets that require the use of Direct Display Mode like the Valve Index are not supported. Really hoping Intel releases a driver soon for native VR support when Battlemage releases as they've been pretty quiet with VR as of late. Should wait and see for Battlemage!

    • @rustypotatoes
      @rustypotatoes 4 หลายเดือนก่อน +1

      @@PixTure I do have a quest 3, however a few years from now I would like to get a pcvr headset, guess I'll have to wait and see

  • @DEA3HAM
    @DEA3HAM 4 หลายเดือนก่อน

    Great video and thanks for the windows discount!

  • @mrwang420
    @mrwang420 4 หลายเดือนก่อน +2

    damn. Intel Pulling up hard with the gpus.

  • @BaconGod.
    @BaconGod. 4 หลายเดือนก่อน +14

    12gb is not fine 16gb standard its 2024 we aint stuck in the year 2012

    • @melonmusk1274
      @melonmusk1274 4 หลายเดือนก่อน +5

      “Stuck in 2012” the gtx 680 had only 2gb. No gpu in 2012 was 12gb (not including enterprise gpus)

  • @Perry2186
    @Perry2186 4 หลายเดือนก่อน +1

    what time are we lookin at for an anouncemenet?

  • @rdnowlin1206
    @rdnowlin1206 4 หลายเดือนก่อน +2

    If Intel's new GPU (battlemage) can match or beat AMD's RX7700xt at $350 - I would buy it!

  • @enragedbacon470
    @enragedbacon470 4 หลายเดือนก่อน +3

    just passing through enjoying the vram salt from all these people that dont actually buy gpus and just complain

  • @xzaratulx
    @xzaratulx 4 หลายเดือนก่อน +1

    ngl, these Arc look sexy and their specs sound really potent.
    Hope Intel can bring some live into the stagnant GPU market again.
    I am only worried about their drivers but Intel seems to catch up.

  • @stopthefomo
    @stopthefomo 4 หลายเดือนก่อน +1

    wassup my friend, so I assume Arc is the future?

  • @Falco75
    @Falco75 4 หลายเดือนก่อน

    Seems like a competitive card, good video btw

  • @wjack4728
    @wjack4728 4 หลายเดือนก่อน

    I wonder if Dosbox (original) will not work in Windows like with Alchemist.

  • @TecoProductions
    @TecoProductions 4 หลายเดือนก่อน +2

    I don't think they cut down

  • @BareFox
    @BareFox 4 หลายเดือนก่อน

    Thanks for the CD Key site!

  • @TheBann90
    @TheBann90 4 หลายเดือนก่อน +3

    The leak is still fake since the Intel guy in the PCWorld interview hinted at all the Battlemage cards being given 16+ GB of ram. And here the 40 core version only has 12GB.

  • @FrankValchiria
    @FrankValchiria 4 หลายเดือนก่อน

    i have the feeling that this will come out to close to nvidia new rtx 5 series, this should be out right now

  • @AndyViant
    @AndyViant 4 หลายเดือนก่อน +1

    If they come in at that price to performance then no wonder Nvidia did that super refresh.

  • @Joeybag0donuts
    @Joeybag0donuts 4 หลายเดือนก่อน

    So this card is comparable to the 4070ti-4080????? Apologies if I misread but I’m just curious

    • @prateekpujara5742
      @prateekpujara5742 2 หลายเดือนก่อน

      Even if its reaches 4070 while being 200 dollars cheaper who doesn’t want it?

  • @dean4696
    @dean4696 4 หลายเดือนก่อน

    really hope they release the 980, if its successful enough it would destabilize the entire midrange of this upside down GPU market. Also would be more than willing to spend an extra 50-100 bucks at those specs to retain 16gb of vram. these games use more and more every year, also modded games use an immense amount.

  • @ComputerFix
    @ComputerFix 4 หลายเดือนก่อน

    Usefull information about windows licenses!

  • @CHT1992
    @CHT1992 4 หลายเดือนก่อน +10

    12gb is not fine at all for the kind of performance these cards might deliver. 12GB is the minimum acceptable for midrange cards, with low end being 8gb. These cards are more on the medium to high range performance field based on specs.
    What's the point of getting a B980 if it only offers 12gb vram. Card will die quickly. Games are already pushing beyond 8gb vram at 1080p, so push those 1440p or 2160p in and you might just come across problems in memory hungry games.
    Of course this might not be an issue if people rely on upscalers in the future, but the performance this card here offers is way too high currently, for the low amount of vram.
    It should have been 12gb or 16gb for b970 and then 16gb or 20gb for b980. They should actually release versions with more vram for higher msrp. As it is, if they release cards with good msrp and this kind of performance, i still won't find it interisting investing on something like this, knowing i will have to be buying a new GPU in 3-4 years because of vram issues.
    I bought my 1080ti at launch, back then, games were pushing for 4gb vram usually. Those extra 7GB the 1080ti included made the card survive this long. If it was 8gb, it would have been dead by now as the rtx 3060 would be kicking it into the shadow realm with those 12gb, despite being slightly slower. Same principle applies here, if someone wants to milk this purchase for like a decade, they won't be able to because of that low VRAM. This is no different than buying a ##60 card from nvidia, which will be relevant for like 5 years at best, then it struggles.
    If intel wants to make any sense of this release and be attractive, they need to release 2 variants of each card with different vram options as i described earlier.

    • @anitaremenarova6662
      @anitaremenarova6662 4 หลายเดือนก่อน +1

      Let's hope they fix this mistake like they did the 8GB A770.

    • @ababcb3005
      @ababcb3005 4 หลายเดือนก่อน +1

      I wonder why they don't add more VRAM. As I understand it, the price of the memory itself is only like $30 or so. I can see why NVIDIA would be stingy (to entice people to buy datacenter cards for AI), but since Intel is a newcomer, I'd think they would be more inclined to offer as compelling of a deal as they can, within reason. Is there some kind of technical hurdle that makes it much more expensive to support more VRAM than the bare cost of the memory or something?

    • @anitaremenarova6662
      @anitaremenarova6662 4 หลายเดือนก่อน +1

      @@ababcb3005 No, just greed. AMD is providing sufficient VRAM just fine. So does Intel so far.

    • @UltimateGattai
      @UltimateGattai 4 หลายเดือนก่อน +1

      Even my 1080ti's vram is almost maxed out by games like Resident Evil 2 Remake at 1440p. I'd hate to see how much worse it would have been if they put less cram on it.

    • @anitaremenarova6662
      @anitaremenarova6662 4 หลายเดือนก่อน

      @@UltimateGattai Damn, 1080ti is insane to still be able to run 1440p with games released so recently!

  • @Andrew_572
    @Andrew_572 4 หลายเดือนก่อน +9

    I’m rooting for a full blue build. I’m tired of NVIDIA business practices.

    • @anitaremenarova6662
      @anitaremenarova6662 4 หลายเดือนก่อน

      What about AMD?

    • @dex4sure361
      @dex4sure361 4 หลายเดือนก่อน

      @@anitaremenarova6662 amd is trash

    • @user-bj6mv7ok5q
      @user-bj6mv7ok5q 4 หลายเดือนก่อน

      @@anitaremenarova6662 AMD may need more time to catch up with NVIDIA . Rdeon make good GPUs but their prices are similar to NVIDIA.

    • @Andrew_572
      @Andrew_572 4 หลายเดือนก่อน

      @@anitaremenarova6662 They are cool, but DLSS and RTX are a big thing for me.

  • @JaiArah
    @JaiArah 4 หลายเดือนก่อน

    My only concern is that when this card finally releases, we'll be looking at 50 series cards

  • @digiscream
    @digiscream 20 วันที่ผ่านมา

    The 12GB downgrade would be ridiculous - the main selling point of the A770 is that it's the cheapest decent card with 16GB RAM. However, that's not the main problem here - Intel have tweaked the firmware after release to peg the minimum fan speed at 30%, which means that even if you hack the control panel to adjust the fan curves (which can't even be done under Linux, because no control panel) _and_ you have a sound-deadened case, the fans audibly ramp up and down constantly every five seconds or so. Fine if you only run your machine for games, but completely unusable if it's your daily driver and you do any work on it. Worse, they've outright said that this is a new design decision to be carried forward to Battlemage, and will not be changed no matter how much users complain.

  • @ApostelKrieger
    @ApostelKrieger 3 หลายเดือนก่อน

    Have an ARC770 LE and its getting faster from Dreiver to driver, insane !

  • @mikeclardy5689
    @mikeclardy5689 4 หลายเดือนก่อน +1

    Actually, this is throwing major shots in the market. Most people upgrade or aim for whatever nvidias xx60/xx70 skus are. So targeting the midrange with such an aggressive price will absolutely take marketshare from amd and nvidia. Why pay more for less when you can get more and pay allot less.

    • @roklaca3138
      @roklaca3138 2 หลายเดือนก่อน

      Get more for less money? Thats blasphemy for nvidia fanboy shitshow...

  • @envyvaliente2152
    @envyvaliente2152 3 หลายเดือนก่อน

    that’s defenetely wild

  • @reapersasmr5483
    @reapersasmr5483 3 หลายเดือนก่อน

    For that price IT WOULD be amazing, as Navida has been doing us dirty on cards specs and prices

  • @Cipotalp
    @Cipotalp 4 หลายเดือนก่อน +2

    I'm dissapointed :S I only bought Intel ARC bc of the 16GB RAM.... :( I hope Battlemage will have at least 16GB.....

  • @mencrypted
    @mencrypted 3 หลายเดือนก่อน

    I own a Intel Arc A750 and I love it! I am buying Battlemage the moment they touch the market!

  • @anthonylipke7754
    @anthonylipke7754 2 หลายเดือนก่อน

    Cost matters but I feel the 980 seems like it should have 16gb. 12gb on the 970 seems to make sense. No idea if they can do it but it would be nice. The other guys next gen isn't too far away.

  • @TevisC
    @TevisC 4 หลายเดือนก่อน +16

    12gb is a no go for me.. I have the A770. Was looking forward to Battlemage.
    Don't cut it down Intel... produce a beast..

  • @Antagon666
    @Antagon666 4 หลายเดือนก่อน

    A stands for alpha test, B stands for beta test.

  • @Backstage_Politician
    @Backstage_Politician 4 หลายเดือนก่อน +1

    Mid range should be no larger than the lochness monster.... $350

  • @dekzzx
    @dekzzx 4 หลายเดือนก่อน

    would love to see intel come in with something seriously competitive and shake this market up dramatically.

  • @AnnHiroCh
    @AnnHiroCh 3 หลายเดือนก่อน

    Man I like how Intel naming scheme wise started at 700 and is now at 900, now we can say the 980 is relevant.

  • @darthpaulx
    @darthpaulx 4 หลายเดือนก่อน

    So the 970 will be RX7700xt performance and the 980 will be RX7900xt performance for 450?
    If the performance comes true, i believe the price will be 650 instead of 450.
    If it's 450, i go out and buy one the moment it's launched.

  • @thisisashan
    @thisisashan 4 หลายเดือนก่อน

    There is a scaling issue at Intel it seems like.
    Perf per watt the B970 is extremely competitive, but it just doesn't have the performance gate that the 4070 TI does. Not sure why that is mentioned and other cards are blurred. Weird tbh.
    I will say, with the reg 4070 @500 bucks, this looks like a good cheap alternative to a 4070.
    But, other than that, its annoying no one is trying to make graphics cards that compete at the upper tiers.

  • @MasterHero10000
    @MasterHero10000 4 หลายเดือนก่อน

    Okay so the B980 wants to compete with the 4070Ti correct?

  • @lifebetweenthelines8576
    @lifebetweenthelines8576 4 หลายเดือนก่อน

    This is the gpu in waiting for

  • @Banedraven
    @Banedraven 4 หลายเดือนก่อน

    The figures are dropping everytime I hear something new.

  • @zunexxx
    @zunexxx 4 หลายเดือนก่อน

    I have my doubts they will make it this efficient....

  • @ktvx.94
    @ktvx.94 4 หลายเดือนก่อน +2

    Comparing the leaked specs to cards that have been retired and replaced by more powerful and better value versions is borderline misleading, man.

  • @LevelUPGamingTech
    @LevelUPGamingTech 4 หลายเดือนก่อน

    At least intel is trying and doing something with their drivers, something especially AMD should be doing. Honestly I thought that A750 before this was going to stink, but turns out it's actually good for dirt cheap, especially after a bunch of driver updates!
    Thanks for the WIn key man 😉

  • @glitchyboyg7409
    @glitchyboyg7409 4 หลายเดือนก่อน

    Imma buy it probbably

  • @romangregor4552
    @romangregor4552 4 หลายเดือนก่อน

    its a fair price but the 512mb L4 cache for what is L4 cache?
    if the b 980 is good at gaming and has a good driver for linux i buy it instant for this prices if it costs between 350-450€ :D

    • @alexturnbackthearmy1907
      @alexturnbackthearmy1907 4 หลายเดือนก่อน

      I guess they are trying to use it for card optimisation, kinda like optane did with disks?

  • @ADo_Bad_Idea
    @ADo_Bad_Idea 4 หลายเดือนก่อน

    I really wanna see a 4070 ti super performance competitor with 20gb of ram but about $100-150 cheeper

  • @JackPecker911
    @JackPecker911 4 หลายเดือนก่อน

    I feel like the new intel stuff won't be faster than 4070 and it will consume a lot more power than the other options, while matching AMD equivalent card in price. That's my prediction

  • @700mobster
    @700mobster 4 หลายเดือนก่อน +8

    they already messed up if they launch these cards with a sub 200 bit bus, there are already games where low bandwidth makes them nigh unplayable..let alone 12gb of vram

    • @dankmemes3153
      @dankmemes3153 4 หลายเดือนก่อน +2

      Ok then, give an example of those games. The 980 held up till late 2020 releases for 1080p.

    • @enragedbacon470
      @enragedbacon470 4 หลายเดือนก่อน +2

      @@dankmemes3153 they cant none of these people actually know how memory architecture work.. They would buy a gt 710 if it had 16GB of vram.

    • @700mobster
      @700mobster 4 หลายเดือนก่อน

      @enragedbacon470 I have the arc a770 and im not downgrading to 12gb vram.

    • @enragedbacon470
      @enragedbacon470 4 หลายเดือนก่อน

      @@700mobster thats nice, considering the performance level of the intel A770 would never need 16GB to begin with, its a mid range card, you would hit a gpu performance bottleneck before you hit a vram problem, even if your A770 had 12GB of ram the performance would be the same.

    • @700mobster
      @700mobster 4 หลายเดือนก่อน

      @enragedbacon470 Not exactly, I play minecraft with a ton of mods and shader packs and have easily passed the 12gb mark in vram. Games like resident evil 4 and hogwarts have also utilised more than 12gb vram on my card. The point being, personally, I won't be getting a battlemage gpu if they downgrade the vram, the memory bus I might be more flexible on cause it does have more variables.

  • @mrwang420
    @mrwang420 4 หลายเดือนก่อน

    I bet that b980 will mine really good.

  • @azntactical4884
    @azntactical4884 3 หลายเดือนก่อน

    I like Nvidia cards and was planning on buying one for my new build. But, this time around, I will be waiting for Battlemage.

  • @jackcapella2707
    @jackcapella2707 4 หลายเดือนก่อน

    I would completely buy the B980

  • @sidburn2385
    @sidburn2385 3 หลายเดือนก่อน

    Im just waiting for b980.

  • @Jpilgrim30
    @Jpilgrim30 3 หลายเดือนก่อน

    Imagine if Intel is the company to get the GPU industry back in line price wise.

  • @keblin86
    @keblin86 4 หลายเดือนก่อน

    I will believe it when I see it, but I hope it's true!

  • @lexustech48
    @lexustech48 4 หลายเดือนก่อน +5

    With their A770 launching with a 16GB option, a Battlemage with anything less is a loss. Frankly I was really looking forward to a solid offering from Intel to take on the 4070 and win decidedly. As an owner of an A770, I really applaud Intel for jumping in and slinging it. The A770 16GB is a fantastic 1080p card for most everything except MSFS 2020. But you absolutely cannot enter round 2 with a neutered VRAM count to fight in the raster 4K arena. It all matters.

    • @G_WOLF
      @G_WOLF 4 หลายเดือนก่อน +4

      I play on a 1440p monitor with an A770 and on cod I get 110-120 Consistently
      It's now way Intel thinkings 12GB is okay. Only 20GB makes sense

    • @MAJ_T_Bagger
      @MAJ_T_Bagger 4 หลายเดือนก่อน +4

      I've been using my a770 for 4k upscaling at 60fps on a budget tv and it does really well, only thing letting it down atm for me is I mainly play assetto corsa which is dx11 native, yet to try the dxvk mod for assetto but it should help it a lot, often see vram near cap on most of the games I play too.. 12gb is not enough vram for their b980 flagship card

    • @paulcrocker7347
      @paulcrocker7347 4 หลายเดือนก่อน

      be interesting to know what you cpu/dram spec is as I am a avid flight Sim'er and play MSFS and DCS @1440p on both with loads of payware and FSLTL on MSFS, at 80% all ultra settings and LOD at 140 and I get amazing FPS(30+ at high load scenery ) and no stutteres or CTDs.. its a very CPU intensive Sim..

    • @MAJ_T_Bagger
      @MAJ_T_Bagger 4 หลายเดือนก่อน

      @paulcrocker7347 from what I've seen msfs isn't great on any arc card still, I'm running an 11600k and 64gb 3600mhz ddr4, ideally need a 12600k or higher to use deep link features for more fps

    • @paulcrocker7347
      @paulcrocker7347 4 หลายเดือนก่อน +1

      @@MAJ_T_Bagger oh yeah definitely be your cpu Im afraid..Im on 13900k and fast 7000 32gb ram.. also a fast nvme 4tb m.2 drive also makes a huge difference..