Thanks to Vip-cdkdeals for sponsoring this video. 30% off code: GPC20 ▬ Windows 10 pro ($17):www.vip-cdkdeals.com/vck/GPC20w10 ▬ Windows 11 pro($23):www.vip-cdkdeals.com/vck/gpc20w11 ▬ Windows 10 home ($14):www.vip-cdkdeals.com/vck/gpc20wh ▬ Office 2016($28):www.vip-cdkdeals.com/vck/gpc20of16 ▬ Office 2019($47):www.vip-cdkdeals.com/vck/gpc20off19 Get access to the discord and ICC profiles: patreon.com/TheDisplayGuy Arc B980 is Insane - Intel Battlemage GPU Leak Intel is preparing to launch their next generation Battlemage GPUs very soon and the latest leaks suggest they will be very good. Nvidia may want to consider dropping their prices, because if the Intel Arc B970 and B980 leaks turn out to be true they will bring excellent price to performance. Source Intel Battlemage B980 & B970 Leak: th-cam.com/video/ieJThJke6fE/w-d-xo.html
@@DainHunter it does indeed .. currntly arc card are also better at RT than AMD's equivilant cards , so if this thing is really at 4080 levels in ras , that put's it in 4070 level RT. that does indeed sound to good to be true. i'll wiat and see how things go.
Personally I don’t think it will have this effect at all, only because Nvidia have always said that they are a software company that sells GPU’s rather than being a GPU manufacturer, they won’t feel threatened by this at all… or at least not to the point where they would be quaking in their boots and drop the value of their cards… having said that, I would love to see more competition in the GPU market
@@ProfShikari I mean if they don't they just lose all market share under the 4080..even if it's not a major part of their business I think they like having control of the gpu market. I also hope there will be more competition
I think it will boil down to how well the drivers work. We already know that their current cards have alot of performance but often simply cannot deliver that performance onto the screen, due to imperfect drivers. Really hopeful still, Im sure Intel has made huge advancements already and the next gen will only be better
Tbf, by the time it comes out there will be 5000 nvidia series and a 4080 will be equivalent to a 5060ti-5070 and will be likely be priced competitively with Intel/AMD.
@5 I'm actually on the train though. Sure, it makes me a guinea pig but if Intel keeps this up, I'll be glad I and people like me actually bought ARC GPU's instead of cheering them on from a distance.
If old lizard people from microsoft and intel work together,they might make something out of it. That B980,if this price estimate is correct,is pretty tempting.
@@jamesrock9446 yeah, raytracing is nice and all, but as a feature it is still too expensive for the average customer. there's a reason why there hasn't been any game tha only used rayctracing for its lighting.
I'm an ARC A770 LE owner. It's alright. In the games where it works correctly, it works the 16GB's of VRAM to their fullest, conquering almost all challenges. Almost. Ghost Recon Breakpoint is among a few games that... either run (on minimum settings) in the low 40's, or outright crash depending on the rendering engine (Vulkan or DX12, both have problems) that's being used. However, Elden Ring, Armored Core 6, Halo infinite, GTAV, even non-community-patched New Vegas all run great! Intel truly just needs to work on drivers with specific games. Personally, I have no regrets, I just miss having the keybinds to easily start recording from shadowplay.
Considering that's their first generation, and drivers have already improved so much since release. It's not hard to be hopeful that Intel really will fuck with AMD and Intel.
@@TheGeneReyvaIntel is rumored to be slimming down its driver stack in anticipation of Battlemage. So we might expect more features and less hiccups than last generation.
consider getting OBS (not streamlabs) just for the sake of recording. it can do “shadowplay” as well and you have much more customization in terms of what gets recorded and in what quality, and you can set keybinds within OBS as well. just takes a little more effort to set up but afterwards its as simple as shadowplay and arguably more reliable.
@@computerpwn You make a totally fair point! I was lamenting the fact that such a setup, specifically recording the past minutes of gameplay, isn't available (as far as I can tell?) in first-party software. Other software was always an option, I just hoped when I had bought mine that the first-party software intended for the hardware was 'as functional' as equivalent first-party software from competitors. NVIDIA's Instant Replay was a fantastic and well-integrated part of the NVIDIA experience, even on an old GTX 960.
@@geargrinder6511 OBS can record a set amount of time in the past, based on however long you want it to, just like Nvidia’s instant replay. just look up “OBS replay buffer” to figure out how to set it up, but works just as good if not better!
They better hit their release window, because once the RTX5000 series start rolling out, 5090 at the end of the year, then this and the 4080 are going to be Mid level cards.
If NVIDIA keeps asking insane prices like they did for the whole 40 series lineup it really doesn't matter if Intel only can come close to the 4080. Battlemage will be preferred by the majority of gamers when it comes to affordability in combination with decent performance.
@@00Beowulf00especially since all nvidia will so is make it a quad slot card requiring a huge water cooler using 4x as much power so they can run their existing design at higher performance and change $3k for it. Problem is they will only sell to TH-camrs and such, whereas a middle range card will sell millions of units.
The rtx5000 series rolling out right now wouldn't hurt them at all. Nothing available for the next few years would even need the 5000 series, 400 for a 4080 equivalent card or 1800 for a 5000... yeah ppl gonna go with the arc that have actually heard of it
I just really hope you're right. For half price I'm definitely switching! Unfortunately, you're also right that it will largely depend on whether they can finish ironing out the software
ALU count doesn't make sense. According to RedGamingTech, there are 16 ALU's per EU and 8 EU per Xe core, so by dimensional analysis, 16 * 8 = 128 ALU per Xe core. Which is the same as Alchemist, as it's still 16 * 8, just the 8 and the 16 are swapped. Which means 2560 (128*20) ALU's for the 20 Xe core model and 4096 (128*32) for the 32 Xe core model, i.e. no increase in ALU's between Alchemist and Battlemage
Just getting into the world of PC building myself, i'm very happy we'll have another option for GPUs to play around with. I only hope it can provide actual competition to green and red
I really hope they pull this off and completely throw the gpu market into chaos. Gpu prices are way too high and need a good competitor to come in and shake it up.
If they could pull this off, it would put a lot of pressure on the RTX 50 series in the mainstream and mid-range. I want the 5070 to offer ~4080 performance at 200 W, but without direct competition it would be expensive.
Other than repeating rumors from other sources, what specific calculations did you perform to verify that the cost is reasonable (as you stated in your video)?
I expect it will end up somewhere in the middle much like Alchemist has. Theoretical performance of a 3070 but in reality ended up somewhere bouncing between the 3060 and 3060ti depending on the game. This will probably be theoretically close to a 4080 but in reality bounce around a 4070 to 4070ti from game to game. Still, if they can bring that to market at $450 it will absolutely shake up Nvidia and AMD's profiteering, not to mention Arc drivers just keep getting better. By the time this releases hopefully they'll have gotten through the backlog of old games they've been working on lately and they can get back to focusing on new games again.
If Battlemage comes out in late Q2 or in Q3, those products on the 5nm process will have a short-lived glory with NVIDIA and AMD probably releasing their next gen on 3nm in Q4.
Intel needs some exclusive features of their own. Currently Nvidia sports some impressive DLSS and AI features that make their cards attractive. Intel had something great with their hyperencode and hypercompute features though they seem to have fallen by the wayside over the year.
if its as good as an og 4070ti i'd definitely pick one up. I need a good cheap gpu for a mobile gaming rig. Ive also been wanting to play with an intel gpu to see what all they have going on. I have a 4080 already, so something with also 16gigs of vram with the speed to actually use all of it all is very relevant to me. Also ive been on AMD cpus for a loooooooong time. Gives me an excuse to make a full intel machine. I'd deal with the innate problems with software at those levels of performance. I really like tweaking things anyways. Ive been finding myself playing more with the hardware itself rather than what the hardware is for lol. Also its about time i see what intel cpu's are offering cause the last one i had was a 4-core ddr3 chip. They've gotten new features since then and i'd love to play with and learn how to tune them up. never oc'd, undervolted, or co'd on their chips. Have no idea how well their boost algorithm works, while In contrast i know pretty much all there is to know about ryzen tuning and how to cheese its boosting behavior.
Strait to the point you already gave the specs few months ago. A770 was discontinued more than 6 months ago to make room for the Battlemage production. The specs would have been already known that time. The specs are not a movie target after it is locked in place. Every time you make changes you are undoing previous work and putting the product in risk by delaying it. It will also cost a lot to redesign. So no. The original design specs are known to be: B980 Xe 64 / 24gb (16gb) / 256 bus B970 Xe 56 /16gb / 256 bus Mobile Xe 40 /12gb /192 bus
Don't know about these leaks. They're completely different from the ones we had before. I'm gonna wait for Intel to tell us in person how good the cards are
Battlemage will not be what people are expecting. Intel needs more time for their drivers to mature. We're most likely stuck with nvidia and amd for another few years
@@newyorktechworld6492 That's just conspiracy theories. Best to mock it until reality hits you in the face. At least that what people tell me by the way they act. They'll continue trusting the eXpErTs, even though the eXpErTs are consistently wrong and those crazy conspiracy theorists are consistently right.
I don’t want to blame Intel because it does not support basically DX9, this is more an issue of the keep alive games to remastering old and fabulous games from the past to Vulkan and / or DX12 Ultimate. If this games were remastered Intel would not struggle with drivers to support old DX games. In the bottom line thumbs up to Intel 👍! Thanks for your effort and information you did on this video! Keep on it! Greetings from Austria 🇦🇹
I cringe every time I see a TH-camr shill for software keys. Because they're all ripping off their fans. So people, check the net, and you may find the place I'm thinking of. Can't tell you outright because I’m already stepping on this guy's toes.
Wrong CPU hinders the performance fair bit. INTEL is all about pairing to the CPU. FPS numbers for the A770 at 4k is not that far off from 1440p we haveseen when with mid and high settings even RT on. Another point is you can't compare INTEL FPS to FPS. For example 4070ti gives a great result on 1440p but 192 bus from the 4070ti can't do 4k nor can 128 bus from the RX 7600XT 16GB often compared to . Yes on paper 4070ti can do twice the FPS compared to A770 but it is a stuttery mess and won't give a good experience. A770 most is playable and with RT. I'm just pointing to a card 4070ti that is leaps over the RX 7600XT 16GB to get some perspective. XeSS XMX that is for INTEL cards and XeSS DP4A for customer open source cards. The programs are not the same. Two different programs and I have no idea why would they name them so similar and confusing. But XeSS XMX is as good as DLSS but XeSS DP4A is not. But when tested with AMD CPU it will not work optimum. Often missed and wrong information given To understand the architectures of INTEL vs AMD/Nvidia is vital when comparing. INTEL pairs with the CPU splitting the tasks between GPU/CPU and then bringing it together to make a frame with lot less queuing that results to very small frame time. AMD/Nvidia do most of the executions in the GPU and this results queuing and inconsistency, longer frame time and broken frames. Result is that you need twice as much FPS and even then it will stutter from bad frame time. People and reviewers haven't caught up to this yet because there has always been just two with very similar architectures. Hence INTEL PresentMon tool to tune and spot bottlenecks and to find the best pairing. The point made is that if ARC is level in comparison in reality it is hitting lot higher. You can see it well if you compare the 1080p/1440p/4k. More you demand less the drop compared to the competition. You can think this as follows. 1080p you get 100FPS but when you double that you get 200FPS that might be the limit for the game. So you are getting the max offered from the game. 1440p you might get 80FPS but when you double it you get 160FPS. And as follows for 4k similar number would be 70FPS that doubles to 140FPS. This is the weird observation that everyone can make that when the load gets bigger and more is demanded less the FPS drops compared. Reason why I'm talking "doubling" is because you also h ave two elements CPU and GPU both working hard because of the different architecture. You get 1 Frame but in reality if compared you can count that as double as 2. So if the frame limit of the game is 200FPS we only see 100FPS from INTEL that is still matching the performance of 200FPS from other cards. The proof also lies on the RT. As you know RT drops the FPS lot less on INTEL. That number also should be doubled and suddenly it is matching the competition with the extra workload. Pretty hard to explain to be honest but when you figure it you also see where the ARC is actually hitting. It might not be actually double but it wouldn't be far from it. Other proof is the encoding and time it takes. Again the tasks are split and the ARC gets close to the $1000 cards that are suppose to be twice as fast but in reality not that far ahead. As a summery it all depends how you review. And if you review two different architectures the same way you get false results that give you the wrong picture. Maybe this was necessary to sell as there are two CPU manufacturers to make it look like it performs the same. But now we know better. There is lot more to it than these channels let out. Battlemage is based on Alchemist and very similar for the drivers. So I think drivers are going to start from Alchemist level. Then there is APO that seems to promise 10-50% gain. Lets see what that's about.
i wanted to wait for Battlemage to upgrade from my 2080 i got back in 2019, but my 2080 ended up dying of unknown causes. so i picked up an acer phantom gaming A770 16gb for about $280, and all i can say is I wish i had gotten an A770 LE. Excellent card for the price, at least in the games I play and the other softwares I use
Based on the results on the last year of the Intel gpu, it’s looking like a value beast of a card. Still, would like to remember everyone to not buy into promises but to by the card for what it is at the moment.
Intel gpu issues are the drivers which are actually affecting the performance but since the main gpu engineer a ex nvidia one that allowed overclocking of them allot i can see intel being competitive with amd which in fact will drive down the amd prices even more which is a huge W for me
The end of this year is gonna be packed, Battlemage and RX 8000 battling to see which one takes the low-mid end price range, then there's gonna be Nvidia which is gonna be in their own kingdom, lol, i'm rooting for these very powerful cards for an extremely good price
Uhh the B970, other then ALU and boost clock increases, everything else is reduced. That leads me to seeing it as an underperforming card, new software is more demanding on vram. But the B980 is doubled on everything.
I really hope it will be great, and disrupting the market. but I do have good hopes for it, seen how the A series did, actually had been kind of waiting with upgrading my pc to see what intel would do, as well as for the next gen cpu's which support better AI and such, since honnestly that is a new feature in cpu's and while gpus can also do it well, there will be many use cases where it will matter that the cpu has efficient AI capable units as well. also the outro seems to not really kept up to date with the video content, talking about intel gpu's and then only releasing amd and nvidia gpus when people like the video.
So much hype... I would love to believe this could be true. If Intel does come out with something that can compete with a 4070 at 1/2 to 2/3 of the price it will be THE CARD of this generation and the most disruptive GPU ever released. Intel will DESTROY both AMD/Nvidia in terms of value and that is exactly what MANY people have been waiting for. If Intel can pull this off it will be a HUGE COUP in the GPU market, and they could go to 50% market share overnight. (or at least in 1 GPU generation) of course this pre-assumes: 1: they can actually compete with the 4070 in terms of performance 2: that they are willing to release that card at 50%-66% of the 4070 price... 3: that they produce enough of the cards that shortages don't end up driving up the price because no one has enough of the cards to satisfy demand... Those are all BIG FUCKING IF'S and I'll believe it when I see it... A lot of people are REALLY FUCKING SICK of Nvidia/AMD price gouging/price collusion and would gladly switch to Team Blue if Intel could deliver on something like this, but so far they haven't... the 770 is 6600 levels of performance with a bigger memory buffer and a wider system bus and while the memory is great it doesn't really matter on a card with 6600 levels of performance... If the 970 comes in with the performance of 4070... at 90% of the price... that isn't going to change anything either... for this actually be what we're hoping for Intel has to: 1. Bring out the 970 with performance to match the 4070 2. Offer a minimum of 12GB Vram on the card and not on cut down system bus. (16GB on a 256-bit bus would be better) 3. Do this at at least a 30% discount over the 4070 (50% would be better) If they can do it, they will have THE CARD EVERYONE WANTS TO BUY and they will destroy Nvidia/AMD in terms of value, it will be the most disruptive GPU ever released and a market coup the likes of which has never been seen, and they will sell as many as they can produce making major inroads into Nvidia/AMD market share... I would like to believe that this is really possible, but I'll actually believe it when it happens, until then this is a pipe dream...
You're forgetting two factors. 1) Battlemage is not a current gen product. By the time it comes out, Nvidia will be on Blackwell and AMD on RDNA4. 4070-like performance sounds great now, but 5060 levels of performance might not sound that great next year. 2) Intel needs to make money, just like its competitors. Even if they understand that pricing needs to be aggressive for Battlemage to succeed, it will never be that disruptive.
a lot of people won't care about Blackwell if Nvidia makes the 5060 $500 USD especially if it still has 8GB of Vram and that is the course Nvidia has been on.@@looks-suspicious
I really, REALLY hope that Intel will come up with some high end GPUs with lower prices. So nvidia and amd finally drop down their stupidly expenaive GPUs in price. Will Intel be our savior ? Only time will tell.
Hmm... one of my PCs: Asrock PG Arc A770 16GB (PCIe 4) Asrock PG 4 z690 (PCIe 4) 32 GB DDR 4 I5-13400 I do not bench or test but I think it runs very good. Of course you can not compare it with rtx 4090 but it runs all I want smothly and without issues. Driver works good. So Im looking forward to the battlemage. But as you can see, Im limited to PCIe4 and I guess the next GPUs will all use PCIe5. Hmmm.......🤔
It doesn't need to be $450. That would be fantastic but $599 would be acceptable, to me at least. If I can get $800 4070 ti super performance for $600 that is a win. Hell coming withing 10-15% of a $1000 4080 super for $650+ is a deal I would take.
@@gaav87 depends on why you’re spending the money. I just game so I want the best baseline performance for the money. I don’t typically use frame generation or ray tracing. I want 100+ fps at 1440p high settings. I paid $299 for my a770 and am averaging mid 80’s currently.
I'd watch the slanging of the Windows keys. YT tried to ban one channel with 500k subs over it, but they rescinded. I'm not anti cheap keys and I use them, but channels need to be careful. There are too many Karens out there that will tell on you.
to put this into perspective this gpu with be better than an 3090 raw performance wise with higher specs for only 450$ maybe even less if intel has decides to lower msrp again
I can finally do some 4K gaming on a budget that isn't an AMD PC (AMD is still goated but new competitor that is better) and When XeSS 1.4 or 1.5 releases with Intel's Frame Gen (ExtraSS because XeSS is the AI upscaling) Anyone can on a budget play at 4K can't wait for the release
Yeah, but I thought Intel said they would not be making their own flagship GPU's anymore. So now we will have ugly bulky ones instead of a sleek looking one like before.
Remember, everyone: ALUs do not scale linearly, as GC would have you believe. It is very likely that the ARC B980 will be somewhere between 85-110% faster than the ARC A770. It will almost certianly be on par with the RTX 4070 Super, not the 4070 Ti Super or 4080. Granted, it will still be worth it at $450, and I would recommend it to everyone at that price, but let's be realistic.
@@gaav87 Yes, he does make a lot of unrealistic claims, but I think he perceives that to be part of his brand. Unfortunately, we can't be rid of this stuff as long as we're watching videos about hardware that has not been officially announced yet. It will always be part of the "culture".
Lol. The 4090 will be relevant far longer than you think. Yes, nVidia are moving forward with their performance at a breakneck pace but what they've done with the 4090 is comparable to what they did with the 1080TI. It will serve on. Otherwise gamer setups and workstations for a decade, if not more.
Double the ALUs per EU, but half the EUs per Xe core. 8*16*32=16*8*32. If you double something, then half it, you're where you started. Is that bit at 1:40 just wrong? E: NO WAI will the 980 be $450 for those kinds of numbers. That's completely wild. 7600XT price, VERY NEARLY 4080 performance?! E2: Ah, there's the catch. Even approximately a 4070 Ti for $450 is delicious.
Nvidia still owns the professional and pro-sumer market, plus they have a pretty dedicated consumer following. I'm really rooting for Intel, but, even with a competitive product, market shift will be slow.
I have two desktops and two laptops, all are entirely AMD systems. The laptops were just coincidence, but I built the desktops to specifically NOT use Nvidia or Intel. I disliked the idea of Nvidia because AMD works better with Linux systems, and I disliked Intel because of the poor power efficiency compared to AMD although I am not sure why I cared about that. With that being said, I have been very excited about Intel's GPUs, and I am absolutely buying one when Battlemage comes out.
Feel like we are expecting too much from Intel. If 980 really turns out powerful and cheap, and having the same issues as 750/770 had/has, people might still opt for something else.
honestly if they make it 4080 type power for around 700-900 dollars so just slightly cheaper than Nvidia, I would buy it instantly. I've been dying to hop off of the Nvidia wagon with how shady they are but dont want to switch to AMD. Intel are shooting themselves in the foot by releasing this around the same time the 5080 will be released tho.. if leaks are to be trusted that card is going to be close to a 4090.
Intel is trying to make a name for themselves in the GPU market. Looking at Arc they have been taking value GPUs much more seriously then AMD or Nvidia. I can't say what they will do, but if they do what you are suggesting with plenty of VRAM, as you suggested, then they will have by far the best value 4k+ card. They need to get Meta to show them as supported for VR. This will be like the 1080ti. Everyone will suddenly think of Intel as a serious contender as long as the drivers don't screw it all up.
Thanks to Vip-cdkdeals for sponsoring this video. 30% off code: GPC20
▬ Windows 10 pro ($17):www.vip-cdkdeals.com/vck/GPC20w10
▬ Windows 11 pro($23):www.vip-cdkdeals.com/vck/gpc20w11
▬ Windows 10 home ($14):www.vip-cdkdeals.com/vck/gpc20wh
▬ Office 2016($28):www.vip-cdkdeals.com/vck/gpc20of16
▬ Office 2019($47):www.vip-cdkdeals.com/vck/gpc20off19
Get access to the discord and ICC profiles: patreon.com/TheDisplayGuy
Arc B980 is Insane - Intel Battlemage GPU Leak
Intel is preparing to launch their next generation Battlemage GPUs very soon and the latest leaks suggest they will be very good. Nvidia may want to consider dropping their prices, because if the Intel Arc B970 and B980 leaks turn out to be true they will bring excellent price to performance.
Source
Intel Battlemage B980 & B970 Leak: th-cam.com/video/ieJThJke6fE/w-d-xo.html
Please don't parrot RGT bull. He is very confused.
4080 equivalent for half the price could shift the whole market down ... let's see if they pull it off.
Sounds too good to be true tbh
If they pull it off, the entire GPU market will go crazy.
NVIDIA is going to do the same thing if Intel and AMD jumps in.
@@DainHunter it does indeed .. currntly arc card are also better at RT than AMD's equivilant cards , so if this thing is really at 4080 levels in ras , that put's it in 4070 level RT. that does indeed sound to good to be true. i'll wiat and see how things go.
Personally I don’t think it will have this effect at all, only because Nvidia have always said that they are a software company that sells GPU’s rather than being a GPU manufacturer, they won’t feel threatened by this at all… or at least not to the point where they would be quaking in their boots and drop the value of their cards… having said that, I would love to see more competition in the GPU market
@@ProfShikari I mean if they don't they just lose all market share under the 4080..even if it's not a major part of their business I think they like having control of the gpu market. I also hope there will be more competition
$450 for a 4080 competitor? I'll believe it when I see it. Still remaining hopeful.
I think it will boil down to how well the drivers work. We already know that their current cards have alot of performance but often simply cannot deliver that performance onto the screen, due to imperfect drivers. Really hopeful still, Im sure Intel has made huge advancements already and the next gen will only be better
Tbf, by the time it comes out there will be 5000 nvidia series and a 4080 will be equivalent to a 5060ti-5070 and will be likely be priced competitively with Intel/AMD.
@@OfficialMyMindsetI'm not so confident about the performance increases for 5000 series, we'll see
If anything, this will bring at least pricing profit margins to nvda and amd pricing of gpus
@@jsquallys7889 that was awhile go lol and was their first gpu.. Their drivers are pretty flawless now.
I'm glad there's a third option. I hope they keep getting better.
Hell yeah! A flagship GPU in the 400's is a win for all of us!
This is possibly a disruptor and I'm here for it.
@5 I'm actually on the train though. Sure, it makes me a guinea pig but if Intel keeps this up, I'll be glad I and people like me actually bought ARC GPU's instead of cheering them on from a distance.
I wont hesitate to put one in my daughters computer. She plays mainly fortnight and palworld at the moment
@@stephengrinaker5085 May god help your daughter. I shall pray
Holy crap, intel is getting compelling
If old lizard people from microsoft and intel work together,they might make something out of it. That B980,if this price estimate is correct,is pretty tempting.
In rumors and on paper
Take that with a big grain of salt
I don't even care about ray tracing, just gimme 144 fps of cyberpunk at 1440p for $450 :)
@@jamesrock9446Good news is that Intel is looking better at 1440p and 4K than 1080p even with Alchemist.
@@jamesrock9446 yeah, raytracing is nice and all, but as a feature it is still too expensive for the average customer. there's a reason why there hasn't been any game tha only used rayctracing for its lighting.
I'm an ARC A770 LE owner.
It's alright. In the games where it works correctly, it works the 16GB's of VRAM to their fullest, conquering almost all challenges. Almost. Ghost Recon Breakpoint is among a few games that... either run (on minimum settings) in the low 40's, or outright crash depending on the rendering engine (Vulkan or DX12, both have problems) that's being used. However, Elden Ring, Armored Core 6, Halo infinite, GTAV, even non-community-patched New Vegas all run great! Intel truly just needs to work on drivers with specific games.
Personally, I have no regrets, I just miss having the keybinds to easily start recording from shadowplay.
Considering that's their first generation, and drivers have already improved so much since release. It's not hard to be hopeful that Intel really will fuck with AMD and Intel.
@@TheGeneReyvaIntel is rumored to be slimming down its driver stack in anticipation of Battlemage. So we might expect more features and less hiccups than last generation.
consider getting OBS (not streamlabs) just for the sake of recording. it can do “shadowplay” as well and you have much more customization in terms of what gets recorded and in what quality, and you can set keybinds within OBS as well. just takes a little more effort to set up but afterwards its as simple as shadowplay and arguably more reliable.
@@computerpwn You make a totally fair point! I was lamenting the fact that such a setup, specifically recording the past minutes of gameplay, isn't available (as far as I can tell?) in first-party software. Other software was always an option, I just hoped when I had bought mine that the first-party software intended for the hardware was 'as functional' as equivalent first-party software from competitors. NVIDIA's Instant Replay was a fantastic and well-integrated part of the NVIDIA experience, even on an old GTX 960.
@@geargrinder6511 OBS can record a set amount of time in the past, based on however long you want it to, just like Nvidia’s instant replay. just look up “OBS replay buffer” to figure out how to set it up, but works just as good if not better!
They better hit their release window, because once the RTX5000 series start rolling out, 5090 at the end of the year, then this and the 4080 are going to be Mid level cards.
If NVIDIA keeps asking insane prices like they did for the whole 40 series lineup it really doesn't matter if Intel only can come close to the 4080. Battlemage will be preferred by the majority of gamers when it comes to affordability in combination with decent performance.
@@00Beowulf00especially since all nvidia will so is make it a quad slot card requiring a huge water cooler using 4x as much power so they can run their existing design at higher performance and change $3k for it.
Problem is they will only sell to TH-camrs and such, whereas a middle range card will sell millions of units.
The rtx5000 series rolling out right now wouldn't hurt them at all. Nothing available for the next few years would even need the 5000 series, 400 for a 4080 equivalent card or 1800 for a 5000... yeah ppl gonna go with the arc that have actually heard of it
I just really hope you're right. For half price I'm definitely switching! Unfortunately, you're also right that it will largely depend on whether they can finish ironing out the software
Now we are just waiting. Already got my funding secured.
Same
ALU count doesn't make sense. According to RedGamingTech, there are 16 ALU's per EU and 8 EU per Xe core, so by dimensional analysis, 16 * 8 = 128 ALU per Xe core. Which is the same as Alchemist, as it's still 16 * 8, just the 8 and the 16 are swapped. Which means 2560 (128*20) ALU's for the 20 Xe core model and 4096 (128*32) for the 32 Xe core model, i.e. no increase in ALU's between Alchemist and Battlemage
But he just said RedGamingTech is going back on his previous rumours
@@Gen0cidePTB This is literally his updated leak from yesterday which was the baseline for performance simulation in the video
I'm glad I'm not going mad, seeing things, and someone else saw that massive error.
Just getting into the world of PC building myself, i'm very happy we'll have another option for GPUs to play around with. I only hope it can provide actual competition to green and red
I say wait and see. If they deliver it will be wonderful.
I really hope they pull this off and completely throw the gpu market into chaos. Gpu prices are way too high and need a good competitor to come in and shake it up.
only way a competitor can shake up prices is to actually be competing.. falling way behind is not gonna help at all.
If they could pull this off, it would put a lot of pressure on the RTX 50 series in the mainstream and mid-range. I want the 5070 to offer ~4080 performance at 200 W, but without direct competition it would be expensive.
The time of the AMD cpu/Intel GPU pc has arrived
Them bots goin crazy
you know, i hope intel is successful in their GPU endeavors. more options in the market is always a good thing.
Other than repeating rumors from other sources, what specific calculations did you perform to verify that the cost is reasonable (as you stated in your video)?
RGT parroting. And that channel changes their facts every second week. Click bait.
I expect it will end up somewhere in the middle much like Alchemist has. Theoretical performance of a 3070 but in reality ended up somewhere bouncing between the 3060 and 3060ti depending on the game. This will probably be theoretically close to a 4080 but in reality bounce around a 4070 to 4070ti from game to game.
Still, if they can bring that to market at $450 it will absolutely shake up Nvidia and AMD's profiteering, not to mention Arc drivers just keep getting better. By the time this releases hopefully they'll have gotten through the backlog of old games they've been working on lately and they can get back to focusing on new games again.
Exciting indeed. When the B980 comes below $500, then I am very interested.
If Battlemage comes out in late Q2 or in Q3, those products on the 5nm process will have a short-lived glory with NVIDIA and AMD probably releasing their next gen on 3nm in Q4.
Doubt it, price to performance I'd all that really matters
Intel needs some exclusive features of their own. Currently Nvidia sports some impressive DLSS and AI features that make their cards attractive. Intel had something great with their hyperencode and hypercompute features though they seem to have fallen by the wayside over the year.
The beginning of the vid looks like you just got up from doing a bump! LMFAO! 🤣
if its as good as an og 4070ti i'd definitely pick one up. I need a good cheap gpu for a mobile gaming rig. Ive also been wanting to play with an intel gpu to see what all they have going on.
I have a 4080 already, so something with also 16gigs of vram with the speed to actually use all of it all is very relevant to me.
Also ive been on AMD cpus for a loooooooong time. Gives me an excuse to make a full intel machine. I'd deal with the innate problems with software at those levels of performance. I really like tweaking things anyways. Ive been finding myself playing more with the hardware itself rather than what the hardware is for lol.
Also its about time i see what intel cpu's are offering cause the last one i had was a 4-core ddr3 chip. They've gotten new features since then and i'd love to play with and learn how to tune them up. never oc'd, undervolted, or co'd on their chips. Have no idea how well their boost algorithm works, while In contrast i know pretty much all there is to know about ryzen tuning and how to cheese its boosting behavior.
Strait to the point you already gave the specs few months ago. A770 was discontinued more than 6 months ago to make room for the Battlemage production. The specs would have been already known that time. The specs are not a movie target after it is locked in place. Every time you make changes you are undoing previous work and putting the product in risk by delaying it. It will also cost a lot to redesign. So no. The original design specs are known to be:
B980 Xe 64 / 24gb (16gb) / 256 bus
B970 Xe 56 /16gb / 256 bus
Mobile Xe 40 /12gb /192 bus
man your videos are great, but the bodily animation you incorporate always hits home! Keep it up (two thumbs up!)
Don't know about these leaks. They're completely different from the ones we had before. I'm gonna wait for Intel to tell us in person how good the cards are
Yea they are leaks but if they are skmewhat close to the numbers we’re seeing its still a compelling option
Just click bait.
Battlemage will not be what people are expecting. Intel needs more time for their drivers to mature. We're most likely stuck with nvidia and amd for another few years
Give it a few years and I reckon that Intel will eventually surpass AMD with their GPU’s, I have just bought a 7800XT, so I hope I am wrong 🤣
@@ProfShikarii might grab a 7800xt if prices drop
@@ProfShikariWith worsening world conditions the last thing on a person's mind in a few years is gonna be a gpu!
@@ProfShikari7800xt is a beast of a card for its price. Should be autoincluded in all builds under 1k.
@@newyorktechworld6492 That's just conspiracy theories. Best to mock it until reality hits you in the face. At least that what people tell me by the way they act. They'll continue trusting the eXpErTs, even though the eXpErTs are consistently wrong and those crazy conspiracy theorists are consistently right.
I don’t want to blame Intel because it does not support basically DX9, this is more an issue of the keep alive games to remastering old and fabulous games from the past to Vulkan and / or DX12 Ultimate. If this games were remastered Intel would not struggle with drivers to support old DX games. In the bottom line thumbs up to Intel 👍! Thanks for your effort and information you did on this video! Keep on it! Greetings from Austria 🇦🇹
I hope these cards meet your expectations, because I would really like to see nVidia humbled.
AMD better get on the ball here
Too late ,, they are shot ducks
I cringe every time I see a TH-camr shill for software keys. Because they're all ripping off their fans. So people, check the net, and you may find the place I'm thinking of. Can't tell you outright because I’m already stepping on this guy's toes.
By the time this card come out the 50 cards will be making an appearance
Now they just need to start ramping up the vram.
Glad intel hasn't given up, maybe in the near future they will supersede AMD.
Wrong CPU hinders the performance fair bit. INTEL is all about pairing to the CPU. FPS numbers for the A770 at 4k is not that far off from 1440p we haveseen when with mid and
high settings even RT on.
Another point is you can't compare INTEL FPS to FPS. For example 4070ti gives a great result on 1440p but 192 bus from the 4070ti can't do 4k nor can
128 bus from the RX 7600XT 16GB often compared to .
Yes on paper 4070ti can do twice the FPS compared to A770 but it is a stuttery mess and won't give a good
experience. A770
most is playable and with RT. I'm just pointing to a
card 4070ti that is leaps over the RX 7600XT 16GB to get some perspective.
XeSS XMX that is for INTEL cards and XeSS DP4A for customer open source cards. The programs are not the same. Two different programs and I have no idea why would they name
them so similar and confusing. But XeSS XMX is as good as DLSS but XeSS DP4A is not. But when tested with AMD CPU it will not work optimum. Often missed and wrong information
given
To understand the architectures of INTEL vs AMD/Nvidia is vital when comparing. INTEL pairs with the CPU splitting the tasks between GPU/CPU and then bringing it together to
make a frame with lot less queuing that results to very small frame time. AMD/Nvidia do most of the executions in the GPU and this results queuing and inconsistency, longer
frame time and broken frames. Result is that you need twice as much FPS and even then it will stutter from bad frame time. People and reviewers haven't caught up to this yet
because there has always been just two with very similar architectures. Hence INTEL PresentMon tool to tune and spot bottlenecks and to find the best pairing. The point made
is that if ARC is level in comparison in reality it is hitting lot higher. You can see it well if you compare the 1080p/1440p/4k. More you demand less the drop compared to the
competition. You can think this as follows. 1080p you get 100FPS but when you double that you get 200FPS that might be the limit for the game. So you are getting the max
offered from the game. 1440p you might get 80FPS but when you double it you get 160FPS. And as follows for 4k similar number would be 70FPS that doubles to 140FPS. This is the
weird
observation that everyone can make that when the load gets bigger and more is demanded less the FPS drops compared. Reason why I'm talking "doubling" is because you also
h
ave two elements CPU and GPU both working hard because of the different architecture. You get 1 Frame but in reality if compared you can count that as double as 2. So if the
frame limit of the game is 200FPS we only see 100FPS from INTEL that is still matching the performance of 200FPS from other cards. The proof also lies on the RT. As you know
RT drops the FPS lot less on INTEL. That number also should be doubled and suddenly it is matching the competition with the extra workload. Pretty hard to explain to be honest
but when you figure it you also see where the ARC is actually hitting. It might not be actually double but it wouldn't be far from it. Other proof is the encoding and time it
takes. Again the tasks are split and the ARC gets close to the $1000 cards that are suppose to be twice as fast but in reality not that far ahead.
As a summery it all depends how you review. And if you review two different architectures the same way you get false results that give you the wrong picture. Maybe this was
necessary to sell as there are two CPU manufacturers to make it look like it performs the same. But now we know better. There is lot more to it than these channels let out. Battlemage is based on Alchemist and very similar for the drivers. So I think drivers are going to start from Alchemist level. Then there is APO that seems to promise 10-50% gain. Lets see what that's about.
I hope they do it.
It would be great for intel to actually cause prices to drop rather than inflatong cpu pricing :p
What's with the goon bots
i wanted to wait for Battlemage to upgrade from my 2080 i got back in 2019, but my 2080 ended up dying of unknown causes. so i picked up an acer phantom gaming A770 16gb for about $280, and all i can say is I wish i had gotten an A770 LE. Excellent card for the price, at least in the games I play and the other softwares I use
If it sounds too good to be true, it probably is….
We need more competition in the sub-$300 range.
4080 is not relevant with price point of 1200$, the super is 1000$
Based on the results on the last year of the Intel gpu, it’s looking like a value beast of a card. Still, would like to remember everyone to not buy into promises but to by the card for what it is at the moment.
I'm rooting for you intel!
Intel gpu issues are the drivers which are actually affecting the performance but since the main gpu engineer a ex nvidia one that allowed overclocking of them allot i can see intel being competitive with amd which in fact will drive down the amd prices even more which is a huge W for me
If it comes out before may, I MAY have a flagship gpu in my first PC
The end of this year is gonna be packed, Battlemage and RX 8000 battling to see which one takes the low-mid end price range, then there's gonna be Nvidia which is gonna be in their own kingdom, lol, i'm rooting for these very powerful cards for an extremely good price
Bots, more like butts.
If those prices are Legit Nvidia and AMD is going to have some problems... ;o
Still failed at basic arithmetics: 32 Xe-Slices*8 EU/Xe-Slice= 256(!) EU. But still 4096 Shading Units for G31 (256 EU*16 Shading_Units/EU) ...
Clock speeds seem believable,
I have my ASRock ARC A380 hitting 2.904Ghz boost (1/2 VRAM speed X 3) using about 18W driving dual 4K monitors.
Wait so A for Arc, B for Battlemage... C for Cleric? Bring those codenames
C for Celestial
It's confirmed to be Celestial. Also D has been confirmed as Druid as well.
Celestial--The ArchAngel of GPUs 😇
Uhh the B970, other then ALU and boost clock increases, everything else is reduced. That leads me to seeing it as an underperforming card, new software is more demanding on vram. But the B980 is doubled on everything.
I really hope it will be great, and disrupting the market.
but I do have good hopes for it, seen how the A series did, actually had been kind of waiting with upgrading my pc to see what intel would do, as well as for the next gen cpu's which support better AI and such, since honnestly that is a new feature in cpu's and while gpus can also do it well, there will be many use cases where it will matter that the cpu has efficient AI capable units as well.
also the outro seems to not really kept up to date with the video content, talking about intel gpu's and then only releasing amd and nvidia gpus when people like the video.
The original leak suggested a 3080ti at 350$ so this Is really good news imo
If intel does anything close to this with the suspected pricing, I'm 100% converting over to an all intel build
We really do need performing GPU, priced down from the insane levels of Nvidia.
So much hype... I would love to believe this could be true. If Intel does come out with something that can compete with a 4070 at 1/2 to 2/3 of the price it will be THE CARD of this generation and the most disruptive GPU ever released. Intel will DESTROY both AMD/Nvidia in terms of value and that is exactly what MANY people have been waiting for. If Intel can pull this off it will be a HUGE COUP in the GPU market, and they could go to 50% market share overnight. (or at least in 1 GPU generation) of course this pre-assumes:
1: they can actually compete with the 4070 in terms of performance
2: that they are willing to release that card at 50%-66% of the 4070 price...
3: that they produce enough of the cards that shortages don't end up driving up the price because no one has enough of the cards to satisfy demand...
Those are all BIG FUCKING IF'S and I'll believe it when I see it... A lot of people are REALLY FUCKING SICK of Nvidia/AMD price gouging/price collusion and would gladly switch to Team Blue if Intel could deliver on something like this, but so far they haven't... the 770 is 6600 levels of performance with a bigger memory buffer and a wider system bus and while the memory is great it doesn't really matter on a card with 6600 levels of performance... If the 970 comes in with the performance of 4070... at 90% of the price... that isn't going to change anything either... for this actually be what we're hoping for Intel has to:
1. Bring out the 970 with performance to match the 4070
2. Offer a minimum of 12GB Vram on the card and not on cut down system bus. (16GB on a 256-bit bus would be better)
3. Do this at at least a 30% discount over the 4070 (50% would be better)
If they can do it, they will have THE CARD EVERYONE WANTS TO BUY and they will destroy Nvidia/AMD in terms of value, it will be the most disruptive GPU ever released and a market coup the likes of which has never been seen, and they will sell as many as they can produce making major inroads into Nvidia/AMD market share... I would like to believe that this is really possible, but I'll actually believe it when it happens, until then this is a pipe dream...
You're forgetting two factors.
1) Battlemage is not a current gen product. By the time it comes out, Nvidia will be on Blackwell and AMD on RDNA4. 4070-like performance sounds great now, but 5060 levels of performance might not sound that great next year.
2) Intel needs to make money, just like its competitors. Even if they understand that pricing needs to be aggressive for Battlemage to succeed, it will never be that disruptive.
a lot of people won't care about Blackwell if Nvidia makes the 5060 $500 USD especially if it still has 8GB of Vram and that is the course Nvidia has been on.@@looks-suspicious
i'd really like one of these cards. looking forward to it.
I really, REALLY hope that Intel will come up with some high end GPUs with lower prices. So nvidia and amd finally drop down their stupidly expenaive GPUs in price. Will Intel be our savior ? Only time will tell.
Building a pc right now, hard stuck on if i want to give the a770 a chance, the price tag is very compelling lol
Remember a770 has 16gb Ram and 256 bit bus. They can pull this off if they try it.
Very excited to see this intc gpu to come out!
Hmm... one of my PCs:
Asrock PG Arc A770 16GB (PCIe 4)
Asrock PG 4 z690 (PCIe 4)
32 GB DDR 4
I5-13400
I do not bench or test but I think it runs very good. Of course you can not compare it with rtx 4090 but it runs all I want smothly and without issues. Driver works good. So Im looking forward to the battlemage.
But as you can see, Im limited to PCIe4 and I guess the next GPUs will all use PCIe5. Hmmm.......🤔
All Intel gaming systems in the near future! I started gaming on an Intel 8086.
If this happens and Nvidia/AMD dont have any competitive product, the gaming community would have the best gift in a decade.
It doesn't need to be $450. That would be fantastic but $599 would be acceptable, to me at least. If I can get $800 4070 ti super performance for $600 that is a win. Hell coming withing 10-15% of a $1000 4080 super for $650+ is a deal I would take.
No way anyone would buy, a 599$ intel gpu with 4070super performance... 499$ is best they can get. Litteraly 0 (ZERO) AI support on windows...
@@gaav87 Good thing 99.9% of people don't run AI workloads.
@@gaav87 depends on why you’re spending the money. I just game so I want the best baseline performance for the money. I don’t typically use frame generation or ray tracing. I want 100+ fps at 1440p high settings. I paid $299 for my a770 and am averaging mid 80’s currently.
if intel actually releases a full lineup with inventory on desktop, ill lose a lot of trust in MLID.
I used to hate Intel down my spine..
but now... GO INTEL!! YOU CAN DO IT!!
i really hope this releases, it will be a game changer
I'd watch the slanging of the Windows keys. YT tried to ban one channel with 500k subs over it, but they rescinded.
I'm not anti cheap keys and I use them, but channels need to be careful. There are too many Karens out there that will tell on you.
THe moustache is a bad idea.
to put this into perspective this gpu with be better than an 3090 raw performance wise with higher specs for only 450$ maybe even less if intel has decides to lower msrp again
So how would this compare with a dell oem 3080?
I love my 6800xt. Looking forward to loving this one too. 😏
I can finally do some 4K gaming on a budget that isn't an AMD PC (AMD is still goated but new competitor that is better) and When XeSS 1.4 or 1.5 releases with Intel's Frame Gen (ExtraSS because XeSS is the AI upscaling) Anyone can on a budget play at 4K can't wait for the release
Four power plug things? (forgot what it's called)
Wow, seems very power hungry maybe?
Yeah, but I thought Intel said they would not be making their own flagship GPU's anymore. So now we will have ugly bulky ones instead of a sleek looking one like before.
I’m about start my first build and if this leak ends up true I’m def going for the b980 for my first gpu
Remember, everyone: ALUs do not scale linearly, as GC would have you believe. It is very likely that the ARC B980 will be somewhere between 85-110% faster than the ARC A770. It will almost certianly be on par with the RTX 4070 Super, not the 4070 Ti Super or 4080. Granted, it will still be worth it at $450, and I would recommend it to everyone at that price, but let's be realistic.
Also this dudes living in clouds... 24gbps memory... Intel and amd are stocking up on 20gbps ddr6 from samsung/micron and nvidia stocking up on ddr7
@@gaav87 Yes, he does make a lot of unrealistic claims, but I think he perceives that to be part of his brand. Unfortunately, we can't be rid of this stuff as long as we're watching videos about hardware that has not been officially announced yet. It will always be part of the "culture".
I hope this ends up being real. Tired of nvidia price gouging us all
Intel should keep that big 256 bit band
Insane B980
Lol. The 4090 will be relevant far longer than you think. Yes, nVidia are moving forward with their performance at a breakneck pace but what they've done with the 4090 is comparable to what they did with the 1080TI. It will serve on. Otherwise gamer setups and workstations for a decade, if not more.
Waiting for arc b950. It's been A WHILE since anyone released a cheap gaming gpu.
Double the ALUs per EU, but half the EUs per Xe core. 8*16*32=16*8*32.
If you double something, then half it, you're where you started. Is that bit at 1:40 just wrong?
E: NO WAI will the 980 be $450 for those kinds of numbers. That's completely wild. 7600XT price, VERY NEARLY 4080 performance?!
E2: Ah, there's the catch. Even approximately a 4070 Ti for $450 is delicious.
If its ray tracing performance is good enough I can see this rug pull Nvidia pretty hard.
Nvidia still owns the professional and pro-sumer market, plus they have a pretty dedicated consumer following. I'm really rooting for Intel, but, even with a competitive product, market shift will be slow.
I have two desktops and two laptops, all are entirely AMD systems. The laptops were just coincidence, but I built the desktops to specifically NOT use Nvidia or Intel. I disliked the idea of Nvidia because AMD works better with Linux systems, and I disliked Intel because of the poor power efficiency compared to AMD although I am not sure why I cared about that.
With that being said, I have been very excited about Intel's GPUs, and I am absolutely buying one when Battlemage comes out.
Feel like we are expecting too much from Intel. If 980 really turns out powerful and cheap, and having the same issues as 750/770 had/has, people might still opt for something else.
im so glad intel is making gpus
does intel have vr support?
honestly if they make it 4080 type power for around 700-900 dollars so just slightly cheaper than Nvidia, I would buy it instantly. I've been dying to hop off of the Nvidia wagon with how shady they are but dont want to switch to AMD. Intel are shooting themselves in the foot by releasing this around the same time the 5080 will be released tho.. if leaks are to be trusted that card is going to be close to a 4090.
Whatever the pricing I am purchsing x2 of the B980. Come on release date.
Intel is trying to make a name for themselves in the GPU market. Looking at Arc they have been taking value GPUs much more seriously then AMD or Nvidia. I can't say what they will do, but if they do what you are suggesting with plenty of VRAM, as you suggested, then they will have by far the best value 4k+ card. They need to get Meta to show them as supported for VR. This will be like the 1080ti. Everyone will suddenly think of Intel as a serious contender as long as the drivers don't screw it all up.
why they didnt put RTX 4070 -- 200W and better performance?
the day the B980 releases i will be purchasing it lol no quiestion
$450 for a GPU that competes with an RTX 4080... yeah right, I believe it when I see it...
30% less would be more believable
why are you so animated when you talk. you look like a character from a 1920s silent movie
It's a shame we are all hoping intel can reset the gpu market to something more reasonable.
Lets go intel give me 4080 gpu for 300$