Thanks to Vip-cdkdeals for sponsoring this video. 30% off code: GPC20 ▬ Windows 10 pro ($17):www.vip-cdkdeals.com/vck/GPC20w10 ▬ Windows 11 pro($23):www.vip-cdkdeals.com/vck/gpc20w11 ▬ Windows 10 home ($14):www.vip-cdkdeals.com/vck/gpc20wh ▬ Office 2016($28):www.vip-cdkdeals.com/vck/gpc20of16 ▬ Office 2019($47):www.vip-cdkdeals.com/vck/gpc20off19 SUPPORT My Work & Get Discord Access: patreon.com/TheDisplayGuy
If INTEL releases these cards at the prices you posted in this video, you can be sure NVIDIA will respond quickly with price drops of their own to take the wind out of INTEL's sails. I guess that's a good thing overall assuming people choose these INTEL cards over NVDIA if both are nearly priced the same.
Hi Intel Arc prices maybe a little bit cheaper. Nvidea GeForce done few thing for linux. Dx12 slow in many thing. Dx11 Faster in MANY thing or Vulkan. game devs or Vulkan devs or both need to do few things Vulkan not choosable option in many game, Vulkan it could be one of the best thing for gamers. MAYBE few information bad but i have read THINGS about Intel Arc not has support for Dx12 and Dx12.2 and few game and things still not really works on linux. DON'T MISUNDERSTAND ME, I NOT hate them. i have few product and what i have it's good. here in this countrie, MANY kind of VERY evil things are MANY years! in ALL peoples animal man TYPES are. many evil in basic things also. and my half brother in Usa not friendley poker player, website domain owner he has few. i can't communicate with him. help needed as fast as possible!
"official" specs and performance. "Based on leaks like those from redgaming tech" and "i estimate performance based from these leaks as..." Sounds official indeed
I mean they just do it to increase market share once there market share enough they will do the same as AMD and Nvidia that's a fact because if they went close to Same price as AMD or Nvidia there GPU market share would tank so this is just a business move they don't do it to be your Main
@@XeqtrM1 Doesn't matter. More competition = better price/ better products. In the end, the consumers are going to benefit from this. Look at the market cap of Nvidia. It's 22 times that of intel. To hold that ridiculous market cap up their products are going to be even more expensive than the current outrageous price. However, Intel, with its significantly smaller market cap compared to NVIDIA, faces much less earnings pressure from investors and has much greater flexibility in setting product prices. We need intel to step up. It's good for all of us.
Agreed. Although, no surprise, the amount of whinging is off the chain. Idiots never learn. If you loathe Intel...c'mere a minute...over here...ok listen: "DON'T BUY IT" ... okay? ok. now scram kid ya bother me. For me, like you said, competition is almost never bad and this space hasen't seen it since what? Vodoo cards? Or what was that other system I'm forgetting. IDK, S3 or something that started with an "M" I wanna say without looking it up. Can't remember what S3 was based on. Anyway whatever yes. Competition.
Bought an A770 just for the grins, it being the first serious Intel DGPU. It's not what I'm running now, but I may build a second rig when Battlemage comes out. Just for the grins.
Did the same. Haven’t even plugged it in. Whether it’s the beginning of a great thing or ends up Intel’s first failed major attempt at stabbing into the market, it’ll remain a significant product in the history of the market.
@@kenshirogenjuro873Same. I have an ASRock A770 and a Sparkle one, the Sparkle hasn't even left the box. I've also got an ASRock A380 in my media PC for AV1 encoding and decoding and that thing RIPS for encoding! If they release the B980 I'll buy one of those, too, if nothing else than to support the competition within the GPU market.
I did like the A770, it's in my media PC and I guess a second hand 3070 or a new like 7600XT would've done similar in games, but the a770 LE design was just so clean and I wanted to support the endavour, I was originally just gonna put it on my showcase plank of old / special hardware, but then I figured, instead of buying a gpu for my media PC, just put that one in. Honestly if my 4090 ever dies, I don't think I'd mind using it as my backup GPU, since lately I'm actually playing a lot of lighter games which could even run on igpus.
intel should release a new industrial standard: a unified compute motherboard(at lease 3 years socket support) has a GPU socket and users can expand their GPU memory using the CAMM2 module up to 384 or 512bit wide to change the current NVIDIA's rule.....
@@christophermullins7163 i7-6700+GTX1070, I finally turn off the iGPU, because it consumes my main memory for reserved area, and it's hardware decode is not enough for current video steaming websites..
yeah I suggested that quite some years ago, even though CAMM 2 wasn't really a thing back then so I used some other standard and optimized it for better bandwith. however what Intel should really also do/push is adding hardware memory compression to the GPU's. essentially what that does is it makes 16gb of normal VRAM behave like way more VRAM, dedicated hardware for it keeps latency and energy usage and compute power low. the main big deal here is that in a GPU memory compression in most cases actually makes the GPU faster, since GPU's mostly depend on having a high bandwith and not so much on low latency(after all most people game below 10000fps or such). one of the big problems with CPU ram for high performance GPU's is this reduction in bandwith, there are ways around to make it less impactfull, but even so even normal APU's are bandwith limited which is why they are so much faster on higher memory clocks. that said if you would use CAMM2 modules as many together to get to such a bitrate as you mention, and then combine it with memory compression you can actually get good enough bitrates, the memory compression is kind of needed however, and many people think memory compression gives more ram than is physically installed at the cost of performance, and with CPU's that is mostly the case since they often depend on low latency, but gpu's need high bandwith and since good hardware memory compression actually greatly increases bandwith(there are cases where it can multiply bandwith by more than 8 times(even using old compression methods)). so in gpu's memory compression would make that very much possible. and honnestly we are at a point where it makes little sense for gpu's not to do that/add in such hardware memory compression and perhaps a small direct acces cache ram buffer in case it once does actually need as low as possible latency so that no compromise is made. I had actually hoped intel was gonna do that in this generation since around half a year ago they made a certain refference which kind of seemed to reffer to something like it, as they reffered to new revolutionairy memory improvements, even though that was one of those things where it was uncertain if they would do it or not, and also it wasn't even certain they talked about hardware memory compression since it might just have been that I thought about that option due to having been thinking about that and why it actually would be great.
@@ted_van_loon I just think that following NVIDIA's rules (selling exclusive GPU circuit boards and fixed memory quantities) will not be able to catch up with NVDA's technology and market size. The AI data center sold by NVDA is not accessible to ordinary users and players. Once GPU motherboards and expandable VRAM appear, NVDA will either give up this market, or jump in and start over according to INTEL's rules. I just want to do this with NVDA
I JUST got an A750 yesterday upgrading from a GTX 1660 Super... WOW what a huge and abilities performance boost!! LOVING the card! ( Using a Ryzen 2700X CPU with 64GB of DDR4 )
@@Adreno23421 This is for Xe2, Kinda hard to benchmark a card that will not release till the end of the year. I didn't think this was clickbait because I'm familiar with AMD, Intel and NVIDIA's roadmaps.
If they get the drivers right and the improvements make it at least as reliable as the competition I will go for this, I can only afford so much and the B580 looks great if it has 12gb ram (which is the minimum).
B980 looks like it will run pretty close to AMD’s expected N48 chip (8700XT/8800XT?) in compute. If the drivers don’t do too badly and the ray tracing continues to be significantly ahead of AMD, it will be a statement product.
I have a 4080 in my main PC, and paid too much for it I think for what it is. If Intel release those Battlemage GPUs at that price, I'll be buying the top model for my other PC at that price. Cheers.
Keep in mind, people, that percentage increase is for when games DO work as they should, not some bench that looks bad because the drivers for gen-1 still are not quite there. The drivers this time seem not be an issue - so those increases might mean more than you would expect. THAT SAID - even if just price dropped 4070S or 4070TI - then awesome, those cards new are WAY too much for that class in terms of name or their performance. You are being fleeced, and with Intel, so far, you won't be. That's a win. (also, when the 5080 comes out at or near $1000, you will rethink things if you don't agree right now)
B980 might be a strong competitor to 780XT & RTX4070 if it's $100 less, at $399, but they will still be 1 generation behind since RX8k & RTX5k is about to launch.
I think there's no way in hell Intel is releasing the B770 at $299 if it really does give us that kind of performance. Beyond that, I'm really hopeful for Intel's Battlemage lineup. A third serious competitor will be good for consumers and innovation, and I plan to support this release regardless of the actual gains.
Battlemage and RDNA 4 should launch later this year and are said to be very competitive in the midrange segment. I don't think drivers would be nearly as big of a problem for Battlemage as they were for Alchemist. If Battlemage can deliver a massive performance increase over Alchemist with good drivers at reasonable prices, I think there is a good chance Intel will gain quite a bit of market share in the discrete GPU market. The flagship RDNA 4 GPU is rumoured to offer 7900 XT levels of raster performance and 4080 RT performance at half the price, which would be fairly impressive. If Intel and AMD can pull this off, hopefully, it will convince Nvidia to pack more processing power into their next gen midrange GPUs for reasonable prices. With strong competition, that definitely could happen.
@@jasonvors1922 AMD is stupid if they have decided to release RDNA 4 next year. RDNA 4 needs to have at least a few powerful midrange GPUs at reasonable prices. Otherwise, it's going to be dead on arrival. Don't forget about Intel, though. I am fairly certain that it's confirmed that Battlemage is releasing this year for both mobile chips and discrete graphics cards. Judging from a lot of the leaks, Battlemage is shaping up to be impressive in terms of price and performance in the midrange segment, and it should be released with much more stable drivers at launch, apparently. Hopefully, Intel can pull this off, because if they can, there is a good chance that Battlemage will be a massive success, especially in the discrete GPU market.
Interesting fact that is if the rumored specs and performance is true, the capability of each Battlemage Cores is 50% faster than NV current Lovelace core (4096 intel gpu equal to 6144 nvidia GPU in performance). If next generation Celestial GPU Intel is able to double its Core count to 8192, it might be able to compete with RTX 5080. Such good time!
That's disappointing. I remember not too long ago, the rumor was the top end battlemage was supposed to compete with the RTX 4080. That seems to be wrong. We need another player in the high end GPU market. We need the competition, the prices are high enough
I was too, I was hoping for something similar to the 4070 Ti, but Intel doesn't seem to be interested in the high-end GPU market. Still, 196% performance increase is pretty substantial. I'd still buy it.
The GPU manufacturers aren't going to make NVIDIAs mistake when they released the GTX 1080 and it's still used today. Every GTX 1080 out there is another card not sold year after year.
If that price and performance happens with Intel, get ready for a new era of PC gaming 😮 I feel (this is subject to change) that Intel takes their feature set more seriously than AMD - Id imagine that changes too
@@jimmyjiang3413 maybe Sony will go for it. Intel would definitely undercut AMD to get in there. I don’t believe that there will be a Next Gen home Xbox console. I’m possibly very wrong but I just don’t see another home console from them. Switch/Steam Deck like hybrid handheld? Maybe. If that does happen, maybe Intel could work on something for MS for their handheld like Nvidia does for the Switch.
@@xKB616 what I wanted to see in Xbox will be a performance hybrid: combining performance of a typical RTX 4070S desktop and PS5 with portability and flexibility of Switch. This shall be possible thanks to Intel Lunar Lake’s design direction forward for all chips, including custom chips. Maybe this time Xbox shall consider remove Phil and Sarah from this in favor of someone else like one of the former PlayStation boss takeover for some interim basis before making its own CFO (fiscal) to be vice-president and deputy CEO at the same time (eg. David Zaslav from WBD better fit in CFO role). Maybe it needs to implement Japanese employment practice in the form of lifetime employment without the possibility of layoff. And relegate cloud gaming and Game Pass to older titles only so that newer games being sold as console-only exclusive in the initial release window of two years to mimic the theatrical film release window as much as possible. Maybe this time Xbox shall be sold at north of $2000 for financial sustainability, profitability and hardware revenue reasons.
I think intel will save the '9' class for a more performant part, possibly not even this generation. I think your b770 is accurate, but the one you are calling a b980 will actually be a b780. Maybe the 5120 alus gets shaved down a little.
Not saying anything that comes out of the Intel camp is not true, but with the way things are they seem really unstable as a company and as a product, so until software and hardware hits shelves and 3rd party tested I take everything from them with a bit of salt, I hope they can somehow pull this off, I have in my computer an A770 I'd like to think I can replace it at some time with another Intel
But these cards are competing with old Nvidia cards. 50 series about to drop when battlemage drops. Intel got some catching up to do but pricing looking good
the thing is you guys these things wont even compete with 3000 series... It's a wrap Buy a G102 in a 3070ti or 3080 anything rtx 3000 and up is just a BEAST
So, you want NVIDIA to raise GPU prices even more? With their current market cap, it's evident that their GPUs are going to be even more expensive than they are now. We seriously need competition.
@@PhyteByte Well, I thought you were smart enough to have a little talk. Seems like I was wrong. No wonder why you're STILL rooting for NVIDIA even after they ditched out the very hand that fed them :)
@@swagyolo8602 TH-cam HAS YOU THINKING THERE MORE AND MORE PERFORMANCE TO BE GAINED, THAT THINGS ONLY GET BETTER EVERY YEAR GPU'S ARE THE PRICES THEY ARE BECAUSE THEY'RE THE BEST THEY'VE MADE THEM BUY 3000+ SERIES IF YOU NEED TO SAVE SIMPLE.
... I mean how much can video cards be hyped up?!? gets absurd after a while. all this hype about next gen gpu is a bunch of crap. let me know when 8k @ 300hz and 600 fps running at room temps full tilt renders for under $500.
Yeah no one cares about Intel GPUs except for brokies.. Tell Intel to make a chip with an extensive amount of L3 cache and we might actually buy their chips again
@@swagyolo8602 an easy touch.. and apparently I’m bragging about it. I didn’t know that simply stating the obvious was being easy to touch/gullible.. but you just want to be here to call people stupid
@@BroSomeTV You ain't touching anybody here, edgel0rd. Forget about reasonable price tags; just support their greed and brag about it on TH-cam. How edgy is that? LMAO. Any corporate greed would use you with ease. Btw, you got it wrong. I'm here to mock clowns, not to call random people stupid."
Thanks to Vip-cdkdeals for sponsoring this video. 30% off code: GPC20
▬ Windows 10 pro ($17):www.vip-cdkdeals.com/vck/GPC20w10
▬ Windows 11 pro($23):www.vip-cdkdeals.com/vck/gpc20w11
▬ Windows 10 home ($14):www.vip-cdkdeals.com/vck/gpc20wh
▬ Office 2016($28):www.vip-cdkdeals.com/vck/gpc20of16
▬ Office 2019($47):www.vip-cdkdeals.com/vck/gpc20off19
SUPPORT My Work & Get Discord Access: patreon.com/TheDisplayGuy
If INTEL releases these cards at the prices you posted in this video, you can be sure NVIDIA will respond quickly with price drops of their own to take the wind out of INTEL's sails. I guess that's a good thing overall assuming people choose these INTEL cards over NVDIA if both are nearly priced the same.
Hi Intel Arc prices maybe a little bit cheaper. Nvidea GeForce done few thing for linux. Dx12 slow in many thing. Dx11 Faster in MANY thing or Vulkan. game devs or Vulkan devs or both need to do few things Vulkan not choosable option in many game, Vulkan it could be one of the best thing for gamers. MAYBE few information bad but i have read THINGS about Intel Arc not has support for Dx12 and Dx12.2 and few game and things still not really works on linux. DON'T MISUNDERSTAND ME, I NOT hate them. i have few product and what i have it's good. here in this countrie, MANY kind of VERY evil things are MANY years! in ALL peoples animal man TYPES are. many evil in basic things also. and my half brother in Usa not friendley poker player, website domain owner he has few. i can't communicate with him. help needed as fast as possible!
Clickbait Thumbnail.
Typical
As always
What is thumbnail
@@jeppehgsted5228look at your hand. Unless you are AI bot.
@@Giovanni-Giorgio i see
"official" specs and performance. "Based on leaks like those from redgaming tech" and "i estimate performance based from these leaks as..."
Sounds official indeed
Love the irony
Very happy Intel Didint give up more competition the better
I mean they just do it to increase market share once there market share enough they will do the same as AMD and Nvidia that's a fact because if they went close to Same price as AMD or Nvidia there GPU market share would tank so this is just a business move they don't do it to be your Main
@@XeqtrM1 Doesn't matter. More competition = better price/ better products. In the end, the consumers are going to benefit from this.
Look at the market cap of Nvidia. It's 22 times that of intel. To hold that ridiculous market cap up their products are going to be even more expensive than the current outrageous price. However, Intel, with its significantly smaller market cap compared to NVIDIA, faces much less earnings pressure from investors and has much greater flexibility in setting product prices. We need intel to step up. It's good for all of us.
This will still not be even close to high-end Nvidia.
Agreed. Although, no surprise, the amount of whinging is off the chain. Idiots never learn. If you loathe Intel...c'mere a minute...over here...ok listen:
"DON'T BUY IT" ... okay? ok. now scram kid ya bother me.
For me, like you said, competition is almost never bad and this space hasen't seen it since what? Vodoo cards? Or what was that other system I'm forgetting. IDK, S3 or something that started with an "M" I wanna say without looking it up. Can't remember what S3 was based on. Anyway whatever yes. Competition.
@@XeqtrM1 THEIR, NOT THERE.
Price/TDP/FPS
These are the three important factors in GPUs. This new Battlemage range looks like champs already.
Bought an A770 just for the grins, it being the first serious Intel DGPU. It's not what I'm running now, but I may build a second rig when Battlemage comes out. Just for the grins.
Did the same. Haven’t even plugged it in. Whether it’s the beginning of a great thing or ends up Intel’s first failed major attempt at stabbing into the market, it’ll remain a significant product in the history of the market.
@@kenshirogenjuro873 I agree. It tickled my collector bone. Bought the Acer Predator Bifrost version.
i have 4090 strix oc and arc 770 FE i lovew them both
@@kenshirogenjuro873Same. I have an ASRock A770 and a Sparkle one, the Sparkle hasn't even left the box. I've also got an ASRock A380 in my media PC for AV1 encoding and decoding and that thing RIPS for encoding! If they release the B980 I'll buy one of those, too, if nothing else than to support the competition within the GPU market.
@@406Steven💥💥💥
just got a a580 and im really happy with it.
Thumbnail creator: "Graphic Design is my Passion!"
I did like the A770, it's in my media PC and I guess a second hand 3070 or a new like 7600XT would've done similar in games, but the a770 LE design was just so clean and I wanted to support the endavour, I was originally just gonna put it on my showcase plank of old / special hardware, but then I figured, instead of buying a gpu for my media PC, just put that one in.
Honestly if my 4090 ever dies, I don't think I'd mind using it as my backup GPU, since lately I'm actually playing a lot of lighter games which could even run on igpus.
the b580 will be the king of budget builds
intel should release a new industrial standard: a unified compute motherboard(at lease 3 years socket support) has a GPU socket and users can expand their GPU memory using the CAMM2 module up to 384 or 512bit wide to change the current NVIDIA's rule.....
I am guessing the current camm2 socket being shown in laptops is only 128bit? Would you have an APU or separate GPU?
@@christophermullins7163 i7-6700+GTX1070, I finally turn off the iGPU, because it consumes my main memory for reserved area, and it's hardware decode is not enough for current video steaming websites..
yeah I suggested that quite some years ago, even though CAMM 2 wasn't really a thing back then so I used some other standard and optimized it for better bandwith.
however what Intel should really also do/push is adding hardware memory compression to the GPU's.
essentially what that does is it makes 16gb of normal VRAM behave like way more VRAM, dedicated hardware for it keeps latency and energy usage and compute power low.
the main big deal here is that in a GPU memory compression in most cases actually makes the GPU faster, since GPU's mostly depend on having a high bandwith and not so much on low latency(after all most people game below 10000fps or such).
one of the big problems with CPU ram for high performance GPU's is this reduction in bandwith, there are ways around to make it less impactfull, but even so even normal APU's are bandwith limited which is why they are so much faster on higher memory clocks.
that said if you would use CAMM2 modules as many together to get to such a bitrate as you mention, and then combine it with memory compression you can actually get good enough bitrates, the memory compression is kind of needed however, and many people think memory compression gives more ram than is physically installed at the cost of performance, and with CPU's that is mostly the case since they often depend on low latency, but gpu's need high bandwith and since good hardware memory compression actually greatly increases bandwith(there are cases where it can multiply bandwith by more than 8 times(even using old compression methods)). so in gpu's memory compression would make that very much possible. and honnestly we are at a point where it makes little sense for gpu's not to do that/add in such hardware memory compression and perhaps a small direct acces cache ram buffer in case it once does actually need as low as possible latency so that no compromise is made.
I had actually hoped intel was gonna do that in this generation since around half a year ago they made a certain refference which kind of seemed to reffer to something like it, as they reffered to new revolutionairy memory improvements, even though that was one of those things where it was uncertain if they would do it or not, and also it wasn't even certain they talked about hardware memory compression since it might just have been that I thought about that option due to having been thinking about that and why it actually would be great.
@@ted_van_loon I just think that following NVIDIA's rules (selling exclusive GPU circuit boards and fixed memory quantities) will not be able to catch up with NVDA's technology and market size. The AI data center sold by NVDA is not accessible to ordinary users and players. Once GPU motherboards and expandable VRAM appear, NVDA will either give up this market, or jump in and start over according to INTEL's rules. I just want to do this with NVDA
When i leave a like Nvidia and Amd will drop a new graphics card but i want Intel to do that too
I JUST got an A750 yesterday upgrading from a GTX 1660 Super... WOW what a huge and abilities performance boost!! LOVING the card! ( Using a Ryzen 2700X CPU with 64GB of DDR4 )
"Official"
Proceeds to spout rumors...
*sighs*
Official Specs and Performance - Does not even now how much VRAM the boards will have, not a single benchmark or game benchmark provided... Clickbait
Except it literally does say how much vram the cards will have? Know what you're talking about before commenting bro 💀
@@stabbyman It does, but he literally says he he does not even know if they will release the B980 ( 7:57 ) . Also, no game benchmark provided.
@@Adreno23421 This is for Xe2, Kinda hard to benchmark a card that will not release till the end of the year. I didn't think this was clickbait because I'm familiar with AMD, Intel and NVIDIA's roadmaps.
If they get the drivers right and the improvements make it at least as reliable as the competition I will go for this, I can only afford so much and the B580 looks great if it has 12gb ram (which is the minimum).
10:12 That's not crazy. It's the correct. In fact, the 4060 (labeled as 4070) should have costed $299-$349 at lanch.
B980 looks like it will run pretty close to AMD’s expected N48 chip (8700XT/8800XT?) in compute. If the drivers don’t do too badly and the ray tracing continues to be significantly ahead of AMD, it will be a statement product.
I really hope this is the case. $399 is really reasonable price.
I have a 4080 in my main PC, and paid too much for it I think for what it is.
If Intel release those Battlemage GPUs at that price, I'll be buying the top model for my other PC at that price.
Cheers.
Keep in mind, people, that percentage increase is for when games DO work as they should, not some bench that looks bad because the drivers for gen-1 still are not quite there. The drivers this time seem not be an issue - so those increases might mean more than you would expect.
THAT SAID - even if just price dropped 4070S or 4070TI - then awesome, those cards new are WAY too much for that class in terms of name or their performance. You are being fleeced, and with Intel, so far, you won't be. That's a win.
(also, when the 5080 comes out at or near $1000, you will rethink things if you don't agree right now)
Why wouldn't Nvidia just keep it at $1200? Anyone else see the market splitting where Nvidia does high end while AMD and Intel take the mid range?
Rtx 5080 will start out at 1500
Does intel gpus have a feature similar to dsr/vsr?
i hope they do add a B380 or something again. They missed the mark last time but I think a lot of us want a good low-end gpu
B980 might be a strong competitor to 780XT & RTX4070 if it's $100 less, at $399, but they will still be 1 generation behind since RX8k & RTX5k is about to launch.
I think there's no way in hell Intel is releasing the B770 at $299 if it really does give us that kind of performance. Beyond that, I'm really hopeful for Intel's Battlemage lineup. A third serious competitor will be good for consumers and innovation, and I plan to support this release regardless of the actual gains.
Battlemage and RDNA 4 should launch later this year and are said to be very competitive in the midrange segment. I don't think drivers would be nearly as big of a problem for Battlemage as they were for Alchemist. If Battlemage can deliver a massive performance increase over Alchemist with good drivers at reasonable prices, I think there is a good chance Intel will gain quite a bit of market share in the discrete GPU market. The flagship RDNA 4 GPU is rumoured to offer 7900 XT levels of raster performance and 4080 RT performance at half the price, which would be fairly impressive. If Intel and AMD can pull this off, hopefully, it will convince Nvidia to pack more processing power into their next gen midrange GPUs for reasonable prices. With strong competition, that definitely could happen.
It was confirmed rdna 4 will come out next year, Amd is losing ground fast.
@@jasonvors1922 AMD is stupid if they have decided to release RDNA 4 next year. RDNA 4 needs to have at least a few powerful midrange GPUs at reasonable prices. Otherwise, it's going to be dead on arrival. Don't forget about Intel, though. I am fairly certain that it's confirmed that Battlemage is releasing this year for both mobile chips and discrete graphics cards. Judging from a lot of the leaks, Battlemage is shaping up to be impressive in terms of price and performance in the midrange segment, and it should be released with much more stable drivers at launch, apparently. Hopefully, Intel can pull this off, because if they can, there is a good chance that Battlemage will be a massive success, especially in the discrete GPU market.
Interesting fact that is if the rumored specs and performance is true, the capability of each Battlemage Cores is 50% faster than NV current Lovelace core (4096 intel gpu equal to 6144 nvidia GPU in performance).
If next generation Celestial GPU Intel is able to double its Core count to 8192, it might be able to compete with RTX 5080. Such good time!
That's disappointing. I remember not too long ago, the rumor was the top end battlemage was supposed to compete with the RTX 4080. That seems to be wrong. We need another player in the high end GPU market. We need the competition, the prices are high enough
I was too, I was hoping for something similar to the 4070 Ti, but Intel doesn't seem to be interested in the high-end GPU market. Still, 196% performance increase is pretty substantial. I'd still buy it.
@@gargamel314 thats three times the performance, so I mean, not goog enough for ya? :-P
I don't understand you guys you guys expected a peformance of a 4080/4070 ti for less than 600$ so keep your expectations in check
@@xBINARYGODx It's only double, 200% means 2x. If the A770 gets 50 fps, the theoretical B980 would get 100 fps
Looks like battle mage will pass the 4080 in speed/fps that is
Are you related to the ‘Moore’s law is dead’ guy?
You can't underestimate Intel. they're too big, know too much, have too much computing experience, have tons of money and are just bad 455.
And still screw up .
Just let Raja Koduri do what he does best: design GPU architecture! He did nice job at AMD on RDNA/RDNA 2!
The GPU manufacturers aren't going to make NVIDIAs mistake when they released the GTX 1080 and it's still used today.
Every GTX 1080 out there is another card not sold year after year.
Anus excuse for logic.
If that price and performance happens with Intel, get ready for a new era of PC gaming 😮
I feel (this is subject to change) that Intel takes their feature set more seriously than AMD - Id imagine that changes too
Battlemage shooting some fireballs at nvidia
Title - It's official !
Content - just more speculation.
Sorry but the focus should be pinned on stability and compatibility for older DX games natively without translayer!
The drivers are using less and less of these layers if any with the latest drivers.
yes Intel!!!
My guess here is 15% improvement from clock speed and a 15-20% improvement from IPC. 980 is 4070super.
$199?? wtf are you on
I just hope intel doesn't cancel it because they run out of money.
Мы существуем в такой реальности, когда от видеокарт Интел ждёшь больше чем от видеокарт АМД
What I wanted to see in Battlemage and Celestial ARC would be: implementation in gaming consoles
I mean that’s not really up to Intel.
@@xKB616 but necessary to break AMD monopoly in gaming console chip. Lunar Lake shows a direction forward
@@jimmyjiang3413 maybe Sony will go for it. Intel would definitely undercut AMD to get in there. I don’t believe that there will be a Next Gen home Xbox console. I’m possibly very wrong but I just don’t see another home console from them. Switch/Steam Deck like hybrid handheld? Maybe. If that does happen, maybe Intel could work on something for MS for their handheld like Nvidia does for the Switch.
@@xKB616 what I wanted to see in Xbox will be a performance hybrid: combining performance of a typical RTX 4070S desktop and PS5 with portability and flexibility of Switch. This shall be possible thanks to Intel Lunar Lake’s design direction forward for all chips, including custom chips. Maybe this time Xbox shall consider remove Phil and Sarah from this in favor of someone else like one of the former PlayStation boss takeover for some interim basis before making its own CFO (fiscal) to be vice-president and deputy CEO at the same time (eg. David Zaslav from WBD better fit in CFO role). Maybe it needs to implement Japanese employment practice in the form of lifetime employment without the possibility of layoff. And relegate cloud gaming and Game Pass to older titles only so that newer games being sold as console-only exclusive in the initial release window of two years to mimic the theatrical film release window as much as possible. Maybe this time Xbox shall be sold at north of $2000 for financial sustainability, profitability and hardware revenue reasons.
Intel has to keep those fabs busy
has H.266 VVC decoding encoding?
Why doesnt the tflops track with performance?
I think intel will save the '9' class for a more performant part, possibly not even this generation. I think your b770 is accurate, but the one you are calling a b980 will actually be a b780. Maybe the 5120 alus gets shaved down a little.
Whatever comes, i hope Intel really does well. It could only be good for the gpu market
The new intel cards have a lot of promise, but i will definitely not be the early adopter.
Might be Intel's maxwell moment.
I’m praying that Battlemage gives us a great performance. I’m just tired of NVIDIA and AMD.
Not saying anything that comes out of the Intel camp is not true, but with the way things are they seem really unstable as a company and as a product, so until software and hardware hits shelves and 3rd party tested I take everything from them with a bit of salt, I hope they can somehow pull this off, I have in my computer an A770 I'd like to think I can replace it at some time with another Intel
"Only 12gb of Vram" - lol.
gonna say i have to see the bugs an stuff first before i buy anything
honestly I was thinking about a 7900xtx but if the B980 is this good then I might roll it
🤡🤡
as soon as intel can do VR, i jump ship
If B980 is going to cost 400 USD i am going to give it a try...
2 8pins and 2 6 pins?
Yep clickbait.
But these cards are competing with old Nvidia cards. 50 series about to drop when battlemage drops. Intel got some catching up to do but pricing looking good
Whatever makes you sleep at night.
@@renatoramos8834 take a shower 😂
the thing is you guys these things wont even compete with 3000 series... It's a wrap Buy a G102 in a 3070ti or 3080 anything rtx 3000 and up is just a BEAST
So, you want NVIDIA to raise GPU prices even more? With their current market cap, it's evident that their GPUs are going to be even more expensive than they are now. We seriously need competition.
@@swagyolo8602 all that talk aint your job just buy and play
@@PhyteByte Well, I thought you were smart enough to have a little talk. Seems like I was wrong. No wonder why you're STILL rooting for NVIDIA even after they ditched out the very hand that fed them :)
@@swagyolo8602 TH-cam HAS YOU THINKING THERE MORE AND MORE PERFORMANCE TO BE GAINED, THAT THINGS ONLY GET BETTER EVERY YEAR GPU'S ARE THE PRICES THEY ARE BECAUSE THEY'RE THE BEST THEY'VE MADE THEM BUY 3000+ SERIES IF YOU NEED TO SAVE SIMPLE.
My next GPU will be Intel
why arent you comparing the b580 to the a580 that it's replacing?
Want to dislike this video for clickbait but cant coz i love intel😊
question does any1 think the 5090 will bottleneck running on a i7 14700k or ryzen 9 79503xd?
It depends, if you use 4k, and highest quality raytracing i think the 5090 will not bottleneck those cpu's.
Any cpu out right now will bottleneck the rtx 5090.
@@jasonvors1922 how so?
Old news ive seen this
lol if official why all the question marks on the spec sheet
Ifs, buts and candy nuts.....
😂😅😢
... I mean how much can video cards be hyped up?!? gets absurd after a while. all this hype about next gen gpu is a bunch of crap. let me know when 8k @ 300hz and 600 fps running at room temps full tilt renders for under $500.
silicon aint free bro. doubt they can give 2x the silicon for only 50$ increase
''Battlemage'' top cringe moment
How is this guy around still. He never has facts.
Why the fk do u act like
This is Disney 😂
50 series or bust
lol never coming out at those prices....they are already losing money at current prices. You want those to be cheaper?
They won't have any cards with less than 16GB ram. So this is at least semi fake.
Edit: saw the whole 'leak' now, and its all 100% fake
Yeah no one cares about Intel GPUs except for brokies.. Tell Intel to make a chip with an extensive amount of L3 cache and we might actually buy their chips again
Being an easy touch is nothing to brag about bro.
@@swagyolo8602 what does that have anything to do with what I said
@@BroSomeTV A lot. You should know if you have a working brain.
@@swagyolo8602 an easy touch.. and apparently I’m bragging about it. I didn’t know that simply stating the obvious was being easy to touch/gullible.. but you just want to be here to call people stupid
@@BroSomeTV You ain't touching anybody here, edgel0rd. Forget about reasonable price tags; just support their greed and brag about it on TH-cam. How edgy is that? LMAO. Any corporate greed would use you with ease.
Btw, you got it wrong. I'm here to mock clowns, not to call random people stupid."
Report -- fake information
no new video for 6 months
I think....