Thanks to Vip-cdkdeals for sponsoring this video. 30% off code: GPC20 ▬ Windows 10 pro ($17):www.vip-cdkdeals.com/vck/GPC20w10 ▬ Windows 11 pro($23):www.vip-cdkdeals.com/vck/gpc20w11 ▬ Windows 10 home ($14):www.vip-cdkdeals.com/vck/gpc20wh ▬ Office 2016($28):www.vip-cdkdeals.com/vck/gpc20of16 ▬ Office 2019($47):www.vip-cdkdeals.com/vck/gpc20off19 SUPPORT My Work & Get Discord Access: patreon.com/TheDisplayGuy
Their drivers will stick suck, thei won´t be a huge competitor for a while. It will be just like back in the day when AMD GPU´s sucked soo much due to their drivers.
@@rakeshmalik5385NVIDIA kicked them off & took their spot. Considering they’re the most valuable company in the world…def a little different from what u said, but ok.
At this point gamers just want cheap options since nvidia went crazy and AMD is following the same path, if people get 4070 level of performance and 16GB for 300 or a bit more for 350 is definitely a good deal, we need more options and competition.
@@estacaotech AMD has specifically said with the next gen GPUs it is not chasing the high or ultra end, specifically targeting mid tier or lower, and most are expecting them to be trying to target value to grow market share..l
How is AMD following the same path? They stated they will stop trying to compete in the high-end and start to put more effort into the midrange market.
good luck with that if people continue to buy Nvidias Crap cards Called Low end cards like the RTX 4060/4070 then Nvidia will give a FK about Reducing the prices and as a Market Gigant who own like 80% of the market they tell how high the price will be.
My 3080 broke, and I needed a GPU for video editing and my online class. Went for an A770. Does not compete with higher res stuff but WOW, for 300 bucks and less, amazing stuff. Def excited to see Battlemage.
@@KevPez-IS Good for you. I use multiple CAD software and simulation softwares, I doubt I would be happy with the card. I am sure after a while itcould be fine but honestly I rather pay more and get the works out of the box product right away. That being said pricing from NVIDEA is nuts.
Alchemist had architecture limitations as to why despite its die size, it struggles in games against it's competitors that have smaller die sizes. Currently, XE2 found on Lunar Lake is very competitive with HX 370@@sierraecho884
Nvidia burned me badly when the 40 Series. They shut out my favorite GPU manufacturer (EVGA), they underpowered the 4060, and they overpriced the 40 Series. Ada Lovelace herself (if she were still with us) would have taken the whole thing as a slap to the face. Coming to Intel was a radical change, but I have no regrets.
Yeah coming from 30 series which gave us a 2080 super equivalent at 400 dollars (3060ti which is also my current gpu!) to seeing 40 series gain massive leaps in efficiency but having terrible price for performance per tier of card compared to last gen. 4070 should've been what the 4070 super is but keep its current msrp and 4080 should've been priced at what the 4070tisuper is.
@@OrenSaidAThing My last Nvidia card was the RTX 3060 12GB. Had Nvidia kept their prices down along with keeping 12GB 192-bit lane on the 4060, I might have taken the plunge. But it didn’t go that way for me.
and I hate them for what they did to laptops. Basically making laptop GPUs a scam. In 3000 series, when they started cutting them down\giving wrong naming at least mid-range was ok, but in 4000 series they completely removed proper mid-range and only put either low end or high end in laptops. No desktop 4070 or 4060 ti 16GB, only 4050, 4060, 4060 ti 8GB, 4070 ti and 4080 with the last 3 being named 1 tier above desktop versions.
@@BlindingWulf the thing is nvidia doesnt see amd or intel as competition so amd and intel having gpus isnt gonna make nvidia drop prices and dont forget nvidia is controlling the stock of gpus in a way to keep prices high.
@@hippytree69 its not up to nvidia to decide who is competition and who isnt. if other companies start offering similar performance for substantially lower prices we have great options for gpus to buy.
@@David-KynastonI mean 1080p med-high settings a 2060 is still a good card. No need to chase the frame rates unnecessarily. I only build/upgrade every 4-6 years. I am literally building a 200usd pc for my friend with several used parts and one of them is a 2060 I had laying around from an E Waste bin. I love older parts.
@@jimdob6528 On my 2060S I was playing RE4 Remake yesterday, high settings with high ray tracing. To make it smooth locked at 1080P60 I just had to set the texture budget to 1GB, that said I didn't try it on 2GB I just turned it down under advice from the settings section. There's only 3 or 4 games you can't play at high at 1080P on a 2060S, only one or two I'm even slightly interested in. BTW I love e-waste parts, my sound card is ancient and came out of my brother's scrap bin 5 years ago. Sounds great through my hifi, more muscular and less clinical than modern cards, and the legacy driver and software is still on Soundblaster's website.
I think vram should be intel’s calling card. That seems to be everyone’s current gripe. Battlemage should’ve targeted 4070/4070s/4070ti with 16/20/24 gb of vram. Vram is not the part that balloons price out of control. I don’t know how much difference in gaming vram actually makes but if that’s what people say they want then be the company that gives it to them. $1100 5080 16gb or $449 b970 24gb
@@gs-pd5ox No point in having that much VRAM if it's not going to use it. Also the bigger the memory bus the more power the GPU consumes. Even if the top card is around the 4070 Ti, 16GB is more than enough for that performance. The problem is when you have a card like the 4070/Ti that is a capable 1440p card but runs out of VRAM once you start turning on all of the Nvidia features.
@@26Guenter power draw isn't the gripe. Sure an extra 8 gb of vram at the expense of an extra 100 watts is absurd. But 20 - 24gb at 280 watts or just under? I doubt many would complain. I'm sure battlemage will have upgraded features like raytracing and frame gen that could use it. I just get irritated when something can be done and doesn't. If comments are to be taken as the gospel people want a 4070 ti performing card with 20 gb of vram or more for under $500. If intel could make that happen, they should make that happen. Practicality and necessity be damned. I have a 770 and I have no complaints with it being a first gen card. It didn't sell well for what it is. The best way to get sales is to make what the consumer wants. If people will give you money for something extra that you can logically explain is unnecessary, it isn't really in your best interest to still not do it. Let NVIDIA make overpriced cards. Let AMD make value cards. Intel should make cards with the features that the consumer says they want. And they say they want more vram.
For the resolution those cards are going for, the vram is already good at 12, 16 would just be extra future proof but usually people buying those cards would be upgrading before they even run into vram problems
alchemist was totally legitimate and competes with the low end nvidia and mid range amd stuff (3060ish performance on the a770) - WHILE ALSO using WAAAAY less power AND including av1 encoders (which is amazing for some scenarios, and especially on such vastly less expensive gpus). I'm pretty excited to see where battlemage lands... if it's anywhere in the ballpark of 5060, it'll be a WIN.
I just don’t get why people like nvidia for low end to low mid gpus… you cannot even use their upscaling or ray tracing without SERIUS issues so why are they so popular? AMD or intel slaughters low-mid grade cards without any actual competition.
@@jimdob6528 honestly, I wish that nvidia would stop making low end gpus entirely and go with a good-better-best approach to their skus (and keep around 1 generation previous a bit longer for the low end). that way they have way less engineering work to do, way less support work, way fewer boxes to print, etc. etc. and can put all that driver and support into providing way more robust cards as well as putting more engineering into the next generation solution to make it that much better. we don't need 4060's and 4050's, we need a 4070, a 4080, and a 4090, as well as a 3070, 3080, and 3090.
@@flamestoyershadowkill Depends if you play AAA or not. The last triple A I played was Space Marine II which works fine on a 3060ti. Other than that, I've not played triple A in years. Almost a decade. They've no story to tell and microtransactions killed the vibe for me.
No laptop Battlemage? That's not good for Intel, since its CPUs aren't cutting the mustard. That just means that the AMD Halo line will steamroll the Intel laptop lineup. I think things will be a lot more interesting when we start seeing the stuff MediaTek and nVidia cook up together, and the next generation of the Snapdragon X Elite. And if the rumors about Sound Wave turn out to be accurate, Zen5 might be the last x86.
to be fair, they aren't energy efficient enough for laptops, plus they don't have undervolting on their mobile GPUs, unlike Nvidia. I wouldn't be surprised if they have like half the performance per watt compared to Blackwell.
I certainly hope Intel isn’t pulling our leg for this, Intel becoming a competition to the green and red team would be a refreshing option! May even consider doing an Intel build? Who knows. Next pc for myself is going to be fun with the options I have but I don’t intend to upgrade for another 4 years. 6900xt has been very good to me I gotta admit. 3.5 years old just about.
A.I. and Video Production software most often uses CUDA - and that is NVIDIA. AMD and INTEL going after budget gamers, while OK, isn't the money maker that will keep them in business. Getting OpenGL in a lot of A.I. and Video production products would amplify the value. Right now - CUDA is king.
Budget cards dominate the best sellers list, so intel know what they are doing. Cuda and all that is niche compared to people who just want a card that plays games
Stepping stones. Intel will develop this technology which is editing superiority and eventually incorporate the tech into cpu packages making add in cards a thing of the past and creating a nvid-who? They are already up to mid current gen nvidia on their 2nd gen attempt. Go back to arc driver first release to now and then compare first nvidia, even 10 series to now. Intel absolutely smash nvidia in development and technology improvements using just one gen silicone over nvidia's 3 gens. Only those subscribing to fomo are buying nvidia "ai" or anything "ai".
I agree with @defnotatroll, both Intel and AMD are targeting the mass market for GPUs, Halo products like the 5090 and 5080 probably make up to 5% of all sales together, from a cost per unit perspective Nvidia will keep pushing prices up because they are focusing away from "The casual gamer" to AI compute where they get way, way more money for a similar product.
Didn't Pat Gelsinger said the discrete Intel GPU is ramping down? Intel is focusing more on integrated graphics and server gpus for Ai. Battlemage may be the last we will see from Intel for the next 3 years
@@slimjimjimslim5923 next gen is probably cancelled at this point lol but if battlemage sells better than they expected hopefully they will start working on the gen after :'(
@@Flomockia celestial isn't cancelled but they're developing that with an integrated graphics focus i've heard. the dedicated stuff is closely related to their integrated work. battlemage didn't achieve what they wanted, but since they had already put so much money into developing it they have to sell something to recoup costs. at least that's how i understand it
our timeline at some point got so fucked intel managed to enter the GPU market, become actually a competitor and then literally collapsing its like over a decade old CPU empire, causing them to fully rely on their GPUs... in the span of 2 years
Tell anything you want about Nvidia, but the fact is they are pushing forward despite almost total domination in the market. Meanwhile Intel got complacent and give AMD the chance to come back. Although Nvidia rised prices to ridicolous levels, but you know competition followed their price hike despite always being inferior.
5:03 im surprised by intel. Imagine a card with 20GB vram for $350 and 4070 performence ☠️☠️ as a 3D modeling and animation guy i see this as ABSLOUTE WIN
I've had an A750 since March and mainly use it for blender on Ubuntu. It was $200 new when no one really wanted it. In my case drivers are solid. If the price is right for the B970 I'll build another computer around it like it did for the A750.
I'm doing just fine with my RTX3080Ti. Given the price they charged me for it, I swore I would never buy for another 5 years minimum. That's June 3rd 2026. Never again will I be buying new parts at the rate I used to. Hopefully AMD will continue to improve. I'm over both intel and Nvidia.
I'm with you. My 3090 Ti/24GB is doing just fine for MSFS. I bought it 3 years ago, and won't buy another GPU for a while. I might make an exception for an Intel offering that equated to 4090 performance for $800. I can sell my 3090 Ti for that much.
Had a couple versions of the A770 and performance was really good when it worked. However, stability caused major issues. The drivers just aren't in a good place yet.
I hope intel doesn't abandon this market. This generation has shown us that AMD and intel have no problem stagnating the market. So much GPU regression (7600xt barely faster than 6600xt, the disgusting 4060TI 16gb ... you get the point)
Intel will/should keep making GPUs. Just they should merge the Server, Business and Consumer lines to focus on drivers for two cards at A770 and A380 tiers (same chips with mobile power envelopes for embedded) with their FLEX gpu VFIO/SR-IOV and ray tracing tech . This will densify the value of their cards and free up sales/marketing resources to expanding the feature set. Like adding VFIO/SR-IOV over ethernet (like EtherExp). It should allow them to pierce the veil of the GPU market, going tall on hardware and wide on software/usability rather than wide on hardware and deadlocked on momentum for software to support every iteration. A deep lake vs a desert sized dew or fog.
I'll bet that if Intel's GPUs are as powerful as a 4070 at the lowest spec, keep getting better similarly to this gen, and they continue this badass naming scheme, then they will continue producing GPUs
Are you saying the B-970 ( sounds like New Star Wars Droid name ) will be $1100 as that's half of the Rumoured 5090 price whilst also saying it will be 56% faster than a 5090 ???
Do Intel GPU drivers have something similar to Nvidia's DSR to allow games to render at 4k and then downscale to 1080p displays for perfect anti-aliasing, for example?
How many years of support can we expect on arc cards as i want to build a future proof pc but im a bit scared that out of nowhere intel will end drivers support?
Just get an AMD card. Intel is too big of a risk right now and Nvidia is sh*tting on the consumer. Current generation cards are fine, but i would advise waiting on RX 8000 lineup, since it will offer similar performance to what we have from AMD currently, but improved technology and hopefully better efficiency too. So something like a 8800 or 8800XT would be the idea purchase for people like you and me.
If you want a future proof buy a used high end system from fb marketplace or something. I bought a 7800x3d/4090 system for only $2200 and it's been crazy upgrade coming from my old 970 system
Thank you for this video! Really excited with the Battlemage GPU pricing, as these prices seem to be more realistic than Ngreedia's ridiculous pricing for their GPUs.
Honestly, even though I am not holding my breath, I would not mind intel taking a step back with CPUs to get GPUs on track. If they do this and can then get back to goo CPU stability we could see NVidia make CPUs and if that happens we have a 3 way race for both component types. This makes a better competitive market. I'm hoping. But not optimistic.
isn't there rumors that intel is done with the GPU market , I've seen companies use the "trust me bro for the future upgrade" , but never the "well maybe will just stop idk" It's a bold strategy, Cotton. Let's see if it pays off for 'em
Hopefully they have a successful launch with this line of products so they will continue to develop gpu's. Intel have said after this generation of gpu's they will no longer be a focus of the company which sucks but big sales might convince them otherwise
I am very keen to see how these Battlemage GPU's play out, but Intel need to be on their 'A Game' with keeping the drivers updated as much as Nvidia, otherwise they will fall behind simply because people won't buy hardware that doesn't have the software/drivers to support it.
These have been ready to launch for at least 3 months now if not longer, if Intel doesnt figure out their drivers this time around theres no hope for their gpu division.
Intel releasing a 16GB $400 Arc B890 the speed of an RTX 4070 Ti SUPER would be great. Basically a 4070 Ti SUPER but half the price, but I think it would be great.
Ive been supper happy with my A770 and would love a more powerful option. Imagine how compelling the flagship would be if it came out with 20 or 24 GB of VRAM for AI wrokloads.
If these drop with those prices and that much performance then its gonna be pretty good competition fir more budgeted builds, especially on the lower end with that b550
If Intel manages to bring gpus at reasonable prices then they might disrupt the market because gpu prices these days are just absurd. I see an opportunity for both consumers and Intel. Hope Intel delivers on the value.
who care, people buy a RTX 4060/ti for 500-600$ then a Rx7700xt/7800xt that have way more performance even in RT. Who care about DLSS when your Card is Dead on Arival.
If battlemage really delivers but I may wait for their next generation, I currently have a 4060 for my first PC and I'm leaning on 5070 with gddr7 performance but if 8800xt is more energy efficient I'll hop on AMD so really hope intel will up their game.
I'd be 100% interested in giving Intel some business. But they are going to at least have to offer me 1080-1440p at 100fps minimum for 500 bucks or less or I'm not interested. I mean I just got a 7900xtx for 790 bucks after taxes at the beginning of the year. That's a hard value to beat, it does all that at 4k without even maxing out power. The G31 looks interesting 256bit and some decent memory if the cores are fast that would be decent in games.
Amd taking over the gpu market (specifically in consoles), while Intel falls off with their CPUs but starts making banger GPUs??? And Nvidia giving up on competing and just charging a kidney for their GPUs??? What kind of timeline is this bruh 😭😭😭
Honestly, right now, I just want a successor for the *GT 1030* It wasn't great for the time, but in hindsight, and considering how GPU technology scales, having a low-end graphics card that costs less than $100 and uses only 30W, while delivering similar performance to the fastest integrated GPUs, would be extremely useful. Right now, the best processors with integrated GPUs are held back by their low cache capacity. In esports and strategy games where you don't need a high-end graphics card and a decent integrated GPU is good enough, CPUs with more cache still give much better performance. The best integrated GPUs are only available in processors with 16MB L3 cache, which is absolutely awful. Either Intel, AMD, or Nvidia should release a new low-end graphics card that doesn't suck, or AMD should release a budget APU with a decent amount of cache (retro games don't need 8 cores, but they will still benefit from more cache! Just give us an APU with a good integrated GPU, 4 cores, and 32MB L3!), or Intel should release a desktop CPU with a good integrated GPU (a lot of their laptop iGPUs are pretty good, but they aren't available on desktop CPUs, and their fastest desktop iGPUs are still only available on midrange and high-end CPUs with more cores than a typical iGPU user needs)
I put an A750 in mij second pc. Combined with an ryzen 5600 it performance is close to my 3060ti combined with core i5 10600k. So I’m happy with especially since it costed 200 euros.
As a 1080ti user, I won't buy a card that is only 30% better than mine. And I won't buy a card that gives out 200 extra watts of heat either (ruling out 4080 or 5080) - so I guess I'll wait for 2026 with my upgrade.
for anyone who may not know top end battle mage was completely scrapped there was supposed to be another gpu level like how the cpus go i3 i5 i7 i9 but the next step up from the a770 was cancelled and celestial is going to be soft launched with very few gpus all info comes from mooreslawisdead sucks because I wanted the top end card Im loving intel arc with my a770 I just wish they had an higher end sku
From what I understand originally there were plans for a 40 Xe version (15-25% faster) that would have competed with the RTX 4080. This has been cancelled. GPUs were cut in size possibly to reduce cost & risk.
It would need to have 4080 levels of performance for me to ditch nVidia, though I'm thinking it's going to be more like a 4070. Having 4070 levels of performance with 16GB of Vram is a good proposition if it's under $400, but it's not powerful enough for me to make the switch. Let's see what happens.
I was caught between the 4070 ti SUper & 4080 Super but they are too expensive to justify so I finally settled on the 4070 Super as the 7800xt isn't quite as good in my opinion . . . . Now 45 minutes ago You drop this video stating the flagship is between the 4070 & 4070ti but £150 - £200 cheaper . . . . . . . . . .FFS . . . . . & its dropping in the next few weeks Choices Choices Choices
I did the same thing. I really wanted a 4080 Super, but at $1,149.00 for an ASUS Tuf version, (my favorite, current model of 4070 Super), I’m waiting on a huge price drop, or just wait on the next generation cards.
7800xt is as good as the 4070 super but okey maybe you try to inform yourself a little bit more ? One year after the 7800xt release Nvidia sold the 4070 super with 12gb vram and 700-800$ meanwhile Rx7800xt cost around 400 - 500$ and has equal performance depend on the game the one or the other wins but it has 16gb vram and this will help in the future. Even the RTX 4070ti will be a better deal then the 4070 super but okey you do you.
@@allxtend4005 You've said nothing here the same argument could be said for the 4070, its better depedning on the game& better if want to use it for productivity - I can tell your a fangirl - Life goes beyond gaming ; you'll realise this when you inform yourself a little more! but you do you
So 970 will probably be a little faster than 5060 probably? WOuld love to be able to justify this some other way beyond "If they fail too much they'll quit"
No chance to Battlemage while have lower TFLOPs than NV, have better game perf than NV. As I see top Battlemage will be slighty slower than 4070 but have awesome price, so if drivers will be good then it will be a good low end and lower mid generation
How performance expectations of battlemage came down to 4070 from 4080. Don't forget 5060 ti should match 4070 performance at $400. So there won't be much that Intel will be offering except maybe more vram and a little better price. However if battlemage could bring 4080 performance at $400 that'll be competitive especially if Nvidia is selling 5070 at $600. This can even force Nvidia to lower their price to $500.
if you know intel they will bild an entire ecosystem intel core i 9 285 and a battelmage gpu - productivity beast and good enough gaming ..intel is gonna mop the floor with amd
Looking at how expensive the Nvidia 50 series is going to be, I might consider AMD or probably this. Nvidia is getting too greedy because they didn't have much competition for too long.
I have no confidence in Intel. I will be pleasantly surprised if this is true, however, I definitely do not want to be a Beta tester on this product. I will wait at least 6 months before deciding. Intel has disappointed me time and time again.
I'm more keen on APUs...my current desktop rig will be the last desktop PC I'll build as here in Australia, building PCs has gotten too expensive. I'm waiting for MINI PCs to come with high peroformance Strix Halo APUs it will be cheaper to buy than building a whole rig, which generates more power and heat.
I agree, Another Aussie here and I'm heading the same way, I say that as an enthusiast as well. Sick of these rip off prices, Been building pc's for 30+ years and it was fine until crypto destroyed PC DIY. Am on a 4080, So I'm good for many years yet but when that no longer cuts it, I'm hoping APU's will be all I need. I don't care for 4k, Actually downgraded from 4k back to 1440p, If we can get an APU that can handle 1440p 60+ fps in any game including AAA then that's good enough for me.
@@ShaneMcGrath. Same here bro, I game at 1440p couldn't care less about Ray Tracing nor 4K as the performance not worth the hit for higher res. if anything I'd drop to 1080p MINI PCs are the future as well as handheld gaming. I own an ALLY X great device. I'm in my 40s now and building PCs out of reach for me now.
I've had a mini PC with a discrete Radeon 6600M 8GB for almost two years now, and it plays games just fine at 1080p/60fps. It cost me $630, uses around 180W, practically silent and runs at 70-75C top, and it doesn't have RGB, a glass panel, or any of that dross DIY PCs seem to be plagued by.
I thought they had to shelve these with their financial difficulties.... But for them to be good they will need to make sure their drivers work way better than their prior ones. The drivers alone set them back double digits and made them unusable for older games....destroying the initial release. Releasing on a shoestring budget and only 2 skus to clear inventory....never something you want to hear about a product your buying.
I am embarrassed for Intel. How hard could it possibly be to put 32GB or even 24GB into an Arc GPU. Intel could steal all of the AI mindshare it wants by providing modern, low cost, high VRAM cards to consumers. There is no downside. Every open source AI project would immediately have volunteers for compatibility development. If Intel made an A770 with 48GB of VRAM at $1000, it couldn't possibly make enough cards to keep them on shelves.
Thanks to Vip-cdkdeals for sponsoring this video. 30% off code: GPC20
▬ Windows 10 pro ($17):www.vip-cdkdeals.com/vck/GPC20w10
▬ Windows 11 pro($23):www.vip-cdkdeals.com/vck/gpc20w11
▬ Windows 10 home ($14):www.vip-cdkdeals.com/vck/gpc20wh
▬ Office 2016($28):www.vip-cdkdeals.com/vck/gpc20of16
▬ Office 2019($47):www.vip-cdkdeals.com/vck/gpc20off19
SUPPORT My Work & Get Discord Access: patreon.com/TheDisplayGuy
Intel releasing bad CPUs but being a legitimate competitor in the GPU space is quite the timeline.
On top of also being in such bad shape that it just got delisted from the Dow
Fall from the height, recover in other hills
Their drivers will stick suck, thei won´t be a huge competitor for a while. It will be just like back in the day when AMD GPU´s sucked soo much due to their drivers.
It’s literally the opposite of what u said lol, Foh.
@@rakeshmalik5385NVIDIA kicked them off & took their spot. Considering they’re the most valuable company in the world…def a little different from what u said, but ok.
At this point gamers just want cheap options since nvidia went crazy and AMD is following the same path, if people get 4070 level of performance and 16GB for 300 or a bit more for 350 is definitely a good deal, we need more options and competition.
@@estacaotech AMD has specifically said with the next gen GPUs it is not chasing the high or ultra end, specifically targeting mid tier or lower, and most are expecting them to be trying to target value to grow market share..l
How is AMD following the same path? They stated they will stop trying to compete in the high-end and start to put more effort into the midrange market.
But I would be upset if they stopped production and development.
@@brvndxnl They too raised their prices.
even 4060ti performance for 300 would be better than now
i want the death of $500+ gpus. we need to go back to reasonable sizes and prices.
good luck with that if people continue to buy Nvidias Crap cards Called Low end cards like the RTX 4060/4070 then Nvidia will give a FK about Reducing the prices and as a Market Gigant who own like 80% of the market they tell how high the price will be.
If Battlemage manages to have the same or even better of codec as NVIDIA, has 16gb vram, and is like around $300-$400? I don’t mind grabbing one
@@JellowGelo they already do good with codecs
My 3080 broke, and I needed a GPU for video editing and my online class. Went for an A770. Does not compete with higher res stuff but WOW, for 300 bucks and less, amazing stuff. Def excited to see Battlemage.
Remains to be seen if their dirvers will even work properly especially for productive work.
@ my arc so far has been great in Premiere. I have weird issues updating drivers, but once stable when updated
@@KevPez-IS Good for you. I use multiple CAD software and simulation softwares, I doubt I would be happy with the card. I am sure after a while itcould be fine but honestly I rather pay more and get the works out of the box product right away. That being said pricing from NVIDEA is nuts.
Alchemist had architecture limitations as to why despite its die size, it struggles in games against it's competitors that have smaller die sizes. Currently, XE2 found on Lunar Lake is very competitive with HX 370@@sierraecho884
@@KevPez-IS is there any game or app that doesnt work with it?
Nvidia burned me badly when the 40 Series. They shut out my favorite GPU manufacturer (EVGA), they underpowered the 4060, and they overpriced the 40 Series. Ada Lovelace herself (if she were still with us) would have taken the whole thing as a slap to the face.
Coming to Intel was a radical change, but I have no regrets.
Yeah coming from 30 series which gave us a 2080 super equivalent at 400 dollars (3060ti which is also my current gpu!) to seeing 40 series gain massive leaps in efficiency but having terrible price for performance per tier of card compared to last gen. 4070 should've been what the 4070 super is but keep its current msrp and 4080 should've been priced at what the 4070tisuper is.
@@OrenSaidAThing My last Nvidia card was the RTX 3060 12GB. Had Nvidia kept their prices down along with keeping 12GB 192-bit lane on the 4060, I might have taken the plunge. But it didn’t go that way for me.
and I hate them for what they did to laptops. Basically making laptop GPUs a scam. In 3000 series, when they started cutting them down\giving wrong naming at least mid-range was ok, but in 4000 series they completely removed proper mid-range and only put either low end or high end in laptops. No desktop 4070 or 4060 ti 16GB, only 4050, 4060, 4060 ti 8GB, 4070 ti and 4080 with the last 3 being named 1 tier above desktop versions.
At least there’s some competition against NVIDIA… those prices are insane
😆😆😆😆😆😆😆
I’m interested in RX8800XT. RTX4080 performance for 500bucks would be great
@@BlindingWulf the thing is nvidia doesnt see amd or intel as competition so amd and intel having gpus isnt gonna make nvidia drop prices and dont forget nvidia is controlling the stock of gpus in a way to keep prices high.
@@hippytree69 its not up to nvidia to decide who is competition and who isnt. if other companies start offering similar performance for substantially lower prices we have great options for gpus to buy.
@@BlindingWulf Not in the VR space. I with gaymers would just go AMD or Intel.
Intel will have great chance to grab some GPU market if they do deliver, I hope the GPUs are smooth!
Drivers is still a issues tho, intel need 2 more generations to make sure it stable(just like how AMD is struggle at the past)
That is a badass name for a GPU
for real
If Intel release new cards at this performance per Quid then I'm going Intel to replace my aging 2060S. It would be an absolute no brainer.
Congrats on sticking with your 2060S for so long. More of us need to follow suit.
@@David-KynastonI mean 1080p med-high settings a 2060 is still a good card. No need to chase the frame rates unnecessarily. I only build/upgrade every 4-6 years. I am literally building a 200usd pc for my friend with several used parts and one of them is a 2060 I had laying around from an E Waste bin. I love older parts.
@@David-Kynaston
Thanks, I don't see the point in spending silly money to game.
@@jimdob6528
On my 2060S I was playing RE4 Remake yesterday, high settings with high ray tracing. To make it smooth locked at 1080P60 I just had to set the texture budget to 1GB, that said I didn't try it on 2GB I just turned it down under advice from the settings section. There's only 3 or 4 games you can't play at high at 1080P on a 2060S, only one or two I'm even slightly interested in. BTW I love e-waste parts, my sound card is ancient and came out of my brother's scrap bin 5 years ago. Sounds great through my hifi, more muscular and less clinical than modern cards, and the legacy driver and software is still on Soundblaster's website.
I think vram should be intel’s calling card. That seems to be everyone’s current gripe. Battlemage should’ve targeted 4070/4070s/4070ti with 16/20/24 gb of vram. Vram is not the part that balloons price out of control.
I don’t know how much difference in gaming vram actually makes but if that’s what people say they want then be the company that gives it to them.
$1100 5080 16gb or
$449 b970 24gb
@@gs-pd5ox No point in having that much VRAM if it's not going to use it. Also the bigger the memory bus the more power the GPU consumes. Even if the top card is around the 4070 Ti, 16GB is more than enough for that performance. The problem is when you have a card like the 4070/Ti that is a capable 1440p card but runs out of VRAM once you start turning on all of the Nvidia features.
@@26Guenter power draw isn't the gripe. Sure an extra 8 gb of vram at the expense of an extra 100 watts is absurd. But 20 - 24gb at 280 watts or just under? I doubt many would complain. I'm sure battlemage will have upgraded features like raytracing and frame gen that could use it.
I just get irritated when something can be done and doesn't. If comments are to be taken as the gospel people want a 4070 ti performing card with 20 gb of vram or more for under $500. If intel could make that happen, they should make that happen. Practicality and necessity be damned. I have a 770 and I have no complaints with it being a first gen card. It didn't sell well for what it is. The best way to get sales is to make what the consumer wants. If people will give you money for something extra that you can logically explain is unnecessary, it isn't really in your best interest to still not do it.
Let NVIDIA make overpriced cards. Let AMD make value cards. Intel should make cards with the features that the consumer says they want. And they say they want more vram.
For the resolution those cards are going for, the vram is already good at 12, 16 would just be extra future proof but usually people buying those cards would be upgrading before they even run into vram problems
alchemist was totally legitimate and competes with the low end nvidia and mid range amd stuff (3060ish performance on the a770) - WHILE ALSO using WAAAAY less power AND including av1 encoders (which is amazing for some scenarios, and especially on such vastly less expensive gpus). I'm pretty excited to see where battlemage lands... if it's anywhere in the ballpark of 5060, it'll be a WIN.
I just don’t get why people like nvidia for low end to low mid gpus… you cannot even use their upscaling or ray tracing without SERIUS issues so why are they so popular? AMD or intel slaughters low-mid grade cards without any actual competition.
@@jimdob6528 honestly, I wish that nvidia would stop making low end gpus entirely and go with a good-better-best approach to their skus (and keep around 1 generation previous a bit longer for the low end). that way they have way less engineering work to do, way less support work, way fewer boxes to print, etc. etc. and can put all that driver and support into providing way more robust cards as well as putting more engineering into the next generation solution to make it that much better. we don't need 4060's and 4050's, we need a 4070, a 4080, and a 4090, as well as a 3070, 3080, and 3090.
They also added gpu virtulization free of charge if you had the coorect bios.
This gave the ability to share a gpu between multiple VM
@@jimdob6528 what serious issues are you talking about? i dont have any issues with upscaling.
$300 rtx 4070 equivalent is wild, i'll believe it when i see it, if that's the case 1080p will be phased out in a few years.
amd's equivalent will be 399 and will also beat the 4070. would you pay 100 more for better drivers?
@@sturmgewehr449if you’re not being sarcastic I would
Not if games stay unoptimized
@@flamestoyershadowkill Depends if you play AAA or not. The last triple A I played was Space Marine II which works fine on a 3060ti. Other than that, I've not played triple A in years. Almost a decade. They've no story to tell and microtransactions killed the vibe for me.
@@Thrainite If you mostly play indies you don't even need a GPU that is capable of 4070-level rasterization.
that manner to starts videos never gets old
Ricardo Milos feels 🕺
No laptop Battlemage? That's not good for Intel, since its CPUs aren't cutting the mustard. That just means that the AMD Halo line will steamroll the Intel laptop lineup.
I think things will be a lot more interesting when we start seeing the stuff MediaTek and nVidia cook up together, and the next generation of the Snapdragon X Elite.
And if the rumors about Sound Wave turn out to be accurate, Zen5 might be the last x86.
All the OEMs will continue to sell Intel/Nvidia laptops.
the battlemages already exist as igpus.. and they are really good
to be fair, they aren't energy efficient enough for laptops, plus they don't have undervolting on their mobile GPUs, unlike Nvidia. I wouldn't be surprised if they have like half the performance per watt compared to Blackwell.
I certainly hope Intel isn’t pulling our leg for this, Intel becoming a competition to the green and red team would be a refreshing option! May even consider doing an Intel build? Who knows. Next pc for myself is going to be fun with the options I have but I don’t intend to upgrade for another 4 years. 6900xt has been very good to me I gotta admit. 3.5 years old just about.
We got the real RGB setup.
I have pretty high expectations for battlemage, look at how far the Alchemist cards have come.
The trajectory is looking good.
A.I. and Video Production software most often uses CUDA - and that is NVIDIA. AMD and INTEL going after budget gamers, while OK, isn't the money maker that will keep them in business. Getting OpenGL in a lot of A.I. and Video production products would amplify the value. Right now - CUDA is king.
Budget cards dominate the best sellers list, so intel know what they are doing. Cuda and all that is niche compared to people who just want a card that plays games
Only AI. Intel has Quicksync for video editing.
AI is a market that neither AMD nor Intel have even tried to compete in, and is therefore irrelevant to the discussion.
Stepping stones. Intel will develop this technology which is editing superiority and eventually incorporate the tech into cpu packages making add in cards a thing of the past and creating a nvid-who? They are already up to mid current gen nvidia on their 2nd gen attempt. Go back to arc driver first release to now and then compare first nvidia, even 10 series to now. Intel absolutely smash nvidia in development and technology improvements using just one gen silicone over nvidia's 3 gens. Only those subscribing to fomo are buying nvidia "ai" or anything "ai".
I agree with @defnotatroll, both Intel and AMD are targeting the mass market for GPUs, Halo products like the 5090 and 5080 probably make up to 5% of all sales together, from a cost per unit perspective Nvidia will keep pushing prices up because they are focusing away from "The casual gamer" to AI compute where they get way, way more money for a similar product.
Didn't Pat Gelsinger said the discrete Intel GPU is ramping down? Intel is focusing more on integrated graphics and server gpus for Ai. Battlemage may be the last we will see from Intel for the next 3 years
@@slimjimjimslim5923 next gen is probably cancelled at this point lol but if battlemage sells better than they expected hopefully they will start working on the gen after :'(
@@slimjimjimslim5923 yeah that’s probably because Arc gpu sales were poor but if battlemage sales is good then they definitely continue
@@pchickcelestial was already announced that it's being developed battlemage definitely wasn't canceled
@@Flomockia celestial isn't cancelled but they're developing that with an integrated graphics focus i've heard. the dedicated stuff is closely related to their integrated work. battlemage didn't achieve what they wanted, but since they had already put so much money into developing it they have to sell something to recoup costs. at least that's how i understand it
They may plan to create a killer APU?
Intel just needs to break even on the cards and grab as much market share as possible
our timeline at some point got so fucked intel managed to enter the GPU market, become actually a competitor and then literally collapsing its like over a decade old CPU empire, causing them to fully rely on their GPUs... in the span of 2 years
Tell anything you want about Nvidia, but the fact is they are pushing forward despite almost total domination in the market. Meanwhile Intel got complacent and give AMD the chance to come back. Although Nvidia rised prices to ridicolous levels, but you know competition followed their price hike despite always being inferior.
So the top-end GPU from Intel will be on the same level as a low-end GPU from Nvidia. I wonder if it will be better than a 5060.
5:03 im surprised by intel. Imagine a card with 20GB vram for $350 and 4070 performence ☠️☠️ as a 3D modeling and animation guy i see this as ABSLOUTE WIN
if it does what you are saying, its a win simply because it pushes nvidia to reduce their stupid prices
I've had an A750 since March and mainly use it for blender on Ubuntu. It was $200 new when no one really wanted it. In my case drivers are solid. If the price is right for the B970 I'll build another computer around it like it did for the A750.
I'm doing just fine with my RTX3080Ti. Given the price they charged me for it, I swore I would never buy for another 5 years minimum. That's June 3rd 2026. Never again will I be buying new parts at the rate I used to. Hopefully AMD will continue to improve. I'm over both intel and Nvidia.
I'm with you. My 3090 Ti/24GB is doing just fine for MSFS. I bought it 3 years ago, and won't buy another GPU for a while.
I might make an exception for an Intel offering that equated to 4090 performance for $800. I can sell my 3090 Ti for that much.
Had a couple versions of the A770 and performance was really good when it worked. However, stability caused major issues. The drivers just aren't in a good place yet.
What do you mean I been using a770 with
AMD CPU and Intel GPU and last problem with drivers was 3 moths ago
And usually update fixes everything
Popular games work perfectly
Spiderman would constantly crash on me
@matgaw123 if it works for your uses, then that is awesome
@@nosajmartinez6175 oh
I would like to see Intel produce more then the few generations they have planned. Competition is very much needed in the GPU space
I hope intel doesn't abandon this market.
This generation has shown us that AMD and intel have no problem stagnating the market. So much GPU regression (7600xt barely faster than 6600xt, the disgusting 4060TI 16gb ... you get the point)
Intel will/should keep making GPUs. Just they should merge the Server, Business and Consumer lines to focus on drivers for two cards at A770 and A380 tiers (same chips with mobile power envelopes for embedded) with their FLEX gpu VFIO/SR-IOV and ray tracing tech . This will densify the value of their cards and free up sales/marketing resources to expanding the feature set. Like adding VFIO/SR-IOV over ethernet (like EtherExp). It should allow them to pierce the veil of the GPU market, going tall on hardware and wide on software/usability rather than wide on hardware and deadlocked on momentum for software to support every iteration. A deep lake vs a desert sized dew or fog.
I'll bet that if Intel's GPUs are as powerful as a 4070 at the lowest spec, keep getting better similarly to this gen, and they continue this badass naming scheme, then they will continue producing GPUs
Are you saying the B-970 ( sounds like New Star Wars Droid name ) will be $1100 as that's half of the Rumoured 5090 price whilst also saying it will be 56% faster than a 5090 ???
Since Pat officially killed the Arc development going forward, I don't think we will see Celestial or Druid, regardless if Battlemage is good or not.
Do Intel GPU drivers have something similar to Nvidia's DSR to allow games to render at 4k and then downscale to 1080p displays for perfect anti-aliasing, for example?
How many years of support can we expect on arc cards as i want to build a future proof pc but im a bit scared that out of nowhere intel will end drivers support?
Just get an AMD card. Intel is too big of a risk right now and Nvidia is sh*tting on the consumer.
Current generation cards are fine, but i would advise waiting on RX 8000 lineup, since it will offer similar performance to what we have from AMD currently, but improved technology and hopefully better efficiency too. So something like a 8800 or 8800XT would be the idea purchase for people like you and me.
If you want a future proof buy a used high end system from fb marketplace or something. I bought a 7800x3d/4090 system for only $2200 and it's been crazy upgrade coming from my old 970 system
Atleast 5 years if they stop making GPU's and probably up to 10 if they continue.
@@PetruBolocan let's hope so
since arc continue to exist as igpus i wouldnt worry
Thank you for this video! Really excited with the Battlemage GPU pricing, as these prices seem to be more realistic than Ngreedia's ridiculous pricing for their GPUs.
Honestly, even though I am not holding my breath, I would not mind intel taking a step back with CPUs to get GPUs on track. If they do this and can then get back to goo CPU stability we could see NVidia make CPUs and if that happens we have a 3 way race for both component types. This makes a better competitive market. I'm hoping. But not optimistic.
isn't there rumors that intel is done with the GPU market , I've seen companies use the "trust me bro for the future upgrade" , but never the "well maybe will just stop idk" It's a bold strategy, Cotton. Let's see if it pays off for 'em
I missed seeing that whiplash intro.
Hopefully they have a successful launch with this line of products so they will continue to develop gpu's. Intel have said after this generation of gpu's they will no longer be a focus of the company which sucks but big sales might convince them otherwise
I am very keen to see how these Battlemage GPU's play out, but Intel need to be on their 'A Game' with keeping the drivers updated as much as Nvidia, otherwise they will fall behind simply because people won't buy hardware that doesn't have the software/drivers to support it.
These have been ready to launch for at least 3 months now if not longer, if Intel doesnt figure out their drivers this time around theres no hope for their gpu division.
Intel releasing a 16GB $400 Arc B890 the speed of an RTX 4070 Ti SUPER would be great. Basically a 4070 Ti SUPER but half the price, but I think it would be great.
Ive been supper happy with my A770 and would love a more powerful option. Imagine how compelling the flagship would be if it came out with 20 or 24 GB of VRAM for AI wrokloads.
If these drop with those prices and that much performance then its gonna be pretty good competition fir more budgeted builds, especially on the lower end with that b550
I’ve been wanting to upgrade from my 2070S for awhile now but the GPU market is so confusing to me
If Intel manages to bring gpus at reasonable prices then they might disrupt the market because gpu prices these days are just absurd. I see an opportunity for both consumers and Intel. Hope Intel delivers on the value.
who care, people buy a RTX 4060/ti for 500-600$ then a Rx7700xt/7800xt that have way more performance even in RT. Who care about DLSS when your Card is Dead on Arival.
"justs behind" a 4070? The B770 is almost 30% slower accord9ing to these charts. How is 21.5 "just behind" 29.1?
Raw power-wise , would b970 challenge 7900 xtx ?
btw 4070 have 32mb L2... and that not anything close to it. and how is 24.6TF is faster than 29.1TF ?
If battlemage really delivers but I may wait for their next generation, I currently have a 4060 for my first PC and I'm leaning on 5070 with gddr7 performance but if 8800xt is more energy efficient I'll hop on AMD so really hope intel will up their game.
I'd be 100% interested in giving Intel some business. But they are going to at least have to offer me 1080-1440p at 100fps minimum for 500 bucks or less or I'm not interested. I mean I just got a 7900xtx for 790 bucks after taxes at the beginning of the year. That's a hard value to beat, it does all that at 4k without even maxing out power. The G31 looks interesting 256bit and some decent memory if the cores are fast that would be decent in games.
With those prices and better drivers on launch it could get interestig in low-midrange.
I am curious to the actual picture quality Anti Aliasing AntiTrisopic Filtering and Vertical Sync on Intel GPU never used one before.
Amd taking over the gpu market (specifically in consoles), while Intel falls off with their CPUs but starts making banger GPUs???
And Nvidia giving up on competing and just charging a kidney for their GPUs???
What kind of timeline is this bruh
😭😭😭
Dang, this looks surprisingly good. Now even I'm looking forward to this.
Does it do ray tracing?
Higher end gpu will be 900
Lower high end gpu will be 700
Mid range gpu will be 500
Budget gpu will be 300
Same as i9,i7,i5,i3
I think
There's rumours that Intel is canning the whole thing after celestial. Qualcomm is rumoured to be considering starting video cards.
I half hope it'll be amazing price to performance and half don't because I just bought a 4070 Super. Amazing GPU at least and I paid $540 for it.
Honestly, right now, I just want a successor for the *GT 1030*
It wasn't great for the time, but in hindsight, and considering how GPU technology scales, having a low-end graphics card that costs less than $100 and uses only 30W, while delivering similar performance to the fastest integrated GPUs, would be extremely useful. Right now, the best processors with integrated GPUs are held back by their low cache capacity. In esports and strategy games where you don't need a high-end graphics card and a decent integrated GPU is good enough, CPUs with more cache still give much better performance. The best integrated GPUs are only available in processors with 16MB L3 cache, which is absolutely awful.
Either Intel, AMD, or Nvidia should release a new low-end graphics card that doesn't suck, or AMD should release a budget APU with a decent amount of cache (retro games don't need 8 cores, but they will still benefit from more cache! Just give us an APU with a good integrated GPU, 4 cores, and 32MB L3!), or Intel should release a desktop CPU with a good integrated GPU (a lot of their laptop iGPUs are pretty good, but they aren't available on desktop CPUs, and their fastest desktop iGPUs are still only available on midrange and high-end CPUs with more cores than a typical iGPU user needs)
the fastest igpu compares to a 4070. they ain't gonna make a 30 watt 4070
the things you said literally exists, 5700x3d exists, an rx 6600 exists, a 3060 used exists, an a770 exists.
Damn Intel... I want a B970! 💸💸💸
Pricing will matter. I'm a bit skeptical that they will perform well AND be priced to out compete AMD and Nvidia.
I put an A750 in mij second pc. Combined with an ryzen 5600 it performance is close to my 3060ti combined with core i5 10600k.
So I’m happy with especially since it costed 200 euros.
You should change the ending to say "... amd, Nvidia and Intel release new gpu's"
As a 1080ti user, I won't buy a card that is only 30% better than mine. And I won't buy a card that gives out 200 extra watts of heat either (ruling out 4080 or 5080) - so I guess I'll wait for 2026 with my upgrade.
for anyone who may not know top end battle mage was completely scrapped there was supposed to be another gpu level like how the cpus go i3 i5 i7 i9 but the next step up from the a770 was cancelled and celestial is going to be soft launched with very few gpus all info comes from mooreslawisdead sucks because I wanted the top end card Im loving intel arc with my a770 I just wish they had an higher end sku
From what I understand originally there were plans for a 40 Xe version (15-25% faster) that would have competed with the RTX 4080. This has been cancelled. GPUs were cut in size possibly to reduce cost & risk.
It would need to have 4080 levels of performance for me to ditch nVidia, though I'm thinking it's going to be more like a 4070. Having 4070 levels of performance with 16GB of Vram is a good proposition if it's under $400, but it's not powerful enough for me to make the switch. Let's see what happens.
Didn't Intel promise something like this for their first gen cards too? Then they drastically under performed?
AMD is getting competition while Nvidia is being the smirking last boss.
I was caught between the 4070 ti SUper & 4080 Super but they are too expensive to justify so I finally settled on the 4070 Super as the 7800xt isn't quite as good in my opinion . . . . Now 45 minutes ago You drop this video stating the flagship is between the 4070 & 4070ti but £150 - £200 cheaper . . . . . . . . . .FFS . . . . . & its dropping in the next few weeks
Choices Choices Choices
I did the same thing. I really wanted a 4080 Super, but at $1,149.00 for an ASUS Tuf version, (my favorite, current model of 4070 Super), I’m waiting on a huge price drop, or just wait on the next generation cards.
7800xt is as good as the 4070 super but okey maybe you try to inform yourself a little bit more ?
One year after the 7800xt release Nvidia sold the 4070 super with 12gb vram and 700-800$ meanwhile Rx7800xt cost around 400 - 500$ and has equal performance depend on the game the one or the other wins but it has 16gb vram and this will help in the future.
Even the RTX 4070ti will be a better deal then the 4070 super but okey you do you.
@@allxtend4005 You've said nothing here the same argument could be said for the 4070, its better depedning on the game& better if want to use it for productivity - I can tell your a fangirl - Life goes beyond gaming ; you'll realise this when you inform yourself a little more! but you do you
@@allxtend4005 oh s#!t Sherlock
Everyone talks mad about amd drivers but what about intel? They are good?
Hopefully it takes off and we get legitimate competition for Nvidia. They have been price gouging us for way too long
I have been staring at this bushy-ass mustache for a year waiting! I want a gold star.
So 970 will probably be a little faster than 5060 probably? WOuld love to be able to justify this some other way beyond "If they fail too much they'll quit"
So Intell will smash RTX 5090 for half the price ?
whats the point of next gen gpus that are as good as last gen gpus?
They're offering better price to performance.
No chance to Battlemage while have lower TFLOPs than NV, have better game perf than NV. As I see top Battlemage will be slighty slower than 4070 but have awesome price, so if drivers will be good then it will be a good low end and lower mid generation
How performance expectations of battlemage came down to 4070 from 4080. Don't forget 5060 ti should match 4070 performance at $400. So there won't be much that Intel will be offering except maybe more vram and a little better price. However if battlemage could bring 4080 performance at $400 that'll be competitive especially if Nvidia is selling 5070 at $600. This can even force Nvidia to lower their price to $500.
Shintel will be at 4060ti perf not 4070. Just checking tflops. How 25 is more than 29?!?! Again complete garbage in gaming, for content should be ok
+50% perf is not really exciting imho (rtx 4070 perf at most). Around +75% could have been tho.
Intel doing side quests instead of the main quest
if you know intel they will bild an entire ecosystem intel core i 9 285 and a battelmage gpu - productivity beast and good enough gaming ..intel is gonna mop the floor with amd
I want a BattleMage,, not for gaming but running a lot of work displays at 4K for trading ..
hope this is at least half true, depends on software too, and price , but for once ill be cheering for intel.
Nvidia is accepting body parts for the 5000 series.
4070 performance? The 5070 should be out soon. Why does Intel keep targeting the prioir generation? This is planning to fail.
56% faster than 8800 Ultra?
Wow so fast
It's too bad Intel is gonna cancel discreet gpus after battle mage. I own a arc card and I like the little thing.
Looking at how expensive the Nvidia 50 series is going to be, I might consider AMD or probably this. Nvidia is getting too greedy because they didn't have much competition for too long.
I have no confidence in Intel. I will be pleasantly surprised if this is true, however, I definitely do not want to be a Beta tester on this product. I will wait at least 6 months before deciding. Intel has disappointed me time and time again.
I'm more keen on APUs...my current desktop rig will be the last desktop PC I'll build as here in Australia, building PCs has gotten too expensive. I'm waiting for MINI PCs to come with high peroformance Strix Halo APUs it will be cheaper to buy than building a whole rig, which generates more power and heat.
I agree, Another Aussie here and I'm heading the same way, I say that as an enthusiast as well.
Sick of these rip off prices, Been building pc's for 30+ years and it was fine until crypto destroyed PC DIY.
Am on a 4080, So I'm good for many years yet but when that no longer cuts it, I'm hoping APU's will be all I need.
I don't care for 4k, Actually downgraded from 4k back to 1440p, If we can get an APU that can handle 1440p 60+ fps in any game including AAA then that's good enough for me.
@@ShaneMcGrath. Same here bro, I game at 1440p couldn't care less about Ray Tracing nor 4K as the performance not worth the hit for higher res. if anything I'd drop to 1080p MINI PCs are the future as well as handheld gaming. I own an ALLY X great device. I'm in my 40s now and building PCs out of reach for me now.
@@ShaneMcGrath. What hurts us are the GPU pricing.
I've had a mini PC with a discrete Radeon 6600M 8GB for almost two years now, and it plays games just fine at 1080p/60fps. It cost me $630, uses around 180W, practically silent and runs at 70-75C top, and it doesn't have RGB, a glass panel, or any of that dross DIY PCs seem to be plagued by.
@@udasai Oh nice, with the upcoming MINI PCs with 32GB of RAM, high power APUs and storage i'm defo going to get a MINI PC once my desktop dies
It would be upsetting if Intel stopped producing/development.
time to root for Intel even though I'm in the market for the high end GPUs which I use for 5 years
they should have focused on a tiled gpu to beat amd/nvidia in the consumer market...
Do we should care about Pcie 5.0?? There are rarely motherboard has that.
A 7800xt is 200 bucks more and im anticipating that the B970 will beat it
Just in power consumption 😂
I thought they had to shelve these with their financial difficulties....
But for them to be good they will need to make sure their drivers work way better than their prior ones.
The drivers alone set them back double digits and made them unusable for older games....destroying the initial release.
Releasing on a shoestring budget and only 2 skus to clear inventory....never something you want to hear about a product your buying.
Drivers are pretty much fixed for majority of games now it's really the old games you'll notice the issues but only the ones that don't support openGL
I am embarrassed for Intel. How hard could it possibly be to put 32GB or even 24GB into an Arc GPU. Intel could steal all of the AI mindshare it wants by providing modern, low cost, high VRAM cards to consumers. There is no downside. Every open source AI project would immediately have volunteers for compatibility development. If Intel made an A770 with 48GB of VRAM at $1000, it couldn't possibly make enough cards to keep them on shelves.
if it was so "easy" then AMD would have already done it.
"By this point why didn't Intel just release CPU's with 1024 cores, they would have easily beat AMD" 🤡🤡🤡
@@KenpachiAjaxAMD seems to have the Marketing IQ of an ape when it comes to GPU's.
32 and even 24GB are pointless on a card that isn't capable of utilizing it.
Drivers?
Quite decent I say just avoid games that are old and don't support opengl