I'm worried about Intel's Battlemage GPUs
ฝัง
- เผยแพร่เมื่อ 11 ต.ค. 2024
- Urcdkeys.Com 25% code: C25 【Back-to-school super sale】
Win11 pro key($21.8):biitt.ly/f3ojw
Win10 pro key($15.4):biitt.ly/pP7RN
Win10 home key($14.4):biitt.ly/nOmyP
office2021 pro key($28.1):biitt.ly/aV1op
office2019 pro key($50.9):biitt.ly/7lzGn
office2021 pro key($96.9):biitt.ly/DToFr
Support me on Patreon: / coreteks
Buy a mug: teespring.com/...
My channel on Odysee: odysee.com/@co...
Follow me on Twitter: / coreteks
Footage from various sources including official youtube channels from AMD, Intel, NVidia, TSMC, etc, as well as other creators are used for educational purposes, in a transformative manner. If you'd like to be credited please contact me
#battlemage #intel #gpu
audio quality is not good.
is perfect
it's definitely not as good as it normally is I have to agree but idc corteks is great
He sounds like a 78 year old
Not as good as in the past: th-cam.com/video/BbFmCwBymkU/w-d-xo.html
The background music is a weird pick and causes issues with the main audio
As an daily arc user... i won't lie - all current rummors are a bit gut hitting... yet i will still buy it for 2 reasons....
1 - i want to be active in proving that we need 3rd competitior and it is not that bad
2 - games i play already run great on my A770 under 1440p screen
People should not reward a company for just making the product. Only if the product is actually good. I remember people buying Bulldozer when Intel dominated and they said the same. But this does not promote or help competition. This just promotes lackluster products that are sold with very thing margins anyway.
Bailout-like behavior
@@Raivo_Kit gave them enough capital to create Ryzen 1000 and recover, didn't it? Amd was a few months away from bankruptcy. And if they had failed we would have been stuck with a Intel Monopoly today and 4 core CPUs...
@@MrBlacksight I very much doubt that bulldozer and a few die hard buyers kept them afloat. What saved them was console business and hiring Jim Keller to design Zen from ground up. They have said so themselves.
Hear hear. I also run a770 16gb to support a 3rd player after green and red team decided to throw us gamers under any passing bus.
Compressing audio is ruining your videos for past 2 years now. Today it got to the point with this video where at 10% volume on my IEM you are SCREAMING at me the whole time. But like usual this comment gets buried anyway, so why am I even still trying....
8===D
On a proper audio setup its the perfect volume.
@@kazuviking No, it's not. It's extremely compressed and sounds like garbage.
@@looks-suspicious I was talking about volume not quality. The quality is garbage but it was not quiet.
Horrible sound quality on this video.
It's AI running on a 14th gen intel K processor
Bro atleast you got audio💀
My volume is adjusted to max and it's silent, even the ads were quiet😂😂
Audio is fucked up
The bad audio made this one almost unwatchable, thankfully it was a short one.
EDIT: no, 6 min was all my ears could take.
@@smilo_don because you are using intel
One of my friends has an ARC 770 and it works great, when it works. These days it works almost all the time.
The only reason I will be skipping Battle Mage is because I already have an Alchemist. I have had great luck with my A770 and I don't see myself upgrading for a few more long years.
A770 was 256-bit, 400mm squared and had 4096 cores. Even if they have a good IPC increase, 192bit with GDDR6 points to a GPU designed for competing with the 4060 at best. Intel's dGPU is already quite bad in p/w despite them focusing on that, the A770 uses more power than a 3070.
Yeap... Intel is going to have to work *hard* on architecture improvements instead of banking on node improvements as implied in this video. Their performance to area (and power) ratio is much much worse than nvidia and AMD cards, even accounting for the process node differences.
If the rumored tech projections are true the Arc B770 16gb model is going to be about 50% faster than the Arc A770 16GB Model. That's a pretty good upgrade for most folks all things considered. If true I will be supporting it. Not going to lie though I'm bummed out BattleMage was scaled back due to Intel's current struggles, and isn't going to be the all out beast it was rumored to be initially. I will still support them however because they stepped in to provide us with a reasonably priced GPU while the other two were price gouging us all. I will be voting once again with my wallet just like I did with the 12900k/ArcA770.
One thing the Intel GPUs did very well was video encoding, especially working together woth the CPU. That is a major advantage for that workflow.
It's all they worked on 2013-2020 because they laid off all their CPU/GPU design engineers and just sat around watching themselves get older . ..
Potato level audio
200W is actually a sweet spot if you look at the this month's Steam hardware survey. I actually compiled some data on it today and between 75W and 200W, so the range where cards are running up to PEG 8PIN, that's 62% of the desktop market. I'm preparing an article on card design, so I had to go through this data in one of the points to explain why focus on this segment.
Intel arc issue is not hardware, it's the driver optimization... Which has come a long way, Intel arc battlemage will be good
Actually Intel Arc Does in fact have a hardware flaw that's been known for ages. A lot of the work they've done with the drivers has been to allow them to work around the hardware flaw. Battlemage graphics *should* be a huge jump forward.
@@hot_wheelz what's the hardware flaw that impacts performance?
it IS hardware. Do you realize that the Arc GPU's have a transistor budget and power draw up to RTX 3070 levels? Yet they're slower than a 3060?
@@JoeL-xk6bo does that hardware flaw or bug still exist when it outperforms the 3060?
@@Wild_Cat yes, the bug is present in all current generation Intel Arc chips.
Bro wtf is this audio? Can we not use a dollar store microphone?
Next GPU generation will be battle of _"The Least Sucky GPU"_
Intel shoud have booted Raja Korduri and Venkata Renduchintala earlier on and kept Jim Keller for an extra couple years.
Meanwhile with their ongoing CPU fiasco, the AI bubble popping right now and the recession next year Intel as a company gonna be in the trenches for the next 3 years.
keller left due to all the politics, he was leaving one way or another
I use my Arc 770 for AV1 encoding, along with a 12700k. Was cheap, temps stay low, I feed my NDI vMIX output across the house using 10 Gb fiber. ONLY issue I have is Windows blowing out new drivers, for the damn original one! So, switched to Fedora 39, and no problems. I'm not a gamer, so what I really was looking for was an encoder box that did not have GPU driver conflicts. An AMD 7950x and 7900 XTX was a nightmare of heat, and dealing with AMD drivers. A 4090 with the 12700k was incredible, but overkill, and required more power than I cared to put in a machine that sits in a dark room. Intel's DeepLink really works for me, despite not having a super powerful cpu or gpu, they work in concert, as an encoder. Efficiently too.
Intel effectively dismantled the Arc division, I would be surprised that Celestial even ships on desktop, maybe mobile.
They're up against upcoming AMD APUs releasing late next year with integrated graphics on-par with a desktop GTX 1660 Super, GTX 1080, and RTX 4060.
We might see an "efficiency lull" in performance gains like this year again in 2026 as a recession dulls demand which might be another chance for the Intel graphics team to catch up.
AMD ain't slowing down though -- especially 2027 onwards.
Source?
@@danielmdax trust me bro.
@@thelasthallow Lets see. Raja is gone. Arc was severely delayed. A580 launched a year after other models. Battlemage is no show in 2024. Yeah things look real "bright" for Intel...
@@handlemonium1660 super is powered by gddr6, i don't think an apu with ddr5 would be able to supply enough bandwith for the igpu, newest apu already saturating the ddr5 bandwith already
Intel's 15th gen will be successful , unless they don't care anymore
I don't think they care. I think they're so demoralized right now, that they wouldn't be able to fab a 4004 without serious defects.
They need to get rid of their execs, hire a whole new leadership team, with the CEO, CTO, and COO all distinguished engineers, material scientists, or physicists. People who know the process intimately, and can actually lead these teams. If you can't understand the cost of choices, you're going to mess up every time.
They don't care anymore for five years already. Intel is doomed.
@@jklappenbach..cant stand political pressure, just another us company a like.
@@jklappenbach They do care. My friend currently works at Intel, and he says the company is taking the situation very seriously. For instance, they've activated their emergency work status for certain departments.
@@swagyolo8602 Like they did care not to lose 3 million in silicon, knowing it was defective right. Having billions. It's like normal salesman, they sell you sh*t quality strawberries knowing most of them you will put into bin. Why they would lose money if they can sell you bad product saying that they didn't know to the 200 customers the same day
6:10 $350-400 for a B770 performing at ~3070Ti levels would be insane! At this price it'd need to be faster with 12-16GB, or be under $300
And that's for a card releasing in 2025...
Remember, this is a 1st gen product, and it's already doing rather well, especially in the productivity sector. You can argue that launch was garbage, cause it was, but Intel has got up from that stumble in their GPU department and are actively improving it. It's VERY much obvious.
Nvidia overprices their GPUs still and AMD is taking too long with pushing RT at all. Intel, on their iGPUs, already have that feature and they're fixing on pushing
@@PixelatedWolf2077 replied to the wrong comment I think?
A770 was always targeting 3070+ performance. Didn't you feel the love from Raja? Oh you didn't? I guess its because A770 was a massive flop selling at 2/3rds the hoped-for msrp ...
@@dazuza95 Oh, whoops! my bad. Thanks for the head ups
Maybe AMD finally discovered the term "efficiency" as they did with the latest 9000 series CPUs.
So, I'm waiting for RADEON 8000 series with high-efficiency especially @ video (media) playback consumption (TH-cam, Netflix, VLC etc.)
40W+ consumption is NOT acceptable just for video playback... (Last 2 Gens, RX 6800 & Up models suffer from this illness !)
Well RDNA3 should be better since AMD implement a way to turn of the unused MCD. My 6700xt always been 26w during video playback with dual monitor setup. For Navi 21 die, the IFC are big and required to be active to move data. Nvidia's L2 cache work separately by each GPC. Even with AD102 80+ mb L2, they work independently and feed their own GPC only. I also think this is why when games aren't optimized for AMD, you can get crash and game not working as the game optimized for Nvidia arch.
The interesting thing about the A770 is that in applications that actually leverage it properly, it ends up entirely competitive with the other 256-bit cards offered by the competition.
Which is wild, because there are so many others wherein it suffers and chokes down to being not-terribly-dissimilar to the 128s.
Zen5 problems are not about TSMC node for core chiplet. The core itself seems to be 18-30% faster when running, with 25%-200% regression in time it takes to start executing a thread, and without getting improvements to main memory latency or LLC hit-rate to feed the core. AMD made probably some microcode mistake that got revealed by Anandtech core-core latency test showing large regression in communication latency between threads in the same core. A game based on 2 thread architecture got 30% boost in FPS from Zen5 in phoronix review of 9700X.
Unfortunately now that the microcode is fixed, the core to core latency is the same as 7000 series and the added speed up is only about 1%. Apparently the inter-CCD latency is irrelevant to most games ..
@@systemBuilder I wasn't speaking about that latency. I was speaking about latency of two threads in the same core. And I've realized it wasn't a mistake, it was a security fix against timing attacks.
In order to prevent process A to have timing attack based way of learning about data another process has L1 and L2 caches store threadID as part of tag that gets checked and make wrong threadID a cache miss.
But that also means if thread 1 in a game shares a data with thread 2, each time one thread access it that becomes L1 and L2 cache miss for the other thread. But big issue in that is Windows moving threads between cores in the same cluster when the thread has done its tasks and starts waiting and releases that core to be used by any thread in the system. Now with plenty of threads in a modern game the probability of same thread getting back to same core is low.
Betting on Intel delivering on their node rollout promises seems wildly optimistic...
Also your next gen speculation numbers are definitely completely off the rails and need revision. Going by the +268% and the +65% numbers you stated so confidently, that would mathematically make the 4080 2.2x as fast as the 4070. That's a yikes.
Yeah Intel themselves admits their nodes are no good for GPU's considering that Arc is made on TSMC's N6.
Their Intel 4 based Meteor Lake is supposedly also is short supply - ie not enough capacity. Also failed to reach clock targets.
Battlemage was designed to fight RDNA 3 and Nvidia 4000 and its going to be underdog even comapre to them, RDNA 4 and 5000 series will make Battlemage hard DOA.
Nah RTX 5k will be expensive
I'll watch when the audio is fix. Okay?
Feels like im watching 30 years old video
If they don't mess up the new gpu cores should be twice as fast. So it would still not be high-end, but the B770 should compete with the RTX 4080.
With the financial pressure on consumers right now, the price will decide the (mid-range) winner this time.
I really dont think you understand this industry half as well as you think.
Aqui esperando la battemate pero nose si irme por nvidia tengo una intel arc a580
Te recomiendo RX 7900 GRE
Just picked up an Rx6750xt @ Primeday for $280. I sacrificed some ray-tracing performance but saved $110.
Fact: Prior to the cuts, Intel had as many employees as TSMC, AMD and NVidia combined. But Intel only had 1/3 the revenue of those three companies and 1/40th of the market cap.
Cuts were needed. I'm surprised it wasn't 25,000 or 1 in 5. Intel could easily cut 20% and still stay in business. But cut smartly. Don't offer generous severance to people you need.
what's your background music track there? very nice
This boils down to a specific person: Raja Koduri. He's just not good at running a design team and he has a very good trail of products under his name to prove this. Next two-three years of Intel GPU releases will have his touch on them so buckle up.
audio is disturbing
Is this analysis accounting for deep link technology? That could be significantly improved as well.
Tons of poor people need cheap Intel GPU
The audio?
Lower end nothing really changes, we haven't gotten a Polaris card in a long time. (Great value and relatively cheap)
Main question, will it support virtualization (SR-IOV, vGPU...)? With neither Nvidia nor AMD supporting it (as well as AMD likely not giving us a meaningful desktop APU anytime soon) being able to share the GPU between VM and the host OS would bring some fresh air in the desktop space!
If Intel can produce something at 7800xt performance with 16gb VRAM as well (along with being equipped with around RDNA4 ray tracing capability) that is priced at say 325/50 dollars it would be a strong enough showing value-wise IMO...but it sounds unlikely on current rumours tbh.
you guys don't really understand why they're slow to release the battlemage chips which have taped out ages ago. they GPU is too expensive, and it performs way below what they aim for. Rumors of intel targeting RTX 3070 were 100000% true looking at the A770 specs, it just performs 2.5 tiers worse. They hoped to have a competitive $499-$549 GPU on their hands, they just sell at a budget price because they wouldn't sell otherwise. Given the supply that they released they only even released the A series as a beta test/field test to help driver improvement.
Another massive loss of money for them if that is all they got
If anyone could do that they would have a winner, I would be happy with 12gb and close to that performance for $350.Your statement does show how screwed intel is though sadly.
They need to release a 7700xt for 250 dollars to shake things up
Such gpu would never be sold for 325/50 dollars. Rather around 600 dollars at a minimum.
No need to be worried, Intel has more backing than we all think and even if it's going to fail with Battlemage it will continue its business and most definitely survive. It's fair to say that what Intel has accomplished with their new GPU cores will also continue to succeed in the integrated GPU space so no matter if single card GPUs will not sell they'll still use them and show the world they can do the GPUs as good as AMD or Nvidia.
Alchemist has been good for anything but gaming. My A770 performs well on my headless system running my Plex server, transcoding videos with Handbrake. I also use it for the occasional retro game streaming via moonlight and it performs okay since I only need it to run 1080p at 60fps.
Something is way off with the percentages, I've upgraded from 4070 to 4070TI super due to VRAM issues. 4070TI super is about 15% slower then 4080, and 30% faster then 4070. So instead of 268% vs A770 as a reference, it might be somewhere about 50% more then the 4070 percentage number below. There is not that great of a gap between these gpus.
Well done you giving nVidia money twice 😂😂😂
VRAM issues in what?
@@mitsuhh In games that need more that 12GB of it
@@adi6293 which games?
@@mitsuhh Don't know ask him, when I had a 3080 I had issues in Diablo 4, Far Cry 6 and RE 4 remake but that card had 10GB of memory
I worry about Intel, not because I like the company, but because competition is sorely needed. I fear though, that with the 13th and 14th gen immolating themselves Intel has lost their last advantage, that being reliability. Still hope they get their act together eventually.
@nnm711 We won't need Intel in Desktop CPU market for long. ARM and Risk 5 is coming in PC in 10-20 years. Intel just needs to keep in market till then. Later they can get lost. Until then i keep buying AMD CPU and GPU, and hope they become better than Nvidia in GPU.
AMD will be hard pressed to push past Nvidia unless Nvidia does what Intel did and grows heavily complacent. AMD on the other hand fumbled hard by not adding any sort of proper RT cores on their GPU. For example, they'll start hurting big time in newer games like the new Wukong game that's coming out.
Intel's current architecture, Alchemist, isn't amazing but I feel it's the start of something good especially with the fact they already implemented not only RT cores but their XMX cores that do what DLSS is to Nvidia and improve the quality in their own XeSS upscaler. The A770 currently does awful but it makes sense as it's just starting.
When AMD bought ATI, they also fumbled hard, too.
Idk what rumors you're hearing but I've yet to hear something about the 5070 being close to the 4080. 5070 is going to be 4070 Ti to 7900 XT. As for top Battlemage, it's going to double performance of A770. So that would be around 40 70 Ti. The problem is A770 was supposed to be around 3070/Ti but wasn't. I think Intel will do better this time but will still miss the mark with drivers and optimization at launch.
According to a leaker the top Battlemage is supposed to be 4070 super performance at best and the rest is 4060 ti level.
Thanks for your Videos
I bet intel will kill its gpu division
Honestly speaking, Intel ARC, while pretty bold, is just too expensive, especially during a time when Intel is going through severe financial crisis. I think it would be best for them to at least pause battlemage until they recover.
Why is this so compressed
I suspect Battlemage will essentially be a paper launch that really only gets any volume production as part of the iGPU component in Intel's CPUs. Next gen there's literally no competition with Nvidia at the top end so Nvidia can do whatever it wants at that level. For the mid range we have AMD vs Nvidia and AMD will pretty much have to offer better value than Nvidia if they want to sell anything. That leaves Intel with only the low end which isn't really going to offer much in the way of margins. It would not surprise me at all if both AMD and Nvidia decide not to even bother offering anything at the low end and just let Intel have that segment. Both AMD and Nvidia would rather sell high margin AI chips. Intel doesn't have that luxury.
I hope you're right about Intel getting it's act together on the manufacturing side. That's really the key to their turnaround IMO. Without that the only way they come back is to spin off their manufacturing business as a separate company and go fabless like AMD and Nvidia. I don't think they'll do that but that's a viable option that would allow them to really focus on chip design instead of manufacturing.
I got ASUS strix 4070TI super for 700 bucks and it's borderline usable in VR when using Praydog's UEVR injector mod to play flat epic engine games in VR. I was thinking of getting 5070(TI) for that reason, but iI don't see the point if it will only be 4080S equivalent as that's only 15-20% faster than my current GPU. 5080 will probably cost 1200 bucks or more again. Lack of competition really sucks 😒
for another thing I am going Intel processor and gpu for the next build
Why is the thumbnail a projector with two gpu fans pasted on top?
I dont. Their future products actually look really good.
They are investing in fabs like crazy and their current management doing it is competent. Unlike the muppets of the past (Murthy) who have caused the steaming turd piles they are dealing with now.
Lvl 1 techs said that Zen 5 wasn't made for x86 to be much better, in favor of huge gains in AI performance. I think Zen 6s issues are not necessarily due to the node.
Hard *agree* that it's not the node - the fact that Zen5 performs worse than Zen4 in some workloads is evidence that it's an architectural issue, not a manufacturing-issue.
I am not worried as the arc battlemage and the upcoming processors are fantastic
I as some people are choming at the bit
Only apple cares about cpu and gpu right now
Their apple silicon m4 beat all x86 in single core score!!!
Guess I won't upgrade once again next gen.
Indeed. The current rumors about Battle Mage are somewhat underwhelming at the high-end, but I think expectations should be realistic. Intel might work a miracle and have a competitive card around the RTX 4070's level, but Intel's best card is more likely to compete against an RTX 3070 Ti/RX 7700 XT. For Intel's sake, they should bring their best GPUs to market because they need to salvage their damaged reputation after accelerated degradation on Raptor Lake CPUs.
If Intel Celestial or Druid can be manufactured in-house on Intel 18A, Intel might have a huge advantage to pump out as many GPUs as they can with leftover wafer capacity after their CPU runs. Does Intel GPU engineering have the ability to transition away from TSMC to Intel fabs?
I'm worried about Intel PERIOD. They're cooked.
Intel may have to focus on fabbing other companies' chips.
Meh. A decade ago AMD was cooked, cycles come and go.
Intel is still a giant corporation turning over billions pa with 35% GM
@@samuelcrow4331 Recent reports have shown that Intel might try to split their fab and design division, or even sell some of their fabs due to earnings woes.
I don't think their 18Ang cards will be ready for next year....I think this is a huge leap in faith from you...(liked)
I like the audio, makes me feel all commodore 64ish inside my 🧠
A number of people are suggesting that this is a lower end card. Think A580 or A750 not A770 replacement. Higher cards with 16gb of memory are 4070ti to 4080 levels of performance although proof will be in the pudding. Xess and expected improvement in ray tracing and AMD features are available on all cards it may be more interesting than you are suggesting here. I put a response which somehow isn't showing up so something similar is added here. Although not perfect the arc cards have come a long way from the original terrible launch. The drivers were the main issue which now have been largely fixed. A recent video showing a number of games at 1440p ultra settings even saw it out performing the 3060ti (especially in the 1% lows) and sometimes keeping up with the 4060ti. On top of that its encoding (av1 etc) across the whole range is best in its price point. It is true that it likely has fallen short of expected targets but it's also not as bad as you are making it out to be. In to Battlemage. The card you're looking at here is (based on other sources) one of the lower end cards. Think the replacement of the A580/750 area and those sources expect the high end card to be somewhere around the 4070ti to 4080 in performance for games. Only time will tell and it will be interesting to see if this actually turns out to be the case and if Intel manages to transfer the hard earned lessons that it hard to learn (on the software side) with the arc cards to deliver this at a reasonable cost.
because their GPU is too bad. The A770 was meant to sell at $500 and compete with the RTX 3070, but it performs like a GTX 1080. They probably couldn't fix the issue and it is no surprise, the Xe architecture has always been bad, even as an iGPU before these desktop cards were made. Leaks of Lunar Lake GPU performance put it right around a 780M...but Lunar Lake was supposed to have Battlemage and also a larger GPU than Meteor Lake. Intel packaging and shipping and selling a 400mm+ GPU that draws 250W and only able to sell it for the price of a 4060 base model is a financial suicide in their current situation. They sold the A770 at probably a $200 loss or more since they were on a more advanced node at TSMC than the RTX 3000 and RX 6000 cards.
@@JoeL-xk6bo although the software was a disaster at launch, the actual arc graphic cards are well made. The drivers are working much better now and on a recent test at 1440p ultra setting it was beating a 3060ti (especially in the 1%liws. It was even holding its own against the 4060ti in some cases. I think that you might want to revisit these cards. It has fallen short in gaming so far from what was hoped but is well ahead of your suggestions. Also encoding it is an excellent choice right through the product lineup outstripping anything else in its class. It isn't perfect but for a first generation product it has certainly come a long way. As for battlemage, I am both hopeful and sceptical. Intel have had a difficult start but have made great strides already and given the hard work already done and the hard lessons learnt I believe they can improve in both efficiency and performance (other sources are currently suggesting this) but given the less than stellar performance from AMD so far using the 4nm setup. The best thing to do here is see if the lessons that Intel have learnt transfers over to battlemage as they cannot have a repeat of what happened to arc at launch.
I don't quite get why people seems to be hoping Arc launching at low price. Intel is not a charity and neither are AMD. People always expecting them to compete and get Nvidia to lower their price. Sadly people, you won't get that if you're always gonna purchase Nvidia either way. So don't get mad at AMD and Intel later for price fixing. Current Arc A series are just a flop which is why you get them for cheaper. If B770 competing with 5060 you will be getting them at 400Usd.
i kind of want to give intel a go, but they have to offer something that's better than im already running, and im running a 7900xt.
i have a hunch that intel may give better performance on linux, and if this is the niche they are targeting, i see that as a good thing. gpu performance has been one of the few remaining things keeping me off of linux on my daily drive.
Intel will improve the core technology, not just node
Intel is 4 YEARS BEHIND NVIDIA! THE A770 was intended to compete with the 3070. It has a bigger die, more transistors, and better process node(s). It came out 2 years after the 3070 and it achieved 3060 performance, i.e. a 2-year generational lagtine, 2+2= 4 years behind! The only reason Intel makes Arc / Battlemage CPUs is to avoid being ejected from the laptop market! I am pretty sure they lose money on every dedicated GPU card they sell!
There is no need to THINK about Intel GPUs before they get within 2Y of NVidia technology - they're a charity case righr now ...
It makes sense.
Having Intel with advantage in node in own fabs it would be start to take wind from AMD (fans) at least. 😅
And I don't deny AMD is doing pretty well except Ryzen 9000.
If AMD said: We silently follow Nvidia AI rush, no high end gpu, AI already in DOA RX7600, Ryzen 9000 will be fast for AI and face recognition (laptops maybe but security agencies will buy it like hot cakes 😊).
I am cutting my comment as I found everything important in video here or any past leak everyone has access to.
Already looking forward next analysis. GG
Don’t ever give opinions with audio quality like this.
you only says about b770 but what about b990
Nah, Battlemage will be for AI, same A770 chip with 48gb of vram. Then they can get some sweet AI investment.
I think the 5060ti at 450 dollars will perform like a 4070 ti super but with 16 gigs of vram and the 5060 having 12.
I think 50 series will actually be good for the low end market... VERRY good unlike 40 series.
Thats if nvidia doesnt raise prices again.
TSMC already said they will raise 5nm price by another 10% in 2025. When RDNA 3 and nvidia 40 series coming out in late 2022 5nm was around $17, 000. Next year 5nm expected price will be around $20, 000 per wafer.
@@arenzricodexd4409 i mean honestly, not as bad as each wafer costing as much as a model X (LINTEL LOL)
I'm just tired of hearing about this gpu at this point, seems like an endless circle jerk and no release date in sight
I read the 15th gen are UP TO 4% faster than the 14900. Intel. 2-7% per year. Maybe these wont eat themselves alive? Nvidia has literally jacked prices up so high that they created a space for Intel's crap GPU's.
This audio is clearly AI generated. Ai trolls are everywhere now.
The Fab business is a solid investment. We need more manufacturers with the ability to make integrated circuits. We can't depend on the graces of TSMC manufacturing availability to the US dependent completely on a foreign company. Let's face it, if China decides to become more assertive and take completely control our ability to get silicon from TSMC will get crazy expensive and the fab ques will be atrocious. You saw this during the movie paranoia response.
There are still cars sitting in lots with missing hardware.
Without TSMC undercutting and taking aggressive postures with clients Intel would be fine. At this point there is an argument for a monopoly. TSMC behaves exactly like Nvidia with its partners "If you order from our competitor, well you won't have ques as available to you"
The exact behavior that took EVGA out of the Graphics card business, EVGA knew that if it decided to also produce AMD cards Nvidia woukd retaliate in some very malicious but semi legal matter. Like nit licensing "power connectors" for thier new GPUS.
Dividen needs to be out on a solid cause in the U.S., investors will cry but too many decisions are made at the board level in the name of dividends putting quality and customers in the trunk... not even the back seat.
Average midwit comment: "Im glad we have more competition and choice"
Then proceeds to never buy an Intel gpu and go Nvidia as always.
Most people only buy one GPU per generation. Right now intel asks for the same price as nvidia for the same performance as nvidia.
PROUD ARC A310 OWNER 😤😤😤 (AND RADEON 7600 AND TWO RTX 3060 12G) FIRST CLASS LINUX GAMING SUPPORT 😤😤😤
Well as a laptop buyer I didn't have a choice.
There was one run of zen4 16core + rdna3 alienwares that I'm pissed I missed the sale on (would have bought for $2800 AUD) instead I picked up a lightly used 13900hx 4060 Helios Neo for 1400, which was sold last year for 1900 down from 3000 RRP. AMD laptops hardly ever get discounted.
The main rule of pricing is, there's no such thing as a bad product, only a bad price. However, if Intel can't make money off it at the price it needs to be to sell, then it's a bad product. Arc is just not efficient enough, it needs too much silicon and too much memory to do the job of its peers. It's so bad nobody even bothered releasing dgpu Arc on laptop
Intro and outro music is atrocious, it hurts my ears and soul.
Intel battlemage will be 2x a770 performance
Certainly not an impossiblity if Intel really nail the architectural improvements. Looking at the comparative performance per watt and are AMD and nvidia are achieving, Intel has plenty of room for improvement that cannot be explained by process nodes.
A lot of Intel fanboys engage in very wishful thinking! They are not aware that Intel was 4Y behind NVidia when they introduced the A770...
All these gpu’s use to much power. I’m a huge fan of the 4060’s low tdp, but at around $300 absolute not. It’s not worth that because it has zero future proof value.
Battlemage specs have been known for a l;ong time already.
B980 Xe 64/256 bus/16-24gb/250-300W (launch H1)
B970 Xe 56/256 bus/16gb/225-250W (launch after B980)
Mobile Xe 40/192 bus/12gb (launch 2025 with the new mobile CPU)
Just for reference.
a750 8gb VRAM 256 bit bus 225W
a770 16gb VRAM 256 bit bus 225W
Year ago.
"Intel has said that it is considering moving some of its manufacturing out of China to the United States to comply with U.S. government regulations."
Today.
"Intel is laying off over 15,000 employees"
So as conclusion U.S is sanctioning China and not giving them the latest. World situation demands it.
Motherboard Brands:
Asus 65% made in China.
Asrock 4% (parent company Asus)
Gigabyte 14% made in China/Taiwan
MSI 14% made in China/Taiwan
China has their fingers on almost all of the motherboards. Asus as the biggest. Asus alone control almost 70%.
Servers/CPU's:
INTEL 70.9%
AMD 20.5%. Both are U.S companies.
ARM 8.1% British company.
CPU Market Share Q2 2023:
INTEL 83%
AMD 17%
Conclusion for internet problems and server failures (Asus W680 motherboards) it is obvious and looking at the BIOS fails and history AMD CPU burned in the socket pushing towards INTEL CPU that now is failing on Chinese made (Asus) motherboards. You have acknowledge where the problem is when native settings over power the CPU by 100 A's for example and how the maximum damage was created.
Celso, you're really harming your sponsors when you publish videos with bad audio.
That ceo looks like one of those that would make any business end up like SEARS Kmart after sucking all the blood out then walking away millions/billions in free cash.
He's actually a legendary tech CEO that's been the architect of important Intel chips, holds various patents, and later built VMWare into a massive company.
@@del46_60 VMware was a massive company already before he got there and had significant momentum.
One problem with this analysis is that current rumors show nVidia 5000 series being only slightly better than 4000 series. Intel only has to catch up to 4000 series with lower prices and better drivers.
Its to bad that 5070 will be gimped at 12gb of ram as a 4080 super for $600 would be received almost as well as the 4090. Given the 7900 GRE price I feel AMD won't price their new card more then $500 unless it is close to the 7900xtx. Intel would have to really surprise but I just don't see how they compete. They should have done this before the Zen generation when they were dominate and could afford to subsidize a department for 5-10 years.
mb
*It is all GAME OVER for Intel !* Batlemage will be another BIG disaster !!!
Even Intel, should advice gamers, to buy AMD products !
Spelling is important.
Please improve youe audio
i think i need to warn people that the semi conductor silicon tech in combination to x86 platform has REACHED ITS PEAK potential. it will barely gain any more. and this is a MATERIAL problem(the medium/bed used for signal processing). whatever the design change in the architectures.
in CPU business, now all they can do is making it more power efficient while adding a "psychological" marginal increase in maximum cycle/clocks. the silicon cant provide more velocity. and adding more transistors only increases the power draw.
and the ingenuity needed now is merely in architectural design. that probably would only gain some tenth of percent.
this problem is faced not only by intel, but by EVERY single brand/manufacturers of x86 based computing apparatuses. and they are just not telling you about it. both AMD, nvidia and intel are quietly agreeing to each other to make it oblivious from public. but they have reached the material performance limit. the silicon.
and you will see that policy(decreasing power draw and increasing power efficiency) more and more in the future batch of new silicon x86 tech. not because of the "green" agenda to fight the climate. because silicon tech has reached its limit. that apply the same to GPU. notice why now they are weighing on FRAME GEN? because the silicon just cant squeeze even more. and adding more power and you get what happen with the intel 13-14th gen chips. more melting GPU everyone?
i have hinted them the WAY OUT, which is to SHIFT to photon/light based processing. which uses crystal medium instead of silicon semi conductors. with this, they can keep going to capitalize with their x86 design, because now they have VELOCITY(how fast a signal can go from point to point), and yet they wont have temperature problem because there is no electrons need to jump between atoms in the processing cores. only light agent. photons. but nobody seems to even bother to invest in it.
you would see photon based crystal chip blocks running in over 300 gigahertz cycles easily.
it simply about changing the transistor way of gating signal with electrostatic activated deflectors and splitters(similar in concept with how you do LCD) inside the crystal chip block.
chip would be blocky and chunky but since you dont need gargantuan cooler of any kind, you get smallest PC possible. running at a mere 50 watts probably. over 300 ghz(terahertz potential even) and only uses some 50 watts for the hundreds of photon/laser micro arrays.
Since you bring up the material properties and manufacturing-limitations, you seem to be forgetting that the same issue is present for ALL ISA's? Even Apple's M-series of ARM-CPU's have quickly met decreased generational gains, and the same phenomenon could be seen on their iPhone SoC's as well as on competitors such as Samsung. Likewise, even Qualcomm's first impressive entry into desktop - X1 Elite - has now immediately been matched in some aspects by AMD's x86-designs with "Ryzen AI".
Regarding Optical computing - I think it will definitively have its place, but I am (like many researchers), skeptical that it will become dominant even in HP-computing* - there will likely be a mix of optical, electro-quantum and exotic architectures.
(I favour devices utilizing a combination of traditional transistor-logic with: phase-change-memory, crossbar-switch -based arithmetic elements, and analog elements to accelerate certain workloads)
*see McMahon Peter L. 2023; "The physics of Optical computing" - it goes through promising applications, but lists the limitations as well.
@@predabot__6778 i recommend light based processing units as main processing units because advanced species USES IT. thats the people who travels thousand of light years away to here. dont ask me how i know them. you forgot whatever processing unit you are mentioning so long it is still using SILICON BED, it is SEMICONDUCTOR. it has LIMITS of electrical performance. i specifically mentioning x86 since it is the big complex processing solution compared to arm. meaning technically the more 'advanced". arm is the simpler one, could make TEMPORARY solution, but not changing the limitations. quantum computing technology is not practical. not only it will require to change completely everything in digital ecosystems so it would interface perfectly, the implementations will be EXPENSIVE. its not for the masses. the crystal chips on the other hand will be as easy to manufacture as LCD. the most rough and simple laser pointer tech that kids can buy as a toy already easy to get. making hundreds of minute laser arrays in tiny form for the signal sources will be an easy manufacturing once the tech has been developed.
its about WILLINGNESS. so many of these company have invested so much on the current outdated and dying tech,and invested a lot more on the impractical quantum processing tech. so they dont want to spend more for THIS, a MUCH SIMPLER AND VIABLE SOLUTION that could get them to hundreds of gigahertz with the same x86/arm architecture designs. there is no changes in digital languages required, you could use everything already available the only thing changes is how the signal is being processed. thats all. the same architecture, using light signals and deflectors/splitters emulating transistor works inside the crystal blocks. and interface the chip block using fiber optics. all ecosystem READILY AVAILABLE.
the company who will be the first to adopt this will sail past all the competitor by hundreds of GHZ performance, less than 100 watts power draw and potentially CHEAPER and more SIMPLER to produce than carving silicon wafers.
if you can already make super fine LCD, you could make nano crystal servos inside a crystal block.
and i share this idea FOR FREE. anybody can go ahead claim and patent it.
79XT
One of the worst videos I almost watched..
The Xe architecture is bad, it's like a rotten fruit. It was too bad at first to even put into Cannon Lake laptops (lol the forgotten generation). I used to packaged device drivers at two previous jobs, Intel iGPU's before the Xe core were reliable almost always. With the new generation came so many problems, software issues, it was a nightmare. They need to move off of the Xe architecture as fast as possible.
I'm still too poor to afford this or game, and my ryzen 3-2200U still sucks donkey 🫏 😮 but I keep watching these in case I can afford them💀