Some additions: Apple's CPU Engineers became Intel's best quality service, but Intel ignored this, to setup another, Apple "transition", in a way "Apple came prepared" They did this back in 2005.
Hey Just fyi Intel don’t only does equivalent of arm nvedia and tamc They also do equivalent work of many eda tool providers(cadence synopsis mentor graphics)wafer and mask manufacturers many many more They r in a different league
Some additions: Apple's CPU Engineers became Intel's best quality service, but Intel ignored this, to setup another, Apple "transition", in a way "Apple came prepared" They did this back in 2005.
The problem was Intel fired all there experienced people in 2015 and 2016 layoffs. In many ways it was age discrimination, every one was late 40s and 50s…..a lot took early retirement like me because I could see the writing on the wall. The CEOs after the founders died or retired have not been leaders…..they are middle managers at best and bean counters at worst! I am just glad I left when I did. They need a leader with vision!
The old white men had to go as part of their $300m investment in workplace diversity. Just think of how much trouble they would be in now if they hadn't decolonized their workforce.
as much as we all shit on intel, let's hope they don't exit the graphics cards business as it would mean amd, nvidia lose competition and they can jack up prices all they want and no one is able to do anything about it
Intel really can't leave graphics development, even if they killed discrete Arc tomorrow. There's a reason why those JDP reports always oversell Intel in terms of shipped graphics processors. They need some form of graphics development because a sizable majority of PCs have only integrated graphics and no one would outside of complete autists would want to go back to black and white terminals. If it's any consolation, between all the mostly justified crapping on Arrow Lake for consumers, the iGPU in those is leaps and bounds better than anything Intel put out before on desktop. If they had like, a 10% price cut, I would just tell people to go grab a 265K or a 245K over an 8700G since unlike AMD's APUs, the IO is actually full-fledged and they aren't "gimped" (PCIe 5.0 is overkill for anyone not running a server farm where SSD's die every six months.) in terms of that.
@@rattlehead999 Not sure if that would be correct. Before Ryzen architecture, they were considered subpar chips and low end, so that has a big contribution to their come back.
i do not envy intel engineer and research people. those people are just expected to perform miracles in physics. probably the only people i wouldn't question for earning beyond 400k a year.
I do not think that most of the problems that Intel is facing today can be blamed on Pat Gelsinger. You have to understand that chip architectures take 2 to three years to develop. The chips that are out now were probably designed in 2021-2022 and while Gelsinger was already CEO, it's not like he was going to come in and upend everything the engineers and managers had already planned. Additionally, he was handed a bloated organization with too many middle managers that have been known to be resistant to change and entitled. When death came in knocking is the first time he has been able to fire most of them without risking antagonizing the people he depended on for the organization to work. Lastly, any technology cycle for fabs take 5 to 10 years. This is the first time he has been able to make decisions on the future of his fabs and Intel is investing more in the next tech than any other fab builder. And yes, that includes TSMC which has refused to buy the new machines. They have said it is not a good bang for buck. Obviously I do not know if that is true. Oh, and the GPUs came too late for the Gaming GPU crunch of 2020 onward and the current AI mania. Thanks.
The GPUs came right in time. It was a good business idea. What failed was the technical execution. The chips where not competitive in performance. Had that been differently everything would had been just fine.
@@rosomak8244 I say it's both technical and excution. Broken as it was, people would have embraced Alchemist, had it just came a few months earlier when everyone is desperate for GPU. And even now, it's not too late to get into a GPU game. Nvidia still hoard everything from gaming to AI and data-center stuff, AMD still not had take off the ground outside of gaming - there are spots available for anyone to get in, literally anyone who delivered.
@@-Ice_Cold- Not at all, AMD only started recovering in 2013 when Sony started using their processors to power the Playstation 4, they lost the desktop market entirely at the time. And they only rise to the glory in 2017, when they released Threadripper and the Ryzen series, for Servers and Desktops respectively, and yet, they are not the dominant player, Intel still is.
@@-Ice_Cold- Yeah no. AMD during the FX era was teetering on the edge of bankruptcy. Intel is probably never going to find themselves in that state, but don't jinx me if they do in the future.
"Why is Apple failing? When Apple first started, we were 10 years ahead of the competition. What happened in 10 years? The competition caught up and Apple stood still. Apple's solution is not to slash and burn but to innovate out of the problem." Steve Jobs, 1997.
this. the Macintosh was 10 years ahead of its time, the only thing that could come close at the same time was an Amiga. but from 1995 to 2005 the Amiga was basically dead (they were really only used for early hackintoshes and legacy media) I 100% agree with this OS X came out the same time as windows XP; it took windows 5 to 7 years to get aero with Vista and seven
I deeply respect Apple I just wish I could still buy an x86 Mac Pro that I could hackintosh the hell out of or the best OS9 Compatible PPC Mac I just don't really care about Apple silicon.
IBM once thought that IBM 360/370 architecture will be forever. Architectures have life span. I have seen numerous ISA coming and going for the last 40 years. x86 is just a very successful one, like IBM 360, but not an immortal one. We are seeing transition from x86 era to ARM. x86 will be thing of the past within 10 years or so.
@@youcantata you really think so?? arm could give compeitition to x86 but can't reaplace it fully. most servers run x86 and might continue to do so as it offer a broad application support compared to arm. most of companies will not switch to arm just cause its so costly to do so. arm will give competition to x86 but will not remove it from computing devicies like laptops, pcs, servers
talking about the geoplitic side, intel can sell some fabs, the others in north america will stay for that specific reason, and everybody happy, except intel, who becomes another fab like global foundries
0:10 That is such an irrelevant stat. Let's say Intel had a loss of only $1 billion but AMD was also struggling and just about broke even with a profit of only $1 million. Headlines: "Intel has 1,000 times as big a loss as AMD made in profits!"
Indeed. Also doesn’t account for big investment and big write-offs that take a bite out of profitability. Would be much more useful to look at profit margins or market share.
That didn't happen: Apple's CPU Engineers became Intel's best quality service, but Intel ignored this, to setup another, Apple "transition", in a way "Apple came prepared" They did this Aback in 2005. Apple CPU Engineer disclosed this in an article back in 2017.
Apple had already been developing their own CPUs for their phones/tablets. Those chips just got good enough to use for laptops. That itself is a damning indictment of Intel when a company that as a side business designs chips, designs one that crushes Intel's in performance.
I do think they have made the right decision. Literally the world right now is depending on TSMC for chips, if someone managed to archieve parity that would be huge and is what they need in order to go forward. It's just the matter if they can hold on long enough to see their vision realised. For a company this desperate, this isn't the time to play it safe.
Pat inherited this mess. Intel previously was consistently investing in their R&D division but stopped when the previous CEO (who was more marketing and business focused). Now that they have fallen way behind the competition they have to play catch up. If the US government wasn't worried about China annexing Taiwan and TSMC (the worlds best Silicone Wafer manufacturer) Intel would have long disappeared. Intel now represents the American hold on technology and is critical. With their current short term cash raising (selling their land and leasing it back) they should survive for long enough to become at the very least competitive with the other companies. Pat is currently Engineering and Talent focused and should be able to sort them out of this mess as TSMC is in contract with the government to aid Intel. This is the make or break era for them but they have the right parts to get them through.
How is incogni so expensive? It's literally an automated bot sending the same request just changing the name. I could understand if it was a couple of euros since it's not funded by ads, but €14 per month?! CRAZY
@@cryonuessgdpr doesn't solve the issue incogni is trying to solve. You are still dealing with many companies who hold your data, each of with you have individual relationships with.
@@HildeTheOkayish Under the GDPR, companies are not allowed to sell your data to random data brokers. So you know exactly who has your data (companies that you have contracts with + credit bureaus that they exchange data with after you explicitly agreed to it). Also, it isn't allowed to collect more data than necessary (i.e. social security numbers may not randomly be collected when there's not need for that). Furthermore, incogni claims that it needs to be a subscription service, because companies will readd your data after a short while after it being deleted, so incogni needs to stay active all the time. This however isn't allowed under the GDPR either.
Furthermore, the fact that people with no idea about chips or computers get to decide who should lead the business. Also, these companies just prioritise shareholders over everyone.
no, it didnt keep in mind that intel had in construction like 5 fabs, each one is like 20000 millions to make, as you know those projects were cancelled or suspended they have debt, 3 years of bad products and amd is eating their lunch in servers, desktops, it was all eaten now the size intel had 4 years ago, it vanished, it is not doing well and will not do better if it doesnt improve soon, or is converted into a fab only, or government takes control for the fabs it is not looking good for intel, in no project they are doing well
It's the age old story that plagues all tech giants. They become giants and then they turn complacent because there's a lack of competition. Innovation stagnates and the rest of the world catches up and eventually overtakes. The complacent company at that point has gotten rid of most of their original passionate founders with a management that only cares about shareholders who only care about profits. In this process, they forget how to make a good product. Without passion, people just fail to see the potential of any startup. Google is going through the same fate, microsoft something similar. All that said, I think appointing Patt the CEO was the right move. I do believe that Intel will pull through in the end. I really want them to develop arm chips or something similar. x86 is dying for consumer products and there's no doubt about it.
@@TheSerpentDKApple may be very successful in multiple product categories, but they're not really dominant in any of them, so I don't think they fit that archetype.
@@Frostbiyt not dominant in market share, but i'd argue that their Mx chips are dominant in performance, as is their Ax chips (at least by 6-9 months compared to Qualcomm et al.) Having said that, nobody can survive 10 years of stagnation...
Slight correction: Intel acquired StrongARM from DEC as part of a patent lawsuit. StrongARM later became the XScale architecture. Great overview of Intel’s woes.
correlates? apple told intel to make chips that did not overheat on the thin laptops apple was disigning, intel said sure, but nothing happened when apple expected, in 2014, so apple went ahead wiht the arm project, intel knew what the apple was asking and couldnt or didnt care to deliver, or a mix of both intel refused to face arm, so arm could progress like this nonstop
@ I honestly don’t know what you are trying to convey. You are just stating what happen which is exactly my point. Apple was looking forward and intel couldn’t keep up hence the worse Macs apple has made are from 2015-2020. 2015 wasn’t bad but it for sure got worse with time.
also, there is a rummor saying intel fired a bunch of engineers in charge of the firmware, the microcode, since the chips couldnt evolve really, it was 12th gen but with more watts being pushed, with poor implementation, and there you had the result it took them 2 years to fix the problem, and some say it is not really fixed, just patched
@@shobuddy not in this case, where it had to be patched 4 times so far that is not a fix, it was a way to calm the masses who did not understood they should have applied for warranty, rma or return of mobey because intel screwed them
Who could have foreseen that it could have negative consequences for Intel to knowingly sell CPUs for years that could oxidize and thus be irreparably damaged, and to try to weasel out of it in response? Apart from that, I would like to point out that without the IBM PC, Intel would have long been a forgotten IT company that produced legendarily bad CPUs in the 1980s.
Beg to differ. As a former telecom / cellular engineer who had opportunity to design in Intel, Hitachi, Motorola and ARM micro-controllers ranging from 8 bit to 32/64 bit (8080/8051/HC05/HC11 - 80386/MC680XXX - arm64) into things ranging from pagers to large central office switches, my experience different. I still remember the late 80s/early 90s our attempts to switch to AMD/OKI/Ti parts only to be bitten by the silicon bugs and reliability issues! Might not have been the best always, but were not far from it most of the time!
As an Intel ex-employee, I can affirm that Intel employees are extremely skilled and the emphasis is always about innovation. The problem is always with the decision making and leadership. They never envisioned future and frontrun in any domain. We always use to work to catch up with competitors products and they continue to release next versions. The complete leadership ladder need an overhaul and bring outside talent to atleast survive in the first place.
19:56 I doubt focusing on AI would have helped considering everything is written with Nvidia's CUDA programming language and the one from Intel is completely different. Even AMD is more like it. Until recently they even helped the ZLUDA project which emulates CUDA on non-Nvidia GPUs. The funniest part is that the project was initially made by an Intel employee to use an Intel API.
@@timothygibney159 ...CUDA is a software layer for working with Nvidia hardware. "Cloud based" just means that you're using someone else's desktop - you still have to use CUDA to interface with your accelerator hardware, no matter where the computer is.
@ you are uniformed. It’s an operating system complete with its own apis, services, networks, daemons, quantum computing, and a complete datacenter. Python functions call lambda in AWS in apis while Azure does so with python functions. Completely abstracted from your hardware. 12 years ago you were correct. We use AWS for rdbms, web hosting, dns services, while only hosting 5 virtual machines which they call instances. Saas and api calls to AI are cloud now with enterprise ai hardware
That is actually a good strategy. By the looks of things the US will do everything in their power to cut off China from the semiconductor supply Chain and when they push too far China, the largest consumer of these products will push back and has the capability to take out TSMC. They may even be able to develop their own chips on par with TSMC in 10 years or less. So its best to diversify and have the option to make things inhouse cause the US is not the global hegemon anymore and its no long free reign for US companies.
Not gonna happen... chip making isn't like automotive, where it's primarily an integration problem... chip making is a technical problem that's constantly evolving. TSMC will tweak their internal fab process every few weeks to improve yield, sometimes only way their customer finds out is through their own acceptance testing. Also, a company like TSMC is really, really compartmentalized to avoid copy-cat/poaching from other companies, so only very few people are aware of the entire process but they lack hands-on experience (more like they haven't done hands-on for a decade or more) to know the nitty gritty details. Only way I can see nvidia pulling it off, is to buy a company like global foundries (which famously failed at making EUV work), and invest hundreds of billions to catch up in 10-15 years time.
@@wopmf4345FxFDxdGaa20 everyone except china can buy EUV machines from ASML. It's not a matter of just buying a machine, else Intel wouldn't be so far behind. The entire manufacturing process is extremely complex and require specialized knowledge to make things work.
ibm still exist, but is not what ibm was known for, right now ibm is nothing, and intel cant be another nothing in the market ibm wants to push quantum computers, but that is not happening, not at the speed they expected it, what will intel do, be a fab for ibm? i bet they will survive, but not as a company that makes products you want to buy, but as a company doing projects for someoneoe else, like ibm you mention, so everyone forgets them + kinda sad, but, intel had it comming
PowerPC is still around. IBM slipped up though, and it can be easy to slip up. IBM had the ability to leapfrog Intel, but Intel eventually pulled ahead again after a few years. This made PowerPC really good for something like a game console. IBM screwed up with the Cell processor. The Cell was ahead of its time, but more importantly, had a hit when it came to general-purpose processing. It's an amazing DSP processor that's probably still used today We still need general-purpose processing. IBM wanted the Cell CPU into the Mac, and Apple didn't like it, which is a major factor why they went with Intel. Intel's Core CPUs then were incredible on top of that. If the Cell CPU was more balanced, it might have been in the Mac. In the PlayStation 3, the Cell was kind-of used like a 2nd GPU, but the problem is that it had to compete with a GPU made by ATI (now part of AMD)
@18:12, I believe the AI fabric chip that Amazon is interested in having Intel manufacture is just a customized Intel design, just like the custom Xeon chips Amazon will be buying. This is a less rosy indicator for Intel, because it means there's still zero indication of interest among chip designers in using Intel's foundry to manufacture their own designs. In other words, Intel Foundry Services still doesn't have any clients besides Intel itself, which doesn't bode well for that side of the business. Even if the node performs well, if nobody but Intel understands how to design for the node, IFS will continue to struggle.
Behind every single foundry there is a huge software stack, that has to support using the services. What intel has in this regard is a huge pile of home-made dung. Don't bet on them getting any better in this regard.
I dunno how they survive tbh, their reputation in the CPU space was demolished with the i9 degradation issues and their denials of it, the ARC GPU line failed and the announced successors are delayed or cancelled and in either way do not challenge AMD or nVidia. I think overstretching themselves is bad and they really need to focus on simplifying and doing one thing but better than anyone else
the competition right now is amd versus the entire arm world, x86 can hold for some time, then amd will jump to arm i bet intel might survive, but only as a foundry it seems, time will tell, qualcomm might buy just some fabs, others might remain under intel control, they are not saying much now, so
The problem with everyone using TSMC isn't only that TSMCs capacities are limited, it's also one single point of failure for the global high end chip manufacturing and has a big red target painted on it. If one day an mentally challenged head of state decides that military support for Taiwan should be limited, it could doom all tech companies in the west. Intel is one of the few companies in the west which can take the pressure of this single point of failure. So it's in everyones best interest that Intels manufacturing capacities are at the best it can be. IMHO
Well made vid! Intel basically got too comfortable being kind of untouchable king of CPUs and woke up way too late, so now has still plenty of catchup to do. It won't be easy for them as competing in manufacturing chips with the TSMC is extremely hard. Let's see where it all goes and if there's light in the end of the tunnel...
While I'm a lifelong Team Red since my first K6 rig in the 90s, after having seen Intel biff it this hard with how their CPUs have struggled to keep up with AMD's in terms of performance-to-dollars ever since Ryzen caught them off guard when it was launched, I feel bad now, and want Intel to succeed. That being said, let it be known that I have faith. These new fabs they're building stateside are the ticket. I just hope they can afford to hold out long enough to realize the gainz that having fabs at home will bring. If they can bust out a rad CPU in the next 5 years that competes better with AMD's offerings, I will go Team Blue just to show my support for a company that can bring itself back from the brink.
Important to remember that Intel still holds 2x the market share of amd, and up until recently Intel was making multiple times the revenue AMD was. Even today, Intel has outsold AMD handily in desktop and laptop markets during 2024. They also didn't end up in this situation randomly, they made bad choices, and are being bankrolled by the American taxpayer
@@EngineeringNibbles Its enigma how they keep 2x or more market share of AMD, while making crappy products. People really don't understand what the brand loyalty is
My last Intel CPU was the i7-4770K. By the time I needed to upgrade AMD had Ryzen and the rest is history. AMDs stuff is so good I don't even bother with any CPU that consumes more than 65w. The stock cooler is fine. I'm done messing with liquid coolers or massive air coolers. Both AMD and Intel need to stop trying to make faster chips and instead focus on power efficiency. A Ryzen 5 5600g is more than powerful enough for 99.99% of users and will be for a decade to come. Give me a CPU with the same performance as a 5600g that consumes 5w and then you can focus on making things faster again.
Not financial advice, but now is the time to buy or invest in Intel Stock. Why? Because in reality Intel is way too big and only has two outcomes. Either they get out of their problems, or they get bought out. Either way its a win to the investor.
If I remember correctly, for some time, while ARM was more consumer friendly and cheaper, Intel was more prestigious. It was noticeable that they are kind of the same, but clearly behaved like a monopolist, making consumers buy new motherboard with each generation while AMD was often compatible with older generations, so same chip kind of lowered its own value as you knew new chip will mean new motherboard, so even more costs on already more expensive product if you would want to upgrade.
Intel, the frog in the boiling pan of water, and it was the huge number of bugs in Skylake that finally shoved Apple off Intel, half of those bugs were discovered by Apple engineers
@@GreyDeathVaccine Insanely ambitious (so good I hope) - it was a design for a super core with insane IPC that could dynamically split into 2 or 4 cores for focusing on multithreading depending on workload. The project was worked on by Jim Keller one of the best CPU architects ever (former AMD, Apple and now ex Intel as well).
We criticise Google for having 2 products (or more) for every category but I think that's the key to innovation and success. If you're competing with yourself all the time it's hard to get comfortable with your own success.
Lunar Lake has been so good. It has the best performance to efficency in windows when you consider compatibility. Just saying it's "fine" seems like an understatement
Lunar Lake’s design, while brilliant is something Intel doesn’t want to keep doing long-term because it’s expensive for Intel to buy chips from TSMC and then pack the dies and memory to a single package.
@@sydguitar99 same thing with arrow Lake. CPUs do more than just gaming. Arrow Lake is pretty good for productivity. And consumes less power, which means less heat.
@@terminator. I somewhat agree with this sentiment and think that Arrow Lake is a semi-step in the right direction. It still doesn't take away from the fact that most public facing CPU benchmarks are for gaming and that Arrow Lake is fairly overpriced for what it offers on that front. If they could just slash prices by around 10-15%, they'd probably wouldn't have got the drubbing they got from 90% of outlets. It being more meh news in between dumpster fires and meh news doesn't help matters.
No company is too big. Nokia was one of the biggest companies before with more than 50% of the phone market share. Look at them now. Although they are doing well, that is nowhere near where they were before.
You pointed out how Intel stopped coming out with new nodes, but you didn't cover why. I mention this because I've yet to see a video covering that why.
Nice summary but financially flawed. One does not save expenses by eliminating dividends. Dividends are an after tax reduction and deducted from equity / retained earnings.
amd cant become like intel, nvidia, qualcom, samsung, broadcom and all the other arm companies pushing will not let amd breathe thigs are changing, x86 is becoming the past, very fast
I remember reading some investing articles from a few years ago (2021/2022 I think) where it said intel where one of the 10 largest companies in the world not in terms of market cap but in terms of cash reserves because of how much money they had due to years of profitability , if intel went bankrupt or insolvent they'll have wasted an insanely large amount of money.
Don't you worry about Intel, they had a similar phase before. I still prefer their cpu-s, i had AMD and Intel during the last few decades, but i am still coming back to Intel.
because you are masochist and feel a deep urge to spend the money that burns your pockets intel never were this big, had so little projects owrking and doing this bad and had so many fab projects around the world with problems, closed or suspended, never fired so many people and never lost more than half of its value allowing amd grow this much intle might survive, but not as it is now. at best it will be a small fab doing amd cpus, said gelsinger himself this year, begging other companies to come to intel to make thir chips on their fabs
They didn't really have a similar phase. AMD did take the performance crown for a couple of years in the early 2000s, when the Athlon 64 came out after the Pentium 4 Netburst architecture failed to deliver, but Intel remained dominant in market share and profitable.
intel receive money from chips act (TAXPAYER money, who else?) intel laying off (TAXPAYING worker, who else?) share holder cheering (there is a bilionaire or two there, who else?) cut TAX (for billionaire, who else?) and raise tariff (increase price paid by customer, who else?)
3:25 Don't we have prigramming languages, like C, Java, Python, etc... mainly for this reason? Just compile for another architecture, and it will work almost automatically. Especially with higher level apps. For example windows had an arm problem, but I don't get why. If you have the source for your app, and for the libraries (there are a great number of FOSS or source available libs), you can compile the app for arm by changing literally a few letters.
It's not just about the code itself. There are no drivers for the arm chips and different variants. You can't use the same code there. It will take a while. And to catch up to the level of optimization of x86 drivers even longer. Just look at arm notebooks. You will not be able to run linux properly there for a while.
It's not that simple. Think about porting something over for Linux, you have to think about Libs management, API compatibility, driver compatibility, it's similar porting over to ARM *your* software will work, but the Libs, APIs, drivers may not, and some of them may not have a ARM version, what you do in this case? You'll have to write your own libs or find an alternative, which is not an easy task to do and waste of time for a market so small on the desktop.
I honestly don't see intel going out of business, they might have to sell their foundries like amd did, but even in the worst case scenario the us government would bail them out to avoid amd becomming a monopoly.
It was bean-counters that destroyed Intel's innovation culture, like they destroyed Boeing's engineering culture. That sort of thing doesn't seem easy to reverse.
This is what I'm sensing as well. Intel decades ago had weird hiring practices where you get a sudden hiring freeze. As soon as they started to dominate AMD, there wasn't as much focus on innovation on the CPUs. Lean Manufactuing went too hard too fast. I'm thinking they got hit from the interest rate hikes used to combat inflation
Market Cap is no valid method for showing what a company is worth. Reality and Stockprices are not logical anymore. Intel will survive i dont have any doubt about that.
Excellent summary of Intel’s struggles. Better than a Cold fusion documentary that I watched. While I feel Intel made many strategic errors, it does prove that US manufacturing in itself has lost its competitive edge. We see many of the same faults with Boeing. In reality, financial markets no longer rewards the real makers. US makers are no longer the powerful companies that actually makes things. Apple will not survive without Foxconn and TSMC, AMD and Nvidia similarly. This decoupling of US and China tech will likely leave America behind. You can build all the factories you want but many of the talent are global and already we are seeing signs of slow progress.
The biggest fail was limiting consumer platform to 4 cores for such a long time. They just wanted to sell their enterprise workstation stuff to people. Then ryzen came, first with 8, later with 16 cores on the budget platform.
Intel foundries will survive Intel desktop and laptop CPUs will not. That's why the stock is down. A total shift from one business model to another. Think IBM in the early 2000s
I don't think anyone wants ARM in desktop systems. Not even laptops but unfortunately for both Intel and AMD both Intel and AMD totally and utterly ignore mobile space making at most derivatives of their desktop architectures for laptops instead of doing bottom-up approach focusing on efficiency. Intel literally gave up mobile space after first two decent tries (Z2xxx and Z3xxx Atoms) didn't pan out as they wanted.
@@e8root Apple moved their desktop and laptops to ARM and they become 10x better. x86 is doomed, thats why Microsoft is trying so hard to move Windows to ARM. Legacy x86 software in Windows might be the only thing keeping Intel alive.
Just a heads up, “stopping to pay dividends” at 15:07 has a completely opposite meaning from what I assume you intended. At least the way I interpret it, it means “stopping [in order] to pay dividends”, not “stopping -to- [the action of] paying dividends”
That was one of the most informative videos I've seen. To the point that I'll be sharing it with fellow tech nerds in my community. Thanks for doing all the research and packaging it into such an enrapturing narrative! If I wasn't already a subscriber, I would do it now in a jiffy.
We have seen this before with other companies like Sun, Sgi, IBM, Commodore, Atari, 3dfx etc. Some of these survived while others either died or got bought by competing companies. Many of these also was the top dogs at one point.
XScale wasn't a small company, it was the name of Intel's second Arm microarchitecture and the first one they designed in-house. Intel acquired their first Arm microarchitecture called StrongARM from Digital Equipment Corporation, along with a fab in Massachusetts.
That Asus Zenfone was actually an incredible phone. It's performance was good and camera quality actually good too. The biggest problem was ASUS didn't do enough software support and it got sluggish over time due to the heavy skin on it. I absolutely loved the customization that phone had though. Thanks for reminding me about that phone lol
my own comments on intel's latest designs: -xeon 6 is quite competitive against the latest epyc release, not completely beat or lose to amd part and i think business customers is okay to that (at least they have second competitive choice once again after long drought of competition since the born of epyc). -lunar lake trade off their performance for battery life and while the battery life result isn't mind boggling yet there is still a chance for improvement on this part with further software update (BIOSes, microcodes, drivers, etc.) and cheaper options started to appear in the form of Ultra 5 variant inside mid range laptop chassis like Vivobook S series. Ultra 5 make a lot of sense for a lot of people because this part could retain 85 to 90% performance of Ultra 7 and Ultra 9 part while being a lot cheaper, making lunar lake a lot more attractive compared to the competition. -i have to agree with you on meteor lake, the value is kinda meh all things considered. but at least it finally make intel design on par with the competition while intel literally transformed the way they design and manufacture their chips with tile design and foveros packaging. that's not a small step when you did it in one go. -arrow lake desktop is just a mess, not competitive in gaming and still meh improvement in productivity workload. but if intel is able to deliver its promises to fix the performance of arrow lake i think it will become a just fine product, nothing really special but at least give intel parity in design compared to ryzen desktop lineup. the last surprise will be arrow lake mobile (true next iteration of meteor lake). i expect intel to do good here since TSMC node will give intel huge advantage in lower voltage window. -and one thing you missed out is battlemage a.k.a next gen intel arc. Xe2 perform really well in lunar lake so i expect this design will be competitive too in discrete graphic form. it will not be market leader, but enought to keep intel design up-to-date with other gpu from different companies. all in all, its a decent releases. you can't expect them to do well right away. i just hope intel will not stop here because their rivals are getting stronger too.
9:27 This is just false. The advanced technology in Intel's 10nm like using cobalt wasn't the "wrong direction" at all. Basically every leading edge fab has since adopted most of the things Intel tried with 10nm. They just tried to do too much all at once. Also, EUV wasn't going to be ready in time for 10nm's initally planned launch date!!! 🤦 THAT'S WHY IT DIDN'T USE IT!!! They only looked stupid for not using it because the process got delayed by so many years that EUV had not only released but become commonplace.
Get 60% off an annual Incogni plan here (sponsored): incogni.com/techaltar
Some additions:
Apple's CPU Engineers became Intel's best quality service, but Intel ignored this, to setup another, Apple "transition", in a way "Apple came prepared" They did this back in 2005.
Where can i find that shirt?
Hey
Just fyi
Intel don’t only does equivalent of arm nvedia and tamc
They also do equivalent work of many eda tool providers(cadence synopsis mentor graphics)wafer and mask manufacturers many many more
They r in a different league
Incogni is a scam.
The supposedly "private" data is actually public. There is nothing to hide.
@@neonoorian2292 Not really, it was an Apple CPU engineer that worked for Apple and disclosed this back in 2017 in an article.
There is a slight slip of the tongue at 11:40: Apple - not Intel - took on the task of moving MacOS to it's own ARM chips.
Same just noticed ,went on to comment ,but you already did!👍
Wanted to comment the same 😁
Some additions:
Apple's CPU Engineers became Intel's best quality service, but Intel ignored this, to setup another, Apple "transition", in a way "Apple came prepared" They did this back in 2005.
Queens gambit
Apple after intel failed to deliver their promises
The problem was Intel fired all there experienced people in 2015 and 2016 layoffs. In many ways it was age discrimination, every one was late 40s and 50s…..a lot took early retirement like me because I could see the writing on the wall. The CEOs after the founders died or retired have not been leaders…..they are middle managers at best and bean counters at worst! I am just glad I left when I did. They need a leader with vision!
Elon musk will buy and lead with fire for humanity...
Or buy linked in and close it 😂
The old white men had to go as part of their $300m investment in workplace diversity. Just think of how much trouble they would be in now if they hadn't decolonized their workforce.
@@mmuller2402scavengers are always lurking
@@mmuller2402 Elon Musk has never run any business that didn't depend on government subsidies or fail.
@@scifino1 How dare you point out this sad reality! :)
as much as we all shit on intel, let's hope they don't exit the graphics cards business as it would mean amd, nvidia lose competition and they can jack up prices all they want and no one is able to do anything about it
It's cute you consider Intel a viable competitor in the GPU business that's making AMD and Nvidia keep pricing low 😂
Intel really can't leave graphics development, even if they killed discrete Arc tomorrow. There's a reason why those JDP reports always oversell Intel in terms of shipped graphics processors. They need some form of graphics development because a sizable majority of PCs have only integrated graphics and no one would outside of complete autists would want to go back to black and white terminals.
If it's any consolation, between all the mostly justified crapping on Arrow Lake for consumers, the iGPU in those is leaps and bounds better than anything Intel put out before on desktop. If they had like, a 10% price cut, I would just tell people to go grab a 265K or a 245K over an 8700G since unlike AMD's APUs, the IO is actually full-fledged and they aren't "gimped" (PCIe 5.0 is overkill for anyone not running a server farm where SSD's die every six months.) in terms of that.
Slava TSMC 🇹🇼
I agree omg this price hike was a major money grab like removing gold from the dollar lol.
Intel graphics cards really suck they are so weak that Amd's own integrated graphics are far better than Intel's.
If amd came back I'm sure Intel can too
AMD came back because intel had already fallen, there just was no competition. But as long as the governments back intel up, they'll comeback.
@@rattlehead999 Don't forget. AMD gave up its Foundry business. Not saying Intel should, but tough decisions ....etc.
@@rattlehead999 Not sure if that would be correct. Before Ryzen architecture, they were considered subpar chips and low end, so that has a big contribution to their come back.
@@rattlehead999the best fx series chip was beaten by a 4 core it at the time. Intel got complacent and never remembered how to compete.
relax, intel is fine.
altho, i genuinely hope intel was gone for good. i don't like their policies.
i do not envy intel engineer and research people. those people are just expected to perform miracles in physics. probably the only people i wouldn't question for earning beyond 400k a year.
I do not think that most of the problems that Intel is facing today can be blamed on Pat Gelsinger. You have to understand that chip architectures take 2 to three years to develop. The chips that are out now were probably designed in 2021-2022 and while Gelsinger was already CEO, it's not like he was going to come in and upend everything the engineers and managers had already planned.
Additionally, he was handed a bloated organization with too many middle managers that have been known to be resistant to change and entitled. When death came in knocking is the first time he has been able to fire most of them without risking antagonizing the people he depended on for the organization to work.
Lastly, any technology cycle for fabs take 5 to 10 years. This is the first time he has been able to make decisions on the future of his fabs and Intel is investing more in the next tech than any other fab builder. And yes, that includes TSMC which has refused to buy the new machines. They have said it is not a good bang for buck. Obviously I do not know if that is true.
Oh, and the GPUs came too late for the Gaming GPU crunch of 2020 onward and the current AI mania.
Thanks.
The GPUs came right in time. It was a good business idea. What failed was the technical execution. The chips where not competitive in performance. Had that been differently everything would had been just fine.
@@rosomak8244 I say it's both technical and excution. Broken as it was, people would have embraced Alchemist, had it just came a few months earlier when everyone is desperate for GPU.
And even now, it's not too late to get into a GPU game. Nvidia still hoard everything from gaming to AI and data-center stuff, AMD still not had take off the ground outside of gaming - there are spots available for anyone to get in, literally anyone who delivered.
Astute observation. Pat was handed a sinking ship and he needed to do some radical
@@rosomak8244 And the driver situation at Arc's launch was pretty abysmal too.
Slava TSMC 🇹🇼
People already forgot where AMD was until Ryzen released.
Intel surely can fail and die but they really shouldn't, for many reasons.
Intel in a worse position vs when AMD was in FX era
@@-Ice_Cold- Doubt it.
@@-Ice_Cold- lmao no, Intel still has more market share which AMD didn't during FX era
@@-Ice_Cold- Not at all, AMD only started recovering in 2013 when Sony started using their processors to power the Playstation 4, they lost the desktop market entirely at the time.
And they only rise to the glory in 2017, when they released Threadripper and the Ryzen series, for Servers and Desktops respectively, and yet, they are not the dominant player, Intel still is.
@@-Ice_Cold- Yeah no. AMD during the FX era was teetering on the edge of bankruptcy. Intel is probably never going to find themselves in that state, but don't jinx me if they do in the future.
"Why is Apple failing? When Apple first started, we were 10 years ahead of the competition. What happened in 10 years? The competition caught up and Apple stood still. Apple's solution is not to slash and burn but to innovate out of the problem." Steve Jobs, 1997.
this. the Macintosh was 10 years ahead of its time, the only thing that could come close at the same time was an Amiga.
but from 1995 to 2005 the Amiga was basically dead (they were really only used for early hackintoshes and legacy media)
I 100% agree with this
OS X came out the same time as windows XP; it took windows 5 to 7 years to get aero with Vista and seven
I deeply respect Apple I just wish I could still buy an x86 Mac Pro that I could hackintosh the hell out of
or the best OS9 Compatible PPC Mac
I just don't really care about Apple silicon.
Sane thought.
@@nxtvim2521 both are still obtainable stuff
Apple was never 10 years ahead of the competition lol , Xerox were but he was saying that referencing the decline of apple in the mid-late 90s
funny how Intel currently runs a big ad campaign on Germans biggest tech news outlet "c't" explaining why the future lies in x86.
Nah, everyone knows future is with RISV-V
the real future is clearly on powerpc, don’t fall from the lies spewed by intel /s
it is though 😅
@@e8root just wait for teams of RISC-V cores wired together to emulate x86 better than Intel !
@@kayakMike1000 That will take a while if it ever happens. Right now x86 is still printing money because there is so much of it.
Like Churchill once said: X86 will never surrender!
Or as Thomas Arne's once sang:
Rule, X86! X86, rule the waves!
X86 never, never, never will be slaves.
as logan paul once said: I like my architecture X86 bruh
@@docilelikewintercatfish I don't really who I despise more
IBM once thought that IBM 360/370 architecture will be forever. Architectures have life span. I have seen numerous ISA coming and going for the last 40 years. x86 is just a very successful one, like IBM 360, but not an immortal one. We are seeing transition from x86 era to ARM. x86 will be thing of the past within 10 years or so.
@@youcantata you really think so?? arm could give compeitition to x86 but can't reaplace it fully. most servers run x86 and might continue to do so as it offer a broad application support compared to arm. most of companies will not switch to arm just cause its so costly to do so. arm will give competition to x86 but will not remove it from computing devicies like laptops, pcs, servers
Why Intel insisting in the fab? Geopolitic strategic reasons. The government will support Intel only if Intel have fabs.
talking about the geoplitic side, intel can sell some fabs, the others in north america will stay for that specific reason, and everybody happy, except intel, who becomes another fab like global foundries
0:10 That is such an irrelevant stat. Let's say Intel had a loss of only $1 billion but AMD was also struggling and just about broke even with a profit of only $1 million. Headlines: "Intel has 1,000 times as big a loss as AMD made in profits!"
Indeed. Also doesn’t account for big investment and big write-offs that take a bite out of profitability. Would be much more useful to look at profit margins or market share.
11:40 Frustrated with Intel's slow progress
*APPLE* switched to their own chips.
You mistakenly said Intel switched to Intel chips
that's what I call improvement
That didn't happen:
Apple's CPU Engineers became Intel's best quality service, but Intel ignored this, to setup another, Apple "transition", in a way "Apple came prepared" They did this Aback in 2005. Apple CPU Engineer disclosed this in an article back in 2017.
Yeah noticed that slip too. It was Apple, not Intel.
Another slip at 12:26. It's not i48, but rather i486.
Apple had already been developing their own CPUs for their phones/tablets. Those chips just got good enough to use for laptops. That itself is a damning indictment of Intel when a company that as a side business designs chips, designs one that crushes Intel's in performance.
@@TUXmint Explained in this YT video:
th-cam.com/video/OuF9weSkS68/w-d-xo.html
I do think they have made the right decision. Literally the world right now is depending on TSMC for chips, if someone managed to archieve parity that would be huge and is what they need in order to go forward.
It's just the matter if they can hold on long enough to see their vision realised. For a company this desperate, this isn't the time to play it safe.
Pat inherited this mess. Intel previously was consistently investing in their R&D division but stopped when the previous CEO (who was more marketing and business focused). Now that they have fallen way behind the competition they have to play catch up. If the US government wasn't worried about China annexing Taiwan and TSMC (the worlds best Silicone Wafer manufacturer) Intel would have long disappeared. Intel now represents the American hold on technology and is critical. With their current short term cash raising (selling their land and leasing it back) they should survive for long enough to become at the very least competitive with the other companies. Pat is currently Engineering and Talent focused and should be able to sort them out of this mess as TSMC is in contract with the government to aid Intel. This is the make or break era for them but they have the right parts to get them through.
How is incogni so expensive?
It's literally an automated bot sending the same request just changing the name.
I could understand if it was a couple of euros since it's not funded by ads, but €14 per month?! CRAZY
Also in the EU at least, this service in completely unnecessary, because we have GDPR.
@@cryonuessgdpr doesn't solve the issue incogni is trying to solve. You are still dealing with many companies who hold your data, each of with you have individual relationships with.
How else could they pay for all these ads? :)
@@HildeTheOkayish Under the GDPR, companies are not allowed to sell your data to random data brokers. So you know exactly who has your data (companies that you have contracts with + credit bureaus that they exchange data with after you explicitly agreed to it). Also, it isn't allowed to collect more data than necessary (i.e. social security numbers may not randomly be collected when there's not need for that). Furthermore, incogni claims that it needs to be a subscription service, because companies will readd your data after a short while after it being deleted, so incogni needs to stay active all the time. This however isn't allowed under the GDPR either.
@@cryonuess1) hackers exist, so your data can get stolen even if not sold
2) just because it's not allowed doesn't mean no-one does it
I often watch your videos in awe of just how focused, clear, and articulate you can be
awe?
Dont get me wrong i enjoy Tech Altar but maybe you should expand your reading and watching if this gives you "awe" ya boob
The valley of death is when you get complacent with your leading tech and start hiring MBAs to run your company
19:36 That's why i hate public companies,stock price is irrelevant, Intel still has good revenue,and had good profits.
Furthermore, the fact that people with no idea about chips or computers get to decide who should lead the business.
Also, these companies just prioritise shareholders over everyone.
Arguably, they pissed away a lot cash on share buybacks. The CHIPs effectively reimburses them for this
@@hilal_younus Boeing agrees with you!
no, it didnt
keep in mind that intel had in construction like 5 fabs, each one is like 20000 millions to make, as you know those projects were cancelled or suspended
they have debt, 3 years of bad products and amd is eating their lunch in servers, desktops, it was all eaten now
the size intel had 4 years ago, it vanished, it is not doing well and will not do better if it doesnt improve soon, or is converted into a fab only, or government takes control for the fabs
it is not looking good for intel, in no project they are doing well
@boltez6507 That's why I buy the dip on intel :P
It's the age old story that plagues all tech giants. They become giants and then they turn complacent because there's a lack of competition. Innovation stagnates and the rest of the world catches up and eventually overtakes. The complacent company at that point has gotten rid of most of their original passionate founders with a management that only cares about shareholders who only care about profits. In this process, they forget how to make a good product. Without passion, people just fail to see the potential of any startup. Google is going through the same fate, microsoft something similar.
All that said, I think appointing Patt the CEO was the right move. I do believe that Intel will pull through in the end. I really want them to develop arm chips or something similar. x86 is dying for consumer products and there's no doubt about it.
nah intel does do r&d its just in hindsight that other technologies are more popular
Guess Apple is next
@@TheSerpentDKApple may be very successful in multiple product categories, but they're not really dominant in any of them, so I don't think they fit that archetype.
@@Frostbiyt not dominant in market share, but i'd argue that their Mx chips are dominant in performance, as is their Ax chips (at least by 6-9 months compared to Qualcomm et al.) Having said that, nobody can survive 10 years of stagnation...
A Tech CEO who quotes the Bible was the right move to appointing?
Slight correction: Intel acquired StrongARM from DEC as part of a patent lawsuit. StrongARM later became the XScale architecture. Great overview of Intel’s woes.
IMHO Intel is a large dinosaur, and large dinosaurs don't evolve and move quickly enough in the face of change.
Large dinosaurs are a popular choice for backing for other rich dinosaurs in the US government
If it's lifetime, intel has pivoted it's main business a few times.
In the chip business, being a dinosaur or not doesn't matter, you just need to have the most cash
Are you retar/ded IRL?
You forgot that these dinosaurs still works with older apps and system that still running 😊
Crazy how the start of intel’s fall from grace correlates with the start of Apple worse Mac’s 2015-2020
correlates?
apple told intel to make chips that did not overheat on the thin laptops apple was disigning, intel said sure, but nothing happened when apple expected, in 2014, so apple went ahead wiht the arm project, intel knew what the apple was asking and couldnt or didnt care to deliver, or a mix of both
intel refused to face arm, so arm could progress like this nonstop
@ I honestly don’t know what you are trying to convey. You are just stating what happen which is exactly my point. Apple was looking forward and intel couldn’t keep up hence the worse Macs apple has made are from 2015-2020. 2015 wasn’t bad but it for sure got worse with time.
So I guess now we kind of know why Intel's home CPU's had high rate of failure in the past two gens. They rushed them.
also, there is a rummor saying intel fired a bunch of engineers in charge of the firmware, the microcode, since the chips couldnt evolve really, it was 12th gen but with more watts being pushed, with poor implementation, and there you had the result
it took them 2 years to fix the problem, and some say it is not really fixed, just patched
@@arch1107 In IT terms, a patch is generally considered a fix.
@@shobuddy not in this case, where it had to be patched 4 times so far
that is not a fix, it was a way to calm the masses who did not understood they should have applied for warranty, rma or return of mobey because intel screwed them
Who could have foreseen that it could have negative consequences for Intel to knowingly sell CPUs for years that could oxidize and thus be irreparably damaged, and to try to weasel out of it in response?
Apart from that, I would like to point out that without the IBM PC, Intel would have long been a forgotten IT company that produced legendarily bad CPUs in the 1980s.
Beg to differ. As a former telecom / cellular engineer who had opportunity to design in Intel, Hitachi, Motorola and ARM micro-controllers ranging from 8 bit to 32/64 bit (8080/8051/HC05/HC11 - 80386/MC680XXX - arm64) into things ranging from pagers to large central office switches, my experience different. I still remember the late 80s/early 90s our attempts to switch to AMD/OKI/Ti parts only to be bitten by the silicon bugs and reliability issues! Might not have been the best always, but were not far from it most of the time!
As an Intel ex-employee, I can affirm that Intel employees are extremely skilled and the emphasis is always about innovation. The problem is always with the decision making and leadership. They never envisioned future and frontrun in any domain. We always use to work to catch up with competitors products and they continue to release next versions. The complete leadership ladder need an overhaul and bring outside talent to atleast survive in the first place.
19:56 I doubt focusing on AI would have helped considering everything is written with Nvidia's CUDA programming language and the one from Intel is completely different.
Even AMD is more like it. Until recently they even helped the ZLUDA project which emulates CUDA on non-Nvidia GPUs.
The funniest part is that the project was initially made by an Intel employee to use an Intel API.
Cuda is out. AWS and Azure is where it is. No one does ai stuff on desktops as it’s all cloud based now
If Nvidia screws up, everyone may write everything in Vulkan.
@@timothygibney159 ...CUDA is a software layer for working with Nvidia hardware. "Cloud based" just means that you're using someone else's desktop - you still have to use CUDA to interface with your accelerator hardware, no matter where the computer is.
@ you are uniformed. It’s an operating system complete with its own apis, services, networks, daemons, quantum computing, and a complete datacenter. Python functions call lambda in AWS in apis while Azure does so with python functions. Completely abstracted from your hardware. 12 years ago you were correct.
We use AWS for rdbms, web hosting, dns services, while only hosting 5 virtual machines which they call instances. Saas and api calls to AI are cloud now with enterprise ai hardware
@@oflameo8927 OpenCL is the competitor do CUDA, Vulkan is just an graphical API, CUDA and OpenCL is another thing.
Wow! You did an amazing job explaining this whole topic
The funny thing is that NVIDIA wants to make their own fabs to vertically integrate like Intel.
That is actually a good strategy. By the looks of things the US will do everything in their power to cut off China from the semiconductor supply Chain and when they push too far China, the largest consumer of these products will push back and has the capability to take out TSMC.
They may even be able to develop their own chips on par with TSMC in 10 years or less. So its best to diversify and have the option to make things inhouse cause the US is not the global hegemon anymore and its no long free reign for US companies.
Not gonna happen... chip making isn't like automotive, where it's primarily an integration problem... chip making is a technical problem that's constantly evolving. TSMC will tweak their internal fab process every few weeks to improve yield, sometimes only way their customer finds out is through their own acceptance testing. Also, a company like TSMC is really, really compartmentalized to avoid copy-cat/poaching from other companies, so only very few people are aware of the entire process but they lack hands-on experience (more like they haven't done hands-on for a decade or more) to know the nitty gritty details. Only way I can see nvidia pulling it off, is to buy a company like global foundries (which famously failed at making EUV work), and invest hundreds of billions to catch up in 10-15 years time.
@@evangellydonut Doesn't TMSC use ASML's machines as well? That's the core technology, and it's not TMSC's technology.
@@wopmf4345FxFDxdGaa20 everyone except china can buy EUV machines from ASML. It's not a matter of just buying a machine, else Intel wouldn't be so far behind. The entire manufacturing process is extremely complex and require specialized knowledge to make things work.
I've not heard anything about Nvidia wanting to make their own fabs. I thought they wanted to make their own CPUs
Intel will probably have to pivot like IBM did to stay alive. The question is what they will pivot into.
ibm still exist, but is not what ibm was known for, right now ibm is nothing, and intel cant be another nothing in the market
ibm wants to push quantum computers, but that is not happening, not at the speed they expected it, what will intel do, be a fab for ibm?
i bet they will survive, but not as a company that makes products you want to buy, but as a company doing projects for someoneoe else, like ibm you mention, so everyone forgets them
+
kinda sad, but, intel had it comming
PowerPC is still around. IBM slipped up though, and it can be easy to slip up. IBM had the ability to leapfrog Intel, but Intel eventually pulled ahead again after a few years. This made PowerPC really good for something like a game console.
IBM screwed up with the Cell processor. The Cell was ahead of its time, but more importantly, had a hit when it came to general-purpose processing. It's an amazing DSP processor that's probably still used today
We still need general-purpose processing. IBM wanted the Cell CPU into the Mac, and Apple didn't like it, which is a major factor why they went with Intel. Intel's Core CPUs then were incredible on top of that. If the Cell CPU was more balanced, it might have been in the Mac. In the PlayStation 3, the Cell was kind-of used like a 2nd GPU, but the problem is that it had to compete with a GPU made by ATI (now part of AMD)
@18:12, I believe the AI fabric chip that Amazon is interested in having Intel manufacture is just a customized Intel design, just like the custom Xeon chips Amazon will be buying. This is a less rosy indicator for Intel, because it means there's still zero indication of interest among chip designers in using Intel's foundry to manufacture their own designs.
In other words, Intel Foundry Services still doesn't have any clients besides Intel itself, which doesn't bode well for that side of the business. Even if the node performs well, if nobody but Intel understands how to design for the node, IFS will continue to struggle.
Behind every single foundry there is a huge software stack, that has to support using the services. What intel has in this regard is a huge pile of home-made dung. Don't bet on them getting any better in this regard.
they survived their chips not being able to do math. I think they're going to be just fine
AMD: First time?
I dunno how they survive tbh, their reputation in the CPU space was demolished with the i9 degradation issues and their denials of it, the ARC GPU line failed and the announced successors are delayed or cancelled and in either way do not challenge AMD or nVidia.
I think overstretching themselves is bad and they really need to focus on simplifying and doing one thing but better than anyone else
They will survive for sure, byt when it will make a comeback still uncertain. The competition of both is what made us customer get the best
the competition right now is amd versus the entire arm world, x86 can hold for some time, then amd will jump to arm i bet
intel might survive, but only as a foundry it seems, time will tell, qualcomm might buy just some fabs, others might remain under intel control, they are not saying much now, so
Missed these longer form videos. Thank you!
The problem with everyone using TSMC isn't only that TSMCs capacities are limited, it's also one single point of failure for the global high end chip manufacturing and has a big red target painted on it. If one day an mentally challenged head of state decides that military support for Taiwan should be limited, it could doom all tech companies in the west. Intel is one of the few companies in the west which can take the pressure of this single point of failure. So it's in everyones best interest that Intels manufacturing capacities are at the best it can be. IMHO
Well made vid! Intel basically got too comfortable being kind of untouchable king of CPUs and woke up way too late, so now has still plenty of catchup to do. It won't be easy for them as competing in manufacturing chips with the TSMC is extremely hard. Let's see where it all goes and if there's light in the end of the tunnel...
While I'm a lifelong Team Red since my first K6 rig in the 90s, after having seen Intel biff it this hard with how their CPUs have struggled to keep up with AMD's in terms of performance-to-dollars ever since Ryzen caught them off guard when it was launched, I feel bad now, and want Intel to succeed. That being said, let it be known that I have faith. These new fabs they're building stateside are the ticket. I just hope they can afford to hold out long enough to realize the gainz that having fabs at home will bring. If they can bust out a rad CPU in the next 5 years that competes better with AMD's offerings, I will go Team Blue just to show my support for a company that can bring itself back from the brink.
Important to remember that Intel still holds 2x the market share of amd, and up until recently Intel was making multiple times the revenue AMD was.
Even today, Intel has outsold AMD handily in desktop and laptop markets during 2024. They also didn't end up in this situation randomly, they made bad choices, and are being bankrolled by the American taxpayer
I feel this, I want them to succeed too, would be very interesting to see
Its no longer red/blue, now its x86 vs Arm, and Arm is winning
@@EngineeringNibbles Its enigma how they keep 2x or more market share of AMD, while making crappy products. People really don't understand what the brand loyalty is
My last Intel CPU was the i7-4770K. By the time I needed to upgrade AMD had Ryzen and the rest is history. AMDs stuff is so good I don't even bother with any CPU that consumes more than 65w. The stock cooler is fine. I'm done messing with liquid coolers or massive air coolers. Both AMD and Intel need to stop trying to make faster chips and instead focus on power efficiency. A Ryzen 5 5600g is more than powerful enough for 99.99% of users and will be for a decade to come. Give me a CPU with the same performance as a 5600g that consumes 5w and then you can focus on making things faster again.
When the time comes, we'll need a sequel to this
Not financial advice, but now is the time to buy or invest in Intel Stock.
Why?
Because in reality Intel is way too big and only has two outcomes. Either they get out of their problems, or they get bought out.
Either way its a win to the investor.
This has been your best effort to date IMHO. Thanks.
If I remember correctly, for some time, while ARM was more consumer friendly and cheaper, Intel was more prestigious.
It was noticeable that they are kind of the same, but clearly behaved like a monopolist, making consumers buy new motherboard with each generation while AMD was often compatible with older generations, so same chip kind of lowered its own value as you knew new chip will mean new motherboard, so even more costs on already more expensive product if you would want to upgrade.
I like when you dedicate a video to a topic instead of jumbling different topics together
There have been some questionable choices like selling off and renting back some of their buildings. Hopefully they can climb back.
Excellent video! I learn so much from this channel.
Ah I didn't realize that the long march had already begun .....
lol that says it so well, let's hope the chip market turns red... I mean blue
Unfortunately, I don't understand half of the graphs you show, e.g. 10:01, 11:19. A small explanation would be much appreciated.
Intel, the frog in the boiling pan of water, and it was the huge number of bugs in Skylake that finally shoved Apple off Intel, half of those bugs were discovered by Apple engineers
Skylake was my last Intel CPU. I switched camps and not planning to return.
My first time watching one of your videos - and wow, what great reporting - thank you!!
intel allegedly already cancelling their royal core project. so sad
Completely false. 18A which is their “core project” is almost in production
18A node has nothing to do with Royal core which is a CPU design.
And yep the Royal core project is dead, it looked insane though.
@@stefanbucur6472 i heard that before, many times, but here we are
@@aravindpallippara1577 Insane good or insane bad?
@@GreyDeathVaccine Insanely ambitious (so good I hope) - it was a design for a super core with insane IPC that could dynamically split into 2 or 4 cores for focusing on multithreading depending on workload. The project was worked on by Jim Keller one of the best CPU architects ever (former AMD, Apple and now ex Intel as well).
We criticise Google for having 2 products (or more) for every category but I think that's the key to innovation and success. If you're competing with yourself all the time it's hard to get comfortable with your own success.
Lunar Lake has been so good. It has the best performance to efficency in windows when you consider compatibility. Just saying it's "fine" seems like an understatement
Lunar Lake’s design, while brilliant is something Intel doesn’t want to keep doing long-term because it’s expensive for Intel to buy chips from TSMC and then pack the dies and memory to a single package.
@Jabid21 I mean it's a good starting point, I have the zenbook 14, and it lasts me all day. Just saying it's fine is dishonest
@@sydguitar99 same thing with arrow Lake. CPUs do more than just gaming. Arrow Lake is pretty good for productivity. And consumes less power, which means less heat.
@terminator. pretty much all the reviews for arrow lake and lunar lake have been very positive and they seem to be selling quite well
@@terminator. I somewhat agree with this sentiment and think that Arrow Lake is a semi-step in the right direction. It still doesn't take away from the fact that most public facing CPU benchmarks are for gaming and that Arrow Lake is fairly overpriced for what it offers on that front. If they could just slash prices by around 10-15%, they'd probably wouldn't have got the drubbing they got from 90% of outlets. It being more meh news in between dumpster fires and meh news doesn't help matters.
What's crazy is that AMD still, 10 years since ++ nodes, hasn't crossed 30% cpu market cap.
Intel's too big to fail
I hope so, I don't want to see AMD start being the new Intel again.
wich means congress will keep it a float with taxpayer money because free market if regulator ask but not if i'm in deep shit for my own mistake
I figure Intel can coast for 10 years but by then they're going to have to deliver.
even after lehman brothers crash, its funny to people say this.
No company is too big. Nokia was one of the biggest companies before with more than 50% of the phone market share. Look at them now. Although they are doing well, that is nowhere near where they were before.
You pointed out how Intel stopped coming out with new nodes, but you didn't cover why. I mention this because I've yet to see a video covering that why.
11:40 Little slip up of words
Nice summary but financially flawed. One does not save expenses by eliminating dividends. Dividends are an after tax reduction and deducted from equity / retained earnings.
I want Intel to comeback because I don't want AMD to be like Intel 🌚
8 years ago you could have swapped the company names lol
amd cant become like intel, nvidia, qualcom, samsung, broadcom and all the other arm companies pushing will not let amd breathe
thigs are changing, x86 is becoming the past, very fast
I remember reading some investing articles from a few years ago (2021/2022 I think) where it said intel where one of the 10 largest companies in the world not in terms of market cap but in terms of cash reserves because of how much money they had due to years of profitability , if intel went bankrupt or insolvent they'll have wasted an insanely large amount of money.
Don't you worry about Intel, they had a similar phase before. I still prefer their cpu-s, i had AMD and Intel during the last few decades, but i am still coming back to Intel.
And that prove what your opinion worth.
because you are masochist and feel a deep urge to spend the money that burns your pockets
intel never were this big, had so little projects owrking and doing this bad and had so many fab projects around the world with problems, closed or suspended, never fired so many people and never lost more than half of its value allowing amd grow this much
intle might survive, but not as it is now.
at best it will be a small fab doing amd cpus, said gelsinger himself this year, begging other companies to come to intel to make thir chips on their fabs
They didn't really have a similar phase. AMD did take the performance crown for a couple of years in the early 2000s, when the Athlon 64 came out after the Pentium 4 Netburst architecture failed to deliver, but Intel remained dominant in market share and profitable.
intel receive money from chips act (TAXPAYER money, who else?)
intel laying off (TAXPAYING worker, who else?)
share holder cheering (there is a bilionaire or two there, who else?)
cut TAX (for billionaire, who else?) and raise tariff (increase price paid by customer, who else?)
I love these amazing in depth analysis from TechAltar. This type of content makes my day!
Thank you!
Glad you enjoy it!
Their fault for trying to just give us quad core cpu's and forcing motherboard upgrades. Look at how many cpu's AM4 supports.
Bought AM4 platform this year, cause I had 40 gigs of RAM from my Skylake setup. I am very happy with R9 5900X (12C/24T).
*UserBenchmark writers right now:*
STRESS LEVEL:
99% CRITICAL
Their single biggest mistake was not dropping x86 completely and going fully into risc architecture, they would be the standard now
I called that in the 90’s. It took a long time to catch up to them but catch up it did.
3:25
Don't we have prigramming languages, like C, Java, Python, etc... mainly for this reason? Just compile for another architecture, and it will work almost automatically. Especially with higher level apps.
For example windows had an arm problem, but I don't get why. If you have the source for your app, and for the libraries (there are a great number of FOSS or source available libs), you can compile the app for arm by changing literally a few letters.
For sure ,but they still need Optimisation to take advantage of uniqueness on each architecture
It's not just about the code itself. There are no drivers for the arm chips and different variants. You can't use the same code there. It will take a while. And to catch up to the level of optimization of x86 drivers even longer. Just look at arm notebooks. You will not be able to run linux properly there for a while.
It's not that simple. Think about porting something over for Linux, you have to think about Libs management, API compatibility, driver compatibility, it's similar porting over to ARM *your* software will work, but the Libs, APIs, drivers may not, and some of them may not have a ARM version, what you do in this case? You'll have to write your own libs or find an alternative, which is not an easy task to do and waste of time for a market so small on the desktop.
I honestly don't see intel going out of business, they might have to sell their foundries like amd did, but even in the worst case scenario the us government would bail them out to avoid amd becomming a monopoly.
Pat is Private Equity. They raped the company.
😂😂😂😂😂😂😂😂
It was bean-counters that destroyed Intel's innovation culture, like they destroyed Boeing's engineering culture. That sort of thing doesn't seem easy to reverse.
This is what I'm sensing as well. Intel decades ago had weird hiring practices where you get a sudden hiring freeze. As soon as they started to dominate AMD, there wasn't as much focus on innovation on the CPUs. Lean Manufactuing went too hard too fast. I'm thinking they got hit from the interest rate hikes used to combat inflation
Spot on, especially for a decade which is an eternity in tech
So the problem was manufacturing and the solution was to hold onto it for dear life and sell off half the company? What a genius CEO
Sure they will. I bet that Intel shares in 2025 will go sky rocketing!
Market Cap is no valid method for showing what a company is worth. Reality and Stockprices are not logical anymore. Intel will survive i dont have any doubt about that.
Excellent analysis!
Excellent summary of Intel’s struggles. Better than a Cold fusion documentary that I watched. While I feel Intel made many strategic errors, it does prove that US manufacturing in itself has lost its competitive edge. We see many of the same faults with Boeing. In reality, financial markets no longer rewards the real makers. US makers are no longer the powerful companies that actually makes things. Apple will not survive without Foxconn and TSMC, AMD and Nvidia similarly. This decoupling of US and China tech will likely leave America behind. You can build all the factories you want but many of the talent are global and already we are seeing signs of slow progress.
Intel will be fine
The biggest fail was limiting consumer platform to 4 cores for such a long time. They just wanted to sell their enterprise workstation stuff to people. Then ryzen came, first with 8, later with 16 cores on the budget platform.
Intel foundries will survive
Intel desktop and laptop CPUs will not.
That's why the stock is down. A total shift from one business model to another. Think IBM in the early 2000s
I don't think their cpus will die. It's all connected for better or worse. Having a successful foundry helps their cpu profits
I don't think anyone wants ARM in desktop systems. Not even laptops but unfortunately for both Intel and AMD both Intel and AMD totally and utterly ignore mobile space making at most derivatives of their desktop architectures for laptops instead of doing bottom-up approach focusing on efficiency. Intel literally gave up mobile space after first two decent tries (Z2xxx and Z3xxx Atoms) didn't pan out as they wanted.
@@e8root Apple moved their desktop and laptops to ARM and they become 10x better. x86 is doomed, thats why Microsoft is trying so hard to move Windows to ARM. Legacy x86 software in Windows might be the only thing keeping Intel alive.
I'm not sure on the second - they have 66-70% market share
@@IBMboy Only Microsoft and Apple is doing it mostly, 95% of the PC/Desktop companies are still relying on X86.
this kind of excites me cuz i know intel is better than this and hopefully this will push them to innovate like they used to
Great insightful episode!
Just a heads up, “stopping to pay dividends” at 15:07 has a completely opposite meaning from what I assume you intended. At least the way I interpret it, it means “stopping [in order] to pay dividends”, not “stopping -to- [the action of] paying dividends”
Very good bedtime story. Watching at 6:30 AM (morning) just before going to sleep
11:40 - Uh I think you mean "frustrated by Intel's slow progress, *APPLE* took on the herculean task..." ?
'cause I've been blastin' and laughin' so long that
even my momma thinks that my mind is gone
yee be treated like a punk, you know that's unheard of
I had never considered before that vertical integration creates a weak spot through inter-dependance but it makes so much sense.
Thank you! I really like your "de-mystification" videos. I know it is a lot of effort, and your effort is genuinely appreciated 👍
what did Intel do wrong?1 slept on the smartphone market to get left by Qualcomm.2 they slept on the AI Market to Nvidia
That was one of the most informative videos I've seen. To the point that I'll be sharing it with fellow tech nerds in my community. Thanks for doing all the research and packaging it into such an enrapturing narrative! If I wasn't already a subscriber, I would do it now in a jiffy.
So Intel is currently stuck in Bane's Prison?
We have seen this before with other companies like Sun, Sgi, IBM, Commodore, Atari, 3dfx etc. Some of these survived while others either died or got bought by competing companies. Many of these also was the top dogs at one point.
Intel's new motto: "Yeah, though I walk through the valley of the shadow of death, I will fear no evil...."
So Intel is the next Nokia, which didn't board the right bus at the right time.
Wow, that was a great video. Excellent research!
XScale wasn't a small company, it was the name of Intel's second Arm microarchitecture and the first one they designed in-house. Intel acquired their first Arm microarchitecture called StrongARM from Digital Equipment Corporation, along with a fab in Massachusetts.
It's good to see good news like this
7:20 and so, Intel's tick tock model became more like a tiktok model
That Asus Zenfone was actually an incredible phone. It's performance was good and camera quality actually good too. The biggest problem was ASUS didn't do enough software support and it got sluggish over time due to the heavy skin on it. I absolutely loved the customization that phone had though. Thanks for reminding me about that phone lol
my own comments on intel's latest designs:
-xeon 6 is quite competitive against the latest epyc release, not completely beat or lose to amd part and i think business customers is okay to that (at least they have second competitive choice once again after long drought of competition since the born of epyc).
-lunar lake trade off their performance for battery life and while the battery life result isn't mind boggling yet there is still a chance for improvement on this part with further software update (BIOSes, microcodes, drivers, etc.) and cheaper options started to appear in the form of Ultra 5 variant inside mid range laptop chassis like Vivobook S series. Ultra 5 make a lot of sense for a lot of people because this part could retain 85 to 90% performance of Ultra 7 and Ultra 9 part while being a lot cheaper, making lunar lake a lot more attractive compared to the competition.
-i have to agree with you on meteor lake, the value is kinda meh all things considered. but at least it finally make intel design on par with the competition while intel literally transformed the way they design and manufacture their chips with tile design and foveros packaging. that's not a small step when you did it in one go.
-arrow lake desktop is just a mess, not competitive in gaming and still meh improvement in productivity workload. but if intel is able to deliver its promises to fix the performance of arrow lake i think it will become a just fine product, nothing really special but at least give intel parity in design compared to ryzen desktop lineup. the last surprise will be arrow lake mobile (true next iteration of meteor lake). i expect intel to do good here since TSMC node will give intel huge advantage in lower voltage window.
-and one thing you missed out is battlemage a.k.a next gen intel arc. Xe2 perform really well in lunar lake so i expect this design will be competitive too in discrete graphic form. it will not be market leader, but enought to keep intel design up-to-date with other gpu from different companies.
all in all, its a decent releases. you can't expect them to do well right away. i just hope intel will not stop here because their rivals are getting stronger too.
9:27 This is just false. The advanced technology in Intel's 10nm like using cobalt wasn't the "wrong direction" at all. Basically every leading edge fab has since adopted most of the things Intel tried with 10nm. They just tried to do too much all at once.
Also, EUV wasn't going to be ready in time for 10nm's initally planned launch date!!! 🤦 THAT'S WHY IT DIDN'T USE IT!!! They only looked stupid for not using it because the process got delayed by so many years that EUV had not only released but become commonplace.
Should've sold what stock I have left back in March. That's just great.