Hi all! Thanks for watching the video :) If you're feeling generous and would like to support my work, you can do so via Patreon (link in description) or using the 'Thanks' button underneath the video :) and if you're interested, check out the trailer for the next retro computing documentary on my channel! This project took 6 months to complete and was huge fun to make! If you enjoyed the video(s) then don't forget to subscribe, like, and share the video on social media! It really does make a difference when trying to grow a small channel. Thanks again everyone :) -Archie
if you will plan to get back to this topic and make an errata, you should correct info about the first computer. ENIAC despite american propaganda was not even third ( there was 2 versions of e.g. en.wikipedia.org/wiki/Colossus_computer and several constructions from Konrad Zuse, e.g. en.wikipedia.org/wiki/Z3_(computer) ).
This is technically true, but it ultimately comes down to your definition of “computer” - neither colossus or the z3 were Turing complete, so I ruled these out
It's a great video, very comprehensive. But the first thing I noticed was its exceedingly long time. This was why I almost didn't watch it. It should have been divided into at 3 parts at minimum, each no more than 29 minutes long. Thanks!
I'm an ASIC designer, i worked on Motorola MC68000 design. What your video fails to mention and is worthy of mention, is the ever constant fight between hardware and software. In the 60s,70,80s software developers needed to develop code within CPU constraints. (and memory). Then we saw software drive hardware... that is to say, if you wanted to play the latest games you needed to spend megabucks on the latest PC hardware. Then, a switch back around 2000 to chips being far superior and software not truly making full use of muticore threading. And now, we see CPUs evolution limited by foundries. Its now that we will see software start to drive innovation in CPUs
Software is still driving the hardware design. Just look at the development of GPGPUs over the past 20 years or the specialized processors in mobile devices. With the end of Dennard scaling and Moore's law, what we do with the limited transistors we have on a chip will become more and more important.
This brought back some vivid memories. I was 7 years old in 1979 when our elementary school library got it's first PCs. A pair of Apple II with the green monochrome displays. I joined an after-school class teaching BASIC 2.0 programming, the built-in programming environment that was part of the Apple II ROM. I recall settling on a Death Star related project for my program, as any sane 7 year old would have. I asked the teacher "How do you make a circle?" and his eyes lit up. He was being asked to explain Pi to a 7 year old and he was delighted.
The Apple 2 series never had a BASIC version 2, there was Integer BASIC (written by Steve Wozniak, and it had no floating point support) and AppleSoft BASIC, written by Microsoft, which did have floating point support built-in. I’d get a big smirk on my face to have a 7 year-old asking such questions because that’s a huge recursive rabbit hole taking a kid that age far deeper than most kids several years older ever go in their lives.
@@strictnonconformist7369 I knew that as well, but assumed that they just meant AppleSoft BASIC, since it was the 2nd BASIC version for Apple II computers; and they said 1979, which was the year that AppleSoft BASIC was released, along with the Apple II Plus, which had AppleSoft built-in. They possibly got the "2.0" in their memory from the widely popular Commodore 64, which called it's BASIC "V2" at the top of the screen when you turned it on with no cartridge inserted.
@@RetroDawn an interesting possible memory failure explanation, I can agree. I didn’t have enough access to Commodore 64s to have that burned into my brain.
I got to learn computers too on the Apple lle with green-only display. 1985-87. Fun memories! Never seen a home computer before, we were endlessly fascinated. One unit even had a color display and we would always fight over who got to use it. Many ancient games of Oregon Trail were played on those machines and others like it. Later, my 4th and 5th grade class got it's OWN computer, which felt extremely luxurious, and we discovered the wonders of Carmen SanDiego. 80s/90s, great times for kids, lulz.
Itanium wasn’t a RISC design, it was what’s known as a VLIW (very large instruction word) processor. It also relied on the compiler to optimise for concurrent operations (so, for example, while an arithmetic operation was in progress the compiler was expected to work out other steps that the CPU could handle that didn’t depend on the result), but compiler technology wasn’t really up to the task.
Which is sorta sad (not that I shed any tears for Itanium), because an iterative application of a logic language (like Prolog) probably would have been able to very cleanly encode the instruction sequencing.
That said, the Transmeta Crusoe was also a VLIW processor aimed at the mobile market with a compiler that translated x86 machine code on-the fly. BTW I remember the press release classifying VLIW as a RISC-like architecture
@@glenwaldrop8166 I can believe it. The computer press has a lot of folks who don't know the lower-level details of the technology. VLIW is definitely distinct from RISC, even if it builds off of RISC.
One of the early design goals for Itanium was for it to run HPs PA RISC binaries. Hence the very long instruction word. The reason Itanium failed was that they broke Intel compatibility. Having seen Digital’s, Alpha, technology, stumble, software vendors, like Oracle, Peoplesoft, JD, Edwards, SAP and countless others were not willing to invest in a new platform.
This is a great long form documentary on the history of CPU development. Very interesting and fun to watch, especially for a guy who is old enough to have seen it all (and work with most of it) as it played out. Thanks Archie! Well done!
Same here. When I was doing my doctorate in engineering at the U. of Waterloo, I built my own Apple II+ clone (soldered the whole thing) and used it do all my mathematical modelling of reactor fluid dynamics and heat transfer using MBASIC (LOL) and write my thesis. The PC had a CP/M card and 16k expansion card (Oooooo). The mathematics were so difficult to solve numerically that I had to develop a new mathematical method, and it took weeks to get a single solution. Now with my current PC, X670E MB and Ryzen 9 7950X CPU overclocked to 6.8 GHz, it takes a few hours.
@@jp34604 At the moment I am using the Dark Rock Pro 4. This cooler barely keeps the CPU at or below it maximum continuous operating temperature of 95 C at that clock speed, which I should clarify was achieved on a single core only. I use the AMD Ryzen Master software to optimize the overclocking. I plan on switching to a water cooler. I did not get one originally because the water cooler required for this CPU would not quite fit into my case (Corsair Obsidian 800D FT). I can, however, make some minor mods to the audio/USB and optical drive bays to make one fit.
yea me too. First computer was a single-card with TMS 9900. A neat chip for homebrew. I had to solder all the chips and sockets myself. Launched my career in microprocessors and hardware development.
This is a truly great exploration and documentary of the history of computing. As a child of the 70s I was already aware of lot of new technologies that emerged around that time. Home gaming with PONG, VHS machines and dedicated handheld single game machines. I was aware of the huge cost around 1980 of the PC. I played games in arcades in pre teen years until I received a Spectrum 48k and everything changed and to this day I'm a tech head.. I'm watching this now and I have learnt even more from it thank you.
I was 16 in 1980 and lived through the developement of the home micro and then the PC. I can relate to this video with great fondness having had the ZX81, BBC Electron and its add ons, BBC Micro, Atari STe. Buying my first PC in 97 (IBM Aptiva) and then building my own.... Its been fun watching this video bringing back great memories. Thanks for your hard work Archie...👌
I had the first Pentium in Portugal. 1994 😂 I got a Matrox 4MB from Canada. 😂 The IC had the famous bug from factory, so I changed it in the US, for a clean Pentium. 😂
This is spectacullarly comprehensive and relevant. I'm blown away. I only have one small quibble: During the 80s and 90s the presentation focuses on low cost solutions, while the 2000s focuses on high end x86. This leaves out mips and arm powered tablet computers, and SBCs like the raspberry pi. And they are relevant, especially arm powered SBCs. A new cycle was attempted to drive down cost with the netbook and tablet craze, but the software wasn't there yet, there just wasn't enough incentive to push Android as a new universal OS for home computers, and it wasn't suited to replace wintel. The raspberry pi, and linux distros ported to it, is the new platform.
This is great. Well done. It must have taken a very long time to narrate and animate. This is one of the best summaries of the microprocessor boom of the 70s, 80s, 90s, and beyond.
How the hell do this only have 1434 views and this channel only have 711 subscribers at the time of writing, is beyond me. Very reminiscent of RetroAhoy, and I mean that in best possible way. Keep doing content like this, and look into optimizing for the TH-cam algorithm, most obvious thing you might be missing is a decent video description, you are not giving TH-cam anything to work with there, stick a synopsis of the video in there, to hit a lot more of the juicy keywords - this video should be sitting at at least 100000 views or even way more by now, in my opinion.
This felt like a 90 minute video with 60 minutes of content. It goes slowly with lots of needless five second pauses that I guess were supposed to make the content seem more dramatic.
Should be way shorter, 7-9 minute chunks. And no long waits with a black screen. And also a catchier title would help. ‘Home Microprocessor’ doesn’t really describe the content, to my opinion.
Agreed. Way too long in one chunk. Also some of the early content was wrong, for example Random Access Memory is not "storing data in the same place as code", it's being able to access any element of data without having to first read all the earlier data. Get elementary things like this wrong and combine it with a far too long video and numbers will drop.
Fantastic video! As I was watching it, memories came back from all the computers I've had during my lifetime. From Sinclair, the commodore, the first IBM with DOS to the servers and PCs I'm still building to this day for customers.
Nice history! My first computer in high school was a single Commodore PET for the entire school, when I went to University I saved my pennies to buy my very own Sinclair ZX-81. What a beast. I recall the Pentium 4 years, and Intel's strange relationship with RAMBUS memory, with all the technical and legal issues that it came with.
We used to call it Rambust memory because it was a disaster for PC companies that decided to use Intel chipsets. Not even Intel’s own server division would use Rambus.
Great content! Love it! Did you know IBM wasn't the company that introduced Intel's 386 architecture with their PS/2 Systems? It was Compaq that beat Big Blue by 7 months with their very expensive high-end Deskpro 386, released in september 1986 vs the IBM PS/2 Model 80 which used the same 80386DX-16, released in April 1987. I think Compaq deserves to be mentioned in documentaries like these as it shaped computing history or at least had a vast influence on its development in the sense that the company played a key role in creating open standards which hugely benefitted/influenced the PC (clone) industry, being the quintessential PC clone manufacturer..
So glad that somebody today recognises just how significant the Compaq Deskpro 386 was, in my opinion just below the original 8088-based PC itself. It, and not the IBM PS/2, established the industry standard architecture, using the PC-AT's open 16-bit bus for peripherals. The greatest missed opportunity was Sun Microsystems' not basing their 386i workstation on the ISA; had they made their SunOS (later Solaris) run on the ISA, they would have blown both Microsoft and IBM out of the PC business, and from 1988 we would all be using full Unix and not the MS-DOS and Windows operating systems, which did not make anywhere near full use of the 386 architecture's power until the 2000's, when Windows became NT; Linux appeared in 1993, and by 1995 had the full power of SunOS/Solaris, but on the standard 86x architecture. Sun gave up in the 386i in 1989. (I replaced my Sun 386i with a Pentium based PC running Slackware Linux in 1995).
I don't normally watch videos longer than about 30 minutes, but this was worth every second of it. Most of it was a great trip down Memory Lane for me; I was born in 1960. Outstanding job!
@@TechKnowledgeVideo What a beautiful trip down memory lane for me! I started in 1979 with CP/M on the Z80 and thought I was the last person on Earth to find out about the microcomputer revolution. I then worked 41 years as a professional developer and Software Engineer in GW Basic, Clipper, and finally Java, Visual Basic, and various Javascript libraries. I retired in early 2021 working in Cloud protocols. I loved every minute of it! Every day was something new that no one had ever done before. I am so grateful that I got to work on the long march! Thank you so much for this beautiful presentation!
Despite quite a few inaccuracies with some earlier market products and trends, and market share implications, it was useful overview of the history of the microprocessor, especially. The post 1990 analysis was more accurate. Nice for me to remember some of the kit I worked with. I wrote a traffic light controller in Z80 assembler. I'd forgotten that!
Thank you for this. I'm 63. For half of my adult life I kept this documentary in my head but my brain clicked off when we stopped calling them Pentium. At least now I know how to intelligently shop for a computer again. Your level of research is impressive. V well done.
Good watch. One thing not mentioned was the fact that the desktop market is somewhat limited by the x86 core architecture. The same instructions from 1980 will function on an i7-12900k. ARM never had that hanging around their ankles. It will be very interesting to see how things develop from here.
@@Mr_Meowingtons It doesn't really no. I coded for the ARM1 Dev box that plugged into a BBC Master back in 1986 and ARM2 for the Arc 300/400 series and the A3000 BBC Micro. ARM2 was a chipset with the main CPU, video, memory and I/O controllers. ARM3 improved the CPU with 4k cache while the other chips were upgraded separately. Quite cool really. Those originals were wildly different from today and are not instruction compatible with even the true 32-bit ARM processors as they used as 24/26 bit bus with the rest of the bits used for passing around the equivalent of a status register and four level interrupt. After ARM3 came ARM6 and then the ARM600 series which were all true 32 bit addressing processors. There was also a cut down 16 bit ARM thumb architecture. DEC (and even Intel which bought DEC) released a StrongARM which powered some of the early Windows CE handheld devices like the Compaq iPaq and Dell Journada.
I agree about micro soft keeping their backwards compatibility but it is necessary. A shocking number of the country's businesses use old computer architecture with new interfaces grafted on. Losing backwards compatibility would (they say) cause a financial disaster. I think the softies should keep the Windows 11 and a new Windows system with limited and specific compatibility and complete control of API level architecture for security. They could rewrite the whole architecture to eliminate even the possibility of exploits. So essentially two Windows. I already have a nice new laptop securitized and firewalled to the point of uselessness and an older one used for games and social media only.
very good and thorough chronicle of early processor development - i like that you were able to trace the architectures to their root in von neumann, through the tubes, the IBM 360, the PDP, the 4004, and the altair (didnt catch if you mentioned this was the iconic machine which appeared to bill gates on the cover of a magazine and inspired the founding of microsoft). you did a nice work through the mainframe tube to transistors, and the microprocessor developments. the bascom calculator and the engineer calling for a simplified generalized design resulting in the 4004. thank you for this video. recommended.
And we all know what's happened in the 3+ years since this video was released. x86 is still the architecture of current desktop PCs, and AMD with Ryzen and Threadripper smack Intel around, and compete very closely with Nvidia in the GPU market, even beating them in some pure raster situations, but are behind when it comes to ray tracing. This technological journey that I have been able to watch and be a part of all these years is fascinating. I'm 54 so I have witnessed a lot, especially the juiciest parts starting in the early early 80s. Thank for the video.
You should talk about DEC Alpha as well. In RISC design. Because that was totally insane on the RISC front. Also Intel's 860 and 960. Wildly different architectures, of which 960 survived rather far, even into printers and space applications.
Since Faggin was mentioned, I think it's time for Jim Keller to get one also. Worked on DEC Alpha, Athlon, Zen, and for Apple, Tesla, and many others. Now at Tenstorrent, AI may be the answer to the next big discovery in just about everything.
Excellent documentary and walk down memory lane. This was the world of computing in which I and my peers in the computer business evolved, and I remember the steps and the back and forth chip competition. It's interesting to reflect on what has happened since this video was produced, and an addendum that brings us up to 2024 would be a logical follow-on to this excellent treatise. I was an early fan of the 6502, then the 680X0 for embedded designs, and then the rise of the Intel-based high volume home computers that transformed the landscape. The progress has been truly stunning.
And let's not forget the first 16 bit home computer (severely crippled by an 8 bit bus and closed software and hardware development by TI) the ill fated 99/4a with the TMS9900 (but it did sell close to 3 million units, virtually all at a loss).
I got one for Christmas that year and I loved it. I had the BASIC cart and Assember language. And a bunch of games. I got the BASIC cart by using the limited BASIC on board to show my parents what it could do and convincing my parents that instead of the home budget cart I could program it in BASIC. And I did. ❤
Thanks for keeping this information not only alive, but digestible, for people like me who didn’t experience it first-hand. It’s incredibly fun reading about everyone’s experiences in the comments with the context about the historical events and the progress of technology from the video. Also, the animations and video editing are top notch ❤
Most people especially in USA didn't hear anything about collossus till about 2002, although some of us born in UK with parents in military, knew of it years before the uk government in 1975 started to declassify it, although it was kept a secret as to it's purpose of use during ww2, I knew though cos me dad told me lol
I'm a mechanical engineering student who has a few friends who know WAY too much about computers. I've been slowly trying to learn but this has been tremendously useful in that goal! You explain things very well to someone who barely knows anything about computers. Well done! And thank you
Thank you so much for this. I used to be a software developer back in the 90's to early 00's and knew the hardware development to the dual core CPU's. But after that, lost my interest in hardware side due to becoming a stay at home Dad. So thanks for this. Really filled in a lot of gaps.
I was getting worried that you were not mentioning ARM what so ever until about 80% of the way through. Given the world dominance of this architecture, not one to miss. The early Archimedes Computers from Acorn were a step change in performance over the Intel architecture at the time, but their lack of maths co-processor became a significant disadvantage during the media war as graphics has a high compute demand.
I felt the same way while watching. The ARM did have an FPU (FPA10) and was designed to support coprocessors, but indeed the standard Archimedes didn't ship with it, this meant software didn't really take advantage since it was likely FP instructions would be slow with the trap based floating point emulator. Acorn did try to break into the UNIX workstation market which would have meant much more powerful ARM based computers in the early 90s had it been successful. Even then, Acorn chose to make the FPU optional even on their flagship £3,995 R260 (rebranded Archimedes A540), without an FPU you have to wonder what market Acorn was actually aiming for!
The bit where you talk about the IBM mainframes contains some errors. Mainly, the IBM 360 and 370 series used BJT transistors, not MOSFETS. Mainframes and mini-computers used either discrete BJT transistors or simple IC's like the 74xxx series, using BJT transistors exclusively. MOSFETS came to their own with the microprocessors, like the 4004.
Well made documentary Archie. I've seen many 'history of the cpu' videos and yours is by far the most informative and thorough one. I enjoyed it alot. Thank you
Excellent video! Glad to see you mention the 4004 and the 8008. Beyond the home Microprocessor, there is the growing massive influx of Microprocessors into the automotive industry where multiple RISC devices simultaneously perform complex independent tasks, communicating with each other via evolving Canbus standards to a central cpu which has the infamous Obd port. This application has become a free-for-all in the industry for the sake of bling but rendering owners and auto mechanics constantly second-guessing what is going on. Would be great to see you make a video on this too, thank you!
I still remember my Apple II computer in 1979. Programs were loaded on it with an audio cassette tape recorder. Later, I got a floppy drive and thought that was truly awesome.
I had Phenom II. It was a great budget CPU. I used it for some fairly hefty tasks, like chess analysis. Clock speed wasn't everything back then. Bulldozer was a disaster and I didn't upgrade my CPU until Ryzen generation.
You say it was a disaster but I owned one of those cpus for 10 years and it still plays Triple AAA games to this day. Sure they ran hot and used a lot of electricity but that was all AMD CPU back then
@@golangismyjam Yeah, my brother uses my dad´s old fx8350 and it works fine for games like fortnite, minecraft or roblox. hell it can even run elden ring with a gtx970
@@manuelhurtado9970 The hexa and octa cores FX CPUs can still run modern titles well enough, they sure did age better than the i3/i5s from the same time period that started struggling when games used more than 4 threads.
@@manuelhurtado9970 Yeah, it may had make it easier to develop/manufacture an octa core CPU without being too expensive (An FX 8150 has a die size of 315 mm² vs Intel's Xeon E5 2650 till 2690 all octa cores with a die size of 435 mm² on Intel's 32 nm that was denser than Global Foundries 32 nm process), but it crippled their IPC... AMD hoped software would caught up fast and fully utilize the 8 threads to make it a more compelling option than the competition, sadly it only happened when the hardware was already obsolete.
At the 1 hour mark, you note that the i7-920 didn't often show a performance gain. Having been an early adopter of it, moving from the Q6600, I have to strongly disagree with that. The difference was obvious and noticeable the minute you did more than treat it as a "faster older PC". The multi-tasking performance was astounding and the overall responsiveness was a step up. I ended up buying 2 of them in 2009 to replace both of my Core2Quads at my office, the difference was noticeable enough.
Truly excellent documentary. Well done. Charts the time line of my career from the 8008 to the present day. If you do release an updated version of this video it would be good to add a mention the DEC Alpha risc machine. And also mention Colossus from Bletchley Park. There were various other microprocessors that you could make passing reference to along the way such as the LSI11 implementation of the PDP11 and the microprocessor versions of the VAX architecture. Also HP had some proprietary microprocessors that they incorporated into their own products.
The production quality here is absolutely incredible and you've not even hit 3K subs yet. You deserve 1000x that, easily. I'm also seeing parallels to Ahoy in all the best ways. Thanks to the algorithm for bringing me here, and thank you for making excellent content; you've got a new sub!
Superb history summary! Thanks. I just recently retired from 40 years in this industry, so I remember *_a lot_* of this. What’s wild though, is that the parts of this history I do _not_ remember as well, are probably the parts that most do: I mostly worked in the embedded-compute arena, have been an Apple dude from the Apple ][ days, and am not much of a gamer, so the exact details of the Intel-AMD x86 wars were not something I followed in much detail.
In 1965 I was an EE student at Birmingham University. Dr Wright was one of our lecturers and his speciality was semiconductor devices - particularly heterojunction transistors. In a lecture he mentioned the latest thing - integrated circuits. I asked him if he thought it would be possible to get a complete computer on an integrated circuit. My recollection is that he told me not to ask silly questions. He obviously thought I was taking the piss.
Yes indeed. CP/M was written in 8080 Assembly Language so it was only ever going to use the 8080 opcodes that the Z80 also ran for compatibility sake. When I wrote CP/M utilities back in the 1980s, even though I wrote in Z80 Assembly Languange, I had to make sure to only use those 8080 instructions from within the Z80 set otherwise my software would not have run on a lot of machines!
Very well done, I grew up programming on 6502, Z80, 68000 and the dreaded 386. I have early memories of the Acorn’s ARM desktops finally the ARM has come of age
Curious mark did an interesting visit to Japan to see a relay based computer that is still running. Iirc they were used commercially once. Worth a look if you are interested.
Great job! I was there at there back in the 80's and worked in computer retail through to 2008. It was a fantastic journey and your video brought the memories flooding back. Thank you!
Nitpicking: the ENIAC was not the first digital computer. That has been the Zuse Z3 in May 1941. It wasn’t fully electronical, though, as most of it was relay based electromagnetic parts.
Indeed, others have also pointed to the Zuse machines. The video starts with ENIAC as I didn’t want to go too far back, and as it was programmable, digital, electronic, and Turing complete, it seemed like a good place to begin.
7:33 : A note for anyone reading the comments- the "electric field controlled" transistor (field-effect transistors), in the form of the JFET, had actually been devised and patented _over a decade_ before the transistors otherwise described at this point in the video (the BJT, which was created before WW2) had been created, but they required too much precision to make at the time. In contrast, the then-common form of the BJT (one of the germanium versions, I think) could be created on a lab bench with a capacitor of the right capacitance, a battery of a specific voltage (to accurately charge the capacitor), and a holder for the BJT that was used both to make it and use it, with the final BJT literally looking like it's schematic symbol (the "bar" is the actual germanium). There were even "foxhole" transistors created with razor blades and clothes pins.
Likewise... good grounding for assembly programming.... I used to discuss with a Z80 programmer... he couldn't see how it possible to write programs with only 2 x 8 bit registers available!
@@lazymassHave a peek at the TH-cam Channel of „ChibiAkuma“. Keith will teach you all the things to write Code for 6502 or Z80 or 68000 … based Computers. Among of this he published two books „Learn Multiplatform Assembly Programming with ChibiAkumas“ part 1 & 2… Highly recommended!
Awesome comprehensive video. Kinda surprised you didn’t mention Intel “tick tock”, and AMD releasing the first 5GHz processor. Even if AMD did it vía a technicality.
Typically I'm hard to surprise where it comes to computing and the lot, but now I learnt about maybe dozens of "new" historical machines. Well done, indeed.
This was thoroughly enjoyable and a great trip down memory lane. Why didn’t you include the shift of including graphics processors on the cpu? Great video and very detailed.
Thank you! Ultimately in a video like this, you have to limit your scope somewhere, and since integrated graphics and other dedicated on-die accelerators are a fairly new concept (only really appearing in the last 10 years) they were left out.
Yorkshireman, Geoffrey William Arnold Dummer, MBE, C. Eng., IEE Premium Award, FIEEE, MIEE, USA Medal of Freedom with Bronze Palm was an English electronics engineer and consultant, who is credited as being the first person to popularise the concepts that ultimately led to the development of the integrated circuit, commonly called the microchip, in the late 1940s and early 1950s.
For old geeks like me, this was a pleasant journey down memory lane. You have put such a lot of research into this project and spent a long time with the editing. Your efforts are very much appreciated 😊
This is an excellent overview of how things have changed and for me (active in development from 1991 onward) shows just what a mess things really were, and how essentially things were held back. I wish there were more mention of how anti-competitive practices did much to create the duoply in the desktop sphere which existed until 2020 but this addition would have made the video at least twice as long :) Here in 2022, things still appear "open." I'm sitting here with an unused Linux box with an original FX water-cooled series AMD under my desk, and writing this comment on a 2021 MacBook Air which despite so little RAM outperforms everything I've ever personally owned while making no noise. Will we see more and more ARM, or, might something disruptive and interesting emerge? We'll see.
as a kid i of course had no idea how new computer tech really was. i was born in 1988. i remember the mid 90s playing on a computer, and the games were on flopy disc's. they were all so ghetto, but back then its all i knew. then the late 90s my parents got a whole brand new computer and wow. msn, computer games, surfing the web. obviously its a lot different today, but you could still generally do the same stuff. talking to my friends, playing games, researching cool shit. and ever since its only elaborating. its bizzare that it all came around right as i grew up and was able to just fall into it. insane times. will be interesting to see how far it goes.
This is a great video, well put together and researched. I lived through most of this (from the 60s forward) and it is nice to see it all condensed together in a timeline. I was hoping to see something about Intel buying DECs chip division and gaining knowledge from the Alpha processor design (fast speeds and high power dissipation) but understand that not everything can be included. Near the end of the 8086 era, the NEC V series chips also had an impact with consumers as well to increase performance. Congratulations on some excellent work.
To say that Intel Itanium was a RISC design is a bit of a stretch. Actually, back then it was the RISC crowd that said that Itanium's VLIW approach was doomed to failure. The main difference between VLIW (which Intel calles EPIC) and advanced superscalar RISC designs is that EPIC does not allow for out of order (OoO) and other dynamic execution of instructions. Instead all this burden is put on the compiler. In fact, if you'd ignore dependencies of instructions and thus the order of instructions produces a wrong result, Itanium will deliver a wrong result happily. Itanium does no data dependency checking at all, this has to be done by the compiler. Removing all dynamic execution features presents a dilemma: The compiler, which has to be very very smart, is forced to find every bit of instruction level parallelism (ILP) during compilation. EPIC has no way of any sort of reordering or re-scheduling. If the compiler isn't able to find ILP there isn't any at all, instread a NOP is issued to the pipe, resulting in VLIW instruction bundles which load only 1 of 3 pipes with work, the other 2 just do NOPs. In that case you lose badly. This static scheduling is especially difficult with loads, since the memory hirarchy presents us with practically impossible to predict memory delays. VLIW/EPIC works best with code like matrix multiplication, which is a very static w.r.t. input data. In such cases parallelism is basically given. An easy job for a compiler to parallelize code like this. But such code is rather rare in a typical desktop or server setting. Also such SIMD computations can be done nicely in vector units of non-VLIW processors, like SSE or AVX in the x86 world. In short, VLIW/EPIC is an architecture that is targeted too much towards specific computational classes to be a general purpose CPU architecture. Also writing a compiler for EPIC which is able to extract every bit of ILP, was/is close to impossible. There were other problems specific to Intel's implementation, notably that executing legacy x86 code was painfully slow.
Baring any memory & similar delays (not particularly familiar with the design, so not sure how those would be handled), it shouldn't actually be _too_ difficult to get decent (not necessarily perfect) scheduling done. In essence, you compile & link, but don't output a final executable, instead producing a SSA form or similar. You then take this, and throw it through a scheduler stage- the scheduler defaults to outputting a NOOP for _all_ of the executable code, but _looks for_ instructions that it can grab from the SSA form as _replacements_ for NOOP instructions, marking those SSA instructions as "finished" in the process. The matching would probably be best done by a module written in Prolog or something. Wouldn't be all that fast, but with a decently sized window should be fairly effective.
The production quality on this is so darn high, how is this channel so small?? You've definitely earned a new subscriber, and I hope to see new content from you in the future!
Hi customsongmaker, I’m surprised you were served that many ads - I just rewatched it on another one of my channels and I got less than 1/4 of that so I’m unsure why you got so many. I would add that the video has only been monitized for less than a week and I’ve been playing around with the ads to see what the best ratio is r.e watchability vs revenue. I have received a few comments suggesting that the ad frequency is a little high and I will be adjusting that accordingly when I’m back at my computer in a few days. One final thing to say that as a small creator ad revenue is the only source of income (no patrons or TH-cam members), and looking to the future (I will be finishing my PhD next year so may not have a flexible schedule after) it will be difficult to continue making videos like this without revenue. I appreciate your comment and will take feedback on board - feel free to keep an eye on the community tab for more updates :)
@@TechKnowledgeVideo I counted the ad breaks on the progress bar. I didn't actually spend 20 minutes watching advertisements just for your video, I stopped watching very early. If there had only been 3 ads - 1 at the beginning, 1 in the middle, and 1 at the end - I would have watched 3 ads, which is 3 times as many ads as I actually watched. Try breaking this video into 5-minute or 10-minute videos. "The Complete History of the Home Microprocessor: Part 1". Then you can see how many people will watch 5 minutes or 10 minutes, with 1 ad. If they like it, they will continue to watch the rest. They will also subscribe, since it makes them think of you as someone with many good videos that they don't want to miss.
The ad breaks on the progress bar do not correspond to ads you will actually see - TH-cam will only play about 1/5 of the adverts it displays on the progress bar - which for this video works out at about every 9 minutes. If you look at the channel I have done what you have said - split this video into 5 separate videos, which viewers will typically see 1 mid roll or less per video (with average video lengths of around 20 minutes). However, this combined video has been far more popular than the individual parts. As to why this is I’m not sure, but the algorithm far prefers pushing this video out than the others. I would add that video retention has stayed the same in the week the video has been monitized compared to the prior week - people on the TH-cam partnered subreddit have done loads of testing on a whole range of videos and against logic it really genuinely doesn’t affect watch time. However, having watched the video back myself for the first time, I do think the quality of the video is degraded with the current frequency of adverts and I really want people to have a great viewing experience. Hence I will reduce the number of ads after the weekend. If you do want to watch the video without ads feel free to use an ad blocker or watch the individual parts which are a lot lighter ad wise :)
Bravo! What a great video. Very interesting and held my attention through to the very end. I pretty much have lived through this whole transition in the home computer world. This was spot on and didn't really miss a thing. Well done!
Back in 2004 I bought a super-compact Staples branded laptop with a processor which was a fairly unusual fork and I think wasn't mentioned here. It has a 1GHz Nehemiah C3 processor by Via, designed for low consumption and cooling requirements. It was a "netbook" PC before the name had been coined, and served me well for many years.
I took my undergraduate degree at the University of Pennsylvania, where I did my introductory computer science courses in the Moore School of Engineering building where Eniac had operated. At that time units of the old machine were lying around in open storage on the first floor and when I would walk to class I'd go right past them and be amazed at how large and clumsy they seemed. Two years later I started seeing ads for the Timex ZX81, the American version of the Sinclair machine, appear in magazines like Scientific American. The juxtaposition of those two computers got one thinking about how far the revolution had come, as well as how far it could go.
A great long form documentary, if a bit anglo-centric, but It contained a few errors. As one of the software developers in the late seventies and eighties, I am well aware of the origin of MS-DOS 1. Microsoft did not buy it, it obtained a copy of CP/M-86 with another name and a different disk file structure. In other words, MS-DOS 1 was essentially pirated software. The author of CP/M-86 was a friend who told me the entire sordid story and why he didn't sue -- "Bill (Gates) was my former student." And that was how the greatest fortune in history up to that time was created.
Really awesome video! Regarding the future of ARM on desktop, I think that this will come when some solution to efficiently emulate x86 gets developed. Apple was able to make the jump because of the restricted and curated nature of their ecosystem. But the "Windows world" is so dependent on legacy software (which is also endorsed by Microsoft with their insistence on stable APIs and backward compatibility) in both home and business that I think that this is absolutely crucial. Apple has Rosetta for this purpose, not sure how efficient it is to be honest.
Apple can efficiently emulate Windows 11, and its comparability layer to run even Windows 95 era software on ARM, it already exists... Microsoft is already transitioning and Windows 11 has a hardware comparability layer much like Apple's Rosetta which runs X86/64 code at near native speeds. I have it installed in a VM on my Mac. The only complaint regarding ARM based computers is where the GPU power is going to come from. Due to its current architecture Apple is GPU locked to its Silicon processors that have no ability to access off chip GPUs. Microsoft/Intel will have to come up with an alternative chip solution but that's only IF they want to keep the GPU industry alive... The days of the GPU or even Front Side Bus are severely limited anyway, eventually as Apple is proving with Silicon it can do at least mid level GPU performance on the same chip as the CPU and run most games well at, at least 1080p with playable frame rates (in terms of the games available). Intel is working on a similar chip to the Apple M1, M2, and M3 but it seems they are further away than ever from achieving it, and if they do so, it won''t be x86 based either... Which leaves AMD in a better position, especially as they acquired ATI, and then the dark horse which is Nvidia which is also producing PC like performance with its Sheild based devices. This all leaves Intel in the weakest long term position currently in terms of its long term roadmap with hot, bloated, and slow processors (until you are draining about 500watt/hours which no one really wants to do anymore) and unable to dye shrink due. to either losing performance, or the current heat issues, where at about 100degrees without water cooling you can basically cook an egg on top of the CPU coolers of most Intel based workstations/desktops. Then in the mobile segment, struggling to push battery life up to 4hours (vs 16 on a Mac) while basically using desktop components in what can only be classified under the old term of "sub-desktop" computers. Not withstanding the horrible quality of cases, keyboards, and track pads by comparison to Unibody Macs (which in design principle are now 15 years old). That's a bother. You get a better user experience in terms of tactile functionality and ergonomics from a 2008 Unibody Mac than any current Intel or AMD laptop on the market which is why they dominate the industry in that market space still to this date where if you don't need a stupid API like .NET frameworks, or Direct X every man and his dog is using a MacBook Pro as the user experience is miles ahead.
I hate to add "old-guy" perspectives, but I lived through the transition from tubes to transistors. Modern kids view that transition as a short misdirection in the march toward the integrated circuit. But it was a huge advancement that has nothing comparable in modern times.
Im in the same boat! But it does give us the ‘under the hood’ knowledge that lets us make systems do things better faster with the same hardware! I still write code that doesn’t waste any more resource than necessary!
Very interesting video, I watched the whole 1:26h. I didn't know that actually the first microprocessor was done for F14. Since it is de-classified now could you point to a video with more details?
Thank you! Unfortunately I cannot post links here, however I can recommend a great article by WIRED entitled “The Secret History of the First Microprocessor, the F-14, and Me” which goes into more detail :)
The Encarta cameo brought back some flash back memories. What a time to be alive. I am happy to see what the world looked like before mainstream Personal Computers. I think computers have really revolutionized humanity in many different ways, for better and for worse.
A nice and relatively detailed summary off the home processor up until the discussion of the ARM processor. The ending leaves the taste that ARM will rule the world and yet there is no mention on RISC-V. In my opinion, RISC-V will be the end-game for uP as even royalties will eat into profit. Still, much of the video is relatively accurate and informative; thank you.
The M1 has already been surpassed by the latest offerings from Intel and AMD... That's how the CPU game has always worked - every time a "new" CPU comes along and takes the "King of the Hill" spot, the designers from every competitor simply get back to work, continuing the never-ending cycle which began decades ago.
@@mojoblues66 They sure are fast but that's mostly because of the software ecosystem that Apple developed since they started designing their ARM SOCs, considering that is easy to see how well an ARM SOC can perform when the software is well optimized for it, for now it isn't really the fastest compared to the highest end x86 CPUs but it's overwhelmingly more efficient. Still even considering all of that technology prowess Apple isn't very consumer friendly with it's lack of upgradability, hard repairability and high prices. I don't see that changing until other major vendors start offering good competition with a good software ecosystem (Windows on ARM still lacking on that regard), that's sincerely much harder when you don't design both "in house".
Well done, i really enjoyed even as one that has followed the microprocessor since the 6502 was popular. I still learnt boatloads of details id never normally research about
Hi all! Thanks for watching the video :) If you're feeling generous and would like to support my work, you can do so via Patreon (link in description) or using the 'Thanks' button underneath the video :) and if you're interested, check out the trailer for the next retro computing documentary on my channel!
This project took 6 months to complete and was huge fun to make! If you enjoyed the video(s) then don't forget to subscribe, like, and share the video on social media! It really does make a difference when trying to grow a small channel.
Thanks again everyone :)
-Archie
if you will plan to get back to this topic and make an errata, you should correct info about the first computer. ENIAC despite american propaganda was not even third ( there was 2 versions of e.g. en.wikipedia.org/wiki/Colossus_computer and several constructions from Konrad Zuse, e.g. en.wikipedia.org/wiki/Z3_(computer) ).
This is technically true, but it ultimately comes down to your definition of “computer” - neither colossus or the z3 were Turing complete, so I ruled these out
@@TechKnowledgeVideo can you also make history of gpu
y're welcome & i like what i've seen so far 👍
It's a great video, very comprehensive.
But the first thing I noticed was its exceedingly long time. This was why I almost didn't watch it. It should have been divided into at 3 parts at minimum, each no more than 29 minutes long. Thanks!
I'm an ASIC designer, i worked on Motorola MC68000 design. What your video fails to mention and is worthy of mention, is the ever constant fight between hardware and software. In the 60s,70,80s software developers needed to develop code within CPU constraints. (and memory). Then we saw software drive hardware... that is to say, if you wanted to play the latest games you needed to spend megabucks on the latest PC hardware. Then, a switch back around 2000 to chips being far superior and software not truly making full use of muticore threading. And now, we see CPUs evolution limited by foundries. Its now that we will see software start to drive innovation in CPUs
That’s a very interesting point! Thanks for sharing :)
Okay
Software is still driving the hardware design. Just look at the development of GPGPUs over the past 20 years or the specialized processors in mobile devices. With the end of Dennard scaling and Moore's law, what we do with the limited transistors we have on a chip will become more and more important.
@@charliefoxtrot5001 thanks mike
Softwares still drive hardware price snd performance for budget segments
This brought back some vivid memories. I was 7 years old in 1979 when our elementary school library got it's first PCs. A pair of Apple II with the green monochrome displays. I joined an after-school class teaching BASIC 2.0 programming, the built-in programming environment that was part of the Apple II ROM. I recall settling on a Death Star related project for my program, as any sane 7 year old would have. I asked the teacher "How do you make a circle?" and his eyes lit up. He was being asked to explain Pi to a 7 year old and he was delighted.
The Apple 2 series never had a BASIC version 2, there was Integer BASIC (written by Steve Wozniak, and it had no floating point support) and AppleSoft BASIC, written by Microsoft, which did have floating point support built-in.
I’d get a big smirk on my face to have a 7 year-old asking such questions because that’s a huge recursive rabbit hole taking a kid that age far deeper than most kids several years older ever go in their lives.
I tried rotating a multi-segmented circle (AKA Star Castle arcade game) in Commodore basic 2.0 when I was maybe 12 in 1982. Can you say slow!
@@strictnonconformist7369 I knew that as well, but assumed that they just meant AppleSoft BASIC, since it was the 2nd BASIC version for Apple II computers; and they said 1979, which was the year that AppleSoft BASIC was released, along with the Apple II Plus, which had AppleSoft built-in.
They possibly got the "2.0" in their memory from the widely popular Commodore 64, which called it's BASIC "V2" at the top of the screen when you turned it on with no cartridge inserted.
@@RetroDawn an interesting possible memory failure explanation, I can agree. I didn’t have enough access to Commodore 64s to have that burned into my brain.
I got to learn computers too on the Apple lle with green-only display. 1985-87. Fun memories! Never seen a home computer before, we were endlessly fascinated. One unit even had a color display and we would always fight over who got to use it. Many ancient games of Oregon Trail were played on those machines and others like it. Later, my 4th and 5th grade class got it's OWN computer, which felt extremely luxurious, and we discovered the wonders of Carmen SanDiego. 80s/90s, great times for kids, lulz.
Itanium wasn’t a RISC design, it was what’s known as a VLIW (very large instruction word) processor. It also relied on the compiler to optimise for concurrent operations (so, for example, while an arithmetic operation was in progress the compiler was expected to work out other steps that the CPU could handle that didn’t depend on the result), but compiler technology wasn’t really up to the task.
Which is sorta sad (not that I shed any tears for Itanium), because an iterative application of a logic language (like Prolog) probably would have been able to very cleanly encode the instruction sequencing.
That said, the Transmeta Crusoe was also a VLIW processor aimed at the mobile market with a compiler that translated x86 machine code on-the fly. BTW I remember the press release classifying VLIW as a RISC-like architecture
@@andrejszasz2816 You are correct. I remember several articles about Itanium and describing it as "Intel just doesn't want to call is RISC".
@@glenwaldrop8166 I can believe it. The computer press has a lot of folks who don't know the lower-level details of the technology. VLIW is definitely distinct from RISC, even if it builds off of RISC.
One of the early design goals for Itanium was for it to run HPs PA RISC binaries. Hence the very long instruction word. The reason Itanium failed was that they broke Intel compatibility. Having seen Digital’s, Alpha, technology, stumble, software vendors, like Oracle, Peoplesoft, JD, Edwards, SAP and countless others were not willing to invest in a new platform.
This is a great long form documentary on the history of CPU development. Very interesting and fun to watch, especially for a guy who is old enough to have seen it all (and work with most of it) as it played out. Thanks Archie! Well done!
Thank you for your kind words Dennis, they mean a lot :)
Same here. When I was doing my doctorate in engineering at the U. of Waterloo, I built my own Apple II+ clone (soldered the whole thing) and used it do all my mathematical modelling of reactor fluid dynamics and heat transfer using MBASIC (LOL) and write my thesis. The PC had a CP/M card and 16k expansion card (Oooooo). The mathematics were so difficult to solve numerically that I had to develop a new mathematical method, and it took weeks to get a single solution. Now with my current PC, X670E MB and Ryzen 9 7950X CPU overclocked to 6.8 GHz, it takes a few hours.
@@dovahkiin159
.
How do you cool 6.8 gigs of clock speed, chilled water?
@@jp34604 At the moment I am using the Dark Rock Pro 4. This cooler barely keeps the CPU at or below it maximum continuous operating temperature of 95 C at that clock speed, which I should clarify was achieved on a single core only. I use the AMD Ryzen Master software to optimize the overclocking. I plan on switching to a water cooler. I did not get one originally because the water cooler required for this CPU would not quite fit into my case (Corsair Obsidian 800D FT). I can, however, make some minor mods to the audio/USB and optical drive bays to make one fit.
yea me too. First computer was a single-card with TMS 9900. A neat chip for homebrew. I had to solder all the chips and sockets myself. Launched my career in microprocessors and hardware development.
This is a truly great exploration and documentary of the history of computing. As a child of the 70s I was already aware of lot of new technologies that emerged around that time. Home gaming with PONG, VHS machines and dedicated handheld single game machines. I was aware of the huge cost around 1980 of the PC. I played games in arcades in pre teen years until I received a Spectrum 48k and everything changed and to this day I'm a tech head.. I'm watching this now and I have learnt even more from it thank you.
I was 16 in 1980 and lived through the developement of the home micro and then the PC.
I can relate to this video with great fondness having had the ZX81, BBC Electron and its add ons, BBC Micro, Atari STe. Buying my first PC in 97 (IBM Aptiva) and then building my own.... Its been fun watching this video bringing back great memories.
Thanks for your hard work Archie...👌
Thank you so much! Glad it brought back many happy memories :)
I had the first Pentium in Portugal. 1994 😂
I got a Matrox 4MB from Canada. 😂
The IC had the famous bug from factory, so I changed it in the US, for a clean Pentium. 😂
This is spectacullarly comprehensive and relevant. I'm blown away.
I only have one small quibble: During the 80s and 90s the presentation focuses on low cost solutions, while the 2000s focuses on high end x86. This leaves out mips and arm powered tablet computers, and SBCs like the raspberry pi. And they are relevant, especially arm powered SBCs.
A new cycle was attempted to drive down cost with the netbook and tablet craze, but the software wasn't there yet, there just wasn't enough incentive to push Android as a new universal OS for home computers, and it wasn't suited to replace wintel. The raspberry pi, and linux distros ported to it, is the new platform.
I can't believe I just watched a feature length video about microchips...but you know what - I enjoyed every second of it!
Glad you liked it! :)
This is great. Well done.
It must have taken a very long time to narrate and animate.
This is one of the best summaries of the microprocessor boom of the 70s, 80s, 90s, and beyond.
Thank you so much :)
How the hell do this only have 1434 views and this channel only have 711 subscribers at the time of writing, is beyond me. Very reminiscent of RetroAhoy, and I mean that in best possible way. Keep doing content like this, and look into optimizing for the TH-cam algorithm, most obvious thing you might be missing is a decent video description, you are not giving TH-cam anything to work with there, stick a synopsis of the video in there, to hit a lot more of the juicy keywords - this video should be sitting at at least 100000 views or even way more by now, in my opinion.
Thank you!
This felt like a 90 minute video with 60 minutes of content. It goes slowly with lots of needless five second pauses that I guess were supposed to make the content seem more dramatic.
Commercial interruptions every ~6 minutes - i may continue viewing beyond the 30' mark in a better mood , coz the content is ok
Should be way shorter, 7-9 minute chunks. And no long waits with a black screen. And also a catchier title would help. ‘Home Microprocessor’ doesn’t really describe the content, to my opinion.
Agreed. Way too long in one chunk. Also some of the early content was wrong, for example Random Access Memory is not "storing data in the same place as code", it's being able to access any element of data without having to first read all the earlier data. Get elementary things like this wrong and combine it with a far too long video and numbers will drop.
Fantastic video! As I was watching it, memories came back from all the computers I've had during my lifetime. From Sinclair, the commodore, the first IBM with DOS to the servers and PCs I'm still building to this day for customers.
Glad it brought back many happy memories :)
Nice history! My first computer in high school was a single Commodore PET for the entire school, when I went to University I saved my pennies to buy my very own Sinclair ZX-81. What a beast.
I recall the Pentium 4 years, and Intel's strange relationship with RAMBUS memory, with all the technical and legal issues that it came with.
Thank you! Very interesting to hear about your computing journey :)
We used to call it Rambust memory because it was a disaster for PC companies that decided to use Intel chipsets. Not even Intel’s own server division would use Rambus.
@@picklerix6162 I used to call it RamButt due to all the difficulties.
WOW really good documentary. It's clear,simple to understand and complete thanks
Thank you so much! :)
Great content! Love it!
Did you know IBM wasn't the company that introduced Intel's 386 architecture with their PS/2 Systems? It was Compaq that beat Big Blue by 7 months with their very expensive high-end Deskpro 386, released in september 1986 vs the IBM PS/2 Model 80 which used the same 80386DX-16, released in April 1987. I think Compaq deserves to be mentioned in documentaries like these as it shaped computing history or at least had a vast influence on its development in the sense that the company played a key role in creating open standards which hugely benefitted/influenced the PC (clone) industry, being the quintessential PC clone manufacturer..
Thanks! Interesting stuff :)
So glad that somebody today recognises just how significant the Compaq Deskpro 386 was, in my opinion just below the original 8088-based PC itself. It, and not the IBM PS/2, established the industry standard architecture, using the PC-AT's open 16-bit bus for peripherals.
The greatest missed opportunity was Sun Microsystems' not basing their 386i workstation on the ISA; had they made their SunOS (later Solaris) run on the ISA, they would have blown both Microsoft and IBM out of the PC business, and from 1988 we would all be using full Unix and not the MS-DOS and Windows operating systems, which did not make anywhere near full use of the 386 architecture's power until the 2000's, when Windows became NT; Linux appeared in 1993, and by 1995 had the full power of SunOS/Solaris, but on the standard 86x architecture. Sun gave up in the 386i in 1989. (I replaced my Sun 386i with a Pentium based PC running Slackware Linux in 1995).
oh man, this brought back memories... well done...
Thanks!
I don't normally watch videos longer than about 30 minutes, but this was worth every second of it. Most of it was a great trip down Memory Lane for me; I was born in 1960. Outstanding job!
Glad you enjoyed it!
@@TechKnowledgeVideo What a beautiful trip down memory lane for me! I started in 1979 with CP/M on the Z80 and thought I was the last person on Earth to find out about the microcomputer revolution. I then worked 41 years as a professional developer and Software Engineer in GW Basic, Clipper, and finally Java, Visual Basic, and various Javascript libraries. I retired in early 2021 working in Cloud protocols. I loved every minute of it! Every day was something new that no one had ever done before. I am so grateful that I got to work on the long march! Thank you so much for this beautiful presentation!
My heart went all a-flutter at 28:00 when the Amiga was shown. Great video, fast pace but so thorough!
Glad you liked it!
Despite quite a few inaccuracies with some earlier market products and trends, and market share implications, it was useful overview of the history of the microprocessor, especially. The post 1990 analysis was more accurate.
Nice for me to remember some of the kit I worked with. I wrote a traffic light controller in Z80 assembler. I'd forgotten that!
This may be nearly a year old but it's still absolutely brilliant. Thank you for putting it together!
Thank you so much! :)
Darn tootin'.
@@daveroche6522 THREE CHEERS FOR THE CREATOR!
Thank you for this. I'm 63. For half of my adult life I kept this documentary in my head but my brain clicked off when we stopped calling them Pentium. At least now I know how to intelligently shop for a computer again. Your level of research is impressive. V well done.
Thank you! Glad you enjoyed it :)
Good watch. One thing not mentioned was the fact that the desktop market is somewhat limited by the x86 core architecture. The same instructions from 1980 will function on an i7-12900k. ARM never had that hanging around their ankles. It will be very interesting to see how things develop from here.
Are you saying ARM made in 1985 has nothing to do with todays ARM?
@@Mr_Meowingtons It doesn't really no. I coded for the ARM1 Dev box that plugged into a BBC Master back in 1986 and ARM2 for the Arc 300/400 series and the A3000 BBC Micro. ARM2 was a chipset with the main CPU, video, memory and I/O controllers. ARM3 improved the CPU with 4k cache while the other chips were upgraded separately. Quite cool really.
Those originals were wildly different from today and are not instruction compatible with even the true 32-bit ARM processors as they used as 24/26 bit bus with the rest of the bits used for passing around the equivalent of a status register and four level interrupt.
After ARM3 came ARM6 and then the ARM600 series which were all true 32 bit addressing processors. There was also a cut down 16 bit ARM thumb architecture. DEC (and even Intel which bought DEC) released a StrongARM which powered some of the early Windows CE handheld devices like the Compaq iPaq and Dell Journada.
I agree about micro soft keeping their backwards compatibility but it is necessary. A shocking number of the country's businesses use old computer architecture with new interfaces grafted on.
Losing backwards compatibility would (they say) cause a financial disaster.
I think the softies should keep the Windows 11 and a new Windows system with limited and specific compatibility and complete control of API level architecture for security. They could rewrite the whole architecture to eliminate even the possibility of exploits.
So essentially two Windows. I already have a nice new laptop securitized and firewalled to the point of uselessness and an older one used for games and social media only.
This video was incredible. Thanks a lot for putting all the time and effort into this! really clear and well put together
Glad you enjoyed it!
very good and thorough chronicle of early processor development - i like that you were able to trace the architectures to their root in von neumann, through the tubes, the IBM 360, the PDP, the 4004, and the altair (didnt catch if you mentioned this was the iconic machine which appeared to bill gates on the cover of a magazine and inspired the founding of microsoft). you did a nice work through the mainframe tube to transistors, and the microprocessor developments. the bascom calculator and the engineer calling for a simplified generalized design resulting in the 4004. thank you for this video. recommended.
Thank you! :)
And we all know what's happened in the 3+ years since this video was released. x86 is still the architecture of current desktop PCs, and AMD with Ryzen and Threadripper smack Intel around, and compete very closely with Nvidia in the GPU market, even beating them in some pure raster situations, but are behind when it comes to ray tracing. This technological journey that I have been able to watch and be a part of all these years is fascinating. I'm 54 so I have witnessed a lot, especially the juiciest parts starting in the early early 80s.
Thank for the video.
You should talk about DEC Alpha as well. In RISC design. Because that was totally insane on the RISC front.
Also Intel's 860 and 960. Wildly different architectures, of which 960 survived rather far, even into printers and space applications.
Since Faggin was mentioned, I think it's time for Jim Keller to get one also. Worked on DEC Alpha, Athlon, Zen, and for Apple, Tesla, and many others. Now at Tenstorrent, AI may be the answer to the next big discovery in just about everything.
And he totally skipped over DEC's foray into desktop computing in 1978 - the DECstation!
the DEC Alpha really was the first 64bit architecture. Too bad the managers were too nice for the cut throat IT business.
Excellent documentary and walk down memory lane. This was the world of computing in which I and my peers in the computer business evolved, and I remember the steps and the back and forth chip competition. It's interesting to reflect on what has happened since this video was produced, and an addendum that brings us up to 2024 would be a logical follow-on to this excellent treatise. I was an early fan of the 6502, then the 680X0 for embedded designs, and then the rise of the Intel-based high volume home computers that transformed the landscape. The progress has been truly stunning.
And let's not forget the first 16 bit home computer (severely crippled by an 8 bit bus and closed software and hardware development by TI) the ill fated 99/4a with the TMS9900 (but it did sell close to 3 million units, virtually all at a loss).
I got one for Christmas that year and I loved it. I had the BASIC cart and Assember language. And a bunch of games.
I got the BASIC cart by using the limited BASIC on board to show my parents what it could do and convincing my parents that instead of the home budget cart I could program it in BASIC. And I did. ❤
Thanks for keeping this information not only alive, but digestible, for people like me who didn’t experience it first-hand. It’s incredibly fun reading about everyone’s experiences in the comments with the context about the historical events and the progress of technology from the video. Also, the animations and video editing are top notch ❤
Thanks! :)
What about Colossus in 1943 ?. One of the greatest achievement's overall in computing. And far ahead of anything else in the world at the time.
Most people especially in USA didn't hear anything about collossus till about 2002, although some of us born in UK with parents in military, knew of it years before the uk government in 1975 started to declassify it, although it was kept a secret as to it's purpose of use during ww2, I knew though cos me dad told me lol
Exceedingly well done. Easy for people to connect with. As a college instructor, I will try to get as many students as possible to view this.
Thank you so much!
Great video. Sandy Bridge was a defining moment for Intel - it's why so many of their later CPUs are so similar.
Indeed!
I'm a mechanical engineering student who has a few friends who know WAY too much about computers. I've been slowly trying to learn but this has been tremendously useful in that goal! You explain things very well to someone who barely knows anything about computers. Well done! And thank you
Thanks! :)
Thank you so much for this.
I used to be a software developer back in the 90's to early 00's and knew the hardware development to the dual core CPU's.
But after that, lost my interest in hardware side due to becoming a stay at home Dad.
So thanks for this. Really filled in a lot of gaps.
No worries! Glad you liked it :)
Nice documentary! Thank you. The next leap forward will be photonix. All good wishes.
Thank you! :)
I was getting worried that you were not mentioning ARM what so ever until about 80% of the way through. Given the world dominance of this architecture, not one to miss. The early Archimedes Computers from Acorn were a step change in performance over the Intel architecture at the time, but their lack of maths co-processor became a significant disadvantage during the media war as graphics has a high compute demand.
I felt the same way while watching. The ARM did have an FPU (FPA10) and was designed to support coprocessors, but indeed the standard Archimedes didn't ship with it, this meant software didn't really take advantage since it was likely FP instructions would be slow with the trap based floating point emulator.
Acorn did try to break into the UNIX workstation market which would have meant much more powerful ARM based computers in the early 90s had it been successful. Even then, Acorn chose to make the FPU optional even on their flagship £3,995 R260 (rebranded Archimedes A540), without an FPU you have to wonder what market Acorn was actually aiming for!
math
@@AR15andGODis not a word.
The bit where you talk about the IBM mainframes contains some errors. Mainly, the IBM 360 and 370 series used BJT transistors, not MOSFETS. Mainframes and mini-computers used either discrete BJT transistors or simple IC's like the 74xxx series, using BJT transistors exclusively.
MOSFETS came to their own with the microprocessors, like the 4004.
Well made documentary Archie. I've seen many 'history of the cpu' videos and yours is by far the most informative and thorough one. I enjoyed it alot. Thank you
Glad you enjoyed it! :)
Excellent video! Glad to see you mention the 4004 and the 8008. Beyond the home Microprocessor, there is the growing massive influx of Microprocessors into the automotive industry where multiple RISC devices simultaneously perform complex independent tasks, communicating with each other via evolving Canbus standards to a central cpu which has the infamous Obd port. This application has become a free-for-all in the industry for the sake of bling but rendering owners and auto mechanics constantly second-guessing what is going on. Would be great to see you make a video on this too, thank you!
Thanks! :)
This was excellent. Looking forward to more content from you.
I'm a CS/EE, 20 years in the industry.
Thank you! :)
I’m amazed at how well you built the story. Brilliant educative video about the history of computing.
Thanks!
Amazing series! I'd love to see more like this. Great work man, can't believe this only has this many views
Glad you enjoyed it! I am working on a similar video at the moment - keep an eye on the channel's community tab for updates :)
I still remember my Apple II computer in 1979. Programs were loaded on it with an audio cassette tape recorder. Later, I got a floppy drive and thought that was truly awesome.
I had Phenom II. It was a great budget CPU. I used it for some fairly hefty tasks, like chess analysis. Clock speed wasn't everything back then.
Bulldozer was a disaster and I didn't upgrade my CPU until Ryzen generation.
You say it was a disaster but I owned one of those cpus for 10 years and it still plays Triple AAA games to this day. Sure they ran hot and used a lot of electricity but that was all AMD CPU back then
@@golangismyjam Yeah, my brother uses my dad´s old fx8350 and it works fine for games like fortnite, minecraft or roblox. hell it can even run elden ring with a gtx970
@@manuelhurtado9970 The hexa and octa cores FX CPUs can still run modern titles well enough, they sure did age better than the i3/i5s from the same time period that started struggling when games used more than 4 threads.
@@ismaelsoto9507 yeah, true, the fx series had a faster clock and more cores, the only problem is that cores share some stuff like the FP scheduler
@@manuelhurtado9970 Yeah, it may had make it easier to develop/manufacture an octa core CPU without being too expensive (An FX 8150 has a die size of 315 mm² vs Intel's Xeon E5 2650 till 2690 all octa cores with a die size of 435 mm² on Intel's 32 nm that was denser than Global Foundries 32 nm process), but it crippled their IPC... AMD hoped software would caught up fast and fully utilize the 8 threads to make it a more compelling option than the competition, sadly it only happened when the hardware was already obsolete.
At the 1 hour mark, you note that the i7-920 didn't often show a performance gain.
Having been an early adopter of it, moving from the Q6600, I have to strongly disagree with that. The difference was obvious and noticeable the minute you did more than treat it as a "faster older PC".
The multi-tasking performance was astounding and the overall responsiveness was a step up. I ended up buying 2 of them in 2009 to replace both of my Core2Quads at my office, the difference was noticeable enough.
Truly excellent documentary. Well done. Charts the time line of my career from the 8008 to the present day.
If you do release an updated version of this video it would be good to add a mention the DEC Alpha risc machine. And also mention Colossus from Bletchley Park. There were various other microprocessors that you could make passing reference to along the way such as the LSI11 implementation of the PDP11 and the microprocessor versions of the VAX architecture. Also HP had some proprietary microprocessors that they incorporated into their own products.
Thank you very much! If I ever make an updated version I’ll take your considerations on board :)
ADHESIVE
I've been doing all of my dev and design work on an M1 Mac for about a year now and it's amazing
The production quality here is absolutely incredible and you've not even hit 3K subs yet. You deserve 1000x that, easily. I'm also seeing parallels to Ahoy in all the best ways. Thanks to the algorithm for bringing me here, and thank you for making excellent content; you've got a new sub!
Thank you! :)
Superb history summary! Thanks.
I just recently retired from 40 years in this industry, so I remember *_a lot_* of this.
What’s wild though, is that the parts of this history I do _not_ remember as well, are probably the parts that most do: I mostly worked in the embedded-compute arena, have been an Apple dude from the Apple ][ days, and am not much of a gamer, so the exact details of the Intel-AMD x86 wars were not something I followed in much detail.
Thanks! :)
Great doc, loved all the details throughout the years. You should be proud.... and damn, you deserve more views and subs for this.
Hahaha thank you! :)
In 1965 I was an EE student at Birmingham University. Dr Wright was one of our lecturers and his speciality was semiconductor devices - particularly heterojunction transistors. In a lecture he mentioned the latest thing - integrated circuits. I asked him if he thought it would be possible to get a complete computer on an integrated circuit. My recollection is that he told me not to ask silly questions. He obviously thought I was taking the piss.
This deserves millions of views. Well done.
And you are speeking perfect slow and easy to follow even for non native speakers, thanks!
Also, CP/M ran on 8080, it ran on Z80 because of compatibility.
Yes indeed. CP/M was written in 8080 Assembly Language so it was only ever going to use the 8080 opcodes that the Z80 also ran for compatibility sake. When I wrote CP/M utilities back in the 1980s, even though I wrote in Z80 Assembly Languange, I had to make sure to only use those 8080 instructions from within the Z80 set otherwise my software would not have run on a lot of machines!
This video is probably the absolute best, comprehensive story of the microprocessor. Just amazing.... KUDOS!
Thank you so much :)
Fantastic series, can't wait for the next one!
Very well done, I grew up programming on 6502, Z80, 68000 and the dreaded 386.
I have early memories of the Acorn’s ARM desktops finally the ARM has come of age
Thanks :)
Wrong again. Not "every machine" uses Vacuum Technology. Konrad Zuse created Z1-Z4 computers which were relay-based AFAIK.
Curious mark did an interesting visit to Japan to see a relay based computer that is still running. Iirc they were used commercially once. Worth a look if you are interested.
Great job! I was there at there back in the 80's and worked in computer retail through to 2008. It was a fantastic journey and your video brought the memories flooding back. Thank you!
Thanks! :)
Archie, this was an awesome review of the Home Computing history! Great production! Thank you for sharing!
Thank you so much! :)
Nitpicking: the ENIAC was not the first digital computer. That has been the Zuse Z3 in May 1941. It wasn’t fully electronical, though, as most of it was relay based electromagnetic parts.
Indeed, others have also pointed to the Zuse machines. The video starts with ENIAC as I didn’t want to go too far back, and as it was programmable, digital, electronic, and Turing complete, it seemed like a good place to begin.
1 1/2 years after the fact and this video finally got blessed by the algorithm gods.
Indeed - I now get more views in 6 hours than I did in the first 6 months of the video release!
7:33 : A note for anyone reading the comments- the "electric field controlled" transistor (field-effect transistors), in the form of the JFET, had actually been devised and patented _over a decade_ before the transistors otherwise described at this point in the video (the BJT, which was created before WW2) had been created, but they required too much precision to make at the time. In contrast, the then-common form of the BJT (one of the germanium versions, I think) could be created on a lab bench with a capacitor of the right capacitance, a battery of a specific voltage (to accurately charge the capacitor), and a holder for the BJT that was used both to make it and use it, with the final BJT literally looking like it's schematic symbol (the "bar" is the actual germanium). There were even "foxhole" transistors created with razor blades and clothes pins.
The BJT was invented by Bardeen, Brattain and Shockley at Bell Labs in 1947, so right after WW2, not before it.
The 6502 was not the king......if IC's had a religion the 6502 would be god
It was the 8bit cpu I enjoyed programming the most.
Likewise... good grounding for assembly programming.... I used to discuss with a Z80 programmer... he couldn't see how it possible to write programs with only 2 x 8 bit registers available!
@@TheElectricW Do you happen to have some source i can look at, that shows how the programming goes for such chips?
@@lazymassHave a peek at the TH-cam Channel of „ChibiAkuma“. Keith will teach you all the things to write Code for 6502 or Z80 or 68000 … based Computers. Among of this he published two books „Learn Multiplatform Assembly Programming with ChibiAkumas“ part 1 & 2… Highly recommended!
Indeed. The 6502 simply crushed the competition for affordable home computers.
This video reminds me so much of Ahoy, a channel that I absolutely adore. Everything was concise and easy to follow, alongside some cool synths.
Awesome comprehensive video. Kinda surprised you didn’t mention Intel “tick tock”, and AMD releasing the first 5GHz processor. Even if AMD did it vía a technicality.
I guess he can't cover every detail of everything in one hour.
Thanks! :)
Typically I'm hard to surprise where it comes to computing and the lot, but now I learnt about maybe dozens of "new" historical machines. Well done, indeed.
Thank you so much :) and thanks for the rest of your comments, it’s always good to hear someone else’s insight!
This was thoroughly enjoyable and a great trip down memory lane. Why didn’t you include the shift of including graphics processors on the cpu? Great video and very detailed.
Thank you! Ultimately in a video like this, you have to limit your scope somewhere, and since integrated graphics and other dedicated on-die accelerators are a fairly new concept (only really appearing in the last 10 years) they were left out.
Yorkshireman, Geoffrey William Arnold Dummer, MBE, C. Eng., IEE Premium Award, FIEEE, MIEE, USA Medal of Freedom with Bronze Palm was an English electronics engineer and consultant, who is credited as being the first person to popularise the concepts that ultimately led to the development of the integrated circuit, commonly called the microchip, in the late 1940s and early 1950s.
This is an exceptionally well done video! Fantastic capturing of the entire history from the origins to today. Great job!!!
Glad you enjoyed it!
He is wrong, proving that is how ignorance thrives. Try 1941 in the UK.
For old geeks like me, this was a pleasant journey down memory lane. You have put such a lot of research into this project and spent a long time with the editing. Your efforts are very much appreciated 😊
Glad you enjoyed it!
This is an excellent overview of how things have changed and for me (active in development from 1991 onward) shows just what a mess things really were, and how essentially things were held back. I wish there were more mention of how anti-competitive practices did much to create the duoply in the desktop sphere which existed until 2020 but this addition would have made the video at least twice as long :) Here in 2022, things still appear "open." I'm sitting here with an unused Linux box with an original FX water-cooled series AMD under my desk, and writing this comment on a 2021 MacBook Air which despite so little RAM outperforms everything I've ever personally owned while making no noise. Will we see more and more ARM, or, might something disruptive and interesting emerge? We'll see.
Thanks! :) Indeed, only time will tell.
A PC is silent if you use high-quality fans (i.e. Noctua), and control them properly.
as a kid i of course had no idea how new computer tech really was. i was born in 1988. i remember the mid 90s playing on a computer, and the games were on flopy disc's. they were all so ghetto, but back then its all i knew. then the late 90s my parents got a whole brand new computer and wow. msn, computer games, surfing the web. obviously its a lot different today, but you could still generally do the same stuff. talking to my friends, playing games, researching cool shit. and ever since its only elaborating. its bizzare that it all came around right as i grew up and was able to just fall into it. insane times. will be interesting to see how far it goes.
Sorry about this, but you did'nt mention the worlds first electronic programable computer, Colossus, developed from 1943-45.
Great documentary.
Thanks! Colossus wasn't technically Turing complete so this is why it is not mentioned :)
@@TechKnowledgeVideo Fair enough. :)
A hugely impressive, multi-decade summary. Well done.
Thank you so much for your kind words :)
This is a great video, well put together and researched. I lived through most of this (from the 60s forward) and it is nice to see it all condensed together in a timeline. I was hoping to see something about Intel buying DECs chip division and gaining knowledge from the Alpha processor design (fast speeds and high power dissipation) but understand that not everything can be included. Near the end of the 8086 era, the NEC V series chips also had an impact with consumers as well to increase performance. Congratulations on some excellent work.
Thank you! :)
The background music is awesome....the video is great 👌👍👍👍👍
Thanks!
To say that Intel Itanium was a RISC design is a bit of a stretch. Actually, back then it was the RISC crowd that said that Itanium's VLIW approach was doomed to failure. The main difference between VLIW (which Intel calles EPIC) and advanced superscalar RISC designs is that EPIC does not allow for out of order (OoO) and other dynamic execution of instructions. Instead all this burden is put on the compiler. In fact, if you'd ignore dependencies of instructions and thus the order of instructions produces a wrong result, Itanium will deliver a wrong result happily. Itanium does no data dependency checking at all, this has to be done by the compiler.
Removing all dynamic execution features presents a dilemma: The compiler, which has to be very very smart, is forced to find every bit of instruction level parallelism (ILP) during compilation. EPIC has no way of any sort of reordering or re-scheduling. If the compiler isn't able to find ILP there isn't any at all, instread a NOP is issued to the pipe, resulting in VLIW instruction bundles which load only 1 of 3 pipes with work, the other 2 just do NOPs. In that case you lose badly. This static scheduling is especially difficult with loads, since the memory hirarchy presents us with practically impossible to predict memory delays.
VLIW/EPIC works best with code like matrix multiplication, which is a very static w.r.t. input data. In such cases parallelism is basically given. An easy job for a compiler to parallelize code like this. But such code is rather rare in a typical desktop or server setting. Also such SIMD computations can be done nicely in vector units of non-VLIW processors, like SSE or AVX in the x86 world.
In short, VLIW/EPIC is an architecture that is targeted too much towards specific computational classes to be a general purpose CPU architecture. Also writing a compiler for EPIC which is able to extract every bit of ILP, was/is close to impossible. There were other problems specific to Intel's implementation, notably that executing legacy x86 code was painfully slow.
Very interesting stuff, thank you for your insight!
Baring any memory & similar delays (not particularly familiar with the design, so not sure how those would be handled), it shouldn't actually be _too_ difficult to get decent (not necessarily perfect) scheduling done.
In essence, you compile & link, but don't output a final executable, instead producing a SSA form or similar. You then take this, and throw it through a scheduler stage- the scheduler defaults to outputting a NOOP for _all_ of the executable code, but _looks for_ instructions that it can grab from the SSA form as _replacements_ for NOOP instructions, marking those SSA instructions as "finished" in the process. The matching would probably be best done by a module written in Prolog or something. Wouldn't be all that fast, but with a decently sized window should be fairly effective.
24:44 my first reaction would be to take a pic of it to save as reference, but you would have had to write it down by hand, wow!
The production quality on this is so darn high, how is this channel so small?? You've definitely earned a new subscriber, and I hope to see new content from you in the future!
Thank you so much! I am currently working on a new video, and will be posting updates on the community tab as it develops :)
There are 46 ads in this one video, so maybe that's why
Hi customsongmaker, I’m surprised you were served that many ads - I just rewatched it on another one of my channels and I got less than 1/4 of that so I’m unsure why you got so many.
I would add that the video has only been monitized for less than a week and I’ve been playing around with the ads to see what the best ratio is r.e watchability vs revenue. I have received a few comments suggesting that the ad frequency is a little high and I will be adjusting that accordingly when I’m back at my computer in a few days.
One final thing to say that as a small creator ad revenue is the only source of income (no patrons or TH-cam members), and looking to the future (I will be finishing my PhD next year so may not have a flexible schedule after) it will be difficult to continue making videos like this without revenue. I appreciate your comment and will take feedback on board - feel free to keep an eye on the community tab for more updates :)
@@TechKnowledgeVideo I counted the ad breaks on the progress bar. I didn't actually spend 20 minutes watching advertisements just for your video, I stopped watching very early. If there had only been 3 ads - 1 at the beginning, 1 in the middle, and 1 at the end - I would have watched 3 ads, which is 3 times as many ads as I actually watched.
Try breaking this video into 5-minute or 10-minute videos. "The Complete History of the Home Microprocessor: Part 1". Then you can see how many people will watch 5 minutes or 10 minutes, with 1 ad. If they like it, they will continue to watch the rest. They will also subscribe, since it makes them think of you as someone with many good videos that they don't want to miss.
The ad breaks on the progress bar do not correspond to ads you will actually see - TH-cam will only play about 1/5 of the adverts it displays on the progress bar - which for this video works out at about every 9 minutes.
If you look at the channel I have done what you have said - split this video into 5 separate videos, which viewers will typically see 1 mid roll or less per video (with average video lengths of around 20 minutes). However, this combined video has been far more popular than the individual parts. As to why this is I’m not sure, but the algorithm far prefers pushing this video out than the others.
I would add that video retention has stayed the same in the week the video has been monitized compared to the prior week - people on the TH-cam partnered subreddit have done loads of testing on a whole range of videos and against logic it really genuinely doesn’t affect watch time. However, having watched the video back myself for the first time, I do think the quality of the video is degraded with the current frequency of adverts and I really want people to have a great viewing experience. Hence I will reduce the number of ads after the weekend.
If you do want to watch the video without ads feel free to use an ad blocker or watch the individual parts which are a lot lighter ad wise :)
This is fantastic! Quite the journey that aligns with my time on the earth. Thank you!
You're very welcome :)
Fantastic! Needs more views! What a shame!
Thank you!
Bravo! What a great video. Very interesting and held my attention through to the very end. I pretty much have lived through this whole transition in the home computer world. This was spot on and didn't really miss a thing. Well done!
Thanks!
Really great content. Well put together!
Back in 2004 I bought a super-compact Staples branded laptop with a processor which was a fairly unusual fork and I think wasn't mentioned here. It has a 1GHz Nehemiah C3 processor by Via, designed for low consumption and cooling requirements. It was a "netbook" PC before the name had been coined, and served me well for many years.
Yeah, Via processors generally fall into a "forgot to mention they also ran" category.
I took my undergraduate degree at the University of Pennsylvania, where I did my introductory computer science courses in the Moore School of Engineering building where Eniac had operated. At that time units of the old machine were lying around in open storage on the first floor and when I would walk to class I'd go right past them and be amazed at how large and clumsy they seemed. Two years later I started seeing ads for the Timex ZX81, the American version of the Sinclair machine, appear in magazines like Scientific American. The juxtaposition of those two computers got one thinking about how far the revolution had come, as well as how far it could go.
This is stellar work sir. Yes it gives off Ahoy vibes but your style shows through. Please make more stuff!
Thank you so much! :)
Agreed. VERY informative and interesting - for those of us with an actual (functioning) brain......
This video is the TRUE definition of "epic". Very well done, thank you!
Glad you enjoyed it!
A rather belated "very good".
Such an excellent video with so few views! I must go see what else you've done and maybe add to views.
Glad you enjoyed it! I am in the process of creating the next long form video :)
A great long form documentary, if a bit anglo-centric, but It contained a few errors.
As one of the software developers in the late seventies and eighties, I am well aware of the origin of MS-DOS 1. Microsoft did not buy it, it obtained a copy of CP/M-86 with another name and a different disk file structure. In other words, MS-DOS 1 was essentially pirated software. The author of CP/M-86 was a friend who told me the entire sordid story and why he didn't sue -- "Bill (Gates) was my former student." And that was how the greatest fortune in history up to that time was created.
Really awesome video!
Regarding the future of ARM on desktop, I think that this will come when some solution to efficiently emulate x86 gets developed. Apple was able to make the jump because of the restricted and curated nature of their ecosystem. But the "Windows world" is so dependent on legacy software (which is also endorsed by Microsoft with their insistence on stable APIs and backward compatibility) in both home and business that I think that this is absolutely crucial. Apple has Rosetta for this purpose, not sure how efficient it is to be honest.
Apple can efficiently emulate Windows 11, and its comparability layer to run even Windows 95 era software on ARM, it already exists... Microsoft is already transitioning and Windows 11 has a hardware comparability layer much like Apple's Rosetta which runs X86/64 code at near native speeds. I have it installed in a VM on my Mac. The only complaint regarding ARM based computers is where the GPU power is going to come from.
Due to its current architecture Apple is GPU locked to its Silicon processors that have no ability to access off chip GPUs.
Microsoft/Intel will have to come up with an alternative chip solution but that's only IF they want to keep the GPU industry alive...
The days of the GPU or even Front Side Bus are severely limited anyway, eventually as Apple is proving with Silicon it can do at least mid level GPU performance on the same chip as the CPU and run most games well at, at least 1080p with playable frame rates (in terms of the games available).
Intel is working on a similar chip to the Apple M1, M2, and M3 but it seems they are further away than ever from achieving it, and if they do so, it won''t be x86 based either...
Which leaves AMD in a better position, especially as they acquired ATI, and then the dark horse which is Nvidia which is also producing PC like performance with its Sheild based devices.
This all leaves Intel in the weakest long term position currently in terms of its long term roadmap with hot, bloated, and slow processors (until you are draining about 500watt/hours which no one really wants to do anymore) and unable to dye shrink due. to either losing performance, or the current heat issues, where at about 100degrees without water cooling you can basically cook an egg on top of the CPU coolers of most Intel based workstations/desktops.
Then in the mobile segment, struggling to push battery life up to 4hours (vs 16 on a Mac) while basically using desktop components in what can only be classified under the old term of "sub-desktop" computers. Not withstanding the horrible quality of cases, keyboards, and track pads by comparison to Unibody Macs (which in design principle are now 15 years old). That's a bother.
You get a better user experience in terms of tactile functionality and ergonomics from a 2008 Unibody Mac than any current Intel or AMD laptop on the market which is why they dominate the industry in that market space still to this date where if you don't need a stupid API like .NET frameworks, or Direct X every man and his dog is using a MacBook Pro as the user experience is miles ahead.
I hate to add "old-guy" perspectives, but I lived through the transition from tubes to transistors. Modern kids view that transition as a short misdirection in the march toward the integrated circuit. But it was a huge advancement that has nothing comparable in modern times.
Im in the same boat!
But it does give us the ‘under the hood’ knowledge that lets us make systems do things better faster with the same hardware!
I still write code that doesn’t waste any more resource
than necessary!
Very interesting video, I watched the whole 1:26h.
I didn't know that actually the first microprocessor was done for F14. Since it is de-classified now could you point to a video with more details?
Thank you! Unfortunately I cannot post links here, however I can recommend a great article by WIRED entitled “The Secret History of the First Microprocessor, the F-14, and Me” which goes into more detail :)
The Encarta cameo brought back some flash back memories. What a time to be alive. I am happy to see what the world looked like before mainstream Personal Computers. I think computers have really revolutionized humanity in many different ways, for better and for worse.
wow! how has this got so few views!/?
A nice and relatively detailed summary off the home processor up until the discussion of the ARM processor. The ending leaves the taste that ARM will rule the world and yet there is no mention on RISC-V. In my opinion, RISC-V will be the end-game for uP as even royalties will eat into profit. Still, much of the video is relatively accurate and informative; thank you.
Really cool video, great job!
And nice to see how your prediction about the future turned out to be true with M1 processors being the beasts they are.
Thank you! :)
The M1 has already been surpassed by the latest offerings from Intel and AMD...
That's how the CPU game has always worked - every time a "new" CPU comes along and takes the "King of the Hill" spot, the designers from every competitor simply get back to work, continuing the never-ending cycle which began decades ago.
@@looneyburgmusic Are you on acid? Also, we're on the M2 already.
@@mojoblues66 the M2 has also already been surpassed.
@@mojoblues66 They sure are fast but that's mostly because of the software ecosystem that Apple developed since they started designing their ARM SOCs, considering that is easy to see how well an ARM SOC can perform when the software is well optimized for it, for now it isn't really the fastest compared to the highest end x86 CPUs but it's overwhelmingly more efficient.
Still even considering all of that technology prowess Apple isn't very consumer friendly with it's lack of upgradability, hard repairability and high prices. I don't see that changing until other major vendors start offering good competition with a good software ecosystem (Windows on ARM still lacking on that regard), that's sincerely much harder when you don't design both "in house".
Hear, hear! This is one of the best summary’s I saw about micro computers, it’s history, the differences, from 1947 up till 2022
Thanks! :)
WHAT.A.DOCUMENTARY.
⭐⭐⭐⭐⭐
💚💚💚💚💚
👏👏👏👏👏
Well done, i really enjoyed even as one that has followed the microprocessor since the 6502 was popular. I still learnt boatloads of details id never normally research about
Thanks :)