The Complete History of the Home Microprocessor

แชร์
ฝัง
  • เผยแพร่เมื่อ 24 ธ.ค. 2024

ความคิดเห็น • 1.6K

  • @TechKnowledgeVideo
    @TechKnowledgeVideo  4 ปีที่แล้ว +150

    Hi all! Thanks for watching the video :) If you're feeling generous and would like to support my work, you can do so via Patreon (link in description) or using the 'Thanks' button underneath the video :) and if you're interested, check out the trailer for the next retro computing documentary on my channel!
    This project took 6 months to complete and was huge fun to make! If you enjoyed the video(s) then don't forget to subscribe, like, and share the video on social media! It really does make a difference when trying to grow a small channel.
    Thanks again everyone :)
    -Archie

    • @jell_pl
      @jell_pl 2 ปีที่แล้ว +4

      if you will plan to get back to this topic and make an errata, you should correct info about the first computer. ENIAC despite american propaganda was not even third ( there was 2 versions of e.g. en.wikipedia.org/wiki/Colossus_computer and several constructions from Konrad Zuse, e.g. en.wikipedia.org/wiki/Z3_(computer) ).

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +11

      This is technically true, but it ultimately comes down to your definition of “computer” - neither colossus or the z3 were Turing complete, so I ruled these out

    • @masternobody1896
      @masternobody1896 2 ปีที่แล้ว +4

      @@TechKnowledgeVideo can you also make history of gpu

    • @FlockOfHawks
      @FlockOfHawks 2 ปีที่แล้ว +2

      y're welcome & i like what i've seen so far 👍

    • @acmefixer1
      @acmefixer1 2 ปีที่แล้ว +1

      It's a great video, very comprehensive.
      But the first thing I noticed was its exceedingly long time. This was why I almost didn't watch it. It should have been divided into at 3 parts at minimum, each no more than 29 minutes long. Thanks!

  • @setdetnet5001
    @setdetnet5001 ปีที่แล้ว +85

    I'm an ASIC designer, i worked on Motorola MC68000 design. What your video fails to mention and is worthy of mention, is the ever constant fight between hardware and software. In the 60s,70,80s software developers needed to develop code within CPU constraints. (and memory). Then we saw software drive hardware... that is to say, if you wanted to play the latest games you needed to spend megabucks on the latest PC hardware. Then, a switch back around 2000 to chips being far superior and software not truly making full use of muticore threading. And now, we see CPUs evolution limited by foundries. Its now that we will see software start to drive innovation in CPUs

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  ปีที่แล้ว +14

      That’s a very interesting point! Thanks for sharing :)

    • @therealcfiddy592
      @therealcfiddy592 ปีที่แล้ว +1

      Okay

    • @charliefoxtrot5001
      @charliefoxtrot5001 ปีที่แล้ว +3

      Software is still driving the hardware design. Just look at the development of GPGPUs over the past 20 years or the specialized processors in mobile devices. With the end of Dennard scaling and Moore's law, what we do with the limited transistors we have on a chip will become more and more important.

    • @therealcfiddy592
      @therealcfiddy592 ปีที่แล้ว +1

      @@charliefoxtrot5001 thanks mike

    • @Raderade1-pt3om
      @Raderade1-pt3om 10 หลายเดือนก่อน

      Softwares still drive hardware price snd performance for budget segments

  • @thehookupiowa
    @thehookupiowa 2 ปีที่แล้ว +121

    This brought back some vivid memories. I was 7 years old in 1979 when our elementary school library got it's first PCs. A pair of Apple II with the green monochrome displays. I joined an after-school class teaching BASIC 2.0 programming, the built-in programming environment that was part of the Apple II ROM. I recall settling on a Death Star related project for my program, as any sane 7 year old would have. I asked the teacher "How do you make a circle?" and his eyes lit up. He was being asked to explain Pi to a 7 year old and he was delighted.

    • @strictnonconformist7369
      @strictnonconformist7369 2 ปีที่แล้ว +6

      The Apple 2 series never had a BASIC version 2, there was Integer BASIC (written by Steve Wozniak, and it had no floating point support) and AppleSoft BASIC, written by Microsoft, which did have floating point support built-in.
      I’d get a big smirk on my face to have a 7 year-old asking such questions because that’s a huge recursive rabbit hole taking a kid that age far deeper than most kids several years older ever go in their lives.

    • @tarstarkusz
      @tarstarkusz ปีที่แล้ว +5

      I tried rotating a multi-segmented circle (AKA Star Castle arcade game) in Commodore basic 2.0 when I was maybe 12 in 1982. Can you say slow!

    • @RetroDawn
      @RetroDawn ปีที่แล้ว +6

      @@strictnonconformist7369 I knew that as well, but assumed that they just meant AppleSoft BASIC, since it was the 2nd BASIC version for Apple II computers; and they said 1979, which was the year that AppleSoft BASIC was released, along with the Apple II Plus, which had AppleSoft built-in.
      They possibly got the "2.0" in their memory from the widely popular Commodore 64, which called it's BASIC "V2" at the top of the screen when you turned it on with no cartridge inserted.

    • @strictnonconformist7369
      @strictnonconformist7369 ปีที่แล้ว +1

      @@RetroDawn an interesting possible memory failure explanation, I can agree. I didn’t have enough access to Commodore 64s to have that burned into my brain.

    • @YourCapyFrenBigly_3DPipes1999
      @YourCapyFrenBigly_3DPipes1999 ปีที่แล้ว

      I got to learn computers too on the Apple lle with green-only display. 1985-87. Fun memories! Never seen a home computer before, we were endlessly fascinated. One unit even had a color display and we would always fight over who got to use it. Many ancient games of Oregon Trail were played on those machines and others like it. Later, my 4th and 5th grade class got it's OWN computer, which felt extremely luxurious, and we discovered the wonders of Carmen SanDiego. 80s/90s, great times for kids, lulz.

  • @ZnakeTech
    @ZnakeTech 3 ปีที่แล้ว +276

    How the hell do this only have 1434 views and this channel only have 711 subscribers at the time of writing, is beyond me. Very reminiscent of RetroAhoy, and I mean that in best possible way. Keep doing content like this, and look into optimizing for the TH-cam algorithm, most obvious thing you might be missing is a decent video description, you are not giving TH-cam anything to work with there, stick a synopsis of the video in there, to hit a lot more of the juicy keywords - this video should be sitting at at least 100000 views or even way more by now, in my opinion.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  3 ปีที่แล้ว +24

      Thank you!

    • @scottlarson1548
      @scottlarson1548 2 ปีที่แล้ว +20

      This felt like a 90 minute video with 60 minutes of content. It goes slowly with lots of needless five second pauses that I guess were supposed to make the content seem more dramatic.

    • @FlockOfHawks
      @FlockOfHawks 2 ปีที่แล้ว +3

      Commercial interruptions every ~6 minutes - i may continue viewing beyond the 30' mark in a better mood , coz the content is ok

    • @ReneKnuvers74rk
      @ReneKnuvers74rk 2 ปีที่แล้ว +5

      Should be way shorter, 7-9 minute chunks. And no long waits with a black screen. And also a catchier title would help. ‘Home Microprocessor’ doesn’t really describe the content, to my opinion.

    • @nickryan3417
      @nickryan3417 2 ปีที่แล้ว +7

      Agreed. Way too long in one chunk. Also some of the early content was wrong, for example Random Access Memory is not "storing data in the same place as code", it's being able to access any element of data without having to first read all the earlier data. Get elementary things like this wrong and combine it with a far too long video and numbers will drop.

  • @denniswofford
    @denniswofford 2 ปีที่แล้ว +257

    This is a great long form documentary on the history of CPU development. Very interesting and fun to watch, especially for a guy who is old enough to have seen it all (and work with most of it) as it played out. Thanks Archie! Well done!

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +15

      Thank you for your kind words Dennis, they mean a lot :)

    • @dovahkiin159
      @dovahkiin159 ปีที่แล้ว +13

      Same here. When I was doing my doctorate in engineering at the U. of Waterloo, I built my own Apple II+ clone (soldered the whole thing) and used it do all my mathematical modelling of reactor fluid dynamics and heat transfer using MBASIC (LOL) and write my thesis. The PC had a CP/M card and 16k expansion card (Oooooo). The mathematics were so difficult to solve numerically that I had to develop a new mathematical method, and it took weeks to get a single solution. Now with my current PC, X670E MB and Ryzen 9 7950X CPU overclocked to 6.8 GHz, it takes a few hours.

    • @jp34604
      @jp34604 ปีที่แล้ว +2

      @@dovahkiin159
      .
      How do you cool 6.8 gigs of clock speed, chilled water?

    • @dovahkiin159
      @dovahkiin159 ปีที่แล้ว

      @@jp34604 At the moment I am using the Dark Rock Pro 4. This cooler barely keeps the CPU at or below it maximum continuous operating temperature of 95 C at that clock speed, which I should clarify was achieved on a single core only. I use the AMD Ryzen Master software to optimize the overclocking. I plan on switching to a water cooler. I did not get one originally because the water cooler required for this CPU would not quite fit into my case (Corsair Obsidian 800D FT). I can, however, make some minor mods to the audio/USB and optical drive bays to make one fit.

    • @garymartin9777
      @garymartin9777 ปีที่แล้ว +4

      yea me too. First computer was a single-card with TMS 9900. A neat chip for homebrew. I had to solder all the chips and sockets myself. Launched my career in microprocessors and hardware development.

  • @stevetodd7383
    @stevetodd7383 2 ปีที่แล้ว +112

    Itanium wasn’t a RISC design, it was what’s known as a VLIW (very large instruction word) processor. It also relied on the compiler to optimise for concurrent operations (so, for example, while an arithmetic operation was in progress the compiler was expected to work out other steps that the CPU could handle that didn’t depend on the result), but compiler technology wasn’t really up to the task.

    • @absalomdraconis
      @absalomdraconis 2 ปีที่แล้ว +7

      Which is sorta sad (not that I shed any tears for Itanium), because an iterative application of a logic language (like Prolog) probably would have been able to very cleanly encode the instruction sequencing.

    • @andrejszasz2816
      @andrejszasz2816 2 ปีที่แล้ว +7

      That said, the Transmeta Crusoe was also a VLIW processor aimed at the mobile market with a compiler that translated x86 machine code on-the fly. BTW I remember the press release classifying VLIW as a RISC-like architecture

    • @glenwaldrop8166
      @glenwaldrop8166 ปีที่แล้ว +7

      @@andrejszasz2816 You are correct. I remember several articles about Itanium and describing it as "Intel just doesn't want to call is RISC".

    • @RetroDawn
      @RetroDawn ปีที่แล้ว +7

      @@glenwaldrop8166 I can believe it. The computer press has a lot of folks who don't know the lower-level details of the technology. VLIW is definitely distinct from RISC, even if it builds off of RISC.

    • @MultiPetercool
      @MultiPetercool ปีที่แล้ว +1

      One of the early design goals for Itanium was for it to run HPs PA RISC binaries. Hence the very long instruction word. The reason Itanium failed was that they broke Intel compatibility. Having seen Digital’s, Alpha, technology, stumble, software vendors, like Oracle, Peoplesoft, JD, Edwards, SAP and countless others were not willing to invest in a new platform.

  • @arn3696
    @arn3696 2 ปีที่แล้ว +10

    I can't believe I just watched a feature length video about microchips...but you know what - I enjoyed every second of it!

  • @electronash
    @electronash 2 ปีที่แล้ว +32

    This is great. Well done.
    It must have taken a very long time to narrate and animate.
    This is one of the best summaries of the microprocessor boom of the 70s, 80s, 90s, and beyond.

  • @sundhaug92
    @sundhaug92 10 หลายเดือนก่อน +1

    1:09:27 Pedantic correction: Ryzen is the consumer-market brand-name, Zen is also Threadripper and Epyc

  • @PaulSpades
    @PaulSpades 3 ปีที่แล้ว +18

    This is spectacullarly comprehensive and relevant. I'm blown away.
    I only have one small quibble: During the 80s and 90s the presentation focuses on low cost solutions, while the 2000s focuses on high end x86. This leaves out mips and arm powered tablet computers, and SBCs like the raspberry pi. And they are relevant, especially arm powered SBCs.
    A new cycle was attempted to drive down cost with the netbook and tablet craze, but the software wasn't there yet, there just wasn't enough incentive to push Android as a new universal OS for home computers, and it wasn't suited to replace wintel. The raspberry pi, and linux distros ported to it, is the new platform.

  • @FreihEitner
    @FreihEitner 11 หลายเดือนก่อน +1

    3 years ago (as of me typing this) so they missed out on Apple's "M" architecture. Their M1, M2 and latest M3 are ARM-based and highly customized to their purpose and giving Intel's and AMD's highest-end processors a serious thrashing in media/content creation tasks but using far less power (laptops outpacing full-size desktops complete with full-size desktop cooling solutions). Me, I'm still an x86 user and modern PC chips are plenty fast for my needs, but it's interesting to see where companies are continuing to find space to improve performance and capabilities.

  • @raggersragnarsson6255
    @raggersragnarsson6255 ปีที่แล้ว +10

    This is a truly great exploration and documentary of the history of computing. As a child of the 70s I was already aware of lot of new technologies that emerged around that time. Home gaming with PONG, VHS machines and dedicated handheld single game machines. I was aware of the huge cost around 1980 of the PC. I played games in arcades in pre teen years until I received a Spectrum 48k and everything changed and to this day I'm a tech head.. I'm watching this now and I have learnt even more from it thank you.

  • @CattleRustlerOCN
    @CattleRustlerOCN 9 หลายเดือนก่อน +2

    And we all know what's happened in the 3+ years since this video was released. x86 is still the architecture of current desktop PCs, and AMD with Ryzen and Threadripper smack Intel around, and compete very closely with Nvidia in the GPU market, even beating them in some pure raster situations, but are behind when it comes to ray tracing. This technological journey that I have been able to watch and be a part of all these years is fascinating. I'm 54 so I have witnessed a lot, especially the juiciest parts starting in the early early 80s.
    Thank for the video.

  • @EliteGeeks
    @EliteGeeks ปีที่แล้ว +4

    oh man, this brought back memories... well done...

  • @nevrunderstandlada
    @nevrunderstandlada ปีที่แล้ว +3

    WOW really good documentary. It's clear,simple to understand and complete thanks

  • @michaelhawthorne8696
    @michaelhawthorne8696 ปีที่แล้ว +20

    I was 16 in 1980 and lived through the developement of the home micro and then the PC.
    I can relate to this video with great fondness having had the ZX81, BBC Electron and its add ons, BBC Micro, Atari STe. Buying my first PC in 97 (IBM Aptiva) and then building my own.... Its been fun watching this video bringing back great memories.
    Thanks for your hard work Archie...👌

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  ปีที่แล้ว +2

      Thank you so much! Glad it brought back many happy memories :)

    • @franciscorompana2985
      @franciscorompana2985 ปีที่แล้ว

      I had the first Pentium in Portugal. 1994 😂
      I got a Matrox 4MB from Canada. 😂
      The IC had the famous bug from factory, so I changed it in the US, for a clean Pentium. 😂

  • @fretworka3596
    @fretworka3596 11 หลายเดือนก่อน +1

    Despite quite a few inaccuracies with some earlier market products and trends, and market share implications, it was useful overview of the history of the microprocessor, especially. The post 1990 analysis was more accurate.
    Nice for me to remember some of the kit I worked with. I wrote a traffic light controller in Z80 assembler. I'd forgotten that!

  • @dj_paultuk7052
    @dj_paultuk7052 ปีที่แล้ว +3

    What about Colossus in 1943 ?. One of the greatest achievement's overall in computing. And far ahead of anything else in the world at the time.

    • @Stef-2U
      @Stef-2U 11 หลายเดือนก่อน

      Most people especially in USA didn't hear anything about collossus till about 2002, although some of us born in UK with parents in military, knew of it years before the uk government in 1975 started to declassify it, although it was kept a secret as to it's purpose of use during ww2, I knew though cos me dad told me lol

  • @ai_is_a_great_place
    @ai_is_a_great_place ปีที่แล้ว +1

    24:44 my first reaction would be to take a pic of it to save as reference, but you would have had to write it down by hand, wow!

  • @Bduh2
    @Bduh2 ปีที่แล้ว +6

    Fantastic video! As I was watching it, memories came back from all the computers I've had during my lifetime. From Sinclair, the commodore, the first IBM with DOS to the servers and PCs I'm still building to this day for customers.

  • @pipschannel1222
    @pipschannel1222 2 ปีที่แล้ว +20

    Great content! Love it!
    Did you know IBM wasn't the company that introduced Intel's 386 architecture with their PS/2 Systems? It was Compaq that beat Big Blue by 7 months with their very expensive high-end Deskpro 386, released in september 1986 vs the IBM PS/2 Model 80 which used the same 80386DX-16, released in April 1987. I think Compaq deserves to be mentioned in documentaries like these as it shaped computing history or at least had a vast influence on its development in the sense that the company played a key role in creating open standards which hugely benefitted/influenced the PC (clone) industry, being the quintessential PC clone manufacturer..

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +4

      Thanks! Interesting stuff :)

    • @awuma
      @awuma ปีที่แล้ว +1

      So glad that somebody today recognises just how significant the Compaq Deskpro 386 was, in my opinion just below the original 8088-based PC itself. It, and not the IBM PS/2, established the industry standard architecture, using the PC-AT's open 16-bit bus for peripherals.
      The greatest missed opportunity was Sun Microsystems' not basing their 386i workstation on the ISA; had they made their SunOS (later Solaris) run on the ISA, they would have blown both Microsoft and IBM out of the PC business, and from 1988 we would all be using full Unix and not the MS-DOS and Windows operating systems, which did not make anywhere near full use of the 386 architecture's power until the 2000's, when Windows became NT; Linux appeared in 1993, and by 1995 had the full power of SunOS/Solaris, but on the standard 86x architecture. Sun gave up in the 386i in 1989. (I replaced my Sun 386i with a Pentium based PC running Slackware Linux in 1995).

  • @alpaykasal2902
    @alpaykasal2902 2 ปีที่แล้ว +4

    My heart went all a-flutter at 28:00 when the Amiga was shown. Great video, fast pace but so thorough!

  • @joshjones3408
    @joshjones3408 ปีที่แล้ว +1

    At 35:37 the music that is playing that has to be the best remake of it the last time i heard that was on the movie the last of the mohekins probably not spilled right...but great stuff 👍👍👍👍👍👍👍

  • @tonylewis4661
    @tonylewis4661 2 ปีที่แล้ว +3

    And let's not forget the first 16 bit home computer (severely crippled by an 8 bit bus and closed software and hardware development by TI) the ill fated 99/4a with the TMS9900 (but it did sell close to 3 million units, virtually all at a loss).

    • @RickHansbury
      @RickHansbury 11 หลายเดือนก่อน

      I got one for Christmas that year and I loved it. I had the BASIC cart and Assember language. And a bunch of games.
      I got the BASIC cart by using the limited BASIC on board to show my parents what it could do and convincing my parents that instead of the home budget cart I could program it in BASIC. And I did. ❤

  • @TheEvertw
    @TheEvertw ปีที่แล้ว +1

    The bit where you talk about the IBM mainframes contains some errors. Mainly, the IBM 360 and 370 series used BJT transistors, not MOSFETS. Mainframes and mini-computers used either discrete BJT transistors or simple IC's like the 74xxx series, using BJT transistors exclusively.
    MOSFETS came to their own with the microprocessors, like the 4004.

  • @pssthpok
    @pssthpok 2 ปีที่แล้ว +19

    Nice history! My first computer in high school was a single Commodore PET for the entire school, when I went to University I saved my pennies to buy my very own Sinclair ZX-81. What a beast.
    I recall the Pentium 4 years, and Intel's strange relationship with RAMBUS memory, with all the technical and legal issues that it came with.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +4

      Thank you! Very interesting to hear about your computing journey :)

    • @picklerix6162
      @picklerix6162 ปีที่แล้ว +4

      We used to call it Rambust memory because it was a disaster for PC companies that decided to use Intel chipsets. Not even Intel’s own server division would use Rambus.

    • @pssthpok
      @pssthpok ปีที่แล้ว +1

      @@picklerix6162 I used to call it RamButt due to all the difficulties.

  • @sonicboomish
    @sonicboomish ปีที่แล้ว +9

    This video was incredible. Thanks a lot for putting all the time and effort into this! really clear and well put together

  • @kob8634
    @kob8634 9 หลายเดือนก่อน +1

    Thank you for this. I'm 63. For half of my adult life I kept this documentary in my head but my brain clicked off when we stopped calling them Pentium. At least now I know how to intelligently shop for a computer again. Your level of research is impressive. V well done.

  • @iVTECInside
    @iVTECInside ปีที่แล้ว +9

    Good watch. One thing not mentioned was the fact that the desktop market is somewhat limited by the x86 core architecture. The same instructions from 1980 will function on an i7-12900k. ARM never had that hanging around their ankles. It will be very interesting to see how things develop from here.

    • @Mr_Meowingtons
      @Mr_Meowingtons ปีที่แล้ว

      Are you saying ARM made in 1985 has nothing to do with todays ARM?

    • @AndyGraceMedia
      @AndyGraceMedia ปีที่แล้ว

      @@Mr_Meowingtons It doesn't really no. I coded for the ARM1 Dev box that plugged into a BBC Master back in 1986 and ARM2 for the Arc 300/400 series and the A3000 BBC Micro. ARM2 was a chipset with the main CPU, video, memory and I/O controllers. ARM3 improved the CPU with 4k cache while the other chips were upgraded separately. Quite cool really.
      Those originals were wildly different from today and are not instruction compatible with even the true 32-bit ARM processors as they used as 24/26 bit bus with the rest of the bits used for passing around the equivalent of a status register and four level interrupt.
      After ARM3 came ARM6 and then the ARM600 series which were all true 32 bit addressing processors. There was also a cut down 16 bit ARM thumb architecture. DEC (and even Intel which bought DEC) released a StrongARM which powered some of the early Windows CE handheld devices like the Compaq iPaq and Dell Journada.

    • @RickHansbury
      @RickHansbury 11 หลายเดือนก่อน

      I agree about micro soft keeping their backwards compatibility but it is necessary. A shocking number of the country's businesses use old computer architecture with new interfaces grafted on.
      Losing backwards compatibility would (they say) cause a financial disaster.
      I think the softies should keep the Windows 11 and a new Windows system with limited and specific compatibility and complete control of API level architecture for security. They could rewrite the whole architecture to eliminate even the possibility of exploits.
      So essentially two Windows. I already have a nice new laptop securitized and firewalled to the point of uselessness and an older one used for games and social media only.

  • @antonnym214
    @antonnym214 ปีที่แล้ว +2

    Nice documentary! Thank you. The next leap forward will be photonix. All good wishes.

  • @--fishiiki-
    @--fishiiki- 2 ปีที่แล้ว +8

    Amazing series! I'd love to see more like this. Great work man, can't believe this only has this many views

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +2

      Glad you enjoyed it! I am working on a similar video at the moment - keep an eye on the channel's community tab for updates :)

  • @lepatenteux592
    @lepatenteux592 ปีที่แล้ว +2

    How can this video have so few views??
    This is BBC documentary level material!

  • @thorstenglaubitz1006
    @thorstenglaubitz1006 ปีที่แล้ว +4

    Well made documentary Archie. I've seen many 'history of the cpu' videos and yours is by far the most informative and thorough one. I enjoyed it alot. Thank you

  •  ปีที่แล้ว +1

    Nitpicking: the ENIAC was not the first digital computer. That has been the Zuse Z3 in May 1941. It wasn’t fully electronical, though, as most of it was relay based electromagnetic parts.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  ปีที่แล้ว

      Indeed, others have also pointed to the Zuse machines. The video starts with ENIAC as I didn’t want to go too far back, and as it was programmable, digital, electronic, and Turing complete, it seemed like a good place to begin.

  • @samposyreeni
    @samposyreeni ปีที่แล้ว +19

    You should talk about DEC Alpha as well. In RISC design. Because that was totally insane on the RISC front.
    Also Intel's 860 and 960. Wildly different architectures, of which 960 survived rather far, even into printers and space applications.

    • @Chordonblue
      @Chordonblue ปีที่แล้ว +4

      Since Faggin was mentioned, I think it's time for Jim Keller to get one also. Worked on DEC Alpha, Athlon, Zen, and for Apple, Tesla, and many others. Now at Tenstorrent, AI may be the answer to the next big discovery in just about everything.

    • @JadedArsenic
      @JadedArsenic ปีที่แล้ว +2

      And he totally skipped over DEC's foray into desktop computing in 1978 - the DECstation!

    • @BasVossen
      @BasVossen ปีที่แล้ว +3

      the DEC Alpha really was the first 64bit architecture. Too bad the managers were too nice for the cut throat IT business.

  • @scottadler
    @scottadler ปีที่แล้ว +1

    A great long form documentary, if a bit anglo-centric, but It contained a few errors.
    As one of the software developers in the late seventies and eighties, I am well aware of the origin of MS-DOS 1. Microsoft did not buy it, it obtained a copy of CP/M-86 with another name and a different disk file structure. In other words, MS-DOS 1 was essentially pirated software. The author of CP/M-86 was a friend who told me the entire sordid story and why he didn't sue -- "Bill (Gates) was my former student." And that was how the greatest fortune in history up to that time was created.

  • @notation254
    @notation254 2 ปีที่แล้ว +4

    Great doc, loved all the details throughout the years. You should be proud.... and damn, you deserve more views and subs for this.

  • @petermitchell6348
    @petermitchell6348 2 ปีที่แล้ว +2

    Yorkshireman, Geoffrey William Arnold Dummer, MBE, C. Eng., IEE Premium Award, FIEEE, MIEE, USA Medal of Freedom with Bronze Palm was an English electronics engineer and consultant, who is credited as being the first person to popularise the concepts that ultimately led to the development of the integrated circuit, commonly called the microchip, in the late 1940s and early 1950s.

  • @NipkowDisk
    @NipkowDisk ปีที่แล้ว +12

    I don't normally watch videos longer than about 30 minutes, but this was worth every second of it. Most of it was a great trip down Memory Lane for me; I was born in 1960. Outstanding job!

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  ปีที่แล้ว +2

      Glad you enjoyed it!

    • @marktwain5232
      @marktwain5232 11 หลายเดือนก่อน

      @@TechKnowledgeVideo What a beautiful trip down memory lane for me! I started in 1979 with CP/M on the Z80 and thought I was the last person on Earth to find out about the microcomputer revolution. I then worked 41 years as a professional developer and Software Engineer in GW Basic, Clipper, and finally Java, Visual Basic, and various Javascript libraries. I retired in early 2021 working in Cloud protocols. I loved every minute of it! Every day was something new that no one had ever done before. I am so grateful that I got to work on the long march! Thank you so much for this beautiful presentation!

  • @Nehmo
    @Nehmo ปีที่แล้ว +2

    I hate to add "old-guy" perspectives, but I lived through the transition from tubes to transistors. Modern kids view that transition as a short misdirection in the march toward the integrated circuit. But it was a huge advancement that has nothing comparable in modern times.

    • @laustinspeiss
      @laustinspeiss 3 หลายเดือนก่อน

      Im in the same boat!
      But it does give us the ‘under the hood’ knowledge that lets us make systems do things better faster with the same hardware!
      I still write code that doesn’t waste any more resource
      than necessary!

  • @felixbaum48
    @felixbaum48 2 ปีที่แล้ว +35

    This may be nearly a year old but it's still absolutely brilliant. Thank you for putting it together!

  • @jondeere5638
    @jondeere5638 11 หลายเดือนก่อน

    An important innovation in 1970 was the DEC PDP1120 computers UNIBUS which combined interrupts, memory, and data on one bus structure rather than three. separate buses.

  • @kensmith895
    @kensmith895 ปีที่แล้ว +9

    Truly excellent documentary. Well done. Charts the time line of my career from the 8008 to the present day.
    If you do release an updated version of this video it would be good to add a mention the DEC Alpha risc machine. And also mention Colossus from Bletchley Park. There were various other microprocessors that you could make passing reference to along the way such as the LSI11 implementation of the PDP11 and the microprocessor versions of the VAX architecture. Also HP had some proprietary microprocessors that they incorporated into their own products.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  ปีที่แล้ว +1

      Thank you very much! If I ever make an updated version I’ll take your considerations on board :)

    • @vicheakeng6894
      @vicheakeng6894 ปีที่แล้ว

      ADHESIVE

  • @TedApelt
    @TedApelt ปีที่แล้ว +1

    I still remember my Apple II computer in 1979. Programs were loaded on it with an audio cassette tape recorder. Later, I got a floppy drive and thought that was truly awesome.

  • @Magnulus76
    @Magnulus76 2 ปีที่แล้ว +10

    I had Phenom II. It was a great budget CPU. I used it for some fairly hefty tasks, like chess analysis. Clock speed wasn't everything back then.
    Bulldozer was a disaster and I didn't upgrade my CPU until Ryzen generation.

    • @golangismyjam
      @golangismyjam 2 ปีที่แล้ว +1

      You say it was a disaster but I owned one of those cpus for 10 years and it still plays Triple AAA games to this day. Sure they ran hot and used a lot of electricity but that was all AMD CPU back then

    • @manuelhurtado9970
      @manuelhurtado9970 2 ปีที่แล้ว

      @@golangismyjam Yeah, my brother uses my dad´s old fx8350 and it works fine for games like fortnite, minecraft or roblox. hell it can even run elden ring with a gtx970

    • @ismaelsoto9507
      @ismaelsoto9507 ปีที่แล้ว

      @@manuelhurtado9970 The hexa and octa cores FX CPUs can still run modern titles well enough, they sure did age better than the i3/i5s from the same time period that started struggling when games used more than 4 threads.

    • @manuelhurtado9970
      @manuelhurtado9970 ปีที่แล้ว +1

      @@ismaelsoto9507 yeah, true, the fx series had a faster clock and more cores, the only problem is that cores share some stuff like the FP scheduler

    • @ismaelsoto9507
      @ismaelsoto9507 ปีที่แล้ว

      @@manuelhurtado9970 Yeah, it may had make it easier to develop/manufacture an octa core CPU without being too expensive (An FX 8150 has a die size of 315 mm² vs Intel's Xeon E5 2650 till 2690 all octa cores with a die size of 435 mm² on Intel's 32 nm that was denser than Global Foundries 32 nm process), but it crippled their IPC... AMD hoped software would caught up fast and fully utilize the 8 threads to make it a more compelling option than the competition, sadly it only happened when the hardware was already obsolete.

  • @MrGsteele
    @MrGsteele ปีที่แล้ว +1

    Excellent documentary and walk down memory lane. This was the world of computing in which I and my peers in the computer business evolved, and I remember the steps and the back and forth chip competition. It's interesting to reflect on what has happened since this video was produced, and an addendum that brings us up to 2024 would be a logical follow-on to this excellent treatise. I was an early fan of the 6502, then the 680X0 for embedded designs, and then the rise of the Intel-based high volume home computers that transformed the landscape. The progress has been truly stunning.

  • @johnpenner5182
    @johnpenner5182 2 ปีที่แล้ว +7

    very good and thorough chronicle of early processor development - i like that you were able to trace the architectures to their root in von neumann, through the tubes, the IBM 360, the PDP, the 4004, and the altair (didnt catch if you mentioned this was the iconic machine which appeared to bill gates on the cover of a magazine and inspired the founding of microsoft). you did a nice work through the mainframe tube to transistors, and the microprocessor developments. the bascom calculator and the engineer calling for a simplified generalized design resulting in the 4004. thank you for this video. recommended.

  • @ovechkin100
    @ovechkin100 ปีที่แล้ว +1

    as a kid i of course had no idea how new computer tech really was. i was born in 1988. i remember the mid 90s playing on a computer, and the games were on flopy disc's. they were all so ghetto, but back then its all i knew. then the late 90s my parents got a whole brand new computer and wow. msn, computer games, surfing the web. obviously its a lot different today, but you could still generally do the same stuff. talking to my friends, playing games, researching cool shit. and ever since its only elaborating. its bizzare that it all came around right as i grew up and was able to just fall into it. insane times. will be interesting to see how far it goes.

  • @mrflamewars
    @mrflamewars 2 ปีที่แล้ว +7

    Great video. Sandy Bridge was a defining moment for Intel - it's why so many of their later CPUs are so similar.

  • @trevorjones3755
    @trevorjones3755 4 หลายเดือนก่อน

    I'm a mechanical engineering student who has a few friends who know WAY too much about computers. I've been slowly trying to learn but this has been tremendously useful in that goal! You explain things very well to someone who barely knows anything about computers. Well done! And thank you

  • @babythorgaming2166
    @babythorgaming2166 2 ปีที่แล้ว +17

    The production quality on this is so darn high, how is this channel so small?? You've definitely earned a new subscriber, and I hope to see new content from you in the future!

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +5

      Thank you so much! I am currently working on a new video, and will be posting updates on the community tab as it develops :)

    • @customsongmaker
      @customsongmaker 2 ปีที่แล้ว +2

      There are 46 ads in this one video, so maybe that's why

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +4

      Hi customsongmaker, I’m surprised you were served that many ads - I just rewatched it on another one of my channels and I got less than 1/4 of that so I’m unsure why you got so many.
      I would add that the video has only been monitized for less than a week and I’ve been playing around with the ads to see what the best ratio is r.e watchability vs revenue. I have received a few comments suggesting that the ad frequency is a little high and I will be adjusting that accordingly when I’m back at my computer in a few days.
      One final thing to say that as a small creator ad revenue is the only source of income (no patrons or TH-cam members), and looking to the future (I will be finishing my PhD next year so may not have a flexible schedule after) it will be difficult to continue making videos like this without revenue. I appreciate your comment and will take feedback on board - feel free to keep an eye on the community tab for more updates :)

    • @customsongmaker
      @customsongmaker 2 ปีที่แล้ว +1

      @@TechKnowledgeVideo I counted the ad breaks on the progress bar. I didn't actually spend 20 minutes watching advertisements just for your video, I stopped watching very early. If there had only been 3 ads - 1 at the beginning, 1 in the middle, and 1 at the end - I would have watched 3 ads, which is 3 times as many ads as I actually watched.
      Try breaking this video into 5-minute or 10-minute videos. "The Complete History of the Home Microprocessor: Part 1". Then you can see how many people will watch 5 minutes or 10 minutes, with 1 ad. If they like it, they will continue to watch the rest. They will also subscribe, since it makes them think of you as someone with many good videos that they don't want to miss.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +3

      The ad breaks on the progress bar do not correspond to ads you will actually see - TH-cam will only play about 1/5 of the adverts it displays on the progress bar - which for this video works out at about every 9 minutes.
      If you look at the channel I have done what you have said - split this video into 5 separate videos, which viewers will typically see 1 mid roll or less per video (with average video lengths of around 20 minutes). However, this combined video has been far more popular than the individual parts. As to why this is I’m not sure, but the algorithm far prefers pushing this video out than the others.
      I would add that video retention has stayed the same in the week the video has been monitized compared to the prior week - people on the TH-cam partnered subreddit have done loads of testing on a whole range of videos and against logic it really genuinely doesn’t affect watch time. However, having watched the video back myself for the first time, I do think the quality of the video is degraded with the current frequency of adverts and I really want people to have a great viewing experience. Hence I will reduce the number of ads after the weekend.
      If you do want to watch the video without ads feel free to use an ad blocker or watch the individual parts which are a lot lighter ad wise :)

  • @v8pilot
    @v8pilot ปีที่แล้ว +1

    In 1965 I was an EE student at Birmingham University. Dr Wright was one of our lecturers and his speciality was semiconductor devices - particularly heterojunction transistors. In a lecture he mentioned the latest thing - integrated circuits. I asked him if he thought it would be possible to get a complete computer on an integrated circuit. My recollection is that he told me not to ask silly questions. He obviously thought I was taking the piss.

  • @stachowi
    @stachowi 2 ปีที่แล้ว +3

    This was excellent. Looking forward to more content from you.
    I'm a CS/EE, 20 years in the industry.

  • @KangoV
    @KangoV 11 หลายเดือนก่อน +1

    I was the proud owner of the first ARM based computer..... The Acorn Archimedes ;)

  • @frankowalker4662
    @frankowalker4662 2 ปีที่แล้ว +5

    Sorry about this, but you did'nt mention the worlds first electronic programable computer, Colossus, developed from 1943-45.
    Great documentary.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +2

      Thanks! Colossus wasn't technically Turing complete so this is why it is not mentioned :)

    • @frankowalker4662
      @frankowalker4662 2 ปีที่แล้ว +1

      @@TechKnowledgeVideo Fair enough. :)

  • @chemtech90
    @chemtech90 หลายเดือนก่อน

    I’m amazed at how well you built the story. Brilliant educative video about the history of computing.

  • @jparky1972
    @jparky1972 ปีที่แล้ว +4

    Thank you so much for this.
    I used to be a software developer back in the 90's to early 00's and knew the hardware development to the dual core CPU's.
    But after that, lost my interest in hardware side due to becoming a stay at home Dad.
    So thanks for this. Really filled in a lot of gaps.

  • @JeremyWinfreeDev
    @JeremyWinfreeDev 2 ปีที่แล้ว +1

    I've been doing all of my dev and design work on an M1 Mac for about a year now and it's amazing

  • @MrPDawes
    @MrPDawes ปีที่แล้ว +17

    I was getting worried that you were not mentioning ARM what so ever until about 80% of the way through. Given the world dominance of this architecture, not one to miss. The early Archimedes Computers from Acorn were a step change in performance over the Intel architecture at the time, but their lack of maths co-processor became a significant disadvantage during the media war as graphics has a high compute demand.

    • @TheUAoB
      @TheUAoB ปีที่แล้ว +5

      I felt the same way while watching. The ARM did have an FPU (FPA10) and was designed to support coprocessors, but indeed the standard Archimedes didn't ship with it, this meant software didn't really take advantage since it was likely FP instructions would be slow with the trap based floating point emulator.
      Acorn did try to break into the UNIX workstation market which would have meant much more powerful ARM based computers in the early 90s had it been successful. Even then, Acorn chose to make the FPU optional even on their flagship £3,995 R260 (rebranded Archimedes A540), without an FPU you have to wonder what market Acorn was actually aiming for!

    • @AR15andGOD
      @AR15andGOD ปีที่แล้ว

      math

    • @ThatSockmonkey
      @ThatSockmonkey ปีที่แล้ว +1

      ​@@AR15andGODis not a word.

  • @jeffreyjeffrey007
    @jeffreyjeffrey007 ปีที่แล้ว +1

    Late to the game on this vid, but at least was reccommended so was able to see. The algo works in strange ways. Seems to get a lot of tech-tube and retro reccommendation in these spaces. Else its Wierd Al's UHF and Pat tthe Nes Punk v8ds of Tallarico rant ro no end. Enjoyed those vids, but how many times am I to watch it? Either way, quality stuff on a two year vid ms d00d. Subbed.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  ปีที่แล้ว +1

      Thanks! Keep an eye out for the next long form documentary coming out later this year :)

  • @endofthelinejoel
    @endofthelinejoel 2 ปีที่แล้ว +9

    This deserves millions of views. Well done.

  • @jeffreyphipps1507
    @jeffreyphipps1507 2 ปีที่แล้ว +1

    Exceedingly well done. Easy for people to connect with. As a college instructor, I will try to get as many students as possible to view this.

  • @sophiekempston5152
    @sophiekempston5152 4 ปีที่แล้ว +24

    Fantastic series, can't wait for the next one!

  • @energysavingday
    @energysavingday ปีที่แล้ว +2

    A hugely impressive, multi-decade summary. Well done.

  • @01chippe
    @01chippe 2 ปีที่แล้ว +8

    This was thoroughly enjoyable and a great trip down memory lane. Why didn’t you include the shift of including graphics processors on the cpu? Great video and very detailed.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +5

      Thank you! Ultimately in a video like this, you have to limit your scope somewhere, and since integrated graphics and other dedicated on-die accelerators are a fairly new concept (only really appearing in the last 10 years) they were left out.

  • @aut0turret
    @aut0turret 2 ปีที่แล้ว +1

    16 year old CPU/motherboard, and still serves me well today. Microshaft won't allow me to install the latest Winblows on it. They lost a paying user because of it, forced me to get the hang of Linux. Been happy with it.

    • @PaulaXism
      @PaulaXism ปีที่แล้ว +1

      I feel so outdated as well replying to you from my Intel core2 quad extreme. Mint 20 btw. and my media machine is a very old core2 3.3ghz job also running Mint..

    • @aut0turret
      @aut0turret ปีที่แล้ว

      @@PaulaXism As long as we're not trying to play the latest video games or frequently needing do some other CPU intensive rendering/encoding task, our CPU's are great today. I see no reason for me to upgrade, especially after the sticker shock. Pay thousands, for what? Something that does what I already have does, but slightly faster.

  • @TailRecursion
    @TailRecursion 2 ปีที่แล้ว +22

    The production quality here is absolutely incredible and you've not even hit 3K subs yet. You deserve 1000x that, easily. I'm also seeing parallels to Ahoy in all the best ways. Thanks to the algorithm for bringing me here, and thank you for making excellent content; you've got a new sub!

  • @Juancheros
    @Juancheros ปีที่แล้ว +2

    Excellent video! Glad to see you mention the 4004 and the 8008. Beyond the home Microprocessor, there is the growing massive influx of Microprocessors into the automotive industry where multiple RISC devices simultaneously perform complex independent tasks, communicating with each other via evolving Canbus standards to a central cpu which has the infamous Obd port. This application has become a free-for-all in the industry for the sake of bling but rendering owners and auto mechanics constantly second-guessing what is going on. Would be great to see you make a video on this too, thank you!

  • @Damjes
    @Damjes 2 ปีที่แล้ว +3

    Also, CP/M ran on 8080, it ran on Z80 because of compatibility.

    • @fredbear3915
      @fredbear3915 2 ปีที่แล้ว +1

      Yes indeed. CP/M was written in 8080 Assembly Language so it was only ever going to use the 8080 opcodes that the Z80 also ran for compatibility sake. When I wrote CP/M utilities back in the 1980s, even though I wrote in Z80 Assembly Languange, I had to make sure to only use those 8080 instructions from within the Z80 set otherwise my software would not have run on a lot of machines!

  • @absalomdraconis
    @absalomdraconis 2 ปีที่แล้ว +2

    7:33 : A note for anyone reading the comments- the "electric field controlled" transistor (field-effect transistors), in the form of the JFET, had actually been devised and patented _over a decade_ before the transistors otherwise described at this point in the video (the BJT, which was created before WW2) had been created, but they required too much precision to make at the time. In contrast, the then-common form of the BJT (one of the germanium versions, I think) could be created on a lab bench with a capacitor of the right capacitance, a battery of a specific voltage (to accurately charge the capacitor), and a holder for the BJT that was used both to make it and use it, with the final BJT literally looking like it's schematic symbol (the "bar" is the actual germanium). There were even "foxhole" transistors created with razor blades and clothes pins.

    • @stevenvanhulle7242
      @stevenvanhulle7242 ปีที่แล้ว +1

      The BJT was invented by Bardeen, Brattain and Shockley at Bell Labs in 1947, so right after WW2, not before it.

  • @vanlife4256
    @vanlife4256 ปีที่แล้ว +11

    Archie, this was an awesome review of the Home Computing history! Great production! Thank you for sharing!

  • @joshjones3408
    @joshjones3408 ปีที่แล้ว +1

    The background music is awesome....the video is great 👌👍👍👍👍

  • @jacobrzeszewski6527
    @jacobrzeszewski6527 2 ปีที่แล้ว +8

    Awesome comprehensive video. Kinda surprised you didn’t mention Intel “tick tock”, and AMD releasing the first 5GHz processor. Even if AMD did it vía a technicality.

    • @toby9999
      @toby9999 2 ปีที่แล้ว

      I guess he can't cover every detail of everything in one hour.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +1

      Thanks! :)

  • @sudokatsu
    @sudokatsu หลายเดือนก่อน

    Thanks for keeping this information not only alive, but digestible, for people like me who didn’t experience it first-hand. It’s incredibly fun reading about everyone’s experiences in the comments with the context about the historical events and the progress of technology from the video. Also, the animations and video editing are top notch ❤

  • @Conenion
    @Conenion 2 ปีที่แล้ว +6

    To say that Intel Itanium was a RISC design is a bit of a stretch. Actually, back then it was the RISC crowd that said that Itanium's VLIW approach was doomed to failure. The main difference between VLIW (which Intel calles EPIC) and advanced superscalar RISC designs is that EPIC does not allow for out of order (OoO) and other dynamic execution of instructions. Instead all this burden is put on the compiler. In fact, if you'd ignore dependencies of instructions and thus the order of instructions produces a wrong result, Itanium will deliver a wrong result happily. Itanium does no data dependency checking at all, this has to be done by the compiler.
    Removing all dynamic execution features presents a dilemma: The compiler, which has to be very very smart, is forced to find every bit of instruction level parallelism (ILP) during compilation. EPIC has no way of any sort of reordering or re-scheduling. If the compiler isn't able to find ILP there isn't any at all, instread a NOP is issued to the pipe, resulting in VLIW instruction bundles which load only 1 of 3 pipes with work, the other 2 just do NOPs. In that case you lose badly. This static scheduling is especially difficult with loads, since the memory hirarchy presents us with practically impossible to predict memory delays.
    VLIW/EPIC works best with code like matrix multiplication, which is a very static w.r.t. input data. In such cases parallelism is basically given. An easy job for a compiler to parallelize code like this. But such code is rather rare in a typical desktop or server setting. Also such SIMD computations can be done nicely in vector units of non-VLIW processors, like SSE or AVX in the x86 world.
    In short, VLIW/EPIC is an architecture that is targeted too much towards specific computational classes to be a general purpose CPU architecture. Also writing a compiler for EPIC which is able to extract every bit of ILP, was/is close to impossible. There were other problems specific to Intel's implementation, notably that executing legacy x86 code was painfully slow.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +2

      Very interesting stuff, thank you for your insight!

    • @absalomdraconis
      @absalomdraconis 2 ปีที่แล้ว

      Baring any memory & similar delays (not particularly familiar with the design, so not sure how those would be handled), it shouldn't actually be _too_ difficult to get decent (not necessarily perfect) scheduling done.
      In essence, you compile & link, but don't output a final executable, instead producing a SSA form or similar. You then take this, and throw it through a scheduler stage- the scheduler defaults to outputting a NOOP for _all_ of the executable code, but _looks for_ instructions that it can grab from the SSA form as _replacements_ for NOOP instructions, marking those SSA instructions as "finished" in the process. The matching would probably be best done by a module written in Prolog or something. Wouldn't be all that fast, but with a decently sized window should be fairly effective.

  • @brianbowcutt249
    @brianbowcutt249 2 ปีที่แล้ว +1

    The great and terrible algorithm kicked me here after I purged dozens of recommendations for sub 60 second cat videos and memes, glad something out there noticed my futile attempts to save my brain. Excellent work, looking forward to seeing more.

  • @maniacfox111
    @maniacfox111 4 ปีที่แล้ว +15

    Really great content. Well put together!

  • @TechDeals
    @TechDeals 6 หลายเดือนก่อน

    At the 1 hour mark, you note that the i7-920 didn't often show a performance gain.
    Having been an early adopter of it, moving from the Q6600, I have to strongly disagree with that. The difference was obvious and noticeable the minute you did more than treat it as a "faster older PC".
    The multi-tasking performance was astounding and the overall responsiveness was a step up. I ended up buying 2 of them in 2009 to replace both of my Core2Quads at my office, the difference was noticeable enough.

  • @Aurange
    @Aurange 2 ปีที่แล้ว +4

    1 1/2 years after the fact and this video finally got blessed by the algorithm gods.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +1

      Indeed - I now get more views in 6 hours than I did in the first 6 months of the video release!

  • @BigDaddy_MRI
    @BigDaddy_MRI ปีที่แล้ว +1

    I’m 70 now, and I watched a lot of this (except for the very early development) microprocessor evolution, and I remember the Intel 4004 being produced.
    While in the US Navy, I was able to get a sample of a 2.5MHz Z-80 by Zilog, and using 256 bytes of ram and 2k bytes of EPROM I wrote and created a tic-tack-toe game. The 40 pin Z-80 is still being made and also the Z-80 family of chips are still available. Brand new. I’m still writing code and building projects with that 8 bit chip. Back in the early ‘90’s, the Z-80 ran almost 100% of all the bank cash tellers and even today, it still does, albeit with much smaller and advanced architecture and clock speeds, the core compute engine is still the Z-80. My opinion only, it remains the most powerful 8 bit microprocessor ever designed. 178 instructions, 3 interrupt modes, and indirect addressing for memory and I/O, it is an amazing device. And my Heathkit H89 with a Z-80 still boots into CP/M (or MS-DOS) with no problems at all. I write code on that machine for my projects.
    Thank you for an OUTSTANDING video!! Wow, that took me down a gr8e bit of fond memory lane. Pun intended.

  • @jamesbond_007
    @jamesbond_007 2 ปีที่แล้ว +3

    This is an exceptionally well done video! Fantastic capturing of the entire history from the origins to today. Great job!!!

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว

      Glad you enjoyed it!

    • @susanhodges9447
      @susanhodges9447 2 ปีที่แล้ว

      He is wrong, proving that is how ignorance thrives. Try 1941 in the UK.

  • @kgbmmt
    @kgbmmt 9 หลายเดือนก่อน

    Great job! I was there at there back in the 80's and worked in computer retail through to 2008. It was a fantastic journey and your video brought the memories flooding back. Thank you!

  • @CobraTheSpacePirate
    @CobraTheSpacePirate 2 ปีที่แล้ว +4

    Fantastic! Needs more views! What a shame!

  • @BuddyWudzyn
    @BuddyWudzyn 5 หลายเดือนก่อน

    This is absolutely amazing and pretty comprehensive for a relatively short amount of time covering so much (and so complicated) information. Props to you for putting so much effort when you apparently were only getting small amounts of views. Anyways, subscribed and I hope you keep making great edutainment!

  • @charlesjmouse
    @charlesjmouse 2 ปีที่แล้ว +3

    A rather belated "very good".
    Such an excellent video with so few views! I must go see what else you've done and maybe add to views.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว

      Glad you enjoyed it! I am in the process of creating the next long form video :)

  • @chrishaidinyak4621
    @chrishaidinyak4621 ปีที่แล้ว +1

    A nice and relatively detailed summary off the home processor up until the discussion of the ARM processor. The ending leaves the taste that ARM will rule the world and yet there is no mention on RISC-V. In my opinion, RISC-V will be the end-game for uP as even royalties will eat into profit. Still, much of the video is relatively accurate and informative; thank you.

  • @Damjes
    @Damjes 2 ปีที่แล้ว +3

    Wrong again. Not "every machine" uses Vacuum Technology. Konrad Zuse created Z1-Z4 computers which were relay-based AFAIK.

    • @crabby7668
      @crabby7668 ปีที่แล้ว

      Curious mark did an interesting visit to Japan to see a relay based computer that is still running. Iirc they were used commercially once. Worth a look if you are interested.

  • @mr88cet
    @mr88cet 11 หลายเดือนก่อน +1

    Superb history summary! Thanks.
    I just recently retired from 40 years in this industry, so I remember *_a lot_* of this.
    What’s wild though, is that the parts of this history I do _not_ remember as well, are probably the parts that most do: I mostly worked in the embedded-compute arena, have been an Apple dude from the Apple ][ days, and am not much of a gamer, so the exact details of the Intel-AMD x86 wars were not something I followed in much detail.

  • @bazza5699
    @bazza5699 3 ปีที่แล้ว +6

    wow! how has this got so few views!/?

  • @NesNyt
    @NesNyt ปีที่แล้ว +61

    The 6502 was not the king......if IC's had a religion the 6502 would be god

    • @toby9999
      @toby9999 ปีที่แล้ว +2

      It was the 8bit cpu I enjoyed programming the most.

    • @TheElectricW
      @TheElectricW ปีที่แล้ว +3

      Likewise... good grounding for assembly programming.... I used to discuss with a Z80 programmer... he couldn't see how it possible to write programs with only 2 x 8 bit registers available!

    • @lazymass
      @lazymass ปีที่แล้ว

      ​​​@@TheElectricW Do you happen to have some source i can look at, that shows how the programming goes for such chips?

    • @hanspeterbestandig2054
      @hanspeterbestandig2054 ปีที่แล้ว

      ⁠@@lazymassHave a peek at the TH-cam Channel of „ChibiAkuma“. Keith will teach you all the things to write Code for 6502 or Z80 or 68000 … based Computers. Among of this he published two books „Learn Multiplatform Assembly Programming with ChibiAkumas“ part 1 & 2… Highly recommended!

    • @sharedknowledge6640
      @sharedknowledge6640 11 หลายเดือนก่อน

      Indeed. The 6502 simply crushed the competition for affordable home computers.

  • @matthagy86
    @matthagy86 2 ปีที่แล้ว +1

    Thanks!

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว

      No problem! Thanks for contribution, it means a lot :)

  • @SquallSf
    @SquallSf 2 ปีที่แล้ว +3

    Very interesting video, I watched the whole 1:26h.
    I didn't know that actually the first microprocessor was done for F14. Since it is de-classified now could you point to a video with more details?

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +1

      Thank you! Unfortunately I cannot post links here, however I can recommend a great article by WIRED entitled “The Secret History of the First Microprocessor, the F-14, and Me” which goes into more detail :)

  • @markbanash921
    @markbanash921 2 ปีที่แล้ว +2

    I took my undergraduate degree at the University of Pennsylvania, where I did my introductory computer science courses in the Moore School of Engineering building where Eniac had operated. At that time units of the old machine were lying around in open storage on the first floor and when I would walk to class I'd go right past them and be amazed at how large and clumsy they seemed. Two years later I started seeing ads for the Timex ZX81, the American version of the Sinclair machine, appear in magazines like Scientific American. The juxtaposition of those two computers got one thinking about how far the revolution had come, as well as how far it could go.

  • @cemacmillan
    @cemacmillan 2 ปีที่แล้ว +11

    This is an excellent overview of how things have changed and for me (active in development from 1991 onward) shows just what a mess things really were, and how essentially things were held back. I wish there were more mention of how anti-competitive practices did much to create the duoply in the desktop sphere which existed until 2020 but this addition would have made the video at least twice as long :) Here in 2022, things still appear "open." I'm sitting here with an unused Linux box with an original FX water-cooled series AMD under my desk, and writing this comment on a 2021 MacBook Air which despite so little RAM outperforms everything I've ever personally owned while making no noise. Will we see more and more ARM, or, might something disruptive and interesting emerge? We'll see.

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  2 ปีที่แล้ว +3

      Thanks! :) Indeed, only time will tell.

    • @awuma
      @awuma ปีที่แล้ว +1

      A PC is silent if you use high-quality fans (i.e. Noctua), and control them properly.

  • @unityxg
    @unityxg 6 หลายเดือนก่อน

    The Encarta cameo brought back some flash back memories. What a time to be alive. I am happy to see what the world looked like before mainstream Personal Computers. I think computers have really revolutionized humanity in many different ways, for better and for worse.

  • @mikedrop4421
    @mikedrop4421 ปีที่แล้ว +3

    This is stellar work sir. Yes it gives off Ahoy vibes but your style shows through. Please make more stuff!

    • @TechKnowledgeVideo
      @TechKnowledgeVideo  ปีที่แล้ว +1

      Thank you so much! :)

    • @daveroche6522
      @daveroche6522 ปีที่แล้ว

      Agreed. VERY informative and interesting - for those of us with an actual (functioning) brain......

  • @Avenger24601
    @Avenger24601 ปีที่แล้ว +1

    This is fantastic! Quite the journey that aligns with my time on the earth. Thank you!

  • @klaxoncow
    @klaxoncow 2 ปีที่แล้ว +3

    Hmm, you could have explained why naming the chip "Pentium" allowed them to avoid AMD and Cyrix copying it, as they had done with the 386 and 486.
    Basically, you cannot trademark a number. And "386" and "486" are numbers, so you cannot legally forbid anyone from using those numbers in their names.
    But by inventing a made-up word - "Pentium" (where the "Pent-" suggests five, as this would otherwise have been the 80586) - then you could trademark "Pentium" and disallow others from using it.

  • @DGHecei
    @DGHecei 9 หลายเดือนก่อน

    Bravo! What a great video. Very interesting and held my attention through to the very end. I pretty much have lived through this whole transition in the home computer world. This was spot on and didn't really miss a thing. Well done!

  • @joelstyer5792
    @joelstyer5792 2 ปีที่แล้ว +3

    This is a great video, well put together and researched. I lived through most of this (from the 60s forward) and it is nice to see it all condensed together in a timeline. I was hoping to see something about Intel buying DECs chip division and gaining knowledge from the Alpha processor design (fast speeds and high power dissipation) but understand that not everything can be included. Near the end of the 8086 era, the NEC V series chips also had an impact with consumers as well to increase performance. Congratulations on some excellent work.