bro this is better than tv style documentaries.. I like how you don't underestimate the intelligence of the viewer and actually explain how things are done in a deep level. Great stuff
@@NewMind bro u dont know how good u are at explaining things in my college my teacher doesn't know dogshit and getting paid like insane and u should be rich af
Excellent job, man... Nice to see educational videos that go a bit beyond 'grade school level' learning. Keep them coming because I'm looking forward to the next... and the next... and the next...and...
"videos that go a bit beyond 'grade school level' learning." Well said, TH-cam has a lot of educational content targeted for a 13 year old to understand or a sophisticated group debate or a straight up University lecture. - The 1st is fair but cuts out so many people who know a few things about the world. - Group debate is great for getting opinions and seeing into what other world leading experts think but isn't overly educational and can often go off topic for small periods. -University lectures are great if you're waist deep in a certain topic and have some advanced questions Videos like these provide the bridge to understanding those university lectures a bit better and work well for an informed adult who hasn't got a phd in said topic.
This was very instructional. Great work. I've been a professional developer for almost 25 years and yet, these videos actually explains a thing or two. I've been coding in high-level languages from the get-go, so you don't really need to know these things, but it is helpful to get a basic understanding.
I really hope you are a teacher somewhere teaching IT/Electronics/Computer Science class. And if you are not, you should go and start teaching asap. Your presentation is absolutely flawless and description of the thing is clear, detailed and yet easily understood with strong emphasis put on history of development and real application. We need more teachers like you for sure. Keep up the good work.
Went on the subway to Canary Wharf recently - all business types were paper or laptop in hand, studying bar graphs and pie charts. I think they're still very much alive.
Awesome content and presentated brilliantly. I am amazed to imagine how genius those guys were who invented this technology at that time with very few sources of information. Excellent work by you guys ❤
CISC was mainstream way before the x86 architecture was invented. The 4004 x86 fore-runner was based on DEC designs which were then all CISC. The term CISC was only coined when RISC architectures came along when memory started to get cheaper. The term only came about to contrast with the new RISC, which was generally faster and more flexible than CISC. The underlying architecture of a lot of CISC processors is actually RISC with the microcode implementing the CISC instruction set.
I find it amazing how people who now seek to define the path in history of computers so consistently ignore DEC. The PC came directly from PDP-11 and VAX's hardware and software. I was there and saw both arenas
Not only that; but it was such a smooth gradient. The ALU was defined in concept, before it was implemented in hardware, before there were integrated circuit full adders, then there were IC ALUs, then there were microcomputers built of many ICs, before there was a single chip microprocessor. The microprosessor didn’t just appear out of nothing from a genius at intel; it was bloody obvious that integrated circuits could eventually integrate registers, decoders, ALUs etc into a single chip CPU. Every example I can think of looks not like a step function were something just appeared out of nowhere, but as a very smooth and gradual development with some false starts (too early, not viable yet) and very often slow diffusion from mainframe or flight simulator hardware to minicomputer or professional industry and eventually to home users.
@@markcummins6571 True. I was quiet comfortable in the PC-DOS environment having come from VMS on a Vax 11-780. - It's not so much they are ignoring DEC though; as they have just never heard of them.
It doesn't really happen these days. There is a mapping layer (paging) between what the program sees as a RAM address and what actually is a RAM address. So what happens is that in the page table, there is a theoretical "page" of memory created called the "guard page". Should the stack try to expand into that, it will generate an interrupt which the OS will handle causing it to map more memory into the stack. Or blue-screen and die. Depends.
I could not believe it took over a year and a half for youtube to finally suggest this content to me. Just ended up subscribing to your channel dude. You did a great job of making this content.
Best thing i have ever seen on youtube. Please do part 3, im a student in computer science and im not really getting to see all the aspect like that of the computer. I code C++,java and SQL but i dont know the steps from this video to there. Please get a part 3 now! :)
Great Series I hope it goes into the Statistical nature of modern CPU's and Protected Mode etc... Also hoping for a Series on Packet Based Networks and the Evolution of Ethernet and the OSI model.
Try 1970 prices. I must have ruined 20 or 40 whole wafers of 1101 and 1301 ram chips. Dad was employee 49 there. I met Gordon (Moore's Law) Moore. I was 12. steve
Excellent content, well-presented. Watching part 1 where you talked about assembly language brought back memories (no pun intended!). I cut my coding teeth on a 6502 then a Z80 and then a 68000. 9-year-old me would have loved to have seen this! Keep up the great work - looking forward to future content.
Coming into this video from the first part, I already knew more about how a cpu functions than I already knew before (which was already a fair bit). Bet this part (and 3 and 4) will truly open my mind to how truly amazing these little bits of circuitry are. Your way of describing and detailing how a stuff works is truly amazing and I don’t think I’ve seen another TH-cam channel describe things in such a way, keep it up
Im a biology student and this is some brain food right here man!!! I really must appreciate your quality work on these videos!! CS is looking more and more interesting
Thank you for such a great series of outstanding presentations on the evolution of digital computing. It took me through my entire career, from a avionics special in the military to a university student to an Electronics c Engineer with a minor in Computer Science. It's unbelievable that I went through so much evolution in a lifetime. Thanks again.
YT suggested feed view here. Subscribed. Good stuff. Don't forget the 68k. Check out the Computer History Museum YT channel's 3 hour interview of the design team behind the 68k. It's a killer upload with amazing context and history from the era. -Jake
really like the description too that presents the overview of what i'm watching. time is $ and energy! More youtube vids (esp tech or skill-depth vids) should be like this!
This is by far the most interesting I have ever seen on TH-cam. I'm trying really hard to understand all this and I'm failing..I will someday understand. I absolutely love this! Thank you so much for this content!
I subscribed almost instantly after watching Part 1. Outstanding content and delivery. It's clear and understandable, stays interesting...I can't stop watching.
Although i would call myself interested in hardware and electronics i find myself overwhelmed by the amount of information packed this series. Its awe inspiring how much knowledge one must have to fully understand everything in between a simple transistor and the stuff that is happening in modern chips. I feel like it would take me a lifetime learning just to fetch up with today. And by that time this will be outdated again. Makes me dizzy :)
Most discussions about the history of computers do. I've got a ton of computer catalogs from 1990, and they all behave like the PC clones and x86 are the only technology that ever existed. So frustrating.
@@Waccoon No kidding. What's especially lost is that most other processor architectures (such as the Motorola 680x0 series) were actually much cleaner and nicer designs than the x86 series and beat the shit out of Intel processors clock for clock.
Viewed this video 3 months ago and I understood nothing it was so frustrating it drove me to read an introduction textbook in computer science then I got interested in java " I'm half way through a java textbook by deitel " and now I get your video I guess I have to thank you twice for making this video and encouraging me to learn more about this interesting subject I also like all of your videos and I appreciate the amount of effort you put in them your doing a magnificent work here keep up the incredible work and thank you again
what a meaningful video, yet so nicely broke down to average guys, and so, entertaining. people like you made yt worth for different reasons than media and entertainment
13:00 about the stack overflow, they are commonly known to result in an exploit that could result in gaining root priviliges on an operating system like android and ios, i saw a video once that talked about this subject, but this video series is intriging! I learned way more than i could imagine! You gained a sub!
Just a little error: At 21:08 the OS made by Microsoft was not known as PC-DOS but as MS-DOS. PC-DOS was developped later by IBM as an alternative to MS-DOS.
Say hello to your new subscriber !!! WHY THE HECK THIS CHANNEL IS SO UNDERRATED 😭😭 after watching video I thought you might be having 1M subscribers 😭😭 I wish that happens soon....
Nice to see 80's and 90's commercials without the usual youtube fog hanging over it. Usually those clips are so blurry or artefacted by the repeating of uploads/downloads which causes decode "fog".
X64 is a CISC instruction set, but ever since the pentium pro the CPU itself is RISC. Instructions are decoded into multiple RISC-like micro-ops (same instruction size, simple instructions) and then handled exactly as if they were RISC. The CISC frontend adds latency, and that’s bad, but it also effectively functions as memory compression. You would have needed more bandwidth to read the instructions for a RISC programme.
@@SerBallister It also adds instruction latency and increases power consumption, which is why mobile and phone CPUs don't do it. Only on desktop where doubling power consumption for less than a doubling in performance makes sense does X86/X64 make sense.
@@marcopolo8584 I'm not sure about the economics there. Cooling and electricity are big costs. You might accept wasting memory bandwidth instead of wasting electricity if each individual task is not very performance critical but you have many of them.
My first machine was an 8086 from our shop that had the on-board network card fried in a lightning strike. It was easier for them to replace the machine than buy a card for it. From there I built my own machines and still do to this day. Much better to put the parts you want inside, rather than get a cookie-cutter machine from someone else. Excellent production!
bro this is better than tv style documentaries.. I like how you don't underestimate the intelligence of the viewer and actually explain how things are done in a deep level. Great stuff
the TH-cam algorithm finally found out what I like!
I searched this video while taking apart a old Microsoft gx
Same story here, this channel looks like geek heaven. I'm all in
Yeah this is right up my alley
yup too bored of those review stuff from mkbhd and stuff, this content is good
Watching other content feels like waisting my life.
Learned more than during an entire semester of computer science lectures.
God-tier education man.
This made my day! Thanks so much.
probably cause he's a better explainer than an average CS professor. or at least has more passion to educate than them.
@@NewMind bro u dont know how good u are at explaining things in my college my teacher doesn't know dogshit and getting paid like insane and u should be rich af
US education institutions are profit centers anymore.
Right there with you!
Excellent job, man... Nice to see educational videos that go a bit beyond 'grade school level' learning. Keep them coming because I'm looking forward to the next... and the next... and the next...and...
"videos that go a bit beyond 'grade school level' learning."
Well said, TH-cam has a lot of educational content targeted for a 13 year old to understand or a sophisticated group debate or a straight up University lecture.
- The 1st is fair but cuts out so many people who know a few things about the world.
- Group debate is great for getting opinions and seeing into what other world leading experts think but isn't overly educational
and can often go off topic for small periods.
-University lectures are great if you're waist deep in a certain topic and have some advanced questions
Videos like these provide the bridge to understanding those university lectures a bit better and work well for an informed adult who hasn't got a phd in said topic.
no paid vimeo :(
Superb content, excellently presented and animated. Binge-watched everything, subscribed. Looking forward to future vids.
Same. This is scratching an itch I didn't know I had
There are truly so many great videos on this channel its incredible.
This was very instructional. Great work. I've been a professional developer for almost 25 years and yet, these videos actually explains a thing or two. I've been coding in high-level languages from the get-go, so you don't really need to know these things, but it is helpful to get a basic understanding.
I really hope you are a teacher somewhere teaching IT/Electronics/Computer Science class. And if you are not, you should go and start teaching asap. Your presentation is absolutely flawless and description of the thing is clear, detailed and yet easily understood with strong emphasis put on history of development and real application. We need more teachers like you for sure. Keep up the good work.
I am not but I appreciate the kind words :)
Dude , please keep doing this deep , highly detailed, braingasms. lml
i came to post this but, same!
I can't believe it took months for this production to arrive in my suggestions!
This is insanely informative. I'm impressed. Also, your narration in this video is noticeably better than in part 1. Nice job!
Gosh, those 80s ads really overestimated how much people would use pie and bar graphs in this age of computing, didn’t they? 😂
Hahaha, yep it was the 80/90s symbol of “business being done here”
Spreadsheets were the killer app back then.
IBM PC was a boring business machine. Computer gamers used C64, amiga, atari st etc. PC didn’t become a good gaming platform until the early 90’s.
Went on the subway to Canary Wharf recently - all business types were paper or laptop in hand, studying bar graphs and pie charts. I think they're still very much alive.
Much like all the "stop drop and roll" training overestimated how often people are on fire
Awesome content and presentated brilliantly.
I am amazed to imagine how genius those guys were who invented this technology at that time with very few sources of information.
Excellent work by you guys ❤
CISC was mainstream way before the x86 architecture was invented. The 4004 x86 fore-runner was based on DEC designs which were then all CISC. The term CISC was only coined when RISC architectures came along when memory started to get cheaper. The term only came about to contrast with the new RISC, which was generally faster and more flexible than CISC. The underlying architecture of a lot of CISC processors is actually RISC with the microcode implementing the CISC instruction set.
I find it amazing how people who now seek to define the path in history of computers so consistently ignore DEC. The PC came directly from PDP-11 and VAX's hardware and software. I was there and saw both arenas
Not only that; but it was such a smooth gradient. The ALU was defined in concept, before it was implemented in hardware, before there were integrated circuit full adders, then there were IC ALUs, then there were microcomputers built of many ICs, before there was a single chip microprocessor. The microprosessor didn’t just appear out of nothing from a genius at intel; it was bloody obvious that integrated circuits could eventually integrate registers, decoders, ALUs etc into a single chip CPU.
Every example I can think of looks not like a step function were something just appeared out of nowhere, but as a very smooth and gradual development with some false starts (too early, not viable yet) and very often slow diffusion from mainframe or flight simulator hardware to minicomputer or professional industry and eventually to home users.
@@markcummins6571 True. I was quiet comfortable in the PC-DOS environment having come from VMS on a Vax 11-780. - It's not so much they are ignoring DEC though; as they have just never heard of them.
@@Chris-ZL Tens of architectures and companies are constantly ignored, not just DEC.
a retronym, so.
the striped IBM logo really sends you back to that time. The first PC we had was an IBM with Windows 95, it was amazingly cool back then...
Now I know where STACKOVERFLOW comes from.
Do you understand it?
@@SightCentralVideos I did.
See logo of stackoverflow
So true 🤣
It doesn't really happen these days. There is a mapping layer (paging) between what the program sees as a RAM address and what actually is a RAM address. So what happens is that in the page table, there is a theoretical "page" of memory created called the "guard page". Should the stack try to expand into that, it will generate an interrupt which the OS will handle causing it to map more memory into the stack. Or blue-screen and die. Depends.
This-this is what I call content. High quality,informative, simple and understandable . Thank you for this.
I could not believe it took over a year and a half for youtube to finally suggest this content to me. Just ended up subscribing to your channel dude. You did a great job of making this content.
This is one of the best informative videos I have ever seen on TH-cam. Lovingly produced and at high quality. Thank you!
Very nice. Especially the reference to C=64 at 20 minutes topped it off nicely.
This channel deserves more subscribers. It’s a shame that TH-cam doesn’t promote more quality content like this.
Man the production quality is top notch. Great work!!!
Best thing i have ever seen on youtube. Please do part 3, im a student in computer science and im not really getting to see all the aspect like that of the computer. I code C++,java and SQL but i dont know the steps from this video to there. Please get a part 3 now! :)
Great Series I hope it goes into the Statistical nature of modern CPU's and Protected Mode etc... Also hoping for a
Series on Packet Based Networks and the Evolution of Ethernet and the OSI model.
So I have 16 Million dollars worth of ram in my machine...at 1980 prices :)
Try 1970 prices. I must have ruined 20 or 40 whole
wafers of 1101 and 1301 ram chips.
Dad was employee 49 there. I met Gordon (Moore's
Law) Moore. I was 12.
steve
Build a time machine!
And my phone is worth over 4 mil
@@jakefisher1638 If you travel back 25 years in time, your phone would lead the TOP500 list of supercomputers.
and you use it to watch tide pod challenge xD
Excellent content, well-presented. Watching part 1 where you talked about assembly language brought back memories (no pun intended!). I cut my coding teeth on a 6502 then a Z80 and then a 68000. 9-year-old me would have loved to have seen this! Keep up the great work - looking forward to future content.
I appreciate having at least a minor understanding of Assembly Language. It helps you to know just how computers think.
You just boiled down my 3 CS courses into 1 video. Can't thank enough for this quality content
I have hardware test tomorrow and I am watching your video now. It is very clear and simple to understand after I was lost in my textbook.
Coming into this video from the first part, I already knew more about how a cpu functions than I already knew before (which was already a fair bit). Bet this part (and 3 and 4) will truly open my mind to how truly amazing these little bits of circuitry are.
Your way of describing and detailing how a stuff works is truly amazing and I don’t think I’ve seen another TH-cam channel describe things in such a way, keep it up
You gave me a flashback to microprocessors course in college... Huge respect for the extremely accurate content and easy to follow narration
This video series was done so well I had to subscribe 😁
Same
This is exactly what I needed. An easy to understand explanation to terminology and concepts.
Im a biology student and this is some brain food right here man!!! I really must appreciate your quality work on these videos!! CS is looking more and more interesting
Thanks for adding subtitles!!! Extremely helpful!
Excellent video, excellent job! Subscribed.
Thank you for such a great series of outstanding presentations on the evolution of digital computing. It took me through my entire career, from a avionics special in the military to a university student to an Electronics c Engineer with a minor in Computer Science. It's unbelievable that I went through so much evolution in a lifetime. Thanks again.
YT suggested feed view here. Subscribed. Good stuff.
Don't forget the 68k. Check out the Computer History Museum YT channel's 3 hour interview of the design team behind the 68k. It's a killer upload with amazing context and history from the era.
-Jake
I'll definitely have to check that out, thanks man!
68k is covered in part 3
really like the description too that presents the overview of what i'm watching. time is $ and energy! More youtube vids (esp tech or skill-depth vids) should be like this!
Excellent series! I was worried there would be no mention of Commodore, but there it was in the end. Great! :)
A fregin mazing....thanks for the massive amount of research, animation and editing involved to make this digestible on my Good Enough Diploma level.
Dude this is incredible content. You are literally the world smarter with every piece of content.
Accurate, well edited, interesting narrative: outstanding! I'm now subscribed!
bro i swear to gawd if u don't get 1 mil subs by the end of this year am gonna be sad
That's one of the most detailed series I ever watched on the topic, thank you so much for the effort put on such awesome work
This channel is underrated. Way underrated.
This is by far the most interesting I have ever seen on TH-cam. I'm trying really hard to understand all this and I'm failing..I will someday understand. I absolutely love this! Thank you so much for this content!
I subscribed almost instantly after watching Part 1. Outstanding content and delivery.
It's clear and understandable, stays interesting...I can't stop watching.
Underrated channel.... I love this channel after watching this series I just subscribed the channel
Better diction and pacing in this video compared to the first one. Good job man this is good stuff
Super impressive and super informative :) Thanks for sharing it. And the production quality is just top-notch.
beautiful :P
One of the very best video-series about CPU's I've ever seen, thank you very interesting!!
Yepp.... am binge watching this series.... Exams can wait ✋😉👏👏 Thanks for making best and amazing videos 💐🎊
hey this is a nice video and also your speech has greatly improved since the first part of the series. i subbed!
Although i would call myself interested in hardware and electronics i find myself overwhelmed by the amount of information packed this series. Its awe inspiring how much knowledge one must have to fully understand everything in between a simple transistor and the stuff that is happening in modern chips. I feel like it would take me a lifetime learning just to fetch up with today. And by that time this will be outdated again. Makes me dizzy :)
This is the most stimulating channel I’ve found in a very long time.
Really enjoying your work thank you
I reached my limit half way through the video, so I'm setting an alarm with the video link 1 year in the future, hopefully I will understand by then
Nicely done. Kind of jumped right over the 6502.
Most discussions about the history of computers do. I've got a ton of computer catalogs from 1990, and they all behave like the PC clones and x86 are the only technology that ever existed. So frustrating.
@@Waccoon No kidding. What's especially lost is that most other processor architectures (such as the Motorola 680x0 series) were actually much cleaner and nicer designs than the x86 series and beat the shit out of Intel processors clock for clock.
Congratulations for the video!
Your series is very nice. Thanks
Viewed this video 3 months ago and I understood nothing it was so frustrating it drove me to read an introduction textbook in computer science then I got interested in java " I'm half way through a java textbook by deitel " and now I get your video I guess I have to thank you twice for making this video and encouraging me to learn more about this interesting subject I also like all of your videos and I appreciate the amount of effort you put in them your doing a magnificent work here keep up the incredible work and thank you again
This video answered my dozens of questions, thank you.
Loving the videos, and glad I found your channel. Liked and subbed. Keep the content coming!
VERY EDUCATIVE AND CONTSTRUCTIVE TUTORIALS AND ELEMENTALLY UNDERSTANDING...
Thank you for this series. Love the footage, love the explanations. Learned so much!
I feel like in a couple of years, you can make a Fall of x86 video.
Awesome content! You deserve all the praise you're getting.
This series has already been so amazing for bridging all sorts of gaps in my knowledge. This is amazing, my man!
#0:56 just like you played factorio for over 100 hours.
Good choice to explain CPU.
amazing videos. so much complexity explained in such a great way. superb.........
Excellent Video. Eagerly waiting for 4th video 💖💖💖
Hats off content
Remarkably detailed, understandable, in-depth....so well-presented! Bravo.
Loved it, thank you for doing these
The best documentary on CPU I've seen.
This channel is going to be huge.
Even better presented than the first part. Awesome
Hi there, I think the production of your Documentary were superb!
I really enjoyed both this and the part 1, VERY NICE
Would love a series on GPU's please. I have just discovered this excellent channel. Keep up the good work!
these videos are so well made! kudos and thank you
Wow. Fantastic explanation of the rise and rise of the cpu and personal computer.
What an amazing job you've done. Informative at every sentence and great visuals. Fantastic education to learn from
what a meaningful video, yet so nicely broke down to average guys, and so, entertaining. people like you made yt worth for different reasons than media and entertainment
13:00 about the stack overflow, they are commonly known to result in an exploit that could result in gaining root priviliges on an operating system like android and ios, i saw a video once that talked about this subject, but this video series is intriging! I learned way more than i could imagine! You gained a sub!
Just a little error: At 21:08 the OS made by Microsoft was not known as PC-DOS but as MS-DOS. PC-DOS was developped later by IBM as an alternative to MS-DOS.
@@psynrg I thought that was the case only for the first DOS, after that, IBM developped its own, slightly different DOS.
Say hello to your new subscriber !!!
WHY THE HECK THIS CHANNEL IS SO UNDERRATED 😭😭 after watching video I thought you might be having 1M subscribers 😭😭 I wish that happens soon....
This could be a presentation in every computer science university.
Nice to see 80's and 90's commercials without the usual youtube fog hanging over it. Usually those clips are so blurry or artefacted by the repeating of uploads/downloads which causes decode "fog".
X64 is a CISC instruction set, but ever since the pentium pro the CPU itself is RISC. Instructions are decoded into multiple RISC-like micro-ops (same instruction size, simple instructions) and then handled exactly as if they were RISC.
The CISC frontend adds latency, and that’s bad, but it also effectively functions as memory compression. You would have needed more bandwidth to read the instructions for a RISC programme.
I think Intels design is the smartest one, it gets the benefit of both worlds - RISC efficiency with CISC compactness
@@SerBallister It also adds instruction latency and increases power consumption, which is why mobile and phone CPUs don't do it. Only on desktop where doubling power consumption for less than a doubling in performance makes sense does X86/X64 make sense.
@@soylentgreenb It makes sense for the server, data center, and HPC applications where memory requirements balloon out quickly.
@@marcopolo8584 I'm not sure about the economics there. Cooling and electricity are big costs. You might accept wasting memory bandwidth instead of wasting electricity if each individual task is not very performance critical but you have many of them.
@@soylentgreenb Memory capacity and bandwidth come at a huge premium. TOP500 is dominated by x86-64 for reasons beyond just industry inertia.
It's amazing that these things actually work and not short out.
Excellent videos; keep it up!
Greay video really good one extremely well explained loved it
Awesome! Thank you for all your hard work!
Amazing how we made calculators, CPUs and then added a calculator to help out a CPUs.
I'll chuck ya the ads. Nice content man.
This is really well done, thank you!
Long live the Z80! Steadfast and faithful. Sane from design to assembly language.
Taking IT classes this is very helpful
RISC-V rise up. We have nothing to lose but our sponsors from many different companies
Good video!
( production point - the same background music sorta went on a bit long )
My first machine was an 8086 from our shop that had the on-board network card fried in a lightning strike. It was easier for them to replace the machine than buy a card for it. From there I built my own machines and still do to this day. Much better to put the parts you want inside, rather than get a cookie-cutter machine from someone else. Excellent production!
You are a natural educator, see my comment on pt 1. Please don't ever stop educating us mere mortals!
Thank you.
Good education.
and overview.
Very understandable :)
great video, keep up good work