No matter how many times I've worked with assembly and CPU's in general it amazes me everytime what has been achieved and how reliable the technology is
@@jaymorpheus11 i still remember the tv section when i was 5, it was dominated by big bulky television and a couple of incredibly expensive flat screens, fast forward 15 years and even those flat screens look so archaic, Smart TVs are now everything, technology is trully amazing
Whenever I try researching about the internals of a CPU I usually get the glossed over, simplified version, with words I don't understand thrown at me. I really appreciate how you explain what things mean, and also put them in the description so I can write it down to look up later. I wish I could subscribe a million times.
Thanks for the amazing comment - that made my day. I try my best to make these videos as in-depth as possible but still relatively easy to digest. It's very gratifying reading appreciative comments like yours. For all the reason that motivates me to work on these videos, feedback like this are some of the most rewarding. Thanks again!
If you really want to address it, probably you need to go througha series of university courses. Digital logic, micro processor principle s and computer archetectures. Hopefully you can find them on TH-cam. Good luck!
Ryan Breaker these topics are covered in CS as well but yeah these videos are a summary of 3 of my CompE classes th-cam.com/video/tawb_aeYQ2g/w-d-xo.html
He summarized the entire Ben Eater CPU creation series (43 videos) in just the first video. I can't wait for part 2... (which I'm going to go watch right now! lol)
As an IT Professional, I'm a hardware guy, it amazes me still that we have come so far in such short time. just 70 or 80 years ago we barely had mastery of basic circuits, today multicore processors that can top 5ghz in the consumer market.... Simply awesome, can't wait to see thee next 30 to 40 year, if I get that long... Great video BTW!
There won't happen much more. The physical boarders are nearly reached. Intel is already sitting since 5 years trying to shrink the build size by 4 nm(from 14 to 10) down. I think something around 3-5 mm will be the smallest size possible. I actually don't know. But 3nm would mean that the length of a transistor is build by less than 30 atoms. And the max Frequency is connected to the build size, so even this has its boarders. You can try to maximise the transistor count or get more Cores but this also won't help because of the sequentiality of the code. So you will need a revolution in the build material or the software side or architecturing. All things that will rather not happen.
@@ahmadz.9944 The most revolutionary action would be use of Quantum computing using Q-bits rather than advancing in conventional digital electric-ionic computing systems. Of course, we will still need electricity to run those computers but the processes would not be done by active electric components like the MOSFETs which we are still using.
@@ahmadz.9944 I'm actually pretty hopeful that we will develop newer materials, different techniques, and what not to eventually overcome those limitations. Its possible we see the joining of regular CPUs and quantum computers down the road. Maybe it went be typical speed boost, maybe it will be the way which we process data. I'm not sure if it will continue, all good things...right? But I'm pretty sure there will be some amazing technological growth in the next decade or two... But then again only time will tell...
@@kjellvb1979 one way it could be done are hybrids, basically Quantum computers can't be used as regular computers because they work very diferent, they are more like puré sheer brute force to process a lot of data in small amounts of time but without any fines, so the solution would be to allow Quantum computers to do all the heavy lifting while more regular computers gather the data and make sence of it
@@ahmadz.9944 This is true, but they said we are developing clever tricks (like 3d lattice of Silicon to essentially build upward rather than outward), newer materials (perhaps graphine will be incorporated poor replace silicon, maybe some other material), and, of course, eventually using hybrid quantum and classical computer processes to accretive something drastically faster than modern day computers. Beyond those aspects, I believe we will stop seeing growth in GHZ fior a while, we kind of have already, and seen more dedication to core growth. But I think you will see the greatest jumps in speed not by improvements on the CPU, but with how all the peripherals communicate with the CPU. We've already seen the growth of solid state hard drives, first with SATA based success and now with NVME drives on PCIe buses. I believe we will see the merging of RAM and solid state hard drives, and iirc there are already companies that offer some hybrid RAM/SSD storage solutions. Point being I think you'll see advancements in CPU to peripheral communication, and large jumps in compute speed due to those advancements. Imagine CPUs with enough cache-able memory that the processors barely had to make calls to peripherals over slower busses, achieving this by bundling RAM/SSD and moving it closer to the CPU along one extremely fast PCIe 4 bus it's amidst like the CPU has unlimited cache to work with. Granted that's an over simplified, and basic summary of what many companies are aiming to do. Maybe not exactly what I explained but variations on that theme of bringing volatile memory and your long term storage together, bringing it closer you the processor and along a faster bus. The ideas been around for a while, the tech is finally getting there. You've seen this on the networking side already, and is only going to be growing the next couple years. Now hopefully we see it coming to the consumer side as well sooner than later. Here's an article that explains it better than I. Or at least some of the concepts. www.edn.com/the-merger-of-networking-storage-ram-and-cache/
You should do a series like this on microcontrollers. Everybody always talks about the history of CPUs and computers, but I'd love to learn more about how that fit in with MCUs.
In essence they work the same way, except micro-controllers are used for more specific purposes. CPU is designed to be placed on to a specifically designed circuit board, the mother board, which is primarily composed of micro-controllers which communicate with each other (e.g: north and south bridge, SATA and USB controllers).
At first microcontrollers were made of discrete cpu, Ram, secondary storage all in one board. Later, on chip microcontrollers started becoming common and the older 'board microcontrollers' were phased out.
Very enjoyable series. Much of my 40-year career (retired now) coincided with the advent of 8-bit, then 16-, 32- and finally 64-bit processors. I like that the subject matter is presented in an easy to digest manner without losing too much of the details of the engineering problems and solutions along the way as well as a fairly comprehensible use of the techno-jargon. Can't wait to watch the rest. Good work!
In all my years on TH-cam I've never seen an effectively brand new channel that convinced me to subscribe faster. This is remarkable and the creator and anyone else associated with this project should be proud of themselves.
We need more high quality video creators like this. I work on projects designing stuff with ARM controllers and mechanical parts all the time. Even I forget about stuff from time to time and I'd come here to pick them up and remind myself how much efforts mankind has made to get to this point today.
I’m sixty three years old now, and I started in the electronics industry around my seventh birthday. My best possession is a 4004 chip removed from a board that was going to thrown away.
@@nigeljames6017I wish I had some of the 1101 and 1301 chips left. At one time, I had a few whole wafers of them. Yes,they were tested and bad, but still. Old style 3 inch wafers. steve
I read the book “How Computers Work” back in the nineties, and if I hadn’t, I think I would be overwhelmed by this excellent video. This video does an excellent job of explaining CPU fundamentals, which are inherently complicated.
the video displayed a glimpse of some money used to see it closer with bigger and heavy machines; 1:30 the exploration of electric circuits in a timeline concept of discovery on how and what to do with it made it smaller;
I’m an IT professional and trust me I’ve never been taught this much details in high school. I felt like to grab a notebook and jot down key points. Thanks 😊 man very well done.
My guy, the TH-cam algorithm gods have smiled upon you and you're popping up all over the place! You have beautiful videos and excellent content here and I can definitely see your channel exploding even now (July 2019). I do, however, have some feedback that could mitigate some negative comments in the future. These do represent a lot more work on top of the apparently massive amount of effort you've already put into these but I think they could benefit you. First, your voice overs are good but maybe consider improving the recording quality some as your channel expands to the less nerdy part of youtube. I feel that some of your lines are a little rushed and warrant re-recording. Second, as you do your research, keep a reference list on hand. You are guaranteed to get something wrong at some point or, at least, say something that many people don't believe or agree with. Having a citation list on all your videos will improve your credibility. Thanks for all your hard work and keep moving forward! I can see this channel passing 100K subscribers by August 2019 easily.
If you really want to learn how a computer works (including software) you should take a course in coursera called From Nand to tetris. It is excellent.
As an EE that took computer architecture classes I must say, this video is extremely well made and manages to explain some very technical information in a very straightforward and simple manner, such that I believe pretty much anyone would be able to understand it. You should consider teaching.
Great video, I knew most from school, but you simplified it so nicely that I believe most people follow without taking notes. To anyone curious to actually understand this part by playing with it, I'll recommend trying to program a microcontroller, perhaps just to sense temperature or light, and outputting it by turning on an LED when certain conditions are met. Luckily we have pre-made platforms for this today (Arduino for instance). To really understand the basics, you'll need to avoid using too many of the pre-made libraries that exist for those platforms... for instance, try to output a message to a character display by only manipulating registers in binary... It can be fun, but only if you're actually interested in how stuff works, and don't mind reading a few datasheets. The advantage of using a microcontroller is that you have RAM and flash memory built in, you don't need to deal with an address-/data-bus until you're ready to connect external memory or peripherals, such as a character display, and their word sizes are not crazy large... it's easier to remember what 8 bits do than what 32 bits do.
As a student who is currently enrolled in Machine and assembly language right now this video is definitely helping me understand what’s is actually going on underneath the hood of all my other coding languages… I appreciate these videos alot right now! 🙌🏻
It amazes me how complex this is. And how it takes more than one person to understand how a computer works at different levels from Applications -> Operating System -> DOS -> Assembly -> The mechanical/electronic parts
extend it even further, to semiconductors, doping, and exchange of electrons, quantum tunneling and superposition at the level of semiconductor energy bands, and then off to the maze of quantum mechanics :)
To me, you resemble Richard Feynman. He received a Nobel prize mainly for his ability to explain very complex topics clearly to everyday people! Keep up the good work, and please except my heartfelt thanks.
I have read 'how do it know' and 'digital electronics'. What you have explained is the best summary for these books plus also the assembly and machine language. As an IT professional who wants to go deeper in digital electronics and low level languages this is the best video for more understanding. Thank you from 🇰🇪 Kenya.
The fact that these things work at all is almost miraculous. I usually can understand how most machinery works, but despite (maybe because of) viewing so many videos on computing over the years, they seem like magic to me. How humans can design something so fast and precise is amazing, equivalent of a stone-age person wondering how a car works.
@@pablopereyra7126 yes . How can a human be so smart to invent all these things that seem like a dream. There is no way. Who told them this goes here that goes there , this is going to go this and this is going to send a signal there , and the is tiny little chip is going to do this like whatttt!!!. For instance the “MOTHERBOARD”
Watched the ads. Really appreciate this as a continuing VLSI Major. It refreshed my painstaking initial understanding of the microprocessor about 6 years ago without TH-cam. Better than all other microprocessor videos out there.
students who are learning aurdino and microprocessors programming wrote whole assignment from your three part series I loved it very much and I am including your vedios of cpu in my teaching material
That's not true. IBM mainframes were going strong by the end of 1969 and there were many of them and a lot more from other companies. The ARM7 is really not all that powerful. So modify your statement to say the 1950's. Then you might be correct.
@@Tapajara The CDC 6600, is one of the most powerful supercomputer that ruled the 60's and 70's could do a MAXIMUM of 300,000 flops. The iphone 5S (a really old phone by today's standards) could do 78.6 giga flops or 78,600,000,00 flops. which means you would require 26200 computers each requiring $2,370,000 . That is not possible. Yeah. Math rules. lol
Well... Two videos in from this channel and I'm sold. I love the technical detail and engineering principles you present in your videos. Thank you for the content and keep up the good work!
8:56 "as [RISC] evolves to CISC"? As it happens, CISC evolved to RISC. That's one of the amazing things about computing systems: they often become more powerful as they become simpler!
These videos literally teach you nearly everything that you would find in a higher education textbook. I wish that you would never stop uploading videos here.
took me back to my 80's teens when 6502 home computers were about and when I slowly picked up this knowledge from years of hobby programming and reading the magazines / books of the time
80 billion instructions per second? Is that true? Wow. Edit :after watching the rest of the video I am grateful for your well worded description of computers that a fossil like me could easily follow.
Wow... by far the most comprehensive explanation I’ve found on this topic. Thorough but not over explained. You have absolutely nailed this. First time coming across your channel. I am now a subscriber. 👍✌️
I'd like to frown on the TH-cam algorithm for reccommending this video after my finals. I mean bruh, this was an entire semester worth of knowledge you packed in here and I understood in 13 minutes way more than I did from my lecturer. Massive props to you good sir
Its more deeper than this.. You need to deeply understand electrons, semiconductor, transistors capacitor inductor, resistor, electromagnetism and yes quantum mechanics too to understand it fully
I am so happy I found this channel today! The footage in every video I've seen is amazing and the information is so satisfying. I hope this channel blows up in the future, it has such a smartereveryday feel
This was a very informative video on how a CPU operates. It's interesting to see how CPU's have gotten faster over time. To go from 5000 instructions per second with the Intel 4004, to over 8o billion instructions a second with the new Intel i9 CPU is fascinating. Keep the content coming!
I retired from the IT industry last year after approx 50 years. What a time to live. From 8 bit mainframes to today, I've seen it all, some being firsts, such as networking multiple mainframes, file transfers, the first real games, such as adventure. The list is huge. One mainframe and OS I worked on was ICL VME2900 of which there is an example at the National Computing Museum. Many thanks for this, the best I have seen describing the computers engine.
Thanks to let us know how science and engineering shaped the world in very interesting way. These documentaries are far more useful +entertaining, than useless movies. I stopped watching movies as science is far more interesting. Thanks a lot for opening my eyes!
Thanks for posting this video! I was feeling pretty bummed about uni and this renewed my love for computers yet again. It's just fascinating to see how things work even though we take them for granted each time we use a computer, or a smartphone, or even a calculator.
i can't believe my computer architecture class can be summarized in less than 15 minutes but you actually did lol i wish all professors/teachers were as "to the point" as you were thank you so much for this video man you saved me new sub !
In 1969, my father got a job, with a tiny memory manufacturing company, started by Robert Noyce, and Gordon Moore. I was 11. I can't remember the address street number, but it was on Middlefield road, in Mt View Ca. steve
I am recomending this series to help fastrack my friends to my understanding of PCs. I am too scatterbrained to do so. Balanced and informative presentation. My hat is off to you sir.
Very well done presentation... Really happy that you mentioned "microcoding". For me, having studied electronics in the '70s, the fundamental mystery was 'cracked' when I finally 'twigged' how "instruction decode" was little more than enabling/disabling signal pathways (designed and implemented by geniuses), working so much like the loom that inspired Lovelace back in the days when a human could see and touch this stuff... Looking very much forward to the next two episodes. Again, sincere respect for your clean, complete explanation.
@@andrewc1036 Yes :) I share links to the channel with friends and colleagues. I also love how he doesn't beg for likes, shares and subscribers. We already know how to subscribe, and will, if the content is good.
Great video dude, very informative. But a tip : slow down a bit, take a moment to breathe. I sometimes felt like you were trying to say everything you could as fast as possible. You have our attention, you can relax a little!
Never in my life have I seen such pure and great content, no advertisement, no begging for likes and subs, just getting to and from the point, but nevertheless, have I been more confused
No matter how many times I've worked with assembly and CPU's in general it amazes me everytime what has been achieved and how reliable the technology is
its almost like its alien its SO fundamentally complicated..simple yet incredible
IKR
I stopped by best buy 10 years ago and then today. Shesh, the TV section was a mindblower.
This video is basically a summary of my microprocessors class and we also code in assembly there 🤣😂
@@jaymorpheus11 i still remember the tv section when i was 5, it was dominated by big bulky television and a couple of incredibly expensive flat screens, fast forward 15 years and even those flat screens look so archaic, Smart TVs are now everything, technology is trully amazing
Whenever I try researching about the internals of a CPU I usually get the glossed over, simplified version, with words I don't understand thrown at me. I really appreciate how you explain what things mean, and also put them in the description so I can write it down to look up later. I wish I could subscribe a million times.
Thanks for the amazing comment - that made my day. I try my best to make these videos as in-depth as possible but still relatively easy to digest. It's very gratifying reading appreciative comments like yours. For all the reason that motivates me to work on these videos, feedback like this are some of the most rewarding. Thanks again!
New Mind 😊
If you really want to address it, probably you need to go througha series of university courses. Digital logic, micro processor principle s and computer archetectures. Hopefully you can find them on TH-cam. Good luck!
well, you could if you created a million account. i'm just saying...
You can look at computer crash course video.
04:45 Instruction Set
05:15 Fetch Decode Execute
06:40 Data Bus & Address Bus
09:05 Opcode & Operand
12:30 Clock
Awesome -- thank you! A table of contents makes it more usable.
You are legend
Based
This guy gets more credit for TOC than the creator gets for the upload..
Thanks that's actually helpful for videos like this
It's cool seeing how overwhelmingly positive comments can be when some is producing truly good content. Keep it up man, this was awesome.
Well said sir
Its also the content. Smart/curious people are more likely to watch this and also more likely to not be trolls/bigots/douchebags.
I think it's more so about there's nothing to disagree with here. Nothing is contentious about this subject.
you summarized my computer science class... thumps up dude
This is more computer engineering than computer science.
I was going to say the same thing!
Same for me! lol
Ryan Breaker these topics are covered in CS as well but yeah these videos are a summary of 3 of my CompE classes
th-cam.com/video/tawb_aeYQ2g/w-d-xo.html
He summarized the entire Ben Eater CPU creation series (43 videos) in just the first video. I can't wait for part 2... (which I'm going to go watch right now! lol)
As an IT Professional, I'm a hardware guy, it amazes me still that we have come so far in such short time. just 70 or 80 years ago we barely had mastery of basic circuits, today multicore processors that can top 5ghz in the consumer market.... Simply awesome, can't wait to see thee next 30 to 40 year, if I get that long...
Great video BTW!
There won't happen much more.
The physical boarders are nearly reached. Intel is already sitting since 5 years trying to shrink the build size by 4 nm(from 14 to 10) down. I think something around 3-5 mm will be the smallest size possible. I actually don't know. But 3nm would mean that the length of a transistor is build by less than 30 atoms.
And the max Frequency is connected to the build size, so even this has its boarders. You can try to maximise the transistor count or get more Cores but this also won't help because of the sequentiality of the code. So you will need a revolution in the build material or the software side or architecturing. All things that will rather not happen.
@@ahmadz.9944 The most revolutionary action would be use of Quantum computing using Q-bits rather than advancing in conventional digital electric-ionic computing systems. Of course, we will still need electricity to run those computers but the processes would not be done by active electric components like the MOSFETs which we are still using.
@@ahmadz.9944 I'm actually pretty hopeful that we will develop newer materials, different techniques, and what not to eventually overcome those limitations.
Its possible we see the joining of regular CPUs and quantum computers down the road. Maybe it went be typical speed boost, maybe it will be the way which we process data.
I'm not sure if it will continue, all good things...right? But I'm pretty sure there will be some amazing technological growth in the next decade or two...
But then again only time will tell...
@@kjellvb1979 one way it could be done are hybrids, basically Quantum computers can't be used as regular computers because they work very diferent, they are more like puré sheer brute force to process a lot of data in small amounts of time but without any fines, so the solution would be to allow Quantum computers to do all the heavy lifting while more regular computers gather the data and make sence of it
@@ahmadz.9944 This is true, but they said we are developing clever tricks (like 3d lattice of Silicon to essentially build upward rather than outward), newer materials (perhaps graphine will be incorporated poor replace silicon, maybe some other material), and, of course, eventually using hybrid quantum and classical computer processes to accretive something drastically faster than modern day computers.
Beyond those aspects, I believe we will stop seeing growth in GHZ fior a while, we kind of have already, and seen more dedication to core growth. But I think you will see the greatest jumps in speed not by improvements on the CPU, but with how all the peripherals communicate with the CPU. We've already seen the growth of solid state hard drives, first with SATA based success and now with NVME drives on PCIe buses. I believe we will see the merging of RAM and solid state hard drives, and iirc there are already companies that offer some hybrid RAM/SSD storage solutions. Point being I think you'll see advancements in CPU to peripheral communication, and large jumps in compute speed due to those advancements. Imagine CPUs with enough cache-able memory that the processors barely had to make calls to peripherals over slower busses, achieving this by bundling RAM/SSD and moving it closer to the CPU along one extremely fast PCIe 4 bus it's amidst like the CPU has unlimited cache to work with.
Granted that's an over simplified, and basic summary of what many companies are aiming to do. Maybe not exactly what I explained but variations on that theme of bringing volatile memory and your long term storage together, bringing it closer you the processor and along a faster bus.
The ideas been around for a while, the tech is finally getting there. You've seen this on the networking side already, and is only going to be growing the next couple years. Now hopefully we see it coming to the consumer side as well sooner than later.
Here's an article that explains it better than I. Or at least some of the concepts.
www.edn.com/the-merger-of-networking-storage-ram-and-cache/
You should do a series like this on microcontrollers. Everybody always talks about the history of CPUs and computers, but I'd love to learn more about how that fit in with MCUs.
In essence they work the same way, except micro-controllers are used for more specific purposes. CPU is designed to be placed on to a specifically designed circuit board, the mother board, which is primarily composed of micro-controllers which communicate with each other (e.g: north and south bridge, SATA and USB controllers).
At first microcontrollers were made of discrete cpu, Ram, secondary storage all in one board. Later, on chip microcontrollers started becoming common and the older 'board microcontrollers' were phased out.
Finally a vid that has demystified the CPU for me. Fantastic job mate
You might find it enlightening to read up on how transistors are used together to form logic gates, as it will become still less mystifying still.
Very enjoyable series. Much of my 40-year career (retired now) coincided with the advent of 8-bit, then 16-, 32- and finally 64-bit processors. I like that the subject matter is presented in an easy to digest manner without losing too much of the details of the engineering problems and solutions along the way as well as a fairly comprehensible use of the techno-jargon. Can't wait to watch the rest. Good work!
In all my years on TH-cam I've never seen an effectively brand new channel that convinced me to subscribe faster. This is remarkable and the creator and anyone else associated with this project should be proud of themselves.
That’s one of the best compliments I’ve received so far.. thank you.
Explained my entire computer architecture class in 15 minutes good job!
We need more high quality video creators like this. I work on projects designing stuff with ARM controllers and mechanical parts all the time. Even I forget about stuff from time to time and I'd come here to pick them up and remind myself how much efforts mankind has made to get to this point today.
I’m sixty three years old now, and I started in the electronics industry around my seventh birthday. My best possession is a 4004 chip removed from a board that was going to thrown away.
Nigel James wanna sell it?
Ayush Kumar Hmmm....I’ll fight you for it !
Sounds like a good deal I paid about 1k for mine. Lol
Top Lobster Sorry Top, but I’ve got more than that.
@@nigeljames6017I wish I had some of the 1101 and 1301
chips left. At one time, I had a few whole wafers of them.
Yes,they were tested and bad, but still. Old style 3 inch
wafers.
steve
I read the book “How Computers Work” back in the nineties, and if I hadn’t, I think I would be overwhelmed by this excellent video. This video does an excellent job of explaining CPU fundamentals, which are inherently complicated.
OMG! How did anyone figure this out. I tried keeping up but I’m dumb. You can only count on me collecting firewood when society collapses.
the video displayed a glimpse of some money used to see it closer with bigger and heavy machines; 1:30
the exploration of electric circuits in a timeline concept of discovery on how and what to do with it made it smaller;
@@Internet_Web_Collections Basically, we take it for granted but terrified to see something that they cannot understand.
Fetching wood can be as relevant...
You are not alone. Besides, there needs to be many more wood collectors. We're always needed. Just like the geeks.
People were happier when electronics didn't exist. Don't forget!
This video amazes me by showing me how little I know
that in effect is the universal truth for everybody...😉
Wow,
This is outstanding.
My teacher explained me all the thing in my coding class. But u add more flavour in it.
I’m an IT professional and trust me I’ve never been taught this much details in high school. I felt like to grab a notebook and jot down key points. Thanks 😊 man very well done.
My guy, the TH-cam algorithm gods have smiled upon you and you're popping up all over the place! You have beautiful videos and excellent content here and I can definitely see your channel exploding even now (July 2019). I do, however, have some feedback that could mitigate some negative comments in the future. These do represent a lot more work on top of the apparently massive amount of effort you've already put into these but I think they could benefit you. First, your voice overs are good but maybe consider improving the recording quality some as your channel expands to the less nerdy part of youtube. I feel that some of your lines are a little rushed and warrant re-recording. Second, as you do your research, keep a reference list on hand. You are guaranteed to get something wrong at some point or, at least, say something that many people don't believe or agree with. Having a citation list on all your videos will improve your credibility. Thanks for all your hard work and keep moving forward! I can see this channel passing 100K subscribers by August 2019 easily.
He's CLOSE to 100k! You were sorta right! :D
yeah, i have english as a second language and without subtitles its quite hard to fully understand everything
You Sir are an excellent teacher. I never got how CPU worked before today and God knows I tried.
You might find Ben Eater's series on making a simple breadboard computer useful. See th-cam.com/video/HyznrdDSSGM/w-d-xo.html for details.
I love this channel! You'll reach 100k subs very soon, keep making videos that are super interesting
This video deserves way more than just 9000 views
so hard ,
becouse another people just wanna play it, feel it , use it ,
and didn't wanna know , how they run ..
It’s a pretty niche topic. Presented beautifully - but still niche.
I think youtube algorithm saw your comment :D
@@pitanu yeah seems like it lol
it deserves English sub for non native speakers
Now I only need to watch this 4 or 5 more times for this to truly sink in! Great job though
If you really want to learn how a computer works (including software) you should take a course in coursera called From Nand to tetris. It is excellent.
Nice nvidia pfp
As an EE that took computer architecture classes I must say, this video is extremely well made and manages to explain some very technical information in a very straightforward and simple manner, such that I believe pretty much anyone would be able to understand it.
You should consider teaching.
Great video, I knew most from school, but you simplified it so nicely that I believe most people follow without taking notes.
To anyone curious to actually understand this part by playing with it, I'll recommend trying to program a microcontroller, perhaps just to sense temperature or light, and outputting it by turning on an LED when certain conditions are met. Luckily we have pre-made platforms for this today (Arduino for instance).
To really understand the basics, you'll need to avoid using too many of the pre-made libraries that exist for those platforms... for instance, try to output a message to a character display by only manipulating registers in binary... It can be fun, but only if you're actually interested in how stuff works, and don't mind reading a few datasheets.
The advantage of using a microcontroller is that you have RAM and flash memory built in, you don't need to deal with an address-/data-bus until you're ready to connect external memory or peripherals, such as a character display, and their word sizes are not crazy large... it's easier to remember what 8 bits do than what 32 bits do.
As a student who is currently enrolled in Machine and assembly language right now this video is definitely helping me understand what’s is actually going on underneath the hood of all my other coding languages… I appreciate these videos alot right now! 🙌🏻
If only this vid existed when i had my Computer Architecture class,great vid :D
It amazes me how complex this is. And how it takes more than one person to understand how a computer works at different levels from Applications -> Operating System -> DOS -> Assembly -> The mechanical/electronic parts
extend it even further, to semiconductors, doping, and exchange of electrons, quantum tunneling and superposition at the level of semiconductor energy bands, and then off to the maze of quantum mechanics :)
This video was my entire assembly course in 14 minutes
To me, you resemble Richard Feynman. He received a Nobel prize mainly for his ability to explain very complex topics clearly to everyday people! Keep up the good work, and please except my heartfelt thanks.
What is your Dogs Name ?
My head is spinning like a top. I am watching this in 720p from a Raspberry pi B+
I have read 'how do it know' and 'digital electronics'. What you have explained is the best summary for these books plus also the assembly and machine language. As an IT professional who wants to go deeper in digital electronics and low level languages this is the best video for more understanding. Thank you from 🇰🇪 Kenya.
The fact that these things work at all is almost miraculous. I usually can understand how most machinery works, but despite (maybe because of) viewing so many videos on computing over the years, they seem like magic to me. How humans can design something so fast and precise is amazing, equivalent of a stone-age person wondering how a car works.
They don’t . Aliens do . No human has the inteligence to do this . Aliens abduct humans and give them this intelligence
@@diegolerma1516 Bro I was thinking the same I already have a box of tinfoil hats ready
@@pablopereyra7126 yes . How can a human be so smart to invent all these things that seem like a dream. There is no way. Who told them this goes here that goes there , this is going to go this and this is going to send a signal there , and the is tiny little chip is going to do this like whatttt!!!. For instance the “MOTHERBOARD”
@@diegolerma1516 brooo you high?
@@nanafalke nah bro it’s just amazing
Maaan, you saved my semester... I was stuck in a lecture, where everybody spoke the same language I wasn't understanding... Very well explained!
Great intro to the micro processor, now all the complicated stuff like pipelining makes more sense.
Watched the ads. Really appreciate this as a continuing VLSI Major. It refreshed my painstaking initial understanding of the microprocessor about 6 years ago without TH-cam. Better than all other microprocessor videos out there.
You don't actually have to watch the ads to support the channel; all that matters is that the ads are served.
Good work dude! You did a great job distilling a very complicated topic into something understandable by most people. :)
students who are learning aurdino and microprocessors programming wrote whole assignment from your three part series I loved it very much and I am including your vedios of cpu in my teaching material
Fun fact: The phone on which you are watching right now has more power than all of the computers
from 1960's combined!
Did you just assume my viewing preference?
That's not true. IBM mainframes were going strong by the end of 1969 and there were many of them and a lot more from other companies. The ARM7 is really not all that powerful. So modify your statement to say the 1950's. Then you might be correct.
@@Tapajara The CDC 6600, is one of the most powerful supercomputer that ruled the 60's and 70's could do a MAXIMUM of 300,000 flops. The iphone 5S (a really old phone by today's standards) could do 78.6 giga flops or 78,600,000,00 flops. which means you would require 26200 computers each requiring $2,370,000 . That is not possible.
Yeah. Math rules. lol
@@greatdane114 lol
Well... Two videos in from this channel and I'm sold. I love the technical detail and engineering principles you present in your videos.
Thank you for the content and keep up the good work!
8:56 "as [RISC] evolves to CISC"? As it happens, CISC evolved to RISC. That's one of the amazing things about computing systems: they often become more powerful as they become simpler!
These videos literally teach you nearly everything that you would find in a higher education textbook. I wish that you would never stop uploading videos here.
I'm throwing this all over social media. People need to learn.
Dude, first time I see a 15 min video covering so much about the bare metal CPU. Very efficient, and easy to understand. Amazing.
took me back to my 80's teens when 6502 home computers were about and when I slowly picked up this knowledge from years of hobby programming and reading the magazines / books of the time
So forty year ago your golden age.
What is your cats name ?
Video with amazing quality. Congratulations!
80 billion instructions per second? Is that true? Wow.
Edit :after watching the rest of the video I am grateful for your well worded description of computers that a fossil like me could easily follow.
I've been looking for a good video explaining CPUs in depth for a while. Nothing has satisfied me until I stumbled upon this. Amazing work!!
The more you learn, the more magical it sounds lol
Wow... by far the most comprehensive explanation I’ve found on this topic. Thorough but not over explained. You have absolutely nailed this. First time coming across your channel. I am now a subscriber. 👍✌️
bruh your subscriber count is going to go to the moon within the next year. good work
If TH-cam had recommended me this video 2 years ago, I'd have aced my Microprocessors finals!
Dude, you just earned a subscriber for life!
Damn this is so good, should have way more views- keep it up
I'd like to frown on the TH-cam algorithm for reccommending this video after my finals. I mean bruh, this was an entire semester worth of knowledge you packed in here and I understood in 13 minutes way more than I did from my lecturer.
Massive props to you good sir
well... guess I learned my whole class for this semester here... +sub
Its more deeper than this.. You need to deeply understand electrons, semiconductor, transistors capacitor inductor, resistor, electromagnetism and yes quantum mechanics too to understand it fully
I am so happy I found this channel today! The footage in every video I've seen is amazing and the information is so satisfying. I hope this channel blows up in the future, it has such a smartereveryday feel
This was a very informative video on how a CPU operates. It's interesting to see how CPU's have gotten faster over time. To go from 5000 instructions per second with the Intel 4004, to over 8o billion instructions a second with the new Intel i9 CPU is fascinating. Keep the content coming!
At 2:15 closed captions mention "in this 2 part series..." but audio and fact turned into a 3 part series. Thanks, great videos!!
Finally you helped me to decide my Hume mistero of how it works! I was start to think about a conspiracy! ;)
I retired from the IT industry last year after approx 50 years. What a time to live. From 8 bit mainframes to today, I've seen it all, some being firsts, such as networking multiple mainframes, file transfers, the first real games, such as adventure. The list is huge. One mainframe and OS I worked on was ICL VME2900 of which there is an example at the National Computing Museum. Many thanks for this, the best I have seen describing the computers engine.
This is AMAZING, always needed to get into this stuff without having to spend days of studying xd
Just built my first gaming pc . I got recommended this video and it's fascinating how cpu works
2:14 "in this 3-part series"
Just now released part 4. :D
Great video. Easily explains how a CPU works
Thanks to let us know how science and engineering shaped the world in very interesting way. These documentaries are far more useful +entertaining, than useless movies. I stopped watching movies as science is far more interesting. Thanks a lot for opening my eyes!
Why is this the best educational infographic video I've ever seen? Because it is, and I've seen plenty.
I wonder how can 5 people dislike this video .. Great content my dude keep up the good work !
Thanks for the supportive words!
Thanks for posting this video! I was feeling pretty bummed about uni and this renewed my love for computers yet again. It's just fascinating to see how things work even though we take them for granted each time we use a computer, or a smartphone, or even a calculator.
Facts
"I need your clothes, your boots, and your motorcycle." - T-800 model 101.
Dude !! You almost summarized my entire OS module 🙌🔥🔥
Great video, I learned a lot! CPUs have always been confusing to me but this helped me understand!
i can't believe my computer architecture class can be summarized in less than 15 minutes but you actually did lol
i wish all professors/teachers were as "to the point" as you were
thank you so much for this video man you saved me
new sub !
In 1969, my father got a job, with a tiny memory
manufacturing company, started by Robert Noyce,
and Gordon Moore. I was 11. I can't remember
the address street number, but it was on
Middlefield road, in Mt View Ca.
steve
Intel?
I am an electrical engineer. I have never heard this topic explained so thoroughly and simply and quickly before. Fantastic video!
MIND=BLOWN i can't believe genius people like u are not as big as logan paul when you should be i am sad ;-;
I am recomending this series to help fastrack my friends to my understanding of PCs. I am too scatterbrained to do so. Balanced and informative presentation. My hat is off to you sir.
I’m 1 minute in. Omg you need so many more subscribers. I had to search for this 😭
I’ve been in semiconductors for 25yrs. This video was well done.
1:06 22/7 is an approximation of Pi. I'm sure this was intentional
Dope! Very excited for this series. Nice to have atleast a laymans grasp of this stuff.
From 0/1 to all these address things... sounds like science fiction.
Very well done presentation... Really happy that you mentioned "microcoding". For me, having studied electronics in the '70s, the fundamental mystery was 'cracked' when I finally 'twigged' how "instruction decode" was little more than enabling/disabling signal pathways (designed and implemented by geniuses), working so much like the loom that inspired Lovelace back in the days when a human could see and touch this stuff...
Looking very much forward to the next two episodes. Again, sincere respect for your clean, complete explanation.
Why is this just recommended to me now in 2019?
It wasn’t made until the end of 2018.
@@jscorpio1987 What I meant was why just now after 7 months into 2019, you dumbo!
rai x good to know. Thanks!
Thank you. It is an awesome video. I revise through your series even get better clarity in understanding some of them.
You need better marketing to get the word out. Great job.
No he doesn't. That would be a foolish distraction.
Keep doing what you're doing, great content IS your marketing.
@@stevenkelby2169 hope you're doing your part then.
@@andrewc1036 Yes :) I share links to the channel with friends and colleagues.
I also love how he doesn't beg for likes, shares and subscribers. We already know how to subscribe, and will, if the content is good.
I haven’t done any marketing nor do I intend to. Just hard world and some TH-cam algorithm luck. If you build it, they will come. ( I hope)
@@NewMind Exactly man, lots of respect for you here, don't change your attitude, it's awesome 👍
Great video dude, very informative. But a tip : slow down a bit, take a moment to breathe. I sometimes felt like you were trying to say everything you could as fast as possible. You have our attention, you can relax a little!
When you ask the super computer to explain their self.
Never in my life have I seen such pure and great content, no advertisement, no begging for likes and subs, just getting to and from the point, but nevertheless, have I been more confused
Finally I get how a CPU works, it uses magic!
A CPU is a rock that thinks
Yeah this video definitely deserves more views. Good stuff man thanks!
This was very simply explained... and I don’t understand any of it
You just sumarized 3 weeks worth of assembly classes without dropping any important information. Good stuff bro.
I so wish there was no background music. So annoying and distracting
Yess bro
This series... I cant really find a Word better than baffling. Thank you so much for this!
I wish this video was slower even though I studied Comp. Sc. in college.
There is no way to improve this video. Its just perfect. Thanks!
great video and extremely well thought out but the background music is completely unnecessary
djneils100 I like it 🤠
Oh shut up will you, stop whinging. Go make your own video without music
moztek for real
@@cimbomlovr1 twattery
So easy to follow. Thanks for not adding irritating music.
You should be more expressive when narrating your videos. The monotone voice really kills the vibe. Great work, as far as everything else goes.
I like it, reminds me of morty
I've read and watched a lot of material about how processors work and have a very basic, non-technical understanding. It still seems like magic.