I still don't know how any human being ever invented a transistor. Who the hell got the idea that arranging a few atoms of silicone together and turning them on and off would lead to full-fledged computers? The very design of computers amazes me, and mostly because I don't know how they were ever invented. such complexity, it seems hard to fathom someone stumbling upon the idea. Imagine a world with no computer, and then you have this idea to build one, and somehow figure out how to make it understand and process human language code (inventing this compiler program too) AND inventing a screen that is capable of receiving processed information to project an image that makes sense to us. Utterly insanely. At any rate, damn am I glad someone figured it out, life is literally a new world with computers. It does make you wonder though, go back 500 years and all the people back then were operating under the same laws of physics and had the same resources. They could of had computers, had they just of known to combine the right amount of materials together to process computers and electricity. So, can you even imagine what we're missing now? What will be obvious in 500 years from now, that we haven't the faintest clue about now?
UnknownXV its complicated if you think about it. but its more as if the concept of modern cpus continously improved from the previous concepts for decades now. Its like thinking how calculus is immensely difficult if you have never seen math in your life, but after years of education its more tolerable. in a simple concept we started with transistors, basic switches that turn on and off based off the control gate. with the power to control current at such a fast speed, we were able to be build logic gates out of them. out of gates we were able to build simple circuits for computation (like full adders, and basic memory like sr latches and flip flops). from there we started bolstering these simple circuits, adding more features (like more math functions, more of the units themselves). and after decades of just adding more, and some complicated circuits to control all of this hardware, we get to the modern cpu.
UnknownXV You think about it the wrong way. There was no 1 person who woke up one day and thought about how to create a computer from scratch. Actually, one can argue that scientists have been working for over 400 years on the required components for such a complicated machine. Many discoveries, laws, theories and experiments had to be done throughout this time to achieve that knowledge. Many people from different disciplines like Mathematics, Physics, Electrical Engineering, Chemistry and much more have been working together to produce such inventions, and have been piecing the puzzle together very slowly. All those integrated pieces were invented or developed mostly independently (and sometimes multiple times by different people) because they were always built on former knowledge (as the ones before..). Most of those inventions did not even have "computers" in mind, but they were later used to aid in the process. The first transistor was not on a tiny silicon, rather a big device. It took time and effort to get it there. The process described in the video is really dumbed down (unfortunately it's impossible to understand all the pieces in a 5min video) but the process can be divided and then divided again (and again..) into small pieces which humans understand, and when you piece it all together it looks to be incomprehensibly complex, but the underlying pieces aren't. Think about it like a puzzle of pieces 1,000,000, it's quite insane to approach it like that, but if you were given the entire puzzle in chunks of 1,000 pieces, it definitely becomes more manageable (the example is not 100% correct but it might help you understand). Also, modern computers (and their functions) were basically only possible to create (whether it's software or hardware) using computers in the first place.
To be fair it wasn't one person doing all the work. Someone designed C++ because they wanted to make Cees picky and less prone to mistakes and ease the way to program. Someone decided to decrease the size of a computer and changed transistors to a silicone rather than lamps.
It didn't start with silicone, it started with tubes... essentially light bulbs. Wired together, light bulbs can create all kinds of logic pathways. Given that there is more to a tube than a lightbulb. Anyway from there, there was a long series of small steps that gradually got us to where we are today. A jurney of a thousand miles begins with...
Put your pins where i can see them! now... slowly exit the motherboard... did you know you were driving with a busted RGB?? you also didn't slow down for the t-junction
We're designing a basic CPU (not every instruction is available, just basics like add, subtract, move, store, load, compare etc) for our Computer Science class right now and it's awesome. I'll add a little snippet of blabber for anyone a little bit more interested in the specifics of how you go from 1s and 0s into instructions. It's kind of hard to condense into simple terms, and I might just sound like a madman talking about nonsense, but hopefully it might explain something! A compiled program is just a list of instructions. An single instruction would be something like "add r0, r1, r2" which means we want to add the value in register 1 with the value in register 2, and store that result in register 0. Say we are using 32 bit architecture, this means that each instruction we use is represented by 32 digits of 1/0s. We split these 32 bits up into different parts to represent information about what instruction we want to run. The CPU takes an instruction, splits it up, and does the various instructions based on what was fed in. So for example in the ARM architecture, the leftmost 4 bits represent the condition of the instruction. If it is 1110, that means we should always run the instruction. If it is 0001, that means we should only run the instruction if our condition flags are set to "not equal". We would have to run a compare instruction beforehand to compare 2 numbers which would set the condition flag registers (like "cmp r0, r1" which compares the value in register 0 with the value in register 1, and sets our condition flags to show the results of the comparison telling us stuff like if it is greater than, less than, equal to). A typical instruction would (for a data-processing instruction) would have 4 bits of conditions, 2 bits of "op" (representing what kind of instruction we are using, e.g data processing vs memory, 6 bits of "funct" (this tells us what our actual instruction is, for example add which is represented by 0100), 4 bits of Rn (the register that contains the value we are using for the instruction), 4 bits of Rd (the register where we store the result, e.g. 0011 in binary means 3, so we store the result in register 3), and 12 bits of Src2 (the second value we are using which can either be another register, or an immediate value). So with "add r0, r1, r2", Rd is 0000, Rn is 0001, and Src2 contains 0010 (along with some other information like bit shifting and such).
We use the textbook called "Digital Design and Computer Architecture". It didn't tell us everything we needed, but helped us design the basic circuits which we added onto with our own logic. There's this awesome free software called "Logism" you can use to design circuits. I would recommend learning how to do basic c programming, learning how to create that same program in assembly, and then learning how that assembly translates into machine code. Hardest part about this stuff is that it is really hard to find answers online, but if you're motivated and good with textbooks then you should be good with the book!
I kinda learned it the other way around from "ands" and "ors" to flipflops to master slave to counter ... etc.. But we never made a CPU, we just programmed one.
This comment is a year old. Man I wish I was smarter back then. Had to leave university, I literally just burned money with no results. I wanted to be a programmer so badly lol, or anything related to IT engineering. I ended up as a network technician (and I love it). Industrial routers are also a challenge, but not as a CPU. I'm just literally too stupid for being a programmer or hardware programmer. Wish you guys the best! Do my dreams :D
I've been researching CPU Architectures for about 5 hours and this one video explains it better than anything else I've seen in that time. Well done mate.
@@TheMrTape what is wrong though? I'm a noob in this but I understand what I came to look for (i.e. that the microprocessor carries out instructions using transistors in logic gates, and also cleared things from other videos, like that these instructions are possible with the voltage that is given to the microprocessor with each cycle of the microprocessor's clock)
@@thisisrtsthree9992 #00:00 To program, I'm programming, I'm a programmer, not code, coding, coder. A clear text program is code as much as any spoken language is (not), even though refereed to as source code, only what it compiles to, assembly instructions, is by definition code. #00:15 and #0:25 CPU's doesn't take "code" directly, it's not compiled (mentioning later doesn't make it right). #00:28 Both can be represented by letters and numbers if translating the binary they both consist of to their respective character abstractions; this representation is only relevant for humans, as the CPU can only see the binary; my text is seen as binary by any CPU, abstracted for your understanding. It's a choice whether to represent something as binary or otherwise to humans; it's irrelevant to the CPU. #0:43 There's not English/Chinese etc. versions; it's not whatever language you want. It's always English. You may name your variables or make comments in any language of course, but the statements and functions are all English. #0:45 There's no sets of instructions when you're talking source code, it consists of statements, functions and operators; instructions exist only in machine code. #0:48 You can't chose whatever programming language. The languages shown in the background are HTML (not a programming language but a markup language) and Javasctipt, both of which are interpreted rather than compiled to machine code, thus irrelevant to what they're talking about. #0:50 A program only needs compilation if it's a non-interpreted language (eg. C, C++, Java rather than eg. Javascript, python, PHP). Again, the programs shown in the background doesn't need compiling to work; in this case no further processing is needed for the CPU to understand, because it doesn't, it uses a translator program to interpret the program in real time. #1:03 A compiler doesn't check for errors, it "checks" for wrong syntax (grammar) by not being able to compile. Also "object code" is a deprecated term for machine code. #1:13 The reason a CPU can interpret machine code/instructions is that this is what it's designed to interpret as instructions; the machine code is 0's and 1's as much as the clear text it's compiled from, since every character is an abstraction of 8 or more bits; it's the exact same thing the CPU sees, always, and this is the only thing it could ever see, 0's and 1's. The differentiator is that the bits of an instruction translates to working CPU functionality while bits of a character translates to the valid character. They're implying that nothing of it is binary before it's compiled, as if the CPU could understand characters directly, as opposed to through abstraction via the 2 "letters" it understands, 0 and 1; the letter A means nothing to the CPU, only A's binary representation: 1000001. Binary is completely irrelevant to any of this, it's always fucking binary. #1:17 Implying that binary is something modern that weren't always the only thing used for computing. I won't bother with the rest. It's all so far of that you're better off assuming how it works. Most of what's presented in these videos are layman assumptions anyway. The (mis)information put forward is worse than a basic school project involving actual research and understanding. #3:11 Only true and relevant point put forward in this video. Overall everything gainable from watching is an extremely dumbed down and twisted understanding of how CPUs work.
@@TheMrTape ya ma dude, thanks for taking the time to write that long reply, I checked some and I saw you are right!! so if anyone peeps this answer they can see too
GJ Slick, that's a really tricky few topics to try and break down into layman's terms, let alone doing them together in a 4 minute video. Out-fucking-standing dude. I'm curious how long it took to actually script this episode #becausereasons
davidabeats because that's what they are paid to do. But explain it so you get every little information that you won't even use most of the time just so they can test you on it. It's a dick move. But they need to do something for the period.
Wow. This was super simple, and easy to understand. I code, but I'm a highschool drop out, and I never learned any of it formally. This is the first time I've heard this explanation in a consumable manner.
On the last Techquickie I asked about a vid talking about freelancing. On this one I get a freelance sponsor roll....I see we're making progress here :D
Loved the heck out of this video ( and many of the other "tech quickies" )! The concepts in the video were explained super well and straight to the point!
Cool coincidence: the laptop at 0:44 was what I used for every one of my programming projects through college. The layout is still my favorite laptop layout (if only the up arrow replaced the right side of the right shift key, rather than being squashed).
Not all languages run through a compiler though; some go through an interpreter, which is a program that rather than translating your code into machine-understandable bits and bytes, runs the program directly instead. Python, Ruby, and JavaScript are such examples. About the instruction sets: The labelling for 64-bit architectures varies from website to website. Some write it as x86-64, some write it as x64, and some again write it as AMD64. These all refer to the same thing, the name just varies depending on who you ask.
Nikolaj Lepka Interpreters basically compile the code line by line and execute it immediately. While you are correct that there is a difference between the terms Compiler and Interpreter, all this video needed people to understand was that source code is not directly understandable by computers, so it needed to be converted first :)
Edrulesok what you're referring to is a just-in-time compiler, which compiles the code line by line. An interpreter doesn't, it just tries its best to parse and execute the code directly
Actually, they are compiled. Whilst I can't speak for JavaScript or ruby, I can say with 100% confidence that Python does go through a compiler. If you don't use the interpreter and just use the command line to execute a python program, a .pyc file will be generated (Python compiled), it just generally gets compiled at a different time by the interpreter.
you only de-fragment a hdd since a hdd is operated by a needle reader (like a big record)... to access data it has to move across the disk platter, over time and multiple writing/deleting from disk can cause gaps to occur over the disk. This is enefficient since the reader is moving over black space and so de-fragmenting basically condenses the bits to reduce the amount the reader has to move =)
i didnt really learn anything new, but for some reason looking at that cpu graphic. while thinking about boolean logic, and my computer running in front of me in its entirety.. it was just a really awe inspiring moment
lol the only time I've ever heard about that discontinued Russian car is in a book called, "Nick and Norah's infinite playlist" . so yeah. talk about obscure.
Wow, didn't think it would be possible to explain that well. Guess you really need to understand the topic perfectly if you want to explain it. And you did an awesome job!
While I already knew about most of this. Luke definitely explained it quite well. In a way that many more people could actually understand. Great job on this!
Good job on explaining this complex topic in a way 'lay' people will be able to understand/engage with. I'm almost finished a Computer Engineering degree and understand all this quite well but often have trouble explaining these things to others.
Still, it is pure magic to me how "turning the power off and on agan" contains information about what color one specific pixel needs to be. And how the compiler was developed in the first place . Magic
A sentence like "in very complex ways" explains nothing. That's as much useful as explaining it like: "and then the CPU goes Salagadoola mechicka boola bibbidi bobbidi boo🎵".
fishywtf Who knows man? Channel is all about making those complex things into simple, or at least manageable to understand information. Gravity Falls is da best yo.
JukaDominator Logic gates aren't too complicated. There's not that many of them and the names are pretty descriptive, like the AND gate takes two inputs, if they are both 1 the output is 1, the OR gate outputs 1 if any of the inputs is 1. There's more of them but they are all intuitive. Put 5 of them together and you can make a "full adder" which simply adds up binary numbers. Multiplication involves repeated adding (or maybe new tricks in modern CPUs).
JukaDominator in fact logic gates are really simple to understand. Just the combination of the is complicated part. All you need to know is a bit of binary logic
Isotopic Design You could go to the trouble of saying 'compiled at runtime' but then we may as well call it interpreted. Yes, eventually all languages get converted to machine code, but I don't think 'compiling' is defined that simply with no temporal connotation. This is why there is a distinction between 'compilers' and 'interpreters' in the first place.
Andrew Brown Who is still using 'interpreters' anyway? Most of the languages that were interpreted now have JIT compilers, noone is doing the 'line by line' approach anymore.
juan valadez Think of CPU cores like arms(with hands, fingers, etc). If you have one arm, you can do stuff OK, but only one thing at a time. Now if you have two arms, you can do things MUCH more efficiently, because you can distribute the work over 2 arms, or do two different jobs at once. Now, if you had 3 arms, you could do even MOORE work than with two arms, and more efficiently, you could distribute the workload over more arms, and could do more jobs at one time. This same analogy can be used to explain bottlenecking a CPU.(crudely, it's not a perfect explanation, but I think it gets the point across) Think of your legs as a GPU(or any part, really), and your arms as your CPU. No matter how many arms you have, and how fast and awesome they are, if your legs can only move at one step a minute, your arms are kind of overkill, because your legs simply can't keep up. Same for vice versa, if your legs can run at a million miles an hour, but your arm can only move one object a minute, then you can't work very much, because that arm is slowing you down. Hoped this helped!^^
The only thing i did not understand is that how these complex transistors after making switches on or off according to the code can understand and give the output. Its very complex. I wish i could work in intel to properly know whats going inside the processor.
Just to add more info. if you are working with C, it is compiled as the video said. However, if you are using Assembler (What is known as "Machine language"), it also needs to be "translated", but this time it's called "Assembly". Depending on the programming language it might have a different process in order be interpreted by the hardware as an instruction. L8er!
This video is great! I'm going to send it to my cousin, he does lots of web dev but wanted to get a better understanding of what's happening closer to the metal. I had a hard time explaining it.
I imagine there's collaboration between a bunch of the guys while writing various scripts. This was probably done while Linus was 'skiving off' and I know he's wanting to roll back his workload, so it's likely we'll have more Slick/B-Roll, Terran etc.. presented/written videos. Which, like yourself BrownieX001 , is cool beans to me :)
Peanutsreveng3 I just enjoyed his presentation. He has become very comfortable with the camera and his deeper/calmer voice helps explain better to me. I know scripting is probably a group research thing.
It's amusing to look back at earlier videos and see squirming Slick, but he can be proud of his achievements despite constant, daily criticism, flaming etc :) Then again, his Mum seems sound as well as his brother, so folks clearly deserve much credit too :)
Peanutsreveng3 I agree. I enjoyed Luke's Mom and Brother during the WAN show They were a couple of my favourite co-hosts. They were awesome and the relation with Luke worked out for the show and had really funny parts.
This was so awesome! I have already researched alot about this and even designed my own basic CPU but this video really puts it all into place. This would have been so helpful when I was first learning this. I should just point people to this video instead of trying to explain it myself! It was also great how you explained binary. By the way Luke, I was wondering how much programming you do. I do a lot because it is free and I don't have any money for hardware but I didn't realize the tech tips team did it. Anyway, best tech quicky I have seen! Keep it up.
Also, languages like Java don't directly compile to machine language but rather their own "bytecode" which is interpreted by a Virtual Machine. Saves you the trouble of writing a compiler for every platform out there. This is why you can run a piece of Java code on so many devices and why it's so popular with embedded devices.
Just took 'Digital Systems and Logic Design' and 'Microprocessor, Micro-controller and Assembly Language Programming' in this semester. Let's see, how it works in depth!
You don't need to compile your source code in order to convert it to 1s and 0s, since source code is already nothing else but binary information interpreted and displayed as text. The language specific instructions in your code need to be converted into (eventually) instructions which are part of your processor's instruction set, that's why you need a compiler.
i love techquickie explaining all computer & technology materials, can you next time post a video about cloud computing, robotics, data science, and artifical intelegent??
Always wanted to know that ! I was going to know it eventually since ingeneering processors is what I want to do for a living but it's nice to have it explained in a short format. (I know what I'm going to study because of LinusTechTips channel, following since way back in 2010.)
Subtitles in the Techquickie channel would be very nice! Sometimes I want to show this videos to someone but the person doesn't speak english well (or at all)!
You kinda missed Assembler. It's as close to human readable machine code you can get (you can program in it for example - you're going to have a nightmare on your hands attempting to program in Binary) and compilers compile your code into this (which the CPU reads and translates into Binary) although Compiler created Assembler can be pretty abstract (you can re-do it and cut out large pieces of generated code but many don't because it's really long winded). You're 2 steps away from bare metal (1s and 0's being the second step and +5V, -5V being complete bare metal). For programmer types - Assembler is translated literally and has no hard-defined types (so chars can potentially hold ints etc). You can do any basic operation in Assembler such as while loops and number manipulation.
I just took a compiler writing class. Holy crap is it rage-inducing, interesting, and fun (if that makes any sense). The final project was writing a basic Pascal compiler that output x86 assembly code. It's not as hard as you would think...
David Teles running the assembler is often forgotten nowadays, since most compilers include that step for you automatically. And for the most part, even after the assembler you still need a linker before the file is executable by the cpu.
Good explanation but still is very simple. I'm gonna learn about all of this deeper later on my college, compiuter engineering is a crazy thing but very much satisfying when you really learn something new.
Andrew Kahn He pretty much skimmed over what you're gonna learn in the first two years or so. As long as you keep up with the classes, it's not bad at all.
I still don't know how any human being ever invented a transistor. Who the hell got the idea that arranging a few atoms of silicone together and turning them on and off would lead to full-fledged computers?
The very design of computers amazes me, and mostly because I don't know how they were ever invented. such complexity, it seems hard to fathom someone stumbling upon the idea.
Imagine a world with no computer, and then you have this idea to build one, and somehow figure out how to make it understand and process human language code (inventing this compiler program too) AND inventing a screen that is capable of receiving processed information to project an image that makes sense to us.
Utterly insanely.
At any rate, damn am I glad someone figured it out, life is literally a new world with computers.
It does make you wonder though, go back 500 years and all the people back then were operating under the same laws of physics and had the same resources. They could of had computers, had they just of known to combine the right amount of materials together to process computers and electricity. So, can you even imagine what we're missing now? What will be obvious in 500 years from now, that we haven't the faintest clue about now?
UnknownXV its complicated if you think about it. but its more as if the concept of modern cpus continously improved from the previous concepts for decades now. Its like thinking how calculus is immensely difficult if you have never seen math in your life, but after years of education its more tolerable.
in a simple concept we started with transistors, basic switches that turn on and off based off the control gate. with the power to control current at such a fast speed, we were able to be build logic gates out of them. out of gates we were able to build simple circuits for computation (like full adders, and basic memory like sr latches and flip flops). from there we started bolstering these simple circuits, adding more features (like more math functions, more of the units themselves). and after decades of just adding more, and some complicated circuits to control all of this hardware, we get to the modern cpu.
UnknownXV You think about it the wrong way.
There was no 1 person who woke up one day and thought about how to create a computer from scratch.
Actually, one can argue that scientists have been working for over 400 years on the required components for such a complicated machine. Many discoveries, laws, theories and experiments had to be done throughout this time to achieve that knowledge.
Many people from different disciplines like Mathematics, Physics, Electrical Engineering, Chemistry and much more have been working together to produce such inventions, and have been piecing the puzzle together very slowly. All those integrated pieces were invented or developed mostly independently (and sometimes multiple times by different people) because they were always built on former knowledge (as the ones before..). Most of those inventions did not even have "computers" in mind, but they were later used to aid in the process.
The first transistor was not on a tiny silicon, rather a big device. It took time and effort to get it there.
The process described in the video is really dumbed down (unfortunately it's impossible to understand all the pieces in a 5min video) but the process can be divided and then divided again (and again..) into small pieces which humans understand, and when you piece it all together it looks to be incomprehensibly complex, but the underlying pieces aren't.
Think about it like a puzzle of pieces 1,000,000, it's quite insane to approach it like that, but if you were given the entire puzzle in chunks of 1,000 pieces, it definitely becomes more manageable (the example is not 100% correct but it might help you understand).
Also, modern computers (and their functions) were basically only possible to create (whether it's software or hardware) using computers in the first place.
Ofek Azulay The puzzle analogy is pretty good. Makes more sense.
To be fair it wasn't one person doing all the work. Someone designed C++ because they wanted to make Cees picky and less prone to mistakes and ease the way to program. Someone decided to decrease the size of a computer and changed transistors to a silicone rather than lamps.
It didn't start with silicone, it started with tubes... essentially light bulbs. Wired together, light bulbs can create all kinds of logic pathways. Given that there is more to a tube than a lightbulb. Anyway from there, there was a long series of small steps that gradually got us to where we are today. A jurney of a thousand miles begins with...
"Hey, that processor is going a little too fast. We're going to have to give you a ticket."
Roxasmaker well shit, imagine if that happens in real life
***** well... You get a help desk ticket, right?
apparently, i haven't overclock yet haha
Sounds like something Comcast would do if they had a say in cpu speed...
Roxasmaker If you go for world record clock speeds you are even looking at prison time.
"Architexture" :D
veazix Im not alone!heard it.
veazix and it wasn't just one time.. he did it again on the car analogy :)
veazix well he does call "graphics(Grafiks)" card as "Grafis card" countless times and that didn't bother anyone....
veazix wanted to post this myself.
Axis Lexington i think its cos hes half french/german or something. Not sure tho
Instructions unclear, over clocked my car. Killed people in front of my car. Voided my warranty.
Benny Kolesnikov I laughed wayyyy to hard at this xD
Instructions also unclear, my dick stuck in Benny Kolesnikov 's car.
Benny Kolesnikov Ah, the freedom warranty.
Benny Kolesnikov Your car clearly needed more fluups
Try adding more voltage.
2015: “An average cpu has millions of transistors”
2019: . . .
Then amd epic release.......
BILLIONSSSSSSSSSSSSSSS
Laughs in 2020
worlwr2 yup me
Dunno why he said millions they already had billions back then
Put your pins where i can see them! now... slowly exit the motherboard... did you know you were driving with a busted RGB?? you also didn't slow down for the t-junction
lol thats really good
LMAO good one
We're designing a basic CPU (not every instruction is available, just basics like add, subtract, move, store, load, compare etc) for our Computer Science class right now and it's awesome. I'll add a little snippet of blabber for anyone a little bit more interested in the specifics of how you go from 1s and 0s into instructions. It's kind of hard to condense into simple terms, and I might just sound like a madman talking about nonsense, but hopefully it might explain something!
A compiled program is just a list of instructions. An single instruction would be something like "add r0, r1, r2" which means we want to add the value in register 1 with the value in register 2, and store that result in register 0. Say we are using 32 bit architecture, this means that each instruction we use is represented by 32 digits of 1/0s. We split these 32 bits up into different parts to represent information about what instruction we want to run. The CPU takes an instruction, splits it up, and does the various instructions based on what was fed in.
So for example in the ARM architecture, the leftmost 4 bits represent the condition of the instruction. If it is 1110, that means we should always run the instruction. If it is 0001, that means we should only run the instruction if our condition flags are set to "not equal". We would have to run a compare instruction beforehand to compare 2 numbers which would set the condition flag registers (like "cmp r0, r1" which compares the value in register 0 with the value in register 1, and sets our condition flags to show the results of the comparison telling us stuff like if it is greater than, less than, equal to).
A typical instruction would (for a data-processing instruction) would have 4 bits of conditions, 2 bits of "op" (representing what kind of instruction we are using, e.g data processing vs memory, 6 bits of "funct" (this tells us what our actual instruction is, for example add which is represented by 0100), 4 bits of Rn (the register that contains the value we are using for the instruction), 4 bits of Rd (the register where we store the result, e.g. 0011 in binary means 3, so we store the result in register 3), and 12 bits of Src2 (the second value we are using which can either be another register, or an immediate value). So with "add r0, r1, r2", Rd is 0000, Rn is 0001, and Src2 contains 0010 (along with some other information like bit shifting and such).
We use the textbook called "Digital Design and Computer Architecture". It didn't tell us everything we needed, but helped us design the basic circuits which we added onto with our own logic. There's this awesome free software called "Logism" you can use to design circuits. I would recommend learning how to do basic c programming, learning how to create that same program in assembly, and then learning how that assembly translates into machine code. Hardest part about this stuff is that it is really hard to find answers online, but if you're motivated and good with textbooks then you should be good with the book!
I kinda learned it the other way around from "ands" and "ors" to flipflops to master slave to counter ... etc.. But we never made a CPU, we just programmed one.
Yeah we covered flip flops and the circuitry after we made a "compiler" that read compiled assembly machine code and did the instructions in C.
This comment is a year old.
Man I wish I was smarter back then. Had to leave university, I literally just burned money with no results.
I wanted to be a programmer so badly lol, or anything related to IT engineering.
I ended up as a network technician (and I love it). Industrial routers are also a challenge, but not as a CPU.
I'm just literally too stupid for being a programmer or hardware programmer.
Wish you guys the best! Do my dreams :D
I still don’t understand how people go a transistor to obey a command like store.
I've been researching CPU Architectures for about 5 hours and this one video explains it better than anything else I've seen in that time. Well done mate.
Did he say "millions of transistors?"
Try billions. The Xeon from 2012 has around 5 BILLION transistors.
Jeremiah Pierce But 5 billion is really just 5000 million. So, hah.
Although you are correct, that's not the general way it is said. Nobody says I have sixteen thousand megabytes of ram.
Jeremiah Pierce I know :P
He said "the average processor..."
Mohamed Zakaria So you don't have a Xeon? What an amateur...
As a Computer Engineering major, I'm proud to say that this is actually a reasonably good explanation.
It's an extremely poor explanation. Half is directly wrong, even objectively. It's really incites a complete facepalm, a major one...
@@TheMrTape what is wrong though? I'm a noob in this but I understand what I came to look for (i.e. that the microprocessor carries out instructions using transistors in logic gates, and also cleared things from other videos, like that these instructions are possible with the voltage that is given to the microprocessor with each cycle of the microprocessor's clock)
@@thisisrtsthree9992 #00:00 To program, I'm programming, I'm a programmer, not code, coding, coder. A clear text program is code as much as any spoken language is (not), even though refereed to as source code, only what it compiles to, assembly instructions, is by definition code.
#00:15 and #0:25 CPU's doesn't take "code" directly, it's not compiled (mentioning later doesn't make it right).
#00:28 Both can be represented by letters and numbers if translating the binary they both consist of to their respective character abstractions; this representation is only relevant for humans, as the CPU can only see the binary; my text is seen as binary by any CPU, abstracted for your understanding. It's a choice whether to represent something as binary or otherwise to humans; it's irrelevant to the CPU.
#0:43 There's not English/Chinese etc. versions; it's not whatever language you want. It's always English. You may name your variables or make comments in any language of course, but the statements and functions are all English.
#0:45 There's no sets of instructions when you're talking source code, it consists of statements, functions and operators; instructions exist only in machine code.
#0:48 You can't chose whatever programming language. The languages shown in the background are HTML (not a programming language but a markup language) and Javasctipt, both of which are interpreted rather than compiled to machine code, thus irrelevant to what they're talking about.
#0:50 A program only needs compilation if it's a non-interpreted language (eg. C, C++, Java rather than eg. Javascript, python, PHP). Again, the programs shown in the background doesn't need compiling to work; in this case no further processing is needed for the CPU to understand, because it doesn't, it uses a translator program to interpret the program in real time.
#1:03 A compiler doesn't check for errors, it "checks" for wrong syntax (grammar) by not being able to compile. Also "object code" is a deprecated term for machine code.
#1:13 The reason a CPU can interpret machine code/instructions is that this is what it's designed to interpret as instructions; the machine code is 0's and 1's as much as the clear text it's compiled from, since every character is an abstraction of 8 or more bits; it's the exact same thing the CPU sees, always, and this is the only thing it could ever see, 0's and 1's. The differentiator is that the bits of an instruction translates to working CPU functionality while bits of a character translates to the valid character. They're implying that nothing of it is binary before it's compiled, as if the CPU could understand characters directly, as opposed to through abstraction via the 2 "letters" it understands, 0 and 1; the letter A means nothing to the CPU, only A's binary representation: 1000001. Binary is completely irrelevant to any of this, it's always fucking binary.
#1:17 Implying that binary is something modern that weren't always the only thing used for computing.
I won't bother with the rest. It's all so far of that you're better off assuming how it works. Most of what's presented in these videos are layman assumptions anyway. The (mis)information put forward is worse than a basic school project involving actual research and understanding.
#3:11 Only true and relevant point put forward in this video.
Overall everything gainable from watching is an extremely dumbed down and twisted understanding of how CPUs work.
@@TheMrTape ya ma dude, thanks for taking the time to write that long reply, I checked some and I saw you are right!! so if anyone peeps this answer they can see too
@@TheMrTape You should be an engineer teacher! wow i learned lots and actually wanna learn more now!
Finally someone who explains this well.
Good video. :)
Shady I only like the videos where Linus / Taran / Luke explains
Shady I never liked that Linus guy's voice but he'll be back.
Shady I never liked that Linus guy's voice but he'll be back.
Shady that's the beauty of this channel, they may not explain every detail, but they explain enough to get a basic understanding of the subject
Shady I've done this for years, and I'm only finally starting to get it all :).
2:15 As a CS student, I might be taking a bit too much pleasure when hearing that my hobby is immensely complicated.
The processor die at 1:45 is a 6502 :)
oddball0045 engineer?
joon9498 www.visual6502.org/JSSim/index.html
***** Nuu, Z80 >:)
Commodore 64
joon9498 Self proclaimed engineer? then yes :) I'm in 10th grade of high school.
I have tried to explain this topic for friends but i have never come close to the quality of this explanation. Well done Luke
I'm in the first year of system's engineering and I learned this today, so awesome to know this already lol
This guy is very clear. Never though it was possible to talk something like that in just 5 minutes. The script is very good and great edition.
Am I the only one that noticed "micro architexture"
you are right!!!
also porgammer.
accents! how do they work?!
I did
These videos are so much more clear than any of my lecturers ever were.
code is compiled into assembly, and from there to binary irc
ingolfur ari Couldn't you write a compiler in binary, that compiles directly to binary?
Franco Selem imagine how hard and time consuming that would be
Miguel C Lol I know, I'm just being a bitch lol
Franco Selem lol ik you weren't serious.
Franco Selem You could in theory, but good luck trying to find someone willing to code something even remotely complex in binary.
This was one of the most interesting videos on this channel in a long time. Keep up the good work!
GJ Slick, that's a really tricky few topics to try and break down into layman's terms, let alone doing them together in a 4 minute video. Out-fucking-standing dude.
I'm curious how long it took to actually script this episode #becausereasons
Peanutsreveng3 I agree, most professors spend several hours just introducing the topics.
davidabeats because that's what they are paid to do. But explain it so you get every little information that you won't even use most of the time just so they can test you on it. It's a dick move. But they need to do something for the period.
Someone that see's the inefficency inherent in monetary system, sweet @ BrownieX001 :)
Wow. This was super simple, and easy to understand. I code, but I'm a highschool drop out, and I never learned any of it formally. This is the first time I've heard this explanation in a consumable manner.
On the last Techquickie I asked about a vid talking about freelancing. On this one I get a freelance sponsor roll....I see we're making progress here :D
Loved the heck out of this video ( and many of the other "tech quickies" )! The concepts in the video were explained super well and straight to the point!
As a C++ programmer i can say you did well. Nice video
Ye, C++ & Python ftw.
Java and vb.net over here.
***** Noob, web development is child's play.
***** You might want to take some English classes, brah...
***** what on earth! Shame on both of you! We're here to have fun, and learn, and you two are flaming each other!
Cool coincidence: the laptop at 0:44 was what I used for every one of my programming projects through college. The layout is still my favorite laptop layout (if only the up arrow replaced the right side of the right shift key, rather than being squashed).
Not all languages run through a compiler though; some go through an interpreter, which is a program that rather than translating your code into machine-understandable bits and bytes, runs the program directly instead. Python, Ruby, and JavaScript are such examples.
About the instruction sets: The labelling for 64-bit architectures varies from website to website. Some write it as x86-64, some write it as x64, and some again write it as AMD64. These all refer to the same thing, the name just varies depending on who you ask.
Nikolaj Lepka Interpreters basically compile the code line by line and execute it immediately. While you are correct that there is a difference between the terms Compiler and Interpreter, all this video needed people to understand was that source code is not directly understandable by computers, so it needed to be converted first :)
Edrulesok what you're referring to is a just-in-time compiler, which compiles the code line by line. An interpreter doesn't, it just tries its best to parse and execute the code directly
Nikolaj Lepka You should go and read up about this :)
Actually, they are compiled. Whilst I can't speak for JavaScript or ruby, I can say with 100% confidence that Python does go through a compiler. If you don't use the interpreter and just use the command line to execute a python program, a .pyc file will be generated (Python compiled), it just generally gets compiled at a different time by the interpreter.
James Brunton Then explain why PyPy is different then
You nailed it, brother. This video sums up everything I've looked up across the internet for hours.
Funny thing he mentioned source code in Chinese. There actually is a programming language in Chinese, its syntax is very similar to Visual Basic.
Yeah, stolen from Visual Basic.
as an electrical engineer, this is accurate. Great job in explaining slick (y)
do a vid on fragment/disk cleanup (why its recommended etc..)
you only de-fragment a hdd since a hdd is operated by a needle reader (like a big record)... to access data it has to move across the disk platter, over time and multiple writing/deleting from disk can cause gaps to occur over the disk. This is enefficient since the reader is moving over black space and so de-fragmenting basically condenses the bits to reduce the amount the reader has to move =)
This is also half automatical, which is why it's rare nowadays to find significant de-fragment from hdds.
i didnt really learn anything new,
but for some reason looking at that cpu graphic. while thinking about boolean logic, and my computer running in front of me in its entirety..
it was just a really awe inspiring moment
OMG , how do you guys know the car YUGO .... I didnt see this coming
lol the only time I've ever heard about that discontinued Russian car is in a book called, "Nick and Norah's infinite playlist" . so yeah. talk about obscure.
Jake Morel it is not russian, Its from Yugoslavia (Serbia,Croatia,Bosnia,etc)
USA Destroyed this big country back in the 90s. just sayin
+euer Vater oh my gosh whateverrrrrrrr
Pa mi smo poznati brate , samo nas ovi ameri ...
+Jake Morel so you are american or kanadian?? Whatever
You know what,.
You just explained more in these 3 minutes than our prof did in an entire semester.
If you think that Yugo is bad,you newer drived Fica 650 .:D
Wow, didn't think it would be possible to explain that well.
Guess you really need to understand the topic perfectly if you want to explain it. And you did an awesome job!
I remember when luke used to be called slick. Old times!
Finally someone who explains this well
thank you for making this video
Does anyone remember when fry's great great great great great nephew from futurama almost got arrested for overclocking bender?
While I already knew about most of this. Luke definitely explained it quite well. In a way that many more people could actually understand. Great job on this!
So that's why AMD chips run so hot! All of their names have sexual connotations!
+LogicOwlGaming That actually makes a lot of sense...
nice
Good job on explaining this complex topic in a way 'lay' people will be able to understand/engage with. I'm almost finished a Computer Engineering degree and understand all this quite well but often have trouble explaining these things to others.
I'm so proud of myself,
I understood literally everything he talked about.
Still, it is pure magic to me how "turning the power off and on agan" contains information about what color one specific pixel needs to be. And how the compiler was developed in the first place . Magic
A sentence like "in very complex ways" explains nothing. That's as much useful as explaining it like: "and then the CPU goes Salagadoola mechicka boola bibbidi bobbidi boo🎵".
Trying to find something basic enough but still interesting enough for my computer tech class. THANKYOU!!!
Could you talk about logic gates in detail next?
JukaDominator LOL logic gates complicated as shit man dont think they will. GRAVITY FALLS FTW
fishywtf Who knows man? Channel is all about making those complex things into simple, or at least manageable to understand information.
Gravity Falls is da best yo.
JukaDominator Logic gates aren't too complicated. There's not that many of them and the names are pretty descriptive, like the AND gate takes two inputs, if they are both 1 the output is 1, the OR gate outputs 1 if any of the inputs is 1. There's more of them but they are all intuitive. Put 5 of them together and you can make a "full adder" which simply adds up binary numbers. Multiplication involves repeated adding (or maybe new tricks in modern CPUs).
JukaDominator in fact logic gates are really simple to understand. Just the combination of the is complicated part. All you need to know is a bit of binary logic
fishywtf Logic gates aren't complicated. Using them in large combinations is somewhat complicated.
So many people take technology for granted but when it comes down to it, its a fucking miracle that this stuff works as well as it does
Yugo best car ever
I actually enjoy Luke in this video. He does a really good job explaining this.
Arcitexture
Most of us already knew all this. It would have been cool to get a little more in detail on microarchitectures.
you failed to mention interpreted languages or virtual machines. Not every language get compiled or gets compiled to machine code.
Andrew Brown eventually they all do...when they do is another topic for (possibly) another video.
Isotopic Design
You could go to the trouble of saying 'compiled at runtime' but then we may as well call it interpreted. Yes, eventually all languages get converted to machine code, but I don't think 'compiling' is defined that simply with no temporal connotation. This is why there is a distinction between 'compilers' and 'interpreters' in the first place.
Andrew Brown Who is still using 'interpreters' anyway? Most of the languages that were interpreted now have JIT compilers, noone is doing the 'line by line' approach anymore.
Maciej Sopyło scripting is still interpreted (perl, python, bash, Tcl (ugh) )
Andrew Brown Interpreters compile source code line by line and then run that code, so I guess they are also technically compilers.
A serious thanks for posting this. It answers a question, which has been sitting on the edge of my mind for years!
I've learned something new nice and it's also very interested😮
Then why do we need more cores in a cpu can it just be a 1 core or is like having 4 cores the best right know
juan valadez watch the episode on CPU Cores
RetroMyke ok
juan valadez
Think of CPU cores like arms(with hands, fingers, etc). If you have one arm, you can do stuff OK, but only one thing at a time. Now if you have two arms, you can do things MUCH more efficiently, because you can distribute the work over 2 arms, or do two different jobs at once.
Now, if you had 3 arms, you could do even MOORE work than with two arms, and more efficiently, you could distribute the workload over more arms, and could do more jobs at one time.
This same analogy can be used to explain bottlenecking a CPU.(crudely, it's not a perfect explanation, but I think it gets the point across)
Think of your legs as a GPU(or any part, really), and your arms as your CPU.
No matter how many arms you have, and how fast and awesome they are, if your legs can only move at one step a minute, your arms are kind of overkill, because your legs simply can't keep up.
Same for vice versa, if your legs can run at a million miles an hour, but your arm can only move one object a minute, then you can't work very much, because that arm is slowing you down.
Hoped this helped!^^
TheNONnoobXD it did
This is the exact video i needed. Literally started with the question that popped into my mind earlier.
this is outrageous!! why replace Linus with a man? truly truly...outrageous!
lol
Fairly good effort for something that takes atleast a semester to understand.
Hey! Yugo isn't that bad! xD lel who am i kidding, it's common here though, even today.
Goran Novakovic It's still a flaming pile of garbage, but it can't be that bad considering some of them are 20 years old and running. :D
Goran Novakovic How do you know that spring is coming?
Yugos begin to start.
Daniel Borsos Yugos start in winter to its petrol engine not diesel so cold wheater doesent hurt it that much
Goran Novakovic haha my dad still owns one haha we had to use it when our main car broke ,ps are you from balkan ? :D
ItsAlan Yes, from Serbia :)
That was a good review for the gist of two whole EE courses I took back in college :D
Where did he find a YUGO?
I was searching this from many weeks.
Finally found it.
The only thing i did not understand is that how these complex transistors after making switches on or off according to the code can understand and give the output. Its very complex. I wish i could work in intel to properly know whats going inside the processor.
Just to add more info. if you are working with C, it is compiled as the video said.
However, if you are using Assembler (What is known as "Machine language"), it also needs to be "translated", but this time it's called "Assembly".
Depending on the programming language it might have a different process in order be interpreted by the hardware as an instruction.
L8er!
"Microarcitexture" right....
This video is great! I'm going to send it to my cousin, he does lots of web dev but wanted to get a better understanding of what's happening closer to the metal. I had a hard time explaining it.
im from Serbia, and I am offended by saying yugo is crap car
This was so well explained and executed. Luke should do these all the time!
I imagine there's collaboration between a bunch of the guys while writing various scripts.
This was probably done while Linus was 'skiving off' and I know he's wanting to roll back his workload, so it's likely we'll have more Slick/B-Roll, Terran etc.. presented/written videos. Which, like yourself BrownieX001 , is cool beans to me :)
Peanutsreveng3 I just enjoyed his presentation. He has become very comfortable with the camera and his deeper/calmer voice helps explain better to me. I know scripting is probably a group research thing.
It's amusing to look back at earlier videos and see squirming Slick, but he can be proud of his achievements despite constant, daily criticism, flaming etc :)
Then again, his Mum seems sound as well as his brother, so folks clearly deserve much credit too :)
Peanutsreveng3 I agree. I enjoyed Luke's Mom and Brother during the WAN show They were a couple of my favourite co-hosts. They were awesome and the relation with Luke worked out for the show and had really funny parts.
Who's watching this in the Kaby Lake era?
This was so awesome! I have already researched alot about this and even designed my own basic CPU but this video really puts it all into place. This would have been so helpful when I was first learning this. I should just point people to this video instead of trying to explain it myself! It was also great how you explained binary.
By the way Luke, I was wondering how much programming you do. I do a lot because it is free and I don't have any money for hardware but I didn't realize the tech tips team did it.
Anyway, best tech quicky I have seen! Keep it up.
That's not Linus :O
Also, languages like Java don't directly compile to machine language but rather their own "bytecode" which is interpreted by a Virtual Machine. Saves you the trouble of writing a compiler for every platform out there. This is why you can run a piece of Java code on so many devices and why it's so popular with embedded devices.
AMD Architectures all sound like sex positions
Alright computer. How 'bout this time we try the Pile Driver ( ͡° ͜ʖ ͡° )
Just took 'Digital Systems and Logic Design' and 'Microprocessor, Micro-controller and Assembly Language Programming' in this semester. Let's see, how it works in depth!
Kakav Maserati, kakve picke materine, Yugo bre! hahaha :D
You don't need to compile your source code in order to convert it to 1s and 0s, since source code is already nothing else but binary information interpreted and displayed as text. The language specific instructions in your code need to be converted into (eventually) instructions which are part of your processor's instruction set, that's why you need a compiler.
wow you really did a great job making this video verry straight forward and profesional but yet a tad bit of humor! i loved it!
i love techquickie explaining all computer & technology materials, can you next time post a video about cloud computing, robotics, data science, and artifical intelegent??
Always wanted to know that ! I was going to know it eventually since ingeneering processors is what I want to do for a living but it's nice to have it explained in a short format. (I know what I'm going to study because of LinusTechTips channel, following since way back in 2010.)
thank you for making these videos so i can watch them at 1 am
5 years of college wasted... you made it understandable in 5 minutes
Really good broad description of an very difficult CS concept. Great video.
Since he brought up cars, the transmission is really a big analog computer that does division based on the gear and engine RPM
fantastic video Luke, top notch and we need more like it!
I may just link this vide to anyone, who asks me "YO DAWG WTF UR DOIN IN IT CLASS?" Nice explaniation! Thanks!
Excellent briefly explanation. Thanks!
When this video came out the newest intel CPU was haswell. 2015 feels like yesterday.
Subtitles in the Techquickie channel would be very nice! Sometimes I want to show this videos to someone but the person doesn't speak english well (or at all)!
You kinda missed Assembler. It's as close to human readable machine code you can get (you can program in it for example - you're going to have a nightmare on your hands attempting to program in Binary) and compilers compile your code into this (which the CPU reads and translates into Binary) although Compiler created Assembler can be pretty abstract (you can re-do it and cut out large pieces of generated code but many don't because it's really long winded).
You're 2 steps away from bare metal (1s and 0's being the second step and +5V, -5V being complete bare metal).
For programmer types - Assembler is translated literally and has no hard-defined types (so chars can potentially hold ints etc). You can do any basic operation in Assembler such as while loops and number manipulation.
I just took a compiler writing class. Holy crap is it rage-inducing, interesting, and fun (if that makes any sense). The final project was writing a basic Pascal compiler that output x86 assembly code. It's not as hard as you would think...
Nice to see that you compared it with YUGO, a slow, budget car from Serbia :D
Amazing video. I have been looking for the answer to this particular question for years! Thanks!
You forget one part of the process:
Source code->Compiler->Assembly->Assembler->Machine Code->Instruction Memory->Instruction Decoder->Control Signals
David Teles running the assembler is often forgotten nowadays, since most compilers include that step for you automatically. And for the most part, even after the assembler you still need a linker before the file is executable by the cpu.
Good explanation but still is very simple. I'm gonna learn about all of this deeper later on my college, compiuter engineering is a crazy thing but very much satisfying when you really learn something new.
I like the rotating host for these!
Another great video you guys never disappoint.
Great video, very informative and easy to learn. Also cool that Luke hosted.
You mentioned it in the video, if there isn't already one, do an overclocking fast as possible
Going into Electrical Computer Engineering next year. This video scared shit out of me.
Andrew Kahn He pretty much skimmed over what you're gonna learn in the first two years or so. As long as you keep up with the classes, it's not bad at all.
Ok. I'm just kidding. I'm not really that frightened. I'm a quick study so I shouldn't have an issue. Thanks for the reply!
Andrew Kahn Not to scare you, but Electrical Engineering is tough..
Simple and quality video with lot of content in it...