As far as programming goes, I think the main thing that has changed over half a century is adding more and more layers of abstraction - microcode (hardware), micro-operations, machine language, low-level compiled languages, virtual machines, high-level languages, and libraries on top of libraries on top of libraries. Understanding the layer underneath where you usually work can help you take better advantage of it in your code.
I think that's what's holding me back somewhat because I am experienced with assembly languages but when I go to writing a C++ or C# program I'm missing too many intermediary steps and it is quite an anxiety-inducing thing to suddenly trust all those layers that I have no control over doing the things as and when I intended. The loss of total control compared to speaking something the computer understands almost directly.
@@vuurniacsquarewave5091 perhaps you could develop a little trust if you played around a bit with Compiler Explorer? It's one of the reasons I created it - to make it possible to interactively edit the C or C++ code and see how the compiler generates the assembly.
@@vuurniacsquarewave5091unfortunately if you’re ever going to make something that’s modern you have to let go and trust… I have web applications running on cloud servers and it causes anxiety but there is no other way in the modern world.
Microcode was removed to create RISC. Ah, that is why MIPS has no flags. Single cycle interrupt only allows to store the instruction pointer in r31 . ARM now has 128 bit single cycle push to solve this.
I've been in the computer industry for 40 years, and all this stuff still amazes me. The same with working on a remote computer on a terminal: when you hit a character on you keyboard, it's send to the remote computer, which in turn sends the character to the terminal and displays it. Ff'ing amazing.
Thanks Matt & Sean (for this and the compiler explorer). I know for many this topic can be "dry stuff", but this fundamental understanding of computing really helps someone go a very long way, and it pulls back the curtain and reveals that actually, computers aren't magic, they do very little, very fast. I think the old 8-bit model of understanding a CPU is really how we will all have to think about computers when being taught to children today, and those old computers still certainly have a place as educational tools, where the pixels are so big you can see them, and memory addresses are so few you can count them.
This is one of the best ComputerPhile episodes that I have seen. I learned more from programming a "Tandy Microcomputer Trainer" in the early 1980s than from most of the computer engineering studies that I have undertaken since. Great analogy. I love it!
CS students start with HTML and CSS? 🤔 They actually might very well have back in the days when MySpace was a thing, though recreationally, not academically :)
@@2eanimation I don't think most CS students start with html, css and js. I think that is more true for bootcamps that focus on shitting out web devs that know a little about servers and dbs but mostly just frontend things.
This is the best description of machine code I've ever heard. I've only ever wrote one machine code program in 6502, I was so proud and shocked that it worked! It was on an Oric 1 though, not a BBC micro. The Oric had a sort of BBC Mode 7 that the Electron lacked. So in 6502 Machine code I made a program display 'teletext-style' pages as a Point-of-sale display in a shop window on Mansfield Road in 1983. It was FAST because it used BLIT to move the teletext page into display memory... Rock 'n' roll ! 😁✌
My first computer was an Amstrad CPC 6128, another Z80-based computer with a whooping 128KB or RAM (which the CPU couldn't access at once, so it used a paging system and the second 64KB could only be used to store data for longer term -because it was slow to retrieve it-). That's the computer I taught myself programming on (first in BASIC, then in Assembly), and my mental model of machine code is pretty much the same as Matt's. I remember how in the CPC 6128 the last 16KB of the first 64KB of RAM would map directly on the screen, so that just by putting numbers in those addresses pixels would light up in various colours. I think that helped make programming "click" for me, in that I had direct, visual feedback of what I was doing just by storing numbers at specific addresses. From there it was easy to "get" how the computer worked. I even remember the available registers of the Z80 CPU. There was A, the accumulator from the video, the relatively free BC and DE 16-bit registers, F which contained bit flags representing various states of the computer and AFAIK was read-only, or at least needed to be treated with care, and finally there was HL, the register that contained the memory index Matt mentioned. Funny that I still remember these details after so many years (nearly 40 years ago!).
Very well explained 👏, my first "mental model" from high school programming had the class trying to "command" a volunteer student (who acted as the robot) to stand up, walk in a circle, and sit back down using a limited instruction set. It was explained in a very similar way.
Wrote assembler on multiple cpus myself so I didn't expect to learn anything, yet this explanation was still interesting to me. Teaching done well is interesting it's a shame that students are so often bored.
Ben's videos are amazing. Watching him explain SR latches blew my mind. The only downside to his channel is that every time he posts I have to go re-watch his entire library to remember how everything works lol
I bought a ZX81 kit from the UK circa 1981 from Nigeria. I was 15. After seeing it boot for the first time I was then curious about how it worked. Less than a year later I'd taken the Z80 out of the ZX81 and built my own machine on a veroboard. I've never been the same ever since. 🙃
I remember in the mid 80's stumbling across this "machine code" whilst poking values into Ram on my Commodore 64. I can still remember the decimal values I poked in to increment the screen colour and jump back and do it again at incredible speeds compared to basic. It was such an eye opener on how fast the machine was actually running compared to how fast basic could do it. Nice video.
This was a lovely example of clear explanation - it may seem pedestrian but fundamentally computers are pedestrian and understanding how fast they do pedestrian things and how much pedestrian work you are asking of them are important insights. The next layer up is understanding quite how much bottom level code is generated from high level instructions and which instructions have higher cost. Then you can start writing efficient code.
This is remarkable. In 20 minutes, a basic understanding of what goes on in a computer CPU and an indication how assemblers work (human-understandable representation of what is in an executable instruction). Parts 2, 3 etc are needed to overcome some of the necessary over-simplifications in this introduction. (For example: there are at least three distinct LOAD instructions - load literal; load from storage location; load from storage using the INDEX.) I would say that anyone with the basic ability to learn assembler programming, or direct machine code if really necessary, would be set in the right direction by this talk.
"earth-shattering moment" @14:24 ! no other word can explain the feeling much better. This video is nostalgic. Back in 1988 when I was learning BASICA (or GWBASIC), reading about the PEEK and POKE statements gave that feeling.
Because they didnt teach you E&M and circuitry to go with your binary because they dont need you to actually know things and have a complete picture they simply need you to have a piece so they can monopolize the pieces and be the only one with the whole...dont you wish the purpose of education was actually to educate and not to filter/control?
It still amazes me that within a couple of lifetimes ago we were still behind horse and cart and yet can watch this video on pocket-sized computers in every corner of the globe. We've come a long way, but there's so much further to go
@@exchable electricity...and magnetism...its a class...(we did not teach IT to do anything it merely does what it was supposed to do in the first place...the weirder thing would be that your neurons and the ones used to make gpt are the exact same thing...meaning YOU are already an organic computer to begin with)
The thing I find the most fascinating is the RAM and how computers are able to “remember” things. Very cool if you look at the circuit for 1 bit of memory. Prior to that the inner working of computers were very abstract to me.
I think you explained a lot very nicely and concisely! The assembly program is cool enough, and bringing it around to machine code in ram is just fantastic!
I highly recommend Ben Eater on youtube. Ben’s video series starts with basic electronics and progresses to building an 8-bit cpu from scratch with breadboards.
The game „Human Resource Machine“ picks this up very well. I have no idea of machine code, but I love that game, and most of what you said sounded quite familiar to me.
I was thinking the same thing. The robot he's describing is literally the worker from Human Resource Machine. Marginally different capabilities, but still.
If you want to go deeper try out "TIS-100" by Zachtronics, it really is the closest to a game about machine code i've ever seen (with some fun twists and a story). Some of thier other games are also very intersting IF you find machine code/assembly interesting!
Great video. Been coding since 1985 and professionally since the late 90s and never understood assembly language or machine code until now. It’s been a black art I’ve kept away from
If only there were some kind of technological device that could be used to convey information to others easily without having to use up so much paper. I tell you, the inventor of such a device will be set for life!
I remember the exact moment of my "watershed" understanding programming. I was in my freshman dorm room in 1982 talking to a classmate on my wall phone (with cord that was about 2 feet long so I had to stand to use). It was the second week of my first programming class discussing the Fortran assignment which I was completely baffled over. My classmate said i should include x=x+1 in my answer. My mathematical mind said that doesn't make any sense. He said "It's not an equation. It is taking the current value of X, adding 1, and putting it back in X". From that moment on programming has come second nature to me.
As a teenager back in the 80s, the Assembler program was beyond my budget, so I wrote some stuff in machine code. The Z80 was pretty easy to understand.
Matt and I have such similar backgrounds, it's scary. I too learned machine code from that Usbourne book. I think we share the same first computer too. Also, my name is Matt. I still program in Z80.
Loved this video! First thing I ever programmed was a TI-57. Instruction codes were the coordinates of the key (row and column) and you had 50 steps and 8 memories. Looking forward to a followup with tests, heap, indirect addressing, ...
I got into computers because of video games and today after 20+ years I am software engineer. I will never stop to get fascinated by computers. this comment, this video, this website, the internet - all 1s and 0s. Its crazy that we put electricity through a bunch of sand to make it perform math, and that basically is what makes the world go around today!
Very nice! Thanks. I endorse the idea of a part 2. All the books I ever saw/read/used on machine code always started with registers, accumulator etc. and *never* mentioned the 'other circuitry' which produces the end user experience which Matt G mentions. I remember spending an hour mcoding once to produce a tune that lasted 3 seconds. I knew exactly what the accumulator was doing, but had no idea how all that interfaced with the speaker. I think a part 2 might track back from the interface down to the mcode. Those old books were like looking at the role of the brain in psychology where ch 1 was 'the neuron' and ch 2 was the synapse - but you never got to human behaviour...
They did not. Thus, the instruction set was reduced to what a normal coder could memorize. When compilers caught up 1994, the PSX got its ISA extended again with vectors. N64 got floats.
Yes! I thought so! He's a great live perfomer. Seen a couple of his talks and he just knows how to present programming to people. Apparently he also has a pretty sweet system of cameras to hold an "interactive" presentation from home. The fascinating part about this is how simple it is as you figure it out, but it's still a lot of labour and you can easily forget a step as you're excited to get the end result and program done, and focus more on your idea than all the manual steps to get to the idea. Which I guess explains why higher level languages have become increasingly popular and why things like C++ got ranges and algorithms as well, to remove the problem of misindexing and looping - doing a silly careless mistake that leads to a bug when thinking about your idea more than walking step by step. Would be great to have Godbolt guest again, he seems to have quite a bit more to say.
for many years i've felt i had the 'gist' of machine code (i normally code higher level) but this made a few things click together in my head more than anything else has.. thanks :)
The most exciting thing about this video was hearing that Usborne books are now available as free pdfs. Those books were a huge part of my childhood! But I've searched around a bit, and haven't found much, certainly no books I recognise from the 80s
I remember learning z80 8-bit assembly back in the 80's on the zx spectrum and the cp464. It took me a while to get my head around it, but it was soooo much faster than the native basic implementations. I had a program on the speccy (machine code monitor iirc) which showed you the values of each of the registers as you stepped through the code. Really helpful to understand what was going on.
A very good explanation of machine code and assembly. Also nice to see someone who doesn't just go and grab some of the shelf assembly language nor some specific architecture for the machine code itself. Even if such can be useful in its own right, but most often going with "off the shelf" explanations means one just regurgitates an old outdated/incorrect explanation for how something works. To rephrase my statement. There is very few bounds to how machine code "can" look and work. There is more or less elegant ways to get to a Turing complete machine, but in the end there is no singular "correct" way. Same for assembly code. (Even if most people just bring out X86 assembly or the like...)
This was a walk down memory lane. I started writing assembler for Intel 8080 CPUs. Soon had over £1m of Intel Development systems with microprocessor In Circuit Emulation. Manual disassembly of machine code was the way to debug. Happy days.
My TRS80 came with 4K of RAM, so learning Z80 machine code was really helpful in making the most use of the available ram.
11 หลายเดือนก่อน +1
Remember vividly realizing at the age of maybe 10 or 11 that programs were just data in memory. Next it dawned on us that it means we can make programs that rewrite themselfs. Felt like sci-fi for me and maybe the 3 other kids at the school that had any interest in how computer actually works.
And on the job you find out that you can’t write to the micro controller eeprom at runtime. Your weird 14 bit instruction word Atmel won’t run from 8bit Ram. Windows and Linux protect mode prevent code manipulation.
Assembler isn't as complicated as people think. Command stack is effectively a switch that tells which math unit to use (copy, add, multiply, shift, etc) at a time. The Opcodes are just for human convenience saying which byte is which command.
It's not that it's complicated, it's just a huge mental load to think about even the simplest of programs in that format. The difficulty comes from the sheer quantity of information to remember rather than actual complexity.
And then you try to deal with overflow in 6502 . Or have to remember to SEC SBC . Or the weird MUL and DIV in most CPUs. Or you need two byte integers. Or you have to manually allocate variables in the 64+1 accumulator in JRISC ( when you disable interrupts).
The moment I realized where my life path would lead me to. Commodore 64/128 machines had diskette drives that contained a microprocessor and a very small amount of RAM (256 bytes if memory serves). Game houses decided that they could fit a machine code routine in the tiny RAM that could decrypt the encrypted game program. There were cartridges available for these machines that let you set break points. I managed to figure out how to decrypt the game code and write it back to a diskette after it was not encrypted any longer. At that point, the game could be loaded and run from that diskette. Got an A in my x86 assembly language class at Cal Poly Pomona. Ended up a network engineer but it all started cracking the diskette drive encrypt/decrypt scheme.
Back in the day you learned this by reading a somewhat messy "manual", a book explaining how the computer worked, that came along with each computer. And maybe a book from a bookstore or a library. There was no teacher, and no internet. Maybe a friend to talk to. Later you might be able to find a magazine explaining some of it, and gathering all those pieces you were able to make a computer game, a wargame dialer, a telnet program, read a voltage from a user port, do 3 dimensional vector graphics (in color!), make "music" and so on. I had great hopes for the future of humanity back in the day :)
I have worked on the whole range of abstraction from microcode on a Burroughs minicomputer in 1975 after a great Computer Science degree where I used IBM Assembly language and PL360 which was a far better tool. I implemented many languages on abstract machines, culminating in a Ph.D. These abstract machine concepts were in use far before Java and C#. The idea is that the machine that executes the high level programming language can be itself high level so that code generation is very simple.
What blows my mind is that the human race created this and things like GPS satellites etc, yet there's still huge numbers of people that believe the earth is flat
oh my god.. I had the same book and same memory of pigeonholes! thanks for posting this Matt - and Usborne of course... great video - you are the Fred Harris of today. :)
Machine code is runnable-code, whose encoding is nontext, directly operable by hardware, without Operating System. That's the correct definition of machine code no other person talks about.
Internally it has logic gates that do full adder operations, minus is adding in reverse, multiplication is repeated sum and division is repeated minus with a counter.
Programming has changed quite a lot actually. I remember programming basic on an 8 bit computer, there's no classes, there's even no functions, only goto and gosub. I guess some logical understanding can linger and remain useful, but computers nowadays are very different architecturally, and there's a whole different set of performance factors.
Technically, you can implement classes and functions even in early versions of BASIC - they just come with a lot of overhead. For that matter, GOSUB isn't far off a pass-by-reference function call already.
Last week the game human resource machine was free on epic. I used it to teach my nephew computer programming. It's basically a game version of this video
I can also recommend games like TIS-100 to understand these concepts. Or Human Resource Machine if playing TIS-100 feels too much like work to you :D All these games have some artificial restrictions for gameplay reasons that real computers don't have but all in all, they are a very good way to acquire what Google calls "computational thinking".
17:22 "More complicated than the 10 instructions I've got…" Given his prior acknowledgement of count-from-0 errors and earlier mention of 11 instructions, I'm not sure if he's trolling us or if he really made the mistake. 😂
I learnt assembler language on an IBM system 370 mainframe under MVS. This was mostly top down, batch programming, so first time through, second and subsequent time through and last time through. Later on it was reentrant, task or interrupt driven, realtime programming. Throughout this entire time, it was very easy to visualise what was happening when a Shift Left Logical, SLL or Move, MVC instruction was executed. Numbers in registers, memory overlays, save areas etc. When it came to wanting to learn C or C++, this way of programming was so burned in to my way of thinking, I found it very difficult to get into concepts like OOP, inheritance etc.
The machine code will never ever change because the hardware structures of all computers , no matter whatever the operating systems it might be, are based on the binary system. All the electronic signals in the hardware systems represent the binary digits 1 and 0, which are the only understandable signals to computers. One programmer's machine code can be readable to themselves, but it is extremely difficult for other programmers to read it , while all they can see is a continuous series or sequence of numerous 1 and 0 stuck together, unlike assembly language and hybrid or HL programming languages, which have some characters similar to human languages, and thus can be readable to other programmers
Wouldnt it be more efficient to hold the number. Load index to 1 instead, then decrement, add, increment, increment, write, and loop back to decrement. Edit: Okay, I think I got it in 9. 01. Load #1 02. Store @0 03. Set Index #1 04. Store @Index 05. Dec Index 06. Add @Index 07. Inc Index 08. Inc Index 09. Jump to 04 I have spent many hours in TIS-100...
Yeah, it's definitely inefficient to overwrite the accumulator then access memory to get back the number you just overwrote. If you're willing to start the sequence with f(0)=0, f(1)=1, and you can assume that the memory is initialised to 0, you could save one more line, at the cost of making your code less portable. Otherwise, I'm pretty sure 9 lines is the minimum - 6 for the loop (three moves, an add, a store and a jump) and 3 for initialising the accumulator, memory 0, and the index.
I wondered if someone had thought of it too. For esthetic reasons I would rearrange it to: 01. Set Index #0 02. Load #1 03. Store @Index 04. Inc Index 05. Store @Index 06. Dec Index 07. Add @Index 08. Inc Index 09. Jump to 04 In case the index register defaults to 0, the first step could be skipped.
there's a really neat indie game that kinda takes this concept of a little guy running around as the accumulator and gameifies it, Human Resource Machine, very worth checking out.
If we write a simple program in a high-level language (lets say fibonacci number generator in C++), is there a way to see the machine code version of that?
Yes but GCC buries your code under 100 lines of boilerplate. For AtariJaguar the code needs to fit into 4kB. I think that John Carmack cut out the boilerplate while he sliced his Doom into 4 kB slices.
Did Matt pay you to ask this question? 😂 There’s an online tool to do exactly this called Compiler Explorer, or more commonly as ‘godbolt’ after its author Matt Goldbolt, the presenter here. A fantastically useful tool for any software engineer. Can’t post a link here, but search for godbolt and you’ll find it.
A good follow up to this video would be an explanation of protected memory and memory management units i.e. explaining why you wouldn't put in absolute addresses for JMPs and why modern computers will refuse to let you overwrite the memory why your program is stored (which was a common practice back in the day of 8-bit and even 16-bit home computers).
A quarter century or so ago, a friend of mine eventually tracked down a bug in his program - in his own words, it was "trying to paint the entire memory blue" (he'd implemented a flood fill without boundary conditions). Happily protected memory was already a thing by then, so he was just overwriting all the program's allocated data storage in memory, not the program's code, nor any other program's memory...
@@rmsgreyI pointed the ES to VGA. Hard to overwrite my code this way. Why in 32bit mode I still cannot access all 256kB of the VGA card in a flat manner? Or the 500kB of my ET3000?
I picked up an x86 assembly book a few years ago and read and did most examples, didn’t quite want to do floating point. Anyway back in the late 80s or early 90s I coded a program in 6502 assembly and reimplemented that in x86 in FreeDOS. It was much easier in 6502 as far as I recall
First of all, thank you, great video. In Germany I had a similar book in the 80s where I learned how computers "think" as well. Second of all, please, what awesome clock/screen is that in the background? 14:04 for example...
I'm a seasoned dev with 15 years experience but I'll never get tired of hearing someone talk about the fundamentals and basics of computers
As far as programming goes, I think the main thing that has changed over half a century is adding more and more layers of abstraction - microcode (hardware), micro-operations, machine language, low-level compiled languages, virtual machines, high-level languages, and libraries on top of libraries on top of libraries. Understanding the layer underneath where you usually work can help you take better advantage of it in your code.
I think that's what's holding me back somewhat because I am experienced with assembly languages but when I go to writing a C++ or C# program I'm missing too many intermediary steps and it is quite an anxiety-inducing thing to suddenly trust all those layers that I have no control over doing the things as and when I intended. The loss of total control compared to speaking something the computer understands almost directly.
@@vuurniacsquarewave5091 perhaps you could develop a little trust if you played around a bit with Compiler Explorer? It's one of the reasons I created it - to make it possible to interactively edit the C or C++ code and see how the compiler generates the assembly.
@@vuurniacsquarewave5091unfortunately if you’re ever going to make something that’s modern you have to let go and trust… I have web applications running on cloud servers and it causes anxiety but there is no other way in the modern world.
Microcode was removed to create RISC. Ah, that is why MIPS has no flags. Single cycle interrupt only allows to store the instruction pointer in r31 . ARM now has 128 bit single cycle push to solve this.
I've been in the computer industry for 40 years, and all this stuff still amazes me. The same with working on a remote computer on a terminal: when you hit a character on you keyboard, it's send to the remote computer, which in turn sends the character to the terminal and displays it. Ff'ing amazing.
Thanks Matt & Sean (for this and the compiler explorer). I know for many this topic can be "dry stuff", but this fundamental understanding of computing really helps someone go a very long way, and it pulls back the curtain and reveals that actually, computers aren't magic, they do very little, very fast. I think the old 8-bit model of understanding a CPU is really how we will all have to think about computers when being taught to children today, and those old computers still certainly have a place as educational tools, where the pixels are so big you can see them, and memory addresses are so few you can count them.
Great to see Mr. Godbolt on here. Great communicator
Oh wow, I didn't even catch the name. His website shows up in dozens of cppCon presentations.
Thought it was Matt. I recognised his voice from the Two's Complement podcast
I was lucky enough to work with Matt at the very beginning of my career. He's even nicer in person.
This is one of the best ComputerPhile episodes that I have seen. I learned more from programming a "Tandy Microcomputer Trainer" in the early 1980s than from most of the computer engineering studies that I have undertaken since. Great analogy. I love it!
This is what all CS students need to start with, not HTML, CSS etc.
Huge respect, great video, enjoyed every second of it! 😎👍
CS50 Course at EDX is pretty close to this.
Totally agree
CS students start with HTML and CSS? 🤔 They actually might very well have back in the days when MySpace was a thing, though recreationally, not academically :)
@@2eanimation I don't think most CS students start with html, css and js. I think that is more true for bootcamps that focus on shitting out web devs that know a little about servers and dbs but mostly just frontend things.
This is the best description of machine code I've ever heard. I've only ever wrote one machine code program in 6502, I was so proud and shocked that it worked! It was on an Oric 1 though, not a BBC micro. The Oric had a sort of BBC Mode 7 that the Electron lacked. So in 6502 Machine code I made a program display 'teletext-style' pages as a Point-of-sale display in a shop window on Mansfield Road in 1983. It was FAST because it used BLIT to move the teletext page into display memory... Rock 'n' roll ! 😁✌
My first computer was an Amstrad CPC 6128, another Z80-based computer with a whooping 128KB or RAM (which the CPU couldn't access at once, so it used a paging system and the second 64KB could only be used to store data for longer term -because it was slow to retrieve it-). That's the computer I taught myself programming on (first in BASIC, then in Assembly), and my mental model of machine code is pretty much the same as Matt's. I remember how in the CPC 6128 the last 16KB of the first 64KB of RAM would map directly on the screen, so that just by putting numbers in those addresses pixels would light up in various colours. I think that helped make programming "click" for me, in that I had direct, visual feedback of what I was doing just by storing numbers at specific addresses. From there it was easy to "get" how the computer worked. I even remember the available registers of the Z80 CPU. There was A, the accumulator from the video, the relatively free BC and DE 16-bit registers, F which contained bit flags representing various states of the computer and AFAIK was read-only, or at least needed to be treated with care, and finally there was HL, the register that contained the memory index Matt mentioned. Funny that I still remember these details after so many years (nearly 40 years ago!).
There's just something about the extreme basics of computing that fascinates me immensely, please do more in the future!
Very well explained 👏, my first "mental model" from high school programming had the class trying to "command" a volunteer student (who acted as the robot) to stand up, walk in a circle, and sit back down using a limited instruction set. It was explained in a very similar way.
Wrote assembler on multiple cpus myself so I didn't expect to learn anything, yet this explanation was still interesting to me. Teaching done well is interesting it's a shame that students are so often bored.
Load that index bro! Wooo! Replace that sum with the sum of the the index and the sum of the sums! Fibonacci be Bussin' fr no memory cap
Thank you so much!
Ben Eater has a great series on that. Complete from simple logic element to how to create and code the microcode in eeprom.
Another +1 to Ben Eater's content! I've made his kit 6502 too, recommended!
Ben's videos are amazing. Watching him explain SR latches blew my mind. The only downside to his channel is that every time he posts I have to go re-watch his entire library to remember how everything works lol
I have the 8 bit computer built
This is so well executed. We request a part 2!!!
I bought a ZX81 kit from the UK circa 1981 from Nigeria. I was 15. After seeing it boot for the first time I was then curious about how it worked. Less than a year later I'd taken the Z80 out of the ZX81 and built my own machine on a veroboard. I've never been the same ever since. 🙃
wow
Hah, NERD!
@@EnjoyCocaColaLight haha, absolutely. Still am.
@@vincei4252
Are we not all ? ;-)
@@srenkoch6127 Nooo, the world is full of people who can't count and don't care! 🙁
I remember in the mid 80's stumbling across this "machine code" whilst poking values into Ram on my Commodore 64. I can still remember the decimal values I poked in to increment the screen colour and jump back and do it again at incredible speeds compared to basic. It was such an eye opener on how fast the machine was actually running compared to how fast basic could do it. Nice video.
This was a lovely example of clear explanation - it may seem pedestrian but fundamentally computers are pedestrian and understanding how fast they do pedestrian things and how much pedestrian work you are asking of them are important insights.
The next layer up is understanding quite how much bottom level code is generated from high level instructions and which instructions have higher cost. Then you can start writing efficient code.
I love this channel so much, can't believe I sat throw and enjoyed an entire video teaching machine code AND understood everything.
This is remarkable. In 20 minutes, a basic understanding of what goes on in a computer CPU and an indication how assemblers work (human-understandable representation of what is in an executable instruction). Parts 2, 3 etc are needed to overcome some of the necessary over-simplifications in this introduction. (For example: there are at least three distinct LOAD instructions - load literal; load from storage location; load from storage using the INDEX.) I would say that anyone with the basic ability to learn assembler programming, or direct machine code if really necessary, would be set in the right direction by this talk.
possibly the best explanation of basic machine language i have ever seen.
"earth-shattering moment" @14:24 ! no other word can explain the feeling much better. This video is nostalgic. Back in 1988 when I was learning BASICA (or GWBASIC), reading about the PEEK and POKE statements gave that feeling.
Thank you for giving us a way to grasp machine code! As a teacher myself, the way you planned that out and introduced the concepts was perfect!
Still amazes me that typing codes in makes a computer do things.
Because they didnt teach you E&M and circuitry to go with your binary because they dont need you to actually know things and have a complete picture they simply need you to have a piece so they can monopolize the pieces and be the only one with the whole...dont you wish the purpose of education was actually to educate and not to filter/control?
It still amazes me that within a couple of lifetimes ago we were still behind horse and cart and yet can watch this video on pocket-sized computers in every corner of the globe. We've come a long way, but there's so much further to go
@@oriwittmer your grasp on the rate of change in your world is tenuous at best...be prepared for the "cyberattacks" during election
@@exchable electricity...and magnetism...its a class...(we did not teach IT to do anything it merely does what it was supposed to do in the first place...the weirder thing would be that your neurons and the ones used to make gpt are the exact same thing...meaning YOU are already an organic computer to begin with)
The thing I find the most fascinating is the RAM and how computers are able to “remember” things. Very cool if you look at the circuit for 1 bit of memory. Prior to that the inner working of computers were very abstract to me.
I think you explained a lot very nicely and concisely! The assembly program is cool enough, and bringing it around to machine code in ram is just fantastic!
Thank you! 😊
Agreed, although it is a big worry that he couldn't add up 8 + 5.
@@blucat4 there's a reason I get computers to do everything for me :D
I highly recommend Ben Eater on youtube. Ben’s video series starts with basic electronics and progresses to building an 8-bit cpu from scratch with breadboards.
The game „Human Resource Machine“ picks this up very well. I have no idea of machine code, but I love that game, and most of what you said sounded quite familiar to me.
I was thinking the same thing. The robot he's describing is literally the worker from Human Resource Machine. Marginally different capabilities, but still.
I'll see if I can get my kids to play that! They won't listen to me :)
If you want to go deeper try out "TIS-100" by Zachtronics, it really is the closest to a game about machine code i've ever seen (with some fun twists and a story). Some of thier other games are also very intersting IF you find machine code/assembly interesting!
@@MattGodboltIt's pretty fun actually, though the puzzles can get pretty tough sometimes :v
Great video. Been coding since 1985 and professionally since the late 90s and never understood assembly language or machine code until now. It’s been a black art I’ve kept away from
If only there were some kind of technological device that could be used to convey information to others easily without having to use up so much paper. I tell you, the inventor of such a device will be set for life!
Yooo! Lovely to see Matt on Computerphile! I use CE daily for work in HPC optimization!
I remember the exact moment of my "watershed" understanding programming. I was in my freshman dorm room in 1982 talking to a classmate on my wall phone (with cord that was about 2 feet long so I had to stand to use). It was the second week of my first programming class discussing the Fortran assignment which I was completely baffled over. My classmate said i should include x=x+1 in my answer. My mathematical mind said that doesn't make any sense. He said "It's not an equation. It is taking the current value of X, adding 1, and putting it back in X". From that moment on programming has come second nature to me.
As a teenager back in the 80s, the Assembler program was beyond my budget, so I wrote some stuff in machine code. The Z80 was pretty easy to understand.
Matt and I have such similar backgrounds, it's scary. I too learned machine code from that Usbourne book. I think we share the same first computer too. Also, my name is Matt. I still program in Z80.
Loved this video! First thing I ever programmed was a TI-57. Instruction codes were the coordinates of the key (row and column) and you had 50 steps and 8 memories.
Looking forward to a followup with tests, heap, indirect addressing, ...
Wow I still have this collection of USBORNE books to this day .. I learned exactly the same way. still a software developer.
I got into computers because of video games and today after 20+ years I am software engineer. I will never stop to get fascinated by computers.
this comment, this video, this website, the internet - all 1s and 0s. Its crazy that we put electricity through a bunch of sand to make it perform math, and that basically is what makes the world go around today!
Very nice! Thanks. I endorse the idea of a part 2. All the books I ever saw/read/used on machine code always started with registers, accumulator etc. and *never* mentioned the 'other circuitry' which produces the end user experience which Matt G mentions. I remember spending an hour mcoding once to produce a tune that lasted 3 seconds. I knew exactly what the accumulator was doing, but had no idea how all that interfaced with the speaker. I think a part 2 might track back from the interface down to the mcode. Those old books were like looking at the role of the brain in psychology where ch 1 was 'the neuron' and ch 2 was the synapse - but you never got to human behaviour...
My brain: Matt Godbolt?
*click*
Matt Godbolt: So I wanted to talk about…
My brain: Matt Godbolt!
Great explanation! I love machine code and admire early computer scientists for memorising all the little instructions.
They did not. Thus, the instruction set was reduced to what a normal coder could memorize. When compilers caught up 1994, the PSX got its ISA extended again with vectors. N64 got floats.
You should totally rename the video to "How computers work". It would probably get more attention which this video totally deserves!
Yes! I thought so! He's a great live perfomer. Seen a couple of his talks and he just knows how to present programming to people. Apparently he also has a pretty sweet system of cameras to hold an "interactive" presentation from home.
The fascinating part about this is how simple it is as you figure it out, but it's still a lot of labour and you can easily forget a step as you're excited to get the end result and program done, and focus more on your idea than all the manual steps to get to the idea. Which I guess explains why higher level languages have become increasingly popular and why things like C++ got ranges and algorithms as well, to remove the problem of misindexing and looping - doing a silly careless mistake that leads to a bug when thinking about your idea more than walking step by step.
Would be great to have Godbolt guest again, he seems to have quite a bit more to say.
for many years i've felt i had the 'gist' of machine code (i normally code higher level) but this made a few things click together in my head more than anything else has.. thanks :)
Matt godbolt is a modern day pioneer.
My University electronic calculator course was incredibly well summarized with this simple example, amazing!👏
On the topic of last years sound check. I think it would be interesting to watch a video of someone on computerphile do a “let’s play” of TIS-100.
Loving seeing Matt on here! Big fan!
The fact that there is an abacus on Matt's desk is proof that we're living in a simulation.
The most exciting thing about this video was hearing that Usborne books are now available as free pdfs. Those books were a huge part of my childhood! But I've searched around a bit, and haven't found much, certainly no books I recognise from the 80s
I remember learning z80 8-bit assembly back in the 80's on the zx spectrum and the cp464. It took me a while to get my head around it, but it was soooo much faster than the native basic implementations.
I had a program on the speccy (machine code monitor iirc) which showed you the values of each of the registers as you stepped through the code. Really helpful to understand what was going on.
A very good explanation of machine code and assembly.
Also nice to see someone who doesn't just go and grab some of the shelf assembly language nor some specific architecture for the machine code itself. Even if such can be useful in its own right, but most often going with "off the shelf" explanations means one just regurgitates an old outdated/incorrect explanation for how something works.
To rephrase my statement.
There is very few bounds to how machine code "can" look and work. There is more or less elegant ways to get to a Turing complete machine, but in the end there is no singular "correct" way. Same for assembly code. (Even if most people just bring out X86 assembly or the like...)
Best vid on the channel. Well done!
Love the pigeon holes idea. Great way of explaining memory.
This was a walk down memory lane. I started writing assembler for Intel 8080 CPUs. Soon had over £1m of Intel Development systems with microprocessor In Circuit Emulation. Manual disassembly of machine code was the way to debug. Happy days.
Great video! Thank you Matt & Sean.
I remember in the 80's ,searching in the memory of a TRS80, where the picture of the letter 'A' was stored. It was a cool way to learn programming.
My TRS80 came with 4K of RAM, so learning Z80 machine code was really helpful in making the most use of the available ram.
Remember vividly realizing at the age of maybe 10 or 11 that programs were just data in memory. Next it dawned on us that it means we can make programs that rewrite themselfs. Felt like sci-fi for me and maybe the 3 other kids at the school that had any interest in how computer actually works.
And on the job you find out that you can’t write to the micro controller eeprom at runtime. Your weird 14 bit instruction word Atmel won’t run from 8bit Ram. Windows and Linux protect mode prevent code manipulation.
Assembler isn't as complicated as people think. Command stack is effectively a switch that tells which math unit to use (copy, add, multiply, shift, etc) at a time. The Opcodes are just for human convenience saying which byte is which command.
It's not that it's complicated, it's just a huge mental load to think about even the simplest of programs in that format. The difficulty comes from the sheer quantity of information to remember rather than actual complexity.
@@Eagle3302PLthat's where subroutines, labels and macros come in. some assemblers, like ca65, even have structs and unions.
The difficulty comes from instruction timing and pipelining. Stuff you wouldn't even think about at the beginning.
And then you try to deal with overflow in 6502 . Or have to remember to SEC SBC . Or the weird MUL and DIV in most CPUs. Or you need two byte integers. Or you have to manually allocate variables in the 64+1 accumulator in JRISC ( when you disable interrupts).
I still remember some of the 6502 opcodes in hex, learnec on my Acorn Atom, 2K of memory. I owe that little machine so much!
The moment I realized where my life path would lead me to. Commodore 64/128 machines had diskette drives that contained a microprocessor and a very small amount of RAM (256 bytes if memory serves). Game houses decided that they could fit a machine code routine in the tiny RAM that could decrypt the encrypted game program. There were cartridges available for these machines that let you set break points. I managed to figure out how to decrypt the game code and write it back to a diskette after it was not encrypted any longer. At that point, the game could be loaded and run from that diskette. Got an A in my x86 assembly language class at Cal Poly Pomona. Ended up a network engineer but it all started cracking the diskette drive encrypt/decrypt scheme.
I read that same Usborne book when I was young - very useful indeed. Thanks for the video!
Back in the day you learned this by reading a somewhat messy "manual", a book explaining how the computer worked, that came along with each computer. And maybe a book from a bookstore or a library. There was no teacher, and no internet. Maybe a friend to talk to. Later you might be able to find a magazine explaining some of it, and gathering all those pieces you were able to make a computer game, a wargame dialer, a telnet program, read a voltage from a user port, do 3 dimensional vector graphics (in color!), make "music" and so on. I had great hopes for the future of humanity back in the day :)
I have worked on the whole range of abstraction from microcode on a Burroughs minicomputer in 1975 after a great Computer Science degree where I used IBM Assembly language and PL360 which was a far better tool. I implemented many languages on abstract machines, culminating in a Ph.D. These abstract machine concepts were in use far before Java and C#. The idea is that the machine that executes the high level programming language can be itself high level so that code generation is very simple.
Absolutely fantastic video/explanation!!!
It still blows my mind that people actually got this to work. And it annoys me that most people aren't more impressed with it 😅
What blows my mind is that the human race created this and things like GPS satellites etc, yet there's still huge numbers of people that believe the earth is flat
oh my god.. I had the same book and same memory of pigeonholes! thanks for posting this Matt - and Usborne of course... great video - you are the Fred Harris of today. :)
Thanks, this was finally an explanation that I could get my head around.
Who knew you were such a great Artist Matt? Banksy your days are numbered! 🙂
Machine code is runnable-code, whose encoding is nontext, directly operable by hardware, without Operating System. That's the correct definition of machine code no other person talks about.
Internally it has logic gates that do full adder operations, minus is adding in reverse, multiplication is repeated sum and division is repeated minus with a counter.
Programming has changed quite a lot actually. I remember programming basic on an 8 bit computer, there's no classes, there's even no functions, only goto and gosub. I guess some logical understanding can linger and remain useful, but computers nowadays are very different architecturally, and there's a whole different set of performance factors.
Technically, you can implement classes and functions even in early versions of BASIC - they just come with a lot of overhead. For that matter, GOSUB isn't far off a pass-by-reference function call already.
During lockdown I grabbed a C64 emulator and C64 Studio - it was the most fun I'd had programming in years.
Great video, but this should be the first in playlist Matt Godbolt on Computerphile.
Last week the game human resource machine was free on epic. I used it to teach my nephew computer programming. It's basically a game version of this video
The sequel, 7 Billion Humans does parallel execution.
This made me feel such nostalgia! 😋
Great video, very well explained. I'd love to see a follow up.
I can also recommend games like TIS-100 to understand these concepts. Or Human Resource Machine if playing TIS-100 feels too much like work to you :D All these games have some artificial restrictions for gameplay reasons that real computers don't have but all in all, they are a very good way to acquire what Google calls "computational thinking".
17:22 "More complicated than the 10 instructions I've got…" Given his prior acknowledgement of count-from-0 errors and earlier mention of 11 instructions, I'm not sure if he's trolling us or if he really made the mistake. 😂
Great content as always. Thank you.
I also learned a lot from these Usbourne books. I used to read them at night when I was around 8, well before we even had a computer.
Great stuff and with such enthusiasm!
I learnt assembler language on an IBM system 370 mainframe under MVS. This was mostly top down, batch programming, so first time through, second and subsequent time through and last time through. Later on it was reentrant, task or interrupt driven, realtime programming.
Throughout this entire time, it was very easy to visualise what was happening when a Shift Left Logical, SLL or Move, MVC instruction was executed. Numbers in registers, memory overlays, save areas etc.
When it came to wanting to learn C or C++, this way of programming was so burned in to my way of thinking, I found it very difficult to get into concepts like OOP, inheritance etc.
The machine code will never ever change because the hardware structures of all computers , no matter whatever the operating systems it might be, are based on the binary system. All the electronic signals in the hardware systems represent the binary digits 1 and 0, which are the only understandable signals to computers. One programmer's machine code can be readable to themselves, but it is extremely difficult for other programmers to read it , while all they can see is a continuous series or sequence of numerous 1 and 0 stuck together, unlike assembly language and hybrid or HL programming languages, which have some characters similar to human languages, and thus can be readable to other programmers
But it is a sequence of words. At least bytes. There are kinda linebreaks
i learned most of this stuff playing the Zachtronics game Shenzhen I/O, kinda neat how well it seems like they built a learning tool out of a game
Try the game "Turing Complete". It will open your mind.
Wouldnt it be more efficient to hold the number. Load index to 1 instead, then decrement, add, increment, increment, write, and loop back to decrement.
Edit: Okay, I think I got it in 9.
01. Load #1
02. Store @0
03. Set Index #1
04. Store @Index
05. Dec Index
06. Add @Index
07. Inc Index
08. Inc Index
09. Jump to 04
I have spent many hours in TIS-100...
Yeah, it's definitely inefficient to overwrite the accumulator then access memory to get back the number you just overwrote.
If you're willing to start the sequence with f(0)=0, f(1)=1, and you can assume that the memory is initialised to 0, you could save one more line, at the cost of making your code less portable. Otherwise, I'm pretty sure 9 lines is the minimum - 6 for the loop (three moves, an add, a store and a jump) and 3 for initialising the accumulator, memory 0, and the index.
I wondered if someone had thought of it too. For esthetic reasons I would rearrange it to:
01. Set Index #0
02. Load #1
03. Store @Index
04. Inc Index
05. Store @Index
06. Dec Index
07. Add @Index
08. Inc Index
09. Jump to 04
In case the index register defaults to 0, the first step could be skipped.
14:19 I remember writing C code using far pointers with 386 processor and getting rain drops made up of English alphabets.
Great video, machine code makes more sense now!
Excellent video... Thank goodness for high level languages 😊
Great explanation!
there's a really neat indie game that kinda takes this concept of a little guy running around as the accumulator and gameifies it, Human Resource Machine, very worth checking out.
Very cool, always wanted to know this, always expected machine code to look like Aurebesh.
If we write a simple program in a high-level language (lets say fibonacci number generator in C++), is there a way to see the machine code version of that?
Yes but GCC buries your code under 100 lines of boilerplate. For AtariJaguar the code needs to fit into 4kB. I think that John Carmack cut out the boilerplate while he sliced his Doom into 4 kB slices.
Did Matt pay you to ask this question? 😂 There’s an online tool to do exactly this called Compiler Explorer, or more commonly as ‘godbolt’ after its author Matt Goldbolt, the presenter here. A fantastically useful tool for any software engineer. Can’t post a link here, but search for godbolt and you’ll find it.
A good follow up to this video would be an explanation of protected memory and memory management units i.e. explaining why you wouldn't put in absolute addresses for JMPs and why modern computers will refuse to let you overwrite the memory why your program is stored (which was a common practice back in the day of 8-bit and even 16-bit home computers).
A quarter century or so ago, a friend of mine eventually tracked down a bug in his program - in his own words, it was "trying to paint the entire memory blue" (he'd implemented a flood fill without boundary conditions). Happily protected memory was already a thing by then, so he was just overwriting all the program's allocated data storage in memory, not the program's code, nor any other program's memory...
Would have to cover relocations in portable executables at the same time.
@@davidmcgill1000I just use the segments given to me by DOS. I did not exceed 64kB of code. Data heavy games. 8086 code is compact.
@@rmsgreyI pointed the ES to VGA. Hard to overwrite my code this way. Why in 32bit mode I still cannot access all 256kB of the VGA card in a flat manner? Or the 500kB of my ET3000?
I picked up an x86 assembly book a few years ago and read and did most examples, didn’t quite want to do floating point. Anyway back in the late 80s or early 90s I coded a program in 6502 assembly and reimplemented that in x86 in FreeDOS. It was much easier in 6502 as far as I recall
I think that's the book my primary school had still in 1999! I've wanted to find it for ages! Or one similar...
First of all, thank you, great video. In Germany I had a similar book in the 80s where I learned how computers "think" as well.
Second of all, please, what awesome clock/screen is that in the background? 14:04 for example...
I still have that book! found it a few years back while having a clean out :D
For the life of me, I can't think of anything in computer science that Matt Godbolt couldn't explain well!
@14:21 exactly this is what got me into computers at a young age. :)
I just started playing Turing Complete on Steam. Great little puzzle game that teaches you computer logic.
That was really interesting, thanks
I was intrigued by the box by the TV that was rotating it's display, what was it?