8 years ago, you explained how semiconductors work from a physics perspective. From there, you've incrementally built up one step at a time, and now you've reached the point where you're literally remaking the very first Apple computer from 1976. And you haven't skipped a single step or building block going all the way back to "this is how semiconductors work". My friend, you are amazing and deserve an award for computer science education. Here's to the next 8 years, at which point you'll likely have reached the point of explaining how Alienware computers can be optimized and improved for resale 😜
hell yes ! I'm very excited for the step into operating systems, that's the largest gap for be with regards to having a good understanding for computers (I'm studying computer science; we had a course on low level stuff like Ben has shown in his breadboard 8-bit computer and this series, several on higher level programming and one on operating systems which was bad and I did not catch a lot)
I got sucked into the his full-scale 40-something long video series that started with, as you said, how a semiconductr worked, then a transistor, a flip-flop, a logic gate, a memory cell, a clock, an adder, and on and on.... now we got graphics cards with vga output and OG computers.
@@iuppiterzeus9663 The operating system set of videos on the 8-bit bread board series was my favorite part. I finally understood how a computer is able to flip information back and forth using the bus and basic control logic, and why each command takes different numbers of clock cycles, and how it's possible to run some command simultaneously, etc., etc.
Ben's ability to incrementally build to a satisfying and impressive result over a series of methodical steps, each of which are satisfying on their own, is a marvel. This 6502 project is starting to feel like when the 8-bit breadboard computer proje t wrapped up with a discussion of Turing Machines and the philosophy of computers. Reflecting back upon how each of these projects started makes these final results feel important and enlightening. Thank you, Ben, for crafting educational and entertaining content that satisfies on so many levels.
One of the reasons I picked up your kit for the 8-bit breadboard computer is precisely for things like this. In college, we mostly dealt with the Motorola 68HC11 microcontroller/processor and I recall having a blast making a "Realtime clock" that displayed the time and date on a 16x2 LCD with a similar 256 byte limit. I remember staying up late before that project was due, cramming "set buttons" for the clock in addition to the required serial initialization. This is exactly the kind of "is this thing working?" code I've been looking for to jump in to the 8-bit computer again. Thank you Ben!
We had Woz as a guest speaker for our electrical engineering and computer science department's undergrad projects day awards dinner. Must have been early 2000's. (I was a professor back in the day.) A genuinely nice guy and proud geek. He talked in detail about the joy of cramming functionality into min cost hardware. A great night for the old nerds like me. I'm not sure the juniors and seniors he was talking to got the full context, but we had fun anyway.
In my opinion, Woz's "annus mirabilis" from March 1975 to April 1977 when at just 26yo, he designed, implemented and launched the hardware, kernel, sound and color output as well as a BASIC interpreter for the Apple I and II is one of history's greatest technological achievements. It was a tour de force of talent which is astounding to this day.
Looking at the code of Wozmon is incredible. I find the same amazing style when looking at his Integer Basic code. If you want to see genius, look no further. I remember back in 1981 at school, me and a friend had a list of the coolest people ever (we were geeks without knowing what a geek was). Einstein was in the list, as well as George Lucas of course and others, but top of the list was Steve Wozniak. It was only in later years that I finally understood the importance of Jobs, but for me Woz will always be #1.
Actually, I remember listening to him in an interview, saying that he imagined so many times before how he would design and build that system, long before he had the means to purchase the components. So, when the right time came for him, he just churned down a lot of content.
@@mojoblues66the c64 drive was total crap because they wanted it to be backwards compatible with the vic20 Look at the disk interface for the commodore pet, it has a much better design even though it is older than the c64 and vic20.
So nice to see Wozniak getting some of the recognition he deserves. It's truly alarming how even Apple themselves seems to have forgotten how essential he was to the success of the company.
This takes me back. I had an Apple ][+ and spent a great deal of time writing code in machine language. Floating point BASIC was just too slow for some purposes, afterall. Your demonstration of controlling the LED reminded me of making the built-in speaker go "tick" by reading (or writing) the I/O address to which it was mapped. I learned a lot by tinkering on that computer, but there were some gaps in my knowledge. These videos -- and the computer kits -- have filled in many of those gaps. Thank you Ben for the stroll down memory lane, and for all the knowledge you've shared with us.
My year 10 teacher taught us assembly in Apple ][. We had no compiler except pencil and paper. We made a musical keyboard, using NOPs to get the frequency. And then I built a small library of pixel operations for speed that I called from basic to build a platform game. 35 years later I visited him to thank him.
It's really cool seeing someone tweaking Woz code to work on a different board. This really lets you see how microprocessors work, especially in the homebrew world. The Apple I came out when I was first learning to program so it's especially fun for me to see this kind of thing. Wow.
@@thewhitefalcon8539 I mean yeah technically you are correct, but I mean the user experience - the only way to interact with the computer is to program it via machine code instead of having a kernel and UI to do the machine coding for you.
Just for the hell of it I recently learned to write machine code for the 6502 (apple II) over a year or so. I made a version of Nokia Snake. Getting it to work was literally the most excitement I've ever gotten from a computer.
If we still had to deal with machine code, we would have flying cars by now. Anyone nowadays carry a CPU in their pockets that is 1000x more powerful than an Apple I and we only use it to access social networks and play games.
I cannot overstate my amazement at how well you've put this whole thing together. It's hard to say something that hasn't been said already in these comments; it all lines up perfectly, but there was never a hint that you'd take it this far. You could have mapped your ROM to the lower half of the address space, but using the upper half lets you use Wozmon with less modification. You set up a serial interface, which lets you copy-paste into the 6502 computer. You've demonstrated that we are standing on the shoulders of giants as we use our computers and smartphones today. You keep me constantly reminded why I chose computers as my field of study. Thank you for this amazing series, Ben!
As a kid, when I finally got a Final Cartridge III for my C64, I used the monitor to copy the Kernal ROM to memory and changed the READY prompt to my first name (also 5 bytes) and run it from memory.
Stuff like WozMon makes me feel so blessed as someone who writes "modern" code. Under 256 bytes to write all that magic and here I am using more than that to write "hello world" to the screen in any current higher level language...
I was never good at using breadboards. My hands are big so handling small components without disturbing nearly wires and components was nearly impossible. And it was always maddening how easy it was to inadvertently loosen a connection. So I dumped the breadboard and learned how to design a PCB. Over the course of several weeks (months?) I had a PCB ordered and delivered. I finally got the third board built correctly and WOZMON finally worked. Seeing that backslash character for the first time... I guess everybody in here knows what that feels like. There are some minor issues with the PCB I have; the serial connector is too close the edge of the board. And the silkscreen has incorrect information; the ROM and RAM labels are switched. (It took a while to figure that out.) I will fix that in the next version. And make a few other minor enhancements. Eventually, I will learn how to 3D print a 'case' for it. Plus - i want to get BASIC running on it. Ben : thank you for an outstanding video series. I have learned more watching these videos than I did in my digital hardware course in college!
As an EE still in school and wanting to enter the field of computer hardware design, your channel has truly become a treasure trove of knowledge. You built this 6502 computer from the ground up while incrementally explaining the history, theory, and design behind it. Truly spectacular work
I started writing programs in machine code for my Atari 800XL when I was about twelve. I can remember sitting down with one of my math teachers running little machine language programs on classroom Apple IIs. Whole applications in tens of bytes...
Thanks for keeping the 6502 alive! I'm currently working on a 6502 based fantasy console that hugs close to the 1980s specs, trying to toe the line between period accuracy, and being convenient enough for modern devs to want to tinker with it. I think if we loose the knowledge of these old systems we'll loose the ability to create efficient and optimized programs. CPUs are insanely powerful compared to the 1980/1990s chips, yet computers aren't that much apparently faster to the user, due to how absolutely inefficient modern applications tend to be. Keep pumping out awesome videos, it lives on my second monitor to keep me sane while coding!
Expectations will increase faster than any increase in clock cycles or transistor count. The fact that I can pull in a couple of libraries and parse a several megabyte executable in 20ms in code running on a browser served to me across the world should be enough evidence that performance and efficiency are absolutely still goals and available to anyone who cares even slightly, but that's not worth anything if it just means that you expect it to be doing even more in that time.
Modern programs are less efficient than hand written assembly programs were back in the day, but there's also no way anyone wants to hand write a program the size of Excel. Modern programming languages and frameworks give you a way to balance programmer effort and bug resistance against efficiency and speed. For instance, C# is usually a little slower than C++ doing equivalent operations because it includes more safety checks and automates memory handling. On the other hand, it's vastly easier to work with and drastically reduces the chance of memory bugs. It's always a tradeoff.
@@clonkex To be fair, Modern C++ is also pretty good with memory handling. new and delete (or malloc/free) are code smells at this point. Even when using C libraries like SDL, you can use a unique_ptr and specify a destructor method and never have to worry about freeing it. It'l call the destroy method as soon as it looses scope.
@@SimonBuchanNz Yeah but then you have 80GB shooter games that look pretty, but have barely any more content than games 1/10th the size. And I'm not saying people shouldn't use C# or JavaScript. I'm saying that if you don't at least fundamentally understand what's happening under the hood, you're more prone to writing inefficient code. This is why so many C# devs do things like + strings together in tight loops and then wonder why their console app job take 16 gb of memory to parse a 2gb file.
@@LunaticEdit 1/10th the size? Try 1/1000th the size! Classic shooters like Duke Nukem 3D and the original Quake came in at under 100 meg. Doom is around the 15-meg mark, and Wolfenstein 3D and its prequel Spear of Destiny _combined_ total less than 10 meg. And just to prove it's still possible, Ion Fury is a modern throwback to Duke3D, using the same engine, and that takes up 91MB on my hard drive according to Steam.
Hi Ben, thanks for this video. I've made myself a small carrier board for a WD65C816 that attaches to my old Digilent Spartan3A fpga board. Once I had my UART working (in Verilog) I set about getting Wozmon assembled and inserted. Made a couple of oversights in my understanding how the code runs, but thanks to your video I have those corrected now. 👍 Cheers ! Bob.
This is very impressive. A precursor to the BASIC interpreter and system interface that wouldn't be available for another 3 years. Truly ahead of its time.
The cool thing as you probably noticed is that the format that you get when you let Woz Mon print a hexdump of memory, is compatible with the format of the command to put something into memory. That was very intentional. In those days it was very common to use a Teletype (or another printing terminal with a paper tape punch and reader) not only as your main user interface but also as your storage medium. Other computers such as the KIM-1 did the same: Whatever they would print, would also be acceptable as input. And of course Microsoft Basic also counted on you having a paper tape punch and reader to store your programs, at least in the beginning; the SAVE and LOAD commands were added later.
Woz is indeed a genius! Thanks for bringing this to light Ben. Over the last few years, I've realized how underrated Woz is and that he needs to be celebrated more than Jobs.
I remember some years ago I was wondering how a computer worked, and you videos gave me a very good idea, and it built up to the point where we have an operating system, this is amazing, the obly good answer to "how computers work"
Awesome! Reprogramming it without flashing the eeprom (or even resetting it) is a huge milestone. For me, this is the point where your breadboard computer has become a real computer (for my arbitrary definition of real 😂)
Well I tried it and it actually worked the first time!! I have learned more than I ever thought I would from this. Love the video and appreciate all the hard work that was put into this!! I am still fascinated with how the hardware and software work together to do stuff so quickly. Even though 1Mhz is very slow these days. I have stepped through some programs with this "simple" cpu and realize that it must take a complete understanding of the computer to be able to write an assembler for it. Gives me a head ache even thinking about that.
I watched a documentary on Woz, and it's remarkable not just that he did it -- but that he was always doing impossible stuff like that. -- He would build machines that had half the number of chips that should have been necessary, and write a third of the code that should have been required, and somehow it all worked. -- What a badass.
This brings back memories when I build by Apple II by purchasing an exact copy of the motherboard and putting all standard components on it. I even managed to write some assembly code to replace keyboard control for PacMan by using the joystick as input. I enjoy watching your videos. Keep up the good work.
This is exactly what I needed One thing I noticed is that I wasn't able to paste into mobaxterm. I'd paste in a line of bytes and it would be echoed back as just a few of the characters I had sent. It turns out that my terminal was sending the characters faster than the CPU could read them in and send back. I ended up having to set the baud rate to 9600 (#$1E loaded into ACIA_CTRL) and then set the transmit loop counter to #$C8 instead of #$FF. I don't know if C8 is the best value, it was trial and error of looking for a value that made the loop long enough to make sure a byte was transmitted and short enough that it was done fast enough to read the next byte being sent. I couldn't find such a value for 19200bps, as it always missed a byte every 30-40 characters. Then I had an issue where I couldn't send more than one line, which it turns out was the CPU being busy writing the input buffer to memory and missing the new bytes being sent. I solved this by sending 20 space characters at the beginning of each line which gave the CPU plenty of time to process the last line before getting the new line. With these changes, I'm able to paste in code and run it on windows with mobaxterm.
This is a problem that often occurs with terminal programs in Windows. I used to configure switches using Putty (ssh), and you had to be careful not to past to much data at a time, since the terminal would not always wait for the receiving end and data would be lost. This is never an issue using ssh from a Linux terminal. I'm not sure what is/was causing this behavior - if it is a bug in the terminals themselfs or some common library in Windows - but it sure was annoying.
Your deep understanding of what you are doing and confidence are beautiful. Thank you for the amazing videos keep them coming. you rock sir. I am happy I am subscribed.
This is really a pivotal moment in the series. It's all coming together in a way that I feel like it hasn't in a while. I absolutely ate up the 8-bit series but found the 6502 less compelling up until this point. I've skipped a few videos but glad I watched this one! That might be what prompts me to get started on a breadboard project
I remember the times when I used to input code from magazines into our Sinclair Spectrum ZX, it was always so cool seeing what a few lines of code could do. Awesome video, took me back about 30+ years! Incredible what he did with ~250 bytes at that time. :)
Thank you so much for making these videos! I already programm since 3 years but only now im getting into the lower systems that happen and all because of you!
Fascinating insight into the inner workings of the Apple 1, and a taste of what those hobbiest programmers were doing back in the mid-70's. Thanks so much, Ben, for creating this video.
This is awesome. I've been wanting to work on a 6502 breadboard machine that can run NES ROMs but I have not had the time. This is a very cool project.
I did just that during covid. I had a spare NES with a lot of damage to the motherboard, the case was already donated to a different NES to fix. I desoldered the cpu and ppu and put them on a breadboard.
a program that can examine and modify its own memory somehow seems... improper? Taboo, even? this has reached the point where it recognizably resembles "a computer", while being so low-level that it follows *completely* alien paradigms to me. I'm glad i've gotten to see this built from the absolute basics, it's given me a lot of new appreciation for all the things that came before my time!
Ben Eater deserves an award lowkey. Everything from the 6502 series, to the SAP-1 series, to everything else, is perfect for learning computers. This is the type of stuff I litterally wouldv struggled to learn in a classroom otherwise.
Thanks Ben, that was wonderful! I started out with an Apple IIe and didn't learn much about the Apple I. So seeing the details of the Apple 1 fills in a lot gaps for me.
Cool! Recently I did something similar which was to convert an Apple II into an Apple 1. Cutting and pasting code into the WozMon is a lot easier than loading via cassette.
10:20 Beware. R after BLOCK XAM 0.A jumps to A, not to zero. It interprets 00 at address 000A as BRK, reads vector 0000 from address FFFE (ROM) and jumps to 0000. In general, when you want to run program after BLOCK XAM ABCD.EFGH, you have to enter address ABCD and then R.
I would posit that part of the reason that wosmond does so much at such a small size is because the raw data level of what a computer does can be highly simplified, especially when you only have 2 or 3 hardware data standards to worry about. Highly intelligent, clever, and motivated design is obviously also part of it.
I am really enjoying this video series. You have a great teaching style. I have seen low level programable logic controls that use a very similar hex style programming interface. To see it be built up from transistors and NAND gates is very interesting.
The first computers I ever got to use were the Apple ][ and Apple //e at school. Learning how to optimize my code was ingrained, because of the RAM limits and speed of those computers. Optimization was critical!
let me say first thanks for sharing your video And boy have you brought back a lot of memory for me I'm 7 3 years old and I could almost remember most of the stuff that you're doing Love it love it i'm familiar with the 6502 WOW I also worked on 128 memory card just love that old Apple TB thanks for sharing
I mean, the logical thinking involved in following along the whole video (and the knowledge it takes) is regularly higher than what I have, but just for the interesting facts that i can take out and seeing that you're up to these crazy things makes it nice to check your things out on here from time to time and slam that like button. Cheers
Nice, really cool. You can easily debounce a button by applying a ceramic disc capacitor across the leads. The capacitor absorbs the small voltage fluctuations. You can determine the sensitivity by choosing a higher or lower capacitance value. You should always reassemble and reupload any change, so it is reflected in the source code, also before changing it, backup the source code. This seemingly simple step prevents devastating typos that can completely ruin a project. Also, keep a copy of every backup of the source code and number them 001, 002, etc. So you can revert to a stable build and prevent hating your life as much. Lol.
I took an embedded systems course and saw that 1008: AF 03 F3 B3 format so many times debugging my assembly code. I knew it was machine code and just thought it was there for legacy reasons. Seeing you input it into wozman directly and have it be loaded into memory like that and then ran with a simple command felt like magic
@@vicroc4 I get your point, but wozmon doesn't have a sliding window manager, preemptive multitasking, network connectivity, a filesystem, virtualization support. Yes our modern kernels are bigger....but it's not like we didn't get anything for it.
@@Felix-ve9hs pretty sad considering windows 95 minimum requirements are 386SX with 4MB (not GB) of RAM. And even that’s bloated compared to what came before.
@@captainharpoon Sure. But I'm talking about Windows, not a sensible OS that makes a firm distinction. There's so much needless crap integrated into the Windows kernel that the OS itself is largely implemented by it.
With some additional hardware this computer can be made compatible with later commodore PET, thus opening up the door towards even more software including games like the Attack of PETSCII robots.
I’m finally beginning to understand memory manipulation and hardware control. Now I look at the Apple 1 as a primitive yet strangely powerful Arduino/Pi.
It's amazing to think that this was part of the start to modern computers. Now we have a small computer that can transmit video and audio so we can see you. And all that is smaller then that computer
"Now maybe you think it's a bit inconvenient to enter programs in hex machine code like this, normally we want to write a program in assembly" Riiiight, of course Ben - I struggle not doing stupid stuff in Python, I should definitely write assembly :D
It's almost insane how tiny a machine code program can be. I once programmed a machine code routine into the old CoCo (TRS-80 Color Computer) which was designed to do something which the CoCo could not do natively, which was to output text on its high-resolution graphics screen. Even with fancy controls built into the code, such as cursor positioning and the ability to redefine character patterns, the code still wound up being tiny, when compared to the bitmap for the 256 characters which it could print (I copied the IBM extended ASCII character set, so the bitmap occupied 2048 bytes). If the USR() call was given an integer, the lower 8 bits would be used to print whichever character had that ASCII code, but the real power came when the call was given a string. Characters from ASCII 32 (space) on up would be printed, but characters from ASCII 0 through 31 would be interpreted as control characters with the work-horse being ASCII 27 (Escape) which was the prefix for all of the more complicated commands such as "redefine character" and "repeat character x times." It's been a long time, so I don't remember exactly how long the code was, but I do remember laughing to myself when I realised how bulky the character bitmap was in comparison to the code which used it. If I had to guess, I'd say it was something like 170 bytes in length, or less.
I love pouring a cup of coffee and watching entire videos like these to remind myself that there is another universe where people understand things that my brain just shuts down immediately when examining. I will never understand how machine code or assembly code works .. but I still watch in awe and admiration of your intelligence and skills!
It’s like trying to build a full scale and completely operational car out of lego, one brick at a time. I did it professionally for a few years and it’s slow tedious work but when it finally runs it’s the greatest feeling in the world.
8 years ago, you explained how semiconductors work from a physics perspective. From there, you've incrementally built up one step at a time, and now you've reached the point where you're literally remaking the very first Apple computer from 1976. And you haven't skipped a single step or building block going all the way back to "this is how semiconductors work". My friend, you are amazing and deserve an award for computer science education. Here's to the next 8 years, at which point you'll likely have reached the point of explaining how Alienware computers can be optimized and improved for resale 😜
hell yes !
I'm very excited for the step into operating systems, that's the largest gap for be with regards to having a good understanding for computers
(I'm studying computer science; we had a course on low level stuff like Ben has shown in his breadboard 8-bit computer and this series, several on higher level programming and one on operating systems which was bad and I did not catch a lot)
I got sucked into the his full-scale 40-something long video series that started with, as you said, how a semiconductr worked, then a transistor, a flip-flop, a logic gate, a memory cell, a clock, an adder, and on and on.... now we got graphics cards with vga output and OG computers.
@@iuppiterzeus9663 The operating system set of videos on the 8-bit bread board series was my favorite part. I finally understood how a computer is able to flip information back and forth using the bus and basic control logic, and why each command takes different numbers of clock cycles, and how it's possible to run some command simultaneously, etc., etc.
Hope he does a Ternary computer at some point.
Ben's almost up to the stage where the device drivers are the human. Ahh, the good old days :') I'm misting up here, reminiscing :'}
I love it when they just put the entire code for the device right in the manual
Ben's ability to incrementally build to a satisfying and impressive result over a series of methodical steps, each of which are satisfying on their own, is a marvel.
This 6502 project is starting to feel like when the 8-bit breadboard computer proje t wrapped up with a discussion of Turing Machines and the philosophy of computers. Reflecting back upon how each of these projects started makes these final results feel important and enlightening.
Thank you, Ben, for crafting educational and entertaining content that satisfies on so many levels.
One of the reasons I picked up your kit for the 8-bit breadboard computer is precisely for things like this. In college, we mostly dealt with the Motorola 68HC11 microcontroller/processor and I recall having a blast making a "Realtime clock" that displayed the time and date on a 16x2 LCD with a similar 256 byte limit. I remember staying up late before that project was due, cramming "set buttons" for the clock in addition to the required serial initialization.
This is exactly the kind of "is this thing working?" code I've been looking for to jump in to the 8-bit computer again.
Thank you Ben!
We had Woz as a guest speaker for our electrical engineering and computer science department's undergrad projects day awards dinner. Must have been early 2000's. (I was a professor back in the day.) A genuinely nice guy and proud geek. He talked in detail about the joy of cramming functionality into min cost hardware. A great night for the old nerds like me. I'm not sure the juniors and seniors he was talking to got the full context, but we had fun anyway.
In my opinion, Woz's "annus mirabilis" from March 1975 to April 1977 when at just 26yo, he designed, implemented and launched the hardware, kernel, sound and color output as well as a BASIC interpreter for the Apple I and II is one of history's greatest technological achievements. It was a tour de force of talent which is astounding to this day.
Don't forget the Disk ][, IMHO his greatest achievement! Compare it with the C64 disk drive, which is basically a second C64.
Looking at the code of Wozmon is incredible. I find the same amazing style when looking at his Integer Basic code. If you want to see genius, look no further.
I remember back in 1981 at school, me and a friend had a list of the coolest people ever (we were geeks without knowing what a geek was). Einstein was in the list, as well as George Lucas of course and others, but top of the list was Steve Wozniak. It was only in later years that I finally understood the importance of Jobs, but for me Woz will always be #1.
@@mojoblues66 He put the Disk ][ together in record time too. The way he generated colour for the Apple ][ was mind bending too.
Actually, I remember listening to him in an interview, saying that he imagined so many times before how he would design and build that system, long before he had the means to purchase the components.
So, when the right time came for him, he just churned down a lot of content.
@@mojoblues66the c64 drive was total crap because they wanted it to be backwards compatible with the vic20
Look at the disk interface for the commodore pet, it has a much better design even though it is older than the c64 and vic20.
So nice to see Wozniak getting some of the recognition he deserves. It's truly alarming how even Apple themselves seems to have forgotten how essential he was to the success of the company.
Hot take: Woz is underrated, Job is overrated and it's actually good later died of ligma.
My thoughts exactly, especially considering how radically different apple is today as a company.
@@crusaderanimation6967both of them were better than Tim Cook. Apple’s innovation died with them both.
Think of how important he was to the _industry._
@@Aeroshogun Steve is still alive!
Watching this in 2024, still amazed with the quality of work and passion you are putting in your videos! Thank you Ben!
This takes me back. I had an Apple ][+ and spent a great deal of time writing code in machine language. Floating point BASIC was just too slow for some purposes, afterall. Your demonstration of controlling the LED reminded me of making the built-in speaker go "tick" by reading (or writing) the I/O address to which it was mapped. I learned a lot by tinkering on that computer, but there were some gaps in my knowledge. These videos -- and the computer kits -- have filled in many of those gaps.
Thank you Ben for the stroll down memory lane, and for all the knowledge you've shared with us.
My year 10 teacher taught us assembly in Apple ][. We had no compiler except pencil and paper. We made a musical keyboard, using NOPs to get the frequency. And then I built a small library of pixel operations for speed that I called from basic to build a platform game.
35 years later I visited him to thank him.
The Apple ][+ was my first "power" computer. I loved it.
@@goofyrulez7914 Same. I owe my career to that computer.
@@idjles Great teachers like that are undervalued, sadly. It's a shame. Good on you for letting your teacher know how much they mattered to you!
Remember call -151?
I’m starting to think you aren’t just haphazardly making breadboard computers, but actually have a bit of a plan.
It's really cool seeing someone tweaking Woz code to work on a different board. This really lets you see how microprocessors work, especially in the homebrew world. The Apple I came out when I was first learning to program so it's especially fun for me to see this kind of thing. Wow.
Now THAT is a Hackintosh
Man... I really take for granted how simple and intuitive modern electronics are. Imagine if computers were still like this - machine code only.
They still are, on the inside
@@thewhitefalcon8539 I mean yeah technically you are correct, but I mean the user experience - the only way to interact with the computer is to program it via machine code instead of having a kernel and UI to do the machine coding for you.
Just for the hell of it I recently learned to write machine code for the 6502 (apple II) over a year or so. I made a version of Nokia Snake. Getting it to work was literally the most excitement I've ever gotten from a computer.
If we still had to deal with machine code, we would have flying cars by now. Anyone nowadays carry a CPU in their pockets that is 1000x more powerful than an Apple I and we only use it to access social networks and play games.
Simple? Modern machines are orders of magnitude more complicated.
I cannot overstate my amazement at how well you've put this whole thing together. It's hard to say something that hasn't been said already in these comments; it all lines up perfectly, but there was never a hint that you'd take it this far. You could have mapped your ROM to the lower half of the address space, but using the upper half lets you use Wozmon with less modification. You set up a serial interface, which lets you copy-paste into the 6502 computer. You've demonstrated that we are standing on the shoulders of giants as we use our computers and smartphones today. You keep me constantly reminded why I chose computers as my field of study. Thank you for this amazing series, Ben!
I just dropped that eeprom programmer, that I bought from your site, down 20 stairs and it is still working. Five star review from me
that's definitely a nokia eeprom programmer 😂
As a kid, when I finally got a Final Cartridge III for my C64, I used the monitor to copy the Kernal ROM to memory and changed the READY prompt to my first name (also 5 bytes) and run it from memory.
Stuff like WozMon makes me feel so blessed as someone who writes "modern" code. Under 256 bytes to write all that magic and here I am using more than that to write "hello world" to the screen in any current higher level language...
I'm eternally grateful to these two channels: Ben Eater and Phill's Lab.
same
The moment of realization you can copy paste monitor commands over serial input was a truly inspiring. It opens immense possibilities.
I was never good at using breadboards. My hands are big so handling small components without disturbing nearly wires and components was nearly impossible. And it was always maddening how easy it was to inadvertently loosen a connection. So I dumped the breadboard and learned how to design a PCB. Over the course of several weeks (months?) I had a PCB ordered and delivered. I finally got the third board built correctly and WOZMON finally worked. Seeing that backslash character for the first time... I guess everybody in here knows what that feels like. There are some minor issues with the PCB I have; the serial connector is too close the edge of the board. And the silkscreen has incorrect information; the ROM and RAM labels are switched. (It took a while to figure that out.) I will fix that in the next version. And make a few other minor enhancements. Eventually, I will learn how to 3D print a 'case' for it. Plus - i want to get BASIC running on it. Ben : thank you for an outstanding video series. I have learned more watching these videos than I did in my digital hardware course in college!
As an EE still in school and wanting to enter the field of computer hardware design, your channel has truly become a treasure trove of knowledge. You built this 6502 computer from the ground up while incrementally explaining the history, theory, and design behind it. Truly spectacular work
I started writing programs in machine code for my Atari 800XL when I was about twelve. I can remember sitting down with one of my math teachers running little machine language programs on classroom Apple IIs. Whole applications in tens of bytes...
Thanks Ben, what a great start to a Saturday!
Thanks for keeping the 6502 alive! I'm currently working on a 6502 based fantasy console that hugs close to the 1980s specs, trying to toe the line between period accuracy, and being convenient enough for modern devs to want to tinker with it. I think if we loose the knowledge of these old systems we'll loose the ability to create efficient and optimized programs. CPUs are insanely powerful compared to the 1980/1990s chips, yet computers aren't that much apparently faster to the user, due to how absolutely inefficient modern applications tend to be. Keep pumping out awesome videos, it lives on my second monitor to keep me sane while coding!
Expectations will increase faster than any increase in clock cycles or transistor count.
The fact that I can pull in a couple of libraries and parse a several megabyte executable in 20ms in code running on a browser served to me across the world should be enough evidence that performance and efficiency are absolutely still goals and available to anyone who cares even slightly, but that's not worth anything if it just means that you expect it to be doing even more in that time.
Modern programs are less efficient than hand written assembly programs were back in the day, but there's also no way anyone wants to hand write a program the size of Excel. Modern programming languages and frameworks give you a way to balance programmer effort and bug resistance against efficiency and speed. For instance, C# is usually a little slower than C++ doing equivalent operations because it includes more safety checks and automates memory handling. On the other hand, it's vastly easier to work with and drastically reduces the chance of memory bugs. It's always a tradeoff.
@@clonkex To be fair, Modern C++ is also pretty good with memory handling. new and delete (or malloc/free) are code smells at this point. Even when using C libraries like SDL, you can use a unique_ptr and specify a destructor method and never have to worry about freeing it. It'l call the destroy method as soon as it looses scope.
@@SimonBuchanNz Yeah but then you have 80GB shooter games that look pretty, but have barely any more content than games 1/10th the size. And I'm not saying people shouldn't use C# or JavaScript. I'm saying that if you don't at least fundamentally understand what's happening under the hood, you're more prone to writing inefficient code. This is why so many C# devs do things like + strings together in tight loops and then wonder why their console app job take 16 gb of memory to parse a 2gb file.
@@LunaticEdit 1/10th the size? Try 1/1000th the size! Classic shooters like Duke Nukem 3D and the original Quake came in at under 100 meg. Doom is around the 15-meg mark, and Wolfenstein 3D and its prequel Spear of Destiny _combined_ total less than 10 meg.
And just to prove it's still possible, Ion Fury is a modern throwback to Duke3D, using the same engine, and that takes up 91MB on my hard drive according to Steam.
Hi Ben, thanks for this video. I've made myself a small carrier board for a WD65C816 that attaches to my old Digilent Spartan3A fpga board. Once I had my UART working (in Verilog) I set about getting Wozmon assembled and inserted. Made a couple of oversights in my understanding how the code runs, but thanks to your video I have those corrected now. 👍 Cheers ! Bob.
This is very impressive. A precursor to the BASIC interpreter and system interface that wouldn't be available for another 3 years. Truly ahead of its time.
The cool thing as you probably noticed is that the format that you get when you let Woz Mon print a hexdump of memory, is compatible with the format of the command to put something into memory. That was very intentional. In those days it was very common to use a Teletype (or another printing terminal with a paper tape punch and reader) not only as your main user interface but also as your storage medium. Other computers such as the KIM-1 did the same: Whatever they would print, would also be acceptable as input. And of course Microsoft Basic also counted on you having a paper tape punch and reader to store your programs, at least in the beginning; the SAVE and LOAD commands were added later.
Woz is indeed a genius! Thanks for bringing this to light Ben. Over the last few years, I've realized how underrated Woz is and that he needs to be celebrated more than Jobs.
I remember some years ago I was wondering how a computer worked, and you videos gave me a very good idea, and it built up to the point where we have an operating system, this is amazing, the obly good answer to "how computers work"
Takes me back a life time of programming, I loved my 6502 game development time. Thank you for the trip through time.
you are a truly gifted teacher sir.
Awesome! Reprogramming it without flashing the eeprom (or even resetting it) is a huge milestone. For me, this is the point where your breadboard computer has become a real computer (for my arbitrary definition of real 😂)
Yeah this was the only video of this 6502 project where I forgot that it was running on a breadboard and not an actual computer.
Well I tried it and it actually worked the first time!! I have learned more than I ever thought I would from this. Love the video and appreciate all the hard work that was put into this!! I am still fascinated with how the hardware and software work together to do stuff so quickly. Even though 1Mhz is very slow these days. I have stepped through some programs with this "simple" cpu and realize that it must take a complete understanding of the computer to be able to write an assembler for it. Gives me a head ache even thinking about that.
I watched a documentary on Woz, and it's remarkable not just that he did it -- but that he was always doing impossible stuff like that. -- He would build machines that had half the number of chips that should have been necessary, and write a third of the code that should have been required, and somehow it all worked. -- What a badass.
I like the way of typing and error correction, so useful via command line for 1970s
This brings back memories when I build by Apple II by purchasing an exact copy of the motherboard and putting all standard components on it. I even managed to write some assembly code to replace keyboard control for PacMan by using the joystick as input. I enjoy watching your videos. Keep up the good work.
Dude. Now I have to dig out my you-inspired 6502 build to see if I can make it run Wozmon... you're awesome man, hope you never stop.
This is exactly what I needed
One thing I noticed is that I wasn't able to paste into mobaxterm. I'd paste in a line of bytes and it would be echoed back as just a few of the characters I had sent. It turns out that my terminal was sending the characters faster than the CPU could read them in and send back. I ended up having to set the baud rate to 9600 (#$1E loaded into ACIA_CTRL) and then set the transmit loop counter to #$C8 instead of #$FF. I don't know if C8 is the best value, it was trial and error of looking for a value that made the loop long enough to make sure a byte was transmitted and short enough that it was done fast enough to read the next byte being sent. I couldn't find such a value for 19200bps, as it always missed a byte every 30-40 characters.
Then I had an issue where I couldn't send more than one line, which it turns out was the CPU being busy writing the input buffer to memory and missing the new bytes being sent. I solved this by sending 20 space characters at the beginning of each line which gave the CPU plenty of time to process the last line before getting the new line.
With these changes, I'm able to paste in code and run it on windows with mobaxterm.
Though the same thing, I’ve just waited for this
This is a problem that often occurs with terminal programs in Windows. I used to configure switches using Putty (ssh), and you had to be careful not to past to much data at a time, since the terminal would not always wait for the receiving end and data would be lost. This is never an issue using ssh from a Linux terminal. I'm not sure what is/was causing this behavior - if it is a bug in the terminals themselfs or some common library in Windows - but it sure was annoying.
Terminal emulators often have a setting to add a few milliseconds after each line is pasted.
Thanks!
Amazing video! Thank you, Mr. Eater
You are a very good engineer keep it up great works👍👍
Your deep understanding of what you are doing and confidence are beautiful. Thank you for the amazing videos keep them coming. you rock sir. I am happy I am subscribed.
A reality great channel for computer science seems like the whole channel is a gift from god
This is really a pivotal moment in the series. It's all coming together in a way that I feel like it hasn't in a while. I absolutely ate up the 8-bit series but found the 6502 less compelling up until this point. I've skipped a few videos but glad I watched this one! That might be what prompts me to get started on a breadboard project
I remember the times when I used to input code from magazines into our Sinclair Spectrum ZX, it was always so cool seeing what a few lines of code could do. Awesome video, took me back about 30+ years! Incredible what he did with ~250 bytes at that time. :)
Ben Eater is always cooking up something good, and once a again it is some good content!!
Ben eater ate this one
BEN IS A GENIUS
HE'S SO CLEVER LIKE AN ALIEN
Thank you so much for making these videos! I already programm since 3 years but only now im getting into the lower systems that happen and all because of you!
This is great! It demystifies a lot of things in a small amount of time. Thank you.
To follow along with your videos I made an emulator of the 6502 in rust and I wait everytime for your videos to continue building on it
Fascinating insight into the inner workings of the Apple 1, and a taste of what those hobbiest programmers were doing back in the mid-70's. Thanks so much, Ben, for creating this video.
All 6502 code I remember was jam-packed and optimized like this, not only Wozniak's. If all you have is a few kilobytes, you start getting creative
This is awesome. I've been wanting to work on a 6502 breadboard machine that can run NES ROMs but I have not had the time. This is a very cool project.
Good luck implementing the PPU ;) Truly an advanced chip for video game machines of that time period.
I did just that during covid. I had a spare NES with a lot of damage to the motherboard, the case was already donated to a different NES to fix. I desoldered the cpu and ppu and put them on a breadboard.
a program that can examine and modify its own memory somehow seems... improper? Taboo, even?
this has reached the point where it recognizably resembles "a computer", while being so low-level that it follows *completely* alien paradigms to me. I'm glad i've gotten to see this built from the absolute basics, it's given me a lot of new appreciation for all the things that came before my time!
its like performing brain surgery on yourself on a tuesday afternoon
Ben Eater deserves an award lowkey.
Everything from the 6502 series, to the SAP-1 series, to everything else, is perfect for learning computers.
This is the type of stuff I litterally wouldv struggled to learn in a classroom otherwise.
A good reminder of how brilliant Woz is.
Thanks Ben, that was wonderful!
I started out with an Apple IIe and didn't learn much about the Apple I.
So seeing the details of the Apple 1 fills in a lot gaps for me.
Cool! Recently I did something similar which was to convert an Apple II into an Apple 1. Cutting and pasting code into the WozMon is a lot easier than loading via cassette.
10:20 Beware. R after BLOCK XAM 0.A jumps to A, not to zero. It interprets 00 at address 000A as BRK, reads vector 0000 from address FFFE (ROM) and jumps to 0000. In general, when you want to run program after BLOCK XAM ABCD.EFGH, you have to enter address ABCD and then R.
I would posit that part of the reason that wosmond does so much at such a small size is because the raw data level of what a computer does can be highly simplified, especially when you only have 2 or 3 hardware data standards to worry about. Highly intelligent, clever, and motivated design is obviously also part of it.
Just again an amazing video as usual. Thanks alot man! :)
I am really enjoying this video series. You have a great teaching style. I have seen low level programable logic controls that use a very similar hex style programming interface. To see it be built up from transistors and NAND gates is very interesting.
Woz is an engineer's engineer. He's a great guy!
The first computers I ever got to use were the Apple ][ and Apple //e at school.
Learning how to optimize my code was ingrained, because of the RAM limits and speed of those computers. Optimization was critical!
Great video, thank you.
Really awesome video Ben, thank you for making and sharing 😍
let me say first thanks for sharing your video And boy have you brought back a lot of memory for me I'm 7 3 years old and I could almost remember most of the stuff that you're doing Love it love it i'm familiar with the 6502 WOW I also worked on 128 memory card just love that old Apple TB thanks for sharing
Love it. Makes you appreciate where it all began.
I mean, the logical thinking involved in following along the whole video (and the knowledge it takes) is regularly higher than what I have, but just for the interesting facts that i can take out and seeing that you're up to these crazy things makes it nice to check your things out on here from time to time and slam that like button. Cheers
Nice, really cool. You can easily debounce a button by applying a ceramic disc capacitor across the leads. The capacitor absorbs the small voltage fluctuations. You can determine the sensitivity by choosing a higher or lower capacitance value.
You should always reassemble and reupload any change, so it is reflected in the source code, also before changing it, backup the source code. This seemingly simple step prevents devastating typos that can completely ruin a project.
Also, keep a copy of every backup of the source code and number them 001, 002, etc. So you can revert to a stable build and prevent hating your life as much. Lol.
I took an embedded systems course and saw that 1008: AF 03 F3 B3 format so many times debugging my assembly code. I knew it was machine code and just thought it was there for legacy reasons. Seeing you input it into wozman directly and have it be loaded into memory like that and then ran with a simple command felt like magic
I’ve always wondered how the Apple I worked, having started programming on the Apple ][ - thank you very much for sharing this with us!
It's amazing that 256 bytes is all it took to write a kernel.
Now you're lucky if Windows confines itself to 4 gigabytes. How far we've fallen.
@@vicroc4 I get your point, but wozmon doesn't have a sliding window manager, preemptive multitasking, network connectivity, a filesystem, virtualization support. Yes our modern kernels are bigger....but it's not like we didn't get anything for it.
@@vicroc4 kernel and os are not the same thing.
@@Felix-ve9hs pretty sad considering windows 95 minimum requirements are 386SX with 4MB (not GB) of RAM. And even that’s bloated compared to what came before.
@@captainharpoon Sure. But I'm talking about Windows, not a sensible OS that makes a firm distinction. There's so much needless crap integrated into the Windows kernel that the OS itself is largely implemented by it.
With some additional hardware this computer can be made compatible with later commodore PET, thus opening up the door towards even more software including games like the Attack of PETSCII robots.
bro went "TTL to Petscii"
That is a funny statement
@@the32bitguy Why is it funny?
superb video man, i love this channel, thank you
Another superb computer science video from Ben Eater.
That was nice. Good video. Clear and interesting.
I’m finally beginning to understand memory manipulation and hardware control. Now I look at the Apple 1 as a primitive yet strangely powerful Arduino/Pi.
What would be super interesting is to see a breakdown of the wozmon code and how it fits so much into so few bytes
Nice work
It's amazing to think that this was part of the start to modern computers. Now we have a small computer that can transmit video and audio so we can see you. And all that is smaller then that computer
Woz is a genius, true, but I don't think you are too far behind him, Ben.
Always so easy to follow. Great pedagogy.
Fascinating! Thank you
I wish writing light and optimized programs was still popular
This is so fascinating! I love seeing these basic computer functions :)
"You find entering machine code by hand this way annoying"
Try doing using the front panel switches on something like the Altair 8800!
Great. Now that you have Wozmon, you now can write with it an assembler inside the computer!
Wow, Thank you(감사합니다)
You're my mentor.Thank you Ben for your work
"Now maybe you think it's a bit inconvenient to enter programs in hex machine code like this, normally we want to write a program in assembly"
Riiiight, of course Ben - I struggle not doing stupid stuff in Python, I should definitely write assembly :D
It's almost insane how tiny a machine code program can be. I once programmed a machine code routine into the old CoCo (TRS-80 Color Computer) which was designed to do something which the CoCo could not do natively, which was to output text on its high-resolution graphics screen. Even with fancy controls built into the code, such as cursor positioning and the ability to redefine character patterns, the code still wound up being tiny, when compared to the bitmap for the 256 characters which it could print (I copied the IBM extended ASCII character set, so the bitmap occupied 2048 bytes).
If the USR() call was given an integer, the lower 8 bits would be used to print whichever character had that ASCII code, but the real power came when the call was given a string. Characters from ASCII 32 (space) on up would be printed, but characters from ASCII 0 through 31 would be interpreted as control characters with the work-horse being ASCII 27 (Escape) which was the prefix for all of the more complicated commands such as "redefine character" and "repeat character x times."
It's been a long time, so I don't remember exactly how long the code was, but I do remember laughing to myself when I realised how bulky the character bitmap was in comparison to the code which used it. If I had to guess, I'd say it was something like 170 bytes in length, or less.
Ben, you are amazing!
As always, your vids are mind blowing. :D
damn Woz, what a legend!
Nice series. As always thank You for sharing
The improved Apple Monitor found on the Apple ][ not only has an assembler and disassembler, but it has a full 16-bit virtual machine called Sweet16.
Nice Im going to have to check that out. Neat
This is amazing. Nuff said.
Awesome as usual. You’re so different ben. Thank you so much for the rich elite class content you provide.
I love pouring a cup of coffee and watching entire videos like these to remind myself that there is another universe where people understand things that my brain just shuts down immediately when examining. I will never understand how machine code or assembly code works .. but I still watch in awe and admiration of your intelligence and skills!
it's only inputs and outputs with loops, conditions, counters
It’s like trying to build a full scale and completely operational car out of lego, one brick at a time. I did it professionally for a few years and it’s slow tedious work but when it finally runs it’s the greatest feeling in the world.