Yep, the computer pioneers of our era are beginning to go now. Dennis Ritchie, co-inventor of C, passed in 2019. There was somebody else that died recently, but I can't recall who it was.
@@huguia Typo. I knew he died a few years ago, so I looked it, saw that it was in 2012, came back here and promptly typed 2019. I'm such a dumbass sometimes.
Fantastic stuff. I've graduated uni as an electrical engineer, but most of this low-level stuff was glossed over even in the digital hardware courses. These video's don't really teach me much new material per say, but what it does teach fills in the holew and gives me a lot of those "Ah hah!" and "Eureka!" moment where all the things just click into place and really deepens the understanding of how these systems work and are build up physically. With this series specifically, i've had flashbacks to the very first embedded programming course in second semster, where we hooked an LCD up to an arduino programmed with AtmelStudio to say "Hello, world!". Never understood exactly how the ports on the ATmega were addressed and accessed by the core of the microprocessor, until now. Good stuff. Merry chrismas Ben :)
You’ve really summed up how I feel watching this. I graduated (MEng, EEE) 7 years ago, assembly was done in first year before quickly moving onto c. I remember assembly, I remember being taught about how microprocessors work (I scored well on this stuff and software in general), but these videos just make it seem so much easier.
@@alexlovett1991 how is your career if you do not mind me asking. I am studying EE and am wondering how easy getting a job was. I have knowledge with programming in multiple languages, hoping that could help me.
@@TheAcidicMolotov I don't know about where you are, but in the UK being an EEE you shouldn't have a problem at all getting a job. Get your profile on linkedin and you should get recruiters messaging you. That's how I got my first job. After 6 months working I basically had recruiters contacting me daily (which is common for other software engineers I know). I'd be wary about using the phrase 'i know language x' it's best to be honest and say 'I've got n experience in x language' where n could be little, a reasonable amount, a lot etc. If you claim to 'know' a programming language interviewers might have a high expectation for the depth of knowledge. I've been doing java for 8 years, I think I've got a very good knowledge of it, but I'm also aware there's a lot I still don't know about it and if asked by someone about those topics I'd have to answer 'i don't know' It's also really worth getting some projects on GitHub or another public repo that you can add in on your CV. Just having done multiple projects on your own time really helps show enthusiasm.
If you are interested in self learning, there's a library for the AVR ATMEGA called V-USB that bit bangs the USB protocol using IO pins. It also has example code for emulating a bunch of different HID devices. I made an arcade joystick using it a few years back -- back before you could just buy a pre-made board on Amazon for $12... 😑
Even after 10 years having been a programmer ( I mean a professional one) you didn't know this one. so what went through your mind? In the meantime I found 2019 to be a rather fascinating one, most satisfying. Would that make it double weird for you? :-)
@@tux1468 Usually I'd be mad about them ripping him off, but to be honest some of them are just as entertaining, but not from an educational point of view.
I have learned so much more from Ben than I did in college. Maybe it is because I am older and wiser, but I think it is because Ben is an amazing teacher. If you read this Ben, please dont ever stop teaching on youtube.
I love this sort of computing. It reminds me of my childhood in the 80s. Assembly code is fascinating to me, even if I'll never have a reason to use it. I would have loved to be involved in NES or SNES, trying to use all of the registers and bits as efficiently and creatively as I could have. The wild west of games.
I got the ELEGOO UNO R3 Project Most Complete Starter Kit But I still haven’t gotten very far in figuring anything out that’s why I’m watching a bunch of videos but I’m still confused as heck
I got that same kit for Christmas too! I’ve had a little experience with electronics but nothing crazy, made some cool little projects like a joystick controlled LED grid or a sound controlled motor
Go on their website and download the PDF that they give for your specific kit. From there, read the first four lessons and if you need help finding the right parts, look at the little guide they put on the top of the parts box
Hey Ben, just wanted to to say thanks to you. I've started my journey of Electrical engineering in 2016 and got confronted with AVR assembly back in 2017. Scratching my head about toggling an LED in this course I attended I started searching for a source of information on the internet and found your channel. This literally changed my life! Of course I scored an A in this exam, as well as in all other exams containing electronics. At the end of my bachelor's degree, i graduated as best student of the year. From there on I've attended I started a master in electrical engineering during covid with my handy dandy TH-cam at my side. What can i say...I've also graduated with superb grades. What I want to say is: the first inspiration and steps during learning matter the most. Thank you Ben 😉
I'm learning so much. Holy cow. As a budding programmer, I have nothing but praise and appreciation for the free education this channel is providing. Thank you.
4:00 Thanks for giving an excellent short description of what is integrated in the package design. Typically you'd hear to just don't worry about it and continue with implementation. Great instruction!
even on a modern CPU, for what this is doing, it's inefficient on a system with even lower memory, it's worse. there's also the issue that every time you want to go and change the output... OUCH! you have to copy-paste several lines of the same code and tweak them just a bit, and hope you don't screw-up its good as a tech demo or proof-of-concept hence why he said the next video will be about optimizing it though, i do have a theory all of what will happen...lol i'm mainly watching this for fun, and also because i'm curious how you take processor and build an actually useful device with it. as for chrome, or ANY modern browser [rather, application] , tbh, yes, quite inefficient. it's now the rule and not the exception that web browsers suffer memory leaks. it starts as trickle, and before you know it, you HAVE to re-start the browser... >.< I use pale moon, myself. Never liked chrome since my first experience, firefox drove me away by taking-away all the features and customization that I used it for in the first place. The reasons go honestly far beyond the scope of developing a single application, though. It's decades of questionable practices, including so-called "rapid app development" and other "innovations" in the tech industry. I started my "serious" programming with these newer languages/tools, and tbh, there are multiple things I don't like. I watch videos like this, and about ROM-hacking, so on. I just facepalm at what's being done today. I have so many ideas right now or projects/systems I'm slowly cobbling-together that are pretty heavily inspired from older technologies/tricks [if not re-implementing them to some degree], or using low-level concepts. Or well, that's the plan.
using a million other libraries or adding extra baggage is why i don't trust jquery, node/npm, so on... javascript is unfortunately so utterly worthless on its own that you need libraries by default. i have yet to see anyone preaching the joys of HTML5 use the APIs right out of the box. they use engines that abstract the crap away. been building my own from scratch. this is a problem plugins didn't have. but "in the name of security" they were decided to be killed-off. hoping to switch myself to full native or web + native development. I just also don't desire to touch such things as C++, either... *shudder* And I suspect my next obstacle is going to be these corporations for "our security" going full gatekeeper via the online stores. note the 'recommend' setting in win10 is to ONLY install from the store. There's a lot of programs that this can kill-off because there are so many reasons it can't or WON'T appear in e-stores. And then it might even force others into needing to go through additional licensing programs [i suspect this is the future for hex editors and programs that have a reputation for being used in hacking, especially reversing proprietary formats and facilitating piracy/emulation] libraries aren't in themselves bad, the way they're used and maintained is. mainly the mainstream ones. admittedly, i used to do this, jquery's main use: $("some_element"). if this is what you need to do, document.getElementById("some_element"). note this is discouraged. same with directly using XHR, which you could cut AJAX out in 90% of all cases. bootstrap, jquery, so on, should only be used for sites like twitter, youtube, so on... [if even there, but google did jquery, twitter did bootstrap... small sites/apps don't need and shouldn't use them] it's mainly executives who don't know anything about the company and the industry leading this push. look at commercials where the only thing they can brag about is : "patented" , "proprietary". this just simply means they keep their methods secret and have gone through the [very expensive] process of facilitating the legal right to sue people for trying to copy their product/service/process. speaks nothing for the quality of the product/service. this is also what i find funny about norton/lifelock. "with all these data breaches happening..." , why these happen : governments [NSA caused ransomware outbreak by not disclosing the exploits to computer companies, cuz THEY wanted them to look for possible 'terrorists' via unethical, immoral, and ILLEGAL spying mechanisms] , databases get breached partially because executives don't want to pay money to people to properly fix them, INTERNET OF THINGS [that shouldn't be connected to the internet], cloud computing, streaming [latter two are ultimately DRM mechanisms cuz it turned-out the fatal weakness of DRM was, and always will be, client-side code/data. it's not security/convenience as the reason, it's copyreich] i'm personally waiting for the next big crash, it's inevitable this will happen in the next few decades. we're actually making things less and less compatible with each other, which will lead to a lot of data rot, FAST. [like flash being killed, at least the open source community is working on it.] we're connecting EVERYTHING to the internet, increasing VASTLY the attack surface for hackers to target. we're increasing bandwidth usage and the infrastructure wasn't designed for so much traffic, nor was it properly upgraded in many cases. we're adding countless lines of tightly coupled dependency/overhead/boilerplate code and adding many extra layers to code management that don't belong there. [oh, and writing macros to generate code/data that the COMPILERS should be doing automatically] everything is obsolete when it's released, nothing released is finished. W3C members and browser vendors are STILL having a pissing contest, this is why APNG failed and GIF is STILL in use. [This is also why CSS3 suffered that prefixed properties BS.] Proprietary code is still supported in open source bodies [EME, aka DRM for the web, glorious "champions of internet freedom" , Mozilla, actually switched their stance to endorse/support it] Something's gotta give... [oh, and programmers who actually care or know how to do things are quitting/dying, assuming the 'supervisor' or an 'o' aren't telling them to basically, not do their job, cuz it costs money, anyways] I'm hoping my own code base will avoid some of this since I'm trying to axe the third-party dependencies as much as possible... Even then, there's a lot I can't avoid, like the internet globally suffering an outage, or the possibility/inevitability M$/Crapple are gonna tighten the noose around developer's necks since as it stands, I [and many others] can write and publish code/software which they don't have the ability to control at present. [End users, likewise, can still download/install/execute/[de]compile that code.] welp...another text wall... never thought programming and tech, generally, could be so...political... >.< i just want to make + play games, dammit!
yes and no. the HTML5 Canvas/webGL and WebAudio APIs are absolutely miserable if you don't have a library. even if it's an in-house solution or lightweight, it's still a library/engine/framework, it's still basically necessitated because you're dealing with simple state machines that don't have any concept of what an actual program [especially game/animation] needs... a 2D game needs a concept of tiles/sprites, which requires some additional systems and data structures. a 3D one needs models, skeletons, possible inverse kinematics, [shaders,] etc... If you implement that wrong, performance is bad, you could even potentially cause damage/stress to the machine/ [something people don't seem to like mentioning about GL is that you really shouldn't touch it if you don't know what you're doing. just how it's not technically necessary even for 3D UP TO A CERTAIN POINT] all games need asset management and audio playback + audio looping. HTML5 doesn't do this out of the box. Neither does C/C++, so on, but we already have libraries [and...MASSIVE engines/frameworks] to deal with this for native applications and...mobile/console. Web had shockwave, java, Flash, Unity as the main ones. They each had their use, and I'd argue still have benefits if people can do ANYTHING besides freak-out about code exploits and "proprietary black boxes". More so, Flash. Shockwave's plugin is horrible, I hated dealing with it. But, some of my fav games need it. Unity is still a mess post-webGL [especially as an impenetrable proprietary black box with code doing god knows what] Java applets were neutered and I don't like the "everything is a class" blueprint-oriented nightmare java helped create. Flash's format, at least, is honestly great if you consider bandwidth and flexibility/extensibility. Parts are a nightmare to parse because that's how serious adobe/macromedia were about making the files SMALL. coupled with the AVM you had pretty much the only general-purpose game engine that's arguably useful, ONCE you figure-out how to use it. You don't have to monkey-patch flash or add loads of boilerplate to make even a simple/'mediocre' game. You can, and a lot of modern flash games do run entire frameworks behind the scenes like box2D or Flixel, but you never NEEDED to. A lot of the limitations with flash are also imposed, now that the format and AVM are documented so thoroughly, it's not out of the question to bypass the limitations or improve the parts that don't work. Just as an asset format, I wish SWF or something like it, stayed. That's one of my projects, but it's time-consuming and tedious, and I''m spread too thin. I probably sound like an Adobe shill, but I've long since lost my respect for them as a company. Currently I don't plan to collaborate with them or any other large organization since I don't trust them. I also don't expect adobe to do anything more than allow it to happen like they already have with SWFmill, FFDEC [which sourcetek/sothink hate! (:< ] , haxe/flashdevelop, flashpoint, etc... Anyways, I just feel the SWF format, itself, has far from outlived its usefulness. Regardless, it should be a user CHOICE to have flash as much as it is to not have. but the "open web" community has demonstrated they're not entirely open [Encrypted Media Extensions!!!] , and aren't open-MINDED ["flash needs to die, flash users are stupid, flash developers need to adapt or die with adobe despite themselves doing NOTHING WRONG"]. There are other things, but I no longer can find the sources [suspicious!], plus don't really want to drag THAT discussion on much longer. I'm worrying about things I shouldn't have to, and don't want to. Which is ANOTHER reason good and potentially good developers are LEAVING the field, especially veterans and disillusioned newcomers. I am currently trying to transition to Haxe. I'm cautiously optimistic Johanthan Blow actually achieves all or most of his promised goals because: I like js for small little macros [converting hex, Caesar cipher, countdowns, so on] , not full-fledged programs C is still a bit arcane assembly isn't feasible unless i target a retro platform where asm programming is decently tolerable actionscript is basically dead C++ is NOPE python isn't really supposed to run large applications and the whitespace sensitivity will hang you quickly lua needs an interpreter if you plan to USE it Rust just didn't quite impress me Java is too strict and the devtools keep ADDING LAYERS [AND NEED JAVA INSTALLED, which while i have, is outdated and i can't seem to update] Haxe still hasn't impressed me, really. But, better than manually writing JS and I CAN compile to native applications or the Neko VM at some point. As for NPM and all : yep... companies and the desire to get hired by said companies. i actually DON'T want to be. freelance or something? fine... chain of command consisting of people who should not be considered 'qualified' to make decisions? NOPE... [which tbh is just ONE reason] i've seen enough blogs and portfolios. same reason people remove their older works "silly me, i was so dumb back then!". the code style and design choices? sure, i'm ashamed of SOME of my stuff in those regards. but yanno what? i'm proud i can say i did it at all, that i completed it. this has some not nice code and uses...jquery: crystalien-redux.com/unrelated/ETX/apps/web/marsTrans/MarsTrans.html but, i can say i FINISHED it, and that it does EVERYTHING i wanted it to. [and it still works to this day!] i might update it in the future, i might not. and two big changes will be externalizing the messages array to a JSON database [at least converting it], and trying to pull jquery + jscrollpane OUT. i settled for that at the time because CSS3 was giving me problems [THANKS BROWSER VENDORS AND W3C!] point is, it means more to me this is done. if a potential employer cannot understand that, if they cannot understand the particular problems that i had to solve to make this work, then they don't deserve me, and they aren't qualified to hire employees. programming, engineering, science; are about developing practical solutions to actual problems, finding the sometimes illusive answers to important questions. companies and standards organizations lost sight of this, obviously. infomercials show it nicely with the insane majority of products you just don't need and can solve the supposed problem with your existing products "special tray to cook your bacon in the microwave!" and common sense. a handful are practical, but MOST aren't. same goes with frameworks and some of these "features" [like 'the cloud'] they added to software. i lost it with the M$ solitaire when they started bragging about the "innovations"... there are CLOUD-BASED games that UP-FRONT let you export your data to A FILE and vice-versa. and tell me why i need to watch ads on a card game that i could easily play in meat space for basically free [cards are cheap, much cheaper in the long run than subscribing to the premium/VIP version] if i wanted? their only true innovation was the extra modes. even then...pft... i could buy like 4-8 decks of cards and play HUNDREDS AND THOUSANDS OF GAMES. didn't need shiny new graphics, didn't want ads, and i fail to see the save data for that game weighing-in even at a KB, easily stored in a plain .txt file or copied to a text/email >.< well, more time basially wasted ranting bout this...sigh... however, the site i'm trying to use is suspiciously not loading at normal speed, so i can't play my crappy casual games, anyways... >.>
engine/framework/library are a bit ambiguous at times. well, accessing the HTML DOM is a simple/small action. jquery's most famous use is to do exactly that. again, for a very specific type of project, it makes sense to bring those frameworks in. but tbh, decoders don't need anything sufficiently advanced, either. reading a PNG could be done with DataStream.js [makes dataviews easier to work with] , zlib.js, and then the PNG-specific bits can be written on both of those. file parsing also usually is the reading of an array, or several...XD. i've reversed or partially reversed a few. seen arrays and other tabular data a LOT. chunk-container types are also surprisingly common. the libraries using libraries using libraries is one main reason i don't use NPM, though. i do understand how SOME of it happens, though. there are a few libraries i want to use in my own, or have the option to, i fear some wrapper objects/instances might need to be used...BLEK! i'm not saying this is the most common reason. 90% definitely is "look, i made an NPM library, you should hire me!"
On my computer: In C++: 8kB In Rust: 430kB In naked assembly: 1k Even the .class in Java (which doesn't include the runtime so it's cheating) is 417B. OK, this is comparing apples and oranges but it's still interesting.
Thank you Benjamin! These videos encourage and motivate me to improve my knowledge of electronics and pursue my business desire/passion to become a hardware manufacturer!
I find it to be a very special gift that you would upload on Christmas Eve. The Tech DIY community is very lucky to have you. I got your main kit under the Christmas tree with a signal generator. As a college student I find this community to be very important in my life. I hope you had a Marry Christmas Ben and God bless you and your family.
Certainly faster pushing 8 lines of pixels from font memory to display and doing all the cursor bookkeeping while the CPU outputs one byte, then pulses the E line.
thanks for this great series. I'm a long time 6502 coder, but I've been wanting to get into circuit building for years and this video presents it all so simple and straight forward.
I got this running on a PCB. I got annoyed with wires coming loose on a breadboard and finally abandoned breadboards. So I figured out how to use KiCad to create the schematic and PCB layout. Learned to solder. Put it together. Had a few hardware issues. But figured all that out. Seeing “hello world” for the first time never felt so good. Thank you Ben!
Lots of impressive stuff here, but one of the most impressive is how you never make any typos! (forget a paren, sure, but I don't think I've seen you type backspace in this whole series!). Thanks for these videos, they are great.
Getting VASM to work on my windows computer was a steep learning curve for a computer novice like myself. I was able to get it to work although I’m not sure I’m using it correctly. What’s amazing is how many clock cycles it takes just to get an “H” to print on the screen. How we have gone from this to virtual reality is truly amazing! Great video!!
I know where you're going, but it might be fun to connect the R/!W, RS and E line directly to the 6502 in the future (with a simple address decoder of course). That would make it possible to just say Hello World by writing the command bytes to one address and the message characters to another address, without the hassle of toggling the control bits in the program.
If I understood the display datasheet correctly, which in part is just gibberish to me so there's that... the display can run as fast as 350KHz on the slowest functions, so even 1Mhz might be pushing it. With components rated at 14Mhz I think the VIA ends up the easiest option, especially if you need to read from the LCD.
I'm a little late to the party, but in case anyone is interested ... I did this and so far it works! And yes, it was fun to do! The E line needs a NOR gate with one input to an address select line (active low - I used a 74ls138 as the address decoder) and the other input to PHI1out. I connected the RS line to A0. Right now, I can't test higher than 165Khz. I need to un-spaghettify my clock module first. The spec sheet says "... high speed MPU bus interface - 2 MHz (when VCC = 5V)". And there should be a way to generate wait states for higher CPU clocks.
You could slap it directly on the bus in this arrangement, but it would interfere with other stuff you *may* want to attach to that bus later on. The CIA/VIA should be your interface to any peripherals.
Wow, very retro. I played with the 6502 & 65C802 CPU back in the 80's I built a logic analyzer based the 65C802. I used two 2x40 LCD panels for UI. All written assembler. The project was done on S100 prototyping boards using wirewrap method. It was lots of fun. Later on I was using the Intel 8031 family of controllers.
Played with a 6502-based Heathkit during my college days over 30 years ago. Would have been way more interesting if we had someone like Mr. Eater as our lecturer.
Following along with the kits and videos, and spent a while trying to debug why my LCD wasn't working. While poking at wires, none of them stood out as an obvious culprit for 'loose wire not connected' etc. Turns out I think I just had my clock set so slow it was taking like a minute to fully initialize the LCD and put text on the screen. DOH! Still, such a great feeling to see it running. Thanks for the great videos and kits!
You could stick a ribbon/strip of plastic under the eeprom chip to give it "handles" to pull it up evenly. or Snap the chip into a sacrificial socket and pry that out each time. Might prolong the life of the chip. or Even use a programming jumper cable that jacks into the breadboard? just a thought. I held my breath every time you popped it out!
There's also "chip pullers", specific tools for that; tweezers meant to pull chips evenly on each side. Your plan is a great workaround if you don't have the tools though!
You can see all the details about how computer works. I worked more 30 years about computer hardware design and software techniques that I can say . These videos really good job.
Ben you must release this series faster because I’m starting my assembly course next semester as a part of my Electrical engineering program ;). Great video as always!
I love the fact you're selling kits on your site. Not likely to get the 6502 one. But the variable timer seems very useful. And I need a programmer - last time I tried to get one all I got was a usb cable. XD
Great video. For any who want a ‘second pass’ at understanding on how these things work, 8-bit guy did a pretty good video on this, from a different perspective.
I guess time and repetition works because I am actually starting to understand most (or some) of the content now. Thanks for these walk throughs! I'll continue to watch into the new year and beyond buddy!
You know what's worse than 177 lines for a hello world? 1k lines for a hello world. I wrote an OS and by the time I'd got the keyboard and video drivers working + bootloader + a bunch of other stuff it came out to be ~1100 lines of C and assembly.
Wow! What discipline do you study? I am student for my bachelor of science in Electrical Engineering but I’m also interested in computer science and native level coding
Best thing about his videos is that he's literally reading the manual! So many people ask stupid questions without even trying to read the documentation. Here you can see: smart people aren't smart from birth, they read, understand and learn!
Next video is probably one I would have thought you'd do a while back: Add a RAM chip so you can have a stack, and put most of that code into subroutines.
I was thinking the same, why not using subroutines? But then I realized it needed a stack and therefore RAM :) well, he still has 32k unused address space. I like where this is leading to.
oh thats why hes not using those... though you not really need a stack or ram to do that ... just store your program counter in a free register (i guess there is a b-register or something) then do all your subroutine-ey stuff and write the program counter back in software?!
urugulu true. Though I am not sure if JSR/RTS could work like that. I should read up on it. My assumption is that it needs a block of 256 bytes at address $0100 it can read/write for the stack.
@@urugulu1656 Nope. 6502 has three 8-bit registers: A, X, and Y. A subroutine like one to display a string would need to make use of A and at least one of X and Y. And, to get to the program counter, you would need to jump to a subroutine and pull it off the stack anyway.
Too bad they never made the 6532 chip in a static CMOS version. If there had been one, it could have been used in this project instead of the 65C22 and it would provide the same 2 8-bit I/O ports, plus 128bytes of RAM which would be enough for a stack (plus a bonus timer). This way the number of components could be minimized by avoiding using a separate RAM chip. It's still doable to use a NMOS 6532, but you won't be able to stop the clock or run it slowly because that's not a static chip.
This is so cool... so much documentation you see on using these displays consists of "just buy our SPI / ATMega328 interface and you won't need to know how the LCD works".... great to see someone talking about actually how to interface to the displays. I love the "wobbly" LED output... it's like a little "caterpillar animation"
Hmm to make the code easier to read and more efficient (spacewise at least), you can just make a new label that takes data input (either from memory or from a register), and then prints it on the screen. .printc: ; print code ; set a to "H" jmp .printc ; etc And then you can make a loop that iterates over some memory and passes it to printc And you can define a string in memory to be read by the loop I know in nasm you would use db but probably not in vasm
worse, there isn't anything you can put at the end of .printc, without a stack, which needs RAM, which could also be what he's referring to (but that needs a stack pointer, which is another register anyways)
Simon Buchan he has registers it's part of the cpu but he has no stack currently so if the sp changes nothing really happens so he can't call procedures
Look at the speed in which the characters appear. I haven't watch the preceding videos right now, but I would be very surprised if he clocked this CPU with more then 10Hz. :-) That's why he doesn't need delays right now, but of course he will need them once he increases the CPU frequency. But: Even Rom was not built on one day. This is one problem in education. There is only so much information you can put in one lesson before it goes over the head of the student.
I would like to have seen you just start the LCD off as a direct replacement of the LEDs, other than the extra pins you need to make it run (like "enable"), before changing your program up, to see what that would have looked like. It would also be a bit interesting to see you have just a raw LCD that replaced the LEDs, meaning that you would have to code for each pixel manually... like you did with the LEDs.
Thanks to these videos I understand the basic principles. It felt like I'd spent years just asking myself how these things actually work. Now I know. I hope to get a kit and ultimately make it run space invaders on a black and white led matrix.
I would love to see you use the video card you made and make an audio card to use with the 6502 computer. Also love your videos, they have been an enourmous inspiration in my journey in learning digital eletronics, thank you for sharing your work with us!
It's 04:30 am and I still can't get asleep because the videos are so addicting! For instance I had no idea of how strong the assembly program depends on the actual hardware setup. Emulating a certain device is relatively simple but all these interactions between devices via the specific interfaces is something a had a huge gap in. I mean this assembly code to print a single letter on a lcd literally involves writing an lcd driver (I mean controlling lcd's behavior) and this is so cool! Ben, are you aware if anyone actually made an emulator of your computer? I already want to write my own! THANK YOU FOR THIS AMAZING SERIES!
Just started out 1st video in playlist for fun and now i am literally curious to perform this myself.. PS. I am a Computer Science student.. I love your videos and its the best gift..
Why do you do a bitwise OR with the Register Select bit and the Enable bit? Won't the Register Select bit already be 1 during that operation? Edit:. Never mind I understand, didn't realize the Enable bit would turn the Register Select bit off. Love your videos, Cheers
LDA is replacing the value of A, so you need it to be RS, then both RS and E, then only RS again. Likewise, STA is setting all 8 bits of the output to the value of A.
NEVER IN MY 20 YEARS OF LIFETIME I THOUGHT THAT THIS TYPE OF CONTENT WOULD BE THE ONE I WOULD LIKE AND I LOVE THE SERIES I BINGED WATCHED IT ONCE AND WHATCHING IT AGAIN TO TAKE NOTES AND LEARN IN DETAILS AS WELL DOING MY OWN RESEARCH ITS AWSOME
Thank you Mr. Eater! Great video and I'm learning so much, I'd like to get to the point of writing my own programs for this machine and you've setup a great foundation to do that. Thank you, I can't wait to do the next video.
If your clock rate is only a MHz or two, you don't need a 65C22 to interface one. Instead hook it up directly to the 6502 data bus, and use address decoding to memory map it.
Very thankful for these and your other instructional videos. Such an inspiration! I'm halfway through the 8-bit stuff, and have finished this project too, with some modifications and additions. Merry Christmas! 😊🎄
finally a series that explains all the logical steps of using micro controllers great work there Ben hopefully you have in mind some expansion modules like the 5651 for serial input output could be cool.
There aren't really "functions" on the 6502! There is JSR (Jump to Subroutine), which stores a copy of the program counter on the stack and jumps to the address supplied; and there is RTS (ReTurn from Subroutine), which reads an address off the top of the stack and jumps to there. You can JSR to somewhere in memory, do a bunch of instructions and then RTS back to where you came from. Just to make it interesting, the PC is actually pointing to the last byte of the JSR instruction, which is the high byte of the address, when it gets copied onto the stack. But when that address gets copied back into the PC, the PC gets increased anyway at the end of every instruction, even RTS; so it's then correctly pointing to the (first byte of the) next instruction after the JSR. So what you might do is have a subroutine that pushes a copy of A onto the stack, and writes it to PORTB; then sets RS, R/W and toggles E on and off, trashing the original value of A; and lastly pulls A back off the top of the stack and does an RTS. Then we can just LDA with the character code and use a single JSR instruction to send it to the LCD module. And that's easily turned into a loop, using the X or Y index register. But first we need some RAM to implement a stack!
bluerizlagirl I wouldn’t even use subroutines, just create two macros for send instruction and data, use the instruction macro 3 times and loop over a string to send the actual message. No RAM needed, no subroutines needed.
@oH well,lord! A macro is just a block of code you can insert over and over again. Instead of copying and pasting, you define a macro consisting of the code you want to copy and insert that wherever you want to paste it. The computer then copies it and pastes it for you, possibly even making substitutions.
I thought the same thing at first... But you can't JSR (function call) without a stack to put the return address. Which means you need a RAM chip at address 0x0100-0x01FF.
Supertyp Yes, I am well aware of that. But supposing he was able to use JSR as H x had mentioned, he would still have to implement a part involving those bits, and still have to go into detail. A function would still have to set those bits up, and any way he wrote the code would enable such explanations.
Chuck Peddle, main designer of the 6502, passed away on 15th December 2019
Rest in Peace
Yep, the computer pioneers of our era are beginning to go now. Dennis Ritchie, co-inventor of C, passed in 2019. There was somebody else that died recently, but I can't recall who it was.
@@markjones5973 Dennis Ritchie died in 2012
Dennis Ritchie died twice?
@@huguia Typo. I knew he died a few years ago, so I looked it, saw that it was in 2012, came back here and promptly typed 2019. I'm such a dumbass sometimes.
@@markjones5973 good for you
I had my Digital Electronics exam today and thank to your 8 bit computer playlist I was able to score a A grade , keep up the good work mate ❤️🍀
That is awesome!
Well done. :)
Once you know how to read those specs documents, everything becomes much clearer.
That’s really awesome
You both keep up the good work! :)
nothing is on fire is the most common circuit designer phrase
sadly thats not true :(
@@PeterfoxUwU Yep, usually it is something regarding smoke. As in, "not letting the magic smoke out", or more commonly a "smoke test".
More like "it's on fire!!!!!!"
Note the exclamation marks.
@@stevewalston7089 smoke test jajaja i didnt know that
I look away every time I plug in something I build.
Fantastic stuff. I've graduated uni as an electrical engineer, but most of this low-level stuff was glossed over even in the digital hardware courses. These video's don't really teach me much new material per say, but what it does teach fills in the holew and gives me a lot of those "Ah hah!" and "Eureka!" moment where all the things just click into place and really deepens the understanding of how these systems work and are build up physically.
With this series specifically, i've had flashbacks to the very first embedded programming course in second semster, where we hooked an LCD up to an arduino programmed with AtmelStudio to say "Hello, world!". Never understood exactly how the ports on the ATmega were addressed and accessed by the core of the microprocessor, until now. Good stuff.
Merry chrismas Ben :)
You’ve really summed up how I feel watching this. I graduated (MEng, EEE) 7 years ago, assembly was done in first year before quickly moving onto c. I remember assembly, I remember being taught about how microprocessors work (I scored well on this stuff and software in general), but these videos just make it seem so much easier.
@@alexlovett1991 how is your career if you do not mind me asking. I am studying EE and am wondering how easy getting a job was. I have knowledge with programming in multiple languages, hoping that could help me.
@@TheAcidicMolotov I don't know about where you are, but in the UK being an EEE you shouldn't have a problem at all getting a job. Get your profile on linkedin and you should get recruiters messaging you. That's how I got my first job. After 6 months working I basically had recruiters contacting me daily (which is common for other software engineers I know).
I'd be wary about using the phrase 'i know language x' it's best to be honest and say 'I've got n experience in x language' where n could be little, a reasonable amount, a lot etc. If you claim to 'know' a programming language interviewers might have a high expectation for the depth of knowledge. I've been doing java for 8 years, I think I've got a very good knowledge of it, but I'm also aware there's a lot I still don't know about it and if asked by someone about those topics I'd have to answer 'i don't know'
It's also really worth getting some projects on GitHub or another public repo that you can add in on your CV. Just having done multiple projects on your own time really helps show enthusiasm.
@@alexlovett1991 thanks for the answers!
Would you do a series abut USB protocol? Especially HID
MGfails noice. Perfect 64 likes.
That would be actually very interesting good idea
@@skmgeek 69 now :)
YES!
If you are interested in self learning, there's a library for the AVR ATMEGA called V-USB that bit bangs the USB protocol using IO pins. It also has example code for emulating a bunch of different HID devices. I made an arcade joystick using it a few years back -- back before you could just buy a pre-made board on Amazon for $12... 😑
So after 10 years of being a professional programmer, I'm now waiting for a video on how to optimise hello world code.
2019 is weeeird.
Even after 10 years having been a programmer ( I mean a professional one) you didn't know this one. so what went through your mind?
In the meantime I found 2019 to be a rather fascinating one, most satisfying. Would that make it double weird for you? :-)
"2019 is weeeird"... Oh boy, you thought that 2019 was a weird year...
Erik Nestaas 2020 is weird.coronavirus...
@@eriknestaas2270 lol
We were so naive back then.......
This is what happens when the "primitive tech" channels advance into the modern era
Or rather, the one "primitive tech" channel since all the other ones are just poorly ripping it off.
@@tux1468 Usually I'd be mad about them ripping him off, but to be honest some of them are just as entertaining, but not from an educational point of view.
@@tux1468 Add apionet
@@Amethaz Bro go away
@@tux1468 No❤️
I have learned so much more from Ben than I did in college. Maybe it is because I am older and wiser, but I think it is because Ben is an amazing teacher. If you read this Ben, please dont ever stop teaching on youtube.
I love this sort of computing. It reminds me of my childhood in the 80s. Assembly code is fascinating to me, even if I'll never have a reason to use it. I would have loved to be involved in NES or SNES, trying to use all of the registers and bits as efficiently and creatively as I could have. The wild west of games.
His instruction / documentation reading skills are legendary
This. I'm fully dependent on stack overflow and ChatGPT and this guy is doing wonders with immutable and finite paper.
the best christmas present
I got the ELEGOO UNO R3 Project Most Complete Starter Kit But I still haven’t gotten very far in figuring anything out that’s why I’m watching a bunch of videos but I’m still confused as heck
I got that same kit for Christmas too! I’ve had a little experience with electronics but nothing crazy, made some cool little projects like a joystick controlled LED grid or a sound controlled motor
I tried to do even the simple stuff but couldn’t figure anything out
Go on their website and download the PDF that they give for your specific kit. From there, read the first four lessons and if you need help finding the right parts, look at the little guide they put on the top of the parts box
Hey Ben, just wanted to to say thanks to you. I've started my journey of Electrical engineering in 2016 and got confronted with AVR assembly back in 2017. Scratching my head about toggling an LED in this course I attended I started searching for a source of information on the internet and found your channel. This literally changed my life!
Of course I scored an A in this exam, as well as in all other exams containing electronics. At the end of my bachelor's degree, i graduated as best student of the year. From there on I've attended I started a master in electrical engineering during covid with my handy dandy TH-cam at my side. What can i say...I've also graduated with superb grades.
What I want to say is: the first inspiration and steps during learning matter the most. Thank you Ben 😉
I'm learning so much. Holy cow.
As a budding programmer, I have nothing but praise and appreciation for the free education this channel is providing. Thank you.
Same
4:00 Thanks for giving an excellent short description of what is integrated in the package design. Typically you'd hear to just don't worry about it and continue with implementation. Great instruction!
“333 bytes is inefficient” I’m watching this in chrome 🤔
even on a modern CPU, for what this is doing, it's inefficient
on a system with even lower memory, it's worse.
there's also the issue that every time you want to go and change the output... OUCH!
you have to copy-paste several lines of the same code and tweak them just a bit, and hope you don't screw-up
its good as a tech demo or proof-of-concept
hence why he said the next video will be about optimizing it
though, i do have a theory all of what will happen...lol
i'm mainly watching this for fun, and also because i'm curious how you take processor and build an actually useful device with it.
as for chrome, or ANY modern browser [rather, application] , tbh, yes, quite inefficient. it's now the rule and not the exception that web browsers suffer memory leaks. it starts as trickle, and before you know it, you HAVE to re-start the browser... >.< I use pale moon, myself. Never liked chrome since my first experience, firefox drove me away by taking-away all the features and customization that I used it for in the first place. The reasons go honestly far beyond the scope of developing a single application, though. It's decades of questionable practices, including so-called "rapid app development" and other "innovations" in the tech industry. I started my "serious" programming with these newer languages/tools, and tbh, there are multiple things I don't like. I watch videos like this, and about ROM-hacking, so on. I just facepalm at what's being done today. I have so many ideas right now or projects/systems I'm slowly cobbling-together that are pretty heavily inspired from older technologies/tricks [if not re-implementing them to some degree], or using low-level concepts. Or well, that's the plan.
using a million other libraries or adding extra baggage is why i don't trust jquery, node/npm, so on...
javascript is unfortunately so utterly worthless on its own that you need libraries by default. i have yet to see anyone preaching the joys of HTML5 use the APIs right out of the box. they use engines that abstract the crap away. been building my own from scratch. this is a problem plugins didn't have. but "in the name of security" they were decided to be killed-off.
hoping to switch myself to full native or web + native development. I just also don't desire to touch such things as C++, either... *shudder* And I suspect my next obstacle is going to be these corporations for "our security" going full gatekeeper via the online stores. note the 'recommend' setting in win10 is to ONLY install from the store. There's a lot of programs that this can kill-off because there are so many reasons it can't or WON'T appear in e-stores. And then it might even force others into needing to go through additional licensing programs [i suspect this is the future for hex editors and programs that have a reputation for being used in hacking, especially reversing proprietary formats and facilitating piracy/emulation]
libraries aren't in themselves bad, the way they're used and maintained is. mainly the mainstream ones. admittedly, i used to do this, jquery's main use: $("some_element"). if this is what you need to do, document.getElementById("some_element"). note this is discouraged. same with directly using XHR, which you could cut AJAX out in 90% of all cases. bootstrap, jquery, so on, should only be used for sites like twitter, youtube, so on... [if even there, but google did jquery, twitter did bootstrap... small sites/apps don't need and shouldn't use them]
it's mainly executives who don't know anything about the company and the industry leading this push. look at commercials where the only thing they can brag about is : "patented" , "proprietary". this just simply means they keep their methods secret and have gone through the [very expensive] process of facilitating the legal right to sue people for trying to copy their product/service/process. speaks nothing for the quality of the product/service. this is also what i find funny about norton/lifelock. "with all these data breaches happening..." , why these happen : governments [NSA caused ransomware outbreak by not disclosing the exploits to computer companies, cuz THEY wanted them to look for possible 'terrorists' via unethical, immoral, and ILLEGAL spying mechanisms] , databases get breached partially because executives don't want to pay money to people to properly fix them, INTERNET OF THINGS [that shouldn't be connected to the internet], cloud computing, streaming [latter two are ultimately DRM mechanisms cuz it turned-out the fatal weakness of DRM was, and always will be, client-side code/data. it's not security/convenience as the reason, it's copyreich]
i'm personally waiting for the next big crash, it's inevitable this will happen in the next few decades. we're actually making things less and less compatible with each other, which will lead to a lot of data rot, FAST. [like flash being killed, at least the open source community is working on it.] we're connecting EVERYTHING to the internet, increasing VASTLY the attack surface for hackers to target. we're increasing bandwidth usage and the infrastructure wasn't designed for so much traffic, nor was it properly upgraded in many cases. we're adding countless lines of tightly coupled dependency/overhead/boilerplate code and adding many extra layers to code management that don't belong there. [oh, and writing macros to generate code/data that the COMPILERS should be doing automatically] everything is obsolete when it's released, nothing released is finished. W3C members and browser vendors are STILL having a pissing contest, this is why APNG failed and GIF is STILL in use. [This is also why CSS3 suffered that prefixed properties BS.] Proprietary code is still supported in open source bodies [EME, aka DRM for the web, glorious "champions of internet freedom" , Mozilla, actually switched their stance to endorse/support it] Something's gotta give...
[oh, and programmers who actually care or know how to do things are quitting/dying, assuming the 'supervisor' or an 'o' aren't telling them to basically, not do their job, cuz it costs money, anyways]
I'm hoping my own code base will avoid some of this since I'm trying to axe the third-party dependencies as much as possible... Even then, there's a lot I can't avoid, like the internet globally suffering an outage, or the possibility/inevitability M$/Crapple are gonna tighten the noose around developer's necks since as it stands, I [and many others] can write and publish code/software which they don't have the ability to control at present. [End users, likewise, can still download/install/execute/[de]compile that code.]
welp...another text wall... never thought programming and tech, generally, could be so...political... >.< i just want to make + play games, dammit!
yes and no. the HTML5 Canvas/webGL and WebAudio APIs are absolutely miserable if you don't have a library. even if it's an in-house solution or lightweight, it's still a library/engine/framework, it's still basically necessitated because you're dealing with simple state machines that don't have any concept of what an actual program [especially game/animation] needs...
a 2D game needs a concept of tiles/sprites, which requires some additional systems and data structures.
a 3D one needs models, skeletons, possible inverse kinematics, [shaders,] etc... If you implement that wrong, performance is bad, you could even potentially cause damage/stress to the machine/ [something people don't seem to like mentioning about GL is that you really shouldn't touch it if you don't know what you're doing. just how it's not technically necessary even for 3D UP TO A CERTAIN POINT]
all games need asset management and audio playback + audio looping.
HTML5 doesn't do this out of the box. Neither does C/C++, so on, but we already have libraries [and...MASSIVE engines/frameworks] to deal with this for native applications and...mobile/console. Web had shockwave, java, Flash, Unity as the main ones. They each had their use, and I'd argue still have benefits if people can do ANYTHING besides freak-out about code exploits and "proprietary black boxes". More so, Flash. Shockwave's plugin is horrible, I hated dealing with it. But, some of my fav games need it. Unity is still a mess post-webGL [especially as an impenetrable proprietary black box with code doing god knows what] Java applets were neutered and I don't like the "everything is a class" blueprint-oriented nightmare java helped create. Flash's format, at least, is honestly great if you consider bandwidth and flexibility/extensibility. Parts are a nightmare to parse because that's how serious adobe/macromedia were about making the files SMALL. coupled with the AVM you had pretty much the only general-purpose game engine that's arguably useful, ONCE you figure-out how to use it. You don't have to monkey-patch flash or add loads of boilerplate to make even a simple/'mediocre' game. You can, and a lot of modern flash games do run entire frameworks behind the scenes like box2D or Flixel, but you never NEEDED to. A lot of the limitations with flash are also imposed, now that the format and AVM are documented so thoroughly, it's not out of the question to bypass the limitations or improve the parts that don't work. Just as an asset format, I wish SWF or something like it, stayed. That's one of my projects, but it's time-consuming and tedious, and I''m spread too thin. I probably sound like an Adobe shill, but I've long since lost my respect for them as a company. Currently I don't plan to collaborate with them or any other large organization since I don't trust them. I also don't expect adobe to do anything more than allow it to happen like they already have with SWFmill, FFDEC [which sourcetek/sothink hate! (:< ] , haxe/flashdevelop, flashpoint, etc... Anyways, I just feel the SWF format, itself, has far from outlived its usefulness. Regardless, it should be a user CHOICE to have flash as much as it is to not have. but the "open web" community has demonstrated they're not entirely open [Encrypted Media Extensions!!!] , and aren't open-MINDED ["flash needs to die, flash users are stupid, flash developers need to adapt or die with adobe despite themselves doing NOTHING WRONG"]. There are other things, but I no longer can find the sources [suspicious!], plus don't really want to drag THAT discussion on much longer. I'm worrying about things I shouldn't have to, and don't want to. Which is ANOTHER reason good and potentially good developers are LEAVING the field, especially veterans and disillusioned newcomers.
I am currently trying to transition to Haxe. I'm cautiously optimistic Johanthan Blow actually achieves all or most of his promised goals because:
I like js for small little macros [converting hex, Caesar cipher, countdowns, so on] , not full-fledged programs
C is still a bit arcane
assembly isn't feasible unless i target a retro platform where asm programming is decently tolerable
actionscript is basically dead
C++ is NOPE
python isn't really supposed to run large applications and the whitespace sensitivity will hang you quickly
lua needs an interpreter if you plan to USE it
Rust just didn't quite impress me
Java is too strict and the devtools keep ADDING LAYERS [AND NEED JAVA INSTALLED, which while i have, is outdated and i can't seem to update]
Haxe still hasn't impressed me, really. But, better than manually writing JS and I CAN compile to native applications or the Neko VM at some point.
As for NPM and all : yep... companies and the desire to get hired by said companies. i actually DON'T want to be. freelance or something? fine... chain of command consisting of people who should not be considered 'qualified' to make decisions? NOPE... [which tbh is just ONE reason] i've seen enough blogs and portfolios. same reason people remove their older works "silly me, i was so dumb back then!". the code style and design choices? sure, i'm ashamed of SOME of my stuff in those regards. but yanno what? i'm proud i can say i did it at all, that i completed it.
this has some not nice code and uses...jquery:
crystalien-redux.com/unrelated/ETX/apps/web/marsTrans/MarsTrans.html
but, i can say i FINISHED it, and that it does EVERYTHING i wanted it to.
[and it still works to this day!]
i might update it in the future, i might not. and two big changes will be externalizing the messages array to a JSON database [at least converting it], and trying to pull jquery + jscrollpane OUT. i settled for that at the time because CSS3 was giving me problems [THANKS BROWSER VENDORS AND W3C!]
point is, it means more to me this is done. if a potential employer cannot understand that, if they cannot understand the particular problems that i had to solve to make this work, then they don't deserve me, and they aren't qualified to hire employees.
programming, engineering, science; are about developing practical solutions to actual problems, finding the sometimes illusive answers to important questions. companies and standards organizations lost sight of this, obviously. infomercials show it nicely with the insane majority of products you just don't need and can solve the supposed problem with your existing products "special tray to cook your bacon in the microwave!" and common sense. a handful are practical, but MOST aren't. same goes with frameworks and some of these "features" [like 'the cloud'] they added to software. i lost it with the M$ solitaire when they started bragging about the "innovations"... there are CLOUD-BASED games that UP-FRONT let you export your data to A FILE and vice-versa. and tell me why i need to watch ads on a card game that i could easily play in meat space for basically free [cards are cheap, much cheaper in the long run than subscribing to the premium/VIP version] if i wanted? their only true innovation was the extra modes. even then...pft... i could buy like 4-8 decks of cards and play HUNDREDS AND THOUSANDS OF GAMES. didn't need shiny new graphics, didn't want ads, and i fail to see the save data for that game weighing-in even at a KB, easily stored in a plain .txt file or copied to a text/email >.<
well, more time basially wasted ranting bout this...sigh...
however, the site i'm trying to use is suspiciously not loading at normal speed, so i can't play my crappy casual games, anyways... >.>
engine/framework/library are a bit ambiguous at times.
well, accessing the HTML DOM is a simple/small action. jquery's most famous use is to do exactly that. again, for a very specific type of project, it makes sense to bring those frameworks in. but tbh, decoders don't need anything sufficiently advanced, either. reading a PNG could be done with DataStream.js [makes dataviews easier to work with] , zlib.js, and then the PNG-specific bits can be written on both of those. file parsing also usually is the reading of an array, or several...XD. i've reversed or partially reversed a few. seen arrays and other tabular data a LOT. chunk-container types are also surprisingly common.
the libraries using libraries using libraries is one main reason i don't use NPM, though. i do understand how SOME of it happens, though. there are a few libraries i want to use in my own, or have the option to, i fear some wrapper objects/instances might need to be used...BLEK! i'm not saying this is the most common reason. 90% definitely is "look, i made an NPM library, you should hire me!"
On my computer:
In C++: 8kB
In Rust: 430kB
In naked assembly: 1k
Even the .class in Java (which doesn't include the runtime so it's cheating) is 417B.
OK, this is comparing apples and oranges but it's still interesting.
As someone coming from Arduino
It's amazing to see this in detail
How the LCD actually works
Instead of importing some library
Thank you Benjamin! These videos encourage and motivate me to improve my knowledge of electronics and pursue my business desire/passion to become a hardware manufacturer!
I find it to be a very special gift that you would upload on Christmas Eve. The Tech DIY community is very lucky to have you. I got your main kit under the Christmas tree with a signal generator. As a college student I find this community to be very important in my life. I hope you had a Marry Christmas Ben and God bless you and your family.
That awkward moment when the MPU in your lcd is almost more powerful than your computer
Lol !
This is so true
Certainly faster pushing 8 lines of pixels from font memory to display and doing all the cursor bookkeeping while the CPU outputs one byte, then pulses the E line.
thanks for this great series. I'm a long time 6502 coder, but I've been wanting to get into circuit building for years and this video presents it all so simple and straight forward.
Literally the best Christmas present :)
I got this running on a PCB. I got annoyed with wires coming loose on a breadboard and finally abandoned breadboards. So I figured out how to use KiCad to create the schematic and PCB layout. Learned to solder. Put it together. Had a few hardware issues. But figured all that out. Seeing “hello world” for the first time never felt so good. Thank you Ben!
I added LED’s for all the data and address lines too. So there are a lot of flashing lights too.
Merry Christmas Ben!
2nded, Merry Christmas Ben and everyone!
Ben Skywalker
Lots of impressive stuff here, but one of the most impressive is how you never make any typos! (forget a paren, sure, but I don't think I've seen you type backspace in this whole series!). Thanks for these videos, they are great.
"Im not pround of this code"
basicly me, every time i write code.
NoCake Elsewhere: I am proud of this code I copied from google and glued together
How to programm for beginners:
Step 0: Search for your problem on StackOverflow
Step 1: Copy & Paste
Step 2: Compile
Step 3: Profit!
But is the Code proud of you?
@@aleisterlavey9716Is the proud code of itself?
RTFM!
This is the perfect and most satisfying example. Binge watching from when coming home today, and exited how many episodes come.
Hello world
Thank you for another year of learning! This series has taught me more than I ever learned in formal classes. Have a safe and warm holiday!
Getting VASM to work on my windows computer was a steep learning curve for a computer novice like myself. I was able to get it to work although I’m not sure I’m using it correctly. What’s amazing is how many clock cycles it takes just to get an “H” to print on the screen. How we have gone from this to virtual reality is truly amazing! Great video!!
I agree with this entire comment.
I know where you're going, but it might be fun to connect the R/!W, RS and E line directly to the 6502 in the future (with a simple address decoder of course). That would make it possible to just say Hello World by writing the command bytes to one address and the message characters to another address, without the hassle of toggling the control bits in the program.
If I understood the display datasheet correctly, which in part is just gibberish to me so there's that... the display can run as fast as 350KHz on the slowest functions, so even 1Mhz might be pushing it. With components rated at 14Mhz I think the VIA ends up the easiest option, especially if you need to read from the LCD.
I'm a little late to the party, but in case anyone is interested ... I did this and so far it works! And yes, it was fun to do! The E line needs a NOR gate with one input to an address select line (active low - I used a 74ls138 as the address decoder) and the other input to PHI1out. I connected the RS line to A0. Right now, I can't test higher than 165Khz. I need to un-spaghettify my clock module first. The spec sheet says "... high speed MPU bus interface - 2 MHz (when VCC = 5V)". And there should be a way to generate wait states for higher CPU clocks.
Here's an update. I'm able to directly access the display on the 6502 bus at 1Mhz clock. No 65C22 VIA needed.
You could slap it directly on the bus in this arrangement, but it would interfere with other stuff you *may* want to attach to that bus later on. The CIA/VIA should be your interface to any peripherals.
That's how it should be done. For higher clocks you need to check the busy flag on DB7 before accessing the next register.
I watch this playlist every now and then , and it gets clearer in each iteration, thanks for this!
Wow, very retro. I played with the 6502 & 65C802 CPU back in the 80's I built a logic analyzer based the 65C802. I used two 2x40 LCD panels for UI. All written assembler. The project was done on S100 prototyping boards using wirewrap method. It was lots of fun. Later on I was using the Intel 8031 family of controllers.
Never thought that assembler is so fascinating language. Thank you Ben, you are a great teacher and inspiration for us.
I recently got interested in electronics and embedded systems, and I just wanna binge watch all the videos in your channel!
Great interest! I'm doing computer engineering on embedded right now at uni. Jobs are guaranteed and the work is real fun!
@@awesomefacepalm I am actually an electronics engineer student, only figured out my interest a while back xD
@@EagleZH23 great!
sir your wire channeling techniques are so amazing... i'm jealous
even a vid like this that's almost half an hour long feels like it's too short :S
Lol I slow video down so I can understand what he says
Yeah, even when I slow it down to 1.25x speed
@@MaxPicAxe that's faster lol
@@vaendryl No , he slows down the time
Holly shit! It's been half an hour?!
It's very interesting watching these videos after just taking a course on this kind of stuff in my computer engineering program
Played with a 6502-based Heathkit during my college days over 30 years ago. Would have been way more interesting if we had someone like Mr. Eater as our lecturer.
your youtube account is older than me...
The ironic part is that academia failed Ben Eater and he dislikes it for it.
"Nothing is on fire." Wow, surprised by that. Usually at least ONE thing smokes out in a programmer video.
Electroboom disapproves
software runs on smoke......let the smoke out, it doesn't run again more
Duh really 🤣
I would love to see an instructional video on how you cut the jumper wires so precisely. Great videos! Just ordered your 6501 kit.
RIP Chuck Peddle 20.12.19 ~ developer of the 6502
Following along with the kits and videos, and spent a while trying to debug why my LCD wasn't working. While poking at wires, none of them stood out as an obvious culprit for 'loose wire not connected' etc. Turns out I think I just had my clock set so slow it was taking like a minute to fully initialize the LCD and put text on the screen. DOH! Still, such a great feeling to see it running. Thanks for the great videos and kits!
You could stick a ribbon/strip of plastic under the eeprom chip to give it "handles" to pull it up evenly.
or
Snap the chip into a sacrificial socket and pry that out each time. Might prolong the life of the chip.
or
Even use a programming jumper cable that jacks into the breadboard?
just a thought. I held my breath every time you popped it out!
Seeing this really makes me appreciate the In-System Programmer and the bootloader on my Arduino!
Joshua Bryant put a zif socket on the breadboard.
@@the_jcbone i'll do this so i can stop having to reball my 74181
johnny wolinski I meant Ben. :-)
There's also "chip pullers", specific tools for that; tweezers meant to pull chips evenly on each side. Your plan is a great workaround if you don't have the tools though!
You can see all the details about how computer works. I worked more 30 years about computer hardware design and software techniques that I can say . These videos really good job.
Ben you must release this series faster because I’m starting my assembly course next semester as a part of my Electrical engineering program ;). Great video as always!
Yeah, I finally write the HelloWorld to my LCD. Thank you again for your instruction.
I love the fact you're selling kits on your site.
Not likely to get the 6502 one.
But the variable timer seems very useful.
And I need a programmer - last time I tried to get one all I got was a usb cable. XD
Man that sucks
you tried hiring a programmer and all that showed up was some random usb cable
Great video. For any who want a ‘second pass’ at understanding on how these things work, 8-bit guy did a pretty good video on this, from a different perspective.
Can you provide a link? Thanks.
Its a series, but here is the first (searched for 8 bit guy lcd) th-cam.com/video/hZRL8luuPb8/w-d-xo.html
When your boss measures productivity by number of lines of code
I guess time and repetition works because I am actually starting to understand most (or some) of the content now. Thanks for these walk throughs! I'll continue to watch into the new year and beyond buddy!
You know what's worse than 177 lines for a hello world? 1k lines for a hello world. I wrote an OS and by the time I'd got the keyboard and video drivers working + bootloader + a bunch of other stuff it came out to be ~1100 lines of C and assembly.
@defunctusername haha yeah that's true.
You had way to much time...
@@EDLEXUS yep. I was on a few long haul flights...
Wow! What discipline do you study? I am student for my bachelor of science in Electrical Engineering but I’m also interested in computer science and native level coding
1024 hope the WiFi was good for all those trips to stack exchange
What this guy is doing is a treasure!!
I am taking my hat off in admiration of what you are doing!
Thank you
Wow, thank you for this Christmas present :)
Love the cable managment
Great that your keeping old school programming alive, using the KISS principle Keep It Simple Stupid😀 Love your work👊👍 You Rock👊👍
This really helps me understand stuff like how microcontrollers work
26:00 - when you're paid by the line
Don't tell me programmers can get paid per line. Omg
I'm starting to think I could do this job. I just need a partner to debug it all.
@@dustybrown4599 if you can't debug it yourself, you can't code.
Best thing about his videos is that he's literally reading the manual! So many people ask stupid questions without even trying to read the documentation. Here you can see: smart people aren't smart from birth, they read, understand and learn!
Next video is probably one I would have thought you'd do a while back: Add a RAM chip so you can have a stack, and put most of that code into subroutines.
I was thinking the same, why not using subroutines? But then I realized it needed a stack and therefore RAM :) well, he still has 32k unused address space. I like where this is leading to.
oh thats why hes not using those... though you not really need a stack or ram to do that ...
just store your program counter in a free register (i guess there is a b-register or something) then do all your subroutine-ey stuff and write the program counter back in software?!
urugulu true. Though I am not sure if JSR/RTS could work like that. I should read up on it. My assumption is that it needs a block of 256 bytes at address $0100 it can read/write for the stack.
@@urugulu1656 Nope. 6502 has three 8-bit registers: A, X, and Y. A subroutine like one to display a string would need to make use of A and at least one of X and Y. And, to get to the program counter, you would need to jump to a subroutine and pull it off the stack anyway.
Too bad they never made the 6532 chip in a static CMOS version. If there had been one, it could have been used in this project instead of the 65C22 and it would provide the same 2 8-bit I/O ports, plus 128bytes of RAM which would be enough for a stack (plus a bonus timer). This way the number of components could be minimized by avoiding using a separate RAM chip. It's still doable to use a NMOS 6532, but you won't be able to stop the clock or run it slowly because that's not a static chip.
This channel has kept my grade afloat in digital electronics
Well that was an unexpected, early and really wonderful christmas gift. Thanks Ben!
This is so cool... so much documentation you see on using these displays consists of "just buy our SPI / ATMega328 interface and you won't need to know how the LCD works".... great to see someone talking about actually how to interface to the displays.
I love the "wobbly" LED output... it's like a little "caterpillar animation"
Hmm to make the code easier to read and more efficient (spacewise at least), you can just make a new label that takes data input (either from memory or from a register), and then prints it on the screen.
.printc:
; print code
; set a to "H"
jmp .printc
; etc
And then you can make a loop that iterates over some memory and passes it to printc
And you can define a string in memory to be read by the loop
I know in nasm you would use db but probably not in vasm
to iterate you need a register to put the index/address in, which is presumably the "extra hardware" Ben is talking about.
worse, there isn't anything you can put at the end of .printc, without a stack, which needs RAM, which could also be what he's referring to (but that needs a stack pointer, which is another register anyways)
Simon Buchan he has registers it's part of the cpu but he has no stack currently so if the sp changes nothing really happens so he can't call procedures
Doesn't he already have the ram chip and ROM chip mapped to different parts of memory? Or do I just not pay enough attention lol
@@mrcobalt124 ROM only, even if it's programmable, and a two byte IO buffer.
Very good video. I am surprised it works without programming delays in between the LCD control commands.
Look at the speed in which the characters appear.
I haven't watch the preceding videos right now, but I would be very surprised if he clocked this CPU with more then 10Hz. :-)
That's why he doesn't need delays right now, but of course he will need them once he increases the CPU frequency.
But: Even Rom was not built on one day. This is one problem in education. There is only so much information you can put in one lesson before it goes over the head of the student.
The end of the series: Recreating Windows 1from scratch - 6502 part 32
Windows (the dos versions) ran on a 8088 so its almost the same but the chip needs to be upgraded
I would like to have seen you just start the LCD off as a direct replacement of the LEDs, other than the extra pins you need to make it run (like "enable"), before changing your program up, to see what that would have looked like. It would also be a bit interesting to see you have just a raw LCD that replaced the LEDs, meaning that you would have to code for each pixel manually... like you did with the LEDs.
thanks Ben
Thanks to these videos I understand the basic principles. It felt like I'd spent years just asking myself how these things actually work. Now I know. I hope to get a kit and ultimately make it run space invaders on a black and white led matrix.
I would love to see you use the video card you made and make an audio card to use with the 6502 computer. Also love your videos, they have been an enourmous inspiration in my journey in learning digital eletronics, thank you for sharing your work with us!
It's 04:30 am and I still can't get asleep because the videos are so addicting!
For instance I had no idea of how strong the assembly program depends on the
actual hardware setup. Emulating a certain device is relatively simple but all these
interactions between devices via the specific interfaces is something a had a huge gap in.
I mean this assembly code to print a single letter on a lcd literally involves writing an lcd driver
(I mean controlling lcd's behavior) and this is so cool!
Ben, are you aware if anyone actually made an emulator of your computer?
I already want to write my own!
THANK YOU FOR THIS AMAZING SERIES!
@BenEater Merry Christmas thank you for the video! :)
this is by far the best christmas present i've gotten so far
I would love to see some proof-of-concept coprocessor stuff in your explanation/building style!
This video is today's best gift so far ❤️
Instead of manualy reseting the 65c02, why not use a 555 timer to hold the reset line for about 1 or 2 seconds?
That stuff you do is amazing. I tried to watch and understant this videos before i got into college but now is when i get the idea
What a nice Christmas gift!
YES VERY TRUE
Just started out 1st video in playlist for fun and now i am literally curious to perform this myself.. PS. I am a Computer Science student.. I love your videos and its the best gift..
Great video as usual. Also, I think I'll try putting your 8-bit breadboard computer on an FPGA now that I have one.
With this video series, I've learnt a lot more programming, than on my actual programming courses (2 semesters) at the university...
Why do you do a bitwise OR with the Register Select bit and the Enable bit? Won't the Register Select bit already be 1 during that operation?
Edit:. Never mind I understand, didn't realize the Enable bit would turn the Register Select bit off.
Love your videos,
Cheers
LDA is replacing the value of A, so you need it to be RS, then both RS and E, then only RS again. Likewise, STA is setting all 8 bits of the output to the value of A.
Should be able to load the data into X or Y and do binary math on A to improve the code.
NEVER IN MY 20 YEARS OF LIFETIME I THOUGHT THAT THIS TYPE OF CONTENT WOULD BE THE ONE I WOULD LIKE AND I LOVE THE SERIES I BINGED WATCHED IT ONCE AND WHATCHING IT AGAIN TO TAKE NOTES AND LEARN IN DETAILS AS WELL DOING MY OWN RESEARCH ITS AWSOME
RIP, Chuck Peddle.
Happy Christmas mate from a big fan in the UK!! Thanks for all your amazing work!!!
Nothing is on fire that is also good 😂
Relaxing videos, highly prized content. Knowledge is not free, appreciated.
The chip in the LCD display is more powerful than the computer you made, isn it?
Lucas Viñas really?
Thank you Mr. Eater! Great video and I'm learning so much, I'd like to get to the point of writing my own programs for this machine and you've setup a great foundation to do that. Thank you, I can't wait to do the next video.
If your clock rate is only a MHz or two, you don't need a 65C22 to interface one. Instead hook it up directly to the 6502 data bus, and use address decoding to memory map it.
very nice and clear tutorial decrypting and understanding the datasheet as always ben, thank you
I've never clicked on a notification faster
Very thankful for these and your other instructional videos. Such an inspiration! I'm halfway through the 8-bit stuff, and have finished this project too, with some modifications and additions. Merry Christmas! 😊🎄
Lads, he's getting ram and making a stack (and maybe ram in the zeropage for pointers and stuff!)
Showing the decrement text is such a tease!
Why not use the 6522 handshake pins for toggling the E pin automatically?
finally a series that explains all the logical steps of using micro controllers great work there Ben hopefully you have in mind some expansion modules like the 5651 for serial input output could be cool.
Are you going to optimize the code using a loop or using a function?
if you watch to the end he says he's doing that in the next video.
There aren't really "functions" on the 6502! There is JSR (Jump to Subroutine), which stores a copy of the program counter on the stack and jumps to the address supplied; and there is RTS (ReTurn from Subroutine), which reads an address off the top of the stack and jumps to there. You can JSR to somewhere in memory, do a bunch of instructions and then RTS back to where you came from.
Just to make it interesting, the PC is actually pointing to the last byte of the JSR instruction, which is the high byte of the address, when it gets copied onto the stack. But when that address gets copied back into the PC, the PC gets increased anyway at the end of every instruction, even RTS; so it's then correctly pointing to the (first byte of the) next instruction after the JSR.
So what you might do is have a subroutine that pushes a copy of A onto the stack, and writes it to PORTB; then sets RS, R/W and toggles E on and off, trashing the original value of A; and lastly pulls A back off the top of the stack and does an RTS. Then we can just LDA with the character code and use a single JSR instruction to send it to the LCD module.
And that's easily turned into a loop, using the X or Y index register.
But first we need some RAM to implement a stack!
Ben you got me glued to my phone for half an hour and thoroughly enjoyed it!. Merry Christmas!
Why didn't you write a JSR/RTS routine for sending an instruction, and another one for sending a character?
Oh, never mind. There's no RAM yet, so you have no stack .....
@@bluerizlagirl Shhh! Spoilers for the next video :)
@@BenEater Best spoiler possible.
Cannot wait for the next video :-)
bluerizlagirl I wouldn’t even use subroutines, just create two macros for send instruction and data, use the instruction macro 3 times and loop over a string to send the actual message. No RAM needed, no subroutines needed.
@oH well,lord! A macro is just a block of code you can insert over and over again. Instead of copying and pasting, you define a macro consisting of the code you want to copy and insert that wherever you want to paste it. The computer then copies it and pastes it for you, possibly even making substitutions.
You could use 4 bit mode and use the other 4 bits on the port for the 3 control lines and the a or k of the backlight so it is software controlled.
As a programmer, watching you copy-paste code that should every obviously be a function was excruciatingly painful to watch
But you do realize, that he like wants to explain how the display works and not how you write asm? He wants to show us, which bits go where.
I thought the same thing at first... But you can't JSR (function call) without a stack to put the return address. Which means you need a RAM chip at address 0x0100-0x01FF.
H x
That’s something I was unaware of, thank you!
Supertyp
Yes, I am well aware of that. But supposing he was able to use JSR as H x had mentioned, he would still have to implement a part involving those bits, and still have to go into detail. A function would still have to set those bits up, and any way he wrote the code would enable such explanations.
@@mekafinchi He says he's doing that in the next video.