@@hyper_lynx That might be possible, but I don't think there would be a way of distinguishing between 65 the ascii character and 65 the number, sind this old Basic would parse both as numbers
What needs to be done is to use the current PRINT command's line evaluation routine to create the text to be printed, then use a new machine code routine to print that to the LCD instead of to the serial port. In principle, it's quite simple. Even in practice, it shouldn't be all that complicated.
In Commodore BASIC, you could redirect PRINT's output with the CMD command. Or use PRINT# instead. That means having to hande files. Not complicated, really?
Applesoft/Microsoft BASIC tokenizes the BASIC reserved words when the command is first typed in. So its not actually looking at each individual character of the commands although most of the rest of the line needs to be parsed as it runs a program.
Just amazing, I was loaned an open university circuit board, with associated literature, for a few months to try and understand how early computers work. I worked steadily through the modules but found the modules to be weak and lacking in actual detailed explanations e.g. why registers were not set back to zero, I thus became frustrated and by the end of the three month loan I had not gained the required skills to fully understand how the code did what it was doing. Contrary to the open university course Bens short explanations of how the code is constructed and executed with resultant actions taught me more in 40 minutes than the thee month university moduled course. I found the comments section in the tutorial useful too in that the answered many of queries (misunderstandings on my behalf) I had. Big thanks to Ben and all those who answered other peoples queries, questions and constructive criticism. Thank you all.
Now this tickles the same itch as when I POKEd commands out to the LPT port in QBasic to change fonts on my dot matrix printer, back in my youth. Excellent video as always!
0:40 Calculator is a vintage HP-16C programmer's calculator from the 80's. Not made anymore, but a modern reproduction is made by SwissMicros, model DM16L
I wish I hadn't thrown out my old HP-15C and HP-26C. My original HP-48C is in a display cabinet while I use an emulator for everyday calc uses. I've seen the 15, back in the day, but never cared for it.
To be fair, many of the popular 8-bit computers had their BASIC in ROM, and BASIC's original annotated source was not public. So this kinda thing would have been very difficult to do back in the day.
Look up Simons Basic, which was an extention to C64 basic back in time. You can see that it is programmed by a Brit, because one of the commands is colour :) He was 16 years old when he programmed it, according to wikipedia. So quite impressive.
With the basic source, you make it seem pretty straight forward to add a whole new command. I can see with a custom memory-mapped I/O device, we could have all sorts of fun. Great Video Ben.
I recently rewatched all your videos about breadboard computer and 6502, immediately after I wrote emulators for both in C, it was my first time but with your videos no problem at all. Now I can enjoy your basic on my computer. It was so much fun. I love to watch your videos!
First, the BASIC editor converts the commands (POKE) into a command byte before storing the line of code. The list function converts the command bytes back to text. This saves significantly on program storage space and execution time. Second, since the LCDCMD and the LCDPRINT code are almost identical, you could re-use the identical part with a jmp or branch. Third, a more standard way to do this would be to OPEN a device file and then to PRINT to that file. You could even CLOSE the file when you are done. Printers, tape drives, and floppy drives used this method on the Commodores.
While all true points, don't forget the purpose of these videos are to teach, not to write the smallest, fastest OS. A wall of impenetrable code doesn't serve the "teaching" aspect too well, regardless of it saving 20 bytes.
This is utterly fascinating. Really glad you've made this series and keep sharing your progress. It would have never even occurred to me to just modify BASIC. Love it!
Thanks Ben, I love the way you approach these solutions and explain them so carefully. I have to admit I was getting a bit bogged down following along and trying to write my own assembly. So by implementing Basic so beautifully opens the door to something I can handle much better! Absolutely fascinating how you modified the basic assembly; it really demystifies how these things work.
I remember writing a basic program as a junior high school project about 35 years ago that could modify a basic token by copying a ROM image to an Apple II 16k language card.
@@renakunisaki Although the definitions get kinda weird since BASIC is a purely interpreted language, whereas Java and JS are both JIT interpreted languages so parts of your program get compiled to machine code in memory (its more complex than that but you know) during its runtime so it's faster.
@@seshpenguin as far as I know Java is compiled two times: the first time by the javac compiler from source code into Java bytecode, and the second time by the java runtime from bytecode into machine code (the JIT part).
So, I didn’t like the fact that the lcd was showing just blocks. I wound up running the LCD initialization assembly right in front of Wozmon. I also had it print a string showing “Wozmon” after the initialization. I then had the reset vector point to the beginning of that assembly code. It then dropped right into Wozmon. I did the same thing at address $8000 so when you wanted to run basic, it printed “MSbasic” first and then dropped right into basic. This was a great video. I’m glad we’re using the lcd again! I also tried to run Oregon Trail in basic just to have an out of memory error at line 6000 something. I would love to see you show a ram banking scheme.
@@DavidLatham-productiondave i use to confuse People when talking about stuff in MS Excel when i tell them they should use "E String 4" instead of "E Dollar 4"
That's the way I always interpreted the dollar sign in that context, too. I always just said "string" because it makes it clear that the variable or function is dealing with a string.
I've started to build my 6502. Already have the rom, the microprocessor and the VIA. But here in Brasil, unfortunatelly it's very hard to get the 62256. I only have the 6264 and I'm trying to figure out on how to use this instead of 62256. I think the max232 is not quite hard to find, and then I finally can do stuffs with my computer through the serial like this. I've learned a lot with your videos. I am really grateful for all the teachings dude
The import taxes and availability are such a pain :( my Brasilian friends usually have to ask an uncle to pick-up bits and pieces when they go to the US, for a business trip or something. But I imagine that's not available to everybody as an option
@@kaitlyn__L I totally agree. I do mechatronics engineering and sometimes I have to order things from abroad Brasil. The taxes are absurd and sometimes we wait months for our items be lost by Correios. We have to be brave to study in here. But I am hopeful some day it's gonna be worth
@@harisalic2568 I've tried to use the nand to control it in many ways already. In some way the 6264 has to use the pin 15 of the 6502 once we invert it when accessing the rom when the 15th bit is zero, we must use it when the 6502 is 1 to access the RAM. But for some reason all my tries have failed. I'm gonna keep trying until I can get the 62256.
@@harisalic2568 with the money I'll spend buying 4 6264, the logic gates and the work I'll have it's better to buy the 62256 and use the Ben's approach 🥲
Everyday is a good day when Ben Eater puts up a new video. Peek/Poke always reminds me of the mid 80s when I was working overseas in the UK developing commercial applications for the psion organiser II and writing rudimentary games in my spare time. Fond memories of crashing the thing trying to work out how to implement things.
Great video. Thanks. This reminds me of the days when I added commands to the Basic of my C128. You could copy the ROM to RAM and then extend the token table. That was fun.
5:30 I think you're confusing two concepts: Just because it's interpreted does not mean it has to parse the ASCII source code from scratch each time the line is run. A reasonable implementation would _tokenize_ the line of code upon entry. When running the line, it does *not* have to read through "POKE" but rather it sees a small number indicating the command, and it doesn't have to parse "24567" (a remarkably laborious process most people don't appreciate -- each digit requires a multiplication) but can load a 16-bit value directly as two bytes. Looking it up, I see that Microsoft BASIC ca 1977 was _partially tokenized_ . The keyword like POKE was indeed replaced with a small number and was not parsed each time; but the value 24567 did have to be parsed each time.
Yes, you could shorten the word POKE by using p and then SHIFT+o which would print something like p୮ but then be interpreted as poke in the listing. I never fully figured out what that SHIFT+letter did of a difference, but I asume it's something to do with setting the bit 8 of the ASCII of letter o, which then means it doesn't have to read any further of the command. I don't know how that then makes it not interprete it as command pos instead.
@@AntiAtheismIsUnstoppable It does not use ASCII. I think the shortcuts are simply programmed in, or a side effect of the look-up process during tokenization. The actual token is a single byte, not a normal letter first.
In 4 bit mode the LCD only consumes half the pio of the via. (Port B in this case). When Ben did the PS2 keyboard he switched to 4bit mode on the LCD to free up pins on port A to read the keyboard. There wasnt a video on TH-cam about 4bit mode. Maybe there was one on Patreon. The PS2 example source code on the website has all the 4bit mode routines.
CS student in university, this series inspired me to focus on systems programming. Thank you for these amazing videos, I know theres a lot of older programmers reminiscing in the comments, but there’s young ones too getting exposure to low level coding/hardware!
I find it heartwarming anyone does a FOR I ... NEXT I. That's how I was taught for/next loops in Applesoft BASIC as a kid in 1981, so it brings me back. There must have been a book example that everyone referenced for the letter "I" to be used so often. Sure, that letter was probably chosen because it's a non-descript integer, but someone made that choice rather than, say "A", and it stuck.
I always thought the letter "i" was used in loops because it is short for "index". Not sure if that's the actual reason though or just one that someone came up with later that happens to fit.
@@thislooksfun1 It's because really old versions of Fortran determined the type of a variable by its name, so "I" was an "integer" variable, which made it suitable for loops, and it became convention.
@@codahighlandyes, variables begining I to N were integers by default, others defaulted to FLOAT - first two letters of integer. By Fortran IV though it had become best practice to explicitly declare them - was still necessary even before that for COMPLEX, CHAR or any array
I remember on the TRS-80 (Model 4), and on the basic that came with early PCs, you had the LPRINT, and LLIST commands (that did the same as PRINT, and LIST) except they sent the data to the printer; so you could print out your programs, or make cool banners on the printer. It's neat to see your basic kind of works the other way around by default (if you imagine the idea of swapping the LPT printer port with a RS232 serial interface). Still, it would be really neat to see a whole slew of equivalent) L-prefixed commands for your "Hitachi interface" compatible LCD module... Such as LCLS to clear it, LPRINT to write to it, LPOKE/LPEEK to set custom character graphics for the upper 128 characters, or read status, etc. Not sure if there's value in LLIST, because even the larger modules are only 40x4 characters... But you might want to account for those too, as IIRC their memory is non-linear (i.e. they behave like 20x8 screens). Anyway, just chimed in to say this is super cool, and I love what you're doing with it, (even if my imagination runs a bit wild). 🤘
Thank you for explaining what no one could explain when I was 13 trying to figure this out and asking everyone I knew All clueless and the books at the time were so much worse
Ahhhhh.. the power of editing video. The code works first time! I wish it was so easy for me...😂 Great video. So interesting to see how the BASIC interpreter is built from low level components.
That realtime comparison at the end is a really nice visual demonstration of how slow interpreted languages are, lol. I'm half joking ofc, modern hardware and modern languages are a lot more powerful that this, but still
@@boretrk Also he should using the . trick in the for loop maybe? So for I = . to Len(s$)-1 But I admit then he has to use a Mid(s$,I+1,1) too, should still be a little faster though
I made LCDPRINT by using the formular evaluator of msbasic. So LCDPRINT is like PRINT (not well tested). The following code should fit to Ben's code on GitHub, which is a little bit different to his code in the video. The code is more or less a compressed copy of PRINT. LCDPRINT: jsr FRMEVL ; Evaluate formular bit VALTYP ; Is it a string? bmi lcd_print_string ; Yes jsr FOUT ; Format floating point output jsr STRLIT ; Build string descriptor lcd_print_string: jsr FREFAC ; Returns temp pointer to string tax ; Put count to counter ldy #0 inx ; Move one ahead lcd_print_next: dex beq lcd_print_end ; All done lda (INDEX),y ; Load char of string jsr lcd_print_char ; Output to lcd iny bne lcd_print_next ; Go on with next char lcd_print_end: rts
@@alancx523 Thanks for trying. I didn't use Ben's code from github. But I didn't want to post my complete lcd.s. I'm using the code from the video. You can try it too. The label lcd_print_char is the routing to output a char to the lcd. In Ben's older code it's labeled with print_char.
@@Ralf4511 Together like this? If so, it doesn't work. I type LCDPRINT "STUFF", it return and get nothing. Hoping I've got something wrong!! LCDPRINT: jsr FRMEVL ; Evaluate formular bit VALTYP ; Is it a string? bmi lcd_print_string ; Yes jsr FOUT ; Format floating point output jsr STRLIT ; Build string descriptor lcd_print_string: jsr FREFAC ; Returns temp pointer to string tax ; Put count to counter ldy #0 inx ; Move one ahead lcd_print_next: dex beq lcd_print_end ; All done lda (INDEX),y ; Load char of string jsr lcd_print_char ; Output to lcd iny bne lcd_print_next ; Go on with next char lcd_print_end: rts lcd_print_char: jsr lcd_wait pha lsr lsr lsr lsr ; Send high 4 bits ora #RS ; Set RS sta PORTB ora #E ; Set E bit to send instruction sta PORTB eor #E ; Clear E bit sta PORTB pla and #%00001111 ; Send low 4 bits ora #RS ; Set RS sta PORTB ora #E ; Set E bit to send instruction sta PORTB eor #E ; Clear E bit sta PORTB rts
Something hit me about the opening when he said “so I’ve got this 8 bit breadboard computer that runs basic” like he hasn’t spent the last 6 plus years making it. So good. Cant wait to see where we go from here!
Great video, but I somehow think that you should have called the routine LCDOUT instead of LCDPRINT. An easy way to make (the new) LCDPRINT do everything PRINT does is: Set a global variable that l et's PRINT know if it should use OUTCHAR or LCDOUT (new function to output a single char to the LCD). Then, LDCPRINT sets the "variable" to 1, does a JSR PRINT, clears the variable and RTS's. This would of course have a small performance hit on PRINT in general, but could still be a somewhat clean solution, since the new branch in PRINT would be set in an ifdef block. That way, you could just use LCDPRINT with any kind of expression. You could even use a global pointer, to what output method PRINT should use. That way, you would be future-proof for the hopefully coming VGA support 🙂
That's a great idea! I also felt it was a bit strange to call it "print", yet have it be unable to be given a string. But I wasn't sure what I'd call it instead - I like your "out" suggestion.
Just use the current PRINT command's line-evaluation routine to create the string to be printed, then use custom code to print that to the LCD. The only complicated part would be if you want to interpret control codes as commands rather than as text characters.
Like I said in another comment, Applesoft BASIC allows you to patch COUT() to be a custom routine (and all basic output goes through COUT).. Applesoft is made by Microsoft so I bet they'd have the same ability in msbasic.
@@misterkite I have never heard of a COUT() command in any version of BASIC which I've used, so I'd guess it only appears in Applesoft BASIC. It sounds like it's similar to the USR() function which exists in other BASICs, but specialised to character output.
@@melkiorwiseman5234 COUT() isn't a public facing command, it's the final "output" routine that outputs what's in the A register to the screen. There are various basic-specific ways to redirect cout, the most common is to use special control characters.. like PRINT CHR$(4);"CAT" the control character 4 causes the output to redirect to DOS in Applesoft.
This video reminds me of my days with a vic 20 at home, and learning from CBM in school. I remember one poke value would make the CBM screen wrap around as if it were a tube. I imagine that was for changing the screen dynamics since screens were not flat back then, and this particular poke was just the maximum value, instead of a much smaller number to make the desktop look flat and rectangular. Those days were fun because the importance of computers was only just starting to be understood.
Sure would like video card in the day hooked up. Not a DYI video card like you did, but a CGA/EGA card that hooks up to a monitor. I know power requirements change and probably a few more chips needed. I for one would buy the interface kit.
Glad to see more of this. Would be cool if i can take any random chips and make a computer out of them. Guess I should get a kit before even dreaming of doing anything that lambastic.
Try to make a computer from a 555, 74S140, and a TL4242. Those were chosen randomly from a large list of integrated circuits. Why not dream for something sensible rather than just random? Would be cool if i could if i could make a car from a windshield, an exhaust hanger, and a blow-off valve.
You could have LCDPRINT check for the quote character if it's present then call LCDPRINTSTRING which would do the string processing......but it seems fast enough on such a small display it wouldn't really be needed.
Back in those days, i read about people using peek and poke to put assembly code into memory and then make basic interpret that code as their own basic function code or even run a native code.
5:20 A bit of minor pedantry here: I believe that MS BASIC tokenises keywords when you type them in, so each POKE is only a 1 or 2 byte (I don't remember which) token, rather than the whole command in ASCII. OTOH, it *does* have to parse numeric constants each time, which is slow AF. Back in the day, we put all that stuff in variables to help speed things up.
Is it an oversight that you didn't preserve the state of the X-register by pushing it to the stack (and popping it back out at the end of your subroutine) or did you decide it wasn't necessary?
You only need to preserve a register if you know you're changing something that is set and shouldn't be changed. When you enter a subroutine, the only regs that are important are the ones transferring data into the subroutine. Those are the ones you don't want to change. But others don't matter (generally). Now if you're making an interrupt handler which can can be invoked absolutely anywhere in your code, then you have no idea what shouldn't be changed, and you have to be careful to preserve EVERYTHING.
It's a trade-off between portability vs performance. Preserving the register makes it easier to re-use code in other projects and for other people to build on it. I originally learned Assembly from an old textbook on Win3.1, where the recommended convention was 'preserve by default'. But the book assumed a 486 CPU, I can see why you might choose 'clobber by default' on a microcontroller system.
This has me thinking about those old pocket computers that had an entire interface in a two-line LCD. I wonder what it would take to make this fully self-contained.
Would love to see maybe writing a driver routine so the LCD displays runs basic natively.. it wouldn't be very good or usable but many portable computers did that, pretty much spawning graphing calculators.
There's now plenty of "programming world" contexts where $ doesn't mean a string. Far more than where it does, in fact. Shells and PHP use leading $ to signify a variable name, many JavaScript libraries have a convention for foo$ to be a stream or signal for a foo, etc.
@@SimonBuchanNz Yes and we're using BASIC here where $ indicates that a variable is a string. What $ is used for in 634 other languages and 12 other currencies really is irrelevant here. Since the days of yore (and I was there), S$ in BASIC is pronounced "ess string".
Nice stuff Ben. Im considering changing C64 BASIC to use the math capabilities of the Kawari to speed up BASIC in general, although I havent yet figured out how best to convert between FP and multiple integer operations and back again.
Microsoft didn’t bother. All math operations were done in FP. If you were worked with integer variables then they were converted to FP for processing and back again for storage. This is why using integer variables is actually slightly slower than FP.
@@stevetodd7383 yep, Im aware. The point being that Kawari can do 16x16=32bit multiplication, and the c64 has a 32 bit mantissa so it should be possible to speed up at least the mantissa multiplication (which I think is currently done with shifts and additions).
Back in the day, we used to modify MS BASIC. Usually to replace the error messages with an obscene variant. It would have been SO cool to have the source code and be able to add new keywords and such like. ... our hacking skills were still a bit lacking then, so reverse engineering the binary was still mostly beyond us.... Open Source is a wonderful thing.
FOR I=1 TO LEN(S$) If I was teaching a computer programming course, I would deduct 1000 points if someone handed in an assignment with this, You don’t want to re-evaluate LEN() for each loop iteration. Instead, set a variable (constant) to be the length and then use that for your loop limit. That is if the version of BASIC supported dynamic FOR loop limit. I’ve seen some versions that fix this value even if it’s a variable. Very old version from 1970 or so.
would be cool if you got the BASIC output (the one that you get in the terminal) to be shown on the LCD screen :) how hard could it be? cool video, thanks!
The comment about the interpreter having to read each character and then determine if that forms a valid keyword reminds me of how ZX Spectrum BASIC has each of its keywords assigned an ASCII code, meaning every command is only one byte.
It’s called Tokenisation, and Microsoft BASIC does it also. It was a standard way to reduce the memory requirements of a program, but Sinclair cheated in that they didn’t bother starting with the string command and converting to a token before saving the typed line to memory, they just had you type the token directly on the keyboard (resulting in a horribly complex keyboard map).
@stevetodd7383 Mmm, although I think when you typed a program in 128K mode, it did the tokenisation when you entered the line. So it took a little longer to "register" each line but then only needed to interpret the tokens at run time.
1st rule of programming - if you are repeating a block of code make it a subroutine! Duplicating code is wasteful. Command 1: code code JMP Common Command 2: code code JMP Common Command 3: code code (no need to JMP here, we just continue into Common) Common: code code etc.
You would usually be correct, but with these older slower systems it really becomes a trade off between size and speed. For instance.. A simple loop that fills a 8kb area (Worlds Worst Video Card) takes around 90,000 cycles: FillScreen: ; Setup Screen pointer STZ Screen ; Ben Eater's Worlds Worst Video card LDA #$20 ; uses the upper 8Kb of system RAM STA ScreenH; This starts at location $2000 LDY #0 ; Zero out Y for loop, we'll clear off screen area as well. LDA #0 ; Always use 0 for color with this routine. Cold load a memory location instead LDX #0 ; Zero out X .MemLoop ; Small, slow, fully rolled clear loop. Also clears offscreen area. STA (Screen),Y INY BNE .MemLoop INC ScreenH LDX ScreenH CPX #$40 ;Top of screen memory is $3F-FF, BNE .MemLoop;Do until $40-00 ; Reset it all STZ Screen ; Ben Eater's Worlds Worst Video card LDA #$20 ; uses the upper 8Kb of system RAM STA ScreenH; This starts at location $2000 However take a look at this unrolled routine: FillScreenLoop: INX ; This is unrolled so that there is a STA for each row of the screen. ; Display mapping has 28 bytes at end of each row unused. ; Display location = $2000 FillScreen: ; Entry with x at 0 (or wanted start) and A with data/color value STA Display, x ; Line 0 STA Display +$80,x ; Line 1 STA Display +$100,x ; ... STA Display +$180,x STA Display +$200,x STA Display +$280,x ..... etc ..... STA Display +$1F00,x STA Display +$1F80,x ; Line 63. (0-63 IE 64 lines) CPX #99 ; Row is 99 long (0-99 IE 100 rows) BEQ FillScreenLoopEnd ; 1 compare and 1 branch/loop pair per 64 rows JMP FillScreenLoop ; 5 cpu cycles per pixel drawn. FillScreenLoopEnd: ; 3x faster than a normal rolled loop The routine above only takes 32,910 cycles. About 1/3rd the cpu cycles of the 'normal' routine. If your goal is to update a screen as quickly as possible you have no choice but to trade space for speed. Even with some tweaks 'proper' functions like are normal on modern systems are just too slow for some things on something like a 6502 running at a couple of Mhz or less. Though they will make everything easier to port to another system and possibly help use less memory. But then on the flip side of both of these extremes you have something like WozMonitor that it is neither 'proper' nor 'slow', and it also known as one of the densest little chunks of code ever produced!
i wrote similar code on a Commodore 64 emulator running MS BASIC and it took about 1 second to run. When i moved the POKE address to a variable and stored the INT(ASC(... in a variable, it ran almost twice as fast. You had to make optimization on these old machines (and on new ones, too).
I love your videos and have learned a lot about computing and electronics from them. I'm trying to expand my knowledge, and it's content like thus that does it best for me. Would you be open to doing a video on using a Raspberry Pi to control an ESP32 (that in turn controls another device, even an LED, or a neo pixel, or whatever is most interesting for you)? I ask this because I really want a better understanding of creating Linux device trees and drivers for controlling any kind of device using something like i2c to get started. Maybe there are other videos you've done that already covered this and I somehow missed them?
No, the interpreter doesn't "read p,o,k,e", the main feature of MS basic is that all commands/keywords are tokenized so, POKE is represented by a single byte (called token in MS basic) 0x94 to be precise.
It only reads the characters P, O, K, and E once, after the user has entered a program line. Having recognized these characters in that order, it enters a "token" at the appropriate place, which is a single byte corresponding to the POKE command. It's a kind of bytecode. @alexander53
I really wouldn't call this hacking, but augmenting or expanding. Manufacturers who licensed it or built upon it back in the day basically did the same for their custom functions.
One thing i noticed, not sure if it's relevant, but in your LCD instructions, you're calling txa to transfer X to A, but then shortly after calling pha to push A to the stack, then calling pla at the end to pop A back. It looks like whatever was in A before you run one of the LCD commands is getting lost, but I could be wrong.
I'm not super familiar with msbasic, but Applesoft basic is made by microsoft as well and it supports redirecting of cout(). This allows you to redirect all standard basic IO to a custom routine... no need to create a new basic command.
I'm simple. Ben pokes a new video onto TH-cam, I have to peek... :)
😅
Basic behaviour
@@m.naexec genius
and poke the like button :D
@@grendel_eoten add all his videos to the stack
Current LCDPRINT could be called LCDCHR, and the new print function that takes the string into account could be the new LCDPRINT.
I wonder if LCDPRINT could check if there is a single byte or a string and print either one correctly instead of using multiple commands
@@hyper_lynx That might be possible, but I don't think there would be a way of distinguishing between 65 the ascii character and 65 the number, sind this old Basic would parse both as numbers
What needs to be done is to use the current PRINT command's line evaluation routine to create the text to be printed, then use a new machine code routine to print that to the LCD instead of to the serial port. In principle, it's quite simple. Even in practice, it shouldn't be all that complicated.
More is less, and ocd is always watching.
LCDPRTCHR
LCDPRTSTR
3 letters per attribute, 3 attributes per command
In Commodore BASIC, you could redirect PRINT's output with the CMD command. Or use PRINT# instead. That means having to hande files. Not complicated, really?
Applesoft/Microsoft BASIC tokenizes the BASIC reserved words when the command is first typed in. So its not actually looking at each individual character of the commands although most of the rest of the line needs to be parsed as it runs a program.
I hit the comments to make the same point. :)
Just amazing, I was loaned an open university circuit board, with associated literature, for a few months to try and understand how early computers work. I worked steadily through the modules but found the modules to be weak and lacking in actual detailed explanations e.g. why registers were not set back to zero, I thus became frustrated and by the end of the three month loan I had not gained the required skills to fully understand how the code did what it was doing. Contrary to the open university course Bens short explanations of how the code is constructed and executed with resultant actions taught me more in 40 minutes than the thee month university moduled course. I found the comments section in the tutorial useful too in that the answered many of queries (misunderstandings on my behalf) I had. Big thanks to Ben and all those who answered other peoples queries, questions and constructive criticism.
Thank you all.
Now this tickles the same itch as when I POKEd commands out to the LPT port in QBasic to change fonts on my dot matrix printer, back in my youth. Excellent video as always!
0:40 Calculator is a vintage HP-16C programmer's calculator from the 80's. Not made anymore, but a modern reproduction is made by SwissMicros, model DM16L
RPN 4 LIFE
@@klchu 2,2,+
I wish I hadn't thrown out my old HP-15C and HP-26C.
My original HP-48C is in a display cabinet while I use an emulator for everyday calc uses.
I've seen the 15, back in the day, but never cared for it.
Thank you ! I was going to ask again...
I have the swissmicro replica, it's amazing!
Always a great day when Ben uploads.
He poked, and I peeked.
That adding functionality to BASIC is so freaking cool, I wish I knew all that when I was a kid and actually using BASIC and assembly to mess around.
To be fair, many of the popular 8-bit computers had their BASIC in ROM, and BASIC's original annotated source was not public. So this kinda thing would have been very difficult to do back in the day.
Look up Simons Basic, which was an extention to C64 basic back in time. You can see that it is programmed by a Brit, because one of the commands is colour :) He was 16 years old when he programmed it, according to wikipedia. So quite impressive.
With the basic source, you make it seem pretty straight forward to add a whole new command. I can see with a custom memory-mapped I/O device, we could have all sorts of fun. Great Video Ben.
I recently rewatched all your videos about breadboard computer and 6502, immediately after I wrote emulators for both in C, it was my first time but with your videos no problem at all. Now I can enjoy your basic on my computer. It was so much fun. I love to watch your videos!
First, the BASIC editor converts the commands (POKE) into a command byte before storing the line of code. The list function converts the command bytes back to text. This saves significantly on program storage space and execution time. Second, since the LCDCMD and the LCDPRINT code are almost identical, you could re-use the identical part with a jmp or branch. Third, a more standard way to do this would be to OPEN a device file and then to PRINT to that file. You could even CLOSE the file when you are done. Printers, tape drives, and floppy drives used this method on the Commodores.
While all true points, don't forget the purpose of these videos are to teach, not to write the smallest, fastest OS. A wall of impenetrable code doesn't serve the "teaching" aspect too well, regardless of it saving 20 bytes.
This is utterly fascinating. Really glad you've made this series and keep sharing your progress. It would have never even occurred to me to just modify BASIC. Love it!
Thanks Ben, I love the way you approach these solutions and explain them so carefully. I have to admit I was getting a bit bogged down following along and trying to write my own assembly. So by implementing Basic so beautifully opens the door to something I can handle much better! Absolutely fascinating how you modified the basic assembly; it really demystifies how these things work.
I remember writing a basic program as a junior high school project about 35 years ago that could modify a basic token by copying a ROM image to an Apple II 16k language card.
Amazing timing on this upload. I am teaching myself embedded programming and was just today doing some bare metal programming on an LCD display
Coming from Java/JavaScript and seeing BASIC as a high-level language makes it clear how abstract modern programming languages are.
If BASIC is high level, I think JavaScript is orbital 😂
@@renakunisaki Although the definitions get kinda weird since BASIC is a purely interpreted language, whereas Java and JS are both JIT interpreted languages so parts of your program get compiled to machine code in memory (its more complex than that but you know) during its runtime so it's faster.
@@seshpenguin as far as I know Java is compiled two times: the first time by the javac compiler from source code into Java bytecode, and the second time by the java runtime from bytecode into machine code (the JIT part).
@@AntonioBarba_TheKaneB Yep that's correct
@@seshpenguin dang, I still got my java basics after all those years trying really hard to forget! :D
So, I didn’t like the fact that the lcd was showing just blocks. I wound up running the LCD initialization assembly right in front of Wozmon. I also had it print a string showing “Wozmon” after the initialization. I then had the reset vector point to the beginning of that assembly code. It then dropped right into Wozmon. I did the same thing at address $8000 so when you wanted to run basic, it printed “MSbasic” first and then dropped right into basic. This was a great video. I’m glad we’re using the lcd again! I also tried to run Oregon Trail in basic just to have an out of memory error at line 6000 something. I would love to see you show a ram banking scheme.
"CHAR$" = "char dollar"
Back in the day (1970s), we used to pronounce that like "Char string", i.e. "$" was pronounced "string". Perhaps YMMV...
Yup. I still use the word "string" in those situations. Confuses the younger folks at work.
@@DavidLatham-productiondave i use to confuse People when talking about stuff in MS Excel when i tell them they should use "E String 4" instead of "E Dollar 4"
For some reason I always read it as "ez". Like, the name Fernandez could be Fernand$
That's the way I always interpreted the dollar sign in that context, too. I always just said "string" because it makes it clear that the variable or function is dealing with a string.
But why? You assigned it the numeric value of exactly one character, while a string would mean zero or more characters
Ben ! Reminder to put these last 2 videos to your playlist
I've started to build my 6502. Already have the rom, the microprocessor and the VIA. But here in Brasil, unfortunatelly it's very hard to get the 62256. I only have the 6264 and I'm trying to figure out on how to use this instead of 62256. I think the max232 is not quite hard to find, and then I finally can do stuffs with my computer through the serial like this. I've learned a lot with your videos. I am really grateful for all the teachings dude
The import taxes and availability are such a pain :( my Brasilian friends usually have to ask an uncle to pick-up bits and pieces when they go to the US, for a business trip or something. But I imagine that's not available to everybody as an option
@@kaitlyn__L I totally agree. I do mechatronics engineering and sometimes I have to order things from abroad Brasil. The taxes are absurd and sometimes we wait months for our items be lost by Correios. We have to be brave to study in here. But I am hopeful some day it's gonna be worth
You can use 4 6264 Chips and use some not Gates to Control their Chips select via the last 2 addres lines to make them behave like a Single 62256
@@harisalic2568 I've tried to use the nand to control it in many ways already. In some way the 6264 has to use the pin 15 of the 6502 once we invert it when accessing the rom when the 15th bit is zero, we must use it when the 6502 is 1 to access the RAM. But for some reason all my tries have failed. I'm gonna keep trying until I can get the 62256.
@@harisalic2568 with the money I'll spend buying 4 6264, the logic gates and the work I'll have it's better to buy the 62256 and use the Ben's approach 🥲
Must get back to my 6502 project ☺️
Ben eater uploading is always inspiring!
Everyday is a good day when Ben Eater puts up a new video. Peek/Poke always reminds me of the mid 80s when I was working overseas in the UK developing commercial applications for the psion organiser II and writing rudimentary games in my spare time. Fond memories of crashing the thing trying to work out how to implement things.
Great video. Thanks.
This reminds me of the days when I added commands to the Basic of my C128. You could copy the ROM to RAM and then extend the token table. That was fun.
5:30 I think you're confusing two concepts: Just because it's interpreted does not mean it has to parse the ASCII source code from scratch each time the line is run. A reasonable implementation would _tokenize_ the line of code upon entry. When running the line, it does *not* have to read through "POKE" but rather it sees a small number indicating the command, and it doesn't have to parse "24567" (a remarkably laborious process most people don't appreciate -- each digit requires a multiplication) but can load a 16-bit value directly as two bytes.
Looking it up, I see that Microsoft BASIC ca 1977 was _partially tokenized_ . The keyword like POKE was indeed replaced with a small number and was not parsed each time; but the value 24567 did have to be parsed each time.
yep, those early microsoft basic intepreters only tokenized the kewords, still better than nothing.
Yes, you could shorten the word POKE by using p and then SHIFT+o which would print something like p୮ but then be interpreted as poke in the listing. I never fully figured out what that SHIFT+letter did of a difference, but I asume it's something to do with setting the bit 8 of the ASCII of letter o, which then means it doesn't have to read any further of the command. I don't know how that then makes it not interprete it as command pos instead.
@@AntiAtheismIsUnstoppable It does not use ASCII.
I think the shortcuts are simply programmed in, or a side effect of the look-up process during tokenization. The actual token is a single byte, not a normal letter first.
Just curious, did I miss an explanation about why we're using 4-bit-mode instead of 8 bit like all the previous videos? 🤔
In 4 bit mode the LCD only consumes half the pio of the via. (Port B in this case). When Ben did the PS2 keyboard he switched to 4bit mode on the LCD to free up pins on port A to read the keyboard. There wasnt a video on TH-cam about 4bit mode. Maybe there was one on Patreon. The PS2 example source code on the website has all the 4bit mode routines.
CS student in university, this series inspired me to focus on systems programming. Thank you for these amazing videos, I know theres a lot of older programmers reminiscing in the comments, but there’s young ones too getting exposure to low level coding/hardware!
I find it heartwarming anyone does a FOR I ... NEXT I. That's how I was taught for/next loops in Applesoft BASIC as a kid in 1981, so it brings me back. There must have been a book example that everyone referenced for the letter "I" to be used so often. Sure, that letter was probably chosen because it's a non-descript integer, but someone made that choice rather than, say "A", and it stuck.
I always thought the letter "i" was used in loops because it is short for "index". Not sure if that's the actual reason though or just one that someone came up with later that happens to fit.
@@thislooksfun1 It's because really old versions of Fortran determined the type of a variable by its name, so "I" was an "integer" variable, which made it suitable for loops, and it became convention.
@@codahighland Huh, neat! Today I learned!
Using I for loops probably harks back to Fortran where I, J, K, L, M and N defaulted to integers and all the other letters default to real (float).
@@codahighlandyes, variables begining I to N were integers by default, others defaulted to FLOAT - first two letters of integer. By Fortran IV though it had become best practice to explicitly declare them - was still necessary even before that for COMPLEX, CHAR or any array
Will you put FORTH on the 6502 computer later in the series? It feels like a much more fitting language for the hardware.
All you have to do is find out how the PRINT command detects numbers and strings and have LCDPRINT do the same.
(I bet you've already done that. lol)
I remember on the TRS-80 (Model 4), and on the basic that came with early PCs, you had the LPRINT, and LLIST commands (that did the same as PRINT, and LIST) except they sent the data to the printer; so you could print out your programs, or make cool banners on the printer. It's neat to see your basic kind of works the other way around by default (if you imagine the idea of swapping the LPT printer port with a RS232 serial interface).
Still, it would be really neat to see a whole slew of equivalent) L-prefixed commands for your "Hitachi interface" compatible LCD module...
Such as LCLS to clear it, LPRINT to write to it, LPOKE/LPEEK to set custom character graphics for the upper 128 characters, or read status, etc. Not sure if there's value in LLIST, because even the larger modules are only 40x4 characters... But you might want to account for those too, as IIRC their memory is non-linear (i.e. they behave like 20x8 screens).
Anyway, just chimed in to say this is super cool, and I love what you're doing with it, (even if my imagination runs a bit wild). 🤘
Ben is cooking again
Ben is done eating, now he's Ben Cooker
And now we're the one's eating.
Well hey there, folks!
Today we're making Basic-flavored Code Sausage... that's the Basic juice!
Thank you for explaining what no one could explain when I was 13 trying to figure this out and asking everyone I knew
All clueless and the books at the time were so much worse
1000X better than my worthless college class where we had to memorize dma pinouts
Ahhhhh.. the power of editing video. The code works first time! I wish it was so easy for me...😂 Great video. So interesting to see how the BASIC interpreter is built from low level components.
I guess it's the power of making the video after having everything up and running ;)
That realtime comparison at the end is a really nice visual demonstration of how slow interpreted languages are, lol. I'm half joking ofc, modern hardware and modern languages are a lot more powerful that this, but still
It's not really a fair comparison.
The 3 integer division per character he is doing there is probably a significant portion of the execution time.
@@boretrk Also he should using the . trick in the for loop maybe?
So for I = . to Len(s$)-1
But I admit then he has to use a Mid(s$,I+1,1) too, should still be a little faster though
I made LCDPRINT by using the formular evaluator of msbasic. So LCDPRINT is like PRINT (not well tested). The following code should fit to Ben's code on GitHub, which is a little bit different to his code in the video. The code is more or less a compressed copy of PRINT.
LCDPRINT:
jsr FRMEVL ; Evaluate formular
bit VALTYP ; Is it a string?
bmi lcd_print_string ; Yes
jsr FOUT ; Format floating point output
jsr STRLIT ; Build string descriptor
lcd_print_string:
jsr FREFAC ; Returns temp pointer to string
tax ; Put count to counter
ldy #0
inx ; Move one ahead
lcd_print_next:
dex
beq lcd_print_end ; All done
lda (INDEX),y ; Load char of string
jsr lcd_print_char ; Output to lcd
iny
bne lcd_print_next ; Go on with next char
lcd_print_end:
rts
Hi, I gave your code a go, and it doesn't seem to work. Mind you, neither does Ben's code on Github
@@alancx523 Thanks for trying. I didn't use Ben's code from github. But I didn't want to post my complete lcd.s. I'm using the code from the video. You can try it too. The label lcd_print_char is the routing to output a char to the lcd. In Ben's older code it's labeled with print_char.
@@Ralf4511 I'll have another go :)
@@Ralf4511
Together like this? If so, it doesn't work. I type LCDPRINT "STUFF", it return and get nothing.
Hoping I've got something wrong!!
LCDPRINT:
jsr FRMEVL ; Evaluate formular
bit VALTYP ; Is it a string?
bmi lcd_print_string ; Yes
jsr FOUT ; Format floating point output
jsr STRLIT ; Build string descriptor
lcd_print_string:
jsr FREFAC ; Returns temp pointer to string
tax ; Put count to counter
ldy #0
inx ; Move one ahead
lcd_print_next:
dex
beq lcd_print_end ; All done
lda (INDEX),y ; Load char of string
jsr lcd_print_char ; Output to lcd
iny
bne lcd_print_next ; Go on with next char
lcd_print_end:
rts
lcd_print_char:
jsr lcd_wait
pha
lsr
lsr
lsr
lsr ; Send high 4 bits
ora #RS ; Set RS
sta PORTB
ora #E ; Set E bit to send instruction
sta PORTB
eor #E ; Clear E bit
sta PORTB
pla
and #%00001111 ; Send low 4 bits
ora #RS ; Set RS
sta PORTB
ora #E ; Set E bit to send instruction
sta PORTB
eor #E ; Clear E bit
sta PORTB
rts
Something hit me about the opening when he said “so I’ve got this 8 bit breadboard computer that runs basic” like he hasn’t spent the last 6 plus years making it. So good. Cant wait to see where we go from here!
Great video, but I somehow think that you should have called the routine LCDOUT instead of LCDPRINT. An easy way to make (the new) LCDPRINT do everything PRINT does is: Set a global variable that l et's PRINT know if it should use OUTCHAR or LCDOUT (new function to output a single char to the LCD). Then, LDCPRINT sets the "variable" to 1, does a JSR PRINT, clears the variable and RTS's. This would of course have a small performance hit on PRINT in general, but could still be a somewhat clean solution, since the new branch in PRINT would be set in an ifdef block. That way, you could just use LCDPRINT with any kind of expression. You could even use a global pointer, to what output method PRINT should use. That way, you would be future-proof for the hopefully coming VGA support 🙂
That's a great idea! I also felt it was a bit strange to call it "print", yet have it be unable to be given a string. But I wasn't sure what I'd call it instead - I like your "out" suggestion.
Just use the current PRINT command's line-evaluation routine to create the string to be printed, then use custom code to print that to the LCD. The only complicated part would be if you want to interpret control codes as commands rather than as text characters.
Like I said in another comment, Applesoft BASIC allows you to patch COUT() to be a custom routine (and all basic output goes through COUT).. Applesoft is made by Microsoft so I bet they'd have the same ability in msbasic.
@@misterkite I have never heard of a COUT() command in any version of BASIC which I've used, so I'd guess it only appears in Applesoft BASIC. It sounds like it's similar to the USR() function which exists in other BASICs, but specialised to character output.
@@melkiorwiseman5234 COUT() isn't a public facing command, it's the final "output" routine that outputs what's in the A register to the screen. There are various basic-specific ways to redirect cout, the most common is to use special control characters.. like PRINT CHR$(4);"CAT" the control character 4 causes the output to redirect to DOS in Applesoft.
This video reminds me of my days with a vic 20 at home, and learning from CBM in school. I remember one poke value would make the CBM screen wrap around as if it were a tube. I imagine that was for changing the screen dynamics since screens were not flat back then, and this particular poke was just the maximum value, instead of a much smaller number to make the desktop look flat and rectangular. Those days were fun because the importance of computers was only just starting to be understood.
This video blows my mind with the different peripherals we could hook up and control with BASIC.
Love these videos. Well done, and fun to watch. How about sound for this little cpu?
I messed around with peek & poke with c64 decades ago, now I understand what I did. Thanks!
Sure would like video card in the day hooked up. Not a DYI video card like you did, but a CGA/EGA card that hooks up to a monitor. I know power requirements change and probably a few more chips needed. I for one would buy the interface kit.
Glad to see more of this.
Would be cool if i can take any random chips and make a computer out of them. Guess I should get a kit before even dreaming of doing anything that lambastic.
Try to make a computer from a 555, 74S140, and a TL4242. Those were chosen randomly from a large list of integrated circuits. Why not dream for something sensible rather than just random? Would be cool if i could if i could make a car from a windshield, an exhaust hanger, and a blow-off valve.
I don't think that word means what you think it means.
@@codefeenix meh; I much prefer building a car from an old blown subwoofer, a passenger-side door latch, and an ignition motor.
You could have LCDPRINT check for the quote character if it's present then call LCDPRINTSTRING which would do the string processing......but it seems fast enough on such a small display it wouldn't really be needed.
Back in those days, i read about people using peek and poke to put assembly code into memory and then make basic interpret that code as their own basic function code or even run a native code.
5:20 A bit of minor pedantry here: I believe that MS BASIC tokenises keywords when you type them in, so each POKE is only a 1 or 2 byte (I don't remember which) token, rather than the whole command in ASCII. OTOH, it *does* have to parse numeric constants each time, which is slow AF. Back in the day, we put all that stuff in variables to help speed things up.
babe, wake up. new Ben Eater video just dropped
I miss turning on a computer and you're right in BASIC.
Yeeessss, finally you uploaded, I love your videos!
Ben is the OG bit flippin bad boy. I commend you on your projects, as an assembly programmer its a lost art IMO
you just put a link in my brain between my interest for the computer and my job as an electrician, thanks
love the special guest appearance by the HP16 🙂
Great video. Have you ever used GFA-BASIC?
You make me remember my first days in computer in middle of 90s
Just a quick guess but have you tried looking up the PRINT command in the source code to see how it handles the string parameter.
now add a command to draw 3d graphics on to a monitor
Ben Ate with this one.
Just like that, Ben creates new BASIC commands.
Man this channel gets me GOING. I love it so much.
That was so cool. I have added commands to commodore 64 BASIC. I recommend anyone try that as well
Thank you for another great video ❤ is there any reason why you do not have github sponsor enabled?
My 80's memories coming back to me. Beginners all purpose symbolic instruction code. I used to say that to my friends back then. Cringe.
This is exactly what I wanted, I've implemented this in JavaScript and using peek and pole to interact with on screen LEDs and switches
You make it look so easy... Fantastic!
thank you Ben for these videos, can I ask why you do not have github sponsors option enabled on your github account?
5:56 The "kbd" files are likely for the Mattel Electronics Keyboard Component's BASIC for the Intellivision gaming console.
Methinks the PRINT command will be explored next for clues to print a string to the LCD.
Is it an oversight that you didn't preserve the state of the X-register by pushing it to the stack (and popping it back out at the end of your subroutine) or did you decide it wasn't necessary?
You only need to preserve a register if you know you're changing something that is set and shouldn't be changed. When you enter a subroutine, the only regs that are important are the ones transferring data into the subroutine. Those are the ones you don't want to change. But others don't matter (generally).
Now if you're making an interrupt handler which can can be invoked absolutely anywhere in your code, then you have no idea what shouldn't be changed, and you have to be careful to preserve EVERYTHING.
It's a trade-off between portability vs performance. Preserving the register makes it easier to re-use code in other projects and for other people to build on it.
I originally learned Assembly from an old textbook on Win3.1, where the recommended convention was 'preserve by default'. But the book assumed a 486 CPU, I can see why you might choose 'clobber by default' on a microcontroller system.
This has me thinking about those old pocket computers that had an entire interface in a two-line LCD. I wonder what it would take to make this fully self-contained.
I really need to pull the trigger and get a 6502 kit. This looks way too fun!
Would love to see maybe writing a driver routine so the LCD displays runs basic natively.. it wouldn't be very good or usable but many portable computers did that, pretty much spawning graphing calculators.
I was wondering if what he's doing here was basically writing a driver. How is a driver different than this?
Haven't played with BASIC at this level before but wouldn't the necessary string parsing already exist in the PRINT command?
In the programming world, "S$" would be pronounced either as "ess string" or just "ess", for simplicity.
"S DOLLARSIGN" (or dollars S) just kinda rolls off the tongue in a way that ess does not.
@@codefeenix yeah back in the day I'd usually say 'ess dollar', sometimes 'ess string'
I would call this "ess string"
There's now plenty of "programming world" contexts where $ doesn't mean a string. Far more than where it does, in fact.
Shells and PHP use leading $ to signify a variable name, many JavaScript libraries have a convention for foo$ to be a stream or signal for a foo, etc.
@@SimonBuchanNz Yes and we're using BASIC here where $ indicates that a variable is a string. What $ is used for in 634 other languages and 12 other currencies really is irrelevant here. Since the days of yore (and I was there), S$ in BASIC is pronounced "ess string".
Nice stuff Ben. Im considering changing C64 BASIC to use the math capabilities of the Kawari to speed up BASIC in general, although I havent yet figured out how best to convert between FP and multiple integer operations and back again.
Microsoft didn’t bother. All math operations were done in FP. If you were worked with integer variables then they were converted to FP for processing and back again for storage. This is why using integer variables is actually slightly slower than FP.
As an aside, MS BASIC was pre the IEEE FP standard, so you may have problems working with any hardware FP solution.
@@stevetodd7383 yep, Im aware. The point being that Kawari can do 16x16=32bit multiplication, and the c64 has a 32 bit mantissa so it should be possible to speed up at least the mantissa multiplication (which I think is currently done with shifts and additions).
wooooooo! nice to see you're still making vids ❤
Back in the day, we used to modify MS BASIC. Usually to replace the error messages with an obscene variant. It would have been SO cool to have the source code and be able to add new keywords and such like. ... our hacking skills were still a bit lacking then, so reverse engineering the binary was still mostly beyond us.... Open Source is a wonderful thing.
FOR I=1 TO LEN(S$)
If I was teaching a computer programming course, I would deduct 1000 points if someone handed in an assignment with this, You don’t want to re-evaluate LEN() for each loop iteration. Instead, set a variable (constant) to be the length and then use that for your loop limit. That is if the version of BASIC supported dynamic FOR loop limit. I’ve seen some versions that fix this value even if it’s a variable. Very old version from 1970 or so.
would be cool if you got the BASIC output (the one that you get in the terminal) to be shown on the LCD screen :) how hard could it be?
cool video, thanks!
This stuff gives me a wave of nostalgia that literally feels like an ache in the solar plexus.
Yeah, I'm weird that way.
Could you maybe make a small video about incremental encoders? I‘d be very interested in your practical way of explaining!
The comment about the interpreter having to read each character and then determine if that forms a valid keyword reminds me of how ZX Spectrum BASIC has each of its keywords assigned an ASCII code, meaning every command is only one byte.
It’s called Tokenisation, and Microsoft BASIC does it also. It was a standard way to reduce the memory requirements of a program, but Sinclair cheated in that they didn’t bother starting with the string command and converting to a token before saving the typed line to memory, they just had you type the token directly on the keyboard (resulting in a horribly complex keyboard map).
@stevetodd7383 Mmm, although I think when you typed a program in 128K mode, it did the tokenisation when you entered the line. So it took a little longer to "register" each line but then only needed to interpret the tokens at run time.
@@K-o-R yes, they fixed things for the 128 (and gave it a more conventional keyboard at the same time)
1st rule of programming - if you are repeating a block of code make it a subroutine! Duplicating code is wasteful.
Command 1:
code
code
JMP Common
Command 2:
code
code
JMP Common
Command 3:
code
code
(no need to JMP here, we just continue into Common)
Common:
code
code
etc.
You would usually be correct, but with these older slower systems it really becomes a trade off between size and speed. For instance..
A simple loop that fills a 8kb area (Worlds Worst Video Card) takes around 90,000 cycles:
FillScreen: ; Setup Screen pointer
STZ Screen ; Ben Eater's Worlds Worst Video card
LDA #$20 ; uses the upper 8Kb of system RAM
STA ScreenH; This starts at location $2000
LDY #0 ; Zero out Y for loop, we'll clear off screen area as well.
LDA #0 ; Always use 0 for color with this routine. Cold load a memory location instead
LDX #0 ; Zero out X
.MemLoop ; Small, slow, fully rolled clear loop. Also clears offscreen area.
STA (Screen),Y
INY
BNE .MemLoop
INC ScreenH
LDX ScreenH
CPX #$40 ;Top of screen memory is $3F-FF,
BNE .MemLoop;Do until $40-00
; Reset it all
STZ Screen ; Ben Eater's Worlds Worst Video card
LDA #$20 ; uses the upper 8Kb of system RAM
STA ScreenH; This starts at location $2000
However take a look at this unrolled routine:
FillScreenLoop:
INX
; This is unrolled so that there is a STA for each row of the screen.
; Display mapping has 28 bytes at end of each row unused.
; Display location = $2000
FillScreen: ; Entry with x at 0 (or wanted start) and A with data/color value
STA Display, x ; Line 0
STA Display +$80,x ; Line 1
STA Display +$100,x ; ...
STA Display +$180,x
STA Display +$200,x
STA Display +$280,x
..... etc .....
STA Display +$1F00,x
STA Display +$1F80,x ; Line 63. (0-63 IE 64 lines)
CPX #99 ; Row is 99 long (0-99 IE 100 rows)
BEQ FillScreenLoopEnd ; 1 compare and 1 branch/loop pair per 64 rows
JMP FillScreenLoop ; 5 cpu cycles per pixel drawn.
FillScreenLoopEnd: ; 3x faster than a normal rolled loop
The routine above only takes 32,910 cycles. About 1/3rd the cpu cycles of the 'normal' routine. If your goal is to update a screen as quickly as possible you have no choice but to trade space for speed. Even with some tweaks 'proper' functions like are normal on modern systems are just too slow for some things on something like a 6502 running at a couple of Mhz or less. Though they will make everything easier to port to another system and possibly help use less memory.
But then on the flip side of both of these extremes you have something like WozMonitor that it is neither 'proper' nor 'slow', and it also known as one of the densest little chunks of code ever produced!
i wrote similar code on a Commodore 64 emulator running MS BASIC and it took about 1 second to run. When i moved the POKE address to a variable and stored the INT(ASC(... in a variable, it ran almost twice as fast. You had to make optimization on these old machines (and on new ones, too).
I love your videos and have learned a lot about computing and electronics from them. I'm trying to expand my knowledge, and it's content like thus that does it best for me.
Would you be open to doing a video on using a Raspberry Pi to control an ESP32 (that in turn controls another device, even an LED, or a neo pixel, or whatever is most interesting for you)?
I ask this because I really want a better understanding of creating Linux device trees and drivers for controlling any kind of device using something like i2c to get started.
Maybe there are other videos you've done that already covered this and I somehow missed them?
Can you add general hex/binary support? Very handy BASIC feature overall! E.G. POKE $D015, %10000100
Though that'd give your HP RPN calc less love.
No, the interpreter doesn't "read p,o,k,e", the main feature of MS basic is that all commands/keywords are tokenized so, POKE is represented by a single byte (called token in MS basic) 0x94 to be precise.
How else would it recognize the token if not by reading it one character at a time?
It only reads the characters P, O, K, and E once, after the user has entered a program line. Having recognized these characters in that order, it enters a "token" at the appropriate place, which is a single byte corresponding to the POKE command. It's a kind of bytecode. @alexander53
@@malloryworlton6359 yep on the sinclair computers you couldn't even type tbe commands, you had basically the tokens on the keys
Love your 6502 content!
gonna be a great weekend another ben eater video !
I really wouldn't call this hacking, but augmenting or expanding. Manufacturers who licensed it or built upon it back in the day basically did the same for their custom functions.
Creating your own BASIC commands would never have occurred to me, but if you have the source code ...
0:40 Nice to see a 16C to demonstrate hex
One thing i noticed, not sure if it's relevant, but in your LCD instructions, you're calling txa to transfer X to A, but then shortly after calling pha to push A to the stack, then calling pla at the end to pop A back.
It looks like whatever was in A before you run one of the LCD commands is getting lost, but I could be wrong.
I would have really loved to have a series of videos about a bare bones GPU with those 74 series chips :/
I'm not super familiar with msbasic, but Applesoft basic is made by microsoft as well and it supports redirecting of cout(). This allows you to redirect all standard basic IO to a custom routine... no need to create a new basic command.
Wow Ben is alive !!
or I have been dead this whole time i guess.
I understood about 3% of this video. A new record!
Excellent video as usual Ben 👍
I was thinking an hour ago, "It's about time Ben Eater posts something..." 😄 Granted!
I would love a video about file allocation table from binary point of view made by Ben
the legend is back!