it's not that they were geniuses, it's that today's people are dumbasses. seriously if you just compare the movies at the end of the 70's with the movies from the beginning of the 80's, there is a rapid decline in intelligence. and don't get me started with the music. yes there were notable exceptions, but I'm talking about the majority here. also, when we're comparing, it is important to notice that certain gems were not seen as such at the time, for example, the original Bladerunner (1982) was a complete flop in the box office, even though today it's considered one of the best movies ever made. my theory is that this effect is due to baby boomers becoming a dominant adult generation. as they were born in the 50's and onward, they started entering their 30's during the 80's, and by that time, for some reason, they were as dumb as it gets (at least when compared with their predecessors). I include my own parents there, even though they were exceptional for the times -- my father was a big fan of the microchip era, and even assembled a computer in the early 80's, but in the mid-90's it was already above his head, he simply couldn't grasp the reality of it (especially the software), and stuck to the hardware reselling, because as personal computers flooded the market, plenty of even dumber people wanted one for themselves.
@@jaleger2295 yeah? what is so cringy about it? you must be a millenial dumbass if you're really feeling it, but then you couldn't have really seen the internet 20 years ago, so I'm not sure what's your case. because man, if you're caught in the middle, like the 80's generations, you're a victim as well of being sandwiched by the retards, and are probably struggling with your emotional health. I just don't see why would you find my comment cringy then.
Same here. I was 14 when I got my Commodore 64, and when I read the assembly language book that everyone bought for it, it was such an eye opener. You read through all the instructions and at the end, it's like, "That's it?! That's all there is to it?!" The beauty of learning such a simple, middle of the road instruction set like 6502 is that, unless Assembly is going to be your main career jam, virtually every paper you ever want to read on Assembly becomes comprehensible, even if you don't know much about the architecture that is the platform of choice for the paper: everything you read is pretty much translatable into 6502 terms (RISC and microcode experts will slaughter me in the replies, of course).
@@GodzillaGoesGaga Sybex books - I had 2 or 3 of them, the editing was terrible and the contents were ore or less thrown together; despite that they were worth having and I learned a lot from them. Thanks Rodnay [sic] !
Just finishing up the Ben Eater series on the 8 bit breadboard computer... and wow! I started this 6502 video years ago and was like "woops! what have I stumbled into here...". Now, after Ben's instructional videos, I could actually keep up with this presentation and it was making a lot more sense. Thanks Ben, and Michael (and 6502 team).
+The Cobra Build Channel I did Z80 first and, because it's got more registers and "clever" instructions, I thought it was better for years... But the 6502 has won my heart over with it's simplicity in recent years.
I remember a short personal meeting with Chuck Peddle, the 6502's developer, at a Hannover fair at the MOS technology booth. Chuck was an absolutely modest man. We discussed a few application questions, and yes, he knew every transistor by its first name...
When you first watch this you are like "ok, this guy is from another planet". After watching Ben Eater series on how to build a 6502 computer from scratch this becomes alot more clear.
In the eighties, while in elementary school, I had the VIC-20, C64, 128, and about 3 1541 drives. I programmed in assembly and BASIC, cracking protection schemes, and doing all sort of reverse engineering. I was a better programmer using the C64, as a kid than I am now. I can really say after all of these years, I never really needed job training. The C64 taught me much of what Ive needed to know during my entire career.
This is the highest level of forensics I have been witness to and it is most intriguing! My thanks to all who put forth effort for this presentation, and especially the humble contributor in Hungary for piecing together that transistor sheet!👍👍
I've written huge amounts of 6502 assembler, but never seen exactly how it works before. I'm staggered at how much complex behavour was achieved with so few transistors. I guess the real power here is in the way the internal ROM sequences the operations cycle by cycle. I can now see how you can easily implement something like this in a FPGA. I'm amazed that so many people thought it was worth the effort of reverse engineering an obsolete processor. I'm glad they did though, that was fascinating.
Absolutely wonderful presentation! This sure brought back many wonderful memories for me. I cut my Assembly programming teeth working with the 4004 cpu chip. It was one of the many simple experimenting boards available in 1980 and I actually used some discrete IC components along with an oscilloscope to program a simple game using the scope as a display. That was so much fun. It is amazing what you can do with 4k of 8-bit ram and 1 megahertz clock speed for the cpu. I still have to laugh remembering playing with that simple board. The programming was done on a alpha-numeric keypad using 6 digits of LED display for data and address info. Simply amazing. I went on to program for many cpu models of that time. It started my career as a database designer and programmer for businesses.
Sounds like an awful choice for 1980 ... there were for example the Motorola D2 kit and the KIM-1, SYM-1, AIM-65 (in order of increasing fancy + expensive), as well as more thoroughgoing machines like the Little Big Board (CP/M) and of course the Apple ][ was about 4 years old by that stage.
Great work! As I began assembly and machine-language code in the late 70's, I could totally appreciate what is done here and the importance to keep what is not only a bunch of transistors but also art for me...
I enjoy the kind of firepower we have with today's processors, but I do miss the simplicity of the early processors and computers. Programming the 6502 in assembly on my Apple II was a ball.
The limitations forced many first time programmers to not only dive into the more obscure instruction sets but did their best to get creative on using those instructions to do what they wanted, regardless how impossible it may seem.
@@kuzadupa185 Very much so. There was a series (maybe more than one series) in Creative Computing aroung 1984 or so on 6502 ASM tricks for making games faster (sprite movement, e.g.). Brilliant stuff. I think John Carmack said he learned to be clever from 6502 constraints.
@@RogerBarraud thanks for reply/info, going to check it out but was hoping you knew of any good books or even videos/movies, that speak about this topic???
Having learnt assembly language on the VIC-20's 6502 and then Atari ST's 68000, it set me up to program microcontrollers when they arrived on the scene. So easy to learn assembly language for any micro, once the principles are embedded (NPI) in your brain.
Be interesting when we get through to the 68000... and all its descendants. Including maybe finally figuring out what the key differences between the original and the HC model, plus all the embedded versions are, that make them act in subtly different ways that are not only a headache for emulation, but also if you try to use a "real" later processor in an updated / repaired version of a classic machine (and in at least one case, vice versa). They do occasionally burn out, after all. ...and that, in essence, is why this work is so valuable. Microprocessors aren't immortal. Like any electronic component, they gradually wear down. As seen with overwrought GPUs used for bitcoin mining. It just takes the older ones a bit longer, much like a low wattage light bulb, as their traces are larger and clock speeds / TDPs are lower. But they will, some day, all die. So if we want to keep the memory alive, and be able to run the old software faithfully, we need to encode their behaviour before it's too late. On which note: there may not be much interest or even value in bothering with this for the x86 line at the moment (though the older chips do have some quirks not replicated in their modern evolutions), but presumably another branch of this project must involve the 8080 and 8085? There are a number of really old classic machines (including a lot of CP/M workhorses, and the original Space Invaders...) that used them, before the 6502 and Z80 (itself a modified clone of the 8080 the same as the 6502 was a modified clone of the 6800). Maybe even the 4004, 4040 and 8008, but I think Intel might actually have released the schematics for those already as part of their 40th anniversary celebrations... Oh, and of course, the 6800 and its variants. It doesn't get as much love as the others, but that processor line was still used in an awful lot of computers and other hardware.
For the surviving companies, publicatiom of older internal documents and interaction with surviving original engineers would be another avenue. Maybe one day we'll see a full explanation of design differences between 80186 and 8086 handling of common instructions. Or full docs on the original AMD CPUs from the 1970s.
It´s just scarily genious to make such a CPU with only 3510 transistors. And it´s same as genious to get it back onto "paper". I knee down to the gods.
The Zilog Z80 4MHz processor was the first processor I programmed back in the early 80's age 23 yrs. using assembly language . No assembler used - just machine code written from my head in memorised hex numbers to system EPROM. Seems to be a mind blowing task now but if you are determined and interested you will acquire the knowledge to let you create a program that executes with the minimum number of clock cycles. This is a great presentation going down into the 6502 and full credit to Mike - most impressive. I preferred the Z80 because of the linear address bus. Thanks for sharing this video and reminding me of the mnemonics.
@@johndododoe1411 It was just as "linear". But the 6502 was more byte-oriented in its addressing modes and sometimes took an extra cycle when crossing 256 bytes boundaries. The Z80 was faster though, despite what 6502 fans will tell you ;-) Especially so on code that had to deal with larger numerical quantities than single byte numbers, such as floating point.
@@johndododoe1411 Ok. Yes the 8080 had status signals multiplexed onto the data bus. But that wasn't a plague really, pretty simple to interface (although more involved than on the Z80). The later 8086/88 had multiplexed adddress and data on the same bus though, which was a little worse, and it also slowed these chips down somewhat. The 6502 didn't have any off that, but was instead slowed down by it's primitive and pretty unreliable (among chip versions from different manufacturers) memory access method that tended tp lock it into low clock speeds compared to a Z80 or 8088. The latter two would do stuff internally while waiting for memory response for two full cycles, while the 6502 just issued a short pulse (approx 30% to 60% of a clock pulse) and assumed the memory was fast enough.
@@herrbonk3635Different maximum clock speeds, different numbers of clock cycles per instruction. Very different architectures. Very hard to make comparisons of the form you are making. Performance comparisons need to be done by writing programs in each processor's assembly language to perform the same function.
The 6502 also powered a renaissance of pinball games in the 1980s, very nicely engineered, from Bally and others IIRC. Eight Ball, Bally, 1981 as an example.
@mr.1n5an_e It's not so much that "they have a word for that", but the way the language/grammar works means that a lot of what you might think of as separate words become just parts of a word.
It was also the text on screen when looking through the eyes of the Terminator in the first Terminator movie with Arnold Schwarzenegger. The screen would turn red and contain white text that was flashed up rapidly.
done them BOTH the 6502 and z80 ... I miss the good ole days of dealing with limited space and actually having to worry about every single cycle to get the most bang out of the program ....
The most beautiful processor of that time. By comparison the 8086 had a hopelessly non-orthogonal instruction set, limited addressing and was only 16 bits.
So many happy memories of poring over the Assembly manual for the Commodore 64, using BASIC to hand-assemble enough 6502 in memory to count to 10... I felt like a 14 year old god.
@The programming crypto guy Sure! Whatever comes to hand. I recently worked with Richard Dawkins translating some old Pascal programs of his into first Java and then JavaScript.
@The programming crypto guy Very good! Learning the 6502 was weird, because once I cracked the basic instruction set and addressing modes, and the stack concept, it was astronishing: THAT'S ALL THERE IS? But, yeah. A Turing machine made out of 1500 transistors or whatever it is. I swear it felt as if one had pulled the sword from the stone. (Or met the monolith, I was a teenage 2001 fan).
Would love to hear the story of the 6502 from one (or more) of the designers ... a great history lesson. Can not imagine the work and effort that was put into this cpu design ..... WOW factor !
watch the Apollo computer rebuild ...and check out the memory cores ... THAT was only a few years before the 6502 ... and BASIC was mid 60's .... then we got the 6502 and z80
I had a C64 and learned assembly language. I never bothered learning the hardware implementation (note: I don't mean things like VIC 2 sprites, charset and sound, to use the C64 you *needed* to know them, I'm talking about the silicon itself - gates, delidded chips etc. The literal black boxes on the motherboard) 40 years on I am interested...
Ahh..takes me back to the Commodore/Sinclair days.. I guess i was lucky to get 'hand-me-downs' of all the old computers when my uncle/dad/etc upgraded or got a new machine, i got the old one! Got to play with a 6502, Z80 and 68K! :D
They were hardware designers and 6502 coders who needed something better for their next project. They were partly influcenced by the 6502 implementation though, after visiting Bill Mensch and Western Design Center.
I would love to see something like this for the 8088, as a step towards a Visual IBM PC. Simulating things like floppy drives and CRT monitors in the same kind of detail as the chips might be too much of a stretch, though.
Wow Great presentation What I don't understand is that the original schematics and designs can't be found anymore. Has really somebody destroyed all these. They should be hang in a museum!
Hanniffy Dinn That would be strange! Why would those guys do all that work if the schematic is available on the internet Are you really talking about the internals off the 6502? Have you seen the video.
BOOZE & METAL no the schematics for 6502 have been available for decades. There is plenty of VLSI EDA free software you can download ( magic , icarus, klayout etc..) and you can easily find the oasis gda vector files for the 6502 and view them. Just search! The 6502 is so iconic it's the first thing I downloaded and viewed ages ago in a VLSI program. This 6502 project is like the huge 6502 maker project, it's just people doing things the hard way because they can for the lolz.
Hanniffy Dinn I think your are confusing some things. Those VSLI designs are functional (nearly) compatible designs, not really the same as the true internals of the 6502. I'm not a CPU architect, but I'm quite sure you can build the functionality of a 6502 in many different ways. Have you seen the video. Because they proofed that the internal design was clearly different then they had expected. What I was talking about was the original design, not somebodies interpretation!
Mum Blic What the fuck are you rambling about? I have the ORIGINAL 6502 design files on my hard drive. The 6502 is extremely well known and used. It's not a mystery fuckface it NEVER was.
Re: the ALU being pretty standard: The 6502 ALU design actually resulted in a patent related to how its decimal mode. Other CPUs at the time rather had a "decimal adjust" instruction that would adjust the result of and addition/subtraction based on the carry and half-carry into packed BCD. The 6502 instead had a flag which would cause the ALU to do this automatically, hence there's no half-carry flag exposed on the 6502. That it was patented is also likely the reason this mode was missing from the 6502 clone in the NES 2A03. By omitting it, they'd avoid infringing on the patent.
The 6502 ALU certainly has some quirks, but it's very standard compared to the Z80. Same with the sequencing, register latches and other details. See Ken Schriffs blog posts on these chips!
5 ปีที่แล้ว +4
BCD was meant to improve number calculation, avoiding the problems of rounding, it's designed for money and financial calculations.
People forget about memory constraints (not cheap and not small) and a float point package is not exactly tiny, whereas Binary Coded Decimal made it much easier as long as you paid attention to the half carry bit (for the young-uns, sometimes you needed to test it other times make sure it was zero before starting a new calc) and where the decimal point needed to be. Ahhh those were the days...
The Atari 400/800 BASIC did its math in BCD mode (as BASIC implementations go, it's a very nice BASIC). It was pretty slow but very accurate. I think they used a polynomial expansion for trig functions, so a sine would take a noticeable fraction of a second to compute.
@@messupfreq550 Except the 6502 used a patented method to actually do BCD adds and subtracts without the programmer needing to worry about the half carry bit, nor decimal adjusting the value afterwards. Annoyingly the D flag wasn't removed on an interrupt which meant the interrupt handler had to clear it using an extra 2 cycles each interrupt if it intended to do some ADCs or SBCs.
It's easy to miss what happens with the 'WAIT6502,1" command at 16:38 because it's natural to be looking at the bottom line of the screen, but the top line of the screen changes from 'Commodore Basic' to 'Microsoft'.
This is too funny - almost like Bill Gates pieing someone in the face years before being on the receiving end. I never owned a Commodore PET myself, having started with a C64, but able to see this effect when configuring Vice for the PET 3008 under model settings. 😁
This is a great presentation. I wrote an article on illegal opcodes in 1982, funny how similar the names of the illegal instructions are, although I called KILL "HLT" after the Z80 instruction 'halt and catch fire'. From memory most seemed to do nothing, which with an operand was great for obfuscating code for copy protection, but there was one weird instruction that was like ORA $absolute,Y with #$EE or something. I'd love to find out why that worked that way! Thank you for being so rigorous in *knowing* what happened and why.
@@JohnJTrastonmaybe some people don’t store cents , but try to store € or $ ? And they don’t know about OOP which would hide the 100 from them. So are lazy like JS and use floats. String Currency
My software engineering career began in 1982 with a C64 and a blackjack program written in BASIC. What I wouldn't give to see that blackjack program run on the emulator now!
because it's historical and nobody has been 100% sure how it works (hardware engineers and programmers would be interested in knowing) plus many people still play nes games on emulators or write emulators as a hobby and knowing how it works can make better emulators.
I guess, even the producers themselves didn't knew all of the chips behaviour. otherwise they would have for example tried to mask the KIL instructions and not just left it out of documentation ;)
actually if you were old and young enough back in the 60's and 70's this stuff is all old hat the people who are doing the archeology these days werent born then ...
@@acmenipponair you bet your buttocks they did ... the reason for the kills and all the undefined ops were for testing purposes and optomisations ... they kept them seperate to avoid various problems like emf cross talk ...on a trace ... or the activity of one transistor causing another unconnected one to see enough leak voltage to actually enable .. and add in the good ole boy resonance ... yup that sucker was a PAIN in the butt to keep to a minimum in places where you need resistors and capacitors BUT had long traces and transistors in the same areas ... these things would and still do give false positives ...
there is a reason, why Commodore didn't renamed MOS technologies: because that was the selling point of the CPU: Yes, the CPU manifacturer was owned by Commodore and therefore commodore got the chips for lowest cost for the PETs and Cxxx-computers. But on the other hand, as there was not Commodore, but MOS written on the chips, Commodore was able to sell the 6502 to Apple, Nintendo and so on without them having to fear a face lost. When somebody opened an Apple II, they didn't saw a big "Commodore 6502" processor on the board - which would have made them think "why don't I just buy a Commodore PET then instead, when Apple only uses the same chips as Commodore?" - But they saw a MOS 6502 chip on the board. They only rebranded the chips, when this problematic in Marketing wasn't there anymore, as most of the old 6502 computers had died out, the new ones hadn't boards you could see (who would ever open an NES console?) and also, in the 90s it was well known from the PC world, that computer companies work together, like Intel and IBM and later Apple and IBM.
that and to make the chips cheap enough for sale they needed to make BILLIONS of them... when you have 30 years of stock on hand any given day ... you tend to have lots of ready to go chips that cost a few pennies ... no point wasting money rebranding them when they will be INSIDE a machine already
I honestly wish I had the slightest clue how the hell someone goes about learning this kind of stuff from the very start and where to go with each next step that is actually graspable to the average punter, because this is just totally and utterly alien to me. It boggles my mind anyone can ever grasp this stuff. But, really frustratingly, I kinda get the impression from everything I've ever even just slightly touched upon on this topic [or kinda related] that I'd need to basically understand stuff at this level to ever really figure out how the living Christ to program for SNES myself (that's what I'm actually interested in). Because the community most certainly isn't making SNES development tools that just regular more casual game makers like me might want to use or indeed can use. Real shame it's still all so complex and beyond the grasp of most people. And, seriously, what the hell subject are people taking at high school or wherever that somehow provides them with the very basic knowledge required to even remotely understand where to start and the where to go next to be able to one day actually do this kind of thing with any semblance of competence. Every single time I try to start learning using whatever resource online, I reach a moment where I'm like, this cannot be the start, I've missed something, there's some step I need to understand that I should have known about before this point, because they're talking about stuff that I just don't have a clue about, yet the talk as if I'm already supposed to know it someone, even though they themselves never taught me about it prior to them mentioning it right now. It's just so frustrating to think I'm missing some fundamental piece of a puzzle here, some essential knowledge or even just some way of thinking about things that others seem to be able to do, and cannot for the life of me figure out what it is.
You could study physics. Only one person in my class proceeded with maths because it seems more fundamental and clean. Don’t go to applied physics right as your bachelor.
I have for long hoped for an emulator that doesn't emulate just the logic behavior of the 6502, but the electrical input and outputs. Although performance demanding, it would be perfectly doable. Imagine an interface that just exposes the input and output pins. I think it would even have to be written to just emulate the pins, but internally it could have a matrix of the actual transistor grid.
FPGA and even Raspi have General purpose IO pins . I was just shocked to find videos on TH-cam where the timing of a microcontroller was as bad as on a Desktop PC with long pauses and jitter. Why do vendors set up a microcontroller like this? If I hate on real-time, I would stick to a PC.
In 1977 I made a choice to go the Z80 route via the TRS80 Mod I as the instruction set and registers were more scalable and you didn't have to do gymnastics to get things done. 40+ years later I can still remember Z80 opcodes by memory. _lawn(void int x0FF)
All the ideas for RISC and pipelining came out of the design of the 6502... back then we didn't say "it's a bit like pipelining" we said "it HAS pipelining"!
The transistor count has gone up by several orders of magnitude (and the transistors have become similarly smaller), so I imagine it's just not feasible to read it directly in the same way now without some abstraction.
The Super Nintendo has 15 addressing modes in the LDA y ADC, SBC, CMP also has a Address Bus of 24 Bits and Can use 16 MB ROM cartridge without use a Mapper like in the NES
The CPU uses it 16bit ALU to avoid the additional cycle when you pass a page boundary while addressing. But all good NES coders laid out arrays and loops to stay within a page. So, no speed up. Do 2d games need 16bit registers?
Decimal mode is that important in business and financial applications that IBM based its entire mainframe character set (EBCDIC) on being able to efficiently store the values "0".."9". Decimal representation is important because it's more accurate, when dealing with large numerical values, than holding numbers in floating-point form. The downside, of course, is that BCD arithmetic is slower to perform, but that's relatively unimportant when accuracy is your greater concern: to overcome that, the mainframe instruction set would have been equipped with opcodes and macros intended specifically to deal with arithmetic on decimal values. (The 6502 could only do adds and subtracts on decimal values). Incidentally, the 8080 and its successors, including the Pentium, all had decimal (BCD) support, and did the Motorola 68xxx. BCD continue to be a supported data type in databases( e.g., MS SQL Server), as well as on Microsoft's .NET framework.
I strongly suspect many modern implementations use plain 64 bit integers instead if 18 digit BCD. Decimal floating point was a thing in many calculators and the Ti-99 BASIC interpreter. Ti99 used Base 100 encoding with each 2 digit pair encoded as an ordinary byte instead of a BCD byte to avoid dedicated hardware features.
This is fascinating. For practical reasons I'd like to know what the 65c816 does... (trying to work out the snes, basically) But I guess that will take a while. Obviously, it derives from the 65c02, which is implied in the name, and while that undoubtedly differs from the original (that's even documented in the programming guide), we also know of course that there are no illegal opcodes in 16 bit native mode, and that in emulation mode all illegal opcodes equate to 'no op'. But how does it vary and build on the 65c02 exactly? Does it effectively contain an entire 65c02 and then additional components? Or is it actually quite different internally and just pretends to be like a 6502 through the way the external logic is set up? I guess having a 6502 emulator that is 100% accurate would still allow you to verify emulation mode of a 65c816... Bearing in mind the known altered behaviour of course... mmh. One day, I suppose. XD
My understanding of the 65816, which may be wrong, is that emulation mode performs the same operations for each opcode that are performed in native mode, just without the option of switching to 16-bit. The bits that make 16-bit possible are inaccessible and fixed to be clear and "replaced" with the B flag and a 1 (the B flag is just an artifact used when pushing P to the stack during interrupt generation to signify if BRK caused the interrupt).
We had them make a 6502 with everything 8 made 16, only. However they sped up the transistors about 10 fold doubled the number of addressing modes though same principal and banked the 4 memories. Should have 24 bits and 2 operands though.
Super presentation. I really like the contents and the material. I 'd like to know how the presentation material was built. It's so video like. What software was used ? And especially what is this video (?) font ?
Does the $00 IR injection on IRQ really cause BRK not to work on interrupt? I'd assume the PC doesn't get incremented on that fetch, so when the handler returns it would refetch the BRK instruction and execute it. It would seem a more plausible explanation is that the RTI doesn't reenable interrupts until after the next instruction following the return has been executed, so it always executes at least one instruction with interrupts disabled on return. With interrupts disabled, BRK becomes a no-op. Just guessing here, but P comes off the stack last, after the PC which points to the BRK in the instruction flow. Then the S (?) bus gets latched into P while the BRK instruction is latched into IR due to the T-state overlap. Again, just guessing.
Holy cow. I feel like those guys in the 70's were pure geniuses and are producing technology that is science fiction to my brain today.
Yeah they were... Apollo missions were pure genius. Transistors. And lately, software.
it's not that they were geniuses, it's that today's people are dumbasses.
seriously if you just compare the movies at the end of the 70's with the movies from the beginning of the 80's, there is a rapid decline in intelligence. and don't get me started with the music. yes there were notable exceptions, but I'm talking about the majority here.
also, when we're comparing, it is important to notice that certain gems were not seen as such at the time, for example, the original Bladerunner (1982) was a complete flop in the box office, even though today it's considered one of the best movies ever made.
my theory is that this effect is due to baby boomers becoming a dominant adult generation. as they were born in the 50's and onward, they started entering their 30's during the 80's, and by that time, for some reason, they were as dumb as it gets (at least when compared with their predecessors). I include my own parents there, even though they were exceptional for the times -- my father was a big fan of the microchip era, and even assembled a computer in the early 80's, but in the mid-90's it was already above his head, he simply couldn't grasp the reality of it (especially the software), and stuck to the hardware reselling, because as personal computers flooded the market, plenty of even dumber people wanted one for themselves.
@@milanstevic8424 this might be the cringiest comment I've seen in like 20 years of internet
@@jaleger2295 yeah? what is so cringy about it? you must be a millenial dumbass if you're really feeling it, but then you couldn't have really seen the internet 20 years ago, so I'm not sure what's your case. because man, if you're caught in the middle, like the 80's generations, you're a victim as well of being sandwiched by the retards, and are probably struggling with your emotional health. I just don't see why would you find my comment cringy then.
There's some amazing science fiction today, particularly in the physics field!
I watch this about three times a year. It’s one of the best talks on the 6502 ever done. It inspired me to study assembly.
It's probably the best way to understand how a microprocessor works. I had the Rodney Zaks 6502 book with me all through the 80's !!
Same here. I was 14 when I got my Commodore 64, and when I read the assembly language book that everyone bought for it, it was such an eye opener. You read through all the instructions and at the end, it's like, "That's it?! That's all there is to it?!" The beauty of learning such a simple, middle of the road instruction set like 6502 is that, unless Assembly is going to be your main career jam, virtually every paper you ever want to read on Assembly becomes comprehensible, even if you don't know much about the architecture that is the platform of choice for the paper: everything you read is pretty much translatable into 6502 terms (RISC and microcode experts will slaughter me in the replies, of course).
@@GodzillaGoesGaga Sybex books - I had 2 or 3 of them, the editing was terrible and the contents were ore or less thrown together; despite that they were worth having and I learned a lot from them.
Thanks Rodnay [sic] !
You'll love Ben Eater's 6502 series then.
@@GodzillaGoesGagaok cool. Yet how did this come from the more simple computer used on apollo ?
"65312 views"... only 224 views left before the counter rolls back to zero!
It's 8 bits, so the counter had already rolled back to zero 255 times and was currently at 32 by the time you made your comment.
@@roger58679 no, views is about addresses.
@Dr. M. H. Yes, we saw the Carry flag was set! :)
Few years later - it made my day. Thanx :)
Ahahahaaha
Just finishing up the Ben Eater series on the 8 bit breadboard computer... and wow! I started this 6502 video years ago and was like "woops! what have I stumbled into here...". Now, after Ben's instructional videos, I could actually keep up with this presentation and it was making a lot more sense. Thanks Ben, and Michael (and 6502 team).
I ended up building my own custom sap1 after watching Bens series. I finally understand. The 6502 is a logical next step :)
The 6502 was the very first processor I programmed back in the early 80's. This was a fascinating presentation, well done!
+The Cobra Build Channel I did Z80 first and, because it's got more registers and "clever" instructions, I thought it was better for years... But the 6502 has won my heart over with it's simplicity in recent years.
VIC-20 FOREVER!
@@edgeeffect All these years on and it's still 6502 VS the mighty Z80 😁 who would have thought it.
Same here. Still using a BBC Master128 here, updated with MicroSD for storage.
I remember a short personal meeting with Chuck Peddle, the 6502's developer, at a Hannover fair at the MOS technology booth. Chuck was an absolutely modest man. We discussed a few application questions, and yes, he knew every transistor by its first name...
They were all called Trev.
I guess their last names were all Peddle, because they were his children.
When you first watch this you are like "ok, this guy is from another planet". After watching Ben Eater series on how to build a 6502 computer from scratch this becomes alot more clear.
It's like digital archaeology, isn't it.
Yes!!
But it is still very much used today, the 6502...
err excuse me
thats a great way to put it. but the 6502 and Zilog z80 are even used in some applications today!!
Exactly that.
In the eighties, while in elementary school, I had the VIC-20, C64, 128, and about 3 1541 drives. I programmed in assembly and BASIC, cracking protection schemes, and doing all sort of reverse engineering. I was a better programmer using the C64, as a kid than I am now.
I can really say after all of these years, I never really needed job training. The C64 taught me much of what Ive needed to know during my entire career.
Very interesting and awesome work. I loved coding for the 6502 in assembly language in the 80s ❤
This is the highest level of forensics I have been witness to and it is most intriguing! My thanks to all who put forth effort for this presentation, and especially the humble contributor in Hungary for piecing together that transistor sheet!👍👍
I've written huge amounts of 6502 assembler, but never seen exactly how it works before. I'm staggered at how much complex behavour was achieved with so few transistors. I guess the real power here is in the way the internal ROM sequences the operations cycle by cycle. I can now see how you can easily implement something like this in a FPGA.
I'm amazed that so many people thought it was worth the effort of reverse engineering an obsolete processor. I'm glad they did though, that was fascinating.
They do it for the games/software... and for the challenge itself of course.
Absolutely wonderful presentation! This sure brought back many wonderful memories for me. I cut my Assembly programming teeth working with the 4004 cpu chip. It was one of the many simple experimenting boards available in 1980 and I actually used some discrete IC components along with an oscilloscope to program a simple game using the scope as a display. That was so much fun. It is amazing what you can do with 4k of 8-bit ram and 1 megahertz clock speed for the cpu. I still have to laugh remembering playing with that simple board. The programming was done on a alpha-numeric keypad using 6 digits of LED display for data and address info. Simply amazing. I went on to program for many cpu models of that time. It started my career as a database designer and programmer for businesses.
Sounds like an awful choice for 1980 ... there were for example the Motorola D2 kit and the KIM-1, SYM-1, AIM-65 (in order of increasing fancy + expensive), as well as more thoroughgoing machines like the Little Big Board (CP/M) and of course the Apple ][ was about 4 years old by that stage.
If you were a masochist you could have at least gone for a SC/MP or CDP1802 ;-)
The first 4-bit processor! I've never used it - I only knew Z80A and 6502 - but I always wanted to try one.
I was a technician at CalComp in the early '70's. They made plotters. One of the test plots we would run was called "IC Mask on Strippable Film".
Oddly enough, this will be more relevant in the future.
Great work!
As I began assembly and machine-language code in the late 70's, I could totally appreciate what is done here and the importance to keep what is not only a bunch of transistors but also art for me...
Simulation of CPU on physical level is insanely precise simulation with all undocumented features and hardware bugs. Awesome! 😍
it is fascinating how people can wrap their head around this, super smart!
I enjoy the kind of firepower we have with today's processors, but I do miss the simplicity of the early processors and computers. Programming the 6502 in assembly on my Apple II was a ball.
The limitations forced many first time programmers to not only dive into the more obscure instruction sets but did their best to get creative on using those instructions to do what they wanted, regardless how impossible it may seem.
@@kuzadupa185 Very much so. There was a series (maybe more than one series) in Creative Computing aroung 1984 or so on 6502 ASM tricks for making games faster (sprite movement, e.g.).
Brilliant stuff.
I think John Carmack said he learned to be clever from 6502 constraints.
@@RogerBarraud thanks for reply/info, going to check it out but was hoping you knew of any good books or even videos/movies, that speak about this topic???
this was so fundamentally useful for my 6502 emulator you have no idea
Having learnt assembly language on the VIC-20's 6502 and then Atari ST's 68000, it set me up to program microcontrollers when they arrived on the scene. So easy to learn assembly language for any micro, once the principles are embedded (NPI) in your brain.
I still love writing 6502 Assembly on the Atari 8-bit computers.
Be interesting when we get through to the 68000... and all its descendants. Including maybe finally figuring out what the key differences between the original and the HC model, plus all the embedded versions are, that make them act in subtly different ways that are not only a headache for emulation, but also if you try to use a "real" later processor in an updated / repaired version of a classic machine (and in at least one case, vice versa). They do occasionally burn out, after all.
...and that, in essence, is why this work is so valuable. Microprocessors aren't immortal. Like any electronic component, they gradually wear down. As seen with overwrought GPUs used for bitcoin mining. It just takes the older ones a bit longer, much like a low wattage light bulb, as their traces are larger and clock speeds / TDPs are lower. But they will, some day, all die.
So if we want to keep the memory alive, and be able to run the old software faithfully, we need to encode their behaviour before it's too late.
On which note: there may not be much interest or even value in bothering with this for the x86 line at the moment (though the older chips do have some quirks not replicated in their modern evolutions), but presumably another branch of this project must involve the 8080 and 8085? There are a number of really old classic machines (including a lot of CP/M workhorses, and the original Space Invaders...) that used them, before the 6502 and Z80 (itself a modified clone of the 8080 the same as the 6502 was a modified clone of the 6800). Maybe even the 4004, 4040 and 8008, but I think Intel might actually have released the schematics for those already as part of their 40th anniversary celebrations... Oh, and of course, the 6800 and its variants. It doesn't get as much love as the others, but that processor line was still used in an awful lot of computers and other hardware.
For the surviving companies, publicatiom of older internal documents and interaction with surviving original engineers would be another avenue. Maybe one day we'll see a full explanation of design differences between 80186 and 8086 handling of common instructions. Or full docs on the original AMD CPUs from the 1970s.
I'm new to the 6502. The more I learn about it, the more I love it.
Really enjoyd watching this. Have been doing a lot of assembler work on the 6502 and 6510 many many years ago.
Good work!
Wow. I designed and hand coded an english language interpreter in 6502 code in 1987. Brings back lots of memories.
Where can i see it?
It´s just scarily genious to make such a CPU with only 3510 transistors.
And it´s same as genious to get it back onto "paper". I knee down to the gods.
I think that every IT class should start fr 6502 and his assembly
25:37 damn boi that's some hardcore dedication right there...
I learned assembly back in 1983 on my Atari 400 with the 6502. Cool stuff. I love the Bender and T-800 references! I had not seen those before.
I think the Hungarian guy Balaz (25:00) has got to be given an award for what he did drawing those transistors!
The Zilog Z80 4MHz processor was the first processor I programmed back in the early 80's age 23 yrs. using assembly language . No assembler used - just machine code written from my head in memorised hex numbers to system EPROM. Seems to be a mind blowing task now but if you are determined and interested you will acquire the knowledge to let you create a program that executes with the minimum number of clock cycles. This is a great presentation going down into the 6502 and full credit to Mike - most impressive. I preferred the Z80 because of the linear address bus. Thanks for sharing this video and reminding me of the mnemonics.
What was the 6502 bus like?
@@johndododoe1411 It was just as "linear". But the 6502 was more byte-oriented in its addressing modes and sometimes took an extra cycle when crossing 256 bytes boundaries. The Z80 was faster though, despite what 6502 fans will tell you ;-) Especially so on code that had to deal with larger numerical quantities than single byte numbers, such as floating point.
@@herrbonk3635 I wasn't asking about speed or efficiency. More about the kind of complexity that plagued the 8080 bus compared to the Z80 simplicity.
@@johndododoe1411 Ok. Yes the 8080 had status signals multiplexed onto the data bus. But that wasn't a plague really, pretty simple to interface (although more involved than on the Z80). The later 8086/88 had multiplexed adddress and data on the same bus though, which was a little worse, and it also slowed these chips down somewhat.
The 6502 didn't have any off that, but was instead slowed down by it's primitive and pretty unreliable (among chip versions from different manufacturers) memory access method that tended tp lock it into low clock speeds compared to a Z80 or 8088. The latter two would do stuff internally while waiting for memory response for two full cycles, while the 6502 just issued a short pulse (approx 30% to 60% of a clock pulse) and assumed the memory was fast enough.
@@herrbonk3635Different maximum clock speeds, different numbers of clock cycles per instruction. Very different architectures.
Very hard to make comparisons of the form you are making.
Performance comparisons need to be done by writing programs in each processor's assembly language to perform the same function.
The 6502 also powered a renaissance of pinball games in the 1980s, very nicely engineered, from Bally and others IIRC. Eight Ball, Bally, 1981 as an example.
Not being a coder. Most of this was way above my Head. But the Nerd in me stuck it out to the end and enjoyed ever minute of the presentation.
+The Pumpking King you don't have to be a coder to understand.
you have to be an engineer....
same here. I think those people came out of the outer space man!!!
learn assembly. he does not explain it in full detail.
if you learn assembly and computer architecture you would understand this completely.
@Dmitriy Getman Meh, you're about halfway there with C. Helps a lot to fool with asm a bit.
These slide transitions are awesome!
"utasításdekódolónak" = "for the instruction decoder"
Would be good to know who was the Balázs mentioned in the talk. He is my hero! Is it you?
@mr.1n5an_e It's not so much that "they have a word for that", but the way the language/grammar works means that a lot of what you might think of as separate words become just parts of a word.
Wait a minute. Wasn't this inside my Nintendo Entertainment System?
Yup!!!
I recognized 6502 Assembly code that was read in hexadecimal on a Doctor Who episode for one of the computers that controlled time.
It was also the text on screen when looking through the eyes of the Terminator in the first Terminator movie with Arnold Schwarzenegger.
The screen would turn red and contain white text that was flashed up rapidly.
Come from a Z80 background but this presentation was excellent. great work guys! 😁
done them BOTH the 6502 and z80 ... I miss the good ole days of dealing with limited space and actually having to worry about every single cycle to get the most bang out of the program ....
He should do the Z80
@@johnmarks714 Yes, then he would probably also learn how it works. (Which he clearly didn't understand at this point.)
Brilliant. Now, please, do the 68k. It is only 20 times more complex. :)
The most beautiful processor of that time. By comparison the 8086 had a hopelessly non-orthogonal instruction set, limited addressing and was only 16 bits.
Agreed
Magic. Thank you to all involved for the engineering love and effort.
So many happy memories of poring over the Assembly manual for the Commodore 64, using BASIC to hand-assemble enough 6502 in memory to count to 10... I felt like a 14 year old god.
@The programming crypto guy Sure! Whatever comes to hand. I recently worked with Richard Dawkins translating some old Pascal programs of his into first Java and then JavaScript.
@The programming crypto guy Very good! Learning the 6502 was weird, because once I cracked the basic instruction set and addressing modes, and the stack concept, it was astronishing: THAT'S ALL THERE IS? But, yeah. A Turing machine made out of 1500 transistors or whatever it is. I swear it felt as if one had pulled the sword from the stone. (Or met the monolith, I was a teenage 2001 fan).
Fascinating stuff! I always wanted to know why there were illegal opcodes for LAX and SAX, but not LAY, SXY, etc.
cpu architecture archaeology?
Would love to hear the story of the 6502 from one (or more) of the designers ... a great history lesson. Can not imagine the work and effort that was put into this cpu design ..... WOW factor !
watch the Apollo computer rebuild ...and check out the memory cores ... THAT was only a few years before the 6502 ... and BASIC was mid 60's .... then we got the 6502 and z80
Bill Gates had them all killed so that he could ensure a society where nobody knew but him.
Very impressive work behind this!
I had a C64 and learned assembly language. I never bothered learning the hardware implementation (note: I don't mean things like VIC 2 sprites, charset and sound, to use the C64 you *needed* to know them, I'm talking about the silicon itself - gates, delidded chips etc. The literal black boxes on the motherboard)
40 years on I am interested...
This is clearly the best talk ever provided in the history of mankind.
Nope. Too fast.
You sir, are one cool geek! I loved this video!
Okay TH-cam I'll watch the video! If I watch it will you stop recommending it to me?!?
Ahh..takes me back to the Commodore/Sinclair days.. I guess i was lucky to get 'hand-me-downs' of all the old computers when my uncle/dad/etc upgraded or got a new machine, i got the old one! Got to play with a 6502, Z80 and 68K! :D
This is a great example for both the debilitating approaches toward building A.I. and the solution therein. -merfthewise
The 6502 was used on the Commodore Vic 20. It introduced me to machine code.
if you had the assembler add on it was a 68k ... ;)
the original designers of ARM were all 6502 fan boys.
They were hardware designers and 6502 coders who needed something better for their next project. They were partly influcenced by the 6502 implementation though, after visiting Bill Mensch and Western Design Center.
You assumed gender.
@@Margarinetaylorgrease and?
One of them is called Sophie
stuart jarvis yes she‘s a great hardware/software engineer, but back then was still Roger Wilson...
I would love to see something like this for the 8088, as a step towards a Visual IBM PC.
Simulating things like floppy drives and CRT monitors in the same kind of detail as the chips might be too much of a stretch, though.
Simply a cool project and presentation of it.
OH OH OH STOP IT. my head hurts. iam going back to carburetors.
Wow Great presentation
What I don't understand is that the original schematics and designs can't be found anymore.
Has really somebody destroyed all these. They should be hang in a museum!
Mum Blic you can download them, I did ages ago.
Hanniffy Dinn That would be strange! Why would those guys do all that work if the schematic is available on the internet
Are you really talking about the internals off the 6502? Have you seen the video.
BOOZE & METAL
no the schematics for 6502 have been available for decades.
There is plenty of VLSI EDA free software you can download ( magic , icarus, klayout etc..)
and you can easily find the oasis gda vector files for the 6502 and view them.
Just search! The 6502 is so iconic it's the first thing I downloaded and viewed ages ago in a VLSI program.
This 6502 project is like the huge 6502 maker project, it's just people doing things the hard way because they can for the lolz.
Hanniffy Dinn I think your are confusing some things. Those VSLI designs are functional (nearly) compatible designs, not really the same as the true internals of the 6502. I'm not a CPU architect, but I'm quite sure you can build the functionality of a 6502 in many different ways. Have you seen the video. Because they proofed that the internal design was clearly different then they had expected. What I was talking about was the original design, not somebodies interpretation!
Mum Blic
What the fuck are you rambling about?
I have the ORIGINAL 6502 design files on my hard drive.
The 6502 is extremely well known and used.
It's not a mystery fuckface it NEVER was.
Interesting talk!
Oh the MOS 6502 CPU!! How it captures everyone's imagination!!
11:53 This is the indirect X addressing mode
Re: the ALU being pretty standard: The 6502 ALU design actually resulted in a patent related to how its decimal mode. Other CPUs at the time rather had a "decimal adjust" instruction that would adjust the result of and addition/subtraction based on the carry and half-carry into packed BCD. The 6502 instead had a flag which would cause the ALU to do this automatically, hence there's no half-carry flag exposed on the 6502.
That it was patented is also likely the reason this mode was missing from the 6502 clone in the NES 2A03. By omitting it, they'd avoid infringing on the patent.
The 6502 ALU certainly has some quirks, but it's very standard compared to the Z80. Same with the sequencing, register latches and other details. See Ken Schriffs blog posts on these chips!
BCD was meant to improve number calculation, avoiding the problems of rounding, it's designed for money and financial calculations.
People forget about memory constraints (not cheap and not small) and a float point package is not exactly tiny, whereas Binary Coded Decimal made it much easier as long as you paid attention to the half carry bit (for the young-uns, sometimes you needed to test it other times make sure it was zero before starting a new calc) and where the decimal point needed to be. Ahhh those were the days...
The Atari 400/800 BASIC did its math in BCD mode (as BASIC implementations go, it's a very nice BASIC). It was pretty slow but very accurate. I think they used a polynomial expansion for trig functions, so a sine would take a noticeable fraction of a second to compute.
@@messupfreq550
Except the 6502 used a patented method to actually do BCD adds and subtracts without the programmer needing to worry about the half carry bit, nor decimal adjusting the value afterwards.
Annoyingly the D flag wasn't removed on an interrupt which meant the interrupt handler had to clear it using an extra 2 cycles each interrupt if it intended to do some ADCs or SBCs.
My first computerwas a UK101,6502 based kit,cost about £325 with an 8k upgrade.This was around 1978-79 i think.
The UK101 was marketed by Caroline Radio D J Spangles Muldoon.
Wow! Really in-depth but super interesting!
It's easy to miss what happens with the 'WAIT6502,1" command at 16:38 because it's natural to be looking at the bottom line of the screen, but the top line of the screen changes from 'Commodore Basic' to 'Microsoft'.
Thanks Keith -- that's a good comment to add here. Usually we do something like WAIT6502,10 which prints it 10 times and is much more obvious.
This is too funny - almost like Bill Gates pieing someone in the face years before being on the receiving end. I never owned a Commodore PET myself, having started with a C64, but able to see this effect when configuring Vice for the PET 3008 under model settings. 😁
The sketch with Bender and Flexo are great xD
I wonder if Chuck Peddle got to see any of this work before he passed away in 2019.
This is a great presentation. I wrote an article on illegal opcodes in 1982, funny how similar the names of the illegal instructions are, although I called KILL "HLT" after the Z80 instruction 'halt and catch fire'. From memory most seemed to do nothing, which with an operand was great for obfuscating code for copy protection, but there was one weird instruction that was like ORA $absolute,Y with #$EE or something. I'd love to find out why that worked that way! Thank you for being so rigorous in *knowing* what happened and why.
Z80 has no 'halt and catch fire' instruction. You are probably thinking of 6800 or 6809.
Crazy to think that Janet Jackson's "Rhythm Nation" has been known to knock out hard drives. 😁
Fantastic presentation!
Fun fact :D as a hungarian, I can actually read that stuff, which Balázs wrote :D
Oh and "Utasításdekódoló" means instruction decoder
the physics based emulation.... niiiice
Bcd benefit is that manipulation never produces bad rounding results, unlike float. Good for accounting
What rounding? There are all integer operations. Not floating points.
@@JohnJTrastonmaybe some people don’t store cents , but try to store € or $ ? And they don’t know about OOP which would hide the 100 from them. So are lazy like JS and use floats.
String Currency
The first Atari was prototyped by Jay Minor with the Microcomputer Associates JOLT series
Miner. the car was a Minor :-)
My software engineering career began in 1982 with a C64 and a blackjack program written in BASIC. What I wouldn't give to see that blackjack program run on the emulator now!
Great talk, thanks!
Right at 25:40 everyone went crazy
That was a mass nerdgasm if I ever heard one. And rightfully so!
because it's historical and nobody has been 100% sure how it works (hardware engineers and programmers would be interested in knowing) plus many people still play nes games on emulators or write emulators as a hobby and knowing how it works can make better emulators.
I guess, even the producers themselves didn't knew all of the chips behaviour. otherwise they would have for example tried to mask the KIL instructions and not just left it out of documentation ;)
actually if you were old and young enough back in the 60's and 70's this stuff is all old hat the people who are doing the archeology these days werent born then ...
@@acmenipponair you bet your buttocks they did ... the reason for the kills and all the undefined ops were for testing purposes and optomisations ... they kept them seperate to avoid various problems like emf cross talk ...on a trace ... or the activity of one transistor causing another unconnected one to see enough leak voltage to actually enable .. and add in the good ole boy resonance ... yup that sucker was a PAIN in the butt to keep to a minimum in places where you need resistors and capacitors BUT had long traces and transistors in the same areas ...
these things would and still do give false positives ...
there is a reason, why Commodore didn't renamed MOS technologies: because that was the selling point of the CPU: Yes, the CPU manifacturer was owned by Commodore and therefore commodore got the chips for lowest cost for the PETs and Cxxx-computers. But on the other hand, as there was not Commodore, but MOS written on the chips, Commodore was able to sell the 6502 to Apple, Nintendo and so on without them having to fear a face lost. When somebody opened an Apple II, they didn't saw a big "Commodore 6502" processor on the board - which would have made them think "why don't I just buy a Commodore PET then instead, when Apple only uses the same chips as Commodore?" - But they saw a MOS 6502 chip on the board.
They only rebranded the chips, when this problematic in Marketing wasn't there anymore, as most of the old 6502 computers had died out, the new ones hadn't boards you could see (who would ever open an NES console?) and also, in the 90s it was well known from the PC world, that computer companies work together, like Intel and IBM and later Apple and IBM.
that and to make the chips cheap enough for sale they needed to make BILLIONS of them... when you have 30 years of stock on hand any given day ... you tend to have lots of ready to go chips that cost a few pennies ... no point wasting money rebranding them when they will be INSIDE a machine already
Que Hermoso... el 6502 y el Z80 SINCLAIR, programas de 13 líneas con 1kb de memoria
The 6507 is in the Atari 2600. The Atari 8-bit family of _computers_ (400, 800, XL and XE series) used the 65C02.
I was wondering if the 65C02 would get a mention. IIRC, it has some additional instructions over the 6502.
That's pedantic. The 6507 is the same physical die with interrupts and high address lines not brought out.
No Atari computer ever used the 65C02. You are thinking of the 6502C, essentially a standard 6502 but with two additional signal lines.
I honestly wish I had the slightest clue how the hell someone goes about learning this kind of stuff from the very start and where to go with each next step that is actually graspable to the average punter, because this is just totally and utterly alien to me. It boggles my mind anyone can ever grasp this stuff.
But, really frustratingly, I kinda get the impression from everything I've ever even just slightly touched upon on this topic [or kinda related] that I'd need to basically understand stuff at this level to ever really figure out how the living Christ to program for SNES myself (that's what I'm actually interested in). Because the community most certainly isn't making SNES development tools that just regular more casual game makers like me might want to use or indeed can use.
Real shame it's still all so complex and beyond the grasp of most people.
And, seriously, what the hell subject are people taking at high school or wherever that somehow provides them with the very basic knowledge required to even remotely understand where to start and the where to go next to be able to one day actually do this kind of thing with any semblance of competence.
Every single time I try to start learning using whatever resource online, I reach a moment where I'm like, this cannot be the start, I've missed something, there's some step I need to understand that I should have known about before this point, because they're talking about stuff that I just don't have a clue about, yet the talk as if I'm already supposed to know it someone, even though they themselves never taught me about it prior to them mentioning it right now.
It's just so frustrating to think I'm missing some fundamental piece of a puzzle here, some essential knowledge or even just some way of thinking about things that others seem to be able to do, and cannot for the life of me figure out what it is.
You could study physics. Only one person in my class proceeded with maths because it seems more fundamental and clean. Don’t go to applied physics right as your bachelor.
I have for long hoped for an emulator that doesn't emulate just the logic behavior of the 6502, but the electrical input and outputs. Although performance demanding, it would be perfectly doable. Imagine an interface that just exposes the input and output pins. I think it would even have to be written to just emulate the pins, but internally it could have a matrix of the actual transistor grid.
FPGA and even Raspi have General purpose IO pins . I was just shocked to find videos on TH-cam where the timing of a microcontroller was as bad as on a Desktop PC with long pauses and jitter. Why do vendors set up a microcontroller like this? If I hate on real-time, I would stick to a PC.
I once had the Ohio Scien. 6502 home super-board Basic/assem. and used TV conv. and tape cassete for non-votile memory.
Great speaker
31:38 What do you do if you want something slower? Write it in JavaScript! I find that hilarious as I prefer to use C or C++!
In 1977 I made a choice to go the Z80 route via the TRS80 Mod I as the instruction set and registers were more scalable and you didn't have to do gymnastics to get things done. 40+ years later I can still remember Z80 opcodes by memory.
_lawn(void int x0FF)
All the ideas for RISC and pipelining came out of the design of the 6502... back then we didn't say "it's a bit like pipelining" we said "it HAS pipelining"!
Processors with overlapping of the fetch, operand, execution and store phases existed decades before the 6502, i.e. before the microprocessor.
Coming late to this, but that was fantastic!
The transistor count has gone up by several orders of magnitude (and the transistors have become similarly smaller), so I imagine it's just not feasible to read it directly in the same way now without some abstraction.
The Super Nintendo has 15 addressing modes in the LDA y ADC, SBC, CMP also has a Address Bus of 24 Bits and Can use 16 MB ROM cartridge without use a Mapper like in the NES
The CPU uses it 16bit ALU to avoid the additional cycle when you pass a page boundary while addressing. But all good NES coders laid out arrays and loops to stay within a page. So, no speed up. Do 2d games need 16bit registers?
24:57 maybe the origins of a cost-reduced Atari 2600
Decimal mode is that important in business and financial applications that IBM based its entire mainframe character set (EBCDIC) on being able to efficiently store the values "0".."9". Decimal representation is important because it's more accurate, when dealing with large numerical values, than holding numbers in floating-point form. The downside, of course, is that BCD arithmetic is slower to perform, but that's relatively unimportant when accuracy is your greater concern: to overcome that, the mainframe instruction set would have been equipped with opcodes and macros intended specifically to deal with arithmetic on decimal values. (The 6502 could only do adds and subtracts on decimal values). Incidentally, the 8080 and its successors, including the Pentium, all had decimal (BCD) support, and did the Motorola 68xxx. BCD continue to be a supported data type in databases( e.g., MS SQL Server), as well as on Microsoft's .NET framework.
Games also used BCD for large numbers like the score. Pac-Man definitely does.
I suppose if score go over 32767 … :-)
I strongly suspect many modern implementations use plain 64 bit integers instead if 18 digit BCD. Decimal floating point was a thing in many calculators and the Ti-99 BASIC interpreter. Ti99 used Base 100 encoding with each 2 digit pair encoded as an ordinary byte instead of a BCD byte to avoid dedicated hardware features.
John, you're an even bigger nerd than me.🙂
This is fascinating.
For practical reasons I'd like to know what the 65c816 does... (trying to work out the snes, basically)
But I guess that will take a while.
Obviously, it derives from the 65c02, which is implied in the name, and while that undoubtedly differs from the original (that's even documented in the programming guide), we also know of course that there are no illegal opcodes in 16 bit native mode, and that in emulation mode all illegal opcodes equate to 'no op'.
But how does it vary and build on the 65c02 exactly?
Does it effectively contain an entire 65c02 and then additional components? Or is it actually quite different internally and just pretends to be like a 6502 through the way the external logic is set up?
I guess having a 6502 emulator that is 100% accurate would still allow you to verify emulation mode of a 65c816...
Bearing in mind the known altered behaviour of course...
mmh. One day, I suppose. XD
My understanding of the 65816, which may be wrong, is that emulation mode performs the same operations for each opcode that are performed in native mode, just without the option of switching to 16-bit. The bits that make 16-bit possible are inaccessible and fixed to be clear and "replaced" with the B flag and a 1 (the B flag is just an artifact used when pushing P to the stack during interrupt generation to signify if BRK caused the interrupt).
We had them make a 6502 with everything 8 made 16, only. However they sped up the transistors about 10 fold doubled the number of addressing modes though same principal and banked the 4 memories. Should have 24 bits and 2 operands though.
...now i understand why RISC is so popular these days
slap_my_hand This is why arm processors are so popular. It’s amazing to me that all of this started with the transistor and the founding of intel.
@@Adamchevy pre intel ... think Apollo .. and the Itsy Bitsy Machine company ...
Captions at 0:30 "I hand you over to my hairstyle". I LOLed a bit.
Lol turned on the captions just to see it
Now imagine trying to back engineer a cpu from present day
😱🤯
Super presentation. I really like the contents and the material. I 'd like to know how the presentation material was built. It's so video like. What software was used ? And especially what is this video (?) font ?
Me too
I would love to see a fully simulated computer
Does the $00 IR injection on IRQ really cause BRK not to work on interrupt? I'd assume the PC doesn't get incremented on that fetch, so when the handler returns it would refetch the BRK instruction and execute it. It would seem a more plausible explanation is that the RTI doesn't reenable interrupts until after the next instruction following the return has been executed, so it always executes at least one instruction with interrupts disabled on return. With interrupts disabled, BRK becomes a no-op. Just guessing here, but P comes off the stack last, after the PC which points to the BRK in the instruction flow. Then the S (?) bus gets latched into P while the BRK instruction is latched into IR due to the T-state overlap. Again, just guessing.
The CMOS versions of the 6502 had that bug fixed, afaik.