It sure has been a while since my last video, and there are so many new faces here! If you want to interact more with the community, I have a discord server where we talk about cool and random programming things, and you can see sneak peeks of future videos! discord.gg/9p82dTEdkN I will also start streaming programming soon on my second channel here: th-cam.com/channels/QSOig4wEV_pAFPRg0qtSmQ.html
I designed a 16-bit CPU (microprocessor) in 1977 that was used in a mainframe as the I/O processor. It had a unique instruction set (RISC). Lots of fun, and a lot of work.
@@dimi5862 go on show me, everything from scratch, from nothing. As of this comment the date is Saturday 23rd july 2022. You can work during live streams to make sure you didnt cheat
You have a level of knowledge about low-level computing that I wish I had in, well... anything. Incredible work, continue to hone your talents and I'm looking forward to that build when the day arrives!
As a programmer there's nothing that makes you feel quite like a fraud than seeing someone create their own CPU, build an emulator for it, create a programming language and compiler then write a game in it 😅 Well done and keep up the good work!
Thank you! Also, think the biggest problem with this project is time. I believe anyone can do it, easily, given there are so many resources online. But finding time to do the research, and make the design can be difficult. I didn't know anything about computer hardware, but I did research (and watched a lot of Ben Eater), and then a month later I made this cpu. It is relatively basic, but it is a start.
I have been involved in computer engineering since the 90's and to see how easy the knowledge is accessible now is mind blowing. I had to spend years on what he has done in a few months. This is truly the best time to be alive
I can't imagine what it must be like for the people who were there from around the beginning of all this craze to where we are now. At some point I imagine you knew everything there was to know about but as time moved on new products, programs and such became available and after a while it just becomes too much to all learn and you just watch in awe at how dense and complex its become
@@handlehagglerthe lady who help invent voip is still alive…she was one of the hallmark individuals in the creation of the World Wide Web. It’s so wild to me.
That why we need to support those people who betting on their life to crack payed research paper with newest knowledge and publish it for free, you can called it not right but people alwasy have right for seeking knowledge.
I've been programming for 8 years. I have recently fallen out of love with it. You have reminded me why I loved it in the first place - the ingenuity of simple yet expansible design, the thrill of problem solving and the marriage of hardware and software. Thank you.
@@memeology4138 learn c first. Python abstract just too much. First language should be c if you want to understand things. I would recommend "Let us c" for leaning c. We used this to learn c in college, then I learnt Java fir understanding object oriented language, then python.
@@abcdefghjijklfgu why c and js? learn one language first, master it. plus, js is usually nothing without html/css, unless you're learning react and stuff.
So i just barely passed my computer architecture course. We did mostly 32 bit MIPS processor and we also had to design one ourselves, but on 16 bits, and we didn't have to add multiplication or division. We only implemented about 15 instructions and i can tell you what this man did right hear, is no easy work. You deserve way more subscribers for the amount of work you put in man. Great work!
Simply amazing. The fact that you did all this for fun (mostly) speaks to your intelligence and understanding of how technology works. Yes, my brain didn't understand most of it, but seeing you construct each layer, part, and program was fascinating! I hope to see that creation of the CPU and other parts!
I understood about 0000000000000000% of it. Wondering WHY he did it, When there’s stuff already available and cheap these days, I guess it boils down to “Because I can”
I was planning on doing just that, it might just take some time though. I also want to do a design that doesn't use breadboards, and instead have a few custom PCBs printed.
@@AstroSamDev I would buy it if it's not too expensive! I really like this project and i think it would be really fun to develop own programms that run on real hardware!
@@AstroSamDev @AstroSam You can also put your design on an FPGA, I don't think I have ever seen one done on youtube before for a homebrew cpu. For those who don't know; FPGAs are circuits that you can load any digital circuit on it. For example you can put your circuit on an FPGA and map the input output pins of the top module and the circuit is now the circuit you created.
@@cemalettincembelentepe8943 FPGAs have been used for soft CPUs for a very long time now (well over a decade), it would be more interesting to see a hard CPU manufacturing approach and it would be unique for these homebrew CPUs unlike using an FPGA. Granted he'd be stuck getting them as samples on a really old process node (>=28nm), but that's more than good enough for this purpose
incredible!! this is the most complex thing I've ever picked up on so quickly, You literally made this sound so straightforward I love it. Thank you so much!!!
Not coming here to defend myself, I don't care if what I said was stupid. To anyone saying I'm disrespectful though I suggest you maybe actually look into Terry Davis as a person, because he was a lot more disrespectful than I am. Anyway RIP Terry. I may not agree with his ideology but can't deny the man was a pretty legendary programmer.
When I was 16 i just started learning my first programming language. Meanwhile this guy is casually creating his own languages and also designing a computer from scratch while he's at it. You're going places, man
I did something similar to this a good few years ago, I designed an 8-bit CPU in logicSim, wrote an emulator in C#, designed an assembly language and assembler. Unfortunately, I made a lot of mistakes in designing the architecture, things like fixed instruction lengths and no way to store perform operations on values larger than 8 bits (technically it was possible, but with 256 bytes of memory and 8 of them reserved for registers it wasnt very useful anyway). Having recently written a Z80 simulator, and learning a lot about CPU architecture in the process helped me identify all of the issues in my initial design. with that said, it wasnt bad for an A-level project that confused the hell out of everyone who tried to mark it.
@@shinyhappyrem8728 considering the memory limitations, every instruction being 3 bytes long is an issue (command operand1 operand2), some instructions didn't need operands, such as halt, but would still need 3 bytes in RAM.
@Ramtin Javanmardi If you understand how the architecture works, writing an emulator isnt really too hard, you could even write a simulator if you wanted, but you need to get instruction timings and all sorts of hardware stuff simulated too, which can be a hassle. Basically, you need really 2 core things for an emulator, an ISA, which you should have if you designed the microArch, and some way to interpret the ISA, really this will probably be high level logic in your chosen programming language. I simply used a case switch on binary opcodes that called methods with the operands (remember, because this is an emulator, memory read and write timings dont matter), but a better approach (and the one I used on my Z80 simulator) would be to use instruction classes and loop through an array of these classes (well, cast to interfaces) testing if the opcode can execute a given function in the class. How you handle IO is up to you, I didn't, but memory mapped IO is pretty easy to implement and can be done by simply accessing the RAM object of your emulator. My Z80 simulator is open source, and was written for a university project, so it might help, it can be found here: github.com/Lewinator56/z80Sim (you will need .net core 3.1) Writing an assembler is kind of different, in reality at the easiest level you are simply going to have a defined assembly language and just convert it into opcodes, which is pretty easy to do. If you want to include functions and variables (by name) in the ASM then you need to process symbols too. I simply stuck with a conversion from ASM to binary.
You could learn verilog or vhdl to create the cpu on an fpga, then use off the shelf ic's for ram and rom. Fpga's, if you dont know, are programmable hardware that rewire themselves based on coding. Intel , AMD, and other companies use multi million dollar chips for prototyping chips before sending them off to print. Consumer models range from $50 to $300, and have been used for software implementations for retro systems and modern 32 and 64 bit risc v based cpus.
alternatively you can also be lazy like me and simply build your circuits in a logic simulator like "Digital" (it's like Logisim but faster and with extra features) and then just export them as Verilog/VDHL. it works surprisingly well actually.
And at this point I would have thought that chips from Intel and AMD are pure evolution .. no revolutionary steps. Steps are not agile. Maybe they include experimental modules in every chip, test them, and if it works, you get the next iteration ( more expensive ). An FPGA and a chip from Intel have nothing in common. Intel chips are optimized down to the photolithography layout and full analog simulation (spice) of the transistors. FPGA does not give you this. The days of the Pentium DIV bug, where a professor from a university uses an ad hoc script to get his algorithm from university software to some proprietary CAD at intel, are long gone.
@@ArneChristianRosenfeldt The advantage of using an FGPA instead of individual chips for this project is, that others could make use of it too by simply reprogramming their FPGA. There is already a real CPU design called Magic-1 from Bill Buzbee. The problem is, others have to recreate it, if they want have a copy of it. Search for Magic-1 on YT, there are some videos about this impressive project.
dude legit its my first year in computer science university and the fact that i understood everything you said in this video makes me happy, you inspired me to try messing around with cpu designing
Woah, I remember last year, one of the subjects I had during the semester was about computer architecture. We saw how the alu works, the ram, the control unit, instruction and microinstruction sets and even periferal handling. Our tests were about designing some use cases for each component, which was a demanding process. Seeing you put that knowledge to work makes me remember how amazed I was while taking those classes. Nice video! and excellently explained too
@@mathismartin3092 Our exams were mainly done on paper, but we used logisim for practicing! Of course we stuck to no more than 4 or 5 bits for the instruction set, and mostly doing just read, write and math instructions, but it was really interesting to figure out how to do certain things :D
For smoother movement of the paddles, you could have created 2 more characters (either added to the charset, or overwriting some unused characters) with half height paddle characters, so your paddle could move in half-tile increments.
watched this video once before college and didn't understand much. watched it again after my first semester as an ECE major and understood a lot more. thanks!
bro, I've just came from computer organization and architecture final exam and I wish they taught us by making similar CPU model in the course, well done and keep going you're amazing!
HELL YEAH. I unknowingly taught myself how to build computers using redstone in minecraft. At this point I've now built dozens of redstone machines including an entire gamesystem in minecraft. It has a 7x7 pixel screen, A 7bit CPU, more than 80 4bit regesters, and a rom size currently only 5.456KB or 496 bytes. However the rom can be expanded to indefinite sizes.
The quality of your videos is like a youtuber with a million views per video. I guess I found your channel the right time as I am getting more into computers but not even close to where you're at.
As a CS major and am a software engineer, I designed my own 8-bit microcontroller. I never got past the logic diagram because I didn't need to. I just used logisim to just stimulate my own JAR files for custom functionality. most of it was imbedded diagrams made in the sim. The whole goal was to create a compiler for machine code and load the bytes in with a file in logisim. It was cool writing my own assembly language and executing programs I wrote. I can't remember the program I used to simplify my logic diagrams though
Love the video and the content! Only issue I had was getting caught off guard by the change in music or the selection, startled me a few times cause it would change out of no where while I wasn't fully focused on it and thought something was making noise in a different tab or process lol.
Using a 16-bit address bus would get you 64 kibibytes. Although, since you’re storing a 16-bit word at each address rather than a byte, you have 64 kibiwords - or 128 kibibytes - of memory to work with. Le edit: 1 kibibyte = 1024 bytes. 1 kilobyte = 1000 bytes. One is a binary prefix, the other is a metric prefix.
I'm never going to use that childish sounding garbage. 1024 bytes is a kilobyte and SI has no business even being included in the discussion, I don't even care that they're using the same letters for prefixing, it's still going to be powers of two. Hard drive manufacturers have caused and are continuing to cause harm to the industry by being such cheats and now we've got douche-nozzles who want to redefine all of our terms instead of taking the manufacturers to task. It's almost as bad as changing BC/AD to BCE/CE because they don't like the implications of how the names originated. Stop redefining terms.
I just discovered your channel and instantly loved it, and since you're so skilled with programming, here's an idea: Why don't you start working on a project where you try to design your own NFS Most Wanted 2005 version with the Flagship Cars designed by you, and a bit of story change, do you think it's interesting enough for a YT video? I'm pretty sure after these 20 years I find nostalgia hitting me hard about NFS MW, there are a bunch of mods for this game, but doing something as original as you do in your YT channel, that would be of immense value to us developers and game lovers. Thank you for all your hard work, amazing channel!!
A nice fun little(?) project. I would make some changes to the instruction set. Most instructions had the register after the mnemonic but you have AIN, BIN, CIN. They should be INA, INB, and INC. However, you have 5 bits for the op code and 11 spare bits. You can use the spare bits to indicate source and/or destination register as that can be coded in only two bits each. For jump instructions the spare bits can indicate which flag(s) to check. That would free up 7 instructions for other possible op-codes and make IN, LDI, SWP, and the math operations able to work with any (pair of) registers.
Now grab an FPGA devkit and turn it into real hardware! (In case you haven't done that before, designing circuitry with Verilog / VHDL is actually quite an interesting puzzle).
Agree. My university had us build a single cycle 8-bit MIPS processor on a Xilinx FPGA. A very simple processor, yet probably the best project we've done so far.
Very cool project. This was the stuff I loved doing at uni. Sadly FPGA are still crazy expensive, the CPUs we designed there were all in HDL languages and allowed you to run on hardware. Makes me wanna dig out that code again.
Yo this helped so much and I always appreciate the content and when i found the channel and got the energy from you from the previous video, you've been nothing but real and can vouch for the amazing content and how down to earth you are with everything! All the most love, respect, and appreciation
I watched this video twice before today. I never really understood anything I just played nandgame the other day and i now actually understood most of the stuff. I realized how much more impressive this was
You may want to expand the architecture to add interrupts, although it does complicate things somewhat given that you'll probably want a stack to handle those, and you currently don't have anything stack-ish like jump-to-subroute or returns.
Was going to mention this. It would make timing quite a bit simpler, no more arbitrary wait N cycle loops, just execute game logic once on every vertical sync interrupt. :)
I've always wanted to know how computers work at lower levels. Thanks a lot, this was very insightful, and was very entertaining to watch. You got a new subscriber now.
I had a similar project during my computer science degree. This was a good refresher on the different individual components of a CPU. Makes me want to fire up Logisim again! Great video.
Does your emulator skip the logic layer and just execute instructions on the list? You did an amazing job with your custom CPU. By the way I noticed the original Sims OST playing in the background. I love that OST.
One small thing about how you explained your emulator. You conveniently forgot to mention that part of why logisim is so much slower is that it actually simulates things far more in depth than your emulator. A very minor thing, but the way you glossed over it gives the impression that logisim is slow because... It's slow. Rather than because it's doing way more work than your emulator.
and Logisim's speed does depend on whether or not one is using the default 'chips' inside it or not, it has math operators and register modules, and whatnot, but if you eschew that and go down to the individual gates in sub-circuits....
@@Costelad Imagine instead of executing an add instruction directly, you simulate the propagation of electricity through each part of the circuit implementing an add operation.
The first video of yours im watching and I love it! Ive always been interested in computers and specifically how they work, and ive also wanted to get into circuit building so this is RIGHT up my alley. Your editing is fucking amazing too! Keep it up dude!
Hey so I have an idea! I think that Z# would fit perfectly as an embed for the computer. It should be its own name language. Kind of like batch is for windows. I’m very impressed by this! Hope to see more videos from you in the future
Awesome project. I love stuff like this, even started my own project. Tho it's already a few years working on and off on it, and using a 6502 instead of just logic stuff, and on breadboards. Fun stuff and great work getting so deep with this stuff! Now make it in real hardware! 😏 ...I mean, you used logisim so IC counterparts should be relarively easy to come by, VGA might pose some problems but the VGA signal is quite simple in the end...
Been watcNice tutorialng your vids for a good few weeks now, learning new sNice tutorialt each day. my worksoftow has improved so much since watcNice tutorialng
My godness, your content is awesome ! I think that a video explaining your path to achieve such a detailed knowledge about low level computing would also be very interesting :)
Now you need to build this with actual hardware and add a long term memory to it. Like a SD card slot or something that could be used to store data even when the system is turned off.
Holy shit my man your a fucking genius . Everything from using c++ to make the compiler to the emulator to increase processing speed. This is fucking gorgeous!! I would love to pick your brain on ideas and concepts .
I just found your channel today and your crazy dude! im Just getting into computers. But Making a language AND creating a whole new CPU. Is amazing man! If your were to ask a professor how to do this, he would look at like you were insane! Keep up the great work man!!
I was looking into N64 turbo nerd assembly stuff and I realized something (maybe i'm unaware of some intermediary steps tho, i'm just a terminally online dude with way too much time), the CPU being basically a glorified calculator, the bits fed into it might "just" open/close logic doors to tell the cpu what to do with the following bits of the instruction. For instance : if the 5 first bits are linked to, say, ADD instruction, they'll open the "gates" for the following bits to be fed through the ADD transistor array section. I believe that's why in every assembly language I ever read about, the instruction comes first, then the appropriate values, being usually at least one register address and a fixed value or another register to read the value in. IDK if i'm clear, very interesting project overall ! Thank you for the video !
The fact that technology became so advanced, that practically anyone can make a computer from scratch on their own is fascinating. I start to remember the first ever computer made, that took 4 rooms of space and a group of people to operate, while now a single person can do the same thing in smaller space
It sure has been a while since my last video, and there are so many new faces here!
If you want to interact more with the community, I have a discord server where we talk about cool and random programming things, and you can see sneak peeks of future videos! discord.gg/9p82dTEdkN
I will also start streaming programming soon on my second channel here: th-cam.com/channels/QSOig4wEV_pAFPRg0qtSmQ.html
Hmmm 7 minutes beforr video was uploaded
@@offsetmonkey5382 sus
Discord server is very good everyone should instantly join
Very Nice
Why don't you team up with @Sam Zeloof to meke a homemade computer, he made an 1000 transistor silicon based chip.
I designed a 16-bit CPU (microprocessor) in 1977 that was used in a mainframe as the I/O processor. It had a unique instruction set (RISC). Lots of fun, and a lot of work.
16 bit in 1977? That's insane
@@helloworld145 Not for a mainframe. But the price tag sure was insane.
Cool
@@helloworld145 There were 64-bit computers in the early 1960s.
@@herrbonk3635 64 bit computers must be crazy expensive in that year lol.
The fact he did all this in under 4 months is impressive
I can do it in one
@@dimi5862 go on show me, everything from scratch, from nothing. As of this comment the date is Saturday 23rd july 2022.
You can work during live streams to make sure you didnt cheat
i can do it in one
(millenium).
@@maindepth8830 lol but true
@@dimi5862 prove it
You have a level of knowledge about low-level computing that I wish I had in, well... anything. Incredible work, continue to hone your talents and I'm looking forward to that build when the day arrives!
Just read textbooks
These things are taught in engineering.
@@elakstein no they dont!
@@PflanzenChirurg ok. I will correct myself. I learnt these things in college. Obviously not as deep as he showed in this video.
@@elakstein true... but we somehow dont understand or learn that but a person like this can teach it in just hours...
As a programmer there's nothing that makes you feel quite like a fraud than seeing someone create their own CPU, build an emulator for it, create a programming language and compiler then write a game in it 😅
Well done and keep up the good work!
Thank you!
Also, think the biggest problem with this project is time. I believe anyone can do it, easily, given there are so many resources online. But finding time to do the research, and make the design can be difficult. I didn't know anything about computer hardware, but I did research (and watched a lot of Ben Eater), and then a month later I made this cpu. It is relatively basic, but it is a start.
Wow, are you just super smart or something @@AstroSamDev
@AstroSamDev Even still, it takes incredible dedication to complete. Even if there's enjoyment to it, what you've done is beyond impressive!
Do you use programming language that were not made for Browsers ? (aka, not JS)
If you answer yes to that, you're not a fraud.
@@monad_tcpyou wouldn't be able to write this comment without JS
I have been involved in computer engineering since the 90's and to see how easy the knowledge is accessible now is mind blowing. I had to spend years on what he has done in a few months. This is truly the best time to be alive
I can't imagine what it must be like for the people who were there from around the beginning of all this craze to where we are now. At some point I imagine you knew everything there was to know about but as time moved on new products, programs and such became available and after a while it just becomes too much to all learn and you just watch in awe at how dense and complex its become
@@handlehagglerthe lady who help invent voip is still alive…she was one of the hallmark individuals in the creation of the World Wide Web. It’s so wild to me.
That why we need to support those people who betting on their life to crack payed research paper with newest knowledge and publish it for free, you can called it not right but people alwasy have right for seeking knowledge.
>>creates own programming language
>> learn and creates a CPU with it's own assembly and microcodes
this man is becoming god.
Broo this is not 4chan
@@le9038 just assumed it was
@@le9038 ew... reddit
@@le9038 I think you'll find it's quite literally the opposite lmaoo
God like terry himself...
waiting for alternate temple os :)
I've been programming for 8 years. I have recently fallen out of love with it. You have reminded me why I loved it in the first place - the ingenuity of simple yet expansible design, the thrill of problem solving and the marriage of hardware and software. Thank you.
Hello can u help me? Im new to coding can u link me to any good sources for learning python
@@memeology4138 learn c first. Python abstract just too much. First language should be c if you want to understand things. I would recommend "Let us c" for leaning c. We used this to learn c in college, then I learnt Java fir understanding object oriented language, then python.
@@memeology4138 im new too but im doing c and js
@@elakstein thx for ur help
@@abcdefghjijklfgu why c and js? learn one language first, master it. plus, js is usually nothing without html/css, unless you're learning react and stuff.
So i just barely passed my computer architecture course. We did mostly 32 bit MIPS processor and we also had to design one ourselves, but on 16 bits, and we didn't have to add multiplication or division. We only implemented about 15 instructions and i can tell you what this man did right hear, is no easy work. You deserve way more subscribers for the amount of work you put in man. Great work!
Simply amazing. The fact that you did all this for fun (mostly) speaks to your intelligence and understanding of how technology works. Yes, my brain didn't understand most of it, but seeing you construct each layer, part, and program was fascinating! I hope to see that creation of the CPU and other parts!
I understood about 0000000000000000% of it.
Wondering WHY he did it, When there’s stuff already available and cheap these days, I guess it boils down to “Because I can”
Watching this after my first year as a comp enge major makes me feel so incredibly proud of how far I’ve come in the past year
Amazing. Deserves far more views! Would be great to see you make this CPU for real like the jdh-8
I was planning on doing just that, it might just take some time though. I also want to do a design that doesn't use breadboards, and instead have a few custom PCBs printed.
@@avinadadmendez4019 Cool, I have never heard of that program, I'll look into it more. Thanks!
@@AstroSamDev I would buy it if it's not too expensive! I really like this project and i think it would be really fun to develop own programms that run on real hardware!
@@AstroSamDev @AstroSam You can also put your design on an FPGA, I don't think I have ever seen one done on youtube before for a homebrew cpu. For those who don't know; FPGAs are circuits that you can load any digital circuit on it. For example you can put your circuit on an FPGA and map the input output pins of the top module and the circuit is now the circuit you created.
@@cemalettincembelentepe8943 FPGAs have been used for soft CPUs for a very long time now (well over a decade), it would be more interesting to see a hard CPU manufacturing approach and it would be unique for these homebrew CPUs unlike using an FPGA. Granted he'd be stuck getting them as samples on a really old process node (>=28nm), but that's more than good enough for this purpose
It's been a while since I've taken a comp org class; this was the perfect bridge of software and hardware that reminded me why I loved it so much!
This guy is building his whole DIY computer and I struggle to write a three-line-long Python code to download pictures from a website ...
Yeah. I've wanted to build my own computer for decades, but my brain just goes, "what are these facts NOPE!" lol but it's getting better slowly.
incredible!! this is the most complex thing I've ever picked up on so quickly, You literally made this sound so straightforward I love it. Thank you so much!!!
Man, the sequel to templeOS is looking wild, I’m just waiting for the twist the writers have planned.
It's just TempleOS minus the schizophrenia
@@BigOrse so its just boring then
@@test-zg4hv What so I should praise the man for being completely insane?
@@BigOrse I'll wait on that diagnosis..
Not coming here to defend myself, I don't care if what I said was stupid. To anyone saying I'm disrespectful though I suggest you maybe actually look into Terry Davis as a person, because he was a lot more disrespectful than I am.
Anyway RIP Terry. I may not agree with his ideology but can't deny the man was a pretty legendary programmer.
When I was 16 i just started learning my first programming language. Meanwhile this guy is casually creating his own languages and also designing a computer from scratch while he's at it.
You're going places, man
You think you feel under skilled I'm 22 and just learning visual studio code and Django
@@angryvaultguy dang
I did something similar to this a good few years ago, I designed an 8-bit CPU in logicSim, wrote an emulator in C#, designed an assembly language and assembler. Unfortunately, I made a lot of mistakes in designing the architecture, things like fixed instruction lengths and no way to store perform operations on values larger than 8 bits (technically it was possible, but with 256 bytes of memory and 8 of them reserved for registers it wasnt very useful anyway). Having recently written a Z80 simulator, and learning a lot about CPU architecture in the process helped me identify all of the issues in my initial design. with that said, it wasnt bad for an A-level project that confused the hell out of everyone who tried to mark it.
256 addresses and some external logic should be enough to implement a microcode architecture.
A-level??? Dude that's unreal. I built an sql table viewer in VB...
Fixed instruction length isn't that bad, it's what RISC uses.
@@shinyhappyrem8728 considering the memory limitations, every instruction being 3 bytes long is an issue (command operand1 operand2), some instructions didn't need operands, such as halt, but would still need 3 bytes in RAM.
@Ramtin Javanmardi If you understand how the architecture works, writing an emulator isnt really too hard, you could even write a simulator if you wanted, but you need to get instruction timings and all sorts of hardware stuff simulated too, which can be a hassle.
Basically, you need really 2 core things for an emulator, an ISA, which you should have if you designed the microArch, and some way to interpret the ISA, really this will probably be high level logic in your chosen programming language.
I simply used a case switch on binary opcodes that called methods with the operands (remember, because this is an emulator, memory read and write timings dont matter), but a better approach (and the one I used on my Z80 simulator) would be to use instruction classes and loop through an array of these classes (well, cast to interfaces) testing if the opcode can execute a given function in the class.
How you handle IO is up to you, I didn't, but memory mapped IO is pretty easy to implement and can be done by simply accessing the RAM object of your emulator.
My Z80 simulator is open source, and was written for a university project, so it might help, it can be found here: github.com/Lewinator56/z80Sim (you will need .net core 3.1)
Writing an assembler is kind of different, in reality at the easiest level you are simply going to have a defined assembly language and just convert it into opcodes, which is pretty easy to do. If you want to include functions and variables (by name) in the ASM then you need to process symbols too. I simply stuck with a conversion from ASM to binary.
This is amazing. I've always wanted to try things like this but I'm just a busy CS student for now. I'll come back to this if I do get time to start.
I love that you're using the gameboy sims musics for this video. I mean those are incredibly underrated games but made my childhood
amazing, and actually insipiring. im looking forward to make my own operative system in the future and these videos bring me so much joy. cheers!
is it linux based?
@@WTBuilder _"Interjection!!"_
- Stallman Ace Attorney
You could learn verilog or vhdl to create the cpu on an fpga, then use off the shelf ic's for ram and rom. Fpga's, if you dont know, are programmable hardware that rewire themselves based on coding. Intel , AMD, and other companies use multi million dollar chips for prototyping chips before sending them off to print. Consumer models range from $50 to $300, and have been used for software implementations for retro systems and modern 32 and 64 bit risc v based cpus.
alternatively you can also be lazy like me and simply build your circuits in a logic simulator like "Digital" (it's like Logisim but faster and with extra features) and then just export them as Verilog/VDHL.
it works surprisingly well actually.
yes this
And at this point I would have thought that chips from Intel and AMD are pure evolution .. no revolutionary steps. Steps are not agile. Maybe they include experimental modules in every chip, test them, and if it works, you get the next iteration ( more expensive ). An FPGA and a chip from Intel have nothing in common. Intel chips are optimized down to the photolithography layout and full analog simulation (spice) of the transistors. FPGA does not give you this. The days of the Pentium DIV bug, where a professor from a university uses an ad hoc script to get his algorithm from university software to some proprietary CAD at intel, are long gone.
@@ArneChristianRosenfeldt The advantage of using an FGPA instead of individual chips for this project is, that others could make use of it too by simply reprogramming their FPGA. There is already a real CPU design called Magic-1 from Bill Buzbee. The problem is, others have to recreate it, if they want have a copy of it. Search for Magic-1 on YT, there are some videos about this impressive project.
@@ArneChristianRosenfeldtwouldn't FPGA be a bit more like a drawing board at that level?
Congrats on everything man I'm glad to see the progress you are making !
dude legit its my first year in computer science university and the fact that i understood everything you said in this video makes me happy, you inspired me to try messing around with cpu designing
You deserve way more subs dude, very underrated and hardworking/smart working and dedicated too
this is insane. Making a CPU, then your own compiler specifically to run programs on it and then an emulator for it.
Cool stuff
Woah, I remember last year, one of the subjects I had during the semester was about computer architecture. We saw how the alu works, the ram, the control unit, instruction and microinstruction sets and even periferal handling. Our tests were about designing some use cases for each component, which was a demanding process. Seeing you put that knowledge to work makes me remember how amazed I was while taking those classes. Nice video! and excellently explained too
I did that last year too! I don't know what software you used, but here we did it with diglog and it was a lot of fun ^^
@@mathismartin3092 Our exams were mainly done on paper, but we used logisim for practicing! Of course we stuck to no more than 4 or 5 bits for the instruction set, and mostly doing just read, write and math instructions, but it was really interesting to figure out how to do certain things :D
For smoother movement of the paddles, you could have created 2 more characters (either added to the charset, or overwriting some unused characters) with half height paddle characters, so your paddle could move in half-tile increments.
if it supports color for text, 1 character is enough: you simply swap background and foreground color which turn ▀ turns into ▄ .
or a way to write arbitrary data into unused characters and then display that with the faster character rendering
This would make the drawing routine more complex and thus slower. Better design a bit blitter and let the bit blitter hardware do the rest.
watched this video once before college and didn't understand much. watched it again after my first semester as an ECE major and understood a lot more. thanks!
I really feel like it's understated how incredible this video really is. Amazing work, man!
Your videos are extremely well edited and your explanations are in depth and easy to understand. Can’t wait to see you hit 100k soon!
This is probably the most impressive thing I've ever seen on TH-cam, very very good job! Thank you so much for sharing this 😊
Damn bro
I can't believe how smart u actually are i never thought someone like u would exist really really awesome
Damn
bro, I've just came from computer organization and architecture final exam and I wish they taught us by making similar CPU model in the course, well done and keep going you're amazing!
HELL YEAH.
I unknowingly taught myself how to build computers using redstone in minecraft.
At this point I've now built dozens of redstone machines including an entire gamesystem in minecraft. It has a 7x7 pixel screen, A 7bit CPU, more than 80 4bit regesters, and a rom size currently only 5.456KB or 496 bytes. However the rom can be expanded to indefinite sizes.
This is both beautiful and confusing, it's truly art. I wish go truly understand everything here, but this flies way over my head.
Man relives computer history. Punch cards, assembly, etc..
The quality of your videos is like a youtuber with a million views per video. I guess I found your channel the right time as I am getting more into computers but not even close to where you're at.
As a CS major and am a software engineer, I designed my own 8-bit microcontroller. I never got past the logic diagram because I didn't need to. I just used logisim to just stimulate my own JAR files for custom functionality. most of it was imbedded diagrams made in the sim. The whole goal was to create a compiler for machine code and load the bytes in with a file in logisim. It was cool writing my own assembly language and executing programs I wrote. I can't remember the program I used to simplify my logic diagrams though
Love the video and the content! Only issue I had was getting caught off guard by the change in music or the selection, startled me a few times cause it would change out of no where while I wasn't fully focused on it and thought something was making noise in a different tab or process lol.
What's next? Own OS on this CPU programmed with z#? Lol
Great vid btw worth the wait
Probably, that text editor is just begging to be transformed into an interactive command line.
@@AstroSamDev AstroOS coming :000
@@AstroSamDev rewrite the Linux kernel in Z# and make z#/Linux
/s of course
Lol aside, that's actually not impossible: see Collapse OS for how to bootstrap from almost nothing to working OS.
Let’s get this strait:
- Man makes own cpu
- Man then makes own language
- Man then makes own compiler
- Man then makes his own emulator
WTF
Don't forget he also wrote a port of Pong to run on said computer...
Bro this is like jdh shit right here
*straight
And he's ONLY 16. When I was 16, I was barely writing hello world in C++.
@@PiyushBhakatBits per age. This man will only grow stronger in time and consume more bits
Using a 16-bit address bus would get you 64 kibibytes. Although, since you’re storing a 16-bit word at each address rather than a byte, you have 64 kibiwords - or 128 kibibytes - of memory to work with.
Le edit: 1 kibibyte = 1024 bytes. 1 kilobyte = 1000 bytes. One is a binary prefix, the other is a metric prefix.
how many habibibytes is that
@@axmoylotl 2 hamoodibytes
I'm never going to use that childish sounding garbage. 1024 bytes is a kilobyte and SI has no business even being included in the discussion, I don't even care that they're using the same letters for prefixing, it's still going to be powers of two. Hard drive manufacturers have caused and are continuing to cause harm to the industry by being such cheats and now we've got douche-nozzles who want to redefine all of our terms instead of taking the manufacturers to task. It's almost as bad as changing BC/AD to BCE/CE because they don't like the implications of how the names originated. Stop redefining terms.
Nobody cares about kibi, kilobyte is 1024 in my heart.
@@tissuepaper9962 same here
I just discovered your channel and instantly loved it, and since you're so skilled with programming, here's an idea: Why don't you start working on a project where you try to design your own NFS Most Wanted 2005 version with the Flagship Cars designed by you, and a bit of story change, do you think it's interesting enough for a YT video? I'm pretty sure after these 20 years I find nostalgia hitting me hard about NFS MW, there are a bunch of mods for this game, but doing something as original as you do in your YT channel, that would be of immense value to us developers and game lovers. Thank you for all your hard work, amazing channel!!
A nice fun little(?) project. I would make some changes to the instruction set. Most instructions had the register after the mnemonic but you have AIN, BIN, CIN. They should be INA, INB, and INC. However, you have 5 bits for the op code and 11 spare bits. You can use the spare bits to indicate source and/or destination register as that can be coded in only two bits each. For jump instructions the spare bits can indicate which flag(s) to check. That would free up 7 instructions for other possible op-codes and make IN, LDI, SWP, and the math operations able to work with any (pair of) registers.
Now grab an FPGA devkit and turn it into real hardware! (In case you haven't done that before, designing circuitry with Verilog / VHDL is actually quite an interesting puzzle).
Agree. My university had us build a single cycle 8-bit MIPS processor on a Xilinx FPGA. A very simple processor, yet probably the best project we've done so far.
i am too stupid to understand it
Finally something I can relate to
At least you’re not in denial. Less competition for us folks!
@@0x_nietoh "folks"
Womp womp
Noooo. You just haven't gone deep enough to understand. Not understanding YET is totally different from being stupid.
:)
Very cool project. This was the stuff I loved doing at uni. Sadly FPGA are still crazy expensive, the CPUs we designed there were all in HDL languages and allowed you to run on hardware. Makes me wanna dig out that code again.
Yo this helped so much and I always appreciate the content and when i found the channel and got the energy from you from the previous video, you've been nothing but real and can vouch for the amazing content and how down to earth you are with everything! All the most love, respect, and appreciation
I salute you! I always wanted to do this but never had enough free time. You have motivated me to try again. Keep on inspiring people!
THIS WAS EXPLAINED SO WELL WHAT THE HELL? SERIOUSLY KUDOS TO YOU MAN
very cool cant wait to see you building it in irl ;)
2025: I designed the world’s most powerful computer
Thanks for the motivation. I wasn't sure if I could do it, but I might try it eventually.
Did you try?
I watched this video twice before today. I never really understood anything
I just played nandgame the other day and i now actually understood most of the stuff. I realized how much more impressive this was
15:20 it's been 2 years. Can you make a video of the physical version now? 😀
You may want to expand the architecture to add interrupts, although it does complicate things somewhat given that you'll probably want a stack to handle those, and you currently don't have anything stack-ish like jump-to-subroute or returns.
Was going to mention this. It would make timing quite a bit simpler, no more arbitrary wait N cycle loops, just execute game logic once on every vertical sync interrupt. :)
but can it run doom?
Probably not
I've always wanted to know how computers work at lower levels. Thanks a lot, this was very insightful, and was very entertaining to watch. You got a new subscriber now.
I love how intuitive your operand table is.
congrats for that achievement. You actually designed a own computer. Wow!
This is seriously impressive, and I'm blown away by the amount of time and effort that went into this...
...but can it run DOOM?
"I built my own earth"
I built the entire observable universe by scratch with real breathing life forms in it.
👍
I had a similar project during my computer science degree. This was a good refresher on the different individual components of a CPU. Makes me want to fire up Logisim again! Great video.
What the hell, man, this is nuts..
Great work, i cannot imagine how tough it was to this point, damn..
Does your emulator skip the logic layer and just execute instructions on the list?
You did an amazing job with your custom CPU.
By the way I noticed the original Sims OST playing in the background. I love that OST.
But did God give you a dream telling you to make it as his 4th temple?
One small thing about how you explained your emulator. You conveniently forgot to mention that part of why logisim is so much slower is that it actually simulates things far more in depth than your emulator. A very minor thing, but the way you glossed over it gives the impression that logisim is slow because... It's slow. Rather than because it's doing way more work than your emulator.
and Logisim's speed does depend on whether or not one is using the default 'chips' inside it or not, it has math operators and register modules, and whatnot, but if you eschew that and go down to the individual gates in sub-circuits....
He does say that logisim is simulating every logic gate, and that's why it's slow. Seemed sufficient to me.
Yeah I was a bit confused at that. Why would they differ in speed so much when they’re both emulators?
@@Costelad Imagine instead of executing an add instruction directly, you simulate the propagation of electricity through each part of the circuit implementing an add operation.
2 years ago i watched this video trying to understand computers(i didn't). Now watching this again knowing and understanding everything you did here.
The first video of yours im watching and I love it! Ive always been interested in computers and specifically how they work, and ive also wanted to get into circuit building so this is RIGHT up my alley. Your editing is fucking amazing too! Keep it up dude!
Hey so I have an idea! I think that Z# would fit perfectly as an embed for the computer. It should be its own name language. Kind of like batch is for windows. I’m very impressed by this! Hope to see more videos from you in the future
Awesome project. I love stuff like this, even started my own project. Tho it's already a few years working on and off on it, and using a 6502 instead of just logic stuff, and on breadboards. Fun stuff and great work getting so deep with this stuff!
Now make it in real hardware! 😏 ...I mean, you used logisim so IC counterparts should be relarively easy to come by, VGA might pose some problems but the VGA signal is quite simple in the end...
0:17 I'll complain. Green-field-project-owner neglects updating critical value after making changes.
Been watcNice tutorialng your vids for a good few weeks now, learning new sNice tutorialt each day. my worksoftow has improved so much since watcNice tutorialng
My friend, you are truly a legend. Congrats!
Next step, TempleOS!
Bet in 2030 he'll be competition for Intel, AMD and NVIDIA after he figures out how to chemically make transistors
You want Sam Zeloof for that :P th-cam.com/video/s1MCi7FliVY/w-d-xo.html
Why did you disappear from TH-cam?
Thanks brother, your content helps me a lot! Keep up the great work! This is exactly what yt is made for!!
My godness, your content is awesome !
I think that a video explaining your path to achieve such a detailed knowledge about low level computing would also be very interesting :)
Can it run doom?
16 bit computers CAN run Doom
Me still unable to understand recursion in JavaScript
Now you need to build this with actual hardware and add a long term memory to it. Like a SD card slot or something that could be used to store data even when the system is turned off.
Holy shit my man your a fucking genius .
Everything from using c++ to make the compiler to the emulator to increase processing speed. This is fucking gorgeous!! I would love to pick your brain on ideas and concepts .
I just found your channel today and your crazy dude! im Just getting into computers. But Making a language AND creating a whole new CPU. Is amazing man! If your were to ask a professor how to do this, he would look at like you were insane! Keep up the great work man!!
You are literally the best, I've been looking for a tutorial for three days and yours works
Impressive work. In my computer architecture course we are creating a similar CPU, fun project
Impressed!!! Great content! Well presented, I loved it!
Great tutorial, links and program worked fine for me. Thanks for sharing.
I love love love when people mention how they had xxx subs, then I see the video a year later and they have 10x more. This was so cool, time to binge!
I love the halo 3 ODST soundtrack in the background and good work man.
I had microprocessor last semester, you did well explaining everything as simply as possible.
I was looking into N64 turbo nerd assembly stuff and I realized something (maybe i'm unaware of some intermediary steps tho, i'm just a terminally online dude with way too much time), the CPU being basically a glorified calculator, the bits fed into it might "just" open/close logic doors to tell the cpu what to do with the following bits of the instruction.
For instance : if the 5 first bits are linked to, say, ADD instruction, they'll open the "gates" for the following bits to be fed through the ADD transistor array section.
I believe that's why in every assembly language I ever read about, the instruction comes first, then the appropriate values, being usually at least one register address and a fixed value or another register to read the value in. IDK if i'm clear, very interesting project overall ! Thank you for the video !
halo odst soundtrack at 4:10, such a good song
The fact that technology became so advanced, that practically anyone can make a computer from scratch on their own is fascinating. I start to remember the first ever computer made, that took 4 rooms of space and a group of people to operate, while now a single person can do the same thing in smaller space
It's not the advancement it's the affordability of it.
thank you for sharing, waiting to see a board soon! :)
Thanks for this! Very inspirational and please keep up the good work!
nice! im also planning to build my own computer from scratch..
Its even hard to barely understand the video for me and it amazes me that you managed to learn that so quickly
Fun project! I remember having to do this (but 8 bit) in digital electronics class in undergrad for EE.