Real Programmers Write Machine Code
ฝัง
- เผยแพร่เมื่อ 21 ธ.ค. 2024
- Recorded live on twitch, GET IN
Article
www.catb.org/ja...
By: Ed Nather
My Stream
/ theprimeagen
Best Way To Support Me
Become a backend engineer. Its my favorite site
boot.dev/?prom...
This is also the best way to support me is to support yourself becoming a better backend engineer.
MY MAIN YT CHANNEL: Has well edited engineering videos
/ theprimeagen
Discord
/ discord
Have something for me to read or react to?: / theprimeagenreact
Kinesis Advantage 360: bit.ly/Prime-K...
Get production ready SQLite with Turso: turso.tech/dee...
Real programmers don’t need machines to run their code, they simply execute it on their minds. Anything short of that is a skill issue
fr
Works on my mind
Yeah boyyy, neurons are firing those 0's and 1's
Real programmers don't need their minds to run their code
They just use the universe processing power instead
@@muhamadrifqi1911 then we’ll just ship your mind ❤️
Well, in 1981 I was paid to program for the first time, in raw unadorned, inscrutable hexadecimal numbers. The old Marconi Research labs had just received some Motorola 6809 microprocessors and wanted me and another junior engineer to evaluate them. There was no support, no circuit board, no assembler, just the chips. First we had to design and build an SBC to get the chips running. Then we set about building a debug monitor program, in hexadecimal, programmed to EPROMS. When done our debug monitor could load and run code from C60 cassette tapes or paper tapes. We had typical run, halt, breakpoint and memory inspection commands. All displayed on a VT100 "class teletype".
A little before my time since I came into this world in 1980, yet there is a lot of admiration, respect, and a bit of envy for the generations beforehand who engineered, pioneered their way through things. The ability not to just choose the appropriate tool for the job, but rather the ability to make their own tools to accomplish what needs to be done just from the basic things around them is testament enough of their pure brilliance!
@@skilz8098 I can't claim to have pioneered anything. Although some of the guys I was lucky enough to work with probably did. The second half of the 1970s was an amazing time for us young nerds. For the first time one could get hold of a computer in a chip. Not much in the way of actual "personal computers" and certainly not home computers yet, so everyone was buzzing with excitement and building all kind of things with the chips they could get hold of.
Was the year that my university computer lab (to the mainframe computer - various departments had mini computers) replaced most of the punch card machines with CRT-based terminals. The editor allowed for editing a single statement line at a time.
Way way before my time but something that sounds... essentially impossible in a recent timeframe; I have to assume you had access to data sheets and specs to run the proprietary sheets but my previous job we had some proprietary chips but the only info we had for it was a really crappy c compiler that the contract specified we werent allowed to decomplile. modify nor reverse engineer to make a better one. There's a bucket of admiration for y'all older folks just casually building all these things to be able to work on them but also you had the liberty to do so, so it's slightly less impressive when you have datasheets despite how foreign they tend to sound to most younger programmers these days
@@TurtleKwitty You are right. We had data sheets a plenty. Intel, for example, initially had a hard time convincing engineers of the usefulness of things like the 8080 processor. So they were more than keen to make all necessary information available. Big data books on the hardware and instructions set. Example circuit diagrams. The works. I recall Motorola was much the same with its 6800, 6809, 68000 and other chips.
I would not say "casually building" all this things. When you have designed a board like that and put the hours in soldering or wire-wrapping it up then fault finding it, then you can come back and talk about how "casual" the experience was. Don't forget, as well as not having any HLL language compilers, or assemblers, or machines to run them on, we also had electronic circuit design software, no PCB layout software, it all had to be done by hand. And little in the way of fault finding other than old analog oscilloscopes if you were lucky. I actually did fault finding on one board by attaching a pair of headphones to different signals and listening to them.
It's depressing how secretive much of the tech we use now a days has become.
We're all standing on the shoulders of giants.
Amen! Even Solomon the wise stated: Everything according to its circuits and that there is nothing New Under the Sun.
The ancients were much smarter, wiser and sophisticated than most of the modern status-quo who write and enforce what are in the textbooks gives credit to. They would have us believe the ancients were dumb cave dwellers who barely understood fire. No! They are lying to everyone. Humans, mankind has been this intelligent since the beginning of mankind. There are things that the ancients knew that we could never begin to imagine or understand, yet we today have understood other things that they could have never dreamed of. Does this make us any superior or inferior as to they were? No.
Sure, there will be a point in time (human history) that a great civilization will fall or collapse and from that a lot will be lost (information, knowledge, wisdoms, technologies, etc.) and there will be a quiet or slow period throughout time before another great civilization rises up out of it. Each peak of various epochs or eras throughout the history of humanity brings something different to the table. Not new, because nothing is new, it is all of old. Just something different.
We have three distinct capabilities: one is to observe, to understand, to know, the other is to modify, change, shape things, to create, and the third is simply to be able to make a choice, one way or another.
Human consciousness extends beyond time, space and matter. It is our existence and actions that give them meaning. It takes a mind to imagine a universe, a cosmos, life. We have a purpose even if we don't fully understand it. Each and every one of us brings something different. Individually, none of us can fathom omniscience or omnipotence not at least within our current state of existence for we are only temporary or temporal in the flesh, otherwise stated as mortal. Yet combined as a chorus amongst the waters, seas of souls, we as humanity are a part of that infinite consciousness. We are just one facet of its many faces. One of the greatest teachings from one of the greatest teachers of all time simply said, "I am".
We can see some of these transcendental truths when we start to bridge the gap between many topics or ideas such as Fractal Geometry, Quantum Mechanics, Information Theory, even Simulation Theory. Why? How? It all comes from our own Imaginations! It all comes from the Mind! Yet the Mind isn't alone because our emotions that which Drive Us are rooted from within our Hearts!
We are all giants depending on one's perspective while we are also simultaneously mere grasshoppers or ants from another. Our existence, our reality, our time, is smack right dead in the middle of it all simply due to the fact that we are here.
@LGRvideosnew This only works if you're minimum 90 years old
Wrong
I'm sitting in the chair of gamer
Every stem field is standing on the shoulder of giants. This is the right take. Truth is, a small percentage of folks truly excel and bless the rest of us with their intelligence.
unless you are Mel, in which case you are the giant, standing on the ground, bare feet
Happy to have contributed an article. Nice to watch your reaction to it. 🎉
In the 80s I was in awe at the magic some young hackers could execute on some Spectrum, BBC, IBM or, later, theNAmiga. Their magic was surreal. I think I lost my balls there and then. There's no way I was going to be as good at hacking the hardware to the maximum and roll out such an experience that would fit in a measly 3.5" disk. What was that? 800kb?
720 KB
When I first ran across this on Usenet it was incredibly inspiring. I still remember trying to explain it to people.
I remember finding the demoscene online. It was bonkers how much they could cram in just 64kb, 4kb, or even 32 bits (!!!). Meanwhile a ton of games were already leaning towards 1GB+ of size.
Thank you!!!!!! I love these!!!!!!!
Amiga could do 880Kb if I remember correctly.
Real programmers flip individual bits with their fingers.
I did that! On a PDP-11 to bootstrap it.
@@CallousCoderWait till you onow Real computers were women💀 all dials were touched 🥲
Punch cards. Make a mistake? Get yourself a new card!
@@briankarcher8338 I’ve never worked with lunch cards. We did have an old PDP-11 running that did data acquisition and that was still loaded with paper tape. And I had to do that ones after a power outage and my colleague being on holiday. I loved loading the boot strap code on the front panel, in octal (took me 3 attempts to get it right) and then see that paper tape whizz through and see the data streaming in and printing log messages on tractor paper. I wish we could go back to that time, where you could understand both hard and software of the whole machine.
People used to do that believe it or not. Computers in the old days would require imputing bits by flipping switches.
I worked for Prime, his mustache is a npm package
Holly shit, I wasn't first
@@luithedude3300 My comments are written in Rust. Rust is fast :D
@@Blingoose my comment is in REACT 😭😭😭
The moustache is a mandatory dependency.
Have you tried forking it ?
Rewrite in Rust ❎️
Rewrite in Assembly ✅️
Rewrite Assembly... in Assembly.
@@potato9832
You mean remake an assembler in assembly?
Just write in C? Doesn't it compile to assembly (don't judge me, I'm not a programer yet)
Even if you rewrite Assembly in machine code, it is still an interpreted language interpreted by the microcode on your CPU before execution
Write Assembly compiler in Vacuum light bulbs.
Real programmers don't write code. They actually transfer electrons from one state to another and then transfer the voltages, do actual switching inside a transistor to do the logical operations.
Ah; a transcendent mind.
That's what She said! 🙌
And boom goes the dynamite. It doesn't get any low level than that. Seriously though, I had to wipe a EEPROM for a 1990s car last week buy flipping the EEPROM bits in a UV oven. I don't want to code like that ever again. I'm looking for a modern package to replace that dinosaur chip.
@@complexity5545 That's interesting, using uv to flip bits to wipe data. Oh that's why in movies they use microwaves to destroy harddisks
thank you comrade she
Drums are akin to HDDs. Instead of stacks of flat, spinning platters and read heads, the format is a cylindrical drum with a ready head. Think vinyl records vs wax tubes. Mel was basically calculating the time it took to complete his instructions and optimizing the physical location of his most used instructions so there would be minimal wasted time Seeking them on his drum. The insanity, by today's standards. Optimizing your program for your unique storage medium. The kind of full-stack understanding of the entire computing process from the ground up something like that requires is just mind-boggling. That Mel understood the CPU, memory, registers, and indexes available well enough, he could plan out and optimize on that level is simply bananas. Imagine if Mel translated that expertise into a compiler, and enabled all the programmers there to keep up with him.
21:00 "[...] state of tenseness just short of rupture." Probably should be tension, not tenseness, but the point is that they code and sharpen things and bend them in such obtuse ways, then pin them in place such that if you simply look at the code, it breaks.
That's the thing, though. You can't translate that to a compiler. Maybe... Maybe today, you could train up some kind of AI process to do it, or to iterate upon optimal guesses to select the best optimization path... But you are looking at something which requires a backwards thinking - to understand what the end code of a compiler is going to look like to then go back and optimize the compilation to produce even more optimal arrangements of what was compiled.
Computers today are a bit simpler, and the amount of information they need to work with is far larger - this is why we use compilers, today, and is honestly the biggest hurdle I have when programming. I learned electronics from the ground up and expected programming to effectively be machine language. I thought I was missing something - but that is because programming, itself, is not derived from electronics - it is derived from mathematics and organizational theories.
A perfect example - I took to a puzzle game, ExaPunks, like a fish to water and was writing code in moments. The guy who did all kinds of higher level language programming was horribly confused and thought it was insane to more or less expect someone to be able to write a for while loop in assembly to be too high a mark.
The for while loop or do for loop, and optimizations of it (such as using the register looked at for true/false comparisons to store and decrement a loop count) are obvious once you start tinkering with the limited tool set.
That, to me, is programming.
By contrast, while I can handle higher level languages, I find it much more difficult to follow what the computer is doing. None of it is even remotely intuitive and is so abstract that it is mostly gibberish. I obviously understand an array and transformations and the like - the data constructs and math operations I get - but higher level languages like python start pulling out various concepts from way outside the visible text of the program.
I can develop an optimal set of instructions to create an array in memory and iterate over it for various operations using a computer architecture. However, doing something creative like varying word size to overlap, effectively, two data sets in the same range of memory and use that to do voodoo... Which is basically what this guy in the article is doing... You can't make a compiler do that. Not reliably, at least.
Compilers are great for most things, particularly today where there's just a lot being done and even if you aren't crazy optimized, there's not much difference.
However, I do think the issue of being able to understand the executing hardware to code and optimize for it separates the proverbial men from the boys in the world of software development.
It's not just performance - things like security and assurance of execution are a thing. Initializing memory and/or freeing it after being allocated are basic things which apparently need a whole new language to get people to use. That is being a bit reductive, but people who don't understand how code is being compiled and/or how their machine is executing the code end up being surprised when there are unexpected behaviors from it.
I don't see a person's programming knowledge as complete without that understanding.
Today, such tricks (as in, modifying the IP through hackery) would fuck with the branch predictor, pipeline, and caching so hard, you'd have to be a 200 IQ genius to make it worth.
Melvin Kaye passed away in 2018.
He even optimised his death to occur before covid
Kinda sad...
They froze Ted Williams' brain, the brain of a baseball player, but not Mel's brain?
@@ReedoTV this comment is way funnier than it should be
RIP, you absolute legend.
Growing up in the 80ies, being around 10yo we used to write those machine-code programs from magazines into our c64 and zx spectrum. those where the times :)
Yep, same here, been doing that since i can read lol. I HATED the PITA tape drive more than just typing a few hundred lines of code when i wanted to play something simple.
Same here. Good times.
@@rustymustard7798 lol. not sure I hated the tape player :) what I really hated was when the machine code did not work.. and the next magazine has a "there was a type error at line 34, 0a, should be 0b" :) and then when you got the Zilog Z80 book, you started to understand what all this machinecode did also. So it was a nice learning.
I couldn’t agree more! I am not that old at 51, and I even bootstrapped a PDP-11 by entering the instructions in octal with the switches on the front. And at least once a month I hack in 6502/Z80/68000 still - and you see that on my channel. For me it’s nostalgia and real programming.
And yes even in 1990 in college we wrote Z80 in hexadecimal of the MPF-1.
Real Programmers write code they still talk about in youtube videos 40 years later 👍❤
Real programmers flipped switches, dialed knobs, pulled levers, and pressed buttons on a machine. That's where the term "programmer" came from.
Damn, I thought they were just pro grammar
(I'll see myself out)
Wait till you onow Real computers were women💀 all dials were touched 🥲
Actually programmers were called programmers because they were professional at grammar
Ahh, …Like a cnc machinist.
The real first programmers were all women as they were trusted to handle the flipping of cards . It was men who designed some of the process , but also women played a large part in that too
real programmers don't use keyboard, they use punched-cards.
Real programmers use a magnetized needle and a steady hand.
Punchcards are designed to hold a single line of FORTRAN.
real programmers use cosmic rays
@@imperishableneet Going back to the days of the Loom?
Real programmers don't use physical materials. They use their consciousness.
The folks who make the pretty sand do math were just built different.
These trips down memory lane are fun
Which part of the motherboard is the memory lane?
Real programmers make their own CPU from rocks
Real progammers make their own progamming language. Their own text editor. Their own OS. So basically Terry A. Davis
Must learn Chemistry then...
@@krux02 I was waiting to find this reference!
I.e. an abacus.
Beads = polished rocks
@@krux02 what OS? A real programmer doesn't need an OS to do anything, their code is truly independant
It's a fub story, but a couple important points. Mel was obviously a smart guy but wasn't really doing anything weird or wizardly. He read the LGP-30 manual (and so can you; it's linked in the article!) memorized its 16 very simple instructions, and programmed it with normal, run-of-the-mill self-modifying code, which is how you do loops and have variables on that machine.
A lot of the other details about the machine the author gets totally wrong, like there being a goto built into every instruction. No, it doesn't work like that, it executes instructions in order unless you explicitly jump, just like a modern PC.
On the other hand, optimizing execution time by picking address locations is certainly worthwhile, and surely a hassle, but the manual literally explains the drum timing and how to do this. However, you absolutely cannot do the kind of precise timing like the article implies because the drum rotation is was not synchronous to program execution.
Anyway, the article is fun, and both the author and Mel are great guys, but nothing described there really required anything other than reading the manual, and the article is super apocryphal in many respects.
TLDR: if you didn't RTFM it's your own skill issue. 👍🏻
Actually the drum rotation was perfectly synchronous to the program execution. Synchronous to a bit-time! Also, the registers were on the drum too. They were replicated many times for low latency. There was a Goto built into every RPC 4000 instruction for speed.
As someone who spent a sizeable part of their career writing machine code (well, assembly language, actually) on Motorola 68k processors (because the C compilers were a total joke) I must concur. I fully endorse this episode 😊
(Old grey-beard coder here!)
machine code is just a pain
it's basically just like assembly but worse when you make a single one small mistake you rewrite entire program! also hard to read no mnemonic code.
you would go insane trying to code it yourself without the help of other people.
hexadecimal is just binary in disguise
absolutely mad respect to all those programmers. thank you for making assembler and thank you for making C and thank you for making python and thank you for making javascript eventhough I don't quite like web development, thank you for making open-source libraries
The drum memory was the precursor to hard disks but used a revolving drum coated with magnetic material instead of platters. It is somewhat analogous to how we first used rolls (the phonograph) to record music before we moved to flat disks (the grammophone). With more modern hard disks you also have a time delay between each rotation which is is the major reason that SSDs are so much faster because u dont have to wait for something to move under the read head but you can instead access any location instantaneously like with RAM. On these really old drum memories the delay would however have been much more noticaable because the drum rotated at most at maybe a few 100 rpm while modern mecanical drives spins at between 7200-10000 rpm
Bwah! All child's play. Chuck Norris just stares at the machine until it surrenders to his needs....
If you are interested in how these machines worked, try Usagi Electric’s channel where he’s restoring a 1950’s Bendix G-15 drum memory computer. It’s not the same as Mel’s hardware (even older?) but you can get the flavor of it.
Old programmers were built different
TH-cam recommend me the channel today, and is great.
I've literally written machine code. I started with Batch files in the mid 1990s then moved on to Assemble Language for DOS. And I remember at least one program I wrote I used machine code in one spot instead of the Assembly Language instruction because I wanted more control over the way the instruction was written. In another situation I wrote a macro for an assembly program that then produced machine code in a specific controlled way.
It is good to see this tale is still in circulation.
Your ending rings so true lol. I've been programming for like 5 years on and off never professionally and I just can't stop jumping to different things since they're so fascinating. I prob went through over a dozen programming languages and different programming fields from building my own cricuits and programming on a microcontroller to web dev with react and django. There just isnt enough time to explore all these amazing technologies and really get your hands dirty with them.
C and then compile it to anything. That's why most embedding guys use C. But you're correct, it takes like 8 years to build up your own library of scripts to do it. The toughest is when you start messing around with yocto or verilog stuff. Every-time I deal with those, I have to recall and relearn certain parts because the tool chain has been updated and changed. Having scripts with notes, makes the process shorter.
Real programmers are the ones that invent the languages that the rest of us use. Those guys deserve respect. They didn't just say 'here is a problem, let me write a solution' they said here is a problem, let me build the tools to make writing the solution easier for everyone.
Programming languages are just tools, like any other.
We all make tools that solve problems. We automate. That's programming.
Nope. all are programmer. just with different field of knowledge and skills levels
Real programmers don't use unreal engine.
"Is this a reference". You bet. After the "quiche" booklet any number of parodies came out. I've also seen one that argued taht real programmers write Lisp.
Real programmers knit their programs, just like Charles Babbage meant.
My first collegiate com sci class was them explaining that computer programming is like a loom and thread. Before computer science, I never heard of a loom!!!
Wasn't the Apollo guidance computer's ROM knit?
@@Kenionatus Literally :D
@ThePrimegean Thank you for these. I am so HAPPY
Olde Fortran was a brand of malt liquor.
@ 17:00 BRO HARDCODED A NULL POINTER DEREF AND IT _WORKED!?!_ I'VE BEEN SCAMMED! *SCAMMED,* I TELL YOU! I deref a null pointer all the damn time, and all MY fancy OS does is give me a segfault!
To be fair, there was no predictions, branches and cache in CPU, so timing when instruction would run was a bit easier.
Love the really old reads prime does.
Back in 1994, I was writing drive train and transmission software for Ford trucks in straight assembly. (Model years 1995 - 1996 DI turbo vehicles)
This isn't some 70s thing.
My best friends dad was the kinda guy that could write code in hexadecimal. It was insane to watch. He later became an intel consultant for machine code before he passed away.
I remember reading this over a decade ago. What a blast from the past.
Real programmers make their own electronics 😎
I do this :) Creating a device using a PIC that runs on very low mA powered by a lithium ion battery. Every 10 minutes it pings a GPS position. I do a lot of wild hiking and camping so something like this is useful for me. I'm now working on using scrap pieces of metal to create galvanic batteries that can power low powered electronics anywhere.
Real programmers use butterflies.
- XKCD
@@JayJay-ki4mi Nice!
they make their own analogue computers and run their software instantly using 0 cycles to get the result.
We constructing our own circuits
This is a certified Usenet classic
Can you imagine what we could scrounge up if we had the time and temperament to research Usenet fully?
Drum is basically using HDD for your registers. Adding goto to your instructions is useful because you can't just read the next instruction, by the time your first one is finished your drum span out to "unpredictable" other part of the drum
As a mathematician, I absolutely could not stop myself from weaponised overflows when I was learning C.
Then the compiler kicked the shit out of me
Programming with a mathematician's mind is always weird. Because the hard problems seem rather easy and the other hard problems I'm amazed there's already a library for
@@andrewferguson6901 All programmers are mathematicians. Just a different kind of mathematician.
Real programmers build their own transistors from scratch
@10:50 you were close. The drum is constantly spinning, so by the time the instruction finishes executing it has moved further and the sector under the read head is not the next sector, it might be one 2 or 3 positions further. In essence: the computer had you do interleave manually.
"Keep in mind, I have not compiled this application merely proved it correct." Donald Knuth
I hope we all realize that every time a programmer pulls something like this we get one more review, PM, coverage, clean, whatever rule to stop us. This is why every manager has trust issues.
I learnt the Z80 entering bytes directly into memory from a hex keypad.
I did the same for the Signetics 2650, the 6502 cpu, and the 68000.
Real programmers use all GOTO statements, ride on velociraptors, never use git, and work alone.
25:00 Not only that, changing tracks was very fast. So you could interlace the code being executed by tight loops by setting the next instruction location to the next adjacent track if your current track is out of room.
However, it sounded like Mel wasn't using addressing at all though, he would just jump right into the code at the location the drum head was in when the program got to the next jump instruction. Like controlled arbitrary code execution. Crazy.
Way way way back in 1983 or so I took a Machine Language course in college. We learned to dissect the bit/bytes and hexadecimal representations. A bit of an insight into engineering concepts with logical steps might be taken along the path of a board. I'll admit to thinning out a bit on that level. More interesting to me was doing the sorts ourselves like a bubble sort and understanding how pointers worked on a search or filter. We coded in Assembler and I think some FORTRAN (can't remember). This course did come in handy later in understanding how items in arrays are handled internally and debugging by looking at 'dumps' of memory in hex. It also paid off in understanding what relational database tools like DB2 were based on. Underneath the layers where pointers again. I think it helped me in the early days to understand splitting addressed blocks up efficiently to optimize getting to the memory areas (CA/CI splits or something like that) of the queried data row. We had to be proactive about that back in the day. There were like 5 of us in that class but we enjoyed the hell out of it.
13:00 Programming prowess is a Riemann sphere. And on it, Mel and Tom find themselves side by side at infinity as their diverging genius goes to infinity.
My father used to be a programmer at Bell Labs. He worked on the software for the ESS switches which ran much of the POTS in the early days. The early generation ESS switches were all written in assembler. At some point, they tried to switch to C. The problem with this was the requirement that when a person picked up their phone, they were to receive dial-tone in about 200ms or less. This was a heavy lift, considering that from the moment a customer picked up a phone, you had to detect the "off-hook" condition, then determine if the line was valid, what services that customer had, enable those services, if the billing state was up to date, etc. My dad used to talk about all these tricks needed to shave off cpu cycles wherever they could. When the latest ESS switch was being coded, in C, the best response times were getting were something like 4000ms. So yes, I used to hear it from my father all the time about not being a "real programmer" lol.
5:37 While tedious to write out all those Gotos it can be ridiculously helpful in the long run to avoid having to re-allocate memory and is just more flexible in general.
Usagi electric channel has a bunch of videos on drum memory, swapped the drive out of an ancient "mini" computer. Great channel for vintage hardware.
Next video by tsoding: I'm using raylib by writing in hex
Holly shit I'm first, but I'm not gonna watch it now couse I was going to take a nap. I still love you primeagen, you are amazing ❤
I remember reading this for the first time in the late 90s. It's a classic.
I've heard about that int computer in AoC. I ought to work backwards to it, but it sounds interesting.
For over a decade, maybe two, I've been adamant that what I'm doing is either art or craftsmanship. There can be inherent beauty in the code just as there can be inherent beauty in a clockwork wristwatch. There's a reason many "fancy" watches have see-through into the machinery, because it's beautiful.
Back on my TRS-80 I wrote in assembly language because the processor was only 1.77Mhz. BASIC was really slow for games.
Mel is/was such a gigachad Brogrammer
This article is amazing. To know that programmers have been the same for decades is simply incredible.
Usagi electric on youtube is refurbishing a half-ton vacuum tube computer (Bendix G-15) with a drum, paper tape, and typewriter/quasi-plotter. Basically, the drum is both the memory and the storage.
can't believe you've never read this gem :D
obligatory reminder that roller coaster tycoon was written entirely in machine code. I don't know how he did it, but he did it. Then he mic dropped and peaced out of the industry. Absolute gigachad
Physics in your file read
"if it was hard to write, then it will be hard to understand." That is 100% true because its usually difficult when you know the hardware is screwed up or the user input is impossible to statically code. You're forced to pigeon hole it with a programming language that doesn't have the correct features.. That was a very wise statement. You end up doing craziness just to battle the flaws (like writing code to get around Intel rusty microcodes in their latest chips).
"If a program can't write it's own code, what good is it?" is actually a reference to self modifying code. An example of this is the walking speed in space invaders on some systems*. The instruction to move the invaders is an immediate add instruction looped over the invaders location. When the invaders need to move faster, the code changes this immediate value stored inside the instruction. This means one less variable to track but it also means that you don't have to go to memory to pick up the variable. But what about the old reverso of direction. That is simply achieved by changing the instruction to subtract. Which often in two-complement might mean negging the immediate value. Which is really just doing overflow addition to do subtraction. I think some computers lacked any sub instruction at all. So if you wanted to do a sub you had to manually NOT the value and add one to the result then add that to you the number you want to subtract from. Others didn't have full hardware subtraction so they just did it for you.
* I've never actually seen if this true but it's what my HS programming instructor told us has a basic example.
This is next level. He had every opcode memorized, that's very impressive.
Discovering The Story of Mel is kind of a rite of passage. When you're through with it, I recommend reading (or watching here) "The Art of Doing Science and Engineering" by Richard W. Hamming. It has several chapters/presentations on the history of computers and programming.
Yeah the real programer just execute their code via telekinesis , make reality just bench like Wanda Maxima
Hey you should look at a computer language called RPG. Code statement lines contained three sets of numbers the combination of which represented a command to move data, compute or whatever else was needed. I worked for a company called 'American Home Products' the Jiffy Pop popcorn sellers. There was a guy there that wrote the entire payroll in RPG. I worked a little with the language and asked him one day...'How the hell do you do an entire payroll in this! Amazing!'.
First computer I programmed was in Algol 60 on paper tape. The machine was 39 bits, and made from discrete germanium transistors.
You also needed to enter a handful of machine code instructions just to get the compiler loaded from front panel switches. The instruction set was designed around these initial instructions, relying on the intrinsic self-modifying code of the initial instructions, and the instruction set itself, to automatically jump into the compiler's carefully crafted paper tape bootloader.
A year or so later, microprocessors hit the scene and nobody except large corporations could afford the cross-assemblers or the machines to run the cross assembler on. So we did everything in machine language, pages and pages of hex with no mnemonics, and all branches were hand-calculated. It was completely unmaintainable, of course.
It took about a decade before reasonably priced assemblers and cross assemblers became available, but by then things had dramatically moved on, with microcomputers becoming retail items rather than the domain of homebrew nerd chads.
one of my favorite hacks is instead of checking for booleans and then jnz, the instruction is directly edited in the loop. (though editing bits in floats for approximations is also quite interesting too.)
I still think not allowing program to self modify is annoying, if I had enough motivation, I'd definitely make an optimized programming language that makes this it's main feature.
though on the other hand programs that have 0 side effects are also quite nice, because they are so static.
Way too dank. These og's were unhinged man
Does anyone know what they ment by "drum"? Drum memory was phased out by the 60-70s so im assuming its a holdover term but i dont know what its referring to in the 80s
Drum memory was a magnetic data storage device invented by Gustav Tauschek in 1932 in Austria
Pfft, programmers these days, REAL programmers wrote code on cuneiform clay tablets with a willow stylus.
“ud reaaa”
Your products suck, Ea-Nasir
oh okay, if you want to find more on drums look into usagi electric and his bendix restoration project
Holy shit, if you get a hold of the Real Blackjack Program I'm definitely watching that video, surely it contains both Black Magic, Heavy Wizardry and Deep Magic. I found the Jargon List 10 years ago, and really enjoy reading it, and all its obsolete jargon.
Yes. If you are a very skilled programmer your going to write a faster smaller function in assembly then one compiled from a high level language. When people need code that's super fast and super small the compiler gets fired and assembly is used. Assembly is also what people use when they need really low level code.
indeed hand-rolling assembler when you need some perf. i did assembler way back in the 80s and back then you counted cycles because the machines were gutless. now you tune for keeping things in cache and which cache level. for that you really need to know the architecture you are using. the matrix optimisation stream you had some months back really highlights that.
Not having pointers sucks, because that means you can't address a specific part of memory, at all.
They had to have some way for saying "memory that is stored here". Maybe just relative addressing? Relative addressing is how a lot of static pointers get used in machine code, actually. Functions, static memory, etc.
I actually wrote a tiny program in machine code to blink an LED with an Atmel chip. I.e. I really read the datasheet, found the needed instructions, then translated them into bits on a piece of paper, and flashed it onto the chip via its SPI-based in-field programming feature, using an Arduino and an ad-hoc program on its digital pins. (I.e. not its actual SPI pins, no. I wrote my own ad-hoc program that would write clock and data bits to digital pins.) All 20 or so of those instructions. It is fun. And if you want to learn electronics and how processors work, I do recommend an exercise like that. The dumbed down BS that a typical Arduino tutorial suggest is boring and probably useless to learn _actual electronics._ But it is important to understand that the compiler really does translate what you define with your language into the same real stuff, with very good optimizations. I.e. if you describe your problem in a language well, in a way that is clear, the compiler will match it into hardware stuff, and it will use every hardware capability that is there. Compiler is a tool to help you handle all the hardware features, not to hide them from you. If you will, it is like an AI "copilot" thing :D
Fascinating story! Mind boggling beyond comprehension.
I was so sure there'd be some overflow somewhere! I bet Mel wouldn't have liked pure, effect-free functional languages!
drum storage is literal drums, the outer of the drum is what coated. Like think a oil drum with the outer part magnetic for read and writing.
In high school in 1984 my high-school teacher taught us how to do this before we learned assembly. If you don't know about complement of twos you will never get it.
I see how he optimized the drum (but I had high school programming teacher who used drum memory machine in olden times personally btw). One can write a simulator / interpreter for a drum-machine and do programming competition based on who makes it faster ;-)
Got really the idea to write a simplified simulator 🙂
Hmm, am thinking Mel wasn’t a Clean Code advocate
Probably TMI, but a good background on this kinda stuff is to study 2 books IMO. The first bit of TAOCP by Knuth where he explains the weird hypothetical machine and language he'll use through the series. And Bell and Newell, Computer Structures: Readings and Examples, esp the long Introduction. It's a sort of "prototype textbook" where they were trying to both create a Computer Science curriculum ex nihilo but also syncretize and synthesize all the weird machines of the 60s and early 70s into a coherent picture. The book then goes on to contain an anthology of original papers and accounts of dozens of crazy machines, interlaced with the authors' own mini-chapters explaining the machine in question in terms of registers and instruction set. When you see all these other machines then Knuth's design choices suddenly make sense.
Ah, back in the good ol' days of REAL COMPUTERS. When your computer bits and bytes had to *warm up* cause they were made from thermionic vacuum tubes instead of microscopic cold transistors.
back in the days ,engineers weren't called as such , they were called tech priests
Lol another thing came to mind that had a colleague wonder WTF I had done.
There was this joystick loop in a C64 game and instead of doing a “if then per input” I would just update the operand that of the compare to check for each input. A lot faster than branching.
It does give others a nice puzzle though, to see you reading a value from memory and changing it, in affect changing an opcode or operand. But we needed to do those tricks for speed and memory back then.
Real programmers build their code into the machines hardware on the macroscopic scale of mechanical relays 😤
Does the overflow is used as a ring buffer?
Well, not having index registers meant self modifying code was really useful. You could increment addresses so the next time you wanted a word you’d get it from the same bit of code. I bet there were many such tricks.
It kinda sounds like Primeagen is assuming the machine code is being written directly as the first draft. I would expect any sensible machine code programmer to use pseudo-assembly first to plan things out. Otherwise how do you keep track of the bigger picture?