Hi, thanks for watching. I incorrectly said the landing was on 29 July, it was on the 20th. I just read it wrong on my script! Also this video is getting a few moon landing deniers unfortunately. I’ll try remove the more crazy comments but please ignore (unless you really want to get into an online argument!)
Although it's understandable enough to keep saying 1969... the truth is that the Inertial Guidance System (high-tech gyroscope) the Computer and its Software was the first contract issued for the Apollo mission, so its concepts and design are rooted in the key technologies of nearly a decade prior, and partly successful due to the foresight of what may be available when critical commitments to implementation are due - e.g. use of the first integrated circuits. The software was developed over a long time, and only just made the deadlines set for each critical step in the pathway.
hopefully, a decompiled version will be available! like c. historical we wrote in assembly, until we realize we can write higher level code like c to create assembly. then we wrote another layer like python and java to go a step further. java compiles to java bytecode, that then compile to an assembly with java virtual machine. there's no real reason to write in assembly, as most modern languages today compile to assembly. so we all still write in assembly. but with custom functions you can say.
Thank you for this video. Truly amazing software accomplishment. Amazing that NASA had this one-of-a-kind non-programmer computer-to-human interface. Also, you a beautiful and well spoken. Thank you again. Best Regards.
fwiw: Comanche is the European name for a very famous tribe of Native Americans inhabiting the Great Plains region of what is now the United States. Pronounced kuh-man-chee.
Yes, it was called magnetic core memory. It was really an amazing technology of the day bc it was non-volatile so it served double duty as RAM as well as flash.
@@TheWallReportsMagnetic core memory was a step up from using cathode ray tubes to write, store and read data. Thank heavens I never had to use cathode ray tubes for storage and I have a very time imagining what it was like to use mercury delay lines for memory.
I think you guys are getting two different technologies mixed up here. Magnetic-core memory was volatile and used for RAM, and while it was indeed hand-assembled it was done so in a manner that was the same for every cell. Core-rope memory, on the other hand, is the non-voltile/ROM technology that was hand woven by the so-call "little old ladies" at MIT. The former used the cores to store one bit in the ferrite material itself. The latter used the cores to read which wires passed through and which went around (with up to 64 wires per core in the case of Apollo). Two slightly similar, yet completely different, technologies.
I just smirk when people say older people don’t understand tech. As a 14 year old, I learned about microcomputers on an Intel 8085 development kit. You would write a program on paper using the 8085 assembly instructions, and then look up the machine code for those instructions in the manual and enter them using a keypad. 8 years later at university I was using my personal 386 PC to run Mathcad etc. It is amazing how rapidly things are developing. Apparently one cellphone represents a cost beyond the total world GDP of the 60’s where it possible to construct it using valve tech from that era. Great clip about this famous computer. A testament to pushing the limits of the tech you have available and the character of the people faced with solving problems
My lecturer taught me the concept of memory using a transistor based flip-flop to store a single bit. Fast forward in the 90s I learned to program on a Zilog Z80 using a keypad to produce something on a seven-segment display. Despite of how fast technology is evolving, it's comforting to know that Arduino sustems and bread boards are still widely available nowadays and youngsters are eager to learn from those.
@@thesoundsmith The KIM-1 was super fun to use. I recently learned there were a bunch of expansions for it. I'd only used the base board with its hex keypad and LED display. Maybe a homemade cassette interface (from BYTE, iirc, but it might have been for the SYM).
Yip! Same for me in 6502, back in 1979: Pure assembly language driving hand-wire-wrapped custom hardware for a microtonal music synthesizer. I entered it into Science Fair, although I probably would have done the project either way.
Once in a while I stumble across something like this that really makes up for all the other mediocre junk taking up space on TH-cam that makes me say to myself, "I'm really glad I watched that!" Dee, good job, this is pure gold👌 The content was a very fascinating little slice about this awesome historic endeavor! It invokes more appreciation for all those super intelligent people it took to make the program happen.
There is a 1-to-1 correspondence between the assembly language and machine code. The programming logic is the same for both. Assembly provides human-friendly symbols that the assembler translates 1-to-1 into machine code instructions. Assemblers also provide human-friendly labels for memory addresses like jump/goto targets and data locations. Advanced assemblers also provide "macros" that substitute sequences of assembly instructions with a single command, similar to how macros in office software like word processors and spreadsheets work. Once macro code is substituted and memory address symbols are resolved, again, it's 1-to-1 translation to machine code. Early microcomputers like the Altair and minicomputers like the PDP-11 had front panels with displays a little similar to the AGC DSKY. You could enter the instructions in binary and read results in binary from display lights. The DSKY was more user-friendly (no binary) in this regard as it provided command symbols on the keypad and decimal values for the display and keypad.
One other thing to keep in mind about assembly language is that it's not a single language. Each processor architecture's assembly language is unique; e.g., the AGC's assembly looks completely different from 6502 assembly, which looks completely different from i386 assembly, which looks completely different from ARM assembly, which... you get the idea. This was because assembly instructions map 1:1 to machine instructions, which of course are completely different for different architectures.
@@markrosenthal9108it's not necessarily 1:1. Assemblers support macros that can turn one assembly statement into 2 or more machine code instructions. MIPS CPUs don't have an instruction for loading a 32 bit constant, but there is a "pseudo instruction" li, which is turned into lui + or. The main difference is you can use all the instructions available, while high level languages only use a subset and they won't let you directly use things like CPU flags. An issue that I have faced is C not letting you detect carry or math overflows without wasting time on unnecessary calculations
@@markrosenthal9108 It's not quite 1:1, because even without a macro assembler, there are tricks you can do with machine code, that's difficult or meaningless with assembler, like designing code that executes differently if you jump into the middle of a multi-byte instruction in an instruction set that supports a variable-length instruction set (like x86 or Z80 or 6502 or 68K).
I started playing with computers around '69/70 and started programming for space systems around '80. When I began, cards were for development and paper tape was for finished code (it was more compact but really really annoying to edit). Fast forward ten years to around 1980 and terminals faster than a TTY were starting to become common, which made all the programmers happier -- even at 300 baud. That said, I was still doing a fair amount of programming with cards well into the 80s. Many programs were developed using FORTRAN to work out the algorithms and logic (C wasn't yet mainstream, nor vetted for space missions where I worked) and chase out a lot of the bugs, but in the end we were still translating that by hand into "optimized" and commented assembly (i.e. just compiling and doing an assembly dump as a shortcut wasn't an option). It was a treat when you got to use an AL you already knew from a previous computer; still, you ended up learning a lot of them.
Programming certainly has changed since then! When I started working I did some assembler, but that work died out and haven't touched it since. The lowest language I use is C, and now using frameworks is important.
Do you still program in Fortran? Will Fortran ever be deprecated? I'll bet it will still be used in the 2100's and beyond. The more time passes, it's less likely to ever be replaced.
@@TerryMundy Honestly, I never loved programming in FORTRAN, but what it does, it does well -- which explains its recent resurgence. Do I still program in it? Nope. I haven't had to for a while. Nowadays I tend to use C-family langs, Python, or a microcontroller assembly.
@@terpcj Then, you would know if honest..they were mainframes. 2_ Amigas Where,and are being used ,by 1986 at Nasa.Space program is a lie. Told to I by astonots. Colombia..who are still alive,and Nasa was the ones who paid for my A1200 I still have contavts there. So Im tired of the lies
I have worked on Assembly language programs and it really finds out your weaknesses and forces you to learn. I am not a professional developer BTW, but I do have a passion for embedded systems. I know why they used assembly so much back in the day. It was their only real option. Thank goodness for really good compilers and large memories nowadays. WE ARE SPOILED.
👍🏾True. The thing with Assembly Language and same can be said for machine language is the coder has to know & understand the actual CPU & hardware bc you're working & manipulating registers & memory directly more less depending on whether virtual memory mode is active.
I still remember my first days meeting with assembly. It was in 3rd elementary... No, I was not some Sheldon Cooper level genius in some "gifted kids' school". It was early 90s, and my school decided to buy modern computers (so 286s instead of Comodore64) and start computer science courses for the kids. It was really forward thinking at the time, sure, but there was one tiny problem. None of the teachers knew ANYTHING about computers, so they assigned the science teacher to learn about computers from books and teach kids what she learned. Basically she was ahead of the classes she taught by one or two lessons. And what exactly you think they thought would be the best way to introduce kids into computer science? Yes, exactly what you thought: Assembly language. During the first few lessons we learned what peripherials are, then right after that we were introduced to registers, stacks and heaps and data manipulation verbs. In like two months, all the kids were taken out of that course by the parents.
@@Jenny_Digital Same here. I wrote a few simple arcade games in 6502 back in the 80's on my Atari 800. 6502 was so simple that it made it very difficult to program. At college in 1987 I took an assembly language class which involved programming a mainframe computer. I don't remember the name of the CPU but the assembly was much more advanced making it fairly easy.
@@jp5000able I had an Acorn Electron with only a black and white telly and a tape recorder, but the built-in assembler meant I got my start in 1983 when I was only six. I had no printer so I typed my listings using an imperial widecarriage typewriter my dad had rescued from the tip whilst working for the council. Back then my computer came with a very comprehensive user guide, getting started book and demo tape. Now you get a sheet or two of paper telling you how to plug the thing in and not to open it.
Curious Marc is a treasure trove channel. Going to auctions, nabbing old gear, doing hardware debugging, rebuilding display tech... Then casually piloting a landing. They are amazing.
@Fred-yq3fs Yes, and their series of videos about the Apollo systems is fascinating. At one point they even had access to one of the rope memory "memories" (I guess that's the term?) and reverse engineered it to determine what code was woven into it.
Absolutely true! I learned so much from their hands on restoration of an AGC and a download of its core. They then redid A11 landing with the AGC interpreting inputs and computing outputs.
Margaret Hamilton was the lead software engineer of the AGC. She coined the term Software Engineering and pioneered fault tolerance in software systems. Her ideas behind fault tolerance played a crucial role in gracefully handling of AGC overload right before landing
Gracefully handling? The computer crashed out repeatedly so they hand landed the damn thing. I get it she was an ok programmer but lets not blow what she did out of proportion which is happening a LOT recently..probably because she is American and Black..nobody remembers the other female programmers who had been around for decades and in 1 case 2 fucking centuries earlier.
I went to the Smithsonian a couple of decades ago now and was absolutely stunned by the technology and computers used to go to the moon. It looked absolutely archaic in design. I literally saw soldering points on the greenboards for wires. I was told at the time they had triple redundancies because of the anticipated shaking that might loosen the wires. My respect for those astronauts only grew tenfold when I saw that capsule and those components. Now, I have heard that we have to relearn how to go to the moon, because most of those brilliant engineers are now dead and things were kept on paper. Wow. You would have thought that we would have kept and or recorded these kinds of things, even if for only posterities sake. Great video, thanks for taking the time to explain it.
We have tons of records from every aspect of Apollo. Drawings, technical reports, etc. What’s gone is the ephemera: not design details, but background of “why did we design it this way”. Relearning how to go to the moon is more a matter of experience: the engineers working on Artemis have tons of experience with Earth orbit, less so with the environment of the moon and its unique challenges.
@@Hobbes746 I think it comes down to two things. Firstly I heard that the Apollo engines required individual hand tuning, which the engineers knew how to do at the time, but they're all dead and noone knows what exactly they were doing. Second, there's a cascading chain of technical debt. A lot of minor components and things are surely out of production or the contractor companies gone. So that minor component needs replacing with something else. But that then affects multiple other components, which also need to be changed to be compatible, which in turn affects other things and so on, until it'd be simpler to start with a clean slate. To be honest, it's now more likely that Starship will be ready for a lunar mission long before any attempt to rebuild Saturn V "as is" could ever be ready. Plus Starship will be reusable.
@@calebfuller4713 Yes, those all play a role too. Hand tuning the engines can be learned again, but that takes a few years of hands-on experience, building an engine and then testing it. You also have to consider the change in manufacturing methods: you’d have to redo all the drawings in CAD so you can use modern machines (CNC etc.), you have to take into account that some of the metal alloys used back then have been replaced with new ones with different properties. The list gets long.
@@Hobbes746 Absolutely true! It's part of what I meant by a cascading chain of obsolescence. And then you need to test how the new methods work versus the 1960s methods. Will it blow up or work better than ever? It's not even just the rockets. Its stuff like the the control computers. Are you going to dig up a vintage 1960s computer? Or it all needs to be adapted to a modern computer? Which is a whole new project in itself... One that is again maybe harder than just rewriting new code in a modern language from scratch.
In an age when there are bad actors who seek to revise history or erase it (not for greater accuracy, but for barbaric political reasons), hard copies are good to have, to preserve the sanity of the planet. When the risks of technocratic neo-feudalism or nuclear war are becoming increasingly evident, what we call "records" could become future artifacts that could help people sort out what went wrong.
Fascinating. The code itself is one of the best records of the mission. I also want to point out the amazing inertia motors\sensors connected to the computer. This combination of brilliant engineering enabled the craft to remain perfectly oriented through all missions. Our phones have a similar device installed, and the Occulus Rift has a pair. A DJI Drone hovers perfectly still in all wind conditions (below 25mph) because of its precise, near weightless inertia motors. For a computer, in 1969, to observe that data and respond to it accurately is mind blowing.
As long as you make comments. Especially in assembler. Joe Armstrong (the inventor of Erlang) made a joke about the commenting behaviour of one of his colleagues: It was one big file of assembly code and in the middle of the file there was just one sentence: " ;And now for the tricky bit!" 😆
As long as what needs to be known can be known, putting in a little humor or a note to future programmers is like putting a note in a bottle and casting it into the sea. Who knows who will eventually read that note, and how this little bit of humanity will affect their appreciation of the code. I think the comments also help document a point in history with slang or literary or movie references. If done tastefully and with a sense of balance, these comments give us insight not only in the code, but of the programmer.
Amazing! I remeber an interview/debriefing with all three astronauts after their return and Buzz admitted to not releasing the radar task which he should have done to follow procedure. The result was the two errors. Buzz said that at the time he wasn't going to let go of ride home or something to that effect. Who can blame him! What they did is beyond brave! I think you can find this interview on the 3 DVD set NASA released (years ago now). I can also remember watching this in real time back in '69 - it was sooo tense. The fuel almost gone, the computer errors, a last second decision to manually override the landing to get a better spot. Whew it was spectacular. Thanks for doing the video - Good job! - Bill
I started learning programming when I was 19 in 83. 8086/8088 assembly code was the common language. I still use 8086 assembly as BASIC is often to limited. I got to watch a Saturn 5 launch in 71. That is when I got the programming bug. Thanks for this information.
My parents helped write the code for the AGC it was their first job after getting their masters degrees in math, (no computer science degrees back then). They worked in Houston and were working there in January 27th,1967 when word came in from Cape Canaveral that Gus Grissom, Ed White and Roger Chaffee had died in the Apollo I capsule fire. It was devastating news as they had known them if only from afar. It is one of the clearest memories my mother still has of that time. Later they moved on to work on missile systems at White Sands Missile Range like the Patriot and Toe missiles. I still remember coloring on discarded punch cards and printouts from failed runs :) When I spoke to my dad about it years ago he talked about having to program computers with switches when he first got to WSMR, crazy stuff. Things have come a long way since then it's good to remember that we stand on the shoulders of giants.
Actor Jack Black's mum worked on the Abort-Guidance System (AGS) in the Apollo Lunar Module. In what has to be one of the best quotes ever... "In a memorial tribute, her son Neil notes that she was troubleshooting problems with schematics on the day she went into labor, called her boss to let him know she had fixed the problem and then delivered Jack". The calculations and measurements were also done in metric and displayed in imperial units as was simpler.
@@Alkatross you do! All imperial measures in the USA are based on metric standards. The US inch has always been 25.4mm. However in the UK its varied over time - even at one point 1 inch being 27mm. also US military - 1 click = 1km.
@@mrrolandlawrence i realize it's all the same stuff, but it's just the general knowledge of the measurement that I am missing as a native freedom unitarian.
@@Alkatross yeh thats one thing i'll never understand..... no one uses imperial tones in the USA. its always LBS. even when its in the millions. one curse that is worldwide though -TVs -measured in inches everywhere and car tyres for some reason.
So cool you gave the Wizard of Oz not one but two mentions in your video. One is the "Off to See the Wizard" bit of the code and also that Margaret Hamilton is not only the MIT software engineer but another Margaret Hamilton was the Wicked Witch of the West.
Liars love to add little nuances to their scheme so that the dum dums get distracted and forget why they were looking at the 'code' in the first place. In order to have code, you would have to have a computer. But just like everything with the authorities, they keep evolving their schemes to maintain the realism for the newborns in the human family to tell stories to.
Excellent video. It's heartening to see young software developers pay homage to the giants whose shoulders all developers stand on. I was 7 years old when Apollo 11 landed on the moon. Later in high school, I read about the AGC and I was inspired to learn how to program. First course was FORTRAN with punch cards. I worked as a software engineer for 38 years and never forgot the lessons learned from those pioneers. PS I love your South African accent. Keep up the good work!
I have always injected some humour into the code and continue to do so - mostly for my own amusement - sometimes as a monument/reminder of some insight that would otherwise get lost in time. I enjoy seeing it in other people's code too, it does have some utility beyond putting a smile on a tired developer's face though... What I have found is that third parties looking at the code latch onto those bits and pieces, they give an otherwise featureless jumble of code some landmarks - which folks can use to navigate the code (and share in the insights of the original developer).
I can think of two reasons we see less humor now. One is that all the old timers already made all the obvious jokes. Another is that corporate coders are discouraged from frivolity. Fortunately for me, my boss is old school and it's not a big deal for us to joke in our comments. But we do worry that the company will some day be acquired, so we have to be subtle about it. We're cursed with political correctness.
@@SpareSimian As it happens I work for a soul crushing corp, I don't get much push back because I don't feel a need to be "edgy" with my humor. Where there has been some push back it's come from insecure folks who want to assert themselves - for my part I am happy to let them vent and make a fool of themselves in public - they rarely repeat the mistake.
Note: Assembly code is platform specific. X86 assembly is very different than ARM64 or 6502 assembler. They have completely different syntax, opcodes, registers, etc. Each CPU type's assembly code is essentially it's own language and there is generally no direct parallel between different platforms.
Oldish assembly coder here. Whilst assembly can be different on each platform, the biggest difference is big endian and little endian. Where basically the code is written backward. (Kinda). And yes, opcodes, registers, memory etc. all different but that can even be different from the same architecture to the next in the same architecture.
I agree they are different but.... I know Microchip assembly vey well, and if I have a quick look at a reference manual, I can easily read and understand any other flavor of Assembly.
@@TheHeff76endian is rather insignificant, big endian is easier for a programmer to read but slower for the computer (little endian let's you start doing maths as soon as it's read the lowest significant byte) Load/store Vs register/memory architecture is a more important distinction. The way the 6502 updates flags after every instruction while 8080/z80 only does it on a few, is a larger difference. Even the indexing modes on 6502 are vastly superior. Trying to switch between 6502 and z80 (which are both little endian) can be annoying, the 6502 is the best CPU ever.
@@phill6859 I've never done anything with the 6502, but I heard it had very few registers. Also, every instruction updating the flags sounds more like a nightmare to me; what if you want to test more than one flag? Also it means that you'll more often need to store the flag's value somewhere else for later reference.
@@__christopher__depending on which instruction is running only some of the flag bits are changed not all of them. Normally only the logic like and, or, not, rotate OR math like add, sub, compare effected any of the flags then use conditional jumps to branch on status. Even though it might refresh the flags on every instruction it doesn't usually change any of them.
One of the most interesting TH-cam episodes I've ever viewed. I started using assembly language in 1970, and have a very high degree of respect for early coders. Thanks for your research and presentation! I enjoyed it immensely!!
As did I. A little bit of programming in hex based machine code and then onto Assemby. We though it was amazing when we got 8k of RAM. How things have changed.
My first programming was less than a decade after the moon-landing and we had a landline from college (UK, so age 16 to 18) to a local university mini-computer. The link terminal had a huge telex-keyboard and a paper-roll instead of a cathode-raye tube screen, and then you had to wait for most of the lesson (or frequently until the next one) to get the inevitable error message about a typo in line ten, or maybe the results of the program if you had been extra careful. A DSKY like the Apollo had was massively ahead of it's time. NASA has large public archives concerning the tasks and planning for the tens or hundreds of thousands of people involved in the project, and the solutions to the new problems that had never previously needed solving.
Catching up was indeed a problem, with no budgets for that sort of thing as computer use was percieved to be just something for accountants . . . The more enthusiastic students had Commodore or TRS80 machines at home, at about the same time as the telex-and-paper terminals in college!
Margret Hamilton's daughter often went to her mother's work and was allowed to play in a mock-up capsule that contained an AGC. At one point an alarm went off and Margret asked what her daughter had done and it turned out that she had started program 00 which basically meant that the Saturn was ready for launch. Margret warned her superiors but they brushed it off saying that astronauts would never be stupid enough to do that. Until James Lovell made exactly that mistake during the Apollo 8 mission when it was already close to the moon. Mission Control was not amused. With much pain and effort they managed to restore the AGC remotely!!!
00 is the AGC Idle ("Go to pooh") command. 01-03 are the command for starting up a Saturn. Also, Lovell got 40 and 41 mixed up during Apollo 13, which they were able to unscramble quickly. You're thinking of another incident (maybe on Apollo 8?) that most of us haven't been exposed to?
@@fredflintstone9609 Yes, as I have already said, it was on the Apollo 8 mission. Lovell himself confirmed it. If I can find the link, I will add it here later.
Assembly language also is not just one language. It's a family of languages that are specific to the processor the engineer is writing for. For example the assembly written for an Intel x86 chip would differ from say an AMD x64 chipset. Back in the day each computer had its own version of assembly, so the code written for this flight computer can really only run on this flight computer.
Yeah, I was devastated when I found this out, because I had learned assembly for the Atmega328p And felt like I could code anything retro, only to find out assembly instructions are microcontroller dependent
7:18 Fun fact: the stack of binders next to Margaret Hamilton, the "rope mother", is not "the software", it is several versions of the software. I assume one binder was one version of the software, that is the listing of the 69000 bytes of the final version for AGC. If I remember correctly, to stack the binders was the photographers idea, to make it more impressive.
Great Video Thanks! Don Eyles on of the programmers responsible for those jokes wrote an interesting book called “Sunburst and Luminary an Apollo Memoir” about his time working on Apollo.
It's a really good read! Don Eyles was the main author for the Lunar landing module software and helped pioneer some coding techniques still in use like memory fetch interleaving - esp by game developers to get the most out of the consoles, as well as interrupt coding - used in OS's. The latter turned out to be key in avoiding the information flooding mentioned in the video (indicated by the 1202 alert) from stopping the mission.
Fascinating, thank you! Yes, I'd held the incorrect belief that the AGC was roughly equivalent to a calculator. I'm a technical person, but not a "computer" person, and I found this video very helpful.
We were using punch cards up through 1991 on our ship. We had to learn Hollerith code and even had 5-channel paper tape and magnetic tape. We had guys coming down from various departments, sometimes with several BOXES full of the cards. Every once in a while, they would trip on the ladder (stairwell) coming down to our shop to give us the job! Those cards were stacked sequentially and had NO writing on them. They had to go back to their shop and re-run the entire job again to get the output. :)
So they missed the trick of running a pen down the side of a stack of cards at a slight angle so that even if the cards get shuffled it's quick and easy to get them back into order.
@@Heater-v1.0.0 Our cards were punched and also had the text printed at the top of the card. We were taught to tab over to the comment part of the punch card starting at I believe column 72 and type a sequential number. No one ever did it, since it was so slow to move over there. Dropping your punched card deck was everyone's nightmare.
@@dermick Yeah, we didn't bother numbering our cards when punching in Algol code in 1976. Seems a lot of people didn't cotton on to the trick of running a marker pen down the side of the deck. Anyway, luckily by 1977 we had terminals to a timesharing OS , no more card punching. Then very soon came micro-processors and personal computers. Happy days...
Thanks for such an amazing video Dee. Been an Apollo fan from a very young age. Being a python developer and Linux script writer myself, I found it so interesting. I liken the "main" file linking it all together, to my preferred scripting methodology - making use of many functions (modular) - being called as required.
I learned 6502 assembly language progamming on the Apple ][ computer, a decade after the 1969 Apollo 11 moon launch. The page layout and formatting of the Apollo Guidance Computer assembly code is strikingly familiar, and similar to the assembly language syntax and commenting standards I used and encountered in programming Apple ][, Commodore 64, and IBM PC computers in the 1980's and beyond. It's inspirational to see how low-level programming culture and techniques evolved from the earliest examples of embedded systems programming on machines that predated modern microprocessors.
Fascinating comments about a fascinating subject. My first electronics project was a 3-tube ('valve' for the English) turntable pre-amp built in an aluminum chassis I punched and bent myself. My last project was a dual, surface-mount µP, touch-sensitive subsystem in a handi-cam built on a ten-layer, 3-square-cm pcb. I designed hardware and wrote embedded assembly code. I lived and worked through all the intervening hardware and software growth in systems and technology between those two points. Thank you, Dee, for reminding us about our roots! It has been a great ride!
I remember machine code and hand compiling 6502 programs into hex! I also remember watching the landing live, aged 12. BTW it was July 20th 1969 not the 29th. As a computer operator back in 1983ish punch cards were still in regular use where I worked. Having recently retired after 40+ years in IT, based on talking to recent graduates I get the impression that basically Assembler is no longer taught in college. This is a mistake; short of some kind of electrical engineering degree there is, in my opinion, no better way to understand what the hardware's doing.
Fairly good description of my start. Watched moon landing, programmed on punch tape between 1972 and 1979, Fortran and punched cards 1980, Computer Science degree included hand compiled 6502, bit slice CPUs, and graphics processors along with Pascal, ADA, APL, Cobol. Finally went to C and Smalltalk.
@@ProfPtarmigan Pretty close, I started as an operator in 79, learnt Cobol & ICL Plan Assenbler on the midnight shift from reference manuals. Switched to programming in 83, spen 6 years with a software house (Cobol, RPG2 & Wang Assembler), Moved into the airline industry in 89, spent 9 years doing TPF & MVS assembler (Both in the UK & US) switched to Java in 98. Moved into DEVOPS in 2018 & retired in 2023. Somewhere in there I found time to snag a 1st class hons in, effectively, computer science from the Open University.
Never disregard the old timer engineers. It was their work which laid the foundation for todays tech. Their work was the stepping stones for today. We could not have gotten to where we are without their work. They blazed the way. !
In the mid-1990s, my company designed and built a new space launch platform at Cape Canaveral. As part of the design process, we consulted with as many of the Apollo engineers as possible. These guys were all long since retired but still well aware of their achievement. What amazed me was the very high level of disdain between the old-timers (attitude: Been there; done that.) and the young engineers (attitude: What's it like being an actual dinosaur?) Bringing those two disparate groups together and getting them to respect each other was amazing.
Thank you for making this video. For the record AGC memory didnt use bytes but rather 4096 words of 12 bits each. Very restrictive, and requiring extreme coding efficiency that modern developers don't generally have to worry about.
I agree however, I work for a large semiconductor company and amazed at how little our graduates actually know about assembler and how it interacts with hardware. They have no knowledge of flip flops, latches, bus drivers, fifo etc. I developed on an Atari 600 XL in assembler and then Motorola 68000. I now work on multi ore ARM and DSP however have never forgotten the basics of how a processor core works. I am waiting on the Monster 6502 project to be completed. This has to be the ultimate demo of how a 6502 works. I hard wired my first 6502 board in 1982 and used a Dataman S3 EPROM Emulator to emulate the EEPROM. In those days all we had was the humble LED to assist in debugging. Developers on MCU have it so easy today with C Compilers, debuggers, flash memory etc. Having gone through the early days one learnt how to code correctly in the first place. Their are some great programmers out there however even more useless ones, especially idiots that seem to want to shoe off their apparent skills on LinkedIn. Great video and loved the detail👍
Fun fact: When the space shuttles were built, technicians in Palmdale, California used IBM punch cards as shims to establish gaps between the tiles during bonding operations. As late as 2008 we still found pieces of punch cards between tiles that flew many missions. The punch cards were very consistent in terms of thickness and techs could establish gaps with them to meet strict gap requirements very easily.
I'm 67, at school they stopped the whole school for the Lunar landing wheeled out a television and we watched the landing LIVE. Later on in my Career I spent 40+ years in IT. A specialist in Aerospace/Defence, several languages, no machine code, megabytes of code. Yes I put in jokes. They can be useful. If an error says "Tony your shorts are on fire" in correction you go straight to the place in the code. "Error 404" could be anywhere. NOW THE IMPORTANT PART. For Apollo they taught me that they programmed in triplicate. For a key calculation three programmers would code the algorithm. The calculation on the onboard computer would run all three in parralel. If all three did not agree the result, it would display the two majority result. If All three were different it was "Shorts on Fire". From my experience they could not run the whole of the system in triplicate (we can now) this would be sub sections of code.
What an outstanding presentation and explanation of Apollo 11's hardware and software. When I first took computer science in the US Army (long ago) the instructor began with; "Students, we have been in the digital age since 1844 when Samuel Morse invented his code for the telegraph". The 'dot's may be '1's' and the dash's may be '0's' or vice-versa. But each combination results in s letter that can result in a word and each word that of a language". The rest was easy, though the computer that guided the missiles of my duties were analog artillery computers from WW-!!. What an excellent video to jar loose that memory. Thank you...
"Comanche" is pronounced as /ko-man'-chee/. "ko" rhymes with "go" but can be softened to "kuh". The second syllable is emphasized; "man" is exactly like "man". And "chee" rhymes with "me". The Comanche tribe of Native Americans here in the USA are headquartered in the state of Oklahoma.
@@msromike123, the typical American pronunciation is /uh-loo'-muh-num/. Emphasis on the second syllable. The last two syllables can be /min-um/ instead. You can blame our first prolific dictionary editor for all that. Old Noah Webster.
There is a book about the Apollo guidance computer called "The Apollo Guidance Computer Architecture and Operation" which is pretty good reading. It had a preemptive and cooperative multiprogramming, an interpreter to implement more complex instructions.
Correct me if I'm wrong but my understanding of the AGC is that it was NOT just a calculator some are led to believe. To believe that would be absurd. But what was meant by that statement is that it had the computation power inline with that of a calculator but far more capable bc of its preemptive & cooperative multi-tasking abilities.
@@TheWallReports I guess the problem is also that different people think of different things when thinking of a calculator. When I think of a calculator I think of a non-programmable device which has keys for a fixed set of mathematical operations and a display capable of showing a single number; however there were other electronics also going under the name of calculator that could be programmed and had graphical displays, and which IMHO would qualify as pocket computers.
@@TheWallReports It looks like a calculator because it use a VERB/NOUN interface but it could do a lot of things. The verb was a two digit code that defined the action for 00-99. The lower codes were for early stages on mission like prelaunch while later codes were used for later parts of the mission like navigation, descent and landing back on earth. This interface was something the developers came up with while expected something better in the future but nobody could think on anything better. The CPU had 16 bit registers. With the interpreter they implemented opcodes on double width registers doing trig functions and vector/matrix operations needs for flight control. The CPU had access via IO to many sensors and engine controls. The AGC were quite capable but the ground facilities had much more powerful computers to augment the AGC because there were weight constraints on how big a on board AGC could be.
I remember playing with discarded punch cards as a child back in the early 1980s, I guess. My parents were students at the time & one of their student jobs was loading punch cards into the university's computers (and taking them out again later). They brought home a bunch, mostly to be used as note pad replacements. They also caused my child brain to imagine all kinds of funky things for all those strange hole patterns. Thank you very much for this fascinating & entertaining dive into our past; definitely one of humanity's most uplifting & most impressive achievements ever.
Great summary of the AGC. As others have posted Curious Marc and team have brought back an AGC and rebuilt the code for multiple missions. Current video they are debugging ground radio commands to remotely program the AGC. Given the state of the art back then it was a fantastic accomplishment. Less well know is the LVDC (Launch Vehicle Digital Computer) that guided the Saturn 5 and astronauts to orbit designed by IBM.
Has LVDC been published like the Apollo 11 source? I'd love to see that code. Maybe Curious Marc could restore that hardware for an encore to the AGC project
@@shackamaxon512 Much of the extant LVDC hardware is now a tangled mass of wires because, to keep it light, the LVDC body was built of beryllium. These have degraded to uselessness over time. A shame too; the published specs make it clear the LVDC was also a ground-breaking machine.
Very interesting Dee😀. I am norwegian and living in Norway. At the age of 17 I was very interested in and fasinated by the space programs, especially Apollo. On june 29 i sat "glued" to the TV and followed the real time program when Armstrong and Aldrin actually landed on the moon. This was probably the most impactful moment in my life ever. Later in life in the early eighties I did a lot of programming in IBM assembler code writing add-ons to the Swift-systems used by the banking business. Then in desember 1995 I was very happy to visit the space center at Cape Canaveral. Your YT video was very interesting and giving a good view into the control systems.
Apollo's AGC was the first computer to use ICs (Integrated circuit / Chips) There was a lot of discussion about this but the team decided that they were the only way the project could be realized
Whilst they used some super basic logic gate ICs in the AGC, the real leap goes to the CADC in the Grumman F14 Tomcat which was designed in ~1968-70 and used the first set of ICs that would in combination be considered a microprocessor, the computer being the only way to make the swing wing design flyable
I wouldn't say the AGC literally "got us to the moon", but it did take a HELL of amount of workload off the astronauts and Mission Control. They had to have backup procedures in the checklists in case the AGC failed. What really got the Apollo spacecraft to the moon was the Trans-Lunar Injection burn, and that was controlled by separate hardware called the Instrumentation Unit located in the top of the S-IVB booster stage. The IU could be flashed by either MCC or the AGC, and I would love to know who wrote the code for it, and what it looked like. Probably the most common tasks performed by the CMC in the Command Module were setting up the DAP (digital autopilot), and timing the ignition and cut-off for large delta-V burns by way of programs 30 and 40. Verb 48 would access the DAPs set-up, where you enter the spacecrafts weights and configuration (CSM and LM staged or unstaged). You could then enter whatever attitude you liked into the roll, pitch, yaw registers and the CSM would maneuver into your requested attitude and hold it. For the big delta-V burns like Lunar Orbit Insertion and Trans-Earth Injection, P30 set up the time of ignition and burn magnitude, whereas P40 actually fired the SPS engine and shut it off with an accuracy of up to 1/100 of a second.
During Apollo 8, Lovell accidentally entered something incorrectly that made the AGC "think" they were back on the launch pad. th-cam.com/video/Wa5x0T-pee0/w-d-xo.htmlsi=yKVtUI_-NUDnxnql?t=43:04
Sure it just amazing they had a computer on the apollo that was programmable yet, not 5 feet long and 250 lbs like the DEC PDP-8. Or, not the size of a room like the Honeywell 200. OR, several tons and 4.5 feet wide, 6 feet tall, and 3 feet deep like the IBM 360. The next miracle they well tell us is that we went to the moon with early iphone technology. When is story is made up, it can change and evolve and adapt to all the people available so they BELIEVE, instead of RESEARCH. And then those same people will clap back with words that destroy arguments and conspiracies in a single breath word like "SCIENCE!" It's like an exorcism, just making all the evil truth demons jump out! Just scream it out loud and watch all the people scatter. Unfortunately, it doesn't make empirical evidence, facts, and observations scatter.
@@ThePlanetTVtheplanettv The Apollo AGC was the most advanced computer in the world. It was the first computer in the world to use integrated circuits. This came about because NASA saw the potential in the early research into ICs that was going on in the early 1960s and decided to support this research. In 1965, NASA purchased 60% of the world supply of integrated circuits. This investment paid off: the AGC was not 5 feet long, it was the size of two shoeboxes. the reality of the Apollo missions is supported by a mountain of evidence. The deniers never present any evidence at all, just "arguments" they made up. They think their ignorance constitutes evidence.
I was in the US Air Force, and we used punch cards well into the 80s. It was in the 90s, if I recall correctly, that we switched over to a system of remote terminals. Even then, it was the old system at heart, but you could enter data directly rather than via punch cards. This allowed for real-time updates to maintenance data rather than overnight batch processing.
Yep... I still remember the thrill of the first IBM 3270 terminal in the shop with CICS to enter our code on. The senior programmers hogged it up most of the time...
When I was in the Marine Corps, I was almost a mainframe operator. Still even lists that MOS on my dd214 as a secondary MOS. Luckily, due to some priority changes, I went to school to be a Small Computer Systems Specialist (an IT guy basically), but then ended up actually doing UNIX sysadmin work. I was around a lot of mainframe folks, though, both Marines and civilians.
That's Epic; thanks for clearing up the situation between the AGC and the Lunar Module Pilot: Buzz "Computer, tell me the altitude", Computer "Hold that thought Buzz, I'm a little Busy at the moment! (1201 & 1202 alarms)"
Not too interesting explaining the loading of a value into register. More fun just to see what the developers were thinking. I learned assembler in college, it was excruciating.
@JoeK-q6q Oh wow! You learnt assembly!!! I know assembly (ARM, x86_64, 6502) too, it's not that hard, dude. If the comments are what's interesting then name the video "the comments of the code that sent Apollo 11 to the moon", not "the code that sent Apollo to the moon". I expect a video with this title to talk about the code, not the comments.
@@kalei91 If I remember, it takes roughly 10 or more assembly instructions to represent one line of high level code. It would take half an hour just to explain a simple arithmetic routine 🧐
Hi...thanks for the nice video about Apollo software (and hardware). I have written a lot of assembly code as a computer engineer, so I can appreciate what it took to do this work. This required someone at the top a great deal of planning to structure this given assembly code by nature does very little...requiring LOTS of source code. It looks like they broke the code down into many small parts, which makes sense.
I don’t recall how they handled decimals, but it is worth noting that if you understand the range of numbers you’ll be dealing with and range of accuracy you need, there is absolutely no need for floating point. Floating point is way more flexible since you have both significant digits and the ability to place the point within those digits, but if you don’t need to move the decimal, integers are fine, and they would have a pretty solid idea of the range of numbers possible for any reading or calculation, the accuracy of the inputs, and usefulness of the outputs. Genuinely amazing code.
And we just KNOW, to the BONES, mother nature hates floating point math. Gawd, if only we had copied "Dr John von Neumann's" brain to help with the missions!
As a Space Nerd who spends a horrific amount of time observing the very small details and progress SpaceX is making with Falcon and now Starship, and also physics and engineering back ground, you did a very good job of explaining the mission correctly. So many get parts of this wrong. And you made the inner workings of this very interesting. This is a great video. Have you ever seen the interview Smarter Every Day did with one of the engineers that made the Sartun and integrated all the systems the code you reviewed ran on and controlled? Its awesome.
7:45 As someone who has written assembly language programs for 64K machines, I can say that the stack of printouts that Margaret is standing near in this photograph must be for the *entirety* of software for the Apollo program, not just the lunar and/or command module AGCs. A printout of source code for an assembly program (even one heavily-commented) that fits in 73K (3,840 RAM + 69,000 ROM) might require the number of pages of one of the smaller binders.
In those main files in the GitHub repo you can see that Luminary099 had 1743 pages and Comanche055 has 1751 pages, which is like a box of copy paper. That stack looks more like 8k pages or so. I think you're on to something.
She has said in interviews that when the reporter and photographer showed up, they came up with the idea of making the "stack of code." So, they went around the room and literally just grabbed whatever binders were there. So, that's probably not all source code. I suspect a lot of the thicker binders are just data dumps and other diagnostic listings. After all, computer only had about 36k of assembled code. Even with copious comments, the source code for that is not going to be a huge pile of paper
Yes. Each book is the source code for one AGC. But “Luminary 99” means the 99th version of Luminary, so there were far more versions than is visible in that stack. For that photo, they just grabbed a bunch of printouts that were lying around.
There's no way that stack is just the code for the Command Module and The Lunar Lander. I just downloaded the repo from Github, and the whole repo is 177 files @ 3.34mb. Given printing on 8 1/2 x 11 paper that's roughly 342 pages... less than a 500 sheet ream of ream of paper. I'm assuming those printouts where the typical green and white wide tractor feed... so it would take less than 342 pages. The project I'm dev on is almost 6,000 files at 2.18 GB... which is about 46K pages or 92 reams of paper and @ 2" /ream that's 182 inches or 15.3 feet. The photo is pure made up BS.
"This doesn't really happen in modern programming. File names, comments are always pretty straightforward and serious." You've obviously never seen my code! Thanks for a great video!
Ah, the good old days when programming was creative and fun, when code wasn't shipped until it was just about perfect. Also, the verb/noun paradigm is so well-suited to human command logic. Great video. Enlightening. Entertaining. Well done,
I mean, the noun-verb paradigm was the true genius of the system. It was such an elegant solution to the size and weight limitations necessary in the AGC human interface component. Also, an amazing design in light of the need for the astronauts to be able to accurately interpret the output from, and make input to the AGC in times of extreme stress. Mind boggling!
You'd better be sure that most of the work on the Apollo software was not fun but very tedious. Those jokes tell a story of exasperation. On the flip side, coding can be very much fun today, and modern software can be far more reliable than what even those most brilliant programmers of the 60s were able to do with their limited tooling. If your experience tells you otherwise then perhaps you should look a bit more around for better projects, other programming languages, etc..
Ah, the good old days when requirements were so simple and constrained. Even the smallest app today does so many more things than just "Fly rocket to moon". And then a separate program "Land it".
I loved the video and appreciate the background information provided. I should point out that Margaret Hamilton was certainly not standing next to a single listing of the source code. Someone computed the entire github repo to contain 1,751 pages which would not create a stack that size. The most likely explanation is that it is a culmination of test data run against the source or possibly several printouts stack for effect.
Checking online, it cuts deeper - that "burn baby burn" line was first used during the race riots in 1965 by a Radio DJ named Magnificent Montague in America... Super video, one of my absolute favourite software topics is the AGC. Many thanks for taking your time here, yet getting it all done in under 20 minutes. Well impressed! I am interested in what kind of instruction set they had available to them in terms of a Von Neumann machine, to be writing assembly on a portable computer in the late 60s. And also that the tech had to be shielded from extreme physical forces - ploughing thru Earth's atmosphere. And also, as others have said here in the comments, the actual point-to-point wire-weaving of the memory modules for the AGC, by real seamstresses with actual needles and a ball of wire, is the other most-phenomenal of facts about this software-hardware machine! Thanks for the upload, subbed 👍
I remember it - I had just turned one month old when the landing happened. I was sitting on my mum's lap, watching it on a black-and-white telly. I later wrote code in a kind of assembly then moved on. Cool stuff Dee
Great video! This relates to my favorite computer hack on Apollo 14 to bypass the abort button. It’s too much to type in a comment but might be something for you to cover in a future video
Smarter Every Day has an extensive interview with one of the engineers who designed the AGC. He walks the viewer through the history and the hardware. Very very fascinating and fun.
Based on my experience using assembly language in the 1970s, you needed some sense of humor or else you would lose your mind. Most of the humor in my code and those of my colleagues was more like graveyard humor and rather dark. The comments and subroutine names you quoted are much more amusing than the ones we had.
This channel is AWESOME i can't believe it took yt so long to reccomend this to me. Looking forward to more content from you Dee! Now if you'll excuse me im gonna binge watch this channel
People might be amazed to know that there is STILL a LOT of IBM BAL assembly code running - especially in large, critical applications where companies deem it just too expensive to rewrite, test, and certify the new code. The banking industry is notorious for this - lots of ancient code on HOGAN systems and still running heirarchical databases like IMS...
Assembly was my favorite coding language out of college. The really cool thing about it for me, is that it teaches you what the higher level language compilers need to do to get to the machine readable state. What is great about your research is the fact they needed to restart the AGC to get rid of the overload. Can you imagine today, having to restart your computer, in the middle of a critical moon landing in the next 60 seconds? But with their little powerhouse it was like a light switch. Done, and Done. Burn baby burn! I was 9 years old watching it live. I knew from that point forward, I was going to be a software engineer. Mercury, Gemini, Apollo. What an awesome time to be a kid! Thank God for SpaceX. They have relit the fire that burned for us in the 60's.
i can see why there are moon conspiracies. these engineers performed MAGIC. forget the primitive hardware, the sheer elegance of practical code to control a system of that complexity seems like scifi mumbo jumbo.
@@airaction6423 Science hasn’t been prioritized since then. Even back then, it took the Cold War to make governments and their citizens see science as a high priority.
@@airaction6423The reasoing is in the wall The US went there for geopolitical reasons. Win the cold war. It went there after apolo 11 too. Multiple times. Then the space race was won...and the cold war itself was won. The US government virtually defunded NASA. It had served its purpouse Thats why. And its sad.
Great video! Loved it, used to enjoy programming my first computer (C64) in machine code. Very impressed with the source code that made sure we REALLY could land on the moon and get back safely!
First I Iearned a psudo assembly code the teacher wrote.. then Fortran with both large and then small punch cards, then Basic on the IBM terminal, then APL on the terminal. We also learned a smattering of Cobol and Pascal. Later on taught myself a bunch of flavors of Basic including Mallard Basic on the Amstrad, AmigaBasic and a compiled Basic for the 68000... as well as C and the 68000 family of assembly. Also on the Amiga I learned E, Pascal, and a bunch of others I can't remember anymore. Then came the MS-DOS era with 6800 asm, MS Basic, Java & Javascript, HTML, Perl, ASP, PHP... now for the past 5 years I've been a dev using C++ with the Qt framework. Now learning C# using VS and the .NET framework.
I have an affection for that era, as my second computer experience (after the Dartmouth IBM mainframe, at age 12-13) was with a DEC pdp-8e, a moderate upgrade on the tech developed for this AGC. It was far easier with old Teletype terminals, and a high speed optical paper tape reader, than dealing with punch cards, or the limits NASA's AGC had of 2k RAM, and 36k of ROM as its storage limit. MIT's work on AGC no doubt had crossover to DEC engineering. When we think of standard programming languages in the cp/m uPC era and since, that was NOT the APC. The pdp-8e could run BASIC, FOCAL, COBOL, FORTRAN, or a two pass ASM assembler and compiler. NASA's AGC ran AGC4, its own unique hardware specific code. MIT's engineers amazingly outdid one major NASA contract spec for AGC. They came out with a 70 lb computer, well under the 100 pound design goal limit. That in itself was amazing at that time. It's been a LONG time since I wrote ASM code. One reason Steve Gibson's hard drive utilities are so efficient, is that's one of his long term specialties, as other code bloats just on GUI interfaces, that may hog more than a million times the video RAM alone, on which we ran entire space missions in 1969.
That was so interesting ! Thanks for sharing. I would say that the jokes serve as release from the task, but also as markers. It's way easier to remember a funny note than more boring instructions. So when they would look for a specific piece of code in this huge file, they could remember it was next to this or that funny quote or joke. Very clever, and it shows that serious work will always be more efficient when you mix some fun in.
Assembly is still very much relevant today. In fact, every single time you compile software, you are compiling it into ASM (Assembly) and then the assembler takes over from there.
This is true as if you need efficiency and speed from a CPU and it's peripherals, you switch to assembly, write a sub and call it from whatever higher level language you're writing in.
@@oglothenerd Depends on platform. After the advent of Turbo Pascal for the PC platform, new compilers followed that model and generated straight to machine code. Only older compilers generated asm and required an explicit asm and link step. Before long the common ones of those had an option to generate straight to machine code. Off the PCs, mainframes of the 1980s era were moving to eliminate the assembly intermediates and went from compile direct to linking. Compilers generating asm are historical artifacts on the common platforms at this point.
I'd just like to thank you for a very clear and interesting explanation. I remember watching the moon landing as a 11 year old kid, then in my last year of middle school learning Fortran programming using the write the code -> convert to octal -> punch cards with an unbent paperclip -> bind your program with a rubber band -> wait for the printout to return from the university the next week method. In the early 80s I was programming simple games in Z80 assembly language for the simple home computers of the day. Persistent memory was cassette tapes!
Ouch ... 0:12 the photo of Saturn V launch has nothing to do with Apollo mission ... there is no manned spacecraft on top ... it's the launch of Skylab station
What is amazing was how given the level of Computer Technology at the time of the Apollo missions, how they were able build all the necessary functionality into a computer that was able to fit into the Apollo Command Module and the Apollo Lunar Module. With both of these modules the Apollo Guidance Computers had a limited size and weight that could be fitted into these modules. There was not the luxury of having a large size and heavy weight. So it is likely the Apollo Guidance Computer was the smallest and lightest computers of its time.
Yes. The AGC was the first computer *in the world* to use integrated circuits. Those were chips containing only two logic gates each, but they were an advance over the use of individual transistors as was common back then.
I started on 6502, 8088, and then later Vax Macro I still love assembly language programming, there's something unique knowing you are literally dealing with the instructions themselves Now I use python which is nice but it doesn't quite feel the same
The book “Sunburst and Luminary: An Apollo Memoir” by Don Eyles is also an amazing look at the creation of the AGC and specifically the software from the start.
As an ex assembly programmer (on PDP 11 about a decade after this) I can add that it was good practice to comment every line of code. They were after all cryptic instructions involving registers and memory locations. Actually the PDP 11 had a great assembler. All registers could be incremented and used as memory pointers. Once you added Macros you were close to C. Actually C was essentially sugared PDP assembly code.
As a fellow PDP-11 programmer I approve this 😁. Just a small addition to the nice comments: I once saw this comment (IMHO also citing Shakespeare): 'Our faults dear Brutus are not in our stars, but in ourselves' - error handling function of the RT-11 operating system. I used RT-11 sometimes, but mainly worked with RSX-11M. All were 'real programmers', not just quiche eaters 🤣.
Likewise... this brings back equal memories of nightmares (when it didn't work) and huge euphoria (when it finally did work). We were interfacing A/D's to PDP-11 for data acquisition. Set up "Ping Pong" buffers so the A/D could be writing to Ping while the PDP-11 could read Pong; write to disk... then flip the pointers. Hopefully the "bathtub" never overflowed while filling and draining at the same time. It was those other pesky interrupts from disk writes that could ruin things... now, what was the sustained disk write speed for 8k vs 16k vs 32k block-writes???? As I get close to retirement age (actually past it, but still love being an engineer), I'm thinking about getting into Arduino's for Halloween projects.
@@EngRMP The Arduino is good. I'd recommend getting some of the clones, particularly the nano version. I would also recommend using 2.54mm pin header socks and connectors so the parts can be removed and replaced easily. while I generally gravitate towards the ESP32 , ESP8266, and the raspberry pi pico. However those devices use 3.3 volt logic vs the Arduino's 5 volt logic. It's easier connecting sensors and output devices to the Arduino since most are going to be 5 volt logic and you won't require additional hardware. Get yourself a package about a hundred 2n222 transistors or some 2n7000 fets. It's possible to drive higher power mosfets directly, but if I remember correctly 70 ma is about the most you want to source from the Atmel 328 and stay within safety parameters. There's no real advantage to using real Arduino boards over the clones. I hope you'll enjoy building projects with Arduino as much as I have. The only limitations are going to be your imagination and your wallet. Oh...I usually buy in quantities of 10 so if something goes wrong I have plenty of spares available without any waiting. Besides this stuff is cheap. Take care.
I never actually wrote any useful PDP-11 assembly code, but my 2nd-semester programming course in college (Spring 1972) involved writing a PDP-11 simulator in the first half of the course and then a PDP-11 assembler in the second half, so I became fairly familiar with the PDP-11 instruction set. They made some nice design choices. The auto-increment and auto-decrement addressing modes were useful in lots of situations, and having the program counter be one of the 8 registers meant that pc-relative addressing for branches and subroutine calls was available to the programmer (or compiler writer) without it having to be a special distinct addressing mode.
@@ApolloKid1961 The context of your question is not understood. Please tell me "apollokid' who just so happens to be on this channel, what the inference is? Thank you.
@@ThePlanetTVtheplanettv As you can probably guess, I have enjoyed learning about space travel and Apollo in particular since I was a kid. I was a software developer for 30 years and voilà that combination brought me here. How am I supposed to know what the inference is when you watch the episodes about the AGC on the CuriousMarc channel?
FYI, "Burn Baby Burn" was a term associated with the 1965 Watts Riots. It was often quoted by college students of the time. The MIT programmers would have been familiar with it's source and meaning.
Don Eyles writhe the decent and landing part of the program. He had a great sense of humour and I’m certain that he was responsible for most of all of the crazy comments in the source code.
Thank you for the video, was awesome to see and hear about it in detail :) Just a quick note, "hashtags" prefix a comment, they don't follow a comment.
Note: If you want to get into the nitty gritty of assembly/machine code programmers' heads, try Ed Nather's 'The Story oi Mel' about a guy who used the speed of drum memory for timing events on 1950's Royal McBee computers. This was published on Usenet in response in response to 'Real Programmers don't Use Pascal' (they use Fortran) which was in turn inspired by the book 'Real Men Don't Eat Quiche' (1982) which satirized the devolution of American Masculinity
Good video. BUT big ERROR at 0:52. NOT 29 th.... 20 th July was the Landing on the moon. Such a well known fact. So easy to check. Google it. I was watching it at that time. Please correct or check your script. You are precise normally. Now I can we trust your facts on the rest if you missed this. Still i will watch your videos because you do excellent presentation.
I once wrote a machine code routine to handle graphics on a 6809 8 bit computer and the result was instant execution, insead of a painfully slow MS 1.0 Basic interpreter solution. So source code is very important. :) Was a PID/process simulator.
12:59 You cut that quote short. It looks like it was split over a page boundary, if I understand the "Page 393" marker correctly, and the last bit is on the line after that. So, it should end as, "...such abominable words as no Christian ear can endure to hear."
Yeah that kind of bothered me more than it should have. Also not sure why the programmer gave the page number, because that's going to vary in every edition anyway. Instead you go by act, scene and line numbers.
@@prosimian21 It's a digital transcription of the source code that was originally in printed form, so I'm pretty sure the "Page 393" is just marking where page 393 of the code starts, and is nothing to do with the Shakespeare quote.
This was so interesting and I'm not a developer. You can take almost any aspect of these first steps in space exploration and have enough material for like 50 feature films. This video does a great job illustrating how ground-breaking this tech was and it's nice to see articulated kudos to all of the brilliant minds that were the OGs of the quest for space. But calling assembly "challenging" is like calling the Burj "tall" 😆Great vid!
I went to college in the late 80s. I was the first CompSci class to NOT use punch cards for computer programming classes. My internship was at a company that still used a punch card machine; for FIVE cards per day! So, this technology lingered on and on!
Same. I learned on PDP-11s running RSTS/E in college on the 80s. We’d use VT52 terminals or even the teletype machines in the lab. But on my work experience I had to deal with punched cards and paper tape.
@@TesterAnimal1 Yes, well, this is typical for every Generation. In Universities we mostly learn the cutting edge science and tooling (but also sometimes got optical EEPROMS for programming an SPS 🙂). When you come to industry then you realize, that most of them are at least 10 years behind.
I took a programming course (FORTRAN) at a college in the summer before my senior year of High School. We used punch cards. By the time I started at the same college as a freshman, in 1978, punch cards were obsolete.
The 1201 and 1202 Alarms were causing AGC reboots during the Apollo 11 Landing. This extended the range of the de orbit and landing engine burn. The low level assembly language allowed the AGC to keep up with required tasks, even with frequent reboots. Reboots were near instantaneous. I always wanted to find out how long it took Mission Control to isolate the problem to the enabled Rendezvous Radar. Note: Apollo 14 Landing used a lot of fuel also. They had only a minute of fuel remaining at Touchdown. Apollo 14 had an issue with a landing abort switch with an intermittent short that had to be isolated with an in orbit re-write of the Luminary Code. Loved watching your video! 👏👏👏
MIT handled the 1201 and 1202 problem isolation and provided the fix just before 11 lifted off the lunar surface; see also George Silver. On 14, Don Eyles wrote some code that lived in R/W memory; rewriting Luminary was not possible as it was in fixed (think ROM) memory. See Sunburst and Luminary by Don Eyles.
But but but Stanley Kubrick and the studio! And what about the Van Allen Belt? Huh, what about that pretty lady? And everyone now knows that only computers can do complex math! Got'cha there...
Stanley Kubrick obviously filmed the moon landings, the space agency had no intention of going to the moon. What people don’t realize is that he’s such a stickler for detail that they ended up going the moon to film in-situ.
Hi, thanks for watching. I incorrectly said the landing was on 29 July, it was on the 20th. I just read it wrong on my script! Also this video is getting a few moon landing deniers unfortunately. I’ll try remove the more crazy comments but please ignore (unless you really want to get into an online argument!)
Although it's understandable enough to keep saying 1969... the truth is that the Inertial Guidance System (high-tech gyroscope) the Computer and its Software was the first contract issued for the Apollo mission, so its concepts and design are rooted in the key technologies of nearly a decade prior, and partly successful due to the foresight of what may be available when critical commitments to implementation are due - e.g. use of the first integrated circuits. The software was developed over a long time, and only just made the deadlines set for each critical step in the pathway.
hopefully, a decompiled version will be available! like c. historical we wrote in assembly, until we realize we can write higher level code like c to create assembly. then we wrote another layer like python and java to go a step further. java compiles to java bytecode, that then compile to an assembly with java virtual machine.
there's no real reason to write in assembly, as most modern languages today compile to assembly. so we all still write in assembly. but with custom functions you can say.
Yeah...I noticed the date glitch: I clearly remembered watching the landing 2 days before my 10th birthday! Great video...really brought it to life!
Thank you for this video. Truly amazing software accomplishment. Amazing that NASA had this one-of-a-kind non-programmer computer-to-human interface.
Also, you a beautiful and well spoken. Thank you again. Best Regards.
fwiw: Comanche is the European name for a very famous tribe of Native Americans inhabiting the Great Plains region of what is now the United States. Pronounced kuh-man-chee.
Even more mind blowing is that the AGC's memory was HAND WOVEN.
Yes, it was called magnetic core memory. It was really an amazing technology of the day bc it was non-volatile so it served double duty as RAM as well as flash.
@@TheWallReportsMagnetic core memory was a step up from using cathode ray tubes to write, store and read data. Thank heavens I never had to use cathode ray tubes for storage and I have a very time imagining what it was like to use mercury delay lines for memory.
Fun fact: the “core dump” term comes from this era, when ram was implemented using magnetic cores.
I think you guys are getting two different technologies mixed up here. Magnetic-core memory was volatile and used for RAM, and while it was indeed hand-assembled it was done so in a manner that was the same for every cell. Core-rope memory, on the other hand, is the non-voltile/ROM technology that was hand woven by the so-call "little old ladies" at MIT. The former used the cores to store one bit in the ferrite material itself. The latter used the cores to read which wires passed through and which went around (with up to 64 wires per core in the case of Apollo). Two slightly similar, yet completely different, technologies.
@@Myndale with core-rope you can talk about 'hard coding' 😂
I just smirk when people say older people don’t understand tech. As a 14 year old, I learned about microcomputers on an Intel 8085 development kit. You would write a program on paper using the 8085 assembly instructions, and then look up the machine code for those instructions in the manual and enter them using a keypad. 8 years later at university I was using my personal 386 PC to run Mathcad etc. It is amazing how rapidly things are developing. Apparently one cellphone represents a cost beyond the total world GDP of the 60’s where it possible to construct it using valve tech from that era. Great clip about this famous computer. A testament to pushing the limits of the tech you have available and the character of the people faced with solving problems
My lecturer taught me the concept of memory using a transistor based flip-flop to store a single bit. Fast forward in the 90s I learned to program on a Zilog Z80 using a keypad to produce something on a seven-segment display. Despite of how fast technology is evolving, it's comforting to know that Arduino sustems and bread boards are still widely available nowadays and youngsters are eager to learn from those.
I'm 82 now. My first computer was a KI
@@thesoundsmith The KIM-1 was super fun to use. I recently learned there were a bunch of expansions for it. I'd only used the base board with its hex keypad and LED display. Maybe a homemade cassette interface (from BYTE, iirc, but it might have been for the SYM).
Yip! Same for me in 6502, back in 1979: Pure assembly language driving hand-wire-wrapped custom hardware for a microtonal music synthesizer. I entered it into Science Fair, although I probably would have done the project either way.
i'm 15 and i wish i got that 😔
Once in a while I stumble across something like this that really makes up for all the other mediocre junk taking up space on TH-cam that makes me say to myself, "I'm really glad I watched that!"
Dee, good job, this is pure gold👌
The content was a very fascinating little slice about this awesome historic endeavor! It invokes more appreciation for all those super intelligent people it took to make the program happen.
Kudos for making the distinction between Assembly code and Machine code. Some today even do not know that.
There is a 1-to-1 correspondence between the assembly language and machine code. The programming logic is the same for both. Assembly provides human-friendly symbols that the assembler translates 1-to-1 into machine code instructions. Assemblers also provide human-friendly labels for memory addresses like jump/goto targets and data locations. Advanced assemblers also provide "macros" that substitute sequences of assembly instructions with a single command, similar to how macros in office software like word processors and spreadsheets work. Once macro code is substituted and memory address symbols are resolved, again, it's 1-to-1 translation to machine code.
Early microcomputers like the Altair and minicomputers like the PDP-11 had front panels with displays a little similar to the AGC DSKY. You could enter the instructions in binary and read results in binary from display lights. The DSKY was more user-friendly (no binary) in this regard as it provided command symbols on the keypad and decimal values for the display and keypad.
@@markrosenthal9108 Nobody argued, that an Assembler is like a Compiler.
One other thing to keep in mind about assembly language is that it's not a single language. Each processor architecture's assembly language is unique; e.g., the AGC's assembly looks completely different from 6502 assembly, which looks completely different from i386 assembly, which looks completely different from ARM assembly, which... you get the idea. This was because assembly instructions map 1:1 to machine instructions, which of course are completely different for different architectures.
@@markrosenthal9108it's not necessarily 1:1. Assemblers support macros that can turn one assembly statement into 2 or more machine code instructions. MIPS CPUs don't have an instruction for loading a 32 bit constant, but there is a "pseudo instruction" li, which is turned into lui + or.
The main difference is you can use all the instructions available, while high level languages only use a subset and they won't let you directly use things like CPU flags. An issue that I have faced is C not letting you detect carry or math overflows without wasting time on unnecessary calculations
@@markrosenthal9108 It's not quite 1:1, because even without a macro assembler, there are tricks you can do with machine code, that's difficult or meaningless with assembler, like designing code that executes differently if you jump into the middle of a multi-byte instruction in an instruction set that supports a variable-length instruction set (like x86 or Z80 or 6502 or 68K).
I started playing with computers around '69/70 and started programming for space systems around '80. When I began, cards were for development and paper tape was for finished code (it was more compact but really really annoying to edit). Fast forward ten years to around 1980 and terminals faster than a TTY were starting to become common, which made all the programmers happier -- even at 300 baud. That said, I was still doing a fair amount of programming with cards well into the 80s. Many programs were developed using FORTRAN to work out the algorithms and logic (C wasn't yet mainstream, nor vetted for space missions where I worked) and chase out a lot of the bugs, but in the end we were still translating that by hand into "optimized" and commented assembly (i.e. just compiling and doing an assembly dump as a shortcut wasn't an option). It was a treat when you got to use an AL you already knew from a previous computer; still, you ended up learning a lot of them.
Programming certainly has changed since then! When I started working I did some assembler, but that work died out and haven't touched it since. The lowest language I use is C, and now using frameworks is important.
Dorothy Vaughan was a first Afro-American woman who taught herself Fortran because NASA invested into IBM computers.
Do you still program in Fortran? Will Fortran ever be deprecated? I'll bet it will still be used in the 2100's and beyond. The more time passes, it's less likely to ever be replaced.
@@TerryMundy Honestly, I never loved programming in FORTRAN, but what it does, it does well -- which explains its recent resurgence. Do I still program in it? Nope. I haven't had to for a while. Nowadays I tend to use C-family langs, Python, or a microcontroller assembly.
@@terpcj Then, you would know if honest..they were mainframes. 2_ Amigas Where,and are being used ,by 1986 at Nasa.Space program is a lie. Told to I by astonots. Colombia..who are still alive,and Nasa was the ones who paid for my A1200 I still have contavts there. So Im tired of the lies
I have worked on Assembly language programs and it really finds out your weaknesses and forces you to learn. I am not a professional developer BTW, but I do have a passion for embedded systems.
I know why they used assembly so much back in the day. It was their only real option.
Thank goodness for really good compilers and large memories nowadays. WE ARE SPOILED.
👍🏾True. The thing with Assembly Language and same can be said for machine language is the coder has to know & understand the actual CPU & hardware bc you're working & manipulating registers & memory directly more less depending on whether virtual memory mode is active.
@@TheWallReports in the days when I was writing in 6502, there was no memory protection, no multiply or divide, and no hardware breakpoints.
I still remember my first days meeting with assembly. It was in 3rd elementary... No, I was not some Sheldon Cooper level genius in some "gifted kids' school". It was early 90s, and my school decided to buy modern computers (so 286s instead of Comodore64) and start computer science courses for the kids. It was really forward thinking at the time, sure, but there was one tiny problem. None of the teachers knew ANYTHING about computers, so they assigned the science teacher to learn about computers from books and teach kids what she learned. Basically she was ahead of the classes she taught by one or two lessons.
And what exactly you think they thought would be the best way to introduce kids into computer science? Yes, exactly what you thought: Assembly language. During the first few lessons we learned what peripherials are, then right after that we were introduced to registers, stacks and heaps and data manipulation verbs. In like two months, all the kids were taken out of that course by the parents.
@@Jenny_Digital Same here. I wrote a few simple arcade games in 6502 back in the 80's on my Atari 800. 6502 was so simple that it made it very difficult to program. At college in 1987 I took an assembly language class which involved programming a mainframe computer. I don't remember the name of the CPU but the assembly was much more advanced making it fairly easy.
@@jp5000able I had an Acorn Electron with only a black and white telly and a tape recorder, but the built-in assembler meant I got my start in 1983 when I was only six. I had no printer so I typed my listings using an imperial widecarriage typewriter my dad had rescued from the tip whilst working for the council.
Back then my computer came with a very comprehensive user guide, getting started book and demo tape. Now you get a sheet or two of paper telling you how to plug the thing in and not to open it.
Curious Marc is a treasure trove channel. Going to auctions, nabbing old gear, doing hardware debugging, rebuilding display tech... Then casually piloting a landing. They are amazing.
Are they? 😜
It's really fun to see all the old gear they get working. One of my favorite channels.
@Fred-yq3fs Yes, and their series of videos about the Apollo systems is fascinating. At one point they even had access to one of the rope memory "memories" (I guess that's the term?) and reverse engineered it to determine what code was woven into it.
@@emjizoneI’d consider them the authority as they worked with people who built the AGC’s
Absolutely true! I learned so much from their hands on restoration of an AGC and a download of its core. They then redid A11 landing with the AGC interpreting inputs and computing outputs.
Margaret Hamilton was the lead software engineer of the AGC. She coined the term
Software Engineering and pioneered fault tolerance in software systems. Her ideas behind fault tolerance played a crucial role in gracefully handling of AGC overload right before landing
Gracefully handling? The computer crashed out repeatedly so they hand landed the damn thing. I get it she was an ok programmer but lets not blow what she did out of proportion which is happening a LOT recently..probably because she is American and Black..nobody remembers the other female programmers who had been around for decades and in 1 case 2 fucking centuries earlier.
I went to the Smithsonian a couple of decades ago now and was absolutely stunned by the technology and computers used to go to the moon. It looked absolutely archaic in design. I literally saw soldering points on the greenboards for wires. I was told at the time they had triple redundancies because of the anticipated shaking that might loosen the wires. My respect for those astronauts only grew tenfold when I saw that capsule and those components. Now, I have heard that we have to relearn how to go to the moon, because most of those brilliant engineers are now dead and things were kept on paper. Wow. You would have thought that we would have kept and or recorded these kinds of things, even if for only posterities sake. Great video, thanks for taking the time to explain it.
We have tons of records from every aspect of Apollo. Drawings, technical reports, etc. What’s gone is the ephemera: not design details, but background of “why did we design it this way”.
Relearning how to go to the moon is more a matter of experience: the engineers working on Artemis have tons of experience with Earth orbit, less so with the environment of the moon and its unique challenges.
@@Hobbes746 I think it comes down to two things.
Firstly I heard that the Apollo engines required individual hand tuning, which the engineers knew how to do at the time, but they're all dead and noone knows what exactly they were doing.
Second, there's a cascading chain of technical debt. A lot of minor components and things are surely out of production or the contractor companies gone. So that minor component needs replacing with something else. But that then affects multiple other components, which also need to be changed to be compatible, which in turn affects other things and so on, until it'd be simpler to start with a clean slate.
To be honest, it's now more likely that Starship will be ready for a lunar mission long before any attempt to rebuild Saturn V "as is" could ever be ready. Plus Starship will be reusable.
@@calebfuller4713 Yes, those all play a role too. Hand tuning the engines can be learned again, but that takes a few years of hands-on experience, building an engine and then testing it.
You also have to consider the change in manufacturing methods: you’d have to redo all the drawings in CAD so you can use modern machines (CNC etc.), you have to take into account that some of the metal alloys used back then have been replaced with new ones with different properties. The list gets long.
@@Hobbes746 Absolutely true! It's part of what I meant by a cascading chain of obsolescence. And then you need to test how the new methods work versus the 1960s methods. Will it blow up or work better than ever?
It's not even just the rockets. Its stuff like the the control computers. Are you going to dig up a vintage 1960s computer? Or it all needs to be adapted to a modern computer? Which is a whole new project in itself... One that is again maybe harder than just rewriting new code in a modern language from scratch.
In an age when there are bad actors who seek to revise history or erase it (not for greater accuracy, but for barbaric political reasons), hard copies are good to have, to preserve the sanity of the planet. When the risks of technocratic neo-feudalism or nuclear war are becoming increasingly evident, what we call "records" could become future artifacts that could help people sort out what went wrong.
Fascinating. The code itself is one of the best records of the mission. I also want to point out the amazing inertia motors\sensors connected to the computer. This combination of brilliant engineering enabled the craft to remain perfectly oriented through all missions. Our phones have a similar device installed, and the Occulus Rift has a pair. A DJI Drone hovers perfectly still in all wind conditions (below 25mph) because of its precise, near weightless inertia motors. For a computer, in 1969, to observe that data and respond to it accurately is mind blowing.
Best records of the mission? Kinda reminds me of when the star trek franchise paid someone to make Klingon to add more realism. 😅
Curious Marc is bringing these machines back to life
that how i found him . from one of the videos on agc restoration and since then watch every video he publishing .
@@b43xoit Corrected, thanks!
I'm all for expressing a sense of humour in code comments.
I have seen a comment in the code, which was loading and starting some firmware- " demons be gone!"
I am not.
@@msromike123 so... don't...?
As long as you make comments. Especially in assembler. Joe Armstrong (the inventor of Erlang) made a joke about the commenting behaviour of one of his colleagues:
It was one big file of assembly code and in the middle of the file there was just one sentence: " ;And now for the tricky bit!" 😆
As long as what needs to be known can be known, putting in a little humor or a note to future programmers is like putting a note in a bottle and casting it into the sea. Who knows who will eventually read that note, and how this little bit of humanity will affect their appreciation of the code. I think the comments also help document a point in history with slang or literary or movie references. If done tastefully and with a sense of balance, these comments give us insight not only in the code, but of the programmer.
Amazing!
I remeber an interview/debriefing with all three astronauts after their return and Buzz admitted to not releasing the radar task which he should have done to follow procedure. The result was the two errors. Buzz said that at the time he wasn't going to let go of ride home or something to that effect. Who can blame him! What they did is beyond brave!
I think you can find this interview on the 3 DVD set NASA released (years ago now).
I can also remember watching this in real time back in '69 - it was sooo tense. The fuel almost gone, the computer errors, a last second decision to manually override the landing to get a better spot. Whew it was spectacular.
Thanks for doing the video - Good job!
- Bill
Movies and stories often are...spectacular.
I started learning programming when I was 19 in 83. 8086/8088 assembly code was the common language. I still use 8086 assembly as BASIC is often to limited.
I got to watch a Saturn 5 launch in 71. That is when I got the programming bug.
Thanks for this information.
My parents helped write the code for the AGC it was their first job after getting their masters degrees in math, (no computer science degrees back then). They worked in Houston and were working there in January 27th,1967 when word came in from Cape Canaveral that Gus Grissom, Ed White and Roger Chaffee had died in the Apollo I capsule fire. It was devastating news as they had known them if only from afar. It is one of the clearest memories my mother still has of that time. Later they moved on to work on missile systems at White Sands Missile Range like the Patriot and Toe missiles. I still remember coloring on discarded punch cards and printouts from failed runs :) When I spoke to my dad about it years ago he talked about having to program computers with switches when he first got to WSMR, crazy stuff. Things have come a long way since then it's good to remember that we stand on the shoulders of giants.
Actor Jack Black's mum worked on the Abort-Guidance System (AGS) in the Apollo Lunar Module. In what has to be one of the best quotes ever... "In a memorial tribute, her son Neil notes that she was troubleshooting problems with schematics on the day she went into labor, called her boss to let him know she had fixed the problem and then delivered Jack".
The calculations and measurements were also done in metric and displayed in imperial units as was simpler.
I wish we just used metric in the us
@@Alkatross you do! All imperial measures in the USA are based on metric standards. The US inch has always been 25.4mm. However in the UK its varied over time - even at one point 1 inch being 27mm.
also US military - 1 click = 1km.
@@mrrolandlawrence i realize it's all the same stuff, but it's just the general knowledge of the measurement that I am missing as a native freedom unitarian.
so she fixed one problem but created a new one
@@Alkatross yeh thats one thing i'll never understand..... no one uses imperial tones in the USA. its always LBS. even when its in the millions.
one curse that is worldwide though -TVs -measured in inches everywhere and car tyres for some reason.
So cool you gave the Wizard of Oz not one but two mentions in your video. One is the "Off to See the Wizard" bit of the code and also that Margaret Hamilton is not only the MIT software engineer but another Margaret Hamilton was the Wicked Witch of the West.
Liars love to add little nuances to their scheme so that the dum dums get distracted and forget why they were looking at the 'code' in the first place.
In order to have code, you would have to have a computer. But just like everything with the authorities, they keep evolving their schemes to maintain the realism for the newborns in the human family to tell stories to.
Excellent video. It's heartening to see young software developers pay homage to the giants whose shoulders all developers stand on. I was 7 years old when Apollo 11 landed on the moon. Later in high school, I read about the AGC and I was inspired to learn how to program. First course was FORTRAN with punch cards. I worked as a software engineer for 38 years and never forgot the lessons learned from those pioneers. PS I love your South African accent. Keep up the good work!
Landing was on the 20th. Thanks for highlighting the software/hardware of the effort!
I have always injected some humour into the code and continue to do so - mostly for my own amusement - sometimes as a monument/reminder of some insight that would otherwise get lost in time. I enjoy seeing it in other people's code too, it does have some utility beyond putting a smile on a tired developer's face though...
What I have found is that third parties looking at the code latch onto those bits and pieces, they give an otherwise featureless jumble of code some landmarks - which folks can use to navigate the code (and share in the insights of the original developer).
I can think of two reasons we see less humor now. One is that all the old timers already made all the obvious jokes. Another is that corporate coders are discouraged from frivolity. Fortunately for me, my boss is old school and it's not a big deal for us to joke in our comments. But we do worry that the company will some day be acquired, so we have to be subtle about it. We're cursed with political correctness.
@@SpareSimian As it happens I work for a soul crushing corp, I don't get much push back because I don't feel a need to be "edgy" with my humor. Where there has been some push back it's come from insecure folks who want to assert themselves - for my part I am happy to let them vent and make a fool of themselves in public - they rarely repeat the mistake.
Note: Assembly code is platform specific. X86 assembly is very different than ARM64 or 6502 assembler. They have completely different syntax, opcodes, registers, etc. Each CPU type's assembly code is essentially it's own language and there is generally no direct parallel between different platforms.
Oldish assembly coder here. Whilst assembly can be different on each platform, the biggest difference is big endian and little endian. Where basically the code is written backward. (Kinda). And yes, opcodes, registers, memory etc. all different but that can even be different from the same architecture to the next in the same architecture.
I agree they are different but.... I know Microchip assembly vey well, and if I have a quick look at a reference manual, I can easily read and understand any other flavor of Assembly.
@@TheHeff76endian is rather insignificant, big endian is easier for a programmer to read but slower for the computer (little endian let's you start doing maths as soon as it's read the lowest significant byte)
Load/store Vs register/memory architecture is a more important distinction.
The way the 6502 updates flags after every instruction while 8080/z80 only does it on a few, is a larger difference. Even the indexing modes on 6502 are vastly superior. Trying to switch between 6502 and z80 (which are both little endian) can be annoying, the 6502 is the best CPU ever.
@@phill6859 I've never done anything with the 6502, but I heard it had very few registers. Also, every instruction updating the flags sounds more like a nightmare to me; what if you want to test more than one flag? Also it means that you'll more often need to store the flag's value somewhere else for later reference.
@@__christopher__depending on which instruction is running only some of the flag bits are changed not all of them. Normally only the logic like and, or, not, rotate OR math like add, sub, compare effected any of the flags then use conditional jumps to branch on status. Even though it might refresh the flags on every instruction it doesn't usually change any of them.
One of the most interesting TH-cam episodes I've ever viewed. I started using assembly language in 1970, and have a very high degree of respect for early coders. Thanks for your research and presentation! I enjoyed it immensely!!
As did I. A little bit of programming in hex based machine code and then onto Assemby. We though it was amazing when we got 8k of RAM. How things have changed.
My first programming was less than a decade after the moon-landing and we had a landline from college (UK, so age 16 to 18) to a local university mini-computer. The link terminal had a huge telex-keyboard and a paper-roll instead of a cathode-raye tube screen, and then you had to wait for most of the lesson (or frequently until the next one) to get the inevitable error message about a typo in line ten, or maybe the results of the program if you had been extra careful. A DSKY like the Apollo had was massively ahead of it's time. NASA has large public archives concerning the tasks and planning for the tens or hundreds of thousands of people involved in the project, and the solutions to the new problems that had never previously needed solving.
Microcomputers showed up in 1972, but it takes a while for education to catch up.
Catching up was indeed a problem, with no budgets for that sort of thing as computer use was percieved to be just something for accountants . . . The more enthusiastic students had Commodore or TRS80 machines at home, at about the same time as the telex-and-paper terminals in college!
Yep, me too on the landline setup.
We had a modem connected to the phone and then to the line printer keyboard for Fortan assignments for college!
Margret Hamilton's daughter often went to her mother's work and was allowed to play in a mock-up capsule that contained an AGC. At one point an alarm went off and Margret asked what her daughter had done and it turned out that she had started program 00 which basically meant that the Saturn was ready for launch. Margret warned her superiors but they brushed it off saying that astronauts would never be stupid enough to do that. Until James Lovell made exactly that mistake during the Apollo 8 mission when it was already close to the moon. Mission Control was not amused. With much pain and effort they managed to restore the AGC remotely!!!
00 is the AGC Idle ("Go to pooh") command. 01-03 are the command for starting up a Saturn. Also, Lovell got 40 and 41 mixed up during Apollo 13, which they were able to unscramble quickly. You're thinking of another incident (maybe on Apollo 8?) that most of us haven't been exposed to?
@@fredflintstone9609 Yes, as I have already said, it was on the Apollo 8 mission. Lovell himself confirmed it. If I can find the link, I will add it here later.
th-cam.com/video/Wa5x0T-pee0/w-d-xo.htmlsi=yKVtUI_-NUDnxnql?t=2559
Assembly language also is not just one language. It's a family of languages that are specific to the processor the engineer is writing for. For example the assembly written for an Intel x86 chip would differ from say an AMD x64 chipset. Back in the day each computer had its own version of assembly, so the code written for this flight computer can really only run on this flight computer.
Yeah, I was devastated when I found this out, because I had learned assembly for the Atmega328p
And felt like I could code anything retro, only to find out assembly instructions are microcontroller dependent
It was a BIG MISTAKE to publish this code. Now everybody can copy it and fly to the moon.
Yeah I went there last weekend in my Ford Focus. Wasn't much there! Just a load of rocks!
@@SunnyIntervalsORG My Focus was very unreliable, I was too scared to take the trip.
Well the codes and scheme is so good, they could fly to Uranus.
7:18 Fun fact: the stack of binders next to Margaret Hamilton, the "rope mother", is not "the software", it is several versions of the software. I assume one binder was one version of the software, that is the listing of the 69000 bytes of the final version for AGC. If I remember correctly, to stack the binders was the photographers idea, to make it more impressive.
Sounds likely.
Great Video Thanks! Don Eyles on of the programmers responsible for those jokes wrote an interesting book called “Sunburst and Luminary an Apollo Memoir” about his time working on Apollo.
It's a really good read! Don Eyles was the main author for the Lunar landing module software and helped pioneer some coding techniques still in use like memory fetch interleaving - esp by game developers to get the most out of the consoles, as well as interrupt coding - used in OS's. The latter turned out to be key in avoiding the information flooding mentioned in the video (indicated by the 1202 alert) from stopping the mission.
Fascinating, thank you! Yes, I'd held the incorrect belief that the AGC was roughly equivalent to a calculator. I'm a technical person, but not a "computer" person, and I found this video very helpful.
Thank you for explaining this. I was a teenager in 1969 and remember well the issue of the lunar module computer being overloaded but never knew why.
In the movie "Apollo 13" they said the "the astronaut is the most visible person in the structure of thousand of enginneers".
We were using punch cards up through 1991 on our ship. We had to learn Hollerith code and even had 5-channel paper tape and magnetic tape. We had guys coming down from various departments, sometimes with several BOXES full of the cards. Every once in a while, they would trip on the ladder (stairwell) coming down to our shop to give us the job! Those cards were stacked sequentially and had NO writing on them. They had to go back to their shop and re-run the entire job again to get the output. :)
So they missed the trick of running a pen down the side of a stack of cards at a slight angle so that even if the cards get shuffled it's quick and easy to get them back into order.
@@Heater-v1.0.0 Our cards were punched and also had the text printed at the top of the card. We were taught to tab over to the comment part of the punch card starting at I believe column 72 and type a sequential number. No one ever did it, since it was so slow to move over there. Dropping your punched card deck was everyone's nightmare.
@@dermick Yeah, we didn't bother numbering our cards when punching in Algol code in 1976. Seems a lot of people didn't cotton on to the trick of running a marker pen down the side of the deck. Anyway, luckily by 1977 we had terminals to a timesharing OS , no more card punching. Then very soon came micro-processors and personal computers. Happy days...
Thanks for such an amazing video Dee. Been an Apollo fan from a very young age. Being a python developer and Linux script writer myself, I found it so interesting. I liken the "main" file linking it all together, to my preferred scripting methodology - making use of many functions (modular) - being called as required.
I learned 6502 assembly language progamming on the Apple ][ computer, a decade after the 1969 Apollo 11 moon launch. The page layout and formatting of the Apollo Guidance Computer assembly code is strikingly familiar, and similar to the assembly language syntax and commenting standards I used and encountered in programming Apple ][, Commodore 64, and IBM PC computers in the 1980's and beyond. It's inspirational to see how low-level programming culture and techniques evolved from the earliest examples of embedded systems programming on machines that predated modern microprocessors.
Fascinating comments about a fascinating subject. My first electronics project was a 3-tube ('valve' for the English) turntable pre-amp built in an aluminum chassis I punched and bent myself. My last project was a dual, surface-mount µP, touch-sensitive subsystem in a handi-cam built on a ten-layer, 3-square-cm pcb. I designed hardware and wrote embedded assembly code. I lived and worked through all the intervening hardware and software growth in systems and technology between those two points. Thank you, Dee, for reminding us about our roots! It has been a great ride!
I remember machine code and hand compiling 6502 programs into hex! I also remember watching the landing live, aged 12. BTW it was July 20th 1969 not the 29th. As a computer operator back in 1983ish punch cards were still in regular use where I worked. Having recently retired after 40+ years in IT, based on talking to recent graduates I get the impression that basically Assembler is no longer taught in college. This is a mistake; short of some kind of electrical engineering degree there is, in my opinion, no better way to understand what the hardware's doing.
Fairly good description of my start. Watched moon landing, programmed on punch tape between 1972 and 1979, Fortran and punched cards 1980, Computer Science degree included hand compiled 6502, bit slice CPUs, and graphics processors along with Pascal, ADA, APL, Cobol. Finally went to C and Smalltalk.
@@ProfPtarmigan Pretty close, I started as an operator in 79, learnt Cobol & ICL Plan Assenbler on the midnight shift from reference manuals. Switched to programming in 83, spen 6 years with a software house (Cobol, RPG2 & Wang Assembler), Moved into the airline industry in 89, spent 9 years doing TPF & MVS assembler (Both in the UK & US) switched to Java in 98. Moved into DEVOPS in 2018 & retired in 2023. Somewhere in there I found time to snag a 1st class hons in, effectively, computer science from the Open University.
Never disregard the old timer engineers. It was their work which laid the foundation
for todays tech. Their work was the stepping stones for today. We could not have
gotten to where we are without their work. They blazed the way.
!
In the mid-1990s, my company designed and built a new space launch platform at Cape Canaveral. As part of the design process, we consulted with as many of the Apollo engineers as possible. These guys were all long since retired but still well aware of their achievement. What amazed me was the very high level of disdain between the old-timers (attitude: Been there; done that.) and the young engineers (attitude: What's it like being an actual dinosaur?) Bringing those two disparate groups together and getting them to respect each other was amazing.
Thank you for making this video. For the record AGC memory didnt use bytes but rather 4096 words of 12 bits each. Very restrictive, and requiring extreme coding efficiency that modern developers don't generally have to worry about.
I agree however, I work for a large semiconductor company and amazed at how little our graduates actually know about assembler and how it interacts with hardware. They have no knowledge of flip flops, latches, bus drivers, fifo etc. I developed on an Atari 600 XL in assembler and then Motorola 68000. I now work on multi ore ARM and DSP however have never forgotten the basics of how a processor core works. I am waiting on the Monster 6502 project to be completed. This has to be the ultimate demo of how a 6502 works.
I hard wired my first 6502 board in 1982 and used a Dataman S3 EPROM Emulator to emulate the EEPROM. In those days all we had was the humble LED to assist in debugging. Developers on MCU have it so easy today with C Compilers, debuggers, flash memory etc. Having gone through the early days one learnt how to code correctly in the first place. Their are some great programmers out there however even more useless ones, especially idiots that seem to want to shoe off their apparent skills on LinkedIn.
Great video and loved the detail👍
The RAM was 2048 words of 15 bits (plus one parity bit).
Fun fact: When the space shuttles were built, technicians in Palmdale, California used IBM punch cards as shims to establish gaps between the tiles during bonding operations. As late as 2008 we still found pieces of punch cards between tiles that flew many missions. The punch cards were very consistent in terms of thickness and techs could establish gaps with them to meet strict gap requirements very easily.
I'm 67, at school they stopped the whole school for the Lunar landing wheeled out a television and we watched the landing LIVE. Later on in my Career I spent 40+ years in IT. A specialist in Aerospace/Defence, several languages, no machine code, megabytes of code. Yes I put in jokes. They can be useful. If an error says "Tony your shorts are on fire" in correction you go straight to the place in the code. "Error 404" could be anywhere. NOW THE IMPORTANT PART. For Apollo they taught me that they programmed in triplicate. For a key calculation three programmers would code the algorithm. The calculation on the onboard computer would run all three in parralel. If all three did not agree the result, it would display the two majority result. If All three were different it was "Shorts on Fire". From my experience they could not run the whole of the system in triplicate (we can now) this would be sub sections of code.
What an outstanding presentation and explanation of Apollo 11's hardware and software. When I first took computer science in the US Army (long ago) the instructor began with; "Students, we have been in the digital age since 1844 when Samuel Morse invented his code for the telegraph". The 'dot's may be '1's' and the dash's may be '0's' or vice-versa. But each combination results in s letter that can result in a word and each word that of a language". The rest was easy, though the computer that guided the missiles of my duties were analog artillery computers from WW-!!. What an excellent video to jar loose that memory. Thank you...
"Comanche" is pronounced as /ko-man'-chee/. "ko" rhymes with "go" but can be softened to "kuh". The second syllable is emphasized; "man" is exactly like "man". And "chee" rhymes with "me". The Comanche tribe of Native Americans here in the USA are headquartered in the state of Oklahoma.
LOL. How do you say aluminum?
@@msromike123, the typical American pronunciation is /uh-loo'-muh-num/. Emphasis on the second syllable. The last two syllables can be /min-um/ instead.
You can blame our first prolific dictionary editor for all that. Old Noah Webster.
@@msromike123Just like it is spelled. At least you spelled it correctly. 😁
@@msromike123 Why, the same way I say titanuminium, platinuminium, uranuminium, and plutonuminium, or course. 🤪
@@msromike123It's "aluminium" ... Just coz you lot over the water are too lazy to add the other "I" 😂
There is a book about the Apollo guidance computer called "The Apollo Guidance Computer Architecture and Operation" which is pretty good reading. It had a preemptive and cooperative multiprogramming, an interpreter to implement more complex instructions.
Correct me if I'm wrong but my understanding of the AGC is that it was NOT just a calculator some are led to believe. To believe that would be absurd. But what was meant by that statement is that it had the computation power inline with that of a calculator but far more capable bc of its preemptive & cooperative multi-tasking abilities.
@@TheWallReports I guess the problem is also that different people think of different things when thinking of a calculator. When I think of a calculator I think of a non-programmable device which has keys for a fixed set of mathematical operations and a display capable of showing a single number; however there were other electronics also going under the name of calculator that could be programmed and had graphical displays, and which IMHO would qualify as pocket computers.
@@TheWallReports It looks like a calculator because it use a VERB/NOUN interface but it could do a lot of things. The verb was a two digit code that defined the action for 00-99. The lower codes were for early stages on mission like prelaunch while later codes were used for later parts of the mission like navigation, descent and landing back on earth. This interface was something the developers came up with while expected something better in the future but nobody could think on anything better.
The CPU had 16 bit registers. With the interpreter they implemented opcodes on double width registers doing trig functions and vector/matrix operations needs for flight control. The CPU had access via IO to many sensors and engine controls.
The AGC were quite capable but the ground facilities had much more powerful computers to augment the AGC because there were weight constraints on how big a on board AGC could be.
shes not into this, the video is for clicks and talk.
I remember playing with discarded punch cards as a child back in the early 1980s, I guess. My parents were students at the time & one of their student jobs was loading punch cards into the university's computers (and taking them out again later). They brought home a bunch, mostly to be used as note pad replacements. They also caused my child brain to imagine all kinds of funky things for all those strange hole patterns.
Thank you very much for this fascinating & entertaining dive into our past; definitely one of humanity's most uplifting & most impressive achievements ever.
Great posting. Felt the need to create a new playlist calling it 'Close to heart'. So packed with good info.
Great summary of the AGC. As others have posted Curious Marc and team have brought back an AGC and rebuilt the code for multiple missions. Current video they are debugging ground radio commands to remotely program the AGC. Given the state of the art back then it was a fantastic accomplishment.
Less well know is the LVDC (Launch Vehicle Digital Computer) that guided the Saturn 5 and astronauts to orbit designed by IBM.
Has LVDC been published like the Apollo 11 source? I'd love to see that code. Maybe Curious Marc could restore that hardware for an encore to the AGC project
@@shackamaxon512 Much of the extant LVDC hardware is now a tangled mass of wires because, to keep it light, the LVDC body was built of beryllium. These have degraded to uselessness over time. A shame too; the published specs make it clear the LVDC was also a ground-breaking machine.
Very interesting Dee😀. I am norwegian and living in Norway. At the age of 17 I was very interested in and fasinated by the space programs, especially Apollo. On june 29 i sat "glued" to the TV and followed the real time program when Armstrong and Aldrin actually landed on the moon. This was probably the most impactful moment in my life ever. Later in life in the early eighties I did a lot of programming in IBM assembler code writing add-ons to the Swift-systems used by the banking business. Then in desember 1995 I was very happy to visit the space center at Cape Canaveral. Your YT video was very interesting and giving a good view into the control systems.
Apollo's AGC was the first computer to use ICs (Integrated circuit / Chips) There was a lot of discussion about this but the team decided that they were the only way the project could be realized
Whilst they used some super basic logic gate ICs in the AGC, the real leap goes to the CADC in the Grumman F14 Tomcat which was designed in ~1968-70 and used the first set of ICs that would in combination be considered a microprocessor, the computer being the only way to make the swing wing design flyable
I wouldn't say the AGC literally "got us to the moon", but it did take a HELL of amount of workload off the astronauts and Mission Control. They had to have backup procedures in the checklists in case the AGC failed. What really got the Apollo spacecraft to the moon was the Trans-Lunar Injection burn, and that was controlled by separate hardware called the Instrumentation Unit located in the top of the S-IVB booster stage. The IU could be flashed by either MCC or the AGC, and I would love to know who wrote the code for it, and what it looked like.
Probably the most common tasks performed by the CMC in the Command Module were setting up the DAP (digital autopilot), and timing the ignition and cut-off for large delta-V burns by way of programs 30 and 40. Verb 48 would access the DAPs set-up, where you enter the spacecrafts weights and configuration (CSM and LM staged or unstaged). You could then enter whatever attitude you liked into the roll, pitch, yaw registers and the CSM would maneuver into your requested attitude and hold it. For the big delta-V burns like Lunar Orbit Insertion and Trans-Earth Injection, P30 set up the time of ignition and burn magnitude, whereas P40 actually fired the SPS engine and shut it off with an accuracy of up to 1/100 of a second.
During Apollo 8, Lovell accidentally entered something incorrectly that made the AGC "think" they were back on the launch pad. th-cam.com/video/Wa5x0T-pee0/w-d-xo.htmlsi=yKVtUI_-NUDnxnql?t=43:04
Sure it just amazing they had a computer on the apollo that was programmable yet, not 5 feet long and 250 lbs like the DEC PDP-8. Or, not the size of a room like the Honeywell 200. OR, several tons and 4.5 feet wide, 6 feet tall, and 3 feet deep like the IBM 360. The next miracle they well tell us is that we went to the moon with early iphone technology. When is story is made up, it can change and evolve and adapt to all the people available so they BELIEVE, instead of RESEARCH.
And then those same people will clap back with words that destroy arguments and conspiracies in a single breath word like "SCIENCE!" It's like an exorcism, just making all the evil truth demons jump out! Just scream it out loud and watch all the people scatter. Unfortunately, it doesn't make empirical evidence, facts, and observations scatter.
@@ThePlanetTVtheplanettv The Apollo AGC was the most advanced computer in the world. It was the first computer in the world to use integrated circuits. This came about because NASA saw the potential in the early research into ICs that was going on in the early 1960s and decided to support this research. In 1965, NASA purchased 60% of the world supply of integrated circuits.
This investment paid off: the AGC was not 5 feet long, it was the size of two shoeboxes.
the reality of the Apollo missions is supported by a mountain of evidence. The deniers never present any evidence at all, just "arguments" they made up. They think their ignorance constitutes evidence.
I was in the US Air Force, and we used punch cards well into the 80s. It was in the 90s, if I recall correctly, that we switched over to a system of remote terminals. Even then, it was the old system at heart, but you could enter data directly rather than via punch cards. This allowed for real-time updates to maintenance data rather than overnight batch processing.
As a Cold War veteran myself I was the Communications Squadron and I remember it very well.👍🏾
Yep... I still remember the thrill of the first IBM 3270 terminal in the shop with CICS to enter our code on. The senior programmers hogged it up most of the time...
When I was in the Marine Corps, I was almost a mainframe operator. Still even lists that MOS on my dd214 as a secondary MOS. Luckily, due to some priority changes, I went to school to be a Small Computer Systems Specialist (an IT guy basically), but then ended up actually doing UNIX sysadmin work. I was around a lot of mainframe folks, though, both Marines and civilians.
That's Epic; thanks for clearing up the situation between the AGC and the Lunar Module Pilot:
Buzz "Computer, tell me the altitude", Computer "Hold that thought Buzz, I'm a little Busy at the moment! (1201 & 1202 alarms)"
No different computer... The AGS was the backup for an aborted landing attempt to bring them back up to lunar orbit if the PGNCS called pings
So the video doesn't talk about the code at all, just the comments. Incredible.
Not too interesting explaining the loading of a value into register. More fun just to see what the developers were thinking. I learned assembler in college, it was excruciating.
@JoeK-q6q Oh wow! You learnt assembly!!! I know assembly (ARM, x86_64, 6502) too, it's not that hard, dude.
If the comments are what's interesting then name the video "the comments of the code that sent Apollo 11 to the moon", not "the code that sent Apollo to the moon".
I expect a video with this title to talk about the code, not the comments.
@@kalei91 If I remember, it takes roughly 10 or more assembly instructions to represent one line of high level code. It would take half an hour just to explain a simple arithmetic routine 🧐
@@kalei91 Do a modulus operation in 6502 assembly on 32-bit integers without AI assistance.
@@briankarcher8338 Are you telling me about 6502? I already told you I know 6502 ASM, in fact I learnt Asm with the 6502, what even is your point?
Hi...thanks for the nice video about Apollo software (and hardware). I have written a lot of assembly code as a computer engineer, so I can appreciate what it took to do this work. This required someone at the top a great deal of planning to structure this given assembly code by nature does very little...requiring LOTS of source code. It looks like they broke the code down into many small parts, which makes sense.
And to think we got to the moon with a computer program written in assembler that supported only 30 to 40 operations, and no floating point math.
It had a funny sort of fixed point system for dealing with non-integers.
I don’t recall how they handled decimals, but it is worth noting that if you understand the range of numbers you’ll be dealing with and range of accuracy you need, there is absolutely no need for floating point.
Floating point is way more flexible since you have both significant digits and the ability to place the point within those digits, but if you don’t need to move the decimal, integers are fine, and they would have a pretty solid idea of the range of numbers possible for any reading or calculation, the accuracy of the inputs, and usefulness of the outputs.
Genuinely amazing code.
And we just KNOW, to the BONES, mother nature hates floating point math. Gawd, if only we had copied "Dr John von Neumann's" brain to help with the missions!
As a Space Nerd who spends a horrific amount of time observing the very small details and progress SpaceX is making with Falcon and now Starship, and also physics and engineering back ground, you did a very good job of explaining the mission correctly. So many get parts of this wrong. And you made the inner workings of this very interesting. This is a great video. Have you ever seen the interview Smarter Every Day did with one of the engineers that made the Sartun and integrated all the systems the code you reviewed ran on and controlled? Its awesome.
7:45 As someone who has written assembly language programs for 64K machines, I can say that the stack of printouts that Margaret is standing near in this photograph must be for the *entirety* of software for the Apollo program, not just the lunar and/or command module AGCs. A printout of source code for an assembly program (even one heavily-commented) that fits in 73K (3,840 RAM + 69,000 ROM) might require the number of pages of one of the smaller binders.
In those main files in the GitHub repo you can see that Luminary099 had 1743 pages and Comanche055 has 1751 pages, which is like a box of copy paper. That stack looks more like 8k pages or so. I think you're on to something.
It is all of the code for Apollo.
She has said in interviews that when the reporter and photographer showed up, they came up with the idea of making the "stack of code." So, they went around the room and literally just grabbed whatever binders were there. So, that's probably not all source code. I suspect a lot of the thicker binders are just data dumps and other diagnostic listings. After all, computer only had about 36k of assembled code. Even with copious comments, the source code for that is not going to be a huge pile of paper
Yes. Each book is the source code for one AGC. But “Luminary 99” means the 99th version of Luminary, so there were far more versions than is visible in that stack. For that photo, they just grabbed a bunch of printouts that were lying around.
There's no way that stack is just the code for the Command Module and The Lunar Lander. I just downloaded the repo from Github, and the whole repo is 177 files @ 3.34mb. Given printing on 8 1/2 x 11 paper that's roughly 342 pages... less than a 500 sheet ream of ream of paper. I'm assuming those printouts where the typical green and white wide tractor feed... so it would take less than 342 pages. The project I'm dev on is almost 6,000 files at 2.18 GB... which is about 46K pages or 92 reams of paper and @ 2" /ream that's 182 inches or 15.3 feet.
The photo is pure made up BS.
"This doesn't really happen in modern programming. File names, comments are always pretty straightforward and serious." You've obviously never seen my code!
Thanks for a great video!
Ah, the good old days when programming was creative and fun, when code wasn't shipped until it was just about perfect. Also, the verb/noun paradigm is so well-suited to human command logic. Great video. Enlightening. Entertaining. Well done,
It famously wasn't perfect.
@@phill6859 it never will be
I mean, the noun-verb paradigm was the true genius of the system. It was such an elegant solution to the size and weight limitations necessary in the AGC human interface component. Also, an amazing design in light of the need for the astronauts to be able to accurately interpret the output from, and make input to the AGC in times of extreme stress. Mind boggling!
You'd better be sure that most of the work on the Apollo software was not fun but very tedious. Those jokes tell a story of exasperation.
On the flip side, coding can be very much fun today, and modern software can be far more reliable than what even those most brilliant programmers of the 60s were able to do with their limited tooling. If your experience tells you otherwise then perhaps you should look a bit more around for better projects, other programming languages, etc..
Ah, the good old days when requirements were so simple and constrained. Even the smallest app today does so many more things than just "Fly rocket to moon". And then a separate program "Land it".
I loved the video and appreciate the background information provided. I should point out that Margaret Hamilton was certainly not standing next to a single listing of the source code. Someone computed the entire github repo to contain 1,751 pages which would not create a stack that size. The most likely explanation is that it is a culmination of test data run against the source or possibly several printouts stack for effect.
Checking online, it cuts deeper - that "burn baby burn" line was first used during the race riots in 1965 by a Radio DJ named Magnificent Montague in America...
Super video, one of my absolute favourite software topics is the AGC. Many thanks for taking your time here, yet getting it all done in under 20 minutes. Well impressed!
I am interested in what kind of instruction set they had available to them in terms of a Von Neumann machine, to be writing assembly on a portable computer in the late 60s. And also that the tech had to be shielded from extreme physical forces - ploughing thru Earth's atmosphere. And also, as others have said here in the comments, the actual point-to-point wire-weaving of the memory modules for the AGC, by real seamstresses with actual needles and a ball of wire, is the other most-phenomenal of facts about this software-hardware machine!
Thanks for the upload, subbed 👍
I remember it - I had just turned one month old when the landing happened. I was sitting on my mum's lap, watching it on a black-and-white telly. I later wrote code in a kind of assembly then moved on. Cool stuff Dee
Great video! This relates to my favorite computer hack on Apollo 14 to bypass the abort button. It’s too much to type in a comment but might be something for you to cover in a future video
Smarter Every Day has an extensive interview with one of the engineers who designed the AGC. He walks the viewer through the history and the hardware. Very very fascinating and fun.
I think you mean the Launch Vehicle Computer that controlled the Saturn V.
Based on my experience using assembly language in the 1970s, you needed some sense of humor or else you would lose your mind. Most of the humor in my code and those of my colleagues was more like graveyard humor and rather dark. The comments and subroutine names you quoted are much more amusing than the ones we had.
This channel is AWESOME i can't believe it took yt so long to reccomend this to me. Looking forward to more content from you Dee! Now if you'll excuse me im gonna binge watch this channel
People might be amazed to know that there is STILL a LOT of IBM BAL assembly code running - especially in large, critical applications where companies deem it just too expensive to rewrite, test, and certify the new code. The banking industry is notorious for this - lots of ancient code on HOGAN systems and still running heirarchical databases like IMS...
Assembly was my favorite coding language out of college. The really cool thing about it for me, is that it teaches you what the higher level language compilers need to do to get to the machine readable state. What is great about your research is the fact they needed to restart the AGC to get rid of the overload. Can you imagine today, having to restart your computer, in the middle of a critical moon landing in the next 60 seconds? But with their little powerhouse it was like a light switch. Done, and Done. Burn baby burn! I was 9 years old watching it live. I knew from that point forward, I was going to be a software engineer. Mercury, Gemini, Apollo. What an awesome time to be a kid! Thank God for SpaceX. They have relit the fire that burned for us in the 60's.
i can see why there are moon conspiracies. these engineers performed MAGIC. forget the primitive hardware, the sheer elegance of practical code to control a system of that complexity seems like scifi mumbo jumbo.
On the contrary. There are conspiracies because we have not gone there since then with infinitely more powerful, safe and robust equipment
@@airaction6423look up the Artemis program
@@airaction6423 Science hasn’t been prioritized since then. Even back then, it took the Cold War to make governments and their citizens see science as a high priority.
@@airaction6423The reasoing is in the wall
The US went there for geopolitical reasons. Win the cold war.
It went there after apolo 11 too. Multiple times.
Then the space race was won...and the cold war itself was won.
The US government virtually defunded NASA. It had served its purpouse
Thats why. And its sad.
@@miguelpadeiro762 I understand your arguments but it still makes little or no sense
Great video! Loved it, used to enjoy programming my first computer (C64) in machine code. Very impressed with the source code that made sure we REALLY could land on the moon and get back safely!
The first language I learnt was BASIC. The second was 6502 Assembler. I also programmed on punched cards at school.
First I Iearned a psudo assembly code the teacher wrote.. then Fortran with both large and then small punch cards, then Basic on the IBM terminal, then APL on the terminal. We also learned a smattering of Cobol and Pascal. Later on taught myself a bunch of flavors of Basic including Mallard Basic on the Amstrad, AmigaBasic and a compiled Basic for the 68000... as well as C and the 68000 family of assembly. Also on the Amiga I learned E, Pascal, and a bunch of others I can't remember anymore. Then came the MS-DOS era with 6800 asm, MS Basic, Java & Javascript, HTML, Perl, ASP, PHP... now for the past 5 years I've been a dev using C++ with the Qt framework. Now learning C# using VS and the .NET framework.
@@douglascaskey7302 Very nice. I learned C on the Amiga too.
I have an affection for that era, as my second computer experience (after the Dartmouth IBM mainframe, at age 12-13) was with a DEC pdp-8e, a moderate upgrade on the tech developed for this AGC. It was far easier with old Teletype terminals, and a high speed optical paper tape reader, than dealing with punch cards, or the limits NASA's AGC had of 2k RAM, and 36k of ROM as its storage limit. MIT's work on AGC no doubt had crossover to DEC engineering.
When we think of standard programming languages in the cp/m uPC era and since, that was NOT the APC. The pdp-8e could run BASIC, FOCAL, COBOL, FORTRAN, or a two pass ASM assembler and compiler. NASA's AGC ran AGC4, its own unique hardware specific code.
MIT's engineers amazingly outdid one major NASA contract spec for AGC. They came out with a 70 lb computer, well under the 100 pound design goal limit. That in itself was amazing at that time.
It's been a LONG time since I wrote ASM code. One reason Steve Gibson's hard drive utilities are so efficient, is that's one of his long term specialties, as other code bloats just on GUI interfaces, that may hog more than a million times the video RAM alone, on which we ran entire space missions in 1969.
Awesome presentation! Learned a few new things. Thank you so much
That was so interesting ! Thanks for sharing. I would say that the jokes serve as release from the task, but also as markers. It's way easier to remember a funny note than more boring instructions. So when they would look for a specific piece of code in this huge file, they could remember it was next to this or that funny quote or joke. Very clever, and it shows that serious work will always be more efficient when you mix some fun in.
Assembly is still very much relevant today. In fact, every single time you compile software, you are compiling it into ASM (Assembly) and then the assembler takes over from there.
This is true as if you need efficiency and speed from a CPU and it's peripherals, you switch to assembly, write a sub and call it from whatever higher level language you're writing in.
Unless your compiler goes direct to machine code. Which most do anymore.
@fredflintstone9609 Most compilers still go through ASM since the assemblers are usually built with further optimization sometimes.
@@oglothenerd Depends on platform. After the advent of Turbo Pascal for the PC platform, new compilers followed that model and generated straight to machine code. Only older compilers generated asm and required an explicit asm and link step. Before long the common ones of those had an option to generate straight to machine code. Off the PCs, mainframes of the 1980s era were moving to eliminate the assembly intermediates and went from compile direct to linking. Compilers generating asm are historical artifacts on the common platforms at this point.
@fredflintstone9609 Hmmmm... that makes sense!
I'd just like to thank you for a very clear and interesting explanation. I remember watching the moon landing as a 11 year old kid, then in my last year of middle school learning Fortran programming using the write the code -> convert to octal -> punch cards with an unbent paperclip -> bind your program with a rubber band -> wait for the printout to return from the university the next week method. In the early 80s I was programming simple games in Z80 assembly language for the simple home computers of the day. Persistent memory was cassette tapes!
Ouch ... 0:12 the photo of Saturn V launch has nothing to do with Apollo mission ... there is no manned spacecraft on top ... it's the launch of Skylab station
Get a grip ffs
Too be fair, she only mentioned Saturn V when she showed a picture of just a Saturn V with what essentially is a modified fuel tank..
Did you miss the arrow graphic pointing to the Saturn 5?
What is amazing was how given the level of Computer Technology at the time of the Apollo missions, how they were able build all the necessary functionality into a computer that was able to fit into the Apollo Command Module and the Apollo Lunar Module. With both of these modules the Apollo Guidance Computers had a limited size and weight that could be fitted into these modules. There was not the luxury of having a large size and heavy weight. So it is likely the Apollo Guidance Computer was the smallest and lightest computers of its time.
Yes. The AGC was the first computer *in the world* to use integrated circuits. Those were chips containing only two logic gates each, but they were an advance over the use of individual transistors as was common back then.
I started programing Macro11 Digital PDP11 I MISS IT!
I started on 6502, 8088, and then later Vax Macro
I still love assembly language programming, there's something unique knowing you are literally dealing with the instructions themselves
Now I use python which is nice but it doesn't quite feel the same
The book “Sunburst and Luminary: An Apollo Memoir” by Don Eyles is also an amazing look at the creation of the AGC and specifically the software from the start.
As an ex assembly programmer (on PDP 11 about a decade after this) I can add that it was good practice to comment every line of code. They were after all cryptic instructions involving registers and memory locations.
Actually the PDP 11 had a great assembler. All registers could be incremented and used as memory pointers. Once you added Macros you were close to C. Actually C was essentially sugared PDP assembly code.
As a fellow PDP-11 programmer I approve this 😁. Just a small addition to the nice comments: I once saw this comment (IMHO also citing Shakespeare): 'Our faults dear Brutus are not in our stars, but in ourselves' - error handling function of the RT-11 operating system. I used RT-11 sometimes, but mainly worked with RSX-11M.
All were 'real programmers', not just quiche eaters 🤣.
The register symmetry of the PDP 11 was really nice. The Motorola chip in the early Macs borrowed a lot from that architecture.
Likewise... this brings back equal memories of nightmares (when it didn't work) and huge euphoria (when it finally did work). We were interfacing A/D's to PDP-11 for data acquisition. Set up "Ping Pong" buffers so the A/D could be writing to Ping while the PDP-11 could read Pong; write to disk... then flip the pointers. Hopefully the "bathtub" never overflowed while filling and draining at the same time. It was those other pesky interrupts from disk writes that could ruin things... now, what was the sustained disk write speed for 8k vs 16k vs 32k block-writes???? As I get close to retirement age (actually past it, but still love being an engineer), I'm thinking about getting into Arduino's for Halloween projects.
@@EngRMP The Arduino is good. I'd recommend getting some of the clones, particularly the nano version. I would also recommend using 2.54mm pin header socks and connectors so the parts can be removed and replaced easily. while I generally gravitate towards the ESP32 , ESP8266, and the raspberry pi pico. However those devices use 3.3 volt logic vs the Arduino's 5 volt logic. It's easier connecting sensors and output devices to the Arduino since most are going to be 5 volt logic and you won't require additional hardware. Get yourself a package about a hundred 2n222 transistors or some 2n7000 fets. It's possible to drive higher power mosfets directly, but if I remember correctly 70 ma is about the most you want to source from the Atmel 328 and stay within safety parameters. There's no real advantage to using real Arduino boards over the clones. I hope you'll enjoy building projects with Arduino as much as I have. The only limitations are going to be your imagination and your wallet. Oh...I usually buy in quantities of 10 so if something goes wrong I have plenty of spares available without any waiting. Besides this stuff is cheap. Take care.
I never actually wrote any useful PDP-11 assembly code, but my 2nd-semester programming course in college (Spring 1972) involved writing a PDP-11 simulator in the first half of the course and then a PDP-11 assembler in the second half, so I became fairly familiar with the PDP-11 instruction set. They made some nice design choices. The auto-increment and auto-decrement addressing modes were useful in lots of situations, and having the program counter be one of the 8 registers meant that pc-relative addressing for branches and subroutine calls was available to the programmer (or compiler writer) without it having to be a special distinct addressing mode.
A serialised restoration of an actual Apollo AGC can be viewed here on TH-cam. Definitely worth a look.
Currently, CuriousMarc and his team are working on the Apollo radio communications equipment (S-band).
@@ApolloKid1961 Sure, and my friend Farty McFartyFart is working on the warp drive.
@@ThePlanetTVtheplanettv It's just here on YT. How hard is that?
@@ApolloKid1961 The context of your question is not understood. Please tell me "apollokid' who just so happens to be on this channel, what the inference is? Thank you.
@@ThePlanetTVtheplanettv As you can probably guess, I have enjoyed learning about space travel and Apollo in particular since I was a kid. I was a software developer for 30 years and voilà that combination brought me here. How am I supposed to know what the inference is when you watch the episodes about the AGC on the CuriousMarc channel?
FYI, "Burn Baby Burn" was a term associated with the 1965 Watts Riots. It was often quoted by college students of the time. The MIT programmers would have been familiar with it's source and meaning.
Don Eyles writhe the decent and landing part of the program. He had a great sense of humour and I’m certain that he was responsible for most of all of the crazy comments in the source code.
Thank you for the video, was awesome to see and hear about it in detail :) Just a quick note, "hashtags" prefix a comment, they don't follow a comment.
They do not make such quality any longer -- same as with many, many products.
Quality is good when reality, is considered.
Nice deep dive into the code and the functions it was meant to control. The humour of the coders was a nice touch. Thank you for the video.
the note off to see the wizard is followed by the instruction to call burb baby burn , so you see
Note: If you want to get into the nitty gritty of assembly/machine code programmers' heads, try Ed Nather's 'The Story oi Mel' about a guy who used the speed of drum memory for timing events on 1950's Royal McBee computers. This was published on Usenet in response in response to 'Real Programmers don't Use Pascal' (they use Fortran) which was in turn inspired by the book 'Real Men Don't Eat Quiche' (1982) which satirized the devolution of American Masculinity
Good video. BUT big ERROR at 0:52. NOT 29 th.... 20 th July was the Landing on the moon.
Such a well known fact.
So easy to check. Google it.
I was watching it at that time.
Please correct or check your script. You are precise normally.
Now I can we trust your facts on the rest if you missed this. Still i will watch your videos because you do excellent presentation.
I once wrote a machine code routine to handle graphics on a 6809 8 bit computer and the result was instant execution, insead of a painfully slow MS 1.0 Basic interpreter solution. So source code is very important. :)
Was a PID/process simulator.
12:59 You cut that quote short. It looks like it was split over a page boundary, if I understand the "Page 393" marker correctly, and the last bit is on the line after that. So, it should end as, "...such abominable words as no Christian ear can endure to hear."
Yeah that kind of bothered me more than it should have. Also not sure why the programmer gave the page number, because that's going to vary in every edition anyway. Instead you go by act, scene and line numbers.
@@prosimian21 It's a digital transcription of the source code that was originally in printed form, so I'm pretty sure the "Page 393" is just marking where page 393 of the code starts, and is nothing to do with the Shakespeare quote.
Great video! I love learning about early computing.
They had so little compared to today, but what they had, they made sing.
The source code looks like assembly. 0:21 Nice.
Well it is
This was so interesting and I'm not a developer. You can take almost any aspect of these first steps in space exploration and have enough material for like 50 feature films. This video does a great job illustrating how ground-breaking this tech was and it's nice to see articulated kudos to all of the brilliant minds that were the OGs of the quest for space. But calling assembly "challenging" is like calling the Burj "tall" 😆Great vid!
I went to college in the late 80s. I was the first CompSci class to NOT use punch cards for computer programming classes. My internship was at a company that still used a punch card machine; for FIVE cards per day! So, this technology lingered on and on!
Same. I learned on PDP-11s running RSTS/E in college on the 80s. We’d use VT52 terminals or even the teletype machines in the lab.
But on my work experience I had to deal with punched cards and paper tape.
@@TesterAnimal1 Yes, well, this is typical for every Generation. In Universities we mostly learn the cutting edge science and tooling (but also sometimes got optical EEPROMS for programming an SPS 🙂). When you come to industry then you realize, that most of them are at least 10 years behind.
I took a programming course (FORTRAN) at a college in the summer before my senior year of High School. We used punch cards.
By the time I started at the same college as a freshman, in 1978, punch cards were obsolete.
The 1201 and 1202 Alarms were causing AGC reboots during the Apollo 11 Landing. This extended the range of the de orbit and landing engine burn. The low level assembly language allowed the AGC to keep up with required tasks, even with frequent reboots. Reboots were near instantaneous. I always wanted to find out how long it took Mission Control to isolate the problem to the enabled Rendezvous Radar. Note: Apollo 14 Landing used a lot of fuel also. They had only a minute of fuel remaining at Touchdown. Apollo 14 had an issue with a landing abort switch with an intermittent short that had to be isolated with an in orbit re-write of the Luminary Code. Loved watching your video! 👏👏👏
MIT handled the 1201 and 1202 problem isolation and provided the fix just before 11 lifted off the lunar surface; see also George Silver. On 14, Don Eyles wrote some code that lived in R/W memory; rewriting Luminary was not possible as it was in fixed (think ROM) memory. See Sunburst and Luminary by Don Eyles.
But but but Stanley Kubrick and the studio! And what about the Van Allen Belt? Huh, what about that pretty lady? And everyone now knows that only computers can do complex math! Got'cha there...
Stanley Kubrick obviously filmed the moon landings, the space agency had no intention of going to the moon. What people don’t realize is that he’s such a stickler for detail that they ended up going the moon to film in-situ.
Its funny they told us, without telling us. And worse, they made a mockumentary with Rumsfeld and Haig helping with the joke. Can you imagine?
@@ThePlanetTVtheplanettv What is the name of that?
Programming such computers was a huge challenge, the software had to be very short and efficient, squeezing every line of code to the maximum.
You can do a lot with 3 terabytes, but you can also do a lot with 3 bytes.
What is a lot you can do with 3 bytes?
Track 16 million packages for starters.
@@automateTec That's 3 bytes times 18 million, 54 million bytes.
54 MB.
@@callykitten5095 not in barcode form :)
Nice, this is really gold lady!😊 I was that kid of 5 yo. That was pulled out his bed by his parents to watch the landing... still remember it all