I'm proud that my sister worked on the Apollo missions as a Quality Control person. Her job was to check and measure every solder joint because solder is heavy and every gram of weight had to be accounted for. It sounds tedious to me, but seeing those Saturn V's launch from our vantage point in New Smyrna Beach was just awe inspiring. Amazing people built that machine.
Solder contains lead, making it heavy, and dangerous to work with being a biohazard. I'm sure they used the minimum quantity while maintaining reliability.
Some truth there. Buzz has admitted that he did not disable the Rendezvous program during decent. He was concerned that if a Descent Abort occurred, the Rendezvous computations may not be able to be started. Because of this, the computer was having to store Descent Radar data as well as calculations needed to re-acquire the Command module, Columbia. Trying to store both sets of data resulted in the 1202 system overload warning, that they decided to ignore and have Armstrong fly the lander manually. A lot of this, and more, came out during NASA TVs 50th Anniversary coverage of the Apollo XI landing.
Buzz didn't think the protocol was written thoroughly enough and that keeping the radar recording on would matter. Instead he thought it would be safer to keep it on, in case they'd benefit from the gathered info later. Could have been a fatal mistake. The engineers and programmers developing the computers were top intelligent.
We all, everyone, made use of "every single byte available" back in the early days of computing. Made for some excellent, lean programming. Programming today (and for some years now) is very lax and wasteful by comparison.
Much faster and cheaper, though. Aerospace development is still lean, mean and very nearly bug free, but also slow and super expensive. Shuttle software was a thing of beauty, too, even though they had way more resources and better tools (Apollo programming was done mostly on paper).
Such a common misconception. Yes it was probably the lowest price as most things are but it's the best price that can still meet the specs that gets the job. It's not like it was farmed out to China.
This is brilliant. It does need to be put in context. As anyone who worked with dos based computers recalls, we used to be able to do a lot more when resources were devoted to the program instead of the interface. My no hard drive 512k ram one 720 floppy Toshiba T1000 did some great productivity software stuff without the burden of running a GUI
Yeah---if only they had'nt accidently on purpose destroyed billions of our Dollars worth of Telemetric know how --eh? They could rebuild everything, only with modern improvements.
Actually there were unneeded background tasks. In the event of overload, the AGC would dump those processes and reload the essential systems that kept the astronauts alive
@@omniyambot9876 Probably not. The Apollo computer had software that was literally hard wired using magnetic cores for memory. The AGC could be rebooted within a fraction of a second and pick up where it left off. The AGC survived 4gs of acceleration while being violently vibrated. They also survived Van Allen belt radiation for a short time. Eleven Apollo missions were launched, and every one carried two AGCs: one for the CM and one for the lander. Only 1 Apollo launch carried just one. Of the 21 computers launched, not one failed. So the best a smart phone could do is match the AGC, not be more reliable.
Apollo Guidance Computer (AGC) purpose: Put a human on the Moon iPhone X purpose: Enslave the lower orders so that a super-race of machine/human hybrids can take over the world Both are awesome technical achievements!
An iPhone already a piece of cheaply made aluminium would not survive space. The large Sievert increase is enough to break it. And the electrons would escape from the battery at an alarming rate rendering it useless.
The reason computers got faster: So you could close the porn window before your wife makes it down the hallway ( and have a game running in the background the whole time!) Lol
you smart arse assholes!! you get a job at IBM or NASA..the had guts, alot relied upon these people, instead of todays zombies with I-phones in their faces talking about crap..people foget these astronauts were also top-gun and nasa test pilots (crack pilots, people that could think without a I phone) so thing about that if your i phone lets ya, fucking trolls...
@กล้วยหอมจอมซน This is an old comment, but I just gotta say it looks like @LEGO PRODUCTIONS is having child issues if anything. This chunk of coal can't figure out how to use an iphone and is taking his rage out on younger generations. I mean, he's not wrong. My generation is retarded, but so is @LEGO PRODUCTIONS.
Thank you, thank you. I get so tired of hearing people that don't understand technology and how it works, try and deny the fact that it was possible. Moon landing deniers say we didn't go to the moon because we didn't have the technology... that's equivalent to saying that circumnavigating the globe in a ship in 1600's was impossible because sailors didn't have GPS. Just because you don't know how it was done, doesn't make it impossible...
That's why modern computers aren't all that it was doing calculations that helped a rocket launch into history and safely land on the moon that's so much more exciting then being able to watch tv on you're phone
Yes and its construction was such that it would not fry when hit by some radiation on the trip to the Moon and back. There is elegance in its simplicity.
@@qtig9490 if you're just going to the moon and back you don't really need to worry about the radiation factor it takes less then 3 hours to go through the Van Allan belts and a thin layer of Aluminum is sufficient for high energy particle radiation and if you're talking solar flares that's why the sun is monitored 24/7 which gives a good 3 days to get out of its way
It's interesting to read comments from those who doubt the computer power available was sufficient to get men to the moon, and yet they overlook the fact that both the USA and USSR were soft landing unmanned craft onto the surface of the moon during the 60s with LESS computational power than the Apollo missions :-)
Even 55 years later no other nation has ever claimed to have landed men on the moon not even Russia . It’s interesting that no one else in the world can do it this long after 1969
@@gregleach5833 - Greg, the USA and USSR first put men into space in 1961, and yet the only other nation that has built rockets capable of sending people into space is China who first achieved it in 2003.
@@gregleach5833 - 63 years later and still only 3 nations have managed to build rockets capable of getting people into space, so those are the only three nations today that can potentially send men to the moon... if they really want to.
@@gregleach5833 - Think about the size of rocket required to get men to the moon and ask yourself if Russia or China have built such a rocket yet (the USSR tried with the N1-L3 'moon rocket' in the 60s and it blew up during every test launch). The USA have achieved a rocket of that size with the Saturn V, the recent SLS and soon with Starship.
Wacthing CuriousDroid video: "Wow, I am actually proud on feat of engineers, scientists and pathfinders who really advanced Humanity" Seeing the comments: *Faith in humanity lost*
You are one of my favorite sites for a dose of old school reality. When people around the world were less pretentious and understood and used their ability of critical thought. It seems to be a lost art today.
I’m impressed. This is a clear explanation for a general audience of some complex information - especially, of just what was going on with the LEM’s computer, and the repeated error codes, during Eagle’s descent. The explanation is so good, I almost understood what happened in that episode!
You failed to mention the "Human Calculators" on staff to also calculate orbital dynamics equations in parallel with the computers on the early Apollo missions. They served as a sanity check and backup for the untrusted computers. It's interesting to note they used slide rules and did these calculations by hand. Great video. Thanks!
Many of us old timers remember using slide rules to calculate problems in college and on the job. The slide rule was a great invention and helped advance achievements in science and engineering. Our phones now have the capabilities to calculate most everything and if not, then a modern calculator today can achieve the task.
On Gemini 12 the onboard computer failed, so Buzz Aldrin used a slide rule to make the calculations required to rendezvous with the Agenda target vehicle.
Lol . I found an old newspaper ad from the mid or early 1970s in an old magazine i picked up at the thrift shop . The computer, monitor, keyboard and mouse cost four thousand dollars . And it mentioned " a hard drive that holds a whopping 30 megabytes ! "
I was just watching a video from CuriousMarc who somehow managed to get an original Apollo flight computer, and he somehow had it hooked up to a lunar lander flight simulator and was showing how the computer controlled the actual landing. It seemed so easy and helped explain what was going on during the landing videos!
@deckard163 and in a thousand years people will look back to these years and say: well, our stuff has million times more power, so they couldn't play realtime 3D games back in 2018? That's stupid. And is equvivalent to your argument. Just because we have million times more calculateing power, it doesn't mean that something is impossible with the lesser computers.
@deckard163 and I am the one with "you have reading comprehension problems"... I didn't said they had 3D games back than, try again. I pointed out how stupid your argument is. And going to the moon isn't about fucking computers. I imagine you have never programmed something byte by byte. You would know that calculating motion for landing isn't that hard, thos computers could manage it. Going to the moon is about money. Do you know how much it costs to build one of those rockets? And why would you spend that much money for that? The cold war is over. There is minimal scientific benefit from a moonlanding. There is no point, unless you do a reality show about modern moonlandings or such bullshit. That's what todays avarage people need. Basing your argument on "today you get million times more computational power for 60$" is as stupid as if somebody in the future would say that we didn't have 3D games now, because in the future you can buy a computer more powerful for 2$. And as I said, the bottleneck isn't about computers. Try to summarize the calculations needed to controll a thruster based on position information provided by a radar. At max a couple 100 instructions every second is my first guess. I hope if you read through a cople times these comments you will get enough info, because this is my last comment here. I don't waste my time with conspiracy nutts, it hurts intellectually.
Reminds me of the old Burroughs B500 mainframe I used to look after (btw it was ex US Military!). Due to having magnetic core memory, you could press the HALT button, make a note of the current memory address register shown in duodecimal by neons on the control panel and shut it down. When it was powered up again, you could key in the saved memory address, hit CONTINUE and it started up again as if nothing had happened. It saved me some very late nights! 19.2K of memory, 9.8 MB of head per track disk, three half inch tape drives, one 80 column card reader, one paper tape reader and one 800 lines per minute line printer. Can't remember the clock speed but I think it was about 1 MHz. Those were the days! Surprisingly, as it was used for business applications like payroll and accounting, the actual processor speed wasn't really relevant as the whole system was limited by how fast the printer could print. The downside was that it needed regular preventative maintenance, it lived in a big air conditioned room and needed three phase mains power (lots of). Not a chip in sight - all discrete diode and transistor logic.
That's even before my time as a computer engineer. I started working in 1980, on Honeywell level 66 mainframes. Huge cabinet with about 80 wire wrapped boards with ic's in the "dead bug" position. Had a great time debugging these boards with a board tester, a logic probe and huge schematics books.
We've gone backwards. Modern software is bloated, inefficient, require a ridiculous amount of memory and are unreliable. As a developer, I find myself constantly appalled by the low quality of software written today.
Nobody in the commercial market would pay to use software developed to aerospace standards though. Keeping it within a reasonable price range is a hugely important parameter for things like cell phone apps, so they take shortcuts, use ready-made libraries, hacky solutions, take full advantage of easy-to-push patches etc. On the other hand, just the QA for Shuttle software involved a specialized peer review group of the same size as the principal coding team (boom, twice the payroll), an outside verification team (thrice the payroll), then a module test group, configuration inspection group for the complete software load and then another verification group at NASA. Imagine trying to develop a smartphone app with these many independent QA levels and then trying to make the money back by selling it at 5$ a copy...
Well maybe it was state of the art...at that time! 1969. We are now 2022 and yet they are trying to work out how to protect orion from the van allen radiation belts! We supposedly went there with apollo missions? Have they forgot how they did it? Oh I forgot they lost all the information! Err! Too many anomalies.
@@michaellyne8773 So all you're saying here is that your knowledge of the Van Allen Radiation Belts is limited and you don't understand how the 1960s technology used by NASA quickly became obsolete....
Great explanation! One of the biggest misconceptions about the technology that took man to the moon... Those computers, while seemingly crude compared to today's, were tailored specifically for firing the engines, controlling guided descent, controlling, guided ascent, and rendezvous. They were quite remarkable and an amazing achievement by the brilliant minds working on it.
...Even allowing for fuel slosh and an ever changing center of gravity. If I remember right, when the Instrumentation Lab at MIT was given the task for the guidance system, they had already made one for the Polaris missile and they just grabbed one of those off a shelf and started expanding on it.
I've heard that Stanley Kubrick directed the fake moon landings. But he was such a stickler for authenticity that he demanded it be filmed on location on the moon.
Actually there’s something more pertinent to the iPhone comparison. The iPhone’s CPU uses a nanometer process. The ionized radiation would shred any smartphones components without shielding. The Apollo’s computers used rope based ROM which wouldn’t have been affected due to its size and pre-programs tasks. Well, that’s the basic explanation. This is the reason the new Orion capsule is being tested even though we’ve sent humans to the moon - smaller CPUs and memory, etc..
question for you ! how many hours did Neil Armstrong and buzz aldrin spend on the moon! The answer given was...Armstrong and Aldrin spent 21 hours, 36 minutes on the moon's surface.5 Jan 2022 So my question is and I do not mean to sound crude..but what did they do about there toilet arrangements? We all have to do our number one's and number two's! Did they take off there suits to change there catheter bag? Come on nasa I would like an honest answer..Remember all them bulky suits and back pack ? In a small environment!
@@michaellyne8773 Good honest question. They only spent a couple of hours on the moon. They wore diapers while on the surface. But back inside the LEM they used specialized bags. They were fairly sophisticated, but not perfect by any means. Actually they were fairly crude, but did the job. My dad worked at NASA so I got to see a urine bag, and an entire Apollo-era space suit complete with vacuum packed food - he actually brought the whole thing home for a few days and had a show and tell for one of my classes. That was awesome. Anyway, it’s good remember that they spent some of that 21 hours sleeping, eating and working. So, they weren’t just sitting there waiting to go to the john. And I believe they took some medication to help their digestive systems slow down. Going #1 was a easier than going #2 which could get messy. Anyway, they used bags, tightly folded them up and stored them in return containers. NASA scientist wanted them to bring all that stuff back for evaluation. These were early days and they needed to know what affect long term space travel had on the body. Apparently the LEM didn’t smell great, but these guys were determined. There’s a report that explains everything in detail if you’re interested. It’s called “Biomedical Results of Apollo - Chapter 2 - Waste management system”. Google it. You don’t have to read the whole thing if you don’t want to. You’ll get the idea pretty quickly. There are some pictures of the receptacles along with some diagrams. Hope that helps clear some things up for you. Btw, yeah the LEM was small, but not as small as most people think. It was smartly laid out and there was room to maneuver.
@@TheSteveSteele but in there big bulky suits ? They did that in a closed environment no bigger than two telephone kiosks! let alone the controls and other equipment inside no air remember! Somehow I beg to differ! Sounded a good answer. Did they take of there suits and then put them back on again 🤔 I watched the preparation before they took off to go to the moon! Come on! A small environment 😆
@@michaellyne8773 The suits they launched with were for take off. A totally different experience than what it was like standing in the LEM in 1/6ths earth gravity. They weren’t wearing all of the protection they were during takeoff. What do you mean by “no air”? The LEM had a life support system. They had air. Since they weren’t wearing their full suits in the LEM during rest periods, it wasn’t that big of deal. They urinated into a bladder they were wearing that could be dealt with later, so they had to remove nothing for that. For the other, they had to use a bag, which required some suit removal but not much. You telling me you couldn’t go to the bathroom in a telephone booth if you had too? Not a big deal.
People who are fucking intelligent enough to realise that they have something to learn. Of those who do not take notes, in 0.01% of cases it's because they already fucking know this stuff. In 1.99% of cases, it's because of some fucking disability that prevents notetaking, and in 98% of cases it's pure stupidity often signified by arrogance, over-confidence and inability to concentrate.
The popular Arduino Uno boards have 2K of RAM, 32K of ROM. They're also 8bit, and as a blank slate, can be used for real-time computing. Play with one of those for a while and you realise just how much you can do with what appears to be very little. Those of us old enough to have grown up with 1980's home computers (BBC Micro in my case), aren't quite as shocked.
Yeah, people, even professional programmers under-estimate the possibilities of the hardware they use. I had started with assembly programming in my graduation(2007-2010). Still to this day, I love how much can be achieved in assembly. Even though, I only program in C/ C++ for my job.
@@mazzalnx It's not disgusting but good and useful too. Without this layered structure we would be stuck with changing bits even for most complex work.
@@anujmchitale I don't mind the structuring but the laziness disgusts me. I rarely use that word but, genuinely, I cannot fathom why each of my simple text messaging apps use 200MB of memory or more to run. Security is necessary, modularity is necessary, I understand, but we had all of that laid out around 2002 and it worked beautifully. Messaging apps used no more than 30MB back then and did THE EXACT SAME THING. Pictures, voice, contact lists, ads, fancy UI, you name it. Since then, all I see is nonsensical fluff being piled on top...
Fascinating. But you failed to add that all the major course calculations for the orbits were already pre-planned into the computers to cut down on any additional on-the-fly calculations. These were numbers crunching intensive so NASA did these ahead of time.
yes.. with robotic missions.. evidence shows astronauts were faking to be halfway to the moon while in low earth orbit. th-cam.com/video/mCHG6uJH5L8/w-d-xo.html
orbital positions provided to NASA for their 1976 robotic Mars Viking mission, which they applied to the photos that recorded at the Cydonia Region after locating a five sided pyramid at 40.87 north, presented by a NASA whistleblower that also contends that the Apollo missions were fraudulent. Bart Jordan and MARS Viking 1 th-cam.com/video/4Dc1A37LIGM/w-d-xo.html
A great video as usual. My brother is a computer programmer for Google. He once commented to me that old computers designed for specific tasks like the Apollo systems can just keep on working as they were designed to, for ever and a day (so long as the hardware hangs in there), because they don't get bloated with new computing requirements each year.
The culture at large has forgotten the difference between specific-use computers - totally bespoke technology - and the general computers we have in consumer space, with OSs and GUIs holding our hands as we navigate all matter of programs, apps and uses. Apollo / NASA systems were designed for one mission. No email, web surfing, digital media playout, no games. Just figure out all the data needed to get to the moon and back. Hats off to the engineers in the 1960s who did this. A 'brain-power' Olympic Games for sure.
I love your videos and want to make sure to start with a sincere "thank you" so you don't think I'm a nit-picky troll. I just wanted to point out that you mentioned that the LEM's Abort Guidance System wasn't used on any missions and that's correct for all of the missions that landed but the AGS was used by Apollo 10 as a check-out of that computer to clear the way for Apollo 11. Apollo 10 was considered a dress rehearsal for Apollo 11 so they were doing stuff like that. For the record it worked fine after a slight emergency that occurred when the astronauts accidentally performed the descent stage separation with the primary computer on but the guidance data in the AGS. Once they got the AGS engaged, it led them back to the command module perfectly. Thanks again for your very informative content.
Thank you so much for producing this. I work in this industry and your spaceflight videos such as this are consistently excellent and even I learn something from them! You give a thorough, correct, and frankly understandable explanation in terms viewers can understand and the production with archival video is top notch. This is the best explanation I have ever seen about the Apollo computers and I appreciate that you so correctly explained the technical differences compared to something we know. Well Done Sir!
People tend to think a control system computer needs to be PC or modern cell phone level powerful to control lots of devices at once, but that is not the case. A programmable logic controller is a good example of this. You can control 10's of thousands of processes with a single PLC unit that has the computer power of 1980's RISC CPU tech. Heck, even today's Nasa Rovers use 1990's RISC CPU tech.
I think you overestimate the processor requirements of a PLC, the first ones I used had 8008s (the ancestor of the 8080 and hence 8086) and 6502s (Apple 2 CPU). Perfectly happy to run moderately complex machinery, just forget pretty operator displays.
@@carlamayer8198 I watched a video years ago as they were mocking the sheeple for being so dumb they had to refake the actual fake landing on the moon with Photoshop of dubbed in drink cups on the set ...this whole thing of NASA never going to moon , was started by NASA ..and I highly suspect the latest flat Earth bs is also a NASA doings along with all this "evidents that rockets can not work in space ..and add all this to the stories that say satillites don't exist and now we are being told the sun is only a light bulb fastened to a dome along with the moon and stars ..and the sheeple believe ... I wonder where the Nazis will take us next ? Passed the gas chambers ? I don't see any in the FEMA camps only the gas powered crematoriums ..so they are going to save some money this time and just throw us into to fire to be burned alive !!!!! Have a beautiful day
It's nice to hear about some things we got right, rather than everything that went wrong and how much it cost. The designers and engineers of the Apollo Program are just as much heroes as the astronauts!
The thing many forget is that OS and other software have got more and more massive over the decades. Back then with limited memory and CPU power meant really efficient programming. The resources now are so massive it is less important.
this is a particularly fascinating area of the Apollo program for me. The sheer ingeniousness of how they designed and built the computers to enable craft to travel to and land on the moon. In a way, they had to be smarter than today's system designers and developers, because of the extremely limited computing power available.
@callingaseyeseeit2325 lmao. I get it, this topic is too hard for you to understand. But just because you don't understand it, doesn't mean it's a conspiracy theory. Go change your tinfoil hat
Excellent topic and video! Slight tweak: The errors that caused the Lunar-Module AGC to drop low-priority tasks, giving 1202 and 1201 Alarms, was not due to any problem in the landing radar, but just that both it and the Rendezvous radar were turned on, causing “data overload,” so to speak. The rendezvous radar was not needed, but the checklist allowed Aldrin to have it on, which he did as a precaution, just in case they had to abort and immediately rendezvous with the Command and Service Module. As I recall, the checklist was amended in subsequent missions.
Slight tweak to your slight tweak.. The power supplies for the landing radar and the AGC were out of phase and that caused the CDU inputs from the radar to interrupt the CPU roughly 6000 times per second, and this is what caused the overloads and the program alarms. This problem was known to both MIT and Grumman at least a year before Apollo 11, but nothing was done to fix it.
As I've said under videos about cosmology - If the universe is a simulation, given enough memory, you could run the whole thing with an Intel 4004 processor. We would all be oblivious to the passage of time for the processor as the processor is not a part of the simulation. That that functionally tiny computer could get men to the Moon is no surprise at all. I used to have a bit of fun programming my Tandy PC-7 with its 1.5k of memory. Even played Conway's LIFE on it, but I had to use pencil and graph paper to translate coordinates into visual results.
Excellent vid. I only wish you could have mentioned Margaret Hamilton, the young lady who wrote most of the code for those computers. I teach middle school Science and I always give a shout-out like that when I can, to encourage and hopefully inspire the girls to coninue studying Science.
My Dad worked on the IBM Apollo project team, in Huntsville Alabama (guidance systems). I didn’t find out until I was 17 years old! (1988), when I overheard him talking to my friend. I was shocked that my (dull IBM computer software developer)Dad was involved with this!
The Apollo Abort Guidance System *was* actually used, by the way. It was tested on Apollo 9, accidentally activated on Apollo 10, used during lunar ascent gimbal lock on Apollo 11, and used for most of the return trip of Apollo 13 (including several engine burns).
Steve Fischler and I built the ground control computers for the Gemini and Apollo spaceflights in a facility in Palm Bay, below Melbourne, which was below the Cape. The facility was Radiation, Inc. and owned by a dentist, Homer Denius. The first IC chips were built right down the hall in flasks sitting on what looked to be duct tape covered pillows. Our computers were wired by some 50 women in a room adjacent to ours and we used potted circuits built in little plastic boxes to populate the wired boards. Each potted circuit had 10 pins on the bottom and contained flippflops, astable oscillators, etc. NCR built the memories for our computers, 64K machines. But when you are doing assembler or a threaded interpretive language like Forth, that is tons of room. In fact the most complex word processor program I have ever seen ran under CPM (control program for microcomputers) which only ran on 8 bit machines. But when the shot JFK, I expected the space program to fold so I took a job as a life guard in Lantana, Florida. It was fun while it lasted.
felix mendez Bro you doudt everything regardless of what anyone says , dam relax . You have all this " evidence " to prove your statement's take it public then, actually do more then just debate people online lol . Go attack the believers of Bigfoot
felix mendez See it's a fact any jacklegg can make a website and put what he wants on it , truth or lie people will believe it . That is why nothing on the internet is 100% truth , where are the documents and witness statement's to prove your claims ? Real evidence not website's or TH-cam videos
What very few people know: The kind of software programming that was done on the AGC...and the limited AGC memory capabilities....were used again on the FIRST Commercial Airline aircraft Flight Management System (the Boeing 747 in 1981). That computer controlled the climb out, in route navigation and descent of that aircraft. Similar systems were eventually put on ALL commercial and military aircraft. These aircraft use Flight Management Systems up to the present time. This was one VERY important offshoot of the Apollo program that was put to a very useful purpose for EVERYONE flying today.
Well, we don't tend to care as much, so even 'good' code can tend to be sloppier. The Amiga OS runs on 1 megabyte of RAM (technically 512k, but it was never very good at that), and a 7 mhz processor, and can do multitasking and a bunch of other things that wouldn't look out of place in a modern system. Some of that performance comes from being very fast and loose with security and reliability. (performance is king, everything else is secondary). some of it is misleading (The system isn't as capable as superficial comparisons would suggest) But some of it genuinely results from much less constrained development. I don't care if my graphics routine uses an extra 5 megabytes of memory on my system with 16 gigabytes of total memory. But on my 1 megabyte system I DO care if I use 70 kilobytes instead of 60... It might be that various overheads related to hardware design and operating systems mean I need that extra 5 megabytes to do stuff I wouldn't have had to on the older system. Or maybe I'm doing more error and security checking to prevent crashes and exploits. Or maybe... I just don't care. I mean, what do I care if I'm wasting what isn't even 0.1% of my system's resources? Unless I do that hundreds of times over, it won't really matter. Plus, there's laziness. Do I want to spend 1000 hours writing hand-crafted assembly code? Or trust my compiler to create 'good enough' code that may be only 70% the speed, and uses 130% more storage space, but saves me 900 hours of work due to the reduced complexity on my end? Some optimisations just aren't worth the time they take to write. Plus, the most heavily optimised code is also the least comprehensible, most brittle, and least portable, in general. Which often isn't worth it. And of course, if small mistakes and inefficiencies like that aren't critical to whether something works or not, it lowers the barrier to entry. Sloppier, less skilled programmers can still write usable code, because their errors aren't as disastrous.
There is "some optimization isn't worth it" and there is "this program that is basically just a dedicated web browser takes nearly 1GB of storage and another one of RAM" (looking at you, facebook).
The hardware *was* the operating system then. And you're comparing general purpose computers (like we have today) with a computer that was designed specifically to just do one specific job. A $15 programmable calculator today would run circles round the flight processor then. It's not a job that _needs_ a lot of power.
Yep. Soviets had a very risky build and a shitty team coordination (hence failing to get N-1 flying without going boom), while USA managed to win on teamwork and pure spending (F-1 and J-2 were inefficient, but they were sufficient to do the job).
So some people can believe that there are 1.600.000.000 transistors on a (let's remember it) human-made 1" square of silicium, but can't admit that we went on the Moon? Not the best argument, but I'm always mezmerised by miniaturisation those days. And by stupidity.
That's kind of where I'm at. Apollo - sure thing. i7 - really? I guess we got help from the aliens we met on the moon. The Internet, too. A signal travels from my keyboard, through a computer bus, through an ethernet cable, through a switch, through a router, through a telco system spanning copper, fiber and cable, to arrive across the country in just 100 ms. We accept that for granted and yet it is damn near a miracle it all actually works. And here you are reading it, going, "hmmm..."
My very first ‘computer’ was a Casio FX-702p programmable calculator which I bought in the souk in Abu Dhabi in 1981. With just 1.68k of memory, one had to be very inventive to get it to do stuff of any size, things like designing small sub-routines that could be used in different situations. Looking back, the sheer limited size of such machines forced the best from programmers and was very good training for the future. I’m sure this must have been the experience of those programming for NASA nearly 20 years before me.
“Things like a computer that could fit inside a single room and hold millions of pieces of information.” I don’t know if Jim Lovell really said that, I know Tom Hanks did when he played Lovell in Apollo 13, but I always smile when I hear that line in the film
It was a damn good ship design. Apollo 13 only failed because someone dropped the o2 tank during production. Even with that explosion, it still got those guys home
powerful ground based computers taking the heavy processing load and then sending it to the spacecraft. this seems to be the first example of what we now call "cloud computing" ??
For several years MIT was the sole consumer of every logic gate chip produced in the USA. At the time the factory failure rate of semiconductors was very high.... sometimes 10 chips had to be purchased to get one fully functional unit. Lots of hidden pitfalls in the AGC project!
The AGC was an embedded computer. Embedded computing never needs tons of resources, embedded system engineers and even Arduino noobs understand that. You can do a whole lot of stuff with a 32k flash and 2k ram AVR microcontroller. It is totally different than what people usually understand by 'computers'. Even the most powerful embedded computers never need 10 MB of flash memeory.
Something left out that I would like to know is whether the AGC, and the other computers programs were all executing binary code, or had they advanced to an octal or hexadecimal instruction set that was running at the time the processors were running? I say binary and also all of the commands sent up from the ground were reprogramming of the flight computers to take advantage of :no longer needed for the moment storage space, which the program would branch into until it got to end run and an new set of instructions were sent up to the astronauts with the offsets into cpus memory where the program would run until it was no longer needed and then another set of instructions sent up to overlay the memory of the previous code that had served its purpose and that area of storage which had been previously used would be overlaid with new code. BTW for you IBM sys progs, if you have ever seen a JES2 message it is predicated by $HASPxxx, which stood for Houston Area Spooling Program, so in essence JES2 was a by product of Apollo. Did the AGC have registers?
Great Video, all computers running an OS now prioritise important tasks and have for decades. Interrupts are an example that prioritise your mouse or keyboard instruction, all this comes from Apollo. First use of integrated transistors too.
That's the biggest misunderstanding: The Apollo flight computer was built from the ground up to be good at doing one task, where as a smart phone can do hundreds reasonably well. Two totally different machines.
of course it is.you clearly have a brain.thank god someone else can see through this absolute garbage.space travel is impossible even today.thats why we get cgi cartoons from nasa.nothing is real.not a single thing
Comparing the AGC with a smartphone is not useful. The AGC is a real-time beast of a computer, capable of an inner working totally out of bounds for current complex microcoded microprocessor. Some parts of the AGC processor use analog tricks to allow a safe and error-proof operation. In many of its features, it is still superior to many general purposes computers of today. It can only be fairly compared with other spaceship computers or critical fail-safe embedded microcontrollers, and still, the AGC holds it well with its cleaver error handling system capable of running or safely recovering even in the worst condition. The AGC should be a classic case study for today's computer engineers.
@ᚱᛰUᛠӖᚱ ᚦᗩӖϻᛰᚤ can't debunk it cause it's a fact it happened six fucking times what ever you think is evidence it's just proof who ever said it is a moron so what do you call proof
Man, these tings were really basic. I followed the Apollo program since it was first announced in '61 and in '69, I was an Engineering Science major at SUNY. We did a Fortran exercize to lay out a path to The Moon and back. It was a mind bender for sure,
@@rbwannasee That is wrong. Neutrinos have no electric charge, therefore they don't spiral in magnetic fields. Neutrinos would hardly do anything to a transistor, since they are so sterile. Most of the neutrinos pass right through the entire Earth without collideing. It is protons end electrons which pose a threat to computers.
@@rbwannasee but the radiation part is correct. You need cube kilometers of matter to have a reasonable chance of detecting a couple neutrinos in a week.
because people have a hard time realizing that new computers are designed and made by old computers ... they're all extremely capable when you think about it - they did an excellent job working with what they had, which they thought was amazing at the time
*WHO WALKED ON THE MOON* • Neil Armstrong (1930-2012)-Apollo 11 • Edwin "Buzz" Aldrin (1930-)-Apollo 11 • Charles "Pete" Conrad (1930-1999)-Apollo 12 • Alan Bean (1932-2018)-Apollo 12 • Alan B. Shepard Jr. (1923-1998)-Apollo 14 • Edgar D. Mitchell (1930-2016)-Apollo 14 • David R. Scott (1932-)-Apollo 15 • James B. Irwin (1930-1991)-Apollo 15 • John W. Young (1930-2018)-Apollo 10 (orbital), Apollo 16 (landing) • Charles M. Duke (1935-)-Apollo 16 • Eugene Cernan (1934-2017)-Apollo 10 (orbital), Apollo 17 (landing) • Harrison H. Schmitt (1935-)-Apollo 17 *WHO ORBITED THE MOON WHILE THE OTHERS WALKED ON ITS SURFACE* • Frank Borman (1928-)-Apollo 8 • William A. Anders (1933-)-Apollo 8 • James A. Lovell Jr. (1928-)-Apollo 8, Apollo 13 • Thomas Stafford (1930-)-Apollo 10 • Michael Collins (1930-2021)-Apollo 11 • Richard F. Gordon Jr. (1929-2017)-Apollo 12 • Fred W. Haise Jr. (1933-)-Apollo 13 • John L. Swigert Jr. (1931-1982)-Apollo 13 • Stuart A. Roosa (1933-1994)-Apollo 14 • Alfred M. Worden (1932-2020)-Apollo 15 • Thomas K. Mattingly II (1936-)-Apollo 16 • Ronald E. Evans (1933-1990)-Apollo 17
Peter M My only issue with your initial comment is that the way you phrased it, you are implying that there is no operating system that can be trusted.
Fantastic summary story of one of the great mysteries of the space program, especially to someone who made a living programming general purpose computers!
I have to stop once in a while and appreciate the weight of the fact that this small piece of technology in my hand gives me access to mind-numbing volumes of information and knowledge at the flick of a finger. - something that would’ve been considered near magic just a few centuries ago much less in ancient times. That I can take images better than high end cameras a decade ago, view a map of the world in exquisite detail, order food or items from the other side of the world to my home, ask an artificial intelligence about almost anything, view the news in another country, translate languages, have an encrypted conversation with someone on the other side of the planet.... Contemplate that phone in your hand and marvel at the absolute power you possess, that ancient kings would’ve killed you for.
What your phone is actually doing is accessing more capable, application specific servers to provide said services. Your phone is not creating the maps, it's just downloading and displaying them.
@@DasAntiNaziBroetchen Yeah, but its still doing all if it from the user side, we have made infrastructure to support this vast network that gives us this amout of information
I really want to know - why do you conspiracy adherents think that going to the moon was too hard for us to do? Why not just go to the moon? Why bother with such a ponderous cover-up instead of just going? (I'm asking respectfully because I want to know where you're coming from, not to start a flame war.) It's difficult to debate an issue when there is no clear position on the other side.
I think he's a guy who is fascinated by science, and I am fascinated by his videos. Again, why do you conspiracy adherents think that going to the moon was too hard for us to do? Why not just go to the moon? Why bother with such a ponderous cover-up instead of just going?
Registers, nouns and verbs. It would be amazing to see what modern computers could really do if we weren't drowned in an avalanche of "user experience" bloat.
I have an inkling of it, I've used command-line linux using high end hardware. Even with the bloat, command line instructions just fly and so much is completed in an instant, something that would blow the minds of 60's computer technicians.
Excellent document. Many people think that the current computing power of their cell phones would get them to the moon. However, a high degree of integration density would not survive in a radiation environment - cell phones work well on Earth under a protective magnetic shield, not in space. The hardware was adapted to the environment and the software was written very cleverly with minimal memory requirements. Today it would not be possible, because libraries are used that are completely compiled into the program and algorithms are created automatically. All this results in a large capacity requirement and a high error rate.
I'm proud that my sister worked on the Apollo missions as a Quality Control person. Her job was to check and measure every solder joint because solder is heavy and every gram of weight had to be accounted for. It sounds tedious to me, but seeing those Saturn V's launch from our vantage point in New Smyrna Beach was just awe inspiring. Amazing people built that machine.
Well they welded the integrated circuits for this not soldered
@@crackerjack3287 Yep, lol
@@crackerjack3287 sounds like something a relative would get wrong and misremember.
Solder contains lead, making it heavy, and dangerous to work with being a biohazard. I'm sure they used the minimum quantity while maintaining reliability.
@@crackerjack3287 No welding! Solder only!
God, this was so good. Much love, Curious Droid, you're doing great work.
Hey stupid turtle I watch your videos
oh, exurb1a came here :D
hi
Oh hey depression turtle
d-dad?
"Sorry Houston, Buzz accidentally clicked on an ad. It will be 29 seconds before we can return."
Yeah right that was bean he was surfing porn
Some truth there. Buzz has admitted that he did not disable the Rendezvous program during decent. He was concerned that if a Descent Abort occurred, the Rendezvous computations may not be able to be started. Because of this, the computer was having to store Descent Radar data as well as calculations needed to re-acquire the Command module, Columbia. Trying to store both sets of data resulted in the 1202 system overload warning, that they decided to ignore and have Armstrong fly the lander manually.
A lot of this, and more, came out during NASA TVs 50th Anniversary coverage of the Apollo XI landing.
What's you're point he took a chance and it landed perfectly
:D
Buzz didn't think the protocol was written thoroughly enough and that keeping the radar recording on would matter. Instead he thought it would be safer to keep it on, in case they'd benefit from the gathered info later.
Could have been a fatal mistake.
The engineers and programmers developing the computers were top intelligent.
For once an informative presenter. Didn't start with "Hey guys what's up?" Thank you.
@pcpaulius Someone's got penis envy.
Agreed, this guy is excellent, solid and well presented info from beginning to end.
I love you guy's names 😃
(while throwing up gang signs)
"SMASH THAT LIKE BUTTON AND DON'T FORGET TO SUBSCRIBE! !"
And didn' t start with "Hi TH-cam".
We all, everyone, made use of "every single byte available" back in the early days of computing. Made for some excellent, lean programming. Programming today (and for some years now) is very lax and wasteful by comparison.
Much faster and cheaper, though. Aerospace development is still lean, mean and very nearly bug free, but also slow and super expensive. Shuttle software was a thing of beauty, too, even though they had way more resources and better tools (Apollo programming was done mostly on paper).
I've seen commercial code that wouldn't pass any exam ever.
*Cough* YandereDev *Cough*
Excepting critical programs of course: military, medical et al.
Programmers were smarter back then.
All joking aside, Clever people developed this stuff.
With slide rules and hand drawn circuit diagrams
Clever , you mean built by the lowest bidder
@@frankscott6360 If you are above IBM or MIT, then yes...
I welded two wrenches together once to get a tricky bolt off of a John Deere starter.
Such a common misconception. Yes it was probably the lowest price as most things are but it's the best price that can still meet the specs that gets the job. It's not like it was farmed out to China.
This is brilliant. It does need to be put in context. As anyone who worked with dos based computers recalls, we used to be able to do a lot more when resources were devoted to the program instead of the interface. My no hard drive 512k ram one 720 floppy Toshiba T1000 did some great productivity software stuff without the burden of running a GUI
Yeah---if only they had'nt accidently on purpose destroyed billions of our Dollars worth of Telemetric know how --eh? They could rebuild everything, only with modern improvements.
Just crunching numbers. No video, no audio, no unneeded background sub routines or apps, no bloated OS..
Actually there were unneeded background tasks.
In the event of overload, the AGC would dump those processes and reload the essential systems that kept the astronauts alive
@@MostlyPennyCat Agree, but I was referring to background stuff like update clients and such.
whatever we say, a smartphone would be more reliable if modified
@@omniyambot9876 Probably not. The Apollo computer had software that was literally hard wired using magnetic cores for memory. The AGC could be rebooted within a fraction of a second and pick up where it left off. The AGC survived 4gs of acceleration while being violently vibrated. They also survived Van Allen belt radiation for a short time. Eleven Apollo missions were launched, and every one carried two AGCs: one for the CM and one for the lander. Only 1 Apollo launch carried just one. Of the 21 computers launched, not one failed. So the best a smart phone could do is match the AGC, not be more reliable.
@@ohger1
Yeeeah, I know what you meant.
Apollo Guidance Computer (AGC) purpose: Put a man on the Moon
iPhone 6 purpose: Candy Crush daily fix
Apollo Guidance Computer (AGC) purpose: Put a human on the Moon
iPhone X purpose: Enslave the lower orders so that a super-race of machine/human hybrids can take over the world
Both are awesome technical achievements!
🍺🍺
An iPhone already a piece of cheaply made aluminium would not survive space. The large Sievert increase is enough to break it. And the electrons would escape from the battery at an alarming rate rendering it useless.
PC laptop in your bedroom purpose: watch porn.
The reason computers got faster: So you could close the porn window before your wife makes it down the hallway ( and have a game running in the background the whole time!) Lol
“Houston, er, why has the AGC got 3 heat settings ?”
“Apollo, we are missing a toaster down here...”
@jdtrickster4 im preeeetty sure it was a joke
you smart arse assholes!! you get a job at IBM or NASA..the had guts, alot relied upon these people, instead of todays zombies with I-phones in their faces talking about crap..people foget these astronauts were also top-gun and nasa test pilots (crack pilots, people that could think without a I phone) so thing about that if your i phone lets ya, fucking trolls...
@กล้วยหอมจอมซน This is an old comment, but I just gotta say it looks like @LEGO PRODUCTIONS is having child issues if anything. This chunk of coal can't figure out how to use an iphone and is taking his rage out on younger generations.
I mean, he's not wrong. My generation is retarded, but so is @LEGO PRODUCTIONS.
"Houston we have a problem"
-"Have you tried turning it off and on again"
with those early systems was often the best solution
-"Run it in compatibility mode with steam engines".
this comment should have more likes
it is always on, with a reset switch. It is a state engine, not a top-down application
"Alright we hear you loud and clear, now what we want you to do is give the console a good whack"
Thank you, thank you. I get so tired of hearing people that don't understand technology and how it works, try and deny the fact that it was possible. Moon landing deniers say we didn't go to the moon because we didn't have the technology... that's equivalent to saying that circumnavigating the globe in a ship in 1600's was impossible because sailors didn't have GPS. Just because you don't know how it was done, doesn't make it impossible...
"Apollo, you are go for landing."
"Houston, thre's a damn advertisement about walmart specials on my I-Phone. Dammit. I shop at Target."
That's why modern computers aren't all that it was doing calculations that helped a rocket launch into history and safely land on the moon that's so much more exciting then being able to watch tv on you're phone
I gotta wait a few sounds to press skip
Very informative video, as is usual from this presenter. Straight to the point, concise and non-chirpy. Great!
You are just about as clear a narrator as Ive ever heard on TV or otherwise. Thanks for you clarity
I would trust the AGC , it was task specific, no randomness , it was physically made to not make errors and so much more robust.
Yes and its construction was such that it would not fry when hit by some radiation on the trip to the Moon and back. There is elegance in its simplicity.
@@qtig9490 if you're just going to the moon and back you don't really need to worry about the radiation factor it takes less then 3 hours to go through the Van Allan belts and a thin layer of Aluminum is sufficient for high energy particle radiation and if you're talking solar flares that's why the sun is monitored 24/7 which gives a good 3 days to get out of its way
@@qtig9490 I'm glad someone gets that
Armando Silvier Multi ton? You gave away your total technological ignorance right there, boy!
In fact, every statement you made was incorrect.
less memory -> less code -> less bugs
Less ability
-> still to heavy
+ No bloatware = No memory eater
No computers = no code = no bugs = no bloatwares = no memory eaters. That’s true as well.
The size of the transistors made them more immune to soft errors from cosmic radiation
This guy is hard working and brilliant. What a clear, concise mind
It's interesting to read comments from those who doubt the computer power available was sufficient to get men to the moon, and yet they overlook the fact that both the USA and USSR were soft landing unmanned craft onto the surface of the moon during the 60s with LESS computational power than the Apollo missions :-)
Allow people to be genuinely unaware about their actions and thoughts otherwise we all were astronauts and engineers. ❤️🙏🏻
Even 55 years later no other nation has ever claimed to have landed men on the moon not even Russia . It’s interesting that no one else in the world can do it this long after 1969
@@gregleach5833 - Greg, the USA and USSR first put men into space in 1961, and yet the only other nation that has built rockets capable of sending people into space is China who first achieved it in 2003.
@@gregleach5833 - 63 years later and still only 3 nations have managed to build rockets capable of getting people into space, so those are the only three nations today that can potentially send men to the moon... if they really want to.
@@gregleach5833 - Think about the size of rocket required to get men to the moon and ask yourself if Russia or China have built such a rocket yet (the USSR tried with the N1-L3 'moon rocket' in the 60s and it blew up during every test launch).
The USA have achieved a rocket of that size with the Saturn V, the recent SLS and soon with Starship.
More computing power was used to create Paul’s shirt than was used in the Apollo missions.
But. I m wait for present people go to moon..
😪😪😴😴😴😂
Nasa says we won’t be able to go back… the radiation is a huge problem.
Android's AGC: "15 milliseconds until landing engine fire - but first 17 updates need your approval ..
Pipe2DevNull You can fire your landing engines when this ad finishes.
Pipe2DevNull
Returning from space flights is DLC. 10$. or die.
+Pipe2DevNull You can skip after 3 seconds.
this one made me laugh. :D i like this better than people arguing.
VeNuS2910 Thats fucking bs
Wacthing CuriousDroid video: "Wow, I am actually proud on feat of engineers, scientists and pathfinders who really advanced Humanity"
Seeing the comments: *Faith in humanity lost*
True, He does a unreal job of respecting what humanity could do with tech of theyre time.
Welcome to the youtube comment section.
I was almost too scared to scroll down to see what you meant. My god, what the fuck people.
What if Paul Shillito actually trolls his own vids - dat would be freaky! But I would still love him - as long as he doesn't grow a beard.
They aren't human so they don't count. Restore your faith already. ;-)
You are one of my favorite sites for a dose of old school reality. When people around the world were less pretentious and understood and used their ability of critical thought. It seems to be a lost art today.
I’m impressed. This is a clear explanation for a general audience of some complex information - especially, of just what was going on with the LEM’s computer, and the repeated error codes, during Eagle’s descent. The explanation is so good, I almost understood what happened in that episode!
You failed to mention the "Human Calculators" on staff to also calculate orbital dynamics equations in parallel with the computers on the early Apollo missions. They served as a sanity check and backup for the untrusted computers. It's interesting to note they used slide rules and did these calculations by hand.
Great video. Thanks!
Many of us old timers remember using slide rules to calculate problems in college and on the job. The slide rule was a great invention and helped advance achievements in science and engineering. Our phones now have the capabilities to calculate most everything and if not, then a modern calculator today can achieve the task.
Margaret Hamilton!
On Gemini 12 the onboard computer failed, so Buzz Aldrin used a slide rule to make the calculations required to rendezvous with the Agenda target vehicle.
I would not trust my life to an Iphone, I would trust my life to an indestructible Commodore 64.
Long live the Commodore 64 !!!
Mine still works. Also my CBM1541 Disk Drive and my MPS801 printer.
@sote ful Me, too. (glad I grew up in the 80s) I had the same computers you had, except I had a VIC-20 before the C64 and I had a Tandy 1000EX.
you are all fools: A Vic-20 is a better bet!
Commodore 64s sound amazing! I love SID MUSIC!
Lol . I found an old newspaper ad from the mid or early 1970s in an old magazine i picked up at the thrift shop . The computer, monitor, keyboard and mouse cost four thousand dollars . And it mentioned " a hard drive that holds a whopping 30 megabytes ! "
1969 - Computers were landing humans on the moon
2017 - Computers allow me to shitpost and look at porn.
Born to late to explore earth, born to early to explore space, but born just right to browse dank memes ;)
luv your name
This deserves a lot more thumbs up.
earth is flat,nobody went to the moon,and never will...grow up
johncautobody Your brain is flat.
I was just watching a video from CuriousMarc who somehow managed to get an original Apollo flight computer, and he somehow had it hooked up to a lunar lander flight simulator and was showing how the computer controlled the actual landing. It seemed so easy and helped explain what was going on during the landing videos!
Today's computers would put a pop-up in the way of the readout. You'd have to click "Skip Ad" to land.
@deckard163 and you say the video is stupid.. wtf man 😂😂😂
@deckard163 and in a thousand years people will look back to these years and say: well, our stuff has million times more power, so they couldn't play realtime 3D games back in 2018? That's stupid. And is equvivalent to your argument. Just because we have million times more calculateing power, it doesn't mean that something is impossible with the lesser computers.
@deckard163 and I am the one with "you have reading comprehension problems"... I didn't said they had 3D games back than, try again. I pointed out how stupid your argument is. And going to the moon isn't about fucking computers. I imagine you have never programmed something byte by byte. You would know that calculating motion for landing isn't that hard, thos computers could manage it. Going to the moon is about money. Do you know how much it costs to build one of those rockets? And why would you spend that much money for that? The cold war is over. There is minimal scientific benefit from a moonlanding. There is no point, unless you do a reality show about modern moonlandings or such bullshit. That's what todays avarage people need.
Basing your argument on "today you get million times more computational power for 60$" is as stupid as if somebody in the future would say that we didn't have 3D games now, because in the future you can buy a computer more powerful for 2$. And as I said, the bottleneck isn't about computers. Try to summarize the calculations needed to controll a thruster based on position information provided by a radar. At max a couple 100 instructions every second is my first guess.
I hope if you read through a cople times these comments you will get enough info, because this is my last comment here. I don't waste my time with conspiracy nutts, it hurts intellectually.
Someone would put BRAWNDO® THE THIRST MUTILATOR in the lander fuel tank instead of fuel.
@@zoltankurti that was one of the most intelligent and interesting comments I've read on here tonight
Reminds me of the old Burroughs B500 mainframe I used to look after (btw it was ex US Military!). Due to having magnetic core memory, you could press the HALT button, make a note of the current memory address register shown in duodecimal by neons on the control panel and shut it down. When it was powered up again, you could key in the saved memory address, hit CONTINUE and it started up again as if nothing had happened. It saved me some very late nights!
19.2K of memory, 9.8 MB of head per track disk, three half inch tape drives, one 80 column card reader, one paper tape reader and one 800 lines per minute line printer. Can't remember the clock speed but I think it was about 1 MHz. Those were the days!
Surprisingly, as it was used for business applications like payroll and accounting, the actual processor speed wasn't really relevant as the whole system was limited by how fast the printer could print.
The downside was that it needed regular preventative maintenance, it lived in a big air conditioned room and needed three phase mains power (lots of).
Not a chip in sight - all discrete diode and transistor logic.
That's so awesome those things didn't need all the fancy shit they just did what they were supposed to
That's even before my time as a computer engineer. I started working in 1980, on Honeywell level 66 mainframes. Huge cabinet with about 80 wire wrapped boards with ic's in the "dead bug" position. Had a great time debugging these boards with a board tester, a logic probe and huge schematics books.
We've gone backwards. Modern software is bloated, inefficient, require a ridiculous amount of memory and are unreliable. As a developer, I find myself constantly appalled by the low quality of software written today.
Joshua Barretto: Finally someone speaks out!
Nobody in the commercial market would pay to use software developed to aerospace standards though. Keeping it within a reasonable price range is a hugely important parameter for things like cell phone apps, so they take shortcuts, use ready-made libraries, hacky solutions, take full advantage of easy-to-push patches etc.
On the other hand, just the QA for Shuttle software involved a specialized peer review group of the same size as the principal coding team (boom, twice the payroll), an outside verification team (thrice the payroll), then a module test group, configuration inspection group for the complete software load and then another verification group at NASA.
Imagine trying to develop a smartphone app with these many independent QA levels and then trying to make the money back by selling it at 5$ a copy...
Paul Zuk: Good point but we're not asking for aerospace standards, just fit for purpose.
Most coders just suck and rely on patchwork.
I often wonder what Windows 3.11 might be when run under today's hardware...
I find it absolutely incredible the minds that developed this technology. And I object to people calling it "primitive." It's state of the art.
Well maybe it was state of the art...at that time! 1969. We are now 2022 and yet they are trying to work out how to protect orion from the van allen radiation belts! We supposedly went there with apollo missions? Have they forgot how they did it? Oh I forgot they lost all the information! Err! Too many anomalies.
@@michaellyne8773 So all you're saying here is that your knowledge of the Van Allen Radiation Belts is limited and you don't understand how the 1960s technology used by NASA quickly became obsolete....
@@thegreatdivide825 ime saying that they faked it..let's see what Artemis does in 2025
@@michaellyne8773 Thank you for confirming your limited knowledge
@@thegreatdivide825 thank you for your limited knowledge also!
Great explanation! One of the biggest misconceptions about the technology that took man to the moon... Those computers, while seemingly crude compared to today's, were tailored specifically for firing the engines, controlling guided descent, controlling, guided ascent, and rendezvous. They were quite remarkable and an amazing achievement by the brilliant minds working on it.
...Even allowing for fuel slosh and an ever changing center of gravity. If I remember right, when the Instrumentation Lab at MIT was given the task for the guidance system, they had already made one for the Polaris missile and they just grabbed one of those off a shelf and started expanding on it.
Steven Salmon No ”fuel sloshing” in the Polaris. It had No liquid fuel.
In a nutshell: "Do not many things, but do them very well"
I've heard that Stanley Kubrick directed the fake moon landings. But he was such a stickler for authenticity that he demanded it be filmed on location on the moon.
TrueGrandImperial. Now THAT was funny!
Joke stealer
Good one!
Who filmed Armstrong jumping off the ladder? not a camera mounted to the LEM
robotic arms were not developed until 1971 oooops
Actually there’s something more pertinent to the iPhone comparison. The iPhone’s CPU uses a nanometer process. The ionized radiation would shred any smartphones components without shielding. The Apollo’s computers used rope based ROM which wouldn’t have been affected due to its size and pre-programs tasks. Well, that’s the basic explanation. This is the reason the new Orion capsule is being tested even though we’ve sent humans to the moon - smaller CPUs and memory, etc..
Thank you for explaining it in a clear and concise way.
question for you ! how many hours did Neil Armstrong and buzz aldrin spend on the moon! The answer given was...Armstrong and Aldrin spent 21 hours, 36 minutes on the moon's surface.5 Jan 2022
So my question is and I do not mean to sound crude..but what did they do about there toilet arrangements? We all have to do our number one's and number two's! Did they take off there suits to change there catheter bag? Come on nasa I would like an honest answer..Remember all them bulky suits and back pack ? In a small environment!
@@michaellyne8773 Good honest question. They only spent a couple of hours on the moon. They wore diapers while on the surface. But back inside the LEM they used specialized bags. They were fairly sophisticated, but not perfect by any means. Actually they were fairly crude, but did the job. My dad worked at NASA so I got to see a urine bag, and an entire Apollo-era space suit complete with vacuum packed food - he actually brought the whole thing home for a few days and had a show and tell for one of my classes. That was awesome.
Anyway, it’s good remember that they spent some of that 21 hours sleeping, eating and working. So, they weren’t just sitting there waiting to go to the john. And I believe they took some medication to help their digestive systems slow down. Going #1 was a easier than going #2 which could get messy. Anyway, they used bags, tightly folded them up and stored them in return containers. NASA scientist wanted them to bring all that stuff back for evaluation. These were early days and they needed to know what affect long term space travel had on the body. Apparently the LEM didn’t smell great, but these guys were determined. There’s a report that explains everything in detail if you’re interested. It’s called “Biomedical Results of Apollo - Chapter 2 - Waste management system”. Google it. You don’t have to read the whole thing if you don’t want to. You’ll get the idea pretty quickly. There are some pictures of the receptacles along with some diagrams. Hope that helps clear some things up for you. Btw, yeah the LEM was small, but not as small as most people think. It was smartly laid out and there was room to maneuver.
@@TheSteveSteele but in there big bulky suits ? They did that in a closed environment no bigger than two telephone kiosks! let alone the controls and other equipment inside no air remember! Somehow I beg to differ! Sounded a good answer. Did they take of there suits and then put them back on again 🤔 I watched the preparation before they took off to go to the moon! Come on! A small environment 😆
@@michaellyne8773 The suits they launched with were for take off. A totally different experience than what it was like standing in the LEM in 1/6ths earth gravity. They weren’t wearing all of the protection they were during takeoff. What do you mean by “no air”? The LEM had a life support system. They had air. Since they weren’t wearing their full suits in the LEM during rest periods, it wasn’t that big of deal. They urinated into a bladder they were wearing that could be dealt with later, so they had to remove nothing for that. For the other, they had to use a bag, which required some suit removal but not much. You telling me you couldn’t go to the bathroom in a telephone booth if you had too? Not a big deal.
this guy talks like my history teacher. almost began taking notes.
Eddy Mugira - yes - (much) better than an ordinary history lesson at school !
Wish my history teacher was dat good!
who the fuck takes notes?
People who are fucking intelligent enough to realise that they have something to learn. Of those who do not take notes, in 0.01% of cases it's because they already fucking know this stuff. In 1.99% of cases, it's because of some fucking disability that prevents notetaking, and in 98% of cases it's pure stupidity often signified by arrogance, over-confidence and inability to concentrate.
The herd mentality is strong with this one. Silly sheep.
The popular Arduino Uno boards have 2K of RAM, 32K of ROM.
They're also 8bit, and as a blank slate, can be used for real-time computing.
Play with one of those for a while and you realise just how much you can do with what appears to be very little.
Those of us old enough to have grown up with 1980's home computers (BBC Micro in my case), aren't quite as shocked.
Yeah, people, even professional programmers under-estimate the possibilities of the hardware they use.
I had started with assembly programming in my graduation(2007-2010). Still to this day, I love how much can be achieved in assembly. Even though, I only program in C/ C++ for my job.
Everything these days is layer upon layer upon layer of bloat. It's disgusting.
@@mazzalnx It's not disgusting but good and useful too. Without this layered structure we would be stuck with changing bits even for most complex work.
@@anujmchitale I don't mind the structuring but the laziness disgusts me. I rarely use that word but, genuinely, I cannot fathom why each of my simple text messaging apps use 200MB of memory or more to run. Security is necessary, modularity is necessary, I understand, but we had all of that laid out around 2002 and it worked beautifully. Messaging apps used no more than 30MB back then and did THE EXACT SAME THING. Pictures, voice, contact lists, ads, fancy UI, you name it. Since then, all I see is nonsensical fluff being piled on top...
@@mazzalnx Oh if you are specifically talking about Java then yeah, the bloat disgusts me as well.
Fascinating. But you failed to add that all the major course calculations for the orbits were already pre-planned into the computers to cut down on any additional on-the-fly calculations. These were numbers crunching intensive so NASA did these ahead of time.
yes.. with robotic missions.. evidence shows astronauts were faking to be halfway to the moon while in low earth orbit. th-cam.com/video/mCHG6uJH5L8/w-d-xo.html
lol good link. The comment section where people tear the video apart is particularly entertaining.
orbital positions provided to NASA for their 1976 robotic Mars Viking mission, which they applied to the photos that recorded at the Cydonia Region after locating a five sided pyramid at 40.87 north, presented by a NASA whistleblower that also contends that the Apollo missions were fraudulent. Bart Jordan and MARS Viking 1 th-cam.com/video/4Dc1A37LIGM/w-d-xo.html
Hi @Kernels , if you read some of the journals you can see that the loaded programs get changed on the fly by the astronauts.
A great video as usual. My brother is a computer programmer for Google. He once commented to me that old computers designed for specific tasks like the Apollo systems can just keep on working as they were designed to, for ever and a day (so long as the hardware hangs in there), because they don't get bloated with new computing requirements each year.
so to get to the moon you just need a quad core cpu in you space ships computer that's doable🤣
The culture at large has forgotten the difference between specific-use computers - totally bespoke technology - and the general computers we have in consumer space, with OSs and GUIs holding our hands as we navigate all matter of programs, apps and uses.
Apollo / NASA systems were designed for one mission. No email, web surfing, digital media playout, no games. Just figure out all the data needed to get to the moon and back. Hats off to the engineers in the 1960s who did this. A 'brain-power' Olympic Games for sure.
I love your videos and want to make sure to start with a sincere "thank you" so you don't think I'm a nit-picky troll. I just wanted to point out that you mentioned that the LEM's Abort Guidance System wasn't used on any missions and that's correct for all of the missions that landed but the AGS was used by Apollo 10 as a check-out of that computer to clear the way for Apollo 11. Apollo 10 was considered a dress rehearsal for Apollo 11 so they were doing stuff like that. For the record it worked fine after a slight emergency that occurred when the astronauts accidentally performed the descent stage separation with the primary computer on but the guidance data in the AGS. Once they got the AGS engaged, it led them back to the command module perfectly. Thanks again for your very informative content.
Thank you so much for producing this. I work in this industry and your spaceflight videos such as this are consistently excellent and even I learn something from them! You give a thorough, correct, and frankly understandable explanation in terms viewers can understand and the production with archival video is top notch. This is the best explanation I have ever seen about the Apollo computers and I appreciate that you so correctly explained the technical differences compared to something we know. Well Done Sir!
People tend to think a control system computer needs to be PC or modern cell phone level powerful to control lots of devices at once, but that is not the case. A programmable logic controller is a good example of this. You can control 10's of thousands of processes with a single PLC unit that has the computer power of 1980's RISC CPU tech. Heck, even today's Nasa Rovers use 1990's RISC CPU tech.
EETechs of course, they dont need to fake his landings in movie studios with hight tech, because retards will believe they anyway
Indeed IBM AS400's are still in use today...massive number crunching capability and that's all!!!!
Yep, look at the Arduino - so many possibilities, so many GPIO pins to control many many devices.
I think you overestimate the processor requirements of a PLC, the first ones I used had 8008s (the ancestor of the 8080 and hence 8086) and 6502s (Apple 2 CPU). Perfectly happy to run moderately complex machinery, just forget pretty operator displays.
@@carlamayer8198 I watched a video years ago as they were mocking the sheeple for being so dumb they had to refake the actual fake landing on the moon with Photoshop of dubbed in drink cups on the set ...this whole thing of NASA never going to moon , was started by NASA ..and I highly suspect the latest flat Earth bs is also a NASA doings along with all this "evidents that rockets can not work in space ..and add all this to the stories that say satillites don't exist and now we are being told the sun is only a light bulb fastened to a dome along with the moon and stars ..and the sheeple believe ... I wonder where the Nazis will take us next ? Passed the gas chambers ? I don't see any in the FEMA camps only the gas powered crematoriums ..so they are going to save some money this time and just throw us into to fire to be burned alive !!!!! Have a beautiful day
You do some of the best videos anyone will ever find on TH-cam. Great, great job!
It's nice to hear about some things we got right, rather than everything that went wrong and how much it cost. The designers and engineers of the Apollo Program are just as much heroes as the astronauts!
And yet people insist that it never happened, that it went wrong. What's wrong with humanity? Our greatest achievement, and some people just deny it.
Yes it was brilliant engineering all the way around.
Careful. Some moron will be along in a second to tell you that the pyramids are 'fake'...
LOL! No video of the astronauts putting on or taking off their moon suits. Fake news.
@@suekennedy8917 They weren't anticipating someone such as yourself 50 years later with a voyeuristic disorder
The way this man explains things; he should be my university lecturer.
The thing many forget is that OS and other software have got more and more massive over the decades. Back then with limited memory and CPU power meant really efficient programming. The resources now are so massive it is less important.
this is a particularly fascinating area of the Apollo program for me. The sheer ingeniousness of how they designed and built the computers to enable craft to travel to and land on the moon. In a way, they had to be smarter than today's system designers and developers, because of the extremely limited computing power available.
Man, I love your videos! No clickbait, no fillers... all about the topic and very interesting!
Your IQ must be somewhere around 50 !!
@@rearview2709 Your number of girlfriends must be somewhere around 0
I love the simple-minded folk in the comments. "I can't comprehend something, therefore it must be fake"
Imagine living life like that.
@callingaseyeseeit2325 lmao. I get it, this topic is too hard for you to understand. But just because you don't understand it, doesn't mean it's a conspiracy theory. Go change your tinfoil hat
@callingaseyeseeit2325 not one single point worth arguing from you. Perhaps bring some actual intelligence to the conversation, you derelict
@callingaseyeseeit2325 😂😂😂
@callingaseyeseeit2325 your intellect is too much for me. I can't argue with such a superior being.
They will change to a red hat.
Excellent topic and video!
Slight tweak: The errors that caused the Lunar-Module AGC to drop low-priority tasks, giving 1202 and 1201 Alarms, was not due to any problem in the landing radar, but just that both it and the Rendezvous radar were turned on, causing “data overload,” so to speak.
The rendezvous radar was not needed, but the checklist allowed Aldrin to have it on, which he did as a precaution, just in case they had to abort and immediately rendezvous with the Command and Service Module. As I recall, the checklist was amended in subsequent missions.
Slight tweak to your slight tweak.. The power supplies for the landing radar and the AGC were out of phase and that caused the CDU inputs from the radar to interrupt the CPU roughly 6000 times per second, and this is what caused the overloads and the program alarms. This problem was known to both MIT and Grumman at least a year before Apollo 11, but nothing was done to fix it.
Modern phones : Do everything and anything, a million things at once!
AGC : You have one job ....
As I've said under videos about cosmology -
If the universe is a simulation, given enough memory, you could run the whole thing with an Intel 4004 processor. We would all be oblivious to the passage of time for the processor as the processor is not a part of the simulation.
That that functionally tiny computer could get men to the Moon is no surprise at all. I used to have a bit of fun programming my Tandy PC-7 with its 1.5k of memory. Even played Conway's LIFE on it, but I had to use pencil and graph paper to translate coordinates into visual results.
Man, what a great video. Not many things i didn't know but it made me understand it much better.
Brilliant work.
Excellent vid. I only wish you could have mentioned Margaret Hamilton, the young lady who wrote most of the code for those computers. I teach middle school Science and I always give a shout-out like that when I can, to encourage and hopefully inspire the girls to coninue studying Science.
Amazing how far technology has come,imagine 50 years from now....
My Dad worked on the IBM Apollo project team, in Huntsville Alabama (guidance systems). I didn’t find out until I was 17 years old! (1988), when I overheard him talking to my friend. I was shocked that my (dull IBM computer software developer)Dad was involved with this!
The Apollo Abort Guidance System *was* actually used, by the way. It was tested on Apollo 9, accidentally activated on Apollo 10, used during lunar ascent gimbal lock on Apollo 11, and used for most of the return trip of Apollo 13 (including several engine burns).
Steve Fischler and I built the ground control computers for the Gemini and Apollo spaceflights in a facility in Palm Bay, below Melbourne, which was below the Cape. The facility was Radiation, Inc. and owned by a dentist, Homer Denius. The first IC chips were built right down the hall in flasks sitting on what looked to be duct tape covered pillows. Our computers were wired by some 50 women in a room adjacent to ours and we used potted circuits built in little plastic boxes to populate the wired boards. Each potted circuit had 10 pins on the bottom and contained flippflops, astable oscillators, etc. NCR built the memories for our computers, 64K machines.
But when you are doing assembler or a threaded interpretive language like Forth, that is tons of room. In fact the most complex word processor program I have ever seen ran under CPM (control program for microcomputers) which only ran on 8 bit machines.
But when the shot JFK, I expected the space program to fold so I took a job as a life guard in Lantana, Florida. It was fun while it lasted.
See the geeks and tin foil haters don't care , they just repeat others doudt
felix mendez Bro you doudt everything regardless of what anyone says , dam relax . You have all this " evidence " to prove your statement's take it public then, actually do more then just debate people online lol . Go attack the believers of Bigfoot
felix mendez See it's a fact any jacklegg can make a website and put what he wants on it , truth or lie people will believe it . That is why nothing on the internet is 100% truth , where are the documents and witness statement's to prove your claims ? Real evidence not website's or TH-cam videos
felix mendez Do you claim it's cgi if you see a car accident I front of you ?
felix mendez Your assuming alot lol , you make all these baseless claims for enjoyment get a life lol
What very few people know: The kind of software programming that was done on the AGC...and the limited AGC memory capabilities....were used again on the FIRST Commercial Airline aircraft Flight Management System (the Boeing 747 in 1981). That computer controlled the climb out, in route navigation and descent of that aircraft. Similar systems were eventually put on ALL commercial and military aircraft. These aircraft use Flight Management Systems up to the present time. This was one VERY important offshoot of the Apollo program that was put to a very useful purpose for EVERYONE flying today.
The more computing power we have got the more we tend to complicate things when developing software.
That is not true. It is as simple as never before to develope software.
Samuel Rosenberg it depends on the effort.
Well, we don't tend to care as much, so even 'good' code can tend to be sloppier.
The Amiga OS runs on 1 megabyte of RAM (technically 512k, but it was never very good at that), and a 7 mhz processor, and can do multitasking and a bunch of other things that wouldn't look out of place in a modern system.
Some of that performance comes from being very fast and loose with security and reliability. (performance is king, everything else is secondary).
some of it is misleading (The system isn't as capable as superficial comparisons would suggest)
But some of it genuinely results from much less constrained development.
I don't care if my graphics routine uses an extra 5 megabytes of memory on my system with 16 gigabytes of total memory.
But on my 1 megabyte system I DO care if I use 70 kilobytes instead of 60...
It might be that various overheads related to hardware design and operating systems mean I need that extra 5 megabytes to do stuff I wouldn't have had to on the older system.
Or maybe I'm doing more error and security checking to prevent crashes and exploits.
Or maybe... I just don't care. I mean, what do I care if I'm wasting what isn't even 0.1% of my system's resources?
Unless I do that hundreds of times over, it won't really matter.
Plus, there's laziness.
Do I want to spend 1000 hours writing hand-crafted assembly code? Or trust my compiler to create 'good enough' code that may be only 70% the speed, and uses 130% more storage space, but saves me 900 hours of work due to the reduced complexity on my end?
Some optimisations just aren't worth the time they take to write.
Plus, the most heavily optimised code is also the least comprehensible, most brittle, and least portable, in general.
Which often isn't worth it.
And of course, if small mistakes and inefficiencies like that aren't critical to whether something works or not, it lowers the barrier to entry.
Sloppier, less skilled programmers can still write usable code, because their errors aren't as disastrous.
There is "some optimization isn't worth it" and there is "this program that is basically just a dedicated web browser takes nearly 1GB of storage and another one of RAM" (looking at you, facebook).
Same with storage space.
looking at the MS website why does MS word suggest 3gb of storage, I mean Warcraft 3 is under a gig and is a whole game.
The hardware *was* the operating system then. And you're comparing general purpose computers (like we have today) with a computer that was designed specifically to just do one specific job. A $15 programmable calculator today would run circles round the flight processor then. It's not a job that _needs_ a lot of power.
That was a clear, concise and informative presentation. Very nice work.
th-cam.com/video/Z3FJ2dhr-aA/w-d-xo.html, now look at what really happend hahaha hate to bust your bubble Folma
Man, I have really enjoyed these Apollo vids, thank you.
Amazing stuff, software design led by legendary and wonderful Margaret Hamilton ❤️
So the Moon landings were essentially like going into a bossfight at level one and somehow winning
Yep. Soviets had a very risky build and a shitty team coordination (hence failing to get N-1 flying without going boom), while USA managed to win on teamwork and pure spending (F-1 and J-2 were inefficient, but they were sufficient to do the job).
@@caav56 Russians went full kerbal mode, add more boosters
@@cursedcliff7562 That'd be alternate tech tree branch, the UR-700.
@@caav56 dude, who got to space first?
@@victormponcec Those, who got all of their N-1 superheavy rockets blow up one after the other.
So some people can believe that there are 1.600.000.000 transistors on a (let's remember it) human-made 1" square of silicium, but can't admit that we went on the Moon?
Not the best argument, but I'm always mezmerised by miniaturisation those days. And by stupidity.
Marcells44 most people can't comprehend how much 1.600.000.000 really is
That's literally what I am thinking all the time, thank you for commenting this :)
That's kind of where I'm at. Apollo - sure thing. i7 - really? I guess we got help from the aliens we met on the moon. The Internet, too. A signal travels from my keyboard, through a computer bus, through an ethernet cable, through a switch, through a router, through a telco system spanning copper, fiber and cable, to arrive across the country in just 100 ms. We accept that for granted and yet it is damn near a miracle it all actually works. And here you are reading it, going, "hmmm..."
joedmac78 If it helps, those people should be aware that it takes about 50 years to be 1.6 billion seconds old.
Milt Farrow ..... Call your Mom and get help.
My very first ‘computer’ was a Casio FX-702p programmable calculator which I bought in the souk in Abu Dhabi in 1981. With just 1.68k of memory, one had to be very inventive to get it to do stuff of any size, things like designing small sub-routines that could be used in different situations. Looking back, the sheer limited size of such machines forced the best from programmers and was very good training for the future. I’m sure this must have been the experience of those programming for NASA nearly 20 years before me.
Increadible how they did it!! Respect for all those people working for the Apollo program. A good book to read is "One Giant Leap" by Charles Fishman
Why--is he a Olympic High Jumper ?
“Things like a computer that could fit inside a single room and hold millions of pieces of information.” I don’t know if Jim Lovell really said that, I know Tom Hanks did when he played Lovell in Apollo 13, but I always smile when I hear that line in the film
The leaps and bounds we made due to setting humanity a difficult goal. This is still the pinnacle of human achievement
It was a damn good ship design. Apollo 13 only failed because someone dropped the o2 tank during production. Even with that explosion, it still got those guys home
I love these videos. The Apollo era was incredible. I was born too late.
powerful ground based computers taking the heavy processing load and then sending it to the spacecraft.
this seems to be the first example of what we now call "cloud computing" ??
Now that's an idea! A computer that lives in the satellite!
Nope. Time sharing existed far before that.
“The more they overthink the plumbing, the easier it is to stop up the drain”
Montgomery "Scotty" Scott
For several years MIT was the sole consumer of every logic gate chip produced in the USA. At the time the factory failure rate of semiconductors was very high.... sometimes 10 chips had to be purchased to get one fully functional unit. Lots of hidden pitfalls in the AGC project!
The AGC was an embedded computer. Embedded computing never needs tons of resources, embedded system engineers and even Arduino noobs understand that. You can do a whole lot of stuff with a 32k flash and 2k ram AVR microcontroller. It is totally different than what people usually understand by 'computers'. Even the most powerful embedded computers never need 10 MB of flash memeory.
As usual , an excellent informative video. Well done CD
Something left out that I would like to know is whether the AGC, and the other computers programs were all executing binary code, or had they advanced to an octal or hexadecimal instruction set that was running at the time the processors were running? I say binary and also all of the commands sent up from the ground were reprogramming of the flight computers to take advantage of :no longer needed for the moment storage space, which the program would branch into until it got to end run and an new set of instructions were sent up to the astronauts with the offsets into cpus memory where the program would run until it was no longer needed and then another set of instructions sent up to overlay the memory of the previous code that had served its purpose and that area of storage which had been previously used would be overlaid with new code. BTW for you IBM sys progs, if you have ever seen a JES2 message it is predicated by $HASPxxx, which stood for Houston Area Spooling Program, so in essence JES2 was a by product of Apollo. Did the AGC have registers?
As usual more garbage salesmanship, keep eating the lies up I bet you believe a plane crashed into the Pentagon to
@@christinapankey1415 too/also. If you're going to make silly comments then at least get your English correct.
@@christinapankey1415 you're nuts
You appreciate how advanced the AGC was at the time after seeing that MIT was notified by a western union telegram.
How much would you like to own that telegram? Slightly more interesting than an NFT.
Great Video, all computers running an OS now prioritise important tasks and have for decades. Interrupts are an example that prioritise your mouse or keyboard instruction, all this comes from Apollo. First use of integrated transistors too.
That's the biggest misunderstanding: The Apollo flight computer was built from the ground up to be good at doing one task, where as a smart phone can do hundreds reasonably well. Two totally different machines.
That's it in a nutshell.
of course it is.you clearly have a brain.thank god someone else can see through this absolute garbage.space travel is impossible even today.thats why we get cgi cartoons from nasa.nothing is real.not a single thing
johncautobody Your brain is flat.
Dulqornain You may have eyes but you don't seem to know how to use them.
Dulqornain That’s because your ears don’t work either! You are all fucked up!
Comparing the AGC with a smartphone is not useful. The AGC is a real-time beast of a computer, capable of an inner working totally out of bounds for current complex microcoded microprocessor. Some parts of the AGC processor use analog tricks to allow a safe and error-proof operation. In many of its features, it is still superior to many general purposes computers of today.
It can only be fairly compared with other spaceship computers or critical fail-safe embedded microcontrollers, and still, the AGC holds it well with its cleaver error handling system capable of running or safely recovering even in the worst condition.
The AGC should be a classic case study for today's computer engineers.
@ᚱᛰUᛠӖᚱ ᚦᗩӖϻᛰᚤ can't debunk it cause it's a fact it happened six fucking times what ever you think is evidence it's just proof who ever said it is a moron so what do you call proof
Man, these tings were really basic. I followed the Apollo program since it was first announced in '61 and in '69, I was an Engineering Science major at SUNY. We did a Fortran exercize to lay out a path to The Moon and back. It was a mind bender for sure,
@jubjub247 You're the biggest god damned idiot on youtube..
@@chriskleckner1659 I'm sure, that there are bigger idiots out there. Never underestimate human idiocy.
@@caav56 Man...you got THAT right.
THE APOLLO PROGRAM WAS THE GREATEST TECHNICAL PROJECT IN HISTORY!
Thanks. I always pay attention to these questions you answered. Greatest set of answers ever. Thanks.
An iphone (full of tiny transistors the size of a few atoms across) exposed to cosmic radiation wouldn't function very well.
wait do you mean neutrons or neutrinos?
@@rbwannasee That is wrong. Neutrinos have no electric charge, therefore they don't spiral in magnetic fields. Neutrinos would hardly do anything to a transistor, since they are so sterile. Most of the neutrinos pass right through the entire Earth without collideing. It is protons end electrons which pose a threat to computers.
@@rbwannasee neutrinos are the hardest to detect particles out of regular radiation.
@@rbwannasee but the radiation part is correct. You need cube kilometers of matter to have a reasonable chance of detecting a couple neutrinos in a week.
LOL ever heard of shielding??? Even back then they shielded their computers...
because people have a hard time realizing that new computers are designed and made by old computers ... they're all extremely capable when you think about it - they did an excellent job working with what they had, which they thought was amazing at the time
*WHO WALKED ON THE MOON* • Neil Armstrong (1930-2012)-Apollo 11 • Edwin "Buzz" Aldrin (1930-)-Apollo 11 • Charles "Pete" Conrad (1930-1999)-Apollo 12 • Alan Bean (1932-2018)-Apollo 12 • Alan B. Shepard Jr. (1923-1998)-Apollo 14 • Edgar D. Mitchell (1930-2016)-Apollo 14 • David R. Scott (1932-)-Apollo 15 • James B. Irwin (1930-1991)-Apollo 15 • John W. Young (1930-2018)-Apollo 10 (orbital), Apollo 16 (landing) • Charles M. Duke (1935-)-Apollo 16 • Eugene Cernan (1934-2017)-Apollo 10 (orbital), Apollo 17 (landing) • Harrison H. Schmitt (1935-)-Apollo 17 *WHO ORBITED THE MOON WHILE THE OTHERS WALKED ON ITS SURFACE* • Frank Borman (1928-)-Apollo 8 • William A. Anders (1933-)-Apollo 8 • James A. Lovell Jr. (1928-)-Apollo 8, Apollo 13 • Thomas Stafford (1930-)-Apollo 10 • Michael Collins (1930-2021)-Apollo 11 • Richard F. Gordon Jr. (1929-2017)-Apollo 12 • Fred W. Haise Jr. (1933-)-Apollo 13 • John L. Swigert Jr. (1931-1982)-Apollo 13 • Stuart A. Roosa (1933-1994)-Apollo 14 • Alfred M. Worden (1932-2020)-Apollo 15 • Thomas K. Mattingly II (1936-)-Apollo 16 • Ronald E. Evans (1933-1990)-Apollo 17
I would trust in an ARM processor to bring me to the moon and back, but I would not trust in the OS.
What OS would that be? There is not just one flavor of operating system per processor type, you know. :-)
Oh, you are the clever guy in the village, are you?
Peter M No, just an old systems programmer who knows a thing or two about cpus and os'es. :-)
If you are so, I should not explain my initial comment.
Peter M My only issue with your initial comment is that the way you phrased it, you are implying that there is no operating system that can be trusted.
Fantastic summary story of one of the great mysteries of the space program, especially to someone who made a living programming general purpose computers!
I have to stop once in a while and appreciate the weight of the fact that this small piece of technology in my hand gives me access to mind-numbing volumes of information and knowledge at the flick of a finger. - something that would’ve been considered near magic just a few centuries ago much less in ancient times. That I can take images better than high end cameras a decade ago, view a map of the world in exquisite detail, order food or items from the other side of the world to my home, ask an artificial intelligence about almost anything, view the news in another country, translate languages, have an encrypted conversation with someone on the other side of the planet....
Contemplate that phone in your hand and marvel at the absolute power you possess, that ancient kings would’ve killed you for.
What your phone is actually doing is accessing more capable, application specific servers to provide said services. Your phone is not creating the maps, it's just downloading and displaying them.
You must be fun at parties
@@DasAntiNaziBroetchen Yeah, but its still doing all if it from the user side, we have made infrastructure to support this vast network that gives us this amout of information
@@cursedcliff7562 k
@@ArkamasRoss I don't go to them because people like you are there.
your presentations are so profesional without being boring! Thank you for such great videos
They never had to worry about the blue screen of death.
milkybar06 Ah the dreaded blue screen of death :)
But there was the really large black screen of death that they had to worry about (space).
TH-cam algorithm would have sent Apollo 1 to the sun through TH-cam recommendations
No problem. Just send them at night. - AOC
"How did the Apollo flight computers get men to the moon and back ?"
SPOILER ALERT:
They were connected to a rocket which flew them there.
No way...that's great...WE LANDED ON THE MOON!
Spoiler alert #2
We didn't go to the moon
Suuurrre they did.
I really want to know - why do you conspiracy adherents think that going to the moon was too hard for us to do? Why not just go to the moon? Why bother with such a ponderous cover-up instead of just going? (I'm asking respectfully because I want to know where you're coming from, not to start a flame war.) It's difficult to debate an issue when there is no clear position on the other side.
I think he's a guy who is fascinated by science, and I am fascinated by his videos. Again, why do you conspiracy adherents think that going to the moon was too hard for us to do? Why not just go to the moon? Why bother with such a ponderous cover-up instead of just going?
No two-minute waste of time getting to the point. Thank you for knowing how to present a video, most don't know how to do it.
Brilliant video thank you! Amazing what was achieved.
Du hast das gut gemacht. So klar, so einfach. Super!
Registers, nouns and verbs. It would be amazing to see what modern computers could really do if we weren't drowned in an avalanche of "user experience" bloat.
I have an inkling of it, I've used command-line linux using high end hardware. Even with the bloat, command line instructions just fly and so much is completed in an instant, something that would blow the minds of 60's computer technicians.
Excellent document. Many people think that the current computing power of their cell phones would get them to the moon. However, a high degree of integration density would not survive in a radiation environment - cell phones work well on Earth under a protective magnetic shield, not in space. The hardware was adapted to the environment and the software was written very cleverly with minimal memory requirements. Today it would not be possible, because libraries are used that are completely compiled into the program and algorithms are created automatically. All this results in a large capacity requirement and a high error rate.
Well said. Moon-hoax believers are completely ignorant of the concept of _flight-rated._