Not really, they are all basically cut from the same cloth. They're problem solvers and they have a subversive understanding of government and society because they are literally writing the code that runs the world, or could end it. Them having a sense of humor about it is how you cope with that over your head. Prime probably doesn't think about the $100 or millions of dollars and the salaries abd wages tied to the success or failure his work. But you can bet that his work ethic and drive to learn and be better is why Netflix snatched him up despite his "Quiche Eating-ness" lol.
You fail your linux machine ? I don't even run linux, I run Xen Hypervisor as the real machine, because I am a real programmer, I don't need virtual memory when I have a 3-step-address paging CPU.
Long time C and Pascal developer here (back in the day, these days I use fancy curly-brace languages). I started in industry in '91, and this article represents what was kind of a jokey but real attitude that I myself came across, even then. To this day, I tell my work colleagues that there's nothing new under the sun, and yes a lot of what I see and hear these days is same as it's always been, only the names are changed. Great video, gave me some lol moments. I'm a self-diagnosed quiche eater, and I have been for over 30 years.
I was a young hobbyist in the heyday of 68k home machines and remember a lot of Pascal diehards having similar attitudes towards C and then C++ as this article has about Pascal. By the time I started in industry, a little over a decade after you, it was C/C++ people coming out with similar stuff about Java. But even today C, C++, Java, and even Pascal still have their users who do magic with them while other people are laughing at coders of newer languages from their .Net shaped thrones...
That's what I'm saying to everyone who tells me that real programmers don't exclusively use iPad to make their apps with no code frameworks and ChatGPT
My dad is a real programmer from this era, he turned 70 this year. I showed him this, and he agreed with everything in the article and had a good chuckle. And yes, he regularly had to fix programs directly in binary with no source code (or worse, source code that was very out of date and worthless - it was worse because you wasted a lot of time fixing it and waiting for it to compile, only to have to do the binary modification anyway later). It was common that if you were unsure about the state of the source code (meaning you have never compiled the program from source yourself and deployed that to production), to just do the direct binary modification even if source code is supposedly available - because you have less downtime that way, and unverified source code can't be trusted.
Technically, he’d have been using assembly rather than dealing with machine code (or binary) directly. While assembly is just one step removed from machine code, looking at mnemonics and arguments is hell of a lot easier than staring at a bunch of hex or binary and having to do the translation in your head. Of course, if the program you’re dealing with doesn’t include debugging symbols, it can be a lot of “fun” working out more complex logic from just the assembly, as even the concept of “variable” doesn’t exist in assembly, just registers or memory locations.
But considering why he had to do that. Because programmers were lazy and didn't follow good source control habits. Now we save ourselves from this mess
@@keoghanwhimsically2268 If you know the instruction set well, it's simple enough (no x86 instruction encoding crap) and you're familiar with the software, it's not impossible at all, or even that hard, to work in machine code. I'm only a hobbyist with 6502 stuff for instance, but I know most of the opcodes by heart already.
@@Syksyisin I’m not sure what you’re referring to. If you’re talking about 6502 OP codes like LDA (load-accumulator), JSR (jump-to-subroutine), etc., that’s assembly. The machine code would just be the binary or hex. Now, you may indeed be one of those who does all the binary translation in your head without resorting to the assembly instruction set or argument syntax, but I’ve never head of an actual programmer finding that a productive thing to do. (I originally learned to program on 6502 assembly 40-odd years ago. Apple IIs came with a basic assembler built-in.) Now, there are also macro assemblers that allow you to define reusable macros similar to C, as well as extensions to basic assembly that give you primitives to define variables, loops, etc., if that’s what your referring to. These are not basic assembly, however., and most of this information would be lost in the assembly process anyway.
@@keoghanwhimsically2268it was a common skill for system programmets in the mainframe days. Nowadays it's of course different and not really needed, but it's not magic or even particularly difficult.
My dad was a firmware programmer in the '80s and '90s. He would sometimes take home 6 inch thick printouts of hex and read through it. Every once in a while he would circle stuff in it. And that is why I never wanted to be a programmer as a kid. It wasn't until I took a cs101 class in college that I realized it could be different.
I remember that article; it came out 4 years into my CS studies at Edinburgh University, who had just changed to using Pascal in 1980 for its undergraduate teaching, not a unanimously welcomed decision! As I obviously identify as a "real programmer" (substitute C, or similar systems level language, for Fortran), I've long questioned my fondness for quiche ever since. To this day, I still have anxiety when buying a quiche at the supermarket.
@@UniquePanda0 OK, valid point, decisions may be swayed by many sources of anxiety, and in recent decades, the issues beyond real programming, which you may be alluding to, weigh much more heavily on me, and are the true reason I don't buy many processed food items. Unfortunately, they are not as funny as the real programming anxieties.
Yeah, I would be concerned about food poisoning due to how long that quiche was sitting around. They would either have to load up the shelf stabilizers and thus make it taste like salty cardboard (yuck) or make it fresh daily and it would only be good for maybe an hour at room temp. A supermarket is most definitely the wrong place to buy a quiche.
54 YO South African here. I recall that enjoying quiche was considered unmanly by supposed alphas in the late 80s (when I was in my late teens) and 90s. Accusing a man of liking quiche was a fairly common way of calling them a soy boy back then. "Real men" preferred that every meal be large quantities of meat, preferably as bloody as possible, with vegetables in the margins.
Old timer here, just old enough to remember entering programs on front panels (never personally, but definitely witnessed). The hardware in question had 10 toggles on the fron panel. Bits 0-7 were the bits of the byte being input, bit 8 was a parity bit, and the 9th toggle committed the value to remember and incremented the write address. So, rather than programming in hex, this device was programmed directly in binary.
Did a course in 1985 where we learned about microprocessor architecture. We wrote programs by entering them in binary this same way. The output for the program was 8 LEDs that you could control.
I doubt that. The "Real Programmers..." satire post may have been inspired by that book, of course. And this is also satire. Not meant to be taken too seriously. Unfortunately, I'd estimate that 90% of viewers here have no idea about most of the stuff mentioned. When I read it for the first time (1989-90?) I at least had read about most of it, and got maybe 80% of the jokes... Hell, I feel old now... maybe I should power up my DEC PRO350 and do some coding in MACRO-11 and Pascal...
Born around 1982 I've only had quiche for breakfast-lunch-dinner: Pascal+Assembly+FfoxPro and then C for 10 years ... :) How I laugh today at the bootcamp tart-eaters doing js on the server and calling programming!
The article is not talking about DO...WHILE when it says "DO loop" A DO Loop in Fortran is a FOR Loop in other languages (and not a FOR IN Loop either, a FOR x = start TO stop in BASIC or for(init; test; step) in C/C++)
22:10 It's very ironic that Voyager was kept alive by the most "quiche eater" way possible by debugging and livepatching LISP and Forth code on the probe itself.
C is just LISP but the parens is replaced with braces and the identifier is put outside instead of inside. (Tsoding showed chat this in one of his streams, and it's blown my mind ever since)
given that javascript started out as a lisp (but had to change syntax bc of java hype), and in many ways still resembles lisp, they weren't entirely wrong.
Delphi; I bloody _loved_ Delphi. I finally parted company with it around the 'Seattle' days, but really the last version I was using as an all-day, every-day tool was probably 2007 or XE. Fabulous tool, you can be insanely productive in Delphi.
@@RobUttley I assume that Total Commander and FastStone Image Viewers are written in Delphi. Not the best language I guess, but close to C(++) and it was way more productive than using WinAPI (unless you start mixing WinAPi with VCL)
Its always interesting to see someone refer to "the good ol days" in an article written during a period one might nowadays consider to be "the good ol days".
Real programmers use Turbo Pascal. I wrote a 3D animation tool that worked with Povray in Turbo Pascal. Ah the good old days when we were inventing all the abbreviations to save bandwidth on our BBSes. Hmm that reminds me I may or may not have written a trojan horse to obtain all the usernames and passwords from Renegade BBS software.
Oh my, POV-Ray, Turbo Pascal and BBS's in the same sentence, flashback to happy times. GW-Basic and EDLIN not so happy, but whatever... But the article is wrong anyway. We all know true wizards code by bit banging on two wires. No other way to get SeriousWorkTM done.
@@bitwize that’s surely true. With Turbo Pascal you could do everything that you could do with C, only the compiler was much faster and the string handling is much better. Never understood why a programming language like C with such a shitty string handling could be so popular.
@@chazzer5968 it's a riff on the "soyboy" insult that's weirdly popular in programming communities. it originates from a weird myth in "alpha male" type spaces that soybeans contain estrogen and make men feminine and gay
Ah, TECO. (Pronounced with a long E, BTW.) The comparison to line noise was pretty accurate. But not only was TECO an editor, but it was also a programming language. One of my best hacks was using TECO to write a UUCP program so my roommate could transfer files between his S100-based system (running CP/M) and the school's VAX over a 300 baud modem. UUCP, S100, CP/M, VAX, and 300 baud modems. Yeah, I'm that old.
13:28 yawning is also an indicator of your brain needing more oxygen. I remember seeing a video talking about how modern housing is bad at circulating air and refreshing the oxygen supply. so if anything, maybe pop open a window if you find yourself yawning too much.
I remember reading this article when it was published. I wrote text manipulation software in FORTAN (first generation page layout software with proportional fonts. Back then most computer generated text was mono-spaced). I was already an "expert" at using TECO (which stands for Text Editor and COrrector. and yes, TECO commands looked like modem line noise). And I tried to introduce RATFOR to the programming team I worked with, unsuccessfully. Good Times.
This article is older than me by three months. I learned Pascal when I was 15 or so. And at 16 I had it in high school as my main programming language in CS. I don't use it anymore obviously. But I owe it so much in my growth as a dev. I love articles like this to learn not only the history of my field but also the how old schools programmers where thinking and arguing about this stuff. Thank you for bringing it to me!
I am old enough to understand all the jokes. I started programming in the mid-1980s, so I remember CP/M on "microcomputers" and JCL and OS/360 on mainframes. I even taught myself enough Fortran back then to understand the Fortran jokes in this article. And I remember quiche: "Real men don't eat quiche" was an actual saying back then.
When we got to assembly in uni, in the first 10 or so lines of the textbook, the teacher had written something like "There's no real need to learn or understand assembly anymore...", which is a real Quiche eater statement. Worst part is that I was such a baby dev then, that I completely agreed. 3 years can really change a man for the better.
This is such a great piece of humour. And I lived through a bit of this time, looking at core dumps and CICS logs and correcting hack sysops OH into 0H to make their patches work.
"Quiche Eater" comes from a humor book from the late-70s/early-80s titled "Real Men Don't Eat Quiche" which was exactly as funny as it sounds. That said, Pascal was a terrible language to get actual work done in, but it was never designed to be a production language: it was designed to as a teaching language for structured programming concepts (the big paradigm shift before OO which was to kill spaghetti code and GOTOs).
@@elmersbalm5219 and in order to do that they had to have many extensions to the language, up to and including object orientation. Standard Pascal was barely useable outside of the educational setting, lacking support for many features that were commonplace in other languages of the time, like separate compilation or bit manipulation (to name two that stick out in my memory). The reason that it stuck around as long is it did with Apple was that the operating system APIs had been designed in 1983, and used Pascal conventions that could not be altered without breaking existing code. Apple then also had two decades of tooling built up around that base that would also have been fiendishly expensive to replace (MPW, Inside Macintosh, and tons of tech notes), so it was just easier and cheaper to keep the existing toolset, write a few helper functions for the people using C and C++, and muddle through, until Steve returned with a wrecking ball labeled "NeXTstep/OPENstep."
"Terrible language to get real work done." Because you have to type := or because you need more modules to do the work for you? Maybe Python is up your alley.
All joikng aside - some of this comes from the feeling of fear of being replaced ( or, for non-programmers hope of replacing programmers ) and that was always there, as things keep evolving. My friend's dad believed that in just a few years, tools like Excel would eliminate the need for programming. It was 1990s, I was a teenager learning C++.
Fortran66 fun fact: since a certain machine packed six 6 bit characters into a 36 bit word, there was no lower case nor double quote character. String literals were specified by the digits of length, the letter H, followed by the content of the string. 7HFORTRAN
Pascal might not be cool right now, but it's not dead. My current programming project is reverse-engineering and recreating a preprocessor in a 90s SDK that generated C code from class definition files. I wanted it to be a drop-in replacement for the original DOS tool, plus run on modern Linux, Windows, etc. Free Pascal lets me do that with barely any changes to the code - develop and test on Linux, cross-compile and test in DOSBox. And I've enjoyed my time with Pascal! I feel like I can be productive with the language, prototyping at a high level, but dipping into lower-level coding when I want to. If I ever get around to rewriting the rest of the tools in the SDK, I'll probably stick with Pascal. And yes, the preprocessor works. I've just got one last bit to do (a subset of a C compiler to convert structs to TASM code). I should probably write some proper unit tests, too, but Real Programmers don't write automated tests, amirite?
One of the main community-made modding tools for Bethesda games, xEdit, is written in Pascal. Its basically essential if you wanna install many mods and keep your game running correctly. One of its more powerful functions is that it can run Pascal scripts to modify data, so lots of mod authors and power users end up learning some Pascal, or at least use some of the scrips that are already included.
It's crazy how the Xerox brand has changed, learning about networks and computers back in the day you'd think they'd be one of the biggest companies in the world.
The IBM 360 had a big red emergency stop button on the front panel. The operators were playing indoor cricket and someone hit the ball into the emergency stop. Red faces all around.
My first job programming IBM System 38s and AS/400s in 1988-90 in RPG III (yes, kill me now), our entire server room had a big red button -- that was actually protected by a piece of hard transparent plastic. Some salesguy comes in there before lunch one day, falls in just such a way he tried to catch himself on the button, ripping off the guard and shutting the a System 38 and a couple AS/400s running production. We all got to go home because the database integrity check took 10 hours after a stunt like that. Good times
In 1982 I wrote a C compiler in Pascal 6000 for the NOS 1.4 OS on CDC6600. Also wrote an 8086 backend that cross compiled. At sometime also used Pascal on a SAGE IV system that was a Pascal based system.
In former East Germany (GDR) we used Turbo Pascal (on machines using cloned Z80s) in the late 80's in school to learn programming. That felt like the real thing compared to BASIC. We even learned ASM.
12:54: I think most assembly programmers on the mainframe had that skill. At the start of my career in the 2000s, I was trained to work on that platform and one of the things they taught us was to manually assemble programs and disassemble machine code (we weren't trained to become application programmers, but sometimes a vendor provided 'exits', hooks in Unix/Linux parlance, that could be programmed in assembler). It was a surprisingly straight forward process: first byte of each instruction was the operand and all instructions had a specific format (register-register, register-storage, etc.) and length. Probably before the 80s they didn't have fancy tools like Abend-AID anyway, so reading a dump like an assembly listing must have been something you pick up quick when you were serious about programming - though it doesn't make it any less impressive!
The most hardcore statement i have ever heard if you cannot do it in fortan do it in assembly. If you cannot do it in assembly. IS IT WORTH DOING? what. Fesr and admiration is all i feel.
Quiche eater: during Anglo-French wars the Englisgh insisted on having meat in the front lines which made logistics extremely difficult and soldiers often hungry. The French would instead use resources that were locally available, easy to transport, and stayed fresh longer: eggs, cheeses, spinach. French soldiers were well-fed and primed to fight, taking on some major decisive victories. In a 1983 book the expression was coined, based on anti-French stereotypes.
@@radosmirkovic8371 according to WP, he was born in northern Schweiz. So he is from the German speaking part, which perhaps explains the good engineering in Pascal; and he may have been more of a Wurst eater than a quiche eater.
We had a practical paper on microprocessor, were we used HEX keyboard to program 8085 in Assembly. It is even harder than it seems to be. We had to dry run a program multiple times before typing it in, and double check ever line during typing.
Same here, except it was 6800. I take it the dry run was paper and pen. As was converting the Assembly to HEX before getting to using the HEX keyboard.
That Black Red Tree and PARC comment was fun. I thought "typewriter ribbons" and Wikipedia said i was close. Red was a "good looking" color in their printers. Anyone that did CMYK printing related jobs for a living can tell you that red is actually a very "friendly" color with a somewhat generous range where it's not magenta anymore but hasn't gone orange yet. And making it darker is just tiny amounts of black mixed in, no need for any Cyan in the mix. In the article, fun read. Reminded me of The Jargon File esp the glossary, which i read and chuckled a lot, way back in the 90's. In a way, it's a bit like the Fallout intro: "War. War never changes. (...) The details are trivial and pointless, the reasons, as always, purely human"
In the Movie Earnest Goes to Camp, Earnest referenced the “Real Men Don’t Eat Quiche” book. “… men who had never tasted quiche” … “You couldn’t get quiche in ‘Nam”
11:10 That is FORTRAN 66! That is the language that we used to learn algorithms applied to physics. In the 2010s... The first 5 characters (I think it is 5 not 7) were the label of the line, so they are not interpreted as lines of code to be executed. Those were the labels that you use for your GOTO. E.g.: 100 CONTINUE Do your stuff, and at some point break the loop if satisfied GOTO 200 GOTO 100 200 CONTINUE This means: Go to the line with the label 100. And the line with the label 100 just continues the program. This is a do while True loop, basically. But then to break, you just go to the line with the label 200. Perfectly fine, Real Men do not need the convenience of something like a bound loop. Also, only the first 72 characters or a line were read. The rest were ignored. So many errors arising from people not using an editor with a vertical line at 72, and not understanding where they are using the variable real_ti in the code, they cannot find it. They only use real_time. Oh... On top of that, in our version of FORTRAN, only capital letters could be used, so it really looked like we were angrily programming. Additionally, variables starting with letters I to N were automatically assigned integers, the rest were automatically reals. Now the first thing I do when starting any FORTRAN code is to write "implicit None". BUT this is the reason why ALL my loops in ANY language use the dummy variable i as its index. (Also I never use j as index, despite being so common as index, because j in FORTRAN was used as the imaginary unit, like in electrical engineer). So many memories!
The funny part? We used it because that is what my professor at the time knew how to use, and he passed a USB with a cracked version of a FORTRAN compiler that he knew. At the time it was a bit ridiculous to use such an old and clearly outdated language, especially after having taken an optional lecture in which we learnt C (not C++, but C, my professor was adamant in the distinction). Although most of my peers had never programmed before, so they were just happy to do something and they couldn't compare FORTRAN with anything. But almost 20 years later, I actually use FORTRAN at my work as a physics researcher, as all of our simulation programs are written in FORTRAN. So the very old, apparently useless language that I learn at the university was in fact the perfect language for me to learn.
22:15 Unfortunately, the story is much simpler and malicious. It boils down to money, like it always does. A 10k ft view (hehe) of the issue: - Airbus announced a new line of planes which were much more efficient than the previous generation, which included all the available Boeing planes. - Boeing rushed a refurbished old design as the base for the 737 MAX, changing the specifications which were considered when creating that design, but without modifying the design. - The design was, well, not design for the characteristics of the 737 MAX, so the plane had a problem with staling if the angle of attack was too steep. - Instead of a physical solution, which could mean months of delay or even needing a new design, they opted for the MCAS software, which used A SINGLE SENSOR (which, by the way, fails constantly), to determine whether the angle of attack was too steep, and then automatically and without warning applied a correction to the pilot's input, decreasing the angle of attack. So far, this would have been just an annoying feature of a poorly design plane. But here is where Boeing seriously screw it up maliciously, and I am amazed that tens of people are not in jail for this. - In order to attract customers, Boeing promised that any pilots accredited to fly a 737 could also fly a 737 MAX without any further instructions beyond a 1 hour presentation explaining some differences. MCAS was NEVER explained to ANYONE outside of Boeing. There was apparently a mention in the manual but not to explain the system, whether it could be overwritten or anything like that, just the word MCAS in the middle of a sentence. - They did that because training pilots cost a lot of money and downtime. - Flight 610 pilots do not understand why their plane keeps nose diving despite the pilots trying their best to push the nose up. They keep following the manual and none of the solutions work. There are no signals flashing, no damage to any parts that would cause this, it is not a problem with the hydraulics. They are completely dumbfounded. - Flight 610 crashes, killing almost 200 people. - When MCAS was discovered, Boeing's excuse as to why it is not explained or even mentioned anywhere in the training is that they considered it not vital for the plane operation. You know, the system that CANNOT be overwritten by any means that will nosedive your plane into the ground because one of the sensors that constantly fails has failed. Not vital for the plane operation. All because Boeing did not want to lag behind Airbus, so they cut all the corners they could, and then they purposely and maliciously hid vital information to not discourage customers by mentioning that further training would be required. I consider it an act of m*rder and I think that the people who knew about all of this and who approve all of this should be in jail for at least reckless manslaughter.
The way I understand this is even worse. The planes did not have a problem with stalling (more than other planes, anyway), but the handling was different enough from the previous models that the MAX would have to be certified as a separate type, meaning that pilots would have to be trained as if for a completely different airplane (simulator training etc). By adding MCAS, Boeing managed to make the handling similar enough to the previous models, so that the pilots would only need the 1 hour training. The airplanes had two AoA sensors, but the software only used one at a time (every time the airplane was started it would switch the sensors it used). There was an indication in the cockpit that both AoA sensors are showing different values, but it was an optional feature and not all airlines paid for it.
Wrong, wrong, wrong. "Boeing rushed a refurbished old design as the base for the 737 MAX, changing the specifications which were considered when creating that design, but without modifying the design." What you referred to as Airbus' "new line of planes" was nothing more than "refurbished old designs". The A320 neo is simply a re-engined version of the original A320, which was developed 40 years ago. Updating an existing aircraft type is a common practice and is in many ways safer than developing a totally new type. "The design was, well, not design for the characteristics of the 737 MAX, so the plane had a problem with staling if the angle of attack was too steep." 😂😂 Please look up what a stall is. It is, by definition, caused by a high angle of attack. "Instead of a physical solution, which could mean months of delay or even needing a new design, they opted for the MCAS software, which used A SINGLE SENSOR (which, by the way, fails constantly), to determine whether the angle of attack was too steep, and then automatically and without warning applied a correction to the pilot's input, decreasing the angle of attack." Untrue. They spent months developing aerodynamic (what you call physical) solutions. These helped, but weren't enough. The purpose of MCAS was not to decrease the angle of attack. The aircraft had a slight tendency to pitch up under certain circumstances. MCAS was designed to compensate for that tendency to maintain the flight path commanded by the pilots. Also, alpha vanes (the sensors you refer to) do not fail often. In the case of Lion Air, the failure was caused by improper maintenance. The technician should have caught the problem through testing, but he deliberately neglected to perform the test. It turns out that he had been doing this for months. "MCAS was NEVER explained to ANYONE outside of Boeing." False. MCAS was developed in order to meet FAA requirements, and of course the FAA was fully aware of it when they granted certification. "Flight 610 pilots do not understand why their plane keeps nose diving despite the pilots trying their best to push the nose up. They keep following the manual and none of the solutions work. There are no signals flashing, no damage to any parts that would cause this, it is not a problem with the hydraulics. They are completely dumbfounded." Incorrect. They did not follow "the manual". They didn't follow any of the procedures they were trained for. If they had followed the unreliable airspeed procedure as required, MCAS could never have activated. Once it did activate, they should have performed the runaway stabilizer trim procedure, but they didn't. Both of these procedures were identical to the ones for the earlier 737 models, and the pilots should have practiced both in a simulator. Despite their failure to perform these procedures, the Captain had no trouble flying the aircraft for 10 minutes. It was only when he handed over control to the inexperienced First Officer that the situation got out of control. Rather than overriding MCAS as the Captain had, the FO allowed it to make large adjustments in the nose-down direction and he only made tiny nose-up adjustments. These tiny inputs weren't enough to get back to a correct trim setting, but they allowed MCAS to activate repeatedly. The Captain failed to monitor what the FO was doing and take back control. No-one at Boeing or the FAA had anticipated that pilots would behave this way since it goes against basic flying techniques. The aircraft must always be kept in trim. "When MCAS was discovered, Boeing's excuse as to why it is not explained or even mentioned anywhere in the training is that they considered it not vital for the plane operation. You know, the system that CANNOT be overwritten by any means that will nosedive your plane into the ground because one of the sensors that constantly fails has failed. Not vital for the plane operation." I believe that's a misrepresentation of what they said, but please show me the quote if I'm wrong. Pilots could easily override MCAS by using the trim switches or wheels. This is a basic and routine part of flying. And as mentioned above, MCAS would never have activated at all if the standard procedures had been followed.
Patching executables on the fly to remove some conditions inside loops was nice, but then someone invented instruction prefetch queue and pipelining and someone decided that on real operating systems, executable memory pages are read only. I played with it as a kid a bit, it was somewhat possible to quess the processor based on side effects, such as patching only had effect when it was done further than 10 instructions ahead or so.
This is from The Jargon File, if anyone is curious. It's an excellent repository of stories, slang terms, and ways of life from the tech world of old. It provides a gimpse into the culture. Particularly interesting is "The Lexicon."
Not long ago I read "The Soul of a New Machine" by Tracy Kidder, about Data General, published in 1981. Just like this article, it felt surprisingly similar to today, just with different jargon.
I cannot disagree more with the premise of this article. REAL programmers were just extremely proficient at the languages they are using, and not so much with newer languages. So changing paradigm requires lots of learning and code porting. To say that the language itself is the reason for success, is just wild. To assume that languages only grow towards worse. Feel free to debug and code in hex, other people finished debugging using their IDE 5 hours ago, and are now at home.
the first AI winter happened because perceptrons weren't being linked together to generalize functions (aka neural nets) and there wasn't enough compute iirc
Interesting, the height of quiche eater environments, Smalltalk, was developed by guys who previously worked for DARPA developing ARPANET, which probably qualifies as something Real Programmers do.
Mid 2000, VNC was cyclic discovery. Every couple of years some new people would suddenly discover VNC as they'd get better at programming. Shout out thrir discovery to email-lists. Lol. Discovering VNC was such a correct gauge/milestone on knowing people's skills lol
18:12 So let me get this straight - You would rather patch the binary?!!! And not keep any source around at all?!!! And not because the compilation is slow or anything, but rather because your text editor is so bad it is easy to break your program by editing it?!!! WTF is going on?!!! I thought I knew programming, but I realize I knew nothing at all. I am NOT a Real Programmer. Quiche eater over here... Holy fu--, the amount of times during this article I just had to stop and either laugh out loud or scream "WHY WOULD YOU EVER THINK OF DOING THAT" to the high heavens is uncountable.
I'm a professional software engineer, maily using C++ and C#... But I do love and use Pascal ! And I'm glad it was revived with Freepascal and Lazarus. Pascal is, to my opinion, still the best language to learn how to write software! It gives you everything used in "industry languages", but in a safe, verbose environment. And it compiles very fast, which is nice if you have to try things out.
I just read Dijkstras green language analysis. It's interesting that without or missing the context of Adas packaging and privacy features then someone so great could be so wrong. He also doesn't get how types protect the user and reduce the need for branching checks. Ada was so far ahead of OOP. I actually think it hits a sweet spot without using any inheritance that was added in 95. Procedural with privacy, highly modular packages, type design and strong typing.
Real men eat quiche. The whole "real men" thing in programming was a sort of parody that is probably mostly forgotten now, which was a staple of a "holy war" type discussion. It itself was a twist on a book called "Real Men Don't Eat Quiche".
8:20 Whenever my program gets a little more complicated, I use a lot of goto anyway. In rust I can have named blocks now and then either break from them (goto after the block), or continue them (if they are declared as loop, so a goto to the beginning of the block). And loops shouldn't loop by default. Most of my loops have a break at the end anyway. They usually look somewhat like this: for (;;) { if (...) continue; ... if (...) continue; ... break; } This wolud be written easer with goto: loop: if (...) goto loop; ... if (...) goto loop; ... And the break at the end isn't even neccessary.
Ha ha. I took Pascal in hs back in the early 90’s after learning BASIC on my Commodore64 when I was 8. Went to school for IT. Then went on to web development later and then product management. My step dad was the project manager at Lockheed that led the software development dept to get the 4 different computer systems on the space shuttle to work back in the 70s. He’s the one that bought me my first computer when I was 8 and showed me BASIC.
At school in the very early 80s we wrote Fortran 4 programs using coding sheets for Computer Science class. Was interesting. Way more interesting than the COBOL we also did. Also on coding sheets.
I was learning Pascal about them, much better than Fortran, and ABEND is pronounced Ab-End, abnormal end, or raise exception. We used Sheffield Pascal on Prime minister at uni. We also used Object Pascal, when object orientated programming came about. I wrote a collector and report generator in Pascal, the collector logged output from an SX2000 port concentrator and reported on usage. Most I used Prime's PLP and SPL which were PL/1 like, and you could declare quite sophisticated structures.
Well, I saw Adam Dunkels write a demo effect, a fire effect burning on the screen, on his c64 in HEX. It worked fine. He did it because he couldn't find the disk he saved it on so he just rewrote it on the fly to show us what it looked like.
I remember the Quiche Eater flame. FORTRAN had a strangle hold on any fast calculation and memory efficient topics. UI and GUI didn’t exist back then. CRAY supercomputer was benchmarked against FORTRAN subroutines used at Los Alamos and Lawrence Livermore Labs (LINPACK). But today’s computers have so much more processors, this no longer makes sense. I learned some Assembly and FORTRAN in school in 80’s, but also took a Pascal class. I mostly tinkered in Basic and TurboBasic.
And the tooling is surprisingly modern, it is a joy to use and still as fast as ever. It's underrated, if you ask me, because people would rather circle jerk with libraries that are essentially Fortran bindings (aka everything on Python)
SPZAP was real thing. It was often used to patch programs... actually if you work on an IBM mainframe it still is sometimes a real thing especially on z/OS.
Xerox and Bell Labs had lots of research come out because of tax policy. Corporate taxes on profits were really high. Companies could either reduce prices to avoid large tax bills or reinvest with things like research. Xerox and Bell Labs did the latter.
Interestingly, in modern versions of pascal (aka Delphi) you can actually turn on constant assignment. i.e. if you declare a constant you can actually change it....This is complete madness of course, and is done for backward compatibility. So you could in theory have a constant limit for some catastrophic event and in code keep changing it.
I got paid to write pascal back in the early 90s. We were developing software for doctor's surgeries. This was before source control was a thing. But you know what, we managed just fine.
It's crazy how programmer sense of humor hasnt changed that much
Not really, they are all basically cut from the same cloth. They're problem solvers and they have a subversive understanding of government and society because they are literally writing the code that runs the world, or could end it. Them having a sense of humor about it is how you cope with that over your head. Prime probably doesn't think about the $100 or millions of dollars and the salaries abd wages tied to the success or failure his work. But you can bet that his work ethic and drive to learn and be better is why Netflix snatched him up despite his "Quiche Eating-ness" lol.
People are always looking for ways to feel superior to eachother.
Its because we don't go outside😂😢😂😂
@@benanderson6797 😂 and that's true man
it actually just means that their problem of societal standing haven't changed, since humor is a coping mechanism.
11:55 "Unix is a glorified videogame"
I'll write that quote down and repeat it myself every time I'll fail to do something in my linux machine.
You fail your linux machine ? I don't even run linux, I run Xen Hypervisor as the real machine, because I am a real programmer, I don't need virtual memory when I have a 3-step-address paging CPU.
Linux is not UNIX though.
@@daniel29263 Close enough, Unix is the video game and Linux is the pirated copy of that.
I wonder if people used these expensive machines to play MUDs via telnet ... yes wiki mentions earliest dungeon game in 1975 and first MUD in 1978.
@@daniel29263he's not a real peogrammer
Long time C and Pascal developer here (back in the day, these days I use fancy curly-brace languages).
I started in industry in '91, and this article represents what was kind of a jokey but real attitude that I myself came across, even then.
To this day, I tell my work colleagues that there's nothing new under the sun, and yes a lot of what I see and hear these days is same as it's always been, only the names are changed.
Great video, gave me some lol moments. I'm a self-diagnosed quiche eater, and I have been for over 30 years.
I was a young hobbyist in the heyday of 68k home machines and remember a lot of Pascal diehards having similar attitudes towards C and then C++ as this article has about Pascal. By the time I started in industry, a little over a decade after you, it was C/C++ people coming out with similar stuff about Java. But even today C, C++, Java, and even Pascal still have their users who do magic with them while other people are laughing at coders of newer languages from their .Net shaped thrones...
That's what I'm saying to everyone who tells me that real programmers don't exclusively use iPad to make their apps with no code frameworks and ChatGPT
My dad is a real programmer from this era, he turned 70 this year. I showed him this, and he agreed with everything in the article and had a good chuckle.
And yes, he regularly had to fix programs directly in binary with no source code (or worse, source code that was very out of date and worthless - it was worse because you wasted a lot of time fixing it and waiting for it to compile, only to have to do the binary modification anyway later).
It was common that if you were unsure about the state of the source code (meaning you have never compiled the program from source yourself and deployed that to production), to just do the direct binary modification even if source code is supposedly available - because you have less downtime that way, and unverified source code can't be trusted.
Technically, he’d have been using assembly rather than dealing with machine code (or binary) directly. While assembly is just one step removed from machine code, looking at mnemonics and arguments is hell of a lot easier than staring at a bunch of hex or binary and having to do the translation in your head. Of course, if the program you’re dealing with doesn’t include debugging symbols, it can be a lot of “fun” working out more complex logic from just the assembly, as even the concept of “variable” doesn’t exist in assembly, just registers or memory locations.
But considering why he had to do that. Because programmers were lazy and didn't follow good source control habits. Now we save ourselves from this mess
@@keoghanwhimsically2268 If you know the instruction set well, it's simple enough (no x86 instruction encoding crap) and you're familiar with the software, it's not impossible at all, or even that hard, to work in machine code. I'm only a hobbyist with 6502 stuff for instance, but I know most of the opcodes by heart already.
@@Syksyisin I’m not sure what you’re referring to. If you’re talking about 6502 OP codes like LDA (load-accumulator), JSR (jump-to-subroutine), etc., that’s assembly. The machine code would just be the binary or hex. Now, you may indeed be one of those who does all the binary translation in your head without resorting to the assembly instruction set or argument syntax, but I’ve never head of an actual programmer finding that a productive thing to do. (I originally learned to program on 6502 assembly 40-odd years ago. Apple IIs came with a basic assembler built-in.)
Now, there are also macro assemblers that allow you to define reusable macros similar to C, as well as extensions to basic assembly that give you primitives to define variables, loops, etc., if that’s what your referring to. These are not basic assembly, however., and most of this information would be lost in the assembly process anyway.
@@keoghanwhimsically2268it was a common skill for system programmets in the mainframe days. Nowadays it's of course different and not really needed, but it's not magic or even particularly difficult.
My dad was a firmware programmer in the '80s and '90s. He would sometimes take home 6 inch thick printouts of hex and read through it. Every once in a while he would circle stuff in it.
And that is why I never wanted to be a programmer as a kid. It wasn't until I took a cs101 class in college that I realized it could be different.
I remember that article; it came out 4 years into my CS studies at Edinburgh University, who had just changed to using Pascal in 1980 for its undergraduate teaching, not a unanimously welcomed decision! As I obviously identify as a "real programmer" (substitute C, or similar systems level language, for Fortran), I've long questioned my fondness for quiche ever since. To this day, I still have anxiety when buying a quiche at the supermarket.
I think just in general it's never bad to have a good portion of anxiety when buying ready-to-eat quiche from the supermarket! 😂
@@UniquePanda0 OK, valid point, decisions may be swayed by many sources of anxiety, and in recent decades, the issues beyond real programming, which you may be alluding to, weigh much more heavily on me, and are the true reason I don't buy many processed food items. Unfortunately, they are not as funny as the real programming anxieties.
Yeah, I would be concerned about food poisoning due to how long that quiche was sitting around. They would either have to load up the shelf stabilizers and thus make it taste like salty cardboard (yuck) or make it fresh daily and it would only be good for maybe an hour at room temp. A supermarket is most definitely the wrong place to buy a quiche.
54 YO South African here. I recall that enjoying quiche was considered unmanly by supposed alphas in the late 80s (when I was in my late teens) and 90s. Accusing a man of liking quiche was a fairly common way of calling them a soy boy back then. "Real men" preferred that every meal be large quantities of meat, preferably as bloody as possible, with vegetables in the margins.
Exactly as todays with the carnivore diet being endorsed by today's "alpha" men
That's so stupid lmao
this is literally the predecessor for soy boy 💀
soy boys are also social justice warriors
There was a saying that "Real men don't eat quiche".
Summary: Real Programmers make sure nobody else can manage the code they wrote so you can never fire them.
Wish programmers remebered that before they invented AI LLMs lol.
Also - 12 year old kids apparently were blowing real men back then
Old timer here, just old enough to remember entering programs on front panels (never personally, but definitely witnessed).
The hardware in question had 10 toggles on the fron panel. Bits 0-7 were the bits of the byte being input, bit 8 was a parity bit, and the 9th toggle committed the value to remember and incremented the write address.
So, rather than programming in hex, this device was programmed directly in binary.
Did a course in 1985 where we learned about microprocessor architecture. We wrote programs by entering them in binary this same way. The output for the program was 8 LEDs that you could control.
It comes from a satirical book called "Real Men Don't Eat Quiche" 🤣🤣🤣
I doubt that. The "Real Programmers..." satire post may have been inspired by that book, of course.
And this is also satire. Not meant to be taken too seriously. Unfortunately, I'd estimate that 90% of viewers here have no idea about most of the stuff mentioned. When I read it for the first time (1989-90?) I at least had read about most of it, and got maybe 80% of the jokes... Hell, I feel old now... maybe I should power up my DEC PRO350 and do some coding in MACRO-11 and Pascal...
The post is a parody of the book. It even has its own Wikipedia entry if you search for “Real Programmers Don't Use Pascal” 😂
Oh god I'm so old that I even remember that. Prime isn't _that_ younger than me, the fact he has no idea and has to ask makes me feel old, so so old
I think he means "Quiche Eater"
@@snorman1911 So what? People who eat Quiche are Quiche Eaters, Quite Eater!
Does nobody remember the book "Real Men Don't Eat Quiche"? I feel old. Time to learn a new language.
I remember
Well, it *is* a 1982 book that needed to be imported (as far as I can find) for anyone not in the USA (and later Australia).
Q lang would be funny
Born around 1982 I've only had quiche for breakfast-lunch-dinner: Pascal+Assembly+FfoxPro and then C for 10 years ... :)
How I laugh today at the bootcamp tart-eaters doing js on the server and calling programming!
Oh how the times have changed, lol
The article is not talking about DO...WHILE when it says "DO loop" A DO Loop in Fortran is a FOR Loop in other languages (and not a FOR IN Loop either, a FOR x = start TO stop in BASIC or for(init; test; step) in C/C++)
22:10 It's very ironic that Voyager was kept alive by the most "quiche eater" way possible by debugging and livepatching LISP and Forth code on the probe itself.
There you go! Someone mentioned Forth at last!
Imagine when this gigachad saw the state of web dev today.
Nicklaus Wirth passed away this year, January.
Is there a channel that explore "sh*t post" articles from this distant past? Something from the 70s "The Future is LISP" type of thing. Does it exist?
C is just LISP but the parens is replaced with braces and the identifier is put outside instead of inside.
(Tsoding showed chat this in one of his streams, and it's blown my mind ever since)
@@oserodal2702 Isn't LISP's macro system way more sophisticated and flexible than C's, though?
@@luke_fabis Yeah, a stupid joke made in five minutes is not gonna be funny once you analyse into it, durr.
@@oserodal2702 You made a joke?
given that javascript started out as a lisp (but had to change syntax bc of java hype), and in many ways still resembles lisp, they weren't entirely wrong.
Turbo Pascal and later Delphi was my favourite language I grew up with in the 90s 😊
You can still try Free Pascal :)
Delphi; I bloody _loved_ Delphi. I finally parted company with it around the 'Seattle' days, but really the last version I was using as an all-day, every-day tool was probably 2007 or XE. Fabulous tool, you can be insanely productive in Delphi.
@@RobUttley there is an open source version called 'Lazarus'. works fine. :)
@@RobUttley I assume that Total Commander and FastStone Image Viewers are written in Delphi. Not the best language I guess, but close to C(++) and it was way more productive than using WinAPI (unless you start mixing WinAPi with VCL)
I program professionally today in Delphi. We just got multiline string literals!! Oh and when inline variables were introduced was a game changer 😂
Its always interesting to see someone refer to "the good ol days" in an article written during a period one might nowadays consider to be "the good ol days".
Real programmers use Turbo Pascal. I wrote a 3D animation tool that worked with Povray in Turbo Pascal. Ah the good old days when we were inventing all the abbreviations to save bandwidth on our BBSes. Hmm that reminds me I may or may not have written a trojan horse to obtain all the usernames and passwords from Renegade BBS software.
Turbo Pascal 5.5 at home, and at work it was Turbo Pascal 7. Legendary days!
Turbo Pascal was the GOAT for sure.
Oh my, POV-Ray, Turbo Pascal and BBS's in the same sentence, flashback to happy times. GW-Basic and EDLIN not so happy, but whatever... But the article is wrong anyway. We all know true wizards code by bit banging on two wires. No other way to get SeriousWorkTM done.
This essay predates Turbo Pascal by a few months. Turbo reintroduced things back into Pascal, like pointers, that made the language fun again.
@@bitwize that’s surely true. With Turbo Pascal you could do everything that you could do with C, only the compiler was much faster and the string handling is much better. Never understood why a programming language like C with such a shitty string handling could be so popular.
10 years later I will also say.... back then I used to handcraft CSS not use tailwind like other almond milk drinkers
Almond milk drinkers? Excuse me? What's that suppose to mean?
@@chazzer5968 nothing specific just some random food I could think of ... maybe flexing I am not lactose intolerant
@@chazzer5968 it's a riff on the "soyboy" insult that's weirdly popular in programming communities. it originates from a weird myth in "alpha male" type spaces that soybeans contain estrogen and make men feminine and gay
@@somenameidk5278 Oh dear, some things never die, do they.
Ah, TECO. (Pronounced with a long E, BTW.) The comparison to line noise was pretty accurate. But not only was TECO an editor, but it was also a programming language. One of my best hacks was using TECO to write a UUCP program so my roommate could transfer files between his S100-based system (running CP/M) and the school's VAX over a 300 baud modem.
UUCP, S100, CP/M, VAX, and 300 baud modems. Yeah, I'm that old.
13:28 yawning is also an indicator of your brain needing more oxygen. I remember seeing a video talking about how modern housing is bad at circulating air and refreshing the oxygen supply. so if anything, maybe pop open a window if you find yourself yawning too much.
I know someone who use NextJS and he yawn all the time so you maybe onto something here.
I remember reading this article when it was published. I wrote text manipulation software in FORTAN (first generation page layout software with proportional fonts. Back then most computer generated text was mono-spaced). I was already an "expert" at using TECO (which stands for Text Editor and COrrector. and yes, TECO commands looked like modem line noise). And I tried to introduce RATFOR to the programming team I worked with, unsuccessfully. Good Times.
And the SAT answer: RATFOR is to FORTRAN as cfront (c++) is to C.
jesus christ this was AMAZING. history does in fact repeat itself
The takeaway is that history never repeats itself, but sometimes rhymes
This article is older than me by three months. I learned Pascal when I was 15 or so. And at 16 I had it in high school as my main programming language in CS. I don't use it anymore obviously. But I owe it so much in my growth as a dev.
I love articles like this to learn not only the history of my field but also the how old schools programmers where thinking and arguing about this stuff.
Thank you for bringing it to me!
I am old enough to understand all the jokes. I started programming in the mid-1980s, so I remember CP/M on "microcomputers" and JCL and OS/360 on mainframes. I even taught myself enough Fortran back then to understand the Fortran jokes in this article. And I remember quiche: "Real men don't eat quiche" was an actual saying back then.
When we got to assembly in uni, in the first 10 or so lines of the textbook, the teacher had written something like "There's no real need to learn or understand assembly anymore...", which is a real Quiche eater statement. Worst part is that I was such a baby dev then, that I completely agreed. 3 years can really change a man for the better.
This is such a great piece of humour. And I lived through a bit of this time, looking at core dumps and CICS logs and correcting hack sysops OH into 0H to make their patches work.
"Quiche Eater" comes from a humor book from the late-70s/early-80s titled "Real Men Don't Eat Quiche" which was exactly as funny as it sounds. That said, Pascal was a terrible language to get actual work done in, but it was never designed to be a production language: it was designed to as a teaching language for structured programming concepts (the big paradigm shift before OO which was to kill spaghetti code and GOTOs).
Apple made Pascal as its official developer language until Mac OS X changed everything.
@@elmersbalm5219 and in order to do that they had to have many extensions to the language, up to and including object orientation. Standard Pascal was barely useable outside of the educational setting, lacking support for many features that were commonplace in other languages of the time, like separate compilation or bit manipulation (to name two that stick out in my memory). The reason that it stuck around as long is it did with Apple was that the operating system APIs had been designed in 1983, and used Pascal conventions that could not be altered without breaking existing code. Apple then also had two decades of tooling built up around that base that would also have been fiendishly expensive to replace (MPW, Inside Macintosh, and tons of tech notes), so it was just easier and cheaper to keep the existing toolset, write a few helper functions for the people using C and C++, and muddle through, until Steve returned with a wrecking ball labeled "NeXTstep/OPENstep."
"Terrible language to get real work done."
Because you have to type := or because you need more modules to do the work for you?
Maybe Python is up your alley.
All joikng aside - some of this comes from the feeling of fear of being replaced ( or, for non-programmers hope of replacing programmers ) and that was always there, as things keep evolving.
My friend's dad believed that in just a few years, tools like Excel would eliminate the need for programming. It was 1990s, I was a teenager learning C++.
Fortran66 fun fact: since a certain machine packed six 6 bit characters into a 36 bit word, there was no lower case nor double quote character. String literals were specified by the digits of length, the letter H, followed by the content of the string. 7HFORTRAN
Pascal might not be cool right now, but it's not dead.
My current programming project is reverse-engineering and recreating a preprocessor in a 90s SDK that generated C code from class definition files. I wanted it to be a drop-in replacement for the original DOS tool, plus run on modern Linux, Windows, etc. Free Pascal lets me do that with barely any changes to the code - develop and test on Linux, cross-compile and test in DOSBox.
And I've enjoyed my time with Pascal! I feel like I can be productive with the language, prototyping at a high level, but dipping into lower-level coding when I want to. If I ever get around to rewriting the rest of the tools in the SDK, I'll probably stick with Pascal.
And yes, the preprocessor works. I've just got one last bit to do (a subset of a C compiler to convert structs to TASM code). I should probably write some proper unit tests, too, but Real Programmers don't write automated tests, amirite?
One of the main community-made modding tools for Bethesda games, xEdit, is written in Pascal. Its basically essential if you wanna install many mods and keep your game running correctly.
One of its more powerful functions is that it can run Pascal scripts to modify data, so lots of mod authors and power users end up learning some Pascal, or at least use some of the scrips that are already included.
My first language was Pascal, so I guess I was doomed from the beginning. I don’t even like quiche.
It's crazy how the Xerox brand has changed, learning about networks and computers back in the day you'd think they'd be one of the biggest companies in the world.
Thank you so much. I learned pascal when I first learned to code 30 years ago. I was 16,.. "quiche eater" from the gate
IF - zero or one time
WHILE..DO - zero or more times
DO..WHILE (REPEAT..UNTIL) - one or more times
FOR - n times
The IBM 360 had a big red emergency stop button on the front panel.
The operators were playing indoor cricket and someone hit the ball into the emergency stop. Red faces all around.
My first job programming IBM System 38s and AS/400s in 1988-90 in RPG III (yes, kill me now), our entire server room had a big red button -- that was actually protected by a piece of hard transparent plastic. Some salesguy comes in there before lunch one day, falls in just such a way he tried to catch himself on the button, ripping off the guard and shutting the a System 38 and a couple AS/400s running production. We all got to go home because the database integrity check took 10 hours after a stunt like that. Good times
Not having documentation = job security.
In 1982 I wrote a C compiler in Pascal 6000 for the NOS 1.4 OS on CDC6600. Also wrote an 8086 backend that cross compiled. At sometime also used Pascal on a SAGE IV system that was a Pascal based system.
Ahh Pascal. The first language i studied in highschool before going for web development and later on C in college
Real programmers program in FORTRAN. (C++ and Assembly language are somewhat acceptable. Use awk to process CSV files, not Python.)
why
@JimAllen-Persona well, guess what, Fortran has had OO for quite some time now. Cobol too, while we're at it
In former East Germany (GDR) we used Turbo Pascal (on machines using cloned Z80s) in the late 80's in school to learn programming. That felt like the real thing compared to BASIC. We even learned ASM.
12:54: I think most assembly programmers on the mainframe had that skill. At the start of my career in the 2000s, I was trained to work on that platform and one of the things they taught us was to manually assemble programs and disassemble machine code (we weren't trained to become application programmers, but sometimes a vendor provided 'exits', hooks in Unix/Linux parlance, that could be programmed in assembler). It was a surprisingly straight forward process: first byte of each instruction was the operand and all instructions had a specific format (register-register, register-storage, etc.) and length. Probably before the 80s they didn't have fancy tools like Abend-AID anyway, so reading a dump like an assembly listing must have been something you pick up quick when you were serious about programming - though it doesn't make it any less impressive!
My old man learned fortran, I do not come from a line of quiche eaters, thank god.
The most hardcore statement i have ever heard if you cannot do it in fortan do it in assembly. If you cannot do it in assembly. IS IT WORTH DOING? what. Fesr and admiration is all i feel.
If you can't do it in assembly, can it even be done?
Quiche eater: during Anglo-French wars the Englisgh insisted on having meat in the front lines which made logistics extremely difficult and soldiers often hungry. The French would instead use resources that were locally available, easy to transport, and stayed fresh longer: eggs, cheeses, spinach. French soldiers were well-fed and primed to fight, taking on some major decisive victories.
In a 1983 book the expression was coined, based on anti-French stereotypes.
I knew keesh is french. Btw Niklaus With was Swiss and I think of French origins.
@@radosmirkovic8371 according to WP, he was born in northern Schweiz. So he is from the German speaking part, which perhaps explains the good engineering in Pascal; and he may have been more of a Wurst eater than a quiche eater.
@@lhpl I didn't know that. Although I am not pascal developer I have high respect for late prof. Wirth.
Excuse me sir, I would like some more Soylent with my quiche please
We had a practical paper on microprocessor, were we used HEX keyboard to program 8085 in Assembly. It is even harder than it seems to be. We had to dry run a program multiple times before typing it in, and double check ever line during typing.
Same here, except it was 6800. I take it the dry run was paper and pen. As was converting the Assembly to HEX before getting to using the HEX keyboard.
FYI - TECO
en.wikipedia.org/wiki/TECO_(text_editor)
Something from DEC unlike all the other IBM inspired name drops in article.
That Black Red Tree and PARC comment was fun. I thought "typewriter ribbons" and Wikipedia said i was close. Red was a "good looking" color in their printers. Anyone that did CMYK printing related jobs for a living can tell you that red is actually a very "friendly" color with a somewhat generous range where it's not magenta anymore but hasn't gone orange yet. And making it darker is just tiny amounts of black mixed in, no need for any Cyan in the mix.
In the article, fun read. Reminded me of The Jargon File esp the glossary, which i read and chuckled a lot, way back in the 90's. In a way, it's a bit like the Fallout intro:
"War. War never changes. (...) The details are trivial and pointless, the reasons, as always, purely human"
Pascal is pretty good, there's even a cross-platform 2D+3D Game Engine written on it, Castle Engine
maybe the real quiche eaters are the friends we made along the way
In the Movie Earnest Goes to Camp, Earnest referenced the “Real Men Don’t Eat Quiche” book.
“… men who had never tasted quiche” … “You couldn’t get quiche in ‘Nam”
11:10 That is FORTRAN 66! That is the language that we used to learn algorithms applied to physics. In the 2010s...
The first 5 characters (I think it is 5 not 7) were the label of the line, so they are not interpreted as lines of code to be executed.
Those were the labels that you use for your GOTO. E.g.:
100 CONTINUE
Do your stuff, and at some point break the loop
if satisfied
GOTO 200
GOTO 100
200 CONTINUE
This means: Go to the line with the label 100. And the line with the label 100 just continues the program. This is a do while True loop, basically. But then to break, you just go to the line with the label 200.
Perfectly fine, Real Men do not need the convenience of something like a bound loop.
Also, only the first 72 characters or a line were read. The rest were ignored. So many errors arising from people not using an editor with a vertical line at 72, and not understanding where they are using the variable real_ti in the code, they cannot find it. They only use real_time. Oh...
On top of that, in our version of FORTRAN, only capital letters could be used, so it really looked like we were angrily programming.
Additionally, variables starting with letters I to N were automatically assigned integers, the rest were automatically reals.
Now the first thing I do when starting any FORTRAN code is to write "implicit None".
BUT this is the reason why ALL my loops in ANY language use the dummy variable i as its index.
(Also I never use j as index, despite being so common as index, because j in FORTRAN was used as the imaginary unit, like in electrical engineer).
So many memories!
The funny part? We used it because that is what my professor at the time knew how to use, and he passed a USB with a cracked version of a FORTRAN compiler that he knew. At the time it was a bit ridiculous to use such an old and clearly outdated language, especially after having taken an optional lecture in which we learnt C (not C++, but C, my professor was adamant in the distinction). Although most of my peers had never programmed before, so they were just happy to do something and they couldn't compare FORTRAN with anything.
But almost 20 years later, I actually use FORTRAN at my work as a physics researcher, as all of our simulation programs are written in FORTRAN. So the very old, apparently useless language that I learn at the university was in fact the perfect language for me to learn.
Vsauce "juvenoia" explores this going all the way back to Plato, highly recommended
Damn Pascal was what we learned in high school. I am a quiche eater.
Ahh.. thank you, this article is where I got it from and I was always misquoting it "you can write fortran in any language"
22:15
Unfortunately, the story is much simpler and malicious. It boils down to money, like it always does. A 10k ft view (hehe) of the issue:
- Airbus announced a new line of planes which were much more efficient than the previous generation, which included all the available Boeing planes.
- Boeing rushed a refurbished old design as the base for the 737 MAX, changing the specifications which were considered when creating that design, but without modifying the design.
- The design was, well, not design for the characteristics of the 737 MAX, so the plane had a problem with staling if the angle of attack was too steep.
- Instead of a physical solution, which could mean months of delay or even needing a new design, they opted for the MCAS software, which used A SINGLE SENSOR (which, by the way, fails constantly), to determine whether the angle of attack was too steep, and then automatically and without warning applied a correction to the pilot's input, decreasing the angle of attack.
So far, this would have been just an annoying feature of a poorly design plane. But here is where Boeing seriously screw it up maliciously, and I am amazed that tens of people are not in jail for this.
- In order to attract customers, Boeing promised that any pilots accredited to fly a 737 could also fly a 737 MAX without any further instructions beyond a 1 hour presentation explaining some differences. MCAS was NEVER explained to ANYONE outside of Boeing. There was apparently a mention in the manual but not to explain the system, whether it could be overwritten or anything like that, just the word MCAS in the middle of a sentence.
- They did that because training pilots cost a lot of money and downtime.
- Flight 610 pilots do not understand why their plane keeps nose diving despite the pilots trying their best to push the nose up. They keep following the manual and none of the solutions work. There are no signals flashing, no damage to any parts that would cause this, it is not a problem with the hydraulics. They are completely dumbfounded.
- Flight 610 crashes, killing almost 200 people.
- When MCAS was discovered, Boeing's excuse as to why it is not explained or even mentioned anywhere in the training is that they considered it not vital for the plane operation. You know, the system that CANNOT be overwritten by any means that will nosedive your plane into the ground because one of the sensors that constantly fails has failed. Not vital for the plane operation.
All because Boeing did not want to lag behind Airbus, so they cut all the corners they could, and then they purposely and maliciously hid vital information to not discourage customers by mentioning that further training would be required.
I consider it an act of m*rder and I think that the people who knew about all of this and who approve all of this should be in jail for at least reckless manslaughter.
The way I understand this is even worse.
The planes did not have a problem with stalling (more than other planes, anyway), but the handling was different enough from the previous models that the MAX would have to be certified as a separate type, meaning that pilots would have to be trained as if for a completely different airplane (simulator training etc).
By adding MCAS, Boeing managed to make the handling similar enough to the previous models, so that the pilots would only need the 1 hour training.
The airplanes had two AoA sensors, but the software only used one at a time (every time the airplane was started it would switch the sensors it used).
There was an indication in the cockpit that both AoA sensors are showing different values, but it was an optional feature and not all airlines paid for it.
Wrong, wrong, wrong.
"Boeing rushed a refurbished old design as the base for the 737 MAX, changing the specifications which were considered when creating that design, but without modifying the design."
What you referred to as Airbus' "new line of planes" was nothing more than "refurbished old designs". The A320 neo is simply a re-engined version of the original A320, which was developed 40 years ago. Updating an existing aircraft type is a common practice and is in many ways safer than developing a totally new type.
"The design was, well, not design for the characteristics of the 737 MAX, so the plane had a problem with staling if the angle of attack was too steep."
😂😂 Please look up what a stall is. It is, by definition, caused by a high angle of attack.
"Instead of a physical solution, which could mean months of delay or even needing a new design, they opted for the MCAS software, which used A SINGLE SENSOR (which, by the way, fails constantly), to determine whether the angle of attack was too steep, and then automatically and without warning applied a correction to the pilot's input, decreasing the angle of attack."
Untrue. They spent months developing aerodynamic (what you call physical) solutions. These helped, but weren't enough. The purpose of MCAS was not to decrease the angle of attack. The aircraft had a slight tendency to pitch up under certain circumstances. MCAS was designed to compensate for that tendency to maintain the flight path commanded by the pilots. Also, alpha vanes (the sensors you refer to) do not fail often. In the case of Lion Air, the failure was caused by improper maintenance. The technician should have caught the problem through testing, but he deliberately neglected to perform the test. It turns out that he had been doing this for months.
"MCAS was NEVER explained to ANYONE outside of Boeing."
False. MCAS was developed in order to meet FAA requirements, and of course the FAA was fully aware of it when they granted certification.
"Flight 610 pilots do not understand why their plane keeps nose diving despite the pilots trying their best to push the nose up. They keep following the manual and none of the solutions work. There are no signals flashing, no damage to any parts that would cause this, it is not a problem with the hydraulics. They are completely dumbfounded."
Incorrect. They did not follow "the manual". They didn't follow any of the procedures they were trained for. If they had followed the unreliable airspeed procedure as required, MCAS could never have activated. Once it did activate, they should have performed the runaway stabilizer trim procedure, but they didn't. Both of these procedures were identical to the ones for the earlier 737 models, and the pilots should have practiced both in a simulator. Despite their failure to perform these procedures, the Captain had no trouble flying the aircraft for 10 minutes. It was only when he handed over control to the inexperienced First Officer that the situation got out of control. Rather than overriding MCAS as the Captain had, the FO allowed it to make large adjustments in the nose-down direction and he only made tiny nose-up adjustments. These tiny inputs weren't enough to get back to a correct trim setting, but they allowed MCAS to activate repeatedly. The Captain failed to monitor what the FO was doing and take back control. No-one at Boeing or the FAA had anticipated that pilots would behave this way since it goes against basic flying techniques. The aircraft must always be kept in trim.
"When MCAS was discovered, Boeing's excuse as to why it is not explained or even mentioned anywhere in the training is that they considered it not vital for the plane operation. You know, the system that CANNOT be overwritten by any means that will nosedive your plane into the ground because one of the sensors that constantly fails has failed. Not vital for the plane operation."
I believe that's a misrepresentation of what they said, but please show me the quote if I'm wrong. Pilots could easily override MCAS by using the trim switches or wheels. This is a basic and routine part of flying. And as mentioned above, MCAS would never have activated at all if the standard procedures had been followed.
Patching executables on the fly to remove some conditions inside loops was nice, but then someone invented instruction prefetch queue and pipelining and someone decided that on real operating systems, executable memory pages are read only. I played with it as a kid a bit, it was somewhat possible to quess the processor based on side effects, such as patching only had effect when it was done further than 10 instructions ahead or so.
This is from The Jargon File, if anyone is curious. It's an excellent repository of stories, slang terms, and ways of life from the tech world of old. It provides a gimpse into the culture.
Particularly interesting is "The Lexicon."
Read all that stuff when I was coming up in the late 90s, formatted the family computer with Linux and Slashdot was in its heyday. Good times.
Not long ago I read "The Soul of a New Machine" by Tracy Kidder, about Data General, published in 1981. Just like this article, it felt surprisingly similar to today, just with different jargon.
I have been appropriately humbled.
just came from the story of mel video and i didn’t even realized that the story of mel was linked in this article. absolute masterpiece of blogging
This article is hilarious I keep coming back to it
I cannot disagree more with the premise of this article. REAL programmers were just extremely proficient at the languages they are using, and not so much with newer languages. So changing paradigm requires lots of learning and code porting.
To say that the language itself is the reason for success, is just wild.
To assume that languages only grow towards worse.
Feel free to debug and code in hex, other people finished debugging using their IDE 5 hours ago, and are now at home.
the first AI winter happened because perceptrons weren't being linked together to generalize functions (aka neural nets) and there wasn't enough compute iirc
Interesting, the height of quiche eater environments, Smalltalk, was developed by guys who previously worked for DARPA developing ARPANET, which probably qualifies as something Real Programmers do.
Mid 2000, VNC was cyclic discovery. Every couple of years some new people would suddenly discover VNC as they'd get better at programming. Shout out thrir discovery to email-lists. Lol.
Discovering VNC was such a correct gauge/milestone on knowing people's skills lol
18:12 So let me get this straight - You would rather patch the binary?!!! And not keep any source around at all?!!! And not because the compilation is slow or anything, but rather because your text editor is so bad it is easy to break your program by editing it?!!! WTF is going on?!!!
I thought I knew programming, but I realize I knew nothing at all. I am NOT a Real Programmer. Quiche eater over here... Holy fu--, the amount of times during this article I just had to stop and either laugh out loud or scream "WHY WOULD YOU EVER THINK OF DOING THAT" to the high heavens is uncountable.
Man, I would be intimidated by this guy's neck beard lol
I'm a professional software engineer, maily using C++ and C#... But I do love and use Pascal ! And I'm glad it was revived with Freepascal and Lazarus. Pascal is, to my opinion, still the best language to learn how to write software!
It gives you everything used in "industry languages", but in a safe, verbose environment. And it compiles very fast, which is nice if you have to try things out.
I just read Dijkstras green language analysis. It's interesting that without or missing the context of Adas packaging and privacy features then someone so great could be so wrong. He also doesn't get how types protect the user and reduce the need for branching checks. Ada was so far ahead of OOP. I actually think it hits a sweet spot without using any inheritance that was added in 95. Procedural with privacy, highly modular packages, type design and strong typing.
Real men eat quiche. The whole "real men" thing in programming was a sort of parody that is probably mostly forgotten now, which was a staple of a "holy war" type discussion. It itself was a twist on a book called "Real Men Don't Eat Quiche".
`sudo apt install lazarus`
Lol, my childhood!
The Real Programmer joke.
Even funnier, this attitude developed over only a decade or so, as until the late 60's or early 70's programming was considered "women's work."
8:20 Whenever my program gets a little more complicated, I use a lot of goto anyway.
In rust I can have named blocks now and then either break from them (goto after the block), or continue them (if they are declared as loop, so a goto to the beginning of the block).
And loops shouldn't loop by default.
Most of my loops have a break at the end anyway.
They usually look somewhat like this:
for (;;) {
if (...) continue;
...
if (...) continue;
...
break;
}
This wolud be written easer with goto:
loop:
if (...) goto loop;
...
if (...) goto loop;
...
And the break at the end isn't even neccessary.
Ha ha. I took Pascal in hs back in the early 90’s after learning BASIC on my Commodore64 when I was 8. Went to school for IT. Then went on to web development later and then product management.
My step dad was the project manager at Lockheed that led the software development dept to get the 4 different computer systems on the space shuttle to work back in the 70s. He’s the one that bought me my first computer when I was 8 and showed me BASIC.
18:12 fix compiler so it doesn’t suck? No thanks, patching binary is what real programmers do.
At school in the very early 80s we wrote Fortran 4 programs using coding sheets for Computer Science class. Was interesting. Way more interesting than the COBOL we also did. Also on coding sheets.
Real programmers use Microsoft paint for the frontend and scratch for the backend.
I was learning Pascal about them, much better than Fortran, and ABEND is pronounced Ab-End, abnormal end, or raise exception. We used Sheffield Pascal on Prime minister at uni. We also used Object Pascal, when object orientated programming came about. I wrote a collector and report generator in Pascal, the collector logged output from an SX2000 port concentrator and reported on usage. Most I used Prime's PLP and SPL which were PL/1 like, and you could declare quite sophisticated structures.
Well, I saw Adam Dunkels write a demo effect, a fire effect burning on the screen, on his c64 in HEX. It worked fine. He did it because he couldn't find the disk he saved it on so he just rewrote it on the fly to show us what it looked like.
Imagine this guy seeing you deploy your Next.js project on Vercel from VSCode :D
I remember the Quiche Eater flame. FORTRAN had a strangle hold on any fast calculation and memory efficient topics. UI and GUI didn’t exist back then. CRAY supercomputer was benchmarked against FORTRAN subroutines used at Los Alamos and Lawrence Livermore Labs (LINPACK). But today’s computers have so much more processors, this no longer makes sense. I learned some Assembly and FORTRAN in school in 80’s, but also took a Pascal class. I mostly tinkered in Basic and TurboBasic.
I am slowing indocrinating myself to like Fortran. The committee is actually getting their shit together and doing good things this days.
And the tooling is surprisingly modern, it is a joy to use and still as fast as ever. It's underrated, if you ask me, because people would rather circle jerk with libraries that are essentially Fortran bindings (aka everything on Python)
30:31 Damnit, I'm being called out! I was doing this very thing today at work.
SPZAP was real thing. It was often used to patch programs... actually if you work on an IBM mainframe it still is sometimes a real thing especially on z/OS.
delphi legend
Xerox and Bell Labs had lots of research come out because of tax policy. Corporate taxes on profits were really high. Companies could either reduce prices to avoid large tax bills or reinvest with things like research. Xerox and Bell Labs did the latter.
I swear my FORTRAN professor in 1991 wrote this article! I heard him say many of these same things.
As someone who learned Pascal around the time this article was written, I understood the Quiche Eater reference.
Interestingly, in modern versions of pascal (aka Delphi) you can actually turn on constant assignment. i.e. if you declare a constant you can actually change it....This is complete madness of course, and is done for backward compatibility. So you could in theory have a constant limit for some catastrophic event and in code keep changing it.
God I remember reading this article as a kid who was learning Pascal.
Real programmers only write in VHDL
program lol;
begin
writeln('Never gonna give you up!');
end.
I got paid to write pascal back in the early 90s. We were developing software for doctor's surgeries. This was before source control was a thing. But you know what, we managed just fine.
I haven't used Pascal since about 2006, although I had moved on to Delphi by that time. But, Pascal/Delphi sure made me a ton of money up until then.