" if you're not ready to argue uselessly for hours over things that don't even matter then you're not ready to be a programmer " no truer words have ever been spoken
I disagree with this statement... I was born to argue over useless things that didn't matter... it took 6 years of life to realize that meant i was a programmer.
That's why the lady (and it was invariably a lady) that converted your written code into punchcards (yes, that was a job), she would put a thick line in marker pen diagonally across the top edge of the cards. This made the tripping-and-spilling your cards annoying but not suicide-inducing. You "just" had to restore the line and your cards would be in order.
Yes, I immediately thought of this diagonal line. commons.wikimedia.org/wiki/File:Punched_card_program_deck.agr.jpg#/media/File:Punched_card_program_deck.agr.jpg
That was my mom! She was a coder at two BIG government contractors in the 70s. We used to re-sort those “corrupt” stacks at home (almost every night)! Mom independently invented the marker trick after a few months at the first. The engineers wanted her quit it-instead she eventually replaced those engineers as the primary programmers! (Funny how the business interests of the company won out.) Definitely watch “Hidden Figures,” to understand the stupid culture back then.
They had punch card sorters that physically implemented a radix sort, one column at a time. You started with the least significant digit and worked up. The machine would spit out a separate stack for each digit. You’d just pile up the stacks, feed them back in and run for the next digit.
imagine the interview: Sort these cards with a radix sorting algorithm and make sure not to trip and fall with the cards. We'll then move to some asm questions.
Modern cobol runs on virtual state machines implemented on top of Java or GCC's cobol standard library or similar. The primary reason cobol is still used is it is auditor-friendly. Auditors cannot generally _write_ cobol, but they can read it with minimal assistance. It is nearly a perfect subset of English, so if you can read english you can understand cobol.
@@ChilenonetoTH-cam That's not entirely true. Sure, the old cobol code itself is quite "battle tested" and unlikely to have latent bugs that will spontaneously break, but the VMs that now run it can have issues, and it _is_ connected to the outside world at least a bit. At those boundaries, if someone exposes the wrong function to external tools, there _could_ be a problem. Still, it would likely require an incredibly targeted attack, not just grabbing the latest 0-day PoC off github.
When I saw this video I immediately realized that this guy has barely done his research and felt the irresistible urge to make a video showing all that half-baked knowledge
Yeah, so much of it was grating as the person that made it does not even have a surface level understanding of the material covered and gets a lot wrong, emphasizes incidental features, and so on.
Maybe do it better than?! There are so many beginners out there. So help them and don't mock about the one who tries. And I think he did a well enough job. So just like most programmers do in their jobs "well enough" to not being kicked out but still be hated if someone needs to review the code.
You don't hand number the punch cards you draw a diagonal line down the side of your stack with a sharpie. You'll always be able to put them back in order then. I've never written programs that way but I know someone who did in an academic setting. He said they would trip each other on purpose so you had to be prepared.
I've told this before but the punch card sorting reminds me: I worked as part of a physics experiment and had a real world opportunity for quicksort. Briefly: We built a particle detector with ~2000 cables that needed to be connected in a specific order before it was installed. The team responsible for connecting cables finished their job, so my job was to connect cables in the correct order to test equipment so as to confirm the equipment was functional. Problem: the cable guys didn't keep the cables sorted. I walked into a room full of ~2000 randomly tangled cables and had one afternoon to test all of them. I first tried randomly finding cables in order, no good, it would take a couple of days minimum. But then my computer programming experience came to mind: In place quicksort the cables. I finished the task on time and got the reward of not being kicked out of the lab.
5:52 ... that's the Czech National Bank ... it's still exactly the same as in this picture, and I work there as a developer. Just about an hour ago, I walked along this wall in the photo when I finished work and was going home. :) PS: I use VSCode :P PPS: but also Fedora .. and Vim for commit messages if not -m .. redemption ( ͡° ͜ʖ ͡°)
What language does Czech banking use? Russian banking mostly hires Java devs. I think it's because specialized COBOL-oriented computers were no longer the mainstream solution in 1990s, but I can't be sure about that - maybe IBM just failed to get our banks hooked on that stuff. So I wonder if it's the same in every post-Soviet country.
There was another trick to keeping punch cards ordered that worked great: drawing diagonal lines across the spine of the stack so you could instantly see if a card was in the wrong place. I imagine people learned this trick really quickly.
For a history of some departments in Helsinki University of Technology, we interviewed a lot of old beads. The strategy of drawing shapes on the stack varied greatly, almost everybody drew the lines on one side of the stack, many left it at that. Some finessed it further by drawing more creative (up-down unsymmetric) shapes on the other side of the stack.
Other countries prioritize updating their technologies by making laws that deprecate existing projects. Estonia wanted to develop their tech sector so put a maximum age on all government supporting technologies. They also wanted to bring in more tech talent, so established electronic residency programs.
@@sirhenrystalwart8303that's the dumbest comment. You're comparing a country with a population of 1.3 million, that was part of the Soviet Union until 30 years ago, with a country that has a population that's 300 times as big. When looking at per capita numbers, you'll see that the relative per capita size isn't that far off! And when you take into account that all the biggest tech companies (FAANG) have loads of cheaper tech departments in Europe, while funnelling the income to the US, it suddenly doesn't look that impressive to slightly outdo a small Baltic country.
I have been part of health tech startup scene for the better part of the last decade, and in that area the US is a total of shit show in terms of results while Estonia has one of the best technical platforms in the world. A US hospital can hardly share a document with another hospital in a structured data format, whereas Estonia has done that for ages. (I am not even close Estonia btw)
I just started a few monts a go all as a hobby and whenever I listen to this guy, I feel like I am missing 100 years of programming knowledge! :3 It just has so many levels to it! Like every video I watch... Every sentence is something I never heard of before!
1. LISP is (after FORTRAN) the second oldes language still in use. 2. It's funny to watch JAva, C#, C++, Python, Javascript to copy concepts that have been realised in LISP some 50 to 60 years ago .... 3. Lots of parentheses: In LISP programs are simply lists that can, if needed, be processed via the the complete set means, the language LISP offers. So the grammar of LISP is (only a little bit overly simplified) completely described as: a list starts with a '(' and ends with a ')'. The first symbol after '( ' is treated either a function call or as macro call or as special operator, the following elements of the list are treated parameters to the aforemetnioned function, macro oder special operator. 4. I grew up using C, C++ and later Java, PERL and Python. I came across LISP ca. 10 years ago and used it since then in educational and (semi )professional environments. Everytime I return to C, especially C++ I wonder who was able to come up with such cumbersome grammars.
I don't know if any ever had them, but you can encode the ordering of punch cards completely mechanically, and sort them nearly instantly by hand. You just punch holes along the edge, one for each of the bits in a binary number large enough to address every card, then you clip off the edges of the holes of each card's number, connecting them to the edge of the card. If it's card 5, you clip off the edge of holes 1 and 4. Now to sort them, just restock them all, properly aligned, then stick a pin through the least significant bit holes, and lift out the ones that haven't been clipped. The other ones will fall free and stay in the stack. Bring those to the front of the stack, then stick a pin through the second most significant digit, and do the same. Repeat until you've done all bits, and the cards are sorted. It's the real world version of the radix sort.
WebAssembly is more like talking about Java Byte Code. It was made as a common intermediatory, but just like I write my Java Byte Code in Java, Kotlin, Scala, or Groovy, I write my WebAssembly in Rust, C++, Go, or Kotlin.
1:33 Plankalkül was the first "high level language" in 1945, but the first implemented HLL was UNIVAC Short Code in 1950
3 หลายเดือนก่อน +5
0:20 Most forget that Assembly doesn't mean you don't have macros. With Macro's you almost get functions, variables and possibly loops. Suddenly assembly seems less barebone and manual IMHO.
In the 1990's I wrote thousands of lines of COBOL for a bank. Firstly on the IBM mainframe with DB2 on overnight batch programs. But then with COBOL and SCOBOL (Screen Cobol) on the Tandem non-stop computer. The Tandem was used for day trading because it had two transaction logs. Extra resilience for any faults with multi million currency deals and the formatting of SWIFT payment messages.
R as a user is wonderful for exactly what it’s intended for. The syntax for what you’re doing 99% of the time is smooth, the ecosystem is incredible, the certification is only beaten by SAS, it’s great. R as a developer is a reason to take a long walk off a short plank and if you’re trying to go out if it’s comfort zone, it’s hell
I think R is really great. It's super easy to teach it to somebody with no programming experience to automate data-handling tasks. I suggested it to project managers that would spend 1 day a week wrangling together reports collated from several excel sheets and gave them some quality materials for self-learning. After a few weeks (not full-time) they had scripts written that performed their 1 day of spreadsheet wrangling into a single script run. A lot of their time was spent double checking the validity of information as it would influence important decision making and the manual approach would commonly introduce errors, so automating the process was a boon. Obviously a proper data pipeline is preferred but so many organisations still run important aspects of their business and make multi-million dollar decisions on excel sheets 😅.
Missing: Unix shell/bash and other shells, PostScript (it actually is a programming language! it's stack based), Jai (do we count languages that aren't released yet?), Vala, Idris, Godot Script, all kinds of graphical scripting languages of various game engines etc. And if he mentioned HTML, then he should also mention SGML, XML, Yaml, Toml, JSON, TeX/LaTeX, Qt QML, ... (HTML is not a programming language, its a markup language. I.e. show me a HTML "program" that calculates the sum of two numbers.)
Circa time index 19:20. RE: Babbage. The problem is, the British government granted him £5,000 (IIRC) for the Difference Engine which he did not complete. To put that into perspective, that was the cost of several front line warships at the time. Charles realized he could do better and switched to the Analytic Engine in mid stream. This did not make him popular. On top of that, his protégée, Ada Augusta, who was not taken seriously due to being a woman was pretty much the only person to understand the full potential of the Analytic Engine. Not even Babbage understood its full potential. Ada wrote what is today considered the first program for automatic computing machinery. It was a program for the Analytic Engine that would calculate Bernoulli numbers. The machine was never built. Ada tried to get funding by betting on horse races. This did not go well for her. It is a rather sad and tragic story. She was eventually buried, after dying at a rather young age, next to her father, Lord Byron. Yes, the poet.
This is her diagram, which resembles an _Excel_ spreadsheet: en.wikipedia.org/wiki/Ada_Lovelace#/media/File:Diagram_for_the_computation_of_Bernoulli_numbers.jpg
This video coversa lot of languages but doesn't mention that before C was B. It also didn't mention FORTH, Modula 2, PL/1, PL/C, WATBOL, SPITBOL, or SNOOPY to name just a few. I think I also missed seeing any reference to LISP which also brings to mind Scheme. CSS has nothing to do with SQL. HTML is a markup language. (It is in the acronym.) I haven't seen anything in the standard about how to define variables or do math. The syntax of many languages that came after C were influenced by C. JS was originally for use on web pages but has been getting used outside of a web pages. BTW, The match scene makes perfect sense if you pay attention to the audio of the video on which you are commenting. It said before BASIC programming had mainly been the domain of scientists and mathematicians. That's why they were showing a scene with math on a chalk board.
I just want to say that the R community is so friendly and welcoming. R has also had best in class data analytics tools for well over 5 years, and a lot of the features people like in e.g. pandas, Polars, or Ibis, have their origins in either base R or the tidyverse. The one notable exception is probably machine learning, but in the past few years that has massively improved in R (from what I hear). The focus amongst the most influential people in the data space is now on interoperability between R, Python, and even Julia, by implementing multiple backends, and having bindings for popular C/C++/Rust frameworks, like Polars. What most people don't seem to realise as well is that R is a general purpose programming language with a huge, varied, and very high quality package ecosystem for all sorts of cool stuff.
15:45 I don't know how it feels to program in limited ram, but I am very happy to shove a lot of stuff in ram to avoid roundtrips to the network and filesystem, and most of the time it makes sense to just shove a lot of stuff in ram for better more efficient processing, so I don't think having less ram would make better software, because having a lot of ram allows us to cache a lot of stuff and avoid painfully slow IO operations
So sad he didn't mention Snap! the successor to scratch, or the scratch that has proper lists, lambdas, and heavily inspired by lisp. It was also created by a guy at UC Berkeley, and is hosted by Berkeley, although it's mainly developed by the creator, Jens Monig, a former lawyer. When I say it was heavily inspired by lisp, I mean it. In Snap! 10.0, we're getting the ability to convert Snap! code to lisp-like code and vice versa. I don't know if it's fully runnable lisp, but it does have the lisp syntax. Enough about lisp, can I just talk about how Snap! has proper functions, aka cust blocks? Yeah, scratch has custom blocks, but those were added after BYOB, the Snap! predecessor (which also happens to be a modification of scratch 1.4, although Snap! these days shares no source code with scratch), added them. When I say proper, I mean, you can create stack blocks, reporters, and predicates (which scratch loves to call them booleans), as well as adding more input types (which act like type hints). I could go on and on about how great Snap! is, but I'd be here all day. And by the way, yes, the exclamation mark is part of the name.
So the origin of Object Oriented Programming SIMULA-67 wasn't worth mentioning?? While USA was "playing" with COBOL and FORTRAN the Norwegians invented OOP in 1967!!!! For gods sake: OOP is almost 60 years old! Just look at simula: Begin Class Glyph; Virtual: Procedure print Is Procedure print;; Begin End; Glyph Class Char (c); Character c; Begin Procedure print; OutChar(c); End; Glyph Class Line (elements); Ref (Glyph) Array elements; Begin Procedure print; Begin Integer i; For i:= 1 Step 1 Until UpperBound (elements, 1) Do elements (i).print; OutImage; End; End; Ref (Glyph) rg; Ref (Glyph) Array rgs (1 : 4); ! Main program; rgs (1):- New Char ('A'); rgs (2):- New Char ('b'); rgs (3):- New Char ('b'); rgs (4):- New Char ('a'); rg:- New Line (rgs); rg.print; End;
I'm not accepting what he had to say about Kotlin. It's a language that allows you to write applications that run on the JVM. Think of it more as a replacement to Java, rather than a language limited to Android apps. And arguably, the most popular IDE for Kotlin is IntelliJ, not Android studio.
I really hate that people think object oriented programming is about classes and methods, instead of what it historically was meant for, which is isolated processes communicating via message passing. Poor Alan Kay, he's been fighting against the butchering for his ideas for most of his life and it's a fight that is evidently lost.
I tried Zig as a C replacement and wasn’t a big fan. Just felt clunky. I think I’ll have to give it another go in a few years. Odin, however, felt like a seamless upgrade from C. It’s like C, but with some nice extra features, but still all the same low level control.
I had the same experience. Really don't get the hype behind it. Especially since i found a bug in their testing code where it returned green even when the code wasn't and the response was basically "yeah meh it does that...". Not to mention the enforced whitespace (yet still you have to use semicolons) and different naming schemes in the standard library. Just feels so sloppy. And ignores too much knowledge we gained on how to design languages.
Trick to keeping your punch card stack in order (or any stack of cards): Draw a diagonal line across the thin edges on the stack when they are perfectly stacked and aligned. Then, if you have to recrwate the order just focus on recreating the diagonal line.
The original video has a bunch of comments clowning on him for a bunch of mistakes he made. Clearly the video was made by some child in high school taking intro to programming
Thanks for the reaction to my video. I realize that I made several mistakes in the video, I’ve been taking it in and I truly appreciate all of the constructive criticism everyone has given me. Keep doing what you’re doing and I love your stuff. ❤❤❤❤❤
Right. In PostgreSQL, as I recall, there was DML - data manipulation language (SELECT, UPDATE, etc) and DDL - data defination language (DEFINE TABLE, etc).
First time clicking off of one of these early, sorry but I'm not listening to a guy that thinks assembly is a programming language that can run in the browser for 40 minutes
When a stack of punched cards was prepared, the author would usually make a diagonal stripe with a marker down the edges, so that they'd be easier to sort in the event of a spill.
My favorite punch card story was in Richard Feynman's memories about doing the calculations for the Nuclear bomb. They had several problems, mainly every time they had to make changes to the program they would start over until one day someone, I don't remember if it was him, realized they could simply start the calculations with the results from last time that wouldn't change from where the new function changes. This literally saved them days or weeks of waiting by doing this one thing.
I worked with Assembly Language from 1996 till 2003. Walker Financial System from California, had an ERP also called Tamaris had the data abstraction layers written in Assembly Language.
Oh boy Mumps. I wrote a process in Mumps that collected real time appointment and visit data for patients a bit ago and it worked surprisingly well I never thought I'd have the opportunity to write in such an old language but it happened... and I am not much better of a programmer because of it.
MUMPS doesn't get enough love. The keywords can mostly get shortened to a single letter, but there's also only like 20 keywords for the whole language.
That combined with how easy it is to create and access persistent data, since it’s essentially the same syntax as accessing in-memory data makes it a pretty neat language. I could definitely see an alternative world where it saw broader adoption outside of just healthcare and some banking/financial software.
People say Elm is dead because the core language doesn't get updates simce 2019 but that's exactly what makes elm great. The language is stable, poweful and you can focus on doing stuff instead of being constantly worried avout the new lib that changed everything. There are "frameworks" like lamdera and elm-pages that extends the core functionalities to satisfy current needs like SSR, but you don't need to relearn everything every year. Also, elm type system is sooo much more simple and useful that typescript. You just define the type, say what types goes in and out a function and nothing else. No weird workarounds, no number | string | boolean | Array | StrangeTypeICantFindTheDefinitionAnywhere
No joke Pascal brings such good memories... very good working debugger, quick to write and very readable... if you look sideways and remove semi colons, its not very far from python
Pro tip! When carrying your program to the machine first take a marker and draw a diagonal line from one corner of the deck to the other across the side of the deck. That way if you drop the punch cards everywhere it's much easier to put them back in order. Much love from a cobol programmer
My first programming job was a mix of (mostly) COBOL and C on mainframe, running London Stock Exchange settlement and Real Time Gross Settlement between the UK wholesale banks. Thank god I haven't been near COBOL in decades but that job did teach me at least a couple of things. - Stateless, everything was stateless it had to be as massively parallel - Memory usage and management. At code review I had to justify every variable declared and every single byte of memory used - when you have a few megabytes and several hundred different server processes, most multi-instance, you really cannot f*ck around
gec4000 was a fantastic machine way ahead of its time, esp. os4000 os with its %sink nullfile concept which even today beats unix/linux $null. the os was on par with vax vms with its english commands . its JCL so flexible better than bash today. I operated and coded on GEC4000 and many other museum pieces back in the 70's
@@baconsandwichbaconsandwich727I caught the tail end of that era, starting programming in the 80s. My early encounters included things like the AS/400 and IBM's big metal. Mainly though, I programmed on PCs interfacing to those systems. I had the displeasure of having to port a program from RPG to C, which was a slog.
For large stacks of punched cards we would draw a diagonal line across the top edge of the cards which would help in reassembling a dropped deck. Rubber bands for small decks and metal boxes for larger ones were also important.
2:17 "Never be tired of being wrong" This is arguably the main characteristic of a true professional and/or expert. The difference between an amateur and a professional is that the amateur knows enough, but not enough to know they're wrong. A professional/expert knows that they dont know enough to not be wrong. A professional programmer knows they're wrong, but it works, so who cares. Always Be Learning!
LabVIEW mentioned?! From my knowledge its mostly used by EE's who do research with large, heavy equipment. I even had to use it just a few years ago. I hated it so much, but it's cool to see it in the list.
No, you usually did not need to rewrite your entire program if you trip with punchcards. Usually you write/print the number on there so you can just sort them by that number again and you're gold
Of course the video falls short, but it’s pretty darn impressive (and fairly accurate) for a 15 minute high-altitude/high-speed tour. I teach a high school engineering course and it’s a fun video to show them (with caveats). I needed to explain not only so many of the failed languages (not included!), but failed and specialty hardware platforms (NeXT, IRIX, blah, blah), many scripting “languages” (which should be included) and a vast history of OSs. Corrupt punchcard stacks (of less than 300 cards) were actually NOT hard to sort, but that was a fun story. I loved you freaking out about indenting! My professors were so inconsistent about it that one guy downgraded students who followed another guy’s strict specs! I’m teaching them Python which is not MY choice. I’ve taught BASIC, APL, MATLAB, Pascal, C, Java, C++, but I don’t think any of them turned out to be both the “perfect” intro teaching language AND a great all-around language. I guess C++ or Java was my favorite. (I messed about with LISP and Prolog and I dodged some bullets there.) I’ve also worked with students solving math, databasing and statistical issues in other languages. As for Babbage, yes, he was a genius, but Ada basically figured out how to turn what ended up being his mess into something that MIGHT have worked. Babbage is the earliest study in “great idea, poor implementation.” The Traveling Salesman problem is more familiar to moderately informed people than talking about discrete math and graph theory (like Euler cycles/circuits/tour/trail/etc.). I HATE the Java/Javascript confusion. Those people clearly suffered from TOO MUCH COFFEE (which contrary to Paul Erdös’ beliefs that mathematicians turning coffee into theorems, did NOT result in good computer language naming strategies)! Okay, now to pick your brain:_If you wanted to choose THE MOST VALUABLE LANGUAGE to both teach AND to use regularly, what are the five you would choose (ordered from best to worst)?
I have a love watchirg people try to explain object oriented programming with that same style of example. 'And the fire pokemon class inherits from the pokemon class, which has an Attack method and an Evolve method, and the char-type pokemon class inherits from the fire pokemon class, and then charmander, char-whatever, and charzard inherit from the char-type pokemon class, and then we instantiate a new instance of the charmander class, which overrides the Attack method with 'Attack' and '4 damage' and overrides the Evolve method to return Charmeleon (I think that's it), and we name it CharChar but it's of type CharTypePokemon not Charmander because it can evolve, and then we apply the Evolve method and it returns a Charmeleon and we assign the value to our variable CharChar and and and and' They always get so excited talking about Pokemon (or whatever their example is) that they go on and on and on while completely missing the point of object orientation. I mean, really, it's comical how people who don't really understand what OOP is about get so excited about their overengineered example attempting to explain it.
I recall my professors lamenting about the inconvenience of having only one computer available in the entire province. This resulted in several weeks of waiting in line, only to discover an error in your cards and having to start the process all over again, ultimately ending up at the back of the queue.
I am new to the channel and so far I am having a pretty good time. Also, due to my birthplace and since Lua was mentioned I am obliged to state : THIS IS BRAZIL!
This is one of my favorite videos now, church. I really miss ActionScript. It was my first real introduction to actual coding and OOP principles. And yes, technically HTML is a declarative programming language.
"if you're not ready to argue uselessly for hours about things that dont even matter, youre not ready to be a programmer," ive never heard my personality so concisely described, or why i chose programming explained to myself in a way i accepted the self flagulation of it the way i did. Never felt so seen, attacked, and as though i found my people simultaneously. Subbed.
" if you're not ready to argue uselessly for hours over things that don't even matter then you're not ready to be a programmer " no truer words have ever been spoken
Well shit i guess im good to go
@@bigerrncodes yeah same
I hate arguing over useless crap. I was wondering why I wasn't a very good programmer despite decades of practice, I think I just found it.
I disagree with this statement... I was born to argue over useless things that didn't matter... it took 6 years of life to realize that meant i was a programmer.
It does matter
😅
APL is named "A Programming Language" because that was the title of the book that they later turned into an actual language. It was pure theory first.
I didn’t know that, thank you.
cool
Why do other people call it Array Programming Language instead?
@@jongeduard I haven't heard that before, but it is one of the array programming languages, so I guess that works as well. :)
I had read that the inventor, Kenneth Iverson, originally meant it to be a math notation for arrays.
every single language in 15 min. nah, thanks. 43 min reaction from prime. Here we go.
Every reaction to every language in 43mins
I don't know who is worse, hem or asmongold
@@ThatSupportThoamongold is just a shit youtuber, primeagen's reactiona are at least a bit insightful.
6:03 rather have the stability of a financial system depend on COBOL, than NPM community packages tbh.
REAL
(real)
If your mission critical system ain’t broke don’t even fucking look at it let along try to fix it
it community packages :v not thing make by community is stable
R E A L
That's why the lady (and it was invariably a lady) that converted your written code into punchcards (yes, that was a job), she would put a thick line in marker pen diagonally across the top edge of the cards. This made the tripping-and-spilling your cards annoying but not suicide-inducing. You "just" had to restore the line and your cards would be in order.
Yes, I immediately thought of this diagonal line. commons.wikimedia.org/wiki/File:Punched_card_program_deck.agr.jpg#/media/File:Punched_card_program_deck.agr.jpg
Brilliant simplicity
That was my mom! She was a coder at two BIG government contractors in the 70s. We used to re-sort those “corrupt” stacks at home (almost every night)!
Mom independently invented the marker trick after a few months at the first. The engineers wanted her quit it-instead she eventually replaced those engineers as the primary programmers! (Funny how the business interests of the company won out.)
Definitely watch “Hidden Figures,” to understand the stupid culture back then.
Later there were electro mechanical sorters for punch cards.
I used the marker trick myself.
We have so much RAM now that someone made Redis. We just don't have that dog in us anymore.
I had no idea that redis consumes lots of ram. Thanks master
Redis is literally a (per default) non persistent key-value storage, it's in ram and nowhere else @@kuroxell
@@kuroxellYesn't. Running Redis uses negligible amounts of RAM. But Redis is used as a in-memory database
Redis has a base 3mb memory footprint
@@dejangegic Think of it less as a DB and more of a data structure manager
In just 15 minutes? Let's go!
*45 min reaction video*
I guess prime went oop on this one
They had punch card sorters that physically implemented a radix sort, one column at a time. You started with the least significant digit and worked up. The machine would spit out a separate stack for each digit. You’d just pile up the stacks, feed them back in and run for the next digit.
Sounds hot
that's really cool
imagine the interview: Sort these cards with a radix sorting algorithm and make sure not to trip and fall with the cards. We'll then move to some asm questions.
@@o1-preview it went more like: Oh you can write a hello world? You are hired!
@@nevemlaci2486 ah good times, back when you could get a job with about 30 lines to say hello world in asm
2023, Prime does OCaml
2024, Prime does Elm and Charm
2025.. Prime learns Haskell?????
I like how he mentioned Lisp and just figured that covered every language with a the lisp-like syntax
JQuery as an example of OOP is WILD
Modern cobol runs on virtual state machines implemented on top of Java or GCC's cobol standard library or similar. The primary reason cobol is still used is it is auditor-friendly. Auditors cannot generally _write_ cobol, but they can read it with minimal assistance. It is nearly a perfect subset of English, so if you can read english you can understand cobol.
And if it works one time, works everytime.. is bullet proof, and so ancient, has no posibility of external hackers connecting and cracking it.
@@ChilenonetoTH-cam That's not entirely true. Sure, the old cobol code itself is quite "battle tested" and unlikely to have latent bugs that will spontaneously break, but the VMs that now run it can have issues, and it _is_ connected to the outside world at least a bit. At those boundaries, if someone exposes the wrong function to external tools, there _could_ be a problem. Still, it would likely require an incredibly targeted attack, not just grabbing the latest 0-day PoC off github.
@@yellingintothewind entirely right.
When I saw this video I immediately realized that this guy has barely done his research and felt the irresistible urge to make a video showing all that half-baked knowledge
I don't see such a video on your channel so I take it you resisted the irresistible
Yeah, so much of it was grating as the person that made it does not even have a surface level understanding of the material covered and gets a lot wrong, emphasizes incidental features, and so on.
Yeah, me too, but again, I have so much stuff to do. I wonder how much of the video written by a GPT.
asm != webassembly
Maybe do it better than?! There are so many beginners out there. So help them and don't mock about the one who tries. And I think he did a well enough job. So just like most programmers do in their jobs "well enough" to not being kicked out but still be hated if someone needs to review the code.
80 seconds in: oh, it's that kinda tech video
You don't hand number the punch cards you draw a diagonal line down the side of your stack with a sharpie. You'll always be able to put them back in order then. I've never written programs that way but I know someone who did in an academic setting. He said they would trip each other on purpose so you had to be prepared.
Meh, that'll teach me to type comments before reading all the other once already entered :D
Having someone else say the same thing just means you had a good comment. @@gwaptiva
Sharpies hadn't been invented in the 1950s
who said something about the 60s; we had that in the 80s
@@____uncompetative Back then they were called "black markers."
I've told this before but the punch card sorting reminds me:
I worked as part of a physics experiment and had a real world opportunity for quicksort. Briefly: We built a particle detector with ~2000 cables that needed to be connected in a specific order before it was installed. The team responsible for connecting cables finished their job, so my job was to connect cables in the correct order to test equipment so as to confirm the equipment was functional.
Problem: the cable guys didn't keep the cables sorted. I walked into a room full of ~2000 randomly tangled cables and had one afternoon to test all of them. I first tried randomly finding cables in order, no good, it would take a couple of days minimum.
But then my computer programming experience came to mind: In place quicksort the cables. I finished the task on time and got the reward of not being kicked out of the lab.
5:52 ... that's the Czech National Bank ... it's still exactly the same as in this picture, and I work there as a developer. Just about an hour ago, I walked along this wall in the photo when I finished work and was going home. :)
PS: I use VSCode :P
PPS: but also Fedora .. and Vim for commit messages if not -m .. redemption ( ͡° ͜ʖ ͡°)
Hello, fellow COLOL programmer
What language does Czech banking use?
Russian banking mostly hires Java devs.
I think it's because specialized COBOL-oriented computers were no longer the mainstream solution in 1990s, but I can't be sure about that - maybe IBM just failed to get our banks hooked on that stuff.
So I wonder if it's the same in every post-Soviet country.
There was another trick to keeping punch cards ordered that worked great: drawing diagonal lines across the spine of the stack so you could instantly see if a card was in the wrong place. I imagine people learned this trick really quickly.
For a history of some departments in Helsinki University of Technology, we interviewed a lot of old beads. The strategy of drawing shapes on the stack varied greatly, almost everybody drew the lines on one side of the stack, many left it at that. Some finessed it further by drawing more creative (up-down unsymmetric) shapes on the other side of the stack.
"Zig is the truest successor to C/C++ there has ever been " well said prime.
The custom bit integers is suuper useful.
Mumps: exactly the kind of terse super efficient code I want my X-Ray death machine to be programmed in.
Other countries prioritize updating their technologies by making laws that deprecate existing projects. Estonia wanted to develop their tech sector so put a maximum age on all government supporting technologies. They also wanted to bring in more tech talent, so established electronic residency programs.
And whose tech industry is stronger, America's or Estonia's? Say what you will about America's system, but it produces some amazing results.
@@sirhenrystalwart8303 lmao as if that was the only factor at play
@@sirhenrystalwart8303that's the dumbest comment. You're comparing a country with a population of 1.3 million, that was part of the Soviet Union until 30 years ago, with a country that has a population that's 300 times as big. When looking at per capita numbers, you'll see that the relative per capita size isn't that far off! And when you take into account that all the biggest tech companies (FAANG) have loads of cheaper tech departments in Europe, while funnelling the income to the US, it suddenly doesn't look that impressive to slightly outdo a small Baltic country.
I have been part of health tech startup scene for the better part of the last decade, and in that area the US is a total of shit show in terms of results while Estonia has one of the best technical platforms in the world. A US hospital can hardly share a document with another hospital in a structured data format, whereas Estonia has done that for ages.
(I am not even close Estonia btw)
@@carlerikkopseng7172What percentage of your networth is invested in Estonian tech companies?
I just started a few monts a go all as a hobby and whenever I listen to this guy, I feel like I am missing 100 years of programming knowledge! :3
It just has so many levels to it!
Like every video I watch... Every sentence is something I never heard of before!
1. LISP is (after FORTRAN) the second oldes language still in use.
2. It's funny to watch JAva, C#, C++, Python, Javascript to copy concepts that have been realised in LISP some 50 to 60 years ago ....
3. Lots of parentheses:
In LISP programs are simply lists that can, if needed, be processed via the the complete set means, the language LISP offers.
So the grammar of LISP is (only a little bit overly simplified) completely described as:
a list starts with a '(' and ends with a ')'. The first symbol after '( ' is treated either a function call or as macro call or as special operator, the following elements of the list are treated parameters to the aforemetnioned function, macro oder special operator.
4. I grew up using C, C++ and later Java, PERL and Python. I came across LISP ca. 10 years ago and used it since then in educational and (semi )professional environments.
Everytime I return to C, especially C++ I wonder who was able to come up with such cumbersome grammars.
I don't know if any ever had them, but you can encode the ordering of punch cards completely mechanically, and sort them nearly instantly by hand. You just punch holes along the edge, one for each of the bits in a binary number large enough to address every card, then you clip off the edges of the holes of each card's number, connecting them to the edge of the card. If it's card 5, you clip off the edge of holes 1 and 4. Now to sort them, just restock them all, properly aligned, then stick a pin through the least significant bit holes, and lift out the ones that haven't been clipped. The other ones will fall free and stay in the stack. Bring those to the front of the stack, then stick a pin through the second most significant digit, and do the same. Repeat until you've done all bits, and the cards are sorted. It's the real world version of the radix sort.
That is absolutely brilliant!
Or you just draw a diagonal line across the top, which is what they'd actually do
WebAssembly is more like talking about Java Byte Code. It was made as a common intermediatory, but just like I write my Java Byte Code in Java, Kotlin, Scala, or Groovy, I write my WebAssembly in Rust, C++, Go, or Kotlin.
1:33 Plankalkül was the first "high level language" in 1945, but the first implemented HLL was UNIVAC Short Code in 1950
0:20 Most forget that Assembly doesn't mean you don't have macros. With Macro's you almost get functions, variables and possibly loops. Suddenly assembly seems less barebone and manual IMHO.
In the 1990's I wrote thousands of lines of COBOL for a bank. Firstly on the IBM mainframe with DB2 on overnight batch programs. But then with COBOL and SCOBOL (Screen Cobol) on the Tandem non-stop computer. The Tandem was used for day trading because it had two transaction logs. Extra resilience for any faults with multi million currency deals and the formatting of SWIFT payment messages.
any suggestions to get into coding for a bank? most job offers I've seen ask for COBOL experience and that just doesn't exist now a days
cobols data division redefines clause was ace
Saying java came from c is weird imho. The complete object model was copied from smalltalk(i think)
I did that in the 2010s lol cobol and tandem (hp non stop)
R as a user is wonderful for exactly what it’s intended for. The syntax for what you’re doing 99% of the time is smooth, the ecosystem is incredible, the certification is only beaten by SAS, it’s great.
R as a developer is a reason to take a long walk off a short plank and if you’re trying to go out if it’s comfort zone, it’s hell
I think R is really great. It's super easy to teach it to somebody with no programming experience to automate data-handling tasks. I suggested it to project managers that would spend 1 day a week wrangling together reports collated from several excel sheets and gave them some quality materials for self-learning. After a few weeks (not full-time) they had scripts written that performed their 1 day of spreadsheet wrangling into a single script run. A lot of their time was spent double checking the validity of information as it would influence important decision making and the manual approach would commonly introduce errors, so automating the process was a boon. Obviously a proper data pipeline is preferred but so many organisations still run important aspects of their business and make multi-million dollar decisions on excel sheets 😅.
R is easy compared to low level programming...
Missing: Unix shell/bash and other shells, PostScript (it actually is a programming language! it's stack based), Jai (do we count languages that aren't released yet?), Vala, Idris, Godot Script, all kinds of graphical scripting languages of various game engines etc.
And if he mentioned HTML, then he should also mention SGML, XML, Yaml, Toml, JSON, TeX/LaTeX, Qt QML, ...
(HTML is not a programming language, its a markup language. I.e. show me a HTML "program" that calculates the sum of two numbers.)
by the way MATLAB = Matrix Laboratory Not Matrix Library in 19:17
Circa time index 19:20. RE: Babbage. The problem is, the British government granted him £5,000 (IIRC) for the Difference Engine which he did not complete. To put that into perspective, that was the cost of several front line warships at the time. Charles realized he could do better and switched to the Analytic Engine in mid stream. This did not make him popular. On top of that, his protégée, Ada Augusta, who was not taken seriously due to being a woman was pretty much the only person to understand the full potential of the Analytic Engine. Not even Babbage understood its full potential. Ada wrote what is today considered the first program for automatic computing machinery. It was a program for the Analytic Engine that would calculate Bernoulli numbers. The machine was never built. Ada tried to get funding by betting on horse races. This did not go well for her. It is a rather sad and tragic story. She was eventually buried, after dying at a rather young age, next to her father, Lord Byron. Yes, the poet.
This is her diagram, which resembles an _Excel_ spreadsheet:
en.wikipedia.org/wiki/Ada_Lovelace#/media/File:Diagram_for_the_computation_of_Bernoulli_numbers.jpg
@@____uncompetative Thanks for the link. I have a copy of her translation. I don't know yet if I can create a program to run her program.
@@____uncompetativemore like excel resembles the spreadsheets it replaced.
This video coversa lot of languages but doesn't mention that before C was B. It also didn't mention FORTH, Modula 2, PL/1, PL/C, WATBOL, SPITBOL, or SNOOPY to name just a few. I think I also missed seeing any reference to LISP which also brings to mind Scheme. CSS has nothing to do with SQL. HTML is a markup language. (It is in the acronym.) I haven't seen anything in the standard about how to define variables or do math. The syntax of many languages that came after C were influenced by C. JS was originally for use on web pages but has been getting used outside of a web pages. BTW, The match scene makes perfect sense if you pay attention to the audio of the video on which you are commenting. It said before BASIC programming had mainly been the domain of scientists and mathematicians. That's why they were showing a scene with math on a chalk board.
I just want to say that the R community is so friendly and welcoming. R has also had best in class data analytics tools for well over 5 years, and a lot of the features people like in e.g. pandas, Polars, or Ibis, have their origins in either base R or the tidyverse. The one notable exception is probably machine learning, but in the past few years that has massively improved in R (from what I hear). The focus amongst the most influential people in the data space is now on interoperability between R, Python, and even Julia, by implementing multiple backends, and having bindings for popular C/C++/Rust frameworks, like Polars. What most people don't seem to realise as well is that R is a general purpose programming language with a huge, varied, and very high quality package ecosystem for all sorts of cool stuff.
I just fucking love Julia
15:45 I don't know how it feels to program in limited ram, but I am very happy to shove a lot of stuff in ram to avoid roundtrips to the network and filesystem, and most of the time it makes sense to just shove a lot of stuff in ram for better more efficient processing, so I don't think having less ram would make better software, because having a lot of ram allows us to cache a lot of stuff and avoid painfully slow IO operations
So sad he didn't mention Snap! the successor to scratch, or the scratch that has proper lists, lambdas, and heavily inspired by lisp. It was also created by a guy at UC Berkeley, and is hosted by Berkeley, although it's mainly developed by the creator, Jens Monig, a former lawyer. When I say it was heavily inspired by lisp, I mean it. In Snap! 10.0, we're getting the ability to convert Snap! code to lisp-like code and vice versa. I don't know if it's fully runnable lisp, but it does have the lisp syntax. Enough about lisp, can I just talk about how Snap! has proper functions, aka cust blocks? Yeah, scratch has custom blocks, but those were added after BYOB, the Snap! predecessor (which also happens to be a modification of scratch 1.4, although Snap! these days shares no source code with scratch), added them. When I say proper, I mean, you can create stack blocks, reporters, and predicates (which scratch loves to call them booleans), as well as adding more input types (which act like type hints). I could go on and on about how great Snap! is, but I'd be here all day.
And by the way, yes, the exclamation mark is part of the name.
So the origin of Object Oriented Programming SIMULA-67 wasn't worth mentioning?? While USA was "playing" with COBOL and FORTRAN the Norwegians invented OOP in 1967!!!! For gods sake: OOP is almost 60 years old! Just look at simula:
Begin
Class Glyph;
Virtual: Procedure print Is Procedure print;;
Begin
End;
Glyph Class Char (c);
Character c;
Begin
Procedure print;
OutChar(c);
End;
Glyph Class Line (elements);
Ref (Glyph) Array elements;
Begin
Procedure print;
Begin
Integer i;
For i:= 1 Step 1 Until UpperBound (elements, 1) Do
elements (i).print;
OutImage;
End;
End;
Ref (Glyph) rg;
Ref (Glyph) Array rgs (1 : 4);
! Main program;
rgs (1):- New Char ('A');
rgs (2):- New Char ('b');
rgs (3):- New Char ('b');
rgs (4):- New Char ('a');
rg:- New Line (rgs);
rg.print;
End;
And simula is quite pretty as well
8:50
he literally does
@@realryleu Not enough!
"who brings a pencil to a blackboard???"
I do, to have something to play with 😂
I'm not accepting what he had to say about Kotlin. It's a language that allows you to write applications that run on the JVM. Think of it more as a replacement to Java, rather than a language limited to Android apps. And arguably, the most popular IDE for Kotlin is IntelliJ, not Android studio.
14:32 "C is the Greatest Language of All time. Obviously the foundation of computer Science" !!! Couldn't be more true
They never mentioned "FORTH."
I really hate that people think object oriented programming is about classes and methods, instead of what it historically was meant for, which is isolated processes communicating via message passing. Poor Alan Kay, he's been fighting against the butchering for his ideas for most of his life and it's a fight that is evidently lost.
Nobody brings a pencil to a chalkboard fight.
To think back in the 2000's everyone was advised to disable Javascript on their browsers. And we were told all websites would soon be built in Flash.
And a lot had at least a flash intro ...
COBOL: Completely Obnoxious Because Of Length
I tried Zig as a C replacement and wasn’t a big fan. Just felt clunky. I think I’ll have to give it another go in a few years.
Odin, however, felt like a seamless upgrade from C. It’s like C, but with some nice extra features, but still all the same low level control.
I had the same experience. Really don't get the hype behind it.
Especially since i found a bug in their testing code where it returned green even when the code wasn't and the response was basically "yeah meh it does that...".
Not to mention the enforced whitespace (yet still you have to use semicolons) and different naming schemes in the standard library.
Just feels so sloppy. And ignores too much knowledge we gained on how to design languages.
Trick to keeping your punch card stack in order (or any stack of cards): Draw a diagonal line across the thin edges on the stack when they are perfectly stacked and aligned. Then, if you have to recrwate the order just focus on recreating the diagonal line.
Original video lost me when he equated assembly to WASM which shows he has absolutely no idea what he is talking about
Why is he making a video that he has no idea what he's talking about?
@@Jean-rg9zgwhy are you commenting nonsense?
The original video has a bunch of comments clowning on him for a bunch of mistakes he made. Clearly the video was made by some child in high school taking intro to programming
Why did i learn Javascript?
@@thunderstein5041because you lack the intelligence to learn anything else
Thanks for the reaction to my video. I realize that I made several mistakes in the video, I’ve been taking it in and I truly appreciate all of the constructive criticism everyone has given me. Keep doing what you’re doing and I love your stuff. ❤❤❤❤❤
en.wikipedia.org/wiki/Plankalkül#Data_types
en.wikipedia.org/wiki/Ada_Lovelace#/media/File:Diagram_for_the_computation_of_Bernoulli_numbers.jpg
en.wikipedia.org/wiki/Ada_Lovelace#/media/File:Diagram_for_the_computation_of_Bernoulli_numbers.jpg
Are you gonna do one on esoteric programming languages?
>mistakes
thats called lying without doing any research, mr chatgpt
As a Clojure programmer, seeing it completely out of the list... doesn't surprise me at all
The video is AI generated by the way. If you listen to the ruby section again you can see the tts struggle.
Crazy
Your channel is like a hidden gem on TH-cam. So glad I found it!
SQL is not a programming language. It is a data manipulation language.
Right. In PostgreSQL, as I recall, there was DML - data manipulation language (SELECT, UPDATE, etc) and DDL - data defination language (DEFINE TABLE, etc).
Being wrong is the first step to being right. Or insanity.
First time clicking off of one of these early, sorry but I'm not listening to a guy that thinks assembly is a programming language that can run in the browser for 40 minutes
When a stack of punched cards was prepared, the author would usually make a diagonal stripe with a marker down the edges, so that they'd be easier to sort in the event of a spill.
LISP for life
This guy has an exceptional ability to turn 15 minute videos into hour long ones.
Forth not mentioned
My favorite punch card story was in Richard Feynman's memories about doing the calculations for the Nuclear bomb. They had several problems, mainly every time they had to make changes to the program they would start over until one day someone, I don't remember if it was him, realized they could simply start the calculations with the results from last time that wouldn't change from where the new function changes.
This literally saved them days or weeks of waiting by doing this one thing.
This is why scientists hate programmers. ;-)
Assembly == Web Assembly = I quit
Fr bruh if that was true I would quit too
=> I quit**
I worked with Assembly Language from 1996 till 2003. Walker Financial System from California, had an ERP also called Tamaris had the data abstraction layers written in Assembly Language.
Bro yo video is 43 minutes long 💀
Because his pauses make the video 3x longer.. but I'm watching to hear him spazz out so that's fine
The person making the video:
Rpg? sure
Scala? sure
scratch? sure
clojure? nah it wont make it into the list
Oh boy Mumps. I wrote a process in Mumps that collected real time appointment and visit data for patients a bit ago and it worked surprisingly well I never thought I'd have the opportunity to write in such an old language but it happened... and I am not much better of a programmer because of it.
MUMPS doesn't get enough love. The keywords can mostly get shortened to a single letter, but there's also only like 20 keywords for the whole language.
That combined with how easy it is to create and access persistent data, since it’s essentially the same syntax as accessing in-memory data makes it a pretty neat language. I could definitely see an alternative world where it saw broader adoption outside of just healthcare and some banking/financial software.
26:48 Brazil mentioned again. And he also said that Elixir was created in Brazil, IIRC the only language he did that.
If you want the purest example of a object oriented language take a smalltalk example. Its one of the best languages i ever used
People say Elm is dead because the core language doesn't get updates simce 2019 but that's exactly what makes elm great. The language is stable, poweful and you can focus on doing stuff instead of being constantly worried avout the new lib that changed everything.
There are "frameworks" like lamdera and elm-pages that extends the core functionalities to satisfy current needs like SSR, but you don't need to relearn everything every year.
Also, elm type system is sooo much more simple and useful that typescript. You just define the type, say what types goes in and out a function and nothing else. No weird workarounds, no number | string | boolean | Array | StrangeTypeICantFindTheDefinitionAnywhere
The language of the Lisa and original Macintosh was Pascal. We all didn't switch to C/C++ for Mac development until later in the 1980s.
No joke Pascal brings such good memories... very good working debugger, quick to write and very readable... if you look sideways and remove semi colons, its not very far from python
My prof told me, that they clipped some part of the top of the punch cards and had a machine to sort them later on.
Pro tip! When carrying your program to the machine first take a marker and draw a diagonal line from one corner of the deck to the other across the side of the deck. That way if you drop the punch cards everywhere it's much easier to put them back in order. Much love from a cobol programmer
My first programming job was a mix of (mostly) COBOL and C on mainframe, running London Stock Exchange settlement and Real Time Gross Settlement between the UK wholesale banks.
Thank god I haven't been near COBOL in decades but that job did teach me at least a couple of things.
- Stateless, everything was stateless it had to be as massively parallel
- Memory usage and management. At code review I had to justify every variable declared and every single byte of memory used - when you have a few megabytes and several hundred different server processes, most multi-instance, you really cannot f*ck around
Babbage was a "high level assembly language" for the GEC 4000 (1970ish), which predated Ada, so the name was probably considered already used.
gec4000 was a fantastic machine way ahead of its time, esp. os4000 os with its %sink nullfile concept which even today beats unix/linux $null. the os was on par with vax vms with its english commands . its JCL so flexible better than bash today. I operated and coded on GEC4000 and many other museum pieces back in the 70's
@@baconsandwichbaconsandwich727I caught the tail end of that era, starting programming in the 80s. My early encounters included things like the AS/400 and IBM's big metal. Mainly though, I programmed on PCs interfacing to those systems. I had the displeasure of having to port a program from RPG to C, which was a slog.
Bell labs was the faang of its time and put out more research papers than most universities
For large stacks of punched cards we would draw a diagonal line across the top edge of the cards which would help in reassembling a dropped deck. Rubber bands for small decks and metal boxes for larger ones were also important.
2:17 "Never be tired of being wrong" This is arguably the main characteristic of a true professional and/or expert. The difference between an amateur and a professional is that the amateur knows enough, but not enough to know they're wrong. A professional/expert knows that they dont know enough to not be wrong.
A professional programmer knows they're wrong, but it works, so who cares.
Always Be Learning!
LabVIEW mentioned?! From my knowledge its mostly used by EE's who do research with large, heavy equipment. I even had to use it just a few years ago. I hated it so much, but it's cool to see it in the list.
It’s used in… well.. labs
I'm never tired of being wrong , I'm a programmer love it
COBOL systems that still run are the perfect example of “if it ain’t broke don’t fix it”
You mentioning Turok RWs was all I needed to get thru the day.
I remember 1978 when the world was just FORTRAN, BASIC, and a lone nerd in the corner reading about Pascal. Good times
"raw dogging zeros and ones into the computer..." - this is what I am here for....
I swear I'm starting to get more passionate about programming with this kind of vids and community
Swift is what Prime wants Rust to be
The single thread performance of Prime in joking is always satisfying to see.
No, you usually did not need to rewrite your entire program if you trip with punchcards. Usually you write/print the number on there so you can just sort them by that number again and you're gold
The punchcard quick insurance hack: make pencil lines on the sides of the stack. Can recreate the pattern in the worst case.
Of course the video falls short, but it’s pretty darn impressive (and fairly accurate) for a 15 minute high-altitude/high-speed tour.
I teach a high school engineering course and it’s a fun video to show them (with caveats). I needed to explain not only so many of the failed languages (not included!), but failed and specialty hardware platforms (NeXT, IRIX, blah, blah), many scripting “languages” (which should be included) and a vast history of OSs.
Corrupt punchcard stacks (of less than 300 cards) were actually NOT hard to sort, but that was a fun story.
I loved you freaking out about indenting! My professors were so inconsistent about it that one guy downgraded students who followed another guy’s strict specs!
I’m teaching them Python which is not MY choice. I’ve taught BASIC, APL, MATLAB, Pascal, C, Java, C++, but I don’t think any of them turned out to be both the “perfect” intro teaching language AND a great all-around language. I guess C++ or Java was my favorite. (I messed about with LISP and Prolog and I dodged some bullets there.) I’ve also worked with students solving math, databasing and statistical issues in other languages.
As for Babbage, yes, he was a genius, but Ada basically figured out how to turn what ended up being his mess into something that MIGHT have worked. Babbage is the earliest study in “great idea, poor implementation.”
The Traveling Salesman problem is more familiar to moderately informed people than talking about discrete math and graph theory (like Euler cycles/circuits/tour/trail/etc.).
I HATE the Java/Javascript confusion. Those people clearly suffered from TOO MUCH COFFEE (which contrary to Paul Erdös’ beliefs that mathematicians turning coffee into theorems, did NOT result in good computer language naming strategies)!
Okay, now to pick your brain:_If you wanted to choose THE MOST VALUABLE LANGUAGE to both teach AND to use regularly, what are the five you would choose (ordered from best to worst)?
Finally a programming language review not entirely focused on st'upid web development.
I have a love watchirg people try to explain object oriented programming with that same style of example. 'And the fire pokemon class inherits from the pokemon class, which has an Attack method and an Evolve method, and the char-type pokemon class inherits from the fire pokemon class, and then charmander, char-whatever, and charzard inherit from the char-type pokemon class, and then we instantiate a new instance of the charmander class, which overrides the Attack method with 'Attack' and '4 damage' and overrides the Evolve method to return Charmeleon (I think that's it), and we name it CharChar but it's of type CharTypePokemon not Charmander because it can evolve, and then we apply the Evolve method and it returns a Charmeleon and we assign the value to our variable CharChar and and and and'
They always get so excited talking about Pokemon (or whatever their example is) that they go on and on and on while completely missing the point of object orientation. I mean, really, it's comical how people who don't really understand what OOP is about get so excited about their overengineered example attempting to explain it.
I recall my professors lamenting about the inconvenience of having only one computer available in the entire province. This resulted in several weeks of waiting in line, only to discover an error in your cards and having to start the process all over again, ultimately ending up at the back of the queue.
I am new to the channel and so far I am having a pretty good time.
Also, due to my birthplace and since Lua was mentioned I am obliged to state : THIS IS BRAZIL!
John Carmack wouldn't have become so well known if we hadn't had such restrictive computing limitations in the early 90's
My aunt used to be a programmer for sprint in the 90s and specialized in QA for COBAL systems. Crazy that they are still using it in finance!
This is one of my favorite videos now, church.
I really miss ActionScript. It was my first real introduction to actual coding and OOP principles.
And yes, technically HTML is a declarative programming language.
22:35 Brazil mentioned!! Yes!!!
Update: 40:00 Brazil mentioned twice YEESS!!!!
laughed almost every minute in this video, you were mentioned by theo also (which brought me here), subscribing now 🤟
"if you're not ready to argue uselessly for hours about things that dont even matter, youre not ready to be a programmer," ive never heard my personality so concisely described, or why i chose programming explained to myself in a way i accepted the self flagulation of it the way i did. Never felt so seen, attacked, and as though i found my people simultaneously.
Subbed.