Memory leak isn’t the issue of using too much memory. Program is still fine, as long as it knows what memory it’s using. Memory leak is the problem where the program forgets to actually delete the memory it no longer uses; the computer (operating system) thinks that the program is still using the memory but the program thinks it’s not.
The problem arises when the program forgets to free the memory, but then thinks it needs MORE memory and asks for it. If this happens in a loop the system will eventually run out of memory.
@@FranLegon Doesn't necessarily result; The program can leak just a few bytes on starting because it forgot to delete some initialization variables. But the rest of the program memory could still be managed properly. It's still dirty in the sense that you forgot to take the dirt off your shoes before entering house. Memory leak can be though of forgetting to throw out the trash. Usually users don't notice it. The problem only gets noticed when the trash raises a stink. Like how humans treat the environment.
Yeah the part about Swift is wrong as it's available on Windows and Linux (the Arc browser is even building it's Windows app on Swift). Yes iOS and Mac development probably requires a Mac but you can 100% write command line apps or backends on Windows and Linux. Also Kotlin is useful for more than Android development, it's fully supported in the popular Spring framework.
I don't think anyone would watch a video of this kind for its accuracy. It's clearly aimed for the common layman (i.e. someone who's only getting started in programming)
@@ashwinnarasimhan2729 technically yes, practically no. Same as with C#/.Net, which can be used with Mono outside of Windows. But why would you do that?
After the line: "You can still find Assembly in modern world: Web Assembly" i realized that this person knows nothing about any aspect of programming excluding wikipedia info.
Oh hey man, thanks for mentioning Holy-C. The dude behind it I actually talked to online a lot. He was actually a really nice guy when the Schizophrenia wasnt screwing him up. He was a weird crazy genius, and ultimately he died tragically alone because of his mental illness. TempleOS and Holy-C are mostly remembered for the mental illness behind it, but Terry was a real human, and he deserves to be remembered as a mad pioneer who could have been so much more if it wasnt for the fact that this world is terribly unequipped to support people with significant mental illness. RIP Terry.
Was surprised, but pleased to see it mentioned too. 110% agree on not only how he should be remembered, but on how he could have been so much more. Dude was a genius. RIP Terry.
I'm always surprised whenever somebody does not remember Terry for his genius self, but for the "controversial" attitude he had. Shows what society values more and how we don't deserve people like Terry. Hope he's in a better world now.
@@tomasprochazka6198 "Formally" is the key part you're missing. I know you meant no harm but it's annoying when someone tries to pull the "but actually 🤓" and isn't even right.
@@dejangegic golang is rarely used informally as well. Between devs, we know what go is, we don't need to call it golang just because search engines can't
I wish r would do that trying to find anything code related I'm a language whose name is a letter is horrible you end up just looking up packages most of the time
This video is just completely riddled with inaccuracies, please viewers of this video, take everything said with a large grain of salt and do your own research and investigation of programming and computing topics.
Assembly is not a well defined term. Most assemblies are higher-level then you would expect.For example, Procedures/Subroutines don't even exist in machine code, so saying WebAssembly is Assembly is debatable but I'd say it is not a false statement. Furthermore, WASM targets a VM so it is the lowest-level language for that specific machine.
eeeeh, sort of. "Assembly" is pretty non-existent as a unified concept. "assembly" is basically just putting names to sequences of raw machine code, it's not really a language in and of itself. For instance even on a risc architecture nothing is stopping you from impementing x86 style inline memory referencing, and then just binding that formatting to a sequence of bytes which does that, even though technically it'd be several distinct machine instructions. Calling "Assembly" a language is sort of like calling macros a language, neither are really a language themselves, they just sort of stand-in for the language itself. (Assembly just puts words to sequences of raw bytes of machine code whereas macros just put words to sequences of source-code.) In that sense, there isn't really a difference between "assembly" and *_any_* compiled code, and web assembly is assembly as much as anything else can be. Granted, Webassembly is a *_unified_* compilation target, i.e. : it's not running *_completely_* bare metal and it's not *_completely_* raw bytecode, but this is more an issue with the classification IMO. You wouldn't call java an interpreted language, but it similarly works in a 'virtual machine'. There is a bit of grey area between compiled languages and interpreted languages, that grey area is just typically so small as to not really matter. Webassembly is basically trying to be as close to completely native as possible while staying a unified compilation target, so I'd say it being 'assembly' (given, as mentioned previously, 'assembly' basically just means bytecode with extrasteps) is a fair classification. It's a bit inaccurate, but every classification is.
WebEssembly is a binary format. So it's not human readable, while the purpose of assembly is, that it's a human readable representation of machine code. WAT (WebAssembly Text Format) could be called an Assembly
Indeed. WASM has support in many programming languages. Rust probably has some of the best support, though. For C/C++, a tool called Emscripten can compile them to WASM.
10:40 note that HTML and CSS are not scripting languages, but a markup languages. You can't have logic with these, right? Despite CSS having calc() function which can be abused to have some logic, still it's not scripting or programming, just mark up. Also I miss Hack lang (PHP extended by Facebook) and Vala, both are great and could be on the list!
@@HappyPurpleFella Computers run on logic, so in order to write instructions to a computer the language needs to be able to perform logic. That's why CS students learn Discrete math at minimum, it's a mathematics field devoted to formalized binary logic, i.e. exactly what computers do. I'm reading a book called Code by Charles Petzold that does a great overview as to how and why computers developed as they did, if you're interested in learning more :) Edit: This just came to me, maybe it's a good example. This need to perform logic could be why you could code a game in Excel, but not in Word. Excel can perform logic and math, but Word can't
Forth was a brilliant language. I spent years reading about it before I ever got a chance to use it because I was utterly obsessed with its concept. And man when I finally got to use it it was like a duck to water for me. Its basically inverted Lisp. One simple concept , stacks (as opposed to lisps one simple concept, lists) taken to its logical end. The end result is that while lisp naturally produces functional programming, forth naturally produces structured programming, leading to a very very interesting conclusion;- The difference between functional and structured programming is the order you remove items from a linked list. Beatiful mathematical comp-sci in the form of a languag. Everyone should learn 3 languages, purely for deeping their understanding, not for job skills: Lisp, Forth and Smalltalk. Those three languages distill the purest essence of their respective pardgims of coding, functional, structural, and OOP respectively. And all three of them you can learn if a few days each, tops.
@@shayneoneill1506 I did not know Smalltalk was the OOP proto-language. I'd always been under the impression that C++ was, which I've always found intimidating. I will look into Smalltalk, thank you.
@@Stettafire Yes. It was created by a man named Chuck Moore, in the mid 1970s. : STAR 42 EMIT ; : STARS 0 DO STAR LOOP ; : DOUBLE DUP + ; : QUAD DOUBLE DOUBLE ; 16 4 QUAD STARS If you enter the above into GNU FORTH, it will print out asterisks almost endlessly.
s, a+, b, d, e, Erlang, oCaml, Produire, 1C enterprise programming language, v, some old soviet programming languages in Russian, and even more are missing than just those.
around 8:50 when "he" started to read the websites names i realized that the script is read by ai voice now that i think about it, whole video is probably created by ai
Was created about two months ago, tons of factual mistakes and weird choices and organization (web assembly? powershell? (and not bash?) zig, but no rust?), and plenty of bizarre unnatural speech. Plus, totally unfocused content, none of the other videos are programming. I didn't see it but I think you're totally right, this is mostly or entirely AI.
Rust is in there. Although I thought the exact same thing: "Zig and not Rust?" Hovered the timeline and there it is. About as meaningful an overview as everything else on the list.@@omnisel
@@mashydp1780Oh huh, yeah, you're right. I must have missed it or thinking about something else he said, or something, lol. It was 14 seconds long, in my defense
I thought about this as well. It is so incredibly simple to have AI generate a list of 50+ (or however many there were) programming languages and write a short description about them regarding either their history or noteworthy components and then use an AI voice to read it. It explains why there's quite a few errors with some of the statements.
yes, classes are not required for OOP, for example. Class based OOP is only one of the many ways to implement OOP. saying C cannot handle OOP is plain wrong.
Html and CSS are not scripting languages. In fact, they are not programming languages at all as they have no functional abilities. They are called markup languages and are completely static unless manipulated by a programming language like JavaScript.
I genuinely did not realize this video was read (and probably made) by AI until a comment pointed out how weird it sounded when it read out the website names. On closer examination, this video is definitely read by AI and had many signs pointing to it, but if you weren't purposefully looking for it, it just goes to show how good AI is getting now.
I can't believe that you didn't mention that Erlang was designed to work even if you change the code *while* it's running etc. And LISP deserves a mention of the Lisps, the languages depending from it due to how easy it was to make new personalized languages in LISP to begin with.
HTML is a markup language, not a programming language. Likewise, CSS is also not a programming language, but is a style sheet language. Furthermore, JavaScript is technically related to Java, but only as a marketing ploy by the folks behind Netscape.
As a dev, that's why we have QA! I support QA in all their endeavours! They allow me to avoid doing the crap I hate.... (I'm only slightly joking. I still do TDD and write unit tests. But if you think TDD can replace proper QA you're fucking deluded)
While PowerShell was originally Windows command line on steroids, it's now pretty much a scripting interpreter for C#/.NET (i.e. doesn't require compiling)
Unfortunately, I have a correction. Swift is no longer only available for Apple devices. It can now be written on and for Windows and Linux. Great video otherwise tho!
@@dejangegic Swift compiles to a binary, like Go. Unlike Go, there’s no garbage collection: just super-fast automatic reference counting. Swift has lots of language niceties like a strong type system, excellent type inference, Protocol types, type extensions, maybe the best enum implementation ever made. Swift’s Foundation library contains so much functionality that most other langauges need help with a patchwork of packages. There’s an excellent server ecosystem covering nearly all other use cases. There are plenty of downsides. Using Swift on Linux has different behavior than using Swift on Apple systems for things like URLSession. So, writing Server Swift will mean adopting third party libaries like AsyncHTTPClient. Most Swift server frameworks rely on the SwiftNIO library because Swift’s async/await system can hop threads (though, I know this is being worked on if not already in Swift 5.9). And if you aren’t using Docker then you have to install the Swift runtime on your host which can be annoying.
@@mollthecoder When Rasmus Lerdorf used it for his personal home page! When he gave up the language to the community that had been built around it, they decided to rename it to PHP: Hypertext Preprocessor, as it was no longer the language for Lerdorf's personal home page but still an hypertext preprocessor, and built for applications on the HTPP!
Calliong it "Every Programming Language Ever" is a bit of a stretch. Others I have worked with include Mercury Autocode, CLIST, ISPF, Paradigm Plus, PL/1, REXX, and GML.
High level languages are not necessarily more slow than low level ones. Some compiled languages C for example, have no runtime and are as fast as assembly (because the compiler literately compiled to assembly)
@@gamerk316 The Rust cult is strong lol. Linus said something about needing to attract new maintainter for the linux kernel, as a lot of the maintainer are getting older. Adding Rust is imo a way to "advertise" the linux kernel. Also, Rust doesn't have most thing that Linus despise with C++ (exceptions, operator overloading, dynamic dispatch, ect...).
@@briannormant3622 Otherwise known as "Linus hates things that makes developers lives easier because of his ego". Linus (and the GPL) are the main reasons why I don't develop for Linux.
Nice thing about Scratch is that every variable is every type of variable simultaneously. You can set a variable to 5, divide it by 2, and then set it to hello all in the same script and all without the requirement of using annoying converting functions.
13:53 Swift is open source, so you can also use it on Linux and windows (I’ve tried it), you just won’t have access to apple specific things like SwiftUI.
try doing ur own research instead of asking chatgpt to write the script and then using an ai model to read it out, this is so wrong on so many things that it doesnt count as educational
Interesting and informative video. A few minor things. You missed FBD, STL, and LAD - all three are very common if you do industrial automation (but virtually unheard of outside that) (usually you end up doing all three from the same tools, and learning them at the same time - they all complement each other but comforms differently to different ways to approach a program) You missed SPARK, the high-reliability version of Ada (given just how insanely safe Ada is to begin with this is kinda mindboggling). Largest user of Ada, it depends on how you view it, if you do it by "organisation size" then yes, it is DoD, but in terms of industries it is used in banking, aviation, railways, and medical fairly heavy (basically in situations where failure are not an option). Nvidia also uses Ada a fair bit. The different Ada versions are notably different (for instance the jump between Ada83 and Ada95 is basically the Ada-counterpart to the jump between C and C++) Portability of Java is rather due to practices than technical reasons, you can compile java-code to platform specific binaries, and you can compile most (compiled) languages to java bytecode (the compiled java that runs in a JVM). (For that matter, virtual machines specific for a specific bytecode isn't exactly rare)
ATLAS was a programming language used by the DoD and airline industry for diagnostic equipment used for testing and repair of radar and computer components in aircraft. It could also be used for CNC and other machine operations.
7:55 functional programming is good not just for math, but also business applications. Check the talks "Domain Modeling made functional" and "Making Impossible States Impossible"
Two things that i noticed: 1. You can code in Swift on windows, although the setup is slighly complicated. 2. Rust is not easy, easier than c and c++? Maybe (i haven't done enough Rust to say that for sure). Rust is by the way NOT used for applications where Performance is EXTREMELY important - Games and OSs do not fall in this category, performance there is important, but not extremely so - because the safetey features that it provides slow it down. Programs where performance is key are found in trading and writen in c or c++.
@@dtimer319 correct, rust is extremely good when it comes to speed and safety, but as far as i know, rust is slower than c++ - only on a marginal level so but sometimes this is relevant. I would like to see the cases where it is faster, since I personally would question the speed of the code itself instead of what it has been compiled to.
@@dtimer319 It's a lie. For the exact same program C++, C and Rust will compile to same machine code. Rust and C++ can be faster than C in some cases, example (std::sort vs qsort) the compiler can directly inline the sorting operation. But a C programmer optimizing the hell out of his program can use macro and/or simply rewrite a qsort for their specific time to get to the same speed as C++/Rust. If you compare the assembly code emmitted by C, C++ and Rust. It's for all intent and purpose the exact same. If you check a test showing perf comparaison between those 3. Check the code used. It won't always do the exact same thing. It's quite easy to understand why, Rust use the LLVM framework, the same Clang use to compile C and C++ code. So the same optimization can be performance for those 3 programs. With modern processor, when it comes to performance, the best way to optimize is to keep in mind cache localy and branchless programming.
@@entropy4959 there are plenty of benchmarks. At least for math problems, the best Rust solutions are often the fastest or second fastest. However the little known D language beats it in terms of speed any time it isn't forgotten to be included in those tests because D is the fastest boy in the race, when it does happen to race.
So the website “99 bottles of beer” (a site devoted to having programs to print the song lyrics) has 1500+ languages. Given 1 second per language it would take 25 minutes to say just the names…
Fortran makes use of columns to. On the old Fortran pads where you wrote your code, there was a special column for the C character which represented a comment.
LOGO was the first language I ever learned about. It was in middle school. We had these old 486 computers with early Windows and PrintShop and Logo. I had a lot of fun making different looking designs. In a year or two I would buy a used TI-83 for $20 and teach myself TI-BASIC, which is incredibly easy because it's an already very limited version of BASIC. Those were some fun times!
FORTH? (mentioned before). Bash (Unix/Linux shell). Snobol (someone should look at the hisory of programming languages!). Of course there is Intercal (Please!).
dude I have been immersed in tech my whole life, and never have I been told that the first compilers were actual physical machines with moving parts. Thats actually kinda wild
Still alive and well and in active use - be that via evolution (M -> Caché -> IRIS, gaining a bunch of functionality such as OO along the way), or the purer variant (GT.M)
Hi @giociampa, I worked for one client just a few years ago and got IRIS shortlisted, but in the end, they went for what "they knew" even though IRIS blew everything out of the water on cost/performance, etc. 🙄
4:17 I'd just like to interject for a moment. What you're refering to as Linux, is in fact, GNU/Linux, or as I've recently taken to calling it, GNU plus Linux. Linux is not an operating system unto itself, but rather another free component of a fully functioning GNU system made useful by the GNU corelibs, shell utilities and vital system components comprising a full OS as defined by POSIX. Many computer users run a modified version of the GNU system every day, without realizing it. Through a peculiar turn of events, the version of GNU which is widely used today is often called Linux, and many of its users are not aware that it is basically the GNU system, developed by the GNU Project. There really is a Linux, and these people are using it, but it is just a part of the system they use. Linux is the kernel: the program in the system that allocates the machine's resources to the other programs that you run. The kernel is an essential part of an operating system, but useless by itself; it can only function in the context of a complete operating system. Linux is normally used in combination with the GNU operating system: the whole system is basically GNU with Linux added, or GNU/Linux. All the so-called Linux distributions are really distributions of GNU/Linux!
No, Richard, it's 'Linux', not 'GNU/Linux'. The most important contributions that the FSF made to Linux were the creation of the GPL and the GCC compiler. Those are fine and inspired products. GCC is a monumental achievement and has earned you, RMS, and the Free Software Foundation countless kudos and much appreciation. Following are some reasons for you to mull over, including some already answered in your FAQ. One guy, Linus Torvalds, used GCC to make his operating system (yes, Linux is an OS -- more on this later). He named it 'Linux' with a little help from his friends. Why doesn't he call it GNU/Linux? Because he wrote it, with more help from his friends, not you. You named your stuff, I named my stuff -- including the software I wrote using GCC -- and Linus named his stuff. The proper name is Linux because Linus Torvalds says so. Linus has spoken. Accept his authority. To do otherwise is to become a nag. You don't want to be known as a nag, do you? (An operating system) != (a distribution). Linux is an operating system. By my definition, an operating system is that software which provides and limits access to hardware resources on a computer. That definition applies wherever you see Linux in use. However, Linux is usually distributed with a collection of utilities and applications to make it easily configurable as a desktop system, a server, a development box, or a graphics workstation, or whatever the user needs. In such a configuration, we have a Linux (based) distribution. Therein lies your strongest argument for the unwieldy title 'GNU/Linux' (when said bundled software is largely from the FSF). Go bug the distribution makers on that one. Take your beef to Red Hat, Mandrake, and Slackware. At least there you have an argument. Linux alone is an operating system that can be used in various applications without any GNU software whatsoever. Embedded applications come to mind as an obvious example. Next, even if we limit the GNU/Linux title to the GNU-based Linux distributions, we run into another obvious problem. XFree86 may well be more important to a particular Linux installation than the sum of all the GNU contributions. More properly, shouldn't the distribution be called XFree86/Linux? Or, at a minimum, XFree86/GNU/Linux? Of course, it would be rather arbitrary to draw the line there when many other fine contributions go unlisted. Yes, I know you've heard this one before. Get used to it. You'll keep hearing it until you can cleanly counter it. You seem to like the lines-of-code metric. There are many lines of GNU code in a typical Linux distribution. You seem to suggest that (more LOC) == (more important). However, I submit to you that raw LOC numbers do not directly correlate with importance. I would suggest that clock cycles spent on code is a better metric. For example, if my system spends 90% of its time executing XFree86 code, XFree86 is probably the single most important collection of code on my system. Even if I loaded ten times as many lines of useless bloatware on my system and I never excuted that bloatware, it certainly isn't more important code than XFree86. Obviously, this metric isn't perfect either, but LOC really, really sucks. Please refrain from using it ever again in supporting any argument. Last, I'd like to point out that we Linux and GNU users shouldn't be fighting among ourselves over naming other people's software. But what the heck, I'm in a bad mood now. I think I'm feeling sufficiently obnoxious to make the point that GCC is so very famous and, yes, so very useful only because Linux was developed. In a show of proper respect and gratitude, shouldn't you and everyone refer to GCC as 'the Linux compiler'? Or at least, 'Linux GCC'? Seriously, where would your masterpiece be without Linux? Languishing with the HURD? If there is a moral buried in this rant, maybe it is this: Be grateful for your abilities and your incredible success and your considerable fame. Continue to use that success and fame for good, not evil. Also, be especially grateful for Linux' huge contribution to that success. You, RMS, the Free Software Foundation, and GNU software have reached their current high profiles largely on the back of Linux. You have changed the world. Now, go forth and don't be a nag. Thanks for listening.
The question is how in the world did the GNU Project arrive on the worst thing to pronounce with the fewest possible letters. Seriously--it's only three letters and when people see it for the first time, there's about three possibilities that come to mind for how to say it. It's simply a case of stereotypical computer geeks doing something brilliant and not knowing the first thing about marketing, and now we get to hear the oft-repeated lamentations of disrespect when that's not the case at all. Linux is just easy to say and kinda catchy--that's it.
Typescript, Go, Elixir, Lua, C/Zig; In my opinion the stack that covers most of the potential software you would wanna make. Potential things to add/swap C++/Rust, Python/Julia, Java/Scala. But in general, I think this gets it done for most use cases. edit: I didnt include specialization languages like kotline, swift etc.. this is more of a "general take" on programming languages since I assume people questioning which language to learn tend to be more junior and new to the field.
"Every Programming Language Ever…" is quite an exaggeration. For instance, these are missing (in no particular order): Object Pascal (Delphi can be considered a dialect of it); Modula2, Modula3, Oberon; E; AmigaE (not related to E); REXX; Eiffel; J (inspired by APL); Racket, Scheme, Clojure, …; Forth; TCL; Korn shell, Z shell, Bash …; AWK; D; BCPL; groovy; Oz; REBOL (and red); Haxe; occam; … and a lot, lot more. Moreover, non-Turing complete languages (as SQL, HTML, CSS) shouldn't be listed in a "programming languages" list. As pointed out by others, there are mistakes. I want to focus on a languages-related few of them: § **Raku** is a different language than Perl (they decided to change the name from Perl6 to Raku to stress the fact that it is a different language); § Go is not like Python at all - and Go is a compiled lang, Python is not § maybe just ALGOL, not ALGOL 60; otherwise you should list other ALGOLs as well (58, 68, W) § C was influenced by Algol 68, and B (derivative of BCPL)
@@doigt6590 I suspect that the "influenced by Algol ..." label could be given to many languages. Then one can argue that Algol 68 wasn't the first and only to use this or that feature/concept/syntax; then one should know the languages the C's creators were exposed to… and so on. Hard to do a correct tracking; anyway I think it isn't unlikely it has influenced someway all the languages created in the time when it was, er, influential :)
@@MauroPanigada Instead of suspecting something, you could actually look it up. The authors of C wrote on the history of its making and they are quite clear on the relation between C and Algol.
@@doigt6590 Looked it up on wikipedia hoping it would have an easy reference of such a history lesson… they have a citation which seems to contradict your "Saying C was influenced by Algol 68 is quite a stretch too." The quote is this: "The scheme of type composition adopted by C owes considerable debt to Algol 68, although it did not, perhaps, emerge in a form that Algol's adherents would approve of."
@@edwardfalk9997 its used most commonly for developing ios apps, but its a programming language on its own, it can be used for other thing like web services, etc.
This is the case with macOS and iOS in general, it's not even unique for Swift. This is due to Apple's licensing. Also, xCode is not for Swift, is much older than Swift, and there are other IDEs for Swift, Swift and xCode are not related. This dude had so many things wrong about these...
@@edwardfalk9997I think they mean you can’t compile to a macOS or iOS executable from Linux or Windows. Swift has run on Linux for about 7 years and Windows for about 3. It was released under the Apache license starting with the first version to support Linux.
@@jaybutler I think they mean you can't BUY a compiler to do it. You can WRITE a compliler for anything to anything in ANY environment which supports STANDARD IN, STANDARD OUT and STANDARD ERROR on any architecture. If it was saleable someone would have done it. PS I used to compile programe in Algol on IBM 360 series mainframes to execute on DEC PDP10 machines.
A lot of missing things here: A few examples of the top of my head: tcl Modula2 Modula3 BCPL B BLISS FOCAL FORTH Scheme Utopia Eiffel bash scripts csh scripts D DIBOL MDL
And what about Focal and two independently made languages called C - - ?.. What about Modula-2/3 and Oberon? And most important of all, where's PL/I ?! And Forth?!.
this isn't every programming language. There were lots of Russian based programming languages in the Soviet union, a French based programming language in France, some newer programming languages like v, some older ones also missing like s, a+, b, d, e, Erlang, oCaml, and some more modern Non-English programming languages like Produire (プロデル (Japanese)) and 1C enterprise programming language (Russian). There's also tons of others missing in here. It would've helped if the video had some disclaimer for how it was classifying these because it seems to be leaving a lot out.
Need to add the languages Forth and ARexx to the list. My list of languages over the years are: basic, forth, assemble (multiple CPUs (6502, 680x0, VAX)), pascal, cobol, C, C++,ARexx, HTML, SQL, (i've looked at other computer languages but haven't programmed in them)
No language will ever come close to fully replacing C (at least for now), but Rust has the biggest chances This is because of the support for C on essentially everything that runs code and it's established role in tech
Probably Nablus best sticking to text based languages, if you start down the scratch round you might as well include Unreal blueprints and Houdini Op graphs etc
LISP : The only computer programming language I've ever been shown that I just cannot get my head around. I have no particular training or practice with Python or Rust, but they're just programming languages. Lisp seems to be something else.
@@KanashimiMusic The two are not directly correlated. Low level gives you more control over memory management, that part was right. You could say the same about a pair of scissors and a chainsaw. If you have no idea how a chainsaw works, its useless.
@@NetherFX They are definitely correlated, very strongly I would even say. I don't know what exactly you mean by "not directly", but most traits that are inherent to higher-level languages naturally come with performance impacts. 1. As you yourself said, low level gives you more control over memory management, and if you don't have that control, the language (or rather, the runtime) has to do the work for you, very often through a garbage collector, which has some performance impact. That is unless the compiler manages the memory for you, like in Rust, but calling Rust a "higher-level" language is debatable at best. 2. High-level languages usually include a lot of non-zero-cost abstractions and overhead which can have significant performance impacts. 3. All high-level languages that I can think of are interpreted. Even languages that are usually referred to as "compiled" are really not - they're compiled to machine code for a virtual machine like the JVM which then interprets or "just-in-time compiles" that code to actual machine code at runtime. And quite obviously, interpreting also has a significant performance impact. Granted, the JVM JIT specifically is known for being very good at optimizing in real time, but even it can't possibly beat ahead-of-time compiled, ACTUAL machine code. I'm aware that low-level vs high-level is not binary, rather, it is a spectrum that's not very well-defined or universally agreed upon. But it's impossible to deny that many of the traits that make a language "high-level" inherently lead to programs written in that language being slower.
ok so: assembly is not a language, it's more like an umbrella term for a class of languages that closely mimic machine code. there are assemblers like gnu as or llvm-as that support multiple assembly languages for different architechtures - x86, arm, wasm etc. there's also fasm - an x86_64 assembler that does a lot more than just spitting out machine code and it can be meta programmed in it's own flavor of intel syntax assembly. high level vs low level doesn't bare much meaning anymore. in fact languages like c are considered low level by todays standards even though they were concieved as high level. object orientation is much more about inheritance and encapsulation of state than about having methods on structs of data. take go - it has all the features you described yet it is very much a procedural language. you could take the pokemon example and implement it in C. What OOP would allow you to do is make a class for all pokemons, a subclass for fire pokemons and a sub sub class for charizard, and have parent classes implement greatest common denominators for their respective kinds. like pokemon have hp and can perform tackle, fire pokemons are weak to water pokemon and can perform ember and charrizard is a concrete fire pokemon with it's name and set of sprites. tbc
Pointers aren't code they're memory locations represented by numbers. C wasn't even the first to introduce them, they're just a very fundamental piece of system level programming. What C done right was it made using them easier and also the most important achievment of C is probably a very uniform and easy to reason about type system. Ie. you might need a type that can hold 32-bit signed integers - go for "int", want it unsigned? "unsigned int". There's an unsigned int somewhere in the memory and you want a handle (a pointer if you will)? "unsigned int*". Pointers won't save you from memory leaks though, quite the opposite. That's part of the reason why modern high level languages manage the memory for you. I feel like you overstated the contribution of smalltalk. Besides objective-c and a bunch of esoteric hobby langs modern day oop mainly stems from C++ which got it's model from simula. Statically typed functional programming's main selling point isn't the static nature of it's types. C is also statically typed, so is algol and fortran. What's cool about stfp is mainly the fp part combined with very expressive type systems. Look up rust enums for more information.
Ok this is very much a nitpick but. C++'s main thing is probably templates not classes. Like generics in other languages they can do things you'd expect - map and so on. Thing is c++ compilers are insanely good at optimizing templated code and also templates can be used to evaluate things at compile time. They come in handy when you want a simple to use interface that compiles to the fastest possible machine code. Ok haskell wasn't the first one to the immutability thing, also it's main feature is probably category theory compliant standard library - monads and such. About it being based on lambda calculus - yes so is ML and most pure functional programming languages. Cool thing about haskell is it is in fact based on typed lambda calculus allowing it to ie. do insane type inference. Like haskell can literaly infere a whole implementation of an interface for you. Javascript is ecmascript. Javascript is the marketing/informal name while ecmascript is the actual name of the language. Also it runs outside of browsers, look up nodejs. There are even web servers that run on javascript.
Nice list! Thanks for the video. On what criteria these languages was chosen? As example why V Lang and Crystal was not included? It's my recent favorites :) both really deserve to be mentioned
Knowing our fortune, if Crystal was mentioned, it'd be two seconds long section and said to be "Ruby, but with better speed" :< How he massacred my boy Nim.... I know it's niche language and it can be summarized to be "Python, but C", but there's so much you can include to expand it even with few seconds more... like memory management system you can choose, or macros allowing you to basically rewrite AST to your needs!
Lots of errors, however, I'm interested in the omissions. While there are many, some major ones are: Command/Cmd/4dos/Tcc, Rexx, Bourne/Korn/Bash, Csh/Tcsh, Forth, Awk, Sed, Groovy, Tcl, dBase/Clipper, Yacc+Lex, Make, m4, Postscript and many others!
Fortran is 1 of the lowest level languges we have today. Its used for places where u need the highest preformance along with c. For context it's generally excepted that u can't really beat c/c++ on raw speed but fortean managed to at times.
Its weird hearing "Fortran" and "slower" together. Fortran compilers have been around so long and are so refined they generate ASM so optimized its considered one of the only languages that runs faster than C. Its a *rocket ship* of a language (And yes, it'll do your GPU stuff via its vector extensions and run them crazy fast). Its also so old its possibly your actual grandfather.
Many of those were half-assed, either missing explaining anything about them or being factually misinformed. If you want to explain "Every Programming Language Ever" in 15 minutes, you can trim your shilling of Python and be more informative about all others, while grouping them based on the paradigms they are primarily focused on
Memory leak isn’t the issue of using too much memory. Program is still fine, as long as it knows what memory it’s using. Memory leak is the problem where the program forgets to actually delete the memory it no longer uses; the computer (operating system) thinks that the program is still using the memory but the program thinks it’s not.
The problem arises when the program forgets to free the memory, but then thinks it needs MORE memory and asks for it. If this happens in a loop the system will eventually run out of memory.
I run into this with some of my older classroom machines and have to teach middle school students about this issue. They have no clue!
Resulting in the program using too much memory
@@FranLegon Doesn't necessarily result; The program can leak just a few bytes on starting because it forgot to delete some initialization variables. But the rest of the program memory could still be managed properly. It's still dirty in the sense that you forgot to take the dirt off your shoes before entering house.
Memory leak can be though of forgetting to throw out the trash. Usually users don't notice it. The problem only gets noticed when the trash raises a stink. Like how humans treat the environment.
@@GaryFerrao ok that's a good point. I was thinking it always build up to "too much" but you're right, it might not. Thanks for the response
Sadly this video has many mistakes, but it was still fun to watch.
Yeah the part about Swift is wrong as it's available on Windows and Linux (the Arc browser is even building it's Windows app on Swift). Yes iOS and Mac development probably requires a Mac but you can 100% write command line apps or backends on Windows and Linux. Also Kotlin is useful for more than Android development, it's fully supported in the popular Spring framework.
Right off the bat with assembly too 😂
Agree.
This is more for entertaining than for accuracy.
I don't think anyone would watch a video of this kind for its accuracy. It's clearly aimed for the common layman (i.e. someone who's only getting started in programming)
@@ashwinnarasimhan2729 technically yes, practically no. Same as with C#/.Net, which can be used with Mono outside of Windows. But why would you do that?
After the line: "You can still find Assembly in modern world: Web Assembly" i realized that this person knows nothing about any aspect of programming excluding wikipedia info.
Oh hey man, thanks for mentioning Holy-C. The dude behind it I actually talked to online a lot. He was actually a really nice guy when the Schizophrenia wasnt screwing him up. He was a weird crazy genius, and ultimately he died tragically alone because of his mental illness. TempleOS and Holy-C are mostly remembered for the mental illness behind it, but Terry was a real human, and he deserves to be remembered as a mad pioneer who could have been so much more if it wasnt for the fact that this world is terribly unequipped to support people with significant mental illness. RIP Terry.
I also got happy when I saw Holy-C
Was surprised, but pleased to see it mentioned too.
110% agree on not only how he should be remembered, but on how he could have been so much more.
Dude was a genius.
RIP Terry.
I'm always surprised whenever somebody does not remember Terry for his genius self, but for the "controversial" attitude he had. Shows what society values more and how we don't deserve people like Terry. Hope he's in a better world now.
Is there still content of him around somewhere I saw some clips of him and would really like to see him program or smth
I more remember him for saying the N word and ranting about government agencies.
Go is formally known as Go, not Golang. Golang is something we use for easier SEO
Some do, and then there are "the others", who call the language Golang, in written and spoken language.
@@tomasprochazka6198 "Formally" is the key part you're missing. I know you meant no harm but it's annoying when someone tries to pull the "but actually 🤓" and isn't even right.
@@dejangegic golang is rarely used informally as well. Between devs, we know what go is, we don't need to call it golang just because search engines can't
I wish r would do that trying to find anything code related I'm a language whose name is a letter is horrible you end up just looking up packages most of the time
Memory leak isn't using too much memory, that's the consequence of memory leak.
♥
I love how the whole comment section is just programmers correcting this video
fr XD
he doesn't know #garbage collection exists in the first place🤣🤣🤣💯👎
Nothing like a triggered autistic programmer.
This video is just completely riddled with inaccuracies, please viewers of this video, take everything said with a large grain of salt and do your own research and investigation of programming and computing topics.
A veritable mountain of salt.
Web Assembly is not assembly. I've written it in C++, but I think other languages are supported.
Assembly is not a well defined term. Most assemblies are higher-level then you would expect.For example, Procedures/Subroutines don't even exist in machine code, so saying WebAssembly is Assembly is debatable but I'd say it is not a false statement. Furthermore, WASM targets a VM so it is the lowest-level language for that specific machine.
eeeeh, sort of. "Assembly" is pretty non-existent as a unified concept. "assembly" is basically just putting names to sequences of raw machine code, it's not really a language in and of itself. For instance even on a risc architecture nothing is stopping you from impementing x86 style inline memory referencing, and then just binding that formatting to a sequence of bytes which does that, even though technically it'd be several distinct machine instructions.
Calling "Assembly" a language is sort of like calling macros a language, neither are really a language themselves, they just sort of stand-in for the language itself. (Assembly just puts words to sequences of raw bytes of machine code whereas macros just put words to sequences of source-code.) In that sense, there isn't really a difference between "assembly" and *_any_* compiled code, and web assembly is assembly as much as anything else can be.
Granted, Webassembly is a *_unified_* compilation target, i.e. : it's not running *_completely_* bare metal and it's not *_completely_* raw bytecode, but this is more an issue with the classification IMO. You wouldn't call java an interpreted language, but it similarly works in a 'virtual machine'. There is a bit of grey area between compiled languages and interpreted languages, that grey area is just typically so small as to not really matter. Webassembly is basically trying to be as close to completely native as possible while staying a unified compilation target, so I'd say it being 'assembly' (given, as mentioned previously, 'assembly' basically just means bytecode with extrasteps) is a fair classification. It's a bit inaccurate, but every classification is.
WebEssembly is a binary format. So it's not human readable, while the purpose of assembly is, that it's a human readable representation of machine code.
WAT (WebAssembly Text Format) could be called an Assembly
You don't write WebAssembly with C++. You can compile C++ into Assembly and WebAssembly
Indeed. WASM has support in many programming languages. Rust probably has some of the best support, though. For C/C++, a tool called Emscripten can compile them to WASM.
10:40 note that HTML and CSS are not scripting languages, but a markup languages. You can't have logic with these, right? Despite CSS having calc() function which can be abused to have some logic, still it's not scripting or programming, just mark up. Also I miss Hack lang (PHP extended by Facebook) and Vala, both are great and could be on the list!
Why do they need to have "logic", after all, isn't programming just instructions to a computer?
@@HappyPurpleFella Computers run on logic, so in order to write instructions to a computer the language needs to be able to perform logic. That's why CS students learn Discrete math at minimum, it's a mathematics field devoted to formalized binary logic, i.e. exactly what computers do. I'm reading a book called Code by Charles Petzold that does a great overview as to how and why computers developed as they did, if you're interested in learning more :)
Edit: This just came to me, maybe it's a good example. This need to perform logic could be why you could code a game in Excel, but not in Word. Excel can perform logic and math, but Word can't
HTML and CSS are quite literally turing complete, if that doesn't reach the bar for scripting language then I'm not sure what does.
Okay.. And?
@@gintoki_sakata__ and it should not be on the list of programming languages 🤷♂
There is no mention of FORTH; the one language seemingly everyone forgets.
no mention of Algol 68, Modula 2, JOVIAL, etc .....🤥
Forth was a brilliant language. I spent years reading about it before I ever got a chance to use it because I was utterly obsessed with its concept. And man when I finally got to use it it was like a duck to water for me. Its basically inverted Lisp. One simple concept , stacks (as opposed to lisps one simple concept, lists) taken to its logical end. The end result is that while lisp naturally produces functional programming, forth naturally produces structured programming, leading to a very very interesting conclusion;- The difference between functional and structured programming is the order you remove items from a linked list. Beatiful mathematical comp-sci in the form of a languag.
Everyone should learn 3 languages, purely for deeping their understanding, not for job skills: Lisp, Forth and Smalltalk. Those three languages distill the purest essence of their respective pardgims of coding, functional, structural, and OOP respectively. And all three of them you can learn if a few days each, tops.
@@shayneoneill1506 I did not know Smalltalk was the OOP proto-language. I'd always been under the impression that C++ was, which I've always found intimidating. I will look into Smalltalk, thank you.
I've legit never heard of Forth. But maybe before my time?
@@Stettafire Yes. It was created by a man named Chuck Moore, in the mid 1970s.
: STAR 42 EMIT ;
: STARS 0 DO STAR LOOP ;
: DOUBLE DUP + ;
: QUAD DOUBLE DOUBLE ;
16 4 QUAD STARS
If you enter the above into GNU FORTH, it will print out asterisks almost endlessly.
+ BCPL, Forth, Modula, Modula-2, Metafont, PostScript
I didn't hear C# or Bash mentioned.
s, a+, b, d, e, Erlang, oCaml, Produire, 1C enterprise programming language, v, some old soviet programming languages in Russian, and even more are missing than just those.
around 8:50 when "he" started to read the websites names i realized that the script is read by ai voice
now that i think about it, whole video is probably created by ai
Was created about two months ago, tons of factual mistakes and weird choices and organization (web assembly? powershell? (and not bash?) zig, but no rust?), and plenty of bizarre unnatural speech. Plus, totally unfocused content, none of the other videos are programming. I didn't see it but I think you're totally right, this is mostly or entirely AI.
I think you two are right. The whole structure of this video is pretty sus
Rust is in there. Although I thought the exact same thing: "Zig and not Rust?" Hovered the timeline and there it is. About as meaningful an overview as everything else on the list.@@omnisel
@@mashydp1780Oh huh, yeah, you're right. I must have missed it or thinking about something else he said, or something, lol. It was 14 seconds long, in my defense
I thought about this as well. It is so incredibly simple to have AI generate a list of 50+ (or however many there were) programming languages and write a short description about them regarding either their history or noteworthy components and then use an AI voice to read it. It explains why there's quite a few errors with some of the statements.
I wish JS was only used for web browsers, but people made the stupid decision to use it in desktop apps, on web servers, and way more
You mean Electrum apps? Which are websites packages in a Chromium browser...
@@theramendutchmani think he meant nodejs
what’s so stupid about deciding to use JS on web servers?
@@danielegvijs sucks
@@emireri2387 specifically how
The way you explained object oriented language is making me cry, I am *NOT* joking.
yes, classes are not required for OOP, for example. Class based OOP is only one of the many ways to implement OOP.
saying C cannot handle OOP is plain wrong.
@@notkamui9749 I mean, by definition, it's right
Next you should do "Every JavaScript framework ever explained in 60 minutes"!
Underrated Comment.
Impossible without running the video on at least 2x
But there will be two more in the time it takes you to watch the video
More like 60 years 😂😂😂
Html and CSS are not scripting languages. In fact, they are not programming languages at all as they have no functional abilities. They are called markup languages and are completely static unless manipulated by a programming language like JavaScript.
I genuinely did not realize this video was read (and probably made) by AI until a comment pointed out how weird it sounded when it read out the website names. On closer examination, this video is definitely read by AI and had many signs pointing to it, but if you weren't purposefully looking for it, it just goes to show how good AI is getting now.
AI can have a lisp now??
maybe script written by ai but the voiceover is a real dude (although misinformed)
@@lestusmestus if it really is AI, then the creator of this video likely trained the model himself on his own voice
@@FilterBud I mean could be. there is pretty good voice cloning sites out there
WebAssembly is NOT an assembly language.
I can't believe that you didn't mention that Erlang was designed to work even if you change the code *while* it's running etc.
And LISP deserves a mention of the Lisps, the languages depending from it due to how easy it was to make new personalized languages in LISP to begin with.
HTML is a markup language, not a programming language. Likewise, CSS is also not a programming language, but is a style sheet language. Furthermore, JavaScript is technically related to Java, but only as a marketing ploy by the folks behind Netscape.
Wow, I taught Pascal and C in the Borland IDE that you used in your IDE screenshot-MAN THAT BRINGS BACK MEMORIES (esp. malloc()!)
14:26
"Developers test their code thoroughly"
I put my fist through my desk I was laughing so hard.
As a dev, that's why we have QA! I support QA in all their endeavours! They allow me to avoid doing the crap I hate.... (I'm only slightly joking. I still do TDD and write unit tests. But if you think TDD can replace proper QA you're fucking deluded)
While PowerShell was originally Windows command line on steroids, it's now pretty much a scripting interpreter for C#/.NET (i.e. doesn't require compiling)
You mentioned powershell, but forgot bash.
LLVM is not a virtual machine...
Unfortunately, I have a correction. Swift is no longer only available for Apple devices. It can now be written on and for Windows and Linux. Great video otherwise tho!
You can even write a server in Swift. A better question is, why would you?
@@dejangegicapple paid you to
@@dejangegic Swift compiles to a binary, like Go. Unlike Go, there’s no garbage collection: just super-fast automatic reference counting. Swift has lots of language niceties like a strong type system, excellent type inference, Protocol types, type extensions, maybe the best enum implementation ever made. Swift’s Foundation library contains so much functionality that most other langauges need help with a patchwork of packages. There’s an excellent server ecosystem covering nearly all other use cases.
There are plenty of downsides. Using Swift on Linux has different behavior than using Swift on Apple systems for things like URLSession. So, writing Server Swift will mean adopting third party libaries like AsyncHTTPClient. Most Swift server frameworks rely on the SwiftNIO library because Swift’s async/await system can hop threads (though, I know this is being worked on if not already in Swift 5.9). And if you aren’t using Docker then you have to install the Swift runtime on your host which can be annoying.
@@dejangegicit’s actually a pretty amazing language
Bro called fortran slow xD
and called lua fast
@@Om4r37lua is fast though compared to fortran or js or the hell that is python
@@Om4r37no shit
@@Om4r37lua is one of the fastest interpreted languages
@@Om4r37javascript is slow, but it's fast when u run it on BunJS env
Fun fact, PHP had a good meaning for it's name... but then they made it:
PHP: Hypertext Preprocessor
IT'S RECURSIVE
Originally it meant personal home page
@@mollthecoder When Rasmus Lerdorf used it for his personal home page!
When he gave up the language to the community that had been built around it, they decided to rename it to PHP: Hypertext Preprocessor, as it was no longer the language for Lerdorf's personal home page but still an hypertext preprocessor, and built for applications on the HTPP!
Now it means PHP Hates Programmers
Calliong it "Every Programming Language Ever" is a bit of a stretch. Others I have worked with include Mercury Autocode, CLIST, ISPF, Paradigm Plus, PL/1, REXX, and GML.
Android Studio may be the most used IDE for Android development, IntelliJ IDEA is the worlds most used IDE for Kotlin
High level languages are not necessarily more slow than low level ones. Some compiled languages C for example, have no runtime and are as fast as assembly (because the compiler literately compiled to assembly)
Well assembly will always be faster, but the real question is can we code perfect assembly 😉
5:56 - there is no c++ in linux, i needed to point it out since no c++ in linux kernel is almost a religious thing for us linux users at this point
god now I need to make a version of the Fuse T-posing over the crying hostage meme with Rust and C++
There is some C++ in the source tree, but only C++98 or older is allowed currently.
@@BlessedDog And yet, they're starting to use Rust :/
@@gamerk316 The Rust cult is strong lol. Linus said something about needing to attract new maintainter for the linux kernel, as a lot of the maintainer are getting older.
Adding Rust is imo a way to "advertise" the linux kernel. Also, Rust doesn't have most thing that Linus despise with C++ (exceptions, operator overloading, dynamic dispatch, ect...).
@@briannormant3622 Otherwise known as "Linus hates things that makes developers lives easier because of his ego".
Linus (and the GPL) are the main reasons why I don't develop for Linux.
Nice thing about Scratch is that every variable is every type of variable simultaneously. You can set a variable to 5, divide it by 2, and then set it to hello all in the same script and all without the requirement of using annoying converting functions.
13:53 Swift is open source, so you can also use it on Linux and windows (I’ve tried it), you just won’t have access to apple specific things like SwiftUI.
try doing ur own research instead of asking chatgpt to write the script and then using an ai model to read it out, this is so wrong on so many things that it doesnt count as educational
Interesting and informative video.
A few minor things.
You missed FBD, STL, and LAD - all three are very common if you do industrial automation (but virtually unheard of outside that) (usually you end up doing all three from the same tools, and learning them at the same time - they all complement each other but comforms differently to different ways to approach a program)
You missed SPARK, the high-reliability version of Ada (given just how insanely safe Ada is to begin with this is kinda mindboggling).
Largest user of Ada, it depends on how you view it, if you do it by "organisation size" then yes, it is DoD, but in terms of industries it is used in banking, aviation, railways, and medical fairly heavy (basically in situations where failure are not an option). Nvidia also uses Ada a fair bit.
The different Ada versions are notably different (for instance the jump between Ada83 and Ada95 is basically the Ada-counterpart to the jump between C and C++)
Portability of Java is rather due to practices than technical reasons, you can compile java-code to platform specific binaries, and you can compile most (compiled) languages to java bytecode (the compiled java that runs in a JVM). (For that matter, virtual machines specific for a specific bytecode isn't exactly rare)
ATLAS was a programming language used by the DoD and airline industry for diagnostic equipment used for testing and repair of radar and computer components in aircraft. It could also be used for CNC and other machine operations.
Lots of small wrong/misleading info but i learned a lot about the languages
FORTRAN was the first high level language that was commercialized, not the first first high level language.
7:55 functional programming is good not just for math, but also business applications. Check the talks "Domain Modeling made functional" and "Making Impossible States Impossible"
Matlab is "MATrix LABoratory", not "Matrix Library".
There was actually a “turtle”. It was a simple turtle-like robot that could hold a pen and draw the results of the Logo program.
The title is nonsense. There are thousands of programming languages around...
well you certainly lived up to your name
For example brainf%%k
Two things that i noticed:
1. You can code in Swift on windows, although the setup is slighly complicated.
2. Rust is not easy, easier than c and c++? Maybe (i haven't done enough Rust to say that for sure). Rust is by the way NOT used for applications where Performance is EXTREMELY important - Games and OSs do not fall in this category, performance there is important, but not extremely so - because the safetey features that it provides slow it down. Programs where performance is key are found in trading and writen in c or c++.
Rust has in many cases been shown to outperform C and C++. Rust is extremely good when performance and safety are needed.
@@dtimer319 correct, rust is extremely good when it comes to speed and safety, but as far as i know, rust is slower than c++ - only on a marginal level so but sometimes this is relevant. I would like to see the cases where it is faster, since I personally would question the speed of the code itself instead of what it has been compiled to.
@@dtimer319
It's a lie.
For the exact same program C++, C and Rust will compile to same machine code. Rust and C++ can be faster than C in some cases, example (std::sort vs qsort) the compiler can directly inline the sorting operation. But a C programmer optimizing the hell out of his program can use macro and/or simply rewrite a qsort for their specific time to get to the same speed as C++/Rust.
If you compare the assembly code emmitted by C, C++ and Rust. It's for all intent and purpose the exact same. If you check a test showing perf comparaison between those 3. Check the code used. It won't always do the exact same thing. It's quite easy to understand why, Rust use the LLVM framework, the same Clang use to compile C and C++ code. So the same optimization can be performance for those 3 programs.
With modern processor, when it comes to performance, the best way to optimize is to keep in mind cache localy and branchless programming.
@@entropy4959 there are plenty of benchmarks. At least for math problems, the best Rust solutions are often the fastest or second fastest. However the little known D language beats it in terms of speed any time it isn't forgotten to be included in those tests because D is the fastest boy in the race, when it does happen to race.
So the website “99 bottles of beer” (a site devoted to having programs to print the song lyrics) has 1500+ languages. Given 1 second per language it would take 25 minutes to say just the names…
Fortran makes use of columns to. On the old Fortran pads where you wrote your code, there was a special column for the C character which represented a comment.
It's been a while but aren't there columns still reserved for labels and to indicate line continuations as well?
@@mikechappell4156 I can't remember. I can only remember the C column. I was a 9-year old child when I last wrote a Fortran program on a pad.
Where is FORTH?
SQL is not a programming language, it is a Structured QUERY Language.
LOGO was the first language I ever learned about. It was in middle school. We had these old 486 computers with early Windows and PrintShop and Logo. I had a lot of fun making different looking designs. In a year or two I would buy a used TI-83 for $20 and teach myself TI-BASIC, which is incredibly easy because it's an already very limited version of BASIC. Those were some fun times!
FORTH? (mentioned before). Bash (Unix/Linux shell). Snobol (someone should look at the hisory of programming languages!). Of course there is Intercal (Please!).
dude I have been immersed in tech my whole life, and never have I been told that the first compilers were actual physical machines with moving parts. Thats actually kinda wild
It was also a job, like there were people that would compile your code into punchcards
Wild, yes, and not true also.
It was really fantastic to see a mention of MUMPS - it's still the greatest!
Still alive and well and in active use - be that via evolution (M -> Caché -> IRIS, gaining a bunch of functionality such as OO along the way), or the purer variant (GT.M)
Hi @giociampa, I worked for one client just a few years ago and got IRIS shortlisted, but in the end, they went for what "they knew" even though IRIS blew everything out of the water on cost/performance, etc. 🙄
4:17
I'd just like to interject for a moment. What you're refering to as Linux, is in fact, GNU/Linux, or as I've recently taken to calling it, GNU plus Linux. Linux is not an operating system unto itself, but rather another free component of a fully functioning GNU system made useful by the GNU corelibs, shell utilities and vital system components comprising a full OS as defined by POSIX.
Many computer users run a modified version of the GNU system every day, without realizing it. Through a peculiar turn of events, the version of GNU which is widely used today is often called Linux, and many of its users are not aware that it is basically the GNU system, developed by the GNU Project.
There really is a Linux, and these people are using it, but it is just a part of the system they use. Linux is the kernel: the program in the system that allocates the machine's resources to the other programs that you run. The kernel is an essential part of an operating system, but useless by itself; it can only function in the context of a complete operating system. Linux is normally used in combination with the GNU operating system: the whole system is basically GNU with Linux added, or GNU/Linux. All the so-called Linux distributions are really distributions of GNU/Linux!
No, Richard, it's 'Linux', not 'GNU/Linux'. The most important contributions that the FSF made to Linux were the creation of the GPL and the GCC compiler. Those are fine and inspired products. GCC is a monumental achievement and has earned you, RMS, and the Free Software Foundation countless kudos and much appreciation.
Following are some reasons for you to mull over, including some already answered in your FAQ.
One guy, Linus Torvalds, used GCC to make his operating system (yes, Linux is an OS -- more on this later). He named it 'Linux' with a little help from his friends. Why doesn't he call it GNU/Linux? Because he wrote it, with more help from his friends, not you. You named your stuff, I named my stuff -- including the software I wrote using GCC -- and Linus named his stuff. The proper name is Linux because Linus Torvalds says so. Linus has spoken. Accept his authority. To do otherwise is to become a nag. You don't want to be known as a nag, do you?
(An operating system) != (a distribution). Linux is an operating system. By my definition, an operating system is that software which provides and limits access to hardware resources on a computer. That definition applies wherever you see Linux in use. However, Linux is usually distributed with a collection of utilities and applications to make it easily configurable as a desktop system, a server, a development box, or a graphics workstation, or whatever the user needs. In such a configuration, we have a Linux (based) distribution. Therein lies your strongest argument for the unwieldy title 'GNU/Linux' (when said bundled software is largely from the FSF). Go bug the distribution makers on that one. Take your beef to Red Hat, Mandrake, and Slackware. At least there you have an argument. Linux alone is an operating system that can be used in various applications without any GNU software whatsoever. Embedded applications come to mind as an obvious example.
Next, even if we limit the GNU/Linux title to the GNU-based Linux distributions, we run into another obvious problem. XFree86 may well be more important to a particular Linux installation than the sum of all the GNU contributions. More properly, shouldn't the distribution be called XFree86/Linux? Or, at a minimum, XFree86/GNU/Linux? Of course, it would be rather arbitrary to draw the line there when many other fine contributions go unlisted. Yes, I know you've heard this one before. Get used to it. You'll keep hearing it until you can cleanly counter it.
You seem to like the lines-of-code metric. There are many lines of GNU code in a typical Linux distribution. You seem to suggest that (more LOC) == (more important). However, I submit to you that raw LOC numbers do not directly correlate with importance. I would suggest that clock cycles spent on code is a better metric. For example, if my system spends 90% of its time executing XFree86 code, XFree86 is probably the single most important collection of code on my system. Even if I loaded ten times as many lines of useless bloatware on my system and I never excuted that bloatware, it certainly isn't more important code than XFree86. Obviously, this metric isn't perfect either, but LOC really, really sucks. Please refrain from using it ever again in supporting any argument.
Last, I'd like to point out that we Linux and GNU users shouldn't be fighting among ourselves over naming other people's software. But what the heck, I'm in a bad mood now. I think I'm feeling sufficiently obnoxious to make the point that GCC is so very famous and, yes, so very useful only because Linux was developed. In a show of proper respect and gratitude, shouldn't you and everyone refer to GCC as 'the Linux compiler'? Or at least, 'Linux GCC'? Seriously, where would your masterpiece be without Linux? Languishing with the HURD?
If there is a moral buried in this rant, maybe it is this:
Be grateful for your abilities and your incredible success and your considerable fame. Continue to use that success and fame for good, not evil. Also, be especially grateful for Linux' huge contribution to that success. You, RMS, the Free Software Foundation, and GNU software have reached their current high profiles largely on the back of Linux. You have changed the world. Now, go forth and don't be a nag.
Thanks for listening.
The question is how in the world did the GNU Project arrive on the worst thing to pronounce with the fewest possible letters. Seriously--it's only three letters and when people see it for the first time, there's about three possibilities that come to mind for how to say it. It's simply a case of stereotypical computer geeks doing something brilliant and not knowing the first thing about marketing, and now we get to hear the oft-repeated lamentations of disrespect when that's not the case at all. Linux is just easy to say and kinda catchy--that's it.
@@jonfeuerborn5859 I hope you're not serious. You're aware that this is a copypasta?
I just wanna say: i use arch btw
Please consider doing a vid on what languages are most popular, aimed at helping a new student decide what to study. Thank You
You totally missed FORTH developed to control telescopes and the basis computer controlled ,manufacturing
I'm so happy you skipped Groovy. For anyone wondering, Groovy walked so Kotlin could run
what bro?😂
@@K.Parth_Singh What haven't you understood?
Well then, i thank Groovy but he has stopped walking, i gotta catch Kotlin now if you don't mind.
@@dejangegic groovy walked?
@@K.Parth_Singh It's a figure of speech. Search for "You have to walk before you can run"
3:02 "A = 1. what is a? A is a REAL SCALAR! a = "now a character array". what is a? A is a 21 element CHARACTER ARRAY!"
Typescript, Go, Elixir, Lua, C/Zig;
In my opinion the stack that covers most of the potential software you would wanna make. Potential things to add/swap C++/Rust, Python/Julia, Java/Scala. But in general, I think this gets it done for most use cases.
edit:
I didnt include specialization languages like kotline, swift etc.. this is more of a "general take" on programming languages since I assume people questioning which language to learn tend to be more junior and new to the field.
Music: Mario Kart Wii OST - Coconut Mall
R is also required for social sciences like political science.
Forgetting PL/1 is no problem for me, but forgetting Forth? Seriously?
"Every Programming Language Ever…" is quite an exaggeration. For instance, these are missing (in no particular order): Object Pascal (Delphi can be considered a dialect of it); Modula2, Modula3, Oberon; E; AmigaE (not related to E); REXX; Eiffel; J (inspired by APL); Racket, Scheme, Clojure, …; Forth; TCL; Korn shell, Z shell, Bash …; AWK; D; BCPL; groovy; Oz; REBOL (and red); Haxe; occam; … and a lot, lot more.
Moreover, non-Turing complete languages (as SQL, HTML, CSS) shouldn't be listed in a "programming languages" list.
As pointed out by others, there are mistakes. I want to focus on a languages-related few of them:
§ **Raku** is a different language than Perl (they decided to change the name from Perl6 to Raku to stress the fact that it is a different language);
§ Go is not like Python at all - and Go is a compiled lang, Python is not
§ maybe just ALGOL, not ALGOL 60; otherwise you should list other ALGOLs as well (58, 68, W)
§ C was influenced by Algol 68, and B (derivative of BCPL)
Saying C was influenced by Algol 68 is quite a stretch too. It's a distant ancestor, through CPL, BCPL, B and finally NB.
@@doigt6590 I suspect that the "influenced by Algol ..." label could be given to many languages. Then one can argue that Algol 68 wasn't the first and only to use this or that feature/concept/syntax; then one should know the languages the C's creators were exposed to… and so on. Hard to do a correct tracking; anyway I think it isn't unlikely it has influenced someway all the languages created in the time when it was, er, influential :)
@@MauroPanigada Instead of suspecting something, you could actually look it up. The authors of C wrote on the history of its making and they are quite clear on the relation between C and Algol.
@@doigt6590 Looked it up on wikipedia hoping it would have an easy reference of such a history lesson… they have a citation which seems to contradict your "Saying C was influenced by Algol 68 is quite a stretch too."
The quote is this: "The scheme of type composition adopted by C owes considerable debt to Algol 68, although it did not, perhaps, emerge in a form that Algol's adherents would approve of."
@@MauroPanigada Lol wikipedia while I am quoting the language's authors. I think I'll truth the language's authors over what wikipedia says haha.
you can code in swift on windows and linux, but you can't compile it for mac or iphone.
?? I thought Swift was written to replace Objective C, especially in iPhone
@@edwardfalk9997 its used most commonly for developing ios apps, but its a programming language on its own, it can be used for other thing like web services, etc.
This is the case with macOS and iOS in general, it's not even unique for Swift.
This is due to Apple's licensing.
Also, xCode is not for Swift, is much older than Swift, and there are other IDEs for Swift, Swift and xCode are not related.
This dude had so many things wrong about these...
@@edwardfalk9997I think they mean you can’t compile to a macOS or iOS executable from Linux or Windows. Swift has run on Linux for about 7 years and Windows for about 3. It was released under the Apache license starting with the first version to support Linux.
@@jaybutler I think they mean you can't BUY a compiler to do it. You can WRITE a compliler for anything to anything in ANY environment which supports STANDARD IN, STANDARD OUT and STANDARD ERROR on any architecture. If it was saleable someone would have done it.
PS I used to compile programe in Algol on IBM 360 series mainframes to execute on DEC PDP10 machines.
A lot of missing things here: A few examples of the top of my head:
tcl
Modula2
Modula3
BCPL
B
BLISS
FOCAL
FORTH
Scheme
Utopia
Eiffel
bash scripts
csh scripts
D
DIBOL
MDL
My grandpa used to code in RPG in the 60s. Really cool to see a shout out
I, and many others, will code some RPG today.
And what about Focal and two independently made languages called C - - ?.. What about Modula-2/3 and Oberon? And most important of all, where's PL/I ?! And Forth?!.
What About BrainF*ck programming language
this isn't every programming language. There were lots of Russian based programming languages in the Soviet union, a French based programming language in France, some newer programming languages like v, some older ones also missing like s, a+, b, d, e, Erlang, oCaml, and some more modern Non-English programming languages like Produire (プロデル (Japanese)) and 1C enterprise programming language (Russian). There's also tons of others missing in here. It would've helped if the video had some disclaimer for how it was classifying these because it seems to be leaving a lot out.
13:46 "replace objective C" sounded like "replay subjective C" lol
Need to add the languages Forth and ARexx to the list. My list of languages over the years are: basic, forth, assemble (multiple CPUs (6502, 680x0, VAX)), pascal, cobol, C, C++,ARexx, HTML, SQL, (i've looked at other computer languages but haven't programmed in them)
You gotta teach me assembly man
Great video!!! However, the background music interferes with understanding the speech clearly.
Holy C is a wild ride of a story haha
Swift is fully open source and capable of being run on many different operating systems today, not just Apple.
All 1500+ languages? I doubt it.
How many total languages exist ,?. Thankyou
No language will ever come close to fully replacing C (at least for now), but Rust has the biggest chances
This is because of the support for C on essentially everything that runs code and it's established role in tech
I miss Standard ML and Scheme 🙂. Functional programming is interesting from a mathematical point of view.
Probably Nablus best sticking to text based languages, if you start down the scratch round you might as well include Unreal blueprints and Houdini Op graphs etc
10:50 uhm... is deprecated...
center is the best element in the world everyone should use it it's pure magic and nobody will change my mind just because it's deprecated.
@@doigt6590 .div {display:flex;justify-content:center}
Nice ride! But wot no forth? And bcpl? Some influential blasts from the distant past - and my youth! 😄 Also, OCCAM?
LISP : The only computer programming language I've ever been shown that I just cannot get my head around. I have no particular training or practice with Python or Rust, but they're just programming languages. Lisp seems to be something else.
lisp isn't a programming language, lisp is a way to live, for real
@@DeciPaliz~ See? I told you I didn't understand it.
What about dBase/FoxPro/Clipper?
Great video, but i hope you understand that arguments like "low level runs much faster than high level" are debatable and wrong.
It's true though, especially for interpreted languages, although this guy clearly doesn't know what he is talking about.
Low level literally runs faster than high level though.
@@KanashimiMusic The two are not directly correlated.
Low level gives you more control over memory management, that part was right.
You could say the same about a pair of scissors and a chainsaw. If you have no idea how a chainsaw works, its useless.
@@NetherFX They are definitely correlated, very strongly I would even say. I don't know what exactly you mean by "not directly", but most traits that are inherent to higher-level languages naturally come with performance impacts.
1. As you yourself said, low level gives you more control over memory management, and if you don't have that control, the language (or rather, the runtime) has to do the work for you, very often through a garbage collector, which has some performance impact. That is unless the compiler manages the memory for you, like in Rust, but calling Rust a "higher-level" language is debatable at best.
2. High-level languages usually include a lot of non-zero-cost abstractions and overhead which can have significant performance impacts.
3. All high-level languages that I can think of are interpreted. Even languages that are usually referred to as "compiled" are really not - they're compiled to machine code for a virtual machine like the JVM which then interprets or "just-in-time compiles" that code to actual machine code at runtime. And quite obviously, interpreting also has a significant performance impact. Granted, the JVM JIT specifically is known for being very good at optimizing in real time, but even it can't possibly beat ahead-of-time compiled, ACTUAL machine code.
I'm aware that low-level vs high-level is not binary, rather, it is a spectrum that's not very well-defined or universally agreed upon. But it's impossible to deny that many of the traits that make a language "high-level" inherently lead to programs written in that language being slower.
1:38 Picture of Colossus from 1940's has nothing to do with COBOL, invented in 1959.
Swift is actually open-source and can be used pretty much anywhere.
ok so:
assembly is not a language, it's more like an umbrella term for a class of languages that closely mimic machine code. there are assemblers like gnu as or llvm-as that support multiple assembly languages for different architechtures - x86, arm, wasm etc. there's also fasm - an x86_64 assembler that does a lot more than just spitting out machine code and it can be meta programmed in it's own flavor of intel syntax assembly.
high level vs low level doesn't bare much meaning anymore. in fact languages like c are considered low level by todays standards even though they were concieved as high level.
object orientation is much more about inheritance and encapsulation of state than about having methods on structs of data. take go - it has all the features you described yet it is very much a procedural language. you could take the pokemon example and implement it in C. What OOP would allow you to do is make a class for all pokemons, a subclass for fire pokemons and a sub sub class for charizard, and have parent classes implement greatest common denominators for their respective kinds. like pokemon have hp and can perform tackle, fire pokemons are weak to water pokemon and can perform ember and charrizard is a concrete fire pokemon with it's name and set of sprites.
tbc
Pointers aren't code they're memory locations represented by numbers. C wasn't even the first to introduce them, they're just a very fundamental piece of system level programming. What C done right was it made using them easier and also the most important achievment of C is probably a very uniform and easy to reason about type system. Ie. you might need a type that can hold 32-bit signed integers - go for "int", want it unsigned? "unsigned int". There's an unsigned int somewhere in the memory and you want a handle (a pointer if you will)? "unsigned int*".
Pointers won't save you from memory leaks though, quite the opposite. That's part of the reason why modern high level languages manage the memory for you.
I feel like you overstated the contribution of smalltalk. Besides objective-c and a bunch of esoteric hobby langs modern day oop mainly stems from C++ which got it's model from simula.
Statically typed functional programming's main selling point isn't the static nature of it's types. C is also statically typed, so is algol and fortran. What's cool about stfp is mainly the fp part combined with very expressive type systems. Look up rust enums for more information.
Ok this is very much a nitpick but. C++'s main thing is probably templates not classes. Like generics in other languages they can do things you'd expect - map and so on. Thing is c++ compilers are insanely good at optimizing templated code and also templates can be used to evaluate things at compile time. They come in handy when you want a simple to use interface that compiles to the fastest possible machine code.
Ok haskell wasn't the first one to the immutability thing, also it's main feature is probably category theory compliant standard library - monads and such. About it being based on lambda calculus - yes so is ML and most pure functional programming languages. Cool thing about haskell is it is in fact based on typed lambda calculus allowing it to ie. do insane type inference. Like haskell can literaly infere a whole implementation of an interface for you.
Javascript is ecmascript. Javascript is the marketing/informal name while ecmascript is the actual name of the language. Also it runs outside of browsers, look up nodejs. There are even web servers that run on javascript.
even then, inheritance is something also very common in metaprogramming and children paradigms of the greater modular programming paradigm.
Nice list! Thanks for the video. On what criteria these languages was chosen? As example why V Lang and Crystal was not included? It's my recent favorites :) both really deserve to be mentioned
I think it's top 60 or something like that. Just the video title is misleading. By the way Crystal is great! As ex Ruby dev I really like Crystal
Knowing our fortune, if Crystal was mentioned, it'd be two seconds long section and said to be "Ruby, but with better speed" :<
How he massacred my boy Nim.... I know it's niche language and it can be summarized to be "Python, but C", but there's so much you can include to expand it even with few seconds more... like memory management system you can choose, or macros allowing you to basically rewrite AST to your needs!
I wish i could program.
I have some amazing ideas that the world will never see.
Did you mention FORTH ?
6:00
Matlab stands for Matrix Laboratory, thank you very much.
Lots of errors, however, I'm interested in the omissions. While there are many, some major ones are: Command/Cmd/4dos/Tcc, Rexx, Bourne/Korn/Bash, Csh/Tcsh, Forth, Awk, Sed, Groovy, Tcl, dBase/Clipper, Yacc+Lex, Make, m4, Postscript and many others!
Thanks for the video. I don't know anything about programming. This clarified a lot.
Excellent. And BASIC was based on Algol.
Simula was module oriented langage and Smalltalk the first oriented langage and the oop has been invented by Alan Kay
Fortran is 1 of the lowest level languges we have today. Its used for places where u need the highest preformance along with c.
For context it's generally excepted that u can't really beat c/c++ on raw speed but fortean managed to at times.
Forgot to mention that JavaScript can be used on the backend using NodeJS, Deno, or Bun.
Its weird hearing "Fortran" and "slower" together. Fortran compilers have been around so long and are so refined they generate ASM so optimized its considered one of the only languages that runs faster than C. Its a *rocket ship* of a language (And yes, it'll do your GPU stuff via its vector extensions and run them crazy fast). Its also so old its possibly your actual grandfather.
Many of those were half-assed, either missing explaining anything about them or being factually misinformed.
If you want to explain "Every Programming Language Ever" in 15 minutes, you can trim your shilling of Python and be more informative about all others, while grouping them based on the paradigms they are primarily focused on
(To be fair, it's a 30 second "shilling")
You missed D language, it's not very popular, but there are companies that use it
You left out TCL with TK that I use every day. The guy I know that knows APL drives a bus.