Looking for books & other references mentioned in this video? Check out the video description for all the links! Want early access to videos & exclusive perks? Join our channel membership today: th-cam.com/channels/s_tLP3AiwYKwdUHpltJPuA.htmljoin Question for you: What’s your biggest takeaway from this video? Let us know in the comments! ⬇
My personal problem with untyped languages is the following: If you get undocumented code (or the documentation is not up to date), you have no clue what the hell the code does. If functions with 3 parameters get called, and every parameter is some sort of list containing lists of lists of random stuff (looking at you, JS...), etc, you need hours and hours just to assert what the parameters should look like.
Not strange untyped languages are often called "scripts". They were meant to be at most a screen size large programs, like bash scripts or test scripts in groovy etc, then some bored developer thought they can be used for large scale projects 😀, listing all the advantages but never the disadvantages.
There's also the fact that static typing between application boundaries gives you statically-enforced contracts. As teams get bigger, this kind of contract enforcement becomes more valuable.
@@meostyles and it will have to be serialized to a type-less byte format for transmission & storage as well (plus handling stuff like endianness, encoding & locales).
@@switchblade6226 and all of those are already solved problems, thank you for polluting the comment section with a meaningless comment. Did you learned those words right before you made the comment?
To be fair though, some of the problems with Java were actually dogmatic programming conventions rather than a requirement of the language or static typing.
I never understood this tradition of creating getters and setters for EVERY class member EVERYTIME. Sometimes it doesn't make sense to create a getter or setter for some members, specially setters (it defeats the point of encapsulation).
@@shimadabr if you think of it in terms of OOP, where messages are passed around, and objects can’t do anything but mutate their own state, send messages and receive messages, getters and setters are a must. Because they give the illusion that you’re doing proper OOP!
@@Fs3iI don’t think he meant that getters/setters are always unnecessary; he’s talking about code where the developer mindlessly creates a setter for a field where it makes not sense to set that field from an external context.
Static typing is a subcategory of static analysis. Being able to reason about you program without running it is priceless. You only need to write tests for things that you can't reason about statically You only have bugs for stuff you can't check statically. When you have a powerful type system, many problems can be reduced to a type check. This is just the beginning, you have linters (that you can even write yourself) and DSLs, where you create mini languages that have the properties you are looking for. The type system in C++ (and to some extent Java) was only there to help the compiler produce more efficient binaries it didn't care about correctness. It was functional languages like Haskell that pioneered using types for correctness.
@@jhoughjr1 I respectfully suggest that's because your reasoning process is not as rigorous as it could be and is not suitable to reasoning about very large and complex systems. For any large system with complex entity types in the problem domain, correctly determining those types and their relationships is a large part of finding the best solution for not just functionality but extensibility, maintainability, security _and_ stability. Something I've heard multiple devs say about Typescript for example is that once you fully understand its type system, you find yourself actually putting most of the business logic into type definitions, rather than in the algorithms which use variables of those types. From watching videos about Haskell, which I haven't tried yet, I get the impression that's the case with Haskell too, because of the way its types are defined (which I gather is based on all the things the type _does_). Type systems like this allow you to put system-wide constraints in one place in the code (the type definition), rather than redundantly replicated all over the code which tends to be the case when dynamically typed languages are used. I've worked with both statically and dynamically typed languages for 35 years (roughly 16 languages, I'd estimate), including procedural, functional, object-oriented and even stack languages, and like most people who've fully grokked the benefits of static typing, its obvious to me there is no contest in terms of which is better. The _only_ benefits of dynamic typing are (1) being easier to learn if you're new to programming and (2) producing less verbose code (for small problems). But they encourage far less rigorous ways of analysing problems and coming up with solutions that serve multiple desirable goals, rather than just the functionality that's immediately required, especially in products under continuous, incremental development. Statically typed languages are generally vastly better in every other respect (extensibility, scalability, maintainability, security, stability of code, et al) for any large complex system. The use of dynamically typed language can only really be justified when (1) the nature of the problem means the code will always be in small, self-contained units rather than integrated systems of any scale, e.g. code that wires up controls on a web form, plugins for 2d and 3d art apps and so on, and (2) for _prototyping_ systems rapidly, rather than for producing the final system the prototype is a proof-of-concept for. Make no mistake about it, dynamically-typed and idiosyncratic languages like Javascript and Python are not popular because they're good programming languages, but because they're easier to learn and in both cases there are other inducements to use them that have nothing to do with their virtues as actual programming languages. The "easier to learn" part comes with a catch: Because they demand less rigor of new developers and allow developers to produce useful results quickly with less knowledge and less careful thought, they also often result in sloppy solutions with long term technical debt, like making it a lot harder to extend the system later, or producing a massive ball of spaghetti as functionality is added over time without any rigorous thought, such that finding a memory leak somewhere deep in the spaghetti 5 years later is a nightmare. Where "other inducements not related to the virtues of the language itself" are concerned: In the case of JS its because it was the only language that could run in every browser (something webassembly is slowly changing) so you had to use it for client side web code and when Node brought JS to server code, it allowed web devs who only knew JS to write server code without learning another language. In the case of Python, which is a terrible language, its pretty much entirely because of Google being the driving force behind a lot of AI and data science in the 2000s, and adopting Python as the language for those efforts. And from what I can gather that decision seems to have been adopted because a lot of people working in that area were data scientists first and reluctant programmers second, so making it as easy as possible for them to convert their mathematical expertise into code, rather than making them good programmers, was the priority. As a result the language _ecosystem_, rather than the language itself, has a ton of stuff that is useful for data science and machine learning, like numpy. Equally useful libraries that do the same things can and have been developed for other languages, but the Python ecosystem has the most mature versions that the most programmers are already familiar with. It's a great ecosystem for machine learning and data science, built around a perfectly shit programming language. From a design point of view, since you're talking about reasoning about code, I worked on a lot of waterfall projects early in my career, where you spend 7 months just workshopping and designing a large system then only 2-3 months actually writing the code. And by the time you get to developing it, you're working from a rigorous technical specification which tells you exactly want you're meant to produce down to every last database table, class/module definition and all of their fields and methods, as well as a detailed description of every process flow and how all those structures interact at every step, before you even begin actual coding. There are no outstanding design decisions when you begin coding, you're just implementing the technical spec, which is often hundreds of pages long, with diagrams. There are no edge cases that take you by surprise after 3 months of going live because that's all been thought through and mitigated in the design process. If you designed properly and implemented the spec properly it just works. And its intimately documented if someone wants to add a feature later and wants to know exactly what they must change. Assuming some kind of impact analysis / change specification is done for those changes, the documentation is incrementally updated. too. And my personal experience is that determining the correct data structures/types (in both persistent data stores and runtime program state) for a problem of any size and complexity (like say developing an inventory management system for a company with many warehouses) is usually a huge part of that rigorous design process. A huge part of the technical debt I've had to deal with when working on legacy systems is due to poorly designed data structures that were dreamed up on the fly by some dev at 3am who just wants a solution to their present problem, not the correct solution for flexibility and long term maintainability, most often written in a dynamically typed language with lots of missing schema checks making it very easy to break things. Static typing at least compels the programmer to think about data structures/types more carefully when they're making those 3am decisions. I know test-driven development is very popular among dynamically-typed programmers and many seem to think that is sufficient to catch problems like missing schema checking that will cause nightmares later. But TDD always assumes that the tests cover all cases, which is hard to achieve in reality. And unit tests can't cover things that require integration testing, and replicating test environments consisting of dozens of interacting systems for integration testing is often a nightmare all on its own. Static typing actually removes the need for the majority of tests required by dynamically typed language in TDD.
@@farrenh Can't we agree that we should use the right tools for the right problem, and that sometimes dynamic typing is the more suitable tool? I would never go as far as claiming dynamic typing is always better (if I interact with other hardware, a database, network etc, then not only the datatypes, but endianness and exact lengths in bits as well as alignment etc are super important). On the other hand I'd argue that stating that "...languages like Javascript and Python are not popular because they're good programming languages, but because they're easier to learn and in both cases there are other inducements to use them that have nothing to do with their virtues as actual programming languages..." misses the point of these languages (in my opinion). In some cases it is extremely obvious what type/kinds of types/characteristics an argument to a function will have (good names, comments etc are good in all languages), and specifying the types explicitly adds nothing of value. Sometimes (again, I'm not trying to argue that this is true in general!) shorter is better, if it is allowed to mix objects freely in a list (without adding an interface/new type, inheriting it from multiple classes etc) then a simple runtime type-check to see if "is this a valid thing to exist in this list?" before processing it (plus a unit test or two) can have the same safety effect as hundreds of lines of static type word salad. If time/money was an infinite resource, all programs had to be at least a million lines long and lives depended on every single one of them, I might have agreed with you. But metaphorically speaking, if time/money had been an infinite resource, all my clothes might have been tailor made out of hand woven cashmere mixed with fine platinum threads and have built in air conditioning.
Now with Java record types and people dropping the stupid getter and setter nonsense, the Answer class implemented today is 1 line which includes equals and hashcode.
Here's something else that changed... Web pages in the 90s were pretty simple by comparison to modern pages. And dynamically typed languages start to run into problems. Imagine your site is 1 million lines of code. It was written over a 10-year period by 20 developers. 15 of those people are long gone. And now you're looking at code you either haven't seen in eight years or may have never seen. And static typing helps. A LOT. Function foo takes two arguments, and they are clearly defined. You don't have to guess. You know what it returns. You just KNOW. Furthermore, your compiler and your IDE know, too. You're not going to make stupid mistakes like use snake case instead of camel case. You're not going to use size when the function expects (or returns) count. You don't have to be an expert in function foo to at least use the right field names. When your site is only 500 lines, that's just not a big deal, and that's what 1998 web sites were like. But that's no longer true. The dynamically typed languages threw out the baby with the bath water. They did so because they were solving a different problem. Statically typed forever, as far as I'm concerned, but I'm an old fart.
Note that IDEs for dynamic languages had autocompletion, spellchecking, and type inference since ~2005, yes, they didn't get it right some of the time but it was right 95% of the time. I think documentation is important though, especially doctests.
Even as a solo dev working on small projects, static types stop me confusing myself. Instead of having to remember: "this function returns an anonymous obj in this specific format" I can read the function definition and go: "ooooh it returns one of those" and then go read the type defenition it returns
saw a 180MB first-page with just few things in angular->typescript->js. If I have downloaded as html, it was 2Mb. If one wrote that in JS was just small and fast. typescript just helps bigger teams working together. writing good dynamic code is wisdom. google home page? 6MB! respect. but not for everyone.
This hits a bunch of nails on the head that I had trouble identifying exactly. Really appreciate it. I have really enjoyed working with the newer statically typed languages, while the old ones (Java, C++) had many drawbacks, most of which you mentioned. The industry just went in a bad early direction. We can have a developer experience that is as good as Ruby with some statically typed languages now.
all of the direction that Java and C++ was bad af, tell me a single feature of those languages you can think of that we haven't discarded or minimized because it wasn't harmful? every single feature those languages introduced has been discarded or replaced by superior features.
@rtfeldman Have you seen Julia? It is a dynamically typed language (although with a design that uses type inference) that doesn't _require_ a runtime overhead. It also doesn't require runtime type checking, Julia _can_ do runtime type-checking if necessary, but it is not required, in fact most of the time we want to avoid that at all costs and only use it when needed.
This is a phenomenal breakdown and historical analysis - you helped me put together the some puzzle pieces that I've been trying to make sense of for years. Thank you for all of the work that you put into this talk.
I'm glad you ended up talking about it at around 33:24 because Delphi was indeed very fast across the board: the IDE was responsive even back then on the era's machines, compile time was very quick and compiled programs ran fast.
Yes! I learned programming in school and university with Turbo Pascal and Delphi and the first time I had to work with languages like Java or C/C++ I thought both about the IDEs/tooling and also regarding the speed: How is anyone supposed to work with that? It is like in the middle age.
@@mudi2000a I preferred the syntax of "curly braces" languages like Java/C/C++ than Pascal/Object Pascal but back then Delphi's IDE and components were indeed ahead of their time.
Yes it was never that you really need so fast CPUs to do "properly fast IDE"... I came up with an IDE idea that gives you these even on 8502 if you really want, because code would be not stored as text to parse but you would edit already parsed syntax tree for example. There are really ways to do it low-cost just there was no time to invent those (neither interest). Also pointing out delphi (or lets say Kylix which I used more in high school), these kind of things just worked fast-enough already on machines points this is again not really an issue of compuiter speed but implementation. Also if you look at completion of some good editor like vim and compare to VSCode for me even feels sluggish despite good computers (and despite people not feeling it) there is still a lot of waste. Btw the only reason why I can imagine the "pendulum swinging back a bit" is because nowadays even in dynamic or gradual typing people introduce as slow build steps like C++. Literally have a customer of mine where a full build takes MINUTES. It is like "what the hell? You still have type errors because its gradual typing but now you have build step which seems worst of both worlds"... But then that will only last as long as people realizing you also not need build step actually for static types haha., So even if pendulum swings back ever - must be temporary.
@@u9vata I think that on moderately recent hardware VS / VS Code are decent enough but indeed, responsiveness isn't perfect. For sure, they have more features than Delphi/Kylix (didn't use Kylix much, mostly only used it to get a Delphi console application running under Linux before switching to Free Pascal) but I can't help but think that there is not the same level of craftmanship or at the very least the same focus on optimization than in the old Borland days. That said, not all product from that time were as robust and well engineered than Delphi. I had barely tried it, but even C++ Builder from the same company felt subpar (I vaguely remember compile times were much longer and weird hard to understand compile and runtime errors - probably because of the constraints of the C++ language).
I prototyped a vertical slice of a system in Python, and after much thought, made the move to Nim. Static type checking has ensured consistency and caught many an error in a system that is expanding in scope and complexity, but I've gotten this with almost NO ceremony. Refactors have become downright easy. I have some concise type definitions, and some helper functions which reach inside a variant type to grab dataframes and header types consistent across all variants; but the overwhelming majority of my code is business logic. It feels like Python, but with a compile step that catches issues. Nim also hates nulls and prefers stack allocation (going so far as to make dynamic types like objects/structs and sequences/hashmaps just "unique" pointers in the stack behind the scenes). Bonus points for static compilation with musl, which has made deployment trivially easy on just about any variant of Linux server.
I've been coding since 1975. I've known for about a half-century that static typing reduces debugging time because errors become apparent earlier in the debugging cycle. It's why static typing was invented. Almost every time I write something in Python, JS, Ruby, or PHP the excitement we felt that day rushes over me once again. Evidently the papyrus scrolls have been lost. I'm trying to remember who took the meeting notes but it was so long ago ...
They were solving different problems. I've been programming about that long, too, and I have a clear preference for static typing. When your code base is 500 lines to assist your HTML, it's one thing. When your code base is 50,000 or 1 million lines, it's an entirely different thing. The scripting speed of dynamic languages pales in comparison to the time you spend figuring out "what fields does this method return?"
I was first introduced to programming with BASIC in 1973, typed into a teletype connected to a mainframe far away that we never saw. Why was that? Because people at the time thought assembler or ALGOL was probably a bit over complex to use for an introduction to programming for kids. WIth a few months we did move on to assembler and other things. The way I see it is that when personal computers and then the web burst onto the scene and all of a sudden there were millions of people out there who can now be introduced to programming for the same time. Naturally, following in the steps of BASIC, languages that were simple for beginners to get into and get the job down were developed and became popular. Something went wrong though, those millions of new programmers never did get pushed to "grow up" programming wise. They went on to build companies and mega corporations etc on the same beginners languages they were weened on. Really, they had little choice. Finally, the programmers of the world have grown up.
The reverse is totally true, I worked in that space for the better part of a decade. You can do static analysis in a dynamically typed language and use that information to optimise, eliminate runtime type checking, enumerate error cases, and provide autocompletion. Types that get checked make this easier, in particular they can give you tighter bounds on analysis time and on the remaining uncertainty.
I remember around 2009 there was a talk by Steven Yegge about dynamic languages striking back, type inference on dynamic languages could give you type warnings instead of type errors and let you use autocomplete with 97% accuracy, and gradual typing could give you some of the benefits of dynamic languages, and the new jits (psyco, V8, luajit) were improving the performance of dynamic languages. Haskell actually soured my like of static typing since I didn't know about -fdefertypes and I kept getting difficult to understand type errors since I was relying too much on type inference, and I would often have to change types through the call stack to lift things into IO, State, Either, Maybe (Look up Clojure Maybe Not). More easy to learn static type languages like Kotlin & Swift & Elm are I think what swung the pendulum.
@@aoeu256 That's interesting to hear! Haskell can definitely be a struggle to fight with sometimes, but for me, it actually increased my appreciation/reliance on static typing by a lot, to the point that Haskell is basically my go-to language for a lot of small ideas I have on the side and that I'm even doing some stuff in Idris, a dependently typed language.
I really like "data oriented" typing, or the sort of gradual typing languages like Clojure/Script can take advantage of. I define a spec and from that I get automatic client side validation, server side validation, data transformation, and database schema generation.
I also like those, but I agree that as of now statically typed languages like those are superior to gradual typing, we have seen so many projects that were bootstrapped in Clojure/Script that have switched to something else like storm or aschiinema because the performance of clojure wasn't there. Now, his point of "I don't see a pendulum switch" doesn't make sense to me, because I do see a switch in the future, it just requires work from someone that knows about PLT & type theory & appreciates dynamic development workflow that only Lisp or Smalltalk can give. Things like dependent types, code synthesis, theorem proofing and having a dynamic runtime for development and fully compiled deployable binary with zero overhead, could be the future. but this is not something we are going to get from the clojure/script community. maybe from the racket community. I left clojure/script because I found the community to be very stubborn and irrational once that haskell & co had already proven with code that static typing is productive and very useful.
In gradual typing wouldn't you have to remember what every function does and how every time you use a function to make sure whatever you pass in doesn't cause compile time errors
@@laughingvampire7555 what are you using now? I want to be able to use Clojure more, but typing aside, it feels like it has a pretty small reach, not having a clear pure clojure path to mobile development or embedded systems or web assembly like other languages do so I'm a bit torn.
Seems like it was more of the heavy OO you had problems with rather than the static typing. Don’t get me wrong, never been a fan of heavy OO, and I use static/dynamic languages depending on the task at hand, but most of the issues you bring up here seem to fall more with the obnoxious object model rather the typical model.
Thanks, this is by far the best talk about dynamic and static typing ! Looking at the history and raison d'être of dynamic and static typing is a very good approach.
23:49 it looks like the Roc `decode` function type annotation is incorrect, the function seems to return a `Result` type (`Ok`/`Err`) but is not annotated as such. Richard also skips past the ML language which was doing static typing with nearly zero ceremony using the same techniques that Roc does now, since the early 1970s 🙂 Re: considering build speed in language design-pioneered by Wirth in Pascal with its single-pass compiler. Turbo Pascal was famous for its build speed. Delphi is a Pascal variant.
Static typing provides implicit documentation for skilled software engineers. If a team needs to build something complex, there’s just no substitute for static typing. The benefits go on forever, and about the only down side is that the languages tend to be a little more rigid. But that rigidity is exactly what you want when a system gets large and complex. For a system that can be built in less than a month, then dynamic typing is fine.
One thing missing from the talk is any discussion about program size. For small programs, where a single programmer can understand the whole program, the extra error checking given by static typing isn't particularly helpful. If, however, you are working on a system consisting of millions of lines of code, developed by hundreds or thousands of programmers (e.g. a web browser), then you want as much error checking as possible as early as possible. Another recent interesting change has been the move away from object orientation. I remember when, in the eighties, object orientation was the great new thing, yet now many new languages seem to have found good ways of providing most of the advantages of object orientation without the overheads.
Note that originally what OOP meant in Smalltalk wasn't the design pattern class fest in C++/Java. Smalltalk had closures (blocks) as methods were objects and loops were methods were objects and a lot of dynamic features like method missing/metaclasses to make coding much shorter and message oriented programming, while coding in the normal Java way actually caused your application to get more lengthy...
There actually two meanings of "type", which historically coincided and are in desperate need of emancipation from each other half a century later. One is a predicate bounding a parameter domain. The other is a concrete representation of abstract values.
I can't for the life of me see the difference between these two meanings. To my mind an abstract value is a definition of a value, be it a definition what is an int or bool or some agregate like an array structure or class. Actual, concrete values being instances that match the abstract description. But at the same time that abstract definition also implies the bounds of the type, the range of ints, the size of arrays, etc. What is the difference I am missing?
@@Heater-v1.0.0 I think by "predicate bounding a parameter domain" he means things like int and by "concrete representation of abstract values" he means things like structs.
ปีที่แล้ว +13
Gradual/optionally typed languages currently in use; Ravi (Lua superset), Luau ("Roblox lua") in development: Mojo (Python superset) - this one has potential to be mainstream before or around 2030.
Also, pytorch. The fact that I can write code and it will navigate complex runtimes to execute on heterogeneous compute environments is practically magic.
Great talk, and I agree with basically all of it. I almost thought gradual typing would be the future, but you convinced me otherwise with fantastic logic.
I think mutability is the pitfall and potentiall death of all dynamically typed languages. It is soo easy to write spaghetti python code - just have a ton of functions returning nothing, all transmuting a single list variable, not even guaranteed to be a list, and it quickly becomes almost impossible to figure out what is actually happening, without running the code and putting in some logs (tools like the compiler, warning you about fails even before you run it/ship it, really start to feel important). Im not saying static typed languages ship code with less bugs than dynamic typed - thats simply not true. But I know devs - if one of the devs just loves to rename variables for no f-ing reason, and renames all but a few obscure use-cases of the returned { displayName, wasEmpty } object, then your program will still run, until one unfornate user runs into: property displayName of object is undefined And (even though the renaming is kind of pointless in most cases, except when the orig author was really clueless in naming), it is not the fault of the programmer for wanting to rename a variable to make code more readable. A language is a bit shitty if it locks you in at your first stupid ideas for interfacing - if you want something done right, write it twice is a good rule. Maybe in the end it comes down to design of code, which again has nothing to do with the language. I've seen shitty code in c++, as well as in Rust. However, shitty code is easier in some languages than in others - If a language gives you too much freedom (like C++, I love it), then you really have to have an idea of what you are going to do and a rough idea of how to get there. There is more to it than just throwing some ints and a string in a struct and calling it done.
Its very difficult to add typing to a dynamically typed language. Ruby has typing with Sorbet and RBS, but it is an uphill battle to make typing work with existing packages and not to mention a developer spends more time trying to make the dynamic/static typing place nicely with each other.
On the contrary this has been extremely smooth in Python. I think possibly Ruby has more challenges with it, maybe due to having more syntax "already taken", so there is less "syntax space" for adding types. It can be somewhat difficult to add type annotations for example for a (fully generic) function decorator, you need to look up ParamSpec etc, but fortunately the vast majority of types even in dynamic code are actually simple (or generic over a single or few type parameters). The harder part for these languages is to make use of the type information to actually become faster. At least for Python, I think it will become more and more "reserved" for smaller scripting-type purposes simply due to people being more aware of its relatively poor performance for larger systems.
@@jolben But python's type hints are ugly and verbose. A simple type declaration requires typing a long list of spells such as TypeVar, Literal, and Protocol, which is only a little better than JAVA. Like the structure of T -> (False, None)|(True, [T]), this becomes Callable T Union tuple Literal False None tuple Literal True list T. And when the type hint warns about a issue, the message given is similar to the C++ template bullshit error message.
@@KevinSheppard Current TS is a really seamless experience, as long as you allow implicit any, it's really more a gradual typing approach in that case. Not only can you convert files one by one, even inside a file you can convert it bit by bit over time.
Saving you an hour by this summary: There was static typing until the late 90's, then dynamic due to slow PCs, IDEs and compilers. Static typing comes back these days and I bet it is here to stay. There is the Roc language I work on. It is statically typed with full type inference, which makes it usable and low ceremony like dynamic languages, but type safe.
Excellent analysis that fits my experience over the years. I think we’ll see more innovations around type systems; the distinction between dynamic and static won’t be the key difference anymore. Static typing, where possible, makes sense-especially if you can do it without hammering productivity. Programmers want powerful, declarative types that reduce cognitive overhead. Type inference is a great example of this desire: you get nice errors and compile time checks but you don’t have to worry about maintaining a bunch of type annotations. So what’s better than that? I wish I knew. I’d secure my fame by inventing the next big language. But I have seen one thing I think offers a clue. The raku language has a feature called subtypes that lets you specify enhanced runtime type checks that supplement compile time types. For example, you can specify even integers by creating a subtype that checks for divisibility by 2. Then any function that takes an even integer will automatically check its arguments. When you combine powerful subtypes with multiple dispatch, you can remove a lot of imperative code and replace with simpler declarative code. Raku may never be a major language, but it has some amazing design ideas that I hope will be influential. So, I hope to see hybrid run and compile time type systems become more powerful and more common. Check out the multi example on RosettaCode to see an example of subtypes being declared in function signatures: rosettacode.org/wiki/FizzBuzz#Raku
@@Octal_Covers I don't think it's Java exactly. Maybe J2EE. Maybe certain segments of the Java community that were coming up with rules for Fortune 500 companies to try to wrangle certain types of problems. The main argument for getters and setters, after all, is reengineering -- which is inevitable in any real project that actually lasts for more than 5 years. There are good reasons you should "program to an interface". Doing so is more work up front, but it sure makes reengineering (and polymorphism) a whole lot easier down the road. You could implement his example in Java with only slightly more code than the JS version. (It's been a while since I've done Java, and I don't remember what JSON support is like.) You don't need getters and setters and all that other boilerplate, after all.
@@jplflyer Yeah, I also felt that the Java felt a bit... off. His Answer class could perfectly well have just been field, field, constructor, all public. Because that'd be 1:1 the functionality the JS object gave him. Setters, Getters/encapsulation, he could have done those in old JS too. Equals and HashCode? JS didn't give him either of those (===/== is only object identity in JS after all) so if it's not needed in a Java class, don't implement it. Also, if he really wanted to display all the "possible" ceremony in the Java example, where's his toString? :D Interestingly enough, his explanation for POJO also seemed different from what I'm used to - the thing on his left is what we'd call a POJO back then.
I really wish more statically typed languages were able to compile with errors. I’ve had huge Rust programs where I wanted to test a simple change on a small code path but because of traits, etc I end up editing dozens of files and getting supper bogged down. Even if it just compiled into “panic if you call any code that had errors in it” would get good enough for me.
I end up going in and inserting panics (or todo!()) myself to tell the compiler “don’t even bother with this chunk of code”. Wish I could automate that
Gradual typing in dynamic language is stupid! It just helps a little bit the IDE (without getting exact correctness), but you don't get any advantage of having typing at all: 1. you still have slow run-time 2. you still have to cover all the tests that could have avoided with real typing, 3. hacked on syntax is ugly (take a look at Sorbet!) 4. you don't get the same kind of developer productivity with real strong static typed language 5. God forbid if you have to work in a huge project with dynamic language, that makes you want to jump off the window everyday.
I recommend changing your example of ceremony. The difference between a JS snippet for a personal project and a work PR in Java has little to do with static typing.
The java boilerplate seems to be more about the patterns chosen in this case. There is no reason that the answer class can't just look something like the json class.
The ceremony in enterprise Java is far, far higher than the ceremony in enterprise Javascript or Typescript. A lot of 90s era programmers conflated the ceremony of the major static languages of the day with static typing, and disliked static typing because of that. Modern mainstream static languages typically have much, much less ceremony than Java, which is something that he points out. If you watch the video, a decent part of his point is that when the options were Java and python, many people preferred python. When the options are Javascript and Typescript, many people opt for Typescript.
I don't think Hack for PHP is very relevant now - although it may have been when it was created. Afaik almost no-one except Facebook now uses Hack. But PHP has been getting much more statically typed. The built in run-time type checking system has been getting much more expressive in the last several versions, and the third party type checking tools (two popular free static analysis tools Psalm and PHPStan, plus the PHPStorm IntelliJ-based IDE ) are I think getting much more widely used, check the types in advance so users can almost completely avoid having any runtime TypeErrors thrown, and also take the type system far beyond what PHP supports natively via docblocks.
@@goblinsgym I'd say it started with Turbo Pascal 1.0; the compiling speed and the OG IDE workflow (write/compile/run/debug in the same program) gave serious heartburn to the juggernaut of developer tools at the time - Microsoft.
I think snappiness was also related to code size. If he was programming J2EE for a fortune-500 company, then there would be serious code bloat. It wasn't that the IDE was slow so much as the IDE had 200,000 lines of J2EE code it was indexing.
@@jplflyer Delphi is fast for large projects thanks to a very efficient unit system. Only changed code needs to be recompiled, and you never ever have to futz around with Make files. I remember reading the C++ book in the late 1980s, and trying to visualize what it would take to write a compiler for this mess. If it is hard for a computer to read, imagine what it will do to humans...
I still use textmate for everything that doesn't require IDE-kind of thing and absolutely loving it! I can have 20-30 java, ruby, javascript projects open and still fast
GREAT TALK. If you are starting in this amazing world, save it. He is giving you historical facts. If you are a dynamic language lover, this guy is just telling you the true that you already know watch out don't deceive yourself.
One correction: PHP itself is now gradually typed. Hack is basically dead outside of Facebook and Slack, but PHP now has what is consider the strongest type system of any interpreted language. (Typescript is compiled.) A lot of work has gone into that, and it's payed off.
For me, it boils down to a very simple concept: in both cases types matter. In one case you're saying, "trust me, I got this." and in the other case the compiler is saying, "prove it." For small projects, throwaway stuff, quick scripting, I reach for dynamic. I don't need to spend all that effort proving it. What's the worst that happens? My script fails and I fix a few things and then run it? But when I got to a few hundred thousand lines that lives for years and has many different developers working on it, I cannot, for a moment, think it would be a rational decision to go with the "just trust me" option.
I'm working in a 1.2 million lines lua game project... Project was started ~10 years ago... not sure how many programmers have been on it... but it must be more than 50 now. The problems I see in the code are old sins (crappy written early systems) and code complexity. The hard nasty bugs are not on the lua side, but the on the engine c++ side.
Great talk. I agree with most of the points, but I'm a bit more optimistic about gradual typing. You say they require a "complicated" type systems, and from a language implementer perspective that is certainly true, but as a regular programmer I call them "powerful" type systems instead! There are still some problems that are easier to solve with access to dynamic/gradual typing. I wouldn't be so quick to dismiss the gradually typed languages that are massively popular today (Python/mypy, JS/TS) just because of their roots in dynamically typed languages. C# is an interesting example of a statically typed language that later added dynamic types too (in version 4.0). Julia and Mojo are two languages that I would love to see entering the top20 some time in the next 10 years. Each of them attempts to combine the benefits of both dynamic and static typing in their own unique way.
@@aoeu256 The "issue", so to speak, with Haskell's Dynamic, is that once you have it, you can't really do much with it apart from explicity try and convert it to another type, which gets tedious and defeats the point a bit, or code your entire codebase with Dynamics - which is still a bit painful. C#'s dynamic at least just lets you do what you want and then yells at you at runtime - also the ceremony needed for casting dynamic->string, for example, is much, much simpler.
Even Java is working hard at ceremony reduction. It's got a long way to go to catch up, but the example at the top in today's java would be a one liner for the entire Answer. Json hasn't changed much, and is still a pain in the ass like that. The catch blocks can be folded together, and the if else could be expressed DRY with a ternary calculating the displayname, before creating an answer. I'll let a recode of the Java example speak for itself: record Answer(boolean wasEmpty, String displayName){} @JsonIgnoreProperties(ignoreUnknown=true) public static record Json(@JsonProperty("name") String name, @JsonProperty(admin) boolean admin){} public static Answer decode(String rawJson){ ObjectMapper objectmapper = new ObjectMapper(); try{ Json json = objectMapper.read(rawJson); String displayName = answer.admin? answer.name + " (Admin)" return new Answer(json.name.isEmpty(), displayName); catch(JsonProcessingException|JsonMappingException e){ //do something prettier than returning null here please return null; } }
The rise and fall of dynamically typed languages is an example of the typical engineering cycle: we have a problem - we find a workaround - we fix the root cause. The same happened to NoSQL and will almost certainly happen to microservices and event-driven architectures.
Could you elaborate? How are we going to solve scaling monolithic software? Or are you referring to that microservices in the true sense are overkill and a bit bigger units are better?
@@kc3vv I am not even sure monolithic software is the answer. It may be something completely different that we don't even know about. Anyway, microservices seem to be a workaround for today's limited technologies rather than a solution that adds value for the customers. Similar to dynamic typing.
This is an interesting take on developments in the programming language space. Microservices are indeed interesting based on the problem that they aim to solve which is probably more related to scalability. There are added benefits in that an app can be split up into different "independent" modules that are language agnostic. There are clear benefits to microservice architectures when done right, but I think that a lot of companies (especially startups) use microservices for the wrong reasons. For example if you build your server in nodejs, which is single threaded, and you want to scale it up, a microservice architecture will help since you can spawn multiple instances, but maybe nodejs was not the best choice in the first place. Also, if you are splitting your app into multiple microservices but stuff breaks if one of the microservices goes down, then this architecture is also probably not the right choice.
@@Voidstroyer I am not questioning the benefits of microservices from developers point of view. I am just saying that they don't add value for the users, as the users don't care about languages used. Scalability does indeed matter for the users but I don't see why the same horizontal scaling cannot be achieved by replicating a monolith.
@@aybgim3850 No I completely agree with you on that users wouldn't really care about what the architecture of an application looks like, as long as it does what it needs to do, and with acceptable performance. I am personally against microservices architecture because I think that some apps (for example in my current company) use microservices prematurely (or the application is split up into too many unnecessary isolated services). I would definitely prefer going monolithic first and only splitting into microservices whenever necessary.
Great talk, thanks a lot… I feel the same and meanwhile love static typing. One aspect that might be interesting to address here as well might be reflection, which is the workaround how static typed languages deal with situations where dynamic typing is required. Reflection usually is quite painful and might be something that becomes easier in the future? Also interesting is how modern programming languages move from strict object orientation to alternative polymorphism, like interface/traits, and from exceptions to optionals. This all seems to be kind of linked with the move to static typing as well.
Java POJOs is what happens when you just follow leads and don't bother to get underlying principles. If fields are going to be final (as were in the example given) fields may as well be just public. The issue with static types is that the capitalist system, is interested in churning optimal abstract code, not in teaching basics, this is what pushes for the enforcement of draconian nonsensical paradigms. Code reviewers are either lazy, over-defensive because of their salaries, or they themselves are unaware... Surprise! Java happened to be the preferred language for the banking system... Does it makes sense now?
I code mainly in Lua nowadays... I very rarely miss static typing at all. Any runtime bugs coming up due to type missmatch are usually the easiest to fix... the problem never lies there. Imho the hard problems are usually related to code complexity or memory out of bounds overwrites etc. Lua does not suffer from memory issues. And code complexity you will have in any language. I just love how quickly I can iterate on lua code, compared to a static language. It's just 5x faster to work with, expacially with instant reload. No compile times.
And yet 2 dynamically typed languages (Python and JavaScript) are leading language popularity lists by a large margin (depending on the actual survey ofc.). Also, statically typed languages try to look like dynamically typed ones (via auto, type inference etc.) because typing ceremonies are not fitting rapid prototyping. I usually use type annotations in Python, and it helps a bit; but I don’t have the feeling that it would be super helpful for development or code quality. If the compiler can use the type hints for optimization (like for Julia or Cython), then yes, it is a value add. But otherwise it is mostly just a habit, a ceremony from my side, but I don’t feel static typing that much helpful for my use case (developing data projects). So at the end I think the preferences are also domain specific. Nice talk, thanks!
SO it sees to me that in some ways you could replace the use of "dynamically typed" in this talk with "Interpretted". And sometimes the language mentioned as dynamic actually IS statically typed but is interpreted.. Case in Point BASIC. which really only had 3 data types Integer, Floating and String.
Not quite all uses (Typescript is "statically typed", but still interpreted). But I also felt like most of his problems were not about being statically or dynamically typed, but about being compiled vs interpreted.
The only way i see the pendulum swing back to dynamic types is if dynamic type languages make another major revolution leaping ahead of the dynamics again
I wonder with AI what the next paradigm shift will be in languages. I don't mean just AI replacing devs, but maybe what paradigms might emerge as more important to this field, like possibly logic programming, and if AI benefits more from static or dynamic typing.
Started off with MatLab and Python, especially bigger GUI projects in MatLab drove me into loving static analysis. My variables in those projects will look like typenames only to keep myself sane and the code somewhat self explanatory... but without actual static analysis.. and actual types. In short: want to build something complex? Then use static typing bc your architectural design schema will most likely look like it anyways.
Gradual typing is not limited to just ActionScript and Dart; JavaScript is indeed a prime example of a gradually typed language that is widely used and immensely popular. JavaScript's versatility and ubiquity in modern web frontend and backend development make it an ideal candidate for leveraging gradual typing. With the use of JSDoc annotations, developers can add static type information to their JavaScript code, allowing for improved code quality, tooling support, and catching potential errors. It allows mixed approach, where parts of the code are statically typed while others remain dynamic, is becoming increasingly prevalent and represents the future of programming languages
JSDoc goes with Typescript or mypy in his taxonomy (static type retrofit to a dynamic language). ActionScript and Dart were singled out as the only mainstream languages that *started* gradually typed.
reasons not mentioned for dynamically typed languages, which made me go that direction in my career after starting with typed ones, were: - lists (dynamic vectors) as first class citizens - dictionaries ("js objects", agnostic hash maps) as first class citizens - one time type agnostic implementations, without having to fight against the compiler with generics or copy paste code or have Interface madness and yes i know, many of these points were addressed by more modern languages, like C# or rust, to an extent. roc seems interesting, but i dislike it's syntax. good talk.
You have listed all the bad things :) Especially dictionaries are the worst. You have a method which accepts a dict, but that dict must have certain fields and you have no idea what are those fields. An exact well named class/object is much better.
The return of static and strong typing is such... ... a relief for me. I had done a big wad of C and C++, but I really liked Object Pascal - all statically typed but Object Pascal was much stronger than C and C++. When Linux came along, and we moved from only being able to afford one compiler to a world where there were languages galore. And the one that really piqued my interest was Ada because it took Pascal's string-typing and shovelled a load more on top. But you've got to go where the market beckons and my customers and bosses wanted web development and that meant Perl, Python and Ruby and I found myself disappointingly in the world of dynamic types. So... "welcome back static types", I say and the stronger the better (looking at you TypeScript X x X) A few years back I was using an ORM in PHP That used "Plain Ol' PHP Objects" - which were, ironically, jam packed with getters and setters. (Because I look for any opportunity I can to deride PHP, I preferred not to pronounce this as PoPo but Pooh-pooh).
Hello, thanks for this presentation. There are several things missing in this talk, or at least that's what I think. First of all, video seems to imply that it was some sort of "decision" for 1990's languages to be static. Its not the case. They HAD to be static because they compiled directly to machine code and on 100mhz processors it wasnt possible to run software with late binding on object creation, function calls, etc. If you called a function, you had to call it excatly the way it was defined (i'm talking about calling convention) and it had arguments to be passed exactly how function would read them. Its not like in dynamic languages, where missing arguments would simply be undefiend/null at runtime and missing function would be written in console at runtime. A call for doSomething() function is transpiled into direct call for specific code address. Unless you leave debugger symbols, compiled software doesnt even know what name of this function is. It HAD to be that way, because statically compiled languages (or ones compiled to intermediate code like java) were a lot faster than static ones. In order to visualize the difference between dynamic and static languages - JavaScript is often called as "fast" language (due to good runtimes like V8 and Node.JS async performance), while Java has fame of being the "slow", sluggish one. Actually, Java is like 5-6 times faster than JavaScript today and the gap was much worse back in the days. Also, you provided the point that it was urge to deploy quickly for dynamic languages. I kinda agree with that, but I believe there is more about that. For backend dynamic languages like PHP, they were simply "good enough" for web development because bottlenecks either come from databases or network quality itself. In 2000's, you had to wait several seconds for website so it didn't matter if site responded 200ms quicker or not. But also remember, back in the days, web was MUCH simplier, it was mostly simple read/write stuff, today you have heavy complex web systems with very complex backend and frontend code base. Bottlenecks were simply not in the user code, its like today Python, being quite slow language, is considered best for AI purposes because its not the bottleneck - all Python is doing is calling libraries in native code that do complex stuff, so Python is perfectly fine solution here. Also minor nitpicks about Delphi compilation time. First of all, things they cleaned is not "cache" but generated .o files which contain native code blocks for particular modules. Those .o files were then collected with a tool called linker to become a full executable. Second thing - indeed Delphi and pascal-based compilers were really fast, but this came with a heavy cost. They were single-pass compilers which required from developers to architecture their code properly, because it didn't deal good with circular references. C/C++ compilers could do circular references OK but they compiled every file against every file, so compilation time grown exponentially with code complexicity.
It's not necessarily true early 90s code had to run in machine code. Early versions of Microsoft's programs for Windows were written in C...however they were not compiled to machine code. They instead were compiled to pseudocode, that was then interpreted at runtime. Executing the programs ran the interpreter, and loaded the interpreted code. This is the secret for how Microsoft managed to get programs like Word, Excel, and Access to fit into the 1MB memory constraints on an 80286, while their competitors failed. Microsoft sacrificed some program speed for smaller memory constraints. The instructions of the pseudocode could be more powerful than the machine code instructions in a smaller amount of memory. The C compiler they used for this was not made publicly available. By the late 90s the memory constraints no longer mattered and Microsoft changed to compiling to native code.
@@grr986 Could you please elaborate more about that? As far as I know, 286 had protected mode already so it could easly go with >1mb. I'd love to see more sources about that, because I'm curious.
@@martinchya2546 My memory is a little fuzzy because it's been more than 30 years. The 80286 had a 1 MB memory constraint at least under Windows, and only 640K was usable by Windows. There was a protected mode on the 80286, but if I recall right, its limitations kept Windows from using it (see Protected Mode on Wikipedia). You could cheat a little and get an extra 64KB out of it with the segmented memory. There was a hack where you could access the first 64KB of the second MB (see High Memory Area in Wikipedia). It wasn't until Windows 3.0 running on a 386 or later that you really were able to get past the 1MB limit. I just looked at my Microsoft Access 1.0 box sitting on my shelf which shipped in late 1992...it required Windows 3.0, 386, 2MB memory, and I believe it still ran pseudocode at the point it shipped. It was certainly pseudocode for most of its development.
I was educated in the school of statically typed languages, and I detested the dynamically typed languages because they make parallelism and effective compilation very very hard. And parallelism and effective compilation are really a thing when you actually run your program. But my attitude towards the dynamically typed languages have improved, it is only that we (as usual) are trying to use them for the wrong task. Dynamical languages should be used for scripting and fast prototyping, and static languages for application or systems development. Put shell back! It is a scripting language. The reason for R:s "success" in keep being dynamic is that it is used for machine learning, yet actually compiles efficiently. It is designed for exactly that niche.
41:00 “Dynamic typing requires some runtime overhead” - That’s a true statement, but quite more nuanced than Richard makes it appear. In a JIT compiled language like Julia the runtime overhead can be O(1) in the number of executions.
"the runtime overhead can be O(1) in the number of executions" - That's a true statement, but quite more nuanced than you make it appear. O(1) doesn't tell you anything about the duration of this. it could be nanoseconds, or it could be decades.
Another reason gradual typing is not the best of both worlds: having certainty that all the code you need to work with will have correct types (including the stuff written by guys who left the company 3 years ago) is a benefit only available to mandatory typed languages.
The early BASIC was statically typed - at least until (including) Commodores home computer BASIC, where variables without type qualifier where float (yes, I, J and so on where float and for loops didn't work with integer in Commodore BASIC! And yes, floats could be array indices), variables with % (S%, X%) were integers and variables with $ were strings. So BASIC was a strongly, statically typed language. AFAIK, the original Dartmouth BASIC had only float variables - and not even string as a data type.
Nice talk. For me, the big downside of static types is the visual clutter of the necessary annotations. I don't think the talk called out type inference as the statically typed languages likely to catch on, but I'd add that qualifier.
Your ide could hide static types maybe and auto inference them. Now that I think about it, you could show different properties of your program depending on different powered type systems like dependent types for behavior, effect types, and normal types.
Rust has some of the best, most impressive, at times shocking type inference. 90% of your code won't need type annotations whatsoever. you can declare a variable with a very ambiguous type (for example let foo = None) and if at any point afterwards anything restricts its type, then it is inferred. it could be returning it from a function, passing is as an argument, or even adding it to a list of known (often inferred!) type.
Two thhings that went unmentiooned was in practical experience Static typing manages to not be bulletproof as far as security but more secure than dynamic typingg and second about static typing, I don't know precisely how an interepter would do type infererence but if I were called upon to speculate the educated guess woould be to without even telling the developer ad in the background create a while true loop and ittirate over all data types until it tested true whichh is part of why the grand wizards of coding tell you low level languages are more painful to write but have thhe advantage of being somewhhat highher porformance than higher ones. I um, personally as an developer in training Really strongly prefer static typing for the performance improvements it offers, many oversimplify that performance advantage to being one of simply the differences between itnerepters and compilers but one thing I've noticed is it runs even deeper in that compiled code tends to be faster due to all thhe ceremony of resolving data types for the compiler rather than making the interepterer do that work for you and any time as developer you can spare the computer from having to do this woork will speed it up. a more traditional viiew would suspect that its impossile to be both more secure AND higher performance but in practical experience as a developer you can squeeze both out oof things by simply specifying data types.I broke many rules by starting with leaarning C instead of python and while I dread thhe lack of a garbage collector later on it's too useful. Obviously assembly is even faster thhan C but I want to be productive. so care about performance yes but don't go so far down that rabbit hole thhat work slows down to a crawl.
PHP has a JIT but that seems to make very little difference to it's run speed. The JIT isn't especially useful for a typical PHP workload. More importantly PHP has an ahead of time compiler, and the compiled bytecode is cached on the webserver that runs it.
IDE's for real. I remember trying to buy a laptop for an early version of Eclipse. I spent a day in BestBuy just installing it and seeing if it crashed when I opened it. The most amusing one was a Toshiba Satellite that crashed into a starscape (black background with randomly colored points interspersed throughout). This was 2008. On the flipside, though, about 41:00 he says dynamically typed languages can't be as fast. Maybe he hasn't tried Julia yet...
The only big apprehension I have is over my love of duck-casting. I don't want to import the type to declare the type. I want to give a function some object that has a .write() function on it. You should be able to specifically enforce that, but it *is* slightly dynamic there.
I think future IDEs driven by AI Lynters and type checkers are going to swing the popularity back to dynamic languages because in the background they're really going to be transpiling to static and able to warn you instantly about errors we make or auto correct for us. Even compile time errors are going to be picked up while coding long before you hit save . In fact we may have multiple abstraction levels of code that we can drill in and out of with the top level being something like just plain English sentences in to the next level being like JavaScript or python then level after that being like rust , go or zig , and then down to a memory management Lang like C then lowest level of assembly or something until all we see is hex or binary. We may just have choice of writing in any type specificity.
4:10 Don't forget C, along with Perl were the main languages when writing CGI. 7:00 It's interesting to see the point of view from someone from, say, US where web exploded in 1995 when in other countries (like down here in Argentina) web exploded only much later so software development focused on desktop for far longer (which meant Visual Basic and Delphi were the preferred languages over JS). 31:05 Not sure about Java but C# got csharprepl, a must for any C# programmer wanting to quickly draft something. 31:40 I'm pretty sure incremental building was already present in mid 2000s at least in Visual Studio. I'm sure though that there were paid plugins to compile faster (there was one that was called Incremental Build, Instant Build, or something like that that sent compilation tasks to different computers in the network to compile each a section of the code). The fact that the first or second thing anyone programming in JS or Python does when installing Visual Studio Code is installing a linter (which is basically adding the preprocessing step of compilers before running their interpreter) to catch errors early speaks volumes about whether static typing is the way of doing things or not (even though I am pretty sure there are a few kamikaze that in 2023 are probably still using a plain text editor without any linting at all).
beside the fact that almost every single big mega companies found out that they needed the speed of non dynamic languages. Business drives change always has always will. From facebook to twitter they pretty much rebuilt their systems from the ground up at least once.
ปีที่แล้ว +6
Let me save you 52 minutes. The answer to title question is.... BECAUSE IT'S USEFUL.
Java is such an impactful language, developers decades later uncovering that maybe some ideas weren't all bad despite Java having them. Maybe one day we'll realize that OOP isn't that terrible, despite Java trying very hard to prove otherwise.
Problem with Java is that while many of the ideas aren’t bad, the way Java pushed them to the extremes are impractical. Java’s OOP is great for people loving UML schemas, it’s noise and boilerplate for people who just want to write useful code.
OOP will always be a bad idea because it tried to merge several different concepts into one for no good reason. Encapsulation is a good idea but not at the object boundary. Polymorphism is a bad idea when done in the Java style where it is mixed with inheritance, the Haskell type classes or Rust traits work much better for code reuse. Inheritance involves subtyping which involves dealing with co- and contravariance, something even most OOP programming language designers don't seem to take into account. OOP also complicates composition which is another way to achieve code reuse. The main issue with OOP is really that nobody ever developed a proper theory for it before implementing it and what theory there is is often ignored in practice (e.g. Liskov substitution principle).
As for what will drive the push for static typing going forward, my bet is usage-metered (CPU/Mem) billing like Cloud Run and Railway. Running a Rust/Go service at a few megabytes each vs some gigantic Node/Java/etc. service has quite different impact on cost.
Assembly has typing similar to C. Float vs double vs integer. Signed vs unsigned. And 8-bit vs 16-bit vs 32-bit vs 64-bit. Less typing than many languages but more typing than "no typing".
Working primarily in ruby on rails, but learning Java in my personal life I really appreciate types now. Java does seem overly verbose for sure, but it's also very easy to read. Ruby on rails is nice because it's so fast to develop. If I was a lot smarter than I am I think the freedom of dynamic languages would be a huge benefit. I'm not that smart though so I see code written 3 years ago and I have trouble initially figuring out what it should be returning. With a statically typed language that's not an issue. Plus my tests are checking type anyways a lot of the time. Java is a pain to set up, I like using vim, ide's are really annoying, etc. That being said in the long run even using something overly verbose like Java, fixing bugs or adding features is a lot easier when the language assumes you are an idiot(java) versus ruby or any dynamic language
I got lectures about why we can profit from strongly typed programming languages, by finding errors early, already in 1984. This is really really old stuff, and it should really be every programmers' general knowledge. I wonder why it appears that it isn't. Besides, with any static programming language you can work as fast and as interactive as with say Python, only that: 1. you need *_much_* more experience, 2. if you use say C, you have to *_code much more_* to get to the goal. Java is not representative, Java's coding practices are not sound.
For TS, This sounds like it could as easily just become an IDE feature instead of a modification of how you write code, which requires a compilation process. Thankfully in PHP land, static typing is optional and dynamic typing doesn't require transpilation. However, i don't find much use for it in PHP because i have an excellent IDE already ( jetbrains ), and type based problems are rarer in PHP for one reason or another.
most of the comparisons done have nothing to do with static or dynamic typing, but with OOP vs functional programming, features of the language itself or other things
The problem all these "better" languages have is that they don't solve problems most developers are having. Almost nobody is CPU limited anymore, and if they are, scaling up CPUs is far cheaper than switching to using the newer tools with fewer available libraries and frameworks and job candidates.
The problem with this approach is that it's true right up until it isn't and you're back at square one. I'm excited to have a new cohort of languages that give me a bit of both worlds up front.
bugs and vulnerabilities are a problem every developer will always have, unless their code is designed to be written, but never run. Programmers not doing or caring about debugging kind of feels like it.
If your doing REPL based exploratory programming, or IO bound applications you are not CPU limited. However, there are certain type of apps (real-time & game engines with hundreds of thousands of objects) that can't be done well in the slower languages. I have to say that LISP with its macros can embed or emit unsafe code or MLIR easily.
Also reminds me that back in 2005 Python was so much higher level than Java and C++ yet with its list comprehensions, closures, **kwargs, decorators, generators, convenient json like syntax, but most other languages caught up. Python sort of became frozen for a while and slower higher level features like relational-constraint programming, STM memory, reactive propagation, equational solvers & symbolic programming, ADT & full pattern matching, predicate classes/dispatch, macros, you have to be in LISPs to benefit from.
Great talk! After years of working with the JVM, moving to JavaScript or PHP on the server side for large prpjects just felt... Icky. Also, take a look at Groovy for a nice balance between dynamic and strongly typed.
I'm much prefer editing programs and compiling them while the program runs so I never have to restart the program. What turns me off to most static languages is the great difficulty with which they handle run time data variations. Working with heterogeneous collections in a dynamic way when the return result is unspecifiable is usually a headache of templates and statically bound languages. I much prefer to work in Steel Bank Common Lisp, and work on my program while it's running only restarting it when I need to. I can compile each and every function as I define and adjust them while the program stays running. There are benefits to static typing but you realize at that point you're programming in a subset of a more expressive programming language. Languages that are poorly designed like JavaScript turn people off to dynamic typing because they mistake dynamic typing for poor coercion rules from one data type to another-especially around strings and numbers which generally just shouldn't be happening in the first place. I feel like poor type coercion rules are really the worst offender in most dynamic languages today. I generally want to express things in terms of algorithms rather than leading a compiler around by the nose and locking myself in to what only gets templated at compilation time.
I don't see complex type systems as a downside. The complex type system required by Typescript to model the full range of possible values in Javascript is orders of magnitude more expressive than the C# type system. A trivial example: where C# would require multiple overloads for a single parameter of varying types, Typescript's union types allow having a single function signature.
i like what python does and idk having a language basically do what my IDE tries to do (statically analyze and check), but then use that to compile? sounds neat
Looking for books & other references mentioned in this video?
Check out the video description for all the links!
Want early access to videos & exclusive perks?
Join our channel membership today: th-cam.com/channels/s_tLP3AiwYKwdUHpltJPuA.htmljoin
Question for you: What’s your biggest takeaway from this video? Let us know in the comments! ⬇
My personal problem with untyped languages is the following: If you get undocumented code (or the documentation is not up to date), you have no clue what the hell the code does. If functions with 3 parameters get called, and every parameter is some sort of list containing lists of lists of random stuff (looking at you, JS...), etc, you need hours and hours just to assert what the parameters should look like.
Yeap, you pretty much need to run your code to understand the full behavior or luckily run your tests if you have them.
Not strange untyped languages are often called "scripts". They were meant to be at most a screen size large programs, like bash scripts or test scripts in groovy etc, then some bored developer thought they can be used for large scale projects 😀, listing all the advantages but never the disadvantages.
There's also the fact that static typing between application boundaries gives you statically-enforced contracts. As teams get bigger, this kind of contract enforcement becomes more valuable.
Only if you have some shared API repo, or using gRPC protocol buffers though
@@meostyles and it will have to be serialized to a type-less byte format for transmission & storage as well (plus handling stuff like endianness, encoding & locales).
@@switchblade6226 and all of those are already solved problems, thank you for polluting the comment section with a meaningless comment. Did you learned those words right before you made the comment?
To be fair though, some of the problems with Java were actually dogmatic programming conventions rather than a requirement of the language or static typing.
I never understood this tradition of creating getters and setters for EVERY class member EVERYTIME. Sometimes it doesn't make sense to create a getter or setter for some members, specially setters (it defeats the point of encapsulation).
@@shimadabr if you think of it in terms of OOP, where messages are passed around, and objects can’t do anything but mutate their own state, send messages and receive messages, getters and setters are a must. Because they give the illusion that you’re doing proper OOP!
That’s a great observation
static typic is dogma I say!
@@Fs3iI don’t think he meant that getters/setters are always unnecessary; he’s talking about code where the developer mindlessly creates a setter for a field where it makes not sense to set that field from an external context.
Static typing is a subcategory of static analysis. Being able to reason about you program without running it is priceless. You only need to write tests for things that you can't reason about statically
You only have bugs for stuff you can't check statically.
When you have a powerful type system, many problems can be reduced to a type check.
This is just the beginning, you have linters (that you can even write yourself) and DSLs, where you create mini languages that have the properties you are looking for.
The type system in C++ (and to some extent Java) was only there to help the compiler produce more efficient binaries it didn't care about correctness. It was functional languages like Haskell that pioneered using types for correctness.
yeah but type systems get in the way of reasoning about my code more often than not.
@jhoughjr1 The answer to this objection is simply time and practice. It may get in the way, but the upsides are incredible.
@@jhoughjr1 I respectfully suggest that's because your reasoning process is not as rigorous as it could be and is not suitable to reasoning about very large and complex systems. For any large system with complex entity types in the problem domain, correctly determining those types and their relationships is a large part of finding the best solution for not just functionality but extensibility, maintainability, security _and_ stability.
Something I've heard multiple devs say about Typescript for example is that once you fully understand its type system, you find yourself actually putting most of the business logic into type definitions, rather than in the algorithms which use variables of those types. From watching videos about Haskell, which I haven't tried yet, I get the impression that's the case with Haskell too, because of the way its types are defined (which I gather is based on all the things the type _does_). Type systems like this allow you to put system-wide constraints in one place in the code (the type definition), rather than redundantly replicated all over the code which tends to be the case when dynamically typed languages are used.
I've worked with both statically and dynamically typed languages for 35 years (roughly 16 languages, I'd estimate), including procedural, functional, object-oriented and even stack languages, and like most people who've fully grokked the benefits of static typing, its obvious to me there is no contest in terms of which is better. The _only_ benefits of dynamic typing are (1) being easier to learn if you're new to programming and (2) producing less verbose code (for small problems).
But they encourage far less rigorous ways of analysing problems and coming up with solutions that serve multiple desirable goals, rather than just the functionality that's immediately required, especially in products under continuous, incremental development. Statically typed languages are generally vastly better in every other respect (extensibility, scalability, maintainability, security, stability of code, et al) for any large complex system. The use of dynamically typed language can only really be justified when (1) the nature of the problem means the code will always be in small, self-contained units rather than integrated systems of any scale, e.g. code that wires up controls on a web form, plugins for 2d and 3d art apps and so on, and (2) for _prototyping_ systems rapidly, rather than for producing the final system the prototype is a proof-of-concept for.
Make no mistake about it, dynamically-typed and idiosyncratic languages like Javascript and Python are not popular because they're good programming languages, but because they're easier to learn and in both cases there are other inducements to use them that have nothing to do with their virtues as actual programming languages. The "easier to learn" part comes with a catch: Because they demand less rigor of new developers and allow developers to produce useful results quickly with less knowledge and less careful thought, they also often result in sloppy solutions with long term technical debt, like making it a lot harder to extend the system later, or producing a massive ball of spaghetti as functionality is added over time without any rigorous thought, such that finding a memory leak somewhere deep in the spaghetti 5 years later is a nightmare.
Where "other inducements not related to the virtues of the language itself" are concerned: In the case of JS its because it was the only language that could run in every browser (something webassembly is slowly changing) so you had to use it for client side web code and when Node brought JS to server code, it allowed web devs who only knew JS to write server code without learning another language. In the case of Python, which is a terrible language, its pretty much entirely because of Google being the driving force behind a lot of AI and data science in the 2000s, and adopting Python as the language for those efforts. And from what I can gather that decision seems to have been adopted because a lot of people working in that area were data scientists first and reluctant programmers second, so making it as easy as possible for them to convert their mathematical expertise into code, rather than making them good programmers, was the priority. As a result the language _ecosystem_, rather than the language itself, has a ton of stuff that is useful for data science and machine learning, like numpy. Equally useful libraries that do the same things can and have been developed for other languages, but the Python ecosystem has the most mature versions that the most programmers are already familiar with. It's a great ecosystem for machine learning and data science, built around a perfectly shit programming language.
From a design point of view, since you're talking about reasoning about code, I worked on a lot of waterfall projects early in my career, where you spend 7 months just workshopping and designing a large system then only 2-3 months actually writing the code. And by the time you get to developing it, you're working from a rigorous technical specification which tells you exactly want you're meant to produce down to every last database table, class/module definition and all of their fields and methods, as well as a detailed description of every process flow and how all those structures interact at every step, before you even begin actual coding. There are no outstanding design decisions when you begin coding, you're just implementing the technical spec, which is often hundreds of pages long, with diagrams. There are no edge cases that take you by surprise after 3 months of going live because that's all been thought through and mitigated in the design process. If you designed properly and implemented the spec properly it just works. And its intimately documented if someone wants to add a feature later and wants to know exactly what they must change. Assuming some kind of impact analysis / change specification is done for those changes, the documentation is incrementally updated. too.
And my personal experience is that determining the correct data structures/types (in both persistent data stores and runtime program state) for a problem of any size and complexity (like say developing an inventory management system for a company with many warehouses) is usually a huge part of that rigorous design process.
A huge part of the technical debt I've had to deal with when working on legacy systems is due to poorly designed data structures that were dreamed up on the fly by some dev at 3am who just wants a solution to their present problem, not the correct solution for flexibility and long term maintainability, most often written in a dynamically typed language with lots of missing schema checks making it very easy to break things. Static typing at least compels the programmer to think about data structures/types more carefully when they're making those 3am decisions.
I know test-driven development is very popular among dynamically-typed programmers and many seem to think that is sufficient to catch problems like missing schema checking that will cause nightmares later. But TDD always assumes that the tests cover all cases, which is hard to achieve in reality. And unit tests can't cover things that require integration testing, and replicating test environments consisting of dozens of interacting systems for integration testing is often a nightmare all on its own. Static typing actually removes the need for the majority of tests required by dynamically typed language in TDD.
@@farrenh Can't we agree that we should use the right tools for the right problem, and that sometimes dynamic typing is the more suitable tool?
I would never go as far as claiming dynamic typing is always better (if I interact with other hardware, a database, network etc, then not only the datatypes, but endianness and exact lengths in bits as well as alignment etc are super important).
On the other hand I'd argue that stating that "...languages like Javascript and Python are not popular because they're good programming languages, but because they're easier to learn and in both cases there are other inducements to use them that have nothing to do with their virtues as actual programming languages..." misses the point of these languages (in my opinion). In some cases it is extremely obvious what type/kinds of types/characteristics an argument to a function will have (good names, comments etc are good in all languages), and specifying the types explicitly adds nothing of value. Sometimes (again, I'm not trying to argue that this is true in general!) shorter is better, if it is allowed to mix objects freely in a list (without adding an interface/new type, inheriting it from multiple classes etc) then a simple runtime type-check to see if "is this a valid thing to exist in this list?" before processing it (plus a unit test or two) can have the same safety effect as hundreds of lines of static type word salad.
If time/money was an infinite resource, all programs had to be at least a million lines long and lives depended on every single one of them, I might have agreed with you.
But metaphorically speaking, if time/money had been an infinite resource, all my clothes might have been tailor made out of hand woven cashmere mixed with fine platinum threads and have built in air conditioning.
@@jhoughjr1 well a blade can get in your way of chopping wood if you grab the axe by the blade
Now with Java record types and people dropping the stupid getter and setter nonsense, the Answer class implemented today is 1 line which includes equals and hashcode.
Yeah, modern Java is nice. It's unfortunately got a lot of negative perception to battle though.
@@KyleSmithNH Nice may still be an overstatement, but it is a LOT better than it used to be and than people perceive it.
Its just @Data annotation with lombok which is basically industry standard by now...
Java's runtime is aweful!
@@bobweiram6321 Can you be more specific? There’s plenty to hate, so I’d like to know which part is grinding your gears.
Here's something else that changed...
Web pages in the 90s were pretty simple by comparison to modern pages. And dynamically typed languages start to run into problems. Imagine your site is 1 million lines of code. It was written over a 10-year period by 20 developers. 15 of those people are long gone. And now you're looking at code you either haven't seen in eight years or may have never seen.
And static typing helps. A LOT. Function foo takes two arguments, and they are clearly defined. You don't have to guess. You know what it returns. You just KNOW. Furthermore, your compiler and your IDE know, too. You're not going to make stupid mistakes like use snake case instead of camel case. You're not going to use size when the function expects (or returns) count. You don't have to be an expert in function foo to at least use the right field names.
When your site is only 500 lines, that's just not a big deal, and that's what 1998 web sites were like. But that's no longer true.
The dynamically typed languages threw out the baby with the bath water. They did so because they were solving a different problem.
Statically typed forever, as far as I'm concerned, but I'm an old fart.
Note that IDEs for dynamic languages had autocompletion, spellchecking, and type inference since ~2005, yes, they didn't get it right some of the time but it was right 95% of the time. I think documentation is important though, especially doctests.
Even as a solo dev working on small projects, static types stop me confusing myself. Instead of having to remember: "this function returns an anonymous obj in this specific format" I can read the function definition and go: "ooooh it returns one of those" and then go read the type defenition it returns
@@huxleyleigh4856so return instances of classes instead
Returning instances of classes doesn't help in a dynamic language because the return type might still be opaque to the IDE.
saw a 180MB first-page with just few things in angular->typescript->js. If I have downloaded as html, it was 2Mb. If one wrote that in JS was just small and fast. typescript just helps bigger teams working together. writing good dynamic code is wisdom. google home page? 6MB! respect. but not for everyone.
This hits a bunch of nails on the head that I had trouble identifying exactly. Really appreciate it. I have really enjoyed working with the newer statically typed languages, while the old ones (Java, C++) had many drawbacks, most of which you mentioned. The industry just went in a bad early direction. We can have a developer experience that is as good as Ruby with some statically typed languages now.
all of the direction that Java and C++ was bad af, tell me a single feature of those languages you can think of that we haven't discarded or minimized because it wasn't harmful? every single feature those languages introduced has been discarded or replaced by superior features.
@rtfeldman Have you seen Julia? It is a dynamically typed language (although with a design that uses type inference) that doesn't _require_ a runtime overhead. It also doesn't require runtime type checking, Julia _can_ do runtime type-checking if necessary, but it is not required, in fact most of the time we want to avoid that at all costs and only use it when needed.
This is a phenomenal breakdown and historical analysis - you helped me put together the some puzzle pieces that I've been trying to make sense of for years. Thank you for all of the work that you put into this talk.
I'm glad you ended up talking about it at around 33:24 because Delphi was indeed very fast across the board: the IDE was responsive even back then on the era's machines, compile time was very quick and compiled programs ran fast.
Yes! I learned programming in school and university with Turbo Pascal and Delphi and the first time I had to work with languages like Java or C/C++ I thought both about the IDEs/tooling and also regarding the speed: How is anyone supposed to work with that? It is like in the middle age.
@@mudi2000a I preferred the syntax of "curly braces" languages like Java/C/C++ than Pascal/Object Pascal but back then Delphi's IDE and components were indeed ahead of their time.
Yes it was never that you really need so fast CPUs to do "properly fast IDE"... I came up with an IDE idea that gives you these even on 8502 if you really want, because code would be not stored as text to parse but you would edit already parsed syntax tree for example. There are really ways to do it low-cost just there was no time to invent those (neither interest). Also pointing out delphi (or lets say Kylix which I used more in high school), these kind of things just worked fast-enough already on machines points this is again not really an issue of compuiter speed but implementation.
Also if you look at completion of some good editor like vim and compare to VSCode for me even feels sluggish despite good computers (and despite people not feeling it) there is still a lot of waste.
Btw the only reason why I can imagine the "pendulum swinging back a bit" is because nowadays even in dynamic or gradual typing people introduce as slow build steps like C++. Literally have a customer of mine where a full build takes MINUTES. It is like "what the hell? You still have type errors because its gradual typing but now you have build step which seems worst of both worlds"...
But then that will only last as long as people realizing you also not need build step actually for static types haha., So even if pendulum swings back ever - must be temporary.
Delphi AKA OO Pascal
@@u9vata I think that on moderately recent hardware VS / VS Code are decent enough but indeed, responsiveness isn't perfect. For sure, they have more features than Delphi/Kylix (didn't use Kylix much, mostly only used it to get a Delphi console application running under Linux before switching to Free Pascal) but I can't help but think that there is not the same level of craftmanship or at the very least the same focus on optimization than in the old Borland days. That said, not all product from that time were as robust and well engineered than Delphi. I had barely tried it, but even C++ Builder from the same company felt subpar (I vaguely remember compile times were much longer and weird hard to understand compile and runtime errors - probably because of the constraints of the C++ language).
I prototyped a vertical slice of a system in Python, and after much thought, made the move to Nim. Static type checking has ensured consistency and caught many an error in a system that is expanding in scope and complexity, but I've gotten this with almost NO ceremony. Refactors have become downright easy. I have some concise type definitions, and some helper functions which reach inside a variant type to grab dataframes and header types consistent across all variants; but the overwhelming majority of my code is business logic. It feels like Python, but with a compile step that catches issues. Nim also hates nulls and prefers stack allocation (going so far as to make dynamic types like objects/structs and sequences/hashmaps just "unique" pointers in the stack behind the scenes). Bonus points for static compilation with musl, which has made deployment trivially easy on just about any variant of Linux server.
I've been coding since 1975. I've known for about a half-century that static typing reduces debugging time because errors become apparent earlier in the debugging cycle. It's why static typing was invented. Almost every time I write something in Python, JS, Ruby, or PHP the excitement we felt that day rushes over me once again. Evidently the papyrus scrolls have been lost. I'm trying to remember who took the meeting notes but it was so long ago ...
They were solving different problems. I've been programming about that long, too, and I have a clear preference for static typing.
When your code base is 500 lines to assist your HTML, it's one thing. When your code base is 50,000 or 1 million lines, it's an entirely different thing. The scripting speed of dynamic languages pales in comparison to the time you spend figuring out "what fields does this method return?"
I was first introduced to programming with BASIC in 1973, typed into a teletype connected to a mainframe far away that we never saw. Why was that? Because people at the time thought assembler or ALGOL was probably a bit over complex to use for an introduction to programming for kids. WIth a few months we did move on to assembler and other things.
The way I see it is that when personal computers and then the web burst onto the scene and all of a sudden there were millions of people out there who can now be introduced to programming for the same time. Naturally, following in the steps of BASIC, languages that were simple for beginners to get into and get the job down were developed and became popular.
Something went wrong though, those millions of new programmers never did get pushed to "grow up" programming wise. They went on to build companies and mega corporations etc on the same beginners languages they were weened on. Really, they had little choice.
Finally, the programmers of the world have grown up.
The reverse is totally true, I worked in that space for the better part of a decade. You can do static analysis in a dynamically typed language and use that information to optimise, eliminate runtime type checking, enumerate error cases, and provide autocompletion. Types that get checked make this easier, in particular they can give you tighter bounds on analysis time and on the remaining uncertainty.
I remember around 2009 there was a talk by Steven Yegge about dynamic languages striking back, type inference on dynamic languages could give you type warnings instead of type errors and let you use autocomplete with 97% accuracy, and gradual typing could give you some of the benefits of dynamic languages, and the new jits (psyco, V8, luajit) were improving the performance of dynamic languages. Haskell actually soured my like of static typing since I didn't know about -fdefertypes and I kept getting difficult to understand type errors since I was relying too much on type inference, and I would often have to change types through the call stack to lift things into IO, State, Either, Maybe (Look up Clojure Maybe Not). More easy to learn static type languages like Kotlin & Swift & Elm are I think what swung the pendulum.
I've tried static analysis in PHP, 20k false positives. Fortunately PHP 8 has improved improved a lot.
@@aoeu256 That's interesting to hear! Haskell can definitely be a struggle to fight with sometimes, but for me, it actually increased my appreciation/reliance on static typing by a lot, to the point that Haskell is basically my go-to language for a lot of small ideas I have on the side and that I'm even doing some stuff in Idris, a dependently typed language.
I really like "data oriented" typing, or the sort of gradual typing languages like Clojure/Script can take advantage of. I define a spec and from that I get automatic client side validation, server side validation, data transformation, and database schema generation.
And generative testing!
I also like those, but I agree that as of now statically typed languages like those are superior to gradual typing, we have seen so many projects that were bootstrapped in Clojure/Script that have switched to something else like storm or aschiinema because the performance of clojure wasn't there.
Now, his point of "I don't see a pendulum switch" doesn't make sense to me, because I do see a switch in the future, it just requires work from someone that knows about PLT & type theory & appreciates dynamic development workflow that only Lisp or Smalltalk can give. Things like dependent types, code synthesis, theorem proofing and having a dynamic runtime for development and fully compiled deployable binary with zero overhead, could be the future.
but this is not something we are going to get from the clojure/script community. maybe from the racket community. I left clojure/script because I found the community to be very stubborn and irrational once that haskell & co had already proven with code that static typing is productive and very useful.
In gradual typing wouldn't you have to remember what every function does and how every time you use a function to make sure whatever you pass in doesn't cause compile time errors
@@laughingvampire7555 what are you using now? I want to be able to use Clojure more, but typing aside, it feels like it has a pretty small reach, not having a clear pure clojure path to mobile development or embedded systems or web assembly like other languages do so I'm a bit torn.
Seems like it was more of the heavy OO you had problems with rather than the static typing.
Don’t get me wrong, never been a fan of heavy OO, and I use static/dynamic languages depending on the task at hand, but most of the issues you bring up here seem to fall more with the obnoxious object model rather the typical model.
Thanks, this is by far the best talk about dynamic and static typing ! Looking at the history and raison d'être of dynamic and static typing is a very good approach.
23:49 it looks like the Roc `decode` function type annotation is incorrect, the function seems to return a `Result` type (`Ok`/`Err`) but is not annotated as such. Richard also skips past the ML language which was doing static typing with nearly zero ceremony using the same techniques that Roc does now, since the early 1970s 🙂
Re: considering build speed in language design-pioneered by Wirth in Pascal with its single-pass compiler. Turbo Pascal was famous for its build speed. Delphi is a Pascal variant.
this talk needs to be given in strangeloop and the biggest clojure conference, take bodyguards with you
Static typing provides implicit documentation for skilled software engineers. If a team needs to build something complex, there’s just no substitute for static typing. The benefits go on forever, and about the only down side is that the languages tend to be a little more rigid. But that rigidity is exactly what you want when a system gets large and complex. For a system that can be built in less than a month, then dynamic typing is fine.
One thing missing from the talk is any discussion about program size. For small programs, where a single programmer can understand the whole program, the extra error checking given by static typing isn't particularly helpful. If, however, you are working on a system consisting of millions of lines of code, developed by hundreds or thousands of programmers (e.g. a web browser), then you want as much error checking as possible as early as possible.
Another recent interesting change has been the move away from object orientation. I remember when, in the eighties, object orientation was the great new thing, yet now many new languages seem to have found good ways of providing most of the advantages of object orientation without the overheads.
This is exactly it. The code size of a modern web page is no longer 500 lines of HTML and 20 lines of JS.
Note that originally what OOP meant in Smalltalk wasn't the design pattern class fest in C++/Java. Smalltalk had closures (blocks) as methods were objects and loops were methods were objects and a lot of dynamic features like method missing/metaclasses to make coding much shorter and message oriented programming, while coding in the normal Java way actually caused your application to get more lengthy...
There actually two meanings of "type", which historically coincided and are in desperate need of emancipation from each other half a century later. One is a predicate bounding a parameter domain. The other is a concrete representation of abstract values.
I can't for the life of me see the difference between these two meanings. To my mind an abstract value is a definition of a value, be it a definition what is an int or bool or some agregate like an array structure or class. Actual, concrete values being instances that match the abstract description. But at the same time that abstract definition also implies the bounds of the type, the range of ints, the size of arrays, etc. What is the difference I am missing?
@@Heater-v1.0.0 I think by "predicate bounding a parameter domain" he means things like int and by "concrete representation of abstract values" he means things like structs.
Gradual/optionally typed languages currently in use; Ravi (Lua superset), Luau ("Roblox lua") in development: Mojo (Python superset) - this one has potential to be mainstream before or around 2030.
Also, pytorch. The fact that I can write code and it will navigate complex runtimes to execute on heterogeneous compute environments is practically magic.
Great talk, and I agree with basically all of it. I almost thought gradual typing would be the future, but you convinced me otherwise with fantastic logic.
I think mutability is the pitfall and potentiall death of all dynamically typed languages. It is soo easy to write spaghetti python code - just have a ton of functions returning nothing, all transmuting a single list variable, not even guaranteed to be a list, and it quickly becomes almost impossible to figure out what is actually happening, without running the code and putting in some logs (tools like the compiler, warning you about fails even before you run it/ship it, really start to feel important). Im not saying static typed languages ship code with less bugs than dynamic typed - thats simply not true. But I know devs - if one of the devs just loves to rename variables for no f-ing reason, and renames all but a few obscure use-cases of the returned { displayName, wasEmpty } object, then your program will still run, until one unfornate user runs into:
property displayName of object is undefined
And (even though the renaming is kind of pointless in most cases, except when the orig author was really clueless in naming), it is not the fault of the programmer for wanting to rename a variable to make code more readable. A language is a bit shitty if it locks you in at your first stupid ideas for interfacing - if you want something done right, write it twice is a good rule.
Maybe in the end it comes down to design of code, which again has nothing to do with the language. I've seen shitty code in c++, as well as in Rust. However, shitty code is easier in some languages than in others - If a language gives you too much freedom (like C++, I love it), then you really have to have an idea of what you are going to do and a rough idea of how to get there. There is more to it than just throwing some ints and a string in a struct and calling it done.
I appreciate dynamically typed languages for their mutability properties the most.
Its very difficult to add typing to a dynamically typed language. Ruby has typing with Sorbet and RBS, but it is an uphill battle to make typing work with existing packages and not to mention a developer spends more time trying to make the dynamic/static typing place nicely with each other.
On the contrary this has been extremely smooth in Python. I think possibly Ruby has more challenges with it, maybe due to having more syntax "already taken", so there is less "syntax space" for adding types. It can be somewhat difficult to add type annotations for example for a (fully generic) function decorator, you need to look up ParamSpec etc, but fortunately the vast majority of types even in dynamic code are actually simple (or generic over a single or few type parameters).
The harder part for these languages is to make use of the type information to actually become faster. At least for Python, I think it will become more and more "reserved" for smaller scripting-type purposes simply due to people being more aware of its relatively poor performance for larger systems.
It's been my experience with early Typescript which is why I'm hesitant to use it till this day.
@@jolben Python3 Uses type hints. Just like Ruby typing checking can be turned off when the going gets tough for large code bases.
@@jolben But python's type hints are ugly and verbose. A simple type declaration requires typing a long list of spells such as TypeVar, Literal, and Protocol, which is only a little better than JAVA.
Like the structure of T -> (False, None)|(True, [T]), this becomes Callable T Union tuple Literal False None tuple Literal True list T. And when the type hint warns about a issue, the message given is similar to the C++ template bullshit error message.
@@KevinSheppard Current TS is a really seamless experience, as long as you allow implicit any, it's really more a gradual typing approach in that case. Not only can you convert files one by one, even inside a file you can convert it bit by bit over time.
Saving you an hour by this summary: There was static typing until the late 90's, then dynamic due to slow PCs, IDEs and compilers. Static typing comes back these days and I bet it is here to stay. There is the Roc language I work on. It is statically typed with full type inference, which makes it usable and low ceremony like dynamic languages, but type safe.
Additionnaly, JS having null and undefined and both being not equals just offers so much more error opportunity
Excellent analysis that fits my experience over the years.
I think we’ll see more innovations around type systems; the distinction between dynamic and static won’t be the key difference anymore. Static typing, where possible, makes sense-especially if you can do it without hammering productivity.
Programmers want powerful, declarative types that reduce cognitive overhead. Type inference is a great example of this desire: you get nice errors and compile time checks but you don’t have to worry about maintaining a bunch of type annotations. So what’s better than that? I wish I knew. I’d secure my fame by inventing the next big language. But I have seen one thing I think offers a clue.
The raku language has a feature called subtypes that lets you specify enhanced runtime type checks that supplement compile time types. For example, you can specify even integers by creating a subtype that checks for divisibility by 2. Then any function that takes an even integer will automatically check its arguments. When you combine powerful subtypes with multiple dispatch, you can remove a lot of imperative code and replace with simpler declarative code. Raku may never be a major language, but it has some amazing design ideas that I hope will be influential.
So, I hope to see hybrid run and compile time type systems become more powerful and more common.
Check out the multi example on RosettaCode to see an example of subtypes being declared in function signatures: rosettacode.org/wiki/FizzBuzz#Raku
I think a lot of the issues that were had with "static typing" was a more structural problem with the languages
I think the Java example specifically is more an argument against (past?) Java's brand of OOP.
@@Kenionatus That's true, it feels like Java encouraged overengineering
@@Octal_Covers I don't think it's Java exactly. Maybe J2EE. Maybe certain segments of the Java community that were coming up with rules for Fortune 500 companies to try to wrangle certain types of problems. The main argument for getters and setters, after all, is reengineering -- which is inevitable in any real project that actually lasts for more than 5 years.
There are good reasons you should "program to an interface". Doing so is more work up front, but it sure makes reengineering (and polymorphism) a whole lot easier down the road.
You could implement his example in Java with only slightly more code than the JS version. (It's been a while since I've done Java, and I don't remember what JSON support is like.) You don't need getters and setters and all that other boilerplate, after all.
@@jplflyer Yeah, I also felt that the Java felt a bit... off. His Answer class could perfectly well have just been field, field, constructor, all public. Because that'd be 1:1 the functionality the JS object gave him. Setters, Getters/encapsulation, he could have done those in old JS too. Equals and HashCode? JS didn't give him either of those (===/== is only object identity in JS after all) so if it's not needed in a Java class, don't implement it.
Also, if he really wanted to display all the "possible" ceremony in the Java example, where's his toString? :D
Interestingly enough, his explanation for POJO also seemed different from what I'm used to - the thing on his left is what we'd call a POJO back then.
@@Adowrath Overall, his talk had some interesting points. On the other hand, I was watching it and saying, "We told you. We told you."
I really wish more statically typed languages were able to compile with errors. I’ve had huge Rust programs where I wanted to test a simple change on a small code path but because of traits, etc I end up editing dozens of files and getting supper bogged down. Even if it just compiled into “panic if you call any code that had errors in it” would get good enough for me.
I end up going in and inserting panics (or todo!()) myself to tell the compiler “don’t even bother with this chunk of code”. Wish I could automate that
This is the unsung benefit of the TS/JS paradigm: no matter how much TS is complaining, you can still run the program and see what happens.
todo!() and unimplemented!() should do the job
Absolutely! There’s no reason not to do that. It worked really well for Haskell.
Gradual typing in dynamic language is stupid! It just helps a little bit the IDE (without getting exact correctness), but you don't get any advantage of having typing at all: 1. you still have slow run-time 2. you still have to cover all the tests that could have avoided with real typing, 3. hacked on syntax is ugly (take a look at Sorbet!) 4. you don't get the same kind of developer productivity with real strong static typed language 5. God forbid if you have to work in a huge project with dynamic language, that makes you want to jump off the window everyday.
I recommend changing your example of ceremony. The difference between a JS snippet for a personal project and a work PR in Java has little to do with static typing.
It has much more to do with the native support of JSON in JS which makes this comparison kind of unfair.
Yeah, I pressed stop after the ceremony bit.
The java boilerplate seems to be more about the patterns chosen in this case. There is no reason that the answer class can't just look something like the json class.
The ceremony in enterprise Java is far, far higher than the ceremony in enterprise Javascript or Typescript.
A lot of 90s era programmers conflated the ceremony of the major static languages of the day with static typing, and disliked static typing because of that. Modern mainstream static languages typically have much, much less ceremony than Java, which is something that he points out.
If you watch the video, a decent part of his point is that when the options were Java and python, many people preferred python. When the options are Javascript and Typescript, many people opt for Typescript.
I don't think Hack for PHP is very relevant now - although it may have been when it was created. Afaik almost no-one except Facebook now uses Hack. But PHP has been getting much more statically typed. The built in run-time type checking system has been getting much more expressive in the last several versions, and the third party type checking tools (two popular free static analysis tools Psalm and PHPStan, plus the PHPStorm IntelliJ-based IDE ) are I think getting much more widely used, check the types in advance so users can almost completely avoid having any runtime TypeErrors thrown, and also take the type system far beyond what PHP supports natively via docblocks.
As someone that has never ever liked dynamic languages, this is music to my ears.
Amen.
The Delphi IDE was always pretty snappy in my experience. Build times in Delphi 7 were very short too (incremental, single pass compiler).
This already started in 1987 with Turbo Pascal 4.0, 36 friggin' years ago. Why has it taken so long to figure this out ?
@@goblinsgym I'd say it started with Turbo Pascal 1.0; the compiling speed and the OG IDE workflow (write/compile/run/debug in the same program) gave serious heartburn to the juggernaut of developer tools at the time - Microsoft.
I think snappiness was also related to code size. If he was programming J2EE for a fortune-500 company, then there would be serious code bloat. It wasn't that the IDE was slow so much as the IDE had 200,000 lines of J2EE code it was indexing.
@@jplflyer Delphi is fast for large projects thanks to a very efficient unit system. Only changed code needs to be recompiled, and you never ever have to futz around with Make files.
I remember reading the C++ book in the late 1980s, and trying to visualize what it would take to write a compiler for this mess. If it is hard for a computer to read, imagine what it will do to humans...
I still use textmate for everything that doesn't require IDE-kind of thing and absolutely loving it! I can have 20-30 java, ruby, javascript projects open and still fast
GREAT TALK.
If you are starting in this amazing world, save it. He is giving you historical facts.
If you are a dynamic language lover, this guy is just telling you the true that you already know watch out don't deceive yourself.
One correction: PHP itself is now gradually typed. Hack is basically dead outside of Facebook and Slack, but PHP now has what is consider the strongest type system of any interpreted language. (Typescript is compiled.) A lot of work has gone into that, and it's payed off.
Typescript is transpiled into JavaScript. Not actually compiled
@@-Jason-L A distinction that doesn't matter in this context.
For me, it boils down to a very simple concept: in both cases types matter. In one case you're saying, "trust me, I got this." and in the other case the compiler is saying, "prove it."
For small projects, throwaway stuff, quick scripting, I reach for dynamic. I don't need to spend all that effort proving it. What's the worst that happens? My script fails and I fix a few things and then run it? But when I got to a few hundred thousand lines that lives for years and has many different developers working on it, I cannot, for a moment, think it would be a rational decision to go with the "just trust me" option.
I'm working in a 1.2 million lines lua game project... Project was started ~10 years ago... not sure how many programmers have been on it... but it must be more than 50 now. The problems I see in the code are old sins (crappy written early systems) and code complexity. The hard nasty bugs are not on the lua side, but the on the engine c++ side.
Great talk. I agree with most of the points, but I'm a bit more optimistic about gradual typing. You say they require a "complicated" type systems, and from a language implementer perspective that is certainly true, but as a regular programmer I call them "powerful" type systems instead! There are still some problems that are easier to solve with access to dynamic/gradual typing. I wouldn't be so quick to dismiss the gradually typed languages that are massively popular today (Python/mypy, JS/TS) just because of their roots in dynamically typed languages. C# is an interesting example of a statically typed language that later added dynamic types too (in version 4.0).
Julia and Mojo are two languages that I would love to see entering the top20 some time in the next 10 years. Each of them attempts to combine the benefits of both dynamic and static typing in their own unique way.
C# may have added syntax to declare dynamic types, but I’ve never seen anyone actually use them.
@@TehKarmalizer Haskell also has dynamic types, but no one uses them...
@@aoeu256 The "issue", so to speak, with Haskell's Dynamic, is that once you have it, you can't really do much with it apart from explicity try and convert it to another type, which gets tedious and defeats the point a bit, or code your entire codebase with Dynamics - which is still a bit painful. C#'s dynamic at least just lets you do what you want and then yells at you at runtime - also the ceremony needed for casting dynamic->string, for example, is much, much simpler.
I feel like the examples of static typing in Java isn't really about static typing. I feel like the bloat comes from OOP patterns.
Even Java is working hard at ceremony reduction. It's got a long way to go to catch up, but the example at the top in today's java would be a one liner for the entire Answer. Json hasn't changed much, and is still a pain in the ass like that. The catch blocks can be folded together, and the if else could be expressed DRY with a ternary calculating the displayname, before creating an answer. I'll let a recode of the Java example speak for itself:
record Answer(boolean wasEmpty, String displayName){}
@JsonIgnoreProperties(ignoreUnknown=true)
public static record Json(@JsonProperty("name") String name, @JsonProperty(admin) boolean admin){}
public static Answer decode(String rawJson){
ObjectMapper objectmapper = new ObjectMapper();
try{
Json json = objectMapper.read(rawJson);
String displayName = answer.admin? answer.name + " (Admin)"
return new Answer(json.name.isEmpty(), displayName);
catch(JsonProcessingException|JsonMappingException e){
//do something prettier than returning null here please
return null;
}
}
Unfortunately if you're using Java you're probably using it in a company that hasn't upgraded from as far back as Java 8...
The rise and fall of dynamically typed languages is an example of the typical engineering cycle: we have a problem - we find a workaround - we fix the root cause. The same happened to NoSQL and will almost certainly happen to microservices and event-driven architectures.
Could you elaborate? How are we going to solve scaling monolithic software? Or are you referring to that microservices in the true sense are overkill and a bit bigger units are better?
@@kc3vv I am not even sure monolithic software is the answer. It may be something completely different that we don't even know about. Anyway, microservices seem to be a workaround for today's limited technologies rather than a solution that adds value for the customers. Similar to dynamic typing.
This is an interesting take on developments in the programming language space. Microservices are indeed interesting based on the problem that they aim to solve which is probably more related to scalability. There are added benefits in that an app can be split up into different "independent" modules that are language agnostic. There are clear benefits to microservice architectures when done right, but I think that a lot of companies (especially startups) use microservices for the wrong reasons. For example if you build your server in nodejs, which is single threaded, and you want to scale it up, a microservice architecture will help since you can spawn multiple instances, but maybe nodejs was not the best choice in the first place. Also, if you are splitting your app into multiple microservices but stuff breaks if one of the microservices goes down, then this architecture is also probably not the right choice.
@@Voidstroyer I am not questioning the benefits of microservices from developers point of view. I am just saying that they don't add value for the users, as the users don't care about languages used. Scalability does indeed matter for the users but I don't see why the same horizontal scaling cannot be achieved by replicating a monolith.
@@aybgim3850 No I completely agree with you on that users wouldn't really care about what the architecture of an application looks like, as long as it does what it needs to do, and with acceptable performance. I am personally against microservices architecture because I think that some apps (for example in my current company) use microservices prematurely (or the application is split up into too many unnecessary isolated services). I would definitely prefer going monolithic first and only splitting into microservices whenever necessary.
Great talk, thanks a lot… I feel the same and meanwhile love static typing. One aspect that might be interesting to address here as well might be reflection, which is the workaround how static typed languages deal with situations where dynamic typing is required. Reflection usually is quite painful and might be something that becomes easier in the future? Also interesting is how modern programming languages move from strict object orientation to alternative polymorphism, like interface/traits, and from exceptions to optionals. This all seems to be kind of linked with the move to static typing as well.
can you think of any traditional OO feature that hasn't been discarded, replaced or minimized? think of any feature of Java/C++ 2000?
Java POJOs is what happens when you just follow leads and don't bother to get underlying principles.
If fields are going to be final (as were in the example given) fields may as well be just public.
The issue with static types is that the capitalist system, is interested in churning optimal abstract code, not in teaching basics, this is what pushes for the enforcement of draconian nonsensical paradigms.
Code reviewers are either lazy, over-defensive because of their salaries, or they themselves are unaware...
Surprise! Java happened to be the preferred language for the banking system... Does it makes sense now?
I code mainly in Lua nowadays...
I very rarely miss static typing at all. Any runtime bugs coming up due to type missmatch are usually the easiest to fix... the problem never lies there.
Imho the hard problems are usually related to code complexity or memory out of bounds overwrites etc. Lua does not suffer from memory issues. And code complexity you will have in any language.
I just love how quickly I can iterate on lua code, compared to a static language. It's just 5x faster to work with, expacially with instant reload. No compile times.
Very nice talk. Thank you!
And yet 2 dynamically typed languages (Python and JavaScript) are leading language popularity lists by a large margin (depending on the actual survey ofc.).
Also, statically typed languages try to look like dynamically typed ones (via auto, type inference etc.) because typing ceremonies are not fitting rapid prototyping.
I usually use type annotations in Python, and it helps a bit; but I don’t have the feeling that it would be super helpful for development or code quality.
If the compiler can use the type hints for optimization (like for Julia or Cython), then yes, it is a value add. But otherwise it is mostly just a habit, a ceremony from my side, but I don’t feel static typing that much helpful for my use case (developing data projects).
So at the end I think the preferences are also domain specific.
Nice talk, thanks!
I would love to see more empirical data supporting both functional languages and the various kinds of typing/type inference as design choices
SO it sees to me that in some ways you could replace the use of "dynamically typed" in this talk with "Interpretted". And sometimes the language mentioned as dynamic actually IS statically typed but is interpreted.. Case in Point BASIC. which really only had 3 data types Integer, Floating and String.
Not quite all uses (Typescript is "statically typed", but still interpreted). But I also felt like most of his problems were not about being statically or dynamically typed, but about being compiled vs interpreted.
I think one thing you missed in what changed is that type inference in modern languages got a lot better
The only way i see the pendulum swing back to dynamic types is if dynamic type languages make another major revolution leaping ahead of the dynamics again
I wonder with AI what the next paradigm shift will be in languages. I don't mean just AI replacing devs, but maybe what paradigms might emerge as more important to this field, like possibly logic programming, and if AI benefits more from static or dynamic typing.
Started off with MatLab and Python, especially bigger GUI projects in MatLab drove me into loving static analysis. My variables in those projects will look like typenames only to keep myself sane and the code somewhat self explanatory... but without actual static analysis.. and actual types. In short: want to build something complex? Then use static typing bc your architectural design schema will most likely look like it anyways.
Gradual typing is not limited to just ActionScript and Dart; JavaScript is indeed a prime example of a gradually typed language that is widely used and immensely popular. JavaScript's versatility and ubiquity in modern web frontend and backend development make it an ideal candidate for leveraging gradual typing. With the use of JSDoc annotations, developers can add static type information to their JavaScript code, allowing for improved code quality, tooling support, and catching potential errors. It allows mixed approach, where parts of the code are statically typed while others remain dynamic, is becoming increasingly prevalent and represents the future of programming languages
JSDoc goes with Typescript or mypy in his taxonomy (static type retrofit to a dynamic language). ActionScript and Dart were singled out as the only mainstream languages that *started* gradually typed.
JS survives through mob mentality not any actual advantage over other languages
There is a lot of ceremony with typescript as well, it's just put onto library authors vs library consumers.
That sounds like a good thing to me. Fewer people reinventing the wheel.
@@ThaJayyes!
reasons not mentioned for dynamically typed languages, which made me go that direction in my career after starting with typed ones, were:
- lists (dynamic vectors) as first class citizens
- dictionaries ("js objects", agnostic hash maps) as first class citizens
- one time type agnostic implementations, without having to fight against the compiler with generics or copy paste code or have Interface madness
and yes i know, many of these points were addressed by more modern languages, like C# or rust, to an extent.
roc seems interesting, but i dislike it's syntax. good talk.
You have listed all the bad things :)
Especially dictionaries are the worst. You have a method which accepts a dict, but that dict must have certain fields and you have no idea what are those fields. An exact well named class/object is much better.
I'm just excited for Jai :)
The return of static and strong typing is such... ... a relief for me. I had done a big wad of C and C++, but I really liked Object Pascal - all statically typed but Object Pascal was much stronger than C and C++. When Linux came along, and we moved from only being able to afford one compiler to a world where there were languages galore. And the one that really piqued my interest was Ada because it took Pascal's string-typing and shovelled a load more on top. But you've got to go where the market beckons and my customers and bosses wanted web development and that meant Perl, Python and Ruby and I found myself disappointingly in the world of dynamic types. So... "welcome back static types", I say and the stronger the better (looking at you TypeScript X x X)
A few years back I was using an ORM in PHP That used "Plain Ol' PHP Objects" - which were, ironically, jam packed with getters and setters. (Because I look for any opportunity I can to deride PHP, I preferred not to pronounce this as PoPo but Pooh-pooh).
Hello, thanks for this presentation. There are several things missing in this talk, or at least that's what I think.
First of all, video seems to imply that it was some sort of "decision" for 1990's languages to be static. Its not the case. They HAD to be static because they compiled directly to machine code and on 100mhz processors it wasnt possible to run software with late binding on object creation, function calls, etc. If you called a function, you had to call it excatly the way it was defined (i'm talking about calling convention) and it had arguments to be passed exactly how function would read them. Its not like in dynamic languages, where missing arguments would simply be undefiend/null at runtime and missing function would be written in console at runtime. A call for doSomething() function is transpiled into direct call for specific code address. Unless you leave debugger symbols, compiled software doesnt even know what name of this function is. It HAD to be that way, because statically compiled languages (or ones compiled to intermediate code like java) were a lot faster than static ones.
In order to visualize the difference between dynamic and static languages - JavaScript is often called as "fast" language (due to good runtimes like V8 and Node.JS async performance), while Java has fame of being the "slow", sluggish one. Actually, Java is like 5-6 times faster than JavaScript today and the gap was much worse back in the days.
Also, you provided the point that it was urge to deploy quickly for dynamic languages. I kinda agree with that, but I believe there is more about that. For backend dynamic languages like PHP, they were simply "good enough" for web development because bottlenecks either come from databases or network quality itself. In 2000's, you had to wait several seconds for website so it didn't matter if site responded 200ms quicker or not. But also remember, back in the days, web was MUCH simplier, it was mostly simple read/write stuff, today you have heavy complex web systems with very complex backend and frontend code base.
Bottlenecks were simply not in the user code, its like today Python, being quite slow language, is considered best for AI purposes because its not the bottleneck - all Python is doing is calling libraries in native code that do complex stuff, so Python is perfectly fine solution here.
Also minor nitpicks about Delphi compilation time. First of all, things they cleaned is not "cache" but generated .o files which contain native code blocks for particular modules. Those .o files were then collected with a tool called linker to become a full executable. Second thing - indeed Delphi and pascal-based compilers were really fast, but this came with a heavy cost. They were single-pass compilers which required from developers to architecture their code properly, because it didn't deal good with circular references. C/C++ compilers could do circular references OK but they compiled every file against every file, so compilation time grown exponentially with code complexicity.
It's not necessarily true early 90s code had to run in machine code. Early versions of Microsoft's programs for Windows were written in C...however they were not compiled to machine code. They instead were compiled to pseudocode, that was then interpreted at runtime. Executing the programs ran the interpreter, and loaded the interpreted code. This is the secret for how Microsoft managed to get programs like Word, Excel, and Access to fit into the 1MB memory constraints on an 80286, while their competitors failed. Microsoft sacrificed some program speed for smaller memory constraints. The instructions of the pseudocode could be more powerful than the machine code instructions in a smaller amount of memory. The C compiler they used for this was not made publicly available. By the late 90s the memory constraints no longer mattered and Microsoft changed to compiling to native code.
@@grr986 Could you please elaborate more about that? As far as I know, 286 had protected mode already so it could easly go with >1mb. I'd love to see more sources about that, because I'm curious.
@@martinchya2546 My memory is a little fuzzy because it's been more than 30 years. The 80286 had a 1 MB memory constraint at least under Windows, and only 640K was usable by Windows. There was a protected mode on the 80286, but if I recall right, its limitations kept Windows from using it (see Protected Mode on Wikipedia). You could cheat a little and get an extra 64KB out of it with the segmented memory. There was a hack where you could access the first 64KB of the second MB (see High Memory Area in Wikipedia). It wasn't until Windows 3.0 running on a 386 or later that you really were able to get past the 1MB limit. I just looked at my Microsoft Access 1.0 box sitting on my shelf which shipped in late 1992...it required Windows 3.0, 386, 2MB memory, and I believe it still ran pseudocode at the point it shipped. It was certainly pseudocode for most of its development.
I was educated in the school of statically typed languages, and I detested the dynamically typed languages because they make parallelism and effective compilation very very hard. And parallelism and effective compilation are really a thing when you actually run your program. But my attitude towards the dynamically typed languages have improved, it is only that we (as usual) are trying to use them for the wrong task. Dynamical languages should be used for scripting and fast prototyping, and static languages for application or systems development. Put shell back! It is a scripting language. The reason for R:s "success" in keep being dynamic is that it is used for machine learning, yet actually compiles efficiently. It is designed for exactly that niche.
22:25 this is how you write good dynamic code. And I think one of the reasons that type checking isn’t actually really necessary.
41:00 “Dynamic typing requires some runtime overhead” - That’s a true statement, but quite more nuanced than Richard makes it appear. In a JIT compiled language like Julia the runtime overhead can be O(1) in the number of executions.
"the runtime overhead can be O(1) in the number of executions" - That's a true statement, but quite more nuanced than you make it appear. O(1) doesn't tell you anything about the duration of this. it could be nanoseconds, or it could be decades.
It never occurred to me that static typing could be non-blocking. Sign me up.
Another reason gradual typing is not the best of both worlds: having certainty that all the code you need to work with will have correct types (including the stuff written by guys who left the company 3 years ago) is a benefit only available to mandatory typed languages.
The early BASIC was statically typed - at least until (including) Commodores home computer BASIC, where variables without type qualifier where float (yes, I, J and so on where float and for loops didn't work with integer in Commodore BASIC! And yes, floats could be array indices), variables with % (S%, X%) were integers and variables with $ were strings. So BASIC was a strongly, statically typed language. AFAIK, the original Dartmouth BASIC had only float variables - and not even string as a data type.
Nice talk. For me, the big downside of static types is the visual clutter of the necessary annotations. I don't think the talk called out type inference as the statically typed languages likely to catch on, but I'd add that qualifier.
In C#, a lot of the clutter has been made unnecessary eg auto-properties, Record over Class....
Your ide could hide static types maybe and auto inference them. Now that I think about it, you could show different properties of your program depending on different powered type systems like dependent types for behavior, effect types, and normal types.
It’s always a relief when the IDE infers a type :)
Rust has some of the best, most impressive, at times shocking type inference. 90% of your code won't need type annotations whatsoever. you can declare a variable with a very ambiguous type (for example let foo = None) and if at any point afterwards anything restricts its type, then it is inferred. it could be returning it from a function, passing is as an argument, or even adding it to a list of known (often inferred!) type.
Oh yes, I agree. I love lua for it's super readabillity for example. It contains almost no annotations, that is no the code itself.
Two thhings that went unmentiooned was in practical experience Static typing manages to not be bulletproof as far as security but more secure than dynamic typingg and second about static typing, I don't know precisely how an interepter would do type infererence but if I were called upon to speculate the educated guess woould be to without even telling the developer ad in the background create a while true loop and ittirate over all data types until it tested true whichh is part of why the grand wizards of coding tell you low level languages are more painful to write but have thhe advantage of being somewhhat highher porformance than higher ones. I um, personally as an developer in training Really strongly prefer static typing for the performance improvements it offers, many oversimplify that performance advantage to being one of simply the differences between itnerepters and compilers but one thing I've noticed is it runs even deeper in that compiled code tends to be faster due to all thhe ceremony of resolving data types for the compiler rather than making the interepterer do that work for you and any time as developer you can spare the computer from having to do this woork will speed it up. a more traditional viiew would suspect that its impossile to be both more secure AND higher performance but in practical experience as a developer you can squeeze both out oof things by simply specifying data types.I broke many rules by starting with leaarning C instead of python and while I dread thhe lack of a garbage collector later on it's too useful. Obviously assembly is even faster thhan C but I want to be productive. so care about performance yes but don't go so far down that rabbit hole thhat work slows down to a crawl.
PHP has a JIT but that seems to make very little difference to it's run speed. The JIT isn't especially useful for a typical PHP workload. More importantly PHP has an ahead of time compiler, and the compiled bytecode is cached on the webserver that runs it.
PHP was not made with speed in mind, like, at all.
@@NXTangl The first version probably wasn't, but a lot of thought has been put into the speed of more recent versions.
IDE's for real. I remember trying to buy a laptop for an early version of Eclipse. I spent a day in BestBuy just installing it and seeing if it crashed when I opened it. The most amusing one was a Toshiba Satellite that crashed into a starscape (black background with randomly colored points interspersed throughout). This was 2008.
On the flipside, though, about 41:00 he says dynamically typed languages can't be as fast. Maybe he hasn't tried Julia yet...
Worst bugs I've encountered during my career have had their root cause in dynamic typing
Delphi and Java came out at the same time. Delphi has always been fast because its roots are Borland's Turbo Pascal.
Small correction: Objective-C has both. Static and dynamic typing.
The only big apprehension I have is over my love of duck-casting. I don't want to import the type to declare the type. I want to give a function some object that has a .write() function on it. You should be able to specifically enforce that, but it *is* slightly dynamic there.
Traits in Rust
I think future IDEs driven by AI Lynters and type checkers are going to swing the popularity back to dynamic languages because in the background they're really going to be transpiling to static and able to warn you instantly about errors we make or auto correct for us.
Even compile time errors are going to be picked up while coding long before you hit save .
In fact we may have multiple abstraction levels of code that we can drill in and out of with the top level being something like just plain English sentences in to the next level being like JavaScript or python then level after that being like rust , go or zig , and then down to a memory management Lang like C then lowest level of assembly or something until all we see is hex or binary. We may just have choice of writing in any type specificity.
4:10 Don't forget C, along with Perl were the main languages when writing CGI.
7:00 It's interesting to see the point of view from someone from, say, US where web exploded in 1995 when in other countries (like down here in Argentina) web exploded only much later so software development focused on desktop for far longer (which meant Visual Basic and Delphi were the preferred languages over JS).
31:05 Not sure about Java but C# got csharprepl, a must for any C# programmer wanting to quickly draft something.
31:40 I'm pretty sure incremental building was already present in mid 2000s at least in Visual Studio. I'm sure though that there were paid plugins to compile faster (there was one that was called Incremental Build, Instant Build, or something like that that sent compilation tasks to different computers in the network to compile each a section of the code).
The fact that the first or second thing anyone programming in JS or Python does when installing Visual Studio Code is installing a linter (which is basically adding the preprocessing step of compilers before running their interpreter) to catch errors early speaks volumes about whether static typing is the way of doing things or not (even though I am pretty sure there are a few kamikaze that in 2023 are probably still using a plain text editor without any linting at all).
fast, fast, fast... we develop like tornados... and leave a path of destruction behind...
Great talk. But what about Erlang and Elixir? Surely, not super popular but dynamic :)
We can type if we want to,
we can leave your 0x behind,
'cause your friends don't type and if they don't type,
well, they're no friends of mine.
beside the fact that almost every single big mega companies found out that they needed the speed of non dynamic languages. Business drives change always has always will. From facebook to twitter they pretty much rebuilt their systems from the ground up at least once.
Let me save you 52 minutes. The answer to title question is....
BECAUSE IT'S USEFUL.
Java is such an impactful language, developers decades later uncovering that maybe some ideas weren't all bad despite Java having them.
Maybe one day we'll realize that OOP isn't that terrible, despite Java trying very hard to prove otherwise.
Problem with Java is that while many of the ideas aren’t bad, the way Java pushed them to the extremes are impractical. Java’s OOP is great for people loving UML schemas, it’s noise and boilerplate for people who just want to write useful code.
OOP will always be a bad idea because it tried to merge several different concepts into one for no good reason.
Encapsulation is a good idea but not at the object boundary.
Polymorphism is a bad idea when done in the Java style where it is mixed with inheritance, the Haskell type classes or Rust traits work much better for code reuse.
Inheritance involves subtyping which involves dealing with co- and contravariance, something even most OOP programming language designers don't seem to take into account.
OOP also complicates composition which is another way to achieve code reuse.
The main issue with OOP is really that nobody ever developed a proper theory for it before implementing it and what theory there is is often ignored in practice (e.g. Liskov substitution principle).
@@Taladar2003 Also, coming up with good class taxonomies is hard. Especially before you have your code written
As for what will drive the push for static typing going forward, my bet is usage-metered (CPU/Mem) billing like Cloud Run and Railway. Running a Rust/Go service at a few megabytes each vs some gigantic Node/Java/etc. service has quite different impact on cost.
Code insight has been available for JavaScript in IDEs for a long time
Assembly has typing similar to C. Float vs double vs integer. Signed vs unsigned. And 8-bit vs 16-bit vs 32-bit vs 64-bit. Less typing than many languages but more typing than "no typing".
Working primarily in ruby on rails, but learning Java in my personal life I really appreciate types now. Java does seem overly verbose for sure, but it's also very easy to read.
Ruby on rails is nice because it's so fast to develop. If I was a lot smarter than I am I think the freedom of dynamic languages would be a huge benefit. I'm not that smart though so I see code written 3 years ago and I have trouble initially figuring out what it should be returning. With a statically typed language that's not an issue. Plus my tests are checking type anyways a lot of the time.
Java is a pain to set up, I like using vim, ide's are really annoying, etc. That being said in the long run even using something overly verbose like Java, fixing bugs or adding features is a lot easier when the language assumes you are an idiot(java) versus ruby or any dynamic language
great talk!
I got lectures about why we can profit from strongly typed programming languages, by finding errors early, already in 1984. This is really really old stuff, and it should really be every programmers' general knowledge. I wonder why it appears that it isn't. Besides, with any static programming language you can work as fast and as interactive as with say Python, only that: 1. you need *_much_* more experience, 2. if you use say C, you have to *_code much more_* to get to the goal. Java is not representative, Java's coding practices are not sound.
Although PHP was given birth in 1994, it wasnt that popular and used until 98/99.
really cool talk.
For TS, This sounds like it could as easily just become an IDE feature instead of a modification of how you write code, which requires a compilation process.
Thankfully in PHP land, static typing is optional and dynamic typing doesn't require transpilation. However, i don't find much use for it in PHP because i have an excellent IDE already ( jetbrains ), and type based problems are rarer in PHP for one reason or another.
That's kinda what it is. It's a superset of Javascript with added type checking.
most of the comparisons done have nothing to do with static or dynamic typing, but with OOP vs functional programming, features of the language itself or other things
Another Feldman banger
The problem all these "better" languages have is that they don't solve problems most developers are having. Almost nobody is CPU limited anymore, and if they are, scaling up CPUs is far cheaper than switching to using the newer tools with fewer available libraries and frameworks and job candidates.
The problem with this approach is that it's true right up until it isn't and you're back at square one. I'm excited to have a new cohort of languages that give me a bit of both worlds up front.
bugs and vulnerabilities are a problem every developer will always have, unless their code is designed to be written, but never run. Programmers not doing or caring about debugging kind of feels like it.
If your doing REPL based exploratory programming, or IO bound applications you are not CPU limited. However, there are certain type of apps (real-time & game engines with hundreds of thousands of objects) that can't be done well in the slower languages. I have to say that LISP with its macros can embed or emit unsafe code or MLIR easily.
Also reminds me that back in 2005 Python was so much higher level than Java and C++ yet with its list comprehensions, closures, **kwargs, decorators, generators, convenient json like syntax, but most other languages caught up. Python sort of became frozen for a while and slower higher level features like relational-constraint programming, STM memory, reactive propagation, equational solvers & symbolic programming, ADT & full pattern matching, predicate classes/dispatch, macros, you have to be in LISPs to benefit from.
Great talk! After years of working with the JVM, moving to JavaScript or PHP on the server side for large prpjects just felt... Icky. Also, take a look at Groovy for a nice balance between dynamic and strongly typed.
I feel like Kotlin and Scala have essentially edged out the Groovy market for terse, well-typed languages on the JVM.
@@KyleSmithNH You're right, they're definitely more popular now, but I still prefer Groovy's syntax and handy extension methods.
The roc lang snippet reminds me a lot of F#
I'm much prefer editing programs and compiling them while the program runs so I never have to restart the program. What turns me off to most static languages is the great difficulty with which they handle run time data variations. Working with heterogeneous collections in a dynamic way when the return result is unspecifiable is usually a headache of templates and statically bound languages. I much prefer to work in Steel Bank Common Lisp, and work on my program while it's running only restarting it when I need to. I can compile each and every function as I define and adjust them while the program stays running. There are benefits to static typing but you realize at that point you're programming in a subset of a more expressive programming language. Languages that are poorly designed like JavaScript turn people off to dynamic typing because they mistake dynamic typing for poor coercion rules from one data type to another-especially around strings and numbers which generally just shouldn't be happening in the first place. I feel like poor type coercion rules are really the worst offender in most dynamic languages today. I generally want to express things in terms of algorithms rather than leading a compiler around by the nose and locking myself in to what only gets templated at compilation time.
I don't see complex type systems as a downside. The complex type system required by Typescript to model the full range of possible values in Javascript is orders of magnitude more expressive than the C# type system. A trivial example: where C# would require multiple overloads for a single parameter of varying types, Typescript's union types allow having a single function signature.
i like what python does
and idk having a language basically do what my IDE tries to do (statically analyze and check), but then use that to compile? sounds neat
Python is a garbage programming language
@@zachb1706 lol ok?
this is relevant how?