The truth is whether you use oop or procedural paradigms to write a program, its going to get complex and harder to maintain as time goes on. People might think o o p is easier because they take an old procedural program and rewrite it in oop but what they've really done is just created a new program. As the "new" ages it will run into the same issues as the old one.
One of my peeves is 'I rewrote it using X and its 100x better!'. I would hope so. If I rewrite an app in VB classic it'll likely be better than when I didn't know what I was writing, what abstractions I'll need, how it'll change, etc, no matter the language.
0:35 I’d add the caveat that the function must be pure, in that it only uses data provided by the function parameters. Once a function uses external data, it becomes a closure, so any modification needs to incorporate both the function and the data being closed over. Splitting that function is more complex because you also need to split the external data. Classes are just a special case of closures… they represent a container for many functions closing over the same external data. The complexity is that each function in the class likely uses a subset of the external data, and each function uses a different subset. Understanding these subsets and how they overlap can be confusing and adds to the difficulty of being able to modify the code. Complexity here often indicates incorrect separation of concerns and architecture.
Exactly this! There is nothing more offensive than a method that takes no args and returns no value. Your object has 100 methods? Guess what, all 100 might be different now... or not. See you at runtime!
Don’t use OO here or here… . It seems like we are missing the part where OO with the right abstraction is a win. Finding the right abstraction is harder than naming things, so OO is probably a disservice to most people.
I'm with Brian Will here, I agree with his answer to when we should use OO: Never. There is no circumstance I can think of that wouldn't be better solved with functional core, procedural shell type systems. Obsession with encapsulation and strict hierarchies of the kingdom of nouns where data and behavior are cobbled together and made into horrible abstractions with message passing like that's supposed to be a silver bullet.. I just.. I wish we had all ditched this a long time ago. Keep functions and data separate, keep data immutable, keep functions small and so simple and well named you can use small variable names and still no confusion. Then compose bits together that are all bulletproof and deterministic. Build software like Lego. Make illegal states unrepresentable. Embrace radical simplification, and static functions. Think about pipeline oriented programming.
This is all very well for stuff you write just once. For stuff you need to maintain and change as you go on by multiple people over a significant time frame, OO is the what you need for damage control
If you think OO is about nouns, you have more learning to do. OO done right is all about verbs, what the thing does. It is functional programming that is all about nouns, even functions are just things.
Object-orientation belongs in a programmer's toolkit, but it is a tricky tool to use. Certain features like implementation inheritance can lead to hyper-spaghetti, and as a Bob Martin video in this series points out most O.O. languages, by implementing fine-grained encapsulation, can do a worse job than clunky old C headers and code files. If you use the right abstractions for the problem, though, an O.O. program can be *easier* to read and understand than its procedural equivalent. Other paradigms have their own quirks. Strict procedural code can get just as tangled. Function-oriented programs flip O.O. on its head by making data structures public, but the indirection among functions can sometimes make the reader's head spin. Logic programming, which I admittedly haven't tried outside of programming classes, seems to have only limited uses. Programmers should learn multiple paradigms and multiple languages so they can identify which problems require which solutions ... before their boss makes them write it in C++, JavaScript, or Java.
I use classes whenever I separate functionality of something to its own class, makes it easy to integrate. then I always create a shared pointer, and don't have to care about memory management.
People think OO is just inheritance. Inheritance is garbage and you don't need it. OO is really anytime you have an object.method syntax. Everyone uses that. Honestly, method(object, parameters...) syntax is still the same thing, so don't think that your C code is safe. Your stacks, queues, arrays, heaps, etc are still OO. Regardless, design patterns, SOLID principles (or SOID? ) can still be used without inheritance, and they still produce value. Use OO, just don't do inheritance. Follow good OO principles, don't just make things objects when they don't need to be. Don't introduces abstractions which do not produce value. Remove abstractions which are not contributing to your project. Keep your code simple, but don't create rigidity just because you are intimidated by OO. Your code should follow the Open Closed Principle. It is should be open to extension and closed to modification. This means that you should be able to add functionality to your code, while minimizing how much you have to change code that has already been written. Every time you have to change something that already works, and has stood the test of time, you risk breaking it. Stop doing that. Use the right abstractions, so that you won't have to keep changing working code.
OOP IS inheritance and subclass polymorphism. Most other paradigms have abstraction, polymorphism, etc. and adhere to solid principles. They just don't use those 2 techniques.
@@gtdcoder That's not really OOP though. You can do that in any paradigm, inheritance and subclasses (*not* subtypes) is what really differentiates OOP.
The truth is whether you use oop or procedural paradigms to write a program, its going to get complex and harder to maintain as time goes on. People might think o o p is easier because they take an old procedural program and rewrite it in oop but what they've really done is just created a new program. As the "new" ages it will run into the same issues as the old one.
One of my peeves is 'I rewrote it using X and its 100x better!'.
I would hope so. If I rewrite an app in VB classic it'll likely be better than when I didn't know what I was writing, what abstractions I'll need, how it'll change, etc, no matter the language.
0:35 I’d add the caveat that the function must be pure, in that it only uses data provided by the function parameters. Once a function uses external data, it becomes a closure, so any modification needs to incorporate both the function and the data being closed over. Splitting that function is more complex because you also need to split the external data.
Classes are just a special case of closures… they represent a container for many functions closing over the same external data. The complexity is that each function in the class likely uses a subset of the external data, and each function uses a different subset. Understanding these subsets and how they overlap can be confusing and adds to the difficulty of being able to modify the code. Complexity here often indicates incorrect separation of concerns and architecture.
Exactly this! There is nothing more offensive than a method that takes no args and returns no value. Your object has 100 methods? Guess what, all 100 might be different now... or not. See you at runtime!
Don’t use OO here or here… .
It seems like we are missing the part where OO with the right abstraction is a win. Finding the right abstraction is harder than naming things, so OO is probably a disservice to most people.
I'm with Brian Will here, I agree with his answer to when we should use OO: Never. There is no circumstance I can think of that wouldn't be better solved with functional core, procedural shell type systems. Obsession with encapsulation and strict hierarchies of the kingdom of nouns where data and behavior are cobbled together and made into horrible abstractions with message passing like that's supposed to be a silver bullet.. I just.. I wish we had all ditched this a long time ago. Keep functions and data separate, keep data immutable, keep functions small and so simple and well named you can use small variable names and still no confusion. Then compose bits together that are all bulletproof and deterministic. Build software like Lego. Make illegal states unrepresentable. Embrace radical simplification, and static functions. Think about pipeline oriented programming.
This is all very well for stuff you write just once. For stuff you need to maintain and change as you go on by multiple people over a significant time frame, OO is the what you need for damage control
If you think OO is about nouns, you have more learning to do. OO done right is all about verbs, what the thing does.
It is functional programming that is all about nouns, even functions are just things.
Object-orientation belongs in a programmer's toolkit, but it is a tricky tool to use. Certain features like implementation inheritance can lead to hyper-spaghetti, and as a Bob Martin video in this series points out most O.O. languages, by implementing fine-grained encapsulation, can do a worse job than clunky old C headers and code files. If you use the right abstractions for the problem, though, an O.O. program can be *easier* to read and understand than its procedural equivalent.
Other paradigms have their own quirks. Strict procedural code can get just as tangled. Function-oriented programs flip O.O. on its head by making data structures public, but the indirection among functions can sometimes make the reader's head spin. Logic programming, which I admittedly haven't tried outside of programming classes, seems to have only limited uses.
Programmers should learn multiple paradigms and multiple languages so they can identify which problems require which solutions ... before their boss makes them write it in C++, JavaScript, or Java.
"Never use OO" is as bad as "Always use OO". Each kind of problem has its kind of appropriate approach for the solution.
@@torbjorngannholm3551OO was meant to be about message passing, but all of the mainstream languages are all about nouns.
FP is not about nouns.
I use classes whenever I separate functionality of something to its own class, makes it easy to integrate.
then I always create a shared pointer, and don't have to care about memory management.
People think OO is just inheritance. Inheritance is garbage and you don't need it. OO is really anytime you have an object.method syntax. Everyone uses that. Honestly, method(object, parameters...) syntax is still the same thing, so don't think that your C code is safe. Your stacks, queues, arrays, heaps, etc are still OO.
Regardless, design patterns, SOLID principles (or SOID? ) can still be used without inheritance, and they still produce value. Use OO, just don't do inheritance. Follow good OO principles, don't just make things objects when they don't need to be. Don't introduces abstractions which do not produce value. Remove abstractions which are not contributing to your project. Keep your code simple, but don't create rigidity just because you are intimidated by OO.
Your code should follow the Open Closed Principle. It is should be open to extension and closed to modification. This means that you should be able to add functionality to your code, while minimizing how much you have to change code that has already been written. Every time you have to change something that already works, and has stood the test of time, you risk breaking it. Stop doing that. Use the right abstractions, so that you won't have to keep changing working code.
OOP IS inheritance and subclass polymorphism. Most other paradigms have abstraction, polymorphism, etc. and adhere to solid principles. They just don't use those 2 techniques.
@@adambickford8720 No OOP can be done with no inheritance or very little, just composition.
@@gtdcoder That's not really OOP though. You can do that in any paradigm, inheritance and subclasses (*not* subtypes) is what really differentiates OOP.
@@adambickford8720 That's just wrong. Even some C programmers do OOP and C does not even have inheritance.
Always. The answer is: always.
Uh-Oh
Respect her. She is telling the real-world problem.
First