FREE REFACTORING TUTORIAL: Learn Approval Testing, Refactoring and Decluttering FOR FREE and apply them to some very nasty code. You can work along with Dave to learn how to make the bad code testable ➡ courses.cd.training/courses/refactoring-tutorial
And his arrest was not for challenging people’s worldview: it was for violating a direct order from the church to teach heliocentrism as a theory, not as a fact
I would love seeing an episode on approval testing. I have been working on a system that was in a pretty bad state and we used a combination of approval testing and testing for some of the modules we could easily isolate as a first pass of tests. We then started refactoring the code but all the new code had tests written to specifically cover it. As we cleaned up the system the test coverage go better and the defect rates dropped. We also follow a boy scout type principle of leaving the code in a better state any time you touch it. So if you are working in a module and you see some bad code clean up a function or two, add some tests, etc. as part of your normal work. We went from merging about once per year into the master branch with numerous bugs to being able to do it several times per day and no failures in many months. Most branches now are very short lived before they merge back in. The situation is still not ideal with a CI/CD standpoint but it sure is a LOT better. I used many things I have learned from your channel in guiding this process.
Never heard the term but that is exactly what we've been doing on our legacy ball of mud. I always felt a bit guilty doing it as it's not unit testing. I'm reassured that it's a recognised approach. Yes, a video about this would be helpful.
Would be very happy to watch a video on approval tests. We have started to adopt that technique and it is very interesting me how on or off track we are 🙂
I deleted my original reply, because I realized I was confusing approval with acceptance tests. I have always used the term characterization tests, from Michael Feathers' book. I posted my first reply, before I got to Dave's portion of this video where he said that approval tests are the same as characterization tests. Most of my tests are characterization tests, since I work in a legacy code base. These tests are a bit nasty because legacy code is obdurate to being tested. My characterization tests are more implementation dependent than behavior dependent, because I don't know what the behavior should be. I view BDD/TDD based unit tests as specification tests. They declare behavior first and then the code implements it. Characterization tests help reveal and define existing behavior. They provide a safe scaffolding for refactoring, from which we may be able to write true BDD/TDD tests. I think of BDD/TDD based tests as being mostly permanent tests. Whereas, I view characterization tests as being temporary until the legacy code can be properly refactored. However, their lifecycle can still be months or years until they can be completely dismantled.
Jim, yes I agree with your characterisation of both - Exactly! Acceptance Tests are the real long-term value, they defend against functional regression, as well as guide the development of new features. Characterisation/Approval tests are a more tactical tool, and are more coupled to implementation detail, which means that they don't work as long-term defence when you need to change the outputs from the system. But they are a great tool to support genuine refactoring, when you are in the state that you describe, where you don't really know all that the system does. I missed this nuance for a while, but now I see them as a fantastic tool, particularly at the start of the process of making your code more tractable.
"stuck in a rut and not realizing what's possible with different thinking" never have truer words been spoken. There are *some other* TH-camrs that should take notice.
I remember watching your video on Approval testing, and definitely would like to see more on this topic. Please do make a video on that. Thanks for sharing your valuable pieces of wisdom.
I understand your point about retrofitting unit testing, however my experience is that you should not take it out of the equation for two reasons. One is that it complements well approval testing and can be applied to isolated areas. Approval tests eventually get more refined into unit tests. The process is important as legacy code tends to have layers over layers of highly specialized and sophisticated functionality. Unit testing helps to reverse engineer especially some fundamental concepts like underlying math and geometry libraries. The second reason is that legacy code more often than not comes with legacy teams. It is very hard to introduce TDD due to resistance ("our software cannot be tested automatically", "it's a waste of time and an overhead"). Having them introduce some level of unit testing helps the team's understanding and appreciation of the concept making it much easier to later on introduce TDD. There is always one or two that resist, and it takes serious amounts of time to overcome this resistance.
I would like to see a video on approval tests! I'd also love to see a video on your take of consumer-driven vs provider-driven contract tests. Particularly with how it pertains to CD.
I often use the Broken Windows theory to describe the mistreatment of legacy systems and systems under development. One neglect seems to beget another. Love the shirt by the way!
Yes please on "Approval Testing" deeper dive. I've felt that approval testing is a _great_ way to get green (not comfortable with writing unit/functional tests on their own) developers moving on test plans that feel more familiar.
Thanks for another great video! I would like to hear more about approval testing. I guess this must involve how the behaviour of the existing and the changed system is recorded and compared. Thanks in advance!!
Didn't know these donations left comments, woops! Nothing to comment on just yet for this video, but I just wanted to say thanks for all the content, between your book and videos I've been slowly trickling in better practices across our teams. Although its hard finding the time between the management, admin, engineering, development and deadlines to fit in time for optimizing for CD and the culture shift that needs to happen, I can see the change taking effect in small ways, and I'd like to think the small things add up over time :)
Thank you for another great summary. I really like the way you simplify a complex topic. It would bei great, if you speak the next time about how to transform a company to use CI/CD.
In the forth an important thing is to make deployment automated. Even if you have to write the automation yourself! I have seen many systems having problems because lot of people implemented stuff on their own way. I don't say that it's always a problem but I think it is very important to try to do it using some off the self (trustworthy) solution! The problem of having lot of self written automative tools is that it can quickly become similar to legacy code. But I would love to hear David's opinion on this! :D
I don't see that as much of a problem. I guess it depends on scale, if you are talking about lots of web-scale environments, where you want economies of scale/reuse for you infra definitions, then sure better tools help with this. But that isn't usually the case when we are talking about automating the deployment of a Legacy System. It is more likely to be a one-off, so custom is less of an issue I think. Having said that, sure, I'd start with an off the shelf tool, as I said in the video. In most cases I'd see if I could sensibly containerise things, then look to Chef, Puppet, et al, and only if none of those options make sense would I do my own thing.
We've just inherited a legacy system (a huge impenetrable financial model) which has thousands of 'regression' tests. The problem is that these regressions take an overnight run to execute (so not a great feedback cycle) and also nobody really knows what they're supposed to be testing (they are not categorised and have very unhelpful names), so if a set of them break, it's very difficult to say why. Do you have any advice on how to deal with this? My gut feeling is to essentially just follow the advice you give in this video of slowly building up characterisation tests for the parts of the code we're refactoring (as well as more typical TDD tests for new parts of the code) and slowly remove the reliance on these regressions over time. The cultural/organizational problem we have is that these regressions give the QAs and wider business a comfort blanket that they really do not want to let go of, which would make it really difficult to move towards a continuous delivery model given the full pipeline takes ~8 hours.
you can treat the legacy tests as legacy code as well in terms of directing your refactoring effort. If the tests arent helpful, replace them with ones that are.
What does it mean "to record interactions with the system"? Syscalls? It sounds like a desperate idea. However, desparate times call for desperate measures. If it was a viable solution without any hidden assumptions, I would pay to learn how to do approval/characterisation tests.
its sort of like virtualization testing. You can record the input/output of normal operations and then use that to verify for the same input you get the same output after making a change. by doing so you are also building something close to a virtual implementation of your service that can be used when your real service is not available to test with. By giving it known input you get known output...allowing the consuming process to continue being testable too.
What I am curious to find out is, if the system you created in LMAX is stateless and if it is horizontally scalable. Generally trading contracts for example EUR/USD currency exchanges is not something you can horizontally scale and the latency for storing and restoring the state of a market in a stateless system takes time.
'Approval testing' is a technical term, it has a specific meaning, and they are there to verify that the code is unchanged in behaviour. Approval tests are great to support refactoring, I have a video coming out soon on Approval testing. Pre-existing tests may or may not be focused on that, it isn't really a definitive term in the same way.
Hmm. As far as I know, Gallileo did discuss his ideas with the Inquisition and with the pope and he got along with them fine. Years later after he called the pope an idiot and he lost many friends by being just an asshole, he published a book by saying there is proof that the sun is in the center even though he did not have any. (The first proof came much later, when the instruments were precise enough). Then the church treated him the way they did. Just out of pure spite The moral of this story is, do not behave like Gallileo. To be right is not enough to make a change. It takes a lot of work and effort to persuade the ones around you.
Yes, Acceptance Tests are high-level, BDD-style, functional tests that validate that the system does what your users want it to do, without, explicitly, saying how the system works.
Michael Feathers' book is definitely a good place to start if you're looking to make it more unit testable. It has a number of practical strategies to isolate the untestable parts, instrument the testable parts, and then refactor and amend what's left. The most important thing to remember: you don't need to fix or refactor everything all at once. Look first at what's most likely to change in the normal course of business. If it's working as it should and doesn't need to change for a business reason, then move on to what isn't working and/or does need to change.
Pfft! I worked with *functions* that had 7k lines (and more!) with no unit test. Well of course there were no unit tests, you can’t unit test stuff like that.
1. Galileo was not executed. 2. Galileo did not have problems because he believed in the Heliocentric model, after all, there is a reason is not attributed to him, but because he was a smart jerk with almost no social skills. Galileo decided to publicly and in writing humiliate a dude because he was of a different opinion about the heliocentric model calling him an idiot. This time the dude had a lot of power and was used to have kings's reverence, Yes the dude was the pope. The reason he was not killed is that he had a friend, so "tower" was his punishment and tower for life.
Dave! It was Copernicus who came up with the idea of a sun-centered solar system. Galileo also believed Copernicus' theory, but was much later. See en.wikipedia.org/wiki/Nicolaus_Copernicus
FREE REFACTORING TUTORIAL: Learn Approval Testing, Refactoring and Decluttering FOR FREE and apply them to some very nasty code. You can work along with Dave to learn how to make the bad code testable ➡ courses.cd.training/courses/refactoring-tutorial
Just a quick rectification. Galileo was not executed. He was placed under house arrest. Giordano Bruno was executed.
Yes, but…
We don’t talk about Bru-no-no-no-no
Yes, sorry for the mistake
And his arrest was not for challenging people’s worldview: it was for violating a direct order from the church to teach heliocentrism as a theory, not as a fact
@@pepineros4681 You definitely have A Disney+ subscription 😂
If I were Pope I would make Giordano Bruno a saint.
I would love seeing an episode on approval testing. I have been working on a system that was in a pretty bad state and we used a combination of approval testing and testing for some of the modules we could easily isolate as a first pass of tests. We then started refactoring the code but all the new code had tests written to specifically cover it. As we cleaned up the system the test coverage go better and the defect rates dropped.
We also follow a boy scout type principle of leaving the code in a better state any time you touch it. So if you are working in a module and you see some bad code clean up a function or two, add some tests, etc. as part of your normal work. We went from merging about once per year into the master branch with numerous bugs to being able to do it several times per day and no failures in many months. Most branches now are very short lived before they merge back in. The situation is still not ideal with a CI/CD standpoint but it sure is a LOT better.
I used many things I have learned from your channel in guiding this process.
Can I second the request for a video on approval testing! Excellent work thanks.
Never heard the term but that is exactly what we've been doing on our legacy ball of mud. I always felt a bit guilty doing it as it's not unit testing. I'm reassured that it's a recognised approach.
Yes, a video about this would be helpful.
Would be very happy to watch a video on approval tests. We have started to adopt that technique and it is very interesting me how on or off track we are 🙂
I deleted my original reply, because I realized I was confusing approval with acceptance tests.
I have always used the term characterization tests, from Michael Feathers' book. I posted my first reply, before I got to Dave's portion of this video where he said that approval tests are the same as characterization tests.
Most of my tests are characterization tests, since I work in a legacy code base. These tests are a bit nasty because legacy code is obdurate to being tested. My characterization tests are more implementation dependent than behavior dependent, because I don't know what the behavior should be.
I view BDD/TDD based unit tests as specification tests. They declare behavior first and then the code implements it.
Characterization tests help reveal and define existing behavior. They provide a safe scaffolding for refactoring, from which we may be able to write true BDD/TDD tests.
I think of BDD/TDD based tests as being mostly permanent tests. Whereas, I view characterization tests as being temporary until the legacy code can be properly refactored. However, their lifecycle can still be months or years until they can be completely dismantled.
Jim, yes I agree with your characterisation of both - Exactly!
Acceptance Tests are the real long-term value, they defend against functional regression, as well as guide the development of new features.
Characterisation/Approval tests are a more tactical tool, and are more coupled to implementation detail, which means that they don't work as long-term defence when you need to change the outputs from the system. But they are a great tool to support genuine refactoring, when you are in the state that you describe, where you don't really know all that the system does.
I missed this nuance for a while, but now I see them as a fantastic tool, particularly at the start of the process of making your code more tractable.
"stuck in a rut and not realizing what's possible with different thinking" never have truer words been spoken. There are *some other* TH-camrs that should take notice.
Very much yes to an episode about approval tests! ❤
This is one of your most significantly important videos to date. Well done and thank you.
I remember watching your video on Approval testing, and definitely would like to see more on this topic. Please do make a video on that. Thanks for sharing your valuable pieces of wisdom.
I understand your point about retrofitting unit testing, however my experience is that you should not take it out of the equation for two reasons. One is that it complements well approval testing and can be applied to isolated areas. Approval tests eventually get more refined into unit tests. The process is important as legacy code tends to have layers over layers of highly specialized and sophisticated functionality. Unit testing helps to reverse engineer especially some fundamental concepts like underlying math and geometry libraries. The second reason is that legacy code more often than not comes with legacy teams. It is very hard to introduce TDD due to resistance ("our software cannot be tested automatically", "it's a waste of time and an overhead"). Having them introduce some level of unit testing helps the team's understanding and appreciation of the concept making it much easier to later on introduce TDD. There is always one or two that resist, and it takes serious amounts of time to overcome this resistance.
I would like to see a video on approval tests! I'd also love to see a video on your take of consumer-driven vs provider-driven contract tests. Particularly with how it pertains to CD.
I often use the Broken Windows theory to describe the mistreatment of legacy systems and systems under development. One neglect seems to beget another. Love the shirt by the way!
Yes please on "Approval Testing" deeper dive. I've felt that approval testing is a _great_ way to get green (not comfortable with writing unit/functional tests on their own) developers moving on test plans that feel more familiar.
Great topic! Looking forward to a more extensive approval testing video.
yes please, I Would be very happy to watch a video on approval tests.
Thanks for another great video! I would like to hear more about approval testing. I guess this must involve how the behaviour of the existing and the changed system is recorded and compared. Thanks in advance!!
Wow! A lot of folks caught the Galileo error. Nice to know people out there are keen on historical fact =)
Love this video! Would definitely also love to see an Approval Testing video!
As always, valuable information. I'm interested in the approval testing, would be great to cover this topic in a separate episode.
This channel is highly relevant and useful.
Of course legacy is of value! You just have to give it some love!
Approval testing looks interesting, but hard to achieve reliably. A video on that would be great. 😉
Didn't know these donations left comments, woops! Nothing to comment on just yet for this video, but I just wanted to say thanks for all the content, between your book and videos I've been slowly trickling in better practices across our teams. Although its hard finding the time between the management, admin, engineering, development and deadlines to fit in time for optimizing for CD and the culture shift that needs to happen, I can see the change taking effect in small ways, and I'd like to think the small things add up over time :)
Thank you, and good luck on your journey.
Thank you for another great summary. I really like the way you simplify a complex topic.
It would bei great, if you speak the next time about how to transform a company to use CI/CD.
Looking forward to an episode on approval testing! :)
Would love to see a video on approval testing. Thanks!
Do an introduction on approval testing please! Love you!
thanks, best summary of this topic I've ever seen :)
Yes, please on Approval Testing!
10/10 shirt
10X rockstar ninja shirt. ;)
Yes, please do the approval test video too :)
Ty Dave! Please show us more on approval testing
+1 for the episode on approval testing
Yes please do a topic on approval testing :)
Thanks for the video, great info!
letting you know in the comments would like to see video on approval tests :D
In the forth an important thing is to make deployment automated. Even if you have to write the automation yourself! I have seen many systems having problems because lot of people implemented stuff on their own way. I don't say that it's always a problem but I think it is very important to try to do it using some off the self (trustworthy) solution!
The problem of having lot of self written automative tools is that it can quickly become similar to legacy code. But I would love to hear David's opinion on this! :D
I don't see that as much of a problem. I guess it depends on scale, if you are talking about lots of web-scale environments, where you want economies of scale/reuse for you infra definitions, then sure better tools help with this. But that isn't usually the case when we are talking about automating the deployment of a Legacy System.
It is more likely to be a one-off, so custom is less of an issue I think.
Having said that, sure, I'd start with an off the shelf tool, as I said in the video. In most cases I'd see if I could sensibly containerise things, then look to Chef, Puppet, et al, and only if none of those options make sense would I do my own thing.
Video on approval testing would be great!
Love the T-shirt!
Would like to see a video with your views on approval testing
I'll very much like to see approval tests. Thank you!
Thanks!
I would like an episode on approval testing.
Does anyone know the name of the FAA program mentioned ?
We've just inherited a legacy system (a huge impenetrable financial model) which has thousands of 'regression' tests. The problem is that these regressions take an overnight run to execute (so not a great feedback cycle) and also nobody really knows what they're supposed to be testing (they are not categorised and have very unhelpful names), so if a set of them break, it's very difficult to say why. Do you have any advice on how to deal with this? My gut feeling is to essentially just follow the advice you give in this video of slowly building up characterisation tests for the parts of the code we're refactoring (as well as more typical TDD tests for new parts of the code) and slowly remove the reliance on these regressions over time. The cultural/organizational problem we have is that these regressions give the QAs and wider business a comfort blanket that they really do not want to let go of, which would make it really difficult to move towards a continuous delivery model given the full pipeline takes ~8 hours.
you can treat the legacy tests as legacy code as well in terms of directing your refactoring effort. If the tests arent helpful, replace them with ones that are.
A bit of criticism: the first four minutes basically said "CD is hard to switch to from a legacy process", it was way too repetitive.
I just love that t-shirt!
What does it mean "to record interactions with the system"? Syscalls? It sounds like a desperate idea. However, desparate times call for desperate measures. If it was a viable solution without any hidden assumptions, I would pay to learn how to do approval/characterisation tests.
its sort of like virtualization testing. You can record the input/output of normal operations and then use that to verify for the same input you get the same output after making a change. by doing so you are also building something close to a virtual implementation of your service that can be used when your real service is not available to test with. By giving it known input you get known output...allowing the consuming process to continue being testable too.
I've always heard what you call Approval Tests referred to as Regression Tests.
Not really the same thing, though they are used as regression tests. I'd say that "Approval tests are one type of regression test, there are others".
What I am curious to find out is, if the system you created in LMAX is stateless and if it is horizontally scalable. Generally trading contracts for example EUR/USD currency exchanges is not something you can horizontally scale and the latency for storing and restoring the state of a market in a stateless system takes time.
It’s not stateless, it is VERY Stateful, but it is horizontally scalable through sharding.
@@ContinuousDelivery so that I don't keep bothering you, can you point me to a document as to how sharding worked?
@@ContinuousDelivery and thank you :)
Galileu was not executed by the inquesition. Died an old man. Almost... this was a clever man.
Quite correct, sorry for my mistake.
Gallileo wasn't executed, just put under house arrest. He was condemned by the church for heresy for the support for the Copernicus science.
Oops, quite right, sorry!
How do approval tests differ from pre-existing tests?
'Approval testing' is a technical term, it has a specific meaning, and they are there to verify that the code is unchanged in behaviour. Approval tests are great to support refactoring, I have a video coming out soon on Approval testing. Pre-existing tests may or may not be focused on that, it isn't really a definitive term in the same way.
I love the t-shirt
Hmm. As far as I know, Gallileo did discuss his ideas with the Inquisition and with the pope and he got along with them fine.
Years later after he called the pope an idiot and he lost many friends by being just an asshole, he published a book by saying there is proof that the sun is in the center even though he did not have any. (The first proof came much later, when the instruments were precise enough). Then the church treated him the way they did. Just out of pure spite
The moral of this story is, do not behave like Gallileo. To be right is not enough to make a change. It takes a lot of work and effort to persuade the ones around you.
with automating acceptance tests, do you mean a high level test? otherwise i don't see how you would automate user tests. i love the shirt by the way
Yes, Acceptance Tests are high-level, BDD-style, functional tests that validate that the system does what your users want it to do, without, explicitly, saying how the system works.
Look at 7 k lines c and cpp files from work 😢.
With no unittest
Michael Feathers' book is definitely a good place to start if you're looking to make it more unit testable. It has a number of practical strategies to isolate the untestable parts, instrument the testable parts, and then refactor and amend what's left.
The most important thing to remember: you don't need to fix or refactor everything all at once. Look first at what's most likely to change in the normal course of business. If it's working as it should and doesn't need to change for a business reason, then move on to what isn't working and/or does need to change.
@@sasukesarutobi3862 thanks for the advice I will take a look at it :)
Pfft! I worked with *functions* that had 7k lines (and more!) with no unit test.
Well of course there were no unit tests, you can’t unit test stuff like that.
1. Galileo was not executed. 2. Galileo did not have problems because he believed in the Heliocentric model, after all, there is a reason is not attributed to him, but because he was a smart jerk with almost no social skills. Galileo decided to publicly and in writing humiliate a dude because he was of a different opinion about the heliocentric model calling him an idiot. This time the dude had a lot of power and was used to have kings's reverence, Yes the dude was the pope. The reason he was not killed is that he had a friend, so "tower" was his punishment and tower for life.
I wonder if I can convince my PM that we need to talk to QA about Approval Testing.....hmm...
Request(DaveShouldShowHowToWriteAnAprovalTest);
“Can we fix it”. No its legacy, we don’t even want to look at it on a bad day. But yay it’s one of our core tools. … taxi!
Was that Rider or Intellij ?
IntelliJ
No, it's Forked
Galileo was not executed... (Died of natural causes nine years after his trial.)
very very unprofessional shirt. disappointed.
Dave! It was Copernicus who came up with the idea of a sun-centered solar system. Galileo also believed Copernicus' theory, but was much later. See en.wikipedia.org/wiki/Nicolaus_Copernicus