Would you consider the exponential distribution for loss severity as an appropriate choice for modelling operational risk via monte carlo? (given the fact, that the lognormal distribution would lead to extrem values, which are not realistic, due to few data points high variability in the data)
Hello and thanks for your question. To be honest, I hesitate to answer since I am not able to come up with the final verdict in this matter. My intention was to explain AMA and show a practical, not too complicated example. Indeed, the exponential function is also used to model loss severity. It boils down to the question whether the tail risk events are "light" or "heavy". The exponential is more for the light, and the lognormal for the heavier events. I would say that the nature of operational risk in banking steers us to heavier events. However, I encourage you to follow the results of your own research. Best, André Koch
Yes, that is correct. These are just examples I made up. For a more detailed explanation of this approach see also the video on the Poisson distribution: th-cam.com/video/DWzASheDqcU/w-d-xo.html Kind regards, André Koch
No, although I did not address this question, they are in my view averages. As you might know, the AMA leaves much of design of this approach with the FI and the supervisor needs to verify and approve. So, there is no law chiseled in stone for this. But again, I meant these numbers to be averages. I hope this helps. Best, André Koch
Multiplying the Random Number of Losses by the Random Loss size assumes the losses are perfectly correlated. If you have 2 random losses you need to generate two losses and sum them, not generate one loss and times it by 2.
Thanks for your comment. We have provided the Excel sheet at www.stachanov.com/en/downloads Maybe it helps if you have that sheet at hand. Also, there is a video that explains the Poisson approach in more detail. But you are right, this is not an easy theme. Best, André Koch
Great content on banking especially for Risk Managers in the banks
Thank you Almighty!
Would you consider the exponential distribution for loss severity as an appropriate choice for modelling operational risk via monte carlo? (given the fact, that the lognormal distribution would lead to extrem values, which are not realistic, due to few data points high variability in the data)
Hello and thanks for your question. To be honest, I hesitate to answer since I am not able to come up with the final verdict in this matter. My intention was to explain AMA and show a practical, not too complicated example. Indeed, the exponential function is also used to model loss severity. It boils down to the question whether the tail risk events are "light" or "heavy". The exponential is more for the light, and the lognormal for the heavier events. I would say that the nature of operational risk in banking steers us to heavier events. However, I encourage you to follow the results of your own research. Best, André Koch
Dear Andre, hi.
Please share your explanation on Basel III operational risk economic capital calculation.
Hallo Imir, thanks for your request. Unfortunately, I cannot help you in the short run since I have much other work to do. Sorry. Best, André
Hi, Thank you for this video it was very helpful. How can I download the example worksheet?
www.stachanov.com/en/downloads Here you go, Best, André Koch
Hi the frequency you have in row B2 to F2 where are they coming from. Are they annual average frequency of each risk event
Yes, that is correct. These are just examples I made up. For a more detailed explanation of this approach see also the video on the Poisson distribution: th-cam.com/video/DWzASheDqcU/w-d-xo.html
Kind regards, André Koch
The actual observed losses are not average values per year, right? Just the the observed single data point?
No, although I did not address this question, they are in my view averages. As you might know, the AMA leaves much of design of this approach with the FI and the supervisor needs to verify and approve. So, there is no law chiseled in stone for this. But again, I meant these numbers to be averages. I hope this helps. Best, André Koch
Multiplying the Random Number of Losses by the Random Loss size assumes the losses are perfectly correlated. If you have 2 random losses you need to generate two losses and sum them, not generate one loss and times it by 2.
Good video. The analytics were presented a bit too fast, though.
Thanks for your comment. We have provided the Excel sheet at www.stachanov.com/en/downloads
Maybe it helps if you have that sheet at hand. Also, there is a video that explains the Poisson approach in more detail. But you are right, this is not an easy theme. Best, André Koch