Pristine and Bionic Turtle have entered into a partnership to promote practical applications of the concepts related to risk analytics and modeling. Practical and hands on understanding of building excel based models related to operational and credit risk is necessary for any job related to risk management. For this purpose, we would be illustrating step by step model building techniques for risk management. Registrations for Operational Risk are OPEN. Please sign-up now.
Last time we had discussed that to fit distribution to scenario data, it is important that data is elicited from experts in such a manner that it is amenable to distribution fitting using one of the methods such as moment matching, Quantile/percentile matching, maximum likelihood, OLS.
Business experts may not understand probability and statistics. So questions need to be framed in such a manner that scenarios are easy to understand and probability distributions can be fitted to frequency and severity data elicited from experts.
If you remember we had started fitting distributions using scenario analysis and Interval approach, where experts mention the frequency of losses estimated within specific loss intervals.
This time we discuss the percentile approach - data is collected for specific percentiles/quantiles of loss severity from experts. In this tutorial we would discuss the Interval Approach. In the following illustrations, we will fit continuous distributions to scenario data collected for loss severities.
Assume that the output of a scenario workshop is that the median loss severity and 90th percentile loss severity are USD 30000 and USD 160000 respectively.
Step-1: Decide on a distribution to be fitted to data.
For this illustration, let us fit a lognormal distribution to scenario data.
Step-2: Decide on seed values of distribution parameters to calculate theoretical Quantile
Step-3: Calculate the theoretical Quantile
Step-4: Compare theoretical quantiles with empirical/scenario quantiles.
Step-5: Use an optimization algorithm (like Excel Solver) to minimise sum of squared differences between theoretical and empirical quantiles by changing parameter values.
In our illustration, change in parameter-1 (log_mean) to 10.31 and parameter-2 (log_stdev) to 1.31 reduces the squared deviation between theoretical quantiles and expert opinion to zero.
Therefore, lognormal (10.31, 1.31) may be used for severity modeling in OpVaR estimation.
One of the common issues is how many quantiles should be elicited from the experts. For fitting a two-parameter continuous distribution, atleast two quantiles should be elicited.
For practical considerations, it may be difficult to ask experts about more than three-four quantiles, lest they will be confused. BCBS in its July-2009 paper on Results from the 2008 Loss Data Collection Exercise for Operational Risk observes that the median number of severity percentiles for banks using the percentile approach was four, with a narrow inter-quartile range indicating that at least three quarters of these banks used four or fewer percentiles.
Both the MLE approach and Quantile approach to fit continuous distributions to scenario analysis data are also discussed in BCBS paper on Results from the 2008 Loss Data Collection Exercise for Operational Risk, Annexure-C.
I have created a template for you, where the subheadings are given and you have to link the model to get the cash numbers! You can download the same from here. You can go through the case and fill in the yellow boxes. I also recommend that you try to create this structure on your own (so that you get a hang of what information is to be recorded).
Also you can download this filled template and check, if the information you recorded, matches mine or not!