While on the subject of “validation” – it can have a range of meanings when applied to credit risk models.
At the most general level it means review by an external authority. This could cover a wider scope than merely reviewing the models themselves. All aspects of how the modelling methodology was chosen, executed, implemented, and integated with the business might be considered. Naturally an external technical review of the models may be a valuable subtask.
Validation using data is a more concrete approach. Widest scope is achieved by having a sample of the bank’s exposures scored by a relevant external agency with similar models for comparison with the bank’s own results. Whilst this covers the most bases, it is hard to do it well in practice because of the difficulty of reproducing the same data environment – for example categorical predictors may need to be ‘mapped’.
Validation using the bank’s own data is the easiest and perhaps most familiar context. Various more specific technical terms apply. Some examples:
- during the model building phase it is good practice to hold out a ‘validation’ sample as a protection against over-fitting. This is also called cross-validation. The validation sample used is randomly selected from the modelling mart to guarantee neutrality with respect to all data effects.
- a proposed new model can be run on ‘out of time’ data – cohorts that are before (‘backtesting’) or after the sample window represented in the modelling mart. This is likely to be instructive and reassuring but does not carry the guarantee that pure cross-validation does.
- the routine monitoring of the performance of models once they have been implemented may also be considered to be ongoing ‘validation’ and is the first line of defence.
The simplest setting is validation of an individual component, especially PD. Last week’s post touched on the more difficult context of validating that the chain of models PD-EAD-LGD work together correctly.
Aren’t there some aspects of Basel – like long term cycle issues – that defy validation? Or rather, rely on judgement rather than analysis?
2 comments
6 August, 2008 at 01:20
Sumit
I am implementing a model validation framework for a US based retail bank. Is anybody familiar with implementation of Mean Squared Error method for validation of PD for Retail Pools? Help much appreciated.
29 August, 2008 at 05:39
richard sutter
MSE is not appropriate for validating a model. It is used to select the intial model, however.