首页    期刊浏览 2024年11月24日 星期日
登录注册

文章基本信息

  • 标题:Preparing for Basel II modeling requirements: Part 3: putting it all together
  • 作者:Jeffrey S. Morrison
  • 期刊名称:The RMA Journal
  • 印刷版ISSN:1531-0558
  • 出版年度:2003
  • 卷号:July-August 2003
  • 出版社:Risk Management Association

Preparing for Basel II modeling requirements: Part 3: putting it all together

Jeffrey S. Morrison

This article, the third in a four-part series, describes the development of an analytics platform by SunTrust that integrates Basel II requirements for data-based risk assessment into a Windows-like interface allowing sophisticated statistical models to be developed, validated, and documented quickly and efficiently.

Although the previous two articles in this series provided some basic guidelines to analytics, the truth is that modeling is as much of an art as a science. Painting the most lifelike picture of risk can involve a variety of statistical tools. Each tool has its strong points and carries with it a unique programming language. Therefore, one of the biggest systems challenges is combining these tools to maximize efficiency, minimize errors, and basically make things as simple as possible.

SunTrust decided to build a standardized platform specifically geared to analytics. Regardless of the statistical tools necessary for performing specific tasks, the user would go to just one place--a customized graphical user interface (GUI). Some general benefits of the system are that it:

* Provides an easily understood user interface for model analytics.

* Launches a variety of programs requiring different software tools.

* Ramps up training for new associates.

* Minimizes programming redundancy by integrating new ideas into a single consolidated system.

* Provides a huge increase in modeling efficiencies--quality and speed to implementation.

* Minimizes the negative impact of employee turnover.

* Reduces syntax and programming mistakes.

* Accumulates intellectual knowledge over time.

Application Development

The development of SunTrust's modeling platform was done in two stages. The first stage consisted of writing customized code in each statistical programming language to capitalize on their strengths and create tailored routines for handling specialized tasks. These statistical packages included SAS, S-PLUS, SHAZAM, and a GIS (Geographic Information System) tool called ARCVIEW, which gives the modeling platform the capability of mapping spatial aspects of the data.

The second stage involved the creation of a GUI to consolidate the various statistical tools under a single umbrella of buttons and windows. Since the number of users was expected to be relatively small and localized, a Web solution was not necessary. Therefdore, Visual Basic 6.0 was selected as the application development language because it's easy to use and most users are familiar with it. SunTrust wanted a design that would offer maximum flexibility in the art of model building yet would be structured to provide consistency of methodology. Features were included that would allow the system to:

* Import raw data from in-house databases.

* Implement user-defined custom coding.

* Access a variety of sampling schemes.

* Select multiple approaches for handling missing information.

* Select a variety of mathematical modeling approaches.

* Produce univariate analysis.

* Generate modeling diagnostics.

* Select primary modeling variables.

* Offer automated routines for fine-tuning models.

* Provide variable selection methods.

* Test implementation code on development and validation data.

* Accumulate champion/challenger modeling results.

* Generate implementation code.

* Create audit trails and support documentation for regulatory review.

The behind-the-scene mechanics are straightforward. After receiving the instructions from the user through the GUI interface, the modeling platform dynamically generates the code necessary to perform the statistical routine. This code is automatically submitted in batch mode to the appropriate statistical software package for execution in the background. Once the execution is complete, a window pops up on the desktop that allows the user to examine the output. Results from the output are collected by the GUI and are used for model development or other analytics. Figure 1 shows a collage of results from several statistical software packages that were built and executed through the standardized GUI interface.

The Main Menu

The main menu of the SUNTRUST RISK MODELING SYSTEM is shown in Figure 2. Eleven primary options are listed across the top of the main menu from left to right:

1. Intro--provides an brief introduction to the system for new users.

2. File Open--imports data files for analysis and model building.

3. Pre-Processing--allows the user to write programming code to create new explanatory variables used in modeling.

4. Prelim Analysis--produces correlations, frequencies, graphics, and a host of analytic techniques to understand the basic structure of the data.

5. Depend Var--allows the user to point to the name of the dependent variable to be used in the regression.

6. Options--sets defaults for regression and other reports.

7. G.I.S.--interactively launches ARC VIEW mapping tool.

8. Utilities--invokes a variety of statistical and data management tools.

9. About--provides version number of program.

10. Documentation--provides links to model documentation.

11. Exit--ends the program. As mentioned in the first article in this series, models for PD (probability of default) and LGD (loss given default) require different statistical approaches.

Therefore, the RISK MODELING SYSTEM offers a variety of choices for model development. For PD models the user can select logistic regression. For LGD models linear and tobit regression procedures are available. The modeler can also use a technique called survival analysts to gain better insight into the possible timing of defaults.

Because flexibility is such an important part of model building, the system allows the user to point to and select variable names to include or exclude from the model. This flexibility is further extended by the ability to apply a variety of automatic routines to the variables that can enhance the model's predictiveness. This is accomplished by clicking on one or more of the seven labels listed under "Variable Options" on the main menu. The selection process may be better seen in Figure 3. From the main menu, the user clicked on the first label called Main Variables, causing a screen to appear where three variables (VAR12, VARl3, and VAR14) were chosen to be included in the regression. Data entered by this means gets no special treatment. By clicking on a different label like the one called Transformation Variables, the user tells the system to automatically pick the best mathematical transformation (square root, natural log, square, reciprocal, etc.) of three variables (VAR1O, VAR15, and VAR17) and include i t in the model.

This type of design allows the user the flexibility to apply a variety of custom procedures so that competing models can be quickly tested and measured for accuracy.

As mentioned in the second article in the series, model validation is extremely important in the new Basel Capital Accord. Therefore, the RISK MODELING SYSTEM automatically generates two types of validations. First, a validation is done using the data from which the model was developed. Second, a validation is done on holdout data-data set aside to independently evaluate the accuracy of the model. In both cases, power curves, K-S values, and other statistics are developed to measure validation accuracy. As shown in the Figure 4 fictional illustration, the system collects all this information and produces a report so the user can see a summary of results from the model development process. Four PD models were developed and validated. The results from each model were accumulated and sorted based upon the K-S value--a measure indicating validation accuracy. The higher this value, the greater the accuracy. In this example, Model # 4 was found to be the winner with a rather large K-S value of 83.9.

Once the model has been developed, the next step is to place it in a programming environment where each loan in the portfolio can be processed to receive an estimated PD or LGD value. This can be a tedious procedure if the model development process was done outside of a standardized system using more manual or ad hoc procedures. One of the most important features of the RISK MODELING SYSTEM is its ability to automatically generate the code necessary to reproduce the predicted values. With only minor syntax changes, this code can be translated into an algorithm for other languages through the change control process in the decision support part of the organization. Figure 5 shows an example of the code used for implementation.

The variable HSCORE in Figure 5 represents the probability of default as produced from a logistic regression model.

Summary

At this point in the game, the modeling requirements from Basel are not cast in stone. Banks are expected, however, to use sound statistical approaches in estimating PD and LGD and to prove their worth within a framework composed of checks and balances. By using a consolidated system interface for model development efforts, SunTrust is better equipped to demonstrate its procedural road map to the regulatory agencies while reaping the benefits of a standardized system. As its use grows over time, additional techniques will be integrated into the user interface to better meet any new Basel standards needed to support the "advanced" IRB approach to capital requirements.

[FIGURE 1 OMITTED]

[FIGURE 2 OMITTED]

[FIGURE 3 OMITTED]

Figure 4

                                                                Top 5%
                                           VIF       K-S      %DEFAULTS
MODEL  NAME       COEFFICIENT    T-STAT  LIN_REG  VALIDATION  VALIDATION

  4    Intercept    5.31255      4.8842        0     83.9      11.4198
  4    VAR10        1.30352      4.1294  1.19054     83.9      11.4198
  4    VAR15       -1.10621     -2.1099  1.00393     83.9      11.4198
  4    VAR17       -1.12203     -2.0810  1.01062     83.9      11.4198
  4    VAR2        -0.20186     -7.8449  1.00032     83.9      11.4198
  4    VAR3         0.10356      8.6214  1.65660     83.9      11.4198
  4    VAR4        -1.29401     -7.3070  1.60361     83.9      11.4198
  4    VAR6         0.74878      2.4164  1.43622     83.9      11.4198
  4    VAR7         0.95098      2.6668  1.28973     83.9      11.4198
  4    VAR8        -0.37092     -5.0988  1.28308     83.9      11.4198
  2    intercept    5.76464     10.5106        0     70.3      11.1111
  2    VAR2        -0.10186     -7.0424  1.00000     70.3       11.111
  2    VAR4        -1.87851    -11.8334  1.23045     70.3      11.1111
  2    VAR6         1.40856      6.1930  1.23045     70.3      11.1111
  3    Intercept    2.59999      6.4989        0     70.2      11.1111
  3    VAR4        -1.88197    -11.8614  1.23045     70.3      11.1111
  3    VAR6         1.42381      6.2205  1.23045     70.3      11.1111
  3    VAR2_1       2.05619      7.0524  1.33333     70.3      11.1111
  3    VAR2_2       1.31823     4.72220  1.33333     70.3      11.1111
  1    Intercept    0.41871      3.6257        0     27.4       6.7901
  1    VAR10JNV    -0.00000     -7.9176  1.00000     27.4       6/7901

Figure 5

code

x --- CODING FOR MISSING OBSERVATIONS ---;

IF VAR10 = .THEN VAR10 = 0.42
IF VAR15 = .THEN VAR15 = 0.4956700587
IF VAR17 = .THEN VAR17 = 0.4971728813
IF VAR2 = .THEN VAR2 =20
IF VAR3 = .THEN VAR3 = 42.492
IF VAR4 = .THEN VAR4 = 2.66
IF VAR6 = .THEN VAR6 = 0.424
IF VAR7 = .THEN VAR7 = 0.308
IF VAR8 = .THEN VAR8 = 7.572
x____________________________;

HSCORE =

5.312557694 +

VAR10 x 1.3035231981 +
VAR15 x -1.106205098 +
VAR17 x -1.122026898 +
VAR2 x -0.2018637 +
VAR3 x 0.1035603126 +
VAR4 x -1.294010317 +
VAR6 x 0.748777983 +
VAR7 x 0.9509830255 +
VAR8 x -0.37099917944 +
HSCORE = 1/(1+EXP(-(HSCORE)))
x_____________________________;

The final article in the series will discuss stress testing. The idea of stress-testing should be of interest to all banks, because developing such approaches can provide the institution with the tools necessary to better manage its capital. Earlier articles in this series have focused on the development and validation of PD and LGD models. In contrast, the final article will introduce a different modeling approach that uses aggregated data to predict the impact of economic and portfolio changes on bank default losses.

[c] 2003 by RMA. Jeff Morrison is vice president, Credit Metrics--PRISM Team, at Sun Trust Banks Inc., Atlanta, Georgia.

Contact Morrison at at Jeff.Morrison@suntrust.com

COPYRIGHT 2003 The Risk Management Association
COPYRIGHT 2005 Gale Group

联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有