ISG Software Research Analyst Perspectives

Financial Institutions Need Dynamic Risk Management

Written by Robert Kugel | Jul 19, 2012 6:53:22 PM

Planning portfolio risk follows the same basic tenets as other sorts of business planning. It must be done in the context of a time dimension. In business, short-term plans are developed with a lot of givens or constraints. For example, capacities are fixed, because it’s impossible to wave a magic wand and bring a new factory on line, stuff more machine tools into already jammed facilities or source more raw materials in a capacity-limited supply chain. Short-term plans also incorporate assumptions about external forces (such as the economy, competitive moves or regulation) that are fixed or change very little in this period. By contrast, long-range or strategic planning is relatively unconstrained. The countries, markets or products that an organization can offer, for example, are not limited by current conditions. Indeed, that’s an essential point of long-range planning: assessing the impact of significant changes to today’s givens or assessing how to manage the impact of expected future trends.

The degree of granularity in planning and optimization also is a function of the time frame. There’s value to adding lots of detail in short-term plans because executives and managers are looking for specific answers to tactical questions. In the short term, planning the details is worthwhile because there’s a much smaller likelihood that the internal or external assumptions built into a model will be materially different from the current state. On the other hand, long-range planning at a detailed level is an exercise in futility. Details matter less because the degree of uncertainty of any single assumption increases with time. There is much greater value to exploring the impact of a range of broad assumptions over a period of years rather than sticking to specific details, because the probability of game-changing events increases.

Planning for that space between the short and the long terms poses its own challenges. A major reason for intermediate-term planning is to determine the best way to dynamically manage the transition from today’s point A to tomorrow’s desired point B. Plans therefore must be made at some level of detail in order to tie the current state to a set of future objectives but without too much extraneous detail. Intermediate-range planning doesn’t have to be done as often as short-term planning, but it must be more frequent than long-range planning. Whereas short-term plans may be generated anywhere from hourly to quarterly (the exact frequency depends on the nature of the business activities being planned and available resources) and long-range plans are created annually, intermediate-range plans ought to be done quarterly or at least semi-annually.

These temporal considerations apply to planning and managing risk in financial services companies, especially banks. Short-term risk management requires a granular approach to achieve a detailed understand of the impact of market changes and the ability to quickly assess (and reassess) specific counter-party risk scenarios. A few years ago, in the wake of the financial crisis, I wrote a blog on an analytical approach to risk analysis for financial institutions advocated by Willi Brammertz, a professor at the University of Zurich. In his book Unified Financial Analysis Brammertz argues for a less abstract, bottom-up approach to portfolio risk management. Rather than constructing models based on a set of assumptions about the assets on a balance sheet, financial services companies should define each asset in terms of its specific parameters (the terms and conditions of the contract between borrower and lender, in the case of a bank) and gauge the performance of each of these under varying sets of assumptions to assess their impact on the entity’s soundness. Abstract models can – and in the financial meltdown did – build in assumptions that are too constrained to alert bank risk managers to the true risks they face. Since it’s possible to define almost all of the assets on a bank’s balance sheet in common, consistent sets of terms and conditions (maturity, rate, currency, secured or unsecured and counter-party risk, among the more common attributes), it should be possible to assess with greater certainty the impact of a wider range of assumptions on future changes on the value, riskiness and liquidity of these assets.

While detailed models are valuable for short-term assessments, they may not be practical or as useful for intermediate-term planning for banks. A highly granular modeling methodology is not likely to work for broader enterprise risk management in larger organizations because these organizations’ information technology infrastructures stand in the way. These institutions employ an amalgam of transaction and risk management systems that have been assembled over time. Even if two systems are from the same vendor, they may have incompatibilities because originally they were purchased by two different companies and provisioned in different ways.  Establishing, maintaining and executing the process of extracting, moving and validating data can be so time-consuming and costly as to render the project unworkable. Then there is the sheer scale of the data that would have to be processed. At some point in the future, the cost of memory and the capabilities of in-memory processing systems may make it feasible for a universal bank to start from a full bottom-up data set, but today it is not. Moreover, given the uncertainties that are present in this sort of intermediate planning, a summary level of detail about the assets provides sufficient fidelity to map out potential courses of action and their impact over this period.

I believe the quality of risk-based planning will be a major differentiator of strategic success for banks over the next decade, for two reasons. In a period of heightened regulatory conservatism (and the higher capital ratios that go with it), risk-adjusted return on capital matters more today than at any other time during the past 30 years. Second, the period of nearly worldwide interest-rate repression that has characterized financial markets for the past five years will end within the next five. Rates will be higher and more volatile than today. As a consequence, banks will need to manage their portfolios to achieve an optimal risk-adjusted return on capital consistent with their strategic positions. To do this successfully means being able to do intermediate-term planning more frequently and more intelligently. Determining how best to get from point A (today’s asset portfolio) to multiple points B (the optimal allocation of bank capital under different scenarios) will result in achieving a consistently higher return on assets than using a more reactive approach to managing a bank’s portfolio.

Managing the risk dimension of portfolio decisions more nimbly can allow banks to prepare better for changes in financial markets and make appropriate changes to their portfolios sooner. Today, conservatism is in order. In 18 months, however, a more aggressive lending posture may be advantageous. Thus planners should ask an array of potentially significant questions. What actions do executives and managers in specific areas of the bank have to take over the next six months to be best positioned for that outcome? What products or services must be available? How should these be priced? How should capital be allocated to achieve the most optimal risk-adjusted return on capital within regulatory constraints? If a certain path is chosen, what conditions will produce the best outcome? What conditions would make that choice embarrassingly wrong?

More intelligent risk management for bank portfolios requires analytic software with which to create and manage accurate risk models, data management tools to feed information from multiple, disparate systems to these models and a management discipline that emphasizes a proactive approach to enterprise planning. Banks that support a more intelligent approach to portfolio management with the right software and good data management practices are likely to be more successful.

Regards,

Robert Kugel – SVP Research