speeches · May 1, 1996
Speech
Alan Greenspan · Chair
For release on delivery
8 25 a m , C D T (9 25 a m , E D T)
May 2, 1996
Remarks by
Alan Greenspan
Board of Governors of the Federal Reserve System
before the
32nd Annual Conference on Bank Structure and Competition
Federal Reserve Bank of Chicago
Chicago, Illinois
May 2, 1996
Good morning It is a pleasure to once more address this impressive
conference Through the years the Chicago Federal Reserve Bank has consistently
identified the most important issues that policymakers are confronting. It has done so
again The rapidity of technological change, the globalization of finance, and the
institutional expansion of financial competition are changing the banking environment
so dramatically that regulators, of necessity, must continually re-evaluate their strategy
and procedures
It is useful, I think, to begin a discussion of "what should regulators do9"
by reminding ourselves just why there is bank regulation, by which I mean regulation
designed to assure a minimum level of prudential soundness At bottom, of course, is
the historical experience of the effects on the real economy of financial market
disruptions and bank failures, especially when the disruptions and failures spread
beyond the initial impetus
Perhaps equally important is the unintended market response to a safety
net designed to address systemic concerns and protect "small" depositors Deposit
insurance and the discount window, by changing the terms under which banks deal with
their creditors, distort the signals and incentives that banks receive from the market,
creating a substantial potential for excessive risk-taking by banks In response, bank
regulators have been forced to try to minimize this moral hazard that, in the absence of
the safety net, the market itself would police The problems that arise from the
short-circuiting of the pressures of market discipline have led us increasingly to
understand that the ideal strategy for supervision and regulation is to simulate the
market responses that would occur if there were no safety net, but without giving up the
basic requirement that financial market disruptions be minimized
Such a realization highlights the basic tensions between stability and risk
taking, between regulation and market forces, between government guarantees and
business choices Our goal is, in effect, to make the safety net moot in order to give
wider range to market forces But our constraint is to avoid systemic risk, by which —
as I have noted at this conference before — I do not mean a zero bank failure rate
Far from it For we should not forget that the basic economic function of these
regulated entities is to take risk If we minimize risk taking in order to reduce failure
rates to zero, we will, by definition, have eliminated the purpose of the banking system
Since dramatic institutional and regulatory change is rare, the challenge is
to develop evolutionary modifications that balance risk-taking and stability, market
incentives and regulations Happily there are, I think, reasons to be optimistic that this
can be done The source of my optimism, as I will explain, is the potential to harness
one of the main forces that is also challenging our current regulatory structure
technology
Supervisors and regulators have already begun to do so by emphasizing
risk management procedures at individual banks and, most recently, by proposing the
use of internal risk models for purposes of allocating capital for trading risk
Nonetheless, the most fundamental recent change — to which the others are simply
modifications — is the adoption of risk-based capital The risk-based capital accord of
1988 was a genuine step forward at the time However, compromises were inevitably
made to achieve consensus The regulators knew the shortcomings, but were — in
my view, correctly — willing to compromise in order to establish both a meaningful
capital floor and to recognize, in a rough way, differences in risks for broad
categories of assets The capital rules were especially helpful to halt — and reverse —
the secular decline in bank capital ratios Moreover, the Fed was — and continues to
be — supportive of prompt corrective action that builds upon the accord by requiring
specific regulatory responses at different risk-based capital ratios
In recent years, however, the weaknesses in the risk-based capital
structure have become ever more evident its sole focus on credit risk, its one-size-fits
all risk weight for nonmortgage loans, its inability to adjust weights for hedges, portfolio
diversification, and management controls, and the difficulties of folding in interest rate
risk, to name a few Most, if not all, of these problems were known before adoption
Most, if not all, are addressed judgmentally by the supervisors in their case-by-case
onsite examination and review Nonetheless, despite these efforts and greater
attention to internal bank risk management and control, our current procedures are
linked too closely, in my view, to the risk-based capital ratio At the same time, the
market place has become more complicated in ways that risk-based capital rules
cannot handle, even with the increasing complexity of the rules
These market complexities, however, would not have occurred without the
same technology that has recently made it possible to quantify risks that only a short
time ago we could just conceptualize Like internal models for measuring risk on
trading positions, quantifiable measures for other risks make it possible for banks and
others to choose more carefully their risk positions and better link those positions to the
appropriate capital levels
Consider securitization, which, of course, has been evolving for several
years, but is, in an important sense, a paradigm of the evolving risk management
techniques in financial markets In a typical secuntization, the sponsoring entity —
which is often a bank — establishes a special purpose vehicle, or conduit, that acquires
a loan pool from an originator Frequently, to exploit regulatory loopholes, the conduit
itself originates the loans The conduit then issues varying tranches of securities to
fund the loan pool It achieves double- or tnple-A ratings on most of these securities
by providing for their credit enhancement, typically by having the sponsor purchase the
most junior securities, which are structured to absorb virtually all of the credit risk
inherent in the underlying loan pool The sponsoring bank retains all of the residual
spread between the return on the loan pool and the costs incurred by the conduit,
including the interest and noninterest costs on the loan-backed securities
For present purposes, what is most critical about secuntizations is that the
participants — the originator and the credit enhancer (who are often one and the
same), as well as the purchasers of the senior securities — all need to have a fairly
clear and quantifiable idea of their risk exposure Typically, however, bank loans are
opaque to outsiders, making such knowledge hard, if not impossible, to come by
Nevertheless, this problem often can be overcome by using loan portfolios with
standard loan contracts having well-known characteristics — like credit card and auto
loans — or by overcollateralizing, or by providing appropriate levels of credit
enhancement Most often a combination of these is used
While perhaps not designed originally with the purpose in mind, one
method for making such loans more amenable to securitization is credit scoring Credit
scoring applies formal statistical procedures to the credit decision process By
subjecting loans to uniform underwriting standards, credit scoring facilitates analysis of
the credit risk inherent in a securitized loan pool Such procedures are being applied to
an ever wider array of bank loans, including, most notably, small business loans In the
future, one can thus expect to see an increasing volume of secuntized small business
loans that are not guaranteed by the SBA It is also worth emphasizing that the credit
scoring exercise — and the market evaluation accompanying secuntizations of loan
pools — requires that banks capture, monitor, study, and present historical loss data on
a large volume of their loans The development of such data bases was a necessary
prerequisite to mortgage, credit card, and auto loan securitization Such data bases will
be equally critical not only to the evolving technology of internal risk management at
banks and for secuntization of other loan pools, but, as I will be noting shortly, also for
evaluations of risk by the supervisor/regulator
A brief digression may be useful at this point A common
misunderstanding of credit scoring and securitization is that both will cause banks to
refrain from making the nonstandard loans that their special knowledge has made
possible, including the credits that, in the past, I have referred to as character loans It
is also sometimes argued that secuntization of loans will undermine the economic
franchise of banking, driving down spreads on the last profitable credit function in the
banks' economic franchise But such a result is not at all preordained, especially for
those banks willing and able to take advantage of new technologies Of course, the
new technologies of loan standardization and credit scoring can be viewed as chipping
away at the monopoly rents associated with specialized knowledge of the local loan
customer In effect, barriers to entry are lowered when the new technologies allow
nonlocal competitors to offer standardized products through nationwide marketing
campaigns using toll-free "800" telephone numbers But the by-products of
standardization and credit-scoring include lower underwriting expenses and the more
accurate estimation of loss probability distributions These byproducts act to offset the
effects of a reduction in barriers to entry, both by raising profits on existing operations
and by opening up opportunities with customers previously not served Better and
quantifiable estimates of risk are tantamount to risk reduction
Moreover, securitization, by expanding lower cost funding sources,
increases the ability of banks to make more loans and to diversify their risks Risk
reduction, lower funding costs, and lower noninterest costs in making the credit
decision, all act to increase banks' risk-adjusted rate of return In addition, banks can,
will, and do, continue to make the nonstandard loan based on judgment and
asymmetrical information, retaining the resultant whole loan in their portfolios Indeed,
the new technology should enable banks efficiently to evaluate borrowers they could
not assess in a cost-efficient way before Thus, new banking opportunities will be
opened The new procedures I have just described also induce banks to price all loans
— but especially nonstandard loans — more accurately to reflect their true costs and
risks This, too, works to increase risk-adjusted rates of return and results in a better
allocation of resources True, some banks, unwilling or unable to adapt to the changing
technology, will lose market share and perhaps suffer lower rates of return But the
banks that embrace the cost-cutting and risk-reducing effects of the technology will, in
my judgment, tend to find it a rewarding experience
Beyond credit scoring and securitization, larger banks are moving into
new areas of risk evaluation for internal management purposes, including the
quantification of credit risk Most large banking organizations have — or are
developing — procedures for allocating capital against various types of loans, based on
estimates of credit risk for various categories For example, in middle market lending at
these institutions, a first step is to classify loans into various rating categories —
usually 1 to 10, with 1-rated loans being equivalent to triple—A securities and 10-rated
loans about to be written off as loss Periodically, each loan is re-evaluated and
re-categorized if necessary Such categorizations have been done for some time, but
the more sophisticated banks are going an important step beyond this point They are
using historical data to estimate the mean and variance of defaults and actual losses on
each grade of loan The result can be interpreted as attempting to infer the loss
probability distribution for each category or subportfolio of loans, and for the entire loan
portfolio
Consider how such information can be used Estimates of expected
losses and the probability distribution of unexpected losses are critical for pricing credits
correctly and deciding whether competitive market rates thus imply withdrawing, cutting
back, or expanding various types of credit A prerequisite, however, is the judgment by
management as to the proper amount of capital to allocate to each of the subportfolios
or risk categories so that risk-adjusted rates of return can be calculated The most
common approach is to allocate sufficient capital so that the probability of actual credit
loss exceeding the allocated capital is no greater than, say, one-half percent This
probability "target," in turn, could be arbitrarily chosen or selected to be consistent with,
say, maintaining a double-A bond rating on the bank's own debt
These capital allocations, as I noted, are for internal management, not
regulatory, purposes But I am impressed with what they teach us, the regulators, and
what they imply for regulatory capital The internal capital allocations used by banks
range from less than 2 percent for highly rated loans to 20 percent or more for the most
risky credits In addition, credit enhancements, such as most junior positions in
secuntized loan pools, can have theoretical capital allocations that widen still further the
range of appropriate internal capital allocations Compare this wide range of internal
capital allocations with the 8 percent, one-size-fits-all Basle standard In fact the
average risk-based capital ratio for U S banks approaches 12 percent for large banks,
far above the 8 percent minimum Nonetheless, consider the contradiction implicit
when a bank with a 12 percent risk weighted capital ratio may be viewed by the public
as having a strong capital position when the bank's own capital allocation models
suggest that it should have 15 percent capital, or more The supervisor, I believe, is not
being misled in most such cases, and is making the appropriate judgmental
adjustments Moreover, the markets clearly make such adjustments I note that the
banks with very high risk-based capital ratios still do not achieve triple—A ratings on
their debt, and some do not even have single-A ratings
One can conceive of a bank following a portfolio policy that would
engender an average internal capital allocation that would be well below our 8 percent
minimum In such a case, our rules would significantly disadvantage the bank and
induce it to find loopholes and to engage in regulatory arbitrage to avoid the standard
But consider a bank carrying capital considerably above what the regulatory guidelines
suggest, but below what the bank's own internal capital allocation procedures imply
Such a bank has no reason to adjust its position for regulatory purposes if the
supervisor does not see through the veil of nominally high regulatory capital As I
noted, I believe that supervisors generally have been able to do so But should they fail
to see through that veil, in an environment of increasingly complicated financial
transactions, there could be a serious inconsistency between our desired regulatory
soundness standard and actual bank risk levels
Federal Reserve staff members are beginning a review of the major
banks' internal credit risk-capital allocation models in order to understand better the
strengths and weaknesses of these models We already know, however, that there has
been an irreversible application of risk measurement technology without which banks
would not be able to design, price, and manage many of the newer financial products,
like credit derivatives These same or similar technologies can and are beginning to be
used to price and manage traditional banking products
Some of these developments are at an early stage, and all are evolving
rapidly Today's technology allows us to measure risk in ways that were unthinkable a
decade ago The next decade will likely produce further dramatic change But already
today, the markets — including credit rating agencies — are using these quantitative
tools
Indeed, for the first time, we can seriously begin to contemplate a
regulatory quantification of what we mean by "soundness " Recall that while the
objective of bank regulation and supervision is to assure a minimum level of prudential
soundness, the precise meaning of soundness has always been tenuous and
ill—defined This is why judgment has been, and will continue to be, a critical
component of prudential supervision However, the technology and techniques banks
have developed, and are developing, allow us greatly to improve that judgment by
constructing measures of soundness in probability terms If we can obtain reasonable
estimates of portfolio loss distributions, soundness can be defined, for example, as the
probability of losses exceeding capital In other words, soundness can be defined in
terms of a quantifiable insolvency probability Moreover, one can conceive of definitions
of soundness that go beyond simply the probability of insolvency to encompass also the
level and variability of losses to the FDIC in the event of insolvency Going still further,
regulatory targets for quantifiable soundness could be made to reflect market-based
goals For example, soundness could be defined in terms of some implied, minimum
credit rating for the bank's deposits, as if they were uninsured All of these approaches,
however, require the regulators to establish targets regarding acceptable failure rates or
the FDIC's exposure to potential losses Note that a bank could meet any particular
quantitative soundness standard by increasing its capital or by reducing the risk of its
portfolio
I do not mean to suggest that we have reached the point at which we can
now establish quantitatively precise soundness standards We have not These
procedures are in their infancy and are hampered by the lack of historical micro data
bases which have to be laboriously constructed at, or by, individual banks Moreover,
ascertaining relevant probabilities, the basis of an evaluation of soundness,
presupposes an estimation of the shape of these distributions, arguably the most
difficult aspect of this process The technical methodology is also changing with
experience and with conceptual progress in the academic and professional
communities
As I noted, we have already decided to use internal bank model
approaches for measuring market risks at banks and allocating regulatory capital to
those risks In addition, the Federal Reserve Board has been studying an alternative
capital allocation process for market risk, the so-called "pre-commitment" approach
This methodology would provide market and other financial incentives for banks to
choose capital allocations for trading risk that are consistent with their own risk
management capabilities, as well as with regulatory objectives With the Board's
encouragement, the New York Clearing House Association is organizing a pilot study of
the pre-commitment approach The next natural step is to begin to review ways to
harness, for supervisory purposes, the banks' own models for the measurement of
credit risk
The private sector, for a considerable time, has been accustomed to
product planning cycles — created by rapid technological change — in which the
planning of the replacement product is begun, if not well along, by the time a new
product is being introduced The United States banking system is not only the largest
in the world, but it is also the most complex and the most innovative We cannot
escape the reality that the banking supervisors and regulators will have to innovate to
continue to carry out their responsibilities Bank, and more generally financial institution
supervision, is, of necessity, a continually evolving process reflecting the continually
changing structure and policies of the supervised institutions We will eventually
correct, for example, all, or most, of the anomalies which we perceive in risk-based
capital, only in a few years to be required to "correct" those corrections This is not a
fault, but a description of an appropriate regulatory process Indeed, given our own
long lead times, we, like banks and nonfinancial firms for their products, must begin
designing the next generation of supervisory procedures even while introducing the
latest modification
What about principles that can be developed to address the original
question of this conference What should regulators do? In light of my remarks this
morning, let me suggest a basic principle Whenever possible, regulators should use
approaches to regulation and supervision that include or simulate market techniques
and signals Importantly, our soundness standards should be no more or no less
stringent than those the market place would impose In the unregulated market, of
course, a financial firm can take on any amount of insolvency risk it wishes, and the
10
market will rate its liabilities, and price them, accordingly Unfortunately, we do not have
a system in which deposit insurance premiums are permitted to vary as widely as
simulating that response would require But, at least theoretically, we can adjust the
individual bank's regulatory capital requirement to offset the reduction in market
discipline attributed to the safety net Perfection would occur if bankers had a
genuinely difficult choice deciding if they really wanted to remain an insured bank or
become an unregulated financial institution
In the final analysis, such an approach is the only way to control the moral
hazard of the safety net, to square the circle in balancing stability requirements with
risk-taking An important — and increasingly feasible — prerequisite in achieving that
balance is for the regulators to quantify what their goals are, especially what is meant
by "soundness " Measuring actual risks relative to these goals would be facilitated if
regulators harness for supervisory purposes the market-oriented tools already used
internally by banks for management purposes
When seeking to implement this principle and utilize new technologies, we
must take care to remember that we are unlikely ever to be able to measure risk in
absolutely precise ways Quantification procedures are still extrapolations of the past,
and behavior is always changing Models wil still doubtless be haunted by specification
and estimation errors The world will still remain a highly complex place, and I have no
doubt that financial participants and markets will continue to invent instruments and
procedures that models will not be able to capture until sufficient experience is gained
Thus, neither am I proposing nor do I anticipate that bank supervisors will be relying on
a black box based on statistical and econometric rules I am suggesting, however, that
new paradigms are in the process of evolving which will provide us with tools that will
permit greater quantification of both risk standards and risk management Such
quantification will not solve all of our problems, nor will it ever substitute for human
judgment, which is the only technology we have available to parse the most difficult
11
regulatory problems Nonetheless, quantification will facilitate great improvements in
both risk management and what regulators will be able to do The financial world is
dynamic and I have little doubt that there will be a continuous need to modify what we
develop In the end, judgment must be augmented with technology, and technology
must be tempered with judgment
* * * * *
12
Cite this document
APA
Alan Greenspan (1996, May 1). Speech. Speeches, Federal Reserve. https://whenthefedspeaks.com/doc/speech_19960502_greenspan
BibTeX
@misc{wtfs_speech_19960502_greenspan,
author = {Alan Greenspan},
title = {Speech},
year = {1996},
month = {May},
howpublished = {Speeches, Federal Reserve},
url = {https://whenthefedspeaks.com/doc/speech_19960502_greenspan},
note = {Retrieved via When the Fed Speaks corpus}
}