In March 2006, the Federal Deposit Insurance Corporation (FDIC) published an update on emerging banking issues, “Scenarios for the Next U.S. Recession.” At the time, FDIC-insured institutions had recorded a fifth consecutive year of record earnings, and it had been nearly two years since the FDIC had last provided assistance to a failed or failing bank—the longest such streak in FDIC history. While times were positive, the report foreshadowed negative trends on the horizon:
“How long can these good times last? Experience teaches us that economic expansions do not last forever and that some types of economic disruptions can be associated with financial distress for banking organizations. While forecasting recessions is, at best, a hazardous business, it makes sense from a risk management perspective to explore various weak-economy scenarios to better prepare for adversity down the road.”
Despite this warning, the update concluded that the banking industry was well-positioned for the next recession. Little did anyone know the severe effect the coming years would have on the asset quality and overall financial condition of banks.
Roughly 6.2 percent of FDIC-insured institutions reported a net loss for 2005. This number had quadrupled by 2007 before peaking at 30.8 percent in 2009. While there have been countless financial reforms and economic events to point toward in explaining this trend, asset quality remains the focal point of profitability. Over the past four years, bank earnings have been hampered by mounting loan-loss provisions. This article serves to illustrate recent trends in the allowance for loan and lease losses (ALLL), provide high-level discussion of the reasons for these trends and provide general guidance on maintaining ALLL at a level adequate to sustain estimated losses and fitting within the confines of generally accepted accounting principles (GAAP).
ALLL Trends – 2005 Through 2010
While the following charts are not surprising to those following the banking industry, they do provide historical context. The average ALLL to loan ratio was between 1 percent and 1.5 percent in 2005. This remained somewhat consistent until a very steep increase from 2007–2009, when the average ratio for all institutions grew to more than 3 percent. The trendline for all institutions is somewhat tempered by the inclusion of savings and loan (S&L) institutions. While S&Ls tend to maintain a much lower level of ALLL (primarily due to difference in asset mix), the S&L trendline has mimicked the commercial bank ALLL trend.
While the above graph presents average ALLL levels, these should not be viewed as guidelines for setting ALLL. In addition, figures in recent years are skewed by pending bank failures and a growing number of institutions under formal regulatory agreements.
As would be expected, the trend in loan charge-offs during that same period is very similar, albeit with a minor lag between bad debt recognition (above) and realization (below).
ALLL’s strain on profitability and capital ratios has caused a seismic shift in the banking environment. During this five-year time frame, the number of FDIC reporting institutions fell by 1,175, or roughly 13 percent. Most of this reduction can be attributed to bank failures, merger activity and FDIC-assisted transactions. While each situation is unique, asset quality and its impact on ALLL is the prevailing reason behind this trend.
The coverage ratio is an oft-examined metric that attempts to measure an institution’s ability to absorb potential losses from delinquent loans. This ratio has trended inversely to changes in ALLL, dropping below 100 percent in 2008 as problem-loan levels continued to grow rapidly at most institutions.
How Did We Get Here?
While every institution is presented with different circumstances and there are a number of moving targets in estimating potential losses within a loan portfolio, here are some general reasons for the sharp upward trend in ALLL.
Components of the ALLL Model
ASC 310-10 (formerly SFAS 114) and ASC 450 (formerly SFAS 5) provide relevant guidance for establishing appropriate allowance levels. GAAP requires an institution to segregate its portfolio between impaired loans and nonimpaired loans. A loan is impaired when, based on current information and events, it is probable an institution will be unable to collect all amounts due according to the loan’s contractual terms. Impaired loans must have specific reserves under GAAP. Specific reserves can be determined in a variety of ways, depending on the type of credit and the most likely source of repayment. Regularly used methods include, but are not limited to, liquidation value of collateral and discounted cash flows.
The graph below shows a steep rise in the average level of noncurrent loans over the past five years. While impaired loans are not reported to the FDIC in quarterly call report submissions, there is likely a strong correlation between delinquencies and impaired loans. This simply illustrates the substantial growth in problem (and impaired) loans.
Obviously, the specific reserve set for any impaired credit will depend on unique facts and circumstances. However, the heavy increase in impaired loans occurred in 2007 and 2008, when many institutions had yet to experience a significant level of charge-off history. Therefore, in many instances, reserves applied to impaired credits were significantly higher than those applied to nonimpaired portfolios.
For the nonimpaired population, GAAP allows pooling loans with similar characteristics and risk factors. A widely accepted method for determining reserve factors uses historical charge-offs as a base rate, adjusted for the effects of qualitative or environmental factors, such as national/local economic trends, levels of adversely classified assets and trends in collateral values. Interagency guidance suggests using a historical loss lookback period of no less than 12 months, but acknowledges management may use a shorter or longer lookback period if it provides a better estimate of potential losses. As most institutions use a form of historical loss as a key component in their reserve factors, the charge-off trends presented earlier would serve to substantially increase these factors. As charge-off levels were relatively low in the pre-recession years, historical loss factors remained low. Now, most lookback periods will encompass periods of unusually high loss experience relative to each institution.
The environmental and qualitative component of loss factors are at the discretion of institution management. Many institutions consult trends, such as classified assets, gross domestic product (GDP), unemployment and housing permits, when determining the proper adjustment. Most of us are painfully aware of the trends in these metrics in recent years, and these trends are reflected in higher qualitative adjustments at most institutions.
Reduction in Real Estate Demand & Value
In pursuit of the construction boom, many institutions moved aggressively into commercial real estate, construction and land development lending. As a result, overbuilding occurred in many markets and real estate values were diluted. As of December 31, 2007, just before the economic storm hit its peak, these lending types comprised approximately 12.75 percent of total commercial bank assets. This figure was only 9.3 percent for S&Ls, largely explaining the wide gap in ALLL levels between institution types. Many real estate projects are highly leveraged, as they are funded primarily by debt as opposed to borrower equity. High leverage makes this type of lending highly sensitive to changes in interest rates, overall credit conditions, local economic factors and regional and national economic trends. Real estate construction projects often have long development periods, and the timeline and estimated demand for a project can change considerably between inception and completion.
While commercial or residential real estate credits can be subject to a unique set of factors, there has been a widespread drop in real estate values in the past five years. The following charts illustrate 20-year trends in average sales prices of new homes and the reduction in demand for housing, respectively.
Source: US Census Bureau
Source: National Association of Home Builders
The above graphs only consider residential home sales, but this can serve as a key indicator in assessing overall real estate trends. The strength of many construction and land development credits lies with housing demand and the ability to sell homes or lots at a price providing adequate debt service coverage. Lower housing demand and sales prices indicate lack of economic growth. Given a stagnant economy, there also are fewer suitors for commercial real estate properties.
With the lack of demand for real estate, the primary source of cash flow for many borrowers has dried up. Many have had trouble servicing debt or are likely to encounter problems in the future as liquidity becomes an issue. Guarantors on such transactions, who may have previously been viewed as an additional source of repayment, are more heavily scrutinized by bank management, regulators and auditors. For guarantors that have the bulk of their net worth tied to real estate holdings, willingness and wherewithal should be demonstrated before an institution can comfortably rely on them as a potential cash flow source.
For real estate loans deemed to be impaired, an institution must establish a specific reserve based on an estimated liquidation value (generally based on third-party appraisal) or a discounted cash flow model, depending on the likely method of repayment. The growing number of impaired real estate credits, coupled with the downward spiral in real estate values, has caused large fluctuations in ALLL throughout the industry. Many institutions want more frequent appraisals for their collateral-dependent loans due to continued volatility in the real estate market. Often, updated appraisals result in lower values, increasing the need for ALLL. In addition, given the lack of comparable sales information, the nature of certain sales examined, e.g., foreclosures, bankruptcies, etc., and general economic uncertainty going forward, real estate valuations have become increasingly subjective.
Identification of Troubled-Debt Restructurings
While troubled-debt restructurings (TDRs) have become an increasingly popular topic among bankers, examiners and auditors over the past five years, they are not particularly new. In general, any concession a lender makes to the terms of a loan to a borrower having financial difficulty represents a TDR under ASC 310-40. Before the current economic cycle, banks rarely had material amounts of TDRs. However, in an effort to ease the financial difficulties of certain borrowers and increase the likelihood of repayment, loan modifications—specifically extensions of maturity date, rate reductions and lighter monthly payments—have become more prevalent in recent times. While not all loan modifications should be considered TDRs, this should be evaluated on a case-by-case basis.
By definition, a loan meeting the definition of a TDR is impaired under ASC 310-10-35. As discussed above, specific reserves must be determined for all impaired loans based on estimated collateral liquidation value or a discounted cash flow model. Given continued decline in demand for and value of real estate, many banks have needed to set aside hearty specific reserves for TDR credits. The chart below shows the ratio of TDRs to average assets over the past five year-ends. As illustrated, the sharp increase between 2007 and 2009 coincided with much greater loan modification frequency, as well as more TDR awareness and scrutiny from bank regulators and auditors.
While call report instructions allow institutions to remove a loan’s TDR status in the year subsequent to the modification—assuming the loan is performing under the new terms and maintains a market rate of interest—high modification frequency and continued scrutiny haven’t yet allowed this to have much of an impact in reversing the upward trend. In addition, FASB ASU 2011-02, A Creditor’s Determination of Whether a Restructuring is a Troubled Debt Restructuring, sheds some light on TDR identification. This guidance elaborates on the factors to examine in determining whether a loan modification is really a concession and whether a borrower is experiencing “financial difficulty.” Public companies are in the process of adopting this guidance in their 2011 quarterly filings, while the effect for most private entities will be felt in 2012. While this does not change the underlying accounting treatment, it could increase TDR quantity for many institutions.
The current regulatory environment has greatly influenced ALLL levels over the past five years. In 2006, the Interagency Policy Statement on the Allowance for Loan and Lease Losses urged institutions to maintain effective loan review systems. The statement called for an independent loan review function and greater frequency and enhanced scope. Regular, diligent and independent reviews of credits, coupled with economic decline, resulted in a substantial increase in problem assets, charge-offs and, ultimately, ALLL.
Approaching 400 failed banks since 2007, and an estimated loss of more than $80 billion to the FDIC’s Deposit Insurance Fund (DIF), regulators and auditors have encountered nearly every credit problem imaginable. In the best interest of account holders, the DIF and the general public, a great deal of prudence is being taken in every examination. As ALLL is an estimate and largely the product of a great deal of management subjectivity, many institutions and their regulatory agencies have adopted a conservative approach to ALLL while operating within GAAP parameters.
Where Do We Go From Here?
These are uncharted waters for most financial institutions. The economic and credit quality trends of the past five years have led to unprecedented losses in most loan portfolios, while significantly complicating ALLL estimates and causing greater deviation in ALLL methodologies industrywide. The need for sound ALLL methodology is more important than ever in preparing accurate financial statements and evaluating an institution’s true financial condition.
Determining the proper ALLL is inevitably imprecise, and an appropriate ALLL falls within a range of estimated losses. Regardless of your unique methodology, ALLL estimates should be based on a comprehensive, well-documented and consistently applied loan portfolio analysis.
For most institutions, the primary driver for determining ALLL on nonimpaired loans is historical loss experience. While a sound ALLL model will use this data as a foundation, it should be supplemented with current economic trends, forecasts and management’s qualitative considerations. Management should really question the relevance of prior loss history in determining how much loss remains in their portfolio. Over periods with relatively stable loss histories, historical loss factors are highly relevant and might provide the best estimate of future losses. However, based on the last five years of credit woes and continued economic uncertainty, perhaps management’s qualitative adjustments are currently more crucial. The gap between an ALLL model based on incurred losses and one truly estimating expected losses can be wide in times of economic volatility. Management should evaluate their unique portfolios and determine how best to balance the past with future projections.
As any bank CFO or CCO can attest, attempting to quantify qualitative ALLL adjustments can be extremely difficult. For each adjustment, management should strive to achieve “directional consistency” between the adjustment amount and the underlying factors being considered. For example, if an institution has a qualitative factor for national and local economic trends that considers metrics such as GDP and the unemployment rate, the related adjustment should correlate to movement in these metrics. In addition, a high-level check of past qualitative adjustments will help determine whether the current adjustment amount is reasonable within the context of the ALLL model. Procedures such as stress testing for interest rate changes and risk rating migrations can help quantify these adjustments and solidify estimated loss calculations.
Outside the mechanics of the ALLL calculation, board involvement can be crucial in maintaining a sound methodology and adequate ALLL level. Given the size of this estimate and amount of subjectivity involved, boards should reevaluate the ALLL process and consider if changes or adjustments are appropriate. Board members should remain educated on their institution’s current ALLL methodology and relevant accounting guidance.
Unfortunately, financial institutions do not have the benefit of hindsight as examiners and auditors often do. However, ALLL is a crucial estimate, and management must make every effort to accurately estimate future credit losses based on information available. Despite the recent doom-and-gloom scenarios, there are positive signs in recent ALLL trends. The FDIC’s Quarterly Banking Profile for fourth quarter 2010 reported improved asset quality for a third consecutive quarter, and fourth-quarter loan-loss provisions were the smallest for the industry since third quarter 2007. In addition, 54 percent of insured institutions reduced their provisions in the fourth quarter, compared to 2009. Hopefully, these are signs that better days are ahead.
For more on ALLL, contact your BKD advisor.