<%@ Language=VBScript %> <% Source=Request.QueryString("Source") If Source = "OIGnet" Then Response.Write Source %>
FDIC Office of Inspector General, 3501 Fairfax Dr., VS-E9068, Arlinton, VA 22226


Search Accessibility Privacy Information Quality Contact Us
<% End if %>

Statistical CAMELS Offsite Rating
Review Program for FDIC-Supervised
Banks


September 26, 2002
Audit Report No. 02-033

FDIC
Federal Deposit Insurance Corporation
Office of Audits
Office of Inspector General
Washington, D.C. 20434

DATE: September 26, 2002

TO: Michael J. Zamorski, Director, Division of Supervision and Consumer Protection

FROM: Russell A. Rau [Electronically produced version; original signed by Russell Rau], Assistant Inspector General for Audits

SUBJECT: Statistical CAMELS Offsite Rating Review Program for FDIC-Supervised Banks (Audit Report No. 02-033)

This report presents the results of the Office of Inspector General’s (OIG) audit of the effectiveness of the Statistical CAMELS Offsite Rating (SCOR) review program. (Note: Financial institution regulators use the Uniform Financial Institutions Rating System to evaluate a bank's performance. Six areas of performance are evaluated and given a numerical rating of "1" through "5," with "1" representing the least degree of concern and "5" the greatest degree of concern. The six performance areas identified by the CAMELS acronym are: Capital adequacy, Asset quality, Management practices, Earnings performance, Liquidity position, and Sensitivity to market risk.) SCOR is one of the primary offsite monitoring tools used by the Federal Deposit Insurance Corporation (FDIC) to monitor the condition of insured financial institutions between safety and soundness examinations.

The audit objectives were to determine the effectiveness of SCOR as an early warning system and assess actions taken by the Division of Supervision and Consumer Protection (DSC) in response to early warning flags identified by SCOR. (Note: Effective July 1, 2002, the Division of Supervision and the Division of Compliance and Consumer Affairs were merged to form the new Division of Supervision and Consumer Protection (DSC). The DSC promotes the safety and soundness of FDIC-supervised institutions, protects consumers’ rights, and promotes community investment initiatives by FDIC-supervised insured depository institutions.) While SCOR is used to monitor the condition of all FDIC-insured financial institutions, we focused our review on those institutions for which the FDIC is the primary federal regulator. As Of March 31, 2002, the FDIC was the primary federal regulator for approximately 5,400 of the 9,500 FDIC-insured financial institutions in the United States. Additional details on our objectives, scope, and methodology are presented in Appendix I.

BACKGROUND

Section 10(d) of the Federal Deposit Insurance Act (12 USC 1820(d)) requires annual full-scope onsite examinations of every insured financial institution at least once during each 12-month period. Annual examination intervals may be extended to 18 months if the financial institution has total assets of $250 million or less and is a well-managed and well-capitalized institution.

Also, the FDIC may rely upon the examinations performed by the state banking authorities for every other examination, meaning that an institution can go up to 3 years without an FDIC examination. To help bridge the gap between examinations, regulators use various offsite monitoring tools to stay abreast of the financial condition of institutions.

Offsite monitoring focuses on evaluating the financial condition and potential risks of insured depository institutions through data collection, analysis, and review. DSC case managers are the key players in the FDIC’s offsite monitoring program. If certain risk indicators are identified, such as an increase in past due loans, the case manager can determine whether supervisory attention is warranted before the next regularly scheduled examination.

During 1998, the FDIC implemented a new offsite rating tool, SCOR, to more effectively and efficiently monitor risk to the banking and thrift systems. The SCOR system replaced the Capital Asset Earnings Liquidity (CAEL) offsite monitoring system. SCOR uses quarterly Reports of Condition and Income (Call Reports) to identify institutions that could potentially receive a downgrade in their CAMELS ratings at their next safety and soundness examination. (Note: Call Reports are sworn statements of a bank’s financial condition that are submitted to supervisory agencies quarterly in accordance with federal regulatory requirements. Call Reports consist of a balance sheet and income statement and provide detailed analyses of balances and related activity. ) To do this, SCOR uses statistical techniques to estimate the relationship between Call Report data and the results of the latest examination and estimates the probability of an institution being downgraded at the next examination.

SCOR calculates and displays an institution’s probability of downgrade for both composite and component ratings. According to the DSC Case Managers Procedures Manual, when a "1" or "2" rated institution shows a 30-percent or higher probability of being downgraded to a "3," "4," or "5," SCOR flags the institution for inclusion on an exception report, called an Offsite Review List (ORL). The ORL also includes "3" rated institutions that had a 30-percent or higher probability of being downgraded to a "4" or "5."

Quarterly, DSC case managers review the institutions identified in the SCOR ORL reports and provide a written analysis of the review to assistant regional directors for approval. The Case Managers Procedures Manual states that the case manager’s SCOR analysis will identify the reasons for the deterioration in any components identified by SCOR and recommend an appropriate follow-up response. The DSC field offices are provided copies of the SCOR analysis. FDIC examiners review SCOR and any analysis as part of their pre-examination planning. The Case Managers Procedures Manual includes state and other federal regulators on the SCOR analyses distribution list.

In addition to SCOR, the FDIC uses the Growth Monitoring System (GMS) and Real Estate Stress Test (REST) as its primary tools for offsite monitoring. GMS analyzes financial ratios and changes in dollar balances based on Call Report information to identify banks that have experienced rapid growth. REST is a model that measures a bank’s exposures to real estate lending by using the current Call Report data to forecast an institution’s condition over a 3- to 5-year horizon and scoring institutions on the CAMELS scale of 1 to 5.

The FDIC does not consider offsite monitoring a substitute for bank examinations. The FDIC recognizes that the accuracy of its offsite monitoring systems is dependent on the accuracy of Call Report data. Those institutions that do not accurately report their financial condition may not appear as a potential problem on the SCOR system. Thus, if information contained in a bank’s Call Report does not reflect the true condition of the bank, the effectiveness of the early warning system is diminished.

RESULTS OF AUDIT

The effectiveness of the SCOR review program in detecting potential deterioration in the financial condition of insured depository institutions, as presently implemented, is limited for the following reasons:

  • A time lag of up to 4½ months exists between the date of the Call Report and the subsequent offsite review;
  • The SCOR system depends on the accuracy and integrity of Call Report information to serve as an early warning between examinations;
  • The SCOR system cannot assess management quality and internal control or capture risks from non-financial factors such as market conditions, fraud, or insider abuse; and
  • DSC case managers rarely initiate follow-up action to address probable downgrades identified by SCOR outside of deferring to a past, present, or future examination.

As a result, SCOR has not identified emerging supervisory concerns or provided early warnings of potential deterioration at the majority of financial institutions in our sample. Further, case managers are placing limited reliance on SCOR as an early warning system.

LIMITATIONS OF SCOR AS AN EARLY WARNING SYSTEM

Time Lags in the SCOR Review Process

The analysis of SCOR reports can take up to 4½ months after the "as of" date of Call Report information. According to the Case Managers Procedures Manual, the initial SCOR ORL is available approximately 45 days after the Call Report date. The SCOR deadline for the completed DSC case manager analysis and input of codes is 3 months after the initial SCOR ORL, as shown in the following table.

Table 1: SCOR Time Frames

Call Report Dates SCOR ORL
Initial
SCOR
Deadline
March 31

May 15

August 15

June 30

August 15

November 15

September 30

November 15

February 15

December 31

February 15

May 15

Source: DSC Case Managers Procedures Manual

For example, with a March 31 Call Report, the SCOR ORL is available May 15 with the SCOR deadline date being August 15, 4½ months after the Call Report date. If case managers use the entire time frame to complete this analysis, there is generally not enough time between examinations for the case managers to recommend onsite activity such as a visitation or acceleration of the next examination. Therefore, case managers cannot verify whether the early warning of potential downgrade is valid and take any appropriate action.

FDIC-supervised institutions are on a 12- or 18-month examination cycle with alternating examinations usually performed by state banking departments. The SCOR review program generally operates on a 4½-month cycle between the Call Report date and the case manager’s analysis due date. Once SCOR flags an institution and the SCOR analysis is prepared, typically an examination either (1) has occurred within the last 3 to 6 months, (2) is currently taking place, or (3) is planned in the next 3 to 6 months. For those SCOR-flagged institutions we reviewed, we found that the case managers’ analysis either deferred to the most recent examination and/or reasoned that any deterioration, if not already known, would be detected in the next examination. As a result, it appears that limited reliance is placed on SCOR because time lags in processing and analyzing data often render the information meaningless for early warning purposes.

The Chairman of the FDIC has begun an initiative to modernize the Call Report process and improve the flow of bank data. Part of that initiative will include reducing the turn-around time for processing Call Report data. Currently it takes 45 days after quarter-end before Call Report data are processed and available for offsite monitoring purposes. Under the Chairman’s initiative, that time will be reduced to a few days after quarter-end.

In conjunction with shortening the time needed to process Call Reports, DSC may want to reevaluate its time frames for analyzing SCOR ORLs. In our opinion, the 3 months currently allowed could be reduced. When SCOR replaced CAEL in 1998, the FDIC did not update the 3-month time frame used by case managers to analyze the institutions flagged by offsite monitoring. Instead, the FDIC relied on the CAEL time frames outlined in Transmittal Number 94-021, dated February 7, 1994. On average, we found case managers completed their analysis 70 days after the initial ORL. However, based on our interviews with case managers, they did not use this entire time frame to complete the analysis and the analysis only took 1 to 2 days to perform. Unless the time frames can be shortened, the usefulness of SCOR is diminished because of questions concerning the currency of information and proximity to the next examination.

Dependence on Accurate Call Report Data

The effectiveness of SCOR is dependent on the accuracy and integrity of Call Report data. SCOR uses various ratios based on Call Report data to calculate an institution’s probability of downgrade for ratings. Institutions submit Call Report data to the FDIC and, absent errors detected in processing or examiners identifying a problem within records underlying that data, the information is accepted and used for offsite monitoring and reporting purposes. As a result, if institutions do not accurately report their condition in Call Reports, SCOR indicators may not be an accurate reflection of an institution’s actual condition.

SCOR is a financial data driven model that uses quantitative techniques to translate various indicators of bank strength and performance into estimates of risk. Inherent in SCOR is the assumption that there is a regular relationship between a bank’s financial condition and the CAMELS rating from an onsite examination. Altogether, SCOR measures 13 ratios to determine ratings. These ratios compute each of the following items as a percentage of assets:

  • Total Equity Capital
  • Past Due Loans 30 Days
  • Non-accrual Loans
  • Net Charge-offs
  • Net Income
  • Volatile Liabilities
  • Loans and Long-term Securities
  • Loan Loss Reserve
  • Past Due Loans 90 Days
  • Other Real Estate Owned
  • Provision for Loan Losses
  • Cash Dividends Declared
  • Liquid Assets

In our sample of 94 institutions that had been downgraded by an examination, SCOR did not "flag" 67 (71 percent) of the institutions as potential problems before that examination. Moreover, we noted that at least 16 of these institutions had been downgraded two or three composite CAMELS ratings, yet had not been flagged in advance by SCOR. The following table shows the extent of the decline in CAMELS ratings for the 16 institutions:

Table 2: Composite Rating Changes on Sampled Institutions

Composite Rating Change From Number of Institutions

1 to 3

5

1 to 4

3

2 to 4

8

Total

16

Source: OIG analysis of composite ratings of sampled institutions.

To gain an understanding as to why these 16 institutions may not have been flagged by SCOR, we reviewed examination files and found that inaccurate Call Report data coupled with institution management problems were evident in each case. For example, institution management in several cases understated adversely classified assets and the loan loss reserves were in turn under-funded. In each case, the bank failed to identify its loan problems, thus overstating its earnings and capital. Once examiners identified these problems during onsite examinations and the banks made the appropriate adjustments, their financial condition warranted a CAMELS downgrade. When an institution fails to recognize asset problems in its Call Reports, key SCOR indicators such as provision for loan losses, loan loss reserve, earnings, and capital will not reflect accurate balances. As a result, inaccurate Call Reports impede early warning of troubled banks.

Limitations on Ability to Assess Management Quality and Related Risks

The evaluation of bank management remains largely outside the realm of offsite monitoring systems. SCOR performs quantitative analyses based on information contained in Call Reports. Although SCOR provides a management rating, it is based on the bank’s financial performance as opposed to qualitative factors. The FDIC’s preferred approach to assessing qualitative factors such as management quality and internal control, risk management systems, and underwriting standards is through onsite examinations. SCOR also does not capture risks from non-financial factors such as market conditions, fraud, or insider abuse. These factors, in turn, can also impact the ratings assigned by the examination to other CAMELS components. Accordingly, all offsite monitoring systems are limited in their ability to serve as early warning systems.

In many of the examination reports we reviewed, examiners noted inexperience and lack of ability at the senior management and board of director level as a significant circumstance leading to examination downgrades. The Management component was downgraded for all 16 institutions receiving examination downgrades of two or more composite CAMELS ratings that were not previously flagged by SCOR. In 13 of the 16 cases, the Management component was downgraded by two or more ratings. Criticism of management taken from downgraded examinations included comments such as:

  • Board oversight and executive officer performance in all areas of the lending function is weak.
  • Management supervision is unsatisfactory, and senior management’s ability to correct deficiencies in a timely manner is questionable.
  • President and senior management engaged in new and high-risk activities without sufficient Board supervision, due diligence, and adequate policies.
  • Management continued its failure to recognize or adequately incorporate the complexity of asset securitization into the bank’s risk management system.
  • Management’s current methodology for identifying and monitoring problem loans is inadequate.
  • Board supervision of the bank’s subprime lending is inadequate.
  • Management concentrated a large portion of the loan portfolio in limited credit relationships resulting in adversely classified loans and apparent violations of lending limit regulations.

These significant management deficiencies eventually led to problems at these institutions. In the next section of this report, we will discuss the importance of integrating known management deficiencies into SCOR off-site analyses.

DSC Responses to Early Warnings

DSC case managers rarely initiated follow-up action to address probable downgrades identified by SCOR outside of deferring to a past, present, or future examination. The Case Managers Procedures Manual requires that the case managers review institutions and recommend an appropriate response. However, the case managers appeared reluctant to recommend a response, instead relying almost exclusively on past or future examinations to confirm or dispute the SCOR ratings. Therefore, DSC may have missed opportunities for early intervention.

Quarterly, DSC case managers review the institutions identified in the SCOR ORL reports and provide a written analysis of the review. The Case Managers Procedures Manual states that the case manager’s SCOR analysis will identify the reasons for the deterioration in any components identified by SCOR and recommend an appropriate follow-up response. The case managers assign an action code and a follow-up code for each institution on the ORL. The SCOR Offsite Review Program Manual refers to Transmittal Number 94-021 for detailed instructions on the SCOR action codes. According to Transmittal Number 94-021, dated February 7, 1994, the case managers designate a "B" when the institution is in better condition than the system indicates. A "C" supervisory concern code is used when the case manager’s review shows that the SCOR composite rating is appropriate, or should be worse.

The case managers also designate a follow-up code. An "N" indicates that no follow-up action is required and an "F" indicates that a follow-up action has taken place or will take place. Follow-up actions may include onsite visitations; discussions with the institution's management; communication with other federal banking agencies and state authorities; or continued offsite analysis of the institution.

The follow-up codes entered into SCOR by case managers generally depend on whether an examination is recent (no follow-up needed) or in the near future (follow-up is the next onsite examination). Of 94 FDIC-supervised institutions downgraded to a "3," "4," or "5," SCOR flagged 27 institutions (29 percent) as having potential problems. Of these 27 cases, only 5 were identified as a concern that required follow-up action. The follow-up action for three of the five cases with a concern was that an examination was scheduled to take place in the next 2 to 6 months and the case manager would wait until the completion of the examination to determine whether the bank had a problem. The follow-up action on the other two cases was to (1) monitor the next ORL to see if the institution was flagged again and (2) call the bank in the following quarter.

According to the DSC Manual of Examination Policies, "the quality of management is probably the single most important element in the successful operation of a bank." DSC case managers were aware that 22 (including the 5 cases flagged as follow-up) of the 27 flagged institutions had management problems based on the previous examination, such as weaknesses in loan administration policies, asset management, and asset quality. For example, one case manager’s analysis stated "management efficiency and appropriate directorate oversight was deemed to be lacking…Credit administration/underwriting was mediocre." In another instance, the case manager’s analysis stated "the State authority recently conducted an exam. The exam reflected a deteriorated financial condition due to extremely weak loan administration." However, none of the 22 institutions that had known management problems had a visitation recommended. The only institution that had a visitation performed as part of the SCOR analysis had no known problems.

Rather than acting on SCOR flags, case managers used the onsite examination and the examination cycle almost exclusively as the basis for following up on the institutions’ potential deterioration. Some case managers we interviewed expressed the following views:

  • SCOR does not depict an accurate picture for the majority of flagged institutions. The case managers generally know the banks' problems before they are flagged by SCOR.
  • SCOR is useful but has limitations. It is just one source.
  • The Management component cannot be assessed until the examination occurs.
  • SCOR did not give an early warning for the bank's deterioration. The case manager was aware of the deterioration before SCOR generated a report.

As a result, DSC case managers are generally not using SCOR as an early warning system of potential deterioration in FDIC-supervised institutions. Instead, the case managers appear to be relying on the onsite examination to detect deterioration in the safety and soundness of institutions.

Conclusion

Innovation, deregulation, and advances in technology have contributed to making the banking business more complex and potentially riskier. The quick demise of several banks in the past several years underscores the importance of timely data and early intervention when potential problems are evident. Although SCOR is intended to assist the FDIC in achieving such intervention, it appears the processing of SCOR information as currently performed is largely unproductive. This may be due in large part to the time allowed to process and review information after the Call Report date – up to 4½ months. During this time frame, an examination often has either been completed or will be started soon which typically results in no further follow-up by case managers. Also, there appears to be reluctance by case managers to proactively follow up on the SCOR flag by either communicating with the bank or requesting that an examiner go onsite to assess any potential problems. This raises questions about the effectiveness of SCOR as an early warning system as currently implemented.

We believe integrating information associated with bank management from the previous exam into the SCOR offsite review could enhance the offsite monitoring process. History has shown that in almost every bank failure, the problems could ultimately be traced back to poor management. Poor management practices may go undetected, particularly in strong economic times, and it can take a long time before the problems are evident in reported financial information. In cases where bank examiners have identified management weaknesses and then the bank is subsequently flagged by SCOR, these cases pose unusual risk, and case managers should be encouraged to recommend onsite activity or other interaction with the bank before the next scheduled examination.

Recommendations

We recommend that the Director, DSC:

  1. Assess the usefulness of SCOR as an early warning system as it is currently being implemented. Case managers should participate in such an assessment and provide input on whether use of the system should continue given the current limitations on its effectiveness.

If DSC determines that SCOR should continue as part of the offsite monitoring program, we recommend that the Director, DSC:

  1. Revise SCOR procedures to require that the DSC case manager analyses be performed within shorter time frames than allowed by the current procedures.

  2. Instruct case managers to more often recommend onsite activity or other interactions with the institution as a follow-up action for those institutions flagged by SCOR that also have previously-identified management weaknesses.

CORPORATION COMMENTS AND OIG EVALUATION

On September 20, 2002, the DSC Director provided a written response to the draft report. The response is presented in Appendix II to this report. We also held follow-up discussions with DSC staff to clarify aspects of the response. DSC concurred with the report’s three recommendations.

DSC substantially agreed with recommendation 1. While recognizing SCOR's limitations, DSC has concluded that SCOR along with other offsite models provides the most effective and efficient method for identifying emerging risk. DSC believes that SCOR is one of the best offsite models in use by the regulatory community, stating that other federal regulatory agencies and most state banking authorities also use SCOR. Additionally, case managers and examiners make use of offsite data, including SCOR, to pinpoint areas of review in the pre-examination planning process and the risk-focused examination process. DSC stated that it would continue to evaluate the usefulness of SCOR and other risk monitoring tools. DSC also agreed with the OIG that case manager participation in assessing the offsite program is necessary and stated that case managers are involved in various aspects of assessing and improving the FDIC's offsite program, including SCOR. Also, the Director noted that the regulatory agencies have a process in place through the Federal Financial Institutions Examination Council (FFIEC) Surveillance Task Force to discuss each agency’s monitoring methodologies and programs. We consider the Director's comments responsive to our recommendation. This recommendation is resolved, dispositioned, and closed.

DSC concurred with recommendation 2 and stated that near the end of 2002 it will reduce the time frames for offsite reviews from 75 days to 45 days after Call Report data is final. In addition, when the Chairman’s initiative for shortening the time needed to process Call Reports is instituted, the delay in receiving final Call Report data will be reduced from the current 60 days. This recommendation is resolved but will remain undispositioned and open until we have determined that agreed-to corrective action has been implemented and is effective.

For recommendation 3, DSC agreed that institutions flagged by SCOR and other offsite monitoring tools, with previously identified management weaknesses, may pose increased risk. DSC guidance instructs case managers to analyze, monitor, and report upon significant changes in risk profiles. In response to this recommendation, DSC stated that future regional office reviews will include a focus on case manager adherence to established offsite review policy. In follow-up discussions with DSC staff, they indicated that they would expand their regional office review program for 2003 to address the concerns noted in the audit report. These actions will help to ensure that case managers are recommending onsite activity or other interactions with the institution when warranted. This recommendation is resolved but will remain undispositioned and open until we have determined that agreed-to corrective action has been implemented and is effective.

In addition to responding to the three recommendations, the Director had further comments on certain aspects of the report. Those comments generally reiterated the report’s position that SCOR is dependent on the accuracy of Call Report data and cannot assess non-financial factors such as fraud and insider abuse.

Regarding the time lag of up to 4 1/2 months between the Call Report date and subsequent offsite review, DSC stated that in most cases, case managers start the review process as soon as the Offsite Review List is available – approximately 60 days (2 months) after the Call Report date. Since case managers do not document when the review starts but rather when they finish it, we cannot address that observation. However, in our sample, completed offsite review analyses averaged over 110 days (3 1/2 months) and in some instances ranged up to 4 1/2 months after the Call Report date.

Lastly, DSC believes "it is not ‘rare’ that appropriate follow-up actions are taken using all the analysis tools and information available." The OIG agrees that deferring to a past, present, or future examination is oftentimes an acceptable follow-up action. Our concern is that case managers may be over-relying on that particular action and, as a result, may be missing opportunities for early intervention.


APPENDIX I

OBJECTIVES, SCOPE, AND METHODOLOGY

The audit objectives were to determine the effectiveness of SCOR as an early warning system and assess actions taken by DSC in response to early warning flags identified by SCOR. To accomplish our objectives, we reviewed the sections in the DSC Case Managers Procedures Manual related to the SCOR Offsite Review Program, the SCOR Manual, DSC Manual of Examination Policies, DSC regional directors memoranda, and DSC quarterly management reports. To understand the differences in offsite monitoring procedures around the country, we visited the eight regional offices located in Atlanta, Georgia; Boston, Massachusetts; Chicago, Illinois; Dallas, Texas; Kansas City, Missouri; Memphis, Tennessee; New York, New York; and San Francisco, California. At the regional offices, we interviewed a total of 23 DSC case managers and two assistant regional directors based on institutions selected and availability of personnel. We conducted the interviews to obtain an understanding of the region’s offsite monitoring program.

We focused on the 204 FDIC-supervised institutions that had a composite rating of "3," "4," or "5" as of October 2000 and that had a rating downgrade from examinations performed between October 1998 and October 2000. We did not include new institutions in our review. SCOR exception reports are not generated for new institutions because they are already subject to an increased level of supervision for their first 3 years of operation.

We reviewed all "5" rated institutions and judgmentally sampled "3" and "4" rated institutions. We judgmentally selected "3" and "4" rated institutions by asset size and composite rating changes, such as a composite rating change from a "1" to a "3." In addition, we selected institutions from all FDIC regions in order to get a nationwide perspective. Table 3 shows the regions and number of FDIC-supervised institutions reviewed.

Table 3: Number of FDIC-Supervised Institutions Reviewed by OIG

FDIC Region Composite "3" Rating Composite "4" Rating Composite "5" Rating Total Institutions
Atlanta 9 3   12
Boston 6     6
Chicago 10 3 1 14
Dallas 10 5 2 17
Kansas City 9 4   13
Memphis 10 1   11
New York 8 2   10
San Francisco 8 2 1 11
Total 70 20 4 94

In total, we reviewed 94 FDIC-regulated institutions, or 46 percent of the 204 "3," "4," and "5" rated institutions downgraded between October 1998 and October 2000. At each regional office, we reviewed the documentation in institution files that supported offsite monitoring between the dates of the prior examination and the examination that downgraded the institution. Documents reviewed included SCOR and other offsite analyses, reports of examination, one visitation report, telephone conversations with bank management, bank reports, ratings change and confidential problem bank memoranda, and other case manager documents.

We reviewed the prior and downgraded examinations’ composite and component ratings of those institutions not flagged by SCOR. We selected and analyzed those institutions that were downgraded two or more composite ratings between exams to determine the underlying deficiencies and why the SCOR system did not flag the institution. We counted the institutions that SCOR flagged between the dates of the prior examination and downgrade examination and calculated the percentage to actual downgrades to determine the number of institutions that the system flagged for early warning to the DSC case managers.

We reviewed for consistency the action and follow-up codes given by DSC case managers for institutions flagged by SCOR. We evaluated the timing of the "better" than SCOR and "concern" action codes as compared to the downgrade of the institution. We also scheduled all action codes for all ORLs on which the institution was flagged and evaluated the codes given when the DSC case manager knew about the institution downgrade.

The limited nature of the audit objectives did not require that we (1) test for fraud or illegal acts, (2) test for compliance with laws and regulations, or (3) determine the reliability of computer-processed data obtained from the FDIC’s computerized systems. With the exception of the controls over the SCOR model and the application controls of that system, we performed a limited review of internal controls over SCOR. We evaluated whether all institutions on the ORLs had analysis prepared within the designated time frames. We also performed limited testing of the information extracted from SCOR. We did not find any problems with our limited testing of SCOR internal controls. Additionally, we reviewed the 2000 Annual Performance Plan goal for the offsite review program where DSC reported that it met the performance goal of: "100% of supervisory concerns noted during offsite reviews of insured depository institutions are resolved without further action or are referred for examination or other supervisory action." We conducted the audit in accordance with generally accepted government auditing standards from January 2001 through July 2002.


APPENDIX II

CORPORATION COMMENTS

September 20,2002

TO: Stephen M. Beard, Deputy Assistant Inspector General, Office of the Inspector General

FROM: Michael J. Zamorski [Electronically produced version; original signed by Michael J. Zamorski], Director, Division of Supervision and Consumer Protection

SUBJECT: Draft Report on the Statistical CAMELS Offsite Rating Review Program for FDIC-Supervised Banks (SCOR) (Assignment No. 2001-100)

The Division of Supervision and Consumer Protection (DSC) appreciates the opportunity to respond to this draft report. DSC believes that a number of observations under the heading Results of Audit deserve clarification. The report concludes with one primary recommendation and two supporting recommendations. DSC substantially agrees with the recommendations. Our continuing self-improvement processes adapt our existing tools, such as SCOR, to changing and emerging risk in the financial services industry. DSC believes our continued review and assessment of SCOR as part of our comprehensive off-site monitoring program fulfills the primary recommendation to assess the usefulness of SCOR as an early warning system.

We substantially agree with the supporting recommendations to shorten the timeframe within which SCOR analyses are conducted and to consider onsite activity at institutions with previously identified management weaknesses. We currently include Case Managers in the design process, and we have already taken steps to reduce the time frame associated with offsite reviews. We are confident that our current case management guidance is adequate, given that SCOR is to be used in conjunction with all other off-site risk monitoring tools, and is not designed to stand alone as a predictive measure. Existing guidance adequately directs increased supervisory monitoring, including onsite presence, based upon both comprehensive use of monitoring tools and thorough analysis of an institutions historical and emerging risk profile.

OIG’s Recommendations

The following three recommendations have been proposed by the OIG in the draft Report:

(1) Assess the usefulness of SCOR as an early warning system as it is currently being implemented. Case managers should participate in such an assessment and provide input on whether use of the system should continue given the current limitations on its effectiveness. If DSC determines that SCOR should continue as part of the offsite monitoring program, we recommend that the Director, DSC:

DSC’s Response

DSC substantially agrees with this recommendation. DSC continually evaluates the usefulness of SCOR and all other risk monitoring tools. DSC believes that SCOR is one of the best offsite models in use by the regulatory community. The Office of Controller of the Currency (CCC), Federal Reserve, and most of the State Banking authorities use SCOR. We do recognize its limitations; however, SCOR along with the other models provide the most effective and efficient method for identifying emerging risk. A review of the "1" and "2" rated institutions examined and downgraded in 2000 indicate that 182 or 63% were identified prior to the downgrade by one of the FDIC’s offsite models: SCOR, the Growth Monitoring System (GMS), and the Real Estate Stress Test (REST). These models influence the on-site examination process primarily via the pre-examination planning function. All three offsite models use quarterly financial reporting (Call Report and Thrift Financial Report (TFR)) data to highlight different aspects of institution performance levels and trends. Case Managers and examiners make use of offsite data to pinpoint areas of review in the pre-examination planning process and the risk-focused examination process. Results from offsite data do not provide an in-depth analysis of all areas of institution performance. Instead, they identify areas of relative strength as well as weakness at an individual institution. Offsite data is not a substitute for examiner judgement, but offers examiners an opportunity to more quickly assess an institution’s risk profile and devote more resources to areas of weakness or higher risk.

In response to the recommendation, DSC is continually evaluating its offsite models and emerging technology in a constant effort to improve the offsite-monitoring program. We agree that Case Manager participation in assessing the offsite program is necessary. Case Managers are involved in various aspects of assessing and improving the FDIC’s offsite program, including SCOR. The regulatory agencies also have an active process in place through the Federal Financial Institutions Examination Council (FFIEC) Surveillance Task Force to discuss each agency’s monitoring methodologies and programs. Finally, we continue to review ways to incorporate management weaknesses into the offsite process. We do agree that poor management usually is an early warning of more severe problems.

(2) Revise SCOR procedures to require that the DSC case manager analyses be performed within shorter timeframes than allowed by the current procedures.

DSC’s Response

DSC concurs with this recommendation. DSC will be releasing a new offsite module under ViSION (Virtual Supervisory Information on the Net) near year-end 2002. With that release we will be reducing the timeframes for offsite reviews from 75 days after finalized financial data to 45 days.

(3) Instruct Case Managers to more often recommend onsite activity or other interactions with the institution as a follow-up action for those institutions flagged by SCOR that also have previously-identified management weaknesses.

DSC’s Response

DSC substantially agrees with this recommendation. Existing policy and procedure have already covered this recommendation. DSC believes that current offsite monitoring tools, guidance and the processes already in place to continue to develop and enhance those tools and guidance are effective. Case Manager recommendations are already based upon a review of supervisory information, of which offsite data only represents a small portion. The current offsite-monitoring program adequately monitors risk between onsite examinations. DSC agrees that institutions flagged by SCOR, and all other off-site monitoring tools, and with previously identified management weaknesses may pose increased risk. DSC guidance instructs Case Managers to analyze, monitor, and report upon significant changes in risk profiles. In Response to this recommendation, Case Manager adherence to established offsite review policy and procedure will be a focus of future Regional Office Reviews.

Audit Observations and DSC Comments
The following items were identified in the report, and we believe these statements need further clarification.

1. A time lag of up to 4 1/2 months exists between the date of the Call Report and the subsequent offsite review;

DSC’s Response

The OIG’s statement is not totally accurate. The Case Managers review process does not begin at the conclusion of the 4 1/2 months, but when the Call Report is finalized (approximately 60 days). In most cases, the Case Managers begin the review process as soon as the Offsite Review List becomes available (approximately day 60). The Case Managers use this time to develop an appropriate supervisory strategy, which may include scheduling an examination/visitation, contacting the field office, bank, or primary regulator, reviewing correspondence, or waiting for the results of a pending or scheduled examination. Final memoranda may not be dated until the end of the review cycle to capture any late developments. If the Case Manager believes that the financial condition of an institution has deteriorated, a request can be made to accelerate the examination or schedule a visitation, which may be converted into an examination if, conditions warrant.

DSC is planning to reduce the review time when the enhanced system becomes available near year-end 2002. In addition, with the Chairman’s initiative to reduce the time to make the Call Reports available, the delay associated with receiving finalized financial will be reduced from the current 60 days.

2. The SCOR system depends on the accuracy and integrity of Call Report information to serve as an early warning between examinations;

DSC’s Response

SCOR is a financial condition model that was developed through a process of testing numerous financial measures for statistical significance over long periods. The result is a statistically reliable model to identify financial condition for all insured institutions. SCOR is reliant on onsite examinations to review the integrity of the reported financial information. However, the offsite models are only as good as the financial information reported by the bank.

DSC does not consider offsite monitoring a substitute for onsite examinations. SCOR and other offsite models cannot identify insider abuses or fraudulent financial data. SCOR was not designed for these purposes.

3. The SCOR system cannot assess management quality and internal control or capture risks from non-financial factors such as market conditions, fraud, or insider abuse;

DSC’s Response

We agree that SCOR cannot identify insider abuses nor detect fraud. However, SCOR was not designed for these purposes. By design, the model does not rely on limited, non-standard measures. Case Managers are responsible for bringing qualitative information about bank management into their review.

SCOR was designed as a financial condition model. SCOR indirectly forecasts a rating for the management component using the same technique as it does for the other ratings. The model examines the Call Reports of institutions where examiners have actually assigned poor management ratings. SCOR has found that examiners give poor management ratings to institutions with low earnings, low reserves for loan losses, and high levels of past due and problem loans. Because SCOR measures the condition (financial ratios) compared to examination activity, SCOR can produce reasonably accurate forecasts of management ratings. DSC has considered using the management rating to flag additional institutions; however, this cannot be accomplished without increasing error rates. It is not effective or efficient to flag institutions using limited management measures without further analysis.

About non-financial factors such as market conditions, the FDIC’s Division of Insurance and Research has tested external economic measures and cannot find a correlation between bank failures and any particular economic measures. There were no statistically significant external variables, which could be isolated and utilized in the model.

4. DSC Case Managers rarely initiate follow-up action to address probable downgrades identified by SCOR outside of deferring to a past, present, or future examination.

DSC’s Response

To initiate follow-up actions, Case Managers use all off-site monitoring tools, examination findings, historical data, public data, and direct contact with institutions. Despite, the audit emphasis on SCOR and prior examination findings, DSC believes it is not "rare" that appropriate follow-up actions are taken using all the analysis tools and information available.

Last Updated 10/9/2002
Search | Accessibility | Privacy | Information Quality | Plain Writing Act of 2010 | Contact Us | Site Map | Home