United States Department of Veterans Affairs

HOUSE COMMITTEE ON VETERANS’ AFFAIRS
SUBCOMMITTEE ON DISABILITY ASSISTANCE AND MEMORIAL AFFAIRS
MARCH 24, 2010
STATEMENT OF BRADLEY G. MAYES
DIRECTOR, COMPENSATION AND PENSION SERVICE,
VETERANS BENEFITS ADMINISTRATION, U.S. DEPARTMENT OF VETERANS AFFAIRS

March 24, 2010

Mr. Chairman and members of the Subcommittee, thank you for providing me with this opportunity to discuss the Veterans Benefits Administration’s (VBA) Quality Assurance Program and the positive effect it has on the processing of Veterans’ disability claims.  Joining me today are Edna MacDonald, Acting Deputy Director of Compensation & Pension Service for Policy and Procedures, and Terence Meehan, Director of VBA Employee Development and Training. 

The Subcommittee has indicated a special interest in our Systemic Technical Accuracy Review (STAR) Program.  However, I want to emphasize that STAR is only one of four tiers of our multi-faceted national Quality Assurance Program.  The STAR component focuses on claims processing accuracy, while the other three components address regional office oversight, rating consistency, and special focus reviews.  Along with the STAR Program, these components collaborate to ensure high quality and consistent decisions for Veterans.  Before discussing the STAR program, I will briefly discuss the three other facets of our Quality Assurance Program, which endeavors to ensure that compensation and pension benefits are provided in a timely, accurate, and consistent manner.

Quality Assurance Program

Oversight 

The oversight component involves compliance oversight visits to regional offices by members of Central Office site survey teams.  Each regional office is visited on a three-year rotating basis.  All operations of the visited regional office are reviewed and evaluated, with recommendations provided for improving claims processing efficiency, accuracy, and timeliness.  Additionally, the site visit team assesses whether the regional office is in compliance with VBA policy and procedures and consistent with national standards.  A Web site has recently been created to share the “best practices” identified during site visits.

Consistency 

The consistency component is based on mining of source data from the VBA corporate database that houses information from all regional office rating decisions.  Given the possibility for variation in disability decisions, it is incumbent on VA to ensure program integrity by having a credible system for identifying indications of inconsistency among its regional offices and then remedying any inconsistencies found to be unreasonable.  The two key pieces of information obtained are the grant rate and the evaluation distribution. 

Data analysis of recently completed rating decisions identifies the most frequently rated diagnostic codes, and plots both the grant/denial rate and evaluation assigned across all regional offices.  This information focuses on rating decisions for specific disabilities and provides a method to evaluate consistency and accuracy on a regional office level.   A focused review of files from the regional offices that are above or below the national average is conducted with the goal of identifying causes for the statistical anomaly.  Once root causes are identified, the regional offices are notified of any recommendations, which may include training to correct problems or improper procedures identified during the review. 

Special Focus Reviews

Special focus reviews address topics of special interest to VBA or other stakeholders where accuracy and consistency are an issue and are conducted as needed in support VA’s mission and needs.  They address a specified purpose or type of claim and may involve a nationwide review or a review of work assigned to a specific regional office.  The ad hoc reviews can be one-time or recurring in nature.  For example, consolidation of the processing of radiation claims began on October 16, 2006.  In 2008 the STAR staff conducted a special focused review of radiation claims completed between October 2006 and October 2007.  The findings from this review provided a means for assessing the consolidation effort.  The review found the overall accuracy and processing timeliness of radiation claims improved following consolidation. 

STAR 

STAR is the quality assurance component that focuses on claims processing accuracy.  STAR reviews are focused on outcomes for Veterans rather than specific processes by which the outcomes are reached.  STAR reviews evaluate the quality of the rating decision product that VBA provides for Veterans.  From the Veteran’s perspective, there is an expectation that we understand the claim, evaluate it accurately and fairly, and provide proper compensation under the law.  The purpose of STAR reviews is to ensure that rating decision outcomes meet these expectations.  

The STAR system includes review of three types of work: claims that usually require a rating decision; authorization work that does not generally require a rating decision; and fiduciary work.  The focus on rating-related decisions and authorization actions is a benefit entitlement review using a structured checklist to ensure all issues were addressed, claims assistance was provided, and the decision was correct (to include a correct effective date).  Accuracy results are calculated on the results of the benefit entitlement review.  STAR findings provide statistically valid accuracy results at both the regional office and national level.  In addition, quality reviews of the accuracy of VBA examination requests and VHA examination reports are conducted in collaboration with the Compensation and Pension Examination Program (CPEP) office.

VBA continues to focus on expanding and improving the STAR Program.  In 2008 the STAR staff was consolidated to Nashville, which provided space for expansion and allowed aggressive recruitment for more STAR  staff Since then, STAR has completed an extensive expansion effort, more than doubling the staff and increasing the sample size to obtain a statistically valid sample at the regional office level.  In 2010, quality review of fiduciary cases was transferred to Nashville.  During fiscal year (FY) 2009, 24,747 cases were reviewed for rating and authorization accuracy, and 3,671 cases were reviewed for fiduciary accuracy.  The targeted number of cases for STAR review in

FY 2011 is 37,932.  The STAR sample was expanded in 2009 to include a review of brokered work completed by VBA’s 12 Resource Centers and one Tiger Team (a Cleveland-based team focus on processing a unique subset of claims), and sampling was increased for the Pension Management Centers to allow measurement of pension entitlement decisions.  Ongoing reviews of the Disability Evaluation System cases and Appeals Management Center cases became part of the monthly compensation quality sample in FY 2009.

STAR error trends are identified and used as training topics.  Training continues to be a priority and is conducted using a variety of methods, including a monthly national Quality Call, where Compensation & Pension Service’s training, policy, and procedures staffs collaborate with the STAR staff to address national error trends identified in STAR assessments.

To assure accuracy of STAR findings, a second-level peer review of all comments was implemented in FY 2009.  Regional offices are provided explanations on all error calls, and they are required to take corrective action.    On a quarterly basis, regional offices are required to certify to VBA headquarters the corrective action taken for all errors identified by STAR.  The reported actions are validated during the oversight visits conducted by the site survey teams. 

Section 224 of the Veterans’ Benefits Improvement Act of 2008 (PL 110-389) required VA to contract with a third-party entity to conduct an assessment of the quality assurance program, evaluate a sample of employees’ work, measure the performance of VA regional offices and accuracy of rating, assess employee and manager performance, and produce data to help identify trends.  VBA contracted with the Institute for Defense Analyses (IDA) to conduct this assessment.  IDA furnished preliminary findings concerning its evaluation of the national accuracy of work, the regional office accuracy of work, the accuracy of disability ratings, and the consistency of disability ratings.  Its preliminary findings include an assessment that the current STAR accuracy review program is adequately designed to estimate, within a 5 percent margin of error, the percentage of claims completed that contain errors both nationally and by regional office.  IDA is continuing to determine options for identifying the accuracy and consistency of disability rating decisions.

VBA anticipates that further improvements in the STAR Program, as well as other components of the Quality Assurance Program, will result from recommendations from the IDA study.  We are looking forward to the recommendations from IDA and working collaboratively with them to further improve our Quality Assurance Program.

Quality Assurance and Training

Training Improvements

VBA is committed to using the error trends and accuracy findings to improve overall quality.  VBA uses nationwide error patterns identified by STAR reviews, as well as information from other components of the Quality Assurance Program, to adjust and develop the employee training curricula. 

All employees, regardless of training level, must receive 80 hours of instruction annually.  Instructional methods may include Training Performance Support System (TPSS) modules, lectures, or practical application exercises.  For intermediate and journey-level employees, the 85 hours must include 40 Core Technical Training Requirement (CTTR) hours.  These involve standardized training curricula of essential topics and information.  Employees must complete an additional 20 hours of training from a list of standardized topics provided by VBA.  The final 20 hours may be used by regional offices to train on local issues and areas of concern.  This approach ensures that new and experienced employees are grounded in standardized claims processing fundamentals.

Data from STAR reviews, consistency reviews, special focus reviews, and regional office site visits, are used to develop training for our new hires, as well as our intermediate and journey-level employees.  Claims processing personnel are timely informed of errors and inconsistency trends and provided with constructive feedback, including instructions on how to avoid such errors in the future.  The error trends identified in STAR reviews provide us the information we need to assess the effectiveness of our training programs and make necessary adjustments.  This promotes our goal of providing accurate, fair, and consistent claims processing. 

Office of Employee Development and Training 

To ensure that our training is up to date and incorporates the lessons learned from the Quality Assurance Program, VBA has a robust training evaluation program conducted by the Technical Training and Evaluation section of the Office of Employee Development and Training.  With the assistance of professionally qualified outside evaluators, this office has undertaken two formal evaluations of the centralized Challenge program for newly hired claims personnel.  The first formal evaluation, conducted during 2007-2008, included visits to 16 regional offices and surveys with 1,405 respondent trainees.  This led to the following three recommendations, which were adopted in reformulating the Challenge curricula:

Re-sequencing the content to eliminate redundancy and make better use of resources,
Providing standardized hands-on claim processing during the centralized training portion, and
Creating more formal requirements for delivery and compliance.
The second evaluation of Challenge training began at the end of 2009.  Preliminary findings show that the changes made to Challenge significantly improved its effectiveness, as demonstrated by new trainees now being able to correctly process simple cases immediately after completing the centralized portion of the training.  TPSS training was separately evaluated during 2005-2006 and 2006-2007, and then subsumed into the Challenge evaluations because of its high usage during the initial training phase.

In total, the Office of Employee Development and Training visited 34 regional offices during the programmatic evaluations of Challenge and collected more than 3,200 surveys.  This is in addition to the hundreds of individual participant Challenge surveys done in the course of the three phases of Challenge classes. 

Performance support training tools allied to TPSS modules continue to show high and increasing usage, reflecting their utility to the field.  For example, the Medical Electronic Performance Support System provides computerized visual images of the various human body systems.  It was developed with STAR review input to assist with identifying the appropriate rating codes associated with different body systems and to facilitate medical examination requests.  This tool had 633,585 unique user sessions in FY 2009, a 32 percent increase from FY 2008.  Another training tool, the Veterans Service Representative Assistant, was designed to assist with claims development and had 34,696 unique sessions in

FY 2009, a 62 percent increase from FY 2008.  The increased use of these performance support tools may be attributable to a growing population of new claims processors who have learned about them during Challenge training, where their use is integrated into practical applications.

 VBA continues to improve its assessment of training program quality and effectiveness by collecting more timely feedback from participants.  Improvements in the last two years, for instance, include adding surveys during the home-station segments of Challenge training and promulgating an official requirement to complete surveys for both Challenge and CTTR training.   

Conclusion

The VBA Quality Assurance program has undergone significant change over the past several years and has become more comprehensive by expanding the type and breadth of cases reviewed.  This is a journey that we are on, and we look forward to learning from the IDA review and working with our stakeholders to continuously improve our Quality Assurance Program. 

Thank you for the opportunity to provide you an update on our accomplishments.