The AffiliateMarketIngtools of Sciences, Engineering and Medicine
Office of Congressional and government Affairs
At A Glance
: The VA Medical Examination And Disability Rating Process
: 02/14/2008
Session: 110th Congress (Second Session)
: Michael McGeary

Senior Program Officer and Study Director, Committee on Medical Evaluation of Veterans for Disability Benefits, Board on Military and Veterans Health, Institute of Medicine, The AffiliateMarketIngtools

: House
: Veterans’ Affairs Committee, Subcommittee on Disability Assistance & Memorial Affairs


Statement of

Michael McGeary
Senior Program Officer
Study Director
Committee on Medical Evaluation of Veterans for Disability Benefits
Board on Military and Veterans Health
Institute of Medicine
The AffiliateMarketIngtools

before the

Subcommittee on Disability Assistance & Memorial Affairs
Committee on Veterans’ Affairs
U.S. House of Representatives

February 14, 2008

Good morning, Mr. Chairman and members of the Committee. My name is Michael McGeary. I am a Senior Program Officer of the Institute of Medicine (IOM) and served as the staff director of the IOM’s Committee on Medical Evaluation of Veterans for Disability Benefits. Established in 1970 under the charter of the AffiliateMarketIngtools of Sciences, the IOM provides independent, objective advice to the nation on improving health.

The Committee on Medical Evaluation of Veterans for Disability Benefits (the Committee) was established at the request of the Veterans’ Disability Benefits Commission and funded by the Department of Veterans Affairs (VA).

In its June 2007 report, A 21st Century System for Evaluating Veterans for Disability Benefits, the Committee assessed the medical criteria and processes used by VA to determine the degree of disability of service-connected veterans. The Committee did not, however, assess nonmedical aspects of the VA disability claims process and therefore the report does not address all factors that might affect the timeliness of decisions on claims. The Committee did not, for example, evaluate the adequacy of staffing levels or the performance of management information systems.

Chapter 4 of the report focuses on the medical criteria VA uses to assess degree of disability, which are embodied in the VA’s Schedule for Rating Disabilities. Dr. Lonnie Bristow, who chaired the Committee, is scheduled to testify before you on the Rating Schedule on February 26. Chapter 5 of the report, which I am here to review today, focuses on the medical examination and disability rating parts of the claims process. Chapter 5 includes background information on the organization of the claims process and some statistics on workload trends and the timeliness and accuracy of decisions, which I will summarize briefly.

Disability Claims Workload – Veterans Benefits Administration (VBA)

Between and 2006, the annual number of claims from veterans for disability compensation increased by 56 percent (from 420,000 to 650,000). VA was able to decide 630,000 claims in 2006, almost as many as were filed, but the backlog of pending claims increased. At the end of 2006, 378,000 claims were pending, 83,000 of them for more than six months.

Disability Claims Workload – Board of Veterans Appeals (BVA)

Between and 2006, the annual number of formal appeals filed on VA Form 9 increased by 42 percent (from 33,000 to 46,000). Although there were fewer Veterans Law Judges (VLJs) in 2006 than in, the annual number of completed decisions grew, but not enough to keep the backlog of cases pending at BVA from doubling from 20,000 to 40,000. This did not include about 130,000 appeals being reconsidered at the regional office level, either before going to BVA or on remand from BVA.

Timeliness of Disability Decisions – VBA

The average elapsed time from the date the claim requiring a disability decision is received to the date it is decided at the regional office level was 177 days in 2006, up from 166 days in 2004 but down from 223 days in 2002.

Timeliness of Appeals Decisions – VBA and BVA

The average number of days to resolve appeals by VBA and BVA was 657 days in 2006, more than the 529 days it took in 2004 but less than the 731 days it took in 2002.


VBA and BVA each review a sample of decisions for quality assurance purposes. In 2006, 88 percent of rating-related cases met VBA’s accuracy standard, compared with 80 percent in 2002. BVA’s rate of deficiency-free decisions was 93 percent in 2006, compared with 88 percent in 2002.


VA does not assess consistency of decision making on a regular basis. There are indications of substantial variability in decision making from state to state, for example, in the average number of disabilities per veterans; average combined degree (or severity) of disability; average rating level for each of the 14 body systems; percentage of veterans service connected for PTSD, for ratings of 100 percent, and for individual unemployability; and in the percentage of appeals in which the appellant is successful.

IOM Committee Recommendations for Improving the Medical Examination Process

The medical aspects of the claims process that the Committee looked at were, first, the medical examination process and, second, the disability rating process.

Applicants for disability compensation are asked to provide their medical records and, under the duty-to-assist law, VBA helps them obtain those records, especially their service medical records. In nearly every case, VBA has applicants undergo a compensation and pension, or C&P, examination performed by a Veterans Health Administration (VHA) or contractor clinician. The reports of these C&P examinations become part of the medical evidence that VBA’s raters use to evaluate the degree of disability of the veteran and to assign a rating between 0 percent and 100 percent in 10 percent increments. The rating level in turn determines the amount of compensation the applicant will receive.

The Committee found that VBA and VHA have improved the quality and timeliness of medical examinations greatly in the last 10 years but made three recommendations for further improvements. First, VA has developed standardized examination worksheets for more than 70 common conditions, to increase completeness and consistency of examination reports. VA does not, however, have a regular process for updating the worksheets. Most were developed a decade ago, and the Committee found some outdated tests and procedures. The Committee recommended, therefore, that VA implement a process for periodic updating of the disability examination of the worksheets, which should be part of, or closely linked to the process for updating the Rating Schedule recommended by the Committee, with input from an expert advisory committee, also recommended in the report.

Second, VA has developed interactive online versions of the examination worksheets, which result in quicker and higher quality reports than dictated reports. VA has not made use of the online templates mandatory, and the Committee recommended that VA make them mandatory.

Third, the Committee found that VA’s quality review of the examination process was more procedural than substantive, measuring whether a requested item is included in the report, not whether the item is accurate. The Committee recommended that VA establish a regular assessment of the substantive quality and consistency, or inter-rater reliability of examinations and, if the assessment finds problems, to address them, for example by revising the templates or adjusting the training program.

IOM Committee Recommendations for Improving the Rating Process

After the information needed to adjudicate a claim is collected, including the C&P examination report, the veteran’s file is given to a nonmedical rater, who compares the information in the file with the criteria in the Rating Schedule to determine the rating level. The Committee offered three recommendations for improving the rating process.

First, the Committee found that that accuracy rate of rating decisions has increased steadily since VA introduced a quality review program in 1998, from an accuracy rate of 64 percent to 88 percent in 2006. The sample size is small, however, only enough to determine the overall accuracy rate of regional offices, not the accuracy of decisions at the body system or diagnostic code level. GAO and VA’s Office of Inspector General have noted indicators of variability in decision outcomes and urged VA to identify disabilities subject to a great deal of decisions variability, understand the reasons for the variability, and act to reduce the variability where possible. The Committee recommended that VBA periodically assess inter-rate reliability at the diagnostic code level and study the accuracy and validity of ratings. For example, VBA could have a sample of claims rated by two or more raters and analyze the degree of consistency in the ratings given. It could sample ratings given for a particular diagnostic code across field offices to analyze inter-rater and inter-office differences.

Second, the Committee found that raters should have better access to medical expertise. The raters are not medical professionals. If they have a question about the meaning of a test result or if the evidence is inconclusive or incomplete, they have to refer the case back to VHA, which adds time, or make a decision based on incomplete information, which affects accuracy. The committee recommends that VBA have medical consultants available to raters in the regional offices. With modern communications technology, VBA medical consultants could be in a national or in regional centers.

At one time, there were physicians on the rating boards, but the U.S. Court of Appeals for Veterans Claims barred the participation of physicians in rating decisions. The Committee believes that the court’s decision was based on a misunderstanding of the role of physicians in adjudication, which is different from the role of treating physicians. All other major disability programs, such as Social Security’s, DoD’s disability evaluation process, and the Federal Employee Compensation Act program and civil service disability retirement programs, either have physicians or other appropriate clinicians involved in the adjudication decision or have medical experts readily available to review and discuss claims with lay disability raters.

The third recommendation regarding the rating process is to develop and mandate uniform training and certification programs across all regional offices with standardized objectives and outcomes. At the time of the report, VA was well along in developing a training and certification program for C&P medical examiners, which was due to be deployed in the current fiscal year, 2008. VBA had implemented a certification program for its veterans service representatives but, although plans were being made, no such certification program existed for raters. The Committee recommended that VBA develop a training program for raters, using advanced techniques, and evaluate the program rigorously.


The June 2007 report of the Committee on Medical Evaluation of Veterans for Disability Benefits recommended further improvements in VA’s medical examination and rating processes. These recommendations were aimed at improving the quality of medical evaluation and rating processes in terms of accuracy and consistency rather than at increasing the timeliness of decisions. However, several of the recommendations promise to improve timeliness. The recommendation to mandate the use of online medical examination templates, should speed the completion of examination reports, and the recommendation to provide raters with access to medical consultants, should reduce the need to refer case files to VHA for medical opinions.

This concludes my remarks. Thank you for the opportunity to testify. I would be happy to address any questions the Subcommittee might have.


An archived transcript of the hearing can be found on the House Veterans' Affairs Committee’s Web site.