NCRVE Home

IMPROVING PERFORMANCE
MEASURES AND STANDARDS
FOR WORKFORCE EDUCATION

MDS-821



Brian M. Stecher
Lawrence M. Hanser


RAND

Mikala L. Rahn
Karen Levesque
Steven G. Klein
David Emanual

MPR Associates

National Center for Research in Vocational Education
Graduate School of Education
University of California at Berkeley
2030 Addison Street, Suite 500
Berkeley, CA 94720-1674


Supported by
The Office of Vocational and Adult Education
U.S. Department of Education

May,1995


FUNDING INFORMATION

Project Title: National Center for Research in Vocational Education
Grant Number: V051A30003-96A/V051A30004-96A
Act under which Funds Administered: Carl D. Perkins Vocational Education Act
P.L. 98-524
Source of Grant: Office of Vocational and Adult Education
U.S. Department of Education
Washington, DC 20202
Grantee: The Regents of the University of California
c/o National Center for Research in Vocational Education
2150 Shattuck Avenue, Suite 1250
Berkeley, CA 94704
Director: David Stern
Percent of Total Grant Financed by Federal Money: 100%
Dollar Amount of Federal Funds for Grant: $6,000,000
Disclaimer: This publication was prepared pursuant to a grant with the Office of Vocational and Adult Education, U.S. Department of Education. Grantees undertaking such projects under government sponsorship are encouraged to express freely their judgement in professional and technical matters. Points of view or opinions do not, therefore, necessarily represent official U.S. Department of Education position or policy.
Discrimination: Title VI of the Civil Rights Act of 1964 states: "No person in the United States shall, on the ground of race, color, or national origin, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving federal financial assistance." Title IX of the Education Amendments of 1972 states: "No person in the United States shall, on the basis of sex, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any education program or activity receiving federal financial assistance." Therefore, the National Center for Research in Vocational Education project, like every program or activity receiving financial assistance from the U.S. Department of Education, must be operated in compliance with these laws.



ACKNOWLEDGMENTS

  The research which informed this analysis could not have been conducted without the assistance of secondary and postsecondary vocational educators in seven states, who gave generously of their time and insights. Our colleagues E. Gareth Hoachlander and Bryan Hallmark also were important contributors to this study. Judy Wood assisted with text editing and document preparation.


PREFACE

  This report extends the results of an earlier NCRVE study, Improving Perkins II Performance Measures and Standards: Lessons Learned from Early Implementers in Four States, published in 1994, on the use of performance measures and standards to suggest principles for designing outcome-based program improvement systems in light of current efforts to reform the U.S. workforce education system. At the time this project was undertaken, states were actively involved in implementing the Carl D. Perkins Applied Technology and Vocational Education Act of 1990 (Perkins II), and educators and policymakers were anticipating the reauthorization of the legislation in 1995. However, the policy environment has changed; other options such as consolidated block grants for education and training are receiving increasing attention. Although in its details the present document reflects an emphasis on Perkins II, in its general principles the report should be of interest to federal policymakers engaged in developing new education and training policy initiatives. A companion RAND Issue Paper, Accountability and Workforce Training (Stecher & Hanser, 1995), discusses the implications of this study and related research on accountability in a non-Perkins environment.


SUMMARY

  The Carl D. Perkins Vocational and Applied Technology Education Act of 1990 (Perkins II) has guided federal vocational education policy for the past five years. One of the most significant components of Perkins II was its emphasis on using systematic outcome data as a program monitoring and improvement tool. A recent NCRVE study of the effects of Perkins II, Improving Perkins II Performance Measures and Standards: Lessons Learned from Early Implementers in Four States (Stecher et al., 1994), found that the performance measures and standards provisions designed to promote program improvement were not achieving their full potential; it identified shortcomings and recommended actions that could be taken to improve the act.

  This report examines the implications of that research for enhancing accountability in future federal workforce preparation legislation. It also illustrates specifically how the language of Perkins II could be changed to carry out the recommendations of the earlier study.

  The following four features were identified as lacking in Perkins II but were found to be important for an outcome-based system to promote effective program improvement:

  1. Coordinate separate components into a more integrated system for planning, implementing, monitoring, and improving vocational education and training.
  2. Increase the emphasis on the use of the system of performance measures and standards as a program improvement tool.
  3. Clarify the requirements for measures and standards and improve their technical quality.
  4. Increase the amount of technical assistance provided by state and federal agencies to support change at the local and state levels.

  Specific examples are given of changes in the language of Perkins II to incorporate these principles.


INTRODUCTION

  Current federal vocational education legislation expires in 1995, presenting the 104th Congress with an opportunity to reshape federal policy regarding secondary and postsecondary vocational education. At the time of this writing, it remains uncertain whether the Carl D. Perkins Vocational and Applied Technology Education Act (Perkins II) will be reauthorized in its present form or whether federal vocational education initiatives will be "merged into a broader workforce-development bill" (Sommerfeld, 1994, p. 18). Given the growing emphasis on measurable outcomes and standards in government programs, requirements for a system of outcome measures and standards are likely to be included in new legislation, regardless of its format.

   A recent study by the National Center for Research in Vocational Education (NCRVE), Improving Perkins II Performance Measures and Standards: Lessons Learned from Early Implementers in Four States (Stecher et al., 1994), suggests ways to enhance accountability in future federal vocational education legislation. This paper reviews the findings and recommendations of that study and illustrates how they could be translated into legislation. Specifically, we offer suggestions for more effective outcome-based program improvement and accountability procedures. The paper focuses only on those provisions that relate to evaluation and program improvement-often discussed under the heading "performance measures and standards." These suggestions are presented as revisions to the existing Perkins legislation, but they are equally relevant in the context of broader workforce development legislation. Therefore, the report should be of general interest to policymakers responsible for workforce training at the state and federal levels.

     Examples of changes in language we suggest in this report are less relevant than the principles that guided them. Specifically, the results of the study suggest that any federal vocational education effort should incorporate the following:

Furthermore, the language presented in this document may provide a useful starting point for other conceptualizations of the federal role in vocational education.


Background

  Perkins II has guided federal vocational education policy for the past five years. It has been a remarkably influential piece of legislation, in part, because most states have opted to apply its provisions more broadly than required. Although the federal government supplies less than ten percent of the total resources devoted to secondary and postsecondary vocational education, most states have applied Perkins II requirements to locally and state-funded vocational education efforts as well as to those funded with federal resources.

  One of the most significant changes embodied in Perkins II was an emphasis on using systematic outcome data as a monitoring and improvement tool for programs. Under Perkins II, states were required to develop statewide "systems of core standards and measures of performance" that would be used to determine the success of vocational programs and to serve as a basis for local program improvement. If necessary, these systems could also be used to justify state intervention. States were given considerable flexibility in developing their statewide systems; the law specified only two outcomes that were required to be measured. At a minimum, each state was required to collect measures of learning and competency gains in basic and more advanced academic skills, as well as at least one measure of occupational competency attainment, job or work skill attainment, or retention or student placement. Any additional measured outcomes were at the discretion of the Committees of Practitioners and state Departments of Education.

  Those who endorsed this approach to program monitoring in vocational education hoped it would lead to better evaluation, richer communication, more focused program improvement at the local level, and wiser use of state technical assistance capabilities. It was hoped that states would implement efficient systems that provided local administrators and instructors with meaningful performance data to assess the strengths and weaknesses of their vocational programs and to design new strategies to improve the academic, technical, and labor-market outcomes of their students.

  These hopes have not been fully realized, although states have made substantial strides in implementing performance measures and standards systems as envisioned in the federal legislation. However, there remains considerable room for improvement. The following summarizes the results of a recent NCRVE study on the implementation and impact of the Perkins II measures and standards. These findings form the basis for suggested changes in the legislation, which are elaborated in the section entitled "Rationale for Legislative Changes."


NCRVE Study of the Implementation and Impact of Measures and Standards

  In the spring of 1993, NCRVE initiated a two-year study of the effects of Perkins II performance measures and standards for vocational education. At the time of the study, states had three years to implement these provisions. The study examined seven states' progress in implementing statewide systems of performance measures and standards, the effects of these systems on local vocational programs and state agencies, and the factors that influenced local and state actions. [1]

  Four states which were "early adopters" of measures and standards were initially selected for study. In each state, we interviewed staff in the state agency (or agencies) that administered secondary and postsecondary vocational education, and administrators and instructors in both a secondary and a postsecondary vocational institution in two geographically separated regions. Respondents were asked about a number of related themes, including the implementation of performance measures and standards, their integration with other educational reform efforts, and the impact of measures and standards on their vocational programs. Repeat visits to each vocational institution were conducted the following year. In the second year, three additional states were added to the sample to provide greater contrast in terms of implementation and to broaden our exposure.


Results

  This section summarizes the findings of the NCRVE study (Stecher et al., 1994). Substantial progress had been made in implementing measures and standards in the states that were visited, although much work remains to be done to make the systems function as envisioned in the law. At the time of our visits, little attention had been paid to building local- or state-level capacity for translating the measures and standards data into actions for local program improvement. These "leading edge" states were still largely engaged in developing and implementing their systems.

  Furthermore, large variation was found in the states' approaches to the development and implementation of measures and standards. This variation was evident in almost every aspect of program implementation, including how the process was managed, who participated, and the level of resources devoted to it. These differences appeared to be jointly a function of the states' individuality and the flexibility inherent in Perkins II.

  We identified several factors that affected implementation and contributed to the variation in state responses to performance measures and standards. [2] Some of these explanatory factors were elements of the local and state context, and are less responsive to federal policy intervention. Other factors are within the sphere of federal policy influence.

  The level of flexibility afforded states was the first of five factors that could clearly be affected by federal actions. On the positive side, flexibility permitted states to create systems that were responsive to local conditions. On the negative side, the latitude afforded states increased the influence of state context, which heightened differences between states, and, in some cases, lengthened the implementation process.

  The second explanatory factor was the separate and uncoordinated nature of the elements of Perkins II. The Perkins II priorities-measures and standards, integration, Tech Prep, and service to special populations-were not seen as a coordinated system at either the local or state levels. Similarly, performance measures and standards were not being used comprehensively to evaluate the other Perkins initiatives.

  Third, there were neither models nor incentives for ensuring that performance measures and standards were used to improve programs. Perkins II contained an explicit framework for structuring systems, and a federal agency checked for compliance at the adoption stage. However, the law and regulations did little to emphasize information use. [3]

  The fourth factor concerned resources and expertise at the state level. Perkins II created new responsibilities for state staff, but reduced the set-aside for state administration and provided little technical assistance. This presented a dilemma for states that lacked either the expertise or the resources to address these new demands.

  Finally, the law mandated measurement of learning outcomes, even though there were few valid tests available for this purpose. The scarcity of appropriate tools for measuring selected learning outcomes led states to adopt alternatives that were less than optimal. States are still struggling with how to measure important student outcomes such as academic skill gains at the postsecondary level.


Recommendations

  The study recommended several actions federal policymakers could take to enhance the future success of performance measures and standards in vocational education and to promote the program improvement goals of Perkins II.


Translating Results Into Legislation

  Incorporating these changes into the federal vocational education or workforce preparation legislation will increase the efficacy of statewide systems of measures and standards. While the results of this study do not speak to all aspects of the legislation, they do suggest to us specific changes in the measures and standards provisions. The section in this document entitled "Rationale for Legislative Changes" presents our rationale for translating the findings of the study into changes in legislation. It elaborates on four main themes drawn from the results, and presents specific recommendations for revising the law. The next section, "Proposed 1995 Perkins Act Sections," contains a marked-up version of Perkins II showing where the changes might be made in the event of reauthorization. The paper concludes with a Technical Appendix that contrasts the new and old language along with a commentary on proposed changes.

  We remind the reader that we have not attempted to rewrite the whole act or to draft omnibus legislation. We aim only to demonstrate how results of our research have practical value in informing future legislation. Many of our suggested revisions to Perkins II could as easily be imbedded in a consolidated workforce preparation bill.


RATIONALE FOR LEGISLATIVE CHANGES

  The changes we propose flow directly from the research findings summarized in the previous section. In the case of performance measures and standards, the links between federal legislation and state actions are relatively clear and direct, and it is possible to trace implementation effects back to legislative causes. This linkage facilitates the task of revising the law to promote desired outcomes. We believe that the intent of the federal legislation was to promote effective program improvement, and that this can best be accomplished by including four major changes in future legislation:

  1. Coordinate separate components into a more integrated system for planning, implementing, monitoring, and improving vocational education and training.
  2. Increase the emphasis on the use of the system of performance measures and standards as a program improvement tool.
  3. Clarify and improve language describing the required measures and standards themselves.
  4. Increase the amount of technical assistance provided by state and federal agencies to support change at the local and state levels.

  To illustrate how these changes could be imbedded in federal legislation, we revised selected portions of Sections 115, 116, and 117 of Perkins II- the legislation describing performance measures and standards and the requirements for local and state assessment and evaluation. The proposed revisions have been italicized. Data limitations prevented us from undertaking a complete redrafting of the law. We have, therefore, limited our efforts to only those sections where our earlier findings justify legislative reworking. Readers should not assume that we endorse all the non-italicized sections of the act; in most cases, these components are left unchanged because they were not informed by our research.

  The rest of this section describes in narrative format the major changes we recommend. This approach communicates better the goals we were trying to achieve and the broad changes we made to achieve them. The complete text of the proposed revisions, with detailed commentary comparing the new law to the old, is contained in the Technical Appendix.


Developing a Coordinated Program Improvement System

  Our revisions attempt to coordinate the separate elements found in Perkins II into a more integrated system for planning, implementation, monitoring, and improving vocational programs. The logical model underlying this system is illustrated in Figure 1. Revised language clarifies the interrelationships between the elements of Perkins II, including state needs assessments, measures and standards, annual local evaluations, and program improvement plans. Our revisions also promote greater coordination of measures and standards with other federal workforce and education initiatives in the following ways:

  In our research, we recognized the clear need for coordination among federal workforce preparation programs. As a result, our suggested revisions in this area are particularly relevant in the context of a consolidated bill. Coordination among federal workforce education programs is promoted in the following ways:


Use of Information for Program Improvement

  Our proposed changes represent a significant change in focus away from the initial development of the systems of measures and standards called for in Perkins II toward the use of these systems for future program improvement. Our research revealed that while states had made significant progress in developing their systems of measures and standards, for the most part they had not yet tackled the next step of using their systems for program improvement. Without explicit provisions for the use of performance measures and standards, the data they generate may languish in government files instead of being used to improve programs. Although the theme of measures and standards as the basis for a system of program improvement runs throughout our proposed revisions, it is most evident in Section 118, which sets specific requirements for program evaluation and improvement.

  Increased relevance and usefulness of the annual evaluations for program improvement is promoted in the following ways: