Up Previous Next Title Page Contents
Support to States Home |

Introduction

      The centerpiece of standards-based reform and results-oriented accountability systems is a set of standards defining what students should know and be able to do in a kindergarten through postsecondary articulated system. Throughout the 1990s, the effort to establish standards evolved from an interest of professional subject-matter associations such as the National Council of Teachers of Mathematics to a matter of high priority in nearly every state (Massell, Kirst & Hoppe, 1997). By the end of 1998, forty states had developed standards at the secondary level in all core subjects (Education Week, January 11, 1999).

       Having adopted standards for what students should know and be able to do, states are now struggling with the decision of how to define and measure these standards. Indeed, reaching consensus on the definitions of academic excellence has been difficult to achieve within each state, much less nationally. Recognizing that the establishment of standards alone will not improve the education system, states are turning their attention to the development of accountability systems that reward results and provide consequences for low performance. Movement in this direction, in turn, has led to more discussion about the necessary components of accountability such as standards-driven assessment and curriculum, statewide data systems, public reporting forums, incentives and consequences, and professional development aimed at training practitioners to use educational data for program improvement. This document will highlight promising state efforts in the areas.

      While the steady march towards greater accountability is encouraging, it is important to recognize that states are far from completing their accountability systems. Although many states have "pieces and parts" of an overall system, no state has developed the definitive "accountability blueprint." Moreover, states differ in terms of the approach utilized and priority accorded to accountability. While Texas and North Carolina are touted as being the closest to having all the necessary components of a complete accountability system, the authors of Quality Counts appropriately conclude that "states have completed only the first few miles of a marathon when it comes to holding schools accountable for results." (Education Week, 1999, p. 5)


Systemic Accountability Versus Programmatic Accountability Page Up

      Statewide reform efforts focus on improving the delivery of public education from a systemic perspective. By concentrating resources on setting standards and more accurately measuring student performance, states hope to set the stage for performance-based improvement that will produce lasting change in the public education system for all students in elementary, middle and high schools, and postsecondary institutions.

      While states are actively involved in building standards-based accountability systems, accountability efforts at the federal level are also underway. These are generally less comprehensive in approach and are more likely to be related to the tracking of specific programs. In particular, most state and federal legislation includes specific accountability and reporting requirements. Because they are more targeted or categorical in nature, programmatic accountability systems often call for data from specific sub-populations such as limited English proficient students or students receiving Title I funds.

       At times, programmatic reporting requirements may conflict with the goals or spirit of comprehensive statewide efforts. For example, one indicator within a state's comprehensive accountability system may relate to the academic performance of all students at the 3rd, 6th, 8th, and 10th grade. However, federal vocational accountability systems in the Carl D. Perkins Vocational-Technical Act of 1998 require a measure of the academic performance of vocational students, preferably at the 12th grade when a student has completed a vocational program. Similarly, federal programs may request information with the intention of creating a set of indicators that can be reported nationally, rather than as a reflection of the unique circumstances of each state. The result is a patchwork of "systems" at the local and state levels, each moving toward accountability with different data and reporting requirements.


Accountability Standards Page Up

      With districts and states experimenting with a wide range of accountability strategies to improve performance, the National Association of State Boards of Education (NASBE) organized the Study Group on Education Accountability. The Study Group examined the field to determine what is and is not working, the dilemmas that policymakers face, and how the various components of accountability fit together to become a coherent system. The effort resulted in a framework of ten action-oriented standards to guide the discussion, design, and evaluation of local and state education accountability systems (NASBE, 1998). These standards and indicators can be found in Table 1. A more complete description of the standards can be found in the Resource section of the document.

Table 1: Public Accountability for Student Success: Standards for Education Accountability Systems

Standard 1:
Accountability Goals and Vision
Legal authorities clearly specify accountability goals and strategies that focus on student academic performance.
Standard 2:
Governing Accountability
At each level of the education system, designated authorities are charged with the efficient governance of the accountability system.
Standard 3:
Responsible Agents
Specific responsibilities for student learning and performance are assigned to designated agents.
Standard 4:
Collecting Performance Information
Accountability is based on accurate measures of agent performance as informed by assessments that are administered equitably to all students.
Standard 5:
Analyzing and Reporting Performance Information
Those responsible for governing accountability regularly report student and school performance information in useful terms and on a timely basis to school staff, students and their families, local and state policymakers, and the news media.
Standard 6:
Incentives and Consequences
Incentives are established that effectively motivate agents to improve student learning. Consequences, which could include rewards, interventions or sanctions, are predictably applied in response to performance results.
Standard 7:
Building Agent Capacity
Agents are provided sufficient support and assistance to ensure they have the capacity necessary to help students achieve high-performance standards.
Standard 8:
Policy Alignment
Policymakers work to ensure that education policies, mandated programs, financial resources, and the accountability system are well-aligned so that consistent messages are communicated about educational goals and priorities.
Standard 9:
Public Understanding and Support
The accountability system has widespread support.
Standard 10:
Partnerships
Various established partnerships work together to support teachers, schools and districts in their efforts to improve student achievement.

      Source: NASBE (1998)

      The action orientation of the NASBE standards goes a long way toward identifying the elements that can be useful in the development of both comprehensive state systems and programmatic systems of accountability. As states become more experienced with the components of accountability and the federal government becomes more familiar with how these systems can be used for collection of nationally relevant data, state and programmatic accountability systems will begin to merge. However, because certain programs have particular data requirements, they will never match completely. For example, the accountability system needs for Head Start or state childcare programs will never match Perkins III or state vocational requirements. Nonetheless, it is hoped that the current patchwork of data collection and reporting will move closer toward an articulated and better-coordinated system of accountability.


Accountability in Vocational Education Page Up

      In an effort to ensure that states focus on the quality of vocational education programs, Perkins II, the 1990 Carl D. Perkins Vocational-Technical Act, required states to set up accountability systems for local vocational programs. As they grappled with new federal mandates, state-level efforts centered on meeting the law's requirements. At a minimum, they had to develop two accountability measures; one of which had to be an indicator of learning and competency gains, including student achievement of basic or more advanced academic skills. The other measure needed to be one of the following: [1] competency attainment; [2] job or work skill attainment or enhancement; [3] retention in school or completion of secondary school or its equivalent; and [4] placement into additional training or education, military service or employment.

      In spite of some initial trepidation with the accountability requirements in Perkins II, each state developed its own system of performance measures and standards and devised a plan for implementation. At the secondary and postsecondary level, all but two states went well beyond the requirements of Perkins II and included more than two performance measures and standards. In fact, most states included three to ten measures. (Hoachlander & Rahn, 1992)

      Rather than building a data collection system capable of producing nationally comparable data, Perkins II intended for states to provide educators at the local and state levels with data they could use to improve their own vocational education programs and courses; therefore, states built accountability systems suited to their own history and culture. In more traditionally decentralized states, local education agencies were allowed to use their own assessments to measure academic gains and to develop their own instruments on other measures. Local control resulted in more buy-in at the local level but no comparability and little standardization statewide. In more centralized states, statewide assessments were used to move local education agencies toward standardized systems based on a common set of data. While this led to more comparability across local education agencies, these efforts were often perceived at the local level as less useful to program improvement. (Stecher, Rahn, 1997)

      In states that integrated federal legislative requirements with their own initiatives for improving vocational education, the emerging accountability systems tended to evolve over time. In fact, in a few states, vocational accountability is beginning to be integrated into the overall statewide education accountability system; however, in states with more of a "compliance" mindset, the accountability systems evolved very little over time. Administrators in these states have tended to wait for new federal legislation to mandate change before proceeding further in terms of accountability.

      Despite progress, however, states are far from having complete accountability systems for vocational education in place. Integrating vocational education in an overall reform agenda continues to be a challenge. Most state administrators realize that the demand for accountability from the public, business, politicians, and others is here to stay. Therefore, they are working hard to figure out what data is reasonable to collect and how it can be used to improve reporting about students and programs. The concerns raised are no longer "Why do we have to measure academic skills?" but "How do we best measure academic and occupational skills and assist local educators and administrators in their use of data for program improvement?"

      With the recent passage of Perkins III, learning from the past is more important than ever. Perkins III requires the implementation of a similar accountability system; however, it adds the potential for higher stakes with the introduction of incentive performance funds. Through Perkins III and the Workforce Investment Act, states exceeding the levels of performance for core indicators established for workforce agencies, adult education, and vocational education are eligible for incentive grants.

      Perkins II was in many ways the first step toward accountability--requiring the reporting of performance data. Perkins III intends to complete the accountability cycle with rewards and consequences based on performance. Perkins IIIaccountability requirements are described in Table 2.

Table 2: Perkins III Accountability Requirements

      At a minimum, the requirements for core indicators of performance in Perkins III requires measures of each of the following:
      (1) Student attainment of challenging state-established academic, vocational and technical, and skill proficiencies.
      (2) Student attainment of a secondary school diploma or its recognized equivalent, proficiency credentials in conjunction with a secondary school diploma, or a postsecondary degree or credential.
      (3) Placement in, retention in, and completion of, postsecondary education or advanced training, placement in military service, or placement or retention in employment.
      (4) Student participation in and completion of vocational and technical education programs that lead to nontraditional training and employment.


Vocational Education: Pulled in Two Directions Page Up

      In most states, vocational education is trapped between two often-conflicting ends of a spectrum: workforce development and school reform. On the one hand, vocational educators are attempting to be included (and in a few cases lead) in mainstream school reform through contextual learning, applied methodology, and integrated curriculum. Seen this way, vocational education is a vehicle for thematic instruction to improve the knowledge and skills for all students and to better prepare these students for both work and postsecondary education.

      On the other hand, vocational educators also work within the framework of job training and welfare-to-work initiatives designed to help students obtain specific skills for entry-level work and future jobs. This training is more specific in focus and often shorter in term. The primary goal under this purpose is to get the learner a job which hopefully leads to a better job and later to an even better job. Typically, this type of vocational education includes the non-college bound, or high school students that do not wish to go immediately to college, but instead enter the workforce. This type of vocational education also serves displaced homemakers, dislocated workers, welfare recipients, and others whose first priority is to earn a living wage or better wage.

       Table 3 suggests differences that influence vocational education's tension in purpose. It includes legislation connections, vehicles and delivery systems, populations served, and skills taught. The tension between school reform and workforce development should be viewed as a continuum with some states on one end or the other, and others somewhere in the middle.

       The tension between these two purposes is real and quite manifest in some parts of the country. To cite just one example, the purpose and rhetoric surrounding the purpose of vocational education differs substantially in California compared to Arkansas. In California, secondary vocational education is one area within the state's education department. There is no longer a unit called "vocational education." Instead, vocational education is included as an area of emphasis within the High School Division. This division receives no JTPA or other workforce development funds. As such, the major purpose of vocational education at the secondary level is school reform that draws on Career Academies and Tech Prep education as primary vehicles for integrating vocational education into mainstream school reform.

       By contrast, vocational education has recently been consolidated into the newly developed Workforce Development Division in Arkansas. This new division houses both secondary and postsecondary vocational education, including all Perkins funds and 8% JTPA funds. Arkansas soon hopes to file a consolidated State Plan for Perkins III and the Workforce Investment Act.

Table 3: Vocational Education Pulled in Two Directions

      School Reform
Examples of Legislation:
  • Goals 2000
  • Improving America's Schools Act (IASA)
  • Eisenhower
  • Perkins III
Vehicles:
  • Elementary, middle, high schools, & community colleges
  • Academies, tech prep
  • Career clusters or magnet schools
  • Interdisciplinary teams
Populations Served:
  • Elementary through Postsecondary students
Skills Taught:
  • Rigorous academic standards
  • Work-readiness skills (such as the SCANS competencies)
  • Cluster or career major skills


Example of Legislation:
  • Temporary Aid to Needy Families (TANF)
  • Workforce Investment Act (formerly JTPA)
  • Perkins III
Vehicles:
  • Vocational programs within high schools & community colleges
  • JTPA programs
  • One-stop employment centers
  • Vocational centers/adult training
  • Customized training or private sector training
Populations Served:
  • Students in their final two years of high school
  • Community college students
  • JTPA adult and youth participants
  • Welfare recipients
  • Dislocated workers, displaced homemakers, and so on
Skills Taught:
  • Basic academic skills
  • Work-readiness skills (such as the SCANS competencies)
  • Job-specific skills
      Workforce Development

       While it varies by state and by level (secondary and postsecondary), the degree of tension within most states is more concentrated at the secondary level. This tension often complicates data collection and accountability systems, especially when measuring academic and occupational attainment. For example, a secondary level vocational education system primarily focused on school reform might measure academic achievement, dropout rates, and indicators such as student eligibility for postsecondary education. A second system, at the postsecondary level, centered on workforce development, may primarily focus on increases in wage rates, placement and retention, and improved economic indicators. Of course, an accountability system could measure the entire continuum. The questions are "What are the most important indicators in the overall system?" and "What should the system be held accountable for?"

       Table 4 provides examples of potential indicators along the school reform and workforce development continuum. The measures (academic tests, placement) embedded in the indicators are aligned with intended purpose. The line should be viewed as a continuum with some states on one end or the other, and others somewhere in the middle.

       An element that adds more complexity to the measurement challenge discussed above is the governance or delivery system of the state. In cases where the purpose of vocational education has been clearly established and the indicators defined, another challenge remains: at what level will the indicators be implemented and the data collected? For example, Perkins II required that states measure academic gains. In centralized states, the indicator was measured through statewide, standardized tests. This ensured a comparable measure across the state. In decentralized or local control states, local recipients were allowed to select their own measure of academic gains, with results forwarded to the state level. No statewide comparable data was ever developed.

       Moreover, states may have a system that combines both centralized and decentralized elements in its accountability system. For example, a state may have a centralized method for collecting academic performance, but only local methods for job placement. Alternatively, a state may have both a centralized and decentralized strategy on one indicator. For example, Oklahoma has a centralized occupational testing program that includes a multiple-choice test for each occupational area. The performance assessment component is locally developed and scored prior to administering the multiple-choice test. (Stecher, Rahn, 1997) Table 5 describes how the same indicator would be operationalized in a centralized versus decentralized system.

Table 4: Purpose Driven Example Indicators

School Reform
  • % of students scoring at X level on X academic test
  • % of students acquiring certificate of initial mastery
  • % of students enrolling in college prep coursework
  • % of students successfully completing university requirements
  • Decrease in high school dropouts
  • % of students completing high school entering postsecondary institutions
  • % of students attaining SCANs competencies

  • * % of students attaining SCANs competencies
  • % of students passing certification/licensure examinations
  • % of students acquiring certificate of advancement mastery
  • % of students mastering 80% of occupational competencies
  • % of students placed in a job related to training
  • % of students retained in job after six months
  • % of students earning a living wage
Workforce Development

      Although multiple combinations of centralized and decentralized elements are possible, the trend across states seems to be one of centralization. Without comparable measures, it is very difficult to offer statewide incentives and consequences in the accountability system. Without centralized measures, it is difficult to manage the information at the state level, and impossible nationally. However, with this drive toward centralization, defining the purpose of vocational education becomes more important than ever. What are the appropriate indicators and accountability strategies to be included in education systems, and, in particular, vocational education? How will success be defined? How will practitioners be supported in order to attain success as defined?


Table 5: Example of System Delivery Influences on Accountability Measures

School Reform

  • Academic tests selected or developed at local level
  • Local transcript analysis with results reported to state
  • Dropout defined differently at each local district; reported to state
  • Locally reported from postsecondary institutions to state
  • % of students scoring at X level on X academic test
  • % of students enrolling and completing college prep coursework
  • Decrease in high school dropouts
  • % of students completing high school entering postsecondary institutions
  • Statewide standardized academic test
  • Electronic transcript data at state-level
  • State definition; with state centralized database
  • State matched database of secondary and postsecondary records
  • Locally implemented checklists
  • Local student or employer survey
  • % of students mastering 80% of occupational competencies
  • % of students placed in a job related to training
  • % of students retained in job after six months
  • % of students earning a living wage
  • Statewide vocational testing system
  • Unemployment Insurance Data matched to program completers with social security #

Workforce Development


Sharing Practices To Improve Accountability Page Up

       We recognize that no one has all of the answers to the issues raised in this introduction. Most of the measurement issues related to the conflicting purposes of vocational education (school reform versus workforce development) and governance/system delivery (centralized versus decentralized) are quite complex. All states are working on parts of the system or systems. It is hoped that the parts eventually sum to a whole.

       Earlier in this document we provided the NASBE's ten action-oriented standards which are intended to guide the development of overall state accountability efforts. The case studies provided in this document describe promising practices or particular strategies within an accountability system that have operationalized some of the NASBE standards. The necessary components described in this report include the following areas:


  • Setting Standards
  • Assessment Systems
  • Curriculum Strategies
  • System Supports
  • Quality Assurance
  • Policy Linkages


      Within these areas, states have worked hard to develop particular components to their system. It is hoped that this document provides a vehicle to share some of the lessons learned related to the varying strategies employed state to state. The sharing of practices across the states may be helpful to those willing to take the time to analyze and reflect on others' efforts. In some cases, a strategy from another state could be adopted. In most cases, the lessons learned would need to be adapted to particular state conditions. At other times, the strategy may not be a viable option to either adopt or adapt. In these cases, learning from the challenges faced by others may help the reader see his or her own situation in a different light.

      This document is an evolving publication aimed primarily at state administrators and state policymakers. We hope to collect additional strategies from states over time, and mail out periodic updates (see Resources section). The "answers" to complex accountability questions are years down the road. As states continue to experiment and take test flights, we will need to observe, question, and incorporate new ideas into our own practice. Hopefully, this document will offer one more tool to help state administrators fine-tune their flight plan under always changing conditions.

      Buckle up, keep your seatbelts fastened, and enjoy the flight . . .


References Page Up

Hoachlander, E. G., & Rahn, M. L. (1992). Performance measures and standards for vocational education: 1991 survey results (MDS-388). Berkeley: National Center for Research in Vocational Education, University of California, Berkeley.

Massell, D., Kirst M., & Hoppe, M. (1997, March). Persistence and change: Standards-based systemic reform in nine states. CPRE Policy Briefs, Graduate School of Education, University of Pennsylvania, RB-21-March 1997.

National Association of State Boards of Education (NASBE). (1998, October). Public accountability for student success, standards for education accountability (The Report of the NASBE Study Group on Education Accountability). Alexandria, VA: Author.

Quality Counts `98: Rewarding results, punishing failure. (1999, January 11). Education Week (Entire issue), 18(17).

Stecher, B. M., Hanser, L.M., Hallmark, B., Rahn, M. L., Levesque, K., Hoachlander, E. G., Emanuel, D., & Klein, S.G. (1995). Improving Perkins II Performance Measures and Standards: Lessons Learned from Early Implementers in Four States (MDS-732). Berkeley: National Center for Research in Vocational Education, University of California, Berkeley.

Stecher, B. M., Rahn, M. L., Ruby, A., Alt, M., & Robyn, A. (1997). Using Alternative Assessments in Vocational Education (MDS-947-NCRVE/UCB). National Center for Research in Vocational Education, RAND, Santa Monica, CA.




Up Previous Next Title Page Contents
NCRVE Home | Site Search | Product Search

Support to States Home |