NCRVE Home | Site Search | Product Search

<< >> Title Contents NCRVE Home

INTRODUCTION


Vocational education in the United States faces an uncertain future. Congressional action to consolidate the federal government's investment in education and training programs was expected during the current session, but did not happen. Many expect future restructuring to cascade down to the local and state levels in the form of lower funding flowing through new pipelines with more spigots than most vocational educators have been accustomed to in recent years.

One manifestation of the anticipated growth of contested markets for secondary and postsecondary students will be a perceptible ratcheting up of the accountability threshold that will have to be reached or surpassed by successful competitors. It will not matter whether this accountability is required by federal authorities, by a state council that adopts a uniform set of performance measures for multiple state agencies, or by a curious and attentive public. The message to vocational education management will be the same in each instance--provide credible evidence of high value-added performance. Such information must be made available at reasonable cost in a timely manner, and it must be easily understood by non-experts. This is the challenge that motivated the research undertaken to prepare this guide.

This guide is designed for local and state authorities who seek a better understanding of the performance of their vocational education programs. The basic theme is management diagnostics. No methods are described to carry out the rote tabulation of figures for a performance standards system mandated by some higher authority. State and national performance standards come and go. Such standards are therefore less likely to have a long-term effect on the way vocational educators manage their business than voluntary behind-the-scenes diagnostics that have always been carried out by competent administrators who care about their students, business colleagues, and community.

The treatment here builds on a solid foundation of pioneering contributions by others. These are identified in the next section. This guide is designed to motivate readers to aspire to reach a higher plateau of understanding on multiple fronts:

This volume complements Chapter 1: Placement Rates in the 1995 report of The Joint Commission on Accountability Reporting (JCAR). The JCAR was created in 1994 by the American Association of State Colleges and Universities, the American Association of Community Colleges, and the National Association of State Universities and Land-Grant Colleges. This was a preemptive response to evidence that member institutions were increasingly unable to satisfy constituent requests for performance-based information. The JCAR report highlights two pioneering state programs--(1) Florida's Education and Training Placement Information Program (FETPIP) and (2) the Research & Analysis Program of Washington's State Board for Community and Technical Colleges. The managers of these programs, Jay Pfeiffer and Loretta Seppanen, respectively, are charter members of a select group of pioneers who have consistently approached performance measurement challenges with a positive attitude. The result in each case is an exemplary program that has now been selected as a prototype from which others can learn. Components of the Florida and Washington programs have been replicated and adapted by other states for years. Six years ago, we were fortunate to be added to the list of colleagues with whom these program managers cooperate. Much of what appears in the database sections of this guide can be traced directly to their unselfish sharing of time and information.

A basic feature of the Florida and Washington state performance measurement programs is reliance on quarterly employment and earnings data acquired from Florida's Department of Labor and Employment Security and Washington's State Employment Security Department, respectively. These records are treated as a necessary, but not sufficient, source of reliable employment and earnings information. Similar, but not identical, records are maintained by the unemployment insurance unit of each state employment security agency (except New York) to support the management of the state's unemployment compensation program (except Michigan).

Throughout this and the remaining sections, the basic source document of interest is referred to as a quarterly wage record. This precedent is followed because the term has acquired colloquial familiarity among vocational educators. This is a particularly unfortunate term for novices because each record contains a quarterly earnings amount, not an hourly wage rate figure. This seed document differs in particulars across states, but there are many common features and data elements. The basic elements that are of sustained interest here are (1) an employee identifier; (2) an employer identifier; and (3) a dollar figure representing what this employer paid the reference employee during the designated year/quarter.

Some of the early criticisms of this data source, which concentrated on what was not included, have been overcome by pioneers such as Pfeiffer and Seppanen, who have been aggressive advocates for and/or users of complementary information drawn from other sources such as the Office of Personnel Management for federal government civilian employment, the Department of Defense Manpower Data Center for military personnel information, the U.S. Postal Service, data sharing agreements with contiguous states, and periodic survey-based estimates to fill in gaps.

A seven-state alliance was established in February 1995, sponsored by the Employment and Training Administration of the U.S. Department of Labor through its America's Labor Market Information System (ALMIS) initiative. The purpose of this consortium is to investigate a series of issues that must be addressed to increase the value of wage records for third-party users. The senior author of this guide has lead responsibility for the technical facets of the scope-of-work being addressed by these consortium members (Alaska, Florida, Maryland, North Carolina, Oregon, Texas, and Washington). A summary of the first year's research findings will be released in 1996. Among the topics being investigated is the feasibility of adding an occupational data element to the wage record, as Alaska did some years ago in response to a state legislative mandate to do so; a time-unit data element such as hours or weeks worked during the reference quarter, as Florida, Oregon, Washington, and a small number of other states now do; and a geographic, or work site data element that would eliminate a long-standing interpretive problem associated with allowable reporting practices in most states that limit the accuracy of one's assignment of workers to the actual location of their employment.

Of equal importance to readers is a report to Congress by the Bureau of Labor Statistics, which is expected to be released at any time. This report is expected to recommend the creation of a national distributed database capability, similar to the electronic network that is currently used to manage interstate claims processed on behalf of each state's unemployment insurance program (see Northeast-Midwest Institute, 1988, for a historical perspective). One of the fundamental payoffs expected from such a national distributed database capability would be a routine interstate sharing of information about the employment status and earnings level of former students and trainees in other states who cannot be found in the home state wage records file. There is no guarantee that this data sharing capability will become a reality; and, if so, when. But the simple fact that the Bureau of Labor Statistics is expected to recommend such a step to Congress is a clear signal of the value that is placed on wage records as a reliable and inexpensive source of pertinent information--necessary to sensible performance measurement, but not sufficient by itself for such use.

Topics Covered

The remainder of this first section provides an introduction to the current status of one facet of the overall performance measurement issue--documentation of employment and earnings outcomes that can be associated with a particular education event. This overview includes appropriate references to related contributions that may be of interest to some readers. The second section introduces an optics metaphor to describe the interplay of supply and demand forces as each affects the recorded employment and earnings histories of former students. The third section follows with a primer on the proper measurement of each former student's employment status, mobility between employer affiliations, and earnings path. The fourth section then describes refinements of these fundamental components of a performance measurement system.

Measuring Employment and Earnings Outcomes

Overview

Wage records have been used for research and evaluation purposes for more than three decades (Stevens, 1994a). Pioneering uses included evaluation of adult retraining programs (Borus, 1964); a benefit-cost analysis of Neighborhood Youth Corps programs (Borus, Brennan, & Rosen, 1970); estimating a State Employment Security Agency's own penetration of all hires in a local labor market (Hanna, 1976; Siebert, 1976); and follow-up of Comprehensive Employment and Training Act (CETA) participants (MDC, 1980).

The senior author and colleagues first used wage records to document the earnings of former vocational education students in a study of Missouri graduates who had been supported by CETA funds (Atteberry, Bender, Stevens, & Tacker, 1982). The use of wage records for performance standards measurement began in the mid-1980s with Job Training Partnership Act (JTPA) applications (Trott, Sheets, & Baj, 1985).

The current era of wage record use for vocational education research purposes began in 1986, when the author proposed and then carried out such applications for the National Assessment of Vocational Education (NAVE) (Stevens, 1986, 1989a) and the Office of Technology Assessment (Stevens, 1989b). A seminal contribution that appeared at about the same time was done by Pfeiffer (1990). This was an interim report on a work-in-progress, which led directly to adoption of key components of Pfeiffer's pioneering FETPIP program in North Carolina and Texas, and to other second-generation efforts that feature the basic principles of the FETPIP.

The modern origins of national research applications and state program uses of wage records began as roughly aligned, but independent, agendas. These parallel tracks have converged in the 1990s. Today, multiple nodes of university-based research teams collaborating with state agency colleagues are observed across the U.S., particularly in California, Colorado, Florida, Maryland, Missouri, North Carolina, Texas and Washington.

A "that was then" "this is now" caution is offered for three reasons:

  1. Some early studies in the new era of concentrated research effort, such as Strong and Jarosik (1989) and Ghazalah (1991), used or referred to the potential use of state department of revenue, Internal Revenue Service (IRS), and Social Security Administration (SSA) data as an alternative to the use of wage records. Despite some comparative strengths relative to wage records such as national coverage, IRS and SSA records do not satisfy minimum timeliness and format requirements that must be met to carry out routine management diagnostic and performance measurement responsibilities. These sources may still be considered for periodic long-term evaluation studies, particularly if the basic features of the Clinton Administration's Simplified Tax and Wage Reporting System (STAWRS) proposal are adopted (see Internal Revenue Service, 1995). These records are not a practical substitute for the wage record applications that are addressed here.

  2. Now-dated surveys of state capabilities, such as Rahn, Hoachlander, and Levesque (1992), Jarosik and Phelps (1992), and Amico (1993), correctly described statutory and administrative regulation barriers that in some cases limited access to the basic wage record itself. Laws and rules have changed. With a few notable exceptions that are identified later, wage records are more accessible for third-party use now than at any previous time.

  3. An important complement to these legal and management advances has been a pervasive improvement in data processing capacity, accompanied by a steady decline in the cost of maintaining and using large databases.
These three statements, together, send a clear message: Skeptics who, for legitimate reasons, doubted the appropriateness of wage records for performance measurement purposes at an earlier time, should revisit the issue to assign new weights to the pros and cons of relying on data sources that are available today.

Background

Three documents released in 1989 set the stage for a concentration of research attention on the use of wage records for vocational education accountability purposes: Using State Unemployment Insurance Wage-Records To Trace the Subsequent Labor Market Experiences of Vocational Education Program Leavers and Using State Unemployment Insurance Wage-Records To Construct Measures of Secondary Vocational Education Performance by Stevens, and A Longitudinal Study of Earnings of VTAE Graduates by Strong and Jarosik. Using State Unemployment Insurance Wage-Records to Trace the Subsequent Labor Market Experiences of Vocational Education Program Leavers, commissioned by NAVE, was designed to "separate fact from fiction and to establish a common ground for debating the merits of using [wage records] for vocational education performance assessment purposes." A theme of that paper was to distinguish between the demanding coverage requirements of a performance standards system and the less stringent requirements of a management diagnostics system. The intent then, as now, was to counter the orchestrated voices of those who oppose measuring anything if you can not measure everything.

Wage record coverage stops at a state's border. This is sufficient reason for some opponents of wage record use to conclude "if you do not know the employment status of former students who have left the state then you should not document the employment status of those who have remained in the state as productive employees." There is some merit to this stance when performance standards are the topic of discussion because the percentage and importance of unobserved cases will vary depending upon a school's proximity to work opportunities outside the state, the specialized nature of a school's program offerings, and the comparative work histories of former students who stay or leave. It is difficult, but not impossible, to assess the adequacy of what is observed as a reliable proxy for the hypothetical combination of these observations and the unobserved missing cases.

Unlike the performance standards situation, which is frequently characterized by subordinate opposition to attempts by higher management levels to impose measurement procedures that cannot be controlled or manipulated, good managers typically embrace new sources of information that they have full discretionary authority to use or not use.

These are serious matters precisely because the largely unknown incidence of unobserved cases has uncontrolled impacts on uses that are made of the wage records for vocational education accountability purposes. One response to this situation, exemplified by Pfeiffer's progress in Florida and Loretta Seppanen's advances in Washington state, is to seek ways to estimate, or otherwise account for, these omitted groups. This has been done through ad hoc surveys, periodic matches against adjacent state wage record databases, and synthetic estimation of the employment status of unobserved cases. A quite different response, which is fading from view as awareness of the wage record data's strengths grows, is to remain aloof from endorsement and use of wage records, while often continuing to report placement outcomes identified through surveys of uneven quality.

Three months after Stevens (1989a) was released by NAVE, a complementary paper, Using State Unemployment Insurance Wage-Records To Construct Measures of Secondary Vocational Education Performance (Stevens, 1989b), was published by the Office of Technology Assessment. This paper contained more detail about wage record coverage and content then had been available to general readers previously. It also provided a concrete example of how wage records can be used based on one school district's data that had been provided on a pilot basis.

An important study by Strong and Jarosik (1989) was unique because it used state income tax records, Census Bureau data, and one of the first quarters of wage records collected by the state of Wisconsin, which was a late adopter of wage reporting requirements. This was a retrospective study of the 1985 and 1988 employment status and earnings levels of 1982-1983 graduates from a Wisconsin Vocational, Technical and Adult Education program. The time lapse involved in this type of study, which is similar to that found in Ghazalah (1991) who reported 1986 earnings of 1979 graduates, is not conducive to the management diagnostics use of data that is of basic concern here.

Throughout this period of increasing interest in the potential value of wage records for vocational education performance measurement purposes a frequent question was, "Aren't these confidential records?" This topic had been addressed by Brown and Choy (1988), by the papers contained in Northeast-Midwest Institute (1988), and in Stevens (1986, 1989a). However, because of the importance of the issue, a compendium of state employment security agency confidentiality laws and regulations was prepared in support of the National Commission for Employment Policy's research on this topic (Stevens, 1990). Since then, the confidentiality topic has continued to receive attention indicative of its importance (see Journal of Official Statistics, 1993; National Forum on Education Statistics, 1994; and Stevens, 1994b).

The Adult and Youth Standards Unit, Division of Performance Management and Evaluation, Office of Strategic Planning and Policy Development, Employment and Training Administration, U.S. Department of Labor convened a technical workgroup in mid-1991 "to consider the desirability and feasibility of basing JTPA postprogram performance standards on unemployment insurance (UI) wage record data" (Bross, 1991, p. i). The technical workgroup's recommendations follow:

The fact that four years ago this team of experts endorsed the use of wage records for performance standards applications, which are more demanding of accuracy than mere performance measurement uses, should whet the appetite of vocational educators to learn more about wage records. As a postscript to the recommendations of this technical workgroup, the Division of Performance Management and Evaluation has not yet allowed states to use wage records as a substitute for the still mandated use of telephone survey follow-up methods. Instead, in 1993-1994, the Division commissioned a series of state case studies to compare employment and earnings outcomes derived from wage records and telephone survey data, respectively (see Stevens, 1994d, for Maryland's case study results).

Four years ago, in Jarosik and Phelps (1992), the National Center for Research in Vocational Education documented thirteen state profiles of wage record use for follow-up purposes. The authors recommended that all State Directors of Vocational Education collaborate with their Committees of Practitioners to assess the UI wage record database as one source of information for their state performance measures and standards systems; and that continued investigation was needed to improve the value of the wage record data for such purposes. This is the primary objective of the present guide.

The flurry of research findings that had appeared in 1989-1991 were reflected in the U.S. Department of Education's Office of Policy and Planning report (Stevens, Richmond, Haenn, & Michie, 1992) that set forth a five-step wage records implementation plan for states. Prior to this time, analysts had concentrated on post-schooling outcomes only. A different perspective is found in Stern and Stevens (1992). Here, the influence on subsequent earnings of enrollment in a cooperative education program while in high school was investigated. A positive association between participation in cooperative education and subsequent employment success is documented, but this cannot be translated into confirmation of a cause-and-effect sequence because too many covariates expected to be relevant were unobserved.

Explicit attention was given to discretionary management diagnostics in Smith and Stevens (1994), which uses Colorado Community College & Occupational Education System records for illustrative purposes. Colorado data is also found in Stevens (1994c), which is the foundation upon which the present guide has been assembled. This Working Paper, which was issued by the National Center on the Educational Quality of the Workforce at the University of Pennsylvania, uses merged wage record and student data obtained from Colorado, Florida, Missouri, and Washington. Many state-specific examples document the importance of both pre- and post-coverage to capture at least some of the work experience that contributes to a joint outcome that has too often been attributed to the most recent education event alone (also see Stevens, 1994f).

A year ago, John Wirt, who was the manager of the National Assessment of Vocational Education when it issued its final report in 1989, authored a Wage Record Information Systems report (U.S. Congress, Office of Technology Assessment, 1994). The introductory paragraph of this report states that "this background paper responds to section 408 of the 1990 amendments to the Perkins Act, which asks OTA to review activities to be undertaken by the National Occupational Information Coordinating Committee (NOICC) to encourage the use of wage records from state unemployment insurance systems for purposes of conducting policy studies or monitoring the outcomes of vocational education" (p. 1). The NOICC had supported the National Governors' Association survey of state capacity to use wage records (Amico, 1993), and had sponsored the preparation of a how-to-do-it guide for setting up a wage record information system, which was released last year by MPR Associates (Levesque & Alt, 1994). Wirt's report provides a valuable synthesis of issues, findings, and references that are available for more in-depth coverage of a particular topic. Those who are just beginning to familiarize themselves with the wage record topic will appreciate the brevity and clarity of Wirt's text.

The niche that remained when all of these studies had been reviewed was that little motivation had been provided to vocational educators who now had access to multiple how-to-do-it guides, particularly Levesque and Alt's guide (1994), but few compelling reasons to bother acting on this awareness. The present guide takes this foundation of understanding for granted and advances to the challenge of motivating managers to want to use these records. A building-block approach follows in the remaining pages of this guide. The next section begins by answering the question, "Why do we need to know about pre-, concurrent, and post-enrollment employment status and earnings profiles?"


<< >> Title Contents NCRVE Home
NCRVE Home | Site Search | Product Search