One manifestation of the anticipated growth of contested markets for secondary and postsecondary students will be a perceptible ratcheting up of the accountability threshold that will have to be reached or surpassed by successful competitors. It will not matter whether this accountability is required by federal authorities, by a state council that adopts a uniform set of performance measures for multiple state agencies, or by a curious and attentive public. The message to vocational education management will be the same in each instance--provide credible evidence of high value-added performance. Such information must be made available at reasonable cost in a timely manner, and it must be easily understood by non-experts. This is the challenge that motivated the research undertaken to prepare this guide.
This guide is designed for local and state authorities who seek a better understanding of the performance of their vocational education programs. The basic theme is management diagnostics. No methods are described to carry out the rote tabulation of figures for a performance standards system mandated by some higher authority. State and national performance standards come and go. Such standards are therefore less likely to have a long-term effect on the way vocational educators manage their business than voluntary behind-the-scenes diagnostics that have always been carried out by competent administrators who care about their students, business colleagues, and community.
The treatment here builds on a solid foundation of pioneering contributions by others. These are identified in the next section. This guide is designed to motivate readers to aspire to reach a higher plateau of understanding on multiple fronts:
A basic feature of the Florida and Washington state performance measurement programs is reliance on quarterly employment and earnings data acquired from Florida's Department of Labor and Employment Security and Washington's State Employment Security Department, respectively. These records are treated as a necessary, but not sufficient, source of reliable employment and earnings information. Similar, but not identical, records are maintained by the unemployment insurance unit of each state employment security agency (except New York) to support the management of the state's unemployment compensation program (except Michigan).
Throughout this and the remaining sections, the basic source document of interest is referred to as a quarterly wage record. This precedent is followed because the term has acquired colloquial familiarity among vocational educators. This is a particularly unfortunate term for novices because each record contains a quarterly earnings amount, not an hourly wage rate figure. This seed document differs in particulars across states, but there are many common features and data elements. The basic elements that are of sustained interest here are (1) an employee identifier; (2) an employer identifier; and (3) a dollar figure representing what this employer paid the reference employee during the designated year/quarter.
Some of the early criticisms of this data source, which concentrated on what was not included, have been overcome by pioneers such as Pfeiffer and Seppanen, who have been aggressive advocates for and/or users of complementary information drawn from other sources such as the Office of Personnel Management for federal government civilian employment, the Department of Defense Manpower Data Center for military personnel information, the U.S. Postal Service, data sharing agreements with contiguous states, and periodic survey-based estimates to fill in gaps.
A seven-state alliance was established in February 1995, sponsored by the Employment and Training Administration of the U.S. Department of Labor through its America's Labor Market Information System (ALMIS) initiative. The purpose of this consortium is to investigate a series of issues that must be addressed to increase the value of wage records for third-party users. The senior author of this guide has lead responsibility for the technical facets of the scope-of-work being addressed by these consortium members (Alaska, Florida, Maryland, North Carolina, Oregon, Texas, and Washington). A summary of the first year's research findings will be released in 1996. Among the topics being investigated is the feasibility of adding an occupational data element to the wage record, as Alaska did some years ago in response to a state legislative mandate to do so; a time-unit data element such as hours or weeks worked during the reference quarter, as Florida, Oregon, Washington, and a small number of other states now do; and a geographic, or work site data element that would eliminate a long-standing interpretive problem associated with allowable reporting practices in most states that limit the accuracy of one's assignment of workers to the actual location of their employment.
Of equal importance to readers is a report to Congress by the Bureau of Labor Statistics, which is expected to be released at any time. This report is expected to recommend the creation of a national distributed database capability, similar to the electronic network that is currently used to manage interstate claims processed on behalf of each state's unemployment insurance program (see Northeast-Midwest Institute, 1988, for a historical perspective). One of the fundamental payoffs expected from such a national distributed database capability would be a routine interstate sharing of information about the employment status and earnings level of former students and trainees in other states who cannot be found in the home state wage records file. There is no guarantee that this data sharing capability will become a reality; and, if so, when. But the simple fact that the Bureau of Labor Statistics is expected to recommend such a step to Congress is a clear signal of the value that is placed on wage records as a reliable and inexpensive source of pertinent information--necessary to sensible performance measurement, but not sufficient by itself for such use.
The senior author and colleagues first used wage records to document the earnings of former vocational education students in a study of Missouri graduates who had been supported by CETA funds (Atteberry, Bender, Stevens, & Tacker, 1982). The use of wage records for performance standards measurement began in the mid-1980s with Job Training Partnership Act (JTPA) applications (Trott, Sheets, & Baj, 1985).
The current era of wage record use for vocational education research purposes began in 1986, when the author proposed and then carried out such applications for the National Assessment of Vocational Education (NAVE) (Stevens, 1986, 1989a) and the Office of Technology Assessment (Stevens, 1989b). A seminal contribution that appeared at about the same time was done by Pfeiffer (1990). This was an interim report on a work-in-progress, which led directly to adoption of key components of Pfeiffer's pioneering FETPIP program in North Carolina and Texas, and to other second-generation efforts that feature the basic principles of the FETPIP.
The modern origins of national research applications and state program uses of wage records began as roughly aligned, but independent, agendas. These parallel tracks have converged in the 1990s. Today, multiple nodes of university-based research teams collaborating with state agency colleagues are observed across the U.S., particularly in California, Colorado, Florida, Maryland, Missouri, North Carolina, Texas and Washington.
A "that was then" "this is now" caution is offered for three reasons:
Wage record coverage stops at a state's border. This is sufficient reason for some opponents of wage record use to conclude "if you do not know the employment status of former students who have left the state then you should not document the employment status of those who have remained in the state as productive employees." There is some merit to this stance when performance standards are the topic of discussion because the percentage and importance of unobserved cases will vary depending upon a school's proximity to work opportunities outside the state, the specialized nature of a school's program offerings, and the comparative work histories of former students who stay or leave. It is difficult, but not impossible, to assess the adequacy of what is observed as a reliable proxy for the hypothetical combination of these observations and the unobserved missing cases.
Unlike the performance standards situation, which is frequently characterized by subordinate opposition to attempts by higher management levels to impose measurement procedures that cannot be controlled or manipulated, good managers typically embrace new sources of information that they have full discretionary authority to use or not use.
These are serious matters precisely because the largely unknown incidence of unobserved cases has uncontrolled impacts on uses that are made of the wage records for vocational education accountability purposes. One response to this situation, exemplified by Pfeiffer's progress in Florida and Loretta Seppanen's advances in Washington state, is to seek ways to estimate, or otherwise account for, these omitted groups. This has been done through ad hoc surveys, periodic matches against adjacent state wage record databases, and synthetic estimation of the employment status of unobserved cases. A quite different response, which is fading from view as awareness of the wage record data's strengths grows, is to remain aloof from endorsement and use of wage records, while often continuing to report placement outcomes identified through surveys of uneven quality.
Three months after Stevens (1989a) was released by NAVE, a complementary paper, Using State Unemployment Insurance Wage-Records To Construct Measures of Secondary Vocational Education Performance (Stevens, 1989b), was published by the Office of Technology Assessment. This paper contained more detail about wage record coverage and content then had been available to general readers previously. It also provided a concrete example of how wage records can be used based on one school district's data that had been provided on a pilot basis.
An important study by Strong and Jarosik (1989) was unique because it used state income tax records, Census Bureau data, and one of the first quarters of wage records collected by the state of Wisconsin, which was a late adopter of wage reporting requirements. This was a retrospective study of the 1985 and 1988 employment status and earnings levels of 1982-1983 graduates from a Wisconsin Vocational, Technical and Adult Education program. The time lapse involved in this type of study, which is similar to that found in Ghazalah (1991) who reported 1986 earnings of 1979 graduates, is not conducive to the management diagnostics use of data that is of basic concern here.
Throughout this period of increasing interest in the potential value of wage records for vocational education performance measurement purposes a frequent question was, "Aren't these confidential records?" This topic had been addressed by Brown and Choy (1988), by the papers contained in Northeast-Midwest Institute (1988), and in Stevens (1986, 1989a). However, because of the importance of the issue, a compendium of state employment security agency confidentiality laws and regulations was prepared in support of the National Commission for Employment Policy's research on this topic (Stevens, 1990). Since then, the confidentiality topic has continued to receive attention indicative of its importance (see Journal of Official Statistics, 1993; National Forum on Education Statistics, 1994; and Stevens, 1994b).
The Adult and Youth Standards Unit, Division of Performance Management and Evaluation, Office of Strategic Planning and Policy Development, Employment and Training Administration, U.S. Department of Labor convened a technical workgroup in mid-1991 "to consider the desirability and feasibility of basing JTPA postprogram performance standards on unemployment insurance (UI) wage record data" (Bross, 1991, p. i). The technical workgroup's recommendations follow:
Four years ago, in Jarosik and Phelps (1992), the National Center for Research in Vocational Education documented thirteen state profiles of wage record use for follow-up purposes. The authors recommended that all State Directors of Vocational Education collaborate with their Committees of Practitioners to assess the UI wage record database as one source of information for their state performance measures and standards systems; and that continued investigation was needed to improve the value of the wage record data for such purposes. This is the primary objective of the present guide.
The flurry of research findings that had appeared in 1989-1991 were reflected in the U.S. Department of Education's Office of Policy and Planning report (Stevens, Richmond, Haenn, & Michie, 1992) that set forth a five-step wage records implementation plan for states. Prior to this time, analysts had concentrated on post-schooling outcomes only. A different perspective is found in Stern and Stevens (1992). Here, the influence on subsequent earnings of enrollment in a cooperative education program while in high school was investigated. A positive association between participation in cooperative education and subsequent employment success is documented, but this cannot be translated into confirmation of a cause-and-effect sequence because too many covariates expected to be relevant were unobserved.
Explicit attention was given to discretionary management diagnostics in Smith and Stevens (1994), which uses Colorado Community College & Occupational Education System records for illustrative purposes. Colorado data is also found in Stevens (1994c), which is the foundation upon which the present guide has been assembled. This Working Paper, which was issued by the National Center on the Educational Quality of the Workforce at the University of Pennsylvania, uses merged wage record and student data obtained from Colorado, Florida, Missouri, and Washington. Many state-specific examples document the importance of both pre- and post-coverage to capture at least some of the work experience that contributes to a joint outcome that has too often been attributed to the most recent education event alone (also see Stevens, 1994f).
A year ago, John Wirt, who was the manager of the National Assessment of Vocational Education when it issued its final report in 1989, authored a Wage Record Information Systems report (U.S. Congress, Office of Technology Assessment, 1994). The introductory paragraph of this report states that "this background paper responds to section 408 of the 1990 amendments to the Perkins Act, which asks OTA to review activities to be undertaken by the National Occupational Information Coordinating Committee (NOICC) to encourage the use of wage records from state unemployment insurance systems for purposes of conducting policy studies or monitoring the outcomes of vocational education" (p. 1). The NOICC had supported the National Governors' Association survey of state capacity to use wage records (Amico, 1993), and had sponsored the preparation of a how-to-do-it guide for setting up a wage record information system, which was released last year by MPR Associates (Levesque & Alt, 1994). Wirt's report provides a valuable synthesis of issues, findings, and references that are available for more in-depth coverage of a particular topic. Those who are just beginning to familiarize themselves with the wage record topic will appreciate the brevity and clarity of Wirt's text.
The niche that remained when all of these studies had been reviewed was that little motivation had been provided to vocational educators who now had access to multiple how-to-do-it guides, particularly Levesque and Alt's guide (1994), but few compelling reasons to bother acting on this awareness. The present guide takes this foundation of understanding for granted and advances to the challenge of motivating managers to want to use these records. A building-block approach follows in the remaining pages of this guide. The next section begins by answering the question, "Why do we need to know about pre-, concurrent, and post-enrollment employment status and earnings profiles?"