NCRVE Home

NEW PERSPECTIVES
ON DOCUMENTING EMPLOYMENT
AND EARNINGS OUTCOMES
IN VOCATIONAL EDUCATION

MDS-743






David W. Stevens
Jinping Shi



The Jacob France Center
Merrick School of Business
University of Baltimore
Baltimore, Maryland

National Center for Research in Vocational Education
Graduate School of Education
University of California at Berkeley
2030 Addison Street, Suite 500
Berkeley, CA 94720-1674


Supported by
The Office of Vocational and Adult Education
U.S. Department of Education

August, 1996


FUNDING INFORMATION

Project Title: National Center for Research in Vocational Education
Grant Number: V051A30003-96A/V051A30004-96A
Act under which Funds Administered: Carl D. Perkins Vocational Education Act
P.L. 98-524
Source of Grant: Office of Vocational and Adult Education
U.S. Department of Education
Washington, DC 20202
Grantee: The Regents of the University of California
c/o National Center for Research in Vocational Education
2150 Shattuck Avenue, Suite 1250
Berkeley, CA 94704
Director: David Stern
Percent of Total Grant Financed by Federal Money: 100%
Dollar Amount of Federal Funds for Grant: $6,000,000
Disclaimer: This publication was prepared pursuant to a grant with the Office of Vocational and Adult Education, U.S. Department of Education. Grantees undertaking such projects under government sponsorship are encouraged to express freely their judgement in professional and technical matters. Points of view or opinions do not, therefore, necessarily represent official U.S. Department of Education position or policy.
Discrimination: Title VI of the Civil Rights Act of 1964 states: "No person in the United States shall, on the ground of race, color, or national origin, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving federal financial assistance." Title IX of the Education Amendments of 1972 states: "No person in the United States shall, on the basis of sex, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any education program or activity receiving federal financial assistance." Therefore, the National Center for Research in Vocational Education project, like every program or activity receiving financial assistance from the U.S. Department of Education, must be operated in compliance with these laws.




ACKNOWLEDGMENTS

This document reflects the expertise and professional dedication of many colleagues. Jay Pfeiffer, Director of Florida's Education and Training Placement Information Program, and Loretta Seppanen, Manager for Research & Analysis, Washington State Board for Community & Technical Colleges, have been particularly important sounding boards and contributors to the development of the concepts and case study examples that appear here. Gregory Smith, former Coordinator of Institutional Research and Evaluation, Colorado Community College & Occupational Education System, and John Wittstruck, Associate Commissioner, Policy Analysis & Data Services, Missouri Coordinating Board for Higher Education, provided valuable advice and cooperation throughout the research phase. In each of these four states one or more employment security agency colleagues are equally deserving of thanks for cooperation that was neither obligatory nor expected. Kay Raithel and Tom Righthouse in Missouri, and Jeff Jaksich in Washington state, have been advocates for the proper use of administrative records for many years.

The data collection and initial research were conducted through the auspices of the National Center on the Educational Quality of the Workforce at the University of Pennsylvania, on behalf of the National Assessment of Vocational Education. Robert Zemsky, Co-Director of the Center, Jo Anne Saporito, and Margaret R. Hoover managed this phase of the research. Nevzer Stacey, the U.S. Department of Education's project officer for the Center, and David Boesel, then Director of the National Assessment of Vocational Education, provided oversight on behalf of the sponsoring agency.

The National Center for Research in Vocational Education's investment in the preparation of this volume was endorsed by the NCRVE's late director, Charles Benson, and by E. Gareth Hoachlander, President of MPR Associates, Inc. Karen Levesque of MPR Associates and an anonymous reviewer of a draft of the manuscript provided valuable insights and suggestions for improvement that are reflected here.

Liping Chen, Database Manager in The Jacob France Center, Merrick School of Business, University of Baltimore, prepared the data sets that were used to create the tables and figures that are contained in this volume. The graphics were designed and prepared by Lyn Zhao and Rudee Laohakittikul. Michele Kraus assisted in the preparation of the final document. Kristy Wilson Axness, Communications and Research Manager in the Center, provided exemplary oversight of the complex interagency agreements and financial arrangements that arose in the data collection and research phases.


EXECUTIVE SUMMARY

Vocational education in the United States faces an uncertain future. Congressional action to consolidate the federal government's investment in education and training programs is expected. One manifestation of an anticipated growth of contested markets for students who want to acquire, extend, or renew occupational competencies will be a perceptible ratcheting up of the accountability threshold. The message to vocational education management will be to provide credible evidence of high value-added performance at reasonable cost in an easily understood and timely manner.

This volume looks beyond the performance standards topic to satisfy the needs of local and state authorities who seek a better understanding of the employment affiliations and earnings paths of former vocational education students. The basic data source that is relied upon is the quarterly wage record submitted each quarter by employers who are required to comply with their state's unemployment compensation law. Each such record contains just three data elements: (1) an employee's social security number; (2) the reporting employer's unique unemployment compensation tax account number; and (3) the earnings paid to this employee by that employer during the reference year/quarter. Among the facets of performance measurement that are covered here are (1) the quality of available administrative records to carry out this documentation; (2) the confidentiality of the records as this may affect their adequacy for the intended purpose; (3) the importance of recording multiple observations of a former student's employment status and earnings; and (4) the concept of joint outcome.

The first section of this guide provides a brief but thorough introduction to three decades of research that has been conducted using the basic administrative record that is the core of what follows. The 1986-1995 decade of vocational education contributions should be of particular interest to most readers.

The second section introduces and elaborates upon an optics metaphor that weaves three concepts into a tapestry of understanding about the interplay of candidate qualifications, employer requirements, and employment opportunity as these ultimately determine whether and how a former student prospers in the workplace. A fundamental realization that emerges from this metaphor is that a single vocational education event cannot be easily isolated as the single force that resulted in a particular status such as placement or training related placement.

The third section sets forth and uses multiple concepts and measures of employment and earnings. The relevance of pre-enrollment, concurrent, and post-enrollment measures of each is emphasized. This section offers many examples of the weakness of single point-in-time measures of employment status, and documents why attribution of observed snapshots of employment as placements can not be sustained in many cases.

With the fundamentals out of the way, the fourth section explores refinements that are likely to be of interest to vocational education managers who see the possible payoff to gaining new insights about their program outcomes. Included among these refinements are (1) suggestions about the need to align the timing of exit from school when the employment status and earnings level of former students are examined; (2) the importance of documenting previous and continuing education for isolating the impact of a particular vocational education event; (3) how to deal with multiple employer affiliations in a single reference quarter; (4) methods to refine observed earnings figures to distinguish full-time and part-time employment; and (5) brief mention of the value of carrying out the specification of models of the dynamic forces that have been described, which can then be estimated with available data to obtain more credible evidence of the impact of vocational education on a former student's productivity in and rewards received from the economy.

This volume is designed to complement earlier guides that concentrated on the collection of information through state systems like Florida's Education and Training Placement Information Program, and the recently released report of the Joint Commission on Accountability Reporting, which seeks to achieve widespread uniformity of reporting practices and definitions. Here, it is assumed that the data has been, or can be, acquired. The examples offered move readers to the next plateau of understanding, which is what to do with the data and why it is important to do so.


INTRODUCTION

Vocational education in the United States faces an uncertain future. Congressional action to consolidate the federal government's investment in education and training programs was expected during the current session, but did not happen. Many expect future restructuring to cascade down to the local and state levels in the form of lower funding flowing through new pipelines with more spigots than most vocational educators have been accustomed to in recent years.

One manifestation of the anticipated growth of contested markets for secondary and postsecondary students will be a perceptible ratcheting up of the accountability threshold that will have to be reached or surpassed by successful competitors. It will not matter whether this accountability is required by federal authorities, by a state council that adopts a uniform set of performance measures for multiple state agencies, or by a curious and attentive public. The message to vocational education management will be the same in each instance--provide credible evidence of high value-added performance. Such information must be made available at reasonable cost in a timely manner, and it must be easily understood by non-experts. This is the challenge that motivated the research undertaken to prepare this guide.

This guide is designed for local and state authorities who seek a better understanding of the performance of their vocational education programs. The basic theme is management diagnostics. No methods are described to carry out the rote tabulation of figures for a performance standards system mandated by some higher authority. State and national performance standards come and go. Such standards are therefore less likely to have a long-term effect on the way vocational educators manage their business than voluntary behind-the-scenes diagnostics that have always been carried out by competent administrators who care about their students, business colleagues, and community.

The treatment here builds on a solid foundation of pioneering contributions by others. These are identified in the next section. This guide is designed to motivate readers to aspire to reach a higher plateau of understanding on multiple fronts:

This volume complements Chapter 1: Placement Rates in the 1995 report of The Joint Commission on Accountability Reporting (JCAR). The JCAR was created in 1994 by the American Association of State Colleges and Universities, the American Association of Community Colleges, and the National Association of State Universities and Land-Grant Colleges. This was a preemptive response to evidence that member institutions were increasingly unable to satisfy constituent requests for performance-based information. The JCAR report highlights two pioneering state programs--(1) Florida's Education and Training Placement Information Program (FETPIP) and (2) the Research & Analysis Program of Washington's State Board for Community and Technical Colleges. The managers of these programs, Jay Pfeiffer and Loretta Seppanen, respectively, are charter members of a select group of pioneers who have consistently approached performance measurement challenges with a positive attitude. The result in each case is an exemplary program that has now been selected as a prototype from which others can learn. Components of the Florida and Washington programs have been replicated and adapted by other states for years. Six years ago, we were fortunate to be added to the list of colleagues with whom these program managers cooperate. Much of what appears in the database sections of this guide can be traced directly to their unselfish sharing of time and information.

A basic feature of the Florida and Washington state performance measurement programs is reliance on quarterly employment and earnings data acquired from Florida's Department of Labor and Employment Security and Washington's State Employment Security Department, respectively. These records are treated as a necessary, but not sufficient, source of reliable employment and earnings information. Similar, but not identical, records are maintained by the unemployment insurance unit of each state employment security agency (except New York) to support the management of the state's unemployment compensation program (except Michigan).

Throughout this and the remaining sections, the basic source document of interest is referred to as a quarterly wage record. This precedent is followed because the term has acquired colloquial familiarity among vocational educators. This is a particularly unfortunate term for novices because each record contains a quarterly earnings amount, not an hourly wage rate figure. This seed document differs in particulars across states, but there are many common features and data elements. The basic elements that are of sustained interest here are (1) an employee identifier; (2) an employer identifier; and (3) a dollar figure representing what this employer paid the reference employee during the designated year/quarter.

Some of the early criticisms of this data source, which concentrated on what was not included, have been overcome by pioneers such as Pfeiffer and Seppanen, who have been aggressive advocates for and/or users of complementary information drawn from other sources such as the Office of Personnel Management for federal government civilian employment, the Department of Defense Manpower Data Center for military personnel information, the U.S. Postal Service, data sharing agreements with contiguous states, and periodic survey-based estimates to fill in gaps.

A seven-state alliance was established in February 1995, sponsored by the Employment and Training Administration of the U.S. Department of Labor through its America's Labor Market Information System (ALMIS) initiative. The purpose of this consortium is to investigate a series of issues that must be addressed to increase the value of wage records for third-party users. The senior author of this guide has lead responsibility for the technical facets of the scope-of-work being addressed by these consortium members (Alaska, Florida, Maryland, North Carolina, Oregon, Texas, and Washington). A summary of the first year's research findings will be released in 1996. Among the topics being investigated is the feasibility of adding an occupational data element to the wage record, as Alaska did some years ago in response to a state legislative mandate to do so; a time-unit data element such as hours or weeks worked during the reference quarter, as Florida, Oregon, Washington, and a small number of other states now do; and a geographic, or work site data element that would eliminate a long-standing interpretive problem associated with allowable reporting practices in most states that limit the accuracy of one's assignment of workers to the actual location of their employment.

Of equal importance to readers is a report to Congress by the Bureau of Labor Statistics, which is expected to be released at any time. This report is expected to recommend the creation of a national distributed database capability, similar to the electronic network that is currently used to manage interstate claims processed on behalf of each state's unemployment insurance program (see Northeast-Midwest Institute, 1988, for a historical perspective). One of the fundamental payoffs expected from such a national distributed database capability would be a routine interstate sharing of information about the employment status and earnings level of former students and trainees in other states who cannot be found in the home state wage records file. There is no guarantee that this data sharing capability will become a reality; and, if so, when. But the simple fact that the Bureau of Labor Statistics is expected to recommend such a step to Congress is a clear signal of the value that is placed on wage records as a reliable and inexpensive source of pertinent information--necessary to sensible performance measurement, but not sufficient by itself for such use.

Topics Covered

The remainder of this first section provides an introduction to the current status of one facet of the overall performance measurement issue--documentation of employment and earnings outcomes that can be associated with a particular education event. This overview includes appropriate references to related contributions that may be of interest to some readers. The second section introduces an optics metaphor to describe the interplay of supply and demand forces as each affects the recorded employment and earnings histories of former students. The third section follows with a primer on the proper measurement of each former student's employment status, mobility between employer affiliations, and earnings path. The fourth section then describes refinements of these fundamental components of a performance measurement system.

Measuring Employment and Earnings Outcomes

Overview

Wage records have been used for research and evaluation purposes for more than three decades (Stevens, 1994a). Pioneering uses included evaluation of adult retraining programs (Borus, 1964); a benefit-cost analysis of Neighborhood Youth Corps programs (Borus, Brennan, & Rosen, 1970); estimating a State Employment Security Agency's own penetration of all hires in a local labor market (Hanna, 1976; Siebert, 1976); and follow-up of Comprehensive Employment and Training Act (CETA) participants (MDC, 1980).

The senior author and colleagues first used wage records to document the earnings of former vocational education students in a study of Missouri graduates who had been supported by CETA funds (Atteberry, Bender, Stevens, & Tacker, 1982). The use of wage records for performance standards measurement began in the mid-1980s with Job Training Partnership Act (JTPA) applications (Trott, Sheets, & Baj, 1985).

The current era of wage record use for vocational education research purposes began in 1986, when the author proposed and then carried out such applications for the National Assessment of Vocational Education (NAVE) (Stevens, 1986, 1989a) and the Office of Technology Assessment (Stevens, 1989b). A seminal contribution that appeared at about the same time was done by Pfeiffer (1990). This was an interim report on a work-in-progress, which led directly to adoption of key components of Pfeiffer's pioneering FETPIP program in North Carolina and Texas, and to other second-generation efforts that feature the basic principles of the FETPIP.

The modern origins of national research applications and state program uses of wage records began as roughly aligned, but independent, agendas. These parallel tracks have converged in the 1990s. Today, multiple nodes of university-based research teams collaborating with state agency colleagues are observed across the U.S., particularly in California, Colorado, Florida, Maryland, Missouri, North Carolina, Texas and Washington.

A "that was then" "this is now" caution is offered for three reasons:

  1. Some early studies in the new era of concentrated research effort, such as Strong and Jarosik (1989) and Ghazalah (1991), used or referred to the potential use of state department of revenue, Internal Revenue Service (IRS), and Social Security Administration (SSA) data as an alternative to the use of wage records. Despite some comparative strengths relative to wage records such as national coverage, IRS and SSA records do not satisfy minimum timeliness and format requirements that must be met to carry out routine management diagnostic and performance measurement responsibilities. These sources may still be considered for periodic long-term evaluation studies, particularly if the basic features of the Clinton Administration's Simplified Tax and Wage Reporting System (STAWRS) proposal are adopted (see Internal Revenue Service, 1995). These records are not a practical substitute for the wage record applications that are addressed here.

  2. Now-dated surveys of state capabilities, such as Rahn, Hoachlander, and Levesque (1992), Jarosik and Phelps (1992), and Amico (1993), correctly described statutory and administrative regulation barriers that in some cases limited access to the basic wage record itself. Laws and rules have changed. With a few notable exceptions that are identified later, wage records are more accessible for third-party use now than at any previous time.

  3. An important complement to these legal and management advances has been a pervasive improvement in data processing capacity, accompanied by a steady decline in the cost of maintaining and using large databases.
These three statements, together, send a clear message: Skeptics who, for legitimate reasons, doubted the appropriateness of wage records for performance measurement purposes at an earlier time, should revisit the issue to assign new weights to the pros and cons of relying on data sources that are available today.

Background

Three documents released in 1989 set the stage for a concentration of research attention on the use of wage records for vocational education accountability purposes: Using State Unemployment Insurance Wage-Records To Trace the Subsequent Labor Market Experiences of Vocational Education Program Leavers and Using State Unemployment Insurance Wage-Records To Construct Measures of Secondary Vocational Education Performance by Stevens, and A Longitudinal Study of Earnings of VTAE Graduates by Strong and Jarosik. Using State Unemployment Insurance Wage-Records to Trace the Subsequent Labor Market Experiences of Vocational Education Program Leavers, commissioned by NAVE, was designed to "separate fact from fiction and to establish a common ground for debating the merits of using [wage records] for vocational education performance assessment purposes." A theme of that paper was to distinguish between the demanding coverage requirements of a performance standards system and the less stringent requirements of a management diagnostics system. The intent then, as now, was to counter the orchestrated voices of those who oppose measuring anything if you can not measure everything.

Wage record coverage stops at a state's border. This is sufficient reason for some opponents of wage record use to conclude "if you do not know the employment status of former students who have left the state then you should not document the employment status of those who have remained in the state as productive employees." There is some merit to this stance when performance standards are the topic of discussion because the percentage and importance of unobserved cases will vary depending upon a school's proximity to work opportunities outside the state, the specialized nature of a school's program offerings, and the comparative work histories of former students who stay or leave. It is difficult, but not impossible, to assess the adequacy of what is observed as a reliable proxy for the hypothetical combination of these observations and the unobserved missing cases.

Unlike the performance standards situation, which is frequently characterized by subordinate opposition to attempts by higher management levels to impose measurement procedures that cannot be controlled or manipulated, good managers typically embrace new sources of information that they have full discretionary authority to use or not use.

These are serious matters precisely because the largely unknown incidence of unobserved cases has uncontrolled impacts on uses that are made of the wage records for vocational education accountability purposes. One response to this situation, exemplified by Pfeiffer's progress in Florida and Loretta Seppanen's advances in Washington state, is to seek ways to estimate, or otherwise account for, these omitted groups. This has been done through ad hoc surveys, periodic matches against adjacent state wage record databases, and synthetic estimation of the employment status of unobserved cases. A quite different response, which is fading from view as awareness of the wage record data's strengths grows, is to remain aloof from endorsement and use of wage records, while often continuing to report placement outcomes identified through surveys of uneven quality.

Three months after Stevens (1989a) was released by NAVE, a complementary paper, Using State Unemployment Insurance Wage-Records To Construct Measures of Secondary Vocational Education Performance (Stevens, 1989b), was published by the Office of Technology Assessment. This paper contained more detail about wage record coverage and content then had been available to general readers previously. It also provided a concrete example of how wage records can be used based on one school district's data that had been provided on a pilot basis.

An important study by Strong and Jarosik (1989) was unique because it used state income tax records, Census Bureau data, and one of the first quarters of wage records collected by the state of Wisconsin, which was a late adopter of wage reporting requirements. This was a retrospective study of the 1985 and 1988 employment status and earnings levels of 1982-1983 graduates from a Wisconsin Vocational, Technical and Adult Education program. The time lapse involved in this type of study, which is similar to that found in Ghazalah (1991) who reported 1986 earnings of 1979 graduates, is not conducive to the management diagnostics use of data that is of basic concern here.

Throughout this period of increasing interest in the potential value of wage records for vocational education performance measurement purposes a frequent question was, "Aren't these confidential records?" This topic had been addressed by Brown and Choy (1988), by the papers contained in Northeast-Midwest Institute (1988), and in Stevens (1986, 1989a). However, because of the importance of the issue, a compendium of state employment security agency confidentiality laws and regulations was prepared in support of the National Commission for Employment Policy's research on this topic (Stevens, 1990). Since then, the confidentiality topic has continued to receive attention indicative of its importance (see Journal of Official Statistics, 1993; National Forum on Education Statistics, 1994; and Stevens, 1994b).

The Adult and Youth Standards Unit, Division of Performance Management and Evaluation, Office of Strategic Planning and Policy Development, Employment and Training Administration, U.S. Department of Labor convened a technical workgroup in mid-1991 "to consider the desirability and feasibility of basing JTPA postprogram performance standards on unemployment insurance (UI) wage record data" (Bross, 1991, p. i). The technical workgroup's recommendations follow:

The fact that four years ago this team of experts endorsed the use of wage records for performance standards applications, which are more demanding of accuracy than mere performance measurement uses, should whet the appetite of vocational educators to learn more about wage records. As a postscript to the recommendations of this technical workgroup, the Division of Performance Management and Evaluation has not yet allowed states to use wage records as a substitute for the still mandated use of telephone survey follow-up methods. Instead, in 1993-1994, the Division commissioned a series of state case studies to compare employment and earnings outcomes derived from wage records and telephone survey data, respectively (see Stevens, 1994d, for Maryland's case study results).

Four years ago, in Jarosik and Phelps (1992), the National Center for Research in Vocational Education documented thirteen state profiles of wage record use for follow-up purposes. The authors recommended that all State Directors of Vocational Education collaborate with their Committees of Practitioners to assess the UI wage record database as one source of information for their state performance measures and standards systems; and that continued investigation was needed to improve the value of the wage record data for such purposes. This is the primary objective of the present guide.

The flurry of research findings that had appeared in 1989-1991 were reflected in the U.S. Department of Education's Office of Policy and Planning report (Stevens, Richmond, Haenn, & Michie, 1992) that set forth a five-step wage records implementation plan for states. Prior to this time, analysts had concentrated on post-schooling outcomes only. A different perspective is found in Stern and Stevens (1992). Here, the influence on subsequent earnings of enrollment in a cooperative education program while in high school was investigated. A positive association between participation in cooperative education and subsequent employment success is documented, but this cannot be translated into confirmation of a cause-and-effect sequence because too many covariates expected to be relevant were unobserved.

Explicit attention was given to discretionary management diagnostics in Smith and Stevens (1994), which uses Colorado Community College & Occupational Education System records for illustrative purposes. Colorado data is also found in Stevens (1994c), which is the foundation upon which the present guide has been assembled. This Working Paper, which was issued by the National Center on the Educational Quality of the Workforce at the University of Pennsylvania, uses merged wage record and student data obtained from Colorado, Florida, Missouri, and Washington. Many state-specific examples document the importance of both pre- and post-coverage to capture at least some of the work experience that contributes to a joint outcome that has too often been attributed to the most recent education event alone (also see Stevens, 1994f).

A year ago, John Wirt, who was the manager of the National Assessment of Vocational Education when it issued its final report in 1989, authored a Wage Record Information Systems report (U.S. Congress, Office of Technology Assessment, 1994). The introductory paragraph of this report states that "this background paper responds to section 408 of the 1990 amendments to the Perkins Act, which asks OTA to review activities to be undertaken by the National Occupational Information Coordinating Committee (NOICC) to encourage the use of wage records from state unemployment insurance systems for purposes of conducting policy studies or monitoring the outcomes of vocational education" (p. 1). The NOICC had supported the National Governors' Association survey of state capacity to use wage records (Amico, 1993), and had sponsored the preparation of a how-to-do-it guide for setting up a wage record information system, which was released last year by MPR Associates (Levesque & Alt, 1994). Wirt's report provides a valuable synthesis of issues, findings, and references that are available for more in-depth coverage of a particular topic. Those who are just beginning to familiarize themselves with the wage record topic will appreciate the brevity and clarity of Wirt's text.

The niche that remained when all of these studies had been reviewed was that little motivation had been provided to vocational educators who now had access to multiple how-to-do-it guides, particularly Levesque and Alt's guide (1994), but few compelling reasons to bother acting on this awareness. The present guide takes this foundation of understanding for granted and advances to the challenge of motivating managers to want to use these records. A building-block approach follows in the remaining pages of this guide. The next section begins by answering the question, "Why do we need to know about pre-, concurrent, and post-enrollment employment status and earnings profiles?"


A FOUNDATION FOR DOCUMENTING EMPLOYMENT AND EARNINGS OUTCOMES FROM INVESTMENTS IN VOCATIONAL EDUCATION

The concluding paragraph of NAVE's Interim Report to Congress (U.S. Department of Education, 1994) serves as an important warning to anyone who is interested in documenting a vocational education/employment and earnings connection:
What has become apparent from reviewing vast literature is that further work is needed to sort out the complicated and often contradictory findings regarding the economic returns to vocational education. Many of the studies reviewed reveal that it is the interaction of a variety of factors that predicts economic success. Until researchers can isolate those effects, or better understand the interaction between key factors, conclusions about the effects of vocational education will remain tentative. (p. 405)
This conclusion should not be permitted to become an excuse for inaction because we do not know enough yet (see Grubb, 1995; Kane & Rouse, 1995a, 1995b). Responsible authorities must act now, even if the validity of a relationship between a particular education event and earnings has not been established. Here, validity refers to evidence that a measure--such as earnings--actually represents a vocational education outcome. Some confuse this concept with a measure's reliability, which represents the strength and consistency of evidence that a measure is accurately recording what the investigator wants to record. For instance, it might be concluded that a State Employment Security Agency's database is a reliable source to document a former student's earnings level, while believing that this is not a valid measure of a vocational education outcome. This view might be expressed if one thinks that these measures also reflect the influence of other factors, such as ability, motivation, previous employment experience, other education exposure, and work-site circumstances, which make it difficult, and perhaps impossible as a practical matter, to isolate vocational education's independent influence (see National Institute of Education, 1981, p. VII-22; and Wirt, Muraskin, Goodwin, & Meyer, 1989, p. 122, for a historical perspective).

Three Fundamental Concepts

Progress toward the collection of convincing evidence of a vocational education/ employment and earnings relationship will require a consistent recognition that employment status and earnings are joint, or combined, outcomes that reflect the interaction of candidate qualification factors, employer requirement factors, and employment opportunity factors.

Figure 1 provides a visual representation of a metaphor that describes the interplay among these three basic concepts. The importance of this interdependence must be understood to appreciate the relevance of the outcome measurement lessons that appear in the next two sections.

The metaphor used here is optics. Clear vision is a function of complex forces that remain a mystery to most of us. Similarly, a former student's subsequent employment status is determined by their own qualifications and actions, the qualifications and actions of others, employer requirements, and employment opportunity. Aspects of this interplay remain as mysterious to many people as the phenomenon of sight.

Pursuing the optics metaphor, think of a former student's decision to seek employment as a pulse of light, and an employer's decision to hire a new employee as a second pulse of light. Figure 1 shows each of these beams of light passing through two screens before arriving at a target. The human eye passes light through pupil, lens, vitreous fluid, and retina before it is translated through the optic nerve to the brain. Each component has a distinct function that contributes to the quality of perception.

A person's decision to look for a job is screened through their own unique bundle of qualifications, and those of all others who might be thought of as competing candidates, before arriving at the employment opportunity destination. An employer's decision to hire a new employee is screened through their own specification of requirements, and the requirements of other employers who are competing for available candidates, before arriving at the hiring opportunity point. The roles of each of these screens and the employment opportunity set for investigating vocational education outcomes must be understood before proceeding to the examples that appear in the last two sections of this guide.

Qualification Factors

These factors include any attribute that is used by an employer to determine a person's qualification for initial hire, retention, on-the-job training, subsidized continuing education, and promotion. These attributes include common and unusual factors, legal and illegal factors, and persistent and periodic factors.

Historically, a common candidate qualification criterion has been evidence of having reached an educational plateau such as high school graduation or receipt of a postsecondary certificate or degree. However, as the pool of those who have reached each of these plateaus has grown, the relative importance of this qualification criterion has fallen. The value of an attribute as a discriminator among candidates falls as the fraction of those who offer the attribute rises. At the extreme, when all of the candidates offer an attribute, it cannot serve as a basis for selection. And, of course, if no candidate offers the attribute, it cannot serve as a discriminator. This is not the same as saying that an attribute cannot be a requirement when all candidates exhibit it. Achievement of the goal of universal literacy by the Year 2000 would not mean that literacy would then become irrelevant.

A less common screening criterion, but one that is being promoted as a prime candidate for widespread use, is certified competency. The National Skills Standard Board established pursuant to the Goals 2000: Educate America Act is attempting to hasten the obsolescence of employer reliance on evidence of program completion alone as a candidate qualification criterion. Vocal advocacy will be heard for the substitution of evidence of actual skill competencies that are consistent with established industry standards or requirements. Another pertinent qualification criterion is an employer's preference for relying upon referrals from current employees, based on a belief that incumbent employees know the qualifications that are appropriate and that they have a selfish interest in screening out poorly qualified candidates whose failure on the job would reflect on their own judgment.

Whether consideration of a particular candidate qualification criterion is common or unusual may affect how wage record data will be interpreted as evidence of a vocational education outcome. At higher levels of geographic aggregation, the relative importance of an unusual screening criterion will recede. For example, a particular classroom teacher may have earned a local reputation for attracting highly motivated and competent students who are aggressively recruited by local employers. This reputational advantage will be reflected in the employment and earnings records of this teacher's former students. However, at higher levels of aggregation, such as the district-wide performance of peers who were exposed to a similar curriculum, this advantage will be offset by average and below average achievements.

It is important to consider how employer use of illegal candidate qualification criteria might affect the interpretation of wage record data as evidence of a vocational education outcome. A school administrator's awareness of pervasive discrimination practices in a particular employment sector might be expected to result in either, or both, of the following undesirable policies.

  1. Discouragement of enrollment by members of the group that is discriminated against because their diminished chances for employment success will make the vocational education program look bad.

  2. Passivity with regard to student enrollment decisions, followed by no conscious attempt to help members of the affected population overcome the known barrier to employment success.
Use of a performance measure that documents employment status or earnings level could be expected to increase the denial or discouragement of enrollment by those who are believed to have poor prospects for successful entry into employment, while simultaneously providing a new incentive to increase the resources that are devoted to the special needs of those who do enroll. Investment in the documentation of employment and earnings profiles over time will provide accurate information about how members of particular populations have fared in the workplace. Administrative actions (e.g., discouragement of enrollment) may be based on false beliefs about employer attitudes and behavior. At the same time, care is urged to be alert to the danger of hasty action when poor performance is revealed. Future generations of students deserve to know whether their predecessors have succeeded, and whether educational accomplishment contributed to the documented success or failure.

These examples indicate how employment and earnings data might be used to help, or to hurt, a vocational education program. Fear of the unknown, and concern about a loss of control over performance measurement, has led some vocational educators to oppose the acquisition and use of this data.

The distinction between persistent and periodic candidate screening criteria promotes a similar emotion of uncertainty, or fear of misuse of data. Qualifications that are sufficient for successful candidacy at one time, or in one place, may not suffice at a different time, or in another location. Economic conditions vary across locations at any point in time, and over time within a particular location. When these differences, or changes, are large they strain a vocational educator's ability to calibrate their offerings to current standards.

Requirement Factors

The previous paragraph serves as a rhetorical bridge from candidate qualification criteria to employer requirements. Employer hiring or promotion requirements are not independent of, or insulated from, candidate qualifications. Instead, there is a two-way interplay between candidate qualifications and employer requirements. Inevitably, employer requirements are both a cause of and a response to candidate qualifications. Twenty-two Skills Standards Projects are underway, funded by the U.S. Department of Education and the U.S. Department of Labor. These investigators are discovering how difficult it is to identify an agreed upon skill standard in an industry. First, the industry must be defined, which involves contentious issues associated with competing certification bodies. Then a decision must be made whether the standard will be based on a threshold level of entry-level performance, typical incumbent performance, or leading-edge performance. Each of these definitions is subject to change as technologies evolve and workforce availability and qualifications ebb and flow. Careless introduction and use of new skill standards can result in diminished opportunity for those who fail to attain the threshold level of demonstrated performance competency that is recommended. If this diminished opportunity is considered to be an unfortunate, but necessary, artifact of the Nation's progress, then an observer may reluctantly conclude that this outcome was inevitable. However, if the diminished opportunity is subsequently found to have resulted from a redefinition of standards that has no validity in the context of future production requirements, then a different conclusion would be justified.

Wage record data offer an unprecedented opportunity to monitor the relationship between candidate qualifications and employer requirements, but this relationship only has meaning in the context of the third component of Figure 1--the employment opportunity set. An alignment of qualification and requirement is sterile in the absence of opportunity. This can be illustrated by returning to the optics metaphor. The pupil, lens, and vitreous fluid may be normal in both right and left eyes, so light is properly focused on the retina; however, if either retina (i.e., the metaphorical target in Figure 1) is damaged, the brain's perception will be distorted or destroyed. Similarly, candidate qualification bundles may be accurately recorded, and employer requirements may be defined with equal clarity, but in the absence of employment opportunity each is meaningless.

The Employment Opportunity Set

A perennial complaint by vocational educators is that they should not be held accountable for the economy's success or failure in generating enough jobs to absorb their former students. Critics respond that the Nation's investment in vocational education only makes sense if the skills acquired through these auspices are actually used in the workplace. These are simplistic characterizations of complicated terms of debate.

No one argues that decisions about the level and type of vocational education investment should ignore current and projected labor market conditions. Discontent arises from what are considered to be unreasonably short time horizons for adapting curricula and enrollment flows to new economic circumstances. No one advocates an exact alignment of skill acquisition and use. Instead, displeasure is expressed about recurring patterns of nonuse of skills that are expensive to develop.

The candidate qualification, employer requirement, and employment opportunity components of Figure 1 can now be interpreted using the optics metaphor. A person's decision to look for or keep a job, or to seek a promotion, is based on an assessment of their current bundle of qualifications and the perceived qualification bundles of competing candidates, both interpreted in the context of speculation about the employment opportunity set. A candidate's decision to act is characterized as a pulse of light with four descriptors:

  1. Timing--The timing of a student's decision to act is important because this affects the two screens (own bundle of qualifications and competitor qualification bundles) that the pulse of light must pass through before reaching the employment opportunity target.

  2. Intensity--The intensity of light that reaches the employment opportunity target is determined by the quality of the student's own bundle of qualifications considered in the context of the qualification bundles offered by competing candidates. On an illumination continuum, the most qualified person can be characterized as the brightest light reaching the employment opportunity target, while the least qualified candidate's availability may go unrecognized.

  3. Aim--Figure 1 is intended to provide a clear visual image of how a student's accomplishments can be described in terms of the aim of the pulse of light. The directness of aim with respect to reaching the employment opportunity target is determined by the appropriateness of qualification relative to other competing candidates. Think of ricochet as a descriptor of ill-conceived preparation. Ricochet within the cone representing a student's own bundle of qualifications represents poor preparation in an absolute sense. Ricochet within the cone representing the pool of potential competitors indicates weak preparation in a relative context.

  4. Sustainability--This descriptor represents the importance of keeping one's credentials in front of those who make personnel decisions. It is not sufficient just to reach the employment opportunity target; it is important to be there when an appropriate employer action arrives (see below). Sustainability is determined by a student's motivation and persistence.
The upper half of Figure 1, then, is intended to create a visual impression of three components: (1) a student's own bundle of qualifications, (2) the qualification bundles offered by competing candidates, and (3) an employment opportunity target. Poorly timed action, weak qualifications, and limited persistence, alone or together, serve as warning signs that employment and earnings outcomes may be unsatisfactory.

This part of Figure 1 also conveys a strong message that the expected impact of one spell of exposure to vocational education must be interpreted in the overall context of a student's own previous achievements, the current qualification bundles offered by competing candidates, and the current reservoir of employment opportunities. Each reader is urged to consider how a particular vocational education event might be expected to affect the timing, intensity, trajectory, and sustainability of a student's competitiveness. Four scenarios illustrate the range of conclusions about vocational education's impact that might be reached: (1) A high school student who has completed an integrated multiyear program of vocational courses without accompanying work experience; (2) a successful completer of a three- or four-year Tech Prep curriculum with related employment experience; (3) an employed adult who has completed three seemingly unrelated community college courses to qualify for promotion within the company; and (4) an employed, or unemployed, adult who has returned to a community college to complete one or more modules of vocational courses to prepare for a career change. Use Figure 1 to decide how vocational education might affect each student's pulse of light (i.e., candidacy).

The lower half of Figure 1 represents the demand elements of what has been labeled "a dynamic context of employment opportunity." The optics metaphor should be familiar enough by now that this part of the story can be told more quickly.

An employer's decision to attempt to hire a new employee, or to retain or promote an incumbent, begins with a specification of the requirements that candidates will be expected to meet or surpass. This specification takes into account both the threshold requirements of the position itself and what is known about the requirements of other employers who are expected to compete for the same population of candidates. These two screens are identical in function to those in the upper half of Figure 1; they affect whether, when, with what intensity, and for what duration the employer's action appears on the employment opportunity target. Here, ricochets represent requirements that are out of sync with the actual demands of the job, or with the requirement bundles that are used by competing enterprises. An employer's transmission of a pulse of light that is ill-timed, or that does not persist, is less likely to find qualified candidates on the employment opportunity target.

Think of the employment opportunity set as a stage floor, with a student's candidacy for employment and an employer's announcement of a job opening as spotlights that have the capability to sweep this stage. A match occurs only if these two spotlights overlap on this stage above a minimum level of illumination. Timing, intensity, aim, and persistence each contribute to the likelihood that a match will occur. The stage (i.e., the employment opportunity set) may be large or small; there may be many job opportunities, or only a few. The sweep of either of the spotlights may be limited, which means that the likelihood of overlap is reduced or even eliminated. This will happen if the qualifications of available candidates are inconsistent with the requirements of available jobs. Persistence on either side becomes important if time is required for either party to modify their bundle of qualifications/requirements.

Visualize a student's emission of a pulse of light being reflected back by the pool of available candidate qualification bundles. This describes a case in which a student's decision to seek work with a current bundle of qualifications is rejected by the availability of better qualified candidates, or by candidates with similar qualifications who are willing to accept a less costly compensation package. The student must then decide whether to try again at another time or in a different local labor market, to upgrade their own bundle of qualifications, or to modify their compensation requirements. Similarly, when an employer's announcement of a job opening is reflected it means that their bundle of requirements is out of sync with those broadcast by competing enterprises, so they have to decide whether to modify their expectations, seek candidates elsewhere, withdraw the opening, or sweeten the compensation offered.

The fundamental theme of this optics metaphor is that each vocational education event has one impact on one qualification factor for one candidate. The remaining two sections in this guide describe how a state employment security agency's administrative records can be used to document elements of this impact.


EMPLOYMENT AND EARNINGS CONCEPTS,
MEASURES, AND THEIR USE

This section is split into employment and earnings subsections. The first subsection concentrates on why a single snapshot recording of a former student's employment status should be avoided, and how wage records can be used to learn more about a former student's subsequent employment and earnings history. The topic of training-related employment is deferred to the final section, where refinements of a basic performance measurement system are presented. The second subsection covers multiple earnings concepts and measures. The topic of full-time versus part-time earnings is investigated as part of a broader discussion of full-quarter/partial-quarter and year-round/seasonal earnings.

Employment Concepts

Placement has been the traditional outcome measure of choice for the observers of vocational education activities who have been willing to acknowledge the relevance of any measure of employment status. Historically, this term has not been blessed with a single universally accepted definition. The most common definition refers to a former student's employment status during a specified interval soon after leaving school.

Evidence is presented later in this section that challenges continued reliance on placement as a single, or even primary, measure of employment outcome. There are at least three good reasons why caution should be exercised in the use of a placement measure.

  1. Some students maintain an employer affiliation after leaving school that began while they were still enrolled in school, or even before they were enrolled. This means that the basic concept of placement is irrelevant in such circumstances. This pattern is particularly frequent among former community college students. The recorded employment status is not an outcome at all, unless the observed continuity of employment is thought to have been contingent upon the employee's participation in the reference vocational education activity. Also, in such cases, the phrase transition from school to work mistakenly identifies simultaneous activities as sequential events.

  2. Employment status at any particular time is a joint outcome of many forces, only one of which is a vocational education activity in which a former student may have participated. There are many reasons why encouragement of documented employment at a particular time may have undesirable consequences. For instance, awareness that employment status during a particular interval following graduation will be used as a performance measure provides an explicit incentive for authorities to act to assure that as many of the former students as possible will be counted as employed at this time.

    Former students are likely to be encouraged to accept inappropriate jobs just so they can be recorded as employed at the designated time. Such distortions can then have long-term consequences if this job limits a former student's subsequent opportunities to advance, if it promotes an attitude of disappointment with the reward realized from investing time and emotion in the pursuit of additional education, or if it establishes a pattern of lateral job-hopping without advancement.

  3. Reliance on a single snapshot of employment status, such as the January-March quarter following the year of graduation, can introduce seasonal, and even cyclical, distortions that reflect differences in local labor market conditions that are likely to be unrelated to vocational education's potential long-run contribution to the well-being of former students, the local community, and the Nation.
A sensible approach to estimating vocational education's contribution to the prosperity of former students, their community, and the Nation at large would document different time intervals for high school, community college, and baccalaureate/post-graduate students.
  1. High School Students--At the outset, employment during the last months of school enrollment should be documented. This provides a benchmark against which subsequent employment can be gauged. Continuity of employment through the bridge period of leaving school can then be identified, which makes it possible to investigate the interplay of concurrent employment and enrollment in classes. Patterns which reveal how one type of affiliation while in school is often associated with a different, but predictable, affiliation after leaving school can also be identified.

    Repeated measurement of employment status following a former student's school leaving can support a capability to then match employment records with postsecondary education records. Caution must be exercised in doing so. Unobserved enrollment in out-of-state postsecondary institutions can affect an analyst's interpretation of observed education/employment pairings and earnings relationships. Evidence is presented in the next section that demonstrates how combinations of employment and continuing education can be documented. This is especially important to avoid mistaken attribution of observed employment status and earnings levels as outcomes of high school vocational education alone.

  2. Community College Students--Everything that applies to high school students is pertinent here, too; however, knowledge of pre-enrollment employment status is of interest as well. Many community college students have established records of employment that must be considered as a joint-input when conclusions are reached about the relationship of education and subsequent employment.

    Documentation of previous postsecondary education is also important. Careless reliance on an assumption that the award of a community college degree, certificate, or other type of recognition of achievement is a student's highest level of accomplishment is increasingly unfortunate, as many recipients of baccalaureate and graduate degrees have returned to community colleges to update or extend their skills.

  3. Baccalaureate/Post-Graduate Students--Again, overlays of pre-enrollment, concurrent, and post-schooling employment with the full series of educational affiliations would be best suited for a serious attempt to isolate the effect of any one component of this investment series on a former student's subsequent employment history.

Why Recording a Single Snapshot of Employment Status
Should be Avoided

The traditional and still common approach to documenting alleged employment outcomes of vocational education is to record employment status during a specified interval following a former student's exit from school. A typical report from this type of follow-up documentation describes how many of the former students were found in (1) training-related jobs, (2) other jobs, (3) continued education, or (4) military service.

Figure 2 illustrates how this traditional approach can be refined to more accurately reveal the relationship between a former student's participation in a vocational education activity and an observed employment status. Beginning in the upper left corner of Figure 2, it is important to identify all members of the reference population. Typically, this will be a single year's graduates (e.g., members of the graduating class of 1993-1994). Potential interpretive problems emerge immediately:

Two rules are recommended for universal adoption in carrying out a follow-up protocol based on Figure 2:
  1. Clearly identify the reference population, so no ambiguity arises about who is included. It is assumed here that the desired reference population of former students can be identified by their social security numbers. Confidence in the accuracy of this assumption is increasing as time passes because changes in reporting requirements for tax and educational accountability purposes have combined in recent years to reduce the percentage of students who cannot be identified in this way. However, important omissions persist. Recent immigrants to the U.S. are unlikely to obtain a social security number right away. Many of these immigrants enroll in community college classes to improve their language and employability skills. This presents a classic measurement error problem, since the very first step in Figure 2 cannot be taken for such enrollees.

    Electronic reporting has simultaneously increased the accuracy of reporting, which is important because an error at this point makes it impossible to create an accurate longitudinal record. The appearance of student social security numbers on school records cannot be assumed to mean that record matching can be carried out. Ohio's schools maintain student records that include social security number identifiers, but school officials are prohibited from transmitting this information to the state for accountability purposes. No attempt is made here to offer a generic statement about what is permissible in a particular jurisdiction. Each reader who is concerned about this topic is urged to investigate applicable laws and administrative policies for their own situation. Pfeiffer (1994) and Levesque and Alt (1994) offer pertinent background information.

  2. Be sure that all members of this population are accounted for in each subsequent tabulation of recorded employment status. The most frequent criticism of traditional vocational education follow-up practices is that nonresponse biases diminish, and perhaps destroy, the reliability of alleged outcomes. It will become clear in subsequent sections of this guide that the use of state employment security agency records does not overcome this problem. However, an extraordinary innovation in interstate data exchange capability would follow favorable Congressional action on the anticipated recommendation by the Bureau of Labor Statistics that a national distributed database capability be established.
The remaining elements of Figure 2 should be read from left to right. Each vertical stack of ovals accounts for all members of the reference population. The dates shown reflect an actual database that will be described in conjunction with Figure 3. The vertical stack of three ovals in the middle of Figure 2 distributes the reference population of former students into three categories:
  1. The top oval includes all of the former students who appear in the particular state employment security agency's records of reported employment during the July/August/September quarter of 1991; thus the designation 1991:3.

    For most high school students, the third quarter (July-September) can be assumed to be the first, or transition, quarter following graduation. Those who leave school without a diploma or who must attend summer school to complete their graduation requirements are exceptions to this general rule. Care must also be exercised to account for the timing of enrollment in postsecondary coursework, which usually begins in August or September following graduation; that is, during the third quarter. Anecdotal observation of international and domestic travel sojourns by recent high school graduates suggests that the third quarter of the year of graduation is unlikely to be representative of post-graduation employment status. The primary value of this snapshot of employment status is to identify those who continue a previously established affiliation with a particular employer, as well as those who immediately accept a job with a new employer. The distribution of school leaving dates for community college students is less concentrated, so more care must be exercised in defining a particular quarter as the transition quarter.

  2. The middle oval in the vertical stack of three in Figure 2 includes those former students for whom no record of 1991:3 employment was found, but who did appear in a later quarter's records. The homogeneity of this population depends upon the length of the observation period; that is, how many quarterly snapshots of a former student's status have been carried out to determine whether they were employed in the same state.

    Currently, most states that are known to be using state employment security agency administrative records to document post-schooling employment status record a single quarter's status. The reference quarter differs, with the October-December quarter of the year of school-leaving and the January-March quarter of the next year as the most popular choices. Florida's Education and Training Placement Information Program (FETPIP) has created longitudinal files that include multiple annual observations of the October-December quarter for particular reference populations of former students. Similarly, Washington's State Board for Community & Technical Colleges acquires multiple annual snapshots of the January-March quarter for its reference populations of former community college students.

  3. The bottom oval in the vertical stack of three, which is labeled no post-school employment recorded through 1992:4, includes those members of the reference population of former students who were not found in any of the six quarterly snapshots of reported employment covering the period 1991:3 through 1992:4; that is, July 1991 through December 1992.

    The location and status of the former students represented in this oval is unknown. They may be working in another state. They may be a federal government civilian employee or member of the military. They may work for the U.S. Postal Service, for a railroad, or for a religious or philanthropic organization. They may be self-employed or working as a commissions-only independent contractor. They may be enrolled in a public or private postsecondary school in this state or elsewhere. They may be incarcerated or hospitalized. They may be traveling internationally or domestically.

    The point of the previous paragraph is to acknowledge that attention can be focused on the members of the reference population who appear in the top or middle ovals (those who are found in the state employment security agency's database) or on those who end up in the bottom, residual oval. Those who tout the value of the employment security agency's data will emphasize what can be documented, while detractors will highlight what cannot be documented.
The vertical stack of seven ovals on the right side of Figure 2 distributes the members of the reference population of former students into more valuable categories from a school management standpoint. The lone oval that is offset between the vertical stacks of three and seven ovals includes all of the former students who were reported as employed in both the April-June quarter and the July-September quarter of the reference year, but by at least one different employer in each of these quarters. The phrase at least one different employer is important. A former student may have worked for multiple employers during a given three-month period. The authors have found as many as seven reporting employers during a single quarter for one former student. Decision rules have to be devised to handle these multiple-employer cases. The final section of this guide describes recommended approaches to deal with this situation. These former students are identified here to create a way to investigate the relationship between employment accepted during the last months of school enrollment, assumed here to be April-June; the continuity of that affiliation after leaving school; and the effect of different affiliation patterns on earnings and continuing education. The relevance of each of the seven categories is described next:
  1. The top of the seven oval stack is of particular importance. It represents a direct challenge to the accuracy of the placement concept. Each of the former students who is included in this category of post-schooling employment status has been reported as working for the same employer during each of the four quarters of the year that they left school.

    For virtually all former high school students, appearance in this employment category means that they continued to work for the same employer for between four and six months while they were still in school, and for between four and six months after graduation. The four-month intervals would apply if a high school senior was paid for employment during any part of March, then remained employed from then through at least the beginning of October. This would result in recorded employment during each of the four quarters of the year, which actually represents concurrent employment and high school enrollment from the beginning date of employment in March through the graduation date, followed by employment only from then through the date in October when this employment affiliation ended. Verification of the employment only status would require matching of this former student's social security number with available postsecondary enrollment records.

    The maximum pairing of two six-month employment spells would occur only if the student was employed at the beginning of January and then continued that affiliation through December. Here, use of the word maximum refers to the longest possible length of continuous employment during the one year reference period. This is an arbitrary choice. The reference period can be lengthened by beginning earlier or monitoring longer after a student's graduation. The recording of earlier employment status might be of particular importance when the reference population is former community college students, many of whom have years of previous and concurrent employment experience. More diagnostics are required to accurately identify continuity of employment affiliation for former students in postsecondary programs because multiple exit dates are possible.

  2. The second and third ovals (from the top in Figure 2) are separated for a reason that is not apparent from this figure alone. Former students represented in each of these two ovals had been reported as employed by at least one different employer during the April-June and July-September quarters of the year they left school. What distinguishes the two groups is that the former students in the upper of the two ovals had also been reported as employed in the January-March quarter, while those in the lower of the two ovals had not been reported as employed during this three-month period. The "no 1991:1 earnings reported" label indicates that these former students were not found in the first quarter records of employment reported to the state's employment security agency. The finding applies to the other two pairings of ovals in the stack of seven in Figure 2. The first quarter is thought to be of particular importance because it allows the investigator to distinguish between cases in which employment may have been reported during some part of June, following graduation but still during the second quarter, and those cases in which genuine continuity of an already established employer affiliation occurs.

  3. The former students who are classified in the fourth and fifth ovals (from the top in Figure 2) are all characterized by the uniform descriptor that they did not appear in the state employment security agency's database of reported employment during the July-September quarter of the year they left school. In other words, there was a distinct break between their leaving school and the appearance of their first reported post-schooling employment affiliation. Again, the former students who exhibit this common characteristic have then been split into those who had been reported as employed at some time during the January-March quarter, and those who had not been reported as employed during this three-month period.

  4. The bottom pairing of ovals in the stack of seven in Figure 2 covers those who had been reported as employed in the first quarter of the year they left school, but had not been reported as employed at any time from July 1 of that year through December 31 of the following year, which covers the first eighteen months after they left school. This assumes the former students all left school in June, which is a reasonable assumption in most high-school circumstances, but not for many postsecondary situations. It is possible for the former students found in either of these paired ovals to have been reported as employed during the April-June quarter of the year they left school. This possibility is downplayed here only because attention is later focused on comparisons of those who sustain preexisting employer affiliations and those who start anew. One isolated spell of reported employment is of little consequence in such inquiries. For other purposes, awareness of this employment may be important.

    From a vocational education standpoint the former students who are classified in the bottom oval in the stack of seven in Figure 2 are of particular interest. Assume that the reference population is high school graduates in a particular year, all of whom had completed some type of vocational program. A finding that none of these students had been reported as employed in the same state during the next eighteen months can at least be characterized as a signal to investigate further. Many explanations that are consistent with vocational education's mission will apply to some members of this group of former high school students. Some will have left the state to accept jobs that utilize the skills they have acquired. Others will have enrolled in postsecondary programs to build upon the foundation of skills they have already learned. Still others will be working in one of the noncovered types of employment (such as federal government civilian or military employment, self-employment, work for a railroad, and affiliation with a philanthropic or religious organization).

    This unknown status is particularly important when comparisons among populations of former students are anticipated. Different expectations of post-schooling behavior are likely to emerge such as the probability of postsecondary enrollment for a class of high school graduates. The pertinent management decision that must be made is whether, and how far, to pursue the classification of these unknowns. Some components of this classification can be accomplished in a routine manner at relatively low cost. Exchanges of information between secondary and postsecondary levels within a state are occurring more frequently. These voluntary administrative actions reduce the number of unknown cases and increase the public's understanding of, and confidence in, reported outcomes.|
This subsection on employment concepts can be summarized by comparing the depth of understanding of employment status that emerges from Figure 2 compared to the traditional use of a single snapshot of post-schooling employment status that is then labeled placement. The practical value of the seven categories of employment status will be demonstrated in the next three subsections.

Up to this point, a conceptual foundation has been laid for documenting employment outcomes from investments in vocational education, and practical categories of previous, concurrent, and post-schooling employment status have been identified. The next three subsections illustrate how this conceptual framework (Figure 1) and these employment categories (Figure 2) can be used to develop straightforward reports for management use. The single goal in these sections is to provide those who have an interest in local, state, or national vocational education activities with information that might reasonably be expected to affect management decisions, which, in turn, will increase vocational education's value to students and employers alike.

Employment Measures

The employment counts that appear in this section have been extracted from a four-state consolidated database assembled by the authors since 1991 (see Stevens, 1994c). The numbers presented are actual counts. Different states, education type and level, and years are represented in the figures and tables that appear here to highlight particular aspects of the analysis that has been conducted to date. In every case, the criterion for use here has been relevance for management action.

Figure 3 fills in the conceptual shell from Figure 2 with the actual distribution of employment status for two populations of former students. Members of both populations graduated from a public high school in the same state during the 1990-1991 school year. The left side of the figure covers vocational program completers, and the right side covers nonvocational graduates. These reference groups were chosen because there is keen interest in the comparative employment outcomes for members of these two populations. In addition, this particular comparison reveals some relationships that illustrate conceptual points made earlier.

This state certifies vocational programs, but the content of a particular vocational program classification is not necessarily uniform across approved programs in different public high schools. The absence of content homogeneity increases the importance of data elements that identify differences among programs that are classified together at a higher level of aggregation. Often, the desired data elements are not available in statewide databases. There is an urgent need to identify what these data elements are, to establish priorities for introducing them, and to work with local and state management information specialists to accomplish this through informed voluntary action.

Remember that each of the three- and seven-oval stacks in Figure 3 sums to the respective total number of graduates; all members of each population are accounted for. The employment counts in the top oval of each stack provide a direct comparison of the percentage of former students who exhibit a continuous employer affiliation through all four quarters of the year they graduated from high school. These former students cannot be said to have been placed with this employer, at least not in a post-schooling sense.

Awareness of the higher rate of continuous employer affiliation for the vocational program completers, when compared to their nonvocational classmates, may be interpreted as good or bad news by a program's supporters and critics. Those who see this as good news might contend that the continuity of affiliation indicates that the employer values the employee's productivity that has been achieved through a combination of work-site and school-based learning. Those who interpret this higher rate as bad news may counter that the continuity could signal an inability to move from temporary after-school employment to a more meaningful first step onto a career ladder. Additional information must be examined to distinguish between these views. Some diagnostics of this type appear later in this guide. This impasse, based on one snapshot of a former student's employment status, strengthens a point made earlier. Multiple observations of a former student's status, and multiple
descriptors of each of these, are required to provide reports that have high management value; that is, that might actually affect a decision.

Approximately one out of every four of the former students in each of these high school graduate populations was reported as working in both the second and third quarters of this graduation year, but for a different employer. This might be characterized as what many observers think of as the typical transition situation for a high school graduate in the U.S. today.

It is interesting to speculate why this pattern is thought to be typical when only one out of every four cases satisfies this employment mobility criterion. The observed range of percentages for two reference groups in two states (four cells) is from a low of 25% to a high of 27%, and this range is stable across different years of high school graduation.

The surprise in Figure 3 is the high percentage of vocational program completers who did not appear in the state's employment and earnings records during the twenty-four month pre-/post-observation period (January 1991-December 1992). This figure (29%) is more than twice as high as the percentage for a population of 1990-1991 high school graduates who completed a vocational program in another state (12%).

Figures 4 and 5 reveal very different comparative rates of unknown status for community college students who completed vocational and nonvocational programs respectively. This difference is discussed at that point, but it is important to note here that the educational level is a critical factor in accounting for the rate of unknown status cases that appear. Since these are the types of figures that are likely to be extracted from a carefully documented report, and then repeated out of context, extreme care must be exercised to assure reader awareness of the investigator's own explanation for comparative results. One reason why state employment security agencies have not been deluged with requests for access to their administrative records is that many third-parties fear the unknown. Cautious vocational education administrators have exhibited a pervasive leeriness of establishing an outcomes measurement system that they cannot control.

The unit of analysis in Figure 3 is a statewide class of high school graduates who had completed either a vocational or nonvocational curriculum. Any other unit of analysis can be adopted using the same basic shell first introduced in Figure 2. Vocational/ nonvocational comparisons can be prepared for each school district within a state, and these can then be compared across districts. Selected pairs of vocational programs can be compared at the local, district, or statewide levels. Individual schools can be compared within a single district.

When a particular unit of analysis is chosen, an investigator is obliged to think through how the substitution of a different unit of analysis might affect the interpretation of observed employment status distributions. Alertness to the possible occurrence of small cell sizes and assurance of strict compliance with confidentiality requirements are of paramount importance (see Stevens, 1994b, 1994e).

Two recommendations emerge from this examination of Figure 3:

  1. When a finding is counterintuitive, double-check the calculation, and then seek some basis for comparison with what is thought to be an appropriate comparison group (e.g., the same reference population in another state). This will become increasingly feasible as more states adopt longitudinal reporting practices.

  2. Always conduct diagnostics with respect to noncovered employment possibilities. This is particularly important when a substate jurisdiction is pertinent such as when a metropolitan school district lies on a state's border with another state. The importance of federal government civilian and military employment opportunities in the jurisdiction should be considered, as should unusual patterns of self-employment. In many cases, these diagnostics can be carried out as mind-experiments or conceptual exercises without actually incurring the costs to collect pertinent employment data. Expert advice can be solicited from local education authorities and from the state employment security agency.

Earnings Concepts

A typical vocational education follow-up report describes the average hourly wage rate earned by recent graduates who have responded to a mail or telephone survey. This information usually is calculated from respondent answers to two questions: (1) "How much did you earn last week/month before taxes and other deductions were taken out?" and (2) "How many hours did you work that week/month?" Concerns about the cost and representativeness of such surveys are two reasons why interest has been expressed in the substitution of state employment security agency administrative records for traditional survey data.

The Content of a Wage Record Revisited

A state employment security agency's database of wage records contains quarterly reports of employee earnings submitted by employers who are required to comply with the state's unemployment compensation law. In most cases, a wage record includes only three data elements: (1) an employee's social security number, (2) the total amount of reportable earnings paid to the employee during the reference quarter, and (3) the employer's own unique identifier.

A small number of states, including Florida, Oregon, and Washington, require employers to report the number of hours or weeks each employee worked during the reference quarter. The accuracy of this time-unit information depends on (1) the ability of a reporting employer, or their agent, to provide the desired information; (2) their motivation to attempt to provide accurate information; and (3) the receiving agency's quality control procedures. These vary from state to state. Washington's unemployment compensation law includes hours of work as a factor in their tax computation, so there is a reciprocal interest by the reporting and receiving parties to pursue quality information. Recently, Oregon switched from a weeks-of-work reporting requirement to an hours requirement. This was done in part to satisfy third-party users who have long sought an ability to derive an hourly wage rate equivalent from a quarterly earnings figures and in part out of recognition that employers do not routinely maintain a weeks-of-work data element in their personnel files. Florida's employment security agency is currently reviewing many aspects of the state's unemployment compensation law, including advocacy for dropping the required reporting of a weeks-of-work figure.

Extreme caution must be exercised when a time-unit measure is used to derive a synthetic hourly wage equivalent from a reported quarterly earnings amount. Employer payments of end-of-year bonuses and other types of compensation that are distributed unevenly throughout a year may bear little or no relationship to the number of hours that were worked during the reference period. When longitudinal tracking of adult earnings may be relevant, a higher level of caution is urged. Terminated employees sometimes receive substantial early-retirement lump-sum payments, the cash equivalent of unused benefits, and other one-time payments that occur in one or more quarters after the former employee has left the business.

The definition of reportable earnings is codified in each state's unemployment compensation law. There are definitional differences in these state-specific laws. Investigators are encouraged to include the particular definition that applies when wage records are used for vocational education follow-up purposes.

Reference to an employer identifier masks a number of technical issues associated with the identity of a particular reporting entity. State employment security agency personnel refer to reporting units, not employers. A reporting unit can be a single business, one establishment in a multi-establishment enterprise, or a group of business entities that have received permission to be treated as a single entity. Each business that is covered by a state's unemployment compensation law has both a Federal Employer Identification Number (FEIN) and a state-specific unemployment compensation account number. In most states, a multi-establishment business entity can choose to submit its quarterly wage reports under a single umbrella identifier, or separately for each establishment. This option should not be confused with a state's participation in the Business Establishment List (BEL) program of the Bureau of Labor Statistics, which requires multi-establishment businesses to report how many, but not which, employees work at each establishment location.

It is also important to understand that the Bureau of Labor Statistics-State Employment Security Agency cooperative program commonly known as the ES-202 program, based on a long obsolete paper form number, asks employers to report the average number of employees who were paid for employment during the pay period that includes the twelfth day of the month. This contrasts with the transmittal of quarterly wage reports for all employees who were paid during the reference quarter.

Many of the quarterly reports submitted to a state employment security agency are now prepared by service bureaus that process compensation data for multiple employers. Most states allow an employer, or their agent, to use any address of record on these quarterly reports such as a headquarter's address, an accounting firm or legal counsel's office location, or a service bureau's address. This means that extreme care must be exercised in describing where former students are employed in a state.

Multi-establishment businesses often have more than one Standard Industrial Classification (SIC) code assigned to their business activities. When a new business requests an unemployment compensation account number for the first time, a questionnaire is given to them asking for a description of the business' major activity. The state employment security agency's research unit assigns a four-digit SIC code based on the information that is provided. The accuracy of this code is then reviewed on a three-year cycle as part of another Bureau of Labor Statistics-State Employment Security Agency cooperative program. Mergers and acquisitions can affect the accuracy of a business' SIC code until the next review cycle. In any case, it may be difficult, or even impossible, to associate a former student's employment with a unique industrial affiliation.

The relatively new and growing phenomenon of employee leasing has challenged state employment security agencies in their ongoing attempt to sort out distinctions between the reporting entity itself and where people actually work. For different reasons, a state employment security agency is interested in knowing about a particular employee's tie to a leasing agent and to the work site. It is not very informative to know that a leasing company employs 7,500 people without also knowing that these are distributed across manufacturing, wholesale and retail trade, and service sector assignments.

Having urged caution, it is important to keep these warnings in proper perspective. Most reporting units are single establishment businesses whose location and industrial code are both known, and whose use of a payroll vendor or leasing agent can easily be identified.

Multiple Wage Records Within and Across Quarters

A former student can begin or leave a job at any time. They can hold more than one job at the same time. They can move from one job to another job without any break in between, or an interval of voluntary or involuntary time without work can occur. The phrase "time without work" is used to guard against improper use of the term "unemployment," which is normally reserved for use when the Bureau of Labor Statistics' classification criteria are met. Each investigator must devise rules for how these different circumstances will be handled. A decision must be made when more than one wage record is found for a former student during a reference quarter:
  1. One wage record might be designated as the primary record for this reference quarter.

  2. The reported earnings amounts that appear in each of the records can be added together.
Caution must be exercised when pursuing either of these approaches. The primary record may reflect a part-time low-wage job that a former student held for most of the quarter, which has been selected instead of a full-time high-wage job that was started in the last weeks of the reference quarter. Or, when the earnings amounts on multiple wage records are summed, one former student may have held two part-time jobs throughout the reference quarter; while a second former student held one full-time job for six weeks, did not work for a month, and then began a new full-time job. The combined earnings levels in these two cases may be identical.

When one quarter is designated as the reference period for documenting a former student's earnings, and no other data is requested from the state employment security agency, then little can be done to reduce the types of ambiguity that have been described here. However, if a continuous longitudinal record can be created for two or more sequential quarters, more can be said about a former student's employment affiliations and earnings.

Various types of diagnostics can be carried out:

  1. Perform a quarter-to-quarter comparison of a former student's employer affiliations to reveal the likely sequence of events. If one employer identifier appears in two sequential quarters, while another employer identifier appears in only the first of these two quarters, then it is reasonable to assume that the former student moved from one employer to another.

    It remains possible in this case that the former student had been moonlighting during the first quarter, holding two jobs at the same time, but then quit the second job before the end of the first quarter. The conclusion that the former student had moved from one job to another would be inaccurate. The authors have conducted extensive diagnostics using three-quarter sequences, which permit the investigator to determine with substantial confidence whether an employee was employed throughout the reference quarter. The procedure followed is to first compare employer identifiers during the first and second of the three quarters. If a match of employer identifiers is found, then it is concluded that the employee was working for this employer at the beginning of the second quarter. The comparison of employer identifiers is then repeated for the second and third quarters. Now if a match occurs it is concluded that the employee was working for the employer at the end of the second quarter. So, through these two steps, it has been determined with a high probability of accuracy that the former student worked for this employer throughout the second quarter. It is still possible that a recurring pattern of intermittent employment during these three months has been overlooked.
  2. Conduct quarter-to-quarter comparisons of earnings levels to determine the stability of a former student's earnings. This step is particularly important if first or fourth quarter earnings are being used, since these are the most likely times for the payment of annual bonuses. This step will be less important when the reference population is former students in high school vocational education courses because relatively few of them might be expected to be eligible to receive compensation in the form of a bonus.

  3. Establish a federal minimum wage full-quarter/full-time equivalent earnings floor to determine whether a former student's earnings reported by any one employer during the reference quarters fall above or below this threshold. Readers who are old enough to remember when most adults who worked were employed full-time year-round are warned to expect a high percentage of cases that fall below this threshold, particularly when the reference population is former high school students. Again, matches with postsecondary enrollment records can provide some indication of the probability that the observed earnings level is associated with part-time employment.

  4. Add a criterion that a former student appear in each of four sequential quarters of the wage records database. This then becomes the equivalent of year-round employment. This procedure is recommended when an investigator intends to prepare a public release of information about vocational education outcomes. The first, and perhaps only, information many nonspecialists want is an answer to the question, "How much are graduates earning if they work full-time year-round?" This recommendation is not intended to downplay the importance of informing the public about the incidence and geographic/demographic correlates of cases when this criterion is not met.
Each investigator must answer a fundamental question based on the unique context of their own intended use of wage records: "Will I be able to provide reliable new information about the earnings of former students that can be easily understood, and that might reasonably be expected to affect future decisions vis a vis vocational education?" The basic focus here is management diagnostics, but the timely release of accurate information about the earnings of former students might also be expected to influence career choice and enrollment decisions, parental and counselor advice given to students, and state and federal legislative initiatives.

Earnings Measures

Figure 4 retains the layout of employment status categories that was introduced in Figure 2 and then repeated in Figure 3, and adds earnings information. The data underlying Figure 4 was acquired from a different state; and it represents the earnings of former community college (not high school) students who completed a vocational program in 1990-1991.

Currently, this state's management information system does not include a data element that identifies the year/month or term of completion. Investigators who intend to replicate or refine this approach are urged to attempt to acquire both the year/month of completion of a vocational program and a data element that indicates whether a completer also graduated. This information is needed to investigate the independent effects on earnings of credits, program completion, and receipt of a credential (see Kane & Rouse, 1995a). The concepts incorporated in Figure 1 are pertinent here. Previous and concurrent work experience and other educational achievements must be considered if any attribution of outcomes is intended.

The average earnings amounts that appear in Figure 4 reflect an uncensored mix of full- and part-time employment during all or part of the reference quarter. If a member of the reference population had any reported earnings, no matter how small the amount, they
were included in the calculation of average quarterly earnings figures. The final section includes examples of censored subgroups, which apply different criteria for inclusion such as an earnings floor equal to the full-time/full-quarter federal minimum wage level and presence of at least one wage record in each of the four quarters of the reference year.

The highest average quarterly earnings level in Figure 4 is for former students who were already employed by this employer before they left school. This is a typical situation in which the observed earnings level cannot be described as a community college outcome alone.

At a minimum, based only on the information that is provided in Figure 4, the observed earnings level reflects the combined effects of the community college exposure and the concurrent work experience. Almost half of the reference population of former community college students falls into this category, so the importance of avoiding reliance on a single snapshot of post-schooling earnings should be apparent. The rapid growth of adult enrollments, which have resulted in a wide range of transcript patterns, including mixes of credit and noncredit course taking, has increased the difficulty of isolating the net impact of the current spell of community college exposure on observed earnings. Similarly, less-than-full-time enrollment is now the norm, and varied stop-out profiles occur. Together, these features of today's community college environment severely limit the relevance of single snapshot evidence of post-schooling earnings.

The weakness of single snapshot post-schooling earnings as a metric of community college outcome also is seen in the two earnings levels that appear in the lower right corner of Figure 4. Members of each of these reference groups completed a community college vocational program in one state during 1990-1991, did not have any reported earnings in that state during the July-September quarter of 1991, but did have reported earnings in the October-December quarter of 1991. What distinguishes the two groups is that members of one group had reported earnings during the January-March quarter of 1991, while members of the other group did not have such reported earnings. The average post-schooling earnings shown in Figure 4 for the latter group is only 52% of the former group's average. This difference would not be known if only a single snapshot of post-schooling earnings had been taken.

Figure 5 is the earnings equivalent of Figure 3. The left side of Figure 5 repeats the content of Figure 4, while the right side introduces new comparative information for nonvocational program completers. The statewide unit of analysis and 1990-1991 community college reference population continue unchanged. Each of the five paired vocational/nonvocational comparisons, based on post-schooling employment status, reveals a higher average earnings level for the vocational group.

Readers who are unfamiliar or uncomfortable with statistical terminology are urged to be particularly cautious in the use of state employment security agency earnings records. Mean earnings amounts are reported here. Some investigators use median earnings because they want to avoid the effect of outlier values on the average that is to be reported. Standard error figures are provided here to provide a basis for deciding whether an observed difference between two means can be said to be statistically significant. Confusion frequently arises about the difference between statistical significance and substantive importance. An observed difference between a pair of reported average earnings figures may fall in any one of the four cells of a significance-importance matrix.

The earnings amounts that appear in Figures 4 and 5 have been adjusted using a Gross Domestic Product Implicit Deflator Series factor. This allows pre- and post-schooling earnings levels to be compared in real rather than nominal terms--that is, after removing changes in earnings over time that are considered to be attributable to a general increase in prices rather than to increased value of the former student.

The previous two paragraphs, which use a terminology that may be unfamiliar to some readers, highlights a point that is of critical importance to the vocational education community. A proper balance must be found between technical accuracy of accountability reports and their transparency or readability.

Mobility and Its Consequences

Up to this point, repeated mention of employer affiliation has been introduced to justify the call for greater use of longitudinal databases to isolate, describe, and act upon vocational education outcomes. The basic rationale for this plea is that vocational education

can be viewed as a complement to, or as a substitute for, on-the-job training. The relevance of this distinction can be understood more easily if entry-level jobs and opportunities that are more demanding of skill competency are treated separately.

Entry-Level Employment Opportunities

When considering who to hire to fill an entry-level job opening, each employer must decide how to split the position's training requirements between school-based sources and work-site training. This decision can be thought of in the context of Figure 1, along with the optics metaphor that accompanies it. An employer who expects more pre-hire evidence of skill attainment will select from a different and smaller pool of qualified candidates than an employer who is comfortable hiring new employees who only provide evidence of an ability and willingness to learn new skills on the job. The perennial debate among vocational educators and other interested parties about the extent to which high schools should concentrate on job-specific competencies versus foundation skills and the promotion of critical thinking, communication skills, and teamwork is cast in these terms.

Using Figure 1 as a framework for collecting and analyzing data, an investigator should be able to assemble a database of candidate qualification factors, employer requirement factors, and job opportunity descriptors that can identify which bundles of candidate qualifications are actually matched with what bundles of employer requirements. A match occurs when a new hire is detected.

Mid-Level and Advanced Opportunities

Typically, an employer's decision to recruit for a mid-level or advanced position will include some consideration of incumbent employee candidates who might be promoted into the open position. Ambitious employees who seek recognition in such reviews are likely to participate in continuing education to improve their standing vis a vis other possible candidates. This activity describes the behavior of many of today's adult enrollees in the Nation's community colleges. This niche learning will increasingly be available through work-site and home-based electronic linkages with community colleges, while remaining a complement to employer training.

Implications for Vocational Education Outcomes Measurement

There has been a distinct shift in vocational education's role in recent years. One-time exposure to school-based entry-level skill development has diminished in importance, while repeated participation in specialized learning has grown. The either-or context of skill training in past decades (either the school provided the skills or the employers provided the skills) has been supplanted by the and context of the 1990s (workplace training and complementary recurring school-based continuing education). Vocational education's reporting practices, management information systems, and management decisions must be aligned with this reality.

It is no longer practical to ignore the importance of concurrent employment and school enrollment. At the high school level, states are struggling to identify and build upon evidence of concurrent education and work activities that lead to rewarding opportunities after graduation; often including a continuing affiliation with the same employer combined with selective course taking at the postsecondary level. At the community college level, the struggle shifts to ways to document the value-added that arises from diverse enrollment patterns that are not revealed in management information systems and standard reports that were designed in a linear world of transfer students and a few well-defined occupational programs. Advances on either front, high school or community college, will require a conscious downplaying of single snapshot performance measures. More relevant, but not necessarily more expensive, management information systems must be devised. Each data element in such a system should be justified in terms of its potential contribution to affecting a decision. This is the criterion for judging mobility data in the next subsection.

Measurement and Interpretation of Mobility

Is continuity of employer affiliation good or bad? It depends. Continuity of employer affiliation may reveal that a former student cannot compete with the qualification bundles offered by other candidates for more attractive jobs elsewhere, or it might be a positive indication of mutual employee and employer satisfaction.

Progress toward sorting out these situations, which have very different management implications, can be made by combining and analyzing data elements currently available in state employment security agency databases. Figure 5 represents a first step toward understanding whether, and how, awareness of employee mobility patterns can be used to refine vocational education performance measurement practices.

Figure 6 is based on the documented mobility patterns of 2,260 former high school students who completed a vocational program and graduated in 1986. Here, stayers are defined as those who were reported as working for the same employer in each of the four reference periods (1986:3 or 1986:4 after graduating from high school; 1987-1988, 1989-1990, and 1991-1992). These stayers must have been reported as working for this employer during 1992:4 to be classified in this way. Movers are those who did not satisfy the criteria for being designated as a stayer, but were reported as working for some other employer in the state during each of the reference periods.

These students represent twenty-eight school districts in one state, with 90% of the records having been drawn from five of these districts. Furthermore, since the intent is to illustrate mobility patterns, each of these former students must have appeared in the state's wage record database in 1986 and 1992. Any member of this population of former students who graduated in 1986 may have left the state, as long as they returned to appear in the 1992 wage record file. Those who are reported as stayers in each of the four two-year snapshot intervals must have satisfied the more stringent criterion that they were reported as working for the same employer in each of the four snapshots. It is possible, but unlikely, that a former student who meets this requirement left the state and was employed elsewhere one or more times. For example, regular seasonal employment in another climate zone could have occurred. This is mentioned because the ultimate goal of this type of analysis is to isolate the relative contribution of school-based education and work-site training on employment status and earnings. If someone who is identified here as a stayer was, in fact, attending school or working out of state, then these events would be missed and the interpretation of the available data would be affected.

Only 8% of the former high-school students who are known to have been employed in the same state in 1986 and 1992 were found to have stayed with the same employer throughout this six and one-half year post-schooling period. The comparable stayer rate for completers of a community college vocational program in two of this state's community college districts, all of whom meet the same classification criteria, is 29%. Just over one-third of the former high school students moved in each of the three employment affiliation
comparisons. Others fall into mixed time sequences of being classified as a stayer or a mover.

Figure 6 illustrates a first step toward developing information that will have practical management value. The frequency of observation (i.e., the snapshot interval) can be increased to cover just one year, or even each quarter, following a student's departure from the reference educational activity. The duration of observation can be extended by periodically updating the database. Differences in mobility patterns among districts, across vocational programs, between vocational and nonvocational programs, and among levels of education can be documented. This assertion is accurate only when minimum cell-size thresholds are satisfied. Even when this criterion is met, caution must be exercised to be sure that the reported mobility pattern itself does not reveal the identity of a particular former student.

Figure 7 represents a second step toward developing insights that are of practical management value. Here, the same basic setup that was used in Figure 6 is repeated, except that now averages of reported earnings are displayed for stayers and movers. This presentation is based on former students who completed a community college vocational program in 1985-1986. Unlike Figures 4 and 5, each of which presented uncensored average quarterly earnings, Figure 7 displays uncensored average annual earnings levels. For each of the 779 former students covered in Figure 7, all reported earnings by any employer in the state during all four quarters of 1986, 1988, 1990, and 1992 are included. The benchmark annual earnings figure for the first reference interval of 1985-1986 would be expected to include part-time employment for a portion of the year for some of the former students. A quarter-to-quarter comparison of earnings for each of the quarters 1985:4 through 1986:3 would be expected to reveal this pattern.

The introduction of a threshold level of annual earnings equal to the federal minimum wage multiplied by full-time year-round employment would cut off the earnings distribution tail falling below $8,500, which would increase the reported averages across the board. The continuous stayers are more likely to have been employed full-time year-round throughout the observation period than their classmates who were classified as movers in each of the three comparisons of employer affiliation.

Figure 7 shows that at the end of 1986 those former students who are now known to have continued their employer affiliation for at least the next two years, already enjoyed a 35% annual earnings advantage over their classmates who are now known to have changed employers at least once during the same two-year period. For those who remained in these respective classifications for the next four years, this wedge of higher earnings favoring the stayers increased to 105%.

The patterns revealed in Figures 6 and 7 challenge those who are dedicated to the accurate measurement of vocational education outcomes. Clearly, diagnostics using additional data elements should be carried out to extract refined findings that can guide management actions in response to this evidence of disparity. Historically, in the United States, voluntary mobility has been associated with improvement of circumstances. Here, therefore, there is disquieting evidence that at least some of the observed mobility is likely to have been involuntary. If so, it is important to know whether the affected former students have failed to prosper for personal reasons or because their education failed to position them for competitive candidacy. Using the terms introduced in Figure 1, which elements in their overall bundle of qualifications contributed most to their downfall? Descriptors of educational attainment or descriptors of personal behavior or other factors? These are the kinds of diagnostics that promise to offer real value-added in support of those who must make decisions about their own career or the careers of others. These are feasible diagnostics with minor refinements of current information systems. Readers are referred to Klerman and Karoly (1994, pp. 31-48) for complementary evidence drawn from the National Longitudinal Survey of Youth (NLSY). This article is recommended because it requires readers to be alert to the nuances of constructing appropriate measures of transition from available data elements. Similar reader awareness is required in the final section of this guide, where ways to align the timing of school-leaving and a particular employer affiliation are described.

Up to this point, the fundamentals of documenting employment and earnings outcomes in vocational education have been developed. The second section provided a new conceptual basis for designing the data collection and analysis steps. Figure 1 introduced three sets of building-blocks: (1) candidate qualification factors, (2) employer requirement factors, and (3) the employment opportunities set. Each of these can be thought of as a vector, or list, of descriptors. An attempt can then be made to link each of these descriptors to one or more available data elements. This, in turn, can be followed by an exercise to
identify possible new data sources for descriptors that have no reliable data source currently. This wish list can then be priced and prioritized.

This third section has described a series of employment and earnings measures that reflect the interplay of the candidate qualification factors, employer requirement factors, and employment opportunity set. Reader alertness to the use of the word reflect here is essential. What many observers refer to as vocational education outcomes are actually a result of complex interdependencies and forces. The straightforward comparisons that have been presented in Figures 2 through 7 mask the lists of descriptors that should be considered before any attribution of cause-and-effect relationship is assigned. The next section returns to the components of Figure 1 and describes practical ways in which some of the interesting descriptors can be drawn from available data sources.


REFINEMENTS OF AVAILABLE EMPLOYMENT AND
EARNINGS MEASURES

This section covers six topics that are expected to be high on most lists of candidates for improvement of available outcomes' concepts and measures:
  1. The definition and measurement of training-related employment

  2. The accurate alignment of timing of school-leaving, employment status, and employer affiliation

  3. The timely and reliable documentation of continuing education

  4. A basis for reaching consensus on decision-rules to be used when multiple employer affiliations are reported for a single observation period

  5. Criteria for reaching agreement on rules to censor available earnings data to reflect full-time/part-time distinctions

  6. Specification and estimation of models of employment and earnings outcomes, which can be expected to support defensible conclusions about the employment and earnings outcomes of vocational education
Here, coverage of a topic means that the issue is addressed and that progress has been made; not that the authors believe that any of the six concerns has been put to rest.

Defining and Measuring Training-Related Employment

The concept of training-relatedness has a certain cachet in vocational education. There is a pervasive assumption that it constitutes a practical performance measure, which serves as a metric that can be applied across curriculum, educational level, governance, and geographic boundaries, and interpreted in a routine manner by non-experts. This section challenges the accuracy of this assumption, but it also offers suggestions about how to proceed if one decides to persist.

The Challenge

Previously, the test of a concept's validity was described in terms of evidence that it actually represents what the investigator intended to measure. Here, a second, more stringent criterion is added to this test: Different users of the concept should exhibit a high level of agreement on the classification of real cases when the concept is applied. A two-stage test of validity is proposed:
  1. Do available decision-rules for carrying out the assignment of training-relatedness designations result in a high level of agreement when actual cases are classified by different coders?

  2. If so, does this assignment of cases reveal what multiple users want to know?
Both criteria must be satisfied for the concept, or classification taxonomy, to be endorsed. These two conceptual criteria must be supplemented with two procedural criteria before adoption for routine use:
  1. Is a practical data collection method available?

  2. Is the cost of this data collection activity less than the benefit that is expected to accrue in the form of better management decisions?
Recent and continuing reengineering of organizational designs and missions in both the business and education sectors combine to threaten the confidence anyone can have in answering any of these questions affirmatively.

From an employer requirements perspective, employee job descriptions encompass far more elements than in previous decades. Some of these tasks are performed on a routine daily basis, while others are held in reserve for periodic or emergency use. Much of the precision of job description has been lost in many sectors of the Nation's economy.

It is useful to recall that codification of job descriptions in the U.S. escalated during the 1930s, associated with the emergence of formal labor-management negotiations in major industries, and with the dominance of manufacturing production activities as the engine of the economy. The decline of organized labor's influence and representation, coupled with the growth of service sector employment, leaves employers with a substantially higher level of discretionary leeway to design personnel assignments in more flexible ways; thus, the blurring of job descriptions. This means that it is more difficult for anyone, novice and expert alike, to routinely identify particular employees by a short-list of competency requirements that distinguish them from other employees.

Similarly, from a candidate qualifications standpoint, the ongoing integration of so-called vocational and academic curriculums is erasing long-standing, clearly defined boundaries. This source of difficulty, from a classification perspective, is compounded by the rapid growth of niche enrollment patterns. Students are being allowed, and often encouraged, to assemble customized programs of study. This trend makes sense in the context of present and anticipated workforce opportunities, but it threatens the integrity of current management information systems vis a vis the routine documentation of a manageable number of vocational modules.

When these forces are considered together, it is apparent that current classification taxonomies are in trouble. This vulnerability has been recognized in many quarters of the federal establishment (see U.S. Department of Labor, 1993; Standard Occupational Classification Revision Policy Committee, 1995). The pilot phase of a revision of the Nation's Dictionary of Occupational Titles is underway. A database design, to be known as 0*NET, is being developed that is expected to include more than one-hundred descriptors for each occupational entry. These descriptors, and the occupational entries themselves, will be updated as new information becomes available, rather than having to wait for comprehensive updates of the entire taxonomy at widely spaced intervals. Also underway are revisions of the SIC (Standard Industrial Classification), Occupational Employment Statistics (OES), Standard Occupational Classification (SOC), and Classification of Instructional Programs (CIP) taxonomies. These partially overlapping, but not coincident, modifications will advance the quality and increase the value of data that is collected; but these changes will also render a substantial number of documents and software products obsolete. The vocational education community has an opportunity to anticipate these revisions and to adapt reporting systems accordingly, but this aperture will close within two to four years.

These challenges should motivate a review of the entire exercise of training-relatedness measurement. The desire to measure may be burning at a stable historical level, or even at a higher level of intensity as the Nation rides the current wave of interest in educational accountability; but the ability to respond to this desire is wavering.

The Current State-of-the-Art

Florida's Education and Training Placement Information Program (FETPIP) has made a substantial investment in the collection of occupational information through mail survey questionnaires. Their treatment of training-related employment is used here, in a series of four tables, to illustrate how a tested classification method can be used with state employment security agency employment and earnings records to prepare tabulations that are sought by administrative, legislative, and other groups.

Based on the linkage of a student record with a wage record the FETPIP identifies the Florida employers who reported these former students as employees. A stratified sample of these businesses is drawn and questionnaires are mailed asking the employers to report the occupation of each designated employee. The fourth quarter of the year of school-leaving is currently used as the reference quarter for this purpose. The FETPIP has created a set of Occupational Employment Statistics Program codes for each of the state's major industrial sectors. The appropriate version of these codes is sent with each mailing to an employer, who is then offered the option of using one of these codes for each designated employee, or entering a job title. The latter are then coded by FETPIP staff members using their own software design to assign the same code to all cases of the same job title once a single decision has been made. The FETPIP considers all aspects of this coding exercise to be of a pilot, or development phase, nature. Table 1 displays the actual coding of training-relatedness of reference quarter employment for 1990-1991 community college vocational program completers. Four assignment methods appear in Table 1.

  1. Based on Job Title--The responding employers either used one of the OES codes provided with the questionnaire, or wrote in a job title or brief job description, in which case the FETPIP staff assigned a training-relatedness code.

  2. Based on Industry Only--Used as a residual option only when no usable job title/description information was provided by the responding employer. The FETPIP has established criteria for this assignment of a training-relatedness code.

    Washington's State Board for Community & Technical Colleges has also experimented with the use of an industry-based coding of training-relatedness, based on the appearance of Occupational Employment Survey staffing pattern cells,

    in particular industry/occupation matrices. The acknowledged problem with this approach is that it simply asserts that because incumbent employees were reported in an industry/occupation cell when the reference staffing pattern survey was conducted, it will be assumed that the former student in question, who is known from state employment security agency records to have been reported as employed in that industry, may be working in one of the training-related occupations. The Board's attempted use of this approach exemplifies the pressure that exists to devise some way to provide a metric of this type.

  3. Unable To Determine--A third-stage assignment, which is used when neither a job title nor industry only code can be identified.

  4. Not Employed in the Reference Quarter--Includes those former students who were not found in the linkage of student records and state employment security agency wage records.
Three training-related designations are used: (1) directly related; (2) somewhat related; and (3) unrelated.

Table 1

Training-Related Employment

1990-1991 Community College Vocational Program Completers

Training-Related Percentof Percent of Percentof
Determination Method Outcome N Total Employed Subtotal
Based on Job Title
Directly Related
5,479
38%
50%
65%

Somewhat Related
977
7%
9%
11%

Not Related
2,060
14%
18%
24%

Subtotal
8,516
59%
77%
100%
Based on Industry Only
Related
971
7%
9%
81%

Not related
224
9%
2%
19%

Subtotal
1,195
16%
11%
100%
Unable To Determine

1,337
27%
12%

Not Employed in the Reference Qtr.

3,361
35%

Total

14,409
100%


Notes: The reference quarter is 1991:4.
"Percent of Total" is the percentage of the total population.
"Percent of Employed" is the percentage of the former students who were reported as employed in 1991:4.
"Percent of Subtotal" is the percentage within each category defined by the relatedness determination methods.
The "Based on Industry Only" rows include only those whose job titles were not available.
The "relatedness" approach used to prepare this table was provided by Florida's Employment and Training Placement Information Program.
The approach is considered to be in a pilot use phase and is subject to future refinement.

The omission of the somewhat related category from the industry only assignment method reflects the FETPIP staff's unwillingness to adopt such an aggressive classification approach.

Three columns of results are presented in Table 1:

  1. Percent of Total--Distributes all of the 1990-1991 reference group community college vocational program completers among the five determination method categories.

  2. Percent of Employed--Distributes only those reference group members who had been reported to the state employment security agency as employed during the reference quarter.

  3. Percent of Subtotal--Distributes, within subgroup, only those who have been assigned a training-relatedness code on the basis of either the job title or industry only method.

Beginning at the bottom of the "Percent of Total" column in Table 1, 35% of the reference group of former community college students were not found in the state employment security agency's 1991:4 wage records. This stand-alone figure based on one snapshot of employment status does not reveal which of these students may have been reported as employed during previous or subsequent quarters, or how many were registered in other public higher education institutions in Florida. The FETPIP has introduced numerous innovations to reduce the number of unknowns in its reports. Table 1 is not designed to reflect these refinements. Another 27% of the original reference group had not been classified, but this represents only 12% of those who had been reported as employed (refer to the last figure in the middle column of Table 1). This leaves two groups of the former students who were successfully classified using one of the two determination methods--16% on the basis of the industry only method, and 59% using the job title method.

For many reporting and management decision purposes, the middle and right-hand columns are more useful than the "Percent of Total" column. However, it is always important to be able to account for all members of an original reference population, so detractors are not allowed to falsely assert that there must be a devious reason why some have been omitted from a tabulation. The upbeat news in the middle column is that 77% of the former students who had been reported as employed during 1991:4 were successfully assigned to a training-relatedness category using the job title approach, and another 11% were assigned using the fall-back industry only method.

The only number that many interested parties will look for and remember is the top figure in the middle column--the percent of those who were reported as employed in 1991:4 and were determined to be working in a directly related job. Many of those who seek this single number are likely to depart without recognizing the importance of the number at the top of the right-hand column: Two-thirds of those who could be assigned a training-relatedness status using the job title method were assigned to the directly related category.

Table 1 typifies one reason why vocational educators are distributed along a continuum of enthusiasm about the use of state employment security agency employment data, and associated attempts to assign training-relatedness designations to these jobs. There is something in this tabulation for everyone. Detractors can point to the not employed, unable to determine, and not related cells. Advocates can focus on the directly related and related cells. This is why either of two alternative approaches is encouraged--do nothing or do more. Table 2 illustrates one way in which FETPIP has done more. This tabulation replicates the Percent of Total column from Table 1 for each of six postsecondary vocational programs. This choice of presentation format reflects the anticipated interests of likely readers of this volume. It does not represent the recommended selection for a press conference.

Table 2

Training-Related Employment

1990-1991 Community College Vocational Program Completers by Program

Determination
Method
Training-Related
Outcome
Office
Occs.
Engineering
Technology
Allied
Health
Health &
Medical
Sciences
Child Care/
Food Service
Protective
Services


N
%
N
%
N
%
N
%
N
%
N
%
Based on Directly Related 102 8% 121 18% 1,638 40% 1,541 65% 38 19% 1,512 55%
Job Title
Somewhat
Related
159
13%
72
11%
191
5%
53
2%
23
11%
271
10%

Not Related
343
27%
134
20%
503
12%
19
1%
28
15%
390
14%
Based on
Related
19
2%
27
4%
444
11%
325
13%
21
11%
6
1%
Industry Only
Not Related
2
0%
4
1%
92
2%
1
0%
3
1%
62
2%
Unable To Determine

229
18%
102
15%
244
7%
142
6%
15
7%
165
6%
Not Reported as Employed in the Reference Quarter

403
32%
219
31%
940
23%
302
13%
73
36%
339
12%
Total 1,257 100% 679 100% 4,052 100% 2,383 100% 201 100% 2,745 100%

Notes: The "relatedness" approach used to prepare this table was provided by Florida's Employment and Training Placement Information Program.
The approach is considered to be in a pilot use phase and is subject to future refinement.

The comparative data presented in Table 2 exemplifies the theme of the entire volume: Available data can be organized in ways that support and promote local and state management diagnostics--formats that may be unrelated to required federal reporting or to responding to constituent inquiries. Here, the word diagnostics is meant to convey a motive of behind-the-scenes troubleshooting or curiosity, rather than triggering a public argument about the accuracy and relevance of particular numbers. Note, for example, that the range of combined not employed/unable to determine cases extends from a high of 50% to a low of 18%. Awareness of this pattern alone is cause for management action to learn more about why, and whether, anything should be done to try to reduce this disparity. Note that the terminology used here is whether anything should be done, not what should be done. The disparity may be traced to origins that do not warrant corrective action.

Table 3 probes to a deeper level of understanding of the training-relatedness issue. Four post-schooling measures of employment status and earnings are reported by each of the relatedness categories:

  1. The percentage of each subgroup of former students who were reported as employed by a different employer at some time during 1992 than had been used as their reference affiliation in 1991:4

  2. The average length of first job held, in quarters

  3. The average level of reported earnings during the reference quarter, 1991:4

  4. The average level of reported earnings one year after the reference quarter, or 1992:4
Table 3

Training-Related Employment

1990-1991 Community College Vocational Program Completers Earnings and Mobility

Different Employer
in 1992
First Job Length
(Qtrs)
Earnings in 91:4 ($) Earnings in 92:4 ($)
Relatedness N Rate Sig. Test Mean Stderr Mean Stderr N Mean Stderr
Directly Related 5,479 0.26

5.06
0.02
$5,826 $39
4,967 $6,625 $45
Somewhat Related 977
0.26
Chi-sq=138.3 5.07
0.05
5,241
87
900
6,016
103
Not Related 2,060 0.39
P=0.000
4.29
0.04
3,523
61
1,702 4,479
77











Industry-Related 971
0.3
Chi-sq=21.1 4.84
0.05
5,376
101
852
6,651
118
Industry-Not-Related 224 0.46 P=0.000 3.99 0.12 3,192 197 187 4,741 259











Unable To Determine 1,337 0.35 4.47 0.05 4,328 86 1,062 5,728 112

Notes: A "different employer" is defined as any employer identification code other than that of the reference quarter, which determined the training-relatedness code.
The "relatedness" approach used to prepare this table was provided by Florida's Employment and Training Placement Information Program.
The approach is considered to be in a pilot use phase and is subject to future refinement.

Table 3 reveals as low a turnover rate, as long an average length of first job held, and a higher average earnings level during the 1991:4 reference quarter, for the directly related subgroup relative to each of the other five designations. A statistically significant difference in turnover rate and average length of first job, favoring the directly related subgroup, is found in a comparison with the not related subgroups alone.

The turnover rate and average length of first job, as these appear in Table 3, are not independent events. Also, care must be exercised in defining length of first job. Here, it means the number of quarters after a former student leaves school during which the reported employer affiliation is the same. Even this seemingly straightforward definition can become complicated, when it is realized that the appearance of more than one wage record in a quarter may indicate that an employee has moved between employers during the quarter. Also, for some reporting purposes, the pre-school-leaving quarters of employer affiliation may be considered relevant.

One year after the reference quarter of 1991:4 the initial earnings advantage enjoyed by the members of the directly related subgroup (determined using the job title method) has been closed by the industry-related subgroup. This is the type of finding that should trigger further inquiry: What explanation comes to mind for this pattern? Is information conveyed by an industry affiliation that has potential value as a predictor of a former student's long-term earnings prospects? If so, this readily available data element, which is contained in state employment security agency records (but not in the wage record database in most cases) can be used for selected analytical and reporting purposes.

Table 4 represents a preliminary attempt to investigate the potential value of the readily available Standard Industrial Classification (SIC) code as a complement to, and perhaps even as a substitute for, the costly and often challenged documentation of training- relatedness. Each column of numbers in Table 4 presents regression coefficients for variables that might reasonably be thought to be correlates of differences in reported earnings during the reference quarter of 1991:4. The basic purpose for this specification is to explore how the coefficients for training-relatedness and SIC code behave together and separately. The results indicate that the industrial classification variable absorbs enough
of the training-relatedness variable's explanatory power that some signs change and the statistical significance of some correlations is lost.

Table 4

Training-Related Employment as a Predictor of Earnings

1990-1991 Community College Vocational Program Completers

Dependent Variable

Earnings in 1991:4

Earnings in 1991:4

Earnings in 1991:4
Regression Type
Linear Regression

Linear Regression

Linear Regression
Number of Observations

11,048



11,048



11,048

R-squared
0.5505

0.5452

0.5432
Population
Employed in 1991:4

Employed in 1991:4

Employed in 1991:4

Estimates
P-Value
Estimates
P-Value
Estimates
P-Value









Intercept
-1,349.046
0.001
-624.243
0.112
-1,593.289
0.000
Demographic Variables







 Male
348.224
0.000
358.952
0.000
336.845
0.000
 African American
-379.172
0.000
-359.124
0.000
-396.685
0.000
 Hispanic
78.941
0.374
124.937
0.162
65.466
0.464
Vocational Program








 Agriculture
126.173
0.802
-184.914
0.714
245.031
0.628
 Office Occupations
-81.097
0.322
-67.847
0.410
-182.215
0.025
 Engineering
 Technologies
-98.290
0.335
-104.061
0.310
-144.063
0.160
 Allied Health
75.909
0.143
-4.262
0.934
121.823
0.016
 Medical Science
2,308.255
0.000
2,226.776
0.000
2,465.415
0.000
 Child Care/Food
  Service
-675.754
0.000
-817.925
0.000
-656.258
0.001
Local Economic
 Conditions








 Local-Avg.-Earn ($)
1.120
0.000
1.017
0.000
1.115
0.000
Pre-Graduation Job Info.







 Pre-Job
-801.805
0.000
-806.899
0.000
-789.015
0.000
 Earnings in 1991:2
0.602
0.000
0.614
0.000
0.611
0.000
1991:4 Job Relatedness








 Directly Related
226.717
0.003
662.065
0.000


 Somewhat Related
100.867
0.280
478.926
0.000


 Not Related
-496.435
0.000
-274.206
0.000


 Industry Related
-112.215
0.256
347.044
0.000


 Industry Not Related
-686.582
0.000
-538.108
0.000


Industry Information








 Avg. Earnings ($)
0.137
0.000



0.175
0.000
 Standard Error of Avg.
  Earn.
-0.701
0.098



-1.220
0.004

Notes: Italic indicates a 0-1 dummy variable.
Local-Avg-Earn is the average 1992:1 earnings of all 1990-1991 of the state's high school and community college students by groups of counties chosen by the authors.
Pre-Job refers to the job held by individuals in 1991:2.
Industry-Specific Avg. Earnings are based on two-digit SIC code. The 1991:4 earnings of all workers (in a different state for which the necessary data was available) are used to calculate the average earnings and standard error of the average earnings.

This brief excursion through the topic of training-related employment is intended to increase reader awareness of the complexity of the issue, but also to indicate some promising paths for future inquiry. The priority that is given to this issue ranks right behind the placement measure based on a management support criterion. Since the two concepts are traditionally paired in management use, together they warrant the highest available designation for serious attention.

Alignment of Enrollment and Employment Affiliations

Up to now, this topic has received less attention than the training relatedness and placement issues. But this is largely because the importance of the matter has not been known or widely discussed. The third major section of this guide described why the topic should now be elevated to a higher level of visibility and discussion. Concurrent enrollment and employment must be considered in any meaningful discussion of vocational education outcomes. Previous employment has to be considered for adults. While school-year units of analysis sufficed in the past, a better alignment of the timing of school-leaving, employment status, and employer affiliation is now needed.

A typical vocational education follow-up design identifies a reference population of former students, usually a school-year of leavers, and carries out a single snapshot of their employment status during some interval soon after this. Adoption of this approach in high school settings is of minor concern because most seniors who graduate do so during May or June; but there are many reasons why an investigator might want to know the year/month of last school attendance for those who leave without receiving a diploma.

Figure 8 shows why such complacency is not justified in the case of community college inquiries. Each of the four panels in Figure 8 represents one subgroup disaggregated from a total of 6,491 vocational program completers in school year 1989-1990. Those in the top panel, 10.4% of the total, left school in the summer quarter (July-September 1989). Those in the next panel, 21% of the total, left in the fall quarter (October-December 1989). The third panel from the top represents 14.3% of the full year's class who left in the winter quarter (January-March 1990). And the bottom panel covers the 54.3% who left in the traditional spring quarter (April-June 1990).

The asterisk in each of the four panels represents the July-September 1990 quarter, which can be thought of as a snapshot of the percent of the reference subgroup of former students who were reported as employed in this quarter. Note that for the 1989:3 graduates, this is the fourth full quarter following school-leaving; while for the other three subgroups of the class of 1989-1990, this is the third, second, and first full quarter after leaving school respectively.

Those represented in the top panel had a full year to hold, and perhaps to leave, one or more jobs. Those included in the bottom panel had a maximum of three months, if they left school in June 1990. Nothing more can be said, based on Figure 8 alone, about the sequences of employment affiliations that have occurred.

Figure 9 reveals why the limitations associated with Figure 8 should be of concern from a management support perspective. Here, a seven-quarter reference period is identified for each member of the 1989-1990 community college vocational program completer population. Keep in mind that the particular seven-quarters differ among the four subgroups of summer, fall, winter, and spring school leavers. For instance, for the summer 1989 completers, the seven-quarter reference period begins in January 1989 and ends in September 1990; while for the spring 1990 completers, the reference period begins in October 1989 and ends in June 1991. This means that ten quarters of data were required to cover the thirty reference months. This compares with the typical one-quarter coverage of most follow-up designs.

The darker bars in Figure 9 represent the summation of seven sequential snapshots of reported employment status from the state employment security agency's wage records database. The pattern is what might reasonably have been expected, with a stable employment rate during the two quarters prior to completion, which increases somewhat during the school-leaving quarter, and then increases again to a higher stable plateau over the next year. This contrasts with the lightly shaded bars, which reveal a continuous decline in the percentage of former students who are still affiliated with the same employer who
had reported them as an employee during their last full quarter prior to the quarter of vocational program completion.

One important implication of Figures 8 and 9 together is that when a single quarterly snapshot of employment status is recorded for an entire school year's reference population of former students, the probability of capturing the first job held will differ for subgroups within this population. Based on the data that underlie these two figures, the range of sustained employer affiliation would be expected to range between a low of 23% (four quarters following completion) and a high of 38% (one quarter after completion).

The diagnostics reflected in Figures 8 and 9 require more data than is needed for the typical one-quarter snapshot of post-schooling employment status. In this case, ten quarters of data were used. The following issues arise in attempting to conduct this type of investigation:

  1. Most state employment security agencies maintain only the most recent available five quarters of wage records on-line because these are used in the routine conduct of the state's unemployment compensation program. Currently, this statement does not apply to Michigan and New York, which are referred to as wage request states because they request verification of earnings when a former employee files a claim to receive unemployment compensation benefits. Michigan does require employers to submit a quarterly report of employee earnings, but these are not used in the same way that other states use them. Massachusetts is the most recent state to enact wage reporting legislation, so the potential availability of historical coverage will be extremely limited there.

  2. Employers are required by law to submit their report on a particular reference quarter within thirty days of the end of that quarter. The state employment security agency then has at least sixty days to process the data so they are available for administrative use by the beginning of the next quarter. Late reporting does occur, which should be considered in deciding when a request for wage records will be submitted to a state employment security agency.

  3. State employment security agencies have uneven capacities to respond to external requests for information. The authors' ability to perform these diagnostics through the auspices of a small federal grant can be traced to the pioneering, and still unique, agreement between the University of Baltimore and Maryland's Department of Labor, Licensing, and Regulation. This interagency agreement between two branches of state government provides for the archiving and research use of the universe of Maryland's wage records, which currently cover the period from April 1985 through March 1995. The data is maintained in encrypted form in a secure facility managed by The Jacob France Center in the University's Merrick School of Business. The Center's database manager and her research colleagues have each signed an oath acknowledging their awareness of and intention to honor the confidentiality stipulations that apply to these records. Each research use of the data must be approved by an authorized person in the originating Department of Labor, Licensing, and Regulation. The Center's executive director is responsible for assuring that any public release of information conforms to the rule that no individual or reporting entity can be identified, either directly or inferentially, on the basis of available data elements such as sex, race, school attended, year graduated, and program completed. This interagency agreement frees the researchers from the queuing issues discussed earlier. The quid pro quo is that the Department of Labor, Licensing, and Regulation has expanded the timeliness and scope of its own ability to answer questions using this database. Based on this unique agreement, the four-state database that underlies the present discussion was assembled by entering into fourteen agreements with multiple state agencies in the four states for the specific research purpose stated in each agreement. While there is no known precedent for this approach, federal government contractors have engaged in project-specific data acquisition of a similar type. Recently, the Unemployment Insurance Service in the U.S. Department of Labor has taken advantage of the archived data at the University of Baltimore to conduct inexpensive inquiries in a timely manner; investigations that could not have been accomplished as quickly at low cost, and may not have been feasible at all through other sources. Legal issues aside, each agency has multiple internal data processing and reporting obligations. Both predictable seasonal fluctuations, and unpredictable pressures associated with recessions that increase the number of unemployment compensation claimants, will affect the agency's ability to respond. Even established interagency agreements that specify a particular schedule of response times may be ignored when unexpected legislative actions impose new demands on the agency's data processing unit.

  4. The states vary widely in the pricing of interagency requests for record linkages. In many cases, informal understandings of the past have given way to formal agreements. There does not appear to be any commonly followed standard for pricing. Some agency's charge on a per-record-matched basis. Others charge on a per-quarter batch run basis, or on the number of records provided by the external party. Some charge on a marginal direct cost basis, while others include standard overhead rates. Many have different rates for particular categories of request.
Together, these considerations amount to a clarion call for strategic planning of requests for data. Decisions must be made ahead of time about when data is really needed, what historical coverage is required, and what data elements are sought. For example, in most states, the wage record file contains only the three data elements described earlier: (1) employee social security number, (2) employer identifier, and (3) earnings paid to this employee by that employer during the reference year/quarter. Interest in a SIC code, or geographic identifier, requires access to different files maintained by a state employment security agency. Having said this, the importance of sorting out and aligning the timing of school-leaving and employment status will in many cases justify the effort and expense that is involved.

Documentation of Continuing Education

Repeated mention has been made of the need to document a former student's continuing education. The basic point has been that accurate attribution of vocational education outcomes cannot occur without taking into account any other educational pursuits that should be considered as joint-inputs.

The importance of this conclusion is illustrated in Table 5, which displays continuing education information for three populations of vocational program completers. Both high school and area school postsecondary levels are represented here. Three annual classes of completers are presented to offer a sense of the stability of the transition flows that are represented. The sex, race, and age attributes reveal the heterogeneity of the vocational education population.

Table 5

Continuing Education as a Factor in Interpreting Employment and Earnings Outcomes:

Three Classes of High School and Area School Vocational Program Completers

High School

Class of 1989-1990
Class of 1990-1991
Class of 1991-1992
Sex



Female
41%
47%
46%
Male
33%
38%
34%
Race



Asian
63%
62%
54%
African-American
34%
42%
39%
Hispanic
45%
48%
42%
Native American
35%
33%
22%
White
39%
43%
42%
Area School (Postsecondary)



Age



<= 25
17%
22%
20%
26-35
13%
17%
16%
>= 35
11%
14%
14%

Table 5 is based on the reference state's own matching of school district records (vocational completers only) and subsequent reporting of a former student's enrollment in one of the state's higher education institutions. Particular attention is drawn to the sensitivity of the enrollment rate of former African-American students to the 1990-1991 recession; a pattern that is not observed for any of the other groups. This revelation exemplifies how the data can be used to identify possible opportunities for administrative action. In this case, both instructional staff members and counselors can be alerted to the apparent vulnerability of African-American students when economic conditions weaken. Higher education authorities will be interested in the cyclical volatility of their expected enrollments, and the demographic twists that might be expected. They should also be alerted to the possibility that a transition difficulty has been shifted from one educational level to another. The relevance of this concern depends in part on the economic circumstances that arise at the time when these enrollees leave the higher education cocoon.

State governance of public education varies too much to offer detailed recommendations for the steps that should be followed to acquire higher education data. Some states, including FETPIP, are introducing coverage of private postsecondary institutions.

Multiple Employer Affiliations

Two issues are covered in this subsection. The first topic is suggestions about dealing with the presence of multiple wage records for a former student in a particular reference quarter. The second topic is the relevance of a former student's multiple employer affiliations for decisionmakers.

Multiple Wage Records in One Quarter

Again, this occurs when two or more employers report that a particular former student worked for them during the reference quarter. It was noted earlier that most state employment security agencies enter data without considering the actual sequence of employer affiliations. In fact, the state employment security agency usually has no way of knowing this sequence. It was also pointed out previously that the employer affiliations may have occurred simultaneously, not sequentially.

The best practical rule that can be recommended for identifying sequential affiliations is to conduct quarter-to-quarter matches that reveal whether one of the affiliations disappears in the next quarter. If so, then there can be a reasonable presumption that this was a previous employer and that the one that appears in both quarters followed.

The primary job issue is more difficult to clarify because there is no reliable time-unit of employment available (e.g., weeks worked). Different combinations of low earnings and many weeks, or high earnings coupled with just a few weeks of employment, will result in the same total quarterly earnings amount. One approach is to define primary on the basis of multiple quarters of data. If a former student maintains one affiliation continuously, and another only occasionally, then the first job can reasonably be designated as the primary affiliation.

Ultimately, the decision rule that is applied should be based on the specific intended purpose that the investigator has in mind. What may be thought appropriate in one case might be rejected in another situation.

The Relevance of Multiple Employer Affiliations

Strong statements have been made in previous sections that the fundamental notion of a transition from school to work should be reexamined in the context of today's patterns of concurrent enrollment and employment (see Osterman & Ianozzi, 1993). Figure 10 provides a specific example of this phenomenon as a foundation for proceeding.

The two community college vocational programs that appear in Figure 10 were chosen to highlight differences. Neither is representative, or typical, of the entire range of vocational offerings. Indeed, historically, one of the problems in discourse about vocational education has been a failure to distinguish among extraordinarily varied curriculums. Here, a three and one-half year post-graduation reference period is covered.

The first figures that virtually leap off the page are the continuing employer affiliation rates--73% for the completers of the health/medical curriculum, and 54% for the completers of the marketing and retail curriculum. These are the percentages of the respective reference groups who cannot be said to have been placed. There has been no
transition from school-to-work for these former students, at least not in the traditional sense of that term.

A second pairing of numbers represents those who were still employed in what is referred to here as the first job at the end of the three and one-half year reference period--53% of the health/medical program completers and 38% of the marketing and retail program completers. The differences in the combined rates of left- and right-truncation (i.e., continuous employment that started before leaving school and, for some, continued through the end of the observation period) account for the more than six months difference in average length of first job held between the two program completion populations.

Diagnostics of the kind displayed in Figure 10 help an investigator, and any other interested party, to understand the interplay between the school curriculum, enrollees in varied components of this curriculum, and the employment opportunity set that was included in Figure 1 earlier.

Censoring Reported Earnings

A frequently-expressed discomfort in using a state employment security agency's quarterly wage records is an investigator's inability to calculate a wage rate figure that can be compared with other data sources. This subsection presents the results of two types of censoring of available quarterly data, which satisfy some of the needs of vocational education decisionmakers.

Tables 6, 7, and 8 provide a pot pourri of data elements for reader consideration. Two new concepts are introduced here. The first, full earnings, requires a former student to have reported earnings in each of the four quarters of the reference year, and to have earned more than $8,667, which is the appropriately inflated 1989 earnings level that was self-reported in the 1990 Census by those who said they had worked forty hours or more per week for forty-eight or more weeks during the year, and who fell at the 5% point in the lower tail of the distribution of earnings for this group. The 5% in the lower tail of the distribution was chosen to eliminate outliers that might be questioned as data entry errors or special circumstance cases. This concept is intended to include only those who have reported earnings in each of the four quarters and who earned at least as much as the members of this comparison group of low-earners among all respondents to the 1990 Census who can be classified as having been employed full-time year-round in 1989. This concept of full earnings is used in each of the three tables in the series. The second concept, full-time earnings, only appears in Table 8. This concept is defined as those who were reported to have worked at least 1,920 hours during the reference year of 1991. This is the equivalent of forty hours a week for forty-eight weeks a year. The data to carry out this calculation were obtained from Washington's State Employment Security Department, which is one of the few state agencies that requires employers to report hours of work associated with the earnings of each employee.

Table 6

Actual Reported Annual Earnings and Censored Earnings of

1989-1990 Community College Associate Degree Recipients

Earnings in 1991 Earnings in 1992 Full Earnings in 1991 Full Earnings in 1992
Program Sex Size N Mean Stderr N Mean Stderr N Mean Stderr N Mean Stderr
Vocational F 1,122 911 $16,384 $328 865 $18,463 $369 573 $20,933 $355 624 $22,620 $357
M 912 679 18,776 542 647 20,675 518 430 25,308 603 469 25,904 514
All 2,034 1,590 17,406 300 1,512 19,410 307 1,003 22,808 335 1,093 24,029 304
Academic F 704 451 10,834 497 450 12,466 493 185 19,937 753 241 19,238 593
M 613 331 16,681 891 331 18,143 879 162 28,291 1,225 193 27,735 1,028
All 1,317 782 13,309 485 781 14,872 479 347 23,837 733 434 23,017 598
Adjusted Difference 4,088(559) 4529(559) -862 (777) 1,064 (640)
of Voc.-Acad. Significant Significant Not Significant Not Significant

Notes: "Full Earnings in 1991" is defined as "Earnings in 1991" if earnings were reported for each of the four quarters, and if this earnings amount is equal to or greater than $8,667; which is the inflated 5% quantile of 1989 full-time workers' earnings in the corresponding 1990 census group. The cut-off point for 1992 earnings is $8,887. Full-time is defined as 40 hours or more per week, 48 weeks or more per year.
The significance tests are for the difference between mean earnings levels for the vocational and academic groups, adjusted for the different distributions of "sex" in these groups. The tests are based on 5% significance level.
1992 earnings are deflated to 1991 by factor 1.025.

Table 6 presents actual earnings with no restriction on the number of quarters of reported employment, and full earnings using the censoring criteria that have been described in the previous paragraph, for the two years following the school year during which the members of the reference population received an associate's degree. This information is presented for male and female degree recipients separately, within vocational and academic groupings. This type of presentation can be replicated for any combination of curriculum and demographics, as long as the confidentiality stipulations are honored.

Table 7

Actual Reported Annual Earnings and Censored Earnings

1989-1990 Community College Associate Degree and Certificate Recipients

Earnings in 1991 Full Earnings in 1991
Degree Level Sex Size N Mean Stderr N Mean Stderr
Degree
F
2,797
2,300
$19,083
$232
1,671
$23,781
$218

M
1,581
1,212
18,677
346
815
24,322
349
All
4,378
3,512
18,943
193
2,486
23,959
186
Certificate
F
838
659
14,222
333
438
18,567
327
(>=1 year)
M
485
396
18,191
545
276
23,046
536

All
1,323
1,055
15,712
298
714
20,298
299
Certificate
F
262
206
12,714
763
96
21,393
1,002
(<1 year)
M
95
76
22,893
1,602
60
27,409
1,544

All
357
282
15,457
753
156
23,707
885
Adjusted Difference
3,351 347 3,923 337
of Degree-Cert. (>=1 yr)
Significant
Significant
Adjusted Difference

2,716
770

593
863
of Degree-Cert. (< yr)
Significant
Not Significant

Notes: "Full Earnings in 1991" requires that earnings were reported for each of the four quarters, and that the combined earnings amount is equal to or greater than $9,751, which is the inflated 5% quantile of 1989 full-time workers' earnings in the corresponding 1990 census group. "Full-time" is defined as 40 hours or more per week, 48 weeks or more per year.
The significance tests are for the difference between mean earnings levels for the degree and certificate groups, adjusted for the different distributions of "sex" in these groups. The tests are based on 5% significance level.

Table 7 provides a more detailed look at the actual and censored earnings levels within the vocational curriculum. Here, three levels of formal recognition of educational accomplishment are identified--associate's degree recipients, completers of a vocational certificate program that lasted at least the equivalent of one year's credit hours, and those who received a certificate for a shorter course of study. Again, male and female students who reached these plateaus are identified separately. The diversity of average annual earnings that appears in Table 7 provides yet another bit of the accumulating evidence that aggregates and snapshots mask major differences beneath the surface. These differences are often of great importance in the decision-making process. Diagnostics of this type can improve the quality of these decisions. It is particularly important to recognize the comparative earnings levels associated with the three types of degree or certificate. There is a clear hint here that previous work experience, and perhaps other education credentials, should be considered in any attempt to treat these earnings figures as vocational education outcomes.

Table 8

The Effect of Using Different Censored Earnings Definitions on

1989-1990 Community College Associate's Degree and Certificate Recipients

Earnings in 1991 Full Earnings in 1991
Degree Level Sex Size N Mean Stderr N Mean Stderr
Degree
F
2,797
1,671
$23,781
$218
599
$25,700
$371
M
1,581
815
24,322
349
402
26,300
502

All
4,378
2,486
23,959
186
1,001
25,941
300
Certificate
F
838
438
18,567
327
126
21,189
605
(>=1 year)
M
485
276
23,046
536
112
25,480
778

All
1,323
714
20,298
299
238
23,208
505
Certificate
F
262
96
21,393
1,002
38
24,096
1,551
(<1 year)
M
95
60
27,409
1,544
33
28,381
2,170

All
357
156
23,707
885
71
26,087
1,321

Notes: "Full Earnings in 1991" requires that earnings were reported for each of the four quarters, and that the combined earnings amount is equal to or greater than $9,751, which is the inflated 5% quantile of 1989 full-time workers' earnings in the corresponding 1990 census group. Full-time is defined as 40 hours or more per week, 48 weeks or more per year.
"Full-Time Earnings in 1991" is the average earnings of those who worked at least 1,920 hours (40 hours per week, 48 weeks per year) in 1991.

Table 9

Earnings and Continuity of Employer Affiliation in a Multivariate Context:

1990-1991 Community College Vocational Program Completers

Dependent Variable
 

Earnings in
1991:4

Earnings in
1992:4

First Job Length
 

Chngd Emp. in
1992
Regression Type
 

Linear
Regression

Linear
Regression

Linear
Regression

Logistic
Regression
Number of
Observations

 
11,048

 
11,048

 
14,674

 
14,674
R-squared or C 0.5452 0.4419 0.2549 0.637
Population
 

Employed in
1991: 4

Employed in
1991: 4

Have Post-
School Job

Have Post-
School Job
 
Estimates
    $
P-Value
 

Estimates
   $
P-Value
 

Estimates
 (qtrs.)
P-Value
 

Estimates
   $
P-Value
 
Intercept -624.243 0.112 -1,089.821 0.024 3.015 0.000 -1.599 0.000
Demographic Variables      
 Male 358.952 0.000 650.800 0.000 -0.001 0.965 0.053 0.246
  African American -359.124 0.000 -402.615 0.000 -0.059 0.167 0.123 0.039
 Hispanic 124..937 0.162 113.325 0.300 0.018 0.774 0.041 0.648
Vocational Program      
 Agriculture -184.914 0.714 77.211 0.897 -0.515 0.159 0.769 0.111
  Office Ocupations -67.847 0.410 -92.355 0.361 0.023 0.687 0.024 0.763
 Engineering Technologies -104.061 0.310 161.113 0.198 -0.024 0.738 -0.044 0.675
  Allied Health -4.262 0.934 333.042 0.000 -0.303 0.000 0.362 0.000
 Medical Science 2,226.776 0.000 2,714.662 0.000 -0.005 0.912 0.244 0.000
 Child Care/Food Service -817.925 0.000 -1,016.304 0.000 -0.342 0.007 0.094 0.590
Local Economics
Conditions
     
 Local-Avg.-Earn ($) 1.017 0.000 1.555 0.000 0.000 0.188 0.000 0.003
Pre-Graduation Job
Info.
     
 Pre-Job -806.899 0.000 -1,098.645 0.000 0.725 0.000 0.045 0.045
  Earnings in 1991:2 0.614 0.000 0.559 0.000 0.000 0.000 0.000 0.000
1991:4 Job
Relatedness
     
 Directly Related 662.065 0.000 505.058 0.000 1.243 0.000 -0.393 0.000
 Somewhat Related 478.926 0.000 404.306 0.000 1.214 0.000 -0.263 0.002
 Not Related -274.206 0.000 -155.603 0.067 0.690 0.000 0.167 0.008
 Industry Related 347.044 0.000 546.003 0.000 1.135 0.000 -0.276 0.001
 Directly Not Related -538.108 0.000 156.325 0.422 0.519 0.000 0.358 0.013
Changed Employers in 1992   -660.157 0.000    

Notes: Italic indicates a 0-1 dummy variable.
Local-Avg-Earn is the average 1992:1 earnings of all 1990-1991 Florida high school and community college students in the state by groups of counties selected by the authors.
Pre-Job refers to the job held selected by the authors in 1991:2.
Industry-Specific Avg. Earnings are based on two-digit SIC code. The 1991:4 earnings of all workers (in another state because the data was available) are used to calculate the average earnings and standard error of the average earnings.

Table 8 uses the same format as Table 7, and repeats the full earnings figures from that table. The new feature here is the introduction of the full-time earnings figures. The intent is to demonstrate the sensitivity of reported earnings figures to the censoring rule that is used, and to highlight the gap between either of these censored figures and the actual reported levels that emerge when no restriction is imposed.

Specification and Estimation of Models

This brief section has been placed at the end of the introduction of new evidence intentionally. Up to this point, the text, figures, and tables have been designed to provide a complete package of refinements that should be considered in every state and in some local school districts. The sole criterion motivating this approach has been practical actionability--an expectation that every figure and table can be replicated and refined contingent upon a threshold level of cooperation among state education and employment security agency personnel. The material presented is expected to become a foundation for a broader and deeper discussion of both technical and policy issues.

The clear theme of Figure 1 is that interdependence of many forces must be considered in any serious attempt to estimate the outcomes of vocational education. Three accessible examples of serious expert investigation of these forces are Grubb (1995) and Kane and Rouse (1995a, 1995b).

The authors of the present guide are now collaborating with Rouse in the creation of a new database that will include longitudinal coverage of sequential cohorts of high school students in the Baltimore City Public Schools, some of whom continued on to one or more of Maryland's public community colleges and/or to one of the eleven teaching campuses of the University of Maryland System. These files complement the decade of Maryland employment and earnings data that are already available to the authors.

Each database has unique strengths and weaknesses. Grubb (1995) and Kane and Rouse (1995a, 1995b) have used the National Longitudinal Survey of the High School Class of 1972 (NLS-72) and the National Longitudinal Survey of Youth (NLSY) to conduct sophisticated investigations of the payoff to investments in community college education. Each of these data sets contains some variables that do not appear in a simple merger of student transcript information with state employment security administrative
records. These data sets are well-suited for the type of research conducted by Grubb and Kane and Rouse, but not for management diagnostics. Management diagnostics occur at the local and state levels. These require timely information that can be used to motivate exemplary performance by subordinate program managers and teachers in these programs.

Congressional budget debates make it clear that sustained federal investment in high-quality longitudinal data sets may be in jeopardy. State vocational education leaders have rarely received actionable support from research conducted with these data sets. This is not their intended purpose. This is why complementary reliance on a state's own administrative records is encouraged. When and if a national distributed database capability becomes a reality some convergence between the two approaches might occur. Meanwhile, there is ample evidence of success in pioneering states, complemented by awareness of how to advance to a higher plateau of understanding, that should be sufficient motivation for any state leader to join the growing team of wage record users.


LOOKING AHEAD

This guide has introduced a series of diagnostic steps that can be taken by anyone who is fortunate enough to collaborate with state employment security agency and state education entity colleagues. Many changes are in the air. Pending legislative and administrative actions promise to broaden the sweep of opportunities to participate in this advance of understanding. Data processing technologies and plummeting costs combine to favor a more democratic ability to pursue such diagnostics. All barriers to progress have not fallen. Continued caution is necessary to be sure that appropriate respect for confidentiality stipulations is practiced at all times. Today's members of the vocational education community, tomorrow's students, and all of us as beneficiaries of their motivation and talents will be winners if a better understanding of employment and earnings outcomes occurs.

A series of important events will occur in the next year:

  1. The seven-state consortium sponsored by the Employment and Training Administration's America's Labor Market Information System (ALMIS) initiative will issue a number of reports that address many of the technical issues discussed in this guide.

  2. The Texas State Occupational Information Coordinating Committee will release documentation about a series of consumer reports based in part on wage record data. This research is also sponsored by the U.S. Department of Labor.

  3. The Bureau of Labor Statistics will move ahead in cooperation with the Employment and Training Administration to design and put in place a national distributed database capability.

  4. There will be progress, albeit uneven advances, in state performance measurement systems. These will include more state systems that include a core of common data elements across human resources programs within the state.

  5. There will be new pressures for vocational education accountability; these may emerge as state initiatives based on block-grant discretionary authority or as federal mandates in the pending consolidation legislation.
This guide responds to local and state administrative priorities for new and improved information sources no matter how these current unknowns play out.


BIBLIOGRAPHY

Amico, L. (1993). State capacity to use UI wage records: The vocational education experience. Washington, DC: National Governors' Association.

Atteberry, J., Bender, C., Stevens, D., & Tacker, A. (1982). Vocational education, CETA program participation and subsequent earnings of 1975-76 graduates in the state of Missouri: The federal role in vocational education (Special Report 39, pp. 183-214). Washington, DC: National Commission for Employment Policy.

Borus, M. (1964). The economic effectiveness of retraining the unemployed. Ph.D dissertation, Economics Department, Yale University, New Haven, CT.

Borus, M., Brennan, J., & Rosen, S. (1970). A benefit-cost analysis of the Neighborhood Youth Corps: The out-of-school program in Indiana. Journal of Human Resources, 5(2), 139-159.

Bross, N. (1991). Findings of the technical workgroup on using unemployment insurance wage record data for JTPA performance standards. Washington, DC: Research and Evaluation Associates, Inc.

Brown, C., & Choy, S. (1988). Information disclosure in postsecondary vocational education: Possibilities and practices. Berkeley, CA: MPR Associates, Inc.

Ghazalah, I. (1991). 1979 vocational education graduates in 1986. Athens: Ohio University.

Grubb, W. N. (1995, Winter). Response to comment. Journal of Human Resources, 30(1), 222-228.

Hanna, J. (1976, June). Progress report: Employment Service Potential Project. Carson City: Nevada Employment Security Department.

Internal Revenue Service. (1995, March). STAWRS update. Washington, DC: Simplified Tax and Wage Reporting System Project Office.

Jarosik, D., & Phelps, L. A. (1992). Empowering accountability for vocational-technical education: The analysis and use of wage records (MDS-244). Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.

Joint Commission on Accountability Reporting. (1995, May). Final report: Draft. Washington, DC: American Association of State Colleges and Universities.

Journal of Official Statistics. (1993). 9(2), 269-591. Stockholm, Sweden. Special Issue Containing the Background Papers Prepared for the National Academy of Sciences Panel on Confidentiality and Data Access.

Kane, T. J., & Rouse, C. E. (1995a, June). Labor-market returns to two- and four-year college. American Economic Review, 85(3), 600-614.

Kane, T. J., & Rouse, C. E. (1995b, Winter). Comment on W. Norton Grubb, "The varied economic returns to postsecondary education: New evidence from the class of 1972." Journal of Human Resources, 30(1), 205-221.

Klerman, J. A., & Karoly, L. A. (1994, August). Young men and the transition to stable employment. Monthly Labor Review, 117(8), 31-48.

Levesque, K. A., & Alt, M. N. (1994). A comprehensive guide to using unemployment insurance data for program follow-up. Washington, DC: National Occupational Information Coordinating Committee.

MDC, Inc. (1980, November). Unemployment insurance data: A study of their utility for follow-up of CETA participants in Balance-of-State North Carolina. Chapel Hill, NC: Author.

National Forum on Education Statistics. (1994, July). Education data confidentiality: Two studies--Issues in education data confidentiality and access, and compilation of statutes, laws, and regulations related to the confidentiality of education data. Washington, DC: National Center for Education Statistics, U.S. Department of Education.

National Institute of Education. (1981). The vocational education study: The final report (Vocational Education Study Publication No. 8). Washington, DC: U.S. Department of Education.

Northeast-Midwest Institute. (1988). The feasibility of a National Wage Record Database. Washington, DC: Author.

Osterman, P., & Ianozzi M. (1993). Youth apprenticeships and school-to-work transitions: Current knowledge and legislative strategy (Working Paper 14). Philadelphia: National Center on the Educational Quality of the Workforce, University of Pennsylvania.

Pfeiffer, J. J. (1990, August). Annual report. Tallahassee: Florida Education and Training Placement Information Program.

Pfeiffer, J. J. (1994). Student follow-up using automated record linkage techniques: Lessons from Florida's Education and Training Placement Information Program (FETPIP). Tallahassee: FETPIP.

Pfeiffer, J. J., & Stevens, D. W. (1992). State and national perspectives on whether and how to attempt to use state UI wage records. Washington, DC: Research and Evaluation Associates, Inc.

Rahn, M. L., Hoachlander, E. G., & Levesque, K. A. (1992). State systems for accountability in vocational education. Berkeley, CA: MPR Associates, Inc.

Siebert, G. A. (1976, June). First progress report on the Employment Service Potential Project. Sacramento: Employment Data and Research Division, California Employment Development Department.

Smith, G. P., & Stevens, D. W. (1994). Beyond accountability: Using administrative databases to conduct discretionary management diagnostics. Denver, CO: Community College of Denver.

Standard Occupational Classification Revision Policy Committee. (1995, April). Proceedings of the Standard Occupational Classification Revision Research Findings Seminar. Washington, DC: Office of Policy Research, Employment and Training Administration.

Stern, D., & Stevens, D. W. (1992). Analysis of unemployment insurance data on the relationship between high school cooperative education and subsequent employment. Washington, DC: National Assessment of Vocational Education, Office of Research, Office of Educational Research and Improvement, U.S. Department of Education.

Stevens, D. W. (1986). Assessing the impact of the Carl D. Perkins Vocational Education Act: Economic development issues. In Design papers for the National Assessment of Vocational Education, III (pp. 29-47). Washington, DC: U.S. Department of Education.

Stevens, D. W. (1989a). Using state unemployment insurance wage-records to trace the subsequent labor market experiences of vocational education program leavers. Washington, DC: National Assessment of Vocational Education, U.S. Department of Education.

Stevens, D. W. (1989b). Using state unemployment insurance wage-records to construct measures of secondary vocational education performance. Washington, DC: Office of Technology Assessment, U.S. Congress.

Stevens, D. W. (1990, December). State Employment Security Agency information disclosure statutes and practices: A management challenge in the 1990s. DeKalb: Center for Governmental Studies, Northern Illinois University.

Stevens, D. W. (1994a). Research uses of wage record data: Implications for a National Wage Record Database. Washington, DC: Division of Occupational and Administrative Statistics, Bureau of Labor Statistics, U.S. Department of Labor.

Stevens, D. W. (1994b). Confidentiality and the design of a National Wage Record Database. Washington, DC: Division of Occupational and Administrative Statistics, Bureau of Labor Statistics, U.S. Department of Labor.

Stevens, D. W. (1994c). The school-to-work transition of high school and community college vocational program completers: 1990-1992 (Working Paper 27). Philadelphia: National Center on the Educational Quality of the Workforce, University of Pennsylvania.

Stevens, D. W. (1994d, September). The use of UI wage records for JTPA performance management in Maryland. Baltimore: Office of Employment Training, Maryland Department of Labor, Licensing, and Regulation.

Stevens, D. W. (1994e). Restricted access considerations in the design of the National Wage Record Database. Washington, DC: Division of Occupational and Administrative Statistics, Bureau of Labor Statistics, U.S. Department of Labor.

Stevens, D. W. (1994f). Performance measurement revisited. Journal of Vocational Education Research, 19(3), 65-82.

Stevens, D. W., Richmond, P.A., Haenn, J. F., & Michie, J. S. (1992). Measuring employment outcomes using unemployment insurance wage records. Washington, DC: Office of Policy and Planning, U.S. Department of Education.

Strong, M. E., & Jarosik, D. (1989). A longitudinal study of earnings of VTAE graduates. Madison: Vocational Studies Center, School of Education, University of Wisconsin-Madison.

Trott, C. E., Sheets, R., & Baj, J. (1985). An evaluation of ETA's PY85 Title II-A performance standards models and feasibility assessment regarding a regional/state-based modeling initiative. DeKalb: Center for Governmental Studies, Northern Illinois University.

U.S. Congress, Office of Technology Assessment. (1994, May). Wage Record Information Systems (OTA-BP-EHR-127). Washington, DC: Author.

U.S. Department of Education. (1994). Interim report to Congress. Washington, DC: National Assessment of Vocational Education, Office of Research, Office of Educational Research and Improvement.

U.S. Department of Labor. (1993, September). Proceedings of the International Occupational Classification Conference (Report 833). Washington, DC: Bureau of Labor Statistics.

Wirt, J. G., Muraskin, L. D., Goodwin, D. A., & Meyer, R. H. (1989). Final report. Volume 1, Summary of findings and recommendations. Washington, DC: National Assessment of Vocational Education, U.S. Department of Education.


NCRVE Home