
W. Norton Grubb
University of California at Berkeley
National Center for Research in Vocational Education
Graduate School of Education
University of California at Berkeley
2030 Addison Street, Suite 500
Berkeley, CA 94720-1674
Supported by
The Office of Vocational and Adult Education
U.S. Department of Education
May, 1995
This document is one in a series of Technical Assistance Reports. It was
originally prepared for the Training Policy and Programme Development Branch
International Labour Office (ILO), Geneva. The ILO sponsored this project.
This document has not been reviewed by NCRVE; therefore, this paper represents
the views of its author and not necessarily those of NCRVE or the U.S. Department
of Education. NCRVE makes Technical Assistance Reports available upon request
for informational purposes.
FUNDING INFORMATION
| Project Title: | National Center for Research in Vocational Education |
|---|---|
| Grant Number: | V051A30003-95A/V051A30004-95A |
| Act under which Funds Administered: | Carl D. Perkins Vocational Education Act P.L. 98-524 |
| Source of Grant: | Office of Vocational and Adult Education U.S. Department of Education Washington, DC 20202 |
| Grantee: | The Regents of the University of California c/o National Center for Research in Vocational Education 2150 Shattuck Avenue, Suite 1250 Berkeley, CA 94704 |
| Director: | David Stern |
| Percent of Total Grant Financed by Federal Money: | 100% |
| Dollar Amount of Federal Funds for Grant: | $6,000,000 |
| Disclaimer: | This publication was prepared pursuant to a grant with the Office of Vocational and Adult Education, U.S. Department of Education. Grantees undertaking such projects under government sponsorship are encouraged to express freely their judgement in professional and technical matters. Points of view or opinions do not, therefore, necessarily represent official U.S. Department of Education position or policy. |
| Discrimination: | Title VI of the Civil Rights Act of 1964 states: "No person in the United States shall, on the ground of race, color, or national origin, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving federal financial assistance." Title IX of the Education Amendments of 1972 states: "No person in the United States shall, on the basis of sex, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any education program or activity receiving federal financial assistance." Therefore, the National Center for Research in Vocational Education project, like every program or activity receiving financial assistance from the U.S. Department of Education, must be operated in compliance with these laws. |
In large part, the expansion of job training programs has reflected a concern with particular economic problems, especially those of unemployment, underemployment, and poverty. Job training programs were first discussed as a response to the unemployment created by the 1960-61 recession. Since then, periods of recession and unemployment, and specific kinds of unemployment (like the increases in dislocated workers unemployed as a result of substantial economic changes) have generated interest in job training programs as potential solutions. And the expansion of welfare programs supporting low-income families during the 1960s generated the realization that poverty was not likely to wither away of its own accord, and one solution has been to propose job training. Of course, the problems of poverty, unemployment, and underemployment are closely related, and so -- despite a proliferation of programs for specific purposes -- it is not surprising to see an overall pattern to the job training programs in the United States.
In the process, an important distinction has emerged between education and job training. The difference is not always clear, since some short-term, job-specific education programs look quite similar to job training. However, there are at least six differences between the two. First, job training programs are generally much shorter. Many of them last 10 - 15 weeks, with part-day attendance, so that the number of contact hours may be as low as 40 hours; the average program length in the recent JTPA evaluation was 3.5 months (Orr et al., 1994, Exhibit 3.18). In contrast, the shortest common postsecondary education programs -- for occupational certificates -- generally last two semesters (or about 30 weeks) of full-time enrollment, involving 360 to 1,000 contact hours; and two-year Associate programs dominate the programs of community colleges.[1]
Second, education programs -- particularly in community colleges and other two-year colleges, and area vocational schools -- are generally open to all members of the population; but job training programs are open only to those who are eligible -- for example, the long-term unemployed or dislocated workers in JTPA, or welfare recipients in welfare-to-work programs. (The issue of eligibility reflects the origin of job training as a solution to particular economic problems: only those who have suffered from these problems are eligible, not the population as a whole.) By construction, then, job training programs enroll individuals who have had particular problems in employment; while some problems may be due to overall employment conditions, others may be due to deficient skills, behavioral problems, and other personal traits.
Third, most education programs take place in educational institutions that are well-institutionalized and standardized -- in high schools, community colleges, and four-year colleges. In contrast, job training services are offered in a bewildering variety of educational institutions, community-based organizations (CBOs),[2] firms, unions, and proprietary schools, making it difficult to determine how services are organized and provided.
Fourth, the kinds of services provided in education programs are relatively standard: for the most part they offer classroom instruction, with both academic and vocational courses, often including labs, workshops and other "hands on" activities. Job training programs offer classroom instruction too, in both basic (or remedial) academic subjects like reading, writing, and math, as well as in vocational skills; but they also offer on-the-job training[3], where individuals are placed in work sites, presumably to learn on the job; work experience, where individuals work for short periods of time; job search assistance, in which clients are given some training in how to look for work, write resumes, file job applications, interview for jobs, and the like; and job clubs, in which clients are required to spend a certain amount of time looking and applying for jobs. Some programs provide counseling as well, to give clients both information about labor market opportunities and "life skills" like the abilities to plan. Job training programs also support placement efforts somewhat more often than educational institutions do, reflecting another division between education and job training: those in educational institutions are likely to declare that they are responsible for "education, not employment", while those in job training are more likely to accept that they have a responsibility for placing individuals as well as training them appropriately. Unfortunately, the variety of such services is so great, and the forms they take is so varied, that it is difficult to know precisely what takes place. As a result the evaluations of specific program components (reviewed in Section III.5) is not particularly comprehensive and is generally inconclusive.
Fifth, the goals of education programs are typically quite broad, and generally encompass political, moral, and intellectual purposes as well as occupational ends; but job training programs are focused exclusively on preparing individuals to become employed. In the case of welfare-to-work programs, the single goal is to get welfare recipients employed as quickly as possible so they can move off the welfare rolls. Because the goal of job training programs is so unambiguous, and because there are no intrinsic benefits to job training -- no one would declare that being in a job training program is fun, or a social activity, or a normal part of growing up, as Americans might say about schools and colleges -- there has been a long history of evaluating them to ascertain their effectiveness. These evaluations have also become increasingly sophisticated over time -- certainly much more sophisticated than those of education programs. The results have also influenced public policy in a way that has not been true in the education system because the political pressures in education -- the support of parents for programs that benefit their own children, including such diverse offerings as those aimed at low-income students, or limited-English-speaking (LEP) students, or gifted students, for example -- are generally lacking in job training programs, where the only justification for public support is their reduce of unemployment, poverty, and the receipt of welfare.
Finally, job training programs differ from education programs in constituting a separate kind of system, a "second chance" system in some ways parallel to but disconnected from the "first chance" educational system. Over the course of 150 years, the education system in the United States has developed a well-articulated series of offerings from kindergarten (now often extended to pre-school programs) to the university level. But for those who have left this system without adequate skills, the job training system can be interpreted as a second chance to get back into the mainstream of the labor force. In general, the establishment of this second chance system is one manifestation of a generous American impulse: to provide opportunities to individuals through various forms of learning, and to be inclusive of all who might benefit from such activities. However, this second-chance system is much younger than the education system; it spends less, is more disorganized, has lower status, and is poorly institutionalized so that it cannot resist purely political pressures. As a result it has been subject to revision by nearly every President so that it lacks the stability of the education system. Given these differences, it is not surprising to find that the job training system is not especially effective -- partly, as I will argue in sections IV and V, because of the separation of education and job training in separate arenas, a division that has been detrimental to both of them.
In this monograph I review the effectiveness of job training programs in the United States, concentrating on the most recent and most sophisticated evaluations.[4] Section I describes in greater detail the variety of job training programs described, explaining why it is appropriate to consider a wide variety of programs. Section II outlines the preferred methodology of recent evaluations using random assignment methods, and clarifies both the strengths of this approach and its inevitable weaknesses. Section III then presents a series of results, first for job training programs (III.1), then for welfare-to-work programs (III.2), and finally for special experimental programs (III.3). These results are followed by the outcomes for different population groups (III.4), for different types of services (III.5), for the effects of programs over time (III.6), and for different programs (III.7). Finally, in Section III.8, I present some recent benefit cost-analyses. A series of tables drawn from the evaluations accompanies this section, so that the reader can see concretely the results of the major evaluations.
The major question these evaluation address, of course, is whether job training programs been successes or failures. A conventional reading of the evaluations is that many (though not all ) job training programs lead to small but statistically significant increases in employment and earnings, and (for welfare recipients) small decreases in welfare payments; where cost-benefit analyses have been done, the social benefits usually outweigh the costs (but not always). One might conclude that these programs have been successful and should be continued. However, the gains in employment and earnings are, from a practical standpoint, quite small: they are insufficient to move individuals out of poverty, or off of welfare; their effects very often decay over time, so that their benefits are short-lived; and as they are currently constructed they certainly do not give individuals a chance at a middle-class occupation or income. In my interpretation, therefore, the successes of job training programs have been quite modest, even trivial -- and that dismal conclusion requires some understanding of why that might be true. In Section IV, therefore, I present a series of possible explanations for the weak results of job training programs. The reasons for failure are necessarily more speculative than are the outcome results in Section III, which are based on harder data and (in many cases) random assignment methods; but some understanding of why job training programs have had such modest results is necessary to develop recommendations that could remedy such programs through public policy, or create more effective programs from the start.
Although the benefits of current job training programs have been small, they problems they address -- unemployment, underemployment, and welfare dependency -- are too serious to ignore. Therefore, rather than abandoning job training, the appropriate response is to determine how to reform them. The conclusion therefore presents a vision of how job training programs could be structured in ways that avoid the reasons for failure outlined in Section IV. This vision has in fact been embodied in current federal legislation -- the School-to-Work Opportunities Act, passed in May 1994 -- that applies to high schools and community colleges. But its implications for job training programs have not yet been developed, and so the purpose of the conclusion is to clarify how reforms now proposed for the education system might benefit job training programs as well. In the end such reforms could eliminate the unproductive division between education and job training that has developed over the past thirty years.
States were given additional authority by JTPA, the successor to CETA, enacted in 1983. State governments now designate local service delivery areas (SDAs), and they can establish priorities for SDA use of a portion of the federal grant. However, JTPA still remains a federal rather than a state program: Nearly all funding comes from the federal government, federal regulations apply to all programs nationwide, and many states have made no effort to assert a role in policy-making beyond that required by federal regulation.
The development of job training programs did more than simply add new funding sources for work-related training; it also dramatically changed the types of institutions that provide training. Job training programs since the 1960s have been characterized by their use of CBOs, unions, private firms, and other institutions--private alternatives to conventional high schools, community colleges, and technical colleges--to provide training and related services. This aspect of job training has given the education and training system greater variety and fluidity and has helped erode the boundary between public and private programs.
Job training also marked a turn toward private sector participation in public programs, not only with the funding of private organizations such as CBOs, but also with the establishment of Private Industry Councils (PICs), at least 51 percent of whose members must represent the private sector. The PICs are responsible for policy guidance and program oversight, and they must approve SDA training plans. They also have the option to administer JTPA programs directly, although fewer than 20 percent do. It is important to note that "private sector representation" refers to employers, and not representatives of labor like unions. Overall, the participation of unions in the job training system is relatively weak, partly because unions now represent only 11 percent of employees in the United States.
JTPA also differs from earlier programs in the nature of the mandates it imposes. While its predecessors focused on the types of services that local agencies could deliver, JTPA emphasizes outcomes by requiring that SDAs meet specific performance standards. The federal government has identified twelve standards[6] from which states select eight that SDAs must meet; states may add standards of their own and may use either a federal adjustment model or one of their own design to take into account the demographic and labor market characteristics of individual SDAs. In theory, the imposition of performance standards is a way of making JTPA more effective. In practice, however, performance standards can be manipulated by local programs, and they prove to be uncorrelated with the effects measured by more sophisticated evaluation techniques (Doolittle et al., 1993, p. 10); they have made local programs concerned with the details of performance measures, but not with effectiveness in a broader sense.
A second strand of development has concentrated on welfare recipients. Historically, welfare in this country -- principally, the program known as Aid to Families with Dependent Children (AFDC)[7] -- was provided only to mothers with children so they could stay at home and care for their young. However, as working has become increasingly common for all women in the United States, including mothers of young children, there has been increasing pressure to get welfare mothers (and fathers[8]) into employment and off the welfare rolls. The first efforts were established in 1962 in the Community Work and Training Program. Like MDTA, it provided funds from the Department of Labor, which could be used by welfare programs at the local level, bypassing the vocational education system. The Economic Opportunity Act of 1964 included yet another program designed to encourage work, the Work Experience and Training Program. In 1967, as part of the far-reaching Amendments to the Social Security Act, the Work Incentive (WIN) program was established as a voluntary work program. Although WIN was made mandatory for welfare recipients in 1971, it was not funded at a level that made widespread participation enforceable, and therefore it remained a limited and voluntary program.
Another strand of the Lyndon Johnson's War on Poverty was the "services strategy" developed as an antidote to poverty. This strategy provided a variety of support services (such as child care and transportation) to enable welfare recipients to work their way off welfare. As embodied in the 1967 Amendments to the Social Security Act, it included funding for short-term training. The support services, which were then consolidated in the Title XX Amendment of 1973, provided funds for social services to states and gave states greater authority to decide which services should be provided. Title XX emerged largely intact (though with considerably reduced funding) in the Social Services Block Grant, enacted in 1981. However, in practice, work-related services (including training) were rarely provided under Title XX, which focused instead on rehabilitating families on welfare and preventing abuse, rather than facilitating employment (Dickinson, 1986).
In 1981, the Reagan administration, building on a history of "welfare-to-work" programs that forced welfare recipients to work in exchange for grants, allowed states to develop their own programs for getting welfare recipients back to work. The state programs that developed were, not surprisingly, enormously varied. Most relied heavily on job search (i.e., short-term assistance in applying for work, but no other training or support services) and work experience or OJT, both accomplished through short-term job placements. A few developed Community Work Experience Programs (CWEPs), in which welfare recipients provide community service in amounts related to the size of their grants--equivalent to the traditional conception of "workfare." Although 84 percent of the programs offered vocational skills training and 72 percent provided post-high school education, in practice only 2.3 percent of the welfare recipients participating in these programs received any skill training, and only 1.6 percent enrolled in postsecondary education (most of these were in Massachusetts, Michigan, and California). In fact, only 3.2 percent received remedial education; even the most basic forms of education and training were quite rare.[9] In practice, then, experimentation with various kinds of services and "welfare-to-work"[10] strategies led to an emphasis on job search, rather than education, training, or other services.
The most recent development in this area is the Family Support Act of 1988, which requires all states to establish JOBS programs to increase the employment of welfare recipients. The legislation provides federal matching funds--ranging from 50 percent to 72 percent of total costs--for a variety of work-related services, including job search, work experience, counseling, child care, and other support services, and all forms of remedial education, vocational education, and training. This new legislation combines the services strategy of the 1960s with the work-related emphasis of WIN (including the use of education and training).
With JOBS, Congress crafted a program that combines a mandate (participation is mandatory for all AFDC recipients who are single heads-of-household and who have no children under three years of age[11]), inducements in the form of services (e.g., one year of health care after participants obtain a job, and transitional child care) to reduce the cost of moving from welfare to employment, and capacity-building through longer-term investments in education and training (typically up to two years). As it has done with vocational education and JTPA, the federal government has attempted to target JOBS service recipients. To avoid a situation in which states primarily serve those most likely to get off welfare even without additional assistance, the federal government requires that 55 percent of states' JOBS funds be spent on those most at risk of long-term welfare dependency (e.g., young mothers who are also high school dropouts).
To some extent, the federal government has specified the services that can be provided by designating which ones are reimbursable. It has also been fairly specific in defining service levels. For example, the proposed JOBS regulations specified that only people who were spending at least 20 hours a week in authorized activities could be counted by states as JOBS participants. The states protested vigorously, arguing that some effective education and training programs require fewer than 20 hours a week (e.g., a full community college courseload typically includes 12 to 15 hours of classroom work per week). The U.S. Department of Health and Human Services (HHS) subsequently changed the regulations to make the 20-hour requirement the average for groups of participants (Kosterlitz, 1989).
Despite clear federal specifications for some aspects of service delivery, the federal government has given the states considerable flexibility in deciding which types of services will be provided, who will provide those services, and the scope of the programs. Even though the JOBS program represents a major new source of federal funding for education, job training, and related services, it requires a substantial matching of funds (between 37 and 50 percent, depending on the services) from the states. It appears that although most states will eventually increase their total spending for welfare recipients, the level will vary greatly--thus perpetuating the differences in welfare benefits and services among states. Many state legislatures will appropriate insufficient funds to match the maximum federal funding.
The question of what program outcomes the federal government expects is still open to argument. HHS will not establish performance standards until later in 1995. In the meantime, debate will continue over what many have called the "hamburger flipper" question: Is it better to move welfare clients quickly into jobs, even if those jobs do not pay enough to lift them out of poverty, or is it better to spend more time and money to increase the clients' chances for longer-term, higher-paying employment that may keep them from falling back onto welfare in the future? (Kosterlitz, 1989). The way the answer to this question is articulated in the federal performance standards will largely shape the amount and type of work-related education and training available to welfare recipients.
Still other job training programs have been developed in response to other more specific problems. For example, a number of programs provide assistance specifically for veterans, since disabled veterans in particular often have a difficult time entering the labor force; dislocated worker programs have expanded because of the special need for experienced workers laid off through no fault of their own (e.g., because of the decline of the timber industry in some states, the decline of defense and aerospace in other states, the possibility of unemployment because of NAFTA, particularly for border states) to be retrained for other kinds of employment; and vocational rehabilitation programs have focused on the special employment problems of disabled individuals. There has been a strong tendency for Congress to respond to specific new problems with a specific new program, rather than to incorporate new purposes into old programs; this in turn has generated a proliferation of job training programs with roughly the same goals -- the enhancement of employment -- but for different groups with different kinds of barriers to employment.
A particular concern has been the development of programs for youth. The problem of high school dropouts is an old one, dating almost to the turn of the century, but it has become increasingly serious as the employment prospects of dropouts have become worse relative to high school graduates. In addition, programs for youth have the special aura of prevention: if they can steer individuals away from unemployment, and away from crime, drugs, and (for girls) early pregnancy, then they can prevent social costly problems in the future. In response, a large number of programs focused on the employment of youth -- both those who have dropped out of high school, and those still in school but considered likely to drop out -- have been developed, particularly within CETA and then JTPA, and their effectiveness is a subject of special concern. Some of these programs are properly considered job training programs -- that is, they offer education and training to equip young people with new skills -- while others are really work experience programs, particularly the summer youth programs that employ some young people during the summer. A positive view is that work experience itself will teach young people the personal attributes necessary for stable employment; a more general perspective is that such efforts are palliatives to keep troublesome youth off the streets. In this monograph I consider the programs targeted for youth in JTPA (Section III.1) and in experimental programs (Section III.3), and in evaluating the effects on particular populations (Section III.4). I then interpret the especially dismal results for youth in Section IV.9.
The results of these many strands of development is that a bewildering array of job training programs exist. Indeed, when the General Accounting Office (GAO) examined federally-funded employment and training programs, they counted 163 programs spending $20.4 billion in 1995 (U.S. G.A.O. 1994).[12] These counts are somewhat misleading, since they disaggregate different titles of certain programs (for example, JTPA is counted as 20 different programs because of its different sections), they include education programs (like literacy efforts and student grants and loans) that are only distantly related to job preparation, and they include other programs (like Community Development Block Grants) that fund a variety of services other than job training. Still, the count is instructive because it illustrates nicely the proliferation of purposes and the widespread perception of how incoherent and fragmented the education and training "system" has become. For purposes of understanding the rough magnitudes of different programs, Table 1 summarizes the GAO figures about education and job training programs. Clearly the major job training programs are JTPA and JOBS, and they are larger than any other federal programs with the exception of students grants and loans for postsecondary education.[13]
Finally, there have been a large number of experimental programs, often started by private foundations in order to test particular approaches to enhancing employment. Some of these efforts have been particularly intensive, and others have concentrated on providing services of particularly high quality. They therefore might provide information about what excellent job training programs -- well-designed, independent of political manipulation, and freed of having to operate under normal pressures -- might accomplish. Several of these experimental programs have been carefully evaluated, and I present their results in Section III-3 -- though with the caveat that such programs may be quite different than those operated with public funding.[14]
There are several reasons for examining the evaluations of a variety of job training programs, rather than confining a review like this to the main job programs of JTPA. One is that there is considerable overlap among these programs because of the ways they have been established. JOBS programs for welfare recipients often send their clients to local JTPA programs -- sometimes on their own initiative, and sometimes at state directive -- and so the two programs use the same services and providers. Both JTPA and JOBS may send their clients to vocational education in community colleges, areas vocational schools, and technical institutes -- particularly when clients are able to find their own education and training arrangements (known as "individual referral") or when there is a fiscal incentive to do so.[15] Secondly, the kinds of services various job training programs provide overlap considerably, so that information from one program is useful in judging the effectiveness of others. Finally, the discussions of job training programs in the United States generally commingle information about different types of programs, and so it is necessary to understand the evaluations of all these programs.
Beginning in the 1980s, outcome evaluations began to use true experimental methods for evaluation: individuals are recruited for programs, and a random sample (the "experimentals") are allowed to enroll in the program while the others (the "controls") are administered questionnaires to collect roughly the same information as the experimentals about the services they receive and their employment history. The ethical dilemmas involved in experimental methods have been avoided by using volunteers and by recruiting more individuals than can be accommodated in the program to be evaluated, so that one could argue that some individuals would not be served even if the programs were not being evaluated using an experimental design. In addition, the effectiveness of these programs is genuinely unknown so that -- unlike denying an individual access to a vaccine known to work against a particular disease -- no one is being kept out of a program that would surely increase their life chances.
The great advantage of experimental methods is that they can eliminate the possibility that various factors unconnected to program effectiveness are responsible for any findings. In job training programs, there are three such factors that are particularly dangerous: selection effects, maturation effects, and regression to the mean. Selection effects operate because job training programs by construction select those individuals who have certain barriers to employment -- low education levels, little work history, perhaps motivational problems, or histories of drug and alcohol abuse -- and therefore might be expected to benefit least from any training program; the variety of these characteristics is so great, and so unmeasurable, that it is difficult to create an equivalent control group without experimental methods. However, these negative selection effects are complicated by other selection effects created by the administration of programs. In order to look good, job training programs have an incentive to choose the most able and job-ready of the individuals who are eligible -- a process known as "creaming". This creates a positive selection effect in addition to the negative selection effect involved in eligibility for the program. Moreover, this kind of effect may operate differently over the business cycle: when unemployment falls, the most job-ready individuals are able to find jobs so that programs have to work harder to recruit people to enroll -- and may have to enroll the least job-ready individuals with multiple employment problems. Paradoxically, then, in boom times when low unemployment makes placements somewhat easier, the individuals enrolled are the least job-ready; when unemployment is high and placements more difficult, the most job-ready individuals are likely to be enrolled because of "creaming". It would be virtually impossible, then, to construct a control group that is comparable to those enrolled in a job training programs except under experimental conditions, since there are too many administrative, economic, and personal factors that affect the composition of a job training program.
Maturation effects occur when individuals improve their conditions by aging or maturing. This is a particularly likely result for youth, who suffer much higher rates of unemployment and lower earnings when they are young and then gradually mature into the more stable employment and earnings patterns of adults -- most of them without the help of any particular program. Maturation effects are also likely for measures of academic achievement, for knowledge about the labor market, for risk-taking behavior, and for certain measures of disruptive behavior including drug use and criminal activity. Without considering this phenomenon, youth programs may look effective over time as those who have enrolled in them mature, even though the program may have had no effect on this process.
Regression to the mean is another problem. By construction, job training programs enroll individuals who have had problems in employment. But some of these individuals may have had an unlucky spell -- an unexpected layoff, for example, for an individual with adequate job skills in what is otherwise a healthy local economy -- and can be expected to find employment on their own within a few months (that is, they regress back to their normal conditions of employment after a while). For such individuals a job training program might speed up the return to employment, but may not make any difference to whether such an individual finds employment again -- in contrast to an individual who lacks fundamental job skills, who is unlikely to find employment without training. The problem of regression to the mean is a particularly serious problem in welfare programs: a large fraction of the welfare population is on welfare for a brief period -- following a layoff, the departure of a wage-earning family member like a husband, a medical emergency -- but then finds employment and leaves the welfare rolls after a short period of time. If large numbers of these "temporary" welfare recipients are enrolled in job training programs, then it will appear to be a success -- though all that may be happening is that normal turnover rather than the effectiveness of the job training program is causing some individuals to find employment and leave welfare. In the quasi-experimental evaluations of CETA programs in the late 1970s, the apparently greater increase in earnings for experimental groups compared to comparison groups turned out to be due to regression to the mean for males, though for females job training programs increased earnings by slightly more than would be expected from such a pattern (U.S.C.B.O. , 1982, and Figure 1).
However, with experimental methods every kind of selection effect is eliminated, as are maturation effects and regression to the mean[17] -- so that any differences in employment after a job training program can be attributed to the program rather than to other causes. Despite these advantages of experimental methods, however, there remain a number of disadvantages, or evaluation problems which the use of experimental methods has not always been able to resolve:
Finally, the problems in evaluating effectiveness for different services and sub-groups extend to the evaluation of particular programs as well. That is, a national job training programs like JTPA is in reality an agglomeration of over 500 programs, each administered locally -- and any average effect masks the distribution around this average caused by the existence of highly effective programs simultaneously with truly dreadful local efforts. The most effective programs may be the most valuable guides to improving practice, of course, so -- if they can be identified -- their characteristics may provide the best information about how to improve programs. While the early evaluations did not address the effectiveness of individual programs, some of the most recent evaluations have managed to detect individual programs that are more effective than the average (reviewed in Section III.7 below).
The danger of displacement also reflects a difference between human capital models of earnings and employment -- in which education and training instill new competencies that increase the productivity and then the wages and earnings of individuals -- from screening and signaling models, in which education or job training signals the greater competencies of certain individuals over others but without changing those competencies. If screening prevails, then individuals completing job training programs will have higher levels of employment and earnings -- but their employment will come at the expense of other individuals who fail to get these jobs, and employment and productivity in the aggregate will not increase. Job training programs generally assume a human capital model, and there is virtually no reference in the evaluation literature to the possibility that signaling might explain any positive outcomes.
The period of time is critical because of the question of whether any potential benefits increase over time or degrade. In the pattern typical of age-earnings profiles for different education levels, for example, a level of schooling may not generate any real increase in earnings for several years, during which an individual is searching for an appropriate job; then the benefits tend to increase over time, peaking somewhere during the period between age 45 and 55 before declining as retirements begin. Similarly, in job training programs one might expect a decrease in earnings during the program itself, as individuals are forced to leave any employment they might have; then, perhaps following a period of job search when earnings are still low, one would hope that earnings compared to those of the control group would be higher, and perhaps would continue increasing as the greater skills from the job training program enable individuals to advance in their jobs compared to the control group. However, a different possibility is that short-term job training programs push individuals into low-quality employment without improving their skills, so that there are short-term employment benefits that disappear after a short period -- leaving experimentals no better off than controls in the long run, and potentially even worse off because off the period of low earnings during the program itself. (This may be especially dangerous with job search assistance, which is designed to help individuals find jobs but without improving their skills.) The difference between these two possible patterns can be detected only with information about earnings several years after a program ends -- and unfortunately many evaluations have not lasted long enough to collect such information. The available results on effects over time are reviewed in Section III.6 below.
* The predominance of experimental approaches has overshadowed other
methods of understanding job training programs, particularly the use of
qualitative and ethnographic evaluations that might provide better insights
into why programs succeed or fail. The earlier evaluation literature of
CETA and the welfare experiments of the 1970s include some qualitative studies,
in which researchers would observe programs carefully, interview participants
at length, and otherwise try to determine what life in a program was like for
its participants. The purpose was not only to get a better sense of what
programs are like -- the "texture of daily life", or the "lived experience" of
programs, as ethnographers might say -- but also to develop better information
about how programs are implemented, what precisely goes on in them, and why
they might be ineffective. In recent examples of qualitative ethnographies and
case studies, for example, Hull (1994) has described the amount of teaching
about on-the-job relationships (in addition to technical skills) that occurs in
a banking program; Kalman and Losey (forthcoming) have analyzed how a workplace
literacy program fails to live up to its self-conception as an innovative,
worker-centered program; Gowen (1994) has described the turmoil in a workplace
literacy program; Grubb and Kalman (1994) have described how the dominant
teaching methods in work-related remedial programs undermine their
effectiveness; and investigations based on interviews have suggested that
certain behavioral problems make job-keeping (rather than job-finding) a
problem among the chronically unemployed (Quint, Musick, and Ladner, 1994).
This last study is particularly interesting because it examined the lives of 50
women enrolled in New Chance, which was also evaluated with random assignment
methods (see Section III.3 and Table 13). They found that those enrolled in the
program were enthusiastic about it; but their progress into employment was slow
and uneven partly because of the problems caused by living in highly
disorganized families and communities.
One rationale for qualitative studies, then, is that they can provide explanations for the outcomes determined by quantitative analyses. Many of the reasons I offer in Section IV for the small benefits of job training programs are based not on formal results from random-assignment experiments but on less formal case studies and observations of job training programs. Formal quantitative evaluations are necessary, then, because only these methods can demonstrate the effects on employment and earnings of job training programs; but qualitative studies are necessary too, to understand why some programs work and others don't and to clarify how existing programs might be improved. Unfortunately, these two traditions of research are not well-integrated: the qualitative examinations typically collect no information about effects on earnings and employment, and the quantitative analyses rarely carry out qualitative studies.
The final drawback of random-assignment evaluation, of course, is that it is expensive. It can therefore be applied to large-scale evaluations of national programs of considerable policy importance -- but it cannot be applied routinely, and cannot be applied to small programs, to many experimental efforts, or to local programs deciding what mix of services or which specific providers they should use. This means that job training programs have typically been subjected to two quite different kinds of "evaluation": random-assignment evaluations of great sophistication and cost, performed largely for federal policy-makers deciding how to establish federal guidelines and legislation; and locally-collected information about effectiveness, like the performance measures required by JTPA and information about caseloads collected in local welfare programs. This kind of local information, which is much cruder and susceptible to local manipulation, is used to monitor local programs, to impose sanctions on local programs that are out of compliance with performance requirements, and in some cases to make local decisions about effectiveness. In the only effort to calibrate these local evaluations with random-assignment evaluations -- to see, for example, whether local programs with strong results on performance measures also have strong results from random-assignment evaluations -- there proved to be no correlation between the two (Doolittle et al., 1993, p. 10). This suggests that performance measures are virtually useless for making rational decisions about effectiveness -- even though they provide political protection because they make JTPA seem like a performance-driven program.
There is little question that the quality of evaluations has increased substantially over the past twenty years. Job training programs -- and particularly those associated with the welfare system -- have been the subjects of what is probably the most sophisticated policy-oriented analysis in the United States. However, given the complexity of social programs and the variety of job training programs, in a country as large and diverse as the United States, it should not be surprising that these evaluations have failed to answer all the important questions about job training programs. Indeed, given the variety of programs and the variation among localities in how they are administered, it is amazing that the existing evaluations come to so much agreement about the effects of different programs -- the subject of the next section.
In Section III.1 I present results for CETA and JTPA programs, the major job training programs. In Section III.2, I present results for welfare-to-work programs, while Section III.3 reviews several experimental programs. Section III.4 then describes what is known for different groups within the population, while Section III.5 does the same for different types of services. Section III.6 describes the results over time, to ascertain whether any benefits of job training programs persist or decay. Finally, Section III.7 presents some findings about variations among specific providers, because of findings of how much variation there is among different providers in their effectiveness.
The results in these sections present information about the effects of job training programs, regardless of cost. In addition, some (but by no means all) of these evaluations have carried out cost-benefit analyses, comparing the effects on earnings with costs of programs to see if they are "worth doing" not in the sense of increasing the employment and earnings of participants, but in the sense of generating benefits that outweigh costs. These cost-benefit analyses are surveyed in Section III.8.
A wide variety of job training program preceded CETA, because of the variety of funding sources and the great latitude local programs had. Partly for this reason, it is difficult to compare evaluations of different programs, because the services they offered and the intensity (or duration) or programs varied so much. In addition, the methodology of evaluation varied as well. The earliest evaluations tended to use conventional regression methods using quasi-experimental control and experimental groups: that is, a regression describing earnings (or the log of earnings) as a function of individual characteristics plus a variable describing program participation (or several variables, if data were available on the intensity of the program) would be estimated. Table 2 presents a summary of the early evaluations of job training programs, distinguishing the results for classroom training, on-the-job training, the Job Corps (a residential, year-long program for youth that is much more intensive than any other job training program), and adult basic education (ABE), a form of remedial reading and writing instruction. The results generally suggest positive effects, generally higher for females than for males, with a rough average of effects in the order of $250 to $300 per year. Given inflation between the early 1970s and the present, such benefits would be worth about $900 to $1,000 per year in 1994 dollars. However, because of unmeasured selection effects and problems with regression to the mean, mentioned above, such regression methods cannot possible control for the true differences between those enrolled in programs and others not-enrolled, so these estimates must be considered over-estimates of the true effects.
When manpower training programs were consolidated in the CETA program in 1973, the kinds of services offered and their administration became somewhat more standardized. In addition, the evaluation of these programs expanded substantially, through two different avenues: the Youth Knowledge Development Project, which generated a large number of qualitative studies of CETA programs; and the generation of the Continuous Longitudinal Manpower Survey (CLMS), which followed a random sample of CETA enrollees from 1975 on. Individuals in the CLMS survey were then matched to comparable individuals from another data set, the Current Population Survey, using different matching methods; then, as in earlier evaluations, regression methods were used to disentangle the effects of personal characteristics (e.g., gender, race, years of formal education, age, prior labor market experience, and the like) from the effects of program participation.
Table 3 presents the results of several studies using the CLMS data. The benefits of participation were generally higher for women; indeed, several studies found statistically insignificant effects for men. In general, the conclusion has been that CETA programs increased earnings for women, from about $500 to $1,000, though there is too much uncertainty to know whether there were increases for men. For youth, the effects were generally zero or even negative. In addition, a smattering of evidence suggests that classroom training and on-the-job training were more effective than work experience and public service employment, where individuals were employed in public service jobs at minimal wages; for example, Taggart (1981, p. 282) concluded that classroom training increased earnings in 1976 by $350 or 10%, on-the-job trainees gained $850 or 18 percent (declining to $600 the second year), and those in public service employment gained $250 the first year and $350 the second; individuals in work experience programs actually lost earnings. However, the various studies are too contradictory to be very sure about this result.
The different studies, varying in their analyses of the same basic data set, were most remarkable for the range of findings: for adult women, the different studies showed variation from no earnings gain at all to $1,300, while for men estimates ranged from negative $700 to $691. One of the most serious problems, aside from the ubiquitous selection effects, is nicely illustrated in Figure 1, from one of these CETA evaluations (Bloom and McLaughlin, 1982). The year before enrollment, CETA clients show a pronounced dip in earnings, compared to the control group; for men, job training simply restores them to the level of earnings of the control group, though for women there is a slight increase above these earnings of the control group. If the earnings dip is merely a transitory component -- for example, caused by bad luck, or a temporary spell of unemployment -- then it is clear that job training programs do nothing for men, though they have modest effects for women (about $800 - $1,300 in this particular study, higher than in most others). If, however, the earnings dip was something connected to a more permanent reduction in earnings capacity, then the benefits of CETA would be considered higher. Disagreement about which of these is true in partly responsible for the range of estimates in Table 3. One conventional conclusion from the range of estimates, therefore, was that quasi-experimental methods were not powerful enough to detect the possible benefits of job training programs, and that experimental methods were advisable (Barnow, 1986). But in part this conclusion resulted from the facts that if there had been any benefits, they must have been modest since very large benefits would have been detected even with quasi-experimental methods and with substantial variation among different studies.
A special program is the Job Corps, a particularly intensive program for youth. The Job Corps is predominantly a residential program in which youth live in a center away from home, and receive a variety of academic instruction, job training, and various other social services for an entire year. It has always been the most expensive job training program, costing about $15,000 in 1994 dollars, and has represented the most serious kind of intervention for those youth judged in the greatest need. As the results in Table 4 indicate, based on a quasi-experimental evaluation using a matched comparison group, it has had positive effects on employment rates and on overall earnings, though not on wage rates, and it has also served to reduce crime among those enrolled. (The findings that wage rates have not increased suggest that the Job Corps did not increase the productivity of its members; but increased employment led to higher earnings, indicating that the Job Corps, like many other job training programs, affected persistence in the labor force instead.) Furthermore, benefit-cost analyses have shown that, despite the high costs of Job Corps, the values of these benefits outweigh the costs (Table 20). These results, which represented some of the earliest benefit-cost analyses of job training, provided some hope that even intensive job training programs, for individuals with the greatest barriers to employment, would be worth doing.
In 1983 CETA -- which had come under fire as an ineffective and politically-manipulated program -- was replaced by JTPA. The two major changes intended to increase effectiveness were the development of performance measures which local programs would be required to meet, and the requirements that local administering boards (call Private Industry Councils, or PICs) have at least 50 percent of their members from private business, an effort to make job training responsive to the needs of employers. In addition, public service employment -- which had been criticized as "make work" -- was eliminated from the services provided.
With the passage of CETA to JTPA, the evaluations based on CLMS ceased. In their place, a random-assignment evaluation of JTPA was planned. The results of this study, which are widely regarded as definitive because of the design and complexity of the evaluation,[21] describe employment effects for individuals from 16 specific programs across the country, both 18 and 30 months after leaving the program.
Table 5 presents the most basic results from this evaluation. As in earlier findings, the impact is higher for adult women ($1,176, or a 9.6 percent increase in earnings) than for men (who experienced a $978 or 5.3 percent increase in earnings).[22] However, while these results are statistically significant, and the benefits of JTPA proved to outweigh their costs for both adult men and women (see Table 22 below), in another sense the benefits are small: for women, who might have to support a family, the program increased earnings only from $12,241 to $13,417 over these 30 months, an average annual gain of $470. Even for those who enrolled in the program, the increase was only $735 per year -- not enough to move individuals out of poverty, for example, or enable them to leave welfare. To be sure, the long-run effects might be more positive, as I explore in Section III.6 below; but clearly JTPA did not provide a substantial boost to the earnings of either women or men.
Furthermore, the effects of these programs on youth are zero or even negative. The negative findings come about because the youth enrolled in the program largely withdrew from employment during the period of training, and the lower earnings during the period of enrollment were not offset by increases in earnings after completing the program. (Since labor market experience is one of the most important criteria for hiring in the sub-baccalaureate labor market -- more important even than many educational credentials below the baccalaureate level [Grubb, Dickinson, Giordano, and Kaplan, 1992] -- the long-run effects for youth enrolled in ineffective training programs may be even worse.) The negative findings are particularly discouraging for the worst-off youth -- those who had been arrested prior to enrolling in the program. To be sure, these results for youth reflect only a subset of youth in JTPA, and are therefore not necessarily comprehensive.[23] Nonetheless, these are very discouraging findings which confirm in many ways the results for CETA in Table 3, and they have caused many analysts and policy-makers to call for eliminating JTPA programs for youth.[24]
Other effects of JTPA can be seen from Tables 6 and 7. It clearly increased the proportion of individuals with a GED or high school diploma, though by only trivial amounts for young males; it seemed to reduce the receipt of welfare benefits for women and female youths (though these effects are insignificant), though it increased welfare benefits among adult males; and it reduced the arrest rate for young males who had not been arrested prior to enrolling in the program. But these effects are quite modest and uncertain; and in the case of increasing the rate at which individuals earn a GED, there is substantial uncertainty about whether this credential improves subsequent employment.[25] Thus examining benefits other than those related to employment doesn't improve the conclusions much: there are benefits, but they are modest for some groups, missing for others, and in still other cases -- e.g., the increase in welfare benefits to males, as well as the overall negative effects for youth -- they operate in the wrong direction.
Overall, the results of the JTPA evaluations are sobering. They reveal modest gains for adult men and women -- in the order of $500 to $750 for women per year, and perhaps $400 to $650 for men[26] -- but earnings increases that are essentially zero or even negative for youth. The results are for adults are statistically significant and, as I will point out in Section III.8 and Table 22 below, JTPA programs are also "worth it" in the sense that their benefits outweigh their costs. But the benefits are not significant in any practical sense -- they are too small to change the life conditions of those who have enrolled in job training, to enable many of them to leave the welfare rolls, or to escape poverty. This kind of finding, replicated in many other studies, leads to the puzzle addressed in Section IV: why. after about 25 years of developing job training programs, are the benefits of job training so small?
As I mentioned in the introduction, the idea of providing services -- including education and training -- to enable welfare recipients to move into employment and off welfare extends back at least to 1962. In addition to the voluntary work programs established during the 1960s, the federal government under Richard Nixon allowed states to experiment with their own welfare-to-work programs. Although the evaluations of these programs were primitive, to say the least, they generated an enormous amount of rhetoric on behalf of welfare-to-work programs. For example, while he was governor of California, Ronald Reagan established a community work experience program (CWEP) in which welfare recipients were required to work in community service jobs in amounts related to their grants (i.e., they were required to "work off" their welfare grants). The California CWEP program was a complete failure: it was able to enroll only a tiny fraction (0.2%) of welfare recipients, and it failed to meet every single one of its employment objectives (Employment Development Department, 1976). Nonetheless, Reagan cited the program as a success virtually every time he discussed welfare, and he used the presumed "success" of this program to press for an expansion of welfare-to-work programs. While the early welfare-to-work experiments tended to emphasize work, rather than education and job training, the emphasis shifted somewhat in the 1980s. During the Reagan administration, states were allowed to implement a series of experiments in their welfare programs, incorporating a mix of work requirements and services (including education and training) in order to reduce their welfare populations. These experiments varied widely, to be sure, though most of them ended up emphasizing job search rather than either job training or mandatory work: the political and practical difficulties involved in forcing welfare recipients into employment -- when they have children as well as deficient skills and, in many cases, other personal characteristics that made employment difficult -- was too great for mandatory work to ever be very substantial.
As was true of state programs in general, the state welfare-to work programs of the 1980s varied substantially in the services they provided. For example, of the five state programs described in Table 8, the Arkansas and San Diego programs provided job search workshops, followed by work experience in public and private agencies; Virginia 's program began with a period of job search followed by either work experience, education, or training; Baltimore included a variety of different services including education, training, job search, on-the-job training, and work experience; and West Virginia required community work experience, potentially of unlimited duration. As was true of other state programs, then, these five tended to emphasize job search assistance and work experience programs over education or job training; a few states did allow education and job training, especially as initiated by the welfare recipient. Thus the services in these welfare-to-work programs overlap those provided by JTPA; but the evaluations of these experiments should not be considered evaluations of training itself.
Several of the state welfare-to-work programs were evaluated with random-assignment methods; the results were revealing in their own right, and they also set the stage for more widespread programs enacted in 1988. Table 8 summarizes the results from 5 state experiments. Four of the five increased the amount of employment; two increased earnings by statistically significant amounts (by $560 per year in San Diego and $156 in Arkansas), with two more increasing earnings by amounts that were not quite statistically significant ($176 in Baltimore and $108 in Virginia). As a result, in three of the five states the total amount of welfare payments fell, by amounts per year ranging from $84 in Virginia to $192 in California. But the effects of these programs in moving welfare recipients off welfare was negligible: in none of these five states was the likelihood of being on welfare reduced.
These evaluations also allowed a number of other useful conclusions, in addition to dispelling the notion that "nothing works". The least effective program was that of West Virginia -- but this was generally attributed to the weak state economy in a largely rural state with few employment prospects. This result suggested that welfare-to-work programs (and job training programs more generally) could not be expected to have much effect in weak economics, perhaps including rural economies generally. Second, these experimental programs appeared to provide the greatest help to the least well-off or the least job-ready individuals -- for women compared to men, and for individuals without prior employment compared to those with a previous work history (Gueron, 1987). When added to similar results from CETA, these results suggested that job training programs ought to concentrate their efforts on individuals with the most barriers to employment -- the very opposite of "creaming".
In these welfare-to-work experiments, therefore, there were benefits of some kind in virtually all states, and -- as Section III.8 and Table 21 clarify -- the benefits exceeded the costs of operating these programs. In addition, the benefits varied among states, and the San Diego experience suggested that earnings increases could be substantial -- at least the order of magnitude of benefits recorded by JTPA. However, the effects were still modest by almost any standard: earnings increases were quite small, ranging between $100 and $200 per year in most states; the reductions in welfare payments by states were similarly small; and these programs had no effect whatsoever in reducing the number of families on the welfare rolls, the goal that is at the heart of welfare-to-work programs. The results could be read as either supporting or undermining the continuation of welfare-to-work programs, but they indicated that such efforts made very little difference to the lives of welfare recipients and provided only trivial savings for taxpayers.
One other evaluation of a state welfare-to-work programs was carried out using a non-experimental design (Nightingale et al., 1991). The Massachusetts Employment and Training (ET) Choices program was evaluated comparing ET participants to a control group of other welfare recipients who did not participate but who were matched on the basis of race or ethnicity, age, sex, age of youngest child, region of the state, and whether the individual was in a one- or two-parent family; then regression methods were used to control for variations among individuals (and between experimental and the comparison groups). The results, summarized in Table 9, are of the same order of magnitude as the experimental results in Table 8 but are somewhat more positive: the increase in the probability of employment was greater than for the five states in Table 8, the average annual earnings increase ($780) was larger, and the reduction in annual welfare payments ($307) was larger than the largest reduction among the five states (where California had a reduction of $192 per year). However, it is difficult to know whether these differences are due to random variation among states -- the possibility that Massachusetts was simply at the upper end of a range of outcomes -- or to the greater variety of services offered in ET Choices (which allowed welfare recipients to enroll in job training, postsecondary education, and remedial education in addition to job search assistance and supported work experience), or to the non-experimental design of the evaluation, which probably resulted in outcomes that were upwardly biased. However, the results confirm once again that welfare-to-work programs can affect both earnings and welfare grants -- though the effects are modest, and unable either to improve the lives of welfare recipients substantially or to end the need for welfare itself.
In 1988, in part based on a favorable reading of the experimental program evaluations, the early welfare-to-work programs were expanded in the Job Opportunities and Basic Skills (JOBS) program. JOBS required states to fund welfare-to-work programs (partly with federal matching funds) with a mix of work requirements, job search assistance, work experience, training, education, counseling, child care, and other supportive services. Again, states set up a variety of programs, varying in their job requirements and their mix of services including job training, education, and less intensive services like job search assistance. However, many states implemented their JOBS programs through JTPA as a matter of state policy, while in other states local JOBS programs accomplished the same thing by convening a variety of local providers and emphasizing the use of the existing JTPA system[27] -- in effect commingling JOBS with the dominant job training system.
So far, two evaluations of JOBS program have been completed: one of the California program called GAIN (Greater Avenues for Independence), and another of the Florida program, Project Independence.[28] The GAIN evaluation, using random-assignment methods, investigated the effects of programs in 6 counties (out of 58), varying by their urban/rural characteristics, the state of the local labor market, and the nature of local programs. Of the many services GAIN provided, the increase caused by enrolling in the program was most substantial for job search activities (since 28.5 percent of the experimental group received these services, compared to 3.9 percent of control group members) and remedial education (defined as adult basic education or GED training, received by 29 percent of experimentals but 5.4 percent of controls) -- but vocational training and post-secondary education did not increased markedly as a result of GAIN. This confirms the bias within welfare-to-work programs: even though welfare recipients are allowed to participate in self-initiated education and job training programs lasting up to two years, in practice the emphasis on short-term job search assistance and on remedial education[29] has reduced the use of lengthier education and training programs.
The overall results are summarized in Table 10, differentiated into effects for single parents (almost all women) and heads of 2-parent families (almost all men). Over the three-year period of the evaluation (from early 1988 to mid-1990) the program increased the probability of employment for both groups; increased earnings for both groups, by $471 per year and $370 respectively; and reduced the amount of welfare payments, by $305 per year and $389 respectively. GAIN also reduced the likelihood of being on welfare by 3 percentage points among the single parents, since 55.5 percent of the controls but only 52.5 percent of the experimental group were on welfare at the end of the third year;[30] however, there was no effect in moving heads of households off welfare. To be sure, these are average effects across all six counties; the substantial variation among counties (reviewed in Section III.7) indicated that some counties were more effective than others. Overall, however, the results are quite consistent with the pre-JOBS results: welfare-to-work programs relying on the most modest services -- in this case, job search assistance and remediation -- can increase employment and earnings and result in some welfare savings, but the effects are small and the reduction in the numbers of families on welfare is trivial.
The results from Project Independence are different in several respects. This welfare-to-work program was really a job search program, since it provide relatively little education or training. Like many other programs, it did increase the amount of employment (as Table 11 shows), and also increased earnings -- though the average increase of $114 per year was very small. It also succeeded in reducing the rate at which individuals received welfare, and in fact the savings in welfare payments were greater than the increases in earnings -- a finding that proves to affect the cost-benefit results (in Table 24 below) substantially. However, two other findings were of greater interest. The evaluation found substantially higher benefits to mothers with children over the age of 6 -- who have many fewer problems related to child care, scheduling, sick days, and the like -- than to mothers with children age 3 to 5, as the results in Table 11 clarify. This finding reinforces the notion that certain individuals -- those with substantial barriers to employment -- may not benefit from job training, an idea that I will examine further in Section III.4.
In addition, the evaluation revealed a common vulnerability of job training and welfare programs. During the evaluation, a worsening recession in Florida caused welfare caseloads to increase while funding stayed the same; the result was that the resources available to clients who enrolled late in the evaluation -- the "late group" -- were substantially smaller. Consistent with this, the increases in earnings and the declines in welfare payments were significant only for the "early group", not for the "late group" -- clarifying that the fiscal conditions of welfare-to-work programs may influence the results substantially. The implication is that individuals who point to certain exemplary welfare-to-work programs as evidence of what job training might accomplish -- for example, the Riverside program described in Section III.7 and Table 19 below -- neglect the more likely possibility that fiscal constraints will lead to low-quality programs with no effects.
One hint about how this program affected behavior came from an examination of attitudes and values. Those who enrolled in the program were more likely to agree that "even a low-paying job is better than being on welfare", were less likely to think than mothers should stay home with their children rather than working, and had lower reservation wages. Even though many of these differences were statistically insignificant, they all support the notion that welfare-to-work programs can change attitudes, replacing an acceptance of welfare with a greater commitment to work -- consistent with one of the intentions behind welfare-to-work programs. But Project Independence also reduced the overall income of those enrolled by a small amount, and it decreased the fraction of those saying that they were satisfied or very satisfied with their overall standard of living, from 45.7 percent among the control group to 41.9 percent among those in the program. Project Independence therefore represents a relatively conservative approach to welfare-to-work programs, in which the costs of welfare to taxpayers declines (see also Table 24) at the expense of welfare recipients themselves.
There has been a common pattern in the United States of experimenting with social programs, trying out promising practices on a small scale before expanding them to more universal programs. Indeed, the history of job training efforts can be interpreted as part of this larger history. The development in the 1960s of manpower training programs located outside the schools was in part an effort to develop novel efforts that bypassed the presumed deficiencies of the educational system; these "experiments", with the roughly positive results summarized in Table 2, were then institutionalized in CETA and then JTPA. Similarly, the current round of welfare-to-work programs in the JOBS programs emerged from welfare-to-work experiments operated by states during the 1980s, which were in turn based on fledgling efforts of the 1960s and 1970s.
There have been several other, more self-consciously experimental programs developed with private and foundation funding that have contributed to our knowledge of "what works". There have, not surprisingly, been a very large number of experimental programs, each with its partisans -- partly because private foundations in the United States often operate by developing an experimental approach (or discovering an experimental approach underway in some corner of the country) and then replicating it elsewhere, sometimes with some kind of evaluation. Some of these efforts have proved not to be especially effective because they cannot be replicated -- for example, some are dependent on the high energy and charisma of a founding leader, and do not work once they are operated by other people. However, others have been designed to be replicated, and have been intended as tests of particularly approaches to employment and training. In contrast to the mainstream job training and welfare-to-work programs -- which have provided a variety of services, often without much thought or guidance about what might be appropriate -- these experimental programs have developed clear models of what employment-related services ought to be provided to specific population groups, and have then worked to make sure that these models are carefully implemented. One might expect, therefore, that these experimental programs would be of higher quality, and would generate more favorable outcomes, than do the mainstream programs reviewed in the previous two sections.
In this section I present the evidence for 4 programs: two for young mothers with children, the Minority Female Single Parent Demonstration (MFSP) and New Chance, a program for young poor mothers; and two for youth, JOBSTART for high school dropouts and the Summer Training and Employment Program (STEP) for youth at risk of dropping out of high school.
Because the four MFSP sites were somewhat different in their services, the outcomes were reported for each of the four programs. Table 12 presents the most important results from the four. The obvious story is that three of the four programs had no influence whatsoever on employment and earnings, or on the receipt of welfare benefits, over the period of the evaluation. However, one program -- the Center for Employment Training -- had substantial effects, increasing the amount of employment by 13 percent, increasing wage rates by 11 percent, and increasing average earnings by 25 percent or $101 per month ($1,212 on an annual basis) -- a huge effect compared to those of other JTPA programs in Table 5, for example, or the GAIN results in Table 10. Like GAIN and earlier welfare-to-work programs, MFSP did not decrease welfare payments substantially, nor did it decrease the likelihood of being on welfare.
The MFSP Demonstration results can be read either positively or negatively. On the negative side, three of four programs that were funded to create exemplary job training programs, with relatively lavish funding and special attention to the design of the programs, failed to have any effects at all. On the positive side, the substantial effects of CET -- confirmed in the JOBSTART evaluation reviewed below -- suggested that well-designed programs can work. Indeed, the success of the CET program in this evaluation has been the subject of a kind of publicity campaign to trumpet the success of a particular approach. In the interpretation of the evaluators and the founder, the success of CET was due to its efforts to integrate remedial education and vocational skill training, and they then began to promote a model of job training that depends on such an integration (Burghardt and Gordon, 1990; see also Literacy and the Marketplace, 1989). This may in fact be a worthwhile approach, as I argue in the Conclusion of this monograph; but the success of CET is due to many factors in addition to the provision of both remediation and job skill training, as I point out in Section III.7.
Overall, however, the results of the MFSP Demonstration are remarkably consistent with other evaluations, especially those of GAIN: job training programs on the average have modest positive effects on employment and earnings,[31] very little effect if any on welfare payments, and no effect on the likelihood of being on welfare -- though individual programs (like CET) may have much more substantial effects.
Table 13 presents the findings of random-assignment evaluations 18 months after entering the program. (Subsequent results will examine results after 42 months; if it takes time for positive results to emerge -- as seems to be true for certain programs, as reviewed in Section III.6 below -- they will not be apparent in the findings available so far.) New Chance was successful in increasing attendance in GED programs and college, and in increasing the proportion of mothers who earned a GED and some credits toward and postsecondary credential. Surprisingly, however, the program did not increase scores on a test of basic skills (the TABE, or Test of Adult Basic Education), and it left 72 percent of those enrolling (compared to 70.2 percent of the controls ) reading at a ninth grade level or below -- suggesting that it is possible to earn a GED without improving academic competencies.[32] For a wide range of other outcomes, however, New Chance made no difference in the short run. Indeed, if anything it appeared to have negative consequences: it appeared to increase pregnancies, though these were balanced by increased abortions so that the number of births stayed about the same[33]; it reduced employment, though not significantly so; it reduced earnings significantly; and it increased welfare slightly. To be sure, it is possible that these results reflect withdrawal from employment during the period of the program itself, and that the early declines in employment and earnings will be reversed after young women get into more stable employment in their third and fourth years after enrolling (as for the JOBSTART program summarized below). However, the early findings, roughly consistent with the MSFP Demonstration results, are not at all encouraging since the only real benefit has been an increase in a credential -- the GED -- which has little effect on either employment or subsequent education. The results are particularly discouraging given the high cost of the programs of about $9,000 per person.
The JOBSTART demonstration took place in 13 sites around the country, using JTPA funds.; it was modeled roughly on the successful Job Corps program (reviewed in Table 4 above), though it was less intensive and non-residential. Each site provided remedial education, vocational skill training, job placement assistance, and various support services like child care, transportation, counseling, and instruction in work readiness and job skills; sites were required to offer at least 200 hours of basic education and 500 hours of job training, making them more intensive than conventional JTPA programs. Three different models of service provision were followed: concurrent programs provided remedial education and occupational training concurrently; sequential/in-house programs provided remedial education and then vocational skill training; and sequential/brokered programs provided remedial education and then referred participants to other programs for vocational skill training.
The JOBSTART Demonstration was evaluated using random-assignment methods over a four-year period. The most important results are summarized in Table 14. One major effect is that the program increased the rate at which drop-outs received a GED -- a result that is not surprising because most of the programs emphasized the GED. However, given the limited effects of the GED in increasing employment or access to postsecondary education, this impact may not be of much value. Indeed, over the four-year period the effects on employment and earnings were insignificant, both for the total sample and for selected sub-groups. As in other evaluations, the proportion of women receiving welfare did not decrease overall -- indeed, it increased for women with children -- and the amounts of welfare did not decrease. Nor did rates of pregnancy or giving birth fall significantly -- a special concern because of the negative effect childbearing has on poverty and welfare dependency; indeed, for mothers entering JOBSTART, rates of pregnancy and giving birth increased during the program. One positive result is that rates of arrest appeared to fall, as did drug use (significantly so for hard drugs excluding marijuana).
The results in Table 14 are dismal, and suggest that well-designed job training programs of moderate cost do not work for youth at all. However, these average effects do mask some potentially positive findings for the longer run. A common pattern was for employment and earnings to fall in the first year of the program, while individuals were enrolled in education and job training, with employment increasing in the second year while earnings and earnings increased in the third and fourth years -- suggesting (from the fourth panel of Table 14) a pattern in which those enrolled in the programs increase their earnings about $400 per year over the long run. This pattern emerged for most sub-groups including men, mothers, and other women, and was particularly marked for men arrested before enrolling in JOBSTART -- for whom earnings increases were $1,129 in year 3 and $1,872 (and statistically significant) in year 4 -- and for youth who left school for academic reasons, for whom earnings increases were $726 in year 3 (and statistically significant) and $592 in year 4. In addition, the finding that drug use decreased as a result of the programs suggests that more positive effects might show up in the long run as some individuals avoid drug-related arrests and drug-motivated unemployment and create stable employment records instead.
The other positive finding is that one of the 13 sites -- the Center for Employment Training in San Jose, the successful site in the MFSP Demonstration -- had statistically significant increases in earnings: Over the four-year period, the experimental group earned $32,959 compared to $26,244, an increase of 25.6 percent and $1,679 per year (and $3,044 per year over years 3 and 4). Unfortunately, disaggregating by sites makes the findings more dismal, if anything. Seven of the 13 sites had negative outcomes -- about what one would expect by chance alone -- and two of them had very large negative effects in the range of $6,200 over the four years. Eliminating the one clear success of CET as a special case, the other 12 sites averaged negative effects of $1,393 over four years and $211 over years 3 and 4, making it difficult to conclude that the long-run effects could be positive. The CET program may be a success story, for the special reasons I analyze in Section III.7, but otherwise it is difficult to find much hope for youth programs in the JOBSTART results.
The STEP program was evaluated with a random-assignment approach in which students enrolled in both the school and the work component were compared to others enrolled in the work component only; thus the design tests the additional effects of the summer schooling component. The effects on reading and math scores and on knowledge of contraception after the first summer were positive; gains during the second summer were also positive though somewhat smaller. However, 3 1/2 years after enrolling in the program, STEP youth experienced the same dropout rates, rates of postsecondary enrollment, employment rates, and rates of teenage pregnancy as did the control group (Walker and Vilella-Velez, 1992; Grossman and Sipe, 1992). One widely-cited conclusion from the STEP experiment is that, while it is possible to improve academic performance through a short-term program, long-term results and effects on more fundamental behavior like employment and pregnancy cannot be changed with a short-term intervention that leaves the rest of schooling and the general environment of poor youth unchanged.
Overall, the results of these experimental programs are disheartening. Although two of them identified a particular program -- CET in San Jose -- as having particularly strong effects, the programs on the average had no effects or even negative effects. To be sure, the populations included in these experiments were among the most difficult to employ, since two focused on young mothers with children and two others on low-income youth. But the results clarify that the modest effects achieved in the general JTPA and JOBS programs cannot be improved upon merely by paying somewhat closer attention to the design of the program and to their implementation.
The results in the previous three sections concentrate on the overall effects of job training programs. However, an important question is whether the effects vary for different population groups -- for example, if they are greater for men versus women, or for adults versus youth, or for whites compared to black or Hispanic individuals, or for those who are the most employment-ready compared to those who have multiple barriers to employment. If it were possible to find certain groups for whom job training programs work better than for others, it would be possible to target resources to those groups and increase the overall effectiveness of the limited resources available for job training, or to provide different kinds of services to different groups.
In the early evaluations of manpower projects (e.g., Table 2), there were no clear conclusions about which groups benefit the most from job training. However, the CETA evaluations found greater effects for women compared to men, and for adults compared to youth (e.g., in Table 3, elsewhere in Barnow, 1986, and Bloom and McLaughlin, 1982). In addition, the benefits appeared to be higher for those with little labor market experience, compared to those with substantial experience before enrolling in a program (Bloom and McLaughlin, 1982) -- suggesting that job training programs might benefit those with the least skills and experience, while those with more skills and experience would simply cycle in and out of employment on their own and not be helped by job training programs.[34]
The results of the welfare-to-work experiments in the 1980s tended to confirm the findings that the lest job-ready individuals would benefit the most. Employment increases were generally greater for women on welfare, for example, and for individuals without prior employment histories (Gueron, 1987, p. 28). A more sophisticated reading of the evidence suggested a kind of tripartite result, presented in Table 15: individuals within these welfare-to-work programs who were the most job-ready -- who were first-time welfare recipients -- generally did not benefit (those in Tier 1); and those individuals with the most serious barriers to employment -- those on welfare more than two years and those with no prior earnings -- had low and often insignificant benefits (in Tier 3). A group in the middle, with some prior time on welfare and with low (not non-zero) earnings (in Tier 2) appeared to gain the most (Friedlander, 1988; Gueron and Pauly, 1991, Ch. 4). Such a conclusion is consistent with earlier findings that the most-job-ready individuals do not benefit much from job training, but also with the negative results of the experimental programs reviewed in Section III.3 that focus on the most disadvantaged. This kind of finding also supports a kind of "triage" policy, in which the most job-ready would be denied access to job training programs (in contrast to the practice of "creaming"); the most disadvantaged, with the greatest barriers to employment, would also be denied access to conventional programs or -- because this practice might be difficult to implement because of political and moral considerations -- would enroll in more intensive and expensive programs than are typically offered. Conventional programs would then concentrate on the group in the middle, for which the most substantial gains at lowest cost might be expected.
The more recent JOBS evaluations have also generated results for specific groups within the population. For the California GAIN program, one set of results tended to confirm the kind of tripartite finding from the earlier experiments: recipients of welfare who were moderately disadvantaged earned more, and received lower welfare benefits, than did either more disadvantaged recipients or new applicants (Riccio, Friedlander, and Freidman, 1994, Table 4.6). Unfortunately, for other groups this conceptually straightforward conclusion became less clear, because the effects on sub-groups varied so much among the six counties. For examples, individuals were classified according to whether they needed basic education or not, since those in need of basic education may need remediation before they can benefit from job training, job search assistance, or any other services -- and therefore may be more expensive to return to employment and less effective in finding employment after any job training program. Overall, both groups benefited from GAIN, in both increased earnings and reduced welfare benefits. However, one of the six counties (Riverside) increased the earnings of both groups; two (Alameda and San Diego) increase the earnings of those not needing basic education, but not the other group; and two (Butte and Tulare) increased earnings of those needing basic education but not those with better academic skills.
In addition, long-term welfare recipients benefited in three or four of the six counties, contrary to the results in Table 15; but new applicants also benefited in two of the four counties that collected such information. The programs in Riverside and San Diego benefited both those with and those without prior employment; on the other hand, Alameda and Los Angeles benefited only those without prior employment, while Butte and Tulare benefited only those with prior employment. Thus these results tend to cast doubt on the "triage" solution suggested by earlier evaluations: at least under some conditions, in some programs, the most disadvantaged individuals can benefit from welfare-to-work programs, while in other cases the principle beneficiaries are the least disadvantaged.
The evaluation of the Florida program, Project Independence, also muddied the waters somewhat. As Table 11 shows, mothers with children age 3 - 5 benefited much less than did mothers whose children were 6 and over -- consistent with the practice of "creaming" that selects only the individuals with the fewest barriers to employment; and those considered job-ready (in terms of their education and prior labor market experience) benefited somewhat more than those with less education and experience. However, individuals who had been on welfare for two years or more -- who were presumably less job ready -- benefited much more than did those on welfare less than two years (or those who were first-time applicants). Thus the effects for specific groups provided little clear guidance for selecting individuals, save for the demonstration that mothers with young children benefit less than others.
The results of evaluating JTPA using more sophisticated random assignment methods have confirmed the greater impact for women compared to men (table 5) -- though the differences were not as great as for the earlier CETA evaluation (e.g., Table 3) -- and the finding of zero or even negative effects for youth confirmed the earlier conclusion that job training might be effective for adults but not for youth. (The findings of greater effects for women are consistent with the earlier findings of greater effects for the least job-ready -- since women are likely to have less labor force experience, and to have children that complicate their employment -- but the larger effects for adults rather than youth contradict this pattern.) For other sub-groups, the results were generally statistically insignificant, and therefore the most cautious conclusion is that there were no differences among different groups in the benefits of job training (Orr et al, 1994, Ch. 5, especially Exhibits 5.8, 5.9, 5.19, 5.20). However, the more serious problem is that the differences which can be detected did not fall into any real pattern. Some of the differences confirm the conclusion that the least job-ready individuals, or those with the greatest barriers to employment, benefited the most:
After a great deal of research, therefore, the effectiveness of job training programs for different sub-groups remain murky. Overall, women benefit more than men, and adults more than youth, for whom the results have often been zero or negative. But at least in some programs, the least job-ready individuals benefit, while in others the opposite is true. This in turn suggests that any particular strategy for selecting applicants -- for example, the efforts to "cream" or select the most job-ready, or the contrary efforts to target job training on the most disadvantaged individuals, or the "triage" solution suggested by Table 15 -- cannot be generalized.
The early evaluations (e.g., in Table 3) were often interpreted as supporting the greater effectiveness of on-the-job training over classroom training (e.g., Taggart, 1981, p. 282). Some reviews of the CETA experiences similarly concluded that on-the-job training was more effective than classroom training, which in turn was more effective than work experience by itself, without a training component or associated classroom instruction (Taggart, 1981, pp. 282 - 288). This kind of finding was consistent with the evidence that CETA increased earnings not by increasing wage rates -- or, in economist's terms, by increasing productivity -- but by increasing rates of employment, something that might be improved more by on-the-job training and its attention to work-related behavior than by classroom instruction and its emphasis on cognitive and vocational skills. Similarly, Barnow (1986), reviewing a greater number of studies, concluded that public service employment and on-the-job training had greater effects than classroom training, and that work experience had no effects (or even negative effects). In addition, the early evaluations of CETA concluded that longer classrooms training programs were on the whole more effective than shorter programs, especially for women (Taggart, 1981, p. 103 ff.); in part, the strong results from the year-long, residential Job Corps program (Tables 4 and 20) strengthened the case that longer, more intensive programs would be more effective.
The JTPA evaluation, using more stringent random-assignment methods, also examined the effectiveness of different kinds of services provided. Individuals were assigned to three different kinds of services: classroom training in occupational skills; on-the-job training/job search assistance, where individuals typically were enrolled in job search assistance and then sometimes found on-the-job training or unsubsidized jobs; and a category of "other" services that might include basic or remedial education, job search assistance, work experience, and miscellaneous other services. For adults, individuals with classroom training received an average of 551 hours of services costing $3,174, substantially higher than for either OJT/JSA (averaging 222 hours and $1,427 per person) or "other" services (204 hours at $1,116 per person). For youth, "other services" (averaging 328 hours at $2,016 per person) were almost as intensive as OJT/JSA (317 hours at $2,755), while again classroom instruction was the most intensive approach (averaging 596 hours at $3,305). To put these figures in perspective, a one-year certificate program in a community college might take 30 weeks of time with about 25 hours per week, or 750 hours -- so the most intensive of these programs involved only two-thirds of the hours of the least intensive vocational program, and the most common services were only one-third the intensity.
However, the effects of these different kinds of services are not related to the intensity of services.[36] As the results in Table 16 indicate, the benefits are greatest for "other services" for women, followed by OJT/JSA; for men, OJT/JSA is the most effective (though the difference is statistically insignificant). These results tend to confirm earlier findings about the superiority of on-the-job training to classroom instruction. For youth, however, the results are inconclusive because of the lack of statistical significance, though classroom training appears more beneficial (or, in reality, less harmful) than other services.
In the early evaluations of welfare-to-work programs, it was not possible in any formal sense to evaluate the effectiveness of different services because individuals were assigned to various services by non-random methods. The states varied in the mix of services they provided, leading to the conclusion that different kinds of programs could all achieve the kinds of modest results found in Table 8 (Gueron, 1987, p. 30). However, because many control group members in these experiments had access to education and job training programs through educational institutions and JTPA, the greatest difference in services between experimental and control groups were typically in the short, job search assistance provided in welfare-to-work programs.
The design of the GAIN program in California reinforced this kind of conclusion. For many participants (those with a high school diploma or a GED, or who passed a literacy test), GAIN required job search assistance before any other services; therefore the dominant service provided by GAIN was the short-term JSA, and the generally positive effects in Table 10 can be attributed to this kind of service rather than any other. However, the differences among the 6 different counties evaluated in this study did suggest some other conclusions. In four of the six counties, the presence of large earnings increases among those in need of basic academic education suggest that remediation is an important component of job training; on the other hand, the absence of earnings gains for these needing in remediation in two counties that emphasized remediation (Alameda and San Diego) indicates that basic skills instruction cannot guarantee success. Other results in Alameda also suggested that vocational training and postsecondary education could have an effect -- though two counties (Riverside and San Diego) produced large earnings increases without increasing vocational training or postsecondary education. The findings therefore suggest -- as Taggart concluded from the CETA results earlier -- that a variety of different service strategies can be used in effective programs, without strong evidence that one particular service is preferred over another.
On the other hand, the results of the Project Independence program (Table 11) indicate the importance of at least some minimal level of services to outcomes. The group that enrolled late in Project Independence, after a recession forced increases in caseloads without a corresponding increase in resources, did not benefit at all -- presumably because they received very little from the program. (The cost per person for individuals in the "late" cohort was slightly less than $900, less than almost any other job training program.) Even at its best Project Independence was not a particularly intensive program, since it provided little more that job search assistance to most clients; but even so the results suggest that results cannot be expected when resources dip below some critical level.
The experimental programs described in Section III.3 all involved much more intensive and carefully-devised programs, incorporating a range of services. The disappointing outcomes therefore give little support to the idea, associated also with the Job Corps, that a range of services and more intensive programs are necessary to overcome the multiple handicaps to employment of the least job-ready groups. On the other hand, the evidence for the Center for Employment Training in San Jose, reviewed in Section III.7, seems to support the conception of comprehensive services. This idea remains an attractive one, and I return to it in the Conclusion -- but neither the national JTPA evaluation nor the GAIN study nor the experimental programs of the past few years provide conclusive support for this idea.
One possible interpretation is that either of two very different strategies may work in job training programs. The first involves relatively inexpensive approaches. On-the-job training has proved to be effective in CETA and JTPA; similarly, job-search assistance supported in many welfare-to-work programs appears to be modestly effective, even though it is relatively inexpensive and of short duration. Both services are designed to get individuals into employment quickly and to socialize them to the norms and values of employment, though neither do anything to enhance the basic cognitive or vocational skills of individuals who lack such competencies. The fact that job training programs have proved to be effective more because of enhancing employment rates, rather than increasing wage rates (and presumably productivity), suggests that the effectiveness of such services is related to getting individuals into employment quickly. Whether such limited services have much effect over the long run, or whether they merely substitute one group of under-prepared workers for others, are questions that the existing evaluations cannot answer. The second effective strategy has been associated with more intensive services, including the kinds of remedial education supported by GAIN and various experimental programs, and often combined with vocational skills training and supportive services. This intensive approach is designed to increase the competencies of individuals who are enrolled in job training programs, rather than to push them into low-wage jobs, and to increase their productivity and wage rates over the long run. These two strategies serve somewhat different purposes, to be sure. More to the point, they can be linked -- as I argue in the Conclusion.
The majority of job training programs are short term, lasting perhaps 15 to 20 weeks; and they are generally self-contained, so that individuals enrolling in them do not typically enroll in other programs to continue the process of education and job training.[37] Therefore job training programs are usually "one shot" efforts to get individuals into employment, rather than the beginning of a longer period of education and job training (as, for example, postsecondary education can be). This is turn raises the question of what the long-term benefits of short-term programs are. If, consistent with the human capital model, job training programs provide their clients with real skills (cognitive, behavioral, or vocational) that increase their productivity and are valued in the labor market, then one would expect to see a permanent increase in wage rates and earnings. If in addition the initial enhancement of skills allows individuals to enter "careers" with subsequent on-the-job training that further increases productivity, one would expect to see wage rates and earnings continuing to increase over time -- as happens in the age-earnings profiles associated with different levels of formal schooling. If, however, job training programs merely push individuals into the labor force without increasing their skills substantially, and fail to gain them access to "careers" or other positions with long-run possibilities for advancement, then the effects of such programs may be short-term, and even with positive short-term benefits the long-run benefits may be essentially zero. Indeed, various education programs have been found to be subject to this kind of "fade-out" or decay of benefits -- including the STEP program described in Section III.3 above -- from which many have concluded that sustained interventions are necessary to improve the life chances of low-income students.
Thus it is important to examine the benefits of job training programs over time, to see whether any initial benefits are sustained (or even increase) or whether they decay. Unfortunately, this has not been easy because most evaluations last a comparatively short period of time, and the long-term effects are therefore unknown. However, a few studies shed some light on this problem.[38] In Table 17 for example, the pattern of earnings for the JTPA evaluation, summarized in Table 5 above, is displayed for different periods over the 30 months of the evaluation. During the first six months, earnings increases are essentially zero, because this is the period when individuals leave the labor force to enroll in the program. In the next 12 months (months 7 to 18), the earnings advantage increases to an average of $68 per months among women and $45 among men; in the next twelve months this advantage is sustained at about $71 per month among women and men (and becomes statistically significant among men). There is no evidence in these figures of further expansion of benefits, but neither do they indicate any "fade-out". (For youth, not surprisingly given the negative results in Table 5, there is no pattern of benefits at any time during the 30 months.) Similarly, the results for JOBSTART in Table 14 suggest that earnings of those enrolling are initially lower (as are hours worked), but then increase to a plateau of about $400 per year more than those in the control group -- though employment rates, after increasing in year 2, seem to fade away. However, the Project Independence results in Table 11 indicate some "fade-out" or decay in benefits, since increases in earnings and employment, and decreases in welfare payments, are greater in year 1 than in year 2.
The evaluations based on the longest period of time involved four of the welfare-to-work experiments begun during the 1980s, and included programs in Virginia, Arkansas, Baltimore, and San Diego (Friedlander and Burtless, 1995). Figure 2 presents the effects of these programs on annual earnings over the five-year period, as well as on welfare payments; the actual figures for employment rates and earnings are given in Table 18. The dominant pattern, in the top panel of Figure 2, is for earnings increases to be trivial in the first year, to increase substantially in year 2, to remain substantial in years 3 and 4, but then to fade in years 4 and 5. Similarly, welfare savings are typically trivial in year 1, increase, but then fade in years 4 and 5. (It's important to note that the period of greatest effect includes years 2 and 3, which is precisely the period covered by months 19 to 30 in the JTPA results in Table 17.) In the more detailed information in Table 18, there appears to be "fade-out" in the employment rate for all programs except perhaps in Arkansas, and "fade-out" in earnings except perhaps in Baltimore.[39] Over a period of time, therefore, these results indicate that the kinds of welfare-to-work programs instituted during the 1980s, and largely continued in the JOBS program, do increase earnings and reduce welfare payments, but only in the moderate run; over the long run they leave individuals with employment rates and earnings no higher than welfare recipients who have not enrolled in such programs, and they do not permanently move individuals off welfare.[40]
There are two possible exceptions to the pattern of decline over time. The only welfare-to-work program with stable benefits was the Baltimore program, where earnings increased to year 3 and then stayed about the same in year 5 (see also Figure 2). One possible explanation is that the Baltimore program stressed human capital development through education and training, more than job search intended to accelerate job finding -- and the development of enhanced competencies increase earnings capacity over the longer run (Friedlander and Burtless, 1995, p. 144). In addition, the CET program in San Jose -- a program that is clearly a special case -- showed relatively stable increases in earnings over a five-year period, though increases in wage rates and in employment rates tended to fall (Zambrowski and Gordon, 1993).
In general, however, these results do not suggest that expansion of benefits over time occurs as a result of job training program -- as happens, typically, when individuals complete educational credentials like Associate degrees and baccalaureate degrees. In the short run like the first year, benefits of job training programs are typically non-existent because individuals withdraw from the labor force -- and short-run evaluations (like that of New Chance in Table 13) are therefore suspect. Over the moderate run, in years 2 and 3 (and perhaps year 4 as well) the benefits increase; but after that they decay.
To be sure, these findings can be interpreted either negatively, because of the lack of long-run effects of job training programs, or positively (Friedlander and Burtless, 1995). In the four programs described in Table 18, two of them saved money for governments since the reduction in welfare payments over five years ($735 in Arkansas and $1,930 in San Diego) outweighed the costs per person (of $118 and $920 respectively). In all four of the programs the increases in earnings over five years were larger than costs, so that from a social standpoint the benefits to all individuals -- recipients plus taxpayers -- outweighed the costs. In this sense these welfare-to-work programs were "worth doing", even though they did not reduce the welfare rolls and did not prepare individuals to leave poverty in any permanent sense.
One of the clear implications of many evaluations is the unsurprising finding that there is substantial variation among specific programs. The important question, however, is whether these differences can be explained, either by the characteristics and quality of the programs themselves, by the characteristics of individuals accepted into these programs (which is particularly important if some programs engage in "creaming"),[41] or by the local economic and employment conditions of particular programs. Otherwise, one would normally expect some random variation in the outcomes of programs -- that is, purely by chance some would have better outcomes than others -- but such variation would not be useful to program administrators or policy-makers trying to improve the quality of job training programs.
An example of variation among specific programs than may be purely random comes from the JTPA evaluation (the evaluation summarized in Table 5 above). At the end of 30 months there were substantial differences among the 16 sites examined: the increases in earnings varied from $2,628 to negative $2,033 among women, and from $5,310 to -$2,637 among men. Among youth, for whom the average effects were negative, the variation was even more marked since one program achieved statistically significant earnings increases of $3,372 for females and a whopping $9,473 among males who had not been arrested (Orr et al., 1994, Exhibit 4.5 and 4.16). However, partly because of small sample sizes in the 16 sites, these differences across sites were not statistically significant; and an analysis that tried to attribute the differences across sites to program conditions, economic conditions, and personal characteristics yielded no statistically significant results.[42]
The evaluations of welfare-to-work programs during the 1980s found benefits smaller in West Virginia than in other states (see Table 8), a difference generally attributed to poorer economic conditions in that state. In the GAIN evaluation, one finding was that the Riverside program was consistently more effective than any of the other five counties (Table 19); indeed, the increase in earnings of nearly 50 percent for individuals in Riverside county is one of the largest effects ever found for any job training program, and the correspondingly large net benefits in Riverside suggested that well-designed programs could save taxpayers money by reducing welfare costs sufficiently (see Table 23 below). Conversely, however, the GAIN results indicate that -- even within a state that imposes a certain uniformity on its welfare-to-work programs -- there can be localities (like Los Angeles and Tulare, the latter a rural county) where programs may have no effects on earnings at all. Indeed, the variations among sites were even greater than those revealed in Table 19: within counties, there were substantial differences between the results in outlying, largely suburban offices, where the benefits were substantial, and those in inner-city offices, where the benefits were either insignificant or even negative -- a result suggesting that the demographic composition of those enrolling might be responsible for the differences. However, a regression-based analysis controlling for such demographic characteristics did not eliminate the substantial differences among counties and among offices within counties (Riccio, Friedlander, and Freedman, 1994, Table 8.2); and in any case some counties were successful with those not needing basic skills while others succeeded with those who were less job ready, suggesting that different counties could be successful with different groups of clients.
Nor did differences in local economic conditions -- measured, for example, by unemployment rates and the growth rate of employment -- explain differences; in particular, the effects of the Riverside program were remarkably consistent even though economic conditions in that county varied substantially during the period of the study, and varied in addition among office. Nor was there any obvious explanation of differences among counties based on the kinds of services offered. In the end, the evaluators concluded that the success of Riverside was due to the combination of practices there: a strong message to participants about the importance of getting into jobs early; a strong commitment to job search and job placement efforts; a mix of job search, education, and training; and a commitment to enforcing mandatory participation of all eligible welfare recipients (Riccio, Friedlander, and Freedman, 1994, Ch. 8). Another observer has concluded that the high expectations of the program staff were responsible (Bardach, 1993), and still others have attributed the success to the energy and charisma of the director.
The evaluation of Project Independence also found substantial differences among counties, ranging from earnings increases over two years of $1,333 to earnings losses of $570 (Kemple, Friedlander, and Fellerath, 1995, Table 6.13). However, these differences were not statistically significant -- that is, it is possible that they were generated simply by chance -- and they were not related in any obvious way to variation in labor market conditions, services available, or patterns in program participation. Such results add to the conclusion that, while there are differences among programs that appear substantial, it is not yet possible to explain these differences by labor market conditions or by program characteristics that could be affected by policy.
The experimental programs reviewed in Section III.3 above also found substantial differences among specific programs. One results in particular was striking: two independent evaluations, one of JOBSTART and one of the Minority Female Single Parent Demonstration (in Table 12), found that the Center for Employment Training in San Jose, California, was much more effective than the other programs. The MFSP evaluators concluded that the success of CET was due to its practice of linking (or "integrating") both job-specific skills and remedial education, along with its attention to job placement and its availability of child care at the site (Burghardt and Gordon, 1990).
However, the truth is probably much more complex and -- like Riverside -- the success of CET probably lies in a combination of factors.[43] First, the CET program has been in San Jose for a long time, and has long-standing connections with employers that facilitate its finding placements for its students.[44] Second, the program concentrates on Hispanics, and most of the instructors are both Hispanic and bilingual; the program is therefore providing bilingual education in addition to job skills instruction, as well as mentoring and acculturation to American practices for those individuals who have just immigrated. Third, the site at San Jose performs real work -- for example, it operates a child care center, a copying business, a cafeteria for the CET members themselves, an auto repair shop, and the like, each associated with one of the job training programs -- so that students are getting work-based training and experience as well as classroom instruction in both job skills and remedial subjects. Fourth, the presence of social services at the site -- child care, but also assistance with immigration issues and job placement -- is clearly important. While CET does provide both remediation and job skills training, and is particularly conscious of the need for English language instruction, the two are not integrated in any important sense -- they take place at different times of the day, and there is rarely any mention in one component of the program of the lessons from the other component -- so the emphasis on CET as an "integrated" program seems misplaced.[45] But there are clearly many other positive elements to CET and a broad range of job-related services provided, and it is easy to understand why it is so much more effective than other job training programs.
The conclusion from both the Riverside program and CET is that a combination of practices distinguishes particularly successful programs. I shall return to this finding in the conclusion, since certain current recommendations for job training programs build on the findings presented in this section.
The results presented so far describe the outcomes of job training programs, and whether they improve the employment rates and earnings (or other potential outcomes like arrest rates, welfare receipt, and fertility behavior). A different way of asking whether job training programs are "worth it" is to compare the outcomes to the costs, though benefit-cost analysis. Since the late 1970s, many job training programs have been subjected to cost-benefit analyses, using generally-accepted methods to establish values for the net present value of the outcomes that could be attributed to the program -- that is the difference in outcomes between those enrolling in the program and a control group. In most cases the additional earnings of those enrolled and the reduction in welfare benefits represent the major benefits, though in some cases the value of crime prevented and drug and alcohol abuse avoided are counted among the benefits (e.g., in the cost-benefit analysis of the Job Corps, in Table 20). To be sure, there are numerous intangible benefits of job training, especially that associated with greater economic independence and reduced use of welfare programs; and there are probably uncounted costs as well, particularly the opportunity costs of mothers with young children, who might otherwise be caring for their children.
Typically these analyses calculate costs and benefits separately for different groups of potential beneficiaries, usually including those enrolled and the "rest of society" (or taxpayers), in order to capture the distributional effects of job training. That is, it is possible that such programs benefit those enrolled while taxpayers lose, or conversely that the benefits to taxpayers through decreased welfare and crime costs outweigh the costs of the programs, even though those enrolled do not earn enough more to offset the loss of welfare benefits.[46] The hope of job training programs, of course, is that both taxpayers and those enrolled will enjoy benefits in excess of costs.
One of the earliest and most influential cost-benefit analyses was that of the residential Job Corps program, summarized in Table 20. The results of this analysis are fairly typical of subsequent analyses: overall the net benefits to those enrolled in the Job Corps were positive, because the increased earnings per person ($3,397) were high enough to offset the lost of welfare income ($1,357), the opportunity costs associated with being unable to work during the period of the program ($728), and other costs associated with being in the program. But the Job Corps was a net loss to taxpayers, since the very high cost of the program ($5,351 in 1977 dollars, and about $15,000 now) was so large that the benefits from taxes on increased earnings ($1,255), reductions in welfare benefits and administrative costs ($1,515) and reduced criminal activity ($2,281) failed to offset it. From a social point of view, adding the benefits to Corps members plus taxpayers, the benefits outweighed the costs; the general lesson from this analysis was that even an expensive job training program could demonstrate its worth in cost-benefit terms.
Similarly, the welfare-to-work programs developed during the 1980s were evaluated using benefit-cost analyses. Table 21 presents the results of a typical and particularly influential analysis, the San Diego program, for both a work experience program (EWEP) and job search assistance. The results, as for the Job Corps, indicate net benefits for those enrolled in the program, as earnings increase by a small amount (e.g., $461 for work experience and job search together), but net losses for taxpayers since the costs of the program were higher than any reduction in welfare payments and increases in taxes from increased earnings. Since from a social perspective the benefits outweigh the costs, the results of this experiment were widely cited as confirming that welfare-to-work programs were worth undertaking, and were widely cited in legislation for the JOBS program. However, the fondest hopes of those hostile to welfare -- that such job training programs would save money for taxpayers -- were clearly not met by the San Diego program.
In the more general analysis of the JTPA program, the cost-benefit results are of course consistent with the findings of the outcome studies. For adult women and adult men, for whom there were significant increases in earnings, the benefits outweigh the costs both for those enrolled in the programs and for society, as presented in Table 22 -- though again taxpayers lose. But for youths, for whom the effects on earnings were negative because of reductions in earnings during the period of program, the program generated net losses of participants as well as taxpayers, with losses per person especially high ($2,904) in the case of young males. The only conclusion possible is that job training programs are a particularly poor investment for youth, and indeed, since these results became public there have been numerous proposals to reduce or eliminate job training efforts for youth.
The benefit-cost analyses of the GAIN program in California, summarized in Table 23, confirm that different programs of varying effectiveness can generate different conclusions in cost-benefit analyses. This analysis distinguished among three groups of potential beneficiaries: welfare recipients enrolled in GAIN; governments supporting the costs of GAIN; and taxpayers, who also pay for GAIN through their taxes but who experience certain benefits and costs -- for example, the output associated with unpaid work experience -- that do not directly affect government budgets. The results indicate that both government budgets and taxpayers lose for all groups of recipients; only in the case of recipients not needing basic remedial education -- which requires relatively expensive classroom training and not simply low-cost job search assistance -- did the benefits outweigh costs both to recipients themselves and to taxpayers. But the detail for specific county programs reveals that the most effective programs -- those in Riverside and in San Diego -- can generate savings for taxpayers as well as substantial net benefits for recipients; indeed, in Riverside the benefits to governments and to taxpayers are larger than those to recipients. (And, on the contrary, the worst program -- that in Los Angeles County -- actually made recipients worse off, as JTPA did for youth.) From these results, therefore, the hopes for welfare-to-work programs lies in their being able to emulate the characteristics of the most effective programs since on the average the GAIN program generated net losses to society.
The results for Project Independence are somewhat different than they are for virtually any other job training program (see Table 24). Because this program did not increase earnings by much, but did reduce welfare benefits, the net effects for those on welfare were actually negative. However, because the program did not cost much (an average of $1,150 per person) and did reduce welfare costs by a considerable amount (about $1,155, according to the figures in Table 24), the program resulted in a small savings to taxpayers[47] -- quite the opposite of the results for GAIN and most other job training programs, where recipients gain but taxpayers lose. However, overall Project Independence generated losses to society since the very small gains to taxpayers did not outweigh the high losses to welfare clients.
Of course, any positive cost-benefit results depend on job training programs with significantly positive effects on employment. In the case where benefits are essentially zero, then of course benefit-cost analysis indicates net losses to society. Table 25 presents the benefit-cost results of JOBSTART, which had largely insignificant results (in Table 14). Consistent with these findings, the net benefits to participants were very small, and the losses to taxpayers and to society as a whole were substantial.
On the whole, however, the results of cost-benefit analyses of job training programs are relatively consistent -- at least if we ignore the evidence from the least effective programs (e.g., JTPA for youth), the most effective programs (like Riverside), and the different case of Project Independence. In general, the benefits of job training programs outweigh the costs since the modest increases in earnings are larger than the modest expenditures per person in these programs. However, these benefits accrue largely to those enrolling in job training programs, and governments (or taxpayers) typically do not benefit since their costs outweigh any benefits they receive in the form of higher taxes on higher earnings, reduced welfare costs, and reduced costs associated with crime and other social problems. Overall, then, job training programs are "worth doing" -- but it is unclear whether taxpayers would continue to support these programs if they realized that they are likely to be net losers.
Overall, the results from nearly thirty years of evaluating job training programs are remarkably consistent -- surprisingly so, given the variation in the kinds of programs that have been supported and the differences in the methods of evaluation. A large number of job training programs lead to increased earnings, and the benefits generally outweigh the costs -- though the increases in earnings are moderate by almost any standards, insufficient to lift those enrolled in such programs out of poverty. Welfare-to-work programs also increase employment and reduce the amount of welfare, but they rarely allow individuals to leave welfare. Furthermore, any benefits probably fade out after four or five years: job training programs do not seem to put many individuals on career trajectories with continued earnings increases, as formal schooling does.[48] For some groups -- youth in particular -- job training programs generally seem to be ineffective (unless perhaps the program is very intensive, as the Job Corps is), and programs appear to be more effective for women than for men -- but otherwise it is difficult to conclude that any particular group benefits more than any other. There are, to be sure, some outliers -- particularly effective programs like the Center for Employment Training and the Riverside GAIN program. But there are spectacular failures as well -- including some experimental programs with carefully-considered program designs, most job training programs for youth, the worst of the GAIN programs, and Project Independence as a whole -- that actually leave those enrolled worse off and thereby violate the first maxim of intervention: "do no harm".
One can assess the modest outcomes of job training programs in either positive terms, as indicating that these programs are worth doing on the average, or in negative terms. In my interpretation, however, the results are very discouraging: thirty years of experimentation with job training programs have created a substantial number of programs whose benefits -- for individuals in dire need of employment and economic independence -- are quite trivial, and are completely inadequate to the task of moving them out of poverty, off of welfare, or into stable employment over the long run. The puzzle is why such well-intentioned efforts have been so ineffective. In this section I present ten possible explanations, based on the results in Section III, direct observations of job training programs, and comparisons with education programs. Such explanations must remain speculative, of course, since there is not yet enough evidence about truly effective job training program to "prove" what works. Nonetheless, such explanations help provide some guidance for recommendations -- the subject of the last section.
The first and most obvious explanation is simply that most job training programs are "small": they last a very short period of time, rarely more than twenty weeks; they often provide a single kind of service -- on-the-job training, or classroom training, or job search assistance -- rather than a variety of complementary services. Job training administrators often take pride in this aspect of their programs: they will say, for example, that they offer "Chevrolet" programs compared to the "Cadillac" programs of educational institutions, by which they mean that they can get to the same destination at much lower cost; they often scorn education programs for being too "academic" and unconcerned with immediate employment.
However, the individuals enrolled in these programs often have multiple problems and several barriers to employment: they often lack job-specific skills, general academic skills, and the kinds of values (including motivation, punctuality, persistence, the ability to work with others) necessary to find and keep employment, and some of them have more serious problems like drug and alcohol abuse, physical handicaps, other health problems, depression and other mental health problems that may be biological rather than experiential. Even when this is not the case, the gap between the needs of those enrolled and the scope of programs is sometimes breath-taking; for example, one job training program I observed was trying to train Spanish-speaking women to be English-proficient secretaries in a 15-week, part-day program. There is, then, a disjuncture between the profound needs of those who have not found stable employment and the small size of job training programs -- and we should not be surprised to find the effects on employment to be trivial.
It is useful to compare the intensity of job training programs to those of education programs, measuring intensity by expenditures. The average JTPA program for adult men and women cost about $2,200 in 1987-89, for a period of enrollment of 20 weeks (Bloom et al, 1994, Exhibit 2); the cost per person in the GAIN program was about $2,300 in 1993 dollars (Riccio, Friedlander, and Freedman, 1994, p. 75). (Of course, many of the experimental programs are much more expensive, and the current cost per person in Job Corps of about $15,000 is the highest of all.) In contrast, one year of full-time enrollment in a community college -- approximately 30 weeks -- averaged $5,700 (Digest of Educational Statistics, 1994, Table 325), which is considerably more than the limited JTPA and GAIN programs. On the other hand, a one-year certificate program increased earnings by about 15 percent over those of high school graduates, and over a long period of time in the labor force (Grubb, 1995, Table 3), rather than increasing earnings only over a four or five-year period. Of course, this comparison is not especially fair because the characteristics of JTPA and GAIN clients are quite different from those of students enrolling in community colleges. These comparisons indicate once again that the typical job training program provides less services at substantially lower cost per person, to individuals with less education and (often) more personal problems, compared to typical postsecondary occupational programs-- and it should not be surprising that the effects of job training are so small.
A second possibility is that the basic strategy of many job training programs, and virtually all welfare-to-work programs, is simply the wrong one. Most programs -- including the successful Riverside program -- have stressed moving individuals into employment quickly, using job search assistance, work experience placements, and on the job "training"; they provide relatively little actual training, despite their name. The underlying assumption is that the basic problem of the unemployed is one of job-finding, and that once individual get jobs they will remain employed. Welfare-to-work programs are particularly insistent on the value of getting any kind of job, and the current political rhetoric in the United States about "ending welfare as we know it" concentrates on pushing welfare recipients into work. This tactic assumes that there are plenty of jobs available to those who want to work and that the appropriate motivation to work -- from either the "stick" of reduced welfare benefits or the "carrot" of increased incentives to work -- will be sufficient. There has been much less attention paid to the problem of enhancing the basic competencies -- cognitive, vocational, and personal -- of job trainees, except in a limited number of intensive and experimental programs.[49]
The success of this strategy is confirmed in the widespread finding that job training programs increase earnings by increasing the amount of employment, rather than by increasing the wage rates (and presumably the productivity) of individuals. But this strategy ignores the reality that the low-skilled labor market for which job training programs prepare individuals is so unstable that -- without the increase in basic skills that would enable individuals to escape the secondary labor market -- they will continue to suffer intermittent employment, low earnings, and the kind of discouragement that leads them over the long run back to marginal employment or welfare. (This is consistent with the finding in Table 18 of benefits declining in years 4 and 5.) Furthermore, the kinds of jobs that individuals leaving such programs can typically get are so dreadful -- with repetitive, boring work, few prospects for advancement, and often harsh and demeaning supervisions -- that it is no wonder that individuals leave after short periods of time. Ethnographic and journalistic accounts have sometimes stressed the difficulty of job-keeping rather than job-finding (Quint, Musick, and Ladner, 1994), and a current experiment is intended to assess the value of continuous services intended to help individuals keep the jobs they find. But whether this is possible without changing the nature of low-wage work is an open question.
The implication of this argument is that, in the interests of greater long-run effects, there should be more attention to the enhancements of skills -- both basic education and job skills -- and less attention simply to getting individuals into employment.[50] Indeed, the long-run evidence on the effects of welfare-to-work programs, summarized in Section III.6 and Table 18, tends to confirm this: the only program without long-run decay in earnings was the Baltimore Options program, which was distinguished from the others by more intensive education and training (Friedlander and Burtless, 1995, p. 144). The most powerful evidence, however, is the contrast between the typical job training benefits -- which decay over 4 or 5 years -- and the age-earnings profiles associated with different levels of education, where the benefits of education expand over time.
When they do not emphasize pushing individuals into employment, job training programs sometimes provide (as their name implies) some training in job-specific skills. Sometimes this takes place in classroom settings, and sometimes in work settings or on-the-job training.[51] However, a study of on-the-job training revealed that in a large fraction of these programs (55 percent), there was little or no explicit training going on: employers viewed the program as a source of subsidized labor and used individuals in routine, unskilled work, without any attention to providing either job-specific or more general skills (Kogan et al., 1989). This approach among employers to on-the-job "training", which occurs in a variety of job training and apprenticeship programs,[52] is particularly likely to occur where employers are small and marginal and are pressed for resources. In many local programs, JTPA agencies seem to act as a screen to provide such employers with a steady source of relatively stable, low-cost labor, and can therefore come up with jobs for JTPA trainees -- but these placements have very little training and few long-run prospects.
The quality of classroom-based job skills instruction in job training programs has not, to my knowledge, ever been closely examined. However, here too there are likely to be serious problems. Keeping up with technological changes is difficult enough in the more sophisticated, longer-term programs offered in community colleges and technical institutes, but in short-term job training programs with little funding for capital outlays it must be nearly impossible. Similarly, the problem of finding instructors from industry is a difficult issue for postsecondary educational institutions, and this problem must even more serious in local job training programs with their intermittent offerings and therefore unstable employment of instructors. Many job training programs are operated by community-based organizations, which typically pay low wages. And because such organizations are often principally involved in other activities -- promoting the rights of black Americans, for example, or of recent immigrants, or advocating on behalf of the disabled -- their experience in job training and education and their connections to employers may not be strong. While the quality of job-related instruction merits further investigation, the conditions in many job training programs are not conducive to high-quality training.
In educational institutions, there is currently a great deal of debate about the most effective pedagogies, and a concerted effort by reformers to replace conventional, didactic methods of teaching -- what I have called "skills and drills" -- with other approaches to teaching associated with a very different tradition of "meaning making" that enable students to be more active in learning, that are student-centered rather than teacher-directed, and that use a wider variety of activities and motivation in the classroom.[53] In adult education too, there has developed an orthodoxy of good practice that advises programs to tailor instruction to the interests and goals of adults and to use a variety of instructional methods including more active techniques. Unfortunately, none of this discussion has affected the world of job training programs, where even the existence of a debate about pedagogy is unknown. Job training programs virtually universally use conventional pedagogical techniques based on "skills and drills", with instructors breaking reading, writing, and mathematical skills into tiny sub-skills and drilling endlessly on a series of inherently meaningless sub-skills (Grubb et al., 1992; Grubb and Kalman, 1994).[54] The instruction is particularly bad in programs that have adopted computer-based instruction: while administrators are often quite proud of their computer programs, the existing programs are the worst examples of "skills and drills" converted to the computer screen, with even shorter reading passages, less writing, and more trivialized arithmetic examples than standard textbook instruction (Weisberg, 1988). Here too there is often a pride among job training administrators in distinguishing themselves from what educators do: they will say, for example, that they are "trainers" rather than "educators", and those in charge of computer-based programs well describe themselves as "managers" of the program rather than "teachers". But this pride masks their deep ignorance of pedagogical issues, and results in instruction that is quite horrifying to see.[55]
While there is evidence that conventional didactic approaches are the least effective methods for teaching many individuals,[56] this approach is likely to be particularly ineffective for the individuals in job training programs. Most of them by construction have not done well in many years of conventional schooling, using conventional didactic instruction; why they should suddenly be able to learn from this approach in very short programs with bad teaching is completely unclear. The ineffectiveness of conventional approaches to teaching may be inferred by a study of remedial education in the GAIN program: the only county with an increase in test scores was San Diego, which developed an innovative program to avoid the problems with the "school-like" adult education system. As one administrator described their efforts: "These people had an unproductive experience in school and were not able to benefit. We wanted to avoid the perception that they were going back. We wanted to make it different and make it work for them (Martinson and Friedlander, 1994, p. 41).
The inability or outright refusal of many job training programs to understand pedagogical issues is exacerbated by the problems of hiring instructors. There has been little research about those who teach in job training programs; but the conditions of providing short-term and intermittent programs, often in community-based organizations with low pay, are not conducive to hiring good teachers. Typically, instructors in job training programs are given little preparation in teaching -- a further indication of how little attention is given to teaching. In contrast, within the schooling system in this country the issue of preparing teachers well, and of paying them enough to attract a stable, experienced, and dedicated teaching force, are widely discussed. The fact that job training programs have typically not even raised this question is yet another sign of the unimportance of teaching -- and another contributor to the low quality of instruction.
Many job training programs, particularly those operated by JTPA and JOBS (which often works through JTPA programs), are highly local. A local decision-making authority, the Private Industry Council or PIC, makes important decisions about the nature of services to be provided, to which groups of individuals, and then establishes the methods for subcontracting with other groups (typically, community-based organizations, educational institutions, and proprietary schools) to provide the various services. The local nature of these decisions is certainly necessary in light of the fact that low- and middle-skilled labor markets are themselves quite local, and therefore programs must adjust themselves to local conditions. However, this also make job training programs vulnerable to local political influence as well. This usually operates to direct funds to particular providers of services, regardless of whether they are effective or not, and makes it difficult for local programs to shift resources from ineffective providers to more effective organizations.
Political interference appears to take place in several different ways.[57] In some cases, providers of training services are represented on the PIC, and seem to be able to direct contracts to their own organizations. In other cases, influential local politicians can effectively threaten to create trouble for a job training program if they do not support a favorite local provider. In still other cases a local job training administration, anticipating political problems, will arrange to allocate resources through non-competitive processes[58] which are essentially rigged so that particular community-based organizations receive funding. In many cases, this kind of interference takes place on behalf of certain groups with particular racial identities -- for example, a local group representing the black community or the Hispanic community (or particular parts of the Hispanic community like Haitians or Puerto Ricans). In other cases the community-based organizations represent women, or older women trying to re-enter the workforce, or the handicapped, or some other particular group. They operate simultaneously as advocates for "their" groups, as sources of guidance and counseling for members of these groups, and as providers of education and training services through public funding. In all these cases, constituents served by such CBOs can exert considerable political pressure. Unfortunately, some of the worst job training programs seem to take place in cities with well-organized minority community-based organizations with political influence; in these cases the difficulty of detecting political interference is compounded by racial tensions, which make it difficult for white administrators at the local or state level to challenge the allocations of funds to ineffective black or Hispanic groups. Conversely many of the best programs seem to take place in rural and suburban areas that are relatively free of such political interference.[59]
The effects of local political interference on the effectiveness of job training programs is difficult to assess. In many cases, it is clear that ineffective organizations are given resources because of political interference, and effectiveness suffers. However, community-based organizations with political influence also include many extremely effective organizations, highly dedicated to the group they serve. My own hunch is that -- as is generally true of the private sector compared to publicly-provided programs -- the community-based organizations providing job training services include some of the best as well as some of the worst providers. The problem with local political influence is that it becomes difficult to eliminate some of the worst providers -- and those who enroll in them are the ones who suffer.
One of the conventions in vocational education and job training is that the labor market value of job-specific education and training is likely to be quite low if individuals are unable to find jobs for which they have been prepared. While there has been relatively little analysis of the consequences of job-related versus unrelated education and training, the little evidence that exists indicates that job-related vocational education does have higher economic benefits than unrelated education (Grubb, 1995b; Rumberger and Daymont, 1984).
In the world of job training, some services convey general competencies -- for example, remediation should enhance basic academic skills that are useful in virtually every job, and on-the-job training and work experience that enhance the personal characteristics required at work should have the same effects. But a good deal of job skill training and on-the-job training is job-specific, and may not benefit individuals much if they fail to find related employment. In one particular data set -- the Survey of Income and Program Participation (SIPP) -- individuals are asked if they have received different forms of job training, and whether this training is related to their current employment. Among those who were in JTPA programs, 49 percent of men and 46 percent of women reported that they used training on their current job; for those reporting having enrolled in CETA, 42 percent of men and 46 percent of women reported that their job training was related (Grubb, 1995b). And -- consistent with the hypothesis that related training has a much higher economic return -- the earnings of men with related JTPA training were on the average 55 percent higher than those with unrelated training, with the comparable figure for women 42 percent; for those with CETA training, men earned 21 more if their training was related to their current job, while women earned 6 percent more.[60] Thus the average economic benefit of job training programs -- the figures that show up in formal evaluations -- are as low as they are partly because they combine the much lower benefits (presumably near zero) of those who failed to find related employment with the more substantial benefits of those with employment related to their job training. The conclusion in the SIPP data that only a minority of individuals are in jobs related to previous job training suggests that, even with some placement efforts, job training programs have not done a good job of placement. Alternatively, if individuals are placed in related jobs with little future, then they would normally shift over time into different occupational areas with greater prospects for mobility. Under either explanation, however, job training programs appear to do a mediocre job of placing individuals in appropriate jobs.
One of the basic characteristics of job training programs in the United States is that, since they enroll individuals with substantial barriers to employment and provide relatively limited training, they aim to place individuals in jobs with relatively low levels of skill and pay. The limited ambitions of these programs -- confirmed by the finding that increases in earnings are modest, and probably decay after 4 or 5 years -- may be "realistic" given their resources, but they still do not help individuals move out of poverty or off welfare. And the emphasis on quick placement into employment, while it generates modest benefits in the short run, reinforces the notion that job training should be a short, one-shot event, with individuals then leaving the job training system.
Typically, job training programs are not linked to other programs, either to other job training programs or to education programs.[61] On occasion, job training programs refer individuals to other programs -- for example, they may refer those in need of remediation to adult education programs -- but in these cases there is little effort to identify which such programs might be most effective, or to follow individuals to make sure they enroll and complete other programs (Grubb et al., 1991). Even in welfare-to-work programs, where there is a caseworker assigned to each individual to make sure that he or she can navigate the array of services offered, individuals often become "lost" when they are referred to services but then never enroll, or never complete the program, or fail to enroll in subsequent programs. As one GAIN administrator in California commented about individuals referred for remedial education, the lack of information about progress means that many clients "fall into the black hole of adult basic education", staying in adult education for long periods of time without much progress and without caseworkers knowing where they are (Grubb and Kalman, 1994). When individuals complete job training programs they are usually referred to employment, not to subsequent education; and of course the need for most trainees to earn a living generally precludes them from immediate enrollment in other education or training. The consequence is that the possibilities for expanding resources to those in job training programs -- for gaining them access to additional training that might provide them with access to jobs of increasing skills levels and pay -- are virtually non-existent,[62] and it is not surprising that the benefits of job training seem to decay over time.
But an alternative way to view education and training is the notion -- sometimes encapsulated in the overworked phrase "lifelong learning" -- that a low-level job training program would be only a start back into the labor market. If job training programs were linked to one another and to education programs, then an individual could enroll in a low-level program, complete it and enter low-skilled and low-paid employment, and then -- at such a point where time and resources permitted -- could continue in a more advanced job training program, or a credential program in a community college, in order to gain access to higher-skilled and better paying jobs. This kind of "ladder" of training and education opportunities would therefore be able to take an individual at any level of skill and then -- in a series of short-term programs rather than a single, one-shot program -- provide access to a range of jobs with better long-run prospects, a possibility to which I turn in the Conclusion.[63]
A possible explanation for the mediocre employment effects of job training programs is simply that there are not enough jobs for unskilled and semi-skilled workers, and the labor market is unable to absorb those who complete job training programs.[64] For example, the finding that welfare-to-work programs in West Virginia were the least effective (see Table 8) was generally attributed to weak economic conditions in those regions. However, in other cases it has proved impossible to blame labor markets. For example, the outcomes across the six GAIN counties of California could not be explained by variation in labor market conditions; and the analysis of JTPA after 18 months found no significant effect of the local unemployment rate on earnings, and only a minimally significant effect of urban location on youth (but not adult) outcomes (Bloom et al., 1993, Exhibit 7.12). In addition, as I pointed out in Section II, labor market conditions have contradictory effects: although high unemployment or low employment growth may make job placement more difficult, it may also cause more job-ready individuals to enroll in job training programs, making placement easier than in boom times when those enrolled in job training programs have the greatest barriers to employment. In general labor market explanations have not been popular among those examining job training programs.
However, it is still possible that labor market conditions may explain the mediocre effects of job training programs. In the first place, there have been relatively few systematic examinations of labor market effects, aside from that undertaken for the JTPA evaluation, and the examination of labor market effects on JTPA may have been marred by insufficient variation in labor market conditions.[65] Second, it is possible that the weak condition of labor markets for modestly-skilled work explains the pervasively mediocre results of job training programs, even if it fails to explain the cross-section variation in outcomes. Finally, it is worth noting that job training programs represent a supply-side solution to the problem of underemployment and poverty -- the assumption that if the skills of the labor force can be improved, or if individuals out of the labor force can be induced to enter it, then employment and earnings will improve without intervention on the demand side. The alternative tactic, of course, is that a supply-side policy should be coordinated with a demand-side policy to increase the demand for modestly skilled workers. Indeed, something like this took place in the last years of the CETA program, when public service employment (PSE) created additional jobs for CETA trainees in governmental and non-profit community-based organizations. Such efforts were denounced as "un-American" since they might substitute public employment for private employment, and were quickly abolished by the Reagan administration. But the notion that demand-side policy in labor markets should be coordinated with the more common supply-side policy remains an attractive idea, and one that might improve the mediocre effects of job training programs.
Over the past two decades, the results of evaluation youth programs have been especially dismal. Many of the early CETA programs for youth had negative results (see Table 3), with the exception of the expensive Job Corps program (Table 4), and the minimal or negative effects were replicated in the more sophisticated JTPA evaluation (Table 5 - 7). Several of the experimental programs for youth, like JOBSTART and STEP, have proved ineffective despite careful planning and higher costs. These results are particularly discouraging because of the hope that programs aimed at youth could be preventive, by steering young people into paths that would be beneficial in later years.
There are several explanations for the particularly poor results in youth programs. One has to do with labor market conditions: many employers will not hire young people, so under the best of conditions they have a tendency to mill around until reaching their early or mid-twenties. In addition, for many of the moderately-skilled jobs in the sub-baccalaureate labor market, employers will not consider individuals without a high school diploma, effectively condemning drop-outs to completely unskilled positions. Other explanations depend on the special characteristics of youth culture in the United States: this culture, with its rejection of school and discipline and the premium placed on "coolness", may work against job training training programs in ways that do not affect the performance of adults with greater maturity and sense of responsibility. Still other explanations point out that adolescents are still entangled in their families, which may be disorganized and destructive rather than supportive (Quint, Musick, and Ladner, 1994).
Unfortunately, job training programs cannot do much about these factors, However, there may be some systematic failings in the job training programs applied to young people (Granger, 1994; Doolittle, undated). Programs devised for adults may not be developmentally appropriate for adolescents; and the conventional "skills and drills" pedagogy of most job training may be especially abhorrent to them, because of their recent and negative experiences with school. Because of the complex conditions of their lives, young people may need a greater variety of support services -- especially guidance and counseling about such issues as drugs, alcohol, sex, and sexually-transmitted diseases -- that are not usually part of job training programs designed for adults. To be sure, the minimal results of programs designed specifically with a range of services -- like New Chance and JOBSTART -- are not especially encouraging, but most of the efforts to devise better youth programs still concentrate on specifying a broader array of services to cope with the more complex conditions of adolescents' lives. This is essentially the vision I offer in the Conclusion, in which work-based placements would be combined with school-based activities (as well as a variety of supportive services) to provide a variety of learning opportunities for those individuals who have not done well in conventional educational institutions of the "first chance" system.
A final possibility is that the very idea of providing "second chance" programs is flawed. Given the universality of "first-chance" programs -- the elementary-secondary education system in the United States, with its many remedial and compensatory programs, along with a higher education system that virtually guarantees a place for everyone (in open-access community colleges if not in four-year colleges) -- one could argue that the individuals who fail to use the educational system to increase their skills and gain access to employment are by definition those with such serious intellectual, personal, and motivational barriers to employment that no "second chance" system of reasonable cost could possibly help them enter stable employment. Under this argument, the resources that currently go into job training programs should be diverted into improving the "first chance" programs; alternatively, one could argue (as some have[66]) that there has been too much spent already within the first-chance educational programs on preventing school failure, and the education and training system should devote less of its resources to the bottom tenth, or those that have sometimes been demeaned as the "leftovers" (Taggart , 1981).
But this view -- the abandonment of second-chance opportunities, though job training programs and such institutions as adult education and community colleges -- is a distinctly un-American idea. Even now, when the country has swung toward the political right, with many proposals for dismantling important aspects of our welfare and regulatory systems, there are almost no calls to dismantle the education and training system. One reason is that even the most ineffective efforts at training for employment are more palatable than the alternative -- allowing individuals to live at public expense without working. And the effort to build second-chance programs is an expression of the enduring American commitment to equity in some form -- even if that form is not particularly effective. For these reasons this country is unlikely to abandon its current efforts at job training. Daunting as it may be, the appropriate task is that of improving job training programs, rather than abolishing them.
How might job training programs be improved? Drawing on the explanations in the previous section, one tactic might be to improve each of their components. That is, the quality of job training needs to be carefully considered, and should be improved in many programs; the nature of instruction in basic skills is often very poor, and job training programs need to learn more from the education system about appropriate instructional methods; and efforts in assessment, case management, and placement may need to be strengthened (e.g., Dickinson, Kogan, and Means, 1994; Dickinson et al., 1993).
But this kind of piecemeal approach, valuable though it might be in specific instances, seems to miss the point. The real problem with existing job training programs is not that an individual component here or there is not of adequate quality, but that the offerings of the "system" as a whole consist of a welter of different services, none of them obviously more effective than any others and all of them poorly coordinated, with individual programs of limited intensity not linked to other opportunities even though they are intended for a population with substantial needs. In contrast, the most effective programs -- the Center for Employment Training and the Riverside GAIN program, for example -- seem to work because they encompass a combination of mutually-supporting practices. This suggests that the most powerful approaches to reforming job training would first create models of more comprehensive and interactive employment-related services, and then worry about the quality of individual components.
Furthermore, it would be important to connect job training programs to other training and education opportunities, rather than leaving them as independent, limited, one-shot efforts. The effects of job training programs, small as they are, seem to decay over time, while the benefits of education typically increase with further labor market experience. The disconnection between "education" and "job training", rooted in the very creation of job training programs during the 1960s, has been counter-productive for both Many of the reasons I gave in the previous section about the ineffectiveness of job training came from this divorce -- including the small scale of job training efforts, the ineffective pedagogy, the political influences in job training, and the provision of services in small, unstable organizations. And conversely education institutions could learn much from job training programs about the importance of employment and of services like job placement.
Fortunately, there is already a model in the United States that could be used as the basis for reforming job training, though it has not yet been implemented. The School-to-Work Opportunities Act, passed in May 1994, is intended to apply to secondary and postsecondary education programs, but it presents a vision that could be used to guide job training programs as well. The STWOA can be interpreted as specifying five elements for successful programs:
Such programs would also try to establish links among programs in order to create education and training "ladders" -- that is, a series of sequential education and training-related activities that individuals can use to progress from relatively low levels of skill (and relatively unskilled and poorly-paid work) to higher levels of skill and (presumably) more demanding, better-paid, and more stable occupations.[68] Individuals could enter the system at any level -- for example, welfare recipients with the lowest levels of education and labor market experience could enter programs similar to those now available through JOBS, but each program would be linked to other programs further up in the hierarchy of skills. The programs in the current job training system would be articulated with those in the education system through the community college -- that is, certain job training programs would lead into certificate and Associate programs in the community college, from which individuals could over time move into baccalaureate programs in four-year colleges. In this way individuals with minimal skills could move up a "ladder" of ever-expanding opportunities.
There are several reasons for emphasizing vertical integration and the creation of education and training ladders, in place of isolated job training programs:
Ashenfelter, O. (1978). Estimating the effect of training programs on earnings. The Review of Economics and Statistics, LX(1), 47-57.
Bardach, E. (1993, December). Improving the productivity of JOBS programs. New York : Manpower Demonstration Research Corporation.
Barnow, B. (1986, February). The impact of CETA programs on earnings: A review of the literature. Journal of Human Resources, 22: 157-193.
Berryman, S. (1995). Apprenticeship as a Paradigm of Learning. In W.N. Grubb (Ed.), Education through Occupations in American High Schools. Vol. I: Approaches to integrating academic and vocational education. New York: Teachers College Press.
Bloom, H. et al. (1993, January). The National JTPA Study: Title II-A Impacts on Earnings and Employment at 18 Months. Bethesda: Abt Associates.
Bloom, H. S., Orr, L. L., Cave, G., Bell, S. H., Doolittle, F., & Lin, W. (1994, March). The National JTPA Study: Overview: Impacts, benefits, and costs of Title II-A. Bethesda, MD: Abt Associates.
Bloom, H., and McLaughlin, M. (1982). CETA Training Programs: Do they work for adults?. Washington, DC: Congressional Budget Office and National Commission for Employment Policy.
Borus, M. E. & Prescott, E. C. (1974). The effectiveness of MDTA institutional training over time and in periods of high unemployment. American Statistical Association 1973 Proceedings of the Business and Economic Statistics Section, pp. 278-284.
Borus, M. E. (1964). A benefit-cost analysis of the economic effectiveness of retraining the unemployed. Yale Economic Essays, 4(2), 371-429.
Brazzie, W. F. (1966). Effects of general education in manpower programs. Journal of Human Resources, 1(1), 39-44.
Burghardt, J., and Gordon, A. (1990). More Jobs and Higher Pay: How an Integrated Program Compares with Traditional Programs. New York: Rockefeller Foundation.
Cain, G. G., & Stromsdorfer, E. W. (1968). An economic evaluation of government retraining programs in West Virginia. In Gerald G. Somers (Ed.), Retraining the unemployed (pp. 299-335). Madison, WI: University of Wisconsin Press.
Cameron, S., and Heckman, J. "The Non-equivalence of High School Equivalents", Journal of Labor Economics 11(1) (January, 1993).
Cave, G., Bos., H., Doolittle, F., and Toussaint, C. (1993, October). JOBSTART: Final Report on a Program for School Dropouts. New York: Manpower Demonstration Research Corporation.
Cooley, T. F., McGuire, T. W., & Prescott, E. C. (1975). The impact of manpower training on earnings: An econometric analysis (Final Report MEL 76-01). Washington, DC: U.S. Department of Labor, Employment and Training Administration, Office of Program Evaluation.
Crawford, C. (1995, February). Multiple employment training programs: Major overhaul needed to create a more efficient, customer-driven system (Report No. GAO/T-HEHS-95-70). Washington, D.C.: U.S. General Accounting Office.
Dickinson, K., Kogan, D., and Means, B. (1994, June). JTPA Best Practices in Assessment, Case Management, and Providing Appropriate Services. Menlo Park: SRI International and Social Policy Research Associates for the U.S. Department of Labor.
Dickinson, K., et al. (1993, August). A Guide to Well-Developed Services for Dislocated Workers. Menlo Park: SRI International and Social Policy Research Associates for the U.S. Department of Labor.
Dickinson, N. (1986, July-August). Which welfare work strategies work? Social Work.
Doolittle, F. (undated). Second-chance programs for youth. Unpublished draft, Manpower Demonstration Research Corporation.
Doolittle, F. et al. (1993, August). A summary of the design and implementation of the national JTPA study. San Francisco: Manpower Demonstration Research Corporation.
Employment Development Department, State of California. (1976, April). Third Year and Final Report on the Community Work Experience Program. Sacramento: EDD.
Fischer, R. (1995, January). Job Training as a Means to "Ending Welfare As We Know It": A Meta-Analysis of U.S. Welfare Employment Program Effects. Unpublished Paper. Nashville: Vanderbilt University.
Friedlander, D. (1988). Subgroup Impacts and Performance Indicators for Selected Welfare Employment Programs. New York: MDRC.
Friedlander, D., & Burtless, G. (1995). Five Years After: The long-term effects of welfare-to-work programs. New York: Russell Sage Foundation.
Geraci, V. J. (1984, August). Short-term indicators of job training program effects on long-term participant earnings (Project Working Paper 2). Austin, Texas: University of Texas, Center for Economic Research.
Goldman, B., Friedlander, D., Gueron, J., & Long, D. (1985). California: Findings from the San Diego demonstration. New York: MDRC.
Gooding, E. C. (1962). The Massachusetts retraining program, statistical supplement. Boston: Federal Reserve Bank of Boston.
Gordon, A. and Burghardt, J. (1990, March). The minority female single parent demonstration: Short-term economic impacts. Princeton, NJ.: Mathematica Policy Research, Inc.
Granger, R. C. (1994, October). The policy implications of recent findings from the New Chance Demonstration, Ohio's Learning, Earning, and Parenting (LEAP) program in Cleveland, and the Teenage Parent Demonstration (TPD). Paper presented at the annual meeting of the Association for Public Policy and Management, Chicago, IL. San Francisco: Manpower Demonstration Research Corporation.
Grossman, J. B. and Sipe, C. L (1992, Winter). Summer training and education program (STEP): Report on long-term impacts. Philadelphia: Public/Private Ventures.
Grubb, W.N. (1992, December). The Road Not (Yet) Taken: A Vision for an Education and Training System in California. Unpublished Paper, School of Education, University of California, Berkeley.
Grubb, W. N. (1995a). Education Through Occupations in American High Schools (2 Vols.). New York: Teachers College Press.
Grubb, W. Norton. (1995b, February). The Returns to Education and Training in the Sub-baccalaureate Labor Market: Evidence from the Survey of Income and Program Participation, 1984 - 1990. Berkeley: National Center for Research in Vocational Education.
Grubb, W. N. and McDonnell, L.M. (1991). Local Systems of Vocational Education and Job Training: Diversity, Interdependence, and Effectiveness. Santa Monica and Berkeley: RAND and the National Center for Research in Vocational Education.
Grubb, W. N., and Wilson, R. 1992. "The Effects of Demographic and Labor Market Trends on Wage and Salary Inequality, 1967-1988." Monthly Labor Review (June):115(6):23-39.
Grubb, W.N., and Kalman, J. (1994, November). Relearning to earn: The role of remediation in vocational education and job training. American Journal of Education 103(1):54-93.
Grubb, W.N., Brown, C., Kaufman, P., and Lederer, J. (1989, April). Innovation versus turf: Coordination between vocational education and job training partnership act programs. Berkeley: National Center for Research in Vocational Education.
Grubb, W.N., Brown, C., Kaufman, P., and Lederer, J. (1990, August). Order amidst complexity: The status of coordination among vocational education, job training partnership act, and welfare-to-work programs. Berkeley: National Center for Research in Vocational Education.
Grubb, W.N., Dickinson, T., Giordano, L., & Kaplan, G. (1992, December). Betwixt and between: Education, skills, and employment in the sub-baccalaureate labor market. Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.
Grubb, W.N., Kalman, J., Castellanos, M. , Brown, C., and Bradby., D. (1991, September). Readin', writin', and `rithmetic one more time: The role of remediation in vocational education and job training programs. Berkeley: National Center for Research in Vocational Education.
Gueron, J. M. (1987). Reforming Welfare with Work. Ford Foundation Project on Social Welfare and American Future, Occasional Paper No. 2. New York: Ford Foundation.
Gueron, J. M., & Pauly, E. (1991). From Welfare to Work. New York: Russell Sage Foundation.
Hardin, E., & Borus, M. E. (1971). The economic benefits and costs of retraining. Lexington, MA: D. C. Heath & Co.
Hillocks, G. Research on Written Composition: New Directions for Teaching. Urbana, IL: ERIC Clearinghouse on Reading and Communications Skills and National Conference on Research in English, 1986.
Kalman, J., and Losey, K. (forthcoming). Pedagogical innovation in a workplace literacy program: Theory and practice. In Glynda Hull, ed., What Workers Need to Know: Critical Looks at Literacy, Language, and Skills in the Classroom and on the Shop Floor. Albany: SUNY Press.
Kempel, J., Friedlander, D., and Fellerath, V. (1995, April). Florida's Project Independence: Benefits, Costs, and Two-Year Impacts of Florida's JOBS Program. New York: Manpower Demonstration Research Corporation.
Ketron, Inc. (1979). The long-term impact of WIN II: A longitudinal evaluation of the employment experiences of participants in the work incentive program (Draft Report). Wayne, PA: U. S. Department of Labor, Employment and Training Administration.
Kiefer, N. M. (1976). The economic benefits from manpower training programs (Final Report). Princeton, NJ: U. S. Department of Labor, ASPER.
Knapp, M., Shields, P., and Turnbull, B. Academic challenge for the children of poverty. Volume 1: Findings and conclusions. Washington D.C.: U.S. Department of Education, 1992.
Kogan, Deborah et al. (1989). Improving the Quality of Training Under JTPA. Berkeley Planning Associates and SRI International for the U.S. Department of Labor (November 15).
Kosterlitz, J. (1989). Devil in the details. National Journal, 21(48), 2942-2946.
Levy, Frank, and Murnane, Richard. (1992) U.S. Earnings Levels and Earnings Inequality: A Review of Recent Trends and Proposed Explanations. Journal of Economic Literature 30(3):1333-1381.
Long, D. A., Mallar, C. D., & Thornton, C. V. D. (1981). Evaluating the benefits and costs of the job corps. Journal of Policy Analysis and Management, 1(1), 55-76.
Lurie, I., & Hagen, J. L. (1995). Implementing the JOBS Program: An assessment in ten states. Unpublished paper.
Main, E. D. (1968). A nationwide evaluation of MDTA institutional job training. Journal of Human Resources, III(2), 159-170.
Mallar, C. (1978). Evaluation of the economic impact of the Job Corps program: First follow-up report (Report MEL 79-04). Prepared for the U.S. Department of Labor, Employment and Training Administration, Office of Program Evaluation. Princeton, NJ: Mathematics Policy Research, Inc.
Martinson, K., and Friedlander, D. (1994, January). GAIN: Basic Education in a Welfare-to-Work Program. New York: Manpower Demonstration Research Corporation.
McDonnell, L. M., & Grubb, W. N. (1991, April). Education and training for work: The policy instruments and the institutions (R-4026-NCRVE/UCB). Santa Monica, CA: RAND.
Murnane, R., Willett, J,., and Boudett, K.P. (1994, December). Do Secondary School Dropouts Benefit from Obtaining a GED? From Postsecondary Education and Training? The Answers are Related! Unpublished paper, School of Education, Harvard University.
National Commission for Employment Policy (1994, June). JTPA programs and adult women on welfare: Using training to raise AFDC recipients above poverty (Research Report No. 93-01). Washington, D.C.: NCEP.
National Commission for Employment Policy (1995, January). Understanding federal training and employment programs. Washington D.C.: NCEP.
Nightingale, D. S., Wissoker, D. A., Burbridge, L. C., Bawden, D. L., & Jeffries, N. (1991). Evaluation of the Massachusetts employment and training (ET) program (Urban Institute Report 91-1). Washington, DC: The Urban Institute Press.
Orr, L. L., Bloom, H. S., Bell, S. H., Lin, W., Cave, G., & Doolittle, F. (1994, March). The National JTPA Study: Impacts, benefits, and costs of Title II-A . Bethesda, MD: Abt Associates.
Page, D. A. (1964). Retraining under the manpower development act: A cost-benefit analysis. In J. D. Montgomery & A. Smithies (Eds.), Public Policy: Vol. 13 (pp. 257-276). Cambridge, MA: Harvard University.
Passmore, D. Employment of young GED recipients (Research Brief No. 14). Washington, DC: GED Testing Service of the American Council on Education, September 1987.
Prescott, E. C. & Cooley, T. F. (1972). Evaluating the impact of MDTA programs on earnings under varying labor market conditions (Final Report MEL 73-08). Philadelphia: U. S. Department of Labor, Employment and Training Administration, Office of Policy, Evaluation and Research.
Quinn, L., & Haberman, M. "Are GED certificate holders ready for postsecondary education?" Metropolitan Education 2 (Fall 1986): 72-82.
Quint, J. C., Musick, J. S., and Ladner, J. A. (1994, January). Lives of promise, lives of pain. New York and San Francisco: Manpower Demonstration Research Corporation.
Quint, J. C., Polit, D. F., Bos, H., & Cave, G. (1994). New Chance: Interim findings on a comprehensive program for disadvantaged young mothers and their children. San Francisco: MDRC.
Rangarajan, A., Burghardt, J., and Gordon, A. (1992, October). Evaluation of the minority female single parent demonstration. Volume II: Technical supplement to the analysis of economic impacts. Princeton, NJ: Mathematica Policy Research, Inc.
Ricco, J., Friedlander, D., & Freedman, S. (1994). GAIN: Benefits, costs, and three-year impacts of a welfare-to-work program. San Francisco: MDRC.
Rockefeller Foundation and Wider Opportunities for Women. (1989). Literacy and the Marketplace: Improving the literacy of low-income single mothers. New York: The Rockefeller Foundation.
Romberg, T., and Carpenter, T. "Research on teaching and learning mathematics: Two disciplines of scientific inquiry." In M.C. Wittrock, ed., Handbook of research on teaching, third edition. New York: McMillan, 1986.
Roomkin, M. (1973). The benefits and costs of basic education for adults: A case study. In Benefit-cost analysis of federal programs (pp. 211-223), a compendium of papers submitted to the Subcommittee on Priorities and Economy in Government of the Joint Economic Committee, 92nd Congress, 2nd session. Washington, DC: U. S. Government Printing Office.
Sewell, D. O. (1971). Training the poor. Kingston, Ontario: Industrial Relations Centre, Queen's University.
Slavin, R. "Cooperative learning and the cooperative school." Educational Leadership 45(3) (November 1987): 7-13.
Taggart, R. (1981). A Fisherman's Guide: An assessment of training and remediation strategies. Kalamazoo, MI: W. E. Upjohn Institute.
United States General Accounting Office. (1994). Multiple employment training programs: Most federal agencies do not know if their programs are working effectively (GAO/HEHS-94-88). Washington, DC: Author.
Villeneuve, J., and Grubb, W.N. (forthcoming). Home-Grown School-to-Work Programs: Lessons from Cincinnati's Co-op Education. Berkeley: National Center for Research in Vocational Education.
Walker, G., & Vilella-Velez, F. (1992). Anatomy of a Demonstration: The Summer Training and Education Program (STEP) from pilot through replication and postprogram impacts. Philadelphia: Public/Private Ventures.
Weisberg., A. (1988). Computers, Basic Skills, and Job Training Programs: Advice for Policymakers and Practitioners. New York: Manpower Demonstration Research Corporation.
Zambrowski, A. and Gordon, A. (1993, December). Evaluation of the minority female single parent demonstration: Fifth year impact at CET. Princeton, N.J.: Mathematica Policy Research, Inc.
[2] CBOs are private organizations, incorporated under state laws, that provide many different kinds of social and educational services. Some of them are identified with groups of individuals -- e.g., they represent the black community, or Hispanic migrant workers, or the disabled, or older women returning to the labor force; some of them represent particular services, like child care or homemakers services for the elderly. In general, there is minimal government regulation of CBOs, and their quality varies greatly.
[3] One of the issues here is whether there is any training involved in on-the job training, or whether it is the same as short-term work experience; see the discussion in Section IV.3 below. For evidence that much on-the-job "training" does not provide training, see Kogan et al. (1989).
[4] This monograph is not a formal meta-analysis, which would require statistical analysis of outcome results, suitably standardized, from a large number of studies. A formal meta-analysis has been carried out by Fischer (1995), and will be cited as appropriate; however, like most meta-analyses, the summarization of evidence makes it difficult to understand what might have caused the outcomes. Instead, I take the approach of presenting results from a number of specific evaluations in a series of tables illustrating the kinds of findings that have been typical.
[5]On the history of manpower policy, see National Academy of Sciences (1975).
[6]For program years 1988 and 1989, the adult standards included percent of participants placed in jobs (68 percent); average hourly wage at job placement ($4.95); average cost of placement ($4,500); percent of welfare recipients placed (56 percent); percent of participants employed at 13-week follow-up (50 percent); number of weeks worked at follow-up (8); and weekly earnings at follow-up ($177). The youth standards included a positive termination rate, cost per positive termination, entered employment rate, and employability enhancement rate.
[7] There are many other forms of aid to low-income families that also constitute welfare, including food stamps, housing subsidies, child care and other social services; and many other programs for individuals with low earnings should also be considered welfare, including Social Security for the elderly. However, the political concern with "welfare" and efforts to reduce welfare costs almost always concentrate on AFDC first and foremost.
[8] AFDC was extended to two-parents families in 1967, because of the understanding that confining welfare only to mothers with young children would cause some fathers to abandon their families.
[9]These figures refer to participation in WIN demonstration programs (U.S. GAO, 1987).
[10]Applying the term "workfare" to the experimental welfare programs of the 1980s is somewhat misleading because few of them included CWEP, and the mandatory elements that were historically part of workfare--e.g., the threat that welfare recipients would lose their grants if they failed to comply with work requirements--have not been frequently used. Enrollments in these new programs have been kept relatively low, partly for reasons of cost, so there has been greater emphasis on voluntary than on coerced participation.
[11]In a departure from past welfare policy, JOBS also allows families in which both parents are present in the home and the primary wage-earner is unemployed to qualify for benefits and services, and it requires that one parent participate in training. However, like WIN, funding limits mean that JOBS will not function as an enforceable mandate. The federal government is requiring that states enroll 7 percent of their total welfare caseload in 1990 and that the proportion increase to 20 percent by 1995.
[12] See also the massive review of federal programs compiled by the National Commission for Employment Policy (1995).
[13] Although about one quarter of student aid goes to proprietary schools, most of the remainder goes to students in four-year colleges and universities, not to students in occupationally-specific programs in public community colleges and technical institutes. Student aid therefore provides only limited support for job training.
[14] See also the distinction made by Gueron and Pauly (1991) between "broad coverage" programs, which include all the elements of a complete job training program and are designed to reach a broad range of individuals (and where participation is often mandated for welfare recipients), and "selective-voluntary" programs, for which program administrators can select who can enroll or individuals can volunteer to participate or not. The experimental programs examined in Section III.3 are examples of "selective-voluntary" programs.
[15] In many places, states provide funding per student to individuals in community colleges and technical institutes, so that if JTPA or JOBS sends an individual to a community college state aid to the college -- and not JTPA or JOBS funds -- pays most of the costs of the programs, making this kind of referral cheaper than paying the full costs of the program in a CBO. This kind of "cost-shifting" leads to individuals being funded out of several public sources simultaneously: an individual in a community college could receive subsidy through state aid to the college, federal grants and loans through student aid, and subsidy through JTPA or JOBS. See Grubb and McDonnell (1991).
[16] CETA generated a data set -- the Continuous Longitudinal Manpower Survey (CMLS) -- that followed a several waves of CETA clients and also contained information about control groups, and CLMS was used for many of the evaluations; for a survey, see Barnow (1986).
[17] There can be, of course, differences between experimental and control groups due to sampling error. Therefore most evaluations use regression methods to control for the effects of various personal characteristics that may vary between experimental and control groups -- but with the knowledge that variables describing program participation are uncorrelated with background variables or unmeasured characteristics like motivation.
[18] A contrary argument could be made: that assignees who don't enroll find a job in the interim, and are therefore more job-ready and motivated. However, if this were true, then the benefit per assignee would be higher than the benefit per enrollee, contrary to the evidence.
[19] This is likely to happen if the demand for labor is relatively price-inelastic, in which case any shift in the supply function for labor will increase employment only slightly and reduce wages considerably.
[20] As mentioned above this is not a formal meta-evaluation; for a meta-evaluation, see Fischer (1995). See also the recent review of education and job training programs in What's Working (and What's Not) (Office of the Chief Economist, 1995).
[21] There is still a problem of external validity despite the sophistication of the experimental designs: the 16 programs were volunteers, and though they did not differ in any obvious way from JTPA programs as a whole it is still possible that they are more effective than randomly-selected programs would be.
[22] These results indicate the problems even with an experiment: Because many of those individuals assigned to enroll in the program failed to show up, the numbers of enrollees actually undergoing job training was smaller than the number assigned. The benefits for those enrolling were evidently higher than those assigned, as Table 5 shows.
[23] These JTPA evaluations examined only the Title II-A programs, not the Title II-C programs that require local SDAs to spend 40 percent of their funding on special youth programs.
[24] In addition, these results do not cover the summer youth programs, which provide young people short-term employment during the summer. There is little evaluation of these programs, though generally subsidized work experience has not been effective in improving employment of low-income youth (Office of the Chief Economist, 1995, p. 11).
[25] The General Education Diploma (GED) is a multiple choice test that is intended to be the equivalent of a high school diploma. However, the evidence about its effects is mixed: a careful analysis by (Cameron and Heckman, 1993) found that it had no effect on employment once individual differences were controlled, though a recent re-analysis of these data by Murnane Willett, and Bordett (1994) found more positive but still small effects. Other evidence (e.g., Quinn and Haberman, 1986) suggests that the GED does not increase the rate at which individuals enroll in postsecondary education.
[26] These are substantially smaller than the gains recorded for CETA, as might be expected given the upward bias of the quasi-experimental results summarized in Table 3.
[27] See Lurie and Hagen (1994), summarizing a large number of monographs on the implementation of JOBS; Grubb et al. (1990); and Grubb and McDonnell (1991).
[28] A random-assignment evaluation of JOBS programs in 6 states is also underway, but the results on outcomes will not be completed for at least another year.
[29] The emphasis on remedial education comes from the fact that individuals in GAIN go through a precise sequence of stages. At an early stage their basic reading and math are evaluated, and those who are deficient are directed to remedial education. In general, one of the discouraging (but unsurprising) findings of welfare-to-work programs is how many welfare recipients lack basic academic skills.
[30] This figure nicely illustrates the amount of movement on and off welfare that occurs "naturally", without any special programs: almost half of the single parents in the control group left welfare by the end of the third year.
[31] Across all four programs in Table 3, the average effect on earnings was 6 percent or about $17 per month ($208 per year)
[32] Others have suggested that it is possible to pass the GED with an eighth or nineth grade reading level -- perhaps an indication why earning a GED has so little influence in either employment or in enrolling in postsecondary education.
[33] In the American context where there is vociferous opposition to abortion, it would be politically detrimental to any program to increase both pregnancies and abortions.
[34] This conclusion is also supported in Fischer's (1995) meta-analysis of job training programs. However, he provides no numbers to back up this claim, and the most recent evaluations -- the GAIN study by Riccio, Freidlander, and Freedman (1994), and the JTPA study at 30 months by Orr et al. (1994), both of which weaken his results -- have not been included in his analysis.
[35] This is the only one of these comparisons that was statistically significant.
[36] Because individuals were assigned to services based on which were considered appropriate, the characteristics of clients varied among services and these results are not experimental.
[37] This may happen informally; in particular, some individuals claim that locating job training programs in community colleges provides individuals easier access to the educational programs of these institutions; see, e.g., Grubb et al. (1991). However, my point is that job training programs are not structured to lead to subsequent training and education opportunities. For a proposal to link education and training programs in "ladders", see also the Conclusion.
[38] In addition to the experimental results reported here, Geraci (1984) used the CLMS to measure the relation between longer-term earnings and short-term indicators. In general the correlations are quite low, indicating the inaccuracy of short-term measures of outcomes.
[39] There are technical problems in each of the two figures that suggest the lack of fade-out: the year 6 employment rates for Arkansas are actually extrapolated from two quarters of data, and the year five earnings in Baltimore are extrapolated from three quarters of data. These two crucial numbers could therefore be considered less certain than the figures based on four quarters of data -- even though the method of calculating standard errors, which relies on variation across individuals rather than over time, does not consider this.
[40] These results are corroborated by Fischer's (1995) meta-analysis, which concluded that across all studies the mean effect sizes for the proportion employed increase gradually until quarter 9, at the beginning of year 3, but then decay rapidly over the next three quarters. Similarly, the proportion on AFDC declines slightly until quarter 6, remains substantial in quarters 7 to 10, and then declines to zero; See Tables A.1 and A.2.
[41] Of course, in random-assignment evaluations the effects of creaming will be eliminated because the experimental and the control groups have roughly the same characteristics by construction.
[42] Bloom et al. (1993, Exhibit 7.12). This analysis was based on the 18-month results, which may be too early, and was not replicated for the 30-month results.
[43] These comments are based not on the published descriptions of the CET program, but on several visits by Judy Kalman and Norton Grubb to CET programs in San Jose and Oakland in the course of research for Grubb et al. (1992).
[44] However, it is quite possible that CET has become a trusted supplier of labor to the low-wage employers, who then hire through CET rather than other sources; in this case the employment effects of CET might be offset by displacement, or reductions in employment from other sources.
[45] The pattern of teaching remedial education and job skills in different and independent segments is typical of job training programs that claim to "integrate" the two; they really mean that they provide both, not that the two are integrated. Like most job training programs, the teaching methods used at CET are quite conventional, teacher-directed methods following the practices of "skills and drills".
[46] In these cost-benefit analyses, the treatment of transfer payments is carried out correctly; that is, a reduction in welfare benefits is simultaneously a cost to recipients and a benefit to taxpayers, and the overall effect of a transfer on society is zero. The only real savings from reducing welfare programs therefore comes from reducing administrative expenses, as well as from the intangible benefits of having fewer individuals rely on welfare.
[47]One could argue that the net gains to the government budget and to taxpayers in Table 24 are so small that they could easily be negative, given random variation in programs.
[48] These conclusions are consistent with Fischer's (1995) meta-analysis. He concludes that effects sizes are significantly different from zero, but very small: individuals who receive job training average increased earnings of $200 to $540 per year, and a decrease in welfare payments of $200 to $400 -- but that these effects generally decay over time.
[49] The current evaluation of the JOBS program in 7 states will provide a test of the "get-a-job" approach versus skills training, since individuals will be randomly assigned to the two different service approaches.
[50] There is even a small amount of evidence for this proposition. In Fischer's (1995) meta-analysis, the effects of job search increase in quarter 2 and then decay essentially to zero in quarter 4, while the effects of basic education programs are initially negative but then increase through quarters 2, 3, and 4. In addition, the effects of "staged" job search -- in which individuals are screened through job search and then proceed to other activities including education -- also increase, presumably both from screening and human capital effects. See Table B.1.
[51] There is a long-running debate in the United States about whether classroom-based or work-based training is more effective. The recent infatuation with the work-based apprenticeship systems of Germany and other European countries has led to greater interest in work-based learning even though the pedagogical problems are virtually the same in both settings, as Berryman (1995) has pointed out.
[52] For a description of the generally excellent co-operative education programs in Cincinnati, distinguishing those who view co-op programs as an educational experience versus those who view it as a source of well-trained short-term labor, see Villenueve and Grubb (1995).
[53] The issues in "skills and drills" versus "meaning making" as approaches to teaching are complex, since each involves many different assumptions about the nature of learning, the roles of student and teachers, the appropriate competencies to be taught, and the like; for summaries to these differences see Grubb et al. (1992) and Grubb and Kalman (1994).
[54] There are almost surely a few exceptions since the job training world is so large and varied; for a very brief description of one of them, in San Diego, see Martinson and Friedlander (1994).
[55] In the course of observing job training and remediation programs, we saw several -- in particular a STEP program -- that almost caused us to violate good research protocol by complaining to senior administrators about the cruel and unusual practices we observed. Even in the highly-regarded CET program the teaching was completely conventional.
[56] Some direct evidence based on learning outcomes of the superiority of alternatives to conventional teaching is available for elementary students (Knapp, Shields, and Turnbull, 1992); a meta-analysis of writing has shown that the presentational (or didactic) mode and the conventional teaching of grammar are the least effective approaches (Hillocks, 1986); and some specific practices have been confirmed superior, like cooperative learning (Slavin, 1987). However, there is relatively little evidence based on learning outcomes for adults taught in different ways partly because there is relatively little empirical research of any kind about adult education, developmental education in community colleges, and basic skills within job training programs and partly because the efforts to evaluate outcomes of different instructional methods have used inconsistent conceptions of instruction and therefore inconsistent observations of classrooms; see Romberg and Carpenter (1986) for math and see Hillocks (1986)for writing. Another problem is that different approaches to teaching generally emphasize different goals: Advocates of teaching in the meaning-making tradition usually stress "authentic" tasks and higher-order competencies, notoriously difficult to assess reliably, while those following skills and drill are more likely to be content with standardized tests.
[57] The question of local political interference is widely acknowledged by observers of job training programs, but almost no one has acknowledged the problem in writing -- probably because this problem has racial dimensions that are highly controversial in the United States; for an attempt to describe the problem in one community (Fresno) see Grubb and McDonnell (1991). Other efforts to cope with this problem arise in cases where particular JTPA programs have been investigated for fraud, mismanagement, or ineffectiveness. There has been to my knowledge no real analysis of the local politics of job training programs, however.
[58] It is common in the job training world for local programs to use a competitive RFP (request for proposal) process where the SDA invites organizations to bid on proposals for providing services; the competitive RFP process is intended to increase the number of organizations bidding on the basis of low cost and high quality. However, there appear to be a number of ways to manipulate the RFP process so that certain organizations will be chosen; for example, RFPs can be written so they apply only to specific organizations, or the knowledge that an RFP is rigged for a particular organization will prevent others from applying. One individual commenting on an early draft of this monograph reported that a charismatic PIC director had his life threatened for changing contracting procedures.
[59] For example, in a study of efforts to coordinate job training, welfare-to-work programs, and vocational education, none of the exemplary coordination efforts were in cities because of the dominance of purely political allocation of resources there; see Grubb et al. (1991).
[60] These are results for calendar year 1987. Of course, the quality of the data entering these calculations is much poorer than the data in random assignment experiments: the SIPP data are entirely self-reported and are retrospective, generating problems of whether the reports of having been enrolled in job training are accurate, as well as assessments of job-relatedness. The other independent variables available in the SIPP data are inadequate to control for the various characteristics of those in job training programs; therefore all the coefficients on JTPA and CETA job training programs are negative, reflecting negative selection.
[61] On the problems in the current "system" of education and training, see especially Hansen (1994); Grubb and McDonnell (1992); McDonnell and Grubb (1991); Grubb et al. (1992); and Grubb et al. (1991).
[62] There are only a few exceptions to the general pattern of job training occurring in isolation from other education and training programs. In some welfare-to-work programs, caseworkers emphasize "self-initiated placement", where an individual can put together a series of individualized education and training programs. And in some cases job training programs contract with community colleges to provide short-term job training; in such cases administrators claim one of the benefits to be that trainees can more easily enroll in regular community college programs -- though the frequency of movement from short-term job training to longer-term educational programs in such situations is unknown.
[63] Part of the history of education in the United States is the development of a coherent system, with clearly-articulated links among different levels of education -- though this system was not complete until the 1920s or so. One could argue that the job training "system" is still in its infancy, since it is barely 30 years old -- and that it will take a longer period of time for a system to emerge.
[64] Equivalently, it may be that individuals completing job training programs find employment at the expense of others who do not -- though, as mentioned in Section II, this kind of displacement cannot be detected with conventional evaluation methods. In economic terms, a shift outward in the supply function for a particular kind of labor (e.g., for modestly-skilled employment) along a stationary and inelastic demand function will result in very little additional employment and in a fall in the wage rate -- and so placement rates will be low, displacement high, and the increase in earnings modest. In effect, job training programs assume that the demand for labor is relatively elastic.
[65] Because there were only 16 sites, there was not much variation in labor market conditions. In addition, the results were carried out only for the 18-month outcomes, not the 30-month results.
[66] See, for example, Traub (1995), who complains about the inappropriateness of providing so much remediation in colleges.
[67] For a primer recommending practices for summer youth programs that is quite similar to this recommendation, see Center for Human Resources (1993).
[68] The ideas in the section draw upon Grubb (1992). Within the social services field, the same idea is known as providing a "continuum of services" -- for example, providing a range of mental health programs from highly restrictive, for the most dangerous and worst-off patients, to minimum security facilities and halfway houses, to various forms of counseling and therapy for those able to live on their own. In theory, individuals can enter the continuum at any point appropriate to their needs, and progress up and out of the system.
| Department and Program | Fiscal year 1995 appropriation (in millions of dollars) |
|
|---|---|---|
| Department of Labor | ||
| JTPA | $4,912.5 | |
| Veterans programs | 175.1 | |
| Employment service | 845.9 | |
| Other | 910.5 | |
| Department of Education | ||
| Vocational Education | 1,236.2 | |
| Adult Education | 270.6 | |
| Literacy programs | 38.5 | |
| Student grants and loans | 4,716.0 | |
| Rehabilitation services | 2,086.1 | |
| Other | 638.0 | |
| Department of Health and Human Services | ||
| JOBS | 1,300.0 | |
| Community Services Block Grants | 426.3 | |
| Refugee assistance | 105.0 | |
| Other | 192.3 | |
| Department of Agriculture | ||
| Food stamp employment and training | 165.0 | |
| Department of Housing and Urban Development | ||
| Youthbuild | 50.0 | |
| Empowerment Zone and Enterprise Community Program | 640.0 | |
| Other | 47.3 | |
| Other departments | 1,763.3 | |
| Total | $20,413.9 | |
| Source: U.S. G.A.O. (1995), Appendix II. | ||
| Classroom Training | On-the-Job Training | |||
| Ashenfelter (1978) | Cooley, McGuire, and Prescott (1975) | |||
| Black males | $318 to 417 | Males | $-38 to $59 | |
| White males | 139 to 322 | Females | 30 to 226 | |
| Black females | 441 to 552 | |||
| White females | 354 to 572 | Ketron (1979) | ||
| Minority males | 1984 | |||
| Borus (1964) | White males | 2181 | ||
| Males | 305 | Minority females | 884 | |
| White females | 926 | |||
| Borus and Prescott (1974) | ||||
| Males
|
516
|
Kiefer
(1976)
|
||
| Females
|
38
|
Black
males
White males
|
-160
-61
| |
| Cooley,
McGuire, and Prescott (1975)
|
Black
females
White females
|
386
926
| ||
| Males
|
71
to 234
|
|||
| Females
|
168 to 291
|
Prescott
and
Cooley (1972)
|
||
| Hardin
and Borus (1971)
|
251
|
Males
Sewell (1971)
|
796
| |
| Ketron
(1979)
|
Males
|
375
| ||
| Minority
females
|
184
|
Females
|
754
| |
| White
females
|
701
|
Job Corps
|
| |
Kiefer (1976)
|
||||
| Black
males
|
-742
to -355
|
Kiefer
(1976)
|
||
| White
males
|
-644
to -375
|
Black
males
|
-179
| |
| Black
females
|
591
|
White
males
|
-74
| |
| White
females
|
639
|
Black
females
White females
|
-188
-780
| |
| Main
(1968)
|
409
|
Mallar (1978)
|
| |
Page; Gooding (1964; 1962)
|
446
|
Males Females without
children
|
187
565
| |
Prescott and Cooley (1972)
|
Females
with
children
|
-206
| ||
Males
|
652
|
Adult Basic Education
|
||
| Sewell
(1971)
|
432
|
Brazzie (1966)
|
||
| Cain
and Stromsdorfer (1968)
|
Males
|
2368
| ||
| White
males
|
828
|
Roomkin
(1973)
|
||
| White
females
|
336
|
Males
|
318
| |
| Females
|
12
| |||
| Source: Taggart (1981), Table 3.1. | ||||
Effects of CETA on annual earnings increases for subgroups
|
Westat
(1981)
FY 76
|
Westat
(1984)
FY 76
|
Westat
(1984)
FY 77
|
Bassi
(1983)
|
Bassi
et al.(1984)
Nonwelfare disadvantaged
adults
|
Bassi
et al.
(1984)
Welfare
|
Bassi
et al.
(1984)
Youth
|
Bloom & McLaughlin
(1982)
|
DJW
(1984)
Adults
|
DJW (1984)
Youth
| |
| Overall | 300** | 129** | 596** | -- | -- | -- | -- | -- | -- | -- |
| White Women |
500** | 408** | 534** | 740** to 778** | 705** to 762** | 840** to 949** | -68 to -23 | -- | -- | -- |
| White Men |
200 | -4 | 500** | -- | 17 to 136 | 578 to 691** | -576** to -515** | -- | -- | -- |
| Minority Women |
600**
|
336**
|
762**
|
426**
to 671**
|
779**
to 810**
|
659**
to 703**
|
-201
to -77 |
--
|
--
|
--
|
| Minority Men |
200
|
-104
|
658**
|
117
to 211
|
116
to 369
|
-273
to 69 |
-758**
to -681**
|
--
|
--
|
--
|
| Women
|
--
|
--
|
--
|
--
|
--
|
--
|
--
|
800**
to 1300**
|
13
|
185
|
| Men
|
--
|
--
|
--
|
--
|
--
|
--
|
--
|
200
|
-690**
|
-591**
|
** = statistically significant at 5 percent.
Source: Barnow (1986), Table 3.