NCRVE Home | Site Search | Product Search

<< >> Title Contents Grubb, W. N., Kalman, J., Castellano, M., Brown, C., & Bradby, D. (1991). Readin', writin' and 'rithmetic one more time: The role of remediation in vocational education and job training (MDS-309). Berkeley: National Center for Research in Vocational Education, University of California.

Postsecondary Vocational Education: Community Colleges and Technical Institutes

Comprehensive community colleges and their specialized peers, technical institutes, have become some of the largest providers of remedial education.[8] The institutions have found their incoming students increasingly underprepared, particularly since the vast expansion of enrollments in the 1960s and 1970s, so they have added remedial programs to their more traditional vocational and academic offerings. Virtually every community college now offers some form of remediation;[9] estimates of the fraction of entering students in need of some form of basic instruction vary from twenty-five percent to fifty percent (Cahalan & Farris, 1986, Table 6; Plisko & Stern, 1985; Roueche, Baker, & Roueche, 1987) to seventy-eight percent in the Tennessee system (Riggs, Davis, & Wilson, 1990). Although there has been some resistance to remedial education, partly on the grounds that such programs compromise claims to being "colleges," most community colleges seem to have accepted the legitimacy of these offerings (Mickler & Chapel, 1989); many have expanded their offerings in response to greater numbers of very poorly prepared students from JTPA and welfare programs, as well as increasing numbers of foreign-born students in need of English as a Second Language (ESL).

The expansion of remedial education appears to have taken place as a result of local responses to need rather than as a result of state policies, since relatively few states have adopted specific policies for remediation.[10] However, virtually all states fund remedial education through state aid to community colleges and technical institutes--though a few establish limits on the number of remedial courses per student that receive state support--and many use their Perkins funds for remedial programs for vocational students. Receiving state aid on the basis of enrollment or attendance distinguishes community colleges from most other providers of remediation and creates a fiscal incentive for other programs--notably JTPA and welfare--to send their clients to community colleges.

All of the community colleges in our sample provided some form of remedial education, or "developmental education" as some individuals termed it. The estimates of the fraction of students enrolled in such programs varied from twelve percent to eighty-three percent, with two modes at about thirty-five percent and seventy percent. However, several administrators asserted that this question is difficult because the boundary between what is remedial and what is truly college-level is a matter of judgement. In addition, they claimed that conceptions of who is a "remedial student" vary from all those who are taking at least one remedial course to those enrolled in an entire remedial program. Community colleges provide remediation in several different ways: Some offer courses within English and math departments; some have established separate learning labs or centers where students can go for individualized instruction; and some have established remedial departments which may offer a variety of courses as well as learning labs, and even non-remedial English and writing courses in some institutions.[11]

Not surprisingly, offerings vary widely among community colleges. At one end of the spectrum, some colleges seem to offer only a learning lab equipped either with programmed or computer-based instruction, which students can use on their own initiative with relatively little guidance. However, the most ambitious community colleges offer a great deal more and provide good examples of the eclectic approach to instruction described in Section Four: They provide courses at different levels of difficulty, typically encompassing coursework below the fourth grade level; coursework ranging between the fourth and the eighth grade level; and coursework leading up to college-level competencies in reading, writing, and math, rather than offering only one or two of these subjects; they include labs in all three subjects, where students can work at their own pace under the guidance of instructors; in reading and writing courses, they distinguish between offerings for native speakers of English and those for non-native speakers, since the two groups have different learning needs; and they provide one-on-one tutoring. The best of the community college programs are quite varied in their offerings, then, especially compared to the other providers of remedial education.

Colleges also vary in whether they require developmental education of students who score below some standard or whether remediation is "strongly advised" but not required. There has been a shift toward requiring remediation (Boylan, 1985), since colleges have been under pressure to increase persistence; and eleven states now require mandatory placement in developmental education (Boylan, 1985). However, even with such a requirement, students can usually enroll concurrently in other vocational and academic courses. Most of the institutions that we surveyed advised but did not require underprepared students to take developmental courses. Almost all institutions allowed concurrent enrollment in other courses. (There are exceptions, however; students in Tennessee scoring below college proficiency on the state's basic skills assessment must complete a remedial program before enrolling in courses that use skills which they lack.) As a result, low scores on standardized tests are only rarely a barrier to enrollment in vocational education in community colleges--contrary to the practice in many JTPA programs, for example, in which low scores prevent individuals from entering certain training programs.

Almost all of the community colleges we surveyed include either welfare or JTPA clients, most of them in the regular remedial programs rather than in special courses. In some states, including California and Florida, welfare-to-work programs have not been allocated funds for basic skills instruction, so welfare programs must send their clients either to adult education or community colleges. When welfare clients enroll in community colleges, the tracking requirements under the JOBS program entail extensive paperwork; therefore, community colleges know exactly how many welfare recipients they have in JOBS-sponsored programs. However, unless a community college has a subcontract with a SDA to provide remediation--something which happened in only two community colleges in our twenty-three regions, largely because JTPA avoids using its own resources for remediation--or has received an 8-percent grant for JTPA clients, the college is unlikely to know and has no need to know if a student is also a JTPA client; consequently, individuals referred by JTPA to community colleges for remediation may enroll, but neither the college nor JTPA knows that such a referral has been completed. As a result, many colleges report that they do not know how many JTPA clients they have, even in regions where the SDA reports that it refers individuals to the community college.

In most community colleges, remediation is relatively independent of both transfer education and vocational education. Remedial programs usually have lower status; they are more likely to be taught by part-time instructors than by regular full-time faculty; and they are likely to be seen as precursors to vocational and academic coursework, rather than as complements. In practice, this means that no community colleges in our sample have tried to coordinate remediation with vocational or academic programs. There has been, based on our survey, little attempt to develop "functional context training" in which the content of remedial courses is somehow drawn from or linked with the content of vocational programs. While concurrent enrollment in both remedial and "regular" courses is widespread, and is widely reported to have advantages in keeping students motivated and enrolled, it does not mean that the content of remedial and vocational courses has been coordinated or integrated in any way. To be sure, there has been some discussion among instructors of the need to teach basic skills within the context of "regular" courses--usually courses in literature, the humanities, and the social sciences (Luvaas-Briggs, 1983; Bojar, 1982; McGlinn, 1988; Baker, 1982; and for four-year colleges, Ganschow, 1983). In addition, our site visits identified a few efforts to use vocational material in remedial courses. By and large, however, developmental education efforts in community colleges remain independent of the transfer and vocational programs for which they presumably prepare their students.

Because community college funding is enrollment-driven, community colleges can generally provide good information on how many students are enrolled in their remedial programs. However, other evidence is spotty. Data on the proportion of students starting remediation who complete different stages or who then go on to complete certificates or Associate programs is also very limited, though administrators estimated that between ten percent and fifty-nine percent of students complete remedial courses. Administrators often report that they have evaluation evidence, usually in the form of pre- and posttests; nevertheless, while they may use such information for evaluating the progress of individual students, it is much rarer to see such information used to evaluate the effects of courses or programs. Of the institutions we contacted, several sent us enrollment figures, but only one sent an evaluation of any kind--an analysis of retention rates of students in developmental education.

In the literature on developmental education, there are relatively few evaluations; indeed, complaints about the lack of evaluation evidence are staples of prior examinations (J. E. Roueche, 1968; Cross, 1976; Roueche & Snow, 1977; J. E. Roueche, 1983; Cohen & Brawer, 1989). A meta-analysis of college programs for high-risk and disadvantaged students through the early 1980s (Kulik, Kulik, & Shwalb, 1983) located only nine evaluations of remedial or developmental programs, of which six were for community colleges and none of which was published more recently than 1971. While the analysis found that these programs have positive effects on the average, community college programs and remedial programs have lower effects and usually statistically insignificant effects on both grade point average and persistence. More recently, one can find summaries that claim positive outcomes--such as the claim that "well-designed programs that are challenging and motivating but not overwhelming produce positive results far beyond the expectations of the instructors" (Mickler & Chapel, 1989, p. 3)--as well as relentlessly gloomy interpretations. A few states have carried out substantial evaluations of their programs, notably California, where a consortium has identified colleges with adequate evaluation information and compiled evidence showing test score gains of students in remedial courses (Learning Assessment and Retention Consortium (LARC), 1988a, 1988b, 1989a, 1989b); and New Jersey, whose results focus on attrition rather than test scores (Wepner, 1987; Morante, Faskow, & Menditto 1984). The results indicate that community college students who passed remedial courses had an attrition rate from one semester to the next of thirteen percent, compared to an attrition rate of forty-two percent for those judged in need of remediation who did not complete courses, twenty-seven percent among those in need of remediation who never enrolled in such courses, and twenty-one percent for those judged not in need of remediation--suggesting that completing remediation among those in need of it sharply reduces attrition. However, while the results from New Jersey and California are generally positive, they may not be representative of all developmental programs,[12] and the underlying methodologies are weak (for reasons that will be explored later in this section).

The most thorough evaluations have taken place in Miami-Dade Community College, with its relatively sophisticated institutional research office.[13] Some results (e.g., Losak & Morris, 1983) suggest that completion of developmental courses has made little difference to student success. However, the extensive results in Losak and Morris (1985), reproduced in Tables 1 and 2, are more positive. These tables provide richer information than most other evaluations because they describe outcomes such as persistence and CLAST (College Level Academic Skills Test) scores (scores from a "rising junior" exam which students must pass to transfer from two-year to four-year colleges in Florida) which are more meaningful than changes in standardized test scores. In addition, they allow comparisons among different groups of students. The data in these tables also allow

Table 1
Three-Year Persistence Rates
(Graduated or Re-Enrolled)
For Tested First-Time-in-College Students
Who Entered Fall Term 1982
Miami-Dade Community College

Successfully Completed Remedial Courses in the Following:

Below
Placement
Score
No
Area
One
Area
Two
Areas
Three
Areas

No Area
(N=2021)
N=
Graduated
Still Enrolled
Total
2021
533
430
963

26%
21%
47%
One Area
(N=1524)
N=
Graduated
Still Enrolled
Total
873
95
149
244

11%
17%
28%
651
136
164
300

21%
25%
46%
Two Areas
(N=1360)
N=
Graduated
Still Enrolled
Total
530
25
47
72

5%
9%
14%
509
56
130
186

11%
26%
37%
321
49
104
153

15%
33%
48%
Three Areas
(N=1457)
N=
Graduated
Still Enrolled
Total
641
7
56
63

1%
9%
10%
357
12
69
81

4%
19%
23%
303
24
89
113

8%
29%
37%
156
14
58
72

9%
37%
46%


Source: Losak and Morris (1985), Table 1.




Table 2
Passing Rates for 1984-1985 CLAST Examinees
Related to
Placement Test Results and
College Preparatory Success
Miami-Dade Community College

Successfully Completed Remedial Courses in the Following:

Below
Placement
Score
No
Area
One
Area
Two
Areas
Three
Areas

No Area N=
Passed All
Passed 3 or 4
1091
1031
1090

95%
99%
One Area N=
Passed All
Passed 3 or 4
336
271
324

81%
96%
276
232
266

84%
96%
Two Areas N=
Passed All
Passed 3 or 4
163
86
133

53%
82%
113
67
100

59%
88%
79
51
72

64%
91%
Three Areas N=
Passed All
Passed 3 or 4
108
32
61

30%
56%
62
23
38

37%
61%
44
16
37

36%
84%
27
14
22

52%
81%


Source: Losak and Morris (1985), Table 3.

calculation of rates at which students remedy deficiencies; for example, forty-two percent (=651/1524) of students below a college-level score in one area completed remediation in that area, but only twenty-four percent of those deficient in two areas and eleven percent of those deficient in three areas completed remediation in all subjects. The results indicate that for students found to need remediation, completing more developmental courses improved retention and CLAST scores; but that completing such developmental courses did not eliminate the differences between students entering with deficiencies and those not needing any remediation.[14] That is, developmental education can narrow the differences among students, but it cannot eliminate them--at least not as it is currently practiced at Miami-Dade. Furthermore, completing remedial courses obviously requires substantial time and effort, especially for individuals who need to take such courses in two or three subjects, and so large fractions of students entering with scores below college-level never complete the appropriate remedial sequence.

There is, then, relatively little evidence about the effects of remediation in community colleges despite its growth over the last two to three decades. Although the evidence that exists is positive, particularly the findings from Miami-Dade, it probably describes the best institutions rather than the average practice, and is still subject to methodological flaws.

Adult Basic Education

A large system of adult education in this country provides various offerings for remediation--from ABE, GED, and ESL courses to citizenship training, hobby courses, and various self-improvement courses. The institutional sponsorship of adult education is bewildering: In most states, school districts have responsibility, though typically districts can choose whether or not to provide adult education. In some states (e.g., California), both school districts and area vocational schools provide adult education; in others (e.g., Illinois), adult education is the responsibility of community colleges. In a few cases, there has been a division of labor; for example, in Florida, school districts provide adult education in fourteen counties, and they provide community colleges in the remaining fourteen. Adult education is generally funded by state aid per person enrolled, and so--like community college programs--is an inviting target for JTPA and welfare programs seeking remediation at someone else's expense.

ABE programs have the distinct advantage of being ubiquitous: There are ABE programs in every community in which we interviewed. Programs such as JTPA and many state welfare-to-work efforts lack funding specifically for basic skills. Moreover, these programs do not see themselves as educators and do not want the responsibility of developing educational curricula. Therefore, ABE programs are the most obvious places to send clients in need of remediation, partly because of funding but also because JTPA and welfare programs are also under substantial pressure to use existing resources to avoid duplication of services. As a result, in the majority of communities we surveyed, both programs refer clients to ABE when they fall below specific scores on standardized tests. For example, JTPA programs often establish minimum test scores for entry into certain job skill programs; clients with lower test scores are referred to ABE programs, presumably allowing them to increase their scores and then gain admission to training.

Within adult education, a common practice is to offer GED classes, as well as courses at a lower level of difficulty (often labeled ABE or pre-GED), designed to prepare students for GED classes. ABE classes are equivalent to work roughly between the fourth and eighth grade levels, while GED classes cover material roughly equivalent to grades six or seven to ten.[15] Most ABE and GED courses cover reading comprehension and arithmetic computation, but incorporate little writing; compared to community college developmental education, their range is quite restricted. Most ABE operate as open-entrance/open-exit programs, using texts or programmed workbooks which students can follow at their own pace, or (rarely, because of the lack of funds) using computer-based programs. Overwhelmingly, program directors described curricula as individualized and self-paced. "Individualized" means that programs ascertain an individual's level of performance through a standard test--often the Test of Adult Basic Education (TABE) or the Adult Basic Living Exam (ABLE)--and then start each student at the appropriate level in reading and math. The role of instructors appears to vary greatly. They tend to have little training in adult or remedial education, and they are almost all part-time (e.g., see Balmuth, 1985, and Darkenwald, 1986); since the instructional materials are designed to allow students to progress on their own, teachers need do little other than respond to occasional questions. However, a few ABE directors in our sample mentioned that they develop alternative curriculum materials to vary the format and media of the curriculum and to incorporate some writing and some group discussions into their programs. We suspect, then, that instructors vary enormously, from being relatively passive managers of prepackaged curriculum materials to being more active in devising their own approaches.

Uniformly, the ABE programs we interviewed lack information about completion rates. However, there is a general consensus that completion is very low; figures of fifty percent were commonly cited by the programs in our sample. ABE literature supports these figures, too (e.g., the review by Balmuth, 1985). Because of the lack of records, any figures on completion are simply guesses. What emerges consistently is an image of lackadaisical attendance in ABE: Directors describe many participants as attending sporadically, sometimes over long periods of time, and making slow and uncertain progress.[16]

One goal common to most adult education programs--evident in the structure of pre-GED and GED classes--is to have students pass the GED exam, to have their high school equivalency. In turn, many JTPA and welfare programs have taken GED completion as their goals, and so the GED appears to drive a great deal of existing remediation. Unfortunately, the evidence that completing a GED enhances employment or access to postsecondary education is weak. A number of adult educators we interviewed expressed that a GED "is only the first step," or is not enough to get worthwhile jobs. The literature examining the effects of the GED--scattered, often of low quality, and in great need of synthesis--suggests that the GED may provide a small advantage to those that complete it, but that this advantage might be attributed to motivation, prior preparation, or other personal characteristics that distinguish GED completers from high school dropouts (Passmore, 1987; Olsen, 1989; Quinn & Haberman, 1986). Given the enormous influence of the GED on the goals and methods of adult education, it is disconcerting to find so little support for its effectiveness.

We were unable to collect any evaluation evidence from the programs we interviewed. As in many community colleges, some ABE programs claim to perform evaluations using pre- and posttests, but they use tests for individual assessment rather than program evaluation. Just as none collect systematic information about rates of progress and noncompletion, none collect information about the subsequent experiences of their participants. The fraction of participants who go on to complete a GED or other high school diploma equivalent,[17] the fraction who gain access to vocational training, the fraction among those referred by JTPA or welfare who subsequently enter training and find employment--these and other obvious measures of success are completely lacking. Nor could we find much evaluation evidence in the literature to supplement the information we received from our questionnaires.[18] While a few studies find positive results, most of them are seriously flawed.[19] Even those studies with positive outcomes acknowledge that gains are small. For example, Diekhoff (1988) claims that "there is little doubt that the average literacy program participant achieves a statistically significant improvement in reading skill" (p. 625), citing a 1974 study for the Office of Education that documented a half grade reading gain over a four month period. But given the limited amount of time most adults spend in ABE, with only twenty percent enrolling for longer than one year, most ABE students will improve by one year or less, and their gains--from a fifth to a sixth grade reading level, for example--are trivial in practical terms. As he concludes,

Adult literacy programs have failed to produce life-changing improvements in reading ability that are often suggested by published evaluations of these programs. It is true that a handful of adults do make substantial meaningful improvements, but the average participant gains only one or two reading grade levels and is still functionally illiterate by almost any standard when he or she leaves training. But published literacy program evaluations often ignore this fact. Instead of providing needed constructive criticism, these evaluations often read like funding proposals or public relations releases. (p. 629)

The general tenor of writing is discouraging, acknowledging the low levels of motivation, high dropout rates, and the lack of any but the most infrequent and anecdotal success stories. This literature generally confirms the information from our surveys--of a large, unwieldy set of programs, with varied institutional sponsorship and content, lacking any systematic information about enrollments, completion, progress, or success.

The Job Training Partnership Act

The Job Training Partnership Act (JTPA) allows local programs great discretion in the services provided to eligible individuals, and it allows basic or remedial education either by itself or in combination with occupational skills training (NCEP, 1987). However, most local SDAs have chosen to concentrate on providing classroom-based skills training provided by community-based organizations (CBOs) and educational institutions, on-the-job training provided by firms, and job search assistance. While it is impossible to ascertain at the national level how much of JTPA's resources support remediation, basic education does not figure prominently in most discussions of JTPA,[20] and prior studies have found relatively few SDAs providing any remediation.[21] In our prior observations of JTPA programs (Grubb et al., 1989; Grubb et al., 1990), it became clear that JTPA performance standards have discouraged basic skills for two different reasons. Remediation increases costs, and, therefore, has made it more difficult for programs to meet the cost-per-placement standard (a standard which has recently been abolished). In addition, several administrators claim that JTPA clients are more likely to drop out during remediation because they find it boring, irrelevant to their job goals, and too reminiscent of the schooling in which they have previously failed--and dropouts for any reason make it difficult to meet placement standards. At the same time, many administrators acknowledge the need for more remediation, and some are trying to find new resources to support more instruction in basic skills.

In our sample of SDAs, virtually all offer some remediation. Most SDAs did not know precisely how many clients received basic education, however, because this decision is often left to subcontractors and is not reported to the SDA. Several programs that did hazard guesses estimated that around fifteen percent of their clients received some form of remediation.[22] Most commonly, an SDA will subcontract with various agencies, and some will provide basic skills instruction along with vocational skill training--in short-term secretarial and clerical programs, for example. When this happens, it is difficult to determine what the balance of remediation and job skills training is or what approaches are used in the remediation component because these decisions are left to subcontractors. In only a few cases did SDAs report that they had established a policy to guide subcontractors in their provision of basic skills. When a policy exists, it is usually limited to increasing client test scores by only a few grade levels. It is also common to provide remediation only to those who can prepare for the GED with a minimal brush-up (a month or two); clients with low tests scores may be supported for four to six weeks--clearly not enough to reach any minimum competency level--or, much more likely, they may be referred to an ABE or volunteer literacy program. Some JTPA programs match remediation to the client's employment goal; for example, an individual interested in office occupations may be encouraged to complete a GED, while those in janitorial programs will be encouraged to reach a seventh grade reading level. However, explicit policies about remediation are relatively rare, and SDA administrators were generally unfamiliar with the remedial programs offered by subcontractors.[23]

In a few instances, however, SDAs have established clear expectations about basic skills. Both the San Diego Private Industry Council (PIC) and the San Francisco PIC have declared that all providers of training should also incorporate basic skills instruction as appropriate, either by providing such instruction directly or by referring individuals to other agencies. Typically this is accomplished by dividing the day, for example with skill training provided in the morning and remediation in the afternoon and with no necessary relationship between the two components (though the San Diego SDA supports several organizations that do integrate remediation with vocational skills training in more meaningful ways). The policies of these two PICs are clearly exceptions, at least within our sample, though their decisions are consistent with the drift of federal policy to emphasize more remediation.

Less commonly, SDAs will subcontract with an agency (including various educational institutions) to provide remediation only. For example, the community colleges in San Diego and Danville, Illinois, have contracts to provide remediation for JTPA clients. The Berrier-Cass-Van Buren SDA in Michigan has just started contracts with several CBOs to offer basic education and employability skills based on the competency-based Comprehensive Adult Student Assessment System (CASAS); they were expecting the average duration in these programs to be about four weeks. Contracts specifically for remedial education are more common in youth programs within JTPA, for which mastery of academic competencies is an acceptable outcome. In most adult programs, however, the emphasis remains on job skills training and work experience.

The most common approach of JTPA programs is to refer individuals to other remedial programs. Based on an initial assessment, an SDA may suggest that an individual enroll in a remedial program concurrently with job skills training. The initial assessment may also be used as a barrier to some types of training and as a possible source of "creaming"[24]: Certain training programs have minimum scores necessary for enrollment, and individuals with low scores are then referred to ABE or GED programs in the hopes that they can increase their scores and later gain admission to job training. North Carolina has extended this practice statewide: A seventh grade reading level is necessary to enroll in JTPA, and all individuals below this level are referred to ABE programs.

In referring JTPA clients to other programs, there appears to be a preference for sending individuals to ABE programs rather than community colleges. The timing of ABE programs--which often take place in the evening and which are typically open-entry/open exit--may be more appropriate for individuals who are in job skills training during the day. In addition, community college developmental education in some areas does not offer remediation at a low enough level for many JTPA clients. The tuition charged by community colleges may also be a barrier. However, in states where community colleges have established special remedial centers--as in North Carolina's Human Resource Development Centers or Wisconsin's special learning centers--then JTPA and welfare-to-work programs appear to refer more clients to community colleges.

The most obvious problem with referral is few SDAs have developed mechanisms to follow individuals whom they refer to other programs. Therefore, SDA officials never know whether someone they refer elsewhere enrolled in that program, whether they completed it, or whether they made it back into job skills training.[25] The mechanism of referral may seem like an appropriate form of cooperation among education and job training programs, but it is just as likely to exclude individuals from training and cause them to be "lost" among programs.

Finally, a substantial, though unknown, fraction of JTPA 8-percent funds are used for remediation. These funds, which are designed "to facilitate coordination of education and training services" (Section 123, Job Training Partnership Act), are often allocated through departments of education, following state priorities. In many cases these priorities include remediation; for example, Georgia recommends that 8-percent funds support remediation, GED programs, and support services for JTPA clients in technical institutes; Massachusetts has used its funds for a program called Workplace Education, providing ABE, GED, and ESL instruction through employers; Michigan uses its 8-percent funds for the Summer Training and Education Program (STEP), providing basic skills to in-school youth, and for literacy and basic education provided by local agencies; Illinois allows remediation as an option for 8-percent funds, and several SDAs use all their resources for basic education; Tennessee has allocated half of its funds to the State Department of Education for statewide literacy programs; Washington has recommended that 8-percent programs emphasize basic educational skills and workplace literacy; and California has established, as one of two priorities, programs that combine basic skills and vocational skills. In addition, several states (including California) have allocated some of their 8-percent funds specifically for welfare recipients, and these resources are also likely to find their way into remediation. The 8-percent funds are generally viewed within JTPA as relatively unconstrained resources--meaning, in particular, that they are not subject to performance standards--and have, therefore, been widely used in novel or experimental programs, or those including hard-to-serve groups. As a result, many remedial programs have at least a little 8-percent money supporting them.

The remediation funded by JTPA follows a consistent pattern. Because JTPA funds relatively short programs--rarely longer than twenty weeks and often less than half that--there is constant pressure to achieve gains in short periods of time; programs will therefore report gains (usually in grade-equivalent scores) per one hundred hours of instruction. Second, there is a distinct preference within JTPA for self-contained remedial programs--that is, programs that have curriculum materials (including teacher aides) already developed that can be implemented without a great deal of time for teacher preparation, curriculum development, or the participation of skilled educators--including computer-based programs such as the PLATO system and IBM's Principles of the Alphabet Literacy System (PALS), sometimes referred to as "turn-key" systems. JTPA administrators often distinguish themselves from educators, claiming to be job-oriented and performance-driven rather than academic and enrollment-driven. This distinction leaves some of them uncomfortable with developing educational programs; a typical comment about the decision to refer clients to ABE programs is that "we'll leave that to the educators." Finally, with the exception of some programs incorporating employability skills and several innovative programs described in our "Alternatives to Skills and Drills" section, the vast majority of remediation provided within JTPA has not been modified to incorporate occupationally oriented material or to integrate knowledge required in job skills training. Almost all of it follows the model we label "skills and drills." Unfortunately, the limits of skills and drills are especially obvious within JTPA, which includes many high school dropouts and others who have not done well in conventional schooling; several administrators volunteered that remedial programs are boring and demeaning to their clients, and that some JTPA clients score poorly on standardized tests and drop out despite being able to read relatively well.

As in every other area of remediation, there are no evaluation results about the effects of basic skills within JTPA on other outcomes such as completion of job skills training, placement, or subsequent earnings. Even though SDAs must compile information on performance standards, these data are used for compliance but not for evaluation purposes; as a result, no JTPA program in our sample could provide evidence about the effectiveness of remediation. More general evaluation evidence about the effects of JTPA will begin to come out only when the National JTPA Study is completed, in 1992 (Gueron, Orr, & Bloom, 1988).

Two other recent evaluations of JTPA-related programs are tantalizing, though far from conclusive. One study examined the JOBSTART demonstration programs, which offer comprehensive services to disadvantaged high school dropouts (Auspos, Cave, Doolittle, & Hoerz, 1989). The evaluation differentiated those programs offering both remediation and job skills training concurrently, those offering remediation before job skills training (sequentially), and those providing remediation and referring their clients elsewhere for occupational skills training. The preliminary results indicate that those in JOBSTART received more education and training, and were more likely to receive a GED,[26] compared to control groups, but results about the effects of different patterns of education and training have yet to appear. A second study, an evaluation of the Minority Female Single Parent Demonstration, examined four programs designed to help low-income single mothers move from welfare to employment (Burghardt & Gordon, 1990). Three of the programs had no significant effects, compared to control groups; the one with a significant influence in increasing employment rates and earnings--the Center for Employment Training (CET), based in San Jose and described in greater detail in our "Alternatives to Skills and Drills" section--is a CBO that integrates basic skill training with job skill training. The authors of the evaluation concluded that programs which integrate remediation and skills training are more effective than those that provide the same services in a non-integrated fashion. Appealing as this conclusion is, the contention that integration explains the effectiveness of CET--rather than any other differences among the programs--cannot be supported by this kind of research.[27] In any event, the kind of linkage between remediation and job skills training in the experimental programs evaluated by these two reports is quite different from the general practice in our sample of SDAs, in which relatively few programs provide any basic skills training and largely refer their clients to ABE programs.

Welfare-to-Work Programs

The Family Support Act of 1988 established the Job Opportunities and Basic Skills (JOBS) program, which requires states to establish welfare-to-work programs and to compel some welfare recipients to participate. A wide range of services can be provided, including vocational training, basic or remedial education, postsecondary education, job search assistance, work experience, on-the-job training, and support services such as child care. In theory, the JOBS program could be used to provide a rich array of services to welfare recipients--a rebirth of the "services strategy" of the 1960s. However, many of the experimental welfare-to-work programs established during the 1980s provided paltry amounts of education and training,[28] and our previous investigations confirmed that many states have not appropriated enough money to provide much education or job training (Grubb et al., 1990). The major services in most welfare-to-work programs are short-term job search assistance and counseling.

Our survey of remediation practices confirmed the lack of resources in most welfare-to-work programs. Almost universally, local administrators began planning jobs by convening all providers of education and training in the area, and then used existing providers for specific services--especially JTPA for job skills training and adult education for remediation (Grubb et al., 1990). For remedial education, the dominant practice is to provide an initial assessment--usually with a conventional test of academic skills like the Test of Adult Basic Education (TABE) or, particularly in California, with CASAS, a test which includes employability skills as well as conventional reading and math competencies--and then to refer individuals who have low scores to existing ABE and GED programs and individuals who are not native speakers of English to ESL programs. Quite often this is a matter of state policy: Florida does not provide funding for basic skills through the JOBS program, but relies instead on state funding of ABE through adult schools and community colleges; Georgia has decided to use JOBS funds only for support services and to rely on JTPA and ABE for education and training; Illinois similarly uses Project Chance funds to pay for support services, with community colleges providing education and training from special funds that the Community College Board and the State Board of Education supply; and California has required that adult schools and community colleges provide services to welfare recipients, though local programs are generally free to use their funds as they want.[29] In addition, as mentioned above, many states use large amounts of their JTPA 8-percent funds to support remedial programs for welfare recipients, so again welfare-to-work programs need not use their own resources.

In some instances, welfare-to-work programs have contracted with community colleges to provide remediation for groups of welfare recipients who enroll in the regular developmental education programs of the college but who may have received special tutoring and counseling as well.[30] This mechanism provides welfare recipients with a wider array of remedial courses than most adult schools provide. In addition, welfare recipients can claim to be going to college rather than remedial education; the atmosphere is less like the dreaded high school; and presence at a community college allows them to see the other offerings available. Finally, we have come across some remarkably innovative approaches in the JOBS program. For example, some programs use a mechanism of individual referral, allowing welfare recipients to attend virtually any education or training program in the area (including community colleges, four-year colleges, and proprietary schools), using caseworkers to guide individuals through the maze of possibilities. Fresno City College in California enrolls about five hundred and fifty Greater Avenues for Independence (GAIN) recipients in the developmental programs of the college, providing them with additional tutoring and guidance; welfare workers have also located an office on the campus so that problems with eligibility, necessary information, and lost checks can be resolved without missing classes. However, these are admittedly rare; the typical welfare-to-work program provides assessment, referral to an ABE program for remedial education for those with low scores, and very short-term job search assistance, with education and job skills training relatively uncommon.

One important characteristic of the welfare system is that JOBS participants are assigned caseworkers who are responsible for monitoring progress. In addition, extensive reporting requirements allow programs to track clients. Therefore, the problem of losing track of individuals referred elsewhere, so prevalent in JTPA, should be less serious for welfare recipients. However, this is not necessarily the case: Many welfare programs in our sample are so new that their management information systems are not yet operating, and data on how many individuals have received various services is not available. In addition, there is a surprising tendency for individuals to become lost in the complex system. In California, for example, whose GAIN program has been running longer than almost any other, fourteen percent of single-parent families required to participate received basic education; ten percent received self-initiated education or training; ten percent received job search assistance; one percent received other education and training; and one percent received work experience--but twenty-nine percent did not attend an initial orientation, and thirty-seven percent did not participate in any service at all, largely for lack of follow-up or for being "deferred." Of the thirty-four percent who participated in an initial service (basic education, job search, or self-initiated education and training), ninety-one percent did not make it to the next stage of assessment (Riccio, Golden, Hamilton, Martinson, & Orenstein, 1989, Figure 2). Since large numbers of even mandatory participants are lost in the system or have dropped out, the ideal behind the caseworker model--that individuals have a supportive guide through the possible services they might receive--is in practice undermined. As one GAIN administrator in California commented, the lack of information about progress means that many clients "fall into the black hole of ABE," staying in ABE for long periods of time without much progress and without caseworkers knowing whether they have completed or not.

The dominant practice is to refer individuals to adult education or, less often, to community colleges, and these programs are typically not integrated with job skills training. As a result, remedial education for welfare recipients is rarely coordinated with job skills training. In fact, several states require welfare recipients to follow a rigid order of services. For example, California requires an initial appraisal, then basic education or ESL for those below a certain score, and finally three weeks in job search assistance; those failing to find jobs then go through vocational assessment and develop an employment plan that may include further education in vocational skills training. Similarly, Florida requires a sequence in which individuals who fail to find employment after a job search take the TABE, enroll in remedial programs, and only then go into job skills training. In such cases, remediation must precede skills training, often by relatively long periods, so the chance to coordinate remediation and skills training is lost. Recognizing the disadvantages of its sequential approach, California is now experimenting in four counties with "concurrency"; individuals enroll in remediation and skills training at the same time, but the dominant approach--for that very small fraction of participants who receive any skills training at all--is clearly still sequential.

Finally, and not surprisingly, there is no evidence about the effectiveness of remediation within welfare programs. Although there were careful evaluations of welfare-to-work pilot programs during the 1980s (see Gueron, 1987), none was able to distinguish the contributions of different services to changes in earnings and welfare dependence; indeed, it is difficult even to determine how much basic education individuals received in these pilot programs.[31] Although the evaluation of the Minority Female Single Parent Demonstration found the most effective program to be one which integrates remediation with job skills training (Burghardt & Gordon, 1990), this evaluation, too, could not disentangle the contribution of instruction in basic skills to the outcomes. Most welfare-to-work programs have discovered a much greater need for remediation than anticipated (e.g., see Riccio et al., 1989), and there is a consensus that remediation is one of the most important services that welfare-to-work programs can provide; however, in a strict sense this convention rests on assumptions rather than evidence.


[8] For general background on remediation in community colleges, see Cohen and Brawer (1989), chap. 8, and Ahrendt (1987). Although there may be differences between community colleges and technical institutes in their provision of remediation, we have been unable to learn much about such differences either from our telephone surveys or from the literature.

[9] For surveys of basic skills courses in both two- and four-year colleges, see Lederman, Ribaudo, and Ryzewic (1985), indicating eighty-two percent offer reading courses, ninety-one percent offer basic writing courses, and eighty-six percent offer basic math courses. Since these figures include all colleges, the figures for community colleges are certainly higher. Wright (1985) found that eighty-eight percent of two-year colleges offered some form of developmental education, and ninety-four percent offered support services such as learning assistance centers. By 1985, Boylan (1985) claimed that ninety-seven percent of two-year institutions he surveyed offered developmental education. A survey by the Department of Education found that eighty-eight percent of two-year colleges and seventy-eight percent of four-year colleges offered remediation in 1983-1984 (Cahalan & Ferris,1986). A forthcoming survey by the Department of Education found that ninety-one percent of community colleges offered remedial courses in 1989; see College-Level Remediation in the Fall of 1989, described in Education Week, May 22, 1991, p. 11.

[10] The RAND Corporation is currently conducting a survey of policies in fifty states for the National Center, and one question addressed to postsecondary policymakers is whether there is a state policy on remedial education. The vast majority of states have established no special policies, though California and Washington require community colleges to provide a full range of remedial courses, and several states (Connecticut, Florida, Georgia, Louisiana, New Mexico, New York, Oregon, and Texas) require that remediation be provided to all students who fail a standardized test. (Boylan, 1985, also reports that eleven states now require colleges to provide developmental education where a need for such programs has been identified.) Several states report considerable interest in developing more coherent policies or state task forces to develop such policies.

[11] In a national survey of college and university courses with a sixty-two percent response rate, twenty-five percent of institutions offered courses through English and math departments; thirty-seven percent had established a remedial center of some kind; and forty-three percent had established a developmental or academic skills department (Gruenberg, 1983). Both Cohen and Brawer (1989) and S. D. Roueche (1983) state that many of the most successful developmental programs are in academic departments, but the evidence for their claim is unclear.

[12] A common finding is that studies with positive results are published; those with negative or inconsistent conclusions are less likely to be published. The California results are based only on those colleges that have adequate evaluation results available, and those are likely to be the most self-consciously outcome-oriented programs.

[13] Many of the papers on remediation from the Office of Institutional Research have been collected in two volumes, "Collection of Papers Related to the Academically Underprepared Student," by John Losak.

[14] Along the diagonal in each table are the figures for those who have entered with no deficiencies, and those who have entered with deficiencies in one, two, or three areas but have completed remedial courses in these areas. From Table 1, the total persistence rates for these four groups are the same (forty-seven percent, forty-six percent, forty-eight percent, and forty-six percent), but the graduation rates vary monotonically with the amount of remediation necessary (twenty-six percent, twenty-one percent, fifteen percent, and nine percent), a pattern which appears again in Table 2 for CLAST test results.

[15] The practice of translating tests and programs into grade equivalents is widespread, so we will follow this method of describing programs. This practice reflects the origin of adult education in the elementary-secondary school system, with school standards and criteria still used for adults. However, many have objected to the use of grade equivalents, particularly for adult students who may be quite sophisticated in some areas while their test scores are still relatively low; see Sticht (1987), Taggart (1986), Mikulecky (1983), Long (1983), Balmuth (1985), Tomlinson (1989), and Harman (1985).

[16] See also the surveys by Balmuth (1988) and Darkenwald (1986) on chronic absenteeism, irregular attendance, and dropout. The survey of adult education directors by Holmes, McQuaid, and Walker (1987) found that the second greatest barrier to comprehensive literacy instruction--second only to lack of money--is the low motivation among ABE students.

[17] One source of information about GED completion is Jungeblut and Kirsch (1986), who reported that 39.6 percent of those who studied for a GED received one. However, these results are retrospective self-reports and must be interpreted with caution.

[18] See, for example, Balmuth (1985, 1988), Darkenwald (1986), Kazemek (1988), and Sticht (1988). In the exhaustive literature review by Solorzano, Stecher, and Perez (1989), there are no outcome evaluations despite their attempt to collect them. An evaluation of federally funded programs is now being undertaken by Development Associates, Arlington, Virginia, sponsored by the U.S. Department of Education, but it will collect only limited information on pre- and posttests from a sample of programs.

[19] For a review with some positive findings, see Mahaffy (1983); however, most of the studies he cites have obvious validity problems because they depend on opinion surveys of ABE administrators. Darkenwald (1986) cites a study by Kent examining pre- and posttests over a five month period, with an average gain of 0.5 grade levels in reading and 0.3 grade levels in math (p. 7); another result, from an MDTA program, found increases of 0.4 grade levels after fifty-four hours of instruction. Paltry as they are, these gains are likely to be due to selection effects, regression to the mean, practice effects, and other artifacts.

[20] See, for example, the overview of JTPA in National Center for Employment Policy (NCEP) (1987), which includes almost nothing about basic skills. One reason that it is impossible to learn anything about the magnitude of basic skills within JTPA is that, for reporting purposes, basic education and classroom-based occupational skills training are lumped together into classroom training.

[21] A study of the quality of training in JTPA (Kogan, Dickinson, Means, & Strong, 1989) examined the services in fifteen representative SDAs. While they found that thirteen out of twenty-two classroom-based programs included some basic skills, only two of the thirteen devoted at least twenty percent of class time to basic education. Only three programs included any basic education in the same classes in which occupational skills were taught.

[22] The frequency with which the fifteen percent figure came up is suspicious. Since many administrators have absolutely no information with which they could construct even an estimate, we interpret a figure like fifteen percent to mean that a small but non-trivial number of individuals receive remediation.

[23] Indeed, SDA administrators have no need to know what a subcontractor does. As long as an agency enrolls sufficient numbers of people and fulfills the terms of its subcontract (if it has a performance-based contract), then what the agency does to train and place clients is immaterial to the SDA.

[24] JTPA has consistently been charged by critics with creaming, or accepting only the most able and most experienced individuals eligible; just as consistently, program administrators have responded that since all those eligible are in desperate need of services, the charge of creaming is absurd. For some evidence that creaming has taken place, see GAO (1989).

[25] However, the Kalamazoo-St. Joseph's County SDA in Michigan does track its clients. All individuals draw up an employability development plan before they are referred to ABE, and a JTPA counselor checks on their progress in ABE; individuals can also co-enroll in ABE and on-the-job training rather than being kept out of training if they have low test scores. However, this tracking mechanism appears to be an exception.

[26] The education component in JOBSTART stressed GED preparation, so the increase in GED completion is not surprising. From the description in Auspos et al. (1989), most of the JOBSTART education components seemed to follow a skills and drills approach, with the possible exception of the Dallas site and CET in San Jose.

[27] Other possibilities are that the effects of CET can be explained by the greater amount of job training provided; by the nature of the instructors, who are virtually all Hispanic and bilingual with a predominance of Hispanic clients; by the close connections with local industries; by the fact that many CET classes perform real work--operating the cafeteria, running a child care center, and operating a print shop, for example--rather than merely providing training in work; or by any of a number of other characteristics which would require more extensive field work to detect.

[28] For example, the GAO (1987) found that while eighty-four percent of these programs claimed to offer vocational skills training and seventy-two percent offered postsecondary education, only 3.2 percent of the participants received any remedial education, 2.3 percent received job skills training, and 1.6 percent were enrolled in postsecondary education (p. 69). For corroboration of the low levels of education provided, see Figueroa and Silvanik (1989).

[29] Community colleges in California are under a "cap" or limitation on the enrollment of students who qualify for state aid; however, this cap does not apply to Greater Avenue for Independence (GAIN) participants, thereby providing a funding mechanism for welfare recipients to attend community colleges.

[30] Such a contract is also necessary because the tracking requirements of JOBS impose additional reporting requirements and expenses for the colleges.

[31] There is a tendency in the Manpower Demonstration Research Corporation (MDRC) evaluations of welfare-to-work programs to lump all types of education and training together, making it impossible to tell just what individuals received. In the San Diego experiment, the program clearly increased participation in both college-level courses (in the AFDC-U sample only) and in basic education, though a surprising amount of education and training among the controls means that the differences, even when statistically significant, are surprisingly small (Hamilton & Friedlander, 1989, Table 3.1). In the Virginia case, however, the welfare-to-work program failed to increase education or training significantly (Cave, Freedman, Price, & Riccio, 1986). The real increases in most of the demonstration projects come in job search activities; a reasonable interpretation is that the modest positive outcomes are due to increases in job search, not to education or training.


<< >> Title Contents Grubb, W. N., Kalman, J., Castellano, M., Brown, C., & Bradby, D. (1991). Readin', writin' and 'rithmetic one more time: The role of remediation in vocational education and job training (MDS-309). Berkeley: National Center for Research in Vocational Education, University of California.

NCRVE Home | Site Search | Product Search