As we pointed out in the introduction, states mean different things when they talk about reforming their workforce development systems. In some states, the consolidation of programs in a single agency, or the creation of a super-agency with some authority over a variety of programs, constitutes reform, despite the fact that consolidation by itself may not lead to different decisions about services delivered. Some states are concentrating on building the institutional infrastructure for a system, particularly through one-stop centers. In other states, reform means incorporating more market-like mechanisms, particularly competition among potential providers. Other states are trying to develop data and accountability mechanisms to drive improvement in a top-down way. And in still other cases--Oregon and Oklahoma are the obvious examples--reform means cultural changes in the ways that local and state administrators view their work. Rather than focusing on individual programmatic requirements, administrators are encouraged to work with other programs to achieve the larger goals of workforce and economic development.
Because of this variety, it is worth clarifying what it might mean to create a coherent system of workforce development programs, drawing on our experiences with ten states and twenty localities. We first examine what the linkages among programs might be, developing a rough hierarchy ranging from minimal interaction among programs to fully integrated service delivery. We then examine conceptions about the quality of individual programs, for the simple reason that a coherent system of low-quality programs--in which, for example, the linkages among programs are clear to all participants, information about the options is readily available, and joint planning has eliminated all duplication, but in which each program provides low-quality services--is not necessarily an improvement. Thus, system-building may be necessary but not sufficient for effective workforce development.
In examining what states are currently doing, we can discern a hierarchy of efforts to link programs. The least substantial of these, particularly the information-sharing mechanisms, have been required by federal legislation at least since 1984.[28] The more substantial of these coordination efforts, including integrated service delivery, occur only under rare circumstances, particularly when a community college dominates all services in an area. However, states are starting to think of a greater variety of coordination mechanisms, and some of these are slowly working their way up this hierarchy. The coordination activities that follow can be ranked roughly from less substantial to more substantial (summarized in Box VI.1).
| Box VI.1 A Hierarchy of Coordination | ||||
| Information Sharing | ||||
| 1. | Sign-off provisions | |||
| 2. | Cross-membership on boards and councils | |||
| 3. | Other information-sharing mechanisms, from informal to more formalized | |||
| 4. | Information-sharing about specific clients | |||
| Referrals Among Agencies | ||||
| 5. | General referrals to other programs | |||
| 6. | Subcontracts with other agencies | |||
| 7. | Creating feeder systems and articulation agreements | |||
| Joint Service Delivery | ||||
| 8. | A division of labor | |||
| 9. | Co-delivery or integrated service delivery | |||
| 10. | Consolidated service delivery | |||
Information Sharing
| 1. | Sign-off provisions require one program to assure that they have seen the plans of another program. These constitute perhaps the simplest and most bureaucratic form of information-sharing. Such plans are often compliance-oriented documents assuring federal agencies that the letter of the law has been met, and providing little information about which specific services are being delivered or what substantive decisions have been made. |
| 2. | Cross-membership on boards and councils means that board members for one program also sit on the boards of others. This is a mechanism of information sharing, of course, with the added benefit that individuals have some power to make decisions about other programs. For example, to create linkages among programs, Oregon requires that Regional Workforce Quality Councils (RWQCs) include representatives of welfare, JTPA, the Employment Department, Economic Development Department, community colleges, K-12 districts, and the Community Action Agency. In some cases, the overlapping membership has led to the consolidation of certain boards. In the mid-Willamette region, leaders began to see that the responsibilities of the PIC and the RWQC were substantially similar; both were dissolved, therefore, and replaced by the Enterprise for Employment and Education, which now carries out the responsibilities of both. |
| 3. | Other information-sharing mechanisms range from informal to more formalized. For example, in Greensboro, North Carolina, the directors of the Employment Security Commission and the Department of Social Services regularly get together for coffee. While this is informal and non-institutionalized, they credit this mechanism with keeping them abreast of pending legislation and policy changes, and allowing them to discuss the options available to each agency. Many local agencies cite informal mechanisms, and specifically the good personal relationships among program staff, with facilitating coordination. A more formalized procedure in North Carolina is one in which the state Department of Social Services designates an individual to learn about community college programs; this individual, then, not only becomes expert about complementary programs but also becomes the "point person" for initiatives, complaints, and other contact from community colleges. One community college began formally designating individuals responsible for liaisons with other programs, with one working with the local STW initiative, another connected to local job training efforts, and a third responsible for knowing every program in the college and serving as a one-stop information center for other programs. |
| 4. | Information-sharing about specific clients occurs rarely, though in Greensboro, North Carolina, JTPA and the Department of Social Services have joint caseworkers, one from each agency, for every client. In theory, this practice allows them to identify the appropriate array of services from the two programs. This approach moves beyond simple information-sharing toward joint service provision (see #8 on next page). |
| 5. | General referrals to other programs occur when a program refers all individuals of a particular kind to another agency. For example, some JTPA programs refer individuals who fail a basic skills test to adult education; in the past, some community colleges have sent their lowest-performing students to volunteer literacy programs or adult education. Usually, such general referrals do not consider the characteristics of the receiving programs and may, therefore, refer individuals to low-quality programs. |
| 6. | Subcontracts with other agencies arise when one agency subcontracts with another to provide some or all of its services. In the past, welfare-to-work programs have often subcontracted with adult education providers and community colleges for assessment, remedial education, and vocational skills training (though these arrangements are dwindling under "work first"). Similarly, JTPA programs have often subcontracted with community colleges for vocational education. Whether such subcontracts help create a more coherent system facilitating transfer among programs, or merely serve as sources of lower-cost services,[29] is not always clear. |
| 7. | Creating feeder systems and articulation agreementswithin the education and training "system" is intended to provide smooth sequences or "ladders" of programs. For example, in some areas of North Carolina, every client who completes a pre-employment workshop with TANF is then referred to JTPA for job assessment and possible referral to an appropriate training program. In Iowa, the location of JTPA agencies within community colleges facilitates such feeder systems, though there is little evidence about how common this practice is.[30] The articulation agreements that community colleges have with four-year colleges, and that high schools have with community colleges in Tech Prep, are more familiar versions of such feeder systems. |
| 8. | A division of labor rises when programs agree, through a joint planning process, to divide up services according to the expertise of different programs. For example, in Greensboro, North Carolina, providers agreed that JTPA would do assessment; Social Services would provide supplemental services, like child care and transportation; the community colleges would provide both remedial and vocational skills training; and the local employment service office would provide placement services. This division of labor draws on the varied strengths of different programs. It also helps ensure that individuals can find the full array of services they need--rather than, for example, JTPA clients having no access to vocational skills training (or to restricted options), or community college students having neither supportive services nor placement efforts. |
| 9. | Co-deliveryor integrated service delivery occurs when intake workers from any program see any individual who walks in, and can provide each the full array of services. In a one-stop center in Tempe, Arizona, individuals entering the center (called "guests," not clients) can choose among self-service, using the computer lab for information; group services, in which classes in job searching are offered; and case management, for which an individual must meet eligibility criteria. However, most individuals talk with a Personal Services Representative to determine eligibility and get answers to questions about benefits. In contrast to many other one-stop centers, this mechanism relies not only on information to prospective clients and referral to the appropriate agency, but instead eliminates this potential hurdle by providing personal consultation and enrolling an individual in appropriate programs and services on the spot. Similarly, Iowa is far along in the development of an Integrated Customer Service System (ICS) and is piloting this system in its Cedar Rapids Workforce Development Center. Of course, integrated service delivery is the most complex form of coordination since it requires that all intake workers be crosstrained about all other programs, and it requires a management information system that can track individuals through multiple programs. |
| 10. | Consolidated service delivery takes integrated service delivery one step further by merging one program into another. Most examples of consolidated service delivery we have seen occur when community colleges take over the provision of virtually all services, and particular forms of education are available to a variety of individuals with different needs.[31] For example, remedial classes are open to welfare recipients and employees as well as a college's entering students, and a variety of short-term and longer-term vocational programs serve the needs of those seeking upgrade training and retraining as well as pre-employment preparation. We note that in some cases where consolidation seems to have taken place within community colleges, separate programs are provided by different divisions of the college with little connection among them and, therefore, co-location rather than real consolidation best describes these practices. In addition, consolidation at the state level, where separate programs are administered within one agency rather than scattered among many independent agencies, is not the same as consolidated service delivery, which must take place at the local level. The former could lead to the latter, though it does not generally do so. |
Several mechanisms can move programs to these different stages of coordination. Joint planning can lead to any of them, though it is not necessarily sufficient to do so. The co-location of programs in one-stop centers can also lead to many of them, though so far officials report that co-location has been used primarily to provide more extensive information to clientsrather than to promote coordination among programs.Consolidation of programs within single agencies might be intended to lead to different forms of coordination, but--as the Texas Workforce Commission (TWC) illustrates with its internal divisions and incomplete collection of programs--in practice it has not. Joint planning, co-location, sign-off requirements, cross-membership, and state agency consolidation are, therefore, means to other ends, not necessarily ends in themselves.
If integrated service delivery is the most complex kind of coordination, and information sharing the least substantial, then most states are still at the first stage. Many one-stops have confined themselves to information sharing (though a number of them have plans to move beyond this stage). The voluntary efforts in Oklahoma, the activities of the TWC, and the focus on establishing computer-based information systems in Maryland have not yet moved to encourage referrals on a broad scale among programs at the local level, or to joint service delivery. To be sure, several of the states we examined are trying to create a culture and political climate in which future efforts can be more substantial, including Oklahoma with its initial voluntary effort and Oregon with its emphasis on peer leadership and example rather than coercion. Similarly, Iowa seems to have made only slow progress toward coordination, though it has been steady throughout the terms of Governor Terry Branstad, who has been dedicated to a comprehensive workforce development system. At this stage of states' development, therefore, it is not necessarily discouraging to find how far up this hierarchy of coordination states have to go, since the process has barely begun for most states. It is discouraging only when states seem to backslide, as they do when state leadership takes a different tack (as happened in Maryland, Massachusetts, Michigan, and Wisconsin) or when they appear to be reforming without any clear vision of the next steps.
One vision of system-building--one that we can discern in bits and pieces across the country--is movement up this hierarchy of coordination. But it should be no surprise that, in this large and contrary land, there are many state (and local) officials who deny that such a vision is the right way to go. A number of individuals expressed discomfort with the whole notion of coordination and system-building, sometimes because its benefits have not been proven. In still other cases--and Oklahoma with its voluntary efforts and Oregon with its noncoercive policies are good examples--state agencies may recognize the benefits but assert that coordination is not always worth the political struggle it may require. Consolidation and the inherent problems of overcoming turf also have the tendency (evident particularly in Michigan and Texas) to drive away talented administrators from agencies that are subject to consolidation. This is particularly costly when such individuals are replaced by uninformed and inexperienced staff. In Texas, for example, a great deal of institutional memory has been lost through this process. And certainly in the forty states that we did not examine, this sentiment is prevalent, since by design these are states that have done much less to begin reforming their workforce development "systems."
It is difficult to respond directly to these criticisms of coordination. The efforts at coordination and system-building are too new and incomplete to have been evaluated in any sense, and it is hard to know how to evaluate a complete system change.[32] Of course, system-building requires some political battles, with losers as well as winners. Some reform efforts have taken a great deal of time to pass through political opposition, only to do very little at the local level. Because of this, the results often do not seem to be worth the battle. Even so, many states are starting the process of reform, and have at least implicitly accepted the a priori argument in favor of system-building.
If system-building concentrates on creating linkagesamong programs--linkages of planning, of information, of referrals, or of service delivery--then a different kind of activity emphasizes the qualityof individual programs. In states where particular institutions are highly regarded, there is a much greater tendency for other programs to use them. For example, community colleges in Oregon and North Carolina have strong reputations in those states, and, as a result, they have been central to both economic development and welfare reform. Similarly, adult education in Oregon appears to provide innovative programs, and has been integrated into other workforce development efforts, where elsewhere the poor quality and low profile of adult education has caused it to be nearly invisible to other programs. If individual programs are of low quality, then there is little point in linking them. For example, a welfare administrator in San Francisco stated that welfare recipients "fall into the black hole of ABE" because the instruction was so poor, progress so slow, and dropout rates so high that welfare clients rarely returned with sufficient preparation to move on to vocational skills training.[33] Linking programs is, then, one purpose of system-building, but improving the quality of individual programs is also necessary.[34]
Of the ten states we examined, most have decided to improve quality through a system of performance measures, drawing to some extent on the experience of JTPA. Florida is the best example, with its array of outcome measures (see Box I.3). Oregon's Benchmarks represent another example, and Table 1 displays the data systems other states are developing. All these cases require programs to collect new data, on outcomes rather than inputs. Common measures include completion rates, short-term placement rates, and earnings levels. While the shift to performance-based funding has taken place only in Florida (and is just starting even there), there are many ways to use such information to cajole, coerce, or humiliate local programs into improving their performance.
The JTPA system, which has included performance measures and standards since the early 1980s, provides many warnings about the use of performance measures to enhance quality, including the following: (1) conventional measures of performance--for example, placement rates of individuals completing a program--are not necessarily good measures of effectiveness, which is better measured by the improvementin employment for an individual compared to what he or she would have done in the absence of a program. The random-assignment evaluation of JTPA--one of the most sophisticated evaluations of any training program--measured the effects of participating in the program compared to a precisely equivalent group that had not gone through training. However, these measures of effectiveness proved to be completely uncorrelated with conventional performance measures (Barnow, 1997; Doolittle et al., 1993, p. 10); (2) performance measures often create incentives to "cream"--to select the most able individuals--rather than to improve the quality of programs. Such efforts are difficult to detect except in sophisticated evaluations[35]; (3) conventional performance measures are unable to measure displacement, or the extent to which finding jobs for program completers simply displaces other nontrainees who would have otherwise gotten these jobs. Displacement is particularly likely in markets for less skilled labor with substantial unemployment (Johnson, 1979; Solow, 1990); and in British studies, the extent of displacement has been measured at up to 80% (Begg, Blake, & Deakin, 1991; Deakin & Pratten, 1987; also cited in Grubb & Ryan, 1999, forthcoming); (4) many programs have found ways to manipulate the data necessary for performance measures, though these are not widely discussed[36];
| Florida | Iowa | |
| State agency | Enterprise Florida (advisory); Jobs and Education Partnership (HRIC) | Iowa Council on Human Investment (planning); Iowa Workforce Development (HRIC) |
| Local or regional agencies | Regional Workforce Development Boards | Regional Advisory Boards |
| Programs included-- state level | School-to-work (STW); Welfare-to-work (WtW);
One-stop centers; High-Skills/High-Wages (HS/HW) | Promise Jobs; Elder Services; JTPA; Voc rehab; Dept. of Labor (DOL) Shared Information System |
| Programs excluded-- state level | Proprietary schools; Adult education; Technical centers; Private colleges | Vocational education; Department of Human Services |
| Local responsibilities | STW; WtW (WAGES) (in 16 of 25 areas); HS/HW (occupational forecasting) | STW; WtW; Regional Workforce Centers: JTPA, Voc rehab, ES, Veterans' Services |
| Firm-based training | Quick Response Training (high-dividend occupations only) | New Jobs Training Program (through CCs); Iowa Jobs Training Program; Business Network Program; Training and Retraining for Target Industries; Innovative Skills Development Program |
| Performance Measures and Data Systems | PBIF measures; FETPIP (see Box I.3) | Integrated information system, combining data from all programs |
| Distinctive features | Market-oriented mechanisms | |
| Maryland | Massachusetts | |
| State agency | Governor's Workforce; Investment Board | DOL and Workforce Development; Massachusetts Jobs Council (advisory) |
| Local or regional agencies | 12 SDAs; local Dept. of Social Services (DSS) agencies; local Job Service agencies | Regional Employment Boards (REBs) |
| Programs included-- state level | JTPA; Perkins funds for voc ed; adult ed; ES; JOBS, Nat'l Community Service; Food Stamp Employment and Training (FSE&T) (as of 1995) | One-stop centers; UI; ES; JTPA; WtW |
| Programs excluded-- state level | One-stop centers; STW; WtW; economic development; community colleges; UI; ES | STW; vocational education; community colleges; adult education |
| Local responsibilities | 12 SDAs provide organization for JTPA, STW other job training; DSS agencies implement WtW; community colleges autonomous; local planning encouraged but not mandated | Regional career centers; WtW; policy setting and coordination |
| Firm-based training | Dept. of Business/Economic Development; WtW pilot projects; some local community college and SDA programs | Corporation for Business, Work, and Learning (formerly BatyState Skills Corp.); Enhanced Enterprise Communities |
| Performance Measures and Data Systems | Dept. of Human Resources (DHR) using pay-for-performance contracts with privatevendors; State Dept. of Ed. (SDE) emphasizing performance standards in K-16 education | |
| Distinctive features | New administration reversed course; split of economic development from workforce development, and of welfare from job training; Advanced Technology Centers in community colleges | Competitive funding of regional career centers; ideal of market-based workforce development system |
| Michigan | North Carolina | |
| State agency | Michigan Jobs Commission (administrative); Governor's Workforce Commission (advisory) | Governor's Commission on Workforce Preparedness (GCWP) (HRIC); Interagency Coordinating Council (advisory) |
| Local or regional agencies | Local Workforce Development Boards (WDBs) in 26 Michigan Work Areas | Local WDBs |
| Programs included-- state level | JTPA; ES; TANF/JOBS, Work First!; FSE&T; STW; Economic Development Job Training; Renaissance Zones | GCWP--adult ed; JTPA; JOBS; ES; FSE&T; community colleges; secondary voc ed; New and Expanded Industries Training |
| Programs excluded-- state level | Adult education; community colleges | Vocational rehabilitation |
| Local responsibilities | Direct Administration (Tier One): JTPA; TANF, Work First!; FSE&T; STW; No Wrong Door Centers; Corrections & Displaced Homemaker Training Programs Collaborative Planning (Tier Two): adult education; vocational education; ES | STW; one-stops; JTPA |
| Firm-based training | Economic Development Job Training (EDJT) | New and Expanded Industries Training; Focused Industrial Training; Occupational Extension |
| Performance Measures and Data Systems | Employment at program completion and at 52 weeks postprogram; developing customer satisfaction measures (as of August 1997) | Common follow-up system, reporting to Employment Security Commission (in progress) |
| Distinctive features | Strong employer & economic development orientation; competitive subcontracting by local WDBs; economic development through tax and regulatory relief; EDJT | Priority of CCs as deliverers; consistency of efforts over time; use of one-stop centers for collaboration; staff development (Box I.2); funding to CCs for firm-based training |
| Oklahoma | Oregon | |
| State agency | Workforce Quality Compact | Workforce Quality Council (WQC) (now the Workforce Policy Cabinet) |
| Local or regional agencies | 21 local Compacts (with more expected) | 15 Regional WQCs (now Regional Workforce Committees) |
| Programs included-- state level | Education (K-12, higher education, Depts. of Vocational and Technical Education); Employment Security Commission; Human Services, Commerce, Rehabilitation Services, Human Resources; Cabinet Secretaries: Commerce, Health and Human Services, Human Resources, Education; private sector heads of 21 local Compacts | Five business members; Five labor or community-based organizations; State legislator; Local elected official; Local education representative Education (K-12, Community College Services, State System of Higher Education); Economic Development Department, Department of Human Resources, Bureau of Labor and Industries |
| Programs excluded-- statelevel | DOL; proprietary trade schools; Department of Corrections | Proprietary trade schools |
| Local responsibilities | Voluntary | Advisory primarily; STW; one-stop center planning |
| Firm-based training | Area Vocational-Technical Schools; community and two-year colleges | Targeted Training Program; Small Business Development Center (in CCs) |
| Performance Measures and Data Systems | Exist for separate systems | Oregon Benchmarks (K-12: Certificates of Initial and Advanced Mastery); short-term performance measures; Shared Information System (SIS) |
| Distinctive features | Voluntary participation; inclusion of all education sectors | Long-term, stable commitment; state-level cross-functional teams; Oregon Benchmarks and SIS; Key Industries |
| Texas | Wisconsin | |
| State agency | Texas Council on Workforce and Economic Competitiveness (advisory); Texas Workforce Commission (TWC) (administrative) | Council on Workforce Excellence (state HRIC); Department of Workforce Development (administrative) |
| Local or regional agencies | 28 WDBs, of which 16 were fully operational as of 7/1/98 | Local Collaborative Planning Teams (advisory); Job Centers |
| Programs included-- state level | JTPA; ES; UI; FSE&T; TANF/JOBS/WtW; STW (shared); Trade Adjustment Assistance; Skills Development Fund; Senior Texans; Child Care Management System | Job Centers; Wisconsin Works (WtW); TANF; JTPA; ES; STW; vocational rehabilitation; UI; Workman's Comp |
| Programs excluded-- state level | Vocational education; adult education; vocational rehabilitation; Smart Jobs Fund | Technical colleges; adult education |
| Local responsibilities | Under local WDBs: one-stop career centers; JTPA; FSE&T; JOBS; CCMS; STW; [ES & UI still under TWC] | Job Centers: Wisconsin Works; JTPA; ES; vocational rehabilitation; Perkins Funds; Adult Education Funds |
| Firm-based training | Skills Development Fund; Smart Jobs Fund Programs; Self-Sufficiency Fund; Texans Work Program | Wisconsin Regional Training Partnerships |
| Performance Measures and Data Systems | Statewide goals and eight (8) core performance measures approved by Governor, with UI wage-based employment and earnings as two "key" measures (not fully implemented) | Outcomes vary with area population and unemployment and include placement and duration of employment in unsubsidized jobs and increased earnings |
| Distinctive features | TANF is largely "work first" | Threats from "work first"; early and active Job Centers |
and, most important of all, (5) performance measures as they are usually measured provide information only on short-term benefits, but the long-run effects may be quite different--and, in most job training programs, tend to vanish four to five years after the program.[37] Thus, short-term performance measures are poor indicators of long-run benefits. As a North Carolina official mentioned about the state's Common Follow-up System, the system is trying to ask,
Have the clients gotten the basic underpinnings of an education that will allow them to transfer their knowledge, or every time they change jobs are they going to have to be re-educated because they didn't get the basic foundation to begin with? That's an issue that's not necessarily going to be answered by our follow-up system. Nobody seems to be paying attention to that question since we have had a political shift to the right in our legislature--that has not been a particularly hot question on their minds.
Short-run performance measures may also skew the kinds of services programs offer in favor of getting individuals quickly into jobs that may be of low quality and provide few opportunities for advancement.
Therefore, performance measures may be one way to force local programs to shift their attention to outcomes, but they are seriously flawed. A different approach to program quality concentrates not on either inputs or outcomes, but, rather, on the process by which programs prepare individuals for the labor force. This approach would, for example, develop a conception of what high-quality programs do, and then provide information about "good practice" to other programs through a process of technical assistance. Such information is complementary to that obtained by performance measures. If, for example, a program's performance is found wanting, then there's no reason why this information will improve its outcomes unless it has access to the resources (both informational and financial) necessary for improvement. However, this kind of information about good practice is comparatively rare. We have already noted that technical assistance is relatively infrequent; with the possible exception of North Carolina (see Box I.2), few of the states we examined provide much of it.
Perhaps more damaging, few of the local or state officials we interviewed could articulate criteria describing the quality of education and training programs, aside from the definitions embedded in performance indicators. We routinely asked administrators and policymakers to identify exemplary programs, allowing themto define what "exemplary" might mean--though we provided some examples if they requested them (see question 9 of both the local- and the state-level protocols in Appendix A). We wanted to identify exemplary local programs to visit; in addition, we wanted to see how local and state officials define quality--and to test their conceptions against the five-part conception of quality that we have developed in various studies over the past few years (see Box VI.2).
Unfortunately, few administrators could identify any specific programs as exemplary. Some mentioned examples of programs currently being put in place--for example, individuals in Iowa mentioned Workforce Development Centers that were moving toward the state's goals, and officials in Texas and Michigan also emphasized the one-stop centers that were furthest along in implementation. In the cases of Oregon and Oklahoma, with their focus in coordinated regional planning and implementation, state officials named specific regions within the state that they perceived to be well on their way to coordination. Some mentioned high-profile programs--for example, programs operated with large employers like Disney World or Universal Studios in the Orlando, Florida, area--without indicating what made them exemplary and without knowing anything about them except the program name. Several individuals, particularly in community colleges, proudly described computer-based learning programs--even though computer-aided instruction is often of poor quality, particularly for remediation.[38]
In Maryland, most state officials asserted that it was a local role to address quality--even though the state was attempting to improve quality through some technical assistance, developing performance standards, and tracking performance measures. The few mentions of "exemplary" programs were those that had been around long enough to generate outcome data, or that had a good local reputation, but they were unable to describe key components of implementation strategies that were distinctive in these programs. Officials in Wisconsin identified one particular job center, one that appeared to make more referrals to education than others; however, the domination of "work first" in the state made it difficult to cite any exemplary education and training efforts. In North Carolina, administrators have become enamored of the "model" of the Center for Employment Training (CET) in San Jose, California--a step in the right direction, since CET shares many of the attributes of quality that we have identified in Box VI.2 andits outcomes have been shown to be better than those of other job training programs--but they have been unable to replicate all the components of CET.
To be sure, there are some specific programs and institutions that have a clearer sense of what quality requires. The North Carolina community colleges are particularly insistent on close relationships with employers, which they achieve partly through economic development activities (see Box VI.2, point 1). The Index Program in Tulsa and the Industry Consortia elsewhere in Oklahoma also emphasize close partnerships with employers. The community colleges in Maricopa County cited the importance of workforce development activities in creating close linkages with employers. In addition, they have articulated a philosophy that they call "incrementalism," recognizing that many individuals obtain their education in small, incremental steps. As one administrator described it,
Our college deals mostly with adult students, so they have short-term goals: "I want a job." That's the first goal. "I want a better job." That's another goal. So then they come back each time they need to meet a goal--they come back for the next segment.
This is identical to the idea of creating "ladders" of opportunities, from short-term job training to longer-term vocational programs with certificate and Associate degrees (Box VI.2, point 4), that could help structure other education and training programs. But few programs have articulated such ideas, and fewer still have put them into practice.
Overall, then, most local and state officials in these ten states could only rarely articulate any definitions of what good and bad programs might look like. They made few references to anyof the five criteria for high-quality programs presented in Box VI.2; and they did not have their own conceptions of quality either. There are many reasons for this. Many state administrators do not visit local programs very often, and local administrators rarely get out of their own areas. Many officials interpret what programs do entirely in terms of the numbers they must collect--enrollment rates; sometimes completions; and, in a few states, placement rates and other performance measures--and they interpret their jobs as monitoring performance on these narrowly defined indicators; they simply have no reason to develop a broader conception of quality. Some state administrators feel that it is not their responsibility to impose conceptions of quality on local programs. Conversely, some local administrators feel that they should mimic what the state has declared desirable, rather than coming up with their own ideas. Clearly, many local and state officials had never been asked about exemplary programs. In the absence of state technical assistance, or efforts to identify and replicate "best practice," or snoopy researchers trying to find local programs to visit, this simply is not a useful question. Who would want to know the answers?
| Box VI. 2 Dimensions of Quality in Workforce Preparation Programs[39] | ||||||||
| While there are many ways to conceptualize quality, five characteristics of programs are important in general--though for particular programs several of these may not matter: | ||||||||
| ||||||||
But the lack of any clear conceptions of quality is unsettling. If local and state officials cannot articulate any dimensions of quality, then they cannot recognize when their policies are driving programs in the direction of higher quality or, conversely, when policies might erode quality--for example, when performance measures might actually diminish quality. They cannot identify other state policies that might improve quality, and they certainly cannot support technical assistance--an approach that presumes a state has access to information about high-quality programs. Worst of all, a world in which there are few discussions about quality--but where public discussion focuses on expanding or declining revenues, enrollments, performance measures as oblique measures of quality, and political alliances over and over and over again--is one in which fledgling efforts to improve quality can find little support. The currency of the realm is still enrollment and, in turn, the revenues generated by enrollment.
Ironically, then, the conception of a workforce development system is skewed in many states. These efforts have emphasized linkages among programs--particularly, at this stage, information about other programs--but have done much less about the quality of programs themselves, except to begin the process of defining performance standards. But the problem with this strategy is that many programs in the current education and training system are not very effective. Short-term job training programs and welfare-to-work programs show small effects, vanishing after several years, and even these are being eclipsed by "work first" initiatives. Most adult education programs keep their students for only short periods of time, teach them in dreadful ways, and provide them with little advancement (Grubb & Kalman, 1994; Young, Fitzgerald, & Morgan, 1994). Many community colleges and area schools have high dropout rates; mediocre connections with local employers; and weak services, including placement efforts. The information provided by one-stops may be useful to some students, but not to those who failed to find their way there or are unable to make good use of computer-based information. Each component of the current system bears examination and improvement--but without any clear conception of what high-quality programs look like, there is little guidance to help states face this challenge.
[28] The Perkins Act of 1984 provided federal funding for vocational education and required notification to JTPA programs.
[29] In earlier work, NCRVE determined that many such contracts were fiscally motivated, particularly because community colleges could often collect state ADA payments for JTPA clients and thus charge JTPA less than CBOs could. See Grubb et al. (1990a); Grubb and McDonnell (1996); Gula and King (1990).
[30] It proves virtually impossible to get information about the rate of movement from one program to another--for example, from a JTPA program within a community college into an Associate degree program.
[31] A partial exception occurs in Coos Bay, Oregon, where Newmark College houses various services, though a consortium of programs administers them.
[32] On the difficulties of evaluating the effectiveness of systems, see Grubb and McDonnell (1996).
[33] This example comes from earlier NCRVE research (see Grubb & McDonnell, 1996).
[34] Again, earlier NCRVE research confirmed that one of the common barriers to coordination at the local level was distrust of the services provided in other programs.
[35]The performance measures in JTPA have affected local agencies in several different ways. In Dickinson et al. (1988), about one third of SDAs appeared to be quite client-focused, and did not let performance measures cause any distortions in their operations; but about one third were standards-driven and let performance measures drive them toward low-cost programs and creaming. The remaining third of SDAs were basically clueless and were, therefore, unaffected by performance measures.
[36] In other research, JTPA programs admitted that they would enroll clients formally only after they had been in the program a week or two, eliminating the early dropouts. In addition, efforts to follow up clients to calculate placement rates often do not try very hard to locate mobile clients, who may be the least likely to be employed. Officials sometimes alluded to being able to calculate any placement rates they needed to, seeming to imply that there were still other methods of manipulating data. Despite the potential consequences of these practices on performance measures, we have never seen this subject carefully addressed.
[37] See Friedlander and Burtless (1995), especially Table 4.2, reprinted in Grubb (1996a), pp. 76-77. See also the meta-analysis by Fisher and Cordray (1996), which found that effect sizes for earnings increase gradually until the 9th quarter, at the beginning of the third year, but then decay rapidly, effect sizes for the proportion employed increase gradually until quarter 10, and then decay rapidly over the next three quarters. In a nonexperimental study in the U.S., Geraci (1984) compared short-term indicators and longer-term earnings. The correlations were generally quite low, indicating the inaccuracy of short-term measures of success.
[38] On the generally poor use of computers in community colleges, see Grubb and Associates (1999), Chapter 7; see also the Panel on Educational Technology (1997).
[39] These conceptions of quality have emerged from this study, from examining a series of supposedly exemplary programs, from the research in King et al. (1995), and from various other research over the past decade. We regard these five dimensions of quality as rough working hypotheses about high-quality programs. See also Grubb and Ryan (1999, forthcoming), Section II.4, which presents four stages of human capital development: (1) implementation, (2) the learning process, (3) changes in behavior on the job, and (4) employment and non-employment outcomes. These four stages are closely linked to the conceptions of quality presented in this section.
[40] The issue of pedagogy is badly neglected in all of vocational education and training. This section relies on Achtenhagen and Grubb (1999, forthcoming), which reviews the existing literature on vocational pedagogy. Even in the much-emulated German dual system, issues of pedagogy have not been systematically addressed, even though teacher training there is quite comprehensive. In the English-speaking countries with their less institutionalized systems, teacher training is often badly neglected, and the same is true in most transitional and developing countries. In the absence of explicit preparation in pedagogical issues, most instructors revert to didactic and skills-oriented approaches.
[41]As a particularly vivid example, Quint, Musick, and Ladner (1994) found indications of depressive conditions in one fourth to one half of the individuals in an experimental program for mothers on welfare. Depressive conditions may manifest themselves as drug or alcohol abuse or as what appears to be laziness, so an accurate diagnosis is necessary to understand the appropriate response.