NCRVE Home

EDUCATOR, STUDENT, AND EMPLOYER PRIORITIES FOR TECH PREP STUDENT OUTCOMES

MDS-790






Debra D. Bragg


University of Illinois at Urbana-Champaign

National Center for Research in Vocational Education
Graduate School of Education
University of California at Berkeley
2030 Addison Street, Suite 500
Berkeley, CA 94720-1674


Supported by
The Office of Vocational and Adult Education
U.S. Department of Education

January, 1997


FUNDING INFORMATION

Project Title: National Center for Research in Vocational Education
Grant Number: V051A30003-96A/V051A30004-96A
Act under which Funds Administered: Carl D. Perkins Vocational Education Act
P.L. 98-524
Source of Grant: Office of Vocational and Adult Education
U.S. Department of Education
Washington, DC 20202
Grantee: The Regents of the University of California
c/o National Center for Research in Vocational Education
2150 Shattuck Avenue, Suite 1250
Berkeley, CA 94704
Director: David Stern
Percent of Total Grant Financed by Federal Money: 100%
Dollar Amount of Federal Funds for Grant: $6,000,000
Disclaimer: This publication was prepared pursuant to a grant with the Office of Vocational and Adult Education, U.S. Department of Education. Grantees undertaking such projects under government sponsorship are encouraged to express freely their judgement in professional and technical matters. Points of view or opinions do not, therefore, necessarily represent official U.S. Department of Education position or policy.
Discrimination: Title VI of the Civil Rights Act of 1964 states: "No person in the United States shall, on the ground of race, color, or national origin, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving federal financial assistance." Title IX of the Education Amendments of 1972 states: "No person in the United States shall, on the basis of sex, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any education program or activity receiving federal financial assistance." Therefore, the National Center for Research in Vocational Education project, like every program or activity receiving financial assistance from the U.S. Department of Education, must be operated in compliance with these laws.




ACKNOWLEDGMENTS

Like so many projects, this "little" paper turned into a much bigger project than anyone could have anticipated. Many talented people contributed to the effort, and I offer my deepest gratitude to them all. First and foremost, it would have been impossible to conduct this research without the involvement of twenty of NCRVE's thirty Urban Schools Network sites, especially the twenty program directors who coordinated the data collection at each site. In addition, nearly all of the nation's state-level Tech Prep coordinators provided information about Tech Prep evaluations and outcomes assessments occurring in their local consortia and states. Among my own staff, Debra Daniels, Jim Layton, Paula Puckett, Christina O'Connell, William Reger, and Sue Thomas made valuable contributions at various points along the way. Still others affiliated with NCRVE deserve recognition for their support and encouragement, especially Ruth Katz and Erica Nielsen Andrew. Both have nurtured NCRVE's Urban Schools Network with dedication and enthusiasm. Finally, many thanks go to my dear friend and collaborator, Carolyn Dornsife, whose insights have contributed in immeasurable ways to my understanding the complexities of educational reform.

Debra D. Bragg


EXECUTIVE SUMMARY

A study was conducted to identify, classify, and prioritize student outcomes to help to build a foundation for evaluating Tech Prep programs. The study involved three stakeholder groups: educators, students, and employers. All of these stakeholders were actively engaged in planning and implementing Tech Prep programs affiliated with the Urban Schools Network sponsored by the National Center for Research in Vocational Education (NCRVE). Two research questions guided the study:
  1. Collectively, how do the three stakeholder groups of educators, students, and employers conceptualize and prioritize Tech Prep student outcomes?
  2. What are the similarities and differences in how each of the three stakeholder groups of educators, students, and employers conceptualize and prioritize Tech Prep student outcomes?
The need to better understand the perspectives of various stakeholder groups toward Tech Prep has been identified by many researchers (e.g., see Bragg & Layton, 1995; Connell & Mason, 1995; Dornsife, 1992; Hammons, 1992; Roegge, Leach, & Brown, 1995). Recognizing this need, a study employing concept mapping was undertaken. Concept mapping is a structured conceptualization and statistical modeling procedure developed by Trochim (1989a) to provide a means of articulating and structuring stakeholders' ideas in a visual form called a concept map. A total of 61 stakeholders participated in this concept mapping study representing 20 of the 30 NCRVE Urban Schools Network sites. The participants provided rating and sort data on 98 student outcomes statements gleaned from a wide range of literature addressing Tech Prep, school-to-work, vocational-technical, and general education reform and restructuring. A panel of experts reviewed the list of statements to establish content validity. The concept mapping procedure was pilot tested and refined prior to the actual administration with participants. Preliminary and final concept maps were computed using Trochim's Concept System program for all the entire groups of stakeholder participants (n=61) as well as each of the subgroups: educators (n=24), students (n=18), and employers (n=19). For the final analysis, a nine-cluster (solution) concept map was calculated providing both quantitative and qualitative results regarding Tech Prep student outcomes.

Results showed the three stakeholder groups of educators, students, and employers gave high priority to a wide array of student outcomes. It seems nearly everything one might think of as associated with a modern high school education is seen as important for Tech Prep. In fact, all three stakeholder groups rated nearly all of the 98 student outcomes statements at a "moderate" or "high" priority level even though they were instructed to spread the ratings of the outcomes statements across the 5-point priority rating scale. In addition, results showed there were many more similarities than differences in how the three groups conceptualized and prioritized Tech Prep student outcomes. A nine-cluster concept map was deemed the most logical way to represent the results for all participants. This concept map contained the following clusters (mean cluster rating on 5-point priority scale in parentheses):

  1. Personal attributes, attitudes, and employability skills (4.13)
  2. School-to-work transition (3.96)
  3. Technology and quality management (3.92)
  4. Information use and decision making (3.89)
  5. Work and interpersonal relationships (3.75)
  6. Educational attainment (3.68)
  7. Communications (3.61)
  8. Math and science (3.46)
  9. Democratic and participatory strategies (3.29)
When concept maps were created for each of the subgroups, many of the same clusters were apparent. In fact, all three subgroups sorted virtually the same sets of outcomes statements into the three clusters labeled "personal attributes, attitudes, and employability skills," "school-to-work transition," and "work and interpersonal relationships." In addition, all three stakeholder groups created both vocationally oriented clusters (e.g., work and interpersonal relationships) and academically oriented clusters (e.g., math and science). And these clusters were physically separate from one another on all of the concept maps, giving the impression that outcomes associated with vocational and academic education are distinct and independent from one another. However, in all of the concept maps, stakeholders created one or more clusters that clearly did possess outcomes from across the traditional vocational and academic curriculum. These clusters contained outcomes statements having to do with technology, information use, decision-making, work, and management. Outcomes from such disciplines as the humanities, social studies, science, and vocational-technical education were contained in these clusters. Typical of this kind of cluster is one created by students labeled "work, technology, and information use" or one developed by employers labeled "technology and quality management." Within each of these clusters is a nucleus of outcomes linking vocational and academic subject matter, contributing ideas for the integration of vocational and academic education for Tech Prep.

Beyond these areas, some important differences in how the three stakeholder groups perceived the Tech Prep student outcomes were apparent. Particularly in sorting and rating outcomes related to education, and specifically academic subjects, there was a great deal of disparity in the ways the stakeholder groups perceived student outcomes. For example, educators and students gave higher priority ratings than employers to sets of educational attainment outcomes such as to graduate from high school, make progress on grade level, and graduate from two-year postsecondary college. Employers gave slightly higher priority to clusters of vocationally oriented outcomes, although all three stakeholders tended to give vocationally oriented clusters high priority ratings while academically oriented outcomes received lower (albeit not low but moderate) ratings.

Clusters linked to the academic areas of social studies and humanities received the lowest ratings. In fact, the cluster labeled "democratic and participatory strategies" created by employers rated the lowest of all clusters with an average rating of 2.99. Recalling that the federal Tech Prep Education Act specifies that Tech Prep be comprised of mathematics, science, English/communications, and vocational-technical education, this rating may not be surprising. Similarly to the federal law, most local or state policies associated with Tech Prep have emphasized math, science, English/communications, and vocational-technical education over humanities or other liberal studies. Consequently, the stakeholder participants' responses may reflect a bias in the public policy, influencing how respondents rate various vocational and academic outcomes statements. Of course, this study only examines perceptions and not actual implementation. Therefore, it is not possible to determine whether respondents have experienced a shift in curriculum focus.

Related to this concern, many local consortia and state agencies profess a primary purpose of Tech Prep is to "eliminate the general track," following a vision of Tech Prep articulated by Parnell (1985). Although difficult to determine from this data, it is possible that reforming the general track by emphasizing math, science, and technology may lead to less emphasis on the traditional social and democratic functions of public education. There is only so much time in a school day. Yet, even though the data suggests such a prioritization is occurring among the Tech Prep student outcomes, it is difficult to believe these kinds of ideas are explicit to the respondents in terms of tradeoffs of courses and content (subject matter) within the curriculum. Certainly, more research is needed to understand the actual changes occurring within the curriculum and subsequent effects on students.

In summary, this study attempted to better understand Tech Prep student outcomes from the perspectives of educators, students, and employers actively engaged in implementing Tech Prep. Knowing how these groups conceptualize student outcomes has important implications for understanding the fundamental objectives of Tech Prep, for planning and implementing Tech Prep and related school-to-work programs, and for assessing outcomes. Also, by uncovering various conceptualizations of Tech Prep, it is feasible to identify conflicting perspectives held by disparate stakeholder groups, possibly revealing gaps in the logic that underpins the Tech Prep approach. Using this study as a model, further research could be conducted with still more stakeholder groups (e.g., policymakers, administrators, counselors, parents) and with other localities such as rural and suburban areas. As Tech Prep and school-to-work program implementation continues, more attention must be devoted to student outcomes. Only by better understanding various stakeholder perspectives can future evaluations and outcomes assessments be expected to produce results useful to the nation's goal of reforming education.


INTRODUCTION

Reform has been a priority for the educational community and policymakers at all levels of government over the past decade. Following the passage of A Nation at Risk (National Commission on Excellence in Education, 1983), the country witnessed successive attempts to modify and improve public education. Although a great deal of attention has been paid to making the changes, much less care has been taken in determining the effects of the various reforms on students. Part of the issue relates to time, since systemic reform can take years to implement and institutionalize, delaying outcomes assessment for students who have experienced a completely restructured program. Another concern is that various stakeholder groups are not able to reach consensus on the outcomes they believe students should achieve. Educators push for higher academic standards while employers believe students should be better prepared to go to work (U.S. Department of Labor, 1991), creating the potential for conflict between the two groups. Even if stakeholder groups reach agreement on desired outcomes, a problem arises when appropriate assessment measures and methodologies do not exist. This is apparent when the identified student outcomes require measures that differ from traditional standardized academic examinations. Newer, alternative forms of outcomes assessment such as performance or project-based assessment are beginning to be employed, but more development is needed before these assessments can be used on a large scale.

Each of these issues is applicable to educational reforms associated with the nation's Tech Prep[1] and related school-to-work opportunities programs.[2] These concerns are apparent in all areas, but particularly in urban and rural areas where educational reform has been particularly difficult. Issues related to implementing effective school-to-work related reforms in urban areas are well-documented in a special issue of Education and Urban Society, edited by Seidman and Ramsey (1995). In that issue, Bragg and Layton (1995) point out that both pieces of federal legislation, the Tech Prep Education Act and the School-to-Work Opportunities (STWO) Act, contain directives to ensure funding is distributed to urban and rural schools and two-year colleges. However, like most other reforms, little is known about the quality of these programs or their impact on students in urban or rural areas.

In most localities, Tech Prep and school-to-work programs have been in operation for less than five years (Bragg, Layton, & Hammons, 1994). As a consequence, too little time has passed for students to have completed an entirely reformed secondary-to-postsecondary Tech Prep program. In many areas of the country, high school graduates participating in Tech Prep only first began to matriculate to the postsecondary level during the 1994-1995 or 1995-1996 school years, creating an unsettling lack of information regarding student outcomes. The National Assessment of Vocational Education (NAVE) drew a similar conclusion regarding the need for systematic evaluation of Tech Prep. Boesel, Rahn, and Deich (1994), authors of the Tech Prep section of the NAVE report, recommended increased emphasis on evaluation of Tech Prep programs "using longitudinal studies of student participation, retention, and educational and employment outcomes" (p. 131). They urged government to collect better information regarding how students participate in Tech Prep at both the secondary and postsecondary levels, and how they progress through the system. The study also emphasized the need for more Tech Prep programs to be developed in sites with high concentrations of special population students and to ensure evaluation is employed to monitor the effectiveness of Tech Prep programs for these students. Often, urban schools and colleges enroll disproportionately large numbers of special population students, making it extremely important to understand how Tech Prep involves and affects these students.

To provide a foundation for future efforts to evaluate programs and assess student outcomes relative to Tech Prep, additional research was needed. This study was designed and conducted to identify, classify, and prioritize student outcomes for Tech Prep. To address concerns that would inevitably be raised concerning stakeholders' differing perspectives toward student outcomes, three stakeholder groups were engaged in the study. They were educators, students, and employers.[3] The research questions that guided the study follow:

  1. Collectively, how do the three stakeholder groups of educators, students, and employers conceptualize and prioritize Tech Prep student outcomes?
    • What level of priority does the collective group assign to each student outcome?
    • How does the collective group organize and classify (concept map) the student outcomes?
    • What level of priority is attributed to the clusters of student outcomes that emerge in the concept map?
  2. What are the similarities and differences in how each of the three stakeholder groups of educators, employers, and students conceptualize and prioritize Tech Prep student outcomes?
    • What are the similarities and differences in the priorities attributed to the student outcomes by each of the three stakeholder groups?
    • What are the similarities and differences in how each of the three stakeholder groups organize and classify (concept map) the student outcomes?
    • What are the similarities and differences in the priorities attributed to the clusters of student outcomes of each of the three stakeholder groups?
Understanding how the three stakeholder groups of educators, students, and employers collectively and independently conceptualize student outcomes can have numerous benefits for those who are developing and implementing Tech Prep programs and policies. With better information about the student outcomes associated with Tech Prep, it will be possible to design assessments that are more highly focused and meaningful to the various critical stakeholder groups. By knowing which outcomes are important to particular stakeholders, it may also be possible to develop Tech Prep programs that are more likely to produce desired outcomes.

With better information about student outcomes, still other benefits are feasible. Researchers, policy leaders, and practitioners may be more likely to determine the circumstances under which the fundamental elements of Tech Prep (e.g., articulation, applied academics, stakeholder collaboration, and education-business partnerships) are most tenable. In addition, information produced by outcomes assessments can contribute to further refinement of the fundamental Tech Prep concept. Making Tech Prep more accessible and effective for "all" students rather than limiting it to the "neglected majority" is a particularly important issue for urban localities such as those engaged in this study. Finally, having better quality information about student outcomes can help to build more accountability into any evolving Tech Prep system, resulting in better program implementation and evaluation at all levels.

Assessing Student Outcomes for Tech Prep

Existing legislation and much of the literature on Tech Prep presents ways the educational process ought to be configured and implemented, but neglects the results that should be evident for students who participate in and complete the programs. For example, The Unfinished Agenda prepared by the National Commission on Secondary Vocational Education (1984), one of the first public documents to refer to the Tech Prep concept, recommended that Tech Prep better coordinate secondary and postsecondary education, be grounded in applied academics and technical studies, and ease the student transition into two-year postsecondary education. Like other early writings on Tech Prep, The Unfinished Agenda attempted to define what the educational program should look like and be about, but omitted specifying what students should expect to gain having participated in such a program.

Similarly, Dale Parnell's vision of Tech Prep focused on what he thought should be the key elements of the Tech Prep process, but he identified few student outcomes. In his book The Neglected Majority (1985), Parnell advocated high-quality vocational education, applied academics, and strong relationships between business and education. He argued forcefully to refocus schooling to better meet the needs of the "neglected majority" of high school students who were unlikely to obtain the baccalaureate degree. The 2+2 Tech Prep Associate Degree (TPAD) model conceived by Parnell and further developed by Hull and Parnell (1991) was envisioned to be an equivalent track in rigor and stature to college prep.

In the TPAD model, a common core of "math, science, communications, and technology--all in an applied setting" (Parnell, 1985, p. 144) was to be taught during the last two years of high school and the first two years of postsecondary education at a community, junior, or technical college. Ultimately, this articulated and applied secondary-to-postsecondary educational track was intended to culminate with a two-year associate degree, "the preferred degree for employers seeking to fill a broad range of mid-level occupations," according to Parnell (p. 145). In that statement, Parnell identified a student outcome that is widely associated with Tech Prep: completion with a two-year college associate degree. Other student outcomes were not as clearly specified in the writings of Parnell or others influential in formulating the Tech Prep education approach.

The Tech Prep Education Act

The federal government provided some clarity regarding student outcomes that should be assessed when it passed the Tech Prep Education Act, Title IIIE of the Carl D. Perkins Vocational and Applied Technology Education Act of 1990. According to the law, a Tech Prep education program means
a combined secondary and postsecondary education program which--
(A) leads to an associate degree or 2-year certificate;
(B) provides technical preparation in at least 1 field of engineering technology, applied science, mechanical, industrial, or practical art or trade, or agriculture, health, or business;
(C) builds student competence in mathematics, science, and communication (including applied academics) through a sequential course of study; and
(D) leads to placement in employment. (U.S. Congress, P.L. 101-392, 1990)
Using this definition as an indicator of the outcomes that could potentially be associated with Tech Prep, it is apparent that students who finish the program should obtain an associate degree or two-year certificate, as was specified by Parnell (1985). Adding to the outcome of an associate degree, this definition alludes to outcomes linked to student competence in targeted vocational and academic subjects as well as job placement. The federal law also encourages state agencies to give special consideration to local grant applications that provide apprenticeships or transfer to four-year baccalaureate-degree programs, suggesting that student outcomes could be expanded beyond placement in entry-level jobs--a primary outcome of traditional vocational education programs--to include outcomes pertaining to further training and education, including four-year postsecondary education.

States are also advised to encourage local Tech Prep programs to address dropout prevention and re-entry, the needs of minority youths, youths of limited English proficiency, youths with handicaps, and disadvantaged youths (U.S. Congress, 1990). An "essential element" of the legislation requires that special populations be ensured equal access to the full range of Tech Prep programs, including support services. As such, the federal legislation ensures that Tech Prep not be limited to a select group of students such as the neglected majority or the traditional college-bound, but be inclusive of all students.

It is important to note that while the federal legislation provides some direction in terms of the kinds of student outcomes that should be assessed relative to Tech Prep, the law does not specify that outcomes assessment or any other form of program evaluation be carried out at the local or state levels.[4] Although, according to Layton and Bragg (1992), when Tech Prep programs first began to be implemented, several states built evaluation into the system of performance measures and standards required by the Carl D. Perkins Vocational and Applied Technology Education Act of 1990 (commonly referred to as "Perkins II").[5] However, at that time, only 40% of the states identified student outcomes for local Tech Prep programs, and most often that outcome was academic skill attainment, an outcome mandated by Perkins II. Yet, even then, state officials were questioning how to measure academic attainment and few other student outcomes were being proposed. (For additional discussion of how states have conducted evaluation within the current political context of Perkins II, see Hoachlander & Rahn (1992); McCaslin & Headley (1993); and Stecher, Hanser, & Hallmark (1995). Even though these studies are not directed toward Tech Prep specifically, they do examine how local and state entities have implemented related student outcomes measures according to the Perkins II mandate.)

Formal Evaluations of Tech Prep

Program evaluation has been one of the most neglected components of Tech Prep since the concept became visible in the mid-1980s. In 1988, McKinney, Fields, Kurth, and Kelly reported a lack of attention paid to program evaluation for articulated vocational-technical education programs such as Tech Prep. A 1992 study by Dornsife confirmed that evaluation remained a weak component of local Tech Prep programs. Her study indicated evaluation occurred with only the most advanced Tech Prep programs, but even there the primary goal was to track course enrollments. Rarely were Tech Prep program administrators monitoring program completion, job placement, or still other outcomes that might stress student performance.

In 1993, two years after federal funding became available to plan and implement local Tech Prep programs, local coordinators were asked to rate the stage of implementation of evaluation in regard to Tech Prep programs funded with federal Title IIIE funds (Bragg et al., 1994). Of nearly 50% of all local Tech Prep consortia in the United States, 40% reported they had not even begun to implement formal evaluations of their Tech Prep programs. Another 30% indicated their consortia were in the planning stage of evaluation, showing only a minority of Tech Prep consortia were actively implementing formal evaluations, and most of these were very, very preliminary.[6] Overall, all evaluation-related activities were rated among the lowest of 30 potential components of a Tech Prep program, indicating evaluation continued to be neglected within the first year or two that Tech Prep programs acquired Title IIIE funds.

Generally, indicators of student performance relative to Tech Prep have been compliance-oriented for the purposes of demonstrating accountability to governmental units rather than for improving local programs (Dornsife, 1992). Documentation of student enrollments and program completion primarily at the secondary level but also the postsecondary level have been used most extensively. A doctoral dissertation completed by Hammons in 1992 reported similar findings. However, Hammons did identify outcomes for Tech Prep related to student careers, attitudes and perceptions associated with education and employment, broadening the pool of outcomes that could be associated with student participation in and completion of Tech Prep. This study made an important contribution to the literature in that it did not confine itself to a narrow set of outcomes, but, rather, considered a wide array of potential performance indicators for Tech Prep.

Research conducted by Bragg et al. (1994) concurred with Hammons' earlier conclusions that a broad set of outcomes could be associated with Tech Prep. When local coordinators were asked to rate the priority that should be given to 17 student outcomes, 15 were given a "high" or "very high" priority rating, suggesting evaluations of Tech Prep should be broadly conceptualized and not limited to a few compliance-oriented measures. The following 15 student outcomes were given a "high" or "very high" priority by local coordinators:

  1. Improved knowledge and skills in math
  2. Improved problem-solving, thinking, and reasoning skills
  3. Increased employability skills and work readiness
  4. Increased matriculation from secondary to postsecondary levels
  5. Increased awareness of and interest in technical careers
  6. Improved knowledge and skills in English/communications
  7. Increased knowledge and skills in vocational areas
  8. Improved knowledge and skills in science
  9. Increased motivation for learning
  10. Increased secondary school completion
  11. Increased interpersonal skills (team, leadership)
  12. Increased postsecondary school completion
  13. Increased employability in high-wage jobs
  14. Increased satisfaction of students and graduates with jobs
  15. Increased self-esteem

The National Evaluation of Tech Prep Education

In October 1992, the U.S. Department of Education contracted with Mathematica Policy Research (MPR), Inc. to conduct an evaluation of Tech Prep implementation across the United States. This evaluation is longitudinal, having a five-year scope relying on three distinct data collection methods: (1) a mail questionnaire involving state Tech Prep coordinators in the fall of 1993 and 1996; (2) a four-year annual mail census survey involving local Tech Prep consortia starting in the fall of 1993; and (3) case studies with ten local Tech Prep consortia, also conducted over a four-year period beginning in 1993. This comprehensive national evaluation contains some information about how local and state Tech Prep programs are being evaluated, including how selected student outcomes are being operationalized.

According to Hershey and Silverberg (1994), during FY 1993 all states monitored local Tech Prep implementation by having local consortia make progress reports, usually once or twice per year. These reports typically asked local consortia to document how grant funds were used or how particular processes were functioning, (e.g., staff development, consortium membership, or planning activities). According to Hershey and Silverberg, 30 states required that local consortia report program evaluation activities or results. The majority of states required local consortia to inform them about the number of students in Tech Prep. Just over one-half of the state coordinators indicated their states required data on student outcomes:

State agencies most frequently required outcome data on secondary school program completion (23 states), postsecondary program enrollment (23 states), postsecondary program completion (20 states), and students' academic skills (17 states). Reports on job placements and students' technical skills/competencies were required in 15 and 14 states, respectively. (p. 29)
These findings contrast with other results reported by Hershey and Silverberg (1994) where local coordinators described evaluation activities as in only a "planning" stage, raising questions about how many local consortia could actually provide the kind of information reportedly mandated by state agencies. Similar to the findings reported by Bragg et al. (1994), many local consortia were planning to collect outcomes data and create computerized databases; however, very few had actually accomplished that goal. Most of the computerized databases were being planned to track transcript data (i.e., courses taken or completed and grades). Fewer were designed to monitor and report student performance relative to specific vocational-technical or academic competencies or work-related experiences (i.e., work-based learning experiences, job placements, or wages).

The national evaluation clearly documents that in FY 1993 very few consortia were engaging students in formal evaluation activities or actively collecting data on student outcomes. When attempting to understand how students move through the Tech Prep system, secondary to postsecondary and beyond, the number of local consortia that were able to provide student outcomes data in the area of participation and completion was so limited as to make most of the estimates meaningless. For example, when asked to provide the number of Tech Prep participants at the secondary level, only 250 of 702 local consortia provided estimates. Far fewer provided estimates regarding high school graduation, employment after high school, postsecondary entry, completion of postsecondary, or employment after postsecondary completion. The national evaluation does not attempt to collect other student outcomes data such as "skills levels, competencies, or grades because they are measured, computed, and interpreted differently across localities" (Bragg et al., 1994, p. 117). Therefore, the census to document Tech Prep implementation nationwide will never report student outcomes beyond student participation and completion. This is a concern since understanding how students benefit from Tech Prep would be useful beyond knowing simply whether they participate and complete prescribed phases of the Tech Prep system.

Here, the national evaluation's primary goal of documenting accountability is apparent, but that may not be as helpful to local and state practitioners as other kinds of evaluation. For example, understanding technical and academic competency attainment among students could help to determine the effectiveness of particular aspects of the school-based curricula. Furthermore, identifying employability skill levels among students could help to determine the quality of the work-based curricula. Fortunately, the related case studies associated with the national evaluation delve into these student outcomes. A report documenting the first in a series of site visits conducted by Hershey, Silverberg, and Owens (1994) focused almost entirely on four processes: (1) articulation; (2) curriculum and instructional enhancement; (3) student recruitment, guidance, and career development; and (4) consortium organization and coordination. Data collected from school records for two cohorts of students in each of the ten case-study sites should help to expand the universe of student outcomes being investigated along with the national evaluation and help to address questions about how students benefit from Tech Prep.

State Evaluations of Tech Prep

Beyond the data collected by Hershey and Silverberg (1994) and earlier by Layton and Bragg (1992) from state Tech Prep coordinators, little is known about the Tech Prep evaluation activities sponsored or conducted independently by local consortia and states. To gain a better understanding of how states are evaluating Tech Prep, this investigator led a study in 1994 and 1995 to identify and document existing Tech Prep evaluation activities sponsored or conducted by state agencies. Using a letter mailing to solicit evaluation documents, follow-up telephone interviews, and a document review, the general character of existing state-level evaluations was assessed and documented. In conducting this review process, 33 states provided information regarding their evaluation activities, nearly all reporting findings for the 1993-1994 academic year or earlier; although a few states did provide evaluation reports conducted since that time.

Of all the states responding to our request for information about evaluations, five indicated they were primarily participating in and relying on the U.S. Department of Education-sponsored national evaluation of Tech Prep, research on Tech Prep conducted by the National Center for Research in Vocational Education (NCRVE), or other studies to inform them about how Tech Prep is progressing. One of the state coordinators indicated the federal appropriation for Tech Prep was too limited to allow funds to be diverted away from local and state program implementation. Still, most of these states were monitoring Tech Prep implementation as they did other similar programs, and some were engaging local consortia in formal self-assessments as well.

Twenty-eight states were engaging local consortia in data collection, either by having state staff design and carry out the evaluation, usually in conjunction with local personnel, or by contracting the evaluation to a third party. When a third party was chosen, often it was with vocational-technical personnel employed by a state's land-grant university. Several state agencies have established strong relationships with vocational-technical education units in land-grant universities for the purposes of conducting formal program evaluation. Many of these kinds of units are tapped to evaluate Tech Prep programs, including the vocational-technical units in land-grant universities in Illinois, Minnesota, Missouri, Virginia, and Wisconsin. In addition, third-party evaluations were conducted by other universities, regional education laboratories, or private consulting firms in California, Colorado, Florida, Ohio, Texas, Washington state, and West Virginia. Having reviewed the internal and external evaluation documents produced by state agencies and third-party groups, it seemed that evaluations conducted by third-party agencies were more comprehensive and rigorous than the internal evaluations conducted by state agencies. More of the third-party evaluations used a longitudinal design and standardized data collection procedures, and more provided a comprehensible definition of the population and sample of Tech Prep students and other stakeholders engaged in the study.

Thirteen of twenty-eight states reported they were conducting Tech Prep evaluations, but did not provide formal reports showing data, findings, conclusions, or recommendations. Rather, most provided copies of guidelines, surveys, and site-visit instrumentation used to collect data. Most indicated that although data was being collected and sometimes already available for use by local and state personnel, a formal report was not distributed.

Sixteen states provided copies of formal evaluation reports, most reporting results for FY 1994, although the reports for Illinois, Nebraska, New Hampshire, Ohio, Washington state, and West Virginia included FY 1995 data. None of the reports attempted to compare Tech Prep in disparate settings such as rural, urban, and suburban. (See Table 1 for a summary of the goals and methods used to carry out the sixteen formal evaluations of Tech Prep programs reported here. Appendix A contains additional information about each of these state-level evaluations.)

About one-half of the studies were longitudinal in design, typically lasting for three years to comply with the three-year time period for grant awards specified in the federal Tech Prep Education Act. Most of the studies utilized multiple methods--typically a document review, site visits, and surveys involving various stakeholder groups. Sometimes the evaluations also used observational assessments and secondary analysis of data supplied by MPR Associates, Inc., the organization conducting the national evaluation for the U.S. Department of Education. Eight of the thirteen evaluations had plans to examine student outcomes, although often these outcomes were not specified in the evaluation reports. Several of the reports stated that student outcomes could not be examined because of the early stage of implementation of Tech Prep. However, a few evaluations did report findings relative to Tech Prep student outcomes. Several of the state-level evaluations that made such claims are discussed in this section.

Table 1
Summary of Selected State-Level Tech Prep Evaluations

State

Evaluation Goals and Methods
Source(s)
California
No. of Tech Prep Consortia: 83

Goals:

Determine overall program effectiveness by assessing local and state program implementation and Tech Prep practices. The study focuses on four key areas of concern:

1. a description of Tech Prep education efforts

2. an assessment of program implementation

3. an evaluation of program effectiveness

4. the identification of effective program implementation strategies (p. 14)

Methods:

Five-year longitudinal study designed to evaluate Tech Prep program implementation by a third-party agency. Methods include document review; site visits and observational assessments; analysis of data submitted to Mathematica Policy Research (MPR) Associates, Inc.; and survey questionnaire administration for business/industry (pp. 14-17).

Major Findings Are Presented in the Following Areas:

  • Participation figures for students and professional staff at the secondary and postsecondary levels for 1993-1994.
  • Qualitative description of consortia more fully implementing Tech Prep than those less advanced.
  • Benefits of California's resource consortia
  • State-level program design and implementation and special programs' programs
  • Perceptions of business and industry (based on questionnaire responses)
  • Promising practices and potential pitfalls listed according to organizational strategies, curriculum development, staff development, and special populations
  • No student outcomes results presented (pp. 18-31)

Rubin (1994)
Colorado
No. of Tech Prep Consortia: 33

Goals:

The overall purpose of the evaluation is to assist the individual projects and state program in meeting goals by providing a comprehensive and objective assessment of processes and outcomes. Specific outcomes of the evaluation are

  • to assess the extent to which each consortia accomplished stated goals and objectives.
  • to provide information that can be used to assess the feasibility and effectiveness of Tech Prep approaches.
  • to make recommendations for improving Tech Prep.
  • to suggest ways evaluation findings can be applied in other vocational education settings. (pp. 2-3)

Methods:

Three-year longitudinal design to parallel three-year funding cycle for Tech Prep consortia by a third party. The data collection methods include analysis of data submitted to MPR Associates, Inc., document reviews, surveys of program coordinators, and site visits (pp. 4-7).

Major Findings Are Presented in the Following Areas:

  • Consortia membership
  • Incidence of various articulation methods and agreements
  • Incidence of use of various curricular strategies by vocational programs, career clusters, integration strategies, instructional strategies, and so on
  • Incidence of use of various support processes such as marketing, student selection and recruitment, and assessment
  • Involvement of communities (e.g., business and parents)
  • Methods used to recruit special population students, services provided, and gender equity activities
  • Defining characteristics of Tech Prep students, number of student participants, and demographic and academic characteristics of students
  • Program staffing arrangements, incidence, and perceived benefits of staff development activities
  • Consortia funding and allocations
  • Identification of plans for tracking 31 outcomes for Tech Prep students (Outcomes most often reported by local consortia are course completion, program completion/graduation, and skills/competencies gained by secondary-level Tech Prep students. Some consortia reporting same outcomes at the postsecondary level.)
  • Employment outcomes planned via follow-up surveys (i.e., job placement and average entry-level earnings for completers)
  • Incidence of use of internal evaluation methods (pp. 9-44)

Keller (1995)
Delaware
No. of Tech Prep Consortia: 13

Goals:

No specific evaluation goals are presented in the report.

Methods:

Compilation and analysis of extant data sources are used to create a summary document on Tech Prep in Delaware. Secondary student data is collected from the Student Registration Form and the VAX Computer System at the Department of Public Instruction. Postsecondary student data is based on Delaware Technical and Community College's internal data system through social security numbers. Non-student data comes from workshop sign-in sheets and reports; graduate follow-up surveys; surveys distributed to students, parents, education personnel, government officials, and business and industry representatives. (Foreword)

Major Findings Are Presented in the Following Areas:

  • Tech Prep enrollment demographics (1993-1994)
  • Tech Prep enrollments by technology (1993-1994)
  • Tech Prep versus non-Tech Prep high school dropout rates
  • Achievement score comparisons for Grade 10
  • Tech Prep admission score comparisons for the community college
  • Postsecondary entrance by advanced placement for seniors
  • Total enrollments by advanced placement
  • Demographic rating survey data by survey letter, program rating form, and Fall 1994 survey rating data (pp. 1-15)

Campbell (1995)
Illinois
No. of Tech Prep Consortia: 40 funded in 1991; 7 demonstration sites funded in 1993-1995

Goals:

The stated goals of this third-party evaluation were to describe micro-level Tech Prep programming in selected sites and determine the effects of Tech Prep participation on students (p. 2).

Methods:

During FY94 two demonstration sites were studied, and two additional sites were selected using criteria and data from the FY93 National Tech Prep Survey by MPR Associates, Inc. Student samples were determined at each site using the categories of Tech Prep, non-Tech Prep, and pre-Tech Prep. Student data was collected via transcript review, testing, and group interview. The ACT Work Keys instrument Reading for Information and Applied Mathematics were administered to Tech Prep students to determine progress and level of proficiency. Some students were tested with the Work Readiness instrument developed by project staff. In addition, interview data was collected from Tech Prep students, vocational and academic faculty, administrators, and counselors (pp. 2-4).

Major Findings Are Presented for Four Sites in the Following Areas:

  • Student outcomes including coursetaking patterns; class rank percentile; mean scores on the ACT in English, mathematics, reading, science reasoning, and composite; work readiness; ACT Work Keys reading and applied mathematics.
  • Student perceptions of Tech Prep instruction
  • Staff perceptions of Tech Prep (pp. 4-15)

Roegge & Evans (1995)
Minnesota
No. of Tech Prep Consortia: 29

Goals:

None reported in preliminary report.

Methods:

A follow-up evaluation system designed by a third-party evaluation unit uses data collected from Tech Prep Identifier Form, Data Submittal Form, Career Planning Survey, High School Follow-Up Questionnaire, and Employer Follow-Up Form. Most findings appear to be quantitative, although some qualitative findings appear in the preliminary report but the source is unknown.

Major Findings Are Presented in the Following Areas:

  • Tech Prep high school follow-up participation
  • Student-related consortia data

Brown, Pucel, Johnson, & Kuchinke (1994)
Missouri
No. of Tech Prep Consortia: 12

Goals:

Four objectives were given for the evaluation:

  • To describe how Tech Prep has been conceptualized in Missouri.
  • To describe the processes undertaken as a part of each Tech Prep initiative.
  • To identify outcomes associated with Tech Prep implementation.
  • To determine relationships among Tech Prep outcomes and mission and implementation models, characteristics of consortia, and implementation processes (pp. 7-9).

Methods:

An evaluation conducted by a third party utilized document (RFPs) review, a Tech Prep coordinator survey, and structured interviews with all 12 Tech Prep coordinators (pp. 10-11).

Major Findings Are Presented in the Following Areas:

  • Consortia membership, staffing, and funding
  • Incidence of use of various articulation methods, agreements, and related evaluation
  • Methods used to market, recruit students, involve counselors, and assist students with career planning
  • Incidence of various types of staff development
  • Incidence of curriculum reform involving collaboration, integration, and the development of career clusters
  • Incidence of program evaluation and the identification of barriers (pp. 11-23)

Ruhland, Custer, & Stewart (1994)
Nebraska
No. of Tech Prep Consortia: 6

Goals:

Document local consortia progress on Nebraska's Tech Prep career goals implementation as of June 1995.

Methods:

The report represents a compilation of the implementation status surveys and self-assessments completed by each local consortium. Based on a state model, results are presented in the following areas:

C - ommitment of leaders

A - rticulation agreements

R - elevance of instruction

E - ducate staff

E - nrich career guidance

R - esourceful marketing

S - ystematic review and revision

Major Findings Are Presented in the Following Areas:

  • Number of students served at the secondary and postsecondary levels
  • Percentage of prospective students served by Tech Prep
  • Percentage of school districts served by Tech Prep
  • Status report on how the state is doing addressing seven goal statements related to Tech Prep "Careers"
  • Summary findings are reported by site

Jurgens (1995)
New Hampshire
No. of Tech Prep Consortia: 5

Goals:

This third-party evaluation was designed to be formative, ongoing, and focused on process. The purpose was to identify the strengths and weaknesses in a consortium's Tech Prep initiative. Based on the findings, local consortia were expected to be actively involved in further developing strengths and remediating weaknesses, creating a "continuous state of improvement" (p. 7).

Methods:

The methods involved site-based self-study focused on the following components: administration and organization; articulation agreements; business, industry, and community involvement; curriculum development; impact on students; promotion and marketing; and staff development. Following the self-study activity, site visits with personal interviews and observations were conducted by third-party consultants and state staff ranging in number from eight to ten members (pp. 7-10).

Major Findings Are Presented on a Site-by-Site and Statewide Basis in the Following Areas:

  • A vision for Tech Prep
  • Coordination and communication of Tech Prep activities
  • Coordination of Tech Prep with other reform efforts
  • Postsecondary involvement
  • Tech Prep funding
  • Evaluation and continuous improvement
  • Articulation agreements
  • Business, industry, and community involvement
  • Curriculum development/programs of study
  • Tech Prep student definition
  • Career guidance and counseling
  • Promotion and marketing of Tech Prep
  • Staff development activities
  • Performance indicators suggested in the areas of student retention/completion of secondary school, job placement/education continuation, academic competency gains, work or job skill attainment, and vocational-technical competency attainment (pp. 104-116)

Hammons & Pittman (1995)
North Carolina
No. of Tech Prep Consortia: 63

Goals:

1993-1994 Tech Prep project evaluations were conducted for the purposes of collecting and reporting data on the progress of projects funded under the federal Perkins II legislation (p. 1).

Methods:

In a study conducted by state agency personnel, local Tech Prep consortia representatives presented a structured executive summary of their projects' progress in meeting specified objectives for 1993-1994. Each consortium had 30 minutes to address several categories such as articulation and curriculum integration. A panel of reviewers from the North Carolina Department of Public Instruction and the North Carolina Department of Community Colleges rated progress in each category on a four-point scale indicating goals were Met, Partially Met, Not Met, or Not Applicable (p. 2).

Major Findings and Recommendations Are Presented on a Site-by-Site and Statewide Basis in the Following Areas:

  • Articulation efforts
  • Collaboration
  • Curriculum integration
  • Curriculum improvement
  • Guidance services
  • Staff development
  • Marketing efforts
  • Special populations
  • Achievement results (Data elements most frequently collected include change in Tech Prep enrollment, gains in student grades, gains in postsecondary enrollments, and dropout rates. Indicators not used frequently include standardized assessment instruments and changes in the percentage of students needing remediation at the colleges.) (pp. 4-5)

North Carolina Department of Public Instruction & North Carolina Department of Community Colleges (1994)
Ohio
No. of Tech Prep Consortia: 24

Goals:

Year One of this five-year longitudinal evaluation was viewed as a critical period for collecting baseline information and data about Tech Prep implementation to date at both the state and consortia levels (pp. 1-4).

Methods:

A multifaceted data collection plan was implemented by MGT of America, a third-party evaluator. The evaluation involved (1) survey data collected from Ohio consortia in fall 1994; (2) site visits and personal interviews with key stakeholder groups; (3) surveys of students, parents, and business/industry representatives; (4) a survey about Tech Prep implementation in five other states (Florida, Michigan, New York, Oklahoma, and Pennsylvania) for the purposes of measuring progress; and (5) a multiyear telephone survey of students in the Tech Prep, College Prep, Vocational Education, and General Education tracks (pp. 1-7-8).

Major Findings Are Presented in the Following Areas:

  • State policy and practice for Tech Prep
  • The role of consortia for Tech Prep
  • Professional development of instructors and administrators for Tech Prep
  • Participants' knowledge and perception of the value of the Tech Prep program
  • The impact of Tech Prep programs on students and former students (little data was available to address this area in a meaningful way at this point in the evaluation)

MGT of America, Inc. (1995)
Rhode Island
No. of Tech Prep Consortia: 1

Goals:

The study examined eight years of program management of the Rhode Island Tech Prep Associate Degree (TPAD) program and posited assertions that students who participate in TPAD (1) are more successful in secondary education than non-TPAD students as evidenced by their performance in core subjects; and (2) participate in postsecondary education more frequently (p. 8).

Methods:

The program evaluation employed a comparison group design for outcome measures related to the above assertions. The sample was comprised of 1,350 11th and 12th grade TPAD students from 24 high schools in Rhode Island and 235 non-TPAD students selected by counselors and TPAD liaisons because of their similarity to students who had chosen the TPAD option. Existing instruments were used to assess performance during late spring and early summer of 1994 and information was taken from students' permanent records to create the dataset for this evaluation (pp. 8-10).

Major Findings Are Presented in the Following Areas:

  • Performance of TPAD and Non-TPAD groups on the Metropolitan Achievement Test-Verbal (MATV)
  • Grade point average in math, science, and communication for TPAD participants and comparison groups
  • Postsecondary participation rates (observed frequencies) of TPAD and comparison groups
  • Tech Prep student interviews (both high school and community college levels)
  • Performance at the postsecondary level (pp. 10-12)

Rhode Island Tech Prep Associate Degree Program (no author or date given).
Tennessee
No. of Tech Prep Consortia: 14

Goals:

The evaluation conducted by the Tennessee Board of Regents and the Tennessee Department of Education documents progress made by local consortia during the second year of Tech Prep program implementation.

Methods:

None described.

Major Findings Are Provided in the Following Areas:

  • Number of students (secondary and postsecondary) served by Tech Prep as a linkage program (vocational and applied academics enrollments provided)
  • Perceived impact of services provided by the state in both urban and rural areas
  • Descriptions of Tech Prep program planning between secondary and postsecondary institutions by occupational areas (number of articulation agreements, courses meeting university requirements, career advisement, and youth apprenticeships according to the "Tennessee Model" )
  • The perceived benefits of Tech Prep programs and services in meeting the needs of special populations (i.e., placement, monitoring, assessment, evaluation)
  • The perceived impact of Tech Prep professional activities and services on guidance counselors, teachers, and others (number of activities conducted per site and number of people involved)
  • Description of the preparatory services provided for participants in Tech Prep programs
  • Perceived factors contributing to exemplary programs.

Tennessee Board of Regents & Tennessee Department of Education (1994)
Texas
No. of Tech Prep Consortia: 25

Goals:

This third-party evaluation focused on the description of Tech Prep programs at the local and state levels and the identification of best practices and effective approaches of local projects for improving occupational education (p. i).

Methods:

Multiple methods were used to collect data for the evaluation, including document reviews, two-day site visits to all 25 consortia, interviews with state and federal personnel, mail questionnaires sent to 750 consortia members (44% returned), student data, and data from MPR Associates, Inc. (pp. i-ii).

Major Findings Presented Are in the Following Areas:

  • Student data (i.e., high school student participation in Tech Prep and comparison of Tech Prep students to all students on selected demographic characteristics, postsecondary student participation in Tech Prep and comparison of Tech Prep students to all students)
  • Description of consortia membership and organization, articulation agreements, curriculum development and integration, professional development, business and industry involvement, budgets, and marketing
  • Qualitative data on best practices and effective approaches related to student participation, articulation agreements, curriculum development and implementation, professional development, and business involvement
  • Discussion of program administration issues (pp. 6-39)

Decision Information Resources, Inc. (n.d.)
Washington
No. of Tech Prep Consortia: 22

Goals:

This third-party evaluation was designed to describe Tech Prep planning and implementation processes carried out by local consortia in Washington state.

Methods:

Multiple methods were combined to describe Tech Prep planning and implementation processes using case studies and secondary analysis of data provided by local consortia to MPR Associates, Inc. The case studies were conducted in two consecutive years with four consortia, and these studies were intended to provide detailed information about planning and implementation processes along with practices implementers perceived as effective. The case studies were constructed to portray (1) an overview of the consortium, (2) recent accomplishments in key areas such as articulation, (3) strengths and concerns in the consortium's operation, and (4) issues and new directions for the local Tech Prep initiative (Owens, 1995, p. 1; Owens et al., 1995, p. 1).

Major Findings Are Presented for Each Site and Common Themes, Strengths, and Concerns Are Presented in the Following Areas:

  • Enrollments increased dramatically between 1992-1993 and 1993-1994 from 170 secondary Tech Prep students to 2,203 by the end of 1994. Most students were enrolled in business, office, or marketing programs (Owens, 1995, pp. 1-2).
  • Applied academics courses are implemented widely (Owens, 1995, p. 2).
  • In 1994, 126 Tech Prep graduates had pursued training beyond high school with nearly all enrolled in community colleges (Owens, 1995, p. 2).
  • Two-thirds of consortia reported businesses provided some sort of support to local Tech Prep efforts (Owens, 1995, p. 2).
  • The most commonly identified limitations to Tech Prep were a lack of staff, time, and money, and a lack of truly integrated curriculum (Owens, 1995, p. 2).
  • Major accomplishments across the four sites were reported in the areas of articulation, career pathways and curriculum, and promotions (Owens et al., 1995, p. 39).
  • Strengths center around the contributions of consortium directors, marketing, community college support, community support, and the development of new courses. Concerns include a lack of awareness of Tech Prep, negative attitudes toward Tech Prep and vocational education, difficulties with labor unions, staff turnover, time, and uncertainty about Tech Prep's future (Owens et al., 1995, p. 39).
  • Issues are discussed related to emerging leadership and integration with other reform efforts (Owens et al., 1995, pp. 40-41).

Owens (1995)
West Virginia
No. of Tech Prep Consortia: 15 (since 1991)

Goals:

This third-party evaluation was designed to document the implementation progress and best practices of the Tech Prep Associate Degree initiative in West Virginia (p. iii).

Methods:

Evaluators reviewed annual project reports of each of the pilot TPAD projects in the state and conduct a focus group session with coordinators of the TPAD pilot projects (p. 1).

Major Findings Are Presented in the Following Areas:

  • Conclusions drawn from site reports concerning curriculum, staff development, implementation, committees, marketing, and evaluation
  • Focus group interview responses concerning best practices; administrator involvement; business, industry, and labor involvement; barriers to TPAD implementation; state-level technical assistance; college readiness; judging progress; full implementation; and integration of TPAD

Harman & Stowers (1995)
Wisconsin
No. of Tech Prep Consortia: 22

Goals:

This third-party evaluation was designed to address pressures for accountability and program improvement information at a time when education is poised for the adoption of complex educational reform initiatives such as proposed by STWO Act (p. 5).

Methods:

The evaluation design is based on the concept of benchmarking which is intended to be the framework for school self-assessment and data collection feeding into school planning processes and continuous improvement. Wisconsin's benchmarking model relies on the identification and use of "benchmarks for Tech Prep and STW" in terms of implementation, participation, and outcome. Self-assessment tools and data collection tools are used to provide focus for Tech Prep implementation; identify strengths, gaps, and problems; identify improvement areas; and decide whether changes need to be made.

Major Findings Highlight a Pilot Test of the Benchmarking Model in Six Schools Which Identified the Following Areas:

  • Local practitioners and state-level policymakers felt the benchmarking approach was a useful way to look at diverse programs such as Tech Prep or STW.
  • The self-assessment and data collection instruments were useful for understanding Tech Prep implementation, particularly the Tech Prep Implementation Checklist.
  • The benchmarking process was attributed with helping practitioners develop a greater understanding of the full range of practices associated with Tech Prep and contributing to staff development activities.
  • The process pointed to the need for common definitions within schools and across consortia in the state.
  • Most schools were not equipped "institutionally or attitudinally" for data collection related to the benchmarks. They had difficulty assessing their own programs and students (pp. 18-21).

Connell & Mason (1995)

In Illinois, an evaluation involving four secondary sites included students classified as Tech Prep, non-Tech Prep, and pre-Tech Prep (Roegge & Evans, 1995). Student data was collected from transcripts, standardized tests, and group interviews. The findings indicated that Tech Prep students took as many or more science, math, social science, and foreign language courses as pre-Tech Prep students. The results were statistically significant for advanced science courses only. Although the group of Tech Prep students had a lower class rank percentile overall than the pre-Tech Prep students, the Tech Prep students obtained significantly higher composite scores on the ACT than pre-Tech Prep students. This result was statistically significant at the p=.005 level. A small sample of students was given a "Work Readiness" instrument developed by the researchers, and the results revealed that Tech Prep students had a more "anticipatory attitude toward work" than non-Tech Prep students. The data also revealed that more Tech Prep students thought their "classes would help prepare them for a career," and they were more "sure of what they wanted to do as an adult" (p. 8).

In Rhode Island, a sample of 1,350 students was drawn from 11th and 12th grade from 24 high schools (Rhode Island Tech Prep Associate Degree Program, no author or date given). The students were grouped into Tech Prep Associate Degree (TPAD) (n=1,115) and non-TPAD (n=235) categories. "The comparison group [of non-TPAD students] was composed of similar students from TPAD schools whose guidance counselors identified them as appropriate candidates for the TPAD Program, but who had declined participation, and similar students from two non-TPAD schools whose faculty were planning to implement the Program during the 1994-95 year" (p. 9). The two groups compared closely on several demographic characteristics such as gender and ethnicity. Data used for the secondary analysis was gleaned from existing transcript and test results. Findings show the TPAD group had significantly higher grade point averages (GPAs) in math, science, and communications than the non-TPAD group, but prior to participation in Tech Prep the TPAD students had significantly lower GPAs in these subjects than the non-TPAD group. The postsecondary participation rate for the TPAD students was 60% compared to 39% for the non-TPAD students, although missing data for both groups raises questions about these estimates. Nevertheless, the rate of participation in postsecondary education suggests a sizable proportion of students are continuing their education beyond high school, an important element of Tech Prep.

Like Rhode Island, Delaware's evaluation of Tech Prep relies heavily on existing student data from secondary and postsecondary sources (Campbell, 1995). Some of the results of the evaluation are presented for Tech Prep versus non-Tech Prep students. For example, the dropout rates for Tech Prep students are lower than for non-Tech Prep students over the 1990-1991 to 1993-1994 time period, 0.39% and 5.0% respectively. Achievement score comparisons for a random sample of Tech Prep and non-Tech Prep students show the Tech Prep group's average scores in advanced skills reading and math were higher than for non-Tech Prep students. Finally, results indicate a steady increase in the number of high school students earning advanced college credits, showing an increasing number of students are accessing college credits while still in high school.

Finally, some student outcomes results are presented in the evaluation report authored by Owens, Lindner, and Wang (1995). In this study, personnel employed by the Northwest Regional Educational Laboratory (NWREL) conducted case studies on four sites in Washington state. The case study involving the Seattle consortium documented several data collection activities focusing on student outcomes. The report showed that in December 1993 there was a higher rate of participation of Tech Prep than non-Tech Prep students (based on self-identification) in career development activities and applied academic courses. In addition, the study reports telephone interview findings obtained by Dr. Mary Beth Celio showing 79% of Tech Prep graduates were enrolled in postsecondary education compared to 66% for other high school graduates. Celio's study also reported the following findings:

Since Tech Prep is particularly focused on the connection with community colleges, it is important to note that 47% of the Tech Prep students went on to the community college, while only 33% of the non-Tech Prep students did so. The fact that nearly identical percentages of each group went on to a four year school (32% for Tech Prep and 33% for non-Tech Prep) demonstrates that Tech Prep does not limit options for attending four year programs. Equally impressive is that the Seattle high school graduates going on to the community colleges include a higher percentage of students who formerly did not progress beyond secondary education (those with a high school GPA of 2.8 or less, Black and Asian populations, and high school graduates age 19 or older). (Owens et al., 1995, p. 17)
Case study findings for one other site in Washington state included a list of desired outcomes; however, data was not reported, probably because it was not yet available. Student outcomes that were identified by the Tech Prep consortium in Yakima Valley included increases in attendance rates, standardized test scores, and postsecondary participation, especially for minority students. The consortium also intended to examine whether suspension and dropout rates were declining as local officials hoped they would be in association with student participation in Tech Prep.


METHODS

The primary purpose of this study was to gain an understanding of how Tech Prep student outcomes were conceptualized (prioritized, grouped, and classified) by stakeholders actively involved in the implementation process. Because of the need to gain the perspectives of practitioners, concept mapping was chosen as the data collection method. Concept mapping is a structured conceptualization and statistical modeling procedure developed by Trochim (1989a). Concept mapping provides a means of articulating and structuring participant stakeholders' ideas in a visual form called--not surprisingly--concept maps.

Structured concept mapping is based on a three-phase model for conceptualizing program theory developed by Trochim and Linton (1986). This model suggests there are three general phases in conceptualizing program theory:

  1. Generating a conceptual domain out of thoughts, ideas, intuitions, theories, and problem statements.
  2. Structuring a conceptual domain by defining or estimating relationships between and among concepts.
  3. Representing the structured set of concepts in a conceptual domain verbally, pictorially, or mathematically.

Researchers have applied the concept mapping process in studies of planning, implementation, and evaluation; in basic and applied research; and in a variety of settings (Trochim, 1989a). Over the past several years, numerous papers and symposia utilizing concept mapping methodology have been presented at the annual meeting of the American Evaluation Association. (See Trochim [1989b] for a special issue of Evaluation and Program Planning on concept mapping.) Already the concept mapping process has been used to identify and conceptualize outcomes for vocational-technical education (Grayson, 1992) and Tech Prep (Roegge, Leach, & Brown, 1992), making it a logical methodology to apply to Tech Prep student outcomes.

Population and Sample

A first step of the concept mapping process is the selection of the participants for the study. Concept mapping studies generally use a purposive sampling method since the process depends on participants' knowledge and understanding of the particular domain or area being researched. Since this study is focused on conceptualizing outcomes for Tech Prep students, we chose to contact Tech Prep consortia affiliated with NCRVE's Urban Schools Network because of their extensive and shared involvement in the planning and implementation of Tech Prep in many of the nation's largest urban centers. The Tech Prep coordinators of the 30 sites involved in the Urban Schools Network in late 1994 and early 1995 were contacted by letter about the study. In addition to these sites, representatives of six sites providing mentors for the Urban Schools Network were invited to participate. Although a few of these would not be considered urban, the mentors who contributed to the study in these sites were persons who had been engaged in assisting Tech Prep planning and implementation in NCRVE's Urban Schools Network sites since its inception in 1992, providing them with a rich perspective on urban Tech Prep programs.

In a letter directed to the coordinators, the purpose of the study was explained, along with the basic requirements and procedures for concept mapping. A nomination form was included with the letter of invitation so that each coordinator could nominate an educator, student, and employer from his or her site who were actively involved and strongly committed to Tech Prep. (See Table 2 for a summary of stakeholder participant affiliations by site.)

Table 2
Summary of Stakeholder Participants by Site

Urban Schools Network Site

Educator
Student
Employer
Nashville, TN
X
X
X
Raleigh, NC
X
X
X
Oklahoma City, OK
X

X
Denver, CO
X
X

Oklahoma City, OK
X
X

St. Paul, MN
X


Milwaukee, WI
X
X
X
Omaha, NE
X
X
X
Washington, DC
X

X
Detroit, MI

X

Seattle, WA
X
X
X
Washington, DC
X
X
X
Houston, TX
X
X
X
Las Cruces, NM
X
X
X
Brooklyn, NY
X
X

Charlotte, NC
X

X
Philadelphia, PA
X

X
New Orleans, LA
X
X

Akron, OH
X
X
X
Indianapolis, IN
X

X
Mentor Sites
Portland, OR

X

X

X
Redwood City, CA
X
X

Austin, TX
X

X
Hamlet, NC
X
X
X
Palatine, IL
X

X
Leonardstown, MD

X
X
Total
24
18
19

After the letters had gone out and follow-up telephone calls had been made, a list of 86 stakeholder participants was assembled from the nominations received. Each person nominated for the study was contacted by mail, and all of the materials that needed to be completed were forwarded. Of all persons nominated, 61 returned the data collection instruments, providing a 72% response rate. These nominated persons represented 20 of the 30 NCRVE Urban Schools Network sites and all six mentor sites invited to participate in the study.

Slightly more educators participated in the study than students or employers, although the sample size for all three of the groups was sufficient. Concept mapping is conducted with a small group of experts who bring disparate, but informed perspectives. Trochim (1989a) reports that the group size for concept mapping typically ranges "from ten to twenty people" (p. 17), although concept maps can be conducted with groups of seventy-five or more. The rationale for keeping the group small, however, is that the primary goal of concept mapping is to obtain a deep understanding about how a particular group conceptualizes whatever complex phenomenon is being investigated. To do this, it is crucial to ensure the cooperation of all the participants in all phases of the concept mapping process, a process that can take each participant as much or more than two hours to complete the rating and sorting exercise. On the other hand, it is important to maintain a large enough sample to ensure disparate perspectives are represented when the concept maps are developed and interpreted. Given these parameters, the sample size for this study is optimal for determining the results for the subgroups as well as the aggregate group of all respondents.

Table 3 provides a profile of the three stakeholder participant groups. Briefly, representation by males and females is nearly equal in the study. There is not as equal a representation of other demographic characteristics (although the findings may be representative of the local consortia sites). The majority of participants are White/Caucasian and affiliated primarily with secondary vocational education. However, looking at the data closely gives a unique profile for each group.

Table 3
Profile of Stakeholder Participants

Stakeholder
Group


Gender

Race/Ethnicity
Primary
Affiliation
Major Responsibilities
Educators
(n=24)
Male 38%
Female 62%
African Amer. 28%
Hispanic 0%
Native Amer. 0%
White/Cauc. 64%
Asian Amer. 4%
UK 4%
Secondary 58%
Postsec. 29%
UK 13%
Vocational 54%

Academic 8%
Admin. 33%
UK 5%

Students
(n=18)
Male 39%
Female 61%
African Amer. 39%
Hispanic 11%
Native Amer. 0%
White/Cauc. 39%
Asian Amer. 0%
UK 11%
Secondary 39%
Postsec. 45%
UK 16%
--
Employers
(n=19)
Male 69%
Female 26%
UK 5%
African Amer. 5%
Hispanic 5%
Native Amer. 0%
White/Cauc. 85%
Asian Amer. 0%
UK 5%
--
Educ. Coor./
Director 42%
Owner/Bus.
Mgr. 32%
Public/CBO
Director 16%
Human Res.
Director 10%

Male 47%
Female 52%
UK 1%
African Amer. 25%
Hispanic 5%
Native Amer. 0%
White/Cauc. 63%
Asian Amer. 1%
UK 6%
Secondary 51%
Postsec. 35%
UK 14%

(Data for educators and students only)
Vocational 56%
Academic 8%
Admin. 32%
UK 4%
(Data for educators only)

Note: UK indicates unknown.

First, the demographic information collected from educators shows that the majority are female, White/Caucasian, and affiliated with secondary vocational education; slightly less than one-third are African American, affiliated with postsecondary education, and engaged in administration. Students, the second stakeholder group, are similar to educators in that most are female; however, the race/ethnicity and educational affiliation of students differ from educators. About one-half of the students are minority, either African American (39%) or Hispanic (11%), and nearly one-half are attending postsecondary education. The third group, employers, is dominated by males and Whites/Caucasians. About one-half of the employer group is comprised of persons working as corporate educational coordinators or human resource directors. Another one-third of the group is composed of independent business owners or business managers, and a smaller proportion is made up of persons working for public or community-based organizations (CBOs).

Instrumentation and Procedures

The materials mailed to participants in the concept mapping activity were developed and pilot tested by this researcher. Often, the generation of ideas for concept mapping is done by the participants themselves. However, sometimes it is not possible to gather the participants for a brainstorming session, so a list of ideas is generated from other sources. For this study, statements were obtained from the literature and related materials (e.g., legislation, policy documents, and instruments). Specifically, the Tech Prep and vocational education literature, legislation, and related documents were reviewed for outcomes statements (e.g., see Bragg, 1992; Hoachlander & Rahn, 1992; Hoachlander, Levesque, & Rahn, 1992; Key, 1994; McCaslin & Headley, 1993; O'Neil, 1976; Oregon Department of Education, n.d.; Oregon Department of Public Instruction, 1993; Pearce, Pease, Copa, & Beck, 1991; Peasley & McCaslin, 1995; Roegge et al., 1992; Secretary's Commission on Achieving Necessary Skills, 1991; Stecher et al., 1995).

An attempt was also made to draw ideas from the education reform literature such as the Project 2061 report published by the American Association for the Advancement of Science in 1989; the Curriculum and Evaluation Standards for School Mathematics also published in 1989; and the Standards Project for English/Literature Arts sponsored by the Center for the Study of Reading at the University of Illinois, the International Reading Association, and the National Council of Teachers of English. Other sources reporting academic outcomes were reviewed as well, including Kulm and Malcom (1991); Steffy (1993); and White (1994). In addition, planning documents and reports submitted by the NCRVE Urban Schools Network sites were reviewed for specific outcomes statements (e.g., NCRVE, 1993).

Pilot Testing the Instruments and Data Collection Procedures

The literature review resulted in the generation of a large number of potential student outcomes that were narrowed to 118 outcomes statements for pilot testing. A draft of the instrument containing 118 statements was distributed to six experts who had extensive knowledge of Tech Prep implementation and/or outcomes assessment. A draft of the instrument titled the "Tech Prep Student Outcomes Rating Form" directed the experts to rate the priority they would give each statement on a scale of 1 for "very low priority" to 5 for "very high priority." The statements were also given separately on index cards, and the experts were asked to sort them into piles that contained similar ideas. Each expert then labeled each pile of cards and recorded it on the "Card Sort Summary Sheet." They then packaged all of the completed instruments together and returned them in a pre-addressed, postage-paid envelope to the University of Illinois site for analysis and interpretation.

After conducting the concept mapping procedure, the experts also provided suggestions for improving the clarity and content of the statements. These suggestions were made verbally and/or in writing. The experts were also asked to help reduce the number of statements to a smaller number because of restrictions of the computer program for no more than 98 statements total. However, although the experts offered valuable comments regarding the content of statements, most did not eliminate statements. Consequently, the number of statements was reduced from 118 to 98 by randomly eliminating 20 statements. A list of the final 98 items contained in the "Tech Prep Student Outcomes Rating Form" is provided in Table 4. (See Appendix B for a list of the 98 outcomes statements categorized according to their predominant location in the literature.)

Table 4
Final List of Tech Prep Student Outcomes

Student Outcome Statements

1. create meaning from messages communicated through listening
2. understand nonverbal communication
3. make academic progress on grade level
4. communicate ideas and emotions through the fine arts (e.g., art, music, dance)
5. demonstrate consistent, respectful, and caring behavior
6. apply basic algebra and geometry to solve technical and work-related problems
7. organize information through the development and use of classification rules and systems
8. recognize the need for lifelong learning to enhance skills and learn new skills
9. adapt to emerging technology and adjust to changing work environments
10. exercise leadership in a variety of situations
11. apply knowledge, skills, and learning strategies to career and life choices
12. get along with a variety of people
13. use appropriate and relevant scientific methods to solve specific problems in real-life situations
14. participate as a member of a team
15. show good working relationships with superiors and coworkers in an occupational role
16. evaluate others' performance and provide feedback
17. serve clients/customers
18. teach others new skills
19. resolve conflict based on divergent interests and perspectives
20. know how to give and take instructions
21. appreciate the diversity of values and cultural differences among people
22. complete secondary school
23. expand own knowledge by making connections with new and unfamiliar knowledge, skills, and experiences
24. communicate ideas by quantifying with whole, rational, real, and/or complex numbers
25. plan and work together in meetings
26. demonstrate oral and verbal proficiency in technical communication (reports, policies, procedures)
27. appreciate own and others' artistic products and performances
28. apply group problem-solving strategies
29. use computers and other electronic technology to gather, organize, manipulate, and present information
30. communicate ideas and information through writing
31. demonstrate self-control and self-discipline
32. earn college credit in high school
33. know employer expectations for job performance
34. know how social, organizational, and technological systems work
35. demonstrate the ability to be adaptable and flexible
36. make a successful transition from education to employment
37. make ethical decisions
38. know own abilities, strengths, and weaknesses
39. maintain good physical, mental, and emotional health
40. apply logical reasoning to develop solutions to complex problems
41. select, use, and maintain appropriate tools, information, materials, and equipment
42. build own self-esteem
43. demonstrate motivation to learn
44. use initiative, imagination, and creativity
45. demonstrate a positive attitude toward school
46. attend school regularly
47. communicate ideas and information through speaking
48. demonstrate an ability to calculate through ratios, proportions, and percentages
49. complete postsecondary school
50. construct meaning through reading for information, literary experience, and to perform a task
51. use critical thinking skills in a variety of situations
52. use models and scales to explain or predict the organization, function, and behavior of objects, materials, and living things
53. use division, multiplication, addition, and subtraction with real numbers, decimals, fractions, integers, roots, and powers
54. achieve and maintain employability in a high-wage job
55. articulate personal values and beliefs as they relate to a particular occupation
56. use scientific methods to acquire information, plan investigations, use scientific tools, and communicate results
57. be critically aware of social issues involved in a field of interest
58. know the history of a particular occupation
59. observe, analyze, and interpret human behaviors to acquire a better understanding of self, families, and other human relationships
60. make a smooth transition from secondary to postsecondary education
61. recognize and apply the democratic principles of justice, equality, responsibility, choice, and freedom
62. use the metric system and convert between metrics and traditional systems
63. recognize the geographic interaction between people and their surroundings and make responsible decisions for the environment
64. design, maintain, and improve systems
65. monitor and correct own performance
66. use goal-relevant activities, rank them, and allocate time for them
67. recognize varying forms of government and address issues of importance to citizens in a democracy
68. be dependable and punctual
69. enter postsecondary programs without remediation
70. prepare and use budgets, make forecasts, keep records, and make adjustments to meet objectives
71. use decision-making processes to make informed choices among options
72. use research tools to locate sources of information and ideas relevant to a specific need or problem
73. show appropriate personal appearance and attitude
74. acquire, store, allocate, and use materials and space efficiently
75. have awareness of and interest in technical careers
76. be honest and demonstrate integrity
77. achieve certification of mastery in an occupation
78. apply the English language correctly (spelling, grammar, structure)
79. apply appropriate safety and environmental measures
80. develop and follow through on individual career plans and goals
81. be loyal to an employer
82. gain experience in all aspects of an industry
83. prepare and follow schedules, and manage time efficiently
84. work under tension or pressure
85. work without close supervision
86. demonstrate awareness of workforce and societal trends
87. understand the relationships between theory and practice in a technical area
88. participate in work-based learning experiences
89. succeed in the transition from secondary or postsecondary education to a 4-year college
90. recognize and apply quality standards
91. understand the norms and values of the work culture
92. understand how technology affects quality of life
93. apply advanced algebra, analytic geometry, and/or calculus to solve technical and work-related problems
94. read and create charts, tables, and graphs
95. prepare for direct participation in the democratic process
96. understand the principles of competition, cooperation, and leadership in a work environment
97. understand and communicate in a second language
98. recognize differences and commonalities in the human experience through productions, performances, or interpretations

Data Collection, Analysis, and Interpretation

Using the nominations made by the NCRVE Urban Schools Network coordinators, letters and data collection instruments were mailed to all the persons nominated. These included the Tech Prep Student Outcomes Rating Form, the 98 index cards containing each student outcome statement, the Card Sort Summary Sheet, and a Background Form. All stakeholder participants were asked to complete the instruments within seven to ten days. Although many of the stakeholders did respond quickly, extensive follow-up was conducted via mail and telephone to obtain responses from the 61 respondents.

To assist with interpretation of the data, preliminary results of the study were presented to a small group of NCRVE Urban Schools Network participants at the March, 1995 meeting of that group in Washington, DC. At that meeting, various concept maps were presented to the participants who were asked to assist in labeling the maps and suggesting alternative interpretations of their meanings. This interpretation session was helpful in determining how the stakeholder participants were likely to interpret the final results. It was also helpful in determining how to represent the concept maps visually and verbally in this concept paper.

As completed concept mapping materials were returned, data from the rating and sort forms were entered into the concept mapping computer program called "The Concept System" (Trochim, 1989a). This program was used to aggregate the sort and rate data provided by the stakeholder participants. It "uses a combination of multidimensional scaling (MDS) and cluster analysis techniques to represent conceptual domains underlying the data" (Caracelli & Riggin, 1994, p. 142). Information from the Background Form was compiled into a spreadsheet program. One map was generated to represent the collective view of all stakeholder participants. Additional maps were generated to represent the perspectives of the three stakeholder groups of educators, students, and employers. Further information about the computation and interpretation of these maps is presented in the next section.


FINDINGS AND DISCUSSION

This section of the paper provides a summary of the major concept mapping findings computed for the entire group of participants and for each of the subgroups of educators, students, and employers.

The Perspectives of All Participants Toward
Tech Prep Student Outcomes

The "Concept System" program used both the ratings and card sorts done by the stakeholder participants to generate data points on a two-dimensional map. The points on this map represent the results of nonmetric MDS based on the similarity information supplied by the respondents. Points on the map represent the student outcomes statements that were rated and sorted by the respondents (see Figure 1). Outcomes statements located closer together on the map were sorted together more frequently than statements located farther apart. For example, in the southeast corner of the map, statements 42 and 39 appear close together and both refer to personal attributes. Specifically, item 39 says "maintain good physical, mental, and emotional health" and item 42 states "build own self-esteem." Such similarities are found in statements located in close proximity throughout the map. Taking another example, statements 30 and 78 are located close together in the northwest part of the map. Both of these statements have something to do with communications since item 30 is "communicate ideas and information through writing" and item 78 is "apply the English language correctly (spelling, grammar, structure)."

Based on this point map, the statements were grouped into clusters that show the domains associated with Tech Prep student outcomes. Ward's hierarchical cluster analysis (Everitt, 1980; Ward, 1963) was used by the "Concept System" program on the X-Y coordinate data obtained from the MDS (Trochim, 1989b). The MDS partitions the points on the map into cluster solutions where each solution shows the average priority ratings of each statement and cluster. The Concept System allows a researcher to impose solutions having from four to twenty clusters. Within each cluster, statements with average ratings approaching 5.0 are a high priority; those having average ratings nearer 1.0 are a low priority. The map also shows the relationships of the statements (shown as points on a cluster map) and clusters to each other, called bridging values. Statements with bridging values approaching zero are highly associated with surrounding statements, meaning these statements were sorted by many of the respondents with other statements in close proximity. In contrast, statements having bridging values approaching 1.0 show little association with surrounding statements. Average bridging values are also computed for clusters. Clusters with lower bridging values indicate more concise concepts (or constructs), while clusters with higher bridging values are less clear and interpretable. These clusters are called "bridging clusters" because they act as a link between other clusters in the map.

Based on a qualitative interpretation of the maps, a decision was made to select a nine-cluster solution as the best way to portray the domains associated with Tech Prep student outcomes.[7] Figure 2 presents the nine-cluster solution with the label developed for each cluster, along with the average cluster rating and average bridging value. (For example, for "communications" the average cluster rating is 3.61 and the average bridging value is .59.) Table 5 lists the statements within each cluster, their ratings, the average cluster ratings, and the average bridging values.

Table 5
Mean Ratings and Mean Bridging Values for
Tech Prep Student Outcomes by Cluster for All

Cluster 1: Communications

Rating
Bridging
1 create meaning from messages communicated through listening
4.28
0.66
78 apply the English language correctly (spelling, grammar, structure)
4.28
0.51
47 communicate ideas and information through speaking
4.23
0.49
26 demonstrate oral and verbal proficiency in technical communication (reports, policies, procedures)

4.10

0.48
30 communicate ideas and information through writing
3.93
0.46
2 understand nonverbal communication
3.47
0.70
24 communicate ideas by quantifying with whole, rational, real, and/or complex numbers

3.13

0.37
4 communicate ideas and emotions through the fine arts (e.g., art, music, dance)
2.56
1.00
97 understand and communicate in a second language
2.52
0.59
Cluster average
3.61
0.59

Cluster 2: Information Use & Decision-Making

Rating
Bridging
29 use computers and other electronic technology to gather, organize, manipulate, and present information

4.37

0.24
51 use critical thinking skills in a variety of situations
4.10
0.37
50 construct meaning through reading for information, literary experience, and to perform a task

4.00

0.44
40 apply logical reasoning to develop solutions to complex problems
3.98
0.30
41 select, use, and maintain appropriate tools, information, materials, and equipment

3.93

0.30
71 use decision-making processes to make informed choices among options
3.93
0.32
72 use research tools to locate sources of information and ideas relevant to a specific need or problem

3.84

0.28
7 organize information through the development and use of classification rules and systems

3.72

0.33
87 understand the relationships between theory and practice in a technical area
3.60
0.33
74 acquire, store, allocate, and use materials and space efficiently
3.51
0.31
64 design, maintain, and improve systems
3.44
0.22
70 prepare and use budgets, make forecasts, keep records, and make adjustments to meet objectives

3.43

0.32
Cluster average
3.89
0.31
Cluster 3: Technology & Quality Management
Rating
Bridging
9 adapt to emerging technology and adjust to changing work environments
4.56
0.37
83 prepare and follow schedules, and manage time efficiently
4.12
0.27
34 know how social, organizational, and technological systems work
4.11
0.57
79 apply appropriate safety and environmental measures
4.11
0.34
90 recognize and apply quality standards
3.97
0.25
66 use goal-relevant activities, rank them, and allocate time for them
3.59
0.37
92 understand how technology affects quality of life
3.53
0.59
Cluster average
3.92
0.39

Cluster 4: Math & Science

Rating
Bridging
6 apply basic algebra and geometry to solve technical and work-related problems
4.07
0.09
53 use division, multiplication, addition, and subtraction with real numbers, decimals, fractions, integers, roots, and powers

3.87

0.09
94 read and create charts, tables, and graphs
3.67
0.19
13 use appropriate and relevant scientific methods to solve specific problems in real-life situations

3.52

0.21
48 demonstrate an ability to calculate through ratios, proportions, and percentages
3.52
0.12
56 use scientific methods to acquire information, plan investigations, use scientific tools, and communicate results

3.43

0.10
93 apply advanced algebra, analytic geometry, and/or calculus to solve technical and work-related problems

3.15

0.06
52 use models and scales to explain or predict the organization, function, and behavior of objects, materials, and living things

3.08

0.17
62 use the metric system and convert between metrics and traditional systems
2.82
0.07
Cluster average
3.46
0.12

Cluster 5: Educational Attainment

Rating
Bridging
22 complete secondary school
4.59
0.21
3 make academic progress on grade level
4.07
0.29
49 complete postsecondary school
3.72
0.22
60 make a smooth transition from secondary to postsecondary education
3.72
0.22
69 enter postsecondary programs without remediation
3.63
0.22
89 succeed in the transition from secondary or postsecondary education to a 4-year college

3.48

0.23
32 earn college credit in high school
2.57
0.22
Cluster average
3.68
0.23
Cluster 6: School-to-Work Transition
Rating
Bridging
36 make a successful transition from education to employment
4.51
0.53
8 recognize the need for lifelong learning to enhance skills and learn new skills
4.49
0.46
46 attend school regularly
4.38
0.41
11 apply knowledge, skills, and learning strategies to career and life choices
4.28
0.45
88 participate in work-based learning experiences
4.23
0.42
43 demonstrate motivation to learn
4.16
0.47
23 expand own knowledge by making connections with new and unfamiliar knowledge, skills, and experiences

4.15

0.52
80 develop and follow through on individual career plans and goals
4.10
0.48
45 demonstrate a positive attitude toward school
3.93
0.49
77 achieve certification of mastery in an occupation
3.77
0.45
54 achieve and maintain employability in a high-wage job
3.69
0.41
75 have awareness of and interest in technical careers
3.52
0.45
82 gain experience in all aspects of an industry
3.45
0.43
58 know the history of a particular occupation
2.80
0.46
Cluster average
3.96
0.46

Cluster 7: Personal Attributes, Attitudes, & Employability Skills

Rating
Bridging
76 be honest and demonstrate integrity
4.72
0.00
68 be dependable and punctual
4.69
0.00
31 demonstrate self-control and self-discipline
4.36
0.02
65 monitor and correct own performance
4.33
0.21
33 know employer expectations for job performance
4.28
0.26
5 demonstrate consistent, respectful, and caring behavior
4.28
0.11
35 demonstrate the ability to be adaptable and flexible
4.25
0.00
37 make ethical decisions
4.23
0.14
73 show appropriate personal appearance and attitude
4.21
0.01
38 know own abilities, strengths, and weaknesses
4.15
0.14
85 work without close supervision
4.12
0.07
39 maintain good physical, mental, and emotional health
4.08
0.12
44 use initiative, imagination, and creativity
3.98
0.28
81 be loyal to an employer
3.98
0.01
42 build own self-esteem
3.97
0.16
10 exercise leadership in a variety of situations
3.74
0.10
84 work under tension or pressure
3.60
0.07
55 articulate personal values and beliefs as they relate to a particular occupation
3.34
0.35
Cluster average
4.13
0.11
Cluster 8: Work & Interpersonal Relationships
Rating
Bridging
15 show good working relationships with superiors and coworkers in an occupational role

4.43

0.11
14 participate as a member of a team
4.38
0.10
12 get along with a variety of people
4.30
0.13
20 know how to give and take instructions
4.28
0.27
17 serve clients/customers
4.05
0.16
25 plan and work together in meetings
4.05
0.17
96 understand the principles of competition, cooperation, and leadership in a work environment

3.98

0.28
21 appreciate the diversity of values and cultural differences among people
3.75
0.34
91 understand the norms and values of the work culture
3.63
0.38
18 teach others new skills
3.31
0.22
19 resolve conflict based on divergent interests and perspectives
3.25
0.32
16 evaluate others' performance and provide feedback
3.16
0.12
59 observe, analyze, and interpret human behaviors to acquire a better understanding of self, families, and other human relationships

3.12

0.44
27 appreciate own and others' artistic products and performances
2.80
0.52
Cluster average
3.75
0.26

Cluster 9: Democratic & Participatory Strategies

Rating
Bridging
28 apply group problem-solving strategies
3.80
0.43
61 recognize and apply the democratic principles of justice, equality, responsibility, choice, and freedom

3.54

0.57
57 be critically aware of social issues involved in a field of interest
3.37
0.63
86 demonstrate awareness of workforce and societal trends
3.35
0.45
95 prepare for direct participation in the democratic process
3.22
0.68
98 recognize differences and commonalities in the human experience through productions, performances, or interpretations

3.12

0.82
63 recognize the geographic interaction between people and their surroundings and make responsible decisions for the environment

2.98

0.38
67 recognize varying forms of government and address issues of importance to citizens in a democracy

2.92

0.71
Cluster average
3.29
0.58

The nine-cluster concept map for all participants presents a great deal of information about how the entire group conceptualized Tech Prep student outcomes. First, all of the items and clusters received relatively high ratings on a 1 to 5 scale where 1 meant "very low" to 5 meant "very high" priority. All of the clusters representing the 98 Tech Prep student outcomes statements were given an average rating well above 3.00, indicating all of the clusters had at least a moderate level of priority for the respondents. Looking at the nine clusters, one was rated above 4.0 and two were near that level, indicating these clusters were perceived to be of highest priority to the respondents. The most highly rated cluster was "Personal Attributes, Attitudes, and Employability Skills" with a cluster average of 4.13. This cluster includes statements about honesty, integrity, dependability, and punctuality. The composition of this cluster bares a striking resemblance to the personal qualities described as foundational competencies by SCANS (U.S. Department of Labor, 1991). Another high priority cluster, having a cluster average of 3.96, was labeled "School-to-Work Transition" because of the parallel of the statements in the cluster to the key concepts portrayed in the federal STWO legislation. Finally, the third high priority cluster, with an average of 3.92, was labeled "Technology and Quality Management" because of the mix of statements contained therein having to do with such concepts.

The cluster of outcomes receiving the lowest average rating was "Democratic and Participatory Strategies," showing an average rating of 3.29. This cluster contained statements having to do with democracy, social awareness, diversity, and group processes. The next lowest rated cluster was "Math and Science" with a cluster average rating of 3.46. Within this cluster, outcomes statements having a more basic or applied focus received higher ratings than statements of a more abstract and advanced nature. For example, to apply basic algebra and geometry was given an average rating of 4.07 compared to the statement specifying advanced algebra, analytic geometry, and/or calculus which received a lower average rating of 3.15.

Once the importance of each cluster is understood, it is important to examine the relative location of the clusters, one to another, on the map. First, the two clusters located in the northwest and northern portion of the map represent the academic domains of "Communications" and "Math and Science." On the opposite side of the map, in the eastern and southeast portions, are "School-to-Work Transition" and "Personal Attributes, Attitudes, and Employability Skills." The fact that what appears to be vocationally and academically oriented outcomes show up on opposite sides of the map is an important finding, especially where the bridging values are low as in the case of "Math and Science" and "Personal Attributes, Attitudes, and Employability Skills" showing respondents rarely put such statements together and therefore did not see them as closely associated. Furthermore, the participants gave the "School-to-Work Transition" and "Personal Attributes, Attitudes, and Employability Skills" clusters higher average ratings than the "Communications" and "Math and Science" clusters. Although, as was suggested previously, it is important to remember that all of the clusters received a moderate to high priority rating.

The three clusters in the middle of the map labeled "Information Use and Decision-Making," "Technology and Quality Management," and "Work and Interpersonal Relationships" are important because they are located between the clusters of "Communications" and "Math and Science" and the clusters of "School-to-Work Transition" and "Personal Attributes, Attitudes, and Employability Skills." These clusters received average ratings between 3.75 and 3.92, showing they are a relatively high priority to the respondents. Within each of these clusters is a mix of items drawn from the literature used to create the instrumentation for this study. For example, the "Information Use and Decision-Making" cluster contains an item drawn from the science literature (i.e., "use computers and other electronic technology to gather, organize, manipulate, and present information"); an item from the communications literature (i.e., "construct meaning through reading for information, literary experience, and to perform a task"); and an item from the vocational/occupational literature (i.e., "select, use, and maintain appropriate tools, information, materials, and equipment"). Similarly, within the cluster labeled "Technology and Quality Management" are items found in the science, vocational, and management literature. Possibly, the mix of outcomes statements within these clusters provides a nucleus of concepts useful to the integration of vocational and academic education called for by Tech Prep.

With regard to this notion of vocational and academic integration, we are reminded that three clusters located around the parameter of the map had fairly high bridging values, meaning the statements within the clusters were not consistently sorted with other statements in the same clusters. These three clusters were "Communications" with an average bridging value of .59, "Democratic and Participatory Strategies" with an average bridging value of .58, and "School-to-Work Transition" with a slightly lower average bridging value of .46. The significance of this finding is that these clusters represent concepts that bridge other clusters, suggesting "Communications," "Democratic and Participatory Strategies," and "School-to-Work Transition" may be constructs that connect clusters, suggesting a different way of thinking about vocational and academic integration.

Figure 3 presents the average cluster ratings for each subgroup for the nine-cluster concept map created by all participants.[8] Several conclusions can be drawn from this map that confirm prior observations, but also provide new insights about Tech Prep student outcomes. First, with respect to several clusters there is virtually no difference in how the subgroups rated the clusters. This conclusion applies to such clusters as "Educational Attainment," "Work and Interpersonal Relations," and "Technology and Quality Management," showing there is substantial agreement among the subgroups regarding the level of priority that should be placed on these outcomes. However, in a few cases, students gave higher average ratings to clusters than other subgroups (e.g., see "School-to-Work Transition" and "Personal Attributes, Attitudes, and Employability Skills"). Two clusters are the exception to this conclusion: "Math and Science" and "Democratic and Participatory Strategies." With respect to the cluster labeled "Math and Science," students gave a lower rating than either educators or employers. With respect to "Democratic and Participatory Strategies," the average rating supplied by students was lower than educators, but higher than employers, although all subgroups gave this cluster a much lower average rating than any of the remaining clusters.

The Perspective of Each Stakeholder Group Toward
Tech Prep Student Outcomes

To further explore the potential for conceptual differences among the three subgroups, a nine-cluster concept map was computed independently for each stakeholder group. Since this study was based on the input of educators who are involved in planning and implementing Tech Prep, students who are enrolled in Tech Prep programs, and employers who are a vital part of Tech Prep efforts, a map was created based on the ratings and sort data provided by each subgroup. Each map was compared qualitatively and quantitatively to the map created by all participants (see Figures 2 and 3) and to the maps created for each subgroup. The qualitative comparison examined the location and importance of the clusters across the groups. The quantitative comparison examined the average ratings and bridging values of the clusters as well as differences in the outcomes statements within each cluster across the groups. Gaining a better understanding of the similarities and differences in the results helped to determine the congruency (or lack of it) among the groups with regard to Tech Prep student outcomes.

Educator Perspectives

The nine-cluster concept map for educators has some important similarities with the concept map created by all participants (see Figure 2). Specifically, five of the nine clusters in the educators' map are nearly identical to the clusters emerging from the concept map of Tech Prep student outcomes for all participants. These five clusters are "Personal Attributes, Attitudes, and Employability Skills," "Information Use and Decision-Making," "Work and Interpersonal Relationships," "School-to-Work Transition," and "Communications." Four clusters were sorted and labeled differently, however. They are "Education and Career Attainment," "Analytic and Scientific," "Work Environments," and "Democratic Process and Career Awareness." Furthermore, there are similarities and differences in where the clusters are located in the maps. Clusters such as "Personal Attributes, Attitudes, and Employability Skills," and "Information Use and Decision-Making" are located in about the same place on both maps. However, clusters such as "School-to-Work Transition" and "Communications" are placed in different locations. Figure 4 shows the nine-cluster concept map solution for educators. Table 6 presents the outcomes statements within each cluster, their ratings, the average cluster ratings, and the average bridging values for educators.

Table 6
Student Outcome, Mean Rating, and Mean Bridging Value
by Cluster for Educators

Cluster 1: Communications

Rating
Bridging
1 create meaning from messages communicated through listening
4.33
0.00
26 demonstrate oral and verbal proficiency in technical communication (reports, policies, procedures)

4.29

0.10
30 communicate ideas and information through writing
4.25
0.11
50 construct meaning through reading for information, literary experience, and to perform a task

4.21

0.23
78 apply the English language correctly (spelling, grammar, structure)
4.21
0.41
47 communicate ideas and information through speaking
4.13
0.06
2 understand nonverbal communication
3.61
0.15
4 communicate ideas and emotions through the fine arts (e.g., art, music, dance)
2.46
0.22
97 understand and communicate in a second language
2.17
0.26
Cluster average
3.74
0.17

Cluster 2: Analytic & Scientific

Rating
Bridging
29 use computers and other electronic technology to gather, organize, manipulate, and present information

4.74

0.44
6 apply basic algebra and geometry to solve technical and work-related problems
4.21
0.49
7 organize information through the development and use of classification rules and systems

3.91

0.36
53 use division, multiplication, addition, and subtraction with real numbers, decimals, fractions, integers, roots, and powers

3.88

0.42
48 demonstrate an ability to calculate through ratios, proportions, and percentages
3.75
0.43
94 read and create charts, tables, and graphs
3.70
0.44
56 use scientific methods to acquire information, plan investigations, use scientific tools, and communicate results

3.63

0.23
52 use models and scales to explain or predict the organization, function, and behavior of objects, materials, and living things

3.29

0.39
24 communicate ideas by quantifying with whole, rational, real, and/or complex numbers

3.25

0.33
93 apply advanced algebra, analytic geometry, and/or calculus to solve technical and work-related problems

3.17

0.36
62 use the metric system and convert between metrics and traditional systems
2.88
0.33
Cluster average
3.67
0.38
Cluster 3: Information Use & Decision Making
Rating
Bridging
51 use critical thinking skills in a variety of situations
4.25
0.60
71 use decision-making processes to make informed choices among options
4.21
0.50
41 select, use, and maintain appropriate tools, information, materials, and equipment

4.17

0.66
83 prepare and follow schedules, and manage time efficiently
4.14
0.48
40 apply logical reasoning to develop solutions to complex problems
4.08
0.50
79 apply appropriate safety and environmental measures
4.00
0.46
72 use research tools to locate sources of information and ideas relevant to a specific need or problem

3.92

0.46
13 use appropriate and relevant scientific methods to solve specific problems in real-life situations

3.79

0.35
66 use goal-relevant activities, rank them, and allocate time for them
3.58
0.57
74 acquire, store, allocate, and use materials and space efficiently
3.54
0.61
70 prepare and use budgets, make forecasts, keep records, and make adjustments to meet objectives

3.33

0.50
Cluster average
3.91
0.52

Cluster 4: Education & Career Attainment

Rating
Bridging
22 complete secondary school
4.75
0.12
11 apply knowledge, skills, and learning strategies to career and life choices
4.38
0.56
60 make a smooth transition from secondary to postsecondary education
4.17
0.12
3 make academic progress on grade level
4.04
0.25
69 enter postsecondary programs without remediation
4.04
0.12
88 participate in work-based learning experiences
4.00
0.39
49 complete postsecondary school
3.63
0.06
89 succeed in the transition from secondary or postsecondary education to a 4-year college

3.17

0.13
Cluster Average
4.02
0.22

Cluster 5: School-to-Work Transition

Rating
Bridging
36 make a successful transition from education to employment
4.58
0.35
8 recognize the need for lifelong learning to enhance skills and learn new skills
4.33
0.36
80 develop and follow through on individual career plans and goals
4.17
0.36
45 demonstrate a positive attitude toward school
3.83
0.43
54 achieve and maintain employability in a high-wage job
3.75
0.44
77 achieve certification of mastery in an occupation
3.71
0.25
82 gain experience in all aspects of an industry
3.35
0.46
32 earn college credit in high school
2.17
0.22
Cluster average
3.78
0.35

Cluster 6: Work Environments

Rating
Bridging
9 adapt to emerging technology and adjust to changing work environments
4.71
0.63
23 expand own knowledge by making connections with new and unfamiliar knowledge, skills, and experiences

4.33

0.93
96 understand the principles of competition, cooperation, and leadership in a work environment

3.87

0.66
91 understand the norms and values of the work culture
3.65
0.46
34 know how social, organizational, and technological systems work
3.63
0.77
87 understand the relationships between theory and practice in a technical area
3.57
0.73
92 understand how technology affects quality of life
3.39
0.74
64 design, maintain, and improve systems
3.38
0.80
59 observe, analyze, and interpret human behaviors to acquire a better understanding of self, families, and other human relationships

3.00

0.53
63 recognize the geographic interaction between people and their surroundings and make responsible decisions for the environment

2.96

0.45
98 recognize differences and commonalities in the human experience through productions, performances, or interpretations

2.87

0.59
27 appreciate own and others' artistic products and performances
2.57
0.57
Cluster average
3.49
0.66
Cluster 7: Democratic Process & Career Awareness
Rating
Bridging
95 prepare for direct participation in the democratic process
3.61
0.63
61 recognize and apply the democratic principles of justice, equality, responsibility, choice, and freedom

3.50

0.84
57 be critically aware of social issues involved in a field of interest
3.35
0.82
75 have awareness of and interest in technical careers
3.33
0.81
86 demonstrate awareness of workforce and societal trends
3.22
0.65
67 recognize varying forms of government and address issues of importance to citizens in a democracy

3.13

1.00
58 know the history of a particular occupation
2.42
0.84
Cluster average
3.22
0.80

Cluster 8: Work & Interpersonal Relationships

Rating
Bridging
65 monitor and correct own performance
4.38
0.28
14 participate as a member of a team
4.33
0.33
15 show good working relationships with superiors and coworkers in an occupational role

4.33

0.35
20 know how to give and take instructions
4.29
0.39
12 get along with a variety of people
4.17
0.26
28 apply group problem-solving strategies
4.08
0.55
17 serve clients/customers
4.04
0.37
25 plan and work together in meetings
4.00
0.35
5 demonstrate consistent, respectful, and caring behavior
4.00
0.25
44 use initiative, imagination, and creativity
3.96
0.37
90 recognize and apply quality standards
3.74
0.48
10 exercise leadership in a variety of situations
3.71
0.26
19 resolve conflict based on divergent interests and perspectives
3.54
0.42
84 work under tension or pressure
3.48
0.26
18 teach others new skills
3.38
0.42
16 evaluate others' performance and provide feedback
3.13
0.30
Cluster average
3.91
0.35
Cluster 9: Personal Attributes, Attitudes, & Employability Skills
Rating
Bridging
68 be dependable and punctual
4.58
0.17
76 be honest and demonstrate integrity
4.50
0.21
33 know employer expectations for job performance
4.26
0.27
35 demonstrate the ability to be adaptable and flexible
4.25
0.20
37 make ethical decisions
4.21
0.24
31 demonstrate self-control and self-discipline
4.13
0.21
85 work without close supervision
4.09
0.25
38 know own abilities, strengths, and weaknesses
4.08
0.32
73 show appropriate personal appearance and attitude
4.04
0.20
43 demonstrate motivation to learn
4.00
0.41
42 build own self-esteem
3.87
0.26
39 maintain good physical, mental, and emotional health
3.83
0.26
21 appreciate the diversity of values and cultural differences among people
3.74
0.39
81 be loyal to an employer
3.74
0.20
55 articulate personal values and beliefs as they relate to a particular occupation
3.29
0.37
Cluster average
4.04
0.26

There are five clusters with substantial similarities between the nine-cluster concept map created by the group of all participants (Figure 2) and the educators subgroup (Figure 4). The mean ratings attributed to these five clusters are similar. In addition, most of the items appearing in the five clusters received similar ratings.

Four clusters in the educators concept map labeled "Education and Career Attainment," "Analytic and Scientific," "Work Environments," and "Democratic Process and Career Awareness" do not appear under the exact same label in the map of all participants. The composition of outcomes statements in these four clusters have a qualitatively different focus (albeit sometimes only slight in some cases) from related clusters in the map of all participants. For example, educators sorted outcomes statements into a cluster labeled "Education and Career Attainment" containing eight outcomes statements such as to "complete secondary school"; "make a smooth transition from secondary to postsecondary"; "apply knowledge, skills, and learning strategies to career and life choices"; and "participate in work-based learning." Although the "Educational Attainment" cluster for all participants (containing seven statements total) had many of the same statements, no items linked to careers were present there. Overall, educators gave the "Education and Career Attainment" cluster a higher mean rating than the group of all participants gave the "Educational Attainment" cluster, 4.02 and 3.68, respectively. Educators gave higher mean ratings to outcomes statements within the cluster such as to "complete secondary school," "make a smooth transition from secondary to postsecondary," and "enter postsecondary programs without remediation." In contrast, educators rated the outcome "succeed in the transition from secondary or postsecondary education to a 4-year college" much lower than the group of all participants, 3.17 for educators compared to 3.48 for all. Both groups gave the outcomes statements "make academic progress on grade level" a rating over 4.0, meaning it is a high priority to all participants and to the subgroup of educators.

The cluster labeled "Analytic and Scientific" in the educators' map is similar to the cluster labeled "Math and Science" in the map of all participants. However, the educators' map adds other outcomes found in the math and science literature: "use computers and other electronic technology to organize, manipulate, and present information"; "organize information through the development of classification rules and systems"; and "communicate ideas by quantifying with whole, rational, real and/or complex numbers," giving this cluster a broader scope and more analytical character than the "Math and Science" cluster shown in the map for all participants. Both groups gave their respective clusters a lower mean rating than almost all other clusters, but still indicated the outcomes to be of a moderate to high priority. The educators' group gave the cluster a mean rating of 3.67 and the group of all participants gave it a slightly lower rating of 3.46.

The clusters labeled "Work Environments" and "Democratic Process and Career Awareness" were a unique blend of outcomes statements. These two clusters received the lowest mean ratings of the educators' subgroup, indicating both clusters were a moderate priority to educators. The cluster labeled "Work Environments" contains 12 outcomes statements including to "adapt to emerging technology and adjust to changing work environments" and "expand own knowledge by making connections with new and unfamiliar knowledge, skills, and experiences." Outcomes statements taken from the humanities, science, and fine arts literature also appear in this cluster. The other cluster labeled "Democratic Process and Career Awareness" contains seven outcomes statements, including several that appear in the cluster labeled "Democratic and Participatory Strategies" in the map of all participants. Outcomes statements appearing in both of these clusters include "prepare for direct participation in the democratic process" and "recognize and apply the democratic principles of justice, equality, responsibility, choice, and freedom." Two outcomes statements added to the "Democratic Process and Career Awareness" cluster giving it more of a career orientation are for students to "have awareness of and interest in technical careers" and to "know the history of a particular occupation."

Finally, similar to the concept map for all participants, three clusters on the educators' map received relatively high bridging values, although they were not the same three clusters. With respect to educators, the three clusters with fairly high bridging values are "Information Use and Decision-Making" with a bridging value of .52, "Work Environments" with a bridging value of .66, and "Democratic Process and Career Awareness" with a bridging value of .80. Recall that these high bridging values mean that the educators did not tend to sort the items consistently into the same categories, but, rather, grouped them more randomly. These clusters are not as distinct in the minds of educators as other clusters such as "Personal Attributes, Attitudes, and Employability Skills" and "Education and Career Attainment."

Employer Perspectives

The nine-cluster solution created by the subgroup of employers is the concept map that most closely resembles the map for all participants. Eight of the nine clusters have the same labels because the items contained within the clusters are similar. Another commonality between the two maps is that the clusters appear in basically the same locations on both the employers' map and the map of all participants. In addition, most of the clusters received a comparable mean rating. Figure 5 presents the nine-cluster concept map for employers. Table 7 identifies the outcomes statements within each cluster, their ratings, the average cluster ratings, and the average bridging values for the employers' map.

Table 7
Mean Ratings and Mean Bridging Values for
Tech Prep Student Outcomes by Cluster for Employers

Cluster 1: Communications

Rating
Bridging
1 create meaning from messages communicated through listening
4.32
0.71
47 communicate ideas and information through speaking
4.26
0.57
26 demonstrate oral and verbal proficiency in technical communication (reports, policies, procedures)

4.11

0.47
78 apply the English language correctly (spelling, grammar, structure)
4.00
0.43
30 communicate ideas and information through writing
3.89
0.43
2 understand nonverbal communication
3.32
0.71
97 understand and communicate in a second language
2.26
0.63
4 communicate ideas and emotions through the fine arts (e.g., art, music, dance)
2.00
1.00
Cluster average
3.52
0.62

Cluster 2: Math & Science

Rating
Bridging
6 apply basic algebra and geometry to solve technical and work-related problems
4.26
0.04
29 use computers and other electronic technology to gather, organize, manipulate, and present information

4.11

0.18
53 use division, multiplication, addition, and subtraction with real numbers, decimals, fractions, integers, roots, and powers

4.05

0.07
40 apply logical reasoning to develop solutions to complex problems
3.89
0.26
41 select, use, and maintain appropriate tools, information, materials, and equipment

3.89

0.26
48 demonstrate an ability to calculate through ratios, proportions, and percentages
3.74
0.04
72 use research tools to locate sources of information and ideas relevant to a specific need or problem

3.68

0.33
94 read and create charts, tables, and graphs
3.68
0.17
7 organize information through the development and use of classification rules and systems

3.63

0.30
13 use appropriate and relevant scientific methods to solve specific problems in real-life situations

3.53

0.12
56 use scientific methods to acquire information, plan investigations, use scientific tools, and communicate results

3.26

0.16
24 communicate ideas by quantifying with whole, rational, real, and/or complex numbers

3.16

0.34
93 apply advanced algebra, analytic geometry, and/or calculus to solve technical and work-related problems

3.05

0.10
62 use the metric system and convert between metrics and traditional systems
3.00
0.14
52 use models and scales to explain or predict the organization, -function, and behavior of objects, materials, and living things

2.68

0.19
Cluster average
3.58
0.18
Cluster 3: Technology & Quality Management
Rating
Bridging
9 adapt to emerging technology and adjust to changing work environments
4.47
0.48
79 apply appropriate safety and environmental measures
4.37
0.57
28 apply group problem-solving strategies
4.00
0.57
83 prepare and follow schedules, and manage time efficiently
4.00
0.49
66 use goal-relevant activities, rank them, and allocate time for them
3.58
0.46
92 understand how technology affects quality of life
3.32
0.64
34 know how social, organizational, and technological systems work
3.16
0.78
Cluster average
3.78
0.57

Cluster 4: Information Use & Decision Making

Rating
Bridging
90 recognize and apply quality standards
4.16
0.37
50 construct meaning through reading for information, literary experience, and to perform a task

3.89

0.37
51 use critical thinking skills in a variety of situations
3.89
0.35
71 use decision-making processes to make informed choices among options
3.84
0.34
87 understand the relationships between theory and practice in a technical area
3.68
0.47
74 acquire, store, allocate, and use materials and space efficiently
3.32
0.35
64 design, maintain, and improve systems
3.21
0.50
70 prepare and use budgets, make forecasts, keep records, and make adjustments to meet objectives

3.05

0.37
Cluster average
3.63
0.39

Cluster 5: Education & Career Attainment

Rating
Bridging
22 complete secondary school
4.63
0.36
88 participate in work-based learning experiences
4.37
0.57
11 apply knowledge, skills, and learning strategies to career and life choices
4.00
0.59
77 achieve certification of mastery in an occupation
3.95
0.40
69 enter postsecondary programs without remediation
3.84
0.47
3 make academic progress on grade level
3.79
0.38
23 expand own knowledge by making connections with new and unfamiliar knowledge, skills, and experiences

3.74

0.59
49 complete postsecondary school
3.74
0.36
75 have awareness of and interest in technical careers
3.53
0.55
60 make a smooth transition from secondary to postsecondary education
3.42
0.33
89 succeed in the transition from secondary or postsecondary education to a 4-year college

3.32

0.34
54 achieve and maintain employability in a high-wage job
3.26
0.63
58 know the history of a particular occupation
2.63
0.60
32 earn college credit in high school
2.37
0.33
Cluster average
3.61
0.46

Cluster 6: School-to-Work Transition

Rating
Bridging
8 recognize the need for lifelong learning to enhance skills and learn new skills
4.68
0.53
46 attend school regularly
4.42
0.49
36 make a successful transition from education to employment
4.37
0.52
45 demonstrate a positive attitude toward school
3.63
0.49
80 develop and follow through on individual career plans and goals
3.63
0.48
82 gain experience in all aspects of an industry
3.16
0.58
Cluster average
3.98
0.52

Cluster 7: Personal Attribute, Attitudes, & Employability Skills

Rating
Bridging
76 be honest and demonstrate integrity
4.89
0.00
68 be dependable and punctual
4.79
0.03
73 show appropriate personal appearance and attitude
4.32
0.01
31 demonstrate self-control and self-discipline
4.26
0.01
65 monitor and correct own performance
4.21
0.32
33 know employer expectations for job performance
4.16
0.51
37 make ethical decisions
4.12
0.08
35 demonstrate the ability to be adaptable and flexible
4.11
0.08
85 work without close supervision
4.11
0.15
43 demonstrate motivation to learn
4.05
0.34
84 work under tension or pressure
4.00
0.15
38 know own abilities, strengths, and weaknesses
3.95
0.10
39 maintain good physical, mental, and emotional health
3.89
0.12
81 be loyal to an employer
3.84
0.09
44 use initiative, imagination, and creativity
3.79
0.49
42 build own self-esteem
3.74
0.09
Cluster average
4.16
0.17

Cluster 8: Work & Interpersonal Relationships

Rating
Bridging
14 participate as a member of a team
4.53
0.34
15 show good working relationships with superiors and coworkers in an occupational role

4.42

0.25
12 get along with a variety of people
4.37
0.28
5 demonstrate consistent, respectful, and caring behavior
4.32
0.20
17 serve clients/customers
4.26
0.46
25 plan and work together in meetings
4.11
0.53
96 understand the principles of competition, cooperation, and leadership in a work environment

3.95

0.38
21 appreciate the diversity of values and cultural differences among people
3.63
0.40
10 exercise leadership in a variety of situations
3.58
0.31
91 understand the norms and values of the work culture
3.53
0.55
16 evaluate others' performance and provide feedback
3.11
0.33
55 articulate personal values and beliefs as they relate to a particular occupation
3.05
0.95
27 appreciate own and others' artistic products and performances
2.42
0.44
Cluster average
3.79
0.42

Cluster 9: Democratic/Participatory Strategies

Rating
Bridging
20 know how to give and take instructions
4.16
0.61
61 recognize and apply the democratic principles of justice, equality, responsibility, choice, and freedom

3.16

0.49
18 teach others new skills
3.05
0.63
59 observe, analyze, and interpret human behaviors to acquire a better understanding of self, families, and other human relationships

3.05

0.53
19 resolve conflict based on divergent interests and perspectives
3.00
0.54
86 demonstrate awareness of workforce and societal trends
3.00
0.59
57 be critically aware of social issues involved in a field of interest
2.95
0.80
95 prepare for direct participation in the democratic process
2.89
0.62
98 recognize differences and commonalities in the human experience through productions, performances, or interpretations

2.68

0.62
63 recognize the geographic interaction between people and their surroundings and make responsible decisions for the environment

2.58

0.45
67 recognize varying forms of government and address issues of importance to citizens in a democracy

2.37

0.56
Cluster average
2.99
0.59

In comparing the concept map created for employers to the map for all participants, only one cluster was labeled differently and it was "Education and Career Attainment"--the same label used in the educators' map. This label was used because, like educators, employers created a cluster that combined outcomes linked to both education and career attainment. Outcomes statements appearing in the "Education and Career Attainment" cluster in the employers' map are to "complete secondary school"; "participate in work-based learning"; "apply knowledge, skills, and learning strategies to career and life choices"; and "achieve certification of mastery in an occupation." For employers, this cluster contained several more outcomes statements than for either the group of all participants or for the educators' subgroup. In addition, the cluster received a mean rating of 3.61, showing it to be of lower relative priority to employers than other clusters such as "Personal Attributes, Attitudes, and Employability Skills," "Work and Interpersonal Relationships," and "School-to-Work Transition." (However, this mean rating is similar to the rating given by all participants and the educators' subgroup.)

A final result shows that several of the clusters in the employers' map have fairly high bridging values. In fact, only two of the clusters have extremely low bridging values and they are "Personal Attributes, Attitudes, and Employability Skills" and "Math and Science," meaning the outcomes statements in these clusters were sorted together consistently by persons in the employers' subgroup. Four clusters have quite high bridging values. They are "School-to-Work Transition," "Technology and Quality Management," "Communications," and "Democratic and Participatory Strategies," suggesting that the employers' subgroup did not sort statements within these clusters together consistently. Finally, the cluster receiving the lowest priority rating across all the concept maps was the cluster labeled "Democratic and Participatory Strategies." Employers gave this cluster a mean priority rating of 2.99, indicating student outcomes related to it are a moderate priority.

Student Perspectives

In contrast to employers, the nine-cluster concept map created by the subgroup of students is quite different from the map for all participants. In fact, five of the nine clusters appearing on the students' map are unique from the clusters appearing on any of the other maps because the students sorted the Tech Prep outcomes statements so very differently from the other subgroups. However, four clusters are similar on the students' map and the map for all participants and these are "Educational Attainment," "School-to-Work Transition," "Personal Attributes, Attitudes, and Employability Skills," and "Work and Interpersonal Relationships." These clusters appear in roughly the same positions on the two maps and three of the clusters receive similar mean priority ratings. Only the cluster labeled "Educational Attainment" is markedly different with a mean rating of 3.92 for students compared to 3.68 for all participants. Students' mean rating for "Educational Attainment" is approaching the mean rating educators gave the cluster labeled "Education and Career Attainment" of 4.02, suggesting persons in the educational system may attribute greater value to educational outcomes than employers who operate outside the system. Figure 6 presents the nine-cluster concept map for students. Table 8 identifies the outcomes statements within each cluster, their ratings, the average cluster ratings, and the average bridging values for the students' map.

Table 8
Mean Ratings and Mean Bridging Values for
Tech Prep Student Outcomes by Cluster for Students

Cluster 1: Communications & Democratic Process

Rating
Bridging
1 create meaning from messages communicated through listening
4.17
0.91
61 recognize and apply the democratic principles of justice, equality, responsibility, choice, and freedom

4.00

1.00
98 recognize differences and commonalities in the human experience through productions, performances, or interpretations

3.89

0.79
2 understand nonverbal communication
3.44
0.71
4 communicate ideas and emotions through the fine arts (e.g., art, music, dance)
3.28
0.93
95 prepare for direct participation in the democratic process
3.06
0.96
Cluster average
3.58
0.90

Cluster 2: Work & Interpersonal Relationships

Rating
Bridging
12 get along with a variety of people
4.39
0.29
14 participate as a member of a team
4.28
0.40
44 use initiative, imagination, and creativity
4.22
0.43
25 plan and work together in meetings
4.06
0.38
37 make ethical decisions
4.06
0.40
21 appreciate the diversity of values and cultural differences among people
3.89
0.52
57 be critically aware of social issues involved in a field of interest
3.83
0.60
55 articulate personal values and beliefs as they relate to a particular occupation
3.72
0.36
28 apply group problem-solving strategies
3.67
0.82
27 appreciate own and others' artistic products and performances
3.50
0.66
63 recognize the geographic interaction between people and their surroundings and make responsible decisions for the environment

3.44

0.66
59 observe, analyze, and interpret human behaviors to acquire a better understanding of self, families, and other human relationships

3.33

0.53
19 resolve conflict based on divergent interests and perspectives
3.11
0.59
Cluster average
3.80
0.51

Cluster 3: Personal Attributes, Attitudes, & Employability Skills

Rating
Bridging
31 demonstrate self-control and self-discipline
4.78
0.26
68 be dependable and punctual
4.72
0.10
5 demonstrate consistent, respectful, and caring behavior
4.61
0.23
39 maintain good physical, mental, and emotional health
4.61
0.07
15 show good working relationships with superiors and coworkers in an occupational role

4.59

0.24
38 know own abilities, strengths, and weaknesses
4.44
0.24
81 be loyal to an employer
4.44
0.18
35 demonstrate the ability to be adaptable and flexible
4.39
0.22
42 build own self-esteem
4.33
0.12
73 show appropriate personal appearance and attitude
4.33
0.11
17 serve clients/customers
3.83
0.22
Cluster average
4.49
0.17

Cluster 4: Career/Work Management & Initiative

Rating
Bridging
80 develop and follow through on individual career plans and goals
4.50
0.55
20 know how to give and take instructions
4.39
0.32
65 monitor and correct own performance
4.39
0.52
83 prepare and follow schedules, and manage time efficiently
4.22
0.36
85 work without close supervision
4.17
0.15
96 understand the principles of competition, cooperation, and leadership in a work environment

4.17

0.27
90 recognize and apply quality standards
4.06
0.18
10 exercise leadership in a variety of situations
3.94
0.29
86 demonstrate awareness of workforce and societal trends
3.89
0.32
75 have awareness of and interest in technical careers
3.78
0.36
18 teach others new skills
3.50
0.41
84 work under tension or pressure
3.33
0.29
16 evaluate others' performance and provide feedback
3.28
0.29
Cluster average
3.97
0.35

Cluster 5: Educational Attainment

Rating
Bridging
46 attend school regularly
4.67
0.09
3 make academic progress on grade level
4.39
0.04
45 demonstrate a positive attitude toward school
4.39
0.32
22 complete secondary school
4.33
0.01
89 succeed in the transition from secondary or postsecondary education to a 4-year college

4.06

0.00
49 complete postsecondary school
3.83
0.08
60 make a smooth transition from secondary to postsecondary education
3.44
0.16
32 earn college credit in high school
3.33
0.01
69 enter postsecondary programs without remediation
2.82
0.00
Cluster average
3.92
0.08

Cluster 6: School-to-Work Transition

Rating
Bridging
8 recognize the need for lifelong learning to enhance skills and learn new skills
4.50
0.62
43 demonstrate motivation to learn
4.50
0.62
11 apply knowledge, skills, and learning strategies to career and life choices
4.44
0.39
33 know employer expectations for job performance
4.44
0.55
88 participate in work-based learning experiences
4.39
0.39
23 expand own knowledge by making connections with new and unfamiliar knowledge, skills, and experiences

4.33

0.80
54 achieve and maintain employability in a high-wage job
4.06
0.50
82 gain experience in all aspects of an industry
3.89
0.42
66 use goal-relevant activities, rank them, and allocate time for them
3.61
0.50
7 organize information through the development and use of classification rules and systems

3.56

0.37
58 know the history of a particular occupation
3.50
0.70
Cluster average
4.10
0.53

Cluster 7: Math, Science, & Communications

Rating
Bridging
78 apply the English language correctly (spelling, grammar, structure)
4.67
0.45
51 use critical thinking skills in a variety of situations
4.11
0.32
50 construct meaning through reading for information, literary experience, and to perform a task

3.83

0.44
6 apply basic algebra and geometry to solve technical and work-related problems
3.67
0.08
41 select, use, and maintain appropriate tools, information, materials, and equipment

3.67

0.24
53 use division, multiplication, addition, and subtraction with real numbers, decimals, fractions, integers, roots, and powers

3.67

0.09
94 read and create charts, tables, and graphs
3.61
0.37
56 use scientific methods to acquire information, plan investigations, use scientific tools, and communicate results

3.33

0.13
52 use models and scales to explain or predict the organization, -function, and behavior of objects, materials, and living things

3.22

0.19
93 apply advanced algebra, analytic geometry, and/or calculus to solve technical and work-related problems

3.22

0.09
13 use appropriate and relevant scientific methods to solve specific problems in real-life situations

3.17

0.27
48 demonstrate an ability to calculate through ratios, proportions, and percentages
3.00
0.24
62 use the metric system and convert between metrics and traditional systems
2.56
0.23
Cluster average
3.52
0.24

Cluster 8: Technical Communications

Rating
Bridging
47 communicate ideas and information through speaking
4.33
0.68
72 use research tools to locate sources of information and ideas relevant to a specific need or problem

3.89

0.60
26 demonstrate oral and verbal proficiency in technical communication (reports, policies, procedures)

3.83

0.82
30 communicate ideas and information through writing
3.56
0.50
87 understand the relationships between theory and practice in a technical area
3.56
0.62
97 understand and communicate in a second language
3.22
0.65
24 communicate ideas by quantifying with whole, rational, real, and/or complex numbers

2.94

0.47
Cluster average
3.62
0.62

Cluster 9: Work, Technology, & Information Use

Rating
Bridging
36 make a successful transition from education to employment
4.56
0.65
9 adapt to emerging technology and adjust to changing work environments
4.44
0.74
29 use computers and other electronic technology to gather, organize, manipulate, and present information

4.17

0.29
79 apply appropriate safety and environmental measures
4.00
0.42
40 apply logical reasoning to develop solutions to complex problems
3.94
0.37
70 prepare and use budgets, make forecasts, keep records, and make adjustments to meet objectives

3.94

0.61
92 understand how technology affects quality of life
3.94
0.32
64 design, maintain, and improve systems
3.78
0.27
71 use decision-making processes to make informed choices among options
3.67
0.34
74 acquire, store, allocate, and use materials and space efficiently
3.67
0.28
77 achieve certification of mastery in an occupation
3.67
0.49
Cluster average
3.98
0.44

The five unique clusters created by students are "Work, Technology, and Information Use," "Career/Work Management and Initiative," "Technical Communications," "Communications and Democratic Process," and "Math, Science, and Communications." The first two of these clusters was given an average priority rating that was substantially higher than the ratings attributed to the other three clusters. "Work, Technology, and Information Use" received a mean priority rating of 3.98 and "Career/Work Management and Initiative" got a mean priority rating of 3.97, both very near the high priority level of 4.0. The "Work, Technology, and Information Use" cluster contained 11 outcomes statements that, taken together, seem to suggest students in Tech Prep programs are well-aware of the need to be competent at using technology and information to operate effectively in contemporary workplaces. For students, this cluster falls between two career-oriented clusters--"School-to-Work Transition" and "Technical Communications"--and this finding might provide valuable insight into ways to link and integrate these and other clusters of outcomes.

The cluster labeled "Career/Work Management and Initiative" had a similar focus on work, but it encompassed 13 items related to managing one's and others' work and careers. Similar to the previous discussion about students acute sense of awareness of careers and work, the overall character of this cluster suggests students in Tech Prep programs recognize the importance of developing management skills and personal initiative related to immediate and future work. They give high ratings to outcomes demonstrating the ability to show leadership and strong supervisory and management competencies.

The three other clusters unique to students are "Technical Communications," "Communications and Democratic Process," and "Math, Science, and Communications." All three of these clusters received mean priority ratings between 3.52 and 3.62, indicating they were not viewed to be as important as other clusters, but still were rated of moderate priority to students. Immediately apparent from the titles of these three clusters is the fact that students sorted outcomes statements related to communications into all three clusters. Not surprisingly, then, all three clusters have relatively high bridging values, ranging from .62 to .90, providing additional confirmation that the items in these clusters were sorted in different ways by different students.

In terms of the organization of the clusters, the cluster labeled "Technical Communications" appears in the northwest portion of the map with clusters labeled "Math, Science, and Communications," and "Communications and Democratic Process." Seven outcomes statements appear in the cluster labeled "Technical Communications," including "communicate ideas and information through speaking," "demonstrate oral and verbal proficiency in technical communications," and "understand the relationships between theory and practice in a technical area." The cluster labeled "Communications and Democratic Process" contains six outcomes statements, including "create meaning from messages communicated through listening"; "recognize and apply the democratic principles of justice, equality, responsibility, choice, and freedom"; and "recognize differences and commonalties in the human experience through productions, performances, or interpretations." Within this cluster are outcomes taken from the English/communications, humanities, and fine arts literature, suggesting a potential area where various outcomes are interrelated from the perspective of students.

Finally, the cluster labeled "Math, Science, and Communications" contained 13 outcomes statements related to math, science, and English/communications. The cluster appears to be a mixture of outcomes that might traditionally be expected of high school students. Included among these outcomes are the following statements: "apply the English language correctly (spelling, grammar, structure)"; "use critical thinking skills in a variety of situations"; "apply basic algebra and geometry to solve technical and work-related problems"; and "use scientific methods to acquire information, plan investigations, use scientific tools, and communicate results." Several outcomes statements specifically related to advanced mathematics and science were present in this cluster and given relatively low priority ratings by students, ranging from 3.33 for "use scientific methods to acquire information, plan investigations, use scientific tools, and communicate results" to 2.56 for "use the metric system and convert between metrics and traditional systems." These results suggest students in Tech Prep programs may not appreciate these academic outcomes, particularly those related to math and science. It is clear that students undervalued these outcomes relative to the educators and employers who participated in this study.


IMPLICATIONS FOR POLICY AND PRACTICE

The intent of this study was to gain a better sense of the perspectives of three key stakeholder groups toward student outcomes associated with Tech Prep. Knowing how educators, students, and employers conceptualize outcomes could provide several benefits to practitioners and policymakers. First, understanding the similarities and differences in the perspectives of the three stakeholder groups could inform practitioners about how to proceed with various aspects of program implementation. Second, knowing the priorities that stakeholders place on various student outcomes could help to focus attention and resources on aspects of Tech Prep thought most likely to produce desired results. Third, knowing more about outcomes could result in the development of more meaningful outcomes assessment procedures and instruments, especially where there is a high level of consensus on particular foci of Tech Prep. Finally, understanding the stakeholder perspectives toward Tech Prep could contribute to building more accountability into evolving Tech Prep systems, thereby increasing their potential for continued public support.

Formal program evaluation and outcomes assessment for Tech Prep has been limited, but when evaluations have been conducted they have tended to focus on compliance-oriented measures required by governmental units. Outcomes measures linked to enrollments, program completion, and job placement have been typical of the kinds of measures demanded by state and federal agencies. The national evaluation sponsored by the U.S. Department of Education concentrates much of its attention on having local Tech Prep coordinators estimate the number of students who reach specified points in the educational and employment system, such as high school completion, matriculation into two-year postsecondary education, two-year postsecondary completion, and job entry or matriculation into four-year postsecondary education. Such estimates may be useful in terms of understanding the potential scope and scale of the nation's emerging Tech Prep system, but they are not as helpful to understanding the way programs should operate and benefit students on more personal and consequential levels.

When local coordinators have been asked to specify outcomes they believe to be appropriate for students in vocational-technical programs, typically they identify educational, economic, and psychosocial outcomes (McCaslin, 1990). In 1992, Hammons surveyed local Tech Prep coordinators (educators), and found they supported a wide range of performance indicators for Tech Prep programs, including outcomes in all three of the categories described by McCaslin. Later, in 1994, many of Hammons' findings were supported when a national sample of local coordinators indicated a wide range of academic, vocational, and employment-related outcomes were a high priority for students (Bragg et al., 1994). Now, results of this concept mapping study show the three stakeholder groups of educators, students, and employers also give high priority to a wide array of student outcomes. All three stakeholder groups rate nearly all of 98 student outcomes statements at a moderate or high priority level. (For a summary comparison of how the three stakeholder groups rated the outcomes statements by clusters, see Appendix B.) This finding indicates it would be a mistake to limit assessments of student outcomes to only a few outcomes measures. Rather, multiple measures addressing a wide range of outcomes are necessary to determine how students benefit from educational and employment-related experiences. Unfortunately, assessment measures and methodologies are not available to conduct wide scale assessments of many of the student outcomes identified by the stakeholders in this study, heightening the importance of the need to create valid, reliable, and meaningful outcomes assessments for Tech Prep programs.

Are certain Tech Prep student outcomes grouped together in a logical, consistent pattern? Do the stakeholder groups perceive of the groupings (clusters) of Tech Prep student outcomes in similar and/or different ways? Are particular clusters of student outcomes more important than others to subgroups and to the group as a whole? By looking at the concept maps organized into nine-cluster solutions by the three stakeholder groups and the entire group of participants, it is possible to answer these important questions. Indeed, results show there are similarities in how the three groups conceptualize and prioritize Tech Prep student outcomes. All three stakeholder groups sorted many of the same student outcomes into three clusters labeled "Personal Attributes, Attitudes, and Employability Skills"; "School-to-Work Transition"; and "Work and Interpersonal Relationships." All of these clusters were given a rating near or at a high priority level of 4.0 (out of 5.0) by the subgroups, showing a high degree of consensus in how the stakeholders conceptualized these sets of student outcomes for Tech Prep (see Table 9). These results also suggest that student outcomes linking school-based education to effective attitudes and behaviors in the workplace are a high priority to all three stakeholder groups. Participants organized these outcomes into three distinct groupings: one dealing with personal attitudes and behaviors at work; a second focusing on interpersonal attitudes and behaviors at work; and a third concentrating on more transferable attitudes and behaviors between the school and work environments.

Three additional clusters were created by two of the subgroups. These clusters were "Information Use and Decision-Making," "Education and Career Attainment" and "Communications." Educators and employers were in agreement with respect to how they grouped and prioritized the first two of these three clusters of student outcomes; however, a notable exception was in "Education and Career Attainment" where the average cluster rating given by educators exceeded that of employers by a wide margin, 4.02 compared to 3.61. Although students did not link concepts associated with careers to educational attainment in the same way as educators and employers, the priority placed on the "educational attainment" cluster was also quite high (3.97). These results show outcomes associated with advancing within the educational system are a high priority to educators and students within the system, but not as much to employers outside of it. Consistently, employers give student outcomes associated with school-to-work transition and employment a higher priority than outcomes more closely associated with the educational system itself. This finding raises the question of what level of priority to place on educational attainment outcomes that address whether students are making progress on grade level, graduating from high school, matriculating to the two-year postsecondary level, and so forth. The question is particularly pertinent with regard to a program such as Tech Prep where a school-to-work focus is important to all stakeholders. Given that position, how far should Tech Prep programs go to accommodate the perspective of employers who know the workplace best? How much weight should be given to their preference for vocationally oriented outcomes over educational outcomes?

Yet, the issue is not simply one of educators and students giving greater priority than employers to educational outcomes. Indeed, the situation is much more complex. In fact, all the subgroups rated many outcomes statements associated with traditional academic subjects such as mathematics, science, English, humanities, social studies, and the fine arts lower than outcomes aligned with school-to-work transition and employment. Although all three groups organized outcomes statements into distinct clusters aligned with these academic concepts, most rated these clusters lower than the ones having a work or career orientation. Furthermore, within the clusters of "Math and Science" and "Analytic and Scientific," the stakeholders rated outcomes at the basic level more highly than those at the advanced level, showing a preference for students' mastery of more fundamental academic concepts over more advanced. Also, the academic clusters were segregated from vocational clusters on all the concept maps, with the academic concepts placed on the west side of the map and vocational concepts on the east. This result gives the impression that sets of outcomes associated with vocational and academic education may be both distinct and independent from one another. However, some clusters do not appear to fit this conclusion. In all of the concept maps, stakeholders created one or more clusters containing outcomes having to do with technology, information use, decision-making, work, and management. The outcomes within these clusters were drawn from across the disciplines such as humanities, social studies, science, and vocational-technical education. Typifying this kind of cluster is the one created by students labeled "Work, Technology, and Information Use" and one developed by employers labeled "technology and quality management." Within each of these clusters is the nucleus of outcomes taken from a wide range of vocational and academic subject matter, potentially providing ideas for integrating Tech Prep instruction.

An additional observation should be made about the nature of the three subgroups' conceptualizations of vocational and academic outcomes. Consistently, vocationally oriented outcomes received high or nearly high priority ratings, while academically oriented outcomes received lower (albeit not low but moderate) ratings. Within the clusters of academically oriented outcomes, statements linked to the academic areas of social studies and humanities received the lowest ratings. All three stakeholder groups created clusters with outcomes statements linked to the "Democratic Process and Career Awareness" or "Democratic and Participatory Strategies" and all three gave these clusters low mean ratings relative to the other clusters. In fact, the cluster labeled "Democratic and Participatory Strategies" created by employers rated the lowest of all clusters with an average rating of 2.99. Is this pattern a random occurrence or is there something about Tech Prep that suggests democratic outcomes should receive a lower priority than other outcomes? Public policy specifies that Tech Prep curriculum should be comprised of mathematics, science, English/ communications, and vocational-technical education. Rarely is the area of social studies or humanities mentioned as central. In addition, many local consortia and state agencies profess a primary purpose of Tech Prep is to "eliminate the general track." These constituents are attempting to improve education for the neglected majority of students who have been engaged in the general track, but in so doing may shift priorities away from some of the more traditional social and democratic functions of public education. Is this shift actually occurring? We could not identify data to suggest that such a curricular shift is occurring; however, it is an important issue to monitor. What are the consequences of shifting priorities away from traditional academic subjects to education more highly focused on school-to-work transition, technologies, and vocations? Without more attention paid to formal evaluation, this question will remain elusive.

In summary, this study attempted to better understand Tech Prep student outcomes from the perspectives of educators, students, and employers actively engaged in implementing Tech Prep. Knowing how these groups conceptualize student outcomes has important implications for understanding the fundamental objectives of Tech Prep, for planning and implementing programs, and for assessing outcomes in the future. Also, by uncovering various conceptualizations of Tech Prep, it may be possible to identify conflicting perspectives held by disparate stakeholder groups. Having information from still more stakeholder groups from other localities such as rural and suburban areas would help to illuminate the ways other constituents think about Tech Prep. In addition, obtaining information from policymakers, parents, counselors, and still other groups could help in the development of outcomes assessments. As Tech Prep implementation continues, more attention must be devoted to student outcomes, and this study takes an important next step in that direction.


REFERENCES

Boesel, D., Rahn, M. L., & Deich, S. (1994, July). Final report to Congress, Volume III, Program improvement: Education reform. Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement.

Bragg, D. D. (Ed.). (1992, December). Alternative approaches to outcomes assessment for postsecondary vocational education (MDS-239). Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.

Bragg, D. D., Layton, J., & Hammons, F. (1994). Tech Prep implementation in the United States: Promising trends and lingering challenges (MDS-714). Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.

Bragg, D., & Layton, J. (1995, May). The role of the urban community college in educational reform. Education and Urban Society, 27(3), 294-312.

Brown, J. M., Pucel, D. J., Johnson, R., & Kuchinke, P. (1994). The Tech Prep consortia evaluation system 1993/94 cohort summary preliminary report. St. Paul: University of Minnesota, Department of Vocational and Technical Education.

Campbell, J. R. (1995, March). Delaware consortium on technical preparation programs: Evaluation model. Dover, DE: State Consortium on Technical Preparation Programs.

Caracelli, V. J., & Riggin, L. J. C. (1994, June). Mixed-method evaluation: Developing quality criteria through concept mapping. Evaluation Practice, 15(2), 139-152.

Connell, T., & Mason, S. (1995, April). School to work transition: Issues and strategies for evaluation and program improvement. Paper presentation at the annual meeting of the American Educational Research Association in San Francisco, CA.

Decision Information Resources, Inc. (n.d.). Evaluation of Tech Prep system development and implementation in Texas public schools and institutions of higher education. Houston, TX: Author.

Dornsife, C. (1992). Beyond articulation: The development of Tech Prep programs (MDS-311). Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.

Everitt, B. (1980). Cluster analysis (2nd ed.). New York: Halsted Press.

Grayson, T. E. (1992). Concept mapping. In D. D. Bragg (Ed.), Alternative approaches to outcomes assessment for postsecondary vocational education (MDS-239) (pp. 65-93). Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.

Hammons, F. T. (1992, April). The first step in Tech-Prep program evaluation: The identification of program performance indicators. Unpublished doctoral dissertation, Virginia Polytechnic Institute and State University.

Hammons, F., & Pittman, L. (1995, March). New Hampshire statewide Tech Prep evaluation. Miami Beach: Hammons, Pittman & Associates, Inc.

Harman, H., & Stowers, P. (1995, February). Status of the Tech Prep associate degree initiative in West Virginia. Charleston, WV: Appalachia Educational Laboratory.

Hershey, A., & Silverberg, M. (1994, September). The emergence of Tech-Prep at the state and local levels. Washington, DC: U.S. Department of Education, Office of the Under Secretary Planning and Evaluation Service.

Hershey, A., Silverberg, M., & Owens, T. (1994, October). The diverse forms of Tech Prep: Implementation approaches in ten local consortia. Washington, DC: U.S. Department of Education, Office of the Under Secretary Planning and Evaluation Service.

Hoachlander, E. G., & Rahn, M. L. (1992, March). Performance measures and standards for vocational education: 1991 survey results (MDS-388). Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.

Hoachlander, E. G., Levesque, K., & Rahn, M. L. (1992, July). Accountability for vocational education: A practitioner's guide (MDS-407). Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.

Hull, D., & Parnell, D. (Eds.). (1991). Tech Prep associate degree: A win/win experience. Waco, TX: Center for Occupational Research and Development.

Jurgens, C. (1995). Nebraska Tech Prep: Executive summary fiscal year 1995. Lincoln: Nebraska Department of Education.

Keller, R. (1995, January). Evaluation of Colorado Tech Prep educational initiative: FY 1994 report. Aurora, CO: Mid-Continent Regional Educational Laboratory, Inc.

Key, C. (1994). Synthesis of literature related to Tech-Prep outcomes. In G. Baker III (Ed.), A handbook on the community college in America: Its history, mission and management (pp. 137-150). Westport, CT: Greenwood Press.

Kulm, G., & Malcom, S. M. (Eds.). (1991). Science assessment in the service of reform. Washington, DC: American Association for the Advancement of Science.

Layton, J., & Bragg, D. (1992). Initiation of Tech Prep by the fifty states. In D. D. Bragg, (Ed.), Implementing Tech Prep: A guide to planning a quality initiative (MDS-241) (pp. 4-1 to 4-18). Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.

MGT of America, Inc. (1995, November). An evaluation of Tech Prep in Ohio: Year one final report. Tallahassee, FL: Author.

McCaslin, N. L. (1990). A framework for evaluating local vocational education programs. Columbus: ERIC Clearinghouse on Adult, Career, and Vocational Education, Ohio State University.

McCaslin, N., & Headley, S. (1993). A national study of approved state systems of performance measures and standards for vocational education. Columbus: Ohio State University, Comprehensive Vocational Education Program.

McKinney, F., Fields, E., Kurth, P., & Kelly, F. (1988). Factors influencing the success of secondary/postsecondary vocational-technical education articulation programs. Columbus: National Center for Research in Vocational Education, Ohio State University.

National Center for Research in Vocational Education. (1993, November). Establishing integrated Tech Prep programs in urban schools: Plans developed at the NCRVE 1993 national institute (MDS-770). Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.

National Commission on Excellence in Education. (1983). A nation at risk. Washington, DC: U.S. Department of Education.

National Commission on Secondary Vocational Education. (1984). The unfinished agenda. Columbus: National Center for Research in Vocational Education, Ohio State University.

North Carolina Department of Public Instruction & North Carolina Department of Community Colleges. (1994, November). 1993-94 final report: Planning and implementation grants. Raleigh: Author.

O'Neil, S. L. (1976). Worker perceptions of skills necessary for survival in the world of work. Unpublished doctoral dissertation, University of Illinois at Champaign-Urbana.

Oregon Department of Education. (n.d.). The certificate of advanced mastery. Work in progress planning document. Salem: Author.

Oregon Department of Public Instruction. (1993, January). Excerpts from Oregon certificate of initial mastery--Task force report. Salem: Author.

Owens, T. R. (1995, April). Washington year two: Tech Prep planning and implementation survey summary. Portland, OR: Educational and Work Program, Northwest Regional Educational Laboratory.

Owens, T., Lindner, F., & Wang, C. (1994, December). Tech Prep in Washington state: Case studies report. Portland, OR: Northwest Regional Educational Laboratory.

Owens, T., Lindner, F., & Wang, C. (1995, June). New learnings from Tech Prep in Washington state: A second-year case studies report. Portland, OR: Northwest Regional Educational Laboratory.

Parnell, D. (1985). The neglected majority. Washington, DC: Community College Press.

Pearce, K., Pease, V., Copa, G., & Beck, R. (1991, July). Educational outcomes: Past, present, and future. Working paper of the Design Group for the project "New Designs for the Comprehensive High School." St. Paul: National Center for Research in Vocational Education, University of Minnesota.

Peasley, D., & McCaslin, N. (1995, April). An exploratory examination of the framework for evaluating vocational education programs. Paper presented at the American Educational Research Association meeting, San Francisco, CA.

Rhode Island Tech Prep Associate Degree Program. (No author or date given.)

Roegge, C., & Evans, J. (1995, July). Illinois Tech Prep evaluation study: Final report draft FY1995. Urbana: Department of Vocational and Technical Education, University of Illinois at Urbana-Champaign.

Roegge, C. A., Leach, J. A., & Brown, D. C. (1992). Stakeholder's perceptions of major themes and priority areas of Tech Prep in Illinois. Journal of Vocational Education Research, 18(3), 77-96.

Rubin, M. (1994). Evaluation of California's Tech Prep education program: Year two evaluation report (Draft). San Diego: Evaluation and Training Institute and San Diego Community College District.

Ruhland, S., Custer, R., & Stewart, B. (1994). Status of Tech Prep in Missouri, 1993-94. Jefferson City, MO: Division of Vocational and Adult Education, Department of Elementary and Secondary Education.

Secretary's Commission on Achieving Necessary Skills (SCANS). (1991, June). What work requires of schools. Washington, DC: U.S. Department of Labor.

Seidman, P., & Ramsey, K. (Eds.). (1995, May). Vocational education and school reform: New connections from school to work. Education and Urban Society, 27(3), 235-243.

Stecher, B., Hanser, L., & Hallmark, B. (1995, January). Improving Perkins II performance measures and standards: Lessons learned from early implementers in four states (MDS-732). Berkeley: National Center for Research in Vocational Education, University of California at Berkeley.

Steffy, B. E. (1993). The Kentucky education reform: Lessons for America. Lancaster, PA: Technomic Publishing Co.

Tennessee Board of Regents & Tennessee Department of Education. (1994, December). Tech Prep annual performance report, July 1, 1993 to June 30, 1994. Nashville: Author.

Trochim, M. K. (1989a). The concept system. Ithaca, NY: Concept System.

Trochim, M. K. (Ed.). (1989b). An introduction to concept mapping for evaluation and planning. Evaluation and Program Planning, 12(1), 1-16.

Trochim, M. K., & Linton, R. (1986). Conceptualization for planning and evaluation. Evaluation and Program Planning, 9(4), 289-308.

U.S. Congress. (1990). Carl D. Perkins Vocational and Applied Technology Education Act Amendments of 1990. Public Law 101-392. Washington, DC: U.S. Government Printing Office.

U.S. Department of Labor. (1991). What work requires of schools: A SCANS report for America 2000. Washington, DC: Author.

Ward, J. H. (1963). Hierarchical grouping to optimize an objective function. Journal of the American Statistical Association, 58, 236-244.

White, S. (1994). Overview of NAEP assessment frameworks. Washington, DC: National Center for Educational Statistics.


NOTES

[1] Although 2+2 Tech Prep programs have existed for some time in a few localities and states, Tech Prep was not widespread until after passage of the federal Carl D. Perkins Vocational and Applied Technology Education Act Amendments of 1990.

[2] On May 4, 1994, the U.S. Congress passed the School-to-Work Opportunities (STWO) Act which has a primary goal of encouraging states to plan and implement coordinated school-to-work systems using a variety of models including Tech Prep to assist youth to obtain employment after completing secondary or postsecondary education.

[3] The study sample was comprised of educators, students, and employers actively engaged in Tech Prep implementation as a part of the National Center for Research in Vocational Education's Urban Schools Network.

[4] A national-level evaluation is mandated by the Carl D. Perkins Vocational and Applied Technology Act Amendment of 1990 (Perkins II) and this evaluation is described later in this section.

[5] Perkins II has a primary objective of developing improved accountability systems that require each state to measure student learning gains in basic and more advanced academic skills and student performance in competency attainment. States must also implement measures in one or more of the following areas: job or work skill attainment or enhancement, retention or completion, or job placement.

[6] A targeted follow-up of about fifty Tech Prep consortia indicated by the NCRVE survey to be the most advanced at Tech Prep evaluation in the nation produced disappointing results. Very few formal evaluation plans, instruments, or reports were produced by these sites.

[7] Since it was a primary goal of the study to compare the maps generated by the three stakeholder groups, it was important to maintain the same number of cluster solutions for all the maps. The nine-cluster solution was selected because of its meaningfulness in representing all of the participants' perspectives and each of the subgroups' perspectives in concept maps.

[8] The average cluster ratings for subgroups presented in Figure 3 differ from those presented later in this report because the calculation of average cluster ratings are dependent upon the composition of outcomes statements in the clusters obtained from each particular cluster map. As each new map is calculated, producing more or less difference in the clusters, the average cluster ratings change to reflect the different composition of the clusters. This is why it is important to interpret the results in several different ways to obtain a more complete understanding of results.


APPENDIX A
LIST OF STUDENT OUTCOMES BY CLASSIFICATION IN THE LITERATURE

Communications (7)

Fine Arts (4)

Group Skills (9)

Integrated Knowledge Skills (2)

Math Skills (7)

Personal Skills (8)

Educational Attainment (10)

Science (6)

Social Studies (9)

Thinking and Problem Solving (11)

Vocational/Occupational (25)


APPENDIX B
A COMPARISON OF MEAN RATING AND BRIDGING VALUES
FOR OUTCOMES STATEMENTS BY CLUSTER AND STAKEHOLDER GROUP

Communications/Technical Communications

Rating
Educ. Stud. Empl.

Bridging
Educ. Stud. Empl.

26 demonstrate oral and verbal proficiency in technical communication (reports, policies, procedures)



4.29



3.83



4.11



0.10



0.82



0.47
47 communicate ideas and information through speaking

4.13

4.33

4.26

0.06

0.68

0.57
30 communicate ideas and information through writing

4.25

3.56

3.89

0.11

0.50

0.43
97 understand and communicate in a second language

2.17

3.22

2.26

0.26

0.65

0.63
1 create meaning from messages communicated through listening

4.33

---

4.32

0.00

---

0.71
78 apply the English language correctly (spelling, grammar, structure)

4.21

---

4.00

0.41

---

0.43
2 understand nonverbal communication
3.61
---
3.32
0.15
---
0.71
4 communicate ideas and emotions through the fine arts (e.g., art, music, dance)


2.46


---


2.00


0.22


---


1.00
50 construct meaning through reading for information, literary experience, and to perform a task


4.21


---


---


0.23


---


---
72 use research tools to locate sources of information and ideas relevant to a specific need or problem


---


3.89


---


---


0.60


---
87 understand the relationships between theory and practice in a technical area

---

3.56

---

---

0.62

---
24 communicate ideas by quantifying with whole, rational, real, and/or complex numbers


---


2.94


---


---


0.47


---

Democratic & Participatory Strategies/Democratic Process
& Career Awareness/ Communications & Democratic Process




Rating
Educ. Stud. Empl.




Bridging
Educ. Stud. Empl.

61 recognize and apply the democratic principles of justice, equality, responsibility, choice, and freedom


3.50


4.00


3.16


0.84


1.00


0.49
95 prepare for direct participation in the democratic process

3.61

3.06

2.89

0.63

0.96

0.62
67 recognize varying forms of government and address issues of importance to citizens in a democracy


3.13


3.22


2.37


1.00


0.99


0.56
98 recognize differences and commonalities in the human experience through productions, performances, or interpretations



---



3.89



2.68



---



0.79



0.62
57 be critically aware of social issues involved in a field of interest

3.35

---

2.95

0.82

---

0.80
86 demonstrate awareness of workforce and societal trends

3.22

---

3.00

0.65

---

0.59
75 have awareness of and interest in technical careers

3.33

---

---

0.81

---

---
58 know the history of a particular occupation

2.42

---

---

0.84

---

---
1 create meaning from messages communicated through listening

---

4.17

---

---

0.91

---
2 understand nonverbal communication
---
3.44
---
---
0.71
---
4 communicate ideas and emotions through the fine arts (e.g., art, music, dance)


---


3.28


---


---


0.93


---
20 know how to give and take instructions

---

---

4.16

---

---

0.61
18 teach others new skills
---
---
3.05
---
---
0.63
59 observe, analyze, and interpret human behaviors to acquire a better understanding of self, families, and other human relationships



---



---



3.05



---



---



0.53
19 resolve conflict based on divergent interests and perspectives

---

---

3.00

---

---

0.54
63 recognize the geographic interaction between people and their surroundings and make responsible decisions for the environment



---



---



2.58



---



---



0.45

Educational Attainment/ Educational & Career Attainment

Rating
Educ. Stud. Empl.

Bridging
Educ. Stud. Empl.

22 complete secondary school
4.75
4.33
4.63
0.12
0.01
0.36
3 make academic progress on grade level

4.04

4.39

3.79

0.25

0.04

0.38
60 make a smooth transition from secondary to postsecondary education

4.17

3.44

3.42

0.12

0.16

0.33
89 succeed in the transition from secondary or postsecondary education to a 4-year college


3.17


4.06


3.32


0.13


0.00


0.34
69 enter postsecondary programs without remediation

4.04

2.82

3.84

0.12

0.00

0.47
49 complete postsecondary school
3.63
3.83
3.74
0.06
0.08
0.36
11 apply knowledge, skills, and learning strategies to career and life choices

4.38

---

4.00

0.56

---

0.59
88 participate in work-based learning experiences

4.00

---

4.37

0.39

---

0.57
32 earn college credit in high school
---
3.33
2.37
---
0.01
0.33
46 attend school regularly
---
4.67
---
---
0.09
---
45 demonstrate a positive attitude toward school

---

4.39

---

---

0.32

---
77 achieve certification of mastery in an occupation

---

---

3.95

---

---

0.40
75 have awareness of and interest in technical careers

---

---

3.53

---

---

0.55
54 achieve and maintain employability in a high-wage job

---

---

3.26

---

---

0.63
58 know the history of a particular occupation

---

---

2.63

---

---

0.60
23 expand own knowledge by making connections with new and unfamiliar knowledge, skills, and experiences


---


---


3.74


---


---


0.59

Information Use & Decision Making/Work Technology & Information Use


Rating
Educ. Stud. Empl.


Bridging
Educ. Stud. Empl.

71 use decision-making processes to make informed choices among options


4.21


3.67


3.84


0.50


0.34


0.34
70 prepare and use budgets, make forecasts, keep records, and make adjustments to meet objectives


3.33


3.94


3.05


0.50


0.61


0.37
74 acquire, store, allocate, and use materials and space efficiently

3.54

3.67

3.32

0.61

0.28

0.35
51 use critical thinking skills in a variety of situations

4.25

---

3.89

0.60

---

0.35
40 apply logical reasoning to develop solutions to complex problems

4.08

3.94

---

0.50

0.37

---
79 apply appropriate safety and environmental measures

4.00

4.00

---

0.46

0.42

---
64 design, maintain, and improve systems

---

3.78

3.21

---

0.27

0.50
41 select, use, and maintain appropriate tools, information, materials, and equipment


4.17


---


---


0.66


---


---
83 prepare and follow schedules, and manage time efficiently

4.14

---

---

0.48

---

---
72 use research tools to locate sources of information and ideas relevant to a specific need or problem


3.92


---


---


0.46


---


---
13 use appropriate and relevant scientific methods to solve specific problems in real-life situations


3.79


---


---


0.35


---


---
66 use goal-relevant activities, rank them, and allocate time for them

3.58

---

---

0.57

---

---
36 make a successful transition from education to employment

---

4.56

---

---

0.65

---
9 adapt to emerging technology and adjust to changing work environments


---


4.44


---


---


0.74


---
29 use computers and other electronic technology to gather, organize, manipulate, and present information


---


4.17


---


---


0.29


---
92 understand how technology affects quality of life

---

3.94

---

---

0.32

---
77 achieve certification of mastery in an occupation

---

3.67

---

---

0.49

---
50 construct meaning through reading for information, literary experience, and to perform a task


---


---


3.89


---


---


0.37
87 understand the relationships between theory and practice in a technical area

---

---

3.68

---

---

0.47
90 recognize and apply quality standards
---
---
4.16
---
---
0.37

Math & Science/Analytic & Scientific/Math, Science, & Communications


Rating
Educ. Stud. Empl.


Bridging
Educ. Stud. Empl.

6 apply basic algebra and geometry to solve technical and work-related problems


4.21


3.67


4.26


0.49


0.08


0.04
53 use division, multiplication, addition, and subtraction with real numbers, decimals, fractions, integers, roots, and powers



3.88



3.67



4.05



0.42



0.09



0.07
48 demonstrate an ability to calculate through ratios, proportions, and percentages


3.75


3.00


3.74


0.43


0.24


0.04
94 read and create charts, tables, and graphs

3.70

3.61

3.68

0.44

0.37

0.17
56 use scientific methods to acquire information, plan investigations, use scientific tools, and communicate results



3.63



3.33



3.26



0.23



0.13



0.16
52 use models and scales to explain or predict the organization, function, and behavior of objects, materials, and living things



3.29



3.22



2.68



0.39



0.19



0.19
93 apply advanced algebra, analytic geometry, and/or calculus to solve technical and work-related problems


3.17


3.22


3.05


0.36


0.09


0.10
62 use the metric system and convert between metrics and traditional systems


2.88


2.56


3.00


0.33


0.23


0.14
29 use computers and other electronic technology to gather, organize, manipulate, and present information


4.74


---


4.11


0.44


---


0.18
7 organize information through the development and use of classification rules and systems


3.91


---


3.63


0.36


---


0.30
41 select, use, and maintain appropriate tools, information, materials, and equipment


---


3.67


3.89


---


0.24


0.26
13 use appropriate and relevant scientific methods to solve specific problems in real-life situations


---


3.17


3.53


---


0.27


0.12
24 communicate ideas by quantifying with whole, rational, real, and/or complex numbers


3.25


---


3.16


0.33


---


0.34
78 apply the English language correctly (spelling, grammar, structure)

---

4.67

---

---

0.45

---
51 use critical thinking skills in a variety of situations

---

4.11

---

---

0.32

---
50 construct meaning through reading for information, literary experience, and to perform a task


---


3.83


---


---


0.44


---
40 apply logical reasoning to develop solutions to complex problems

---

---

3.89

---

---

0.26
72 use research tools to locate sources of information and ideas relevant to a specific need or problem


---


---


3.68


---


---


0.33

Personal Attributes, Attitudes,
& Employability Skills

Rating
Educ. Stud. Empl.

Bridging
Educ. Stud. Empl.

76 be honest and demonstrate integrity
4.50
4.83
4.89
0.21
0.06
0.00
68 be dependable and punctual
4.58
4.72
4.79
0.17
0.10
0.03
31 demonstrate self-control and self-discipline

4.13

4.78

4.26

0.21

0.26

0.01
39 maintain good physical, mental, and emotional health

3.83

4.61

3.89

0.26

0.07

0.12
38 know own abilities, strengths, and weaknesses

4.08

4.44

3.95

0.32

0.24

0.10
81 be loyal to an employer
3.74
4.44
3.84
0.20
0.18
0.09
35 demonstrate the ability to be adaptable and flexible

4.25

4.39

4.11

0.20

0.22

0.08
42 build own self-esteem
3.87
4.33
3.74
0.26
0.12
0.09
73 show appropriate personal appearance and attitude

4.04

4.33

4.32

0.20

0.11

0.01
37 make ethical decisions
4.21
---
4.42
0.24
---
0.08
33 know employer expectations for job performance

4.26

---

4.16

0.27

---

0.51
85 work without close supervision
4.09
---
4.11
0.25
---
0.15
43 demonstrate motivation to learn
4.00
---
4.05
0.41
---
0.34
21 appreciate the diversity of values and cultural differences among people

3.74

---

---

0.39

---

---
55 articulate personal values and beliefs as they relate to a particular occupation


3.29


---


---


0.37


---


---
5 demonstrate consistent, respectful, and caring behavior

---

4.61

---

---

0.23

---
15 show good working relationships with superiors and coworkers in an occupational role


---


4.59


---


---


0.24


---
17 serve clients/customers
---
3.83
---
---
0.22
---
65 monitor and correct own performance
---
---
4.21
---
---
0.32
84 work under tension or pressure
---
---
4.00
---
---
0.15
44 use initiative, imagination, and creativity

---

---

3.79

---

---

0.49


School-to-Work Transition

Rating
Educ. Stud. Empl.

Bridging
Educ. Stud. Empl.

8 recognize the need for lifelong learning to enhance skills and learn new skills


4.33


4.50


4.68


0.36


0.62


0.53
82 gain experience in all aspects of an industry

3.35

3.89

3.16

0.46

0.42

0.58
36 make a successful transition from education to employment

4.58

---

4.37

0.35

---

0.52
80 develop and follow through on individual career plans and goals

4.17

---

3.63

0.36

---

0.48
46 attend school regularly
4.13
---
4.42
0.30
---
0.49
54 achieve and maintain employability in a high-wage job

3.75

4.06

---

0.44

0.50

---
45 demonstrate a positive attitude toward school

3.83

---

3.63

0.43

---

0.49
77 achieve certification of mastery in an occupation

3.71

---

---

0.25

---

---
32 earn college credit in high school
2.17
---
---
0.22
---
---
43 demonstrate motivation to learn
---
4.50
---
---
0.77
---
11 apply knowledge, skills, and learning strategies to career and life choices

---

4.44

---

---

0.39

---
33 know employer expectations for job performance

---

4.44

---

---

0.55

---
88 participate in work-based learning experiences

---

4.39

---

---

0.39

---
23 expand own knowledge by making connections with new and unfamiliar knowledge, skills, and experiences


---


4.33


---


---


0.80


---
34 know how social, organizational, and technological systems work

---

3.94

---

---

0.38

---
66 use goal-relevant activities, rank them, and allocate time for them

---

3.61

---

---

0.50

---
7 organize information through the development and use of classification rules and systems


---


3.56


---


---


0.37


---
58 know the history of a particular occupation

---

3.50

---

---

0.70

---

Work & Interpersonal Relationships

Rating
Educ. Stud. Empl.

Bridging
Educ. Stud. Empl.

14 participate as a member of a team
4.33
4.28
4.53
0.33
0.40
0.34
12 get along with a variety of people
4.17
4.39
4.37
0.26
0.29
0.28
25 plan and work together in meetings
4.00
4.06
4.11
0.35
0.38
0.53
15 show good working relationships with superiors and coworkers in an occupational role


4.33


---


4.42


0.35


---


0.25
5 demonstrate consistent, respectful, and caring behavior

4.00

---

4.32

0.25

---

0.20
17 serve clients/customers
4.04
---
4.26
0.37
---
0.46
44 use initiative, imagination, and creativity

3.96

4.22

---

0.37

0.43

---
28 apply group problem-solving strategies

4.08

3.67

---

0.55

0.82

---
21 appreciate the diversity of values and cultural differences among people

---

3.89

3.63

---

0.52

0.40
55 articulate personal values and beliefs as they relate to a particular occupation


---


3.72


3.05


---


0.36


0.95
91 understand the norms and values of the work culture

---

3.72

3.53

---

0.53

0.55
10 exercise leadership in a variety of situations

3.71

---

3.58

0.29

---

0.31
19 resolve conflict based on divergent interests and perspectives

3.54

3.11

---

0.42

0.59

---
27 appreciate own and others' artistic products and performances

---

3.50

2.42

---

0.66

0.44
16 evaluate others' performance and provide feedback

3.13

---

3.11

0.30

---

0.33
65 monitor and correct own performance
4.38
---
---
0.28
---
---
20 know how to give and take instructions

4.29

---

---

0.39

---

---
90 recognize and apply quality standards
3.74
---
---
0.48
---
---
84 work under tension or pressure
3.48
---
---
0.26
---
---
18 teach others new skills
3.38
---
---
0.42
---
---
37 make ethical decisions
---
4.06
---
---
0.40
---
57 be critically aware of social issues involved in a field of interest

---

3.83

---

---

0.60

---
63 recognize the geographic interaction between people and their surroundings and make responsible decisions for the environment



---



3.44



---



---



0.66



---
59 observe, analyze, and interpret human behaviors to acquire a better understanding of self, families, and other human relationships



---



3.33



---



---



0.53



---
96 understand the principles of competition, cooperation, and leadership in a work environment


---


---


3.95


---


---


0.38

Work Environments/Career and Work Management & Initiative/ Technology & Quality Management



Rating
Educ. Stud. Empl.



Bridging
Educ. Stud. Empl.

9 adapt to emerging technology and adjust to changing work environments


4.71


---


4.47


0.63


---


0.48
83 prepare and follow schedules, and manage time efficiently

---

4.22

4.00

---

0.36

0.49
96 understand the principles of competition, cooperation, and leadership in a work environment


3.87


4.17


---


0.66


0.27


---
34 know how social, organizational, and technological systems work

3.63

---

3.16

0.77

---

0.78
92 understand how technology affects quality of life

3.39

---

3.32

0.74

---

0.64
23 expand own knowledge by making connections with new and unfamiliar knowledge, skills, and experiences


4.33


---


---


0.93


---


---
91 understand the norms and values of the work culture

3.65

---

---

0.46

---

---
87 understand the relationships between theory and practice in a technical area

3.57

---

---

0.73

---

---
64 design, maintain, and improve systems

3.38

---

---

0.80

---

---
59 observe, analyze, and interpret human behaviors to acquire a better understanding of self, families, and other human relationships



3.00



---



---



0.53



---



---
63 recognize the geographic interaction between people and their surroundings and make responsible decisions for the environment



2.96



---



---



0.45



---



---
98 recognize differences and commonalities in the human experience through productions, performances, or interpretations



2.87



---



---



0.59



---



---
27 appreciate own and others' artistic products and performances

2.57

---

---

0.57

---

---
80 develop and follow through on individual career plans and goals

---

4.50

---

---

0.55

---
20 know how to give and take instructions

---

4.39

---

---

0.32

---
65 monitor and correct own performance
---
4.39
---
---
0.52
---
85 work without close supervision
---
4.17
---
---
0.15
---
90 recognize and apply quality standards
---
4.06
---
---
0.18
---
10 exercise leadership in a variety of situations

---

3.94

---

---

0.29

---
86 demonstrate awareness of workforce and societal trends

---

3.89

---

---

0.32

---
75 have awareness of and interest in technical careers

---

3.78

---

---

0.36

---
18 teach others new skills
---
3.50
---
---
0.41
---
84 work under tension or pressure
---
3.33
---
---
0.29
---
16 evaluate others' performance and provide feedback

---

3.28

---

---

0.29

---
79 apply appropriate safety and environmental measures

---

---

4.37

---

---

0.57
28 apply group problem-solving strategies

---

---

4.00

---

---

0.57
66 use goal-relevant activities, rank them, and allocate time for them

---

---

3.58

---

---

0.46


NCRVE Home