Previous Next Contents NCRVE Home

CHAPTER 9

Carolyn Dornsife

EVALUATION FOR PROGRAM IMPROVEMENT





      At the 1992 Summer Institutes, the National Center for Research in Vocational Education (NCRVE) helped school teams jump-start the writing process of their implementation plans by requiring specific curriculum integration and Tech Prep program components. The identification of these components was based on research and practical knowledge of how to implement successful Tech Prep and integrated curriculum initiatives (Bragg, 1992; Layton and Bragg, 1993; Dornsife, 1992; Grubb, Davis, Plihal, and Lum, 1991; Hull and Parnell, 1991; Little, 1993). Evaluation was one of the required components with the intention of supporting a data-driven decision-making process as a powerful mechanism for sustaining fledgling reform efforts.

      When requiring the implementation of an evaluation component, NCRVE did not take on the role of a third party evaluator for the Network sites. Instead, the rationale was that by preparing an implementation plan, the team members would gain a greater understanding of how, from the outset, they could incorporate evaluation into a complex, multifaceted reform effort. At the same time, to maintain the momentum of implementation, NCRVE provided technical assistance to help sites implement all program components, including evaluation.


EVALUATION-RELATED
TECHNICAL ASSISTANCE ACTIVITIES

      At the NCRVE Urban Schools Network summer institutes and regional meetings, specific break-out sessions were devoted to the design of an evaluation component. For instance, at the 1992 summer institutes, NCRVE presented sessions on how to translate goals into measurable outcomes, how to identify and use data sources, and how to conduct both formative and summative evaluation activities (process and outcome data, respectively). Over time, evaluation specialists and field consultants from NCRVE provided on-site assistance to individual high school and college sites as they designed and implemented their evaluation component.

      The 1993 and 1994 regional meetings also provided opportunities for sites to collaborate and share information on how they were conducting an evaluation. In 1995 all sites were asked to sign a Program Improvement Agreement[3], and at the 1995 summer institute all team members were required to participate in a hands-on workshop providing a step-by-step process for creating an evaluation plan. During a 1996 Network meeting, all participants attended an open discussion session on evaluation results that had been collected from various Network sites. The purpose was to share information about "what had worked"--the evaluation processes and outcomes that some teams had completed--with the intention that other sites could apply this information.

      Finally, along with regular site visits and monthly telephone calls to team liaisons, NCRVE staff members periodically distributed written survey instruments as a means of assessing technical assistance needs and "customizing" methods for helping sites conduct evaluation. Ideas and successful techniques were shared throughout the Network via the Urban Update newsletter.


THE CONTINUUM OF EVALUATION

      Network sites reflect a continuum of implementation progress in the area of evaluation. Assuming that one end of the continuum is simply discussing design of an evaluation plan, and the other end is complete implementation and the collection of both formative and summative data, Network sites cover the entire length. The reasons for this wide range of efforts are numerous, but stem, in part, from NCRVE's approach to building the Network. We did not want to prescribe, we wanted to provide assistance and have teams of practitioners share information and "lessons learned". In short, over time, we wanted to help sites create a system of program improvement that would ensure long-term sustainability of their education reform initiatives.

      Given the methods NCRVE embraced, the purposes of this chapter are to share available student outcome data and to discuss the ongoing dilemmas of collecting data. How do sites overcome barriers to implementing an evaluation component, and how can NCRVE provide assistance?


THE STATUS OF IMPLEMENTATION:
REFLECTION ON PROCESS

      In reflecting on the process of implementing an evaluation component during summer 1997 focus groups, an overwhelming majority of Network members agree that having to include evaluation in the original plan was good. "It was a start, it made us at least think about what kinds of outcomes we were interested in," stated one of the representatives from the Brooklyn, New York site, composed of George Westinghouse Vocational and Technical High School and New York City Technical College. Furthermore, there was widespread agreement that the required evaluation workshops at Network meetings were helpful. "They were good, if only for the fact that they made classroom teachers think about how student performance could be evaluated."

      In some cases, sites are still using the original written implementation plan as a reference, and many of the outcomes they initially identified are ones they'd like to measure (e.g., attendance). When the focus group discussion turned to actual implementation activities, however, participants divided into three groups: those at the secondary institutions, those at the postsecondary institutions, and those with specific leadership responsibilities, such as a Tech Prep or school-to-work coordinator, a project director, or team leader.

      At the risk of sounding too simplistic, the feedback regarding implementation of an evaluation component tends to fall under three similar themes:

COLLECTING DATA:
FORMATIVE AND SUMMATIVE

      As the feedback from Network members participating in 1997 focus groups indicates, an evaluation component is not a straightforward endeavor, it requires the cooperation of many educators. Moreover, these efforts do not operate in a vacuum, numerous changes (personnel, policy, financial) occur in the school context each day. How did sites confront these challenges and what data have they collected for program improvement purposes?

What has eluded many sites is the collection of student outcome data.

      The purpose of this section is to share some of the student outcome data collected by three "early innovator" sites--those that began implementing an evaluation component after the 1992 Summer Institute. The data from these sites are shared in some detail because it provides several examples of how common barriers were confronted, how goals were translated into measurable outcomes, and how such common data sources as the student transcript can provide a wealth of information. Although the material presented in this section is devoted to student outcomes, it is important to note that Network sites were encouraged to collect both formative and summative data (process and outcome). For instance, as indicated by the content of other chapters in this book, many valuable written products were produced by the sites. These products can easily be used as examples of formative data and as means of evaluating progress and targeting areas for program improvement. What has eluded many sites is the collection of student outcome data.


EARLY INNOVATORS:
EXAMPLES OF EVALUATION
DESIGN AND STUDENT OUTCOMES

      The early innovators were those sites that hired a third-party evaluator to design a plan and collect data. These sites were Brooklyn, New York; Omaha, Nebraska; and, Oklahoma City, Oklahoma. By using an outside evaluator these sites surmounted two major obstacles:(1) defining a Tech Prep student, and (2) identifying a specific person who was responsible for determining outcomes and Survey results were typically used for program improvement purposes, such as 
identifying barriers, strategies to overcome obstacles, and areas for staff 
development. actually collecting the data. Over time, these early innovators have had to reconsider the definition of their target student population (for instance, following the passage of the 1994 School-to-Work Opportunities Act), but they also have baseline data for making comparisons with later student cohorts.

      The third-party evaluators typically sumitted end-of-year reports to the school district and Network sites shared this information with NCRVE. Using these reports, the material presented below is a description of the evaluation designs and the selected student outcomes. At all three early innovator sites, the evaluators gathered survey data on numerous stakeholder groups, including principals, superintendents, teachers, students, counselors, parents, and business representatives. The survey results were typically used for program improvement purposes, such as identifying barriers, strategies to overcome them, and areas for professional staff development. In addition, the evaluators conducted content analyses on such documents as articulation agreements, committee action planning forms, meeting minutes, staff development feedback surveys, and course descriptions for new or modified integrated courses. Clearly, a voluminous amount of process-related information was collected and used to make program improvement decisions, but, because collecting student outcome data is so challenging, we turn our attention to these efforts.


BROOKLYN, NEW YORK

NEW YORK CITY TECHNICAL COLLEGE
GEORGE WESTINGHOUSE VOCATIONAL AND TECHNICAL HIGH SCHOOL

      The New York City Technical College (NYCTC) of the City University of New York, and George Westinghouse Vocational and Technical High School (GWVTHS) are career-oriented institutions. Just before the 1992 NCRVE Summer Tech Prep Institute, NYCTC, which has three career divisions that all lead to associate and/or baccalaureate degrees, and GWVTHS entered into a Tech Prep partnership agreement linking curriculum in two departments--engineering technology and business and communications technology. The team that attended the 1992 summer institute was composed of two administrators, three instructors, and one counselor from the college and one administrator, three teachers, and one counselor from the high school.

      The core components of this Tech Prep initiative included integrated curriculum, articulated courses, work-based learning, and career guidance and counseling. A Tech Prep student was defined as any junior or senior enrolled in Tech Prep math and Tech Prep English courses.

      The Brooklyn site has employed an independent program evaluator since 1992. He has pursued a longitudinal repeated measures design, emphasizing both cognitive and affective student outcomes. The number of measurable goals for Tech Prep have increased over time, as students have matriculated to the college. The sample of Tech Prep students in the first year of the evaluation (1991-92) was limited to 100 high school students; by the fourth year (1994-95) the high school sample increased to 384 students, 119 of them enrolled at NYCTC.

      In general, the overall goals of Tech Prep have remained constant (the dependent variables): to improve students' understanding, awareness, interest, and attitude toward technical careers and to improve their grade point averages and attendance. The evaluator selected the 90-item WorkWise questionnaire to assess most of the affective variables. In addition, student academic and career sense of self, was assessed by a 40-item Self Description Questionnaire. This instrument measures four self-concept factors, including math, verbal, academic, and problem-solving abilities. The cognitive dependent variables have included the cumulative high school grade point average and attendance (data were collected from student transcripts).

      A variety of variables were also examined, including gender, English as a second language, socioeconomic status, and students' high school major. As participants have matriculated to the college, a new set of college-related performance and competency indices have been identified. These measures include City University of New York (CUNY) Math and Reading exams, college grade point average, number of semesters at college, transfer to baccalaureate program, and job placement. The evaluator plans to create a comparison group of college students who did not participate in Tech Prep during high school.


EVIDENCE OF PROGRAM EFFECTIVENESS

      The following information is based upon Year Four (1994-95) evaluation data. In terms of independent variables, as the Tech Prep program matured over four years, the percentage of female participants grew from seventeen to thirty-six percent. The majority of students (seventy-one percent) reported English as a first language. Students' socioeconomic status was indirectly measured using parents' education level, and seventeen percent of the students reported neither parent had a high school diploma, twenty-two percent had one or both parents graduate from high school, and seventeen percent reported one or both parents graduated from college (twenty percent did not report parental education levels).


ATTENDANCE

      Complete high school attendance data was available for eighty-five seniors in Tech Prep, and a chi-square analysis found significant differences between grades. For instance, average days absent in ninth grade was eight, and in twelfth grade the average was three. If attendance can be considered an indirect measure of student motivation, the program is having a positive impact on participating students.


STUDENTS' SELF-ESTEEM

      A total of 260 students provided response data on the Self-Description Questionnaire. Two types of self-esteem (math and problem solving) are amenable to change considering the heavy math and science curricular focus of Tech Prep. Math, verbal and academic self-esteem averages remained unchanged, problem-solving self-esteem averages rose with each passing year from 13.70 in 1992-93 to 15.05 in 1994-95 (out of a 20-point total). In fact, by year three a significant difference was found between pretest and posttest scores (t value=2.47, p<.01). It can now be stated with some certainty that participating in the Tech Prep program enhances students' problem solving self-esteem.


CUNY BASIC SKILLS TESTS

      In Year Four, the program evaluator began to compile data on students who had completed Tech Prep at the high school and continued to New York City Technical College. Of the eighty-eight Tech Prep students who graduated from Westinghouse in that year, thirty-nine entered NYCTC in the following fall. Of these, thirty-nine percent has passed the three CUNY basic skills tests prior to matriculation. Passing rates for the individual tests were as follows:

Each of these was significantly higher than the experience of the college's general applicant pool. This finding supports the conclusion that Tech Prep is preparing students' transition to college.


NEXT STEPS--USE OF RESULTS

      For the Brooklyn team, the evaluation results have confirmed the belief that their reform efforts are working. The component activities they have pursued have had an impact on students' educational experiences--they are staying in school, improving their self-esteem on problem-solving abilities, and passing the CUNY math and reading admission exams for New York City Technical College, on their first attempt. The site's current goals center on expanding their efforts into more classrooms and including different learning activities. For instance, the implementation of a joint (college and high school) school-based enterprise for computer repair is in its second year, with plans to incorporate middle school teachers into the effort. In addition, the staff continue to refine the use of portfolio assessments in various classes, and a "transition math class" (calculus equivalent)--a math class for high school students planning to attend NYCTC in the fall--was piloted during the 1996 school year. Information on job placements for NYCTC graduates who completed the 2+2 Tech Prep sequence should be available by June 1998.

      Finally, the team leaders have brought high school and college English instructors together to design an English curriculum that will ensure that all students will not only pass the writing placement exams to NYCTC but will also have the skills to pass their first-year college English courses. This action is the result of presenting the teachers with data on student performance (from both their placement exams and their first-year college English course grades).


OMAHA, NEBRASKA

METROPOLITAN COMMUNITY COLLEGE
BRYAN HIGH SCHOOL

      The city of Omaha has some noteworthy history behind its effort to plan and implement school-to-work initiatives. In 1991, then-President Bush selected Omaha as a model city for America 2000, in turn, providing significant federal funds to help implement these reform initiatives and nurture partnerships between education and business. Following these events, in 1992 the Omaha team attended NCRVE's Tech Prep Summer Institute and used that opportunity to further develop a plan for their Careers 2000 project (a part of America 2000). This project used a Tech Prep framework and was intended to create new methods for uniting secondary and postsecondary education with the workplace. It was developed in response to a locally identified need to assist the "forgotten half" youth.

      The Metropolitan Community College (MCC) worked with Bryan High School to implement Career 2000 and remains an active partner in its subsequent school-to-work initiatives. The team members include two college administrators, three instructors, and one counselor, and one high school administrator, two teachers, and two guidance counselors. The primary content areas selected for articulation and further development in the Career 2000 project were business education, consumer and home economics, and technology education.

      The State of Nebraska defines Tech Prep as a course of study designed to help high school students form a firm academic and technological foundation on which to build their futures. There are four primary Tech Prep components at the Omaha site: articulation, applied academics and integration of academic and vocational education, career guidance, and partnerships. As defined by the Nebraska State Department of Education, a Tech Prep student is someone enrolled in one or more applied or technical courses that are delineated in a Tech Prep articulation agreement (Jurgens, 1995).


EVALUATION DESIGN AND STUDENT OUTCOMES

      The evaluation of Careers 2000 began in 1992 with the identification of four goals and numerous sub goals. The goals were articulated in the following areas:

  1. Establish a learning climate that focuses on people, processes, and products that result in constant incremental improvement in instruction for students (e.g., involve forty percent of staff in implementing at least three promising instructional strategies that increase student learning).

  2. Restructure the school schedule to increase the amount of time staff have available for interdisciplinary instructional planning and improvement activities (e.g., design a plan for implementing a block schedule).

  3. Design and implement career clusters--an instructional delivery system that infuses career readiness and preparation concepts across the curriculum (e.g., identify clusters, implement articulation agreements, ensure that all tenth grade students select a career cluster).

  4. Promote workplace competencies. Examine and implement curricular and instructional changes that are relevant to future workforce and career needs (e.g., at least twenty interdisciplinary team projects involve core subject areas, technology, and career orientation).

      The evaluators, from the University of Nebraska-Omaha, selected several student outcome measures for a five-year longitudinal study. These measures included: (1) employment status, (2) hourly wages, (3) continuing education, (4) career/job goals, (5) perceived amount of education needed, and (6) the California Achievement Test scores. Data were collected during the fall of 1992, 1993, and 1994 via a survey questionnaire mailed to a sample of 1992 Bryan High School graduates (n=245). Overall, ninety-nine percent of the graduates were contacted at least once during the course of the longitudinal study. By the third year, the average age of the sample was twenty-one years old, and included forty-nine percent females, fifty-one percent males, seventy-nine percent Caucasian, and twenty-one percent non-Caucasian. As presented below, the evaluators reported several interesting findings in their comparison of data from 1992 to 1994.


EMPLOYMENT STATUS

      In 1994, eighty-eight percent of the graduates were employed--fifty-three percent full-time, twenty-four percent part-time, and eleven percent in the armed forces. The number of graduates employed full-time increased by twenty-two percent, while the number of graduates employed part-time decreased by seventeen percent from year one of the study. Of the employed graduates, ten percent were working two or more jobs; and forty-nine percent had worked at their present job for less than one year.


TYPES OF WORK AND WAGES

      The types of work performed by the employed graduates related to the following occupational areas--twenty-four percent administrative support, twenty-three percent service, eleven percent sales, nine percent construction, six percent administrative/managerial, six percent installation/repair, five percent production, two percent professional, and eleven percent other. Most students earned an hourly wage of $4.25-6.50 (sixty-three percent in year one and forty-two percent in year three), or $6.51-8.50 (twenty-one percent in year one and thirty-six percent year three).


EDUCATIONAL STATUS

      Of the forty-seven percent of 1992 graduates who were continuing postsecondary education, thirty-five percent were full-time and twelve percent part-time. This figure represents a sixteen percent decrease in the number of graduates continuing postsecondary education from year one of the study. The types of school attended by the graduates continuing postsecondary education were seventy percent four-year institutions, twenty-four percent two-year institutions, and six percent private career schools.


CAREER GOALS

      Of the 1992 graduates, fifty-nine percent identified a career goal that generally requires at least a bachelor's degree. Yet only forty-seven percent of these graduates were continuing their postsecondary education. At the same time, the career goals of graduates remained relatively stable from year one to year three. For instance, during year three the career goals identified by the graduates related to the following occupations--forty-five percent professional, twenty-four percent administrative/managerial, eight percent services, six percent technology, five percent construction, four percent installation/repair, three percent sales, two percent administrative support, two percent production, and four percent other.


CALIFORNIA ACHIEVEMENT TEST

      Graduates were grouped according to their Total Battery score from their last high school California Achievement Test. Using this method, 31 graduates scored at the twenty-fifth percentile and below, 130 graduates scored at the twenty-sixth to seventy-fifth percentile, and 53 graduates scored above the seventy-fifth percentile. Of the graduates who scored in the lowest percentile group, ninety-seven percent were employed (seventy-seven percent full-time and nineteen percent part-time). In comparison, those who scored in the middle and highest groups were more likely to be employed part-time or serving in the military. Of the graduates in the lowest percentile group, forty-two percent reported career goals in a professional or administrative/managerial field that generally requires a bachelors degree, yet only twenty-six percent were continuing their education.


NEXT STEPS--USE OF RESULTS

      A good indicator of how the evaluation results were used by the Omaha team members is in the content of Bryan High School's 1994-95 school restructuring plan. The major initiative of the restructuring plan was the implementation of a block schedule. The implication is that the teaching staff supports a major reorganization of time for instructional delivery. Several "school learning goals" were also delineated in the plan, including achievement goals for technology, mathematics and problem solving, oral language and reading, and writing skills. The restructuring plan includes an extensive description of indicators and examples of documentation. For example, in the focus area of technology achievement, goal one is that "all students will increase their knowledge of technological applications." Indicators include such items as "all students will be required to complete, per term, three writings done on a word processor, and demonstrate mastery of a technological application within a career cluster." One strategy for meeting this goal is described as "all teachers will infuse into their courses basic academic and symbolic skills," and one means of documenting the effectiveness of this strategy is to require teachers to generate assignments that will involve problem-solving skills that will be graded and assessed via unit lessons presented in ACT's WorkKeys.

      At the time of this writing, a formal evaluation has not been completed regarding the "success" of the block schedule, and whether the goals identified in the school restructuring plan have been met, and to what degree. From information that NCRVE researchers have collected during recent site visits, many teachers welcome the opportunity to receive more training on curriculum development for the block schedule. Indeed, professional development activities continue to be an important component for nurturing the continued improvement of school-to-work initiatives.


OKLAHOMA CITY, OKLAHOMA

OKLAHOMA COMMUNITY COLLEGE
FRANCIS TUTTLE VOCATIONAL TECHNICAL CENTER

      In 1991 the Francis Tuttle Center, along with Oklahoma City Community College; the University of Oklahoma; and the Edmond, Western Heights, Putnam City, and Deer Creek Public Schools, received federal funding to implement a Tech Prep demonstration project. The partners united to form the Consortium to Restructure Education through Academic and Technological Excellence (CREATE). The 1992 plan, developed by the team at the NCRVE Tech Prep Summer Institute, built on this demonstration project. The consortium implemented a 4+2+2 career pathway model, emphasizing four years of high school preparation, plus two years of community college and two years of training at a baccalaureate-granting institution. Students can exit at any point and apply their skills to selected career opportunities. The major career clusters targeted for Tech Prep were health occupations, business, and engineering/trade technology.

      Given CREATE's numerous partners, the 1992 summer institute team was composed of representatives from each institution. Following the institute, NCRVE has typically worked with a core number of people, including one administrator, one counselor, and two teachers at Francis Tuttle, one administrator at Putnam High School, and two administrators at the Oklahoma City Community College.

      As originally conceived in 1992, the primary components their Tech Prep initiative were curriculum articulation, integrated curriculum and applied academics, career counseling, and partnerships between education and business. A Tech Prep student was defined as someone enrolled in an applied academics course.


EVALUATION DESIGN AND STUDENT OUTCOMES

      As a CREATE partner, the Oklahoma City Community College, Office of Program Development, provided an evaluation specialist to the consortium. The evaluator designed a three-year project that included an examination of both implementation processes and student learner outcomes.

      The summative evaluation consisted of an examination of four student motivation measures--grade point average, the Iowa Test of Basic Skills (the National Percentile Rank (NPR) scores), absenteeism, and withdrawal rate. In addition, comparison groups of students were identified for the evaluation. Students were grouped into three educational categories--general track, Tech Prep, and other. The purpose for these student groups was to allow a baseline comparison. As the evaluator argued, "it may not be appropriate to compare the Tech Prep student with the "other" students on such outcomes as the math subcomponent of the Iowa Tests (because these students may have more math and are likely to score higher). It is more appropriate to compare Tech Prep student to the "general track" student, however, a comparison between all three will provide some clarity on the outcomes of Tech Prep" (Hellman, 1994, p. 2).

      The general track (n=24) was defined as those students enrolled in the minimal courses to graduate. The Tech Prep students (n=72) were defined as those enrolled in applied academics, and subsequently enrolled in an articulated vocational technical program[4]. The "other" group (n=23) was defined as students who have taken courses that exceed the minimal requirements and are typically considered college preparatory sequences (e.g., algebra II, chemistry).


DEMOGRAPHIC CHARACTERISTICS

      As presented below, the general track and Tech Prep groups had more male students, and the "other" category was almost equally represented by each gender. Not surprising, given the school districts' location in the primarily white-collar northwest section of Oklahoma City, most students in each group are Caucasian.

TABLE 9-1:

OKC DEMOGRAPHIC CHARACTERISTICS
CHARACTERISTICS STUDENT GROUPS (PERCENT)

GENDER GENERAL TRACK TECH PREP OTHER

Male 70 62 45
Female 30 38 55

ETHNICITY

Caucasian 82 68 100
Non-Caucasian 18 32 --

ABSENCE AND DROPOUT RATES

      Comparing the tenth grade absence rates for the three groups, "other" students had an average absence rate of 7.8 days, Tech Prep students showed a slightly higher average absence rate of 10.2 days. The general track students had an even higher average absence rate of 16.0 days. Comparing the percent of students dropping out of school, the evaluator reported a similar trend. Only 8.7 percent of "other" students withdrew from school, compared to 9.7 percent of Tech Prep students, and 33.3 percent of general track students. Hellman (1994, p. 3) interpreted these results as "a positive indication to possible benefits of the Tech Prep program implemented by the CREATE consortium. "


GRADE POINT AVERAGE TRENDS

      The grade point average (GPA) trends from the eighth grade to the twelfth grade were compared for the student groups. The "other" students had a consistently higher GPA, compared to the Tech Prep and general track students. The Tech Prep students have consistently higher GPA's than those in the general track. Of particular interest are the GPA scores at the twelfth grade, where Tech Prep students equal the "other" students, and the general track students end on a downward trend. Although, the Tech Prep scores decrease toward the general track and then increase, in general, the GPA results support the educational reform goals of Tech Prep (Hellman, 1994, p. 4).

TABLE 9-2:

GRADE POINT AVERAGE - STUDENT GROUPS
GRADE GENERAL TRACK TECH PREP OTHER

8th 1.6 2.3 2.4
9th 1.7 2.1 2.6
10th 1.8 2.0 2.8
11th 2.0 2.1 2.6
12th 1.8 2.5 2.5

AVERAGE 1.8 2.2 2.6

IOWA TEST BATTERY SCORES (ITBS),
NATIONAL PERCENTILE RANK (NPR) SCORES

      The students National Percentile Rank scores, of the ITBS, were compared across seventh, ninth, and eleventh grades, in the language, reading, math, and science components. The NPR scores range from zero to one hundred. In each grade, the "other" students scored the highest on all four components, which may be influenced by the higher-level course work they completed. For the language and science sub scores, at the seventh grade, Tech Prep students were lower than the general track students. However, by grade eleven the Tech Prep student scores were higher than the general track students. In the math sub score, the Tech Prep students scored higher than the general track students in grades seven and nine, but both groups experienced a decline in scores by grade eleven. For reading, Tech Prep students scored higher than general track students in seventh grade, but declined lower than these students in grades nine and eleven.

      The ITBS results shed some light on potential areas for improvement, for example, more attention was warranted in the areas of reading comprehension skills for Tech Prep students. At the same time, Hellman (1994) most appropriately states that the ITBS may not be the best measure of student motivation to learn as he or she advances toward a technical career.

Without dispute, the ITBS is a valid measure of student achievement, but the scores may not be of value to the students taking the test, and so we should consider giving achievement tests that are germane to the students' chosen technical field. The primary comparison of interest would be, but not limited to, placement in a career of choice. Furthermore, supervisors should be contacted at various times throughout the new employee's first year of employment to provide competence indicators which may lead to that employee's successes (p. 36).

NEXT STEPS--USE OF RESULTS

      Perhaps one of the most ambitious uses of the secondary student outcome data was the subsequent pilot study of postsecondary students. During the fall 1994 semester, the evaluator compared a group of Francis Tuttle students who had "banked articulation credits" in three technical programs at Oklahoma City Community College (n=16), to a random sample of students who had not accumulated any articulated credits in these programs (n=11). The selected technical programs were business, computer science, and electronics, and the control group consisted of recent high school graduates enrolled in these programs. In keeping with the federal education policy changes in 1994, the students were now referred to as school-to-work students (replacing Tech Prep). The majority of students in both groups were male (73 percent control group, 75 percent STW group).

      A major area of concern addressed in the evaluation was enrollment in remedial math. In particular, the evaluator argued that "given the initial focus of applied academics (applied math) for the Tech Prep/School-to-Work reform movement, it would be hoped that the STW students would be more similar to the control group in their enrollment in remedial math. "Using the ACT scores and grade point average, as measures of student achievement, the evaluator reported that "considering the ACT math subcomponent scores, the STW student compares relatively equally with the control" (Hellman, 1995, p. 1)[5]. The percent of students who enrolled in remedial math courses was, however, not as equally distributed. As shown in Table 9-3, twenty-one percent of the STW students enrolled in remedial math during the fall 1994 semester, as compared to nine percent of the control group.


TABLE 9-3:
COMPARISON MEASURES - STUDENT GROUPS
COMPARISON MEASURE CONTROL GOUP STW GOUP

GPA
Summer 1994 2.00 2.88
Fall 1994 2.74 2.71
Overall 2.48 2.72

ACT
Math 18.50 19.08

REMEDIAL MATH
No 9.1% 21.4%

GENDER
Male 72.7% 75.0%
Female 27.3% 25.0%

      Interestingly, despite the differences in ACT scores and enrollment in remedial math, the STW group reported a higher overall GPA. In considering all the data, the evaluator concluded that "the STW student is doing at least as well as the control group, and further examination should provide clarity on the issue of STW student efficacy at the college level" (Hellman, 1995, p. 2).


OVERCOMING BARRIERS

      These exemplary teams continue to make great strides in collecting student outcome data, at the same time, however, they are confronted with several ongoing dilemmas. For instance, how do you sustain a data collection effort when a third party evaluator leaves? How do you compare students who complete a program in 1993 to those who completed it in 1998 (i.e., when Tech Prep was a fledgling effort versus a more mature initiative)? These dilemmas are not an issue for most of the Network teams because they must grapple with such fundamental tasks as (1) identifying students, (2) selecting outcome measures, (3) determining data collection methods, (4) determining how to use the data, and (5) deciding how to incorporate teachers into the evaluation process. In short, most sites are still working through such questions as: what student are we interested in following, what outcomes do we want to measure, who is going to collect the data, how is it going to be used, how can teachers incorporate the data into the classroom experience, and what is the relevance of the data for teaching and learning?

Some schools have embarked on a whole school reform initiative. Hence, there is 
no target student because everyone is subject to the same

      Some sites have worked through many of these questions and even begun data collection activities, but then a second set of barriers seem to arise. New superintendents or instructional deans are hired, district or state education policies change, or project coordinators relocate to another school. Following these personnel changes, a "new" data collection process is implemented, and "old" data or outcome measures are deemed irrelevant or wrong. For instance, some sites are facing the complete reconstitution of their school, and student records are inaccurate or not available. Some schools have embarked on a whole school reform initiative. Hence, there is no target student because everyone is subject to the same "treatment," and the comparison of student performance becomes a more difficult problem, not easily solved by over worked staff who are not experts in evaluation methodologies. This latter problem is further compounded by the use of a different definition of target student by the community college that is trying to follow individuals who have transitioned from a local high school. Some Network members report that secondary students are a transient population, and even though they can be identified in high school they can't be followed after graduation. As a result, valuable information for program improvement, such as evidence of success in postsecondary courses or in career-related job placements, is not available.

      Perhaps one of the most distressing barriers is the rejection of data that is collected. The results that were collected are not the outcomes the community at large wants to hear or read about. For instance, entrance into and completion of a four-year university degree is still the ultimate goal for many secondary school educators, parents, guidance counselors, and students. However, when available data (from an Urban Network team) indicates that 4.5 years after graduation only two percent of a high school's 250 graduates have completed a baccalaureate degree, the data are typically ignored rather than used as an impetus to talk about school improvement. Many teams are also confronted with resistant attitudes toward accepting a "careers" approach to secondary education. There is a gap in understanding about the economic realities of a community (the skills employers need) and the available education and training opportunities (local schools and colleges can provide students with the knowledge and skills needed to fill those career positions).

      Although the barriers to using outcome data for program improvement purposes seem formidable, there continues to be progress and dialogue on how to overcome obstacles. For instance, a representative from the Milwaukee site commented that "we need to use performance assessments that test the skills we purport to want to know about. If solving a geometric proof is the skill you want, then test for that, but if you want a student to apply some geometric concept, then test for that applied knowledge. In addition, the Somerville, Massachusetts site, found a way to empower teachers so they would use outcome data--they select the outcomes to be measured. "Teachers don't like to collect data for someone else's outcomes. It has to be meaningful to them," commented a district-level administrator.

      Indeed, a major problem is the lack of teacher engagement in the evaluation process and the related application of results to an ongoing program improvement process. Interestingly, the research literature is growing with articles on school restructuring efforts, teacher-directed reforms, and outside partners (such as the role played by NCRVE) who are trying to provide assistance. For instance, Bascia (1996) provides an excellent discussion of how the implementation of school reforms is typically hampered by the larger organizational and political contexts in which they are situated. She states quite frankly that:

The success and speed with which school programs undergo comprehensive change depends in significant part on school staffs' relative legitimacy and power within their partnership networks...The interconnectedness and multilevel, multisector nature of restructuring projects can work to impede school staffs' efforts to significantly reconceptualize their education programs, particularly at [urban] schools where staffs are contending with the most critical educational problems (p. 179).

      In agreement with Bascia (1996), Kanpol and Yeo (1995), Noguera (1996), Weinstein, Madison, and Kuklinski (1995), and Talbert and McLaughlin (1994) report that high school departments, schools, and districts play a role in supporting or undermining teacher professionalism. In short, research indicates that teachers will engage in such activities as evaluation studies if they are part of an active learning community of teachers. Furthermore, what is needed in the urban environment is an approach that integrates efforts to promote urban school reform with broader strategies aimed at revitalizing urban areas. More teachers will engage in efforts to collect student performance outcome data if they work in an environment characterized as a supportive learning community-- for both themselves and their students.


LESSONS LEARNED

      A detailed discussion of outcome data was presented from our Brooklyn, Omaha, and Oklahoma City teams. In closing, however, it is important to acknowledge the many sources of data that are available for each team. For instance, consider the guidance counseling component. If the goal of a team is to design and implement a comprehensive career information system for all students, then this can be Staff need access to data that is already available at their schools or districts. assessed by analyzing how the changes in guidance activities have affected students. As presented in other chapters of this publication, team members can use several bodies of evidence for evaluation purposes, including integrated course materials, plans for work-based learning experiences, individual career plans, and articulation agreements. In some cases, however, the most powerful outcome measures are those of student progress and achievement.

      In terms of how many Urban Schools Network teams can provide data on student outcomes, the proportion is representative of the nation as a whole. As mentioned in Chapter Eight, according to results from the federally funded, national study of Tech Prep, "in 1993 only two percent of consortia were actually testing a computerized student database" (Silverberg and Hershey, 1994, p. 30). In agreement, Bragg, Puckett, Reger, Thomas, Ortman, and Dornsife (1997) report that, based upon a nationwide random sample of Tech Prep consortia (n=855), conducted during two time periods, in 1992-93 only eight percent of consortia reported "initial implementation" of an evaluation component, and in 1994-95, only twenty-eight percent reported the "initial implementation" of an evaluation component. Initial implementation was defined as what "occurs when plans and products of the development stage begin to be carried out" (Bragg et al., 1977, p. 49).

      What can be done by NCRVE or other outside organizations helping schools change to increase the number of sites that are engaged in evaluation activities? As stated during July 1997 focus groups, sites continue to need technical assistance in translating goals into outcome measures, identifying student comparison groups, and learning techniques on how to collect student followup data after graduation. Most importantly, Network site staff need access to data that is already available at their schools or districts. In some cases, this access may be gained by identifying an appropriate administrator or by working with a principal who receives information and reports on a regular basis. This last area targeted for technical assistance highlights the political nature of conducting evaluation and a barrier mentioned in this chapter--that the use of available data may be rejected because it does not support a particular belief system or contradicts a particular goal. There is no single solution to the problems of access, interpretation, and use of data, but it is encouraging that Urban Schools Network member institutions have provided strategies to surmount these obstacles that are common across so many school settings and a feature of implementing any major education reform initiative.


REFERENCES

Bascia, N. (1996). Caught in the crossfire. Restructuring, collaboration, and the "problem" school. Urban Education, 31 (2), 177-198.

Bragg, D., Puckett, P., Reger, W., Thomas, H., Ortman, J., & Dornsife, C. (1997). More "promising trends and lingering challenges" for Tech Prep and school-to-work. Berkeley, CA: NCRVE.

Bragg, D. (1992). Implementing Tech Prep: A guide to planning a quality initiative. Berkeley, CA: NCRVE.

Dornsife, C. (1992). Beyond articulation: The development of Tech Prep programs. Berkeley, CA: NCRVE.

Grubb, W. N., Davis, G., Plihal, J., & Lum, J. (1991).The cunning hand, the cultured mind: Models for integrated vocational and academic education. Berkeley, CA: NCRVE.

Hull, D., & Parnell, D. (1991) Tech Prep associate degree: A win/win experience. Waco, TX: Center for Occupational Research and Development.

Hellman, C. (1994). Tech Prep student achievement: An examination of CREATE's Tech Prep program. Oklahoma City, OK: Oklahoma Community College.

Hellman, C. (1995). A preliminary report on the school to work student enrolled at Oklahoma City Community College. Research Monograph VIII. Oklahoma City, OK: Oklahoma Community College.

Jurgens, C. (1995). Nebraska Tech Prep. Executive summary, fiscal year, 1995. Lincoln, NE: Nebraska Department of Education.

Kanpol, B., & Yeo, F. (1995). "Inner-city realities: Democracy within difference, theory, and practice. The Urban Review, 27 (1), 77-91.

Layton, J., & Bragg, D. (1993). Results from a 50-state survey. Berkeley, CA: NCRVE.

Little, J. (1993). Teacher as researcher. Berkeley, CA: NCRVE.

Noguera, P.(1996). Converting the urban in urban school reform. The Urban Review, 28 (1), 1-19.

Silverberg, M., & Hershey, A.( 1994). The emergence of tech-prep at the state and local levels. Washington, DC: U.S. Department of Education, Office of Planning and Evaluation Service.

Talbert, J., and McLaughlin, M. (1994, February). Teacher professionalism in local school contexts. American Journal of Education, 102, 123-153.

Weinstein, R., Madison, S., & Kuklinski, M. (1995). Raising expectations in schooling: obstacles and opportunities for change. American Educational Research Journal, 32 (1), 121-159.


APPENDIX

AGREEMENT FOR PROGRAM IMPROVEMENT
BETWEEN THE NATIONAL CENTER FOR RESEARCH IN VOCATIONAL EDUCATION (NCRVE) AND SCHOOLS AND COLLEGES AFFILIATED WITH THE NCRVE NETWORK

      The school or college agrees to:

1. Offer at least one career-related learning sequence in which every participating student has the opportunity to accomplish both of the following objectives:
a) Achieve high academic standards and satisfy course requirements for admission to postsecondary education, including four-year college or university; and
b) Gain strong understanding of and experience in "all aspects of an industry" to prepare for rewarding employment and potential career advancement.
2. Try to include in each career-related learning sequence a range of students whose demographic characteristics and performance levels reflect the composition of the whole school or college, and provide the services needed to enable all participating students to achieve the objectives in 1.
3. Recruit students into each career-related learning sequence on the basis of students' own choice.
4. Determine the effectiveness of the following practices in each career-related learning sequence:
a) Curriculum that integrates academic and vocational-technical subjects through the study of a broad industry or career major;
b) Student-centered instructional methods that link classroom studies to work-based learning; and
c) Explicit pathways that lead from high school to postsecondary education.
5. Involve students, parents, faculty, and employers in decisions that affect the program.
6. Participate in periodic self-assessment activities to document the progress made, the type of difficulties encountered, and the course of future actions.
7. Share information with NCRVE at least once a year about the extent of progress in implementing the practices in (2) through (5) and achieving the student outcomes in (1), including results of activity-based assessments of student learning.

      NCRVE is committed to providing assistance to Network schools in their efforts to implement the Agreement for Program Improvement.

      NCRVE agrees to provide:

  1. COMMUNICATION:
          Talk with teams on a monthly basis to assess needs and solicit input on Network events. Maintain and monitor an electronic network with the participating teams. Provide updated information on activities, research, funding sources, successful programs, and opportunities for professional development. Promote activities of the Urban Schools Network locally and nationally and report progress made at the individual sites. Help teams market their programs to the community and the media.

  2. PROFESSIONAL DEVELOPMENT:
          Conduct regional meetings or summer institutes to exchange ideas across team sites, and enable teams to further progress in the implementation of their plans.

  3. TECHNICAL ASSISTANCE:
          Match sites with individuals (field consultants, NCRVE staff, and Faculty Fellows) and resources to help teams in the process of planning, implementation, and problem solving. Provide sites with access to the program development guide: Getting to Work. Work within the district and region to develop ongoing support. Assist teams in the procurement of funds by providing letters of support, information pertaining to sources of funding, and grant preparation help.

  4. FEEDBACK:
          Provide opportunities for assessment of progress, collection, and documentation of improvement. Offer suggestions, problem solving, and troubleshooting. Synthesize evidence of progress across network sites. Showcase the accomplishments of Network sites to community, state, and national audiences.

  5. FUNDING:
          Actively seek additional sources of funding for the Urban Schools Network. Additional support will be utilized to provide on-site technical assistance to the network schools, and to assist team members in attending NCRVE-sponsored meetings and other professional development activities.


[3] The Program improvement Agreement was a two-page declaration of mutual agreement between NCRVE and the individual Urban Network sites. The agreement presented seven goals NCRVE embraced for educational reform, and by signing the agreement, the team members committed themselves to implementing these goals. One of the items (#7) specifically called for the annual collection of outcome information. Technically, the teams agreed to submit information on their progress, both programmatic and student achievement. See the Appendix at the end of this chapter for a copy of the agreement.

[4] It is important to note that the definition of a Tech Prep student changed over the course of the evaluation as the initiative evolved and goals were refined. For instance, initially, 1,000 secondary students could be identified as "Tech Prep" because the definition was limited to applied academics course enrollment. Eventually, the definition included the qualification of enrollment in an articulated vocational-technical program.

[5] The ACT scores and GPA were selected because OKCCC routinely includes this information in student records. For each measure of the criteria, a oneway analysis of variance was calculated to determine if any significant differences existed between the STW and the control group. No significant differnces were found, most likely a function of the small sample sizes (Hellman, 1995, p. 2).


Previous Next Contents NCRVE Home
NCRVE Home | Site Search | Product Search