The complex and consequential process whereby matches are made between students and the diverse array of academic and vocational courses is little understood. To help sort out this matchmaking process, we looked closely at three comprehensive high schools and the students who attended them. We were particularly interested in assessing the importance of three possible factors: first, educators' judgments about students' abilities, their postsecondary destinations, and their educational needs--especially as these relate to race, gender, and social class; second, students' and parents' preferences; and third, limits and opportunities in different schools that stem from conditions outside of schools (changing demographics, state policies, and resources) and from schools' own traditions and structures. Our overarching goal was to understand the culture surrounding the differentiated curriculum--the dynamics that keep it in place and are likely to erect obstacles to blurring the boundaries between academic and vocational curriculum and students.
In the first year of our case studies, we spent a great deal of time doing field work in three very different schools--observing, studying school documents, and talking with educators and students. Administrators and teachers told us how they made decisions about what academic and vocational courses to offer and how to place students in various courses.[3] During the second year, we analyzed transcripts from students in the 1988 senior class at the three schools. The transcripts gave us rich information about the consequences of the curriculum decision processes at three schools and about which courses students actually took. As such, they allowed us to examine how the placement and coursetaking experience differed for students from different racial, ethnic, and socioeconomic groups, for boys and girls, for native-born as well as foreign-born students, and for those who appeared to be college-bound and those who did not. These data also permitted us to examine the role of vocational education in the high school careers of various student groups and to probe differences between the high school curriculum experiences and post-high school plans of students who took a relatively large number of vocational education courses and those who took little or no vocational education.
We selected three four-year senior high schools located in adjacent communities within a major West Coast urban center.[4] The schools differ in two important ways. First, they are part of three local school districts, each with its own interpretations of state policies and its own curriculum policies. The schools also differ in the student populations they serve. Coolidge serves a racially and socioeconomically diverse group of students who live in an integrated neighborhood. The student body at Washington is almost entirely middle-to upper-middle-class white and Asian. Students at McKinley are nearly all African American and Latino, a substantial proportion of whom are poor. These differences were of particular interest, since we wanted to explore how they might relate to differences in curriculum and placement decisions at high schools. Taken together, the similarities and differences among the schools permitted us to raise some preliminary hypotheses about how schools juggle academic and vocational programs in comprehensive high schools of various types. They also permitted us to explore how schools respond to the pressures from state and district policymakers, the needs of the surrounding labor market, and administrators' and teachers' own beliefs about what educational experiences different students need in high school.
The schools' geographic proximity held constant several factors that might otherwise confuse our understanding of similarities and differences in their decisionmaking processes. Because the schools are in the same labor market area, we could be more certain that, although the programs offered might reflect more or less sensitivity to the types of jobs likely to be available to students, they would not be geared to preparing students for communities with very different needs. The schools' proximity also held constant the type of postsecondary education and training opportunities available to graduates and dropouts. Finally, they were subject to the same state resource and curriculum policies--e.g., high school graduation and state college and university requirements; regulations governing the use of Perkins money for vocational programs; and other state-controlled vocational programs. For example, all three schools have similar access to state-supported regional occupational training programs. These programs provide courses both on high school campuses and in off-campus centers--courses that differ in several important respects from the "regular" school vocational offerings. Their programs are subject to state approval, their staff is more closely connected with work settings (many are part-time employees), and the state provides these programs with extra funding to purchase up-to-date equipment and materials. We expected that the additional resources available through the regional programs would have a similar effect on the quantity and type of vocational courses each of the schools offered as part of their comprehensive program.
We collected and analyzed each school's student handbook, course descriptions, and master schedule to obtain the "public" information about course offerings and enrollment processes. These gave us a comprehensive and "objective" picture of the curriculum opportunities available at the three schools and the official procedures through which students obtain them.
We relied on interviews to reveal the subtler, more "subjective" side of the story about how schools make curriculum and placement decisions. We conducted our interviews during the 1988-1989 school year, beginning with district-level administrators, then site administrators, counselors, and teachers. At each school we interviewed the district curriculum director, the district vocational education coordinator, the school principal, and assistant principals or deans responsible for overseeing curriculum or counseling. We interviewed all of the counselors and approximately 15 teachers at each school.
We designed our interview protocols for each respondent group as we proceeded to incorporate knowledge gained in the preceeding tier of interviews. Nevertheless, in each interview, we queried respondents about the influence on school decisions of several factors external to the school, including funding levels and policies at the state and local levels and demographic and socioeconomic characteristics of student populations. We also asked about the effects of internal school factors, including the philosophy of the site administration, the capacity and teaching preferences of the staff, and the logistics of building a schedule. We framed questions that might reveal educators' perceptions of the "appropriate" curricula for various students (e.g., those with particular race, class, gender, and prior achievement characteristics), guidance counseling practices, grades, and test scores. We also asked about students' and parents' influences on the nature of the schools' programs and on the assignment of students to various programs.
At two of the three schools we also interviewed students drawn from both vocational classes and academic classes in various tracks. We asked them about how various factors influenced their decisions to enroll in particular courses--their own current interests, their postsecondary aspirations, the guidance counseling provided by the school, parent involvement, and their perceptions of the purpose and quality of various curriculum offerings, particularly vocational education.
To ensure the validity of interview data, we used standard triangulation procedures. We collected data about each topic of interest from a variety of data sources (school records, interviews, and observations). Additionally, several data collectors conducted interviews and observations at each site.
We also used triangulation strategies as we analyzed the case study data. At least two members of the study team coded data from interviews and site visits and sorted these data into categories or themes central to the study. We also used teams of researchers to code the school record data to generate baseline quantitative descriptions of the curriculum at each of the schools and policies regarding student placement.
To place our findings about the curriculum at the case study schools in a broader context, we also collected data about the curriculum at three larger groups of schools. Each of these three "comparison groups" was similar to one of our study schools, that is, each enrolled student bodies comparable in parent education, English-speaking facility, mobility, and concentration of students from families receiving assistance under the Aid to Families with Dependent Children Program. The schools were located in counties that included large urban areas within the same state as our study schools. Additionally, all but one of these comparison schools were four-year comprehensive high schools. In total, we asked 82 schools to send copies of their master schedule for the 1988-1989 school year and their current course description booklet. Sixty-eight schools responded--22 that could be compared with Coolidge; 22 with McKinley; and 24 with Washington. From the materials these schools sent, we gained a better idea of how typical our three case study schools were in their graduation requirements, the types of vocational and academic courses offered, and the number of class sections of various courses that were actually scheduled. We were also able to use state data to compare achievement outcomes and some coursetaking patterns at our schools with the others in their group.
To understand students' coursetaking and vocational education experience at the three high schools, we collected background and transcript data for all students who were seniors any time during the 1987-1988 school year.[5] This sample included both graduates and nongraduates. Data were collected from the transcript of each student in the senior class, from other materials in the student's cumulative file, and in some cases from information provided by counselors and school or district administrative records.[6]
We noted each student's gender, race or ethnicity, and date of birth. At Washington we were also able to record students' country of birth. As with most other school-based studies, we were unable to find a reliable measure of students' socioeconomic status (SES).[7] At Coolidge, the guidance counselors agreed to estimate the household income of the 1988 seniors who had been assigned to them. Using this information, SES was rated as "low" (family income less than $12,000), "middle" ($12,000-$50,000), or "high" (more than $50,000). Counselors at the other two schools felt that they did not know their students well enough to make an accurate assessment. However, state data, along with what we learned during our interviews and observations, make clear that, on average, students at McKinley were from lower-income families than those at the other two schools, and that Washington's students were the most affluent.
At Coolidge and Washington we had access to each student's eighth grade reading and math standardized achievement test scores (e.g., the Comprehensive Test of Basic Skills); at all three schools we located students' 10th grade reading and math scores.[8]
We also recorded each student's graduation status, final GPA, class rank, total
course credits, and, at two of the schools, whether the student completed the
state university's requirements for admission. For those students who took the
SAT or ACT college admissions tests, we recorded scores on both the verbal and
math subtests. At all three schools, we noted whether a student requested that
his or her transcript be sent to two-year or four-year colleges and
universities or to technical trade schools as part of the process of applying
for en-
trance to that institution.[9] These
end-of-high-school outcomes gave us an opportunity to understand the extent to
which the schools altered overall achievement levels or the relative standing
of various groups of students during their high school years.
Finally, we collected data from the transcripts about the courses students had taken each semester (including summer school) for all four high school years. All mathematics, English, and vocational courses were recorded for all students. For students identified as vocational concentrators, all other subjects were noted as well.[10] We developed a course coding scheme based on the master schedule for each school that was consistent across schools but also allowed for the variation in each school's course offerings. In addition, the codes preserved considerable detail about the array of vocational course offerings. For each course, we noted the general subject area, specific course title, the ability level or track of students for which it was intended, and the number of credits and the grade the student received. The ability or track codes distinguished among ESL, low or remedial, regular, college-preparatory, or honors courses. In addition, since Coolidge and Washington offered courses that combined students from different levels, we developed codes to identify various combinations. For example, some courses grouped low students with regular non-college-prep students, and others combined regular non-college-prep students with college-prep students. The course location codes identified courses taken at another U.S. or foreign high school, at an adult or continuation school, at a junior college or university, or at the off-campus regional center (RC).
These data enabled analyses of the curriculum experiences of the student cohort enrolled at the schools sometime during their senior year. Students who were present from their freshman to senior years are included, as well as those who transferred into the school between their freshman and senior years and remained there. This sample does not include students who were in the graduating class of 1987-1988 but who transferred to another school or dropped out before the start of their senior year.[11] The sample sizes for the senior class at three schools are shown in Table 2.1.[12]
It is important to note that our data permitted us to analyze subgroups of students within the senior class sample. For example, we can examine the coursetaking patterns for the cohorts of students who were enrolled continuously at their respective schools from 11th to 12th grades, 10th to 12th grades, or 9th to 12th grades.[13] We decided to focus our analysis of student coursetaking behavior on the cohort of students enrolled in the 10th through 12th grades at their respective schools.[14] The continuity of experience for this student cohort makes them most relevant to our analysis of who gets what and why at particular schools. These are the students likely to have been most affected by the decisionmaking processes operating at the school.
As noted above, the student bodies at Coolidge, Washington, and McKinley High Schools differed in their racial and ethnic makeup, the number of foreign-born, their achievement levels, and their post-high school plans.
| Washington | Coolidge | McKinley | ||||
| Grade | No. | % of Senior Class | No. | % of Senior Class | No. | % of Senior Class |
| 12th | 458 | 100.0 | 446 | 100.0 | 436 | 100.0 |
| 11th-12th | 432 | 94.3 | 423 | 94.8 | 411 | 94.3 |
| 10th-12th | 398 | 86.9 | 380 | 85.2 | 350 | 80.3 |
| 9th-12th | 368 | 80.3 | 323 | 72.4 | 285 | 65.4 |
NOTE: We defined the 12th grade year as the 1987-88 academic year; 11th grade as 1986-87; 10th grade as 1985-86; and 9th grade as the 1984-85 academic year. | ||||||
Table 2.2 displays the demographic characteristics of students at our three schools.[15] As noted above, Coolidge's senior class is the most ethnically diverse in contrast to Washington's largely white and Asian student body, and to McKinley's student population, which is overwhelmingly African American with a significant Latino cohort. More striking, however, is the fact that a large number of students at Washington and McKinley were born outside the United States. Eighty-six percent of the Asian students and 42 percent of the Latino students at the two schools are immigrants.[16] This pattern reflects national trends, particularly for high schools in metropolitan areas.
| Washington | Coolidge | McKinley | |||||
| 12 | 10-12 | 12 | 10-12 | 12 | 10-12 | ||
| Number of students | 458 | 398 | 446 | 380 | 436 | 350 | |
| Sex (%) | |||||||
| Male | 44.5 | 45.2 | 47.7 | 46.6 | 47.0 | 48.0 | |
| Female | 55.5 | 54.8 | 52.3 | 53.4 | 53.0 | 52.0 | |
| Race/ethnicity (%) | |||||||
| White | 63.8 | 66.1 | 46.2 | 47.6 | 0.2 | 0.0 | |
| Black | 0.4 | 0.3 | 12.8 | 10.8 | 72.9 | 72.3 | |
| Asian | 29.7 | 28.1 | 12.6 | 13.2 | 1.6 | 0.9 | |
| Latino | 5.7 | 5.0 | 27.1 | 27.6 | 22.5 | 24.0 | |
| Other/missing | 0.4 | 0.5 | 1.3 | 0.8 | 2.8 | 2.9 | |
| Country of birth (%)a | |||||||
| USA | 71.0 | 73.1 | -- | -- | 68.8 | 70.6 | |
| Japan, Southeast Asia | 24.2 | 22.9 | -- | -- | 1.8 | 1.4 | |
| Mexico, South/Central America | 0.7 | 0.5 | -- | -- | 15.6 | 17.4 | |
| Europe, Africa, Middle East | 3.9 | 3.3 | -- | -- | 2.7 | 2.0 | |
| Other/missing | 0.2 | 0.2 | 11.1 | 8.6 | |||
| SES (%)b | |||||||
| Low | -- | -- | 13.2 | 13.4 | -- | -- | |
| Middle | -- | -- | 60.8 | 61.1 | -- | -- | |
| High | -- | -- | 14.4 | 15.5 | -- | -- | |
| Missing | -- | -- | 12.6 | 10.0 | -- | -- | |
aData on country of birth were not available for Coolidge High students. bSES data were available for Coolidge High students only. Data were derived from retrospective assessment of each student's family income by that student's former guidance counselor. | |||||||
Our data about the socioeconomic status of Coolidge students suggest that more than half (60 percent) come from middle-class families, significant percentages belong to poor (13 percent) and wealthy (14 percent) families. It is interesting to note that a large and roughly comparable percentage of Coolidge students from all racial and ethnic groups are in the middle SES group. The remaining Asian students are disproportionately low SES, and whites are disproportionately high SES. African American and Latino students not in the middle group are nearly equally divided between high and low SES.[17]
On every measure for which we have data across the three schools, McKinley students rank the lowest: on 10th grade achievement test scores in both math and reading, SAT math and verbal scores, total number of credits taken, cumulative grade point average, and graduation rate (see Table 2.3).
| Measure | Washington | Coolidge | McKinley | ||||||||||
| Mean percentile scores | |||||||||||||
| Math, grade 8*a | 70.9 | (271) | 68.4 | (274) | -- | ||||||||
| Math, grade 10** | 72.2 | (363) | 62.0 | (322) | 44.8 | (322) | |||||||
| Reading, grade 8*a | 66.9 | (269) | 61.3 | (276) | -- | ||||||||
| Reading, grade 10** | 60.8 | (370) | 54.9 | (324) | 40.2 | (325) | |||||||
| SAT mean score, math** | 545.3 | (208) | 470.6 | (186) | 352.8 | (117) | |||||||
| SAT mean score, verbal** | 429.8 | (208) | 422.5 | (186) | 328.1 | (117) | |||||||
| Percentage who met state
university requirementsa | 47.3 | 34.1 | -- | ||||||||||
| Mean total creditsb | 244.1 | 233.1 | 229.4 | ||||||||||
| Mean GPA | 2.8 | 2.5 | 2.3 | ||||||||||
| Percentage of 12th graders
who graduated | 92.5 | 92.9 | 86.6 | ||||||||||
| Sample size | 398 | 380 | 350 | ||||||||||
NOTE: When sample sizes are smaller than the full sample because of missing data, the sample sizes are shown in parentheses. *Differences among schools are significant at the .05 level. **Differences among schools are significant at the .01 level. a We were unable to obtain 8th grade achievement test scores or information on the number of students who met the state university course requirements for McKinley students. b The total credits required for graduation at each school differed. Coolidge required 220 credits, Washington 220, and McKinley 230. The mean total number of credits for Washington students is lower than that required for graduation because (as at all three schools), although our sample includes 1988 seniors who did not graduate, as well as those who did, the graduation rate for McKinley students is lower than that for Coolidge or Washington. | |||||||||||||
Washington students score higher than students at Coolidge and McKinley on six of our ten measures. More Washington than Coolidge students met the state university course requirements, Washington students earned a slightly higher mean GPA, and they took more total credits during their four years. Moreover, the scores of Washington students on the 8th and 10th grade math achievement test and on the math portion of the SAT exam are significantly higher than those of Coolidge students. Only on their scores for the reading achievement scores (8th and 10th grade) and the verbal portion of the SAT did Washington students in our sample score lower than did Coolidge students but, nonetheless, still higher than students at McKinley.[18]
At Coolidge and Washington, Asian students score highest in math and somewhat lower than whites in reading and verbal competencies.[19] A significantly higher percentage of Asian students at both schools completed the state university entrance requirements than did students from any other ethnic group. Latino students at Coolidge and McKinley scored at the bottom on nearly every measure of achievement.[20] Foreign-born students at Washington, most of whom were Asian, performed significantly better than other (mostly white) students in math, and many more of them completed the university entrance requirements. Their reading scores, however, were lower than those of native-born students. At McKinley, this pattern did not hold; the scores of foreign-born students, who are mostly Latino, tended to be comparable to or lower than those of the native-born, largely African American, cohort.
Table 2.4 suggests that the achievement differences among the "best" students at each of the schools--those in the top 10 percent of the class, as defined by both GPA and class rank--follow patterns similar to those found among the schools as a whole. McKinley's "best" seniors scored lower than the comparable group of students at Coolidge or Washington on reading and math achievement tests and on both the verbal and math portions of the SAT. Moreover, the mean scores of McKinley's most academically talented students were also lower than the mean scores for all students at Coolidge and Washington on the reading and math portions of the SAT, and lower than the mean score of all Washington students on the 10th grade math achievement test. At the same time, the extremely high math scores and middling English scores that we observed among all Washington students relative to their Coolidge counterparts remain when we compare the top 10 percent cohort at each school.[21]
Despite these striking differences among the schools and subgroups within them, we observe an interesting similarity among them. None of the schools appeared to have increased their relative achievement rankings (in terms of national norms) over time. At Coolidge and Washington, in fact, we find some slippage in national percentile rankings between students' 8th and 10th grade test scores. At Washington this slippage appears in reading and at Coolidge in both reading and mathematics (see Table 2.3). The relative stability of percentile rankings for the top 10 percent at each school suggests that the slippage occurred primarily among middle and low-achieving students--suggesting that the schools were less successful with these groups than with their highest-achieving students.
| Washington | Coolidge | McKinley | |||||||
| Math, grade 8a | 96.5 | (23) | 90.8 | (34) | -- | -- | |||
| Math, grade 10 | 97.1 | (43) | 89.1 | (39) | 70.1 | (31) | |||
| Reading, grade 8a | 77.1 | (23) | 82.7 | (35) | -- | -- | |||
| Reading, grade 10 | 73.8 | (43) | 81.0 | (39) | 62.8 | (31) | |||
| SAT mean score, math | 653.6 | (44) | 581.3 | (40) | 412.4 | (25) | |||
| SAT mean score, verbal | 502.7 | (44) | 512.5 | (40) | 370.4 | (25) | |||
| Sample size | 45 | 43 | 34 | ||||||
NOTE: Top 10 percent defined by both GPA and class rank. When sample sizes are smaller than the full sample because of missing data, the sample shown in parentheses. a We were unable to obtain the 8th grade scores for McKinley students. | |||||||||
Additionally, within all three schools, comparing the numbers of SAT takers from various racial and ethnic groups and their SAT scores with 10th grade achievement scores suggests that none of the schools were particularly effective in increasing the academic performance of their Latino and African American students.
The plans of 1988 seniors at each school are consistent with the differences in student achievement we observed (see Table 2.5). Again, McKinley students appeared least likely to apply to two- or four-year colleges, Coolidge students are more likely to apply to two- than four-year colleges, and Washington students are most likely to apply to four-year colleges.[22]
Few students from the 1988 senior class at any of our schools (between 1 and 3 percent) appeared interested in formal postsecondary technical education.
| Washington | Coolidge | McKinley | |
| Two-year college | 14.7 | 40.8 | 6.1 |
| Four-year college | 72.6 | 37.1 | 29.0 |
| Two- or four-year college | 75.8 | 69.0 | 33.9 |
| Technical school | 2.8 | 1.3 | 3.1 |
Taken together, the data from our interviews, observations, examination of school documents, and transcript analyses permit us to examine commonalities across the three schools and differences among them in the culture that supports a differentiated academic and vocation curriculum--i.e., the dynamics underlying schools' decisions about curriculum and student placement and the patterns of student coursetaking that follow.
Even so, there are important limitations to what our case study research can provide. Our field work focused on decisionmaking processes in general, rather than on decisions about specific students that could be linked to their coursetaking. The transcripts give few clues about how or why a student enrolls in courses and particular levels of courses. Moreover, transcripts report only final semester course placements; they do not indicate whether a student might have initially enrolled in other courses and subsequently requested a transfer or was transferred into the courses or sections recorded. As a result, it is difficult to verify whether the schools or the students make curriculum decisions, to make judgments about the degree of coursetaking "coherence" at our three schools, or to understand how well schools are able to carry out their plans to have students take the courses they "need." Finally, although we believe that our schools are similar to many others, it is impossible to generalize our findings to a larger population of high schools.
Nevertheless, as we detail in the sections that follow, the combination of intensive field work and transcript analyses allowed us to examine, close up, the dynamics of curriculum differentiation in contemporary high schools and its effect on students' academic and vocational placements.
[3]Individual case reports about the three schools are reported in a companion Note, Selvin et al. (1990).
[4]We have kept confidential the identity and location of each school and the identity of all individuals with whom we spoke. The names we have assigned to the three schools are pseudonyms.
[5]Other studies of students' coursetaking patterns have based their samples on the cohort of students enrolled in the freshman class (Garet and Delaney, 1988). Given the limitations of the administrative and recordkeeping procedures at the three schools, it was not possible within our timeframe and budget to collect transcript data for the group of students who entered 9th grade in the fall of 1984.
[6]Some data were not available for all three schools.
[7]A list of students eligible for free or reduced-price lunch was available for students at Washington and Coolidge. But not all students who qualify for a free or reduced lunch take advantage of this benefit, either because they are unaware of it or are self-conscious about doing so. As a result, we felt that using the free or reduced-price lunch list as a marker for students from a "low" socioeconomic background would significantly underestimate that population. Moreover, we had no measure of students at the other end of the SES spectrum.
[8]Because the schools used different achievement tests, we used students' percentile rankings to obtain a comparable measure across schools.
[9]The quality of the data on this variable for all three schools may be less reliable than for other measures coded from student transcripts for the following reasons: Students may have requested that their transcripts be sent to institutions to which they did not ultimately apply for admission. In addition, we cannot be certain that all transcript requests were noted on transcripts. Although school administrators assured us that they were very conscientious about recording this information on each student's transcript, personnel at our schools may have been inconsistent in recording such requests. Such inconsistencies may have been random--that is, some clerks at each school may have been more conscientious than others--or they may have been systematic--clerks at one school were predictably less conscientious than clerks at the other schools. At Washington, information about transcripts sent was frequently missing from the student's file. By the time we began data collection, information from the transcripts of the 1988 senior class at Washington had been computer-entered, and the hard copy transcripts were no longer available at the school site (as was the case for Coolidge and McKinley). However, the portion of each student's transcript where notations were made about transcripts sent to postsecondary institutions was often missing from the electronic file. Our coding scheme, at Washington as well as at the other two schools, distinguished student transcripts from which this information was missing from complete transcripts on which no transcript requests had been noted.
[10]We defined students as vocational concentrators if they took six or more semesters of vocational courses at the case study school. Although we recorded all courses taken by concentrator students, for this report we analyzed only their participation in math, English, and vocational courses.
[11]This sample undoubtedly biases our findings, particularly since students who drop out of school are likely to differ in systematic ways from those who remain. Therefore, our findings apply only to that group of students staying in school until grade 12.
[12]Data were collected for a total of 1,355 students. Because many data were missing, 15 students were dropped from the final senior class sample (3 from Coolidge, 1 from Washington, and 11 from McKinley).
[13]At all three schools, about 95 percent of the senior class was also enrolled in the previous year. At Coolidge and Washington, 85 to 87 percent of the class was enrolled for the previous two years; only 80 percent of McKinley's senior class was present two years earlier. The three schools diverge even more in the stability of their student bodies when examining the fraction of four-year students represented in the senior class. Eighty percent of the senior class entered as freshmen at Washington, compared to only 65 percent for McKinley. This divergence may be due in part to differential rates of enrollment from students who had been in private three-year junior high schools or in schools outside the district. This divergence is compounded by the significant differences in the schools' attrition rates. The principal at Coolidge reported a 25 percent attrition rate between the 9th and 12th grades. Washington reported the lowest rate of attrition--12 percent over the four years--whereas at McKinley, 55 percent of 9th graders leave the school before graduation (Selvin et al., 1990).
[14]We did not analyze the 9th through 12th grade cohort because of the differential reduction in sample size across the three schools, as shown in Table 2.1.
[15]In this table we have presented data for two groups of students at each school: all students who were enrolled during the 12th grade and students who attended 10th through 12th grade in that school. These data indicate that the characteristics of both groups are quite similar. As a result, in subsequent tables we will present data for the 10th through 12th grade sample only.
[16]Eighty-two percent of Asian students at Washington and all Asian students at McKinley are foreign-born. (In the McKinley cohort, however, Asians constitute less than 1 percent of the total 10th-12th grade 1988 senior sample.) Twenty percent of Latino students at Washington and 70 percent of those at McKinley are foreign-born. We were unable to obtain data on country of birth for Coolidge seniors, although data from our field study suggest that a substantial proportion of the school's Latino and Asian students are immigrants.
[17]Table A.1 displays these data.
[18]This may result from the influence of the large foreign-born cohort at McKinley, and particularly at Washington.
[19]Tables A.2, A.3, and A.4 compare the achievement scores of students from different ethnic and racial backgrounds and of native-born and foreign-born students.
[20]However, the three Latino students at Washington who took the SAT test scored higher than did whites on the math portion and higher than Asians and whites on the verbal portion.
[21]Of the top 10 percent of Washington students, 95.5 percent took the 10th grade achievement test, 90.7 percent of Coolidge High's top 10 percent, and 91.2 percent of the top 10 percent of McKinley High students. Of the top 10 percent at Washington, 97.8 percent took the SAT test; 93 percent of the top 10 percent at Coolidge, and 73.5 percent of McKinley students in the top 10 percent.
[22]Washington's high rate of application to four-year colleges, relative to Coolidge and McKinley, is consistent with the fact that more Washington students completed the state university's entrance requirements.