NCRVE Home | Site Search | Product Search

<< >> Title Contents NCRVE Home

RESEARCH METHODS

To address the primary research objective of this study, survey research design was conducted. Data was collected with a mail questionnaire completed by respondents from U.S. two-year colleges. This section of the study presents a discussion of the population for the study, the data collection instrument and procedures, and the approaches taken to analyze the data.

The Population and Survey Response Rate

The study attempted a census of all two-year colleges (junior, technical, and community) in the United States as of September 1, 1993. The census design was used to ascertain the scope of work-based learning occurring nationwide as well as to give all U.S. two-year colleges the opportunity to nominate their "best" work-based learning programs. The sampling frame for the study was obtained from three sets of mailing labels totaling 1,036 names of two-year college presidents from the American Association of Community Colleges (AACC). On September 3, 1993, mail questionnaires were sent to each of 1,036 two-year college presidents in the United States. Following multiple follow-up procedures (explained further in the section on "Questionnaire Administration"), a total of 505 surveys were returned as of December 31, 1993, for a response rate of 48.7%. Of these, 51 were not usable because they were blank--usually with the comment that the college did not have a work-based learning program--or they were only partially completed, again because the college indicated it did not have a work-based learning program. Consequently, the final version of the data set contained 454 cases.[1]

The following perspective, shared by Dr. Ellen Dran (1994) of the Northern Illinois University Center for Governmental Studies, the organization subcontracted to carry out administration of the questionnaire, is helpful in understanding the response rate for this study:

The 49% response rate for this study should be considered successful. Schools are heavily surveyed and to get 505 colleges to respond to such a long questionnaire is difficult. Also, based on the [telephone] calls we made to nonrespondents and calls by some colleges to us, we suspect that some of the nonrespondents did not have WBL programs and therefore did not think it necessary to return the questionnaire. . . . Probably the most important cause of nonresponse was the fact that the questionnaires were sent to each institution's president, asking that they be forwarded to the appropriate office. Based on our chaser phone calls, it appears that many of the questionnaires were "lost" in the presidents' offices. . . . Finally, comments over the telephone and on the questionnaires themselves indicated that the length of the survey and confusion about terms (especially duplicated and unduplicated head counts) were intimidating and probably contributed to nonresponse. Also, some schools apparently counted themselves out because they did not think their programs met the criterion of using "new and creative strategies" as indicated on pages 3 and 7 of the questionnaire. (Dran, 1994, pp. 1-2)

Since the survey attempted a census, and since there were not that many questionnaires returned as partially completed, it was not possible to compare results for colleges with and without work-based learning. Consequently, the extent to which results can be generalized to the entire population of U.S. two-year colleges is unknown. Unfortunately, neither our project staff, the panel of experts, nor the practitioners involved in the pilot test anticipated that a sizable proportion of two-year colleges might have few or no work-based learning programs, contributing to a substantial pattern of nonresponse. Had this pattern been anticipated, the researchers might have elected to undertake a stratified, random sample of all U.S. two-year colleges to enhance results pertaining to scope of work-based learning activities. As it was, the study contributed to an extremely rich database portraying self-nominated work-based learning programs from two-year colleges throughout the United States.

Questionnaire Development

A mail questionnaire was developed for this study based largely on information collected via previous library, survey, and field-based research conducted by the authors. The questionnaire asked a respondent designated by each college to provide information in the following areas: (1) the scope of work-based learning occurring across the college's curriculum, (2) the characteristics of the college's "best" work-based learning program in a health-related area, (3) the characteristics of the college's "best" work-based learning program in a nonhealth area, (4) the level of support for work-based learning from various stakeholder groups, (5) the general characteristics of the institution, and (6) policy recommendations to help foster additional work-based learning in the two-year college environment (see Figure 1).

Figure 1
Summary of Work-Based Learning in the Two-Year College
Questionnaire Sections and Items.

Questionnaire Parts Items
Part One:

Scope of Work-Based Learning

* Institutional head count enrollment

* Enrollment and estimated number of students in work-based learning by major curriculum area

* Occupational and academic programs which
required work-based learning

Part Two:

Health Work-Based Learning Program

* Name of "best" health work-based learning program

* Qualities of the program

* Year first implemented

* Number of students in FY93

* Approximate number of hours in workplace

* Approximate number of full- and part-time faculty

* Percent of health-care providers participating in program were small, medium-sized, or large

* Whether formally part of Tech Prep

* Type of work-based model used

* Program components used

* Location of primary responsibility for program components

Part Three:

Other Work-Based Learning Program

* Name of "best" nonhealth work-based learning program

* Qualities of the program

* Year first implemented

* Number of students in FY93

* Approximate number of hours in workplace

* Approximate number of full- and part-time faculty

* Percent of employers participating in program were small, medium-sized, or large

* Whether formally part of Tech Prep

* Type of work-based model used

* Program components used

* Location of primary responsibility for program components

Part Four:

Support for Work-Based Learning

* Barriers to the growth of work-based learning

* Level of support for work-based learning programs

Part Five:

Institutional Characteristics

* FTE enrollment for FY93

* Whether enrollment is increasing, remaining stable, or decreasing

* Number of full-time faculty in FY93

* Approximate number of part-time faculty in the fall term of FY92

* Percentage of students enrolled in transfer, occupational, or adult curriculum

* Whether financial resources are increasing, stable, or decreasing

* Whether the college community environment is rural or small town, suburban, or urban

Part Six:

Work-Based Learning Policy Recommendations

* Recommend ways that local, state, or federal governments could encourage growth of work-based learning programs.

In the two sections of the survey that asked respondents to describe their "best" programs, the following criteria were designated: (1) a formal structure linking work-based and college-based learning; (2) a proven track record based on existing evaluation data; (3) a fully operational program with evidence of commitment by the college and local employers; and (4) the existence of new and creative strategies in any of the areas of curriculum and instruction, program administration, and/or partnerships between education, business, labor, or other organizations. (See Appendix for a copy of the mail survey instrument.)

Validity

To ensure the content validity of the instrument, a panel of experts reviewed a draft of the instrument. Based on feedback from this panel, the questionnaire was revised and disseminated to approximately twenty members of the National Council for Occupational Education (NCOE) advisory board for a pilot test. Several relatively minor modifications were made to the mail questionnaire based on feedback received from these individuals, including rewording questions or response categories. One major change based on the group's feedback was to ask for nominations of programs the respondent institutions considered "best" separately for the health and nonhealth curriculum areas. This modification was made because of concerns raised about two-year colleges' nominations being predominantly in a health field, specifically in nursing or nursing-related occupations. By creating both a health and nonhealth section, we could ensure that results would be obtained on programs in nonhealth curriculum areas, an important consideration because of the intent of this study to cross two-year college curricula (i.e., transfer, occupational-technical, and so forth.)

Reliability

The Cronbach's alpha reliability coefficient was calculated for the two subscales used in the survey. Regarding the first of the two subscales, respondents were asked to indicate the extent to which twenty barriers could slow the growth of work-based learning in their own college. A six-point scale was used to indicate the impact of growth on work-based learning, ranging from none (1) to very major (6). The Cronbach's alpha for this subscale was .94. This indicates that the subscale of barriers to work-based learning was highly reliable.

The second subscale focused on the level of support for work-based learning currently being received from fourteen groups (i.e., stakeholder groups), although that particular language was not used in the questionnaire so as to not confuse respondents with potentially unfamiliar terms. Respondents were asked to indicate if the level of support was poor (1), fair (2), good (3), excellent (4), and not applicable (9). The Cronbach's alpha for this subscale was .92. Again, the subscale provided highly reliable indicators of the level of support of various groups toward work-based learning.

Questionnaire Administration

Administration of the mail questionnaire occurred in several phases based on a modified version of the total survey design method of Dillman (1979). First, the questionnaire, a cover letter, and a pre-addressed, stamped envelope were mailed on September 3, 1993, to the total sample of 1,036 two-year colleges. At that time, each college president was given the following instructions: "Your college has been selected to be part of our study. We ask your assistance in getting the questionnaire to the person in your institution who is most knowledgeable about work-based learning programs in operation during the 1993 fiscal year. Often that person is the occupational dean, but not always." The presidents were given contact names and phone numbers if they had questions about who to select to complete the questionnaire. Respondents were asked to complete the instrument and return it by September 24, 1993.

On September 13, a postcard was mailed to all nonresponding colleges. On September 20, chaser telephone calls began to a subsample of nonrespondents, asking them to complete and return the survey. By the conclusion of the data collection period, 666 schools were contacted with these chaser calls. On October 6 and 7, a second copy of the questionnaire, a cover letter, and pre-addressed and stamped envelope were mailed to nonrespondents. A total of 732 questionnaires were mailed during this phase of the data collection process. Additional questionnaires were mailed when requested. All questionnaires received through December 31, 1993, were included in the analysis of data for this project. Again, 454 usable questionnaires resulted from this process and provided the basis for findings presented in this report.

Data Coding and Analysis

Data obtained from this study were coded and entered into a spreadsheet package and analyzed with Statistical Package for the Social Sciences (SPSS) for the Macintosh. Coding of closed-ended items was relatively straightforward, usually following the responses on the questionnaire itself. However, Parts Two and Three of the survey where respondents were asked to identify a work-based learning program that met specified criteria required more extensive coding. For these sections, the inventory of the Dictionary of Occupational Titles (DOT) was used to categorize nominated work-based learning programs in health and nonhealth areas. In some cases, similar DOT codes were combined to create larger categories; however, where possible, the original DOT codes were used to classify programs. Based on the DOT coding scheme, we were able to identify 21 separate types of health programs and 29 separate types of nonhealth or "other" programs.

Other open-ended questions such as the ones found in Parts Two and Three and the question asking for respondents to provide policy recommendations in Part Six were content analyzed. The procedure used was an inductive content analysis (Guba & Lincoln, 1985; Patton, 1980). In this process members of the project staff read and reread the open-ended responses independently to identify major themes thought to portray the data in a meaningful and comprehensive way. In cases where themes were coded and classified differently by the project staff, discrepancies were reviewed and consensus was reached on the themes, classification scheme, and labels used to represent the data.

Finally, it is important to point out that, as would be expected with a relatively large dataset such as this one, there were deviations in response rates to the various sections and items of the survey. To be able to use as many questionnaires as possible for the statistical analysis, we included a very large percentage of all of the questionnaires returned by respondents. This decision resulted in the inclusion of some questionnaires that contained varying amounts of missing data. Consequently, throughout the findings and discussion section of this report, when the number of respondents varied substantially from the number in the total sample of 454 cases, that number is reported for tables and/or cells. The Appendix provides aggregated responses to the entire survey on an item-by-item basis.


[1] A detailed description of the data collection procedures was provided by Dr. Ellen Dran of the Center for Governmental Studies at Northern Illinois University. For further information about these procedures, contact the authors of the studies for a copy of the Survey on Work-Based Learning in the Two-Year College Technical Report (1994) prepared by Dr. Dran.


<< >> Title Contents NCRVE Home
NCRVE Home | Site Search | Product Search