The Oklahoma competency-based testing program encompasses a range of criterion-referenced multiple-choice assessments and performance-based assessments that test the competency attainment of students in both comprehensive high school vocational programs and vocational technical centers. The Oklahoma Department of Vocational-Technical Education (Oklahoma Vo-Tech), a separate agency from the Oklahoma Department of Education, developed and oversees these tests. The testing system as a whole is used to achieve two objectives:
In addition, in some occupational areas, testing has a third purpose: certifying that students have attained competencies for employment purposes.
Written, criterion-referenced multiple-choice tests have been developed for 250 occupational titles that are categorized into approximately fifty-five program areas. A new written test is administered every year for each of the job titles. Advisory groups have been established for each of the program areas to create duty/task lists and to rank tasks by importance (based on how frequently they come up on the job). Questions are written using these lists and are stored in a secure test bank. State staff randomly select test items to develop the annual written tests, which require a minimum score of 70 percent for passing. Oklahoma Vo-Tech is no longer creating assessments in areas involving licensure procedures, such as aircraft maintenance and cosmetology. However, the Oklahoma Department of Health has contracted with Oklahoma Vo-Tech to develop and administer their licensure exams (this aspect is discussed further in a later section).
Students are required to pass performance assessments before they are allowed to take the written test. Though instructors must use the statewide written test, they are free to select tasks for the performance assessment from state-developed ones or other sources, or to use their own performance assessments. Students are asked to show their competence in all the duty areas for a given occupation. Instructors are not required to report passing rates for performance assessments or to provide evidence that students passed the tests to the state, only to keep documentation at the school site for state review and audit purposes.
Performance assessments may be administered throughout the year or at a single point in time. Written assessments are administered at various times during the school year on dates set by the school. After taking the written test on the selected day, students may retake it as many times as the instructor and school allow.
Results on written assessments are reported to the individual students and instructors. Testing liaisons/assessment coordinators at the school site receive a report that describes the performance of the program as a whole in each duty area. Scores are not used to compare different programs; they are aggregated only to compare programs to a standard. In fact, neither superintendents, principals, the state director, nor the assistant director can access individual or program aggregate scores, because the department fears they could misinterpret the data. Only the state-level program manager and his or her staff can access the data.
Curriculum guides that closely follow the task lists have been developed for most occupational titles. Most schools use the curriculum guides, but doing so is optional. Curriculum guides include a posttest, which in practice is now used as both pretest and posttest in order to measure competency gains. These are reported by the testing liaison for the gain measure included in the 1990 Perkins Act-mandated performance measures and standards. Scores from the written assessments are used for the attainment measure.
The assessment system has been fully operational for the last ten years. Advisory committees continue to meet annually to revise task lists and tests. Before new tests are administered every year, each advisory committee reviews the items on its test. The committee for each occupation includes representatives from labor, higher education, secondary faculty, and industry.
Three test specialists (state-level Vo-Tech staff) are in charge of the fifty-five program areas. Each of these specialists thus coordinates the advisory committee's work, test development, administration, and scoring. It is very difficult for staff to keep the task lists up to current standards given this heavy workload. Task lists are thoroughly reviewed every three years.
State staff rely on testing liaisons at each school site to administer the written tests. Testing liaisons must be trained in the areas of objectivity, test security, and administration. Testing liaisons in all five regions received extensive training in 1993 and now receive updates every August at the annual vocational conference. In addition, staff work with the educators who train student teachers, so student teachers are familiar with the curriculum guides and tests before they become teachers. The liaisons were also given inservice training on performance assessment, but the state has no intention of centralizing that procedure.
Machine-readable forms used for the written competency tests are mailed to the testing division throughout the year. Individual and group results are reported at the end of May for each program, along with the mean percentage correct statewide. These scores are used by individual teachers for measuring classroom performance, and by the testing liaison to report competency attainment for program areas for the statewide measures and standards. Competency gain is reported from scores on the pre- and posttest included in the curriculum guides. Unlike the written tests, the hands-on component is not secured, nor is it administered consistently.
For the past eight years, the state agency has provided the Vocational-Technical Education Consortium of States (V-TECS) with duty task lists and tests, but it does not give V-TECS access to the test bank. The state wants the test bank to remain secure. Oklahoma Vo-Tech's assessment system also coordinates the assessments administered statewide by the health certification project and AGC programs (discussed in later sections).
Based on some pilot testing, the state staff and committees believe that scores on the multiple-choice test are closely linked to job performance. Because they believe there is no real need for a statewide evaluation of the performance assessment system (and because of the high cost of conducting one), they have opted not to pursue such an evaluation. The staff make reference to military research showing that cognitive knowledge (tested in multiple-choice format) is the best indicator of performance (knowledge transfer). No formal study has been conducted to investigate correlations between the performance components and written tests. The state office does not collect data on the failure rates on performance tests conducted at local sites.
State staff conduct item analyses on each multiple-choice test question to look for questions that are too difficult or that show gender or racial/ethnic bias. If they find test items that students are consistently not getting right, they review the items carefully and possibly throw them out. Staff also look at the number of tests administered to each student and the number (and percentage) of students retesting.
State staff feel confident in their tests' content validity because of the employer input to the assessment system. The committees meet regularly to update task lists and test questions in order to ensure that the system continues to be useful in opening up employment opportunities for students.
The assessment system has always been implemented on a partly voluntary basis. Local sites are directed to use the system by Oklahoma's state staff, which has more centralized control than most states (for fiscal and historical reasons). However, there are no real consequences for not using the state-prescribed system, and the twenty-nine districts (fifty-nine campuses) can choose which measures and standards to comply with. Under the system of performance measures and standards required by Perkins, school use of the competency-based assessment system has dramatically increased. Local programs receiving federal funds are required to report their performance using these written assessment results.
However, because of this connection with Perkins, instructors more often than not view the tests as contributing to the state agency's accountability system, rather than as a system developed for certifying students or for program improvement. At this point, many instructors use the tests mainly to comply with the state system. Some instructors have difficulty seeing the connection between what they teach in class and what is tested on the competency assessments. One reason for this lack of connection is that the curriculum guide (which teaches to the test) is used inconsistently across schools and occupational areas. Many instructors, schools, and districts use their own texts and other curriculum components instead.
The
state recently implemented a "career passport" (i.e., a career-
focused
portfolio) system. It contains rich information on individual student skills
that can be shown to potential employers. There are six required components:
evidence of a high school diploma (or GED), vocational program completion,
competency in one or more vocational areas (demonstrated by passing approved
tests), completion of minimum academic course requirements, adequate
attendance, and a resume. State staff hope that students and instructors will
see the value of earning a passport and therefore begin to see the
competency-based tests as part of a certification process, rather than solely
as a way to meet accountability requirements. Participation in the passport
system is also voluntary.
Oklahoma has a strong tradition of vocational education, including centralized state control and substantial state funding, which contributes to the successful operation of the assessment system. States lacking a strong funding base and a centralized system of vocational education may be hard-pressed to follow Oklahoma's example. Oklahoma has both vocational-technical centers (fifty-nine campuses) and comprehensive high schools in twenty-nine districts. Each is governed and funded by a separate structure. Through the state board of vocational-technical education, the state budget funds six of the fifteen staff in the testing division. Testing liaisons at each center are essential to the system. Most of the funding for the liaisons and for other costs of operating local test centers comes from local government sources; however, about 20 percent of the testing liaison's job is dedicated to the state's assessment system.
Oklahoma also points to the difficulty of implementing a state-driven, centralized assessment system that is perceived as relevant at the local level. Many instructors find that the state tests and their course curriculum differ in scope. For example, an instructor of advanced electronics focuses on microcomputer skills and knowledge, but his or her students must take the "general technician" test, which is much broader in scope. In contrast, a business/technology instructor in a systems management program teaches general computer skills, not job-specific skills such as "receptionist/word processor." In this case, the test content may be somewhat narrower than the curriculum.
The Health Certification Project is administered jointly by Oklahoma Vo-Tech and the Oklahoma Department of Health. Certification is currently administered in six areas: long-term care nurse aide, home health care aide, medication aide, adult day care nurse aide, developmentally disabled nurse aide, and residential care nurse aide. Students complete a training program that is approved by the Oklahoma Department of Health and then take a two-part test:
About 10,000 students complete the assessments each year in the six areas currently in operation. An RN or LVN must approve the clinical performance part of the test, which covers three selected objectives that change from year to year (these are selected from a comprehensive list of objectives). The testing liaison trains the RN or LVN to be a test judge, using a guide developed by the state. Only forty-three test sites may administer the written portion of the test, but any location (including a hospital) can be approved to assess clinical skills. Any person may work up to 120 days without certification in the three areas. With certification, typical per-hour pay is $5.25 for long-term care nurse aides and $8 for home health care aides.
Students must pass the clinical skills test before taking the written test. Since July 1995, students have been required to complete seventy-five hours of classroom training and twelve hours of clinical training before taking either test. Students study the subjects identified in the Health Certification Project duty/task list developed by Oklahoma Vo-Tech. The test-development method is the same one the Oklahoma competency-based assessment system uses (described above).
There are forty-three test sites in Oklahoma. The tests must meet federal and state licensure requirements for the relevant occupation. It costs each student $30 to take the clinical skills test (the home health aide test is a little more expensive because it requires thirteen competencies). The fees collected go to the area vocational-technical schools. It also costs $30 to take the written test (the area vo-tech school keeps $5; Oklahoma Vo-Tech receives the remaining $25). Oklahoma Vo-Tech is breaking even in administering the health tests.
The written exam is administered monthly; the clinical exam is by appointment (about six times per month). The performance assessments do not evaluate all the skills required to be competent for entry into the occupation. For example, in long-term care, only three skills are tested (selected randomly from fifty-two skills). In home health care, thirteen skills out of forty-eight are tested. It takes forty-five to sixty minutes to administer a clinical exam to one student, and each student is tested individually. Performance evaluators are trained using guides developed by Oklahoma Vo-Tech. Evaluators are paid about $19 an hour to observe and score the tests. The procedure for maintaining the quality of the multiple-choice portion of the exam is the same one used for the overall state competency-based system.
Because of Oklahoma's reputation for developing competency-based testing, AGC hired Oklahoma Vo-Tech to develop assessments and administer a program that leads to nationally recognized credentials in three areas of the construction industry: carpentry (commercial and residential), bricklaying, and stone masonry. These are advanced certificates, with required prequalification of either two years of work experience, or one year of work experience plus the completion of a vocational education program. Prequalification must be documented on the registration form before the test will be administered. Contracting occupations were included in the certification program, but occupations outside of AGC's "contracting" jurisdiction, such as plumbing and electrician work, were excluded.
AGC was incorporated in 1921 as a full-service construction association representing the needs of both open-shop and collective-bargaining contractors. It represents 8,000 general contracting firms and 24,500 associate and affiliate members; it has 101 chapters nationwide. Its mission is stated as follows: "AGC is dedicated to providing programs that promote high standards in the construction industry. AGC has designed this certification program to give prestige and recognition to individuals working in the industry." AGC accredits training programs in various contracting trades. Its members work mainly on commercial construction, where workers are most in demand. The incentive to sit for one of the certificates varies from chapter to chapter. AGC spends a lot of time teaching contractors that they need to invest in training for the incoming workforce.
The first AGC-sponsored tests were administered in 1989. Tests are multiple-choice, with high-level skills incorporated into test questions. Academic skills such as basic math and reading are included. Oral testing is offered by special arrangement. They have never had a request for a test in a language other than English.
Using the process developed for Oklahoma's competency-based system, AGC's workforce development committee oversees the certification program, including development of the task lists and multiple-choice tests. This committee and subcommittees in each certificate area are made up of contractors, training instructors, foremen, and supervisors. Tests are administered through the 101 local AGC chapters across the country. Workers can be trained anywhere and then take the test. Curriculum materials have been developed and sold (on a voluntary basis) to various kinds of training programs (secondary and postsecondary vocational education programs, apprenticeship programs, and companies). In 1996, 700 tests were scanned and 65 to 70 percent met the minimum passing score of 70 percent.
Task lists are reviewed and revised annually by committees of AGC contractors (the committees are coordinated by Oklahoma Vo-Tech). The committees select a pool of questions generated from the task list. These questions are entered into a test bank, which has grown over time. A test is then generated from the test bank every year. Tests range from fifty to 100 questions, depending on the tasks. Data on how important each task is to the particular job are taken into consideration in test development. Committees review tasks, test questions, and curricula annually.
In addition to developing curriculum materials and administering the test bank through an AGC subcontract, Oklahoma Vo-Tech administers the tests through local chapters, scores completed tests, and conducts item analysis for AGC certifications. Oklahoma Vo-Tech has worked with AGC for twenty-five years; the last five have been highly focused on these certifications. AGC funds both a program coordinator and a secretary at Oklahoma Vo-Tech to run the testing program. AGC also employs a full-time curriculum developer at the curriculum center in Oklahoma. Oklahoma instructors can buy materials at cost, whereas AGC receives profits from sales in other states. Oklahoma Vo-Tech is responsible for marketing the curriculum materials and tests.
The annual testing process begins in September, when chapters are asked to identify cities for test sites that year. About a third of the chapters request participation. In January, promotional materials are sent to the chapters to advertise certification. Registration/test administration is $15 per person per test (an individual may take more than one test). The test is then administered in April. The program coordinator works with test coordinators at each chapter to set up the test site and hire test examiners. The required conditions for each test site (such as lighting) and test examiners (such as a resume describing their work in a construction occupation) are specified.
Scores are reported for regions and test sites if there are enough test-takers. Individual results are confidential and are sent only to the test-taker. This sometimes is a problem for employers, who may have paid for the test and expect access to the results. However, results are not shared without the permission of the test-taker. The program coordinator compiles a report for each chapter, with tips on how to improve scores next year.
Seventy percent is the minimum passing score on all tests. Research was done on the relationship between test score, skill level, and job performance, and the analysts decided (somewhat informally) that 70 percent was a "good" score for predicting successful job performance. However, there was no scientific research underlying this cutoff score.
AGC aims to break even financially by having registration and test fees balance the costs of the testing program. One way to do this is to have test sites recruit more test-takers (with a limit of twenty per site). Recruiting more test-takers may be difficult in some areas, however, because of limited incentives to take the test. Another way is to raise fees. When the program originally began, the committee thought the $15 fee would allow the program to break even or even make a profit. But even though money has been lost on the assessment program, no fee hikes have been considered, because the goal is to keep the tests affordable. Adding to the problem is the fact that originally there was national union resistance, so the program was more difficult to implement than anticipated. Currently, AGC has about forty test sites in twenty-five states.
The AGC system was designed to be economically affordable and legally defensible. All that is needed for a site to administer the exams is a test-form scanner and AGC's customized software. One of the main reasons AGC does not use more performance assessments is their high cost. In addition, case law suggests that test-takers may be tested on only those tasks required to perform on the job. Worried about costs and legal challenges, AGC has steered clear of developing or implementing performance assessments. Although there is only limited research to support them, the advisory committee and state staff believe that multiple-choice test scores are highly correlated with job performance and will stand up to potential legal challenges.
The AGC certification system is held to the same quality standards as the Oklahoma competency system. Content validity is high because tests are closely linked to the competencies developed by industry. Administrators would like to do additional research on concurrent validity by checking the correlation between scores and performance. They do use item analyses to review individual questions, handing off to a review committee any items that are performing unusually. A decision may then be made to delete an item(s) and adjust scores before they are reported to individuals.
Certificates can be used not only for hiring, but also to document advanced training for raises and promotions. AGC includes both unionized and nonunionized contractors. Union contractors utilize the apprenticeship system, whereas nonunion contractors operate an open shop, where employment is open to all regardless of qualifications. If a contractor hires only union members, certification can be used as an added qualification to help contractors decide which employees to hire; in some cases, employers are required to pay certified workers more. If it is an open shop, it is up to the contractor to decide how to use the certification. On average, 75 percent of the testers' fees are paid by employers.
AGC provides those who successfully complete certification with an identifying hard-hat decal, wall certificate, and pocket card. AGC hopes that these items will bestow prestige on completers in the eyes of the workers and contractors, and thus will encourage more interest in initial and advanced training. AGC also hopes that the certification program will improve the performance of workers and build pride in skilled craftsmanship.
In January 1996, the following two new certification initiatives were added.
The Occupational Licensing Project (OLP) is the second to be jointly established with the Oklahoma Department of Health. The project is developing occupational duty/task lists and state licensing exams for the following areas: mechanical (sixteen licenses), electrical (four licenses), plumbing (four licenses), sanitarian (one license), fire alarm (two licenses), and burglar alarm (two licenses). The project is also developing a system that will allow the Oklahoma Department of Health to have on-site scanning and analysis capabilities. Until this segment of the project is complete, Oklahoma Vo-Tech is providing individual and group analysis as well as item analysis for these exams.
In July 1996, the testing division entered into a joint agreement with the Oklahoma Department of Education to oversee teacher certification for the following four vocational certification areas: agricultural eduation, family and consumer sciences, marketing education, and technology education. This project involves development of duty/task lists and certification exams for each area, as well as coordination of the administration and analysis of these certification exams.
Over a period of ten years, the Oklahoma competency-based assessments have developed into a system that has a good reputation and continues to branch into new areas--specifically, the statewide health licensure, AGC certification, occupational licensure, and teacher certification. The administrative and quality control procedures for multiple-choice examinations are already in place, and this system has conducted some experimentation with performance-based assessment. Although Oklahoma Vo-Tech believes that performance-based assessment is not a viable option statewide (for financial and logistical reasons), it is having some success with its statewide health licensing system, which includes performance tasks. However, reliability and consistency are still issues in all of its assessments.
All of the assessment systems are applicable to vocational education, although with slightly different objectives and incentives for participation. The Oklahoma competency-based assessment system started as a way to certify student skills and knowledge, as well as to move curriculum toward teaching skills and encourage instructional methods to focus more on student demonstrations of competency. However, many programs did not participate in the system. With the implementation of performance measures and standards, local sites are required to use the assessment system to report performance to the state and to make progress in program improvement. Many local instructors now see the system as "another state requirement" for accountability, rather than as a certification system for helping students.
Oklahoma education officials see a need not only to certify students' occupationally specific skills through their current system, but also to certify broader, more general skills through a "career passport," which students will complete and present to potential employers. State agency staff hope that once the passport system is implemented, educators will see passports and the vocational assessments as complementary parts of a certifying credential that students can use to gain employment. It is hoped that the passport system will be viewed by instructors as integral to their curriculum.
Of the assessment systems, the one for the six health certificates has had the most success in combining multiple-choice and performance-based assessments. It has also been the most successful in breaking even on administration costs and recruiting test-takers. The health certifications are required for licensure, which is necessary for employment, and demand in these fields has been steady, especially in home care occupations.
The AGC assessment system was developed to provide advanced certification (not entry-level employment) in three areas of the construction industry. At this point, the program is not recruiting enough test-takers to break even. The incentives for participation are based on demand and the strength of unions (and unions' preferred uses for the certification), which vary by locale. Therefore, participation rates differ considerably. The occupational areas being certified are not licensed professions, so participation is left to the discretion of employers, who decide based on market conditions and observed employee performance. Employers that hire both union and nonunion employees must value the certificate and incorporate it into hiring practices and salary scales if AGC is to increase participation.
One process for developing an assessment has been adapted for the other assessment systems. The five systems have slightly different objectives and participants, but all are a part of the vocational education enterprise. For the most part, what can be learned from these cases is how to administer, nationally or statewide, criterion-referenced, competency-based, multiple-choice tests, with more limited lessons on locally designed and administered performance-based assessment. In many ways, vocational education in Oklahoma is atypical. The state strongly supports vocational education with numerous state staff and substantial funding. It operates with an entrepreneurial spirit not common in state bureaucracies. Consistent leadership and staff, university support, and a tradition of strong vocational education programs statewide have worked in Oklahoma's favor. If they lack one or more of these elements, other states or metropolitan areas that try to adopt this system or develop a similar one may encounter problems.