These are some milestones of a CETE project to redesign the testing system for high school students in career-technical education (CTE). Beginning in 2008-09, CETE project teams worked with instructors across the state to develop about 35 pathway tests (> 15,000 items) across Ohio career fields (see the table below for Ohio Career Fields and the closest national Career Cluster). The sponsor is the Ohio Department of Education, Office of CTE (ODE-CTE) and a close collaborator is the Ohio Board of Regents (OBR). Pathway tests are part of an integrated career field initiative begun in 2005 with the first Career Field Technical Content Standards (CFTCS). A shift to course-based structures accompanied the latest revisions of content standards during 2012-13.
|Ohio Career Field||OH Pathways||Closest National Career Cluster|
|Agricultural & Environmental Systems||7||Agriculture, Food & Natural Resources|
|Arts & Communication||3||Arts, A/V Technology & Communications|
|Business & Administrative Services||4||Business Management & Administration|
|Construction Technologies||3 (Revised 2013)||Architecture & Construction|
|Education & Training||2||Education and Training|
|Engineering||2 (Revised 2013)||Science, Technology, Engineering & Math|
|Government & Public Service||1||Government & Public Administration|
|Health Sciences||4 (Revised 2013)||Health Sciences|
|Hospitality & Tourism||2||Hospitality and Tourism|
|Human Services||2||Human Services|
|Information Technology||4 (Revised 2013)||Information Technology|
|Law & Public Safety||2 (Revised 2013)||Law, Public Safety, Corrections & Security|
|Manufacturing||2 (Revised 2013)||Manufacturing|
|Transportation||2 (Revised 2013)||Transportation, Distribution, & Logistics|
This first section outlines the pathway testing project as part of the integrated career field initiative. Pathway assessments are content-valid, criterion-referenced tests (multiple choice items) of student technical skill attainment for 1) federal reporting, 2) program improvement, and eventually 3) student growth measurement. Among the innovations represented in this system are 1) focus on the intermediate pathway instead of the lowest level specialization, 2) forms in which at least 30% of the items are higher levels of cognitive challenge and 30% are based on authentic work scenarios, 3) end-of-course tests for pathways in Agricultural and Environmental Systems, Family and Consumer Sciences, and four newly-revised career fields; 4) adding an advanced performance level determined by a higher cutoff score, and 5) articulation (where possible) with postsecondary (PS) learning outcomes for statewide or bilateral articulation. Two subsequent paragraphs summarize the work accomplished under design-development and delivery. Issues that have emerged and how CETE staff members, working with ODE-CTE and OBR, have implemented solutions.
Design and development of tests in this project involves module layout, item bank creation, and item review. Creation is the process of writing, processing, and storing items in a database bank; review is the process used to collect evidence of content validation and recommendations for cutoff scores. In test development since 2009, CETE refined the purpose to select eligible competencies and ODE consultants then defined modules in terms of CFTCS units. Face-to-face meetings of 3-4 days were selected to write items while preserving test security and obtaining better items through interaction among item writers (CETE is actively looking to add more distance methods). Staff conducted 3-day workshops (2009) at which High School (HS) and Post-Secondary (PS) instructors teamed with CETE facilitators to write items. Only HS instructors were used for 2010-12 workshops, and PS item writers are rejoining for the 2012-13 wrap-up year. Items were written at two cognate levels, C1 (lowest 2 levels of Bloom’s taxonomy, 2001) or C2 (4 higher levels of Bloom’s taxonomy). Another addition, derived from common best practices in testing, was to write items for short workplace scenarios (scenario items may be C1 or C2). Details of the item banks for the pathways are found in the table below.
|CTE Pathway Title||Yr||C1||C2||C2%||Scen.||Scen. %||Total|
|Auto. Tech & Med./Heavy Trans. Equip. Tech||09||175||194||53||90||24||369|
|Collision Repair Technician||09||207||137||40||47||14||344|
|Power Equipment Technology||09||277||239||46||75||15||516|
|Information Support and Services||09||347||143||29||144||29||490|
|Programming and Software Development||09||182||116||39||85||29||298|
|IT Core (Units 1-2 of ITWorks.Ohio)||09||56||17||23||21||29||73|
|Visual Design and Imaging||10||212||119||36||80||24||331|
|Agricultural Industrial Power Technology||10||155||135||47||64||22||290|
|Animal Science and Management||10||226||146||39||104||28||372|
|Integrated Marketing Communications||10||358||186||34||137||25||544|
|Ag First Course and Electives||10||230||126||35||118||33||356|
|Culinary and Foodservice Operations*||11||278||222||44||168||34||500|
|Natural Resource Management||11||255||186||42||150||34||441|
|Administrative and Professional Support§||11||239||155||39||137||35||394|
|Lodging and Travel Services||11||277||149||35||137||32||426|
|Meat Science & Technology||11||36||19||34||19||35||55|
|Early Childhood Education||12||184||139||43||109||34||323|
|Medical Office Management & Support||12||209||167||44||124||33||376|
|Legal Office Management & Support||12||235||171||42||144||36||406|
|Supply Chain Management||12||214||164||43||127||34||370|
|Fire (1 Course)||13||57||44||44||45||45||101|
|Criminal Justice (6 Courses)||13||343||192||36||158||30||535|
|Manufacturing-Engineering Design (7)||13||375||224||37||198||33||599|
|Manufacturing-Engineering Operations (8)**||13||426||267||38||237||35||693|
|Structural Construction (5)**||13||266||167||39||137||32||433|
|Allied Health and Nursing (13)*||13|
|Exercise Science / Therapeutic (7)*||13|
|Construction Design (10)||13||546||312||36||310||36||858|
Items for live test forms are selected for use with a CETE database application called VIPER (Virtual Item PickER) – this tool displays information used in item selection. Information provided to psychometric staff, in addition to the item and any associated graphics-scenarios, includes a) content linkage to CFTCS (and PS learning outcomes if available from OBR), b) item analysis data by year, and c) item review and cutoff score judgments. Summary statistics for forms are displayed at the pathway and the module level. Finally, two displays indicate the item coverage, one for the units of the CFTCS and the other for aligned PS learning outcomes (if available). A summary table below provides average values for 19 pathway tests across item bank (item bank size and live items), cutoff scores (advanced and proficient), item ratings (essentiality, quality), item analysis (difficulty and discrimination), and item types (C2 and scenario). This table can be interpreted as follows – on average, nearly 500 items were considered to select just over 225 for usage; the cutoff scores averaged 81% and 60%; SME judgments of item essentiality and quality averaged 3.6 and 3.5 (on 4-point scales); item difficulty averaged .59 and item discrimination .32; the proportion of C2 higher-level items was .41 and of scenario items was .32.
|Item Bank||Cutoffs||Item Ratings||Item Analysis||Item Types|
|Bank #||Live #||Adv||Prof||Essentiality||Quality||Diff||Discrim||C2||Scenario|
Delivery of test forms, following item selection, is accomplished through the WebXam portal (www.webxam.org). This process begins with QA checks before test forms are posted to the website for use. Presenting and scoring student tests, storing data at class-district-state levels, and reporting results at individual and aggregate levels are the main functions.
Best Practices in Large-Scale, Educational Statewide Testing: References
- MacQuarrie, D., Alililegate, B., & Lacefield, W. (2008a). Criterion referenced assessment: Delineating curricular related performance skills necessary for the development of a table of test specifications. Journal of Career and Technical Education, 24, 69-89.
- MacQuarrie, D., Alililegate, B., & Lacefield, W. (2008b). Criterion referenced assessment: Establishing content validity of complex skills related to specific tasks. Journal of Career and Technical Education, 24, 6-29.
- Council of Chief State School Officers & Association of Test publishers. (2010). Operational best practices for statewide large-scale assessment programs. Washington, DC: Authors.
- Downing, S. M., & Haladyna, T. M. (Eds.). (2006). Handbook of test development. Mahwah, NJ: Erlbaum.
- Martineau, J., Paek, P., Keene, J., & Hirsch, T. (2007). Integrated, comprehensive alignment as a foundation for measuring student progress. Educational Measurement: Issues & practice, 26(1),28-35.
- Martone, A. & Sireci, S. G. (2009). Evaluating alignment between curriculum, assessment, and instruction. Review of Educational Research, 79, 1332-1361.
- Tindal, G., & Haladyna, T. M. (Eds.). (2002). Large-scale assessment programs for all students: Validity, technical adequacy, and implementation. Mahwah, NJ: Erlbaum.
Contributor: James T. Austin