OSU.EDU

Ohio Pathway Testing Project

These are some milestones of a CETE project to redesign the testing system for high school students in career-technical education (CTE). Beginning in 2008-09, CETE project teams worked with instructors across the state to develop about 35 pathway tests (> 15,000 items) across Ohio career fields (see the table below for Ohio Career Fields and the closest national Career Cluster). The sponsor is the Ohio Department of Education, Office of CTE (ODE-CTE) and a close collaborator is the Ohio Board of Regents (OBR). Pathway tests are part of an integrated career field initiative begun in 2005 with the first Career Field Technical Content Standards (CFTCS). A shift to course-based structures accompanied the latest revisions of content standards during 2012-13.

Ohio Career FieldOH PathwaysClosest National Career Cluster
Agricultural & Environmental Systems7Agriculture, Food & Natural Resources
Arts & Communication3Arts, A/V Technology & Communications
Business & Administrative Services4Business Management & Administration
Construction Technologies3 (Revised 2013)Architecture & Construction
Education & Training2Education and Training
Engineering2 (Revised 2013)Science, Technology, Engineering & Math
Finance2Finance
Government & Public Service1Government & Public Administration
Health Sciences4 (Revised 2013)Health Sciences
Hospitality & Tourism2Hospitality and Tourism
Human Services2Human Services
Information Technology4 (Revised 2013)Information Technology
Law & Public Safety2 (Revised 2013)Law, Public Safety, Corrections & Security
Manufacturing2 (Revised 2013)Manufacturing
Marketing3Marketing
Transportation2 (Revised 2013)Transportation, Distribution, & Logistics

This first section outlines the pathway testing project as part of the integrated career field initiative. Pathway assessments are content-valid, criterion-referenced tests (multiple choice items) of student technical skill attainment for 1) federal reporting, 2) program improvement, and eventually 3) student growth measurement. Among the innovations represented in this system are 1) focus on the intermediate pathway instead of the lowest level specialization, 2) forms in which at least 30% of the items are higher levels of cognitive challenge and 30% are based on authentic work scenarios, 3) end-of-course tests for pathways in Agricultural and Environmental Systems, Family and Consumer Sciences, and four newly-revised career fields; 4) adding an advanced performance level determined by a higher cutoff score, and 5) articulation (where possible) with postsecondary (PS) learning outcomes for statewide or bilateral articulation. Two subsequent paragraphs summarize the work accomplished under design-development and delivery. Issues that have emerged and how CETE staff members, working with ODE-CTE and OBR, have implemented solutions.

Design and development of tests in this project involves module layout, item bank creation, and item review. Creation is the process of writing, processing, and storing items in a database bank; review is the process used to collect evidence of content validation and recommendations for cutoff scores. In test development since 2009, CETE refined the purpose to select eligible competencies and ODE consultants then defined modules in terms of CFTCS units. Face-to-face meetings of 3-4 days were selected to write items while preserving test security and obtaining better items through interaction among item writers (CETE is actively looking to add more distance methods). Staff conducted 3-day workshops (2009) at which High School (HS) and Post-Secondary (PS) instructors teamed with CETE facilitators to write items. Only HS instructors were used for 2010-12 workshops, and PS item writers are rejoining for the 2012-13 wrap-up year. Items were written at two cognate levels, C1 (lowest 2 levels of Bloom’s taxonomy, 2001) or C2 (4 higher levels of Bloom’s taxonomy). Another addition, derived from common best practices in testing, was to write items for short workplace scenarios (scenario items may be C1 or C2). Details of the item banks for the pathways are found in the table below.

CTE Pathway TitleYrC1C2C2%Scen.Scen. %Total
Auto. Tech & Med./Heavy Trans. Equip. Tech09175194539024369
Collision Repair Technician09207137404714344
Power Equipment Technology09277239467515516
Business Management09328123279621451
Marketing Management093461963615929542
Information Support and Services093471432914429490
Interactive Media09197923210436289
Network Systems093581573014628515
Programming and Software Development09182116398529298
IT Core (Units 1-2 of ITWorks.Ohio)09561723212973
Visual Design and Imaging10212119368024331
Agricultural Industrial Power Technology10155135476422290
Animal Science and Management102261463910428372
Integrated Marketing Communications103581863413725544
Horticulture102521623911327414
Media Arts10221157428322378
Ag First Course and Electives102301263511833356
Culinary and Foodservice Operations*112782224416834500
Financial Services*112291293611732358
Performing Arts112841393314234423
Natural Resource Management112551864215034441
Accounting†112031594412033362
Administrative and Professional Support§112391553913735394
Lodging and Travel Services112771493513732426
Meat Science & Technology11361934193555
Early Childhood Education121841394310934323
Medical Office Management & Support122091674412433376
Legal Office Management & Support122351714214436406
Supply Chain Management122141644312734370
Fire (1 Course)135744444545101
Criminal Justice (6 Courses)133431923615830535
Manufacturing-Engineering Design (7)133752243719833599
Mechanical-Electrical-Plumbing (12)**135332873523128820
Manufacturing-Engineering Operations (8)**134262673823735693
Structural Construction (5)**132661673913732433
Biomedical-Laboratory (5)**13142114459537256
Allied Health and Nursing (13)*13
Exercise Science / Therapeutic (7)*13
Construction Design (10)135463123631036858
TOTALS9458585138.27457130.1915301

Items for live test forms are selected for use with a CETE database application called VIPER (Virtual Item PickER) – this tool displays information used in item selection. Information provided to psychometric staff, in addition to the item and any associated graphics-scenarios, includes a) content linkage to CFTCS (and PS learning outcomes if available from OBR), b) item analysis data by year, and c) item review and cutoff score judgments. Summary statistics for forms are displayed at the pathway and the module level. Finally, two displays indicate the item coverage, one for the units of the CFTCS and the other for aligned PS learning outcomes (if available). A summary table below provides average values for 19 pathway tests across item bank (item bank size and live items), cutoff scores (advanced and proficient), item ratings (essentiality, quality), item analysis (difficulty and discrimination), and item types (C2 and scenario). This table can be interpreted as follows – on average, nearly 500 items were considered to select just over 225 for usage; the cutoff scores averaged 81% and 60%; SME judgments of item essentiality and quality averaged 3.6 and 3.5 (on 4-point scales); item difficulty averaged .59 and item discrimination .32; the proportion of C2 higher-level items was .41 and of scenario items was .32.

Item BankCutoffsItem RatingsItem AnalysisItem Types
Bank #Live #AdvProfEssentialityQualityDiffDiscrimC2Scenario
495.75227.2581.3959.273.633.52.59.32.41.32

Delivery of test forms, following item selection, is accomplished through the WebXam portal (www.webxam.org). This process begins with QA checks before test forms are posted to the website for use. Presenting and scoring student tests, storing data at class-district-state levels, and reporting results at individual and aggregate levels are the main functions.

Best Practices in Large-Scale, Educational Statewide Testing: References

  • MacQuarrie, D., Alililegate, B., & Lacefield, W. (2008a). Criterion referenced assessment: Delineating curricular related performance skills necessary for the development of a table of test specifications. Journal of Career and Technical Education, 24, 69-89.
  • MacQuarrie, D., Alililegate, B., & Lacefield, W. (2008b). Criterion referenced assessment: Establishing content validity of complex skills related to specific tasks. Journal of Career and Technical Education, 24, 6-29.
  • Council of Chief State School Officers & Association of Test publishers. (2010). Operational best practices for statewide large-scale assessment programs. Washington, DC: Authors.
  • Downing, S. M., & Haladyna, T. M. (Eds.). (2006). Handbook of test development. Mahwah, NJ: Erlbaum.
  • Martineau, J., Paek, P., Keene, J., & Hirsch, T. (2007). Integrated, comprehensive alignment as a foundation for measuring student progress. Educational Measurement: Issues & practice, 26(1),28-35.
  • Martone, A. & Sireci, S. G. (2009). Evaluating alignment between curriculum, assessment, and instruction. Review of Educational Research, 79, 1332-1361.
  • Tindal, G., & Haladyna, T. M. (Eds.). (2002). Large-scale assessment programs for all students: Validity, technical adequacy, and implementation. Mahwah, NJ: Erlbaum.

Contributor: James T. Austin