Paradise Valley Community College; 18401 North 32nd Street; Phoenix, Arizona, 85032Skip navigation links

PVCC Assessment Initiative
 
Skip to main body of page
Learn - Measure - Plan

Link to download this page as MSWord document

LOCALLY DEVELOPED ONE-SHOT ITEMS

Definition:

A locally developed one-shot item is any instrument designed and implemented by the faculty or assessing institution at a specific time or as a single component of a specific course.

Examples:

Locally developed one-shot items can include: exams, simulations, performance appraisals, oral exams, papers, or projects.

Costs:
  • Significant time investment for development, leadership, and coordination
  • Additional time for scoring and grading
  • Clerical support
  • Storage
  • Time to review results and make improvement decisions
  • Training
Advantages:
  • Content and style can be customized to fit well with goals and outcomes
  • Students are familiar with these types of assessments
  • Student performance is assessed in a uniform environment
  • Since the assignments/performances/tests are the same, students can be compared easily
  • Provides longitudinal data for the institution (student performance can be compared from one semester to another)
  • Performance criteria can be established relative to curriculum
  • Development process can lead to clarification of outcomes as well as the process and content of student learning.
  • Relatively rapid feedback
  • Faculty control over interpretation and use of results
  • Results should suggest program improvements
  • Depending on choice of instrument, may provide depth and breadth of student development
  • Flexibility, multiple measures
  • Results can be meaningful on many levels
  • Performances, simulations, etc. can measure application, generalization, and higher-order thinking skills.
Disadvantages:
  • Costly development and maintenance (time and effort)
  • Cannot be used for benchmarking or longitudinal data for each student (snapshot)
  • Demands expertise in measurement to assure reliability and validity
  • May not provide external validity
  • Data security (FERPA)
  • Sample of behavior or performance may not be typical
Implementation Suggestions:
  • Work with other departments/programs/institutions to reduce (share) cost and provide an element of externality.
  • Utilize on-campus measurement experts during development for item validation.
  • Contract faculty "consultants" for development and grading.
  • Incorporate outside experts, community leaders, etc into development and grading.
  • Embed items into course requirements to maximize relevance and minimize disruption. This also promotes student involvement/interest.
  • Use triangulation (multi-method approach) to validate results.
  • Develop specific, measurable criteria, especially for performances.
  • Pilot test instrument for training and inter-rater reliability.
  • Use multiple measures to cross-validate.
  • Establish open, non-threatening evaluation atmosphere.
Recommendation:

Locally developed instruments seem to fit well with our chosen learning outcomes, especially due to the course mapping we are completing. By identifying key courses, we should be able to embed one-shot items into the curriculum without much intrusion on the classroom. Communication outcomes could be measured via speeches, papers and performances. Information Literacy seems to fit well with simulation. Problem Solving can be measured via simulation or exam. Technology fits well with projects and simulations.

These measures (and Classroom Based Assessment instruments) yield the most relevant information and can be easily implemented with only a small intrusion on the classroom. Thus, they should be strongly considered as a piece of our initial assessment plan.

Bibliography/Resources:

Banta, T.W., "Assessment 101: Notes from presentation at the annual meeting of the Higher Learning Commission," March 2001

Banta, Trudy W., Palomba, Catherine A., Assessment Essentials : Planning, Implementing, and Improving Assessment in Higher Education. Jossey-Bass. 1999.

Chandler-Gilbert Community College, Program and classroom level rubrics and tools

Maki, Peggy. Using Multiple Assessment Methods to Explore Student Learning and Development Inside and Outside of the Classroom

Mesa Community College, The Mesa Community College Program to Assess Student Learning

Nichols, James O., Assessment Case Studies: Common Issues in Implementation with Various Campus Approaches to Resolution. Agathon Press. 1995

South Mountain Community College, Critical Thinking Assessment, Spring 2001

Van Kollenburg, Susan E., ed. A Collection of Papers on Self-Study and Institutional Improvement: Proceedings of the 106th Annual Meeting of the North Central Association: Serving the Common Good: New Dimensions in Higher Education. Chicago, The Higher Learning Commission. 2001.

Wiggins, Grant, "The Case for Authentic Assessment,"ERIC Digest December 1990