A
Robust and Generalisable Rubric Design Framework for
Critical Thinking Assessment
Thesis
posted
on 07.06.2021, 15:15 by Harry A. Layman
Abstract
This
research has two primary goals. The first is to develop a useful framework for
designing rubrics to improve the utility of feedback and the reliability of
scoring for critical thinking assessments that use constructed response items.
The second is to demonstrate and explore the practicality, effectiveness,
strengths, and weaknesses of this approach as applied to some specific data
sets. The use of constructed response (CR) for educational assessment has been
advocated for decades (Gulikers, Bastiaens
and Kirschner, 2004; Palm, 2008; Wiggins, 1990). The primary benefits claimed
are more authentic measurement and better feedback to students and teachers.
More authentic measurement includes the notion that the construction of a
response is more cognitively challenging and more direct evidence than the
indirect evidence from selecting among predefined choices. Better feedback is
generally limited in practice, however, when use of CR items relies on holistic
scoring, generic rubrics, and a regimen of scorer training and calibration to
attain consistent and generally valid measurement. The resulting broad,
multifactor classification levels are unable to convey response-specific
feedback (Bejar, 2017).
This
research postulates the use of a rubric design framework for the creation of
item-specific, content-centric rubrics for assessment items that have right and
wrong, better and worse possible responses. The framework establishes a uniform
mechanism for identifying essential elements of item responses, with explicit
weights for varying degrees of correctness and completeness and standardized
approaches to calculating overall scores, sub-scores, and scaled scores.
Resultant score reports can provide explicit feedback to response elements
present, absent, or less than complete that explicitly justify and explain
score differences between responses. Successful use of this rubric design framework
promises CR assessment for critical thinking or argumentative writing items
that will have score reports able to provide detailed feedback to students and
defensible scoring outcomes, with the potential for improved interrater
reliability.
https://cogwrite-sgrel-charm-graphics-001.s3-us-west-2.amazonaws.com/2021LAYMANHAPhD.pdf