MetaTOC stay on top of your field, easily

Variability in students' evaluating processes in peer assessment with calibrated peer review

, , ,

Journal of Computer Assisted Learning

Published online on

Abstract

This study investigated students' evaluating process and their perceptions of peer assessment when they engaged in peer assessment using Calibrated Peer Review. Calibrated Peer Review is a web‐based application that facilitates peer assessment of writing. One hundred and thirty‐two students in an introductory environmental science course participated in the study. Two self‐reported surveys and a focus group interview were administered during the semester. The peer assessment data and demographic information were collected at the end of the semester. Although the study results support the agreement between peers and an expert, the variations in a group and individual level were found, in particular, when students evaluated mid‐quality or low‐quality writings regardless of their reviewing ability. Students tended to perceive that the process of evaluating peers' and own writings was helpful in their learning. Further, students' positive perceptions of peer assessment were associated with their understanding of the values of peer assessment tasks and their perceptions of achieving the course goal. We concluded that instructors should provide specific guidelines for how to decide a rating, use actual students' essays instead of instructor‐developed samples to train students and require written explanation for rubric questions to reduce variation in students' ratings and promote learning.