1. Does the use of summative peer assessment in collaborative group work inhibit good judgement?
- Author
-
Joanna Tai, Bhavani Sridharan, and David Boud
- Subjects
Cooperative learning ,Medical education ,Teamwork ,media_common.quotation_subject ,formative and summative assessment ,consistency and accuracy ,assessment bias ,05 social sciences ,Judgement ,education ,group assessment ,050301 education ,Education ,Formative assessment ,Peer assessment ,Summative assessment ,0502 economics and business ,ComputingMilieux_COMPUTERSANDEDUCATION ,Group work ,Grading (education) ,Psychology ,0503 education ,050203 business & management ,media_common - Abstract
© 2018, Springer Nature B.V. The accuracy and consistency of peer marking, particularly when students have the power to reward (or penalise) during formative and summative assessment regimes, is largely unknown. The objective of this study is to evaluate students’ ability and behaviour in marking their peers’ teamwork performance in a collaborative group assessment context both when the mark is counted and not counted towards their final grade. Formative and summative assessment data were obtained from 98 participants in anonymous self and peer assessment of team members’ contributions to a group assessment in business courses. The findings indicate that students are capable of accurately and consistently judging their peers’ performance to a large extent, especially in the formative evaluation of the process component of group work. However, the findings suggest significant peer grading bias when peer marks contribute to final grades. Overall, findings suggest that students are reluctant to honestly assess their peers when they realise that their actions can penalise non-contributing students. This raises questions about the appropriateness of using peer marks for summative assessment purposes. To overcome the problems identified, this paper proposes a number of measures to guide educators in effectively embedding summative peer assessment in a group assessment context.
- Published
- 2019