Back to Search Start Over

Enabling open‐ended questions in team‐based learning using automated marking: Impact on student achievement, learning and engagement.

Authors :
Tan, Sophia Huey Shan
Thibault, Guillaume
Chew, Anna Chia Yin
Rajalingam, Preman
Source :
Journal of Computer Assisted Learning; Oct2022, Vol. 38 Issue 5, p1347-1359, 13p
Publication Year :
2022

Abstract

Background: Different types of assessments influence learning and learning behaviour. Multiple‐choice questions (MCQs) reward partial knowledge and encourage surface learning, while open‐ended questions (OEQs) promote deeper learning. Currently, MCQs is part of team‐based learning (TBL) curriculum, and it is challenging to implement OEQs as immediate feedback is necessary. Objectives: We asked if MCQ and OEQs affect student achievement, student learning and student engagement differently in a TBL classroom. Methods: MCQs and OEQs test scores of N = 66 students were automatically captured in Learning Activity Management System (LAMS) and were compared using a switching replications quasi‐experimental design with pre‐ and post‐tests to answer the research questions. Student learning approaches and engagement in the team activities were assessed using the study process questionnaire and the structure of observed learning outcomes taxonomy respectively. Results and Conclusions: Students get significantly higher MCQ scores than OEQs for the same set of questions, but the reverse is true for application exercises (AEs), which focus on higher‐level application. Most students significantly deepened their learning approaches before OEQs, while poorly prepared students were less engaged during OEQ discussions. Interestingly students subjected to OEQs took less time and scored higher in AE discussions, suggesting better focus on higher‐level thinking. Implications: This project is significant as it bridges our understanding of the value of OEQs and TBL. Our approach is transferable to other courses, and thus it can improve the quality of teaching and learning in tertiary education. Lay Description: What is already known about this topic?: Multiple‐choice questions (MCQs) do not promote deep learning as much as open‐ended questions (OEQ) do.Team‐based learning (TBL) requires MCQs because of the need for immediate feedback.Automated marking can make it more efficient to provide immediate feedback on OEQs as required by the TBL approach. What this paper adds?: OEQs changed student‐learning approaches in the TBL context.OEQs changed the quality of team discussions during the team‐readiness assessment.Test (tRAT) and application exercise phases.Students were concerned about the validity of automated grading. Implications for practitioners: Consider using OEQs in TBL context for deepening student learning.Consider how to defuse students' scepticism of new technology.Consider forming teams with a mix of deep and surface learners. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02664909
Volume :
38
Issue :
5
Database :
Complementary Index
Journal :
Journal of Computer Assisted Learning
Publication Type :
Academic Journal
Accession number :
158963674
Full Text :
https://doi.org/10.1111/jcal.12680