1. Where less is more: Limited feedback in formative online multiple‐choice tests improves student self‐regulation.
- Author
-
Say, Richard, Visentin, Denis, Saunders, Annette, Atherton, Iain, Carr, Andrea, and King, Carolyn
- Subjects
ONLINE education ,STATISTICS ,CLINICAL trials ,FOCUS groups ,TEST-taking skills ,SELF-control ,RESEARCH methodology ,MOTIVATION (Psychology) ,SATISFACTION ,INTERVIEWING ,COGNITION ,EDUCATIONAL tests & measurements ,LEARNING ,T-test (Statistics) ,INTERPROFESSIONAL relations ,STUDENTS ,QUESTIONNAIRES ,DESCRIPTIVE statistics ,NURSING students ,INFORMATION-seeking behavior ,CROSSOVER trials ,STATISTICAL sampling ,CONTENT analysis ,THEMATIC analysis ,DATA analysis ,DATA analysis software - Abstract
Background: Formative online multiple‐choice tests are ubiquitous in higher education and potentially powerful learning tools. However, commonly used feedback approaches in online multiple‐choice tests can discourage meaningful engagement and enable strategies, such as trial‐and‐error, that circumvent intended learning outcomes. These strategies will not prepare graduates as self‐regulated learners, nor for the complexities of contemporary work settings. Objectives: To investigate whether providing only a score after formative online multiple‐choice test attempts (score‐only feedback) increases the likelihood of students to engage in self‐regulated learning compared with more directive feedback. Measurable outcomes included deeper learning, collaboration, information seeking, and satisfaction. Methods: Data in this mixed methods study were collected from nursing students through surveys, test results, focus groups, and student discussion board contributions. A quasi‐experimental design was used for quantitative data, and qualitative data were analysed thematically against domains of self‐regulated learning. Results and Conclusions: Students receiving score‐only feedback were more cognitively engaged with the content, collaborated constructively, and sought out richer sources of information. However, it was also associated with lower satisfaction. In this study, minimal feedback created states of uncertainty, which resulted in the activation of self‐regulatory actions. Implications for Practice: Providing overly directive feedback for formative online multiple‐choice tests is conducive to surface‐level learning strategies. By minimising feedback and allowing for extended states of uncertainty, students are more likely to regulate their learning through self‐assessment and problem‐solving strategies, all of which are required by graduates to meet the challenges of real‐world work settings. Lay Description: What is already known about this topic: Formative online multiple‐choice tests are widely used in higher education.Feedback is a critical element in the design of effective online multiple‐choice tests.Formative feedback should promote self‐regulated learning.Feedback types used in multiple‐choice tests facilitate strategies not conducive to self‐regulation. What this paper adds: Online multiple‐choice test feedback can influence how students self‐regulate their learning.Less feedback can result in greater self‐evaluation and performance in summative assessment.Less feedback can encourage richer help‐seeking strategies (collaboration and information seeking).However, less feedback can result in student dissatisfaction. Implications for practice and/or policy: Judicious use of feedback is required in the design of online multiple‐choice tests.Educators should consider providing less feedback to promote learner self‐regulation.Satisfaction with online multiple‐choice tests does not necessarily equate to greater learning. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF