Back to Search Start Over

Leveraging natural language processing to support automated assessment and feedback for student open responses in mathematics.

Authors :
Botelho, Anthony
Baral, Sami
Erickson, John A.
Benachamardi, Priyanka
Heffernan, Neil T.
Source :
Journal of Computer Assisted Learning; Jun2023, Vol. 39 Issue 3, p823-840, 18p
Publication Year :
2023

Abstract

Background: Teachers often rely on the use of open‐ended questions to assess students' conceptual understanding of assigned content. Particularly in the context of mathematics; teachers use these types of questions to gain insight into the processes and strategies adopted by students in solving mathematical problems beyond what is possible through more close‐ended problem types. While these types of problems are valuable to teachers, the variation in student responses to these questions makes it difficult, and time‐consuming, to evaluate and provide directed feedback. It is a well‐studied concept that feedback, both in terms of a numeric score but more importantly in the form of teacher‐authored comments, can help guide students as to how to improve, leading to increased learning. It is for this reason that teachers need better support not only for assessing students' work but also in providing meaningful and directed feedback to students. Objectives: In this paper, we seek to develop, evaluate, and examine machine learning models that support automated open response assessment and feedback. Methods: We build upon the prior research in the automatic assessment of student responses to open‐ended problems and introduce a novel approach that leverages student log data combined with machine learning and natural language processing methods. Utilizing sentence‐level semantic representations of student responses to open‐ended questions, we propose a collaborative filtering‐based approach to both predict student scores as well as recommend appropriate feedback messages for teachers to send to their students. Results and Conclusion: We find that our method outperforms previously published benchmarks across three different metrics for the task of predicting student performance. Through an error analysis, we identify several areas where future works may be able to improve upon our approach. Lay Description: What is already known about this topic: Open‐ended questions are used by teachers in the domain of mathematics to assess their students understanding but automated support for these types of questions are limited in online learning platforms. Recent advancements in areas of machine learning and natural language processing have led to promising results in a range of domains and applications. What this paper adds: Emulating how teachers identify similar student answers can be used to build tools that support teachers in assessing and providing feedback to student open‐ended work. Implications for practice: Developing better automated supports can increase the amount of direct feedback students receive to guide their learning. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02664909
Volume :
39
Issue :
3
Database :
Complementary Index
Journal :
Journal of Computer Assisted Learning
Publication Type :
Academic Journal
Accession number :
163886553
Full Text :
https://doi.org/10.1111/jcal.12793