Back to Search Start Over

Comparing Different Approaches to Generating Mathematics Explanations Using Large Language Models

Authors :
Prihar, Ethan
Lee, Morgan
Hopman, Mia
Tauman Kalai, Adam
Vempala, Sofia
Wang, Allison
Wickline, Gabriel
Murray, Aly
Heffernan, Neil
Source :
Grantee Submission. 2023.
Publication Year :
2023

Abstract

Large language models have recently been able to perform well in a wide variety of circumstances. In this work, we explore the possibility of large language models, specifically GPT-3, to write explanations for middle-school mathematics problems, with the goal of eventually using this process to rapidly generate explanations for the mathematics problems of new curricula as they emerge, shortening the time to integrate new curricula into online learning platforms. To generate explanations, two approaches were taken. The first approach attempted to summarize the salient advice in tutoring chat logs between students and live tutors. The second approach attempted to generate explanations using few-shot learning from explanations written by teachers for similar mathematics problems. After explanations were generated, a survey was used to compare their quality to that of explanations written by teachers. We test our methodology using the GPT-3 language model. Ultimately, the synthetic explanations were unable to outperform teacher written explanations. In the future more powerful large language models may be employed, and GPT-3 may still be effective as a tool to augment teachers' process for writing explanations, rather than as a tool to replace them. The explanations, survey results, analysis code, and a dataset of tutoring chat logs are all available at https://osf.io/wh5n9/.

Details

Language :
English
Database :
ERIC
Journal :
Grantee Submission
Publication Type :
Conference
Accession number :
ED636098
Document Type :
Speeches/Meeting Papers<br />Reports - Research