Back to Search
Start Over
Evaluating Multiple-Choice Exams in Large Introductory Physics Courses
- Source :
-
Physical Review Special Topics - Physics Education Research . Jul-Dec 2006 2(2):020102. - Publication Year :
- 2006
-
Abstract
- The reliability and validity of professionally written multiple-choice exams have been extensively studied for exams such as the SAT, graduate record examination, and the force concept inventory. Much of the success of these multiple-choice exams is attributed to the careful construction of each question, as well as each response. In this study, the reliability and validity of scores from multiple-choice exams written for and administered in the large introductory physics courses at the University of Illinois, Urbana-Champaign were investigated. The reliability of exam scores over the course of a semester results in approximately a 3% uncertainty in students' total semester exam score. This semester test score uncertainty yields an uncertainty in the students' assigned letter grade that is less than 1 / 3 of a letter grade. To study the validity of exam scores, a subset of students were ranked independently based on their multiple-choice score, graded explanations, and student interviews. The ranking of these students based on their multiple-choice score was found to be consistent with the ranking assigned by physics instructors based on the students' written explanations ( r greater than 0.94 at the 95% confidence level) and oral interviews (r=0.94[subscript -0.09][superscript +0.06]). (Contains 30 endnotes, 4 tables and 7 figures.)
Details
- Language :
- English
- ISSN :
- 1554-9178
- Volume :
- 2
- Issue :
- 2
- Database :
- ERIC
- Journal :
- Physical Review Special Topics - Physics Education Research
- Publication Type :
- Academic Journal
- Accession number :
- EJ839546
- Document Type :
- Journal Articles<br />Reports - Research
- Full Text :
- https://doi.org/10.1103/PhysRevSTPER.2.020102