Back to Search Start Over

Gender Bias in Test Item Formats: Evidence from PISA 2009, 2012, and 2015 Math and Reading Tests.

Authors :
Shear, Benjamin R.
Source :
Journal of Educational Measurement. Dec2023, Vol. 60 Issue 4, p676-696. 21p.
Publication Year :
2023

Abstract

Large‐scale standardized tests are regularly used to measure student achievement overall and for student subgroups. These uses assume tests provide comparable measures of outcomes across student subgroups, but prior research suggests score comparisons across gender groups may be complicated by the type of test items used. This paper presents evidence that among nationally representative samples of 15‐year‐olds in the United States participating in the 2009, 2012, and 2015 PISA math and reading tests, there are consistent item format by gender differences. On average, male students answer multiple‐choice items correctly relatively more often and female students answer constructed‐response items correctly relatively more often. These patterns were consistent across 34 additional participating PISA jurisdictions, although the size of the format differences varied and were larger on average in reading than math. The average magnitude of the format differences is not large enough to be flagged in routine differential item functioning analyses intended to detect test bias but is large enough to raise questions about the validity of inferences based on comparisons of scores across gender groups. Researchers and other test users should account for test item format, particularly when comparing scores across gender groups. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00220655
Volume :
60
Issue :
4
Database :
Academic Search Index
Journal :
Journal of Educational Measurement
Publication Type :
Academic Journal
Accession number :
174011259
Full Text :
https://doi.org/10.1111/jedm.12372