DIGITAL LIBRARY
DIFFERENTIAL ITEM FUNCTIONING ACCORDING TO GENDER: AN ANALYSIS OF A STANDARDIZED TEST IN GEOGRAPHY
Scuola universitaria professionale della Svizzera italiana (SUPSI) (SWITZERLAND)
About this paper:
Appears in: ICERI2023 Proceedings
Publication year: 2023
Page: 1260 (abstract only)
ISBN: 978-84-09-55942-8
ISSN: 2340-1095
doi: 10.21125/iceri.2023.0416
Conference name: 16th annual International Conference of Education, Research and Innovation
Dates: 13-15 November, 2023
Location: Seville, Spain
Abstract:
Standardized tests are a popular tool for assessing students’ competence around the world (Rutkowski et al., 2010). Differences in performance emerge to be related to individual students’ characteristics such as gender or socioeconomic background in several subject tested. Regarding gender, there are still some differences in performance between males and females, to the detriment of females, particularly in several Western countries and in scientific subjects. For example, in the TIMMS 2019 survey in about half the countries involved, boys outperformed girls in mathematics assessment (Mullis et al., 2020). For this reason, it is important to analyze the content of the test itself in order to be able to assess the impact of its construction on performance. In the present study the object is a geography standardized test administered to a whole population of fourth-grade students at the end of compulsory school in the Italian-speaking canton of Switzerland. The purpose of the study is to analyze possible gender differences in students’ overall performance or in specific kind of questions (open-ended questions, multiple choice or drag and drop questions) and, finally, analyze Differential Item Functioning (DIF) (Swaminathan & Rogers,1990; Camilli & Shepard, 1994) of items proposed in the test. The purpose is to highlight any gender differences in a subject (geography) more rarely investigated by international standardized assessment and to provide feedback on the functioning of single items in order to develop increasingly fair standardized assessment.

References:
[1] Camilli, G., & Shepard, L. A. (1994). Methods for identifying biased test items. Sage.
[2] Mullis, I. V. S., Martin, M. O., Foy, P., Kelly, D. L., & Fishbein, B. (2020). TIMSS 2019 International Results in Mathematics and Science. Retrieved from Boston College, TIMSS & PIRLS International Study Center website: https://timssandpirls.bc.edu/timss2019/international-results/
[3] Rutkowski, L., Gonzalez, E., Joncas, M., & Von Davier, M. (2010). International large-scale assessment data: Issues in secondary analysis and reporting. Educational researcher, 39(2), 142-151.
[4] Swaminathan, H., & Rogers, H. J. (1990). Detecting item differential functioning using logistic regres-sion procedures. Journal of Educational Measurement, 27, 361–370.
Keywords:
Standardized assessment, school, performance, differential item functioning.