DIGITAL LIBRARY
AN INNOVATIVE APPROACH FOR A LANGUAGE ADAPTIVE TEST
University of Montreal (CANADA)
About this paper:
Appears in: ICERI2013 Proceedings
Publication year: 2013
Pages: 7036-7042
ISBN: 978-84-616-3847-5
ISSN: 2340-1095
Conference name: 6th International Conference of Education, Research and Innovation
Dates: 18-20 November, 2013
Location: Seville, Spain
Abstract:
Few years ago, the Ministry of Education of Quebec in Canada initiated a project that aims at implementing a compulsory second language test at the college level, in French and in English. The objective was to ensure that all the colleges in the province could share a common instrument to report their students’ level of proficiency in the second language. The first step for our team was to develop a framework aligned with the Canadian Language Benchmarks – a 12-level descriptive scale that can be compared to the Common European Language Framework.

We developed a computerized adaptive test for Reading and Listening. These two subtests consist of multiple choice questions to verify the candidate’s ability to understand audio documents or texts. The documents are selected according to the hypotheses that are made and refined through the administration of the test about the candidate’s actual level. We created documents that correspond to various levels of the scale and are similar to what the candidates are likely to hear or read in real-life situations. Questions and documents have been calibrated using a Rasch model. Since a variable number of questions are attached to a single document, we implemented an innovative adaptive test procedure. We found that this adaptive procedure can locate quickly and accurately the candidate’s level on the descriptive scale in Reading and Listening.

For productive skills, we tried to design the most efficient strategy possible. On the Writing subtest, candidates are submitted a writing task that they may have to perform in an academic or professional context. The text is sent to a marking center. By providing adequate training to the raters, we have been able to obtain a very good inter-rater reliability. Speaking is more difficult to assess. We decided to adopt a confirmatory approach using the information collected in the previous subtests. Candidates are asked to produce a short sample of speech in relation with a familiar topic. The recording is also forwarded to the marking center. Preliminary results indicate that the level estimate is fairly reliable in the context of a low stake certification test.

This project opens different research avenues on practicality issues (e.g., automated test maintenance), measurement issues (e.g., test dimensionality problems) or test use issues (e.g., the social acceptability of the tool).
Keywords:
Assessment, language test, adaptive test.