FACTORS AFFECTING SCIENCE LEARNING ASSESSMENT: THE TRENDS IN INTERNATIONAL MATHEMATICS AND SCIENCE STUDY CASE
University of Cyprus (CYPRUS)
About this paper:
Conference name: 15th annual International Conference of Education, Research and Innovation
Dates: 7-9 November, 2022
Location: Seville, Spain
Abstract:
Assessment is an integral part of the learning process. It helps us gain valuable information regarding students’ learning, the effectiveness of teaching methods/teaching material, as well as teachers’ efficiency. This underscores the importance of ensuring that assessment tests comprise appropriate items that can effectively measure the construct they purport to assess (e.g. students’ understanding or skills).
Assessment tests are often confounded by various (external) factors, which extend beyond the scope of their foci. For instance, an item may entail aspects artificially elevating its complexity (e.g. complicated graphs or unnecessary images), which tend to blur, rather than clarify, the information that can be derived about where students are in terms of their understanding/proficiency about the topic/competence under investigation.
The aim of this study is to explore such confounding factors in the case of science learning assessment. It sets out to address the following question: “to what extent (and how) do certain (external) factors impact on students’ attempt to address science items designed to assess understanding/proficiency about certain topics/skills?” In particular, the study has focused on the following factors: presence of an image in an item, presentation of data in tabular form, and length of the item (number of words).
To address this question we relied on the Trends in International Mathematics and Science Study (TIMSS), a large scale assessment conducted every 4 years since 1995 with the participation of about 60 countries. We drew on the pool of the released TIMSS science items and we selected 20 items which we used as the basis for the assessment tests we employed for the purpose of this study. In particular, we developed two variants of each item that differed with respect to the three factors under investigation so as to evaluate how each comes to influence students’ performance.
The assessment tests were given to 1245 students at grades 4, 5 and 6 and we compared the performance of students on the two versions of each item with the intent to explore the impact of each factor. This study will solely focus on the image-effect on students’ performance.
In broad terms, images may serve two different functions in an item; they could either hold a critical role in that that they uniquely convey crucial information or they could serve a complementary role by essentially repeating (pictorially) information already presented verbally in the item stem. For the purpose of this study, images serving the first role were removed whilst the information they encompassed was integrated in the stem. Images serving a complementary role were entirely removed.
The results revealed better student performance in the items that included critical images as opposed to their counterpart items that excluded images. Thus, it could be argued that providing an image which carries important information not conveyed in the verbal part of the question has a positive impact on performance. Of course, in order for this positive effect to be realized, the selection of images becomes vitally important. Hence, the clarification of the characteristics of “powerful” images needs to receive due attention.
This is an ongoing study and this paper presents preliminary results. However, we anticipate that the study will contribute important information with ensuing implications that could supplement the extant knowledge body with respect to assessment.Keywords:
Education, assessment, science learning assessment, questions, items, students’ understanding, students’ performance, image, image-effect, image functions.