About this paper

Appears in:
Pages: 5552-5562
Publication year: 2011
ISBN: 978-84-615-3324-4
ISSN: 2340-1095

Conference name: 4th International Conference of Education, Research and Innovation
Dates: 14-16 November, 2011
Location: Madrid, Spain

EFFECT OF ELIMINATING DIFFERENTIALLY FUNCTIONING ITEMS ON TEST VALIDITY AND RELIABILITY

J. Pedrajita

University of the Philippines (PHILIPPINES)
This study looked into differentially functioning items in a Chemistry Achievement Test. It also examined the effect of eliminating differentially functioning items on the content and concurrent validity, and internal consistency reliability of the test. Test scores of two hundred (200) junior high school students matched on school type were subjected to differential item functioning (DIF) analysis. One hundred students came from a public school, while the other 100 were private school examinees. The descriptive-comparative research design utilizing differential item functioning analysis and validity and reliability analysis was employed. The Chi-Square, Distractor Response Analysis, Logistic Regression, and the Mantel-Haenszel Statistic were the DIF methods used. A six-point scale ranging from inadequate to adequate was used to assess the content validity of the test. Pearson r was used in the concurrent validity analysis. The KR-20 formula was used for estimating the internal consistency reliability of the test. The findings revealed the presence of differentially functioning items between the public and private school examinees. The DIF methods differed in the number of differentially functioning items identified. However, there was a high degree of correspondence between the Logistic Regression and Mantel-Haenszel Statistic. The content validity of the test differed per DIF method ranging from slightly adequate to moderately adequate in the number of items retained. The concurrent validity of the test also differed per DIF method but all were positive and indicate moderate relationship between the examinees' test scores and their GPA in Science III. Likewise, the internal consistency reliability of the test differed per DIF method. The more differentially functioning items eliminated, the lesser was the content and concurrent validity, and internal consistency reliability of the test becomes. Elimination of differentially functioning items diminishes content and concurrent validity, and internal consistency reliability.
@InProceedings{PEDRAJITA2011EFF,
author = {Pedrajita, J.},
title = {EFFECT OF ELIMINATING DIFFERENTIALLY FUNCTIONING ITEMS ON TEST VALIDITY AND RELIABILITY},
series = {4th International Conference of Education, Research and Innovation},
booktitle = {ICERI2011 Proceedings},
isbn = {978-84-615-3324-4},
issn = {2340-1095},
publisher = {IATED},
location = {Madrid, Spain},
month = {14-16 November, 2011},
year = {2011},
pages = {5552-5562}}
TY - CONF
AU - J. Pedrajita
TI - EFFECT OF ELIMINATING DIFFERENTIALLY FUNCTIONING ITEMS ON TEST VALIDITY AND RELIABILITY
SN - 978-84-615-3324-4/2340-1095
PY - 2011
Y1 - 14-16 November, 2011
CI - Madrid, Spain
JO - 4th International Conference of Education, Research and Innovation
JA - ICERI2011 Proceedings
SP - 5552
EP - 5562
ER -
J. Pedrajita (2011) EFFECT OF ELIMINATING DIFFERENTIALLY FUNCTIONING ITEMS ON TEST VALIDITY AND RELIABILITY, ICERI2011 Proceedings, pp. 5552-5562.
User:
Pass: