EVALUATING AND ASSESSING DATA QUALITY IN THE SOUTH AFRICAN EDUCATION SECTOR
1 Khulisa Management Services (SOUTH AFRICA)
2 University of the Witwatersrand (SOUTH AFRICA)
3 South African National Department of Education (SOUTH AFRICA)
About this paper:
Appears in:
ICERI2009 Proceedings
Publication year: 2009
Pages: 1411-1422
ISBN: 978-84-613-2953-3
ISSN: 2340-1095
Conference name: 2nd International Conference of Education, Research and Innovation
Dates: 16-18 November, 2009
Location: Madrid, Spain
Abstract:
Khulisa Management Services has worked with the South African Department of Education (DoE) since 2004 to assess the quality of their Education Management Information System (EMIS). Khulisa has assessed data quality by: (a) annually surveying a representative sample of schools and triangulating results against the DoE’s EMIS database; (b) conducting data quality audits (DQAs) of district, provincial and the national DoE. These complementary approaches were adopted, after consultation with the DoE on effective methodologies, and provide a comprehensive understanding of the accuracy, efficiency and quality of the EMIS system.
The DoE is increasing emphasis on delivering quality services, partly through enhanced monitoring and implementing data driven decision making. However, the self-reported data from schools (in 2 annual surveys) and the process of transmitting the data have compromised quality. Moreover, there are perverse policy incentives resulting in many principals over-reporting enrolment rates.
Khulisa’s first small assessment in 2004 showed multiple problems including unreliable DoE data and “ghost” students. In 2005/6, the DoE commissioned a larger survey of 2% (about 550) schools, confirming findings. In 2007/8, Khulisa was awarded a 3 year contract to survey 4% of schools and also survey adult, special needs and vocational education sectors. In 2009, the DoE requested that Khulisa add DQAs at district, provincial and the national level. DQAs result in compliance notes which raised both the technocratic and political stakes.
The introduction of DQAs in 2009, which comply with the South African Statistical Quality Assessment Framework, assess the data management system nationally and in all 9 provinces. DQAs track data through the system allowing assessment of where data errors creep in caused by collection, collation, entry, cleaning, analysis and reporting. Once errors are identified, compliance notes are issued requiring system improvements.
The surveys revealed important findings: (a) Most ordinary schools over report enrolment by material amounts, revealing “ghost learners” and have inadequate data management systems reducing reliability and validity. (b) EMIS survey flaws result in poor data from special needs schools. (c) Adult education centres do not follow requisite legislation or maintain adequate data management system resulting in minimal accountability and poor data. (d) Vocational colleges generally have functioning data management systems, but their data reporting systems often do not correlate with DoE requirements. (e) Provincial departments and the national department need to address: resource issues (insufficient budget/staff); inadequately defined/integrated roles and responsibilities; poor process/data flow; insufficient adherence to accountability requirements, including verification; and neglect of smaller sectors (vocational, adult, special needs) in favour of the ordinary schools sector.
System wide issues include weak data management systems at schools, districts and provinces and limited access to technology.
Khulisa and the DoE’s dual approach provides in-depth understanding of how data should be collected, collated, analysed, and used. It has improved accountability and contributed significantly to building capacity at school, province and national levels to address critical weaknesses.Keywords:
evaluation, assessment, data quality, accountability, research, education.