DIGITAL LIBRARY
A QUANTITATIVE EVALUATION OF LEARNING OUTCOMES OF AN INTEGRATED ICT PROGRAMME
Metropolia University of Applied Sciences (FINLAND)
About this paper:
Appears in: ICERI2016 Proceedings
Publication year: 2016
Pages: 2905-2912
ISBN: 978-84-617-5895-1
ISSN: 2340-1095
doi: 10.21125/iceri.2016.1628
Conference name: 9th annual International Conference of Education, Research and Innovation
Dates: 14-16 November, 2016
Location: Seville, Spain
Abstract:
A major administrative, government steering and funding reform was implemented in the Finnish Applied Science University system in 2014 and 2015. The new government funding criteria are based on individual student reaching 55 of normative 60 ECTS credits. The reform is introduced for background of learning outcome analysis.

The change of the funding model also implied many revisions in the education program structures. Helsinki Metropolia University of Applied Sciences (HMUAS) implemented major revisions in our programs beginning of academic year 2014. Among others a previously separate programme of Health Technology was integrated into program of Information and Communication Technology as one of its specialisation options. The rationale and main lines of the curriculum change, including program merger are described necessary to expose the research questions.

Program performance indicators based on the new funding model are retrospectively calculated against the pre-reform student data over four years to establish stabile programme level performance baseline indicators. The criteria were then applied for cross-comparison purposes, A cross-comparison of the programs is presented based on consistently sampled historical data. In particular, we analyse numerically the results of the first year entry cohorts of 2010-2013 during their studies in the old curriculum compared with the theme based integrated approach over two first years of 2014-2016.

A longitudinal follow-up of the student cohorts who selected the Health Technology and other Majors of an integrated ICT programme in 2015 are be traced back to their original entry cohort of 2014 and initial analysis over the two years of sustained good performance is discussed. The realized and significantly improved indicator values are reported. The performance of the Major group is analysed over their performance during their second year. The same criteria were applied on their first year in retrospect. The analysis shows the longitudinal performance development compared to the old curricula for first two years. Because the ICT program runs on two campuses, some pedagogical differences may appear.

As we all are encouraged to promote increasingly diverse and individual paths of study, a cross-check of our indicators using purely administrative, in fact, pedagogically actualized teaching groups is partially performed to decrease the divergence between our administrative performance and the pedagogical performance with actualized student numbers. The discussion reveals a number pedagogically related issues. These could be used to improve the accuracy of the indicators to assess pedagogical performance, or a quantitative view into the learning outcomes.

Critical discussion covers some biased and non-biased error sources and the inherent instability of the introduced trigger-level based funding model. A critical assessment discusses especially indicator instability due to many uncontrollable variables, as far as any educational institution is concerned. We conclude that the cross-reference and longitudinal analysis using the criteria are technically feasible, although the indicator comparisons are sensitive to careful sampling and and use of correct group size. Finally some prudent observations on the reliability of our results are discussed, as well as potential directions how the method should be developed in the future are presented.
Keywords:
Curriculum, Health Technology, ICT, Funding model, University reform, Performance, Indicators.