Norwegian University of Science and Technology (NORWAY)
About this paper:
Appears in: ICERI2015 Proceedings
Publication year: 2015
Pages: 503-507
ISBN: 978-84-608-2657-6
ISSN: 2340-1095
Conference name: 8th International Conference of Education, Research and Innovation
Dates: 18-20 November, 2015
Location: Seville, Spain
This study starts out by exploring why five higher education institutions failed to meet nationally agreed criteria for the approval of their quality systems. To achieve this, the review panels’ reports were examined with a view to data analysis and data application. The reports were readily available online and represented an excellent data source for research purposes. The review panels found that quality reports were descriptive rather than analytical, that quality procedures were unsystematic and conclusions often missing. Unfortunately, the panels failed to come up with approaches that could remedy the situation. Rather, the recipe seems to be more of the same, in particular student evaluation of teaching. With this as a backdrop, this study provides conceptual tools that might change the actors’ approaches and thus empower those undertaking institutional quality reviews.

As a qualitative study, reports from all institutions were analyzed, and the following questions guided the research:
1) Which issues in particular caused the external review panels’ concern?
2) What advice did the panels provide to the institutions to enhance their educational quality?
3) What might be helpful conceptual approaches to get a better handle on quality enhancement?

First, I wanted to capture key themes in conclusions; second, I wanted to check for potential patterns that might be inherent in data gathered from the institutions. Are there issues that keep reoccurring across institutions?

Based on the analysis, a key issue is the need for conceptual tools to make sensible interventions. Our data indicate that institutions need a more scholarly approach undertaking quality enhancement projects. In an ideal world, there would be a link between data collection, analysis and actions. Access to relevant data is a necessary but not sufficient condition for a successful analysis. This requires theoretical insights as well as significant experience.

This paper demonstrates the importance of sound conceptual frameworks capable of identifying targeted areas in relation to “focus” and “purpose”. Conceptual shortcomings have important consequences for data analysis and follow-up initiatives. It is noteworthy that, even though the review panels bluntly rejected the institutions’ quality programs, they were less able to assist failing institutions by suggesting innovative approaches. The general recipe seems to be more of the same that did not work in the first place. This study suggests several different approaches to quality enhancement carrying the potential to change and improve key players’ thinking and practice. It starts out as an empirical study and ends up with conceptual contributions to possibly remedy the situation.

A key challenge in any quality judgment relates to the use of “standards” and “criteria”. These terms are sometimes used interchangeably, but refer to different realities. If quality programs are imposed by external decisions without proper involvement of key players internally, institutions run the risk of failing support from academic staff and students. In the worst case, quality initiatives with the best of intentions inadvertently may do a disfavor to their institution. One way to avoid this for is for stakeholders to become more “grounded” in their approaches. This requires key players to become more professional and “scholarly” in their approach than was the case in this study.
Conceptual approach, quality enhancement, evaluation, scholarly.