DIGITAL LIBRARY
TEACHING MASTER’S STUDENTS IN PUBLIC ADMINISTRATION HOW TO APPRAISE THE METHODOLOGICAL RIGOUR OF SCIENTIFIC STUDIES: A NEW TEACHING APPROACH
Université Laval (CANADA)
About this paper:
Appears in: INTED2014 Proceedings
Publication year: 2014
Pages: 2151-2159
ISBN: 978-84-616-8412-0
ISSN: 2340-1079
Conference name: 8th International Technology, Education and Development Conference
Dates: 10-12 March, 2014
Location: Valencia, Spain
Abstract:
Searching and analysing research studies require specific knowledge and skills, which are supposedly learned at the university through research methodology courses. However, empirical evidence suggests that awareness of having access to electronic bibliographic databases and knowledge of well-established research designs are not acquired by several governmental policy analysts currently on duty. This raises questions regarding what types of knowledge and skills in research evidence mobilization are taught to future policy analysts during their university education. Since 2012, a 45-hour course in evidence-informed policy that includes a module on critical appraisal is mandatory for all students enrolled in the Master’s Program in one of the largest Canadian university. Despite the challenge of conducting controlled experiments in university settings, we have been able to assess the course using a controlled before-and-after study design. The primary outcome was the percentage of pre-post improvement on a knowledge test measuring students' knowledge. We used a controlled before-and-after design with two arms: a treatment group composed of students enrolled in a Master’s Program in Public Affairs who received a mandatory 45-hour training course in evidence-informed policy and a control group composed of students enrolled in a Master’s in Political Science who received a 45-hour course in general research methodology (see details below). Random assignment to treatment and control groups was impossible due to institutional constraints. In the treatment group, the mean percentage of pre-post improvement to the knowledge was 36.9% (SD 27.5), compared to 11.3% (SD 19.12) in the control group composed of students exposed to the traditional graduate-level research methodology course in Political Science. However, the mean score on the post-test for the treatment group was only half of the maximum score. We will replicate the same protocol on the 2014 Winter students's cohort.

Our past experience in training graduate students in critical appraisal of scientific studies led us to innovate in developing a new software application aimed at facilitating the systematic critical appraisal of more than 12 types of research studies including experimental and quasi-experimental studies, quantitative observational studies (cross-sectional and longitudinal), qualitative studies, and narrative & opinion papers. The aim of this presentation is to: i) define the concept of critical appraisal, ii) report the results of our past efforts to teach it to master students, and iii) present a new tool that we developed to facilitate both teaching and learning.
Keywords:
Evidence-based practice, critical appraisal, teaching, public administration.