DIGITAL LIBRARY
MEASURING AND IMPROVING STUDENTS’ CRITICAL THINKING ABILITY
University of Minnesota (UNITED STATES)
About this paper:
Appears in: EDULEARN16 Proceedings
Publication year: 2016
Pages: 6334-6338
ISBN: 978-84-608-8860-4
ISSN: 2340-1117
doi: 10.21125/edulearn.2016.0363
Conference name: 8th International Conference on Education and New Learning Technologies
Dates: 4-6 July, 2016
Location: Barcelona, Spain
Abstract:
This presentation describes an approach to measuring critical thinking (CT) ability in college students and creating interventions to increase it. In a preliminary report (Brothen, 2014) of data from an ongoing study in an introductory psychology course enrolling more than 3,000 students over three academic terms, I described and validated a method to measure CT. This presentation presents additional data and conclusions from that study and reports a new study in which I explore variables affecting CT change and interventions to promote it.
In higher education today, it seems that everywhere one turns, one finds commentators such as Arum and Rokza (2011) encouraging the development of CT or lamenting the lack thereof in college students. A problem with much of the call for increased CT was pointed out by Kuncel (2011) in a quantitative analysis of the primary literature in the field. He concluded that a “broad definition of critical thinking” as something that applies across tasks “is generally not supported by the literature or theory” but rather should be seen as “a finite set of very specific skills” such as correlation vs. causation (p. 2).

Willingham (2007) characterized CT as dependent on domain knowledge and practice. This is the approach taken in my department’s introduction to psychology course. To reinforce the importance of CT and increase it in our students, we employ several interventions. First, the textbook strongly emphasizes six scientific thinking principles throughout the text. We reinforce the textbook treatment in a main component of the course—the online chapter quizzes assigned each week that are structured according to the principles described by Brothen and Wambach (2001). Second, for each textbook chapter, we assign online, computer graded scientific thinking essays provided by the textbook publisher (Pearson Inc.). Each essay requires students to synthesize and evaluate what they have read in the text and then apply it to a real situation. That essay is computer graded and students receive feedback which they can use to rewrite and improve their essay. Third, students attend weekly discussion sections that give them directed practice in critical thinking. They work to produce, analyze, interpret, and critically analyze psychological data. Finally, we assess changes in students’ critical thinking over the semester with the Sufficient Evidence Test (Altemeyer, 1996) administered pre and post online the first week of class and after the last day of class. The instrument measures well the scientific thinking we try to instill throughout the course.

In each of the studies, I obtained and recorded several items of student data including course performance, Big 5 scores, and students’ academic characteristics such as ACT scores, year in college, and cumulative grade point average. All procedures reported here were concordant with our university human subjects regulations.

This series of correlational studies support the hypothesis that critical thinking skills can be improved if students learn a set of scientific/critical thinking techniques and have them reinforced throughout the semester with examples relevant to their overall learning. This study can be considered indicative of a way for educators to proceed if they want to improve critical thinking in their students: focus on those skills relevant to what they teach, make them part of the course material, and reinforce them throughout the course.
Keywords:
Critical thinking, measurement, interventions.