R3 1.21 November 27, 2023 Installing BS Detectors through Critical Thinking Training
Critical thinking skill is both one of the most prized outcomes of a college education, and one of the hardest for our students to attain.
If you do teach in higher education, I’m willing to bet that “critical thinking” is listed as an outcome somewhere in your syllabi, and maybe in all of them. It’s one of the rare areas of agreement across disciplines and philosophies: We really, really want our students to learn to critically think.
But wanting something is not the same as having a pathway to achieving it, and the research literature on critical thinking instruction is littered with barriers, problems, and failed interventions. If you assume, like I did years ago, that simply explaining what critical thinking is or telling students that they will be graded on it is enough to move the needle on this skill – think again. One expert after another (see here and here for examples) has argued that critical thinking is a lot harder to teach than it looks at first glance.
This is partly, but not entirely due to the difficulty of defining it in the first place. I tend not to put much stock in broad, discipline-independent concepts of critical thinking; it looks different in, say, a nursing or art history course versus psychology or mathematics, and that is okay. But as in all things, if we don’t know what the goal is, it’s especially unlikely that we will get there.
None of this is to say that teaching critical thinking is not worth the attempt. Far from it; as a member of a discipline that prides itself to the point of obsession on being able to find and correct biased thinking, I’m constantly trying to work this into every course I take on. And as the authors of this issue’s feature article point out, being able to spot misinformation, in particular, is a survival skill in an age dominated by social media and online news.
Given the fact that critical thinking is both so important and so tough to teach, I was glad to see a fairly new article on the topic with a novel approach and some promising results. It focuses on the sort of critical thinking that psychologists like myself gravitate toward: noticing and resisting the commonest forms of biased and flawed reasoning. These often take the form of appealing fallacies that our brains seem drawn to, like moths to a flame. In the (mildly censored*) terminology of the article, critical thinking means learning to call bullsh*t on these tempting forms of irrationality. The way the authors taught this skill was a new implementation of a concept with plenty of history in the cognitive research literature: inductive learning, meaning practicing with examples drawn from different categories. This approach is light on explicit instruction and instead emphasizes trial and error, repeated attempts, and feedback.
Citation:
Motz, B. A., Fyfe, E. R., & Guba, T. P. (2022). Learning to call bullsh*t via induction: Categorization training improves critical thinking performance. Journal of Applied Research in Memory and Cognition, 12(3), 310–324. https://doi.org/10.1037/mac0000053
DOI:
https://doi.org/10.1037/mac0000053
Paywall or Open:
Open
Summary:
Study 1 was a pilot test of the inductive learning procedure run in a face-to-face classroom setting, with Introductory Psychology students as the participant group. Study 2 presented the procedure to adults recruited from an online research participation platform. In both studies, participants practiced with challenging, contextualized examples of major fallacies and biases in reasoning, attempting to classify examples (and rule out non-problematic examples) and receiving feedback along the way. The intervention produced significant improvement in scores on an established measure of critical thinking in psychology.
Research Questions:
Can inductive training (learning to categorize examples) on different types of fallacies and biases improve general critical thinking performance?
Sample:
Study 1 (classroom pilot): 360 students enrolled in Introductory Psychology, who completed the activities as part of their course requirements
Study 2: 253 participants, all United States-based adults over the age of 18, recruited via the Mechanical Turk research participation platform
Method/Design:
Study 1: Students received an introduction to critical thinking specifically focused on a set of cognitive fallacies and biases: confirmation bias, correlation vs. causation, experimenter bias, lack of control group, overgeneralization, and inferring systematicity from random chance. They practiced identifying and categorizing examples of these biases as a graded learning activity associated with four units in the first half of the course, spaced approximately two weeks apart. Student completed pre- and post-tests at the beginning and end of the semester, with test items adapted from the Psychology Critical Thinking Exam (PCTE).
Study 2: Participants were randomly assigned to complete one of three interventions: categorization practice problems involving critical thinking, categorization practice problems not involving critical thinking, and a no-intervention control. Participants got the same pre- and post-test materials as in Study 1.
The critical intervention was the inductive learning condition, in which participants were presented with examples of different biases and fallacies and asked to identify and categorize them. All participants also received standardized instructional materials on critical thinking prior to the intervention. Training took place over multiple sessions that were spaced, on average, over 1.4 days.
Key Findings:
Study 1: Scores on the PCTE went up from 45.8% at pre-test to 53.8% at post-test.
Study 2: Improvement from pre- to post-test was significantly and substantially higher for the inductive learning condition, relative to the other two conditions (which were not significantly different from one another).
In an interesting side finding, 85% of the Study 2 sample agreed or strongly agreed with the statement “I am skilled at critical thinking,” prior to completing the procedure.
Both studies also revealed that of all the different fallacies, Correlation vs. Causation was the most difficult to identify, both before and after training.
Choice Quote from the Article:
Individuals urgently need to be able to counteract growing volumes of misinformation (false and misleading information) and disinformation (intentionally misleading information; Lazer et al., 2018; Machete & Turpin, 2020). The prevalence of misleading online news and information presents real threats to democracy, public discourse, and public health (Lewandowsky et al., 2017). For example, online misinformation (that has been thoroughly de- bunked) about the side effects from vaccines has resulted in large-scale vaccine hesitancy, creating secondary public health crises caused not by illness, but by a contagion of fallacious claims (Horton, 2020; Poland & Spier, 2010). These claims, and our susceptibility to them, follow well-worn psychological paths, such as our human tendency to view correlational findings, or even just rare coincidences, as providing evidence of causal mechanisms. When exposed to these types of claims, one who is skilled at critical thinking should be able to recognize that mere associations do not warrant claims about one factor causing another, and to challenge the validity of claims.
Why it Matters:
The article opens with a concise and substantive literature review that both drives home the real-world importance of the type of critical thinking this article focuses on, and also distills several major themes from existing literature on teaching critical thinking in college. Also useful is the fact that the training materials are all posted at https://osf.io/cg7k6 - a treasure trove for those who might want to adapt the procedure for their own courses. Notably, this procedure is multiple-choice and auto-graded, making it relatively easy to incorporate even for large sections.
I think that it’s a genius idea to use categorization as the theoretical framework for this work. Although this concept might be a bit obscure outside the discipline of cognitive psychology, within the discipline it’s something we’ve studied for decades, gaining a remarkably complete understanding of how category learning develops and just how important it is for organizing our knowledge of the world. Teaching via exposure to examples, practice, and feedback has been used effectively in laboratory procedures with everything from art appreciation to statistics, so it’s a good bet that it will be useful for critical thinking as well.
As the authors point out, this approach is a variation on the approach known as “prebunking,” a technique for teaching people to, well, call bullshit on misinformation. The analogy often used for prebunking is inoculation, where small and harmless doses of misinformation are introduced. As individuals spot and critique these small claims under controlled circumstances, they develop the ability to do the same with more challenging claims they run across unpredictably in real life.
More than anything, I think the biggest take-home from the article is this: Critical thinking – or any thinking skill – can be taught, but only with substantial amounts of practice that directly targets that skill. If teaching critical thinking really is a top priority (as it is for me), you must develop activities in which students engage in whatever critical thinking looks like in your discipline, and ensure that they get feedback as they go.
Most Relevant For:
Faculty; curriculum designers; instructional designers; leaders with responsibility for campus curriculum and learning outcome standards; librarians
Limitations, Caveats, and Nagging Questions:
As the authors point out, this work is situated in a discipline (psychology). They argue that the approach could transplant smoothly into other subjects, and I agree, but it’s important to keep in mind that what they have developed is not an all-purpose “learn to think” intervention.
They also note that Study 1 is not a controlled experiment, and there are many other factors that could explain the improvement across pre- and post-tests (critical thinking instruction in other courses, specific material other than the inductive learning exercises). Still, given the documented difficulty of producing any improvements in this area, I agree with the authors that the results from Study 1 are an encouraging proof of concept.
If you liked this article, you might also appreciate:
Halpern, D. F., & Butler, H. A. (2019). Teaching critical thinking as if our future depends on it, because it does. In J. Dunlosky & K. A. Rawson (Eds.), The Cambridge handbook of cognition and education (pp. 51–66). Cambridge University Press. https://doi.org/10.1017/9781108235631.004
Heijltjes, A., van Gog, T., Leppink, J., & Paas, F. (2015). Unraveling the effects of critical thinking instructions, practice, and self-explanation on students’ reasoning performance. Instructional Science, 43(4), 487–506. https://doi.org/10.1007/s11251-015-9347-8
Holmes, N. G., Wieman, C. E., & Bonn, D. A. (2015). Teaching critical thinking. Proceedings of the National Academy of Sciences, USA, 112(36), 11199–11204. https://doi.org/10.1073/pnas.1505329112
Lawson, T. J., Jordan-Fleming, M. K., & Bodle, J. H. (2015). Measuring psychological critical thinking: An update. Teaching of Psychology, 42(3), 248–253. https://doi.org/10.1177/0098628315587624
Niu, L., Behar-horenstein, L. S., Niu, L., Behar-horenstein, L. S., & Garvan, C. W. (2013). Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educational Research Review, 9, 114–128. https://doi.org/10.1016/j.edurev.2012.12.002
Prat-Sala, M., & van Duuren, M. (2022). Critical thinking performance increases in psychology undergraduates measured using a workplace-recognized test. Teaching of Psychology, 49(2), 153-163. https://doi.org/10.1177/0098628320957981
Tiruneh, D. T., Verburgh, A., & Elen, J. (2014). Effectiveness of critical thinking instruction in higher education: A systematic review of intervention studies. Higher Education Studies, 4(1). https://doi.org/10.5539/hes.v4n1p1
Willingham, D. T. (2008). Critical thinking: Why is it so hard to teach? Arts Education Policy Review, 109(4), 21–32. https://doi.org/10.3200/AEPR.109.4.21-32
File under: Thinking; critical thinking; cognitive biases
*Special bonus fact: Replacing letters in swear words with punctuation is called grawlix or profanitype.