R3 1.3 February 15, 2023: Student Choices; Small Teaching
On the mental effort associated with retrieval practice, and what Small Teaching principles look like when put to work in an introductory programming class
1. Why students do (or do not) choose retrieval practice: Their perceptions of mental effort during task performance matter.
Citation:
Hui, L., de Bruin, A. B. H., Donkers, J., & van Merriënboer, J. J. G. (2022). Why students do (or do not) choose retrieval practice: Their perceptions of mental effort during task performance matter. Applied Cognitive Psychology, 36(2), 433–444.
DOI:
https://doi.org/10.1002/acp.3933
Paywall or Open:
Open
Summary:
Students make decisions about how to study; the focus of this article is the influence of perceived mental effort, balanced against perceived learning gains, on those choices. Prior research suggests that students see more effortful study tasks as being less effective (when the reverse is likely true). It’s also unclear whether this pattern changes when students are given feedback after they use different strategies. Research participants were asked to learn names for anatomical structures using retrieval or restudy as study strategies, then to rate the mental effort and effectiveness of each one. They received feedback about their performance after trying the different strategies, then made a choice of how to study a new batch of anatomical structures. Participants who rated retrieval practice as more effortful were less likely to choose it in the future, although feedback on performance increased the likelihood of choosing it.
Research Question(s):
- Is restudy rated higher than retrieval practice in terms of subjective perceived learning?
- Is mental effort rated lower for restudy compared to retrieval practice?
- Does feedback encourage students to choose retrieval practice for future study?
- Is perceived learning a mediator in the relationship between subjective mental effort and choice of strategy? In other words, is perception of learning the reason why students avoid strategies they perceive as effortful?
- Does feedback help break up the mediating influence of perceived learning?
The researchers also set out to replicate the typical finding of better performance with retrieval practice over restudy, in the course of testing these hypotheses.
Sample:
51 undergraduate students from a Dutch university who had never studied human anatomy before
Method/Design:
Participants were assigned to memorize anatomical names of structures shown in image form. All participants were assigned to study half of them by restudy and half by retrieval practice (described as “self-testing” to participants). After this initial phase, participants rated the mental effort and learning associated with each strategy. They were then given feedback on how well they were learning different structures with each strategy, and advised that self-testing was the more effective option. A second round of structures was then assigned, and participants chose which strategy they preferred to use. The different stages of the study were spread across four days.
Key Findings:
Notably, researchers did not replicate the typical finding of better overall performance with retrieval practice, although they did, as predicted, find that students perceived restudy as being both easier and more effective. Perceived effort had a bigger influence on choice of strategy than perceived learning; the latter did not turn out to have the substantial mediating role originally predicted. Feedback did increase the likelihood of choosing retrieval practice as a strategy.
Choice Quote from the Article:
“Although a data-driven interpretation of effort is typically common among students (Baars et al., 2020), it is under certain circumstances possible for them to reinterpret effort and take a goal-driven perspective (i.e., high effort indicates high learning, Koriat et al., 2014) when making learning strategy decisions (i.e., choosing a cognitively engaging strategy that leads to a high-learning gain) ... These interventions may help learners develop a more positive image about the value of effort, making them more willing to actively invest effort by engaging in effortful learning strategies.”
Why it Matters:
This research project builds on the (rather safe, in my opinion) assumption that retrieval practice is the best way to study. With this in mind, why don’t students do it? I’ve long suspected that it’s the effort involved, along with perhaps the emotional angle of how retrieval practice exposes where our learning is falling short – an uncomfortable, but useful thing to know before a higher-stakes test. It’s encouraging to know that when students try different strategies in a systematic way and see the results, they change their choices accordingly. In this way, the findings mesh nicely with various systems currently being developed for helping students develop metacognition, such as the Knowledge-Belief-Commitment-Planning (KBCP) framework discussed in an earlier R3 entry.
Most Relevant For:
Faculty reading groups; student success programs; researchers in the field of learning sciences; instructional designers; faculty professional development directors
Limitations, Caveats, and Nagging Questions:
While this article might fit nicely with other ideas about how to apply the findings in typical courses, it doesn’t in and of itself say how to do that. There could be some issues with external validity or generalizability given the narrow type of task and material they used (anatomy illustrations), and the somewhat contrived nature of telling the “students” exactly what to do (restudy or retrieve) in the initial phases of the procedure. Lastly, as the authors acknowledge in the Discussion section, it’s hard to tell how much of the strategy choices were guided by the actual results participants achieved in the feedback phase, and how much by the explicit advice they were given to frame that feedback. If I were replicating this study myself, I’d tone down or eliminate the advice and let the feedback stand on its own.
It's important to note that they didn’t replicate the usual retrieval practice effect. Ironically enough, although researchers were able to persuade participants (to an extent) that this is a good way to study, despite the effort involved, retrieval practice didn’t pay off in this particular task. The authors chalked this up to giving only a short period of time to look at each image, and not giving feedback on right or wrong answers during the study phase. To me, this is still a puzzling finding, and raises the question of whether teachers ought to emphasize even more the “try it and reflect” approach to developing metacognition, to accommodate those (probably rare) cases where retrieval practice isn’t the best strategy.
If you liked this article, you might also appreciate:
Ariel, R., & Karpicke, J. D. (2018). Improving self-regulated learning with a retrieval practice intervention. Journal of Experimental Psychology: Applied, 24(1), 43–56.
Kirk-Johnson, A., Galla, B. M., & Fraundorf, S. H. (2019). Perceiving effort as poor learning: The misinterpreted-effort hypothesis of how experienced effort and perceived learning relate to study strategy choice. Cognitive Psychology, 115, 101237.
Karpicke, J. D., Butler, A. C., & Roediger, H. L. (2009). Metacognitive strategies in student learning: Do students practise retrieval when they study on their own? Memory, 17(4), 471–479. https://doi.org/10.1080/09658210802647009
Tullis, J. G., Finley, J. R., & Benjamin, A. S. (2013). Metacognition of the testing effect: Guiding learners to predict the benefits of retrieval. Memory & Cognition, 41(3), 429–442.
File under: retrieval practice; metacognition; anatomy and physiology; feedback; effort; motivation
2. Apply small teaching tactics in an introductory programming course: Impact on learning performance
Citation:
Jiang, Y. (2022). Apply small teaching tactics in an introductory programming course: Impact on learning performance. Journal of Information Systems Education, 33(2), 149–158.
Article Link:
https://jise.org/Volume33/n2/JISE2022v33n2pp149-158.html
Paywall or Open:
Open
Summary:
This article describes the approaches to and outcomes of bringing in “small teaching” techniques, in lieu of a total course redesign, in an introductory programming course. This framework involves incremental changes and relatively small additions to course design and content, based strongly on principles from cognitive psychology such as retrieval practice and interleaving, along with a focus on deep and applied learning. Learning was assessed across sections before and after the implementation of small teaching features into the course. Performance on most measures was significantly improved with the small teaching features, and the implementation proved to be relatively easy to do in this type of course.
Research Question(s):
- What are the impacts of bringing in small teaching approaches (especially those involving retrieval practice, spacing, and interleaving) on student learning, both in terms of retention and development of thinking skills in introductory programming?
- What are the impacts of these approaches on overall, long-term performance in introductory programming?
Sample: 55 students enrolled in sections of Introductory Business Programming at a mid-sized public university in the southeastern United States
Method/Design: Performance was compared before and after implementation of small teaching practices, with the pre-implementation sections serving as a control group. The instructor and overall course structure were identical, apart from the small teaching features brought in across the experimental and control sections. Quiz, homework, and exam grades served as measures of performance involving both knowledge/memory and problem solving/thinking. Examples of small teaching included having students summarize concepts taught in the previous class meeting (retrieval practice), spreading out coverage of key concepts across sections (spacing), and having students apply different variations on problem solving within a single work session (interleaving).
Key Findings: Student performance on both the knowledge aspects (quiz scores) and thinking skills aspects (homework problems) were significantly improved in the experimental section across most, but not all of the different assessments.
Choice Quote from the Article:
“In contrast to drastic approaches that demand significant instructor time and effort to prepare course redesign before the beginning of a semester, each of these small teaching approaches can be designed and implemented right away with limited preparation, and none of them require extra financial or technical support. Additionally, they are accessible to instructors of all ranks and disciplines and are flexible for implementation in a specific class session, in the middle of a semester, or throughout a semester. Adopting small teaching techniques in course design and delivery is by no means an inferior choice compared to teaching techniques that require big changes such as flipped classroom or simulation games.”
Why it Matters:
This article focuses on teaching introductory programming, but its approach generalizes across a variety of fields. It taps into a framework called “small teaching,” created by author and faculty developer James Lang and further developed by online teaching expert Flower Darby. Lang and Darby’s ideas have been influential, and so I think it is important to see how the framework looks when it’s implemented in a realistic setting such as an introductory programming course. I thought it was also interesting to consider the challenges associated with this particular discipline and course, which (as the authors point out) is increasingly required even for non-computer-science majors (e.g., business) and has the reputation of being a killer course among faculty as well as students. The author points out that in this course, students have to apply what they’re learning in a problem-driven way, while also having to master abstract and formal aspects of programming – concepts that, if missed early on, will compound into failure sooner or later.
This kind of course is therefore an ideal candidate for course redesign, with promising results from some existing projects. Small teaching provides a particularly powerful conceptual framework for guiding redesigns, according to the author, especially given that it focuses less on total overhaul and more on targeted interventions that are easier and cheaper than traditional, major redesign, as well as more flexible. Lastly, I think this article is a good model for other faculty across disciplines interested in systematically studying and disseminating innovations in their classes – i.e., doing and publishing SoTL.
Most Relevant For:
Computer science and information systems faculty and departments; faculty teaching and/or coordinating foundational STEM courses; instructional designers, teaching center staff, and others involved in course redesign; fans of the Small Teaching framework; those interested in pursuing SoTL
Limitations, Caveats, and Nagging Questions:
On the one hand, comparing across “experimental” and “control” sections is a solid approach for demonstrating tangible impacts of a particular intervention. On the other hand, there are some limitations to be aware of. Instructor expectations can, in theory, affect the results. It’s also often the case that several changes are rolled in all at once, which makes sense from a practical standpoint but makes it impossible to tease out which of the particular interventions had the most impact. It’s also important to note that not every single assessment in the course showed significant improvement, for reasons that aren’t clear. The authors also address the fact that they chose not to focus on student opinions, but that these could have added some depth to the quantitative assessments (quiz and homework scores)
If you liked this like this article, you might also appreciate:
Lang, J. M. (2016). Small Teaching: Everyday Lessons from the Science of Learning. Jossey-Bass.
https://www.amazon.com/Small-Teaching-Everyday-Lessons-Learning/dp/1118944496
Darby, F., & Lang, J. M. (2019). Small Teaching Online: Applying Learning Science in Online Classes. San Francisco: Jossey-Bass.
https://www.amazon.com/Small-Teaching-Online-Applying-Learning/dp/1119619092
File under:
STEM; small teaching; retrieval practice; spacing; interleaving; course redesign; computer programming