R3 1.4 March 2, 2023 Evidence of UDL Impact; Cognition and Motivation
Does UDL improve outcomes? And, the evolving link between motivation and cognition.
This issue of R3 includes a blog post along with a summary of a new meta-analysis on the impacts of Universal Design for Learning (UDL). The post is titled “Revisiting the cognition-motivation connection” – it pulls in some of the studies I’ve talked about before in this Substack, combined with some ideas and themes I’ve been working with for a while. It will be up on my blog soon after this issue posts, and the full text of it is also below.
1. Achievement of learners receiving UDL instruction: A meta-analysis
Citation:
King-Sears, M. E., Stefanidis, A., Evmenova, A. S., Rao, K., Mergen, R. L., Owen, L. S., & Strimel, M. M. (2023). Achievement of learners receiving UDL instruction: A meta-analysis. Teaching and Teacher Education, 122, 103956.
DOI:
https://doi.org/10.1016/j.tate.2022.103956
Paywall or Open:
Paywall
Summary:
Universal Design for Learning is a powerful and appealing framework, but needs additional validation of its impact on learning. This study is a meta analysis - essentially, a study of studies done using statistical procedures that allow for analysis findings from across different publications. Studies of learners ranging from children to post-graduate adults were combined and analyzed, both for evidence of impact on learning as well as for overall study quality. Results revealed moderately-sized, positive impacts across a wide variety of age groups, subject material and other factors. There was fairly large variability in quality of UDL studies surveyed, especially with respect to how much detail was reported.
Research Questions (excerpted from the article):
“Research Question 1: Does learner achievement differ between non-UDL and UDL-based instruction?
Research Question 2: Are there specific factors that make learner achievement differ between non-UDL and UDL-based instruction?”
Sample: Twenty studies on intentional use of UDL that met inclusion criteria (described below), narrowed from an initial group of 164 pulled from search results for closer review. These included peer-reviewed as well as non-peer-reviewed (“gray literature”) studies.
Method/Design: The meta-analysis used a number of fairly stringent criteria for inclusion of studies in the final analysis. Key among these were: a design that included treatment and control groups; measurement of learner achievement (and not just, e.g., evaluation ratings); intentional implementation of UDL, with reasonably detailed explanation of how the UDL intervention was carried out. Effect sizes and other summary statistics were pulled from individual studies and entered into the meta-analysis. Researchers also rated the quality of studies using a detailed list of criteria involving the level of detail in the reporting (e.g., whether full description of participant demographics was included), nature of the UDL implementation, and internal validity.
Key Findings: There was a moderately-sized positive effect of UDL on learner achievement. This pattern held up for both smaller and larger class sizes and shorter and longer intervention durations. It wasn’t completely clear whether UDL impacts were similar across subjects, but computer science, photography, and technology showed particularly large effects. UDL impacts were somewhat larger for children vs. adult learners, but not dramatically so.
Choice Quote from the Article:
“Based on the widespread and intuitive appeal that instructors use the UDL framework when designing and delivering instruction (e.g., Hollingshead et al., 2020; Rose et al., 2006), legislation has been based on preliminary evidence of UDL's promise (ESSA, 2015; Higher Education Opportunity Act, 2008). It is only with the advent of experimental studies focusing on learners' achievement that UDL's promise of effectiveness has shifted to potential evidence specific to learning. Significantly, our meta-analysis furthers that potential by careful accrual and analyses of data from studies investigating learners' achievement. With a moderate effect size ... these data provide substantive support regarding UDL's emergence as a research-based practice. ...Learner achievement differs between non-UDL and UDL-based instruction, favoring the latter.”
Why it Matters:
The authors make a great point early in the paper that a fair amount of the research on UDL hinges on weaker measures of learning, such as self-reports on surveys. There’s also a range of research designs, so that important features such as control groups aren’t a given. This meta-analysis takes a more hard-nosed approach to defining and assessing impact, with an emphasis on comparing actual student achievement between UDL and alternatives. The authors also make good points about the implementation of UDL, and how that’s assessed, since this can range from attending UDL-based professional development, to self-rated teacher familiarity with UDL, to analysis of actual lesson plans and everything in between.
Beyond the finding of an overall positive impact of UDL (which is certainly major news in and of itself), there are a few other benefits that follow from this being a well-constructed meta-analysis. Like such studies often do, it hammers home the wide variation in quality and methodology in UDL studies, clearly making the case that we need not just more research but more scrupulously designed research. There is also a chart that’s rich with detail on all the studies they did incorporate, which is a major asset for those looking to build on the research so far or just do a deep dive into the UDL literature.
Lastly, this study does highlight the fact that UDL implementation can mean different things to different people. Based on this work, I think it’s safe to assume that if you implement UDL in a substantive way that’s faithful to the original concept, you will get results similar to what this study found (i.e., a moderate-sized, positive effect). If you don’t, you probably won’t.
Most Relevant For:
Instructional designers; researchers doing work on UDL; disability support services and others responsible for accessibility and inclusion; course redesign initiatives; faculty across disciplines engaged in overhauling or improving courses for inclusion and effectiveness
Limitations, Caveats, and Nagging Questions:
Meta-analysis is a great tool but in my experience, it can be fairly conservative. You can easily find, like these authors did, that out of thousands of initial studies, only a few are reported in enough detail and are well put together enough to make the final cut. Those inclusion criteria have an enormous driving effect on the overall results you find. That said, meta-analysis helps bring clarity to topics for which there is lots of research and interest, but not always a coherent story emerging from all these different studies.
Another caveat is that there are many, many subsidiary analyses in this paper – tests comparing across the gray-literature and peer-reviewed studies, different demographic groups of learners and so on. Most of these will only be of real interest to specialists, but keep in mind that what I’ve summarized here is just that, a brief summary, and if there’s a particular question that piques your interest, definitely check that out in the original as the authors probably addressed it in one way or another. Also, note that for a lot of the comparisons the authors initially set out to do, there just wasn’t enough fine-grained data to support those planned comparisons.
If you liked this like this article, you might also appreciate:
Tobin, T., & Behling, K. (2018). Reach Everyone, Teach Everyone: Universal Design for Learning in Higher Education. West Virginia University Press.
Hogan, K., & Sathy, V. (2022). Inclusive Teaching: Strategies for Promoting Equity in the College Classroom. West Virginia University Press.
Wilson, L.C. (2014, September 10). Introduction to meta-analysis: A guide for the novice. Association for Psychological Science. https://www.psychologicalscience.org/observer/introduction-to-meta-analysis-a-guide-for-the-novice
File under: UDL, inclusive teaching, meta-analysis, achievement of learning goals/objectives, methodology
2. Reflection
Revisiting the cognition-motivation connection: What the latest research says about engaging students in the work of learning
I sometimes tell a story about my first solo book, Minds Online: Teaching Effectively with Technology, involving a crisis that hit about 2/3 of the way through writing it. I forget what topic I’d originally planned to cover in chapter 8, but once I got to that section of the manuscript, I had a bad feeling. There was something missing, and although I wasn’t sure, I suspected it was this: student motivation.
After a few weeks of fretting and overly-dramatic email exchanges with my editor, I resolved to reshape that part of the book. What I ended up doing was synthesizing some classic research and theory in the area and looking at how those ideas might play out in fully online classes.
Granted, the idea of addressing the factors that move students to actually complete one’s carefully-designed course activities might not seem like a bold move. It was a big step for me at the time though, not least because – like a typical academic – I worried about staying comfortably inside of my micro-specialization in cognitive psychology.
But that wasn’t the only issue. In psychology, it’s only fairly recently that we’ve begun to really explore the relationship between the thinking-side and the feeling-side of the mind and brain, especially with respect to how the thinking-brain and feeling-brain influence and shape one another in an powerful set of feedback dynamics.
(As an aside, motivation does belong under the “feeling” side. Textbooks and courses almost always combine motivation and emotion in a single heading, not because there isn’t enough material to treat them separately, but because they are so closely intertwined. Evolutionarily speaking, the point of having emotions in the first place is to motivate, or move, us. Emotions move us away from some things and toward others, in the name of our own survival and the survival of our genes. They also push us to develop and maintain the social bonds that make all forms of human survival possible).
Ever since that experience with Minds Online, I’ve advocated in various ways for tapping into the mind’s inborn mechanisms for motivation, especially those that relate to students’ goals for what they want to get out of their own education. Years back, I even toyed with a larger-scale project about the reciprocal relationship between motivation and cognition. I went so far as to develop a book proposal, titled Leading to Water: Motivating College Students to Take Action, Invest Effort, and Own Their Learning. Looking back on it now, the emphasis I’d put on accountability, resilience, and effort come across as a bit harsh, given that today students are picking up the pieces of their education after COVID, and that in this environment, supporting student mental health takes priority over pushing students to achieve.
But even in context of the current focus on support and flexibility, there is still a lot we can glean about teaching from the study of motivation. I’m not alone in thinking this, seeing as how there’s currently a mini-renaissance of interest in exactly this. The harbingers are all there - keynote titles and webinar topics centered on student engagement, articles in high-profile media outlets. I’d count variations on the engagement theme too. Intrinsic motivation. Interest. Even growth mindset – which I’d argue is still a relevant and research-backed concept – is part of the same territory.
I’m all for this surge in interest, and it got me thinking back to the research basis for it. There are the still-around-for-a-reason classic concepts in academic motivation: Self-efficacy. Intrinsic motivation. Persistence. Self-determination theory. Feedback and its role in helping to induce flow states.
There is newer work that builds on these these classics, though. Much of it explicitly ties to cognitive processes like memory, attention, and thinking, as well as to effective study techniques such as retrieval practice.
This is all especially important because of one connection in particular, the one that hooked me into writing about it in the first place. It’s this: Without being motivated to put in focused effort, there’s no way for students to benefit from all the advances that have been made in the science of effective study.
I say this because practices like quizzing yourself, wrestling with difficult applied problems, and spacing out study are all especially effort-intensive, at least in the short run. With them, students won’t need to spend as many total hours hitting the books, but the hours they do spend will be more arduous – and in the case of retrieval practice, might give them initial feedback that isn’t pleasant to hear.
This is not to say that active study is necessarily unpleasant, upsetting, or a chore. However, it’s a big change from the pleasant-but-inefficient alternatives like re-reading that students default to. Even interleaving, in which you tackle different categories of problem in an unpredictable fashion in a single session, is commonly perceived as harder and more frustrating – potentially cutting students off from the demonstrated benefits of studying in this way.
With that, what does the latest research tell us about the relationship between motivation and learning?
Effortful study techniques are often the better ones, but unfortunately, students seem to perceive this relationship in reverse. One study found that research volunteers rated retrieval practice as harder and also, less effective as a study strategy, compared to passive review. (The good news is that with feedback, they re-evaluated and readily tacked over to study through retrieval.) Another study presented student volunteers with hypothetical study schedules they might use in the run-up to a math exam. Here too, students tended to reject schedules that were high in spacing and interleaving, rating them as less pleasant as well.
I want to be clear here: nothing about this work should imply that struggling students are slacking off, looking for easy grades or worst of all, that they are inherently lazy. If all of that classic research on motivation has taught us one thing, it is that motivation is best seen as a response to a situation, not a disposition you either have or you don’t. Anyone is capable of putting forth effort, when the conditions are right to do so. But it does look like the message about desirable difficulty has a long way to go in reaching students, with many of them continuing to mistake ease for effectiveness.
There’s good news that comes out of the latest research as well. One of the most encouraging things I’ve seen, as a big fan of retrieval practice, is research showing that when students answer quiz questions about a subject, they’re more likely to want to learn more about it. The key dynamic here seems to be that building up a firmly established knowledge base triggers a type of snowball effect, stimulating curiosity and setting off that type of virtuous cycle that all good teachers treasure.
Curiosity, as it turns out, is also sparked by choice – that key component of the influential "self determination theory" of motivation in which autonomy plays a key role. Using a fairly ingenious procedure involving a sham lottery, a research team found that when people get to choose a specific prize drawing from several alternatives, they become more invested in finding out the results.
And lastly, there’s exciting new work being done on the best ways to persuade students that active, effortful study really is the way to go. I say “persuade,” not just inform, because as in so many things, study habits aren’t a behavior that people change simply because they’re told they should. The KBCP framework – short for Knowledge, Belief, Commitment, Planning – is a refreshing alternative to traditional study skills instruction, the one where students are handed a soon-to-be-forgotten list of random-seeming tips about what to do and not to do.
KBCP does start with sharing information about better study practices – the “knowledge” component – but then pivots to persuading students that they do work, ideally through interactive demonstrations or in-class experiments. Students then internalize and carry forth the new practices, committing to using them, planning for how they will do this, and reflecting on the results.
I’m keenly interested in seeing these developments continue, and not just because they recombine concepts from psychology in ways that delight and intrigue me as a disciplinary expert. If we really are going to emerge from the crisis of the last three years stronger and better, and more committed to serving our students, we will need to balance both what we ask students to do and why they should do those things. If we are going to take full advantage of the massive and growing research base on learning, we’ll need to make our approaches appealing. If we are going to be truly transparent with students about the paths to success, we’ll need to persuasively share the best ways to study. We can’t do it without igniting motivation, engagement, and drive.
Further Reading
Abel, M., & Bäuml, K. H. T. (2020). Would you like to learn more? Retrieval practice plus feedback can increase motivation to keep on studying. Cognition, 201(March), 104316. https://doi.org/10.1016/j.cognition.2020.104316
Cavanagh, S.R. (2016). The Spark of Learning: Energizing the College Classroom with the Science of Emotion. West Virginia University Press.
D’Mello, S., & Graesser, A. (2012). Dynamics of affective states during complex learning. Learning and Instruction, 22(2), 145–157. https://doi.org/10.1016/j.learninstruc.2011.10.001
Hui, L., de Bruin, A. B. H., Donkers, J., & van Merriënboer, J. J. G. (2022). Why students do (or do not) choose retrieval practice: Their perceptions of mental effort during task performance matter. Applied Cognitive Psychology, 36(2), 433–444. https://doi.org/10.1002/acp.3933
Romero Verdugo, P., van Lieshout, L. L. F., de Lange, F. P., & Cools, R. (2022). Choice boosts curiosity. Psychological Science. https://doi.org/10.1177/09567976221082637
Shen, L., & Hsee, C. K. (2017). Numerical nudging: Using an accelerating score to enhance performance. https://doi.org/10.1177/0956797617700497