R3 2.1 January 8, 2024 Do Students Learn More from Making Their Own Digital Flashcard Decks?
Welcome to the beginning of the second year of the R3 newsletter - with new findings on an old technique.
Welcome to the 2024 iteration of R3! I’ve been thrilled by the growing interest in this newsletter (thank you for that!), and looking to keep that momentum going in the year to come.
I’ll continue scouring the scholarly literature for the most interesting, useful, and thought-provoking research in learning sciences, especially work that also weaves in some combination of educational technology, inclusion, and student success. I’ll continue to tailor the discussion to the community of professionals working to improve college pedagogy: instructional designers, faculty, faculty professional development leaders and ed tech folks. Every now and again, I’ll review new books in the field, pull together favorite resources, and offer some opinions of my own. It’s my sincere hope that you will find R3 a useful way to stay in touch with the ever-changing, and sometimes overwhelming, research literature informing how we design and teach college courses.
With that, I’ll kick off this year of R3 with a recent study offering an intriguing new take on one of the oldest and most popular study strategies around: flashcards.
Citation:
Pan, S. C., Zung, I., Imundo, M. N., Zhang, X., & Qiu, Y. (2023). User-generated digital flashcards yield better learning than premade flashcards. Journal of Applied Research in Memory and Cognition, 12(4), 574–588.
DOI:
https://doi.org/10.1037/mac0000083
Paywall or Open:
Paywall
Summary:
Now that flashcards have gone digital, students have thousands of premade online study decks to choose from – but do they learn more when they make up their own? This would be a fairly straightforward prediction grounded in the depth of processing principle (Craik & Lockhart, 1972), by which engaging in effortful, meaning-based processing of information increases the likelihood of remembering that information. However, previous research has produced mixed results on whether this prediction holds up in practice. Over a series of six experiments, the authors found consistent support for the idea that creating one’s own card decks – even when this consumes a larger part of one’s available study time – produces better retention than using premade ones.
Sample:
Undergraduate students recruited from a university study participation pool; approximately 50 -70 participants for each one of the six studies.
Method/Design:
The basic methodology for all six of the studies involved presenting passages of typical college-level educational text (approximately 500 words each). Participants read and then studied these passages using flash cards prepared in a specified way, within a browser-based digital flashcard interface. Forty-eight hours after this initial study phase, participants then took a test over the studied material.
The design was within-participants, meaning that all participants tried both ways of preparing the cards. It was set up so that the order of different methods (self-generated vs. premade) and the particular passage assigned to each was balanced out over the course of the study. In the premade condition, participants were told that the deck of cards (focusing on key terms from the passage) had already been prepared, and instructed to spend their time trying to learn the terms by quizzing themselves and checking their answers. In the self-generated condition, they were given a list of the key terms and told to prepare flashcards for each, then to study the cards they’d created. Overall study time was held constant, so that participants had the same amount of total time to either study the premade flashcards, or prepare and then study the self-generated flashcards.
Across the six studies, the main variation was in the exact instructions for preparing cards in the self-generated conditions: copying text verbatim from passage to card (Exp. 1); copying and pasting (Exp. 2); generating examples (3b); and several variations on paraphrasing (3a, 4a, 4b).
Key Findings:
In study 1, there was no difference in performance across the self-generated and premade card conditions. However, a significant and fairly large advantage for self-generated cards (on the order of around 10 percentage points) emerged when participants generated cards by copying, paraphrasing, or creating examples connected to the different terms. In general, the more active and deep processing that was involved in flashcard use, the better participants did. However, the lack of a self-generation advantage in Study 1 may not have only reflected the relatively passive process of copying, but also the time-consuming nature of this method, which left far less time to study the cards after generation.
Choice Quote from the Article:
The present results suggest that the common practice of using freely available flashcard sets—which many learners do for convenience, despite concerns about quality (what Zung et al., 2022, called an “ease–accuracy trade-off”), and with greater frequency than premade paper flashcards—can impair learning efficacy. Accordingly, one of the chief selling points of many digital flashcard platforms, namely the millions of premade flashcard sets (including sets prepackaged with textbooks and other educational products), may not be as compelling as currently thought. Fortunately, the solution is quite simple: use flashcard-making features.
Why it Matters:
Flashcards are a big deal. As the authors point out, around 50% of college students report using premade flashcard decks in particular. It’s a rare student who hasn’t at least attempted flashcards as a study technique, and so correcting a bias toward a less-effective way of going about it could have some far-reaching effects. Any time a student tries retrieval practice and sees a benefit is a win in my book, and if that benefit is even a little bit more obvious to them, due to better technique – so much the better.
You might think that this was all a foregone conclusion – that generating your own cards would have to be superior, given the effort and at least some level of focused attention and thought that would go into the task. But this presumed generation effect doesn’t always pan out in practical learning situations. For example, in an old series of experiments my colleague Laurie Dickson and I conducted, we found that authorized crib cards (notes that students are allowed to bring into an exam) didn’t do much to predict improved exam performance, despite the thinking and generation that goes into making such cards (Dickson & Miller, 2005). In fact, as a later study showed, students actually did better when using cards prepared by an expert, compared to their self-made cards (Dickson & Miller, 2006).
So self-prepared materials don’t always outshine pre-prepared ones, but in the case of flashcards – used more during the creation of memories for material than as an aid to retrieval - they definitely do. This is one piece of solid advice we can offer to students about one of their go-to strategies: make your own. Especially now, given that sources like Quizlet have been around forever and that AI can help generate digital study materials with incredible ease, this is an important fact to share. And who knows – maybe this one tidbit will get students interested in hearing more of our evidence-based expert study guidance.
One last take-away has to do with the poor results found for self-generated cards in the verbatim-transcription condition (and only this condition). In my ideal world, this finding would put to rest the myth that hand-copying somehow burns information into the brain, a stubbornly persistent notion that helps fuel an irrational level of belief in the superiority of handwritten over typed notes. (For a deeper discussion of the anti-laptop furor and research contradicting it, check out my latest book, Remembering and Forgetting in the Age of Technology or this blog post .)
Most Relevant For:
Faculty; educational technology designers; instructional designers; student success program leaders and staff.
Limitations, Caveats, and Nagging Questions:
It’s important not to over-simplify the pattern of results (“premade cards don’t work,” etc.) given that in one of the conditions (the ones involving straight transcription) there really was no difference. Especially if we’re communicating these findings to students in the form of study advice, I think an important talking point is that any retrieval practice is better than none, and flashcard practice probably does have some substantial benefits over, say, passive rereading, even if students can’t or decide not to make their own card decks.
One conclusion does come through loud and clear, though: the continued support for depth of processing as a major factor in what we retain from flashcard practice (and probably by extension, all kinds of other study techniques as well).
If you liked this article, you might also appreciate:
Craik, F. I., & Lockhart, R. S. (1972). Levels of processing: A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11(6), 671–684. https://doi.org/10.1016/S0022-5371(72)80001-X
Dickson K. L., Miller M. D. (2005). Authorized crib cards do not improve exam performance. Teaching of Psychology, 32, 230–232.
Dickson, K.L., & Miller, M.D. (2006). Effect of crib card construction and use on exam performance. Teaching of Psychology, 33, 39-40.
Karpicke, J. D., & Roediger, H. L. (2008). The critical importance of retrieval for learning. Science, 319, 966–968. https://doi.org/10.1126/science.1152408
Yeung, K. L., Carpenter, S. K., & Corral, D. (2021). A comprehensive review of educational technology on objective learning outcomes in academic contexts. Educational Psychology Review, 33, 1583-1630. https://doi.org/10.1007/s10648-020-09592-4
File under: digital study aids; ed tech; retrieval practice; depth of processing; study techniques