These two revision strategies can prepare you for an exam much better than just restudying your notes
A new study has highlighted two superior strategies for university students when revising.
10 March 2020
When studying for exams, it can be tempting to just re-read textbooks or attempt to memorise your notes. But psychologists know that there are actually much more effective ways of learning — they just require a bit of extra effort.
A recent paper in Applied Cognitive Psychology has highlighted two of these superior strategies. The team finds that university students whose revision involves testing themselves or making up questions about course material perform better in a later exam than those who simply restudy their notes.
Past research had already shown that generating questions or being tested during the learning process helps people retain information better than passively trying to absorb knowledge. These strategies are considered "desirable difficulties" that make the learning process harder or more effortful, but which are ultimately beneficial. But many previous studies only examined people's ability to learn in the lab, or had tested them very soon after the learning phase. In real life, of course, people learn in places like schools and universities, and tests may occur days or weeks after studying.
So Mirjam Ebersbach and colleagues from the University of Kassel decided to test the effectiveness of these strategies in an actual educational context. The team recruited 82 German university students in a developmental psychology lecture and assigned them to one of three conditions. All three groups were given a print-out of 10 slides from the lecture (which was about the development of knowledge in infants), but each group studied the material in different ways.
The "restudy" group was told to simply memorise the content of each slide. The "testing" group was given a question for each slide which they had to answer (they were allowed to look at the content in the slides if they weren't able to answer initially). Finally, the "generating questions" group had to come up with test questions themselves, based on the content of each slide.
A week later, participants were given a surprise exam based on the information they had studied. Students had to answer five factual questions that directly related to the content they had learned, as well as five "transfer" questions, in which they had to apply the knowledge to new contexts.
Overall, participants who had only restudied the material scored a little under 45% on the test. By contrast, those who had answered or generated questions during the learning phase scored 11 percentage points higher, on average. Both of these strategies were significantly better than restudying, but neither was better than the other.
When the researchers analysed scores on the factual and transfer questions separately, they found some preliminary evidence that the two methods produced better results than restudying for both kinds of question. This suggests that the strategies help people to not only learn facts, but also apply that knowledge to new contexts.
Of course, the study has some important limitations. Most notably, the sample size was very small — groups ranged from 22 to 30 participants — so it will be important to replicate the findings in larger samples. And, as the researchers point out, their participants were all university students and so were presumably fairly good learners, or at least had plenty of experience studying and sitting exams. It remains to be seen whether these strategies are just as helpful in real-life situations for younger people or those who face greater difficulty absorbing new information.
Still, the results have clear applications for those looking to boost their learning. "Our results suggest that the two learning strategies, which are both clearly more effective than simple restudying, can be recommended by (university) teachers to learners and can also be recommended for (self-regulated) learning," the authors conclude.
Further reading
About the author
Matthew Warren (@MattbWarren) is Editor of BPS Research Digest