Yang et al., (2021) published their meta analysis just this week. The first post in this series looked at the broad point of ecological validity of testing research, both backward and forward testing effects concluding with the main finding.
The current metaanalysis finds a reliable advantage of testing over other strategies in facilitating learning of factual knowledge, concept comprehension, and knowledge application in the classroom. Overall, testing is not only an assessment of learning but also an assessment for learning.
Yang et al., (2021).
Now to the second bite. Questions 2 to 7.
Q.2 Against What Comparison Treatments Does Quizzing Enhance Learning?
Simple, students prefer to restudy (g = 0.330) and other elaborative strategies (g = 0.095), testing is overall a more powerful method to enhance classroom learning.
Reflections: Embed into your testing routines in class, knowing that put of class quizzing is less effective. Time assigned to building and securing knowledge is transferable and deepens thinking. It is an investment.
Q3. Does quiz format matter?
Here was a surprise, matching was the highest effect size format. When you think about the processes and “thinking effort” involved, perhaps it should not have been such a surprise, (g = 0.913 for Matching; g = 0.773 for Fill-in-the-blank; g = 0.638 for Short answer; g = 0.567 for Multiple choice.).
Reflections: An interesting connection to Mazarno’s work on manipulatives. Offer uneven pairs, eg a red herring. The odd-one-out adds additional challenge to the Matching task.
Q.4 Can Knowledge Tested in One Format Be Retrieved to Answer Questions Presented in a Different Format?
Consistent test formats (g = 0.531) are associated with a significantly larger effect size than inconsistent formats (g = 0.399). Material matching significantly modulated the testing. Not forgetting that testing enhances the learning of untested materials,.
Reflection: Which is to be expected? No? It adds a layer of disguise or unfamiliarity. Beware the inherent dangers of fluency. Coherence between learning mode and testing mode (even when testing is the learning) needs to be a variable educators are aware of.
Q5. Does testing benefit untested knowledge?
Testing significantly benefits untested knowledge, although to a smaller extent (g = 0.321) than that for tested knowledge (g = 0.512). Testing not only enhances learning of facts but also facilitates knowledge comprehension and application.
Reflection: “Semantic coherence between tested and untested material is a key modulator of the enhancing effect on untested materials,” and one would also think prior contextual knowledge, eg having previously studied Shakespeare? As as noted above – educators need to be aware of learning-testing coherence.
Q6. Should corrective feedback be offered?
Offering corrective feedback following class quizzes (g = 0.537) significantly increases learning gains over not providing feedback (g = 0.374). Corrective feedback induces greater re-exposure and larger learning gains.
Reflection: From a practical and experiential stand point – recent observations using Classroom has shown that knowing and expecting corrective feedback not only re-exposes the learner to correct or preferred answer it importantly promotes psychological safety, with self-assessment promoting learner agency and confidence in the classroom. This is an area that requires deeper investigation. It sits along side testing, within the realms of metacognition and self-regulation.
To know what you know and what you do not know, that is true knowledge.
Confucius
This growing security and agency is what @MrClassics3 has coined “The Deadpool Effect.” The impact of using using Classroom for testing, the protocol adopted. It is a post I am very much looking forward to exploring and writing with him further in the not too distant future.
13 more questions to go…
Opening post | Questions 2-6 | Question 7-11 | Questions 12-15 | Questions 16-19 |
Pingback: 48,478 students’ data from 222 studies and 573 effects – Testing Effect | Kristian Still
Pingback: 48,478 students’ data from 222 studies and 573 effects – Testing Effect – Edventures