Please report any queries concerning the funding data grouped in the sections named "Externally Awarded" or "Internally Disbursed" (shown on the profile page) to
your Research Finance Administrator. Your can find your Research Finance Administrator at https://www.ucl.ac.uk/finance/research/rs-contacts.php by entering your department
Please report any queries concerning the student data shown on the profile page to:
Email: portico-services@ucl.ac.uk
Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Email: portico-services@ucl.ac.uk
Help Desk: http://www.ucl.ac.uk/ras/portico/helpdesk
Publication Detail
Heterogeneity and Publication Bias in Research on Test-Potentiated New Learning
-
Publication Type:Journal article
-
Authors:Boustani S, Shanks DR
-
Publisher:University of California Press
-
Publication date:31/01/2022
-
Journal:Collabra: Psychology
-
Volume:8
-
Issue:1
-
Article number:31996
-
Status:Published
-
Language:English
-
Keywords:publication bias, meta-analysis, testing effect, test-potentiated new learning
-
Publisher URL:
-
Notes:This is an open access article distributed under the terms of the Creative Commons Attribution License (4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Abstract
Prior retrieval practice potentiates new learning. A recent meta-analysis of this test-potentiated new learning (TPNL) effect by Chan, Meissner, and Davis (2018) concluded that it is a robust and reliable finding (Hedges’ g = 0.44). Although Chan et al. discussed three different experimental designs that have been employed to study TPNL, we argue that their meta-analysis failed to adequately distinguish the findings from these different designs, acknowledge the significance of the substantial between-study heterogeneity across all pooled effects, and assess the degree of publication bias in the sample. We conducted a new meta-analysis that assessed the designs separately and applied appropriate corrections for publication bias. We found that studies using a standard design yield weak evidence of a TPNL effect, studies using pre-testing yield a small but reliable effect, and studies using interleaving designs yield weak evidence of a negative effect. Compared to Chan et al.’s conclusions, these reanalyses cast TPNL in a very different light and point to a pressing need for preregistered experiments to assess its reproducibility in the absence of publication bias.
› More search options
UCL Researchers