Psychology’s Renaissance

Friday, 2018, September 14 - 12:00
ESADE, Barcelona


In 2010-2012, a few largely coincidental events led experimental psychologists to realize that their approach to collecting, analyzing, and reporting data made it too easy to publish false-positive findings. This sparked a period of methodological reflection that we review here and call “psychology’s renaissance.” We begin by describing how psychology’s concerns with publication bias shifted from worrying about file-drawered studies to worrying about p-hacked analyses. We then review the methodological changes that psychologists have proposed and, in some cases, embraced. In describing how the renaissance has unfolded, we attempt to describe different points of view fairly but not neutrally, so as to identify the most promising paths forward. In so doing, we champion disclosure and pre-registration, express skepticism about most statistical solutions to publication bias, take positions on the analysis and interpretation of replication failures, and contend that “meta-analytical thinking” increases the prevalence of false-positives. Our general thesis is that the scientific practices of experimental psychologists have improved dramatically.

Suggested Reading:

Simonsohn, Nelson, Simmons (2014) "P-curve: A Key to the File Drawer," Journal of Experimental Psychology: General, V143(2), p.534-547

Simonsohn, Simmons, Nelson (2015) "Specification Curve: Descriptive and Inferential Statistics for all Plausible Specifications"

Simmons, Nelson, Simonsohn (2011) "False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allow Presenting Anything as Significant", Psychological Science, V22(11),  1359-1366

And a few blogposts
DataColada[64] "How to pre-register a study"
DataColada[55] "The file-drawer problem is unfixable, and that’s OK"
DataColada[33] "The Effect Size Does Not Exist"