Twitter | Search | |
Brian Nosek Aug 27
We replicated 21 social science experiments in Science or Nature. We succeeded with 13. Replication effect sizes were half of originals. All materials, data, code, & reports: , preprint , Nature Human Behavior
Reply Retweet Like
Brian Nosek
Using prediction markets we found that researchers were very accurate in predicting which studies would replicate and which would not. (blue=successful replications; yellow=failed replications; x-axis=market closing price)
Reply Retweet Like More
Brian Nosek Aug 27
Replying to @BrianNosek
Design ensured 90% power to detect an effect size half as large as original study. Replications averaged 5x the sample size of originals. We obtained original materials in all but one case, and original authors provided very helpful feedback on design.
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
Eight original authors & Malcolm Macleod wrote commentaries published in Nature Human Behavior offering a variety of insights. Particularly attend to the ideas about why our results might have differed from the originals. See commentaries linked to
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
For successful ones, replication effect sizes were about 3/4ths the original effect size. For failures, replication effect sizes were near 0. Could suggest influence of publication bias and false positives but does not demand that conclusion.
Reply Retweet Like
Brian Nosek Aug 27
Replying to @m_sendhil @eshafir
An outstanding example of replication fostering deeper understanding is the Shah, , & . We failed to replicate their Study 1 of 5. Then they conducted replications of all 5. They also failed on Study 1, but other parts replicated:
Reply Retweet Like
Brian Nosek Aug 27
Replying to @RolfZwaan
The social-behavioral sciences are undergoing a reformation. First, replication is becoming a normal activity for reducing uncertainty and improving understanding of novel claims. See et al.
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
Second, social scientists are adopting preregistration to improve rigor & demarcate confirmatory and exploratory results—the latter opening possibilities but w/greater uncertainty. Preregistration has doubled each year since 2012:
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
Preregistration and Registered Report growth is non-linear, particularly in social-behavioral sciences, and most particularly (so far) in psychology.
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
Third, journals are adopting transparency policies to improve reproducibility. Ex: We reviewed 33 journals that publish social-personality psych. In 2013, 0 had transparency policies. Today, 73% have some policy & 58% are relatively assertive based on TOP:
Reply Retweet Like
Brian Nosek Aug 27
Replying to @improvingpsych
Social-behavioral sciences' self-scrutiny is changing the norms for rigor and reproducibility. Change is occurring by grassroots efforts by individuals and labs, and by policy interventions by disciplinary leaders and editors. e.g.,
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
Change is also spreading. Ex: More than 100K researchers are using to register their research and make data and materials more accessible. All fields will benefit from this reformation toward greater rigor and transparency.
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
Vox: "What scientists learn from failed replications: how to do better science."
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
Science: "‘Generous’ approach to replication confirms many high-profile social science findings"
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
Nature: "Researchers replicated 62% of social-behaviour findings published in Science and Nature — a result matched almost exactly by a prediction market."
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
Washington Post: "the experiments scrutinized in this latest effort were published prior to a decision several years ago by Science, Nature and other journals to adopt new guidelines designed to increase reproducibility, in part by greater sharing of data"
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
Wired: "Thousands of researchers now pre-register their methodology and hypothesis before publication, to head off concerns that they’ll massage data after the fact. "
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
NPR "'The social-behavioral sciences are in the midst of a reformation'... Scientists are..announcing in advance the hypothesis they are testing; they are making their data and computer code available so their peers can evaluate and check their results."
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
The Atlantic: "Fortunately, there are signs of progress. The number of pre-registered experiments—in which researchers lay out all their plans beforehand to obviate the possibility of p-hacking—has been doubling every year since 2012. "
Reply Retweet Like
Brian Nosek Aug 27
Replying to @BrianNosek
Science News: "‘Replication crisis’ spurs reforms in how science studies are done"
Reply Retweet Like