Well, here it
is, our call for proposals regarding pre-registered replication studies on cognition.
When I say “our” I hasten to add that we stole and adapted the text of this
call, with their permission, from Brian
Nosek and Daniël Lakens
who are guest editing a special
issue of Social Psychology. Their original text is a great model for how
replication studies should be solicited. It might become a standard feature of
many journals in the years to come.
Our special
issue will be an interesting experiment, the results of which we are awaiting
with some trepidation. How many submissions can we expect? No idea. Are we
opening the floodgates or will it be a slow trickle? Only time will tell. We take courage
in our Dutch heritage (and I don’t mean Dutch courage). Taming the waters is in our blood.
Call for Proposals
Special Issue of Frontiers
in Cognition
“Replications of Important Results in Cognition”
Guest Editors: René Zeelenberg &
Rolf A. Zwaan
A signature strength of science is that the evidence is
reproducible. However, direct
replications rarely appear in psychology journals because standard incentives
emphasize novelty over verification (for background see Nosek, Spies, & Motyl, 2012, Perspectives on Psychological Science).
This special issue, “Replications of Important Results in Cognition,” alters
those incentives. We invite
proposals for high-powered, direct replications of important results in all
areas of cognitive psychology, ranging from perception to social cognition. The
review process will focus on the soundness of the design and analysis, not
whether the outcome is positive or negative.
What are important results?
Importance is subjective but demonstrable. Proposals must justify the replication
value of the finding to be replicated. To merit publication in this issue, the
original result should be important (e.g., highly cited, a topic of intense
scholarly or public interest, a challenge to established theories), but also
should have uncertain truth-value (e.g., few confirmations, imprecise estimates
of effect sizes). The prestige of the original publishing journal is not
sufficient to justify replication value.
What replication formats are
encouraged?
Proposals should be for direct replications that faithfully
reproduce the original procedure, materials, and analysis for verification.
Conceptual replications that attempt to improve theoretical understanding by
changing the operational definition of the constructs will not be considered
for this issue. Articles in the issue can take two forms:
(1) Registered
replication. Authors submit
the introduction, methods, and analysis plan for a replication study or
studies. These proposals will be reviewed for their importance and soundness.
Once provisionally accepted, if authors complete the study as proposed, the
results will be published without regard to the outcome. Registered replication
proposals also could include: (a) collaborations between two or more
laboratories independently attempting to replicate an effect with the same
materials, (b) joint replication by the original laboratory and another
laboratory, or (c) adversarial collaborations in which laboratories with
competing expectations prepare a joint registered proposal and conduct
independent replications. Only adequately powered tests of results with high
replication value will be considered.
(2) Registered replication + existing replication attempts. Researchers may already have performed several experiments
attempting to replicate published findings. These experiments may be included
in the submission, but each submission should include at least one registered
replication. Authors should report the experiments they have already completed
(including the results) and describe the registered experiment that they plan
to run.
How do I propose a
replication project?
Interested authors should contact the guest editors before
preparing a formal proposal (Rolf Zwaan, zwaan@fsw.eur.nl; René Zeelenberg,
zeelenberg@fsw.eur.nl). These pre-proposal discussions will occur in early 2013, with the
special issue scheduled for publication in 2014. Deadlines for the formal
proposal and final manuscript depend on the type of project. Registered replication proposals should
be submitted by April 1, 2013 to leave time for initial review, revision,
provisional acceptance, data collection, manuscript preparation, final review,
and acceptance of the final report.
Registered Replications
Replication teams submit a replication proposal for review
prior to initiating data collection.
Peer review includes an author of the original study and other relevant
experts. Review addresses two
questions: (1) Does the finding have high replication value?, and (2) Is the
proposed design a fair replication?
If accepted, the proposal is registered at the Open Science Framework (http://openscienceframework.org/)
or equivalent registration site, and then data collection may commence.[1] Proposals are accepted for publication
conditional on following through with a competent execution of the proposed
design.
Registered Replication
Proposal
Replication proposals include a short introduction that
describes the to-be-replicated finding, summarizes the existing evidence about
the finding, and articulates why the finding has high replication value. If the authors have already performed
one or more replication attempts and wish to include those in their final paper
these experiments and the results should be described in the proposal. The
methods section is the central part of the replication proposal. Because data collection for the
registered experiment cannot start before the paper is provisionally accepted there
are no results or discussion sections for registered experiments in the proposal.
The following should be included in the methods section:
1. Sampling plan. Power analysis based on effect size of the existing evidence
for the finding; planned sample size, manner of recruitment, and anticipated
sample characteristics. Ideally,
studies will achieve .95 power to have equal likelihood of erroneously
rejecting and accepting the null hypothesis. When such power is not feasible, provide justification.
2. Materials and procedure. Ideally, authors obtain the original study materials to
maximize comparability of the replication attempt. If not feasible or
desirable, explain why. Procedures
are described as completely as possible so that reviewers can identify
potential design improvements.
Additional material - such as videos simulating experimental conditions
- can be made available to enhance transparency and review.
3. Known differences from original study. No replication is exact. Fair replications reproduce the
features considered critical for obtaining the result. Authors describe known differences and
explain why these are not critical for a fair replication of the original
finding. In general, authors should avoid making “improvements” on research
designs unless a reasonable expert would agree that the changes improve the
sensitivity of the design to detect the effect.
4. Confirmatory analysis plan. The ideal analysis plan includes an
executable script that would process the data and produce the confirmatory
results. At minimum, authors
should describe the data cleaning plan - i.e., exclusion criteria for participants
and observations; and the analysis process for evaluating the replication
attempt. It should also describe the basis for evaluating the success of the
replication.
Registering the Replication
Project
Accepted proposals, and all shareable materials, are
registered and made available publicly through the Open Science Framework (http://openscienceframework.org/)
or equivalent registration venue.
Other research teams, including the original study authors,
will have the opportunity to submit proposals to conduct a parallel replication
following the registered project protocol. Interested teams should email one of the editors of the
special issue for information on applying. If accepted, these parallel
replications will be very short independent papers that follow the “primary”
report quickly summarizing design alterations, confirmatory results, and a
brief conclusion.
Final Reports
After data collection and analysis. Authors add a results section, discussion section, and an
abstract. The results section
separated into two parts: confirmatory analysis and exploratory analysis. The confirmatory analysis section reports
the outcome of the registered analysis plan. Authors may add exploratory analyses to further examine the
finding. Exploratory analyses
should not address questions orthogonal to verification of the target of
replication. The discussion
question summarizes the findings, draws conclusions about the results, and
identifies limitations of the research.
Authors are encouraged to report a meta-analytic estimate and confidence
interval for the study combined with the original study and any other
replications.
After acceptance of the final report, and any other parallel
replications, the original study authors may be invited to write a brief
commentary. The commentary could
also include reporting of a parallel replication attempt.
Reacties
Een reactie posten