Call for commentaries on target paper “Replication value as a function of citation impact and sample size”


Researchers seeking to replicate original research often need to decide which of several relevant candidate studies to select for replication. Different strategies for study selection have been proposed, utilizing a variety of observed indicators as criteria for selection. However, the goal of study selection is usually not made explicit in these proposals, nor is it made clear how the goal is related to the indicators that are proposed. In a forthcoming article in Meta-Psychology ( we propose a concrete quantitative method to estimate the replication value of a study in order to help researchers identify which original studies are most in need of replication. The article proposes that replication value can be estimated using a combination of original study sample size and citation count. These indicators are then explicitly related to the goal of maximizing the expected utility of claims in the literature, the rationale for which is provided in Isager et al. (2021). 

In our view, it is important for an article discussing the value of a scientific practice to receive criticism, feedback, and viewpoints from a diverse range of people who are interested in this topic. We believe there is much room for critical discussion of the study selection strategy we propose. We have already seen differing viewpoints, for instance about technical aspects of the measurement model that supports the indicator we propose (RVCn), the underlying theoretical model we use to justify the indicator, potential alternative operationalizations of replication value, etc. We would like to facilitate and create a record of this discussion by inviting comments on our proposal. Ideally, the invitees would include stakeholders with a vested interest in the problem we tackle (deciding which studies should be prioritized for replication) but with no vested interest in the particular solution we offer. This way, shortcomings of, improvements upon, and alternatives to our proposed study selection strategy can be brought to the fore as quickly as possible.

Here, we ask the members of the scientific community to provide substantive criticism, interpretation, or elaboration on the article and submit their comments to Meta-Psychology. It is our goal to ultimately publish these comments alongside the final versions of the target paper. Comments may be supportive or critical, focus on the broad picture or more specific aspects of it, and they may even contain alternative proposals as to how replication study selection may be carried out. We expect the final version of the target paper to be significantly improved through this process. We also hope that, by engaging the community this way, we will be able to motivate as many colleagues as possible to join us in reflecting on how scientists could select claims for which more evidence is gathered through replication. 


  • 2024-07-30: Deadline for submission.
  • Regular commentaries should not exceed 1500 words and must link to the target article in a note on their front page.
  • If you have questions, please contact the special issue editor, Felix Schönbrodt ( before submission.
  • Submit your commentary at with the regular workflow (including uploading a preprint on PsyArXiv). In the submission form, please select “Special Topic” from the drop-down list.
  • Commentaries will be sent to an editorial review team made up of both board members and external peer reviewers.


Isager, P. M., Van Aert, R. C. M., Bahník, Š., Brandt, M. J., DeSoto, K. A., Giner-Sorolla, R., Krueger, J. I., Perugini, M., Ropovik, I., Van ’T Veer, A. E., Vranka, M., & Lakens, D. (2023). Deciding what to replicate: A decision model for replication study selection under resource and knowledge constraints. Psychological Methods28(2), 438–451.