Preregistration specificity and adherence: A review of preregistered gambling studies and cross-disciplinary comparison

Downloads

Authors

DOI:

https://doi.org/10.15626/MP.2021.2909

Keywords:

Preregistration , Open Science, Gambling, Addiction, Meta-science, Researcher degrees of freedom

Abstract

Study preregistration is one of several “open science” practices (e.g., open data, preprints) that researchers use to improve the transparency and rigour of their research. As more researchers adopt preregistration as a regular practice, examining the nature and content of preregistrations can help identify the strengths and weaknesses of current practices. The value of preregistration, in part, relates to the specificity of the study plan and the extent to which investigators adhere to this plan. We identified 53 preregistrations from the gambling studies field meeting our predefined eligibility criteria and scored their level of specificity using a 23-item protocol developed to measure the extent to which a clear and exhaustive preregistration plan restricts various researcher degrees of freedom (RDoF; i.e., the many methodological choices available to researchers when collecting and analysing data, and when reporting their findings). We also scored studies on a 32-item protocol that measured adherence to the preregistered plan in the study manuscript. We found gambling preregistrations had low specificity levels on most RDoF. However, a comparison with a sample of cross-disciplinary preregistrations (N = 52; Bakker et al., 2020) indicated that gambling preregistrations scored higher on 12 (of 29) items. Thirteen (65%) of the 20 associated published articles or preprints deviated from the protocol without declaring as much (the mean number of undeclared deviations per article was 2.25, SD = 2.34). Overall, while we found improvements in specificity and adherence over time (2017-2020), our findings suggest the purported benefits of preregistration—including increasing transparency and reducing RDoF—are not fully achieved by current practices. Using our findings, we provide 10 practical recommendations that can be used to support and refine preregistration practices.

Metrics

Metrics Loading ...

References

Allen, C., & Mehler, D. M. A. (2019). Open science challenges, benefits and tips in early career and beyond. PLOS Biology, 17(5), e3000246.

Bakker, M., Veldkamp, C. L. S., van Assen, M. A. L. M., Crompvoets, E. A. V., Ong, H. H., Nosek, B. A., Soderberg, C. K., Mellor, D., & Wicherts, J. M. (2020). Ensuring the quality and specificity of preregistrations. PLOS Biology, 18(12), e3000937.

Benjamini, Y., & Hochberg, Y. (1995). Controlling the False Discovery Rate: A Practical and Powerful Approach to Multiple Testing. Journal of the Royal Statistical Society. Series B (Methodological), 57(1), 289–300.

Bernaards, C. A., & Sijtsma, K. (2000). Influence of Imputation and EM Methods on Factor Analysis when Item Nonresponse in Questionnaire Data is Nonignorable. Multivariate Behavioral Research, 35(3), 321–364.

Blaszczynski, A., & Gainsbury, S. M. (2019). Editor’s note: Replication crisis in the social sciences. International Gambling Studies, 19(3), 359–361.

Bosnjak, M., Fiebach, C., Mellor, D. T., Mueller, S., O’Connor, D., Oswald, F., & Sokol-Chang, R. (2021). A Template for Preregistration of Quantitative Research in Psychology: Report of the Joint Psychological Societies Preregistration Task Force.

Centre for Open Science. (2020). Impact report 2020; maximizing the impact of science together.

Chen, T., Li, C., Qin, R., Wang, Y., Yu, D., Dodd, J., Wang, D., & Cornelius, V. (2019). Comparison of Clinical Trial Changes in Primary Outcome and Reported Intervention Effect Size Between Trial Registration and Publication. JAMA Network Open, 2(7), e197242–e197242.

Claesen, A., Gomes, S. L. B. T., Tuerlinck, F., Vanpaemel, W., & Leuven, K. (2019). Preregistration: Comparing Dream to Reality (pre-print).

Cliff, N. (1993). Dominance statistics: Ordinal analyses to answer ordinal questions. Psychological Bulletin, 114(3), 494–509.

Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Routledge.

Dickersin, K., & Rennie, D. (2003). Registering clinical trials. JAMA, 290(4), 516–523.

Frankenhuis, W. E., & Nettle, D. (2018). Open Science Is Liberating and Can Foster Creativity. Perspectives on Psychological Science, 13(4), 439–447.

Gamer, M., Lemon, J., & Singh, I. F. P. (2019). Irr: Various coefficients of interrater reliability and agreement [R package version 0.84.1]. https://CRAN.R-project.org/package=irr

Goldacre, B., Drysdale, H., Dale, A., Milosevic, I., Slade, E., Hartley, P., Marston, C., Powell-Smith, A., Heneghan, C., & Mahtani, K. R. (2019). COMPare: A prospective cohort study correcting and monitoring 58 misreported trials in real time. Trials, 20(1), 118.

Haven, T. L., & Van Grootel, L. (2019). Preregistering qualitative research. Accountability in Research, 26(3), 229–244.

Head, M. L., Holman, L., Lanfear, R., Kahn, A. T., & Jennions, M. D. (2015). The Extent and Consequences of P-Hacking in Science. PLOS Biology, 13(3), e1002106.

Heirene, R. M. (2020). A call for replications of addiction research: Which studies should we replicate and what constitutes a ‘successful’ replication? Addiction Research & Theory, 0(0), 1–9.

Heirene, R. M., & Gainsbury, S. M. (2020). Can the open science revolution revolutionise gambling research? The BASIS.

Kaplan, R. M., & Irvin, V. L. (2015). Likelihood of Null Effects of Large NHLBI Clinical Trials Has Increased over Time. PLOS ONE, 10(8), e0132382.

Kuperschmidt, K. (2018). More and more scientists are preregistering their studies. Should you? Science.

Lakens, D. (2019). The Value of Preregistration for Psychological Science: A Conceptual Analysis.

LaPlante, D. A. (2019). Replication is fundamental, but is it common? A call for scientific self-reflection and contemporary research practices in gambling-related research. International Gambling Studies, 19(3), 362–368.

Livingstone, C., & Cassidy, R. (2014). The problem with gambling research. Retrieved September 2, 2019, from http://theconversation.com/the-problem-with-gambling-research-31934

Louderback, E. R., Gainsbury, S. M., Heirene, R. M., Amichia, K., Grossman, A., Bernhard, B. J., & LaPlante, D. A. (2022). Open Science Practices in Gambling Research Publications (2016–2019): A Scoping Review. Journal of Gambling Studies.

Louderback, E. R., Wohl, M. J. A., & LaPlante, D. A. (2020). Integrating open science practices into recommendations for accepting gambling industry research funding. Addiction Research & Theory, 1–9.

Moher, D., Bouter, L., Kleinert, S., Glasziou, P., Sham, M. H., Barbour, V., Coriat, A.-M., Foeger, N., & Dirnagl, U. (2020). The Hong Kong Principles for assessing researchers: Fostering research integrity. PLOS Biology, 18(7), e3000737.

Moher, D., Shamseer, L., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., Shekelle, P., Stewart, L. A., & PRISMA-P Group. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Systematic Reviews, 4(1), 1.

Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., du Sert, N. P., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 1–9.

Nature Human Behaviour. (2020). Tell it like it is. 4(1), 1–1.

Norris, S. L., Holmer, H. K., Ogden, L. A., Fu, R., Abou-Setta, A. M., Viswanathan, M. S., & McPheeters, M. L. (2012). Selective Outcome Reporting as a Source of Bias in Reviews of Comparative Effectiveness [Internet]. Agency for Healthcare Research and Quality. Rockville (MD), US.

Nosek, B. A., Beck, E. D., Campbell, L., Flake, J. K., Hardwicke, T. E., Mellor, D. T., van ’t Veer, A. E., & Vazire, S. (2019). Preregistration Is Hard, And Worthwhile. Trends in Cognitive Sciences.

Nosek, B. A., Ebersole, C. R., DeHaven, A. C., & Mellor, D. T. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences, 115(11), 2600.

Ofosu, G., & Posner, D. N. (2019). Pre-analysis Plans: A Stocktaking (Pre-Print). MetaArXiv.

Ofosu, G., & Posner, D. N. (2020). Do Pre-analysis Plans Hamper Publication? AEA Papers and Proceedings, 110, 70–74.

Pham, M. T., & Oh, T. T. (2021). Preregistration Is Neither Sufficient nor Necessary for Good Science. Journal of Consumer Psychology, 31(1), 163–176.

R Core Team. (2020). R: A language and environment for statistical computing [manual]. R Foundation for Statistical Computing. Vienna, Austria. https://www.R-project.org/

Romano, J., Kromrey, J., Coraggio, J., Skowronek, J., & Devine, L. (2006). Exploring methods for evaluating group differences on the NSSE and other surveys: Are the t-test and Cohen’s d indices the most appropriate choices?

Rubin, M. (2020). Does preregistration improve the credibility of research findings? The Quantitative Methods for Psychology, 16(4), 376–390.

Schäfer, T., & Schwarz, M. A. (2019). The Meaningfulness of Effect Sizes in Psychological Research: Differences Between Sub-Disciplines and the Impact of Potential Biases. Frontiers in Psychology, 10, 813.

Scheel, A. M. (2022). Why most psychological research findings are not even wrong. Infant and Child Development, 31(1), e2295.

Schulz, K. F., Altman, D. G., & Moher, D. (2010). CONSORT 2010 Statement: Updated guidelines for reporting parallel group randomised trials. BMJ (Clinical Research Ed.), 340, c332.

Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science, 22(11), 1359–1366.

Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2021). Pre-registration is a Game Changer. But, Like Random Assignment, it is Neither Necessary Nor Sufficient for Credible Science. Journal of Consumer Psychology, 31(1), 177–180.

Stewart, L., Moher, D., & Shekelle, P. (2012). Why prospective registration of systematic reviews makes sense. Systematic Reviews, 1(1), 7.

Vassar, M., Roberts, W., Cooper, C. M., Wayant, C., & Bibens, M. (2020). Evaluation of selective outcome reporting and trial registration practices among addiction clinical trials. Addiction, 115(6), 1172–1179.

Wicherts, J. M., Veldkamp, C. L. S., Augusteijn, H. E. M., Bakker, M., van Aert, R. C. M., & van Assen, M. A. L. M. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: A checklist to avoid p-Hacking. Frontiers in Psychology, 7.

Wohl, M. J. A., Tabri, N., & Zelenski, J. M. (2019). The need for open science practices and well-conducted replications in the field of gambling studies. International Gambling Studies, 1–8.

Downloads

Published

2024-07-01

Issue

Section

Original articles