Assessing rigor and impact of research software for hiring and promotion in psychology: A comment on Gärtner et al. (2022)

Downloads

Authors

  • Andreas M. Brandmaier Department of Psychology, MSB Medical School Berlin, Berlin, Germany; Center for Lifespan Psychology, Max Planck Institute for Human Development, Berlin, Germany; Max Planck UCL Centre for Computational Psychiatry and Ageing Research, Berlin, German https://orcid.org/0000-0001-8765-6982
  • Maximilian Ernst Center for Lifespan Psychology, Max Planck Institute for Human Development, Berlin, Germany; Max Planck School of Cognition, Leipzig, Germany
  • Aaron Peikert Center for Lifespan Psychology, Max Planck Institute for Human Development, Berlin, Germany; Max Planck UCL Centre for Computational Psychiatry and Ageing Research, Berlin, Germany; Department of Imaging Neuroscience, University College London, London, UK

DOI:

https://doi.org/10.15626/MP.2023.3715

Keywords:

research software, open science, metrics, rigor, impact

Abstract

Based on four principles of a more responsible research assessment in academic hiring and promotion processes, Gärtner et al. (2022) suggested an evaluation scheme for published manuscripts, reusable data sets, and research software. This commentary responds to the proposed indicators for the evaluation of research software contributions in academic hiring and promotion processes. Acknowledging the significance of research software as a critical component of modern science, we propose that an evaluation scheme must emphasize the two major dimensions of rigor and impact. Generally, we believe that research software should be recognized as valuable scientific output in academic hiring and promotion, with the hope that this incentivizes the development of more open and better research software.

Metrics

Metrics Loading ...

Downloads

Published

2024-03-17 — Updated on 2024-07-15

Versions

Issue

Section

Special Topic