Commentary: 'Responsible Research Assessment II: A specific proposal for hiring and promotion in psychology'

Authors

  • Andreas M. Brandmaier MSB Medical School Berlin https://orcid.org/0000-0001-8765-6982
  • Maximilian Ernst Max Planck Institute for Human Development
  • Aaron Peikert Max Planck Institute for Human Development

DOI:

https://doi.org/10.15626/MP.2023.3715

Keywords:

research software, open science, metrics, rigor, impact

Abstract

Based on four principles of a more responsible research assessment in academic hiring and promotion processes, Gärtner et al. (2022) suggested an evaluation scheme for published manuscripts, reusable data sets, and research software. This commentary responds to the proposed indicators for the evaluation of research software contributions in academic hiring and promotion processes. Acknowledging the significance of research software as a critical component of modern science, we propose that an evaluation scheme must emphasize the two major dimensions of rigor and impact. Generally, we believe that research software should be recognized as valuable scientific output in academic hiring and promotion, with the hope that this incentivizes the development of more open and better research software.

Metrics

Metrics Loading ...

Downloads

Published

2024-03-17

Issue

Section

Special Topic