Assessing rigor and impact of research software for hiring and promotion in psychology: A comment on Gärtner et al. (2022)
DOI:
https://doi.org/10.15626/MP.2023.3715Keywords:
research software, open science, metrics, rigor, impactAbstract
Based on four principles of a more responsible research assessment in academic hiring and promotion processes, Gärtner et al. (2022) suggested an evaluation scheme for published manuscripts, reusable data sets, and research software. This commentary responds to the proposed indicators for the evaluation of research software contributions in academic hiring and promotion processes. Acknowledging the significance of research software as a critical component of modern science, we propose that an evaluation scheme must emphasize the two major dimensions of rigor and impact. Generally, we believe that research software should be recognized as valuable scientific output in academic hiring and promotion, with the hope that this incentivizes the development of more open and better research software.
Metrics
Published
Versions
- 2024-07-15 (2)
- 2024-03-17 (1)
Issue
Section
License
Copyright (c) 2024 Andreas M. Brandmaier, Maximilian Ernst, Aaron Peikert
This work is licensed under a Creative Commons Attribution 4.0 International License.