Commentary and Analysis of “Ten Simple Rules for Good Research Practice”, a paper by Shwab, Janiaud, Dayan, Amrhein, Panczak, Hemkens, Ramon, Rothen, Senn, Furrer and Held (June 23, 2022)
Combating a well-known crisis
The authors present ten straightforward rules to tackle the reproducibility crisis in science. These rules are based on the idea that differing conclusions across studies often result from varied analytical methods applied to the same datasets, an issue worsened by increasingly cross-disciplinary approaches to research. The authors focus mainly on methodological improvements rather than ethical or legal concerns, advocating for enhanced education across institutions to improve research practices. The process starts by specifying and refining the research question, assessing prior studies systematically, and determining whether the research is confirmatory or exploratory. It follows by developing a detailed study protocol, including all aspects from the hypothesis to data processing. The authors emphasize that researchers should justify their sample size carefully to avoid false negatives and overestimated effect sizes, allowing for missing data and consulting a statistician early.
The authors highlight three key points that stand out to me. First, they emphasize prioritizing public needs over those of the research community. Second, they stress the importance of pre-registering study protocols and differentiate between pre-registration needs for confirmatory versus exploratory analyses. Third, they call for better research data lifecycle practices to ensure that research can be replicated and extended effectively.
The authors' recommendations align closely with Croson and Gachter's "10 Commandments of Economic Science," particularly those for experimentalists: do not hypothesize after the results are known; do not criticize theory without proposing an improved alternative, whether informally or formally; select experimental parameters and designs that genuinely test theories, ideally distinguishing them from competing theories; and replicate and encourage replications by making your data, instructions, and software publicly available. Additionally, Munafo et al.'s "A Manifesto for Reproducible Science" delves deeper into the data management and lifecycle issues discussed in this paper. Their manifesto outlines various strategies across different themes, emphasizing the protection against cognitive biases, improvement of methodological training, provision of independent methodological support, and promotion of collaboration and team science in the methods category. For reporting and dissemination, it stresses the importance of study pre-registration, enhancing reporting quality, and safeguarding against conflicts of interest. The manifesto also identifies key stakeholders for each proposal, noting that the extent of current adoption varies across initiatives.
Data Management
A comprehensive data management plan should be in place, and they recommend following FAIR principles for data handling and storage. Bias reduction is key, using strategies such as randomization, proper communication, and standardized data collection. During execution, researchers should avoid questionable research practices like p-hacking and ensure statistical significance is interpreted cautiously.
Although the authors address the need for improved data management practices during studies, they miss the opportunity to extend these practices to the entire research publication process. Incorporating data management and lifecycle practices as standard procedures in publication could reduce redundancy and enhance collaboration across disciplines and journals. This broader approach might also address referee reporting issues discussed by Berk, Harvey, and Hirshleifer (2017) in their article "How to Write an Effective Referee Report and Improve the Scientific Review Process" (Journal of Economic Perspectives).
Remaining questions
Is improving research data management and lifecycle practices the key to solving the reproducibility crisis, or could it potentially slow down scientific progress