This is a guest editorial and opinions may not reflect official COPE guidance or policies.
The gap between promise and reality
Jasmine Jamshidi-Naeini, Andrew W. Brown,, Wasiuddin Najam, Colby J. Vorland, Stephanie Dickinson, B. Allison
Department of Epidemiology and Biostatistics, Indiana University School of Public Health-Bloomington, Bloomington, IN, USA
Department of Biostatistics, University of Arkansas for Medical Sciences, Little Rock, AR, USA
Arkansas Children's Research Institute, Little Rock, AR, USA
*Correspondence: David B. Allison; email@example.com
Indiana University School of Public Health-Bloomington, 1025 E 7th St, PH 111 Bloomington, IN, USA 47405 (812) 855-1250
A prerequisite of reproducibility of science is availability of original input data (and statistical code), which should be obtainable by independent investigators. Many journals now require articles to include a Data Availability Statement (DAS). However, we have found that promises of data sharing are often not upheld, even when the DAS indicates data can be accessed upon [reasonable] request.
Part of the problem we have faced arises from the subjective interpretation of what qualifies as a "reasonable request" and who is authorized to provide or deny approval. In one example we had concerns about the validity of statistical methods employed in an article that said data would be available upon reasonable request. Therefore, we requested de-identified raw data to reproduce the published analyses using the same methods and to test the hypotheses using statistical approaches established to be appropriate for the study design. Data were not made available to us. The authors mentioned they preferred “not to engage in extensive post-hoc analyses without a clear rationale,” despite our clear rationale of reproducing results and re-analysis with appropriate methods. This exchange highlighted a lack of adherence to principles of reproducibility. The scientific community as a whole needs to take steps to help all parties understand and take responsibility for their role in assuring the integrity and trustworthiness of the research record. This vitally includes the duty to provide necessary materials for others to reproduce and verify one’s published findings.
In a second example the Editor-in-Chief (EIC) acknowledged our request as “appropriate and reasonable” and stated so to the study authors whose data were sought. The EIC, however, indicated that they (the EIC) lacked the authority to enforce the process of making the data available. We believe that this is not fully accurate: an EIC can (and arguably should) retract a paper if an author has pledged to make data available, but then does not do so upon request. Such a discrepancy between a statement made in a paper and reality may, in some cases, constitute falsification (as defined by The Office of Research Integrity). The request for data was denied by the authors, citing reasons such as the dataset being “huge” and containing identifiable information. They also indicated their intention to conduct additional analyses for different research questions. We believe the result in this scenario could have been more positive if the journal had established mechanisms and guidelines allowing for proper enforcement of data sharing policies. Additionally, it is important for authors to recognize that sharing data for the purpose of reproducing published results will not compromise any further analyses they intend to conduct on their data.
We have also encountered cases where the EIC’s intervention led to positive changes. In a notable example authors initially declined our request, even in anonymized form, citing ethical approval and consent concerns. Upon the EIC's intervention, the journal requested the authors to provide documentation from their institutional review board regarding the permissibility of sharing de-identified data. Following discussions with their review board de-identified data could, in fact, be shared with us subject to a data use agreement.
Cross-border regulations may sometimes lead to withholding data despite the DAS. In another example, authors had committed to making the data available upon request and pending approval. However, they were unable to fulfill this commitment due to the judgement of the European Court of Justice on the adequacy of the EU-U.S. Privacy Shield Framework for personal data. An erratum was therefore published to correct the original DAS. Such regulations, in our experience, do not make data sharing impossible. We have had experiences where corresponding authors from other countries have provided remote guest access to their institutions’ servers, allowing us to conduct analyses and export results. Such experiences highlight the value of cooperation between researchers to promote data availability in a manner that respects regulatory requirements. Importantly, the responsibility lies with authors and editors to be aware of such relevant regulations at the time of publication to be articulated in the DAS.
The inclusion of a DAS in articles should promote actual data sharing, recognizing that data sharing is a fundamental step towards increasing the reproducibility and trustworthiness of science. However, as indicated by Ian Hussey (Hussey, 2023), “the presence of Data Availability Statements that are not adhered to or enforced in any way risks giving rise to what is referred to as ‘Openwashing’: the appearance of transparency without adequate follow-through.”
COPE members represent a wide diversity of opinions and the purposes of COPE Guest Editorials are to provide a venue for these opinions to be included in COPE conversations and to elevate issues that members are grappling with. The topics of the above guest editorial - issues of transparency, reproducibility, and accountability - are important ones. It should be noted that COPE Retraction Guidelines do not include inconsistencies within data sharing as a reason for retraction.
COPE short survey on data sharing policies
Please consider completing a quick survey to inform us further on opinions on this topic