Autism research earns ‘expression of concern’ over unavailable knowledge | Spectrum
PLOS Data: According to the guidelines of the PLOS journals, researchers must make all data necessary to replicate their study results publicly available upon publication.
Last month, PLOS ONE magazine added an editorial “word of concern” to a study of brain imaging of autism published seven years ago after a dispute over access to the study’s materials and raw data.
For the study – cited 138 times, according to the Web of Science by Clarivate Analytics – the researchers used electroencephalography (EEG) to record brain activity in 20 autistic and 20 non-autistic people while participants listened to an audio clip of the same Syllables were repeated in a happy, angry, or neutral voice.
Non-autistic people showed large changes in brain activity when hearing emotional speech sounds, the study reported, but people with autism did not, which suggests they have difficulty recognizing vocal emotions. “All data on which the results are based are unrestrictedly available,” the researchers wrote in 2014.
But since then, the researchers have not shared their EEG data, or the acoustic stimuli used to collect that data, with other researchers. The denial not only contradicts the researchers’ original statement, but also violates the journal’s guidelines.
PLOS ONE published the concern on August 16 after “a reader contacted PLOS after having difficulty accessing the underlying raw and stimulus data,” said David Knutson, senior manager communications at PLOS ONE.
This message from the reader prompted the journal’s editors to reach out to the researchers, who “stated that they were unable to share the raw data underlying their study due to the data release restrictions imposed by the Institutional Review Board in the ethics approvals and consent forms.” study, ”says Knutson. “This limitation is recognized by PLOS, but authors should make these limitations clear in their data availability statement at the time of submission.”
The researchers shared a processed EEG dataset that could only be used to replicate the statistical analysis of the paper. The researchers also told the journal that they had patented the auditory stimuli and therefore could not share them publicly, according to the editorial note.
Senior researcher Yawei Cheng, director of the Neuroscience Institute at National Yang Ming Chiao Tung University in Taiwan and a member of the Editorial Board of PLOS ONE when the paper was released, did not respond to Spectrum’s request for comment.
In March 2014, four months before Cheng’s study was published, the PLOS journals changed their guidelines to require researchers to make all data necessary to replicate their study results publicly available upon publication without restriction. If there are legal or ethical reasons to withhold the data of a publication, the “authors must indicate according to the PLOS guideline how others can get access to the data”.
But only about 20 percent of the studies in PLOS ONE indicate that their data was stored in an online repository, the journal’s preferred method for data sharing, according to a 2018 study of more than 47,000 articles as of March 2014 published by May 2016. Another 7.4 percent contain data sets with restricted access and 70 percent indicate that all data is reported directly in the text and additional information area. The latter form of reporting is problematic, as scientific studies often only contain summary data instead of raw values and can therefore not be easily replicated, according to the authors of the 2018 study.
“It’s not uncommon for researchers not to provide their raw data, and there could be many reasons,” says Mark Rothstein, professor of law and medicine and director of the Institute for Bioethics, Health Policy and Law at the University of Louisville School of Medicine in Kentucky, which was not part of the study.
Many journals and funding agencies, like the U.S. National Institutes of Health, don’t require data to be publicly shared, but that agency has announced new data-sharing guidelines that should go into effect in 2023.
In the case of Cheng’s study, “the journal’s editors may think they had – I’m not saying they were misrepresented – but the authors did not specify exactly what the availability of the data would be,” says Rothstein. “They said the raw data would be available and so on, but what was actually shared was a little less than that. I think that’s the problem.”
Quote this article: https://doi.org/10.53053/XMSF7452