At a time when science is grappling with a reproducibility crisis, getting researchers to share their data openly is widely regarded as being a key part of the solution.
However, much of the value of accessible research data relies on the academic community trusting it enough to use it in further studies, and a new report has thrown this into doubt.
According to the findings of Research data infrastructures in the UK, academics often “do not trust other researchers’ data”, and the sector seems unsure as to who has overall responsibility for ensuring basic data quality. The report goes on to say that for both “creators and users of research data”, it is “frequently unclear…what has been or will be done, and by whom” to ensure data quality.
“The provision of detailed documentation on provenance and on analytical procedures [is] critically important; but requirements for quality assurance can be multi-layered, difficult and time-consuming, and responsibilities for ensuring that data does indeed conform to basic quality standards are frequently not clearly defined,” the report stated.
In the section on “quality issues”, the task force found that funders, universities, journals, publishers, repositories and data centres all have roles to play, with researchers and those looking to use others’ research being unsure of what actions might have been or will be taken to ensure basic standards.
It also pointed to the “lack of availability of reviewers” as contributing to the problem, because they have “few incentives to engage in the ill-defined task of data review”.
The report follows a survey conducted by Elsevier and Leiden University earlier this year, where trust was also an issue. Thirty four per cent of academics who responded admitted that they did not publish their own figures, with some saying they did not “like the idea that others might abuse or misinterpret their data, let alone take credit for it”.
Pam Thomas, chair of the task force commissioned by the UK government to produce the new report, and pro vice-chancellor for research at the University of Warwick, told Times Higher Education that the “perceived level of mistrust is probably greater than the reality”, and said that researchers “do trust each other’s data all the time”. But she added that it was a very “political landscape”.
“It’s important to discuss what data should be made open and to what extent should it be verifiable and [have] gone through quality checks before it’s put up for other people to use.
“What we want to start with is verifiable data that has, in some sense, been validated as a useful, well-collected dataset that doesn’t have major flaws.”
The report also highlighted the “sub-optimal” mechanisms available to find research data, calling current means “crude”. The “selection, storage, and preservation” of data was also criticised, with the task force noting that outside certain fields, there existed a “lack of common understanding” of what data should be stored, as well as where, when and for how long.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to THE’s university and college rankings analysis
Already registered or a current subscriber? Login