Including more international voices on assessment panels, introducing metrics to measure impact and making sure that every academic fits into the defined categories are some ways the research excellence framework could be improved.
These are the thoughts of Julia Black, pro-director for research at the London School of Economics, who also said the way the REF 2014 results were released to institutions was “painful”.
Professor Black, who was speaking at a conference on the REF organised by the Higher Education Policy Institute and Elsevier, said there was an issue surrounding to the national nature of REF panels.
“Only going to the national societies, not going to the international societies and not having an international dimension to the assessors is making it a bit parochial,” she said.
She added that one way of adding an international benchmark to the REF could be getting other countries to use the same process. “But I think a more direct way would be to get more international peer reviewers in; otherwise it is a little bit as if we are marking our own homework”, she said.
Professor Black also talked about how some academics at LSE were not submitted to the REF because they did not fit in any category.
“That shouldn’t happen. If their research is of sufficient quality they should be allowed to [fit in] somewhere,” she said.
Improvements could also be made in terms of how the results are released to universities, she said. Results for individual universities were given to institutions a day before they could access scores for the whole sector, while the Higher Education Statistics Agency issued related data on staff numbers a further day after that.
Professor Black likened the system of releasing the data gradually over three days to “Chinese drip torture”.
Metrics for impact could also be considered, as this is the area “where the most subjective assessments are being made”, she added.
Andy Westwood, vice-president of public affairs at the University of Manchester, predicted that future REFs would involve greater expectations of institutions and less money.
“Dual support and quality-related [funding] may not survive intact. I would be even more surprised if it does survive without more strings attached to it,” he said. He added that these strings could be in the form of measuring impact in more sophisticated ways.
David Willetts, former universities and science minister, told the conference that during the summer of 2010, when large cuts to the science and research budget were on the table, the Treasury would only agree to a flat-cash settlement if measuring impact was part of the deal.