Research intelligence - Panning for gold dust in midstream

Measure impact while research is under way, urge Jisc-funded studies. Elizabeth Gibney reports

September 27, 2012

Universities may be working hard to pull together evidence of the impact of their work for the research excellence framework next year, but efforts are already under way to find out how future research can track its impact right from the outset.

At a workshop in London on 11 September, run by the National Co-ordinating Centre for Public Engagement (NCCPE), university knowledge exchange staff gathered to hear about the progress of nine studies. Funded by Jisc, the higher education technology and IT body, these studies each take different tacks in trying to make the currently haphazard process more systematic.

The aim is to make less laborious requirements by the REF and research councils to demonstrate potential impact, and for researchers to be more open to the potential impact of their work.

Speaking in one of the breakout sessions at the workshop, Sophie Duncan, deputy director of the NCCPE, stressed the importance of robust evidence in proving research's impact. "Some [academics] are struggling to find any evidence, as they just didn't think to capture it at the time. They didn't think it would be like gold dust," she said.

ADVERTISEMENT

In REF submissions, she added, "some will use anecdotal evidence, whereas others will use a more sophisticated way of evidencing their impact, and they will stand shoulders above those who don't".

For each unit of assessment, the REF requires that institutions include both a template of their efforts to derive impact from research, and a number of case studies detailing the benefits of excellent research (rated as 2* or above) carried out since 1993.

ADVERTISEMENT

The Jisc projects, facilitated by the NCCPE and running this year from March to December, look at how to embed methods of capturing and analysing impact within the process of research itself, and for long after.

Working within the culture

One of the nine studies is Embedding Research Impact at Coventry, which is led by Lorna Everall, who heads Coventry University's Corpor-ate Partnership Unit. The initiative has looked at how an existing research management system could be adapted to include impact.

Under the REF, impact is defined as a benefit to the economy, society, culture, public policy or services, health, the environment or quality of life beyond academia. This means it can take a huge range of forms, ranging from spin-off companies to the cultural enrichment of people's lives.

In the Coventry pilot, applied researchers were asked to brainstorm - with the help of suggestions - the kinds of impacts they thought their work might produce, and when. Researchers were later sent automatic reminders asking if those impacts had occurred and requesting details of the evidence. The information was then stored in an open-data repository.

"We were predominantly working with creating the system but we were also trying to think about the culture," Dr Everall told Times Higher Education after the event.

"The aim was to involve our research management team but also our academics, so that the culture of impact capturing is pulled in throughout the process," she said.

Cognisant that under the REF, impact - which counts for 20 per cent of the total score - is assessed on its reach as well as its significance, other projects are trying to tackle areas in which academics have the potential to influence the widest number of people, namely via the internet.

The Tracking Digital Impact project, led by the University of Exeter, is analysing how academics use the web to achieve impact from their research, with the aim of creating a set of standards for monitoring and assessing it.

ADVERTISEMENT

Public Engagement with Research Online, led by the University of Warwick, is taking this idea a stage further by looking at how existing internet technologies such as Google's search and analytics tools could be employed systematically to provide evidence for the impact of research communicated online.

Speaking in a breakout session at the workshop event, Kent McClymont, associate research fellow at Exeter and project manager of the Tracking Digital Impact project, reflected on the difficulties of identifying what counted as impact, and highlighted how far impact could be discipline- or even project-specific.

"If you want an idea of whether your idea of impact is impact, get it reviewed by someone else," he suggested. "Come up with a case study, send it to someone who is not in the project and get their opinion. We've had people come back and say 'that's definitely not impact'."

Trying to unravel the complexities of impact in social science was the aim of another group whose project partner is Sarah Morton, co-director of the Centre for Research on Families and Relationships consortium based at the University of Edinburgh. She developed an analysis technique that the team is using to look at the way in which research at De Montfort University made an impact.

Defining as well as measuring

"The question for our project was what do we mean by impacts and how can they be assessed?" Dr Morton told THE. "In social science it's not like a new technology where it's much easier to establish the link; it's more complicated than that."

Her project, Disseminating Impact from Engagement with User Groups and Organisations (DIEGO), studied impact in two research projects, on childcare in the Ukraine and teenage pregnancy in Leicester.

The team set up a framework that asked academics what impact they expected their research to have, looked at the assumptions they had made and tried to find evidence of whether this had happened in the way the researchers had expected.

"It was about investigating who you engage with and why, whether this targets the right people...and thinking about how are you going to react if your research challenges or affirms current practice," Dr Morton said.

In future, she added, spotting the difference between dissemination of research and actually causing an impact will be key to dealing with the conundrum.

She said it was common to invite stakeholders to events to discuss research. "But why are you holding that event, who are you inviting and why, and what do you hope to get from them engaging with you?"

When the projects conclude at the end of the year, the NCCPE plans to share the results and create a set of resources that all higher education institutions can use, although that is likely to be too late for use in the current REF.

Efforts to embed the measurement of impact in the traditional research process are aimed at providing a more level playing field in universities where a new dynamic is emerging in the wake of the REF, in which researchers who can articulate and demonstrate the impact of their work achieve a higher profile.

The consensus at the meeting was that researchers are slowly coming around to the idea of thinking beyond outputs such as reports and events to looking more strategically at impact.

Dr Morton, whose PhD on research impact and how to assess it predated the REF, said she hopes that making academics think about impact at earlier stages will not only improve its capture for assessments such as the REF, but actually increase its incidence.

She added that although the REF is currently causing concern in UK institutions, trying to reward research that has an impact was a good idea.

"People were effectively penalised in the previous system because it didn't recognise impact at all. Obviously getting impact from research takes time and effort. At least it's visible now."

ADVERTISEMENT

elizabeth.gibney@tsleducation.com.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT