Netherlands plans overhaul of academic careers in move away from metrics

Country will also consider creating teaching-focused professorships to stop academics being overloaded by responsibilities

十二月 6, 2018
Crumpled tape measure
Source: iStock

The Netherlands will radically shake up how academics are assessed and promoted, including a shift away from relying on citations and journal impact factors.

Dutch universities also want to make it easier for academics to become professors on the basis of their teaching record, in a shift that will be closely watched by policymakers and unions across Europe.

The Association of Universities in the Netherlands (VSNU), the Netherlands Organisation for Scientific Research (NWO), and bodies for university medical centres and health research said that they would organise a raft of activities in 2019 designed to find a “new approach” to “recognising and rewarding academics”.

According to Rianne Letschert, rector of Maastricht University and one of the leaders of the review, one of the aims is to scale back the use of citation metrics and controversial journal impact factors, an average measure of the citations papers in a particular journal receive.

These metrics are currently “dominant” factors in university promotion decisions, and are used by the NWO when making grant decisions, she said.

Their use would be scaled back, and Professor Letschert said she hopes that instead there would be an expectation that funders and university heads of department would read applicants’ work, rather than rely on metrics.

Another plank of next year’s review is to create “differentiation of career pathways”, meaning that universities and university medical centres should give academics “a choice for specific focus areas –teaching, research, knowledge transfer and/or leadership”, according to a VSNU statement.

Workload for academics has ballooned as they are expected to fulfil all these roles, Professor Letschert said, at a time of surging student numbers. “When does it end? How can we be excellent in all these tasks?” she asked.

Maastricht and Utrecht universities have already brought in professorial positions where promotion is tilted towards teaching talent, she said, but this could now be rolled out across the country.

These positions would not be explicitly labelled “teaching professors” – “it should not become a B-track”, Professor Letschert said – and would still require the professors to conduct research, she explained. But the pressure to win grants would be relaxed, she added. One idea is to require these professors to have master’s degrees in educational science.

According to Rinze Benedictus, a policy adviser at University Medical Centre Utrecht, the Dutch rethink was spurred by a realisation that, although the Netherlands performs very well when it comes to research metrics, the “unintended consequences” of this system have gone too far.

Mr Benedictus previously warned that researchers in university medical centres were steering clear of publications that would benefit medicine because they were unlikely to rack up many citations. There was a “mismatch” between how scholars were assessed and the social relevance of their work, he said.

The Dutch reforms aim to boost the impact of research on society, and the NWO “will look for ways to increase the weight of research quality and anticipated impact in its evaluation of researchers and proposals”.

“Impact” has proved controversial in the UK, where the real-world effect of scholarship now determines a substantial chunk of university funding. How to measure impact is “a hot issue here as well”, Mr Benedictus said.

There is no agreement so far on how to measure impact, he said, although one idea put forward by the Dutch Royal Academy of Sciences was to judge researchers on whether they had followed the right dissemination processes, rather than whether or not impact had actually occurred.

“We can’t predict impact, but we can ask researchers to maximise the chances of it having impact,” Mr Benedictus said.

david.matthews@timeshighereducation.com

后记

Print headline: Dutch double down on shift away from role of metrics in academic careers

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.

Reader's comments (3)

If the good teacher would also be good researcher and the good researcher a good teacher, society will be win win. But the best must not be pitched against the good . Each should be recognized and rewarded in proportion as they help the ultimate purpose ... the generation and the transmittance of knowledge . Basil Jide fadipe.
'Good' research should inform teaching. Research that is driven purely by peer review is, in many cases, insular, egotistical and of limited use. As an undergraduate (many years ago) I was fortunate to be taught by world class academics whose reputations were founded on textbooks and lectures. To be taught by someone who had written the standard text on, say, Christian Democratic Parties in Europe was something I rejoiced in and in most cases their lectures were pithy, insightful and the outcome of real scholarship. As undergraduates we were not attuned to the dismal drumbeat of obscure 3 / 4 / 5 * journals which have reduced academic writing to a production line process. Twitter is awash with postings by young academics announcing to the world that they have had a paper accepted or in their cups because it's been rejected. The Netherlands have got it right on this one.
It's great to hear about this move to recognize the various career tracks within academia, but I do think it's misguided to move away from all metrics entirely and I would suggest that the problem isn't with metrics themselves, but the specific ones chosen and how they're used. No one could object to a researcher wanting to supplement their narrative about why they should be awarded a grant or tenure with quantitative information, but they can and do rightly object to the inappropriate use of metrics. If it is true that chasing citations leads researchers to avoid certain topics, the answer is not to give up on approaching a quantitative understanding of impact, as the Dutch Royal Academy of Sciences has suggested, but rather to use metrics that take into account the different citation rates of various research areas. Field-weighted citation impact is one such metric which has been available for some years now: http://www.metrics-toolkit.org/field-weighted-citation-impact/ One can speculate why this metric was overlooked in favor of simple citation counting, but it is surprising that academics, who seek understanding as a primary objective, would give up on understanding impact rather than improving their approach.
ADVERTISEMENT