Leader: Only the best for the best

Our world rankings are hugely influential but also come under criticism every year, so we have decided to improve them

十一月 5, 2009

When we published our sixth annual world rankings last month, we received more than 1 million visits to our website in just one day and made headlines around the world, from New York to New Zealand. We also received a fair amount of criticism, as we have done since their launch in 2004. One of the very first critics said: "All the talk about 'the best universities' sounds more like selling soap powder. The rankings gloss over the fact that if different measures were used, or with different weightings, different results would be obtained."

Then as now, there is no denying that accusation, and we are well aware that rankings can only ever be a crude measure of what universities do. But whatever you think of them, they are here and they are here to stay.

Global rankings have always been used by students to choose where to study, by staff to look at career opportunities and by research teams seeking new collaborative partners. They also help us to analyse and highlight trends and developments in a rapidly changing higher education landscape worldwide. But in recent years they have become extraordinarily influential, used by institutions to benchmark themselves against global competitors and even by governments to set their national higher education agendas.

The responsibility weighs heavy on our shoulders. We are very much aware that national policy and multimillion-pound decisions are influenced by these rankings. We are also acutely aware of the criticisms made of the methodology. Therefore, we feel we have a duty to improve how we compile them.

To this end, the Times Higher Education editorial board met recently to discuss the problem. Two main flaws in the current rankings were identified: the survey of academic opinion that makes up 40 per cent of the overall score in our rankings was deemed too small - this year it was based on fewer than 4,000 responses from around the world, which when aggregated with previous years' results produces a total of 9,386; the other concern was about our use of paper citations to measure research quality.

The board felt that, in future, any survey of academic opinion needed to be more substantial, and that when using citations to measure quality, the very different volume of citations in different subject areas should be taken into account, so that, for example, world-leading institutions without medical schools are not disproportionately hit.

With these criticisms in mind, we will work with our new data partner, Thomson Reuters, to produce a more rigorous and transparent ranking for 2010 and beyond. We will seek further advice from our editorial board and solicit the views of international rankings experts and our readers to develop a suitable methodology for the sector. And this week, Phil Baty, deputy editor of Times Higher Education and editor of the rankings, met experts in Shanghai at the Third International Conference on World-Class Universities to discuss the way forward.

Higher education is global. Times Higher Education is determined to reflect that. Rankings are here to stay. But we believe universities deserve a rigorous, robust and transparent set of rankings - a serious tool for the sector, not just an annual curiosity.

ann.mroz@tsleducation.com.

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
ADVERTISEMENT