World Reputation Rankings 2015 results

‘Super-league’ can’t be shaken from the top

March 12, 2015

 

View the full World Reputation Rankings 2015 top 100


The UK has increased its share of institutions in a global ranking of the world’s most prestigious universities.

The results of the Times Higher Education World Reputation Rankings 2015 show that the UK now boasts 12 of the 100 most renowned higher education institutions in the world (up from 10 last year), while its two strongest performers, the University of Oxford and the University of Cambridge, have improved their positions.

The University of Bristol entered the ranking this year in the 91-100 band, while Durham University and the University of Warwick entered the 81-90 group, although the London School of Hygiene and Tropical Medicine dropped out. Both Cambridge and Oxford climbed two places, and are ranked second and third respectively. Scotland’s sole representative, the University of Edinburgh, climbed 17 places to 29th.

Vince Cable, the business secretary, said the findings illustrated “government efforts to support a world-class system that we can be proud of” but warned that other nations were “hot on our heels”. He said that was why the coalition had taken steps to “secure the reputation” of UK universities through greater fee income and removal of student number caps.

ADVERTISEMENT

The US remains dominant in the annual rankings, claiming 43 of the top 100 universities and eight of the top 10, although the total number of American institutions is down from 46 last year.

For the fifth consecutive year, the rankings highlight an elite group of six US and UK “super-brands” that hold a significant lead over the rest. Although the order has changed over the years, the institutions in the top six have remained constant: Harvard University, Cambridge, Oxford, the Massachusetts Institute of Technology, Stanford University and the University of California, Berkeley.

ADVERTISEMENT

“What we are finding year on year is that universities in this group of six tend to stand out well above the seventh- and eighth-placed institutions,” said THE rankings editor Phil Baty. “They seem to be in a super-league all of their own.”

Germany remains the best-represented nation after the US and the UK, with six top 100 universities (the same as last year). Its neighbour France now boasts five institutions in the table (all of them based in Paris), up from two last year.

Asia’s best performer, the University of Tokyo, slipped one place to 12th position. Meanwhile, China’s top institution, Tsinghua University, climbed 10 places to 26th, and Peking University leaped from 41st to 32nd place.

THE partnered with Elsevier to disseminate the Academic Reputation Survey on which the results are based. Questionnaires, which asked participants to nominate up to 10 of the best institutions in their field of expertise, were completed by some 10,000 academics selected to give a statistically representative sample of global scholars.

The survey was available in more languages than ever (15, up from 10 last year), and responses from more than 140 countries were received.

Listen to the World Reputation Rankings 2015 results podcast

chris.parr@tesglobal.com


World Reputation Rankings 2015 results: top 10

2015 rank2014 rankInstitution
1 1 Harvard University (US)
2 4 University of Cambridge (UK)
3 5 University of Oxford (UK)
4 2 Massachusetts Institute of Technology (US)
5 3 Stanford University (US)
6 6 University of California, Berkeley (US)
7 7 Princeton University (US)
8 8 Yale University (US)
9 9 California Institute of Technology (US)
10 12 Columbia University (US)


Claim a free copy of the World Reputation Rankings 2015 digital supplement

Times Higher Education free 30-day trial

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Reader's comments (5)

And here we thought the THE could not come up with anything more spurious and preposterous than their league tables. Perhaps THE might, more responsibly, develop a metric that encourages universities to develop the quality of their research and teaching, rather than either pandering to the fickle desires of teenagers or hiring expensive PR firms to flog their wares to the academic elite. Their league tables will never, and should never, be given the same credit as the more rigorous QS and ARWU. "Character is like a tree and reputation like a shadow. The shadow is what we think of it; the tree is the real thing." -- Abraham Lincoln
Forgive me, but the author the article that highlights the "multiple serious criticisms of the QS" system is you yourself. As an editor of the THE, this is hardly an independent opinion--not an invalid one, mind you, but to be taken with a grain of salt. Your analysis also fails to address the most perplexing (and, I would argue, invalidating) element of the THE guide--how some universities can rise or plummet dozens of spots from year to year without any substantive changes in their staff, organization or resources. QS and AWRU at least have the benefit of consistency. Lastly, THE (and the Guardian's Complete Guide) gives substantial weighting to the NSS, which is itself a thoroughly flawed metric, and increasingly under trenchant criticism from students, staff and administrators alike. That having been said, your comments are helpful and appreciated, and I hope that this opens up a dialogue about what, precisely, these league tables and the obsession with them is contributing to the actual quality of HE, in the UK and elsewhere. As for Socrates, he was discussing avenues for the preservation of one's honour and the achievement of wisdom, not the practices of institutions. I think a more apt quotation, as universities struggle to resolve their higher scholarly, cultural and social goals in the increasingly-marketized environment that league tables actively promote, is this: "I do nothing but go about persuading you all, old and young alike, not to take thought for your persons or your properties, but and chiefly to care about the greatest improvement of the soul. I tell you that virtue is not given by money, but that from virtue comes money and every other good of man, public as well as private. This is my teaching, and if this is the doctrine which corrupts the youth, I am a mischievous person."
Sincerest apologies, I was misinformed. It is the Guardian's Guide that relies on the NSS. Having closely read the methodology document, I now see that the largest section of the THE teaching segment is based once again on the Thomson-Reuters reputation survey. "Teaching: The learning environment (30%)--The dominant indicator here uses the results of the world's largest invitation-only academic reputation survey. Thomson Reuters carried out its latest reputation survey - a worldwide poll of experienced scholars - in spring 2013. It examined the perceived prestige of institutions in both research and teaching. There were 16,639 responses, statistically representative of global higher education's geographical and subject mix. The results of the survey with regard to teaching make up 15 per cent of the overall rankings score." This is also true of the research segment of your league tables: "Research: Volume, income, reputation (30%). This category is made up of three indicators. The most prominent, given a weighting of 18 per cent, looks at a university's reputation for research excellence among its peers, based on the 16,000-plus responses to our annual academic reputation survey." This is, indeed, much, much worse. Do you see the essentially circular nature of your rankings? A significant portion of your tables, for both research and teaching, is based on a reputation survey. A university's rank on the tables can help or hurt its reputation considerably. As you point out in your own article, the THE has become one of the go-to metrics among academics world-wide, the same academics relied upon for the reputation survey. Ergo, the rankings themselves are partially responsible for the reputation, and influence the next survey of reputation, which drives the next cycle of rankings, which again influences the next reputation survey, ad infinitum. It's an Escher staircase that climbs only to its own bottom step. I understand that the goal of THE is to provide the knowledge necessary for students and parents to make informed decisions. But please tell me how this in any way makes a positive contribution to universities' activities of scholarship, teaching and public service themselves. On the contrary, it seems to be doing great harm. It is driving their investment of millions of pounds, in the UK alone, to raise their "prestige," manage their public reputation and, in doing so, hopefully raise their ranking. It has motivated the insane salary inflation of higher management, the expansion of institutional bureaucracy, and the pernicious influence of a "market-based" mentality for institutions that are, quite patently, neither businesses nor for-profit. Why is this money not better spent on students or staff facilities? On funding for underprivileged students? On the updating of classrooms or labs? On the work environment and benefits for ordinary university employees and their families (higher administrators aside, who seem to be positively raking it in)? Investment in the latter will not raise a university's prestige, in the short run, but they will make it a _better university_ in the long run. Should that not be the goal of all involved, THE included?
I think the often cited Abraham Lincoln quote never fully captures the complex inter-relationship between reputation and ‘reality’ and implies that the tree (reality) is easy to discern, which is not often the case, and reputation (Lincoln’s shadow) is insubstantial and not relevant, which is also not supported by how people value reputation and make life choices based on the concept. When we have researched rankings and reputation in higher education, we found academics made it clear they would only move to another university if it was more highly ranked than the university they were at. Interviewing over 100 PhDs at 21 global universities last year, we found students valued both the quality and the reputation of a university when choosing where to study – and saw them as different things. They defined quality as being about internal academic quality, only clear to those in the same academic field, whilst reputation was the public image, and signified how good a university was at amplifying its quality (or communicating) to the wider world. They went on to say that rankings had been the most important information source for choosing where to study for their PhD. Given both students and staff are making life choices based on rankings and reputation, it is not surprising that universities take them seriously. Louise Simpson
Dear Louisa, In nearly 20 years working in HE, I have _never_ heard anyone refusing to move posts simply because the institution was "lower-ranked" than they were. An institution's ranking is widely recognized in the profession as a marketing tool, and one that says almost nothing about any given faculty or department. Staff generally won't move to institutions that have markedly less resources for research or fewer opportunities to teach excellent students. But beyond that, it's more a question of the character of the department and its staff, the atmosphere of the university, the specifics of the post, the location, the type and amount of teaching, and the opportunities for research and the supervision of Ph.D.s. Scholars, and top scholars especially, routinely leave a "higher-ranked" institution for one that is slightly or even significantly "lower-ranked" because they have ample evidence that both the post and the department will be a better fit and more in line with their personal and professional goals. A 100 person survey, furthermore, is statistically insignificant. I am not arguing that students aren't making choices based on rankings, that much is clearly true. What I am arguing, and there is widespread support for this view among those who don't have a vested interest in the rankings themselves (THE, Guardian, US News & World Report), is that the rankings are artificial, inaccurate, based on a flawed methodology and, ultimately, damaging to the basic mission of Higher Education (the creation and dissemination of knowledge and expertise). They are not unimportant, they are pernicious.

Sponsored

ADVERTISEMENT