THE University Impact Rankings 2019 by SDG: climate action methodology

April 2, 2019

This ranking explores universities’ research on climate change, their use of energy and their preparations for dealing with the consequences of climate change.

View the methodology for the University Impact Rankings 2019 to find out how these data are used in the overall ranking. 

Metrics

Research on climate action (27%)

  • Proportion of papers in the top 10 per cent of journals as defined by Citescore (10%)
  • Field-weighted citation index of papers produced by the university (10%)
  • Number of publications (7%) 

This focuses on research that is relevant to climate action. The field-weighted citation index is a subject-normalised score of the citation performance of publications.

The data are provided by Elsevier’s Scopus dataset, based on a query of keywords associated with SDG 13 (climate action). It includes all indexed publications between 2013 and 2017. The data are normalised across its range using z-scoring. 

ADVERTISEMENT

Low-carbon energy use (26.9%)

This measures the amount of renewable and low-carbon energy used by the institution. It refers to the gigajoule (GJ) of electrical energy provided by renewable or nuclear energy.

This data and evidence were provided directly by universities. The data are normalised across its range using z-scoring. 

ADVERTISEMENT

Environmental education measures (46.1%)

  • Providing local education around the impact of climate change (9.75%)
  • Generating a university climate action plan (9.75%)
  • Working with local or national government to address climate change planning (9.75%)
  • Informing and supporting government on issues associated with climate change (8.45%)
  • Collaborating with NGOs around climate change (8.4%) 

This data and evidence were provided directly by universities. The evidence was evaluated and scored by Times Higher Education and is not normalised.

Evidence

When we ask about policies and initiatives, our metrics require universities to provide the evidence to support their claims. Evidence is evaluated against a set of criteria and decisions are cross-validated where there is uncertainty. Evidence is not required to be exhaustive – we are looking for examples that demonstrate best practice at the institutions concerned.

Timeframe

Unless otherwise stated, the data used refer to the closest academic year to January to December 2017.

Exclusions

Universities must teach undergraduates and be validated by a recognised accreditation body to be included in the ranking.

Data collection

Institutions provide and sign off their institutional data for use in the rankings. On the rare occasions when a particular data point is not provided, we enter a value of zero. 

The methodology was developed in conjunction with our partners Vertigo Ventures and Elsevier, and after consultation and input from individual universities, academics, and sector groups. 

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Sponsored

ADVERTISEMENT