Accessibility Links

times logo

Good University Guide 2021 methodology: how we compiled the rankings

times higher education ranking methodology

Sources of data The Times and The Sunday Times Good University Guide makes use wherever possible of the very latest data available on each of the measures in our academic and social inclusion rankings, much of it released for the first time during summer 2020.

Data on entry standards, student-staff ratios, services and facilities spend, completion rates, first-class and 2:1 degrees, graduate prospects and the 2014 Research Excellence Framework staffing data were supplied by the Higher Education Statistics Agency (Hesa), which provides a system of data collection, analysis and dissemination in relation to higher education in the whole of the United Kingdom. The original sources of data for these measures are data returns made by the universities themselves to Hesa.

The provision of the data

Related articles

The best universities by subject

times higher education ranking methodology

tree-logo-small-v2.png

Elsevier

  • Research trends reports
  • Tips & Tricks

You are here

Search this blog, get our newsletter.

times higher education ranking methodology

Follow Scopus

  • Content (59)
  • Metrics (44)
  • Release (36)
  • Tips and Tricks (26)
  • scopus (19)

Recent Posts

  • Scopus AI Release: May 2024
  • Enhancements to Citation Overview on Scopus
  • ACRL Webinar - Unlocking Insights with Generative AI: How to Enhance Research Efficiency in the Library
  • Changes to Scopus Open Access (OA) document tagging
  • A.T. Still University Scopus and Digital Commons case study now available!

Times Higher Education 2023 World Rankings Released

Time Higher Education (THE) released their 2023 World University Rankings today.   Click here to visit the THE World University Rankings site .   Elsevier is a proud partner of Times Higher Education, supplying the bibliometric dataset based on Scopus data for the 2023 THE World University Rankings.  If you have ever wondered how Scopus data is used in the rankings methodology, keep reading and take time to look at the resources shared below. 

THEscreencap.jpg

times higher education ranking methodology

Scopus data is organized in a structured way so you (and Times Higher Education) can use it to uncover insights about global research. A key to that structure is organizing of the data based on the relationships between authors and institutions; this relationship relies on the accuracy of Scopus author and institutional profiles.  Elsevier provides THE with a bibliometric dataset based on publications in a five-year window and citations in a six-year window.  Elsevier provides a 5-year Field Weighted Citation Impact (FWCI) metric for the Citations (research influence), the number of publications for the research productivity indicator and the proportion of the university’s total relevant publications that have at least one international co-author for the International Collaboration indicator.  

The Elsevier Partnership with THE Goes Beyond Scopus  

Use of Scopus data allows Elsevier to provide THE with a robust bibliometric dataset they can use with confidence when completing their Rankings methodology. The bibliometrics dataset provided to THE is also available to analyze in SciVal through the Rankings analysis feature of the Overview module. This provides SciVal users access to the actual bibliometrics used in the THE World University Rankings so they can: 

  • Analyse, understand and generate insights based on the actual FWCI and Citation Scores used in 30% of the THE WUR, rather than proxies developed in-house 
  • Benchmark at the THE subject level for deeper understanding of your university’s position and to inform faculty and department level plans 
  • Analyse, benchmark, and produce peer comparison reports across all bibliometric drivers directly in the Benchmarking module 
  • Analyse the trends in the bibliometric drivers to enrich your understanding and inform plans accordingly  
  • Remove the need for manual curation and the development of proxy indicators 

To learn more about THE Rankings and Elsevier solutions, look at: 

Understanding Scopus & SciVal and the Times Higher Education   

Enhancing sustainable development goals or promoting universities? An analysis of the times higher education impact rankings

International Journal of Sustainability in Higher Education

ISSN : 1467-6370

Article publication date: 16 June 2022

Issue publication date: 19 December 2022

This study aims to analyse and evaluate the methodology followed by the Times Higher Education Impact Rankings (THE-IR), as well as the coverage obtained and the data offered by this ranking, to determine if its methodology reflects the degree of sustainability of universities, and whether their results are accurate enough to be used as a data source for research and strategic decision-making.

Design/methodology/approach

A summative content analysis of the THE-IR methodology was conducted, paying special attention to the macro-structure (university score) and micro-structure (sustainable development goals [SDG] score) levels of the research-related metrics. Then, the data published by THE-IR in the 2019, 2020 and 2021 edition was collected via web scraping. After that, all the data was statistically analysed to find out performance rates, SDGs’ success rates and geographic distributions. Finally, a pairwise comparison of the THE-IR against the Times Higher Education World University Rankings (THE-WUR) was conducted to calculate overlap measures.

Severe inconsistencies in the THE-IR methodology have been found, offering a distorted view of sustainability in higher education institutions, allowing different strategic actions to participate in the ranking (interested, strategic, committed and outperformer universities). The observed growing number of universities from developing countries and the absence of world-class universities reflect an opportunity for less-esteemed institutions, which might have a chance to gain reputation based on their efforts towards sustainability, but from a flawed ranking which should be avoided for decision-making.

Practical implications

University managers can be aware of the THE-IR validity when demanding informed decisions. University ranking researchers and practitioners can access a detailed analysis of the THE-IR to determine its properties as a ranking and use raw data from THE-IR in other studies or reports. Policy makers can use the main findings of this work to avoid misinterpretations when developing public policies related to the evaluation of the contribution of universities to the SDGs. Otherwise, these results can help the ranking publisher to improve some of the inconsistencies found in this study.

Social implications

Given the global audience of the THE-IR, this work contributes to minimising the distorted vision that the THE-IR projects about sustainability in higher education institutions, and alerts governments, higher education bodies and policy makers to take precautions when making decisions based on this ranking.

Originality/value

To the best of the authors’ knowledge, this contribution is the first providing an analysis of the THE-IR’s methodology. The faults in the methodology, the coverage at the country-level and the overlap between THE-IR and THE-WUR have unveiled the existence of specific strategies in the participation of universities, of interest both for experts in university rankings and SDGs.

  • University rankings
  • Higher education institutions
  • Sustainable development goals
  • Academic marketing
  • Development goals
  • Impact Ranking

Bautista-Puig, N. , Orduña-Malea, E. and Perez-Esparrells, C. (2022), "Enhancing sustainable development goals or promoting universities? An analysis of the times higher education impact rankings", International Journal of Sustainability in Higher Education , Vol. 23 No. 8, pp. 211-231. https://doi.org/10.1108/IJSHE-07-2021-0309

Emerald Publishing Limited

Copyright © 2022, Nuria Bautista-Puig, Enrique Orduña Malea and Carmen Perez-Esparrells.

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode

1. Introduction

The sustainable development goals (SDGs) are a collection of 17 interlinked goals that emerged as a global strategy to solve critical problems (e.g. climate change, poverty) and emphasised the importance of evidence-based decision-making. SDGs establish a defined horizon for the millennium development goals (MDGs) by enlarging their scope, reach and engagement in their creation and implementation ( Fisher and Fukuda-Parr, 2019 ). Agenda 2030, approved in the Rio + 20 conference in 2012, is a 15-year plan that must be embedded in the decision-making of governments, businesses, universities and society at large. The plan encompasses a roadmap of targets and indicators for aligning both developing and developed countries on the path of sustainable development. SDGs refer to policy demands across a wide variety of topics (e.g. health, gender equality). It is therefore challenging to develop specific indicators or tools that can value the goals’ contributions or impacts ( Rafols et al. , 2021 ; Siegel and Bastos Lima, 2020 ).

Higher education institutions (HEIs) are drivers for the achievement of the full set of goals, through their role in human formation, knowledge production and innovation ( Chankseliani and McCowan, 2021 ). As such, universities stand out as models of sustainability ( Wright, 2004 ) due to their societal responsibility in training future professionals and leaders and in stimulating public awareness of sustainability ( Collins, 2017 ). By assuming the role of “agents of change”, HEIs are therefore pivotal to the search for solutions to current environmental problems ( Hesselbarth and Schaltegger, 2014 ). In this regard, considering the three core university missions (research, teaching and knowledge transfer), these institutions can address sustainable development from different perspectives or dimensions: university governance (e.g. vision, missions, strategic plans), campus operations (e.g. energy use, waste), community outreach (e.g. exchange programmes) and assessment and reporting (e.g. dissemination of sustainability results achieved) ( Cortese, 2003 ; Lozano et al. , 2015 ). In addition, developing general educative actions (e.g. programmes for consciousness-raising regarding climate change, hunger, gender) and formulating proper research questions ( Ligozat et al. , 2020 ) are also advisable actions.

As proof of this commitment, the number of HEIs that are signing declarations and agreements (e.g. the Talloires Declaration, the Halifax Declaration, the SDG Accord and the Abuja Declaration) designed to foster sustainable development (SD) and education for sustainable development has increased over time ( Lozano et al. , 2015 ). Linked to the sustainability actions, the publication of reports is another instrument for institutions to disclose and publicise their commitment to sustainability and, by extension, the SDGs ( Caputo et al. , 2021 ). However, universities have been slow to adopt sustainability practices ( Lozano, 2006 ; Velazquez et al. , 2006 ) due to a resistance to change and a lack of awareness, interest and funding. For HEIs to fully engage, achieve and promote the SDGs, they must overcome the lack of long-term policies, insufficient resources ( De La Poza et al. , 2021 ) and the absence of specific targets for countries in which higher education requires reform and rebuilding ( Heleta and Bagus, 2021 ).

In parallel with this situation, the literature related to sustainability and higher education has been continuously evolving, building different areas of interest. Such areas include the relation between universities and specific sustainable topics, such as urban planning, energy efficiency or climate change ( Figure 1 , cluster in red) or the relationship between sustainability and education ( Figure 1 , cluster in blue), training, students ( Figure 1 , cluster in green) and organisation, management and public relations ( Figure 1 , cluster in yellow) [ 1 ]. Therefore, all the sustainability dimensions are addressed. Regardless of the field, a considerable number of publications that include SDG-related keywords [ 2 ] can be observed in Scopus data (from 109 publications in 2015 to 2,904 publications in 2020), which could reconfigure these clusters in the short term.

Given that HEIs are key actors within the context of global sustainability, understanding their contribution to the SDGs remains crucial. However, while there are an increasing number of universities aligning their activities with the SDGs, to document and evidence the wide variety of activities relevant to sustainable development being undertaken by universities is still needed ( Chankseliani and McCowan, 2021 ). In the same train of thought, appropriate tools to measure this contribution have not yet been developed. Moreover, it is important to design alternative initiatives or rankings that measure universities’ contributions to sustainability goals, solve methodological problems and provide data collection at the global level.

specific data sources (i.e. biased towards the Global North);

the research dimension ( Safón, 2019 ); and

the arbitrary weighting used to rank institutions (Gadd, 2021).

In addition, those global rankings do not take societal aspects into consideration (e.g. open science initiatives, sustainability or diversity), thus limiting the use of rankings in determining whether HEIs contribute to the SDGs. This leads the scientometric community to advocate for fair and responsible rankings ( Gadd, 2020 ). The translation of the policy language goals into performance indicators has been indicated to be a primary concern ( Ràfols, 2020 ) because quantitative metrics might miss many of the values of the targets ( Unterhalter, 2019 ; Torabian, 2019 ).

Despite the wide variety of university rankings ( Orduna-Malea and Perez-Esparrells, 2021 ), those valuing sustainability in its broader sense (green rankings) are scarce and represent a change in ranking designs ( De La Poza et al. , 2021 ). The most popular ranking was the GreenMetric World University Ranking, a global sustainability ranking developed by the Universitas Indonesia (UI) in 2010 that values the following dimensions: setting and infrastructure (15%), energy and climate change (21%), waste (18%), water (10%), transportation (18%) and education (18%). However, its simplicity has been considered a crucial limitation in terms of its “categories and indicators in comparison with other systems and the demands of the data types required are generally low for participants and less empirical than those used in other systems” ( Lauder et al. , 2015 ). Galleli et al. (2021) compare this green ranking with the Times Higher Education – World University Rankings (THE-WUR), finding a significant gap in the specificity of the two rankings. Other ranking and evaluation tools related to the commitment of universities (at a national or regional level) to sustainability have emerged. Nonetheless, these initiatives are geographically centred in the Anglo-Saxon regions (e.g. STARS and AASHE STARS at the USA, and People and Planet Green League in the UK).

First introduced in 2015, the SDG index was the first global ranking system translating the SDG narrative to measure the contribution of universities at the country level while using a wide range of appropriate indicators for each goal ( Lafortune et al. , 2018 ). However, collection data limitations, unmeasurable metrics and the low accuracy of the composite index jeopardised its utility ( Diaz-Sarachaga et al. , 2018 ). The Times Higher Education Impact Ranking (THE-IR) was launched in 2019 as the first global ranking at the institution level aimed at measuring the contribution of HEIs in each of the 17 SDGs.

Despite the youth of this ranking (the third edition appeared in 2021), THE-IR has already aroused interest in the higher education community, with both detractors and followers. Torabian (2019) considers this ranking as a positive move that demonstrates HEIs (and the broader public) want to address SD. However, the pursuit of certain indicators or rankings (e.g. in terms of research output) might encourage practices that conflict with the goals themselves. Cardozo et al. (2021) analysed the 2020 edition and found 16.67% of Ibero-American universities were indexed in the overall ranking. SDG8, “Decent work and economic growth”, was the goal most addressed (9.81%), followed by SDG3, “Good health and well-being” (9.68%) and SDG4, “Quality education” (7.69%). Iskandaryan (2020) described how one Russian university is implementing SDG4, while De La Poza et al. (2021) assessed the level of reporting and alignment of SDG achievements with the overall THE-WUR ranking score. They found that the highest-ranked universities are more committed to SDG9, “Industry, innovation and infrastructure” and SDG16, “Peace, justice and strong institutions”. Finally, Calderon (2021) offered a critical overview of this ranking and suggested that it needs to be contextualised, given the worldwide scope of SDGs, according to a regional or national basis that allows a like-to-like comparison between countries.

As with any other ranking, THE-IR should be analysed to ascertain whether its methodology reflect the sustainability degree of universities, and their results are accurate enough to be used as a data source for research and strategic decision-making. While THE-IR has been analysed in the literature, these studies do not include a comprehensive analysis of the THE-IR methodology. De la Poza et al. (2021) compared THE-IR and THE-WUR to discover whether the performance on one SDG in THE-IR is related to the scores obtained in THE-WUR, considering universities’ regions and disciplines. However, they did not cover the raw overlap between THE-IR and THE-WUR (which might also reveal not only a potential influence of the latter on the former but also the existence of specific geopolitical strategies to obtain visibility in this ranking). Finally, previous literature has not included all available editions. Given the high variability between the first and second editions (due to the incorporation of all SDGs in the second edition instead of the 11 SDGs considered in the first one), the inclusion of the third edition seems necessary to offer a wider and more accurate picture of this ranking and its methodology.

Therefore, this study aims to analyse and evaluate the THE-IR methodology, coverage and scores offered.

2. Times Higher Education Impact Rankings: a methodological approach

THE-IR follows a complex methodology that includes numerous metrics and composite indicators that should be briefly detailed to comprehend and appropriately contextualise the main results offered in this study. This ranking intends to measure the contribution of universities against each of the 17 goals. This ranking includes all universities teaching at either undergraduate or postgraduate levels; no minimum research requirements are required. The methodology consists of a micro level (scores for each SDG) and a macro-level (creation of an overall score).

At the micro level, each SDG aims to capture the contribution of universities across four wide areas: research, teaching, stewardship (i.e. universities as custodians of significant resources, including not only physical resources but also human resources), and outreach (i.e. work that universities undertake with their local, regional, national and international communities). To accomplish this, a set of metrics is assigned to each SDG. These metrics are grouped into three different categories (research metrics, continuous metrics, evidence metrics) based on the nature and purpose of each metric ( Table 1 ). While research metrics are derived from an external product (Scopus), the remaining metrics (continuous and evidential material) derive from institutional data provided privately and directly by the HEIs to the ranking publisher.

Following this design, each SDG is organised into different groups of metrics. The first group is always related to research metrics (27% out of the total SDG score). The remaining groups (composed of continuous and evidence metrics) depend on each SDG. At the end, a final score (from 0 to 100) is obtained for each SDG.

At the macro-level, only four SDGs are weighted to build the overall score. SDG17 is measured for all universities (22% of the overall score), while the remaining three SDGs selected are those achieving the top three highest scores for each university (26% each), excluding SDG17.

A summative content analysis ( Harwood and Garry, 2003 ) and a descriptive statistical analysis and have been used to accomplish with the objectives of this work.

The summative content analysis aims to analyse documents and texts that seeks to quantify their content in terms of predetermined categories and in a systematic and replicable manner. In this particular case, Impact Rankings Methodology 2021 v.1.3 ( THE Impact Ranking , 2021) has been inspected to determine the use of research-related metrics, specifically the types of indicators used per SDG, their weight in each SDG, and their breadth of use throughout the 17 SDGs. This analysis was limited to the research-related metrics for the following reasons. First, these metrics are bibliometric indicators that have been provided by a commercial database and extensively analysed in the bibliometric literature, therefore providing clear guidance in evaluating their appropriateness. Second, these metrics are transversal in the THE-IR methodology (research metrics is a fixed category used to measure each SDG). Third, these metrics represent a fixed weight of 27% of the overall score for each SDG, rendering their influence on the final score noteworthy. Continuous and evidential metrics have not been directly analysed because they are provided by HEIs privately.

The statistical descriptive analysis is used to describe the characteristics of a sample or data set. In this particular case, data from all available editions of the THE-IR (2019, 2020 and 2021) were considered. All data was directly collected in May 2021 from the official website through web scrapping. This same procedure was followed to collect the data related to the THE-WUR for the same ranking editions. For each university, the university name, the university region (country), and all performance scores were extracted and analysed through descriptive statistics. Pearson correlations were used to calculate the stability of the distribution of universities providing specific SDG data over the different ranking editions. Then, a pairwise comparison method was used to compare both rankings. Countries' information was matched with geographical regions according to a list provided by the United Nations [ 3 ]. In addition, information regarding the designation of “developing” and “developed” areas was also included. ArcGIS [ 4 ] software was used to plot the coverage and overlap data maps. Venn Diagram Plotter [ 5 ] software was used for creating the Venn diagram.

4.1 Analysis of the impact rankings methodology

The THE-IR exhibits a ranking design that raises several concerns at both the macro and micro levels. At the macro-level structure, the consideration of only four out of 17 SDGs is debatable as it allows a university to achieve the top position with zero values for 13 SDGs. However, the major concern is that only one SDG (SDG17: “Partnership for the goals”) is common for all universities, while the remaining scores are related to the top three SDGs with a better score in each university. This means that universities will be compared against different SDGs and mixed into the same classification. While not explicitly expressed in the Berlin Principles [ 6 ], this procedure breaks the effective practices in building university rankings.

Another consequence of the method followed by this ranking is that longitudinal analyses are not permitted as universities might be scored in relation to different SDGs over the editions. For example, Trinity College Dublin (Ireland) is evaluated in relation to SDG5, SDG9 and SDG11 in 2020 but to SDG3, SDG9 and SDG16 in 2021. The University of Bahrain constitutes an even more extreme case as all SDGs evaluated were different (SDG2, SDG4 and SDG11 in 2020; SDG5, SDG8 and SDG16 in 2021). This is not an isolated phenomenon as only 23% of all those universities ranked both in the 2020 and 2021 THE-IR editions (695 institutions) were evaluated through the same four SDGs.

This means that the overall score for one university might be generated with data completely different to that used in the previous ranking edition. This could partly explain the abrupt position changes several universities have exhibited in successive editions. Such examples, all of which reduce the reliability of the ranking, include France’s Aix-Marseille University (from 20 th in 2020 to 201–300 th in 2021), Denmark’s Aalto University (from 47 th in 2020 to 201–300 th in 2021), Saudi Arabia’s Princess Nourah bint Abdulrahman University (from 401–600 th in 2020 to 27 th in 2021) or Indonesia’s Institut Teknologi Sepuluh Nopember (401–600 th in 2020 to 64 th in 2021).

Moreover, the weights assigned to each of the four SDGs are subjective and not explicitly justified. Because SDG17 is a more generic and controversial goal, its higher weight in the final score is unjustified. These arbitrary weights break Principle 9 ( Make the weights assigned to different indicators [if used] prominent and limit changes to them) included in the Berlin Principles on Ranking Higher Education Institutions [ 7 ]. While some voices have criticised these principles as being decoupled from actual ranking practices ( Barron, 2017 ), updating these principles would uncover a greater number of inconsistencies.

The final scores include data ranges for some universities. While the use of ranges is generally a useful practice to minimise the use of rigid scores, as well as to tolerate margins of error in data, excessively wide ranges produce other side effects. For example, THE-IR’s 2021 edition reveals 103 universities with an overall score of 9.2–36.4, which makes the indicated value useless.

At the micro level, the scores obtained for each SDG can also admit data ranges, finding the same limitation as that mentioned above. For example, 103 universities obtained a score of 0.3–29.0 for SDG12 in the THE-IR’s 2021 edition. Even more extreme behaviour was found concerning the University of Alkafeel (Iraq), which obtained a score of 0.1–38.4 for SDG7. It is unclear how the overall score is calculated when the four SDGs considered have scores with such wide ranges.

The mixture of metrics to measure each SDG introduces another discussion point. The score in each SDG mixes indicators related to “research on topics related to the sustainable development objectives”, and metrics related to “the sustainability of the university as a whole institution”. Furthermore, that one university conducts research on topics related to those goals does not necessarily imply its researchers are contributing to the development of the SDGs or working sustainably.

The number of indicators used by SDGs and the distribution of indicators by type (research, continuous and evidence metrics) is otherwise not homogeneous, which makes possible the existence of metrics with extremely different weights in the final SDG score. For example, SDG4 (“industry, innovation and infrastructure”) is built upon only four indicators, while SDG11 (“sustainable cities and communities”) is built upon 19 indicators ( Table 2 ).

The raw methodology includes 231 different indicators. All 48 research indicators used for each of the 17 SDGs are displayed in Table 3 accompanied by their specific weight in each of the SDGs measured (definitions of these metrics are available in the supplementary material on URL: https://doi.org/10.21950/HSFRDH .

These results raise the following concerns: A co-authorship indicator is introduced in the measurement of three SDGs: cooperation with developing countries (measured twice, in SDG1, “poverty” and SDG17, “partnership for the goals”) and gender (SDG5, “gender equality”). However, co-authorship is not considered in the remaining SDGs.

Each SDG (from 1 to 16) includes a productivity indicator (referred to as Publications in Table 1 ), which is tailored in each SDG by using topic-based queries in Scopus ( Iskandaryan, 2020 ). While this approach is topic-sensitive and denotes a substantial effort, the use of queries from Scopus could be controversial as different query approaches can alter the resulting country rankings ( Armitage et al. , 2020 ). SDG17 includes this same indicator but also includes all publications regardless of the topic. Therefore, the same publication can be measured in at least two SDGs by default. Otherwise, this indicator exhibits different weights: 13.55% (SDG17; “partnership for the goals”), 13% (SDG8, “decent work”), 11.60% (SDG9, “industry”), 10% (SDG1, “poverty”) and 7% in the remaining objectives. This variability is not justified. In addition, this indicator is not normalised according to the size of the university. Otherwise, several principles (e.g. Principles 2, 4, 6 and 9) included in the Leiden Manifesto for research metrics are somewhat compromised ( Hicks et al. , 2015 ).

Concerning the measurement of citation-based impact, a normalised indicator (Field Weighted Citation Impact) is used. However, its usage is limited to a small number of SDGs (3, 4, 5, 8, 9 and 17). The absence of the indicator in other SDGs seems to be not clearly justified.

A journal-level metric (Citescore) is mainly used in this ranking. However, it is excluded in four SDGs (1, 3, 9 and 17). While the generic nature of SDG17 can explain its absence, it remains unclear why Citescore is not used in the remaining three SDGs. Moreover, the weight of this indicator is equal to 10% for all SDGs except for SDG8 (“decent work and economic growth”), where it is equal to 14%. The methodology details that this indicator actually measures the proportion of a university’s publications that, according to the Citescore metric, appear in the top 10% of journals. The arbitrariness of this threshold (10%), the absence of discipline filtering (Citescore is a metric without field-normalisation) and the nature of multidisciplinary journals render this count of publications debatable.

An altmetric measure (paper views) is included in SDG3 (“health”) and SDG4 (“education”), but it does not appear in any other SDG. The use of Scopus as a data source might indicate that PlumX is being used as an altmetric data source, but no explicit information has been found. The inclusion of this indicator, with a final weight of 10% in the overall SDG score, is therefore not justified.

Finally, an extra indicator measuring patents-citing publications (referred to as citing patents) is used in SDG9 (“industry, innovation and infrastructure”), but it is outside the research indicators group. This decision is also debatable as the parameter measures citations received by publications from patents. A similar indicator (clinical citations) used in SDG3 (“health”) is otherwise included in the research indicators group. The citing patents indicator, as a citation-based indicator, also reveals a transversal nature, which makes it appropriate to be measured in all SDGs, as the other bibliometric indicators used.

4.2 Institutional participation

The participation of universities in the THE-IR has increased over the editions. In 2020, 768 universities with data in at least one SDG were ranked (out of 859). For the 2021 edition, up to 1,117 universities with data in at least one SDG were ranked (out of 1,240). The number of SDGs for which universities submitted data varied significantly between institutions.

Considering the 2021 edition, we can find four different types of universities. First, the interested universities (9.5 out of all universities submitting information). These universities submitted data for less than 4 SDGs. Consequently, they are not ranked in the overall score. Second, the strategic universities (36.9% of universities). These institutions submit data for a small number of SDGs, around the minimum required (between 4 and 6 SDGs). Third, the committed universities. These institutions provide data for a large quantity of SDGs (between 7 and 15). Finally, the outperformer universities. These institutions submit data for almost all available SDGs ( Figure 2 ).

The distribution of universities per number of SDGs provided shows a slight change from 2020 to 2021 ( R = 0.62; alpha value= 0.01). Comparing the 2020 and 2021 editions, we can see an increase in the raw number of universities providing only the four required SDGs (200 universities in 2021), which constitutes a particularly frequent action, followed by those universities providing data on 16 SDGs (184 in 2021). There was an important decrease in institutions that provided data for all the 17 SDGs from 2020 (19.1% of all submitting universities) to 2021 (7.4%).

The number of universities providing data on each SDG (referred to as “providing universities” in Table 4 ) remains constant for the 2020 and 2021 editions ( R = 0.95), as well as the number of universities where the corresponding SDG is used in their overall score (referred to as “measured universities” in Table 4 ) ( R = 0.99; alpha value = 0.01). We can observe that SDG4 (“quality education”), SDG5 (“gender equality”) and SDG10 (“reduced inequalities”) are the SDGs most frequently submitted by universities. On the contrary, SDG2 (“hunger”) and SDG14 (“life below water”) are the SDGs less frequently provided by universities. Otherwise, an anomalous behaviour is detected for SDG2, with a significant drop in the number of universities providing data on this SDG from 2020 (33.9% of all universities) to 2021 (10.9%). However, SDG2 is used in the overall score for the 83% of universities providing data on this SDG, thus exhibiting an elevated success rate ( Table 4 ).

4.3 Geopolitical analysis

Results confirm an increase in the participation of countries in the THE-IR, from 76 in 2019 to 94 in 2021, exceeding the number of countries in the THE-WUR and the UI GreenMetric ranking for the first time ( Table 5 ). Complete data on the evolution of the number of universities by country in the three editions of THE-IR and THE-WUR is available in the supplementary material.

Japan and Russia are the most represented countries in the THE-IR’s 2021 edition (both with 75 universities). These countries are followed by the UK (50) and Turkey (49) [ 8 ]. The increasing presence of several countries in the different editions of the THE-IR is remarkable. For example, Russia included 45 new universities in 2021 that were not included in the 2019 edition, India had 36 new universities, Iraq 34 and Brazil 23.

The overlap between THE-IR and THE-WUR is indicated in Figure 3 . The raw number of universities included in both rankings is growing over the years (from 328 to 687). However, this value is misleading as the coverage of both rankings has increased. 68.36% of universities indexed in the THE-IR were not indexed in the THE-WUR in the 2020 ranking edition. This value drops to 61.5% in 2021.

Several countries participating in the THE-IR are not indexed in THE-WUR (it is noteworthy that institutions do not choose to participate in the THE-WUR; rather, they are selected based on their performances). These countries predominantly come from Latin America and the Caribbean (e.g. Uruguay, Paraguay, Ecuador, Dominican Republic, Curaçao) or West Asia (Palestine, Bahrain, Azerbaijan and Armenia). The notable presence of Uzbekistan (12 universities), Ecuador (10 universities indexed) or Azerbaijan (6 universities) should be emphasised ( Table 6 ).

Other countries exhibit extremely unbalanced behaviour regarding their presence in the THE-IR, particularly if we consider the 2021 edition. For example, China includes 91 universities in the THE-WUR (4 th by country rank), but only 13 are participating in the THE-IR (26 th by country rank). Germany includes 48 universities in the THE-WUR (9 th by country rank), while only six are participating in the THE-IR (44 th by country rank).

If we limit the analysis to the elite positions (top 200), we can observe a greater presence of countries in the THE-IR (37 countries) than in the THE-WUR (27 countries). However, despite the greater variety of countries in the THE-IR, most top universities are principally located in developed countries (the UK with 38, Australia with 24 and Canada with 20 universities). Otherwise, only 52 out of the top 200 universities ranked in the THE-WUR are participating in the THE-IR in 2021 (this value was 42 in 2020). These results provide evidence that the top universities indexed in the THE-WUR are not yet participating the THE-IR, but this behaviour may change in coming years if the THE-IR coverage continues to grow.

The distribution of universities throughout the three editions is presented in Figure 4 . On the one hand, East Asia (211 universities in 2019; 231 in 2020; and 250 in 2021), North America (199 in 2019; 202 in 2020; and 211 in 2021) and Northern Europe (147 universities in 2019; 154 in 2020; and 155 in 2021) are the most well-represented by the THE-WUR. On the other hand, East Asia (55 universities in 2019; 92 in 2020), Latin America and the Caribbean (55 in 2019 and 88 in 2020), West Asia (86 in 2020 and 140 in 2021) and Eastern Europe (51 in 2019 and 133 in 2021) are the most well-represented by the THE-IR. Thus, the number of developing countries (187%) has significantly increased in the THE-IR in comparison with the THE-WUR. This might suggest a different strategy for these countries participating in this SDG ranking.

The regions that presented the highest overlap between both rankings are East Asia (43 countries overlap in 2019 to 85 in 2021) and Eastern Europe (from 31 countries in 2019 to 70 in 2021), whereas African Regions (East, West and Southern Africa) scarcely overlap.

5. Discussion

Despite other initiatives launched to measure sustainable development, the THE-IR is the first worldwide ranking that values the contribution of HEIs towards SDGs. This study attempts to explain the methodology, coverage and geopolitical issues (by area and by country) related to this emergent ranking.

First, the content analysis undertaken revealed inconsistencies in the appropriateness and weights assigned to the research metrics employed ( Table 3 ), which might jeopardise the alignment of these metrics to the contributions to the SDGs. These results reinforce the general concerns that have already been indicated in the literature related to the use of performance metrics to measure sustainability ( Torabian, 2019 ; Calderon, 2021 ; De la Poza, 2021 ; Rafols et al. , 2021 ). Beyond research metrics, the irregularities uncovered support the work by Gadd (2020 , 2021 ) with new insights, proving that a lack of a solid ranking structure (methodological design) has the potential to cause adverse consequences (unrealistic results, incorrect decision-making and opportunistic behaviours). Otherwise, the continuous and evidential metrics have not been analysed. As the review of this institutional evidence is not public, the final evaluation of each SDG cannot be reproduced. Consequently, the final scores and ranking positions used in the Results section should be interpreted under this circumstance. Future studies should address this metric to obtain a complete picture of the THE-IR method. For example, conducting systematic surveys among university staff could provide insights into the accuracy and stability of THE-IR.

On the other hand, the results related to the THE-IR/THE-WUR coverage and overlap should be also discussed because the rules regarding participation in these rankings are different. While participation in the THE-WUR is based on performance, participation in the THE-IR operates on a voluntary basis. This fact can explain the growing number of universities from developing countries (most new HEIs) participating in the THE-IR as a reputational phenomenon that results from being excluded from the THE-WUR. The HEIs of developing countries could be becoming aware of sustainability’s relevance to promoting their universities and increasing their academic brands. Thus, concurring with Hazelkorn and Mihut (2021) , this ranking represents an opportunity for institutions to accrue intangible assets in the form of reputation and competitiveness within the ranking’s sphere of influence through achieving the SDGs and improving their positions on the ranking. In turn, as Hazelkorn (2015) foresaw, the university systems of these countries have noted how their flagship universities are positioned more highly in this ranking system than in the THE-WUR and how newcomers have emerged onto the scene of global rankings, reshaping the landscape of these institutions.

The low participation observed of the elite universities might be related to a cost-benefit analysis at the institutional-level. The cost of compiling evidence and the work it encompasses does not present a significant advantage in improving their academic image, which their presence in the THE-WUR ranking already ensures ( De la Poza et al. , 2021 ). Likewise, given that the engagement of universities with sustainability goals is evaluated, obtaining lower positions in the THE-IR could affect their reputation. Therefore, non-participation is a plausible option for most top research universities.

Participation in the ranking being voluntary and the general lack of incentives and funding to foster the involvement of countries might explain the low participation of other regions (e.g. Eastern Africa). Institutions with fewer resources cannot afford participation as it requires extra effort, which constitutes a paradox given this ranking aims to promote sustainability goals. However, the rising participation of other countries from developing regions (e.g. Latin America, the Caribbean and West Asia), which are exclusively participating in THE-IR, suggest a strategic interest (and investment) in promoting “sustainability” branding and increasing academic reputation.

More precisely, the surprising percentage of universities (16.1% in the 2021 edition) providing only the minimum required data to be ranked (four SDGs; Figure 2 ) reinforces the hypothesis that universities in certain regions are making strategic efforts to participate in this ranking due to the possibilities that the designed methodology allows. This is one of the most important unintended consequences. In addition, providing minimum data – also called “cherry-picking” ( Forestier and Kim, 2020 ) and “SDG washing” ( Buhmann, 2018 ) – should not be an acceptable approach due to the integrative and indivisible nature of the agenda ( United Nations, 2015 ). As Calderon (2021) suggests, increasing the number of mandatory SDGs and basing the overall score on them could provide more stability to the forthcoming editions of this ranking.

Finally, this study has used the available online ranking information as a primary source, which relies on the goodwill and evidence of universities (the information on the SDGs is submitted by each university to the THE-IR). The authors are unaware of the quality procedures the ranking publisher has adopted for considering, evaluating and displaying the data gathered. Any remaining errors are our responsibility.

6. Conclusions

THE-IR is the first global university ranking intended to measure HEIs against all the sustainability goals. However, the coverage, methodology and results provided by this ranking raise concerns about their (dis)alignment with the United Nations Agenda objectives. The key conclusions obtained are displayed below in a dual manner:

6.1 Ranking method

The macro-structure level of the THE-IR breaches acceptable standards for designing university rankings (two universities can be ranked in the same list through different indicators, and one university can be measured with different data through different editions), hindering the results’ comparability and limiting their understandability. At the micro-structure level, the use of research metrics compromise several of the Leiden Principles for research evaluation, while the use of scores with extremely wide ranges prevents an accurate interpretation of the final ranked positions. For these reasons, it is concluded that the results derived from this ranking may not be as accurate as assumed.

6.2 Ranking coverage and overlap

The results reveal an increasing number of countries participating in the THE-IR, with a greater representation of worldwide regions in elite positions (top 200) than that obtained in the THE-WUR. The growing number of universities from developing countries and the absence of world-class universities reflect a lack of interest in this ranking from elite universities and an opportunity for less-esteemed institutions. For this reason, it could be concluded that this emergent ranking is being primarily used by institutions that do not achieve leading positions in the THE-WUR. Such institutions can build strategies to achieve higher positions that are benefited by the serious limitations found in the ranking methodology.

Avoid global scores and stick to the specific un-ranged scores of each SDG, where each university is compared through the same battery of indicators.

Reward the number of SDGs on which each university submits data, and the achievement of minimum scores (thresholds) to minimise strategic actions.

Reduce the weight of research metrics, and use these indicators more equitably across all SDGs, avoiding measuring the same indicator in different SDGs.

Offer disaggregated results for each SDG. On the one hand, research (which denotes interest and impact on issues related to an SDG, but does not necessarily imply being sustainable), teaching, transfer and institutional management.

Value the regional or national orientations of universities as well as other qualitative dimensions (e.g., HEIs’ societal contribution).

The findings have clear implications for different actors in the university rankings arena. Firstly, university managers can be aware of the THE-IR’s validity when demanding informed decisions, and better preparing their institutions for participating in sustainability-related rankings. Secondly, university ranking researchers and practitioners can access a detailed analysis of the THE-IR to determine its properties as a ranking and use raw data from THE-IR in other research studies or reports. Thirdly, these results can help the ranking publisher to improve or resolve some of the inconsistencies found in this study. Fourthly, the development of public policies related to the evaluation of the contribution of universities to the SDG should avoid the inclusion of the overall results provided by THE-IR as much as possible, due to the inconsistencies identified in this work. If used, we strongly recommend highlighting each SDG separately, indicating the number of universities analysed in that SDG, and driving readers to the full current methodology. Otherwise, the information provided in public policies might be misinterpreted, leading to inappropriate decisions, which could, in turn, lead to economic and reputational consequences to the higher education institutions. Given the global audience of the THE-IR, the results obtained contribute to minimising the distorted view about the sustainability in the higher education institutions that the THE-IR projects to the Society, and alerts governments, higher education bodies and universities to take precautions when making (economic) decisions based on this ranking.

Last, further research should be undertaken on specific case studies that can help to gain insight into the practical implementation of the ranking. The motivation by different types of organisations (that include comprehensive, technical, public, private or non-profit universities, or institutions placed in large cities) to participate in this ranking should also be explored, particularly considering the distinction between developed and developing countries. It might also be interesting to conduct qualitative research (e.g. a DELPHI analysis) through university leaders to determine the perceived advantages and disadvantages of participating in the THE-IR and whether the doing meaningfully contributes to assessing the impact of universities in achieving SDGs.

times higher education ranking methodology

Map of keyword co-occurrences of publications on sustainability and higher education

Source: Scopus; powered with VOSviewer ( https://www.vosviewer.com ).

times higher education ranking methodology

Type of university participation in the Impact Rankings

times higher education ranking methodology

Overlap of universities between the three editions

times higher education ranking methodology

Distribution of universities in the THE-WUR (right) and the THE-IR (left) over the years

Overview of metrics analysed in the impact rankings

Research indicators used in the-IR (2021 edition)

Number of universities providing data for each SDG in the the-IR

Unique countries participating in the the-IR but not indexed in the the-WUR

a unique value indicates the number of universities in the three editions.

WA: West Asia; WE: West Europe; SSA: Sub-Saharan Africa; SEA: South-east Asia, LAC: Latin America and the Caribbean; SE: East Europe; NE: North Europe; NA: North Africa; CA: Central Asia

The following query was performed in Scopus: TITLE-ABS-KEY (universit* OR “higher education”) AND TITLE-ABS-KEY (sustainab*). This query is intended to gather publications regarding both sustainability and higher education. From the corpus of publications gathered, all those containing at least the following keywords were filtered out: “Sustainable Development”, “Sustainability”, “Education”, “Higher Education”, “University”, “University Sector”, “Universities”, “Higher Education Institutions”, “Education for Sustainable Development”, “Sustainability Education”, “Sustainable Development Goals”, “University Campus”. The field was limited to the social sciences. A total of 19,809 keywords were obtained.

“Sustainable Development Goals”, “Sustainable Development Goal”, “SDG”, “SDGs”, “Sustainable Development Goals (SDGs)”.

available at: https://unstats.un.org/unsd/methodology/m49/#geo-regions

available at: www.arcgis.com

available at: https://omics.pnl.gov/software/venn-diagram-plotter

available at: http://200.6.99.248/∼bru487cl/files/Berlin_Principles_Release.pdf

available at: http://ireg-observatory.org/en_old/berlin-principles

Japan is the country that includes the most universities in all three published ranking editions.

Armitage , C.S. , Lorenz , M. and Mikki , S. ( 2020 ), “ Mapping scholarly publications related to the sustainable development goals: do independent bibliometric approaches get the same results? ”, Quantitative Science Studies , Vol. 1 No. 3 , pp. 1092 - 1108 , doi: 10.1162/qss_a_00071 .

Barron , G.R. ( 2017 ), “ The Berlin principles on ranking higher education institutions: limitations, legitimacy, and value conflict ”, Higher Education , Vol. 73 No. 2 , pp. 317 - 333 , doi: 10.1007/s10734-016-0022-z .

Buhmann , K. ( 2018 ), “ Future perspectives: doing good but avoiding SDG-washing. Creating relevant societal value without causing harm ”, in Mulder , H. , Scheltema , M. , van't Foort , S. and Kwant , C. (Eds), OECD Guidelines for Multinational Enterprises: A Glass Half Full , OECD , pp. 127 - 134 .

Calderon , A. ( 2021 ), “ Why SDG-focused impact rankings need to be contextualized ”, University World News , available at: www.universityworldnews.com/post.php?story=2021042815055074 (accessed 24 December 2021 ).

Caputo , F. , Ligorio , L. and Pizzi , S. ( 2021 ), “ The contribution of higher education institutions to the SDGs – an evaluation of sustainability reporting practices ”, Administrative Sciences , Vol. 11 No. 3 , pp. 1 - 13 , doi: 10.3390/ADMSCI11030097 .

Cardozo , M.L. , Japura , G.A. , Vargas-Murillo , A.R. , Sancho , G.F.I. and López , H.R.P. ( 2021 ), “ Characterization of Ibero-American universities by SDG in times higher education impact rankings 2020 ”, Turkish Journal of Computer and Mathematics Education , Vol. 12 No. 5 , pp. 689 - 700 , doi: 10.17762/turcomat.v12i5.1470 .

Chankseliani , M. and McCowan , T. ( 2021 ), “ Higher education and the sustainable development goals ”, Higher Education , Vol. 81 No. 1 , pp. 1 - 8 , doi: 10.1007/s10734-020-00652-w .

Collins , T.J. ( 2017 ), “ Review of the twenty-three year evolution of the first university course in green chemistry: teaching future leaders how to create sustainable societies ”, Journal of Cleaner Production , Vol. 140 , pp. 93 - 110 , doi: 10.1016/j.jclepro.2015.06.136 .

Cortese , A.D. ( 2003 ), “ Planning for higher education the critical role of higher education in creating a sustainable future need for a new human perspective envisioning a sustainable future ”, Planning for Higher Education , Vol. 31 No. 3 , pp. 15 - 22 , available at: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.607.6556&rep=rep1&type=pdf (accessed 24 December 2021 ).

De La Poza , E. , Merello , P. , Barberá , A. and Celani , A. ( 2021 ), “ Universities’ reporting on SDGs: using the impact rankings to model and measure their contribution to sustainability ”, Sustainability , Vol. 13 No. 4 , pp. 1 - 30 , doi: 10.3390/su13042038 .

Diaz-Sarachaga , J.M. , Jato-Espino , D. and Castro-Fresno , D. ( 2018 ), “ Is the sustainable development goals (SDG) index an adequate framework to measure the progress of the 2030 agenda? ”, Sustainable Development , Vol. 26 No. 6 , pp. 663 - 671 , doi: 10.1002/sd.1735 .

Fisher , A. and Fukuda-Parr , S. ( 2019 ), “ Introduction – data, knowledge, politics and localizing the SDGs ”, Journal of Human Development and Capabilities , Vol. 20 No. 4 , pp. 375 - 385 , doi: 10.1080/19452829.2019.1669144 .

Forestier , O. and Kim , R.E. ( 2020 ), “ Cherry-picking the sustainable development goals: goal prioritization by national governments and implications for global governance ”, Sustainable Development , Vol. 28 No. 5 , pp. 1269 - 1278 , doi: 10.1002/sd.2082 .

Gadd , E. ( 2020 ), “ University rankings need a rethink ”, Nature , Vol. 587 No. 7835 , p. 523 , doi: 10.1038/d41586-020-03312-2 .

Gadd , E. ( 2021 ), “ Love DORA, hate rankings ”, Impact of Social Sciences [blog] , available at http://eprints.lse.ac.uk/111336 (accessed 24 December 2021 ).

Galleli , B. , Teles , N.E.B. , dos Santos , J.A.R. , Freitas-Martins , M.S. and Junior , F.H. ( 2021 ), “ Sustainability university rankings: a comparative analysis of UI green metric and the times higher education world university rankings ”, International Journal of Sustainability in Higher Education , Vol. 23 No. 2 , doi: 10.1108/IJSHE-12-2020-0475 .

Harwood , T.G. and Garry , T. ( 2003 ), “ An overview of content analysis ”, The Marketing Review , Vol. 3 No. 4 , pp. 479 - 498 .

Hazelkorn , E. ( 2015 ), Rankings and the Reshaping of Higher Education: The Battle for World-Class Excellence , Palgrave McMillan , London .

Hazelkorn , E. and Mihut , G. ( 2021 ), “ Introduction: putting rankings in context looking back, looking forward ”, in Hazelkorn , E. and Mihut , G. (Eds), Research Handbook on University Rankings: theory, Methodology, Influence and Impact , Edward Elgar Publishing , Gloss , pp. 1 - 17 , doi: 10.4337/9781788974981.00008 .

Heleta , S. and Bagus , T. ( 2021 ), “ Sustainable development goals and higher education: leaving many behind ”, Higher Education , Vol. 81 No. 1 , pp. 163 - 177 , doi: 10.1007/s10734-020-00573-8 .

Hesselbarth , C. and Schaltegger , S. ( 2014 ), “ Educating change agents for sustainability – learnings from the first sustainability management master of business administration ”, Journal of Cleaner Production , Vol. 62 , pp. 24 - 36 , doi: 10.1016/j.jclepro.2013.03.042 .

Hicks , D. , Wouters , P. , Waltman , L. , De Rijcke , S. and Rafols , I. ( 2015 ), “ Bibliometrics: the leiden manifesto for research metrics ”, Nature , Vol. 520 No. 7548 , pp. 429 - 431 , doi: 10.1038/520429a .

Iskandaryan , R. ( 2020 ), “ Rethinking higher education through sustainable development goals (SDGs): a Russian perspective ”, in Kantola , J. , Nazir , S. , Salminen , V. (Eds), Advances in Intelligent Systems and Computing , Springer , pp. 328 - 334 , doi: 10.1007/978-3-030-50791-6_42 .

Lafortune , G. , Fuller , G. , Moreno , J. , Schmidt-Traub , G. and Kroll , C. ( 2018 ), Global Responsibilities. Implementing the Goals , Bertelsmann , p. 476 , available at: https://s3.amazonaws.com/sustainabledevelopment.report/2018/2018_sdg_index_and_dashboards_report.pdf (accessed 24 December 2021 ).

Lauder , A. , Sari , R.F. , Suwartha , N. and Tjahjono , G. ( 2015 ), “ Critical review of a global campus sustainability ranking: GreenMetric ”, Journal of Cleaner Production , Vol. 108 , pp. 852 - 863 , doi: 10.1016/j.jclepro.2015.02.080 .

Ligozat , A.L. , Névéol , A. , Daly , B. and Frenoux , E. ( 2020 ), “ Ten simple rules to make your research more sustainable ”, PLoS Computational Biology , Vol. 16 No. 9 , p. e1008148 , doi: 10.1371/journal.pcbi.1008148 .

Lozano , R. ( 2006 ), “ Incorporation and institutionalization of SD into universities: breaking through barriers to change ”, Journal of Cleaner Production , Vol. 14 Nos 9/11 , pp. 787 - 796 , doi: 10.1016/j.jclepro.2005.12.010 .

Lozano , R. , Ceulemans , K. , Alonso-Almeida , M. , Huisingh , D. , Lozano , F.J. , Waas , T. , Lambrechts , W. , Lukman , R. and Hugé , J. ( 2015 ), “ A review of commitment and implementation of sustainable development in higher education: results from a worldwide survey ”, Journal of Cleaner Production , Vol. 108 , pp. 1 - 18 , doi: 10.1016/j.jclepro.2014.09.048 .

Marginson , S. and van der Wende , M. ( 2007 ), “ To rank or to be ranked: the impact of global rankings in higher education ”, Journal of Studies in International Education , Vol. 11 Nos 3/4 , pp. 306 - 329 , doi: 10.1177/1028315307303544 .

Orduna-Malea , E. and Perez-Esparrells , C. ( 2021 ), “ Multidimensional taxonomy of university rankings ”, in Hazelkorn , E. and Mihut , G. (Eds), Research Handbook on University Rankings: theory, Methodology, Influence and Impact , Edward Elgar Publishing , Gloss , pp. 106 - 126 , doi: 10.4337/9781788974981.00017 .

Rafols , F. ( 2020 ), “ Consensus and dissensus in ‘mappings’ of science for sustainable development goals (SDGs) ”, Leiden Madtrics Blog Post on 10th August , Vol. 121 No. 2 , pp. 897 - 915 , available at: https://leidenmadtrics.nl/articles/consensus-and-dissensus-in-mappings-of-science-for-sustainable-development-goals-sdgs

Rafols , I. , Noyons , E. , Confraria , H. and Ciarli , T. ( 2021 ), “ Visualising plural mappings of science for sustainable development goals (SDGs) ”, in Glänzel , W. , Heeffer , S. , Chi , P-S. and Rousseau , R. (Eds), Proceedings of the 18th International Conference on Scientometrics and Informetrics (ISSI2021) , ISSI , pp. 949 - 954 , available at: https://kuleuven.box.com/s/kdhn54ndlmwtil3s4aaxmotl9fv9s329 (accessed 24 December 2021 ).

Safón , V. ( 2019 ), “ Inter-ranking reputational effects: an analysis of the academic ranking of world universities (ARWU) and the times higher education world university rankings (the) reputational relationship ”, Scientometrics , Vol. 121 No. 2 , pp. 897 - 915 , doi: 10.1007/s11192-019-03214-9 .

Siegel , K.M. and Bastos Lima , M.G. ( 2020 ), “ When international sustainability frameworks encounter domestic politics: the sustainable development goals and agri-food governance in South America ”, World Development , Vol. 135 , doi: 10.1016/j.worlddev.2020.105053 .

Torabian , J. ( 2019 ), “ Revisiting global university rankings and their indicators in the age of sustainable development ”, Sustainability: The Journal of Record , Vol. 12 No. 3 , pp. 167 - 172 , doi: 10.1089/sus.2018.0037 .

Unterhalter , E. ( 2019 ), “ The many meanings of quality education: politics of targets and indicators in SDG4 ”, Global Policy , Vol. 10 No. 1 , pp. 39 - 51 , doi: 10.1111/1758-5899.12591 .

United Nations ( 2015 ), “ Resolution adopted by the General Assembly on 25 September 2015 ”, Transforming our world: the 2030 Agenda for Sustainable Development, United Nations, New York .

Velazquez , L. , Munguia , N. , Platt , A. and Taddei , J. ( 2006 ), “ Sustainable university: what can be the matter? ”, Journal of Cleaner Production , Vol. 14 Nos 9/11 , pp. 810 - 819 , doi: 10.1016/j.jclepro.2005.12.008 .

Wright , T. ( 2004 ), “ The evolution of sustainability declarations in higher education ”, in Corcoran P.B. and Wals A.E.J. (Eds), Higher Education and the Challenge of Sustainability , Springer , Dordrecht, Netherlands , pp. 7 - 19 , doi: 10.1007/0-306-48515-X_2 .

Acknowledgements

This project has been supported by the project UNIVERSEO (Ref. GV/2021/141), funded by the regional government of Generalitat Valenciana (Spain).

Corresponding author

About the authors.

Nuria Bautista-Puig attained her BsC in Geography (University of Lleida, Spain), MSc in GIS and remote sensing (University of Zaragoza, Spain) and Ph.D. in Library and Information Science (2020, Universidad Carlos III de Madrid, AASHE Campus Sustainability Research Award, Spanish thesis award). Currently, she is an affiliated researcher at the University of Gävle (Sweden) and R&D Technical Support at UC3M, sponsored by the YUFERING EC-Funded project. Her areas of expertise include the fields of information science, organisational sustainability and bibliometrics/scientometrics. https://orcid.org/0000-0003-2404-0683

Dr Orduña-Malea is Technical Telecommunication Engineer, MA in Library Science, Master in Management of Multichannel Contents, and holds a PhD with a dissertation thesis about webmetrics and university rankings. Currently, he works as a tenured Associate Professor in the Department of Audiovisual Communication, Documentation and History of Art, at the Polytechnic University of Valencia. His lines of research are mainly related to the application of web-based techniques to Science Studies. Enrique also collaborates in funded research projects and international advisory activities about the web visibility of universities, researchers and other online research objects. https://orcid.org/0000-0002-1989-8477

Carmen Perez-Esparrells is B.A. in Economics and Business Administration, M.A. in Economics of Education and Labour Economics and holds a Ph.D. in Economics with a dissertation thesis in university funding. Currently, she is an Associate Professor at the Department of Economics and Public Finance of Autonomous University of Madrid (UAM). She held the positions of Vice-Rector for Innovation at UAM and of Associate Researcher at the Institute for Fiscal Studies, Spanish Ministry of Finance. She has been a member of the Association of Economics of Education (AEDE) since its inception and has held the position of President from 2012 to 2014. https://orcid.org/0000-0003-4466-9825

Related articles

We’re listening — tell us what you think, something didn’t work….

Report bugs here

All feedback is valuable

Please share your general feedback

Join us on our journey

Platform update page.

Visit emeraldpublishing.com/platformupdate to discover the latest news and updates

Questions & More Information

Answers to the most commonly asked questions here

times higher education ranking methodology

Wolfram Demonstrations Project

Times higher education (the) world university rankings.

times higher education ranking methodology

  • Open in Cloud
  • Download to Desktop
  • Copy Resource Object

Requires a Wolfram Notebook System

Interact on desktop, mobile and cloud with the free Wolfram Player or other Wolfram Language products .

Do not show again

This Demonstration lets you explore the effects of the scaled indicators on the overall score in Times Higher Education (THE) world university ranking [1], which is based on the 13 performance indicators designed to assess universities in five areas: (I) teaching, (II) research, (III) research influence, (IV) industry and (V) international outlook.

The methodology used is based on the procedures described in [2]. To find the rank of a university, one needs to know the university’s total scaled score or the scaled scores of each of the indicators defined by the sliders.

The rankings are just estimated, since the calculations are based on data from previous years. You can observe this using the sliders. For example, in the 2022 ranking, Cornell University's overall score is 85.0 and its rank is 22. In the Demonstration, the estimated rank for an overall score of 85.0 is 23. As another example, the University of Copenhagen has an overall score of 62.7 and its rank is 96. In the Demonstration, the estimated rank corresponding to an overall score of 62.7 is 94.

Contributed by: Merve Samioglu   (January 2023) Open content licensed under CC BY-NC-SA

times higher education ranking methodology

[1] Times Higher Education World University Rankings. (Nov 14, 2022) www.timeshighereducation.com .

[2] E. Siniksaran and M. H. Satman, "WURS: A Simulation Software for University Rankings," Scientometrics , 122 (1), 2020 pp. 701–717. doi:10.1007/s11192-019-03269-8 .

Related Links

Permanent citation.

Merve Samioglu "Times Higher Education (THE) World University Rankings" http://demonstrations.wolfram.com/TimesHigherEducationTHEWorldUniversityRankings/ Wolfram Demonstrations Project Published: January 2, 2023

Share Demonstration

Take advantage of the Wolfram Notebook Emebedder for the recommended user experience.

times higher education ranking methodology

Related Topics

times higher education ranking methodology

IREG Observatory on Academic Ranking and Excellence

MEMBERS AREA

times higher education ranking methodology

  • Ranking News

THE Plans New Methodology

The Times Higher Education (THE) world rankings have been very popular with the media and administrators of the traditional elite of global academia. They have, however, received little respect from active researchers.

For some time, THE has been planning for a drastic overhaul of its methodology and is talking about World University Rankings (WUR) 3.0. The first version was published in 2004 in collaboration with QS Quacquarelli Symonds of London. The second began in 2010 when THE and QS announced separate world rankings.

It appears that THE are planning a number of changes that will probably cause significant fluctuations and may lead to renewed debate over the reliability and accuracy of global rankings.

Among the proposed changes are the introduction of two new citations metrics, both of which emphasise papers in high impact journals. It seems that this will reduce some of the anomalies that have occurred in recent years when a number of small or little-known universities have achieved high ranks through publications in mega-projects with large numbers of contributors and citations.

Internationalisation

The international pillar of the rankings will have a new indicator, outward bound exchange students, and a new method of normalisation will benefit countries with large populations,

Knowledge Transfer

In addition to income from industry and commerce, THE will count citations of patents.

The collection and processing of survey data will undergo some significant changes. These include asking survey respondents about their direct experience and prohibiting them from voting for their employers.

These changes will be announced at the forthcoming THE world summit and come into effect for the rankings that will be published in2023.

Times Higher Education

The editors decide on the publication of comments on the site.

Recent News

Report on university rankings from south african, latest ranking from cwur, losers and winners in the new the asian ranking, what happens to universities that leave ranking, shanghai announces best chinese universities, open data and global rankings.

[email protected]

© IREG 2019. All rights reserverd.

Design and implementation: solv.pl

Times Higher Education World University Rankings

Times Higher Education World University Rankings

Table of Contents

What is times higher education ranking, times higher education world university rankings – overview, times higher education world university rankings 2024, what is important to remember regarding the rankings list.

What aspects do you take into account while selecting which institutions to apply to? Do you make decisions based on the university's global ranking?

You can use the world university rankings to learn more about your university and its reputation, even if it shouldn't be the sole determining factor.

In this article, you’ll find the most recent Times Higher Education World University Rankings.

The Times Higher Education magazine publishes the Times Higher Education World University Rankings (THE), a list of universities ranked annually. Since its implementation in 2010, the current approach for the rankings has been more rigorous, transparent, and resilient.

The process is founded on input gathered from specialists in the study abroad sector and expert consultations. It consists of a survey, an analysis of research papers, and their citations. The rankings for THE are among the world's most esteemed ranking lists.

  • 1,904 institutions are ranked in 104 countries and regions by the 2024 THE Rankings. It presents itself as the most comprehensive and varied university rankings available.
  • The University of Oxford, Stanford University, and Massachusetts Institute of Technology are the top three universities.
  • The USA makes up the bulk of universities.
  • India is now the fourth most-represented nation on the list with 91 institutions.
  • Based on THE’s recently developed WUR 3.0 approach, the table evaluates an institution's performance in five areas: teaching, research environment, research quality, industry, and worldwide outlook. It does so by utilising eighteen carefully calibrated performance indicators.
  • Times Higher Education World University Rankings 2024 comprised survey responses from 68,402 researchers worldwide and analysed over 134 million citations from 16.5 million scholarly papers. They received data submissions from over 2,673 institutions, totalling 411,789 data points.

Before choosing to utilise the Times Higher Education World University Rankings 2024 to direct your study abroad experience, keep the following points in mind:

  • It is possible that the top-ranked schools aren't the best for all courses or evaluation standards. Seek the best universities in your chosen field of study.
  • Choosing a university shouldn't be based solely on rankings.
  • If you have been paying careful attention to the Times Higher Education World University Rankings, you must be aware that the majority of the top 10 positions are consistently held by the same universities.

Not every institution is able to reach the top of the ranking. However, this does not imply that other universities are inferior.

Times Higher Education World University Rankings provide a thorough picture of institutional performance and academic quality around the world. Use these rankings to find colleges that fit your academic objectives and aspirations as you start your educational journey.

Explore diverse study options and prestigious institutions with Edvoy, a trusted partner in connecting students with top-ranked universities worldwide. Take the next step towards your academic success and discover the world of opportunities awaiting you.

author avatar

Study Abroad Expert

author avatar

Disclaimer: The views and opinions shared in this site solely belong to the individual authors and do not necessarily represent t ...Read More

What is a gap certificate? How to apply and its benefits

What is a gap certificate? How to apply and its benefits

Post graduate work permit (PGWP) in Canada: Detailed guide

Post graduate work permit (PGWP) in Canada: Detailed guide

What is a diploma course? Different types and job prospects

What is a diploma course? Different types and job prospects

Computer Science vs Software Engineering: Know the key differences

Computer Science vs Software Engineering: Know the key differences

What is study permit? How to apply, differences between study permit and student visa

What is study permit? How to apply, differences between study permit and student visa

OFC Appointment for US Visa 2024

OFC Appointment for US Visa 2024

  • home  
  • Overview of ranking lists
  • Overview of Swiss Universities
  • Shanghai Ranking
  • Times Ranking
  • Leiden Ranking
  • Compare  
  • Shanghai Jiao Tong University
  • Times Higher Education World University Rankings
  • The Leiden Ranking
  • The CHE Ranking
  • Financial Times Rankings

QS (Times Higher Education Supplement until 2009)

  • University commentary
  • On University Rankings

QS (Times Higher Education Supplement until 2009)

(June 2017) Significant change: the reference years of the rankings from 2014 to 2017 have been changed Until now, the reference year of the ranking was the year of its publication. This year (2017) the QS published its ranking under the reference of "2018" and renamed the rankings since 2014 with the date following the year of their publications. We have adjusted accordingly our years (which means that the "Ranking 2015" became the "Ranking 2016").

           ------------------

From 2004 to 2009 the Times Higher Education and Quacquarelli Sysmonds ( QS) published annually a joint ranking under the name " Times Higher Education Supplement ( THES) Rankings" .

In 2010 these two institutions have stopped their collaboration but continue to publish a separate r anking : QS have kept the 2004-2009 methodology and publishe this ranking as the QS World University Rankings ( which also include the THES rankings from 2004 to 2009 ) and the Times produce a new type of ranking : the Times Higher Education World University Rankings .

The QS produces a worldwide ranking of universities based on five qualitative and quantitative indicators. The ranking list was compiled for the first time in 2004. In addition to a global ranking, rankings by scientific fields were made. The QS ranking list includes a global Peer Review:  academics from five continents have nominated leading universities in the fields in which they are considered experts. The QS ranking list is based on the reputation of the universities among scholars in each discipline. In 2004, this Peer Review accounted for 50% of the total number of points for each university. Since 2005, the QS ranking list introduced an employer survey as a new criterion to assess the employment rate of current graduates. The employer survey was carried out among recruiters at international corporations. Recruiters were asked to identify the best universities they prefer to recruit graduates from.  A 10% weighting was given to their responses. This means that the weighting given to the Peer Review indicator was somewhat reduced. However, at 40%, it remained the most important indicator. From 2007 to 2011 the weighting of the various indicators has remained unchanged. However, the following changes were made: a different database was adopted (change from Thomson Scientific to Scopus); a “mathematical” fix was introduced to reduce distortions in the scale used; and finally, the experts conducting the peer review can no longer make reference to their own institution.

Indicators and Weight in the QS Ranking

Source: QS Quacquarelli Symonds Limited

LinkedIn

times higher education ranking methodology

Times Higher Education: Impact Rankings 2024

UPDATED May 11, 2024 : Methodology and modifications aimed at enhancing stability in THE’s 2024 Impact Rankings. (PDF)

Impact Rankings 2024 Results

The Impact Rankings 2024 results are set to be released in early June 2024.

Meanwhile, review the results of last year’s Impact Rankings.

Last year, an Akari Software client, the University of Tasmania , ranked in the Top 5 most impactful institutions.

Something to notice is that, for the second year, the university has ranked #1 in Climate Action , one of the 17 SDGs analysed by the ranking.

THE Impact Ranking 2023 Results - University of Tasmania

Also in the Top 50 , two more of our clients stood out: the University of the Sunshine Coast (#29) and the University of Galway (#34).

THE Impact Ranking 2023 Results - University of the Sunshine Coast

The Times Higher Education Impact Rankings have unveiled Western Sydney University as the top-ranked institution once again, affirming its continued leadership. Australian universities maintain a formidable presence in the top 10, with the University of Tasmania securing fifth place and RMIT University sharing the seventh spot.

This year’s results reinforce the strong correlation between Australasian universities and top rankings, as an institution from the region has now claimed the top position four times since the rankings’ inception in 2019. Canadian universities also exhibit noteworthy performance, occupying an impressive four spots in the top 10. Queen’s University achieves an outstanding third place, while the University of Alberta, the University of Victoria, and Western University jointly secure seventh and ninth places.

On the international front, the University of Manchester from the United Kingdom distinguishes itself as the second-highest ranked institution, positioning it as the leading European university amidst the extensive list of over 1,500 universities evaluated. Aalborg University from Denmark shares the ninth spot, further highlighting the global impact of exceptional educational institutions.

These rankings underscore the strong correlation between universities’ commitment to the United Nations’ Sustainable Development Goals and their resulting impact. The consistent performance of top-ranked institutions, along with their collective dedication, reflects the global efforts and collaborations undertaken to foster sustainable development and create a brighter future.

Official Impact Rankings Results (last year)

See the full list here .

Universities can have a far-reaching influence on the world through their research, education and solutions to global social challenges.

The Times Higher Education Impact Rankings are an essential tool providing an unbiased assessment of how universities contribute to this impact and are the only rankings of their kind from the largest system for ranking higher education globally.

This year’s THE Impact Rankings aims to show not only what progress has been made, but also opportunities for further growth within each institution – allowing them to measure their effect on society, including both positive and negative impacts.

To achieve this goal, data from institutions around the world are collected, looking at keywords such as ‘gender diversity’ and ‘student welfare policy’ which demonstrate real-world results rather than solely relying on metrics such as student grades or number of publications produced. With a vested interest in creating lasting positive change through research disciplines and teaching methods, it is important that academics stay abreast of these new initiatives seeking how best to make use of them towards achieving social good throughout all aspects of university life.

Introducing the Times Higher Education Impact Rankings 2024

The Times Higher Education Impact Rankings 2024 assess universities across 17 of the United Nations Sustainable Development Goals (SDGs), ranging from good health and well-being to peace, justice and strong institutions.

These SDGs are significant because they provide an international framework for setting global priorities and developing national policy agendas that promote sustainable development worldwide.

This method of evaluation will not only create a level playing field among universities but also help to identify their strengths and weaknesses in terms of their respective contributions towards making the world a better place.

What are the United Nations Sustainable Development Goals (SDGs)?

The UN SDGs are 17 goals adopted by all member nations in 2015, which aim to promote global economic development, social inclusion and environmental protection by 2030.

These ambitious objectives are highly interrelated – for example, Goal 4 focuses on quality education, while Goal 14 addresses the conservation and sustainable use of marine resources. Other goals focus on reducing poverty and inequalities, providing clean water and sanitation to all, promoting gender equality, and tackling climate change.

All these objectives are interconnected yet intertwined – for instance, progress towards achieving one goal (e.g. quality education) will also help to promote further progress towards achieving another goal (e.g. gender equality). Therefore, the UN SDGs are a comprehensive set of goals that must be addressed in order to create a better world for everyone. 

The Times Higher Education Impact Rankings 2024 will play an important role in helping universities and other higher education institutions assess their progress towards achieving the UN SDGs and in motivating them to integrate these goals into their curriculum.

THE impact rankings methodology 2024 - How Universities Are Assessed

Download the Impact Rankings 2024 Methodology PDF

  • The Times Higher Education Impact Rankings 2024 assess universities across 17 UN Sustainable Development Goals, ranging from health and environmental protection to education.
  • Participation in the overall ranking requires universities to submit data to at least four SDGs including SDG 17 – Partnerships for the Goals.
  • If a university submits data but does not fulfil the overall ranking requirement, it will still be ranked in individual SDGs.
  • The overall score is generated from scores for up to four SDGs and based on metrics and indicators, which are given an exact or approximate percentage value.

Modifications aimed at enhancing stability in THE's 2024 Impact Rankings.

  • Last year, more challenging questions were introduced, focusing on SDGs 3, 4, 11, 16, and 17.
  • Changes include replacing questions with higher standards and adjusting the calculation of total scores using a rolling average of the last two years.
  • Overall Impact Ranking scores for 2024 are created by averaging the scores with those from the previous year, except for new universities.
  • The two-year blending is not applied to individual SDG rankings, which focus on specific areas of excellence and good practice.

Relationship Between the UN SDGs and Higher Education Curriculum

The Impact Rankings 2024 will encourage a closer relationship between higher education and the UN SDGs, as well as motivate universities to integrate these goals into their curriculum. This integration will allow students to gain a better understanding of the global challenges we face and how to best respond to them, making their education more relevant to the real-world issues that are shaping our society today.

It will also encourage universities to actively promote sustainable practices on campus, as well as provide students with an opportunity to engage in meaningful research and development projects that have the potential to make a lasting impact on the world.

Benefits of Integrating the United Nations Sustainable Development Goals (SDGs) into University Curriculum

By  integrating the UN SDGs into their curriculum , universities will be able to demonstrate their commitment towards building a better world – an essential criterion in achieving higher rankings on the Times Higher Education Impact Rankings 2024.

Universities will benefit from increased participation in international collaboration and research activities, as well as access to more funding opportunities that support their efforts towards promoting social good across all academic disciplines.

This will help universities demonstrate to potential students and employers their commitment to creating a better world than the one they inherited.

The Benefits of Adopting a Sustainable Approach to University Education

Universities that align their teaching, research and societal engagement strategies with the UN SDGs stand to benefit by having access to a wider range of funding opportunities, as many governments and philanthropic organizations are increasingly investing in initiatives that aim to tackle the SDGs.

Investments in sustainability-focused research and teaching projects can help universities build a positive international reputation and attract students from around the world.

Integrating sustainable practices into university curricula will create new collaborations with businesses, NGOs and other organizations, further enriching the teaching and research agendas of universities.

Top 10 Universities in the Impact Rankings 2022

Read more about the  Impact Rankings 2022 results .

  • Western Sydney University – Australia
  • Arizona State University (Tempe) – USA
  • Western University – Canada
  • (4) King Abdulaziz University – Saudi Arabia
  • (4) Universiti Sains Malaysia – Malaysia
  • University of Auckland – New Zealand
  • Queen’s University
  • Newcastle University – UK
  • University of Manchester – UK
  • Hokkaido University – Japan

Akari Software - Times Higher Education - Impact Rankings 2023

Times Higher Education

Times Higher Education (THE) is a magazine reporting specifically on news and issues related to higher education, and is known for publishing the annual Times Higher Education World University Rankings.

Transform Your Higher Education Curriculum Lifecycle

Learnovate Patron Member

Top Content

Design & Marketing by SaaSsy            

Site Dev by Picture & Word            

© 2024 Akari Software            

All rights reserved​

Privacy Overview

Request a demo.

times higher education ranking methodology

COMMENTS

  1. World University Rankings 2024: methodology

    The Times Higher Education World University Rankings are the only global performance tables that judge research-intensive universities across all their core missions: teaching, research, knowledge transfer and international outlook. This year's methodology, for the 20th edition of the World University Rankings, has been significantly updated ...

  2. THE World University Rankings: a quick guide to our methodology

    Phil Baty, editorial director of global rankings at Times Higher Education, explains the methodology underpinning the THE World University Rankings tables. T...

  3. World University Ranking Methodologies Compared

    The Times Higher Education World University Rankings uses 13 performance indicators, grouped into five categories. Institutions are excluded if they do not teach at undergraduate level, or if their research output is below a certain threshold. Teaching (worth 30% of the overall score) Based on a reputation survey (15%), staff-to-student ratio ...

  4. Times Higher Education World University Rankings

    The Times Higher Education World University Rankings, often referred to as the THE Rankings or just THE, is the annual publication of university rankings by the Times Higher Education magazine. The publisher had collaborated with Quacquarelli Symonds (QS) to publish the joint THE-QS World University Rankings from 2004 to 2009 before it turned to Thomson Reuters for a new ranking system from ...

  5. Times Higher Education World University Rankings

    The World University Ranking of the English magazine Times Higher Education ranks the top 400 universities worldwide based on qualitative and quantitative indicators . The methodology differs from QS by increasing the number of indicators taken into account . The Times Higher Education has the ambition to cover the 3 missions of universities ...

  6. Good University Guide 2021 methodology: how we compiled the rankings

    Data on entry standards, student-staff ratios, services and facilities spend, completion rates, first-class and 2:1 degrees, graduate prospects and the 2014 Research Excellence Framework staffing ...

  7. University Rankings Data: A Closer Look for Research Leaders

    (THE rankings methodology guide (opens in new tab/window), accessed September 28, 2023) Why does Times Higher Education use Scopus® Data? Scopus® is a source of research publication and bibliometric data that several rankers rely on. In 2014, THE adopted Scopus as its source for research publication and bibliometric data.

  8. Times Higher Education 2023 World Rankings Released

    Elsevier is a proud partner of Times Higher Education, supplying the bibliometric dataset based on Scopus data for the 2023 THE World University Rankings. If you have ever wondered how Scopus data is used in the rankings methodology, keep reading and take time to look at the resources shared below. Scopus data is used for only part of the ...

  9. Times Higher Education-QS World University Rankings

    The term Times Higher Education-QS World University Rankings refers to rankings published jointly between 2004 and 2009 by Times Higher Education and Quacquarelli Symonds (QS). After QS and Times Higher Education had ended their collaboration, the methodology for these rankings continues to be used by its developer Quacquarelli Symonds. Since 2010 these rankings are known as the QS World ...

  10. A Guide to What Is Times Higher Education Ranking and How It Can Help

    Doctorate-to-bachelor's ratio 2.25%: Having a higher number of doctorate students implies the university offers high-quality education which makes it attractive to postgraduates. Doctorates-awarded-to-academic-staff ratio 6%. Institutional income 2.25%: This income allows the university to invest more in meeting student needs and providing ...

  11. PDF Methodology for Overall and Subject Rankings for The Times Higher

    World University Rankings 2021 methodology | Times Higher Education (THE) 2 Times Higher Education World University Rankings: Times Higher Education is the data provider underpinning university excellence in every continent across the world. As the company behind the world's most influential university ranking, and with almost five decades of experience as a

  12. THE Impact Rankings, Scopus and SciVal

    What is the Times Higher Education Impact Rankings methodology? For each of the 17 UN Sustainable Development Goals, Times Higher Education creates different measurable objectives designed to capture a university's progress toward an individual SDG beyond teaching and research. ... To illustrate how different each methodology is, below are two ...

  13. How to use the QS World University Rankings by Subject

    There are five indicators that determine the QS World University Rankings by Subject 2024: Academic reputation. Drawing on responses from academics, this indicator demonstrates which universities other academics consider to be excellent for research in the given area. The results of the survey are then filtered according to the narrow area of ...

  14. Enhancing sustainable development goals or promoting universities? An

    Purpose. This study aims to analyse and evaluate the methodology followed by the Times Higher Education Impact Rankings (THE-IR), as well as the coverage obtained and the data offered by this ranking, to determine if its methodology reflects the degree of sustainability of universities, and whether their results are accurate enough to be used as a data source for research and strategic ...

  15. Understanding the Methodology: QS World University Rankings

    The QS methodology is well established and accepted. Government officials and policymakers all around the world are keen users of the QS World University Rankings. In a number of countries governments have established specific objectives for the development of their higher education systems based on QS rankings.

  16. Times Higher Education (THE) World University Rankings

    This Demonstration lets you explore the effects of the scaled indicators on the overall score in Times Higher Education (THE) world university ranking [1], which is based on the 13 performance indicat ... The methodology used is based on the procedures described in [2]. To find the rank of a university, one needs to know the university ...

  17. THE Plans New Methodology

    The Times Higher Education (THE) world rankings have been very popular with the media and administrators of the traditional elite of global academia. They have, however, received little respect from active researchers. For some time, THE has been planning for a drastic overhaul of its methodology and is talking about World University Rankings (WUR) 3.0.

  18. Times Higher Education World University Rankings

    The Times Higher Education magazine publishes the Times Higher Education World University Rankings (THE), a list of universities ranked annually. Since its implementation in 2010, the current approach for the rankings has been more rigorous, transparent, and resilient. The process is founded on input gathered from specialists in the study ...

  19. QS (Times Higher Education Supplement until 2009)

    From 2004 to 2009 the Times Higher Education and Quacquarelli Sysmonds ( QS) published annually a joint ranking under the name " Times Higher Education Supplement ( THES) Rankings" . ... QS have kept the 2004-2009 methodology and publishe this ranking as the QS World University Rankings ( which also include the THES rankings from 2004 to 2009 ...

  20. Times Higher Education: Impact Rankings 2024

    The Times Higher Education Impact Rankings 2024 assess universities across 17 UN Sustainable Development Goals, ranging from health and environmental protection to education. Participation in the overall ranking requires universities to submit data to at least four SDGs including SDG 17 - Partnerships for the Goals.

  21. Methodology

    The basic principles of ranking methodology: Each of the 25 criteria gets its own percentage. The total percentage is 100%. The percentage is multiplied by 100, and the general result is 10,000 points. Each criterion and performance indicator gets a specific number of points, and the total value of points obtained by higher education ...