© 2020 All-Rights Reserved Weekender Group Pte Ltd

It’s official: University rankings are not to be trusted!

university rankings
A multitude of factors affect university rankings, not just reputation

Contributed By MICHAEL HENG PBM – 

DO NOT Trust University Rankings: “Useless” – Official Study Slams THE University Rankings

THE World University Rankings Decomposed and Discredited

Singapore’s NUS and NTU were recently ranked at the top by major international World Universities Rankings vendors. One such vendor is THE (Times Higher Education) World University Rankings.

A Norway government-commissioned study has concluded that even the top rankings are so based on subjective weightings of factors and on dubious data that they are useless as a basis for information if the goal is to improve higher education.

university rankings
Are Singapore’s NUS and NTU as good as touted?

The Norwegian Ministry of Education and Research commissioned the Nordic Institute for Studies in Innovation, Research and Education, or NIFU, to analyse Norwegian universities’ placements on international university rankings.

The ministry specifically wanted to know what the rankings meant for the universities in practice, and if there were factors at the national or institutional level that could explain the differences between Nordic countries.

Main Conclusions

The main conclusions regarding the ARWU (Shanghai Academic Ranking of World Universities) and THE (Times Higher Education) are that “placement on those rankings is to a large degree based on a subjective weighting of factors which often have a weak relationship to the quality of education and research.

“The rankings are based on data that to a varying degree are made available and made transparent. The rankings say almost nothing about education.”

“The international rankings are therefore not useful as the basis for information and feedback both on research and education, if the goal is further improvement of … higher education institutions.”

“Decomposing” the THE Rankings

The NIFU Methodology in “decomposing” the ARWU and THE rankings is extensive. There is a sophisticated analysis for each university, looking at which variables can explain most of the variance measured in percentage from the positioning on the same variable for the bench-marking group of universities.

This is a very illuminating exercise, because the standardised measures – for instance in THE – differentiate much better among the top-rated universities than those with a lower ranking. This THE methodological ‘fallacy’ is underlined several times in the report:

“In THE there is a 30.7 point difference between Caltech as number one and Pennsylvania State University at place 50. And for Helsinki University at place 100, there is only a 10.9 point difference. Then there is only a 3.8 point difference between rank 101 and rank 149 and another 4.2 points between rank no 150 and rank 199. The trend in both ARWU and THE is that the lower down the list you get, the smaller the difference between universities.”

What is special with the THE, NIFU argues, is that 33% of the weighting in the ranking is decided by an international survey of academics – but these results are not made available in the THE report, where only the first 50 places are documented.

The last THE reputation survey was done in 2012, NIFU said, and 16,639 academics in 144 countries responded – but THE does not say what was  percentage response rate to the survey.

The Harvard Example (Always Scoring 100)

university rankings
Why aren’t Harvard and other Ivy League universities topping the charts instead?

The Institute says that Harvard University, which is most frequently rated by the respondents, is given a score of 100, and universities down on the list are given a percentage of the “votes” Harvard gets. For instance MIT, second on the list, gets 87.6% as many “votes” as Harvard.

This figure is only published for the first 50 entries on the ranking. Most universities after that might have received less than 1% of the nominations Harvard got, making it feasible that there may be great fluctuation in the proportions between universities from survey to survey.

The Institute argues convincingly that the weight of the survey is an advantage only for the first placement on the rank, and unreliable as a differentiation mechanism for universities further down on the list.

THE Responds:

Phil Baty, editor of the THE rankings, told University World News that the magazine had “not been consulted at all by the NIFU”.

Reputation, he said, formed “just a part of a comprehensive range of metrics used to create the Times Higher Education World University Rankings”. In all, 13 performance indicators were used, “covering the full range of university activities – research, knowledge transfer, international outlook and the teaching environment”.

university rankings
A multitude of factors affect university rankings, not just subjective reputation

“The majority of the indicators are hard, objective measures, but we feel it is very important to include an element of subjective reputation as it helps to capture the less tangible but important aspects of a university’s performance, which are not well captured by hard data.”

However, “hard, objective measures” are NOT the same as VALID and RELIABLE measures of excellence.

The methodology was devised, Baty stressed, after open consultation and was refined by an expert group of more than 50 leading scholars and administrators around the world. Effort was taken to ensure the survey was fair, with countries receiving the right proportion of questionnaires, and it was distributed in multiple languages. Only senior academics were invited to respond to the survey, and all of them had published in world-leading journals.

“Again THE Ranking Methodology has to be scientifically established with regards to its 13 Vectors of THE Measures and their respective Internal Consistency. It is NOT a matter of collective educated opinion by a group of unnamed and unspecified Scholars and Administrators of unknown expertise.”

“When Norway’s universities break new ground and push forward the boundaries of understanding in any particular academic field, they should be making sure that scholars across the world are aware of the discoveries, through the most appropriate means of dissemination – journal publications and conferences,” said Baty.

“This is the only way to ensure Norwegian universities get the credit they deserve. Other small countries have had tremendous success in the rankings – the Netherlands and Switzerland, for example.”

Local Views

The NIFU report was presented at a seminar in Oslo recently, capturing much attention. “Can we trust university rankings?” NIFU wrote on its website. “University Rankings Criticised,” declared the Ministry of Education and Research in a press release.

“A Kiss of Death for University Rankings,” said University of Oslo Rector Ole Petter Ottersen in his blog, stating:

“This report should be made available for everyone working within the higher education sector in Norway. Not the least, it should be available on the news desk of Norwegian newspapers.”

The United Nations Educational Scientific and Cultural Organization (UNESCO) has challenged the validity and reliability of University Rankings such as THE Ranking:

“Global university rankings fail to capture either the meaning or diverse qualities of a university or the characteristics of universities in a way that values and respects their educational and social purposes, missions and goals. At present, these rankings are of dubious value, are underpinned by questionable social science, arbitrarily privilege particular indicators, and use shallow proxies as correlates of quality.”

For the sake of authenticity, Singapore universities should stay away from bogus ranking standards of dubious excellence.

By Michael Heng PBM

This article first appeared on the author’s blog here.

Related Yak & Crow articles:

So what if you are a graduate?

True lies about universities rankings

How meritocracy entrenches inequality

ADVERTISEMENTS