Accessibility:
12.09.2014
Share it:

University World News: U-Multirank – A university ranking evaluation

Original version of this article published by University World News 12 September 2014, available at http://www.universityworldnews.com/article.php?story=20140910135245123

By Richard Holmes

The number of international university rankings continues to increase. They fulfil several roles. Students and their parents, potential employers and sponsors need to have some idea of the quality of the places where they spend time and money.

Government agencies and university administrators want to assess the results of expenditure of public funds. Rankings have also become an arena for competition among universities and states for prestige and resources.

Global university ranking began quite modestly in 2003 with the publication of Shanghai Jiao Tong University’s Academic Ranking of World Universities, or ARWU, a simple index that was research orientated and included just six indicators.

One of these, alumni winning Nobel or Fields awards, was sometimes presented as a measure of teaching quality, but few were ever seriously convinced of this.

The Shanghai rankings were influenced by the current Chinese perception of the roots of Western dominance, namely technology and scientific research. They were not concerned with the humanities or with the education of undergraduates.

The results of the first edition of these rankings caused quite a stir. Europe, especially France, was shocked to discover the size of the lead of the academic Anglosphere, especially North America. They also showed that the United States was well in front of British universities, with four American institutions ahead of Cambridge in fifth place, while Oxford came ninth and Imperial College 17th.

Times Higher Education

In 2004, the Times Higher Education Supplement, as it was then known, teamed up with the graduate student recruitment firm, Quacquarelli Symonds – QS – to produce the World University Rankings.

These marked a shift away from a research only index but only to a limited extent – 20% of the total weighting went to faculty-student ratio, which was supposed to measure quality of teaching, and 10% to an employer survey which was at best a crude proxy for student quality.

In 2009, QS and Times Higher Education or THE, whose new name reflected the shift to a magazine format, went their separate ways with QS continuing the old methodology while THE formed a partnership with Thomson Reuters, the data and media giant.

The new THE rankings included five different indicators that were related in some way to teaching and learning, but were of limited relevance to the teaching of undergraduates.

It was noticeable that in the QS rankings, leading British universities nearly always do much better than in the ARWU. Imperial College London was fifth in the QS rankings in 2013, but only 24th in ARWU, for instance.

The THE rankings show rather less bias towards British institutions – Cambridge is seventh compared with fifth in ARWU – but there is some. London institutions generally do better.

Anti-European bias?

There was clearly a feeling throughout continental Europe that the existing rankings were too dependent on the Anglo-American model of a selective global research university and were biased against European institutions.

Through much of Europe, a lot of research is done by institutions such as the Centre National de la Recherche Scientifique in France or the Max Planck Gesellschaft in Germany while many universities are confined to teaching.

Furthermore, British and American universities are highly differentiated with a great sorting taking place at the end of secondary education and students being selected for different places on the basis of exam results or standardised test scores.

In much of continental Europe, many students remain at home and attend the local university, regardless of their academic ability.

Preparation

In June 2009, the European Commission, a few months before THE’s split with QS, announced the launch of a feasibility study “to develop a multi-dimensional global university ranking”.

The five dimensions were teaching and learning, research, knowledge transfer, international orientation and regional engagement. The idea was to produce rankings of institutions and field rankings based on these five performance indicators.

The commission pointed out that existing global rankings – at that time ARWU and the THE-QS world rankings – focused on research in the hard sciences and ignored teaching, community involvement and publications in the humanities and social sciences. The new rankings, it was proposed, would be transparent, multi-dimensional and global.

The Commission’s report, published in 2011, concluded that the new style rankings would be methodologically feasible and financially practical. The target was to cover 700 European universities with 500 being ranked in three fields. After a pilot study with 159 institutions, 700 institutions were signed up, two thirds of them from Europe.

Criticism

The proposed rankings received a lot of criticism, mainly from British universities but also from prestigious research institutions on the continent, while there was little interest in the United States.

Kurt Deketelaere, secretary-general of the League of European Research Universities – a group of 21 elite institutions – in an interview with THE, said:

“LERU has serious concerns about the lack of reliable, solid and valid data for the chosen indicators in U-Multirank, about the comparability between countries, about the burden put upon universities to collect data and about the lack of ‘reality-checks’ in the process thus far.”

Such criticism was to some extent justified. The experience of the QS rankings and America’s Best Colleges had shown that collecting data from institutions was a difficult business.

Even with complete honesty on the part of bureaucrats, the detail that U-Multirank required would be daunting for many universities in Africa, Asia and South America, although the same could be said of the THE rankings and Thomson Reuters’ associated Global Institutional Profiles Project.

But the reaction of some British universities, which even included representations to the upper house of parliament, seemed rather exaggerated.

It is difficult to avoid the conclusion that they were averse to being ranked by criteria where they might not perform as well as they did on the QS and THE rankings and where European and other universities might do better.

The response from THE was also rather questionable. The magazine had positioned itself as analysing research universities with a global impact and lavish funding, a status that only a few hundred in East Asia and the West and a handful of national flagships could ever hope to attain.

Why should it be so concerned about a ranking that was aiming at a different, although perhaps slightly overlapping, market?

In a House of Lords report in 2012 David Willetts, then UK minister for universities and science, is quoted as saying that U-Multirank could be viewed as “an attempt by the EU Commission to fix a set of rankings in which [European universities] do better than [they] appear to do in the conventional rankings”.

A sceptic might wonder if the THE rankings, powered by Thomson Reuters, could be viewed as an attempt to fix a set of rankings in which British universities do better than they would in some proposed alternatives.

The producers of U-Multirank accepted that there were problems but considered that there was little alternative.

They said: “… at the moment the only alternative to referring to self-reported data is to limit the scope of indicators substantially and so produce a result which is unrepresentative of the wide mission of higher education institutions as a whole and which is unlikely to be of any use outside of the elite research intensive universities.”

This was the essence of the problem faced by U-Multirank. Using data submitted by institutions could mean fewer institutions willing to be ranked. But using data from government or third parties would mean a retreat to a much less ambitious enterprise.

The project was finally launched in Dublin in 2013 and was supported by a grant of €2 million (US$2.6 million) from the European Union. That sum of money aroused indignation from British universities and the rest of the higher educational establishment.

Publication

Finally, the first edition of U-Multirank came out in May 2014. So what does it look like?

The presentation needs improvement. The interface is clumsy and difficult to manipulate, although probably it will get easier with practice and perhaps there will be some tweaking.

The idea of indicating quality with five sizes of circles may work for some people, but others will just get a headache. The circles can, however, be converted into letter grades. This format may not be very helpful for promotion purposes. Letter grades or big coloured circles will not have the same impact as ‘listed in the top five for education’ in the QS rankings or ‘best university in the country in the THE rankings’.

The producers of U-Multirank have made it clear that it is not a conventional university ranking but rather an interactive assessment and evaluation tool. It contains a broad and uneven variety of data and does not combine scores to produce an overall ranking, although they have prepared three readymade rankings.

Information is provided on five areas: teaching and learning; research; knowledge transfer; international orientation; and regional engagement.

Participation is patchy. External data is provided for the elite universities, but most places in the UK and North America have opted out.

Nineteen Canadian universities are listed, but for most of them only external data are provided so that there is very little for the teaching and learning indicator. Montreal Polytechnic provides a near complete range of data, missing only masters graduation rates, and Queen’s University provides data about graduating on time for both masters and bachelor courses. That is all the information provided about learning and teaching in Canada.

In the UK, the only universities to provide data are Bournemouth, Chester, East Anglia, Hull, Kent, Liverpool, Newcastle and Nottingham.

Browsing through the cities listed, the user will find 18 universities in Paris, with teaching and learning data for 12 of them. Only four places in London are listed, with no information on teaching and learning.

For the city of New York, six universities are listed, although two of them – SUNY Buffalo and SUNY Stony Brook – are in fact not in the city.

The only data about teaching and learning is the remarkable 100% for bachelor and masters students graduating on time from SUNY Buffalo. The former is refuted by comparison with America’s Best Colleges – 71% graduation rate in 2011 – and reliable personal testimony. It is unlikely that anyone in New York – city or state – will be impressed by this.

In addition to general data, the new ranking tool contains information about four specific subject areas – physics, electrical engineering, mechanical engineering and business studies.

Altogether, some sort of information is provided for 862 institutions, slightly more than half of them in Europe, although many of these did not submit a complete set of data.

Since so many of the leading American and European research universities have opted out, there are some eyebrow raisers in the upper reaches of the readymade rankings provided by U-Multirank and it is likely that they will have limited public credibility.

On the other hand, both THE and QS have included some surprises in the top 20 or top 50 for various criteria, especially for their citations-based indicators.

Some results

For ‘research and research linkages’, first place goes to the University of Copenhagen and the next nine places to ETH Zurich, University of Helsinki, Leiden, Wageningen, Amsterdam, the Free University of Brussels, Denis Diderot Paris 7, King’s College London and the Catholic University of Leuven.

MIT is in eleventh place and Harvard in 18th. Oxford is just behind the Oregon Health and Science University and Cambridge below the London School of Hygiene and Tropical Medicine.

For the ‘economic involvement’ ranking, the top three places go to Erasmus University Rotterdam, ETH Zurich and the University of Liverpool.

For business studies programmes, the top three are the WHU School of Management, Germany, the American University of Florida and Dartmouth College.

It should be noted that research is already covered extensively by the three better known rankings, knowledge transfer is measured by field and year normalised citations in the THE rankings, citations per faculty in the QS rankings and highly cited researchers in the ARWU. Both THE and QS include measures that are related, albeit somewhat tenuously, to teaching and both have measures of international orientation.

It is the measures of regional engagement that make U-Multirank really distinctive.

Where data is provided, these ratings represent an improvement over or at least an addition to that contained in the other rankings, especially with regard to regional involvement.

For example, looking at Nottingham University – probably the most respected British university in U-Multirank – we can see an A for bachelor graduation rate (representing the raw figure of 94.55%) and graduating on time, a C for masters graduation rate and a B for masters graduating on time. For research, Nottingham gets an A for external research income and research publications and a B for citation rate. Other strengths are income from private sources and patents awarded while regional engagement is weak.

There is also a substantial amount of information for the Al-Farabi Kazakh National University in Kazakhstan.

It has an A for undergraduate graduation rate, publications cited in patents, international joint publications and bachelor degree holders working in the region, B for student mobility and income from regional sources, C for regional joint publications, and D for graduating on time (bachelors and masters), citation rate, research publications, external research income, co-publications with industrial partners and income from private sources. Data was not available for masters graduation rate and patents awarded.

Overall, this seems a richer set of data than is provided by the Shanghai, QS or THE rankings. Unfortunately, there are not many universities about which similar data is provided.

Public response

The most prominent response from existing rankers was Phil Baty’s in THE. He suggested that the new ranking was not going to affect THE’s position as the instrument to assess the world’s leading research universities.

The website SIRIS Lab was critical of some errors and the limitations of the data, but thought that generally it was “a positive step forward in the field of university ranking, as it tries to overcome the limitations present in conventional rankings”.

U-Multirank also received a favourable response from the European Students’ Union vice-chair, who said it was “the first global ranking that includes in a serious manner the teaching and learning dimension” and that “can of course be very useful for students when they make their choices, because it enables them to personalise their search much more than ever before both at institutional and field level”.

Looking ahead

There now seems to be a highly stratified rankings eco-system emerging.

At the bottom are the fun rankings – top party schools and so on – success in which is as likely to cause embarrassment as pride. Then there are utilitarian national rankings aimed largely at the student as consumer, indicating the likelihood of graduation, the financial returns from a degree and so on.

Entering the international arena, there are web-based rankings and research only rankings, including some that seem to have faltered after a few years – the Research Center for China Science Evaluation in Wuhan, URAP in Turkey, HEEACT in Taiwan, Scimago and the Leiden ranking.

Then at the apex we have the big three: Shanghai ARWU, whose status is derived from its stability and reliability, and QS and THE. The prestige of the latter two is largely a product of their claim to produce a holistic ranking while THE and its data provider have famous brand names.

At the moment, it seems that U-Multirank will come below the big three. It is not a truly global ranking since it includes few universities outside Europe.

But critics should be reminded that the European Union, which now has 28 member states, began life as the European Coal and Steel Community in 1951 with just six members.

* Richard Holmes is the writer of University Ranking Watch blog.

Newsletter
sign-up

We make sure you
don't miss any news
Skip to content