U-Multirank wastes EU funding

Nieuws | de redactie
28 maart 2012 | The British House of Lords bashed the U-Multirank initiative of the European Commission. It would only “confuse applicants”. The money should be better spent on other EU priorities. Russian Prime Minister Vladimir Putin, meanwhile, is generally disappointed by Western rankings.

The influential European Union Committee of the British House ofLords (upper house of Parliament) created the report “The Modernisation of Higher Educationin Europe” in which the U-Multirank project was criticized heavily.EU money should be better spent on other priorities given that theTimes Higher Education and Shanghai rankings already did a good jobin providing “transparency and flexibility for students to make aninformed choice”. News that Times Higher Education was quick to catch upon.

The House of Lords raised concerns that the “league tablesmarket was already too crowded, with each ranking deploying its ownmethodologies; that it would confuse applicants and be incapable ofresponding to rapidly changing circumstances in institutionalprofiles; that it could become a ‘blunt instrument’ which would’not allow different strengths across diverse institutions to berecognised and utilised’ and end up being used as the basis forfuture funding decisions; on the grounds of quality, accuracy andlack of data; and that EU funds could be better spent on other EUpriorities.”

Concerns over U-Multirank becoming officialreference

Given the doubts the House of Lords has regarding the addedvalue of U-Multirank, the report also urged the Commission to notdeclare this new instrument as the official reference for Europeanuniversities: “The Higher Education Funding Council for England(HEFCE) acknowledged the Commission’s efforts to overcome some ofthe limitations of traditional league tables and to render it moreobjective but advised ‘caution in providing any form of officialsanction to any one form of ranking tool given that universalranking systems have a history of lacking real comparability androbustness’.”

Instead, the House members pointed towards the merits ofexisting rankings. The Times Higher Education Ranking was mentionedin this context as a role model given that it recently revised itsmethodology. “While Times Higher Education considered that rankingswere “relatively crude”  and could never be properlyobjective, they nevertheless considered that if used appropriatelythey could still provide a useful role in providinginformation.”

Putin disappointed by Western rankings

Especially with the last assessment Russia’s current PrimeMinister and President-elect Vladimir Putin would disagree. “Youmust know that certain experts think that these Western ratingsare, in fact, an instrument for raising the competitiveness [oftheir graduates] on the labor market,” he remarked.

This came as a reaction to the recent outrage by Russian mediathat none of the country’s universities are ranked in THE’s top100. Now, Putin instructed the Russian Education Oversight Agencyto create an alternative ranking.

Complete report on U-Multirank

50.Current rankings-including the Times Higher Education WorldUniversity Rankings and Shanghai Jiaotong University’s AcademicRanking of World Universities-mainly focus on research-intensiveuniversities and only include a small proportion of Europeanuniversities. The Commission therefore believes that a wider rangeof indicators and information should be made available to increasetransparency and allow more informed choices to be made, as well assupporting policy-makers’ higher education reforms. In response,the Commission intends to  launch U-Multirank in 2013, whichwill allow users to profile universities using a number ofperformance indicators rather than just research output.

51.Most of our witnesses were not convinced by the merits of yetanother league table, with the British Council description ofrankings as both a “blessing and a curse”  capturing thisdichotomy well. The Russell Group told us that “rankinguniversities is fraught with difficulties and we have many concernsabout the accuracy of any ranking. It is very difficult to capturefully inumerical terms the performance of universities and theircontribution to knowledge, to the world economy and to society.Making meaningful comparisons of universities both within, andacross, national borders is a tough and complex challenge, notleast because of issues relating to the robustness andcomparability of data”. The EUA were critical of existing rankingsystems as favouring  very large research intensiveinstitutions and while they praised the proposal’s attempt to moveaway from research outputs to look at other indicators, theyconsidered that this would be hard to achieve in practice,particularly due to the lack of data in some universities andMember States and the difficulties in collecting data moregenerally, including the additional burdens that this may place onuniversities.

52.Many of our other witnesses raised a series of concerns:about the proposal’s lack of clarity as to whether it would be aranking or transparency tool; that the league tables market wasalready too crowded, with each ranking deploying its ownmethodologies; that it would confuse applicants and be incapable ofresponding to rapidly changing circumstances in institutionalprofiles; that it could become a “blunt instrument” which would”not allow different strengths across diverse institutions to berecognised and utilised” and end up being used as the basis forfuture funding decisions; on the grounds of quality, accuracy andlack of data; and that EU funds could be better spent on other EUpriorities.

53.Notwithstanding these concerns, if the Commission’s statedintention of increasing transparency and providing more flexibilityfor students to make an informed choice based on different criteriaproved to be possible, then many of our witnesses were prepared tosupport its introduction as potentially adding value. The UKBologna Experts were of the same view but considered thatU-Multirank’s success was  “highly dependent on the extent ofinstitutional engagement, coverage, and accuracy of data used tocompile the rankings”  and that it was  “vital that theinstrument recognises the diverse character of European HEIs in sofar as direct comparisons can be iniquitous and misleading”.

54.The Higher Education Funding Council for England (HEFCE)acknowledged the Commission’s efforts to overcome some of thelimitations of traditional league tables and to render it moreobjective but advised “caution in providing any form of officialsanction to any one form of ranking tool given that universalranking systems have a history of lacking real comparability androbustness”. The NUS also welcomed the Commission’s efforts butstill had doubts about how it would work in practice, believinginstead that improving the public information made available tostudents could be achieved by alternative means  “without theneed to introduce (yet another) potentially subjective andconfusing rankings system”. The Government considered that “it might be useful”  if it genuinely provided a transparentsource of information for students wanting to study abroad but werenot convinced that it would add value if it simply resulted in anadditional European ranking system alongside the existinginternational ranking systems. However, the Minister struck a lesspositive tone when he told us that it could be viewed as  “anattempt by the EU Commission to fix a set of rankings in which[European universities] do better than [they] appear to do in theconventional rankings”.

55.We were interested to note that THES have recently revisedtheir global rankings in 2010 in order to apply a differentmethodology and include a wider range of performance indicators (upfrom 6 to 13). They told us that their approach seeks to achievemore objectivity by capturing the full range of a globaluniversity’s activities-research, teaching, knowledge transfer andinternationalisation-and allows users to rank institutions(including 178 in Europe) against five separate criteria: teaching(the  learning environment rather than quality); internationaloutlook (staff, students and research); industry income(innovation); research (volume income and reputation); andcitations (research influence). In order to inform the revision oftheir rankings, their data supplier, Thomson Reuters, conducted aglobal survey which found that many users distrusted themethodology of the existing world rankings. While THES consideredthat rankings were  “relatively crude”  and could neverbe properly objective, they nevertheless considered that if usedappropriately they could still provide a useful role in providinginformation.

56.We also believe that the provision of clear information andguidance to students is important in order to assist them in makingan informed choice of university. However, we  also appreciate how difficult it can be to evaluate a wider range ofuniversity performance indicators in an objective manner, notingthe limitations inherent in many of the existing rankingsystems.

57.Therefore, it is important that the Commission is clear aboutthe purpose of U-Multirank, what information will be provided andwhat methodology will be used. If the perceived deficiencies inmost other ranking systems are overcome in relation to thisproposal then we could be convinced of the benefits of itsintroduction. However, until these deficiencies can be overcome, weconsider that the Commission should prioritise other activities. Inthe meantime, rankings such as the Times Higher Education WorldUniversity Rankings may have a valuable contribution to make.


«
Schrijf je in voor onze nieuwsbrief
ScienceGuide is bij wet verplicht je toestemming te vragen voor het gebruik van cookies.
Lees hier over ons cookiebeleid en klik op OK om akkoord te gaan
OK