Details of the proposed new methodology for the Times HigherEducation World University Rankings have been unveiled. THEconfirmed this week that it plans to use 13 separate performanceindicators to compile the league tables for 2010 and beyond – anincrease from just six measures used under the methodology employedbetween 2004 and 2009. The wide range of individual indicators willbe grouped to create four broad overall indicators to produce thefinal ranking score.
The core aspects of a university’s activities that will beassessed are research, economic activity and innovation,international diversity, and a broad “institutional indicator”including data on teaching reputation, institutional income andstudent and staff numbers. “The general approach is to decreasereliance on individual indicators, and to have a basket ofindicators grouped across broad categories related to the functionand mission of higher education institutions,” said ThomsonReuters, the rankings data provider. “The advantage of multipleindicators is that overall accuracy is improved.”
THE announced last November that it had ended its arrangementwith the company QS, which supplied ranking data between 2004 and2009. It said it would develop a new methodology, in consultationwith Thomson Reuters, and with advisers and readers, to make therankings “more rigorous, balanced, sophisticated and transparent”.The first detailed draft of the methodology was this week sent outfor consultation with THE’s editorial board of international highereducation experts.
They include: Steve Smith, president of Universities UK; IanDiamond, the former chief executive of the Economic and SocialResearch Council and now vice-chancellor of the University ofAberdeen; Simon Marginson, professor of higher education at theUniversity of Melbourne; and Philip Altbach, director of the Centerfor International Higher Education at Boston College. A wider”platform group” of about 40 university heads is also beingconsulted. The feedback will inform the final methodology, to beannounced before the publication of the 2010 world rankings in theautumn.
Indicators in detail
While the old THE-QS methodology used six indicators – with a 40per cent weighting for the subjective results of a reputationsurvey and a 20 per cent weighting for a staff-student ratiomeasure – the new methodology will employ up to 13 indicators,which may later rise to 16.
For “research”, likely to be the most heavily weighted of thefour broad indicators, five indicators are suggested, drawing onThomson Reuters’ research paper databases. This category wouldinclude citation impact, looking at the number of citations foreach paper produced at an institution to indicate the influence ofits research output.It would also include a lower-weighted measureof the volume of research from each institution, counting thenumber of papers produced per member of research staff. Thecategory would also look at an institution’s research income,scaled against research staff numbers, and the results of a globalsurvey asking academics to rate universities in their field, basedon their reputation for research excellence.
“Institutional indicators” would include the results of thereputation survey on teaching excellence and would look at aninstitution’s overall income scaled against staff numbers, as wellas data on undergraduate numbers and the proportion of PhDs awardedagainst undergraduate degrees awarded.
For 2010, the “economic/innovation” indicator would use data onresearch income from industry, scaled against research staffnumbers. In future years, it is likely it would include data on thevolume of papers co-authored with industrial partners and asubjective examination of employers’ perceptions of graduates.
Institutional diversity would be examined by looking at theratio of international to domestic students, and the ratio ofinternational to domestic staff. It may also include a measure ofresearch papers co-authored with international partners.
Ann Mroz, editor of Times Higher Education, said: “Becauseglobal rankings have become so extraordinarily influential, I feltI had a responsibility to respond to criticisms of our rankings andto improve them so they can serve as the serious evaluation toolthat universities and governments want them to be.
“This draft methodology shows that we are delivering on ourpromise to produce a more rigorous, sophisticated set of rankings.We have opened the methodology up to wide consultation with worldexperts, and we will respond to their advice in developing a newsystem that we believe will make sense to the sector, and will bemuch more valuable to them as a result.”
THE PROPOSED NEW RANKINGS METHODOLOGY
10% Economic activity/Innovation
Research income from industry (scaled against staff numbers)
10% Ratio of international to domesticstudents
Ratio of international to domestic staff
25% Institutional indicators
Undergraduate entrants (scaled against academic staffnumbers)
PhDs/undergraduate degrees awarded
PhDs awarded (scaled)
Reputation survey (teaching)
Institutional income (scaled)
55% Research indicators
Academic papers (scaled)
Citation impact (normalised by subject)
Research income (scaled)
Research income from public sources/industry
Reputation survey (research).
Expliciete directe instructie is minder effectief dan het lijkt
“Er moet een nationaal monument voor Covid-19 slachtoffers komen”
Landelijke infrastructuur Edubadges in zicht
D66 vindt dat universiteiten kunnen leren van de corona-aanpak van het hbo
Vooral Elsevier profiteert van nieuwe ‘open science’ deal