Last year, Lund University was ranked the top Swedish university in the QS Ranking, one of the current global rankings for higher education institutions. Last year, Lund University was ranked the top Swedish university in the QS Ranking, one of the current global rankings for higher education institutions.

Ranking lists important for Swedish universities

Svenska 2016-09-08

Interest in global university rankings has grown rapidly. With the results these days being used by funders, students and prospective colleagues, it is important for institutions to rank highly in the listings – whatever one’s view of their quality. Curie asked a number of Swedish institutions to explain their position on the university ranking phenomenon.

Torbjörn von Schantz

Torbjörn von Schantz

“I got a lot of criticism for that comment – I was told that I hadn’t shown pride in Lund University,” says Torbjörn von Schantz, Vice-Chancellor at Lund University. “I am proud, of course, but that isn’t related to how well we do in the rankings.”

He is describing the situation last year when Lund University secured top place amongst Swedish universities in the QS Ranking, one of the current rankings for higher education institutes (read more about university rankings in Curie here). In an interview in the Sydsvenskan newspaper, von Schantz played down the result and said the rankings were “the most unintellectual of debates” – a description he stands by today.

“The companies behind the ranking systems can alter their criteria from year to year, which affects how universities are placed. The whole thing is also getting more and more commercial – there are expensive conferences where the rankings are discussed and consultants offer their services to help universities to improve their placing.”

Must not determine what universities do

At the same time, he says that the results of the rankings are important. The lists are read with interest by prospective students, future colleagues and some donors, and universities monitor their results and analyse them carefully. But von Schantz says they must not be allowed to determine what universities do.

“I question the value of the rankings, but I do recognise that they are very important and that we must therefore take a position on them,” he says. “We need to put them in perspective. Our own aim is to improve quality on the basis of the parameters that we believe drive up quality, not to climb up in any rankings”.

Otherwise, von Schantz says, the consequences can be absurd. By way of illustration, he mentions one measure found in the rankings which is the number of professors per student.

“You might then argue that we should reduce the requirements for promotion to professor so that we get more professors. That type of thinking is counterproductive.”

Some people also fear that an exaggerated focus on listing results could lead to institutions deprioritising activity that does not help them rise in the rankings. That might apply to community collaboration or teaching and learning – factors that the rankings often disregard.

Consider the impact of the rankings

Arne Johansson

Arne Johansson

But Arne Johansson, Deputy Vice-Chancellor for research at the Royal Institute of Technology (KTH), does not think that aspiring to climb up the rankings has a negative effect on an institution’s work. KTH’s development plan states that the Institute must consider the impact of the rankings in its strategic deliberations, and that “KTH’s position as a leading technological university must be strengthened and demonstrated in the most relevant rankings.”

“What we are doing is addressing the overriding quality issues that a research university has to work on in order to develop – publishing where we will have most effect, offering a good education and recruiting well from across the world,” says Johansson.

To increase its number of citations and so rise in the rankings, KTH has started to monitor bibliometric data on an annual basis, and has also introduced an indicator into its resource allocation system that places a premium on publication in high-impact journals.
Johansson observes that it is important for KTH to be as high in the rankings as possible.

“It affects our ability to recruit the best researchers and attract the best international students. It is also important for collaboration with good universities in other countries.”

Important for marketing

Hans Adolfsson

Hans Adolfsson

Hans Adolfsson, Vice-Chancellor of Umeå University, also says that it is important for the university to be visible in the rankings for marketing purposes. This applies to some extent to the recruitment of researchers and teachers but is particularly relevant to the issue of attracting international students.

“Even though the major rankings scarcely reflect the educational side, many students at both undergraduate and graduate level do read the lists,” says Hans Adolfsson.

“Many people look at the placing and think the figure is an absolute one, whereas in reality there is often a great deal of uncertainty behind it – for example, how well the ranking process has been done, what indicators have been used and whether they reflect how good the university actually is.”

One common criticism of university rankings is that they measure only a small part of the institutions’ activity – mainly international publication and international reputation – but pretend to be a comprehensive measurement of quality. Adolfsson believes that the rankings have limited benefit as a measurement of the overall quality of a university, especially for an institution with a wide-ranging remit such as Umeå.

“We have everything from medicine to arts programmes, so it’s extremely difficult to describe the overall quality of the university in just one figure. But there are individual figures in the calculations that can provide information on quality, such as the extent of research citations.”

Umeå University does not have any explicit strategies to achieve a higher position in the rankings. Adolfsson thinks that would lend them undeserved legitimacy.

“That would be presenting them in a better light than is warranted, and I don’t think we should be doing that,” he says. “It’s better to put systematic quality improvement measures in place that are not there with the aim of rising up the rankings.”

Using their own measuring tools

The Karolinska Institutet (KI) does not have any stated long-term goals to climb up the rankings either, according to Operations Controller Björn Forslöw.

“We work actively with our own measuring tools and indicators; these are more important for our strategic positioning. But some indicators may of course coincide.”

KI supplies data for a number of selected international rankings and follows their progress, says Forslöw.

“In some case it can be interesting to see whether they reveal long-term tendencies either for us or other institutions. But it’s important that any change is real change, not just changes in methodology. Neither should we make too much of the results. Often what happens is that small differences in the underlying results produce changes that look significant in the ranking.”

KI features its position in a number of different rankings on its website and refers to them in information material. Forslöw says that some stakeholders expect universities to present their position in the rankings and explain any changes.

“For example, some research funders from countries that don’t know very much about KI. Lots of international students also follow the rankings – more in, for example, Asian countries than in Western Europe.”

However, he does not feel that the rankings have any significance when establishing collaborations with other institutions or recruiting researchers.

Read more in Curie: Criticism for popular rankings

Text: Sara Nilsson
Photo: Nina Ransmyr