Last autumn, Karolinska Institute dropped from 49th to 53rd place in the Times Higher Education (THE) rankings. But that does not really matter very much according to Nils Hansson, a professor of the history, theory and ethics of medicine at Heinrich Heine University in Düsseldorf.
He is the author of the book Genier utan Nobelpris (‘Geniuses Without a Nobel Prize’),which was published in 2025. As the title suggests, it is about people who narrowly miss out on receiving these coveted prizes. And about how those individuals are at least as good at their profession as those who actually win. Building on this, Hansson has also written several opinion pieces criticising the excessive prominence given to international university rankings; the way this imitates sports, counting points, goals and – when the season is over – final positions in a league table to determine who is best. That method simply does not work for determining whether one university is better or worse than another, he argues.
“Rankings are also a fairly new phenomenon. It is not as though a hundred years ago people did not know which were the leading universities in certain fields and which were not. I do not think that ranking tables have contributed all that much in that regard,” he says.
He would like to see an analysis of what the rankings actually measure. For example, he thinks that bibliometrics, which several of the rankings use as an indicator, is a relatively crude measure of the impact of research.
“Universities are very keen to announce whether they have moved up or down a few places in the lists. For a Swedish audience in particular, I wonder whether this does more harm than good. It is not as if Swedish universities are in the top 10 of any of the major rankings; They are in the top 100, or even further down.”
Three international university rankings are generally regarded as well-established and reliable: Times Higher Education (THE), the QS World University Rankings and the Academic Ranking of World Universities (ARWU).
On Lund University’s website, you can read that it tops the latest QS sub-category for sustainability, and is also ranked number one among universities in the Nordic region in QS’s overall ranking. In a post published on social media by Lund University, Vice-chancellor Erik Renström describes this as “both a recognition and a responsibility”. In the comments under the post, Lund University is congratulated on having beaten both the University of Copenhagen and Uppsala University.
Kristina Eneroth, Deputy Vice-chancellor of Lund University, describes the communication work about rankings as part of the efforts to raise the university’s international profile. She believes it is important to distinguish between the Swedish market, which has relatively few higher education institutions and where the major players are well known to most people, and the international market, which sees a different kind of competition.
“But nobody believes that there is any ranking organisation that is capable of capturing the full complexity of a university’s activities and reducing it to a single figure that says everything.”
For Lund University. she explains, quality development is the key factor when it comes to working with the different ranking systems. When working with quality matters, you need to know where you stand in relation to the wider world.
“Rankings are part of that work. They do not provide a complete picture, and you cannot find answers to all your questions simply by looking at a figure in a ranking result. But they do provide an indicator of how a university is faring in international competition.”
However, she describes a “magic threshold” to bear in mind. Some countries have scholarship programmes where international students can only receive scholarships to top 100 universities. “If you are not on the right side of that line, it is harder to recruit highly talented international students. That is the harsh reality.”
Magnus MacHale-Gunnarsson, a researcher at the University of Gothenburg, has been working with higher education institution rankings since 2010. He is the person who submits the university’s data to the different ranking organisations and analyses the outcomes. He does not believe it is possible to measure education in the same way as, for example, sports results.
“No, and that is largely because university quality is so multidimensional. There are so many different aspects to it.”
His sceptical attitude towards rankings is primarily due to the methodology.
“Their method for measuring university quality is absolutely terrible. It would not even get you a pass as a Bachelor’s essay. The rankings do not have very many different possible indicators to use.”
“Their method for measuring university quality is absolutely terrible. It would not even get you a pass as a Bachelor’s essay. The rankings do not have very many different possible indicators to use.”
Magnus MacHale-gunnarsson
Research intensity is one example of flawed measurement methodology, he argues. A higher education institution with a high proportion of research relative to the number of students, a large amount of research funding compared to teaching funding, or a high number of PhD students relative to undergraduate students is not necessarily better than others.
“It is not certain that a higher education institution is inferior simply because it has a higher proportion of teaching compared to research. QS and Times, (THE ranking, editor’s note), talk a lot about using the rankings to help students choose a higher education institution. So surely you cannot assume that a high proportion of teaching is a bad thing?” he says.
On the other hand, rankings do serve a purpose for international students who wish to compare higher education institutions at a general level. After all, he emphasises, the indicators used in the rankings are not entirely plucked out of thin air.
In South Korea, a number of universities launched a boycott of the 2023 QS rankings – which has since ended – citing methodological shortcomings. In recent years, there have also been examples of European universities withdrawing from rankings. At the start of 2026, the Sorbonne University in France stopped submitting data to the THE rankings. The higher education institutions in Zurich in Switzerland and Utrecht in the Netherlands did the same in 2024. Utrecht University also withdrew from the QS rankings but still appears in them, as QS does not rely solely on data reported by the higher education institutions themselves.
Universitetsläraren sent out questions to 35 Swedish universities and colleges asking whether or not they submit data to international rankings. Of the 20 that replied, 9 stated that they do not.
MacHale-Gunnarsson believes that higher education institutions are caught in a “ranking trap”, where they cannot stop submitting data to the different lists.
“What happens when you’ are not on the list is that fewer students see you and you get fewer applicants.”
The fact that some rankings list higher education institutions even though they do not submit data makes the trap even harder to escape, he continues.
“Then they dig up low-quality data that they cannot verify or put in context, and that is almost worse.”
Looking just at the rankings of Swedish higher education institutions, the Karolinska Institute (KI) tops two out of three lists. However, according to Annika Östman Wernerson, Vice-chancellor of KI, the more general rankings are not the most important ones for KI.
“The more subject-specific rankings are of greatest interest to us,” she says. “They are also most relevant to those who use them in other countries, namely students.”
A good position in the rankings is good for the KI brand, she explains. At the same time, the lists are blunt measuring tools. The attention surrounding a change in ranking can therefore be difficult to interpret from a quality perspective.
“Small changes up and small changes down are sometimes blown out of proportion. So it can be both good and bad. But they do not necessarily mean that our quality has become either better or worse.”
“Small changes up and small changes down are sometimes blown out of proportion. So it can be both good and bad. But they do not necessarily mean that our quality has become either better or worse.”
Annika Östman Wernerson
However, KI does not work particularly actively to climb the rankings, she tells us. They submit data to four ranking organisations, monitor what happens and try to understand why their ranking position goes up or down. They continuously publish where they are placed and have staff who work specifically with rankings.
“It is not their only job of course, but we do have staff who keep an eye on the rankings.”
Annika Östman Wernerson believes that measuring research and education is problematic. One reason is that ranking systems regularly change their measurement models, criteria and weightings. She also emphasises that no ranking provides a complete picture of a higher education institution’s quality. Another problem, she believes, is that it is not entirely transparent how certain measurement methods are designed or how they change.
“Some ranking organisations send out surveys asking people to give their opinions on universities, and these may have a response rate of just one per cent. Such a method is hardly scientific, and if such data forms the basis for rankings, it is obvious that they can be called into question.”
When it comes to education, there are also no reasonable measurable indicators at a global level, notes KI’s vice-chancellor.
“We have not been able to see that these rankings are the reason why researchers or support staff apply to KI. So it is also interesting that, in education, there are actually no good ways to measure performance,” she says.
Lars Geschwind, the Executive Director of SULF, is also a professor at the Royal Institute of Technology (KTH) and has conducted research into the organisation of higher education. He previously worked at Uppsala University and Södertörn University. He believes that how higher education institutions view international rankings has evolved over time.
“It is incredible that we have such highly ranked universities,” he says. “But at those higher education institutions, rankings also play a major role. At KTH, where I have been involved in various areas, I have seen this very clearly. It shapes many of the strategies, and for some vice-chancellors this is more evident than for others. Management teams, and not least university boards, can be extremely interested in rankings.”
International rankings also play an increasingly important role in student recruitment, particularly in certain fields. How a higher education institution ranks also matters when it comes to recruiting staff, he believes.
“Whereas the old universities in particular used to be able to rely on their status, their traditions and their incredibly strong brand,” he continues, “sometimes dating back to the Middle Ages, as in Uppsala’s case, there are now also newcomers. There are opportunities to advance, gain ground and be ranked higher. I believe that driving force is irresistible to leaders at different levels.”
Since it is difficult to measure quality, excellence and a university’s performance, he thinks rankings provide a simpler alternative.
“Rankings also say something about how competitive universities compete are in a market. They are competing for students, teachers and researchers. For some of those people, it is important how highly ranked a university is.”
Modern higher education institutions have a different type of governance, and they approach marketing differently. There are now clearer targets to measure against, as well as clearer strategies and ambitions. Outwardly, it may seem as though a higher education institution is not working particularly actively on its ranking positions, whilst at the same time celebrating when things go well.
“With the multitude of rankings available now, there is always one where they are doing better. It is becoming a bit like opinion polls. There is always one you can choose to highlight.”
According to Anders Söderholm, Vice-chancellor of KTH, the question of how to approach the THE and QS rankings is currently on their agenda.
“Yes, to try to understand in a little more depth how they use data, what data they use, and also to ensure that data about us exists in a form that is accessible to them. We are trying to understand the system as much as we can,” he says.
The idea is to take a more strategic approach to university rankings and develop a long-term plan for working with them, he explains.
“We have a good system and a good way of working with it, but perhaps we could do it in a simpler and more effective way. It is part of the whole package we call strategic operational analysis, where we monitor everything from applicants per place and broadened recruitment to bibliometrics and external funding.”
Rankings are also important in KTH’s international contacts, says Söderholm.
“In the local context, I do not think there is any need for this type of ranking system, because we know each other well. It is more relevant for more distant contacts.”
“In the local context, I do not think there is any need for this type of ranking system, because we know each other well. It is more relevant for more distant contacts.”
Anders Söderholm
Linköping University is just outside the top 200 in THE and some way outside the top 300 in the latest QS and Shanghai rankings. Jan-Ingvar Jönsson, the Vice-chancellor, would ideally like to see Swedish higher education institutions stop referring to rankings. But if that is to happen, it will require a collective effort, he says.
“If you are going to try to challenge a system that everyone relates to more or less strategically, then surely some country should take the lead on this. I think that could have an impact. But Sweden is relatively small, so it is a big step.”
Jönsson sees a clear contradiction between the international collaboration CoARA, the Coalition for Advancing Research Assessment, and the ranking lists.
“Within CoARA, there is a sort of agreement that we should try to move away from traditional bibliometrics and instead start looking at broader parameters that are not currently used,” he says. “At the same time, these ranking companies use precisely that parameter. There is a clash there.”
An agreement on methods for evaluating research
The Coalition for Advancing Research Assessment (CoARA) is an alliance of different organisations that aims to reform the methods used to evaluate research. Over 700 organisations within CoARA have signed the Agreement on Reforming Research Assessment.
According to the agreement, the signatories are to strive to ensure that research is assessed and evaluated primarily on the basis of qualitative measures, with a focus on peer review. The agreement also includes a clause stating that the organisations will avoid using lists that rank research organisations.
Source: CoARA
The quantitative measures within bibliometrics need to be relaxed in favour of more qualitative measures, he believes, citing student satisfaction as an example. But he has no ready answer as to what might replace the rankings.
Linköping University, does not have a clear strategy for climbing the rankings, though Jönsson would certainly like to get over that top 200 threshold.
“I think we are currently the fifth or sixth best in the country in the Times Higher Education rankings. Obviously, achieving that goal would be very nice, but it is not part of a conscious strategy,” he says.
Hans Wiklund is the University Director at Umeå University, which sits just outside the top 400 in the THE, QS and Shanghai rankings. He views the rankings as a component within quality assurance work and explains that they offer educational and communicative benefits.
“In a very simple way, you can make some sort of overall assessment of a higher education institution’s performance,” he says. “And it is also something that is easy for a higher education institution to communicate. On the other hand, it is very important to bear in mind that reality is infinitely more complex.”
However, placing too much emphasis on university rankings when working on the quality of education and research can lead a university to focus on the wrong things, he explains.
“You can end up being driven more by the ambition to climb the rankings than by improving quality. That leads to a kind of corrupt governance, where you steer towards an illusion of quality rather than genuine quality. A great deal of responsibility falls on the leadership at each higher education institution.”
In addition to the doubts surrounding the methods used in international rankings, Hans Wiklund also wishes to emphasise that ranking organisations operate in a commercial market.
“If rankings are to be used at all, it might have been preferable if they had been owned by, and agreed upon within, the sector. That the higher education institutions themselves influenced the design and development of the rankings,” he says.
Timeline: International university ranknings
1910
A list of the one thousand most prominent scientists is compiled in the United States and linked to the universities from which they graduated. When this is related to the institutions’ total number of researchers and university teachers, the ranking system is born.
1983
U.S. News & World Report’s list of the best higher education institutions begins to be published. In the years that follow, ranking lists become increasingly established as a business model in the United States. Universities start developing strategies for how to move up in the rankings.
1992
The United Kingdom’s first university ranking is introduced: the Good University Guide from the Sunday Times.
1999
The Guardian’s university ranking is launched in the UK.
2003
The first international ranking, the Academic Ranking of World Universities (ARWU), is launched. It is also known as the Shanghai Ranking and is introduced by Shanghai Jiao Tong University. Karolinska Institutet lands in 39th place.
2004
Times Higher Education (THE) and QS World University Rankings begin as a joint collaboration.
2008
The former Swedish National Agency for Higher Education investigates whether students can benefit from university rankings. The conclusion is no.
2009
Times Higher Education and QS end their collaboration, and THE changes its methodology.
2023
Utrecht University in the Netherlands stops submitting data to, among others, THE and QS after no Dutch university placed in the top 50 of the ranking. The University of Zurich also stops sharing data with THE and QS.
2023
In the United States, Columbia University leaves the domestic U.S. News & World Report College Rankings. The year before, the institution had fallen 16 positions on the list. Yale and Harvard have also left the same ranking.
2025
Sorbonne University in France announces that it is leaving THE. The university will stop submitting data to the ranking from 2026 onward.
2026
Karolinska Institute, Lund University, and KTH place in the top 100 of the THE ranking. In QS’s European ranking, Lund University ranks twelfth.