The list, “Highly Cited Researchers”, is based on publications in the database Web of Science.

The world’s most cited researchers

Svenska 2019-09-17

Every year, the world’s most cited researchers are listed. Being included on this list can be seen as proof that your research is being used, or that you have done a good job. But the phenomenon of listing highly-cited researchers, and the consequences this can have, are also being criticised.

Every year, the analysis company Clarivate Analytics publishes a list of the researchers whose publications have been referenced the most by others. The list, “Highly Cited Researchers”, is based on publications in the database Web of Science over the last 11 years, and was previously designed differently and had different principals.

The most recent list was published in 2018, and related to citations and publications during the years 2006–2016. The list covers the 1 per cent of researchers who are most frequently cited in any of 21 different scientific fields, or in a newly-established multi-disciplinary category,

Asking the right questions

Of the just over 6 000 researchers listed, around 70 are based in Sweden. Joakim Larsson, professor of environmental pharmacology at the University of Gothenburg, is listed in the multi-disciplinary category. He hopes that he is cited frequently because his studies are good, something he considers is largely about asking the right questions.

“I think that many researchers, including myself, can get better at being more critical already at the ideas stage, and ask the question ‘Is this really relevant and important?’. There is an infinite number of questions. For this reason, it can be a good idea to put more effort into questions where the answer has the potential to be broadly implemented, or to be crucial for advances within a field,” says Joakim Larsson.

He is happy to be included on the list, but at the same time thinks that it is not all that important.

“Being cited a lot can be important when applying for research funding, but these days it isn’t particularly important whether you are included or not on this particular list; the system doesn’t work that way. It is fun to be included on a list that to some extent says that you are doing a good job, but citations are after all just one of many ways of measuring scientific success.”

Not the whole picture

Björn Hammarfelt, docent in library and information sciences at the University of Borås, also says that citations do not give the whole picture.

“Citations aren’t the sole identifier of quality.” But there is a connection between what is perceived as quality among other researchers, and the number of citations that researcher or article has received.

Scientific fields communicate in different places and at differing speeds. This makes the citation measure a blunt instrument in many fields, he emphasises.

“Citation as a measurement is more useful in some fields than in others. If you are to use citation as a quality measure, then you must use a very narrow sample of researchers all dealing with the same things in order for them to be comparable.”

In some fields, researchers hardly ever publish in sources that are included in the Web of Science, from where data for this is taken. Most of the categories on the list are also within natural sciences and medicine (see the fact box).

“In many disciplines, you could never be included on this list, even if you are a top-notch researcher,” says Björn Hammarfelt.

The list is also affected by the fact that the number of article authors does not matter (except in the categories physics and space science). This means that articles with many authors can easily get many citations.

This is because co-authorship is one of the factors that is known to characterise highly-cited articles. When several authors disseminate an article through different networks, it tends to be cited more often. Overview articles and articles in international collaborations are also cited more often generally.

Did the right things from the start

The list of the most highly-cited researchers also includes Gerhard Andersson, professor of clinical psychology at Linköping University. He thinks that early on, he unconsciously did things that can increase the number of citations, such as publishing with international colleagues, including staff members as co-authors, and citing earlier own work. He has also published several highly-cited literature overviews together with a Dutch researcher.

He regards being included on the list as proof that his research is having impact, and a sign that he has chosen the right persons as collaborators.

“You only end up here if the studies you have made are used, and that is important. But it is also about me having chosen a research field that does a lot of publishing, and one that my colleagues and I joined so early that we are difficult to avoid when someone does a new study,” says Gerhard Andersson.

Would rather increase influence

Another of the researchers included on the top list is Ilona Riipinen, professor of atmospheric sciences at Stockholm University. She thinks that researchers should focus on increasing their influence rather than the number of citations, as it is influence that they are really trying to measure.

She does this by thinking about who will read her articles, and writing in such a way that all relevant researchers can understand and benefit from them. What matters is trying to reach those who need to know about the research, she thinks.

“It is important to attend the right conferences and market your articles, in particular just before and after they are published,” says Ilona Riipinen.

Being included on the list is gratifying, as it is a sign that many people have read her work, she thinks.

“It is proof that my research has influenced others. No method of measurement is perfect, but it is good to try to measure the influence of researchers, and this list at least tries to do so in an objective and transparent way.”

More men on the list

As a rule, there are more men than women on the lists of the most-cited researchers. The company behind “Highly Cited Researchers” does not have any figures for gender distribution on the list, but for the Swedish universities that have issued press releases about their researchers on the HCR list, the male dominance is clear.

Ulf Sandström, a researcher at the School of Industrial Technology and Management at the Royal Institute of Technology, has investigated the gender balance of the most cited researchers in a study*. Of the Swedish researchers found in this group, 18 per cent of the authors of publications between 2008 and 2011 were women. For the most recent four-year period, 2012–2015, this proportion had increased to 22 per cent.

But the differences between different scientific fields are large. Within humanities, for example, the percentage of top-cited women reflects the percentage of women within the research field. Among the top-cited researchers within medicine, the percentage of female researchers is considerably lower than the percentage of women within this research field, however.

“We don’t know exactly why this is so, but it is clear that women encounter obstacles in some areas, and this might mean that important competence is lost,” says Ulf Sandström.

Women also produce fewer articles than men, which in turn reduces the chances of being highly cited.

How the list is used

The analysis company Clarivate Analytics uses concepts such as “elite researcher” and “exceptional influence” to describe the researchers on the list. But they point out that there is no generally accepted agreement on what constitutes an exceptional research performance or elite status.

Björn Hammarfelt establishes that what the company says is one thing; the question is how the list is used and interpreted in the research community.

“Being included on the list may result in a pay increase, or being contacted by universities trying to recruit or become associated with highly-cited researchers in order to increase their ranking. Perhaps not in Sweden, but elsewhere.”

For example, the list is one of the indicators in the “Shanghai ranking”, which ranks universities. The citation list can lead to competition between departments, and being included on the list may become some form of branding for researchers. This could cause problems, Björn Hammarfelt thinks.

“The question is whether this way of describing successful researchers is something that should be encouraged. And what type of researchers will we end up with if they have to strive to be ‘highly-cited researchers’? Focusing on producing a lot in a narrow range of periodicals may mean that other tasks are being ignored.”

Footnote: The method for ranking researchers used in the study is not the same as that used for the HCR list, and is described here.

Text: Sara Nilsson

Ta del av information om behandlingen av dina personuppgifter