In a study, researchers at SLU reviewed 216 international articles about environmental support to farmers. Their review pointed to major shortcomings – even in the three or four studies of their own that were included.
RESEARCH THAT PROMISES MORE THAN IT CAN DELIVER:

“We can’t just pretend that all studies are good”

Svenska 2020-11-30

Research studies are not always conducted according to the rules.
They may promise too much, show relationships that do not exist, or not be replicable. Two researchers who have been involved in reviewing studies in different disciplines know this – and have discovered lots of shortcomings.

Jonas Josefsson, an environmental protection biologist, has done something as unusual as questioning his own studies. His interest in reviewing research began already while he was a doctoral student, when his seminar group used to meet every Friday to discuss articles.

“It always ended up with us sitting there complaining about the method section, rather than discussing the actual results.”

Now, some years later and together with his colleague Tomas Pärt and the entire Landscape Ecology Unit at the Swedish University of Agricultural Sciences (SLU), he has conducted a study where they review 216 international articles about environmental support for farmers. Their review pointed to major shortcomings – even in the three or four studies of their own that were included.

Wrongly designed studies

One problem is that many of the studies do not show the effectiveness of environmental support for safeguarding biological diversity. They are not designed to measure this.

Jonas Josefsson has found that one of the greatest problems is that it is up to landowners and decision-makers to determine where the action is to be done. Then it is not sure that the support ends up where it has the greatest benefit.

“If we had been involved as researchers right from the start, we would have used a design where you study a support area before and after the action, and compare it with a control area where no action has been taken during the same period. In the best of all worlds, you would also like the areas that receive support or act as control to be randomly selected, but for practical reasons this is often difficult.”

The researchers at SLU have also reviewed how the studies have interpreted correlation and causality, that is whether there is a link between an action and a result. And, if possible, to also show cause and effect – in other words, that a result is caused by the action.

“We wanted to look at whether correlative studies use causal language, even when it is not actually possible to do so, based on the data available.”

Claiming causal relationships that do not exist

It emerged that two thirds of the studies state a causal relationship, or causality, even though there is only a relationship, a correlation.

“I myself have been there, sharpening it up, and used causal language. When publishing, you have to be clear and distinct, and then it is easy for these things to slip through.”

The research team also looked at whether the studies discuss the presence of any systematic errors, or alternative interpretations. But it turned out that hardly any studies did.

Jonas Josefsson is not surprised by the results.

“We suspected that this was the case. It is our responsibility to begin a discussion about this, because if we just pretend that all studies are good, then there will be no change.”

Trying to influence journals

One possible way is to try to influence journals. The research team is now writing to the 40 largest environmental protection journals, to find out whether they are interested in setting higher standards.

“We want researchers to discuss potential errors and shortcomings in their articles. Currently, there is no formal requirement for this, no framework, such as the ones introduced by certain social sciences journals.”

“At worst, the consequences of poorly executed studies can be that environmental support is paid out incorrectly, but in particular that it is impossible to calculate the effects of the action taken.”

“It is difficult to know what you get for your money.”

Used to reviewing the research of others

Anna Dreber Almenberg, Professor of Economics at the Stockholm School of Economics, is also used to reviewing the research of others. She and her colleagues often discuss studies that they are sceptical about.

They took part in the major international study Open Science Collaboration 2015, where 270 researchers repeated 100 psychology studies. It turned out that half of them could not even be replicated, and only one third of the results were statistically significant.

Anna Dreber Almenberg and her colleagues, led by Eva Ranehill at the University of Gothenburg, have also conducted a much-noticed replication study about ‘power posing’. The original study claimed that if people move their bodies into power positions for a few minutes, this would lead to hormonal and behaviour changes.

“We repeated this with five times as many participants – and could not replicate the main findings.”

All disciplines

Anna Dreber Almenberg considers that the quality of psychology research in particular has increased markedly following these replicated studies. It might be more difficult to replicate studies within disciplines such as economics or politics, for example where a certain event that cannot be replicated is studied. But all disciplines should be reviewed, Anna Dreber Almenberg thinks.

“We must expect that there will some incorrect results, even from researchers who carry out good research. That is part of the process.”

She states as a fact that false positive results tend to have a fairly long afterlife in the literature. It is difficult to get replication studies published, and even if they are published in a good journal, it is not certain that those who read the first study will discover the second one.

“It is not the case that researchers are lazy or sloppy, but it is difficult to keep up to date with everything that is cited. Incorrect results disappear eventually, and we have to speed up this process. Otherwise we risk reducing people’s trust in science, and that is problematic. But there are many reasons to believe that it is instead increasing.”

Reducing the risk of making errors

Her advice to other researchers for reducing the risk is to draw up a pre-registered analysis plan. This is a plan were you specify all the analysis to be conducted – and then to follow this strictly. An alternative is to do a multiverse analysis, where you analyse everything that is possible from the material, and then report the results.

“Transparency is crucial. We often fall in love with our hypotheses, and that makes it extra important to try to protect ourselves against believing that we have found something exciting – when we have not.”

1
Text: Helena Östlund

Ta del av information om behandlingen av dina personuppgifter