Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Science

‘Scientific research must be replicable’

Why is scientific research poorly reproducible? With a little support from the Dutch Research Council (NWO), the Netherlands Reproducibility Network was set up last week.

(Foto: Unsplash)

Why is scientific research often so hard to replicate? The Netherlands Reproducibility Network, a new project launched last week with a bit of support from the Dutch Research Council (NWO), intends to address this issue.

Ever since the Dutch psychology professor Diederik Stapel was exposed for fabricating all kinds of plausible-sounding data – a story that made international headlines – the reproducibility of scientific research has been under increased scrutiny. Why had no one ever tried to reproduce his research results?

Since Stapel’s fraud came to light, a large number of studies have been conducted in a variety of fields, attempting to replicate previous research. These consistently show that 40 to 50 percent of the results cannot be confirmed, according to epidemiologist Michiel de Boer of UMC Groningen.

Within academia, he says, people are calling it a ‘replication crisis’. So far, around 20 countries have established networks to draw attention to the reproducibility of scientific research. Together with a number of colleagues from other universities, De Boer has now established such a network for the Netherlands: the Netherlands Reproducibility Network (NLRN).

Modest

The network is starting modestly, with a small grant of 250,000 euros for the next three years from research funder NWO. De Boer: “We have a coordinator, who works part-time, and we can also organise conferences and develop training materials.”

Fraudulent researchers are not the biggest problem

The goal is not necessarily to actually repeat past research, says De Boer, but to ensure the possibility of replication. Researchers should work as transparently as possible so that others can repeat their process step by step.

“Some scientists don’t see the problem”, De Boer explains. “They go: didn’t I explain what I did in my methodology section? But scientific journals only give you, for example, 400 words to explain your method, which is never enough to go into detail. And if you want to replicate a study, you also need the original data, or the software and codes that were used, and so on.”

Fluke

The creation of the network is part of a larger drive towards open science, De Boer affirms. “But open science is also about things like freely accessible articles, and that’s not something we’re concerned with. We’re specifically looking at the possibility of repeating scientific research.”

Because they’re so rare, academic fraudsters like Stapel are actually not the biggest problem, De Boer believes. Researchers who decide to cut corners and only publish significant outcomes are much more common. This can sometimes make for conclusions that sound spectacular, even though they’re actually based on a fluke.

Meanwhile, science is also plagued by ‘publication bias’, says De Boer. Journals only publish outcomes that are deemed interesting – and that can’t always be replicated.

Rembrandt

Is reproducibility relevant to all disciplines, even those where scientific experiments and data hardly play a role? Yes, says De Boer. “In a field like history, the discussion is still in its infancy, but we are seeing new initiatives. For example, there’s someone who’s trying to replicate the attribution process for two Rembrandt paintings, to see what kind of challenges that might entail.”

The network itself is also in its infancy, De Boer concedes. Together with his colleagues, he’s in talks with many potential partners, but so far only one institution – his own employer, the University of Groningen – has officially joined the network. “We do still have some way to go, but we’ve already connected with all kinds of local initiatives, and we look forward to introducing them to each other.”

HOP, Bas Belleman

HOP Hoger Onderwijs Persbureau

Do you have a question or comment about this article?

redactie@hogeronderwijspersbureau.nl

Comments are closed.