Impact factor abandoned by Dutch university in hiring and promotion decisions

0 16
Avatar for Kidi-Josh16.
2 years ago

Dutch university says it is formally abandoning the impact factor — a standard measure of scientific success — in all hiring and promotion decisions. By early 2022, every department at Utrecht University in the Netherlands will judge its scholars by other standards, including their commitment to teamwork and their efforts to promote open science, says Paul Boselie, a governance researcher and the project leader for the university’s new Recognition and Rewards scheme. “Impact factors don’t really reflect the quality of an individual researcher or academic,” he says. “We have a strong belief that something has to change, and abandoning the impact factor is one of those changes.”

A scientist’s impact factor is a score that takes into account the number of publications and the citation rate of the journals where those papers are published. In this system, articles in highly cited journals such as ScienceNature or Cell count for more than articles in journals whose content is cited less frequently. Boselie says that impact factors — as well as a related measure called the h-index — contribute to a ‘product-ification’ of science that values sheer output over good research. “It has become a very sick model that goes beyond what is really relevant for science and putting science forward,” he says.

The new scheme is part of Utrecht’s Open Science programme, a multi-track effort to make research more transparent and cooperative. Open-science fellows embedded within each department will assess progress towards open-access publishing, public engagement and data sharing.

The decision to revamp hiring and promotion was partly inspired by the Declaration on Research Assessment (DORA), a document created in 2012 at the annual meeting of the American Society for Cell Biology. The declaration aims to “improve the ways in which researchers and the outputs of scholarly research are evaluated” and specifically calls for doing away with impact factors as a way to judge the merit of academics. So far, it has been signed by nearly 20,000 individuals and institutions. Utrecht University signed the document in 2019. At the time, Anton Pijpers, the president of the university’s executive board, said that signing DORA wasn’t a “symbolic step” but “a ‘pledge’ for which UU can be held accountable”.

Among academic researchers, dissatisfaction with use and misuse of the impact factor in evaluations and tenure, promotion and hiring decisions has grown in recent years. A 2018 report called the impact factor “an inadequate measure for assessing the impact of scientists” and concluded that failure to modify the current assessment system is likely to lead to “continued bandwagon behaviour that has not always resulted in positive societal behaviour”1. Despite this, a 2019 study found that 40% of research-intensive universities in the United States and Canada specifically mention impact factors or closely related terms in documents related to tenure, review and promotion2. Only a few of those references strike a note of caution, and most suggest that a high impact score would be necessary for career advancement.

Every university in the Netherlands, Utrecht included, has signed on to ‘Room for Everyone’s Talent’, a 2019 position paper led by the VSNU, the employee association for Dutch universities. That paper calls for a system of recognition and rewards that “enables the diversification and vitalization of career paths”.

On a practical level, evaluating researchers on qualities beyond easy-to-measure metrics can be messy and complicated. “It’s going to be quite challenging to apply,” Boselie says. He explains that each department will have to develop its own systems and strategies to identify researchers and academics who are making the most meaningful contributions to their fields. The process might involve interviews with other researchers in a given field, he says. “There are alternative ways to evaluate individuals on their quality.”

Tenure denial, and how early-career researchers can survive it

Still, doing away with standard metrics could be a risky move for the university and its faculty and staff members. As long as other universities continue to rely on impact factors and other productivity metrics for hiring and promotion, researchers who come up through the Utrecht system might be at a competitive disadvantage if they eventually try to find a job at a different institution, Boselie acknowledges. “There are feelings of insecurity among young academics,” he says. “We feel that it’s a risk that we are willing to take because we believe [the evaluation system] will change in the end.”

Utrecht will not be standing alone in its efforts to change the way researchers are evaluated, says Lynn Kamerlin, a computational biochemist at Uppsala University in Sweden. “As open science becomes more and more important in policy and decision-making surrounding research funding and strategies, I think it will almost be a necessity for institutions to follow suit,” says Kamerlin, who is a member of a European Union group that published a 2019 report on ways to evaluate researchers’ contributions to open science. “It’s always frightening to go first, so having institutions that break ground ahead of you is helpful.”

0
$ 0.02
$ 0.02 from @wrabbiter
Avatar for Kidi-Josh16.
2 years ago

Comments