Pioneering journal eLife faces major test after loss of impact factor
The open-access journal eLife is facing upset after the news that the journal will lose its impact factor — a controversial metric based on citations that is often used as shorthand for quality.
Clarivate, the London-based analytics firm that calculates the impact factor, announced the decision last month, after the scholarly database it owns, Web of Science, said it would no longer index eLife’s papers. That move is a result of eLife’s 2023 introduction of a radical publishing model in which it no longer ‘accepts’ or ‘rejects’ manuscripts, but posts all submissions sent out for review, alongside their referee reports.
The developments raise questions about whether authors are willing to dump conventional measures of quality and prestige for what many say is a long-needed change in research publishing. Clarivate’s decisions have led to a dip in eLife submissions — which cost authors US$2,500 per paper sent for review — from some regions.
“I’m very concerned about the financial viability of the journal as a result of this,” says Randy Schekman, a molecular and cell biologist at the University of California, Berkeley, who was eLife’s first editor-in-chief and stepped down in 2019. Schekman notes that when the journal launched, it tried to avoid being given an impact factor. He thinks that the impending loss of the metric now leaves it in peril. “I wish that we had been allowed to evolve without the influence of that phony number.”
eLife says that although it has never supported the impact factor, it understands the loss of that metric will be a problem for researchers in institutions that still heavily rely on it for assessment. Timothy Behrens, co-editor-in-chief of eLife, says that it’s too soon to say what will happen to the journal, but the editorial team is collecting community feedback and watching submissions carefully. “It looks so far like submissions are not collapsing,” Behrens says. “But it’s very early days.”
Review shake-up
The ructions are the latest for a journal that has been in the vanguard of experimental publishing. eLife launched in 2012 as an open-access alternative to high-profile journals in biology. Originally, it had a collaborative peer-review process in which referees and an editor together decided whether to publish a paper. Authors received a single decision letter, rather than reports from each reviewer.
Last year’s change to the ‘publish, review and curate’ model divided researchers — and eLife’s editors, some of whom resigned. Some wanted a rethink of the model, and voiced several concerns, including that paper quality would drop and that greater pressure would be placed on desk rejection, the process in which an editor decides whether a paper is worth reviewing at all. Others thought the new approach should be tested alongside the old one. “I had no objection to doing the experiment,” says Schekman. “My problem was the decision to dump the original, reasonably successful model in favour of an untested model.”
Behrens says that authors have embraced the model, and that many laud its transparency. “We did not see a big drop in submissions when we shifted from something that had been established for 100 years,” says Behrens. “People have put their faith in this idea.”
“The removal of accept–reject was a really great step forward, because as scientists, we know that evaluation of the paper by the community never stops at that decision,” says Sarvenaz Sarabipour, a computational biologist at UConn Health in Farmington, Connecticut. Sarabipour — who published a paper under eLife’s previous model and has served as a reviewer under both — plans to keep submitting her team’s work to the journal. “I think a lot of people, especially early-career researchers, still hope to send their best work to eLife.”
Behrens says that since the model was introduced, the number of submissions hasn’t changed dramatically across fields. But anecdotally, people have noticed differences. From July to October 2024, eLife received around 640 submissions per month, about 150 of which were selected for review and publication (see ‘Submission trends’). In fields such as computational neuroscience, which are nearer to physics and computer science, where academics are used to preprints, researchers have “taken the new model in stride”, Behrens says. But in areas such as medicine or cell biology, where researchers are more accustomed to conventional publishing models, the approach made a bigger difference.
Suzanne Pfeffer, a biochemist at Stanford University in California, has been turned off by the changes. “In my world, nobody wants to publish there anymore,” says Pfeffer, who was previously a member of eLife’s editorial board but resigned over the new model. “Under the previous form of eLife, we sent our best work there, because we really felt our work was fairly and thoughtfully evaluated with the consultative review process.”
Impact factor
The switch to the publish, review and curate model led Web of Science to stop including eLife papers in its database from October this year. Web of Science indexes journals from cover to cover only if all research articles are validated by review or if publishers provide a feed of a subset of content that is. eLife’s current model decouples peer review from the validation that typically comes with the accept–reject decision.
In November, Clarivate announced that, from 2025, it would no longer issue eLife with an impact factor. Academics have long criticized the metric, which is billed as an indicator of a journal’s quality but is often used to judge individual papers. For many researchers, decisions on hiring, tenure and funding are influenced by the impact factors of the journals in which they publish.
“What we’re seeing is conflict between two contrasting views: the idea that an article should be judged solely on that article or that you can judge the quality of an article by the company it keeps,” says Richard Sever, co-founder of the preprint repositories bioRxiv and medRxiv and assistant director of Cold Spring Harbor Laboratory Press in New York.
Many decried Clarivate’s decision, saying that it was holding back innovation in scholarly communication. eLife saw submissions from China — where the impact factor is widely used — drop steeply, from more than 100 in the first halves of September and October to fewer than 50 in the first half of November (see ‘Impact-factor effect’). In other regions, including the United States and Europe, submissions remained relatively steady.
Index return
Enjoying our latest content?
Login or create an account to continue
Access the most recent journalism from Nature’s award-winning team
Explore the latest features & opinion covering groundbreaking research