logo
Quality of scientific papers questioned as academics ‘overwhelmed' by the millions published

Quality of scientific papers questioned as academics ‘overwhelmed' by the millions published

The Guardian13-07-2025
It was, at first glance, just another scientific paper, one of the millions published every year, and destined to receive little to no attention outside the arcane field of biological signalling in stem cells destined to become sperm.
But soon after the paper was published online, in the journal Frontiers in Cell and Developmental Biology, it found a global audience. Not all of the readers came for the science.
The reason for its broader appeal? An eye-catching image, which depicted a rat sitting upright with an unfeasibly large penis and too many testicles. Its body parts were labelled with nonsense words such as 'testtomcels' and 'dck'.
Rather than fading into academic obscurity, the paper soon became the subject of mainstream media mockery. 'Scientific journal publishes AI-generated rat with gigantic penis', reported Vice News. 'It might be considered an AI cock-up on a massive scale,' intoned the Daily Telegraph.
The images had indeed been generated by artificial intelligence (AI), but that was permitted under the journal's rules. The problem was the authors had not verified the accuracy of the AI-generated material. Neither the journal's staff nor its expert reviewers caught the glaring errors. Three days after publication, the paper was retracted.
What separates the anecdote from other stories of AI mishap is the glimpse it provides into wider problems at the heart of an important industry. Scientific publishing records, and plays gatekeeper to, information that shapes the world, and on which life and death decisions are made.
The first scientific journal was published by the Royal Society in 1665. The maiden issue of Philosophical Transactions told readers about a spot on Jupiter, a peculiar lead ore from Germany, and a 'monstrous' calf encountered by a butcher in Lymington.
Since then, journals have been the chronicle of serious scientific thought. Newton, Einstein and Darwin all posited historic theories there; Marie Curie coined the term 'radioactivity' in a journal.
But journals are more than historical records. Groundbreaking research in critical fields from genetics and AI to climate science and space exploration is routinely published in the growing number of journals, charting humanity's progress. Such studies steer drug development, shape medical practice, underpin government policies and inform geopolitical strategies, even estimates of fatalities in bloody military campaigns, such as Israel's assault on Gaza.
The consequential nature of journals, and potential threats to the quality and reliability of the work they publish, have prompted leading scientists to sound the alarm. Many argue that scientific publishing is broken, unsustainable and churning out too many papers that border on the worthless.
The warning from Nobel laureates and other academics comes as the Royal Society prepares to release a major review of scientific publishing at the end of the summer. It will focus on the 'disruptions' the industry faces in the next 15 years.
Sir Mark Walport, the former government chief scientist and chair of the Royal Society's publishing board, said nearly every aspect of scientific publishing was being transformed by technology, while deeply ingrained incentives for both researchers and publishers often favoured quantity over quality.
'Volume is a bad driver,' Walport said. 'The incentive should be quality, not quantity. It's about re-engineering the system in a way that encourages good research from beginning to end.'
Today, following the dramatic expansion of science and publishing practices pioneered by the press baron Robert Maxwell, tens of thousands of scientific journals put out millions of papers annually. Analysis for the Guardian by Gordon Rogers, the lead data scientist at Clarivate, an analytics company, shows that the number of research studies indexed on the firm's Web of Science database rose by 48%, from 1.71m to 2.53m, between 2015 and 2024. Tot up all the other kinds of scientific articles and the total reaches 3.26m.
In a landmark paper last year, Dr Mark Hanson at the University of Exeter described how scientists are 'increasingly overwhelmed' by the volume of articles being published. Keeping up with the truly original work is only one issue. The demands of peer review – where academics volunteer time to vet each other's work – are now so intense that journal editors can struggle to find willing experts.
According to one recent study, in 2020 alone, academics globally spent more than 100 million hours peer reviewing papers for journals. For experts in the US, the time spent reviewing that year amounted to more than $1.5bn of free labour.
'Everybody agrees that the system is kind of broken and unsustainable,' says Venki Ramakrishnan, a former president of the Royal Society and a Nobel laureate at the Medical Research Council's Laboratory of Molecular Biology. 'But nobody really knows what to do about it.'
In the 'publish or perish' world of academia, where and how often a researcher publishes, and how many citations their papers receive, are career-defining. The rationale is reasonable: the best scientists often publish in the best journals. But the system can lead researchers to chase metrics. They might run easier studies, hype up eye-catching results, or publish their findings over more papers than necessary. 'They're incentivised by their institute or government funding agencies to put out papers with their names on them, even if they have nothing new or useful to say,' says Hanson.
Sign up to First Edition
Our morning email breaks down the key stories of the day, telling you what's happening and why it matters
after newsletter promotion
Scientific publishing has a unique business model. Scientists, who are typically funded by taxpayers or charities, perform the research, write it up, and review each other's work to maintain quality standards. Journals manage the peer review and publish the articles. Many journals charge for access through subscriptions, but publishers are steadily embracing open access models, where authors can pay up to £10,000 to have a single paper made freely available online.
According to a recent analysis, between 2015 and 2018, researchers globally paid more than $1bn in open access fees to the big five academic publishers, Elsevier, Sage, Springer Nature, Taylor & Francis, and Wiley.
Open access helps disseminate research more broadly. Because it is not behind a paywall, the work can be read by anyone, anywhere. But the model incentivises commercial publishers to run more papers. Some launch new journals to attract more studies. Others solicit papers for vast numbers of Special Issues.
For one Swiss publisher, MDPI, special issues of journals are a major income stream. A single MDPI journal, the International Journal of Molecular Sciences, is inviting submissions to more than 3,000 special issues. The publication fee, or article processing charge (APC) for one article is £2,600. As of last year, the Swiss National Science Foundation refuses to pay publication fees for special issues amid concerns over quality. MDPI did not respond to an interview request.
Unhelpful incentives around academic publishing are blamed for record levels of retractions, the rise in predatory journals, which publish anything for a fee, and the emergence of AI-written studies and paper mills, which sell fake papers to unscrupulous researchers to submit to journals. All contaminate the scientific literature and risk damaging trust in science. Earlier this month, Taylor & Francis paused submissions to its journal Bioengineered while editors investigated 1,000 papers that bear signs of being manipulated or coming from paper mills.
While fraud and fakery are important problems, Hanson is more concerned about the glut of research papers that do little to progress scientific knowledge. 'The far greater danger by volume and by total numbers is the stuff that's genuine but uninteresting and uninformative,' he says.
'It's now possible to publish a peer-reviewed article in a journal that has practically nothing new to contribute. These papers are a major drain on the system in terms of the money used to publish and pay for them, the time that's spent writing them and the time that's spent reviewing them.'
Prof Andre Geim, a Nobel laureate at the University of Manchester, said: 'I do believe that researchers publish too many useless papers and, more importantly, we aren't flexible enough to abandon declining subjects where little new can be learned. Unfortunately, after reaching a critical mass, research communities become self-perpetuating due to the emotional and financial interests of those involved.'
Hanson believes the problem is not open access and APCs per se, but for-profit publishers that seek to publish as many papers as possible. He believes the strain on academic publishing could be substantially alleviated if funding agencies stipulated that the work they support must be published in non-profit journals.
Hannah Hope, the open research lead at the Wellcome Trust, said in general, research that is good enough to fund should be published, and that greater investment in science, particularly beyond North America and Europe, has contributed to the rise in scientific papers. But she agrees peer review might be used more selectively. 'I'm sure peer review does lead to improvement in research. Is it always worth the time that goes into it? I think it's something that we should be questioning as a field, and whether peer review happens in the current format on everything,' she says.
Ritu Dhand, the chief scientific officer at publisher Springer Nature, rejected the narrative of 'greedy journal publishers' making money by publishing poor-quality papers and pointed to the fact that the research landscape has gone through a 'radical transformation', quadrupling in size over the past 25 years. Long dominated by western countries, research is now far more global, and led by China rather than the US.
'Is the solution not to allow the rest of the world to publish?' she says. 'We live in a digital world. Surely, it doesn't matter how many papers are being published.' She sees solutions in better filtering, search tools and alerts so researchers can find the work that really matters to them, and a global expansion of peer reviewers to absorb the demand.
While technology poses fresh challenges for academic publishers, Ramakrishnan agrees that it may be the answer to some of the problems. 'Eventually these papers will all be written by an AI agent and then another AI agent will actually read them, analyse them and produce a summary for humans. I actually think that's what's going to happen.'
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

How Quantum Computers Are Solving the World's Biggest Problems
How Quantum Computers Are Solving the World's Biggest Problems

Geeky Gadgets

time7 hours ago

  • Geeky Gadgets

How Quantum Computers Are Solving the World's Biggest Problems

What if the most complex problems plaguing industries today—curing diseases, optimizing global supply chains, or even securing digital communication—could be solved in a fraction of the time it takes now? Quantum computing, once the stuff of science fiction, is no longer a distant dream. With breakthroughs like Google's 105-qubit 'Willow' processor and Microsoft's topological qubits, the race toward fault-tolerant quantum systems is heating up. These advancements are not just incremental; they're fantastic, promising to redefine the limits of computation and disrupt industries across the globe. The question is no longer if quantum computing will change the world, but how soon—and how profoundly—it will happen. ExplainingComputers explores the most pivotal developments in quantum computing as of 2025, from innovative hardware innovations to the emergence of post-quantum cryptography. You'll discover how companies like IBM and SciQuantum are tackling challenges like quantum error correction and scalability, and why these breakthroughs matter for everything from drug discovery to financial modeling. But this isn't just about technology—it's about the societal shifts and opportunities that quantum computing will unlock. As we stand on the brink of a quantum revolution, the implications are as exciting as they are daunting. What will this new era of computation mean for you, your industry, and the world at large? Quantum Computing Breakthroughs Understanding Quantum Computing Quantum computing operates on the principles of quantum mechanics, using qubits as its fundamental units of information. Unlike classical bits, which exist in a binary state of 0 or 1, qubits can exist in multiple states simultaneously through the phenomena of superposition and entanglement. This unique capability allows quantum computers to process vast amounts of data in parallel, offering computational power far beyond that of classical systems. However, qubits are inherently fragile and susceptible to environmental interference, leading to errors during computation. To address this challenge, researchers employ quantum error correction codes, which combine multiple physical qubits to create a single logical qubit. Logical qubits are a critical step toward building fault-tolerant quantum systems, allowing reliable and scalable quantum computation. These advancements are paving the way for practical applications, making quantum computing a viable solution for complex problems. Breakthroughs in 2024-2025 The past two years have been pivotal for quantum computing, with leading technology companies achieving significant milestones. These developments are shaping the future of the field and bringing us closer to realizing the full potential of quantum systems: Google: Google introduced its 'Willow' quantum processor, featuring 105 superconducting transmon qubits. The company achieved a major breakthrough in quantum error correction , demonstrating performance below the surface code threshold. This milestone is a critical step toward scalable quantum systems. Additionally, Google showcased its computational superiority through random circuit sampling (RCS) , further solidifying its leadership in the field. Google introduced its 'Willow' quantum processor, featuring 105 superconducting transmon qubits. The company achieved a major breakthrough in , demonstrating performance below the surface code threshold. This milestone is a critical step toward scalable quantum systems. Additionally, Google showcased its computational superiority through , further solidifying its leadership in the field. Microsoft: Microsoft launched its 'Majorana 1' processor, using topological qubits for enhanced stability and scalability. The company also partnered with Atom Computing to explore neutral atom-based quantum hardware and joined DARPA's US2QC program to advance utility-scale quantum computing. These initiatives highlight Microsoft's commitment to pushing the boundaries of quantum technology. Microsoft launched its 'Majorana 1' processor, using for enhanced stability and scalability. The company also partnered with Atom Computing to explore and joined DARPA's US2QC program to advance utility-scale quantum computing. These initiatives highlight Microsoft's commitment to pushing the boundaries of quantum technology. SciQuantum: SciQuantum unveiled its 'Omega' photonic quantum chipset, designed for scalability and efficiency. The company also developed an innovative cooling system for photonic qubits , resembling data center server racks, to address thermal challenges. This approach demonstrates the potential of photonic systems in achieving practical quantum computing. SciQuantum unveiled its 'Omega' photonic quantum chipset, designed for scalability and efficiency. The company also developed an innovative cooling system for , resembling data center server racks, to address thermal challenges. This approach demonstrates the potential of photonic systems in achieving practical quantum computing. IBM: IBM released a comprehensive roadmap for its fault-tolerant quantum computer, 'Quantum Staling,' which aims to feature 200 logical qubits by 2029. The company introduced advanced error correction techniques, such as barista bicycle codes and noise decoders, to enhance system reliability and scalability. Quantum Error Correction and Scalability: The Next Big Leap Watch this video on YouTube. Explore further guides and articles from our vast library that you may find relevant to your interests in Quantum computing. Securing the Future with Post-Quantum Cryptography The rise of quantum computing presents a significant challenge to traditional cryptographic systems. Quantum computers have the potential to break widely used encryption algorithms, posing a threat to data security across industries. In response, the National Institute of Standards and Technology (NIST) released a 2024 report outlining the transition to post-quantum cryptographic standards by 2035. These standards aim to safeguard sensitive information and ensure cybersecurity in a quantum-enabled future. Post-quantum cryptography focuses on developing encryption methods that are resistant to quantum attacks. This proactive approach is essential for protecting critical infrastructure, financial systems, and personal data as quantum computing becomes more prevalent. Organizations are encouraged to begin adopting these standards to future-proof their security systems. Applications Transforming Industries Quantum computing is set to transform a wide range of industries, offering solutions to complex problems that were previously unsolvable. Some of the most promising applications include: Molecular Modeling: Quantum computers can simulate molecular interactions with unprecedented precision, accelerating advancements in drug discovery and materials science . Quantum computers can simulate molecular interactions with unprecedented precision, accelerating advancements in and . Logistics Optimization: Quantum algorithms can optimize supply chains and transportation networks, reducing costs and improving efficiency for businesses worldwide. Quantum algorithms can optimize supply chains and transportation networks, reducing costs and improving efficiency for businesses worldwide. Financial Modeling: Quantum systems enable the analysis of complex financial data, providing more accurate risk assessments and portfolio optimizations . Quantum systems enable the analysis of complex financial data, providing more accurate and . AI Integration: Quantum computing enhances machine learning algorithms, leading to faster and more accurate artificial intelligence solutions. Quantum computing enhances machine learning algorithms, leading to faster and more accurate solutions. Materials Science: Quantum simulations can uncover new materials with unique properties, driving innovation in energy and manufacturing sectors. Additionally, the emergence of Quantum Computing as a Service (QCAS) is providing widespread access to access to this innovative technology. By offering quantum capabilities through cloud-based platforms, QCAS allows businesses to use quantum computing without the need for costly hardware investments. This model is accelerating the adoption of quantum technologies across industries. The Road Ahead for Quantum Computing The quantum computing market is experiencing rapid growth, with annual revenues projected to reach $5 billion by 2030. While fault-tolerant quantum systems are still under development, they are expected to become commercially viable by the early 2030s. These systems will unlock new possibilities for industries, allowing breakthroughs in areas such as healthcare, finance, and energy. As the field progresses, collaboration between academia, industry, and government will play a crucial role in overcoming technical challenges and driving innovation. The next decade will be instrumental in shaping the future of quantum computing, as researchers and engineers work toward building scalable, reliable, and accessible quantum systems. By staying informed about these advancements, you can better understand the fantastic potential of quantum computing and its impact on technology and society. The developments of 2024-2025 mark a significant step forward, setting the stage for a quantum revolution that will redefine the boundaries of computation and innovation. Media Credit: Explaining Computers Filed Under: Hardware, Technology News Latest Geeky Gadgets Deals Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.

Common medicines may not work for some people based on their DNA, experts find
Common medicines may not work for some people based on their DNA, experts find

The Sun

time2 days ago

  • The Sun

Common medicines may not work for some people based on their DNA, experts find

A PILOT scheme has revealed a widespread genetic sensitivity to common medicines which could increase side effects or stop them working as they should. The trial saw 2,200 adults undergo whole genome sequencing to analyse how their individual DNA responds to the likes of antibiotics and over the counter painkillers. 1 A staggering 99 per cent showed a genetic variant that affects their sensitivity to certain medicines. This could mean some drugs, including over the counter, everyday painkillers, antibiotics and other prescription medications, won't work for some people based on their individual DNA. The blood test, part of Bupa's My Genomic Health scheme, also looked their genetic risk of developing 36 preventable diseases including cancers, heart conditions and type 2 diabetes. It found 91 per cent of participants were found to be at risk of developing a disease with genetic and lifestyle risk factors, such as fatty liver disease, breast cancer and certain heart diseases. While 73 per cent had multiple genetic variants that put them at raised risk of developing a condition that could be prevented or detected early, leading to better health outcomes, including the likes of high cholesterol, skin cancer and type 2 diabetes. And 49 per cent were found to be carriers of a genetic variant that could lead to raised risk of certain condition in future generations. Following the successful pilot, Medication Check can now be purchased through Bupa, and will also be available to more than three million its customers as part of its workplace health scheme. A saliva test will establish what medications are most likely to be effective, those with increased risk of adverse side effects, or ones that won't work for them at all. Dr Rebecca Rohrer, clinical innovation and genomics director for Bupa, said: 'We've long known that most medications only work for 30-50 per cent of the population. 'However, this pilot has highlighted just how significantly individual genomes impact the effectiveness of medications in treating conditions. Beware 3 of the most dangerous medicines in the world - including one found in almost EVERY home 'With more than half of us regularly taking a prescription medication and an increasing number affected by a chronic condition, it's crucial that people are prescribed the right medicine from the start, tailored to their unique genetic makeup. 'In the longer term, genomics is key to early detection and even preventing some illnesses altogether.' After completing the at-home medication check, patients will be offered a GP consultation with the healthcare provider to review any medication identified in their genetic tests. It comes as Bupa is about to introduce two new products to its My Genomic Health suite later this year, that will help to prevent or detect illness earlier. The DNA Health Check will give people early warning of an increased genetic risks of four different conditions - breast cancer, prostate cancer, type 2 diabetes and cardiovascular disease. While the Advanced DNA Health Check will combine insights from medication, disease risk, carrier status and traits, and will look at the genetic risk of developing conditions such as heart disease, metabolic disease and 10 types of cancer. Carlos Jaureguizar, CEO for Bupa Global, India & UK, said: 'Whole genomic sequencing is fundamentally changing our approach to healthcare, pivoting from treatment to prevention. 'It has the power to become a health passport that people can reference throughout their lives. 'We firmly believe genomics is the path to health innovation and prevention, reducing the nation's health burden and giving people personalised knowledge of their own genomic profile to live well for longer.'

The renewable energy revolution is a feat of technology
The renewable energy revolution is a feat of technology

The Guardian

time2 days ago

  • The Guardian

The renewable energy revolution is a feat of technology

I know progressives are supposed to be technophobes, but there is one technology we probably love more than anyone else (except the engineers who created it): renewable energy. It is nothing less than astonishing and unbelievable that we have achieved so much progress in so little time. At the turn of the century, sun and wind in the form of solar panels and wind turbines were expensive, primitive, utterly inadequate solutions to power our machines at scale, which is why early climate activism focused a lot on minimizing consumption on the assumption we had no real alternative to burning fossil fuels, but maybe we could burn less. This era did all too well in convincing people that if we did what the climate needs of us, we would be entering an era of austerity and renunciation, and it helped power the fossil fuel industry's weaponization of climate footprints to make people think personal virtue in whittling down our consumption was the key thing. There's nothing wrong with being modest in your consumption, but the key thing to saving the planet is whittling down the fossil fuel industry and use of fossil fuels to almost nothing by making the energy transition to renewables and an electrified world. And that's a transformation that has to be collective and not just individual. Other stuff is great – changing our diets, especially to reduce beef consumption and food waste, protecting natural systems that sequester carbon, better urban design and better public transit, getting rid of fast fashion, excessive use of plastic, and other wasteful climate-harming forms of consumption – all matter. But the majority of climate change comes from burning fossil fuels, and we know exactly how to transition away from that and the transition is underway – not nearly fast enough, not nearly supported enough by most governments around the world, actively undermined by the Trump administration and many fossil fuel corporations and states. But still, it is underway. And, arguably, unstoppable. Because it's just a better way to do everything. One thing that's been striking in recent years, and maybe visible in recent years because there is now an alternative, is the admission that fossil fuel is a wasteful and poisonous way to produce energy. That's the case whether it's to move a vehicle or cook a dinner. Oil, coal, and gas are distributed unevenly around the world and just moving the fuel to the sites where it will be used is hugely energy inefficient. About 40% of global shipping is just moving fossil fuel around, and more fuel is moved on trains and trucks. But also, fossil fuel is extracted, shipped, and refined for one purpose: to be burned, and in the future coming fast, burning is going to look like a primitive way to operate machines. As the Rocky Mountain Institute explains it: 'Today, most energy is wasted along the way. Out of the 606 EJ [exajoules] of primary energy that entered the global energy system in 2019, some 33% (196 EJ) was lost on the supply side due to energy production and transportation losses before it ever reached a consumer. Another 30% (183 EJ) was lost on the demand side turning final energy into useful energy. That means that of the 606 EJ we put into our energy system per annum, only 227 EJ ended up providing useful energy, like heating a home or moving a truck. That is only 37% efficient overall.' That's the old system, and it's dirty, toxic to human health and the environment – and our politics – as well as the main driver of climate chaos. And wasteful. The new system, on the other hand, is far cleaner, and the fact that sun and wind are so widely available means that the corrosive politics of producer nations and their manipulations of dependent consumer nations could become a thing of the past. I know someone is about to pipe up with an objection about battery materials and there are two answers to that. One is that the race is on, with promising results, to produce batteries with more commonly available and widely distributed materials. The other is that batteries are not like fossil fuel, which you incessantly burn up and have to replace; they are largely recyclable, and once the necessary material is gathered, it can be reused and extraction can wind down. But also the scale of materials needed for renewables is dwarfed by the materials to keep the fires burning in the fossil fuel economy (and the people who complain about extraction sometimes seem to forget about the monumental scale of fossil fuel extraction and all the forms of damage it generates, from Alberta to Nigeria to the Amazon). And renewables are now adequate to meet almost all our needs, as experts like Australia's Saul Griffiths and California's Mark Z Jacobson have mapped out. Simply because it's cheaper, better and ultimately more reliable, the transition is inevitable – but if we do it fast, we stabilize the climate and limit the destruction, and if we don't, we don't. Almost no one has summed up how huge the shifts are since the year 2000, but the Rocky Mountain Institute has done that for the last decade, during which, they tell us: 'clean-tech costs have fallen by up to 80%, while investment is up nearly tenfold and solar generation has risen twelvefold. Electricity has become the largest source of useful energy, and the deep force of efficiency has reduced energy demand by a fifth.' Estimates for the future price of solar have almost always been overestimates; estimates for the implementation of solar have been underestimates. Another hangover from early in the millennium is the idea that renewables are expensive. They were. They're not anymore. There are costs involved in building new systems, of course, but solar power is now the cheapest way to produce electricity in most of the world, and there's no sign that the plummet in costs is stopping. As Hannah Ritchie at Our World in Data said in 2021 of renewable energy: 'In 2009, it was more than three times as expensive as coal. Now the script has flipped, and a new solar plant is almost three times cheaper than a new coal one. The price of electricity from solar declined by 89% between 2009 and 2019.' But even cheap is a misnomer: wind and sun are free and inexhaustible; you just need devices to collect the energy and transform it into electricity (and transmission lines to distribute it). Free energy! We need to get people to recognize that is what's on offer, along with energy independence – the real version, whereby if we do it right, we could build cooperatives, local (and hyperlocal or just autonomous individual) energy systems, thereby undermining predatory for-profit utilities companies as well as the fossil fuel industry. Renewable energy could be energy justice and energy democracy, as well as clean energy. An energy revolution is underway in this century, though it's unfolded in ways slow enough and technical enough for most people not to notice (and I assume it's nowhere near finished). It is astonishing – a powerful solution to the climate crisis and the depredations of the fossil fuel industry and for-profit utilities. Making it more visible would make more people more enthused about it as a solution, a promise, a possibility we can, should, must pursue swiftly and wholeheartedly. Rebecca Solnit is a Guardian US columnist. She is the author of No Straight Road Takes You There and Orwell's Roses

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store