Latest news with #scientificmisconduct
Yahoo
6 days ago
- Science
- Yahoo
Study sheds light on how reams of fake scientific papers are getting into literature
Fraudulent scientific research is now being produced and published on a large scale, with some unethical researchers colluding with unethical editors to attain the prestige that comes with publication, according to a new study in the Proceedings of the National Academy of Sciences. Large groups of editors and authors appear to have cooperated in what it called "the tide of fraudulent science." Among their efforts, the researchers who conducted the study obtained about 47,000 retracted articles. They collected reports of the same image used in multiple publications. They compiled 33,000 papers of suspicious origin. Making use of the fact that editors' names are public at some science publishers, they looked at whether some editors handled disproportionate numbers of problematic scientific papers, ones that were later retracted or noted negatively by other scientists. At the journal PLOS One, they were able to link 30.2% of the retracted articles to 45 editors. Of these editors, 25 had their own papers retracted. The 45 editors represented 0.25% of the total number of editors at the journal. PLOS One did not respond to a request for comment. Researchers also found clusters of articles accepted in less than a month, often involving the same editors and authors. 'They found cases where people submitted papers and those papers got accepted extremely fast, and when you looked at the editors, they were just sending them to each other,' said Luís Amaral, a systems biologist at Northwestern University and senior author of the study. 'There are people who believe that there is widespread fraud,' said Reese Richardson, a postdoctoral researcher in the Amaral Lab at Northwestern and lead author of the study. 'What this paper does is give a method and a starting point and the data to show that this is actually happening, and that the current mechanisms are not equipped to stop it.' The study's findings confirm the suspicions of many researchers, including Elisabeth Bik, a microbiologist and independent scientific integrity consultant who has spent years identifying fraudulent research. In one case, she found 125 papers that reused parts of the same image. 'It was the same photo, but different crops of the same image," she said. "They didn't generate the photos themselves. They got the photos from a third party — a broker, a paper mill.' Researchers have been using the term "paper mill" to describe organizations that sell mass-produced low quality and fabricated research articles. Many of these fraudulent papers, Bik added, seem to come from doctors or researchers in countries where promotions are tied to publication metrics. They see it as an investment, she explained, where a couple of thousand dollars gets them a paper, and a fast track up the promotional ladder. This institutional pressure is especially common in India and China, where promotions, medical licensing or graduation are linked by policy to publication counts, several experts said. In a survey of medical residents in China three years ago, 47% admitted to buying and selling papers, letting other people write papers, or writing papers for others. When the study authors analyzed an archive of articles from a business offering services to "research professionals who are desperate" for publication, they found 26% of the authors were from India. Although the "publish or perish" culture is also common in the U.S., it manifests more in expectations around prestige, funding and tenure, rather than fixed quotas. India and China are the world's most populous nations and both are scientific powerhouses. The paper notes that science fraud can happen anywhere. The accumulation of fake literature has turned some scientific fields — RNA biology, for example — into what Richardson called an academic "minefield," making it difficult for researchers to identify which studies are reliable. Some fraudulent studies have even made it into meta-analyses that shape the way doctors treat patients. They found evidence that this field of research has been targeted by bad actors. Experts say growing awareness of fraud could feed broader skepticism of science, especially if institutional action doesn't keep up. 'The more polluted the record becomes, the harder it is to clean up, and the harder it is to rebuild trust inside and outside the scientific community,' said Stephanie Kinnan, a longtime member of the Committee on Publication Ethics (COPE). The scientific community has tools to fight back. It fines and excludes researchers and universities. Journals retract articles. Aggregators can sideline problematic journals. But the authors of the paper found the amount of "research" from suspected paper mills has been doubling roughly every 1½ years. The actions are not keeping up. For Amaral, and many other scientists, the implications are deeply personal. 'I dreamed of being a scientist since I was 12,' he said. 'Seeing the thing that I've dreamt of being a part of, that I cherish, being potentially destroyed is really enraging.' All research is built on previous research, Amaral explained. That collapses without trust. "This is the great fear — that the entire scientific enterprise that gave us vaccines, that gave us medicine for cancer, that gave us, X-ray machines, computer scanning devices — would just disappear,' he said. This story originally appeared in Los Angeles Times. Solve the daily Crossword


Malay Mail
08-05-2025
- Science
- Malay Mail
From ‘publish or perish' to ‘be visible or vanish': What's next? — Mohammad Tariqur Rahman
MAY 8 — Amidst the dictum 'publish or perish,' a new vibe has emerged in academia: 'be visible or vanish'. The new dictum is introduced in the book 'Engage, Influence and Ensure Your Research Has Impact' by Inger Mewburn and Simon Clews in 2023. The survival of academics in their profession is largely dependent upon the number of papers they publish. An increasing number of papers in their bags adds credit to their reputation. To have a higher prestige, the number of papers alone does not suffice. Papers need to be published in journals with high impact factors. Arguably, the race to increase the number of papers resulted in a number of scientific misconducts, namely, but not limited to, the unethical practice in authorship assignments e.g., guest and honorary authorship; emergence of paper mills; and publishing unauthenticated or manipulated results. The trend of scientific misconduct has been condemned, yet no practical measures have been taken either to control or to decrease it. Rather, the increasing number of retracted papers every year attest the ongoing 'pandemic' of scientific misconduct. Will the new dictum 'be visible or vanish' then add to the pandemic? Visibility in academia is generally measured by the number of citations received by the papers of an academic. Indeed, the number of citations increases with the number of publications. However, some may have more citations than others, with less papers. Nevertheless, researching a popular topic increases the chance of higher citations. Self-citation, i.e., when authors cite their own papers, can be monitored by most of the bibliometric databases such as Scopus or Web of Science. However, the practice of self-citation is not acceptable when the authors cite their own papers, especially if they are not relevant and important. Using Scopus records, a PLOS One paper in December 2023 identified Colombia, Egypt, Indonesia, Iran, Italy, Malaysia, Pakistan, Romania, Russia, Saudi Arabia, Thailand, and Ukraine among the top anomalous self-citing countries (i.e., academics from those countries) in the world. Citing existing literature is an academic norm that reflects the relevance of new research findings, i.e., portrays its rationality, validity, and importance in academic publications. Furthermore, the number of citations provides the impact (and popularity) of the published paper. Albeit, while the 'number' of citations provides the visa for visibility of the paper among the global audience, it does not necessarily represent the paper's importance. For example, one of the most cited papers (>305,000 in 2014) in the history of academia goes to a paper describing how to quantify proteins in a solution. Even one of the most groundbreaking publications in the field of life science, i.e., the DNA sequencing method (>65000 in 2014) that claimed the Nobel prize and led to complete human genome sequencing, did not have any match to the citation of the protein quantification paper. Needless to say, a large number of research publications remain behind the curtain without being cited. Former Harvard president Derek Bok, in his book 'Higher Education in America' (published in 2015) noted that a majority of articles published in the arts and humanities (98 per cent) and social sciences (75 per cent) are never cited by another researcher. The current trend is not expected to be very different from this. A researcher might be interested (or find it important) to research a very rare disease affecting less than 0.1 per cent of the global population. Compared to cancer research, research on such a rare disease will have very low citations. — File pic That brings an imperative question to answer, does a low (or no) citation make a research less (or not) useful? Say, a researcher might be interested (or find it important) to research a very rare disease affecting less than 0.1 per cent of the global population. Compared to cancer research, research on such a rare disease will have very low citations. Again, receiving a high number of citations will be unlikely for a research publication addressing a national issue than a global issue. Those two examples suffice to endorse that the number of citations would fail to reflect the importance of research publications. Rather, it would be wrong if citation is used as a measure to evaluate the impact of such research publications. Going back to the clock, one will find that the dictum 'publish or perish' in academia was introduced in 1942 in Logan Wilson's book, "The Academic Man: A Study in the Sociology of a Profession" - says Eugene Garfield, the founder of Institute for Scientific Information's (ISI). Then, the measurement of journal Impact Factor (IF) was introduced in 1975 by Eugene Garfield as part of the Journal Citation Reports. Eventually, academics were motivated (read forced) not only to publish more and more papers but also to publish their papers in higher-ranking journals measured by higher IF. Eventually, having a higher number of papers and publishing in the 'high' ranking journals became the requirements in academia for appointment, promotion, and even grant approval. Now, in less than 100 years, academia is experiencing a new survival dictum — be visible or vanish. Amidst the logical criticism, academic policy makers will continue to impose the new dictum for appointment, promotion, and even approval. I wonder if the 'inventors' of new knowledge, i.e., academics at universities, know what is next? Prof Mohammad is the Deputy Executive Director (Development, Research & Innovation) at International Institute of Public Policy and Management (INPUMA), Universiti Malaya, and can be reached at [email protected] • This is the personal opinion of the writer or publication and does not necessarily represent the views of Malay Mail.