AI Used in 13% 2024 Abstracts

AI Influence Grows: 13% of 2024 Biomedical Abstracts Show Signs of AI Use, Study Finds
A recent study has estimated that around 13 per cent of biomedical research abstracts published in 2024 may have been generated or significantly influenced by large language models (LLMs), marking a growing shift in academic writing due to artificial intelligence.
The study, conducted by researchers at the University of Tübingen in Germany, analyzed over 15 million biomedical papers published between 2010 and 2024. The findings indicate that research abstracts in 2024 exhibited a noticeable rise in the use of stylistic elements and vocabulary patterns commonly associated with AI-generated text.
Large language models, such as those developed by OpenAI and other AI companies, are trained on vast datasets of human language and can generate human-like responses to prompts. Their increasing accessibility has sparked speculation and debate about the growing use of such tools in scientific and academic writing.
The Tübingen researchers did not claim that all these abstracts were definitively written by AI but highlighted a “drastic shift in vocabulary,” especially the prevalence of “style” words that are statistically more likely to appear in AI-written content. This linguistic fingerprint, they argue, points toward an increasing reliance on LLMs, whether for drafting, editing, or refining scientific abstracts.
“While LLMs can be powerful tools for improving clarity and coherence, their use raises questions about authorship, originality, and transparency in academic publishing,” the researchers noted in their paper.
The trend has prompted calls for clearer disclosure norms and ethical guidelines around AI-assisted writing in scientific research. Many journals are now revising their policies to ensure transparency when AI tools are used during manuscript preparation.
As AI continues to permeate academic fields, experts warn that while it may democratize writing and improve productivity, it also brings challenges related to accountability, trust, and the integrity of scientific communication.
English 



