Hi Dan, I see that China is also trying to tackle the predatory journals and fake articles!

China updates naughty list of journals

China has updated its Early Warning Journal List — a list of journals that are deemed to be untrustworthy, predatory or not serving the Chinese research community’s interests. The latest edition includes 24 journals and, for the first time, takes note of misconduct called citation manipulation, in which authors try to inflate their citation counts. Scholarly literature researcher Yang Liying heads up the team that produces the influential list and spoke to Nature about how it’s done.

Here is an interesting alert from Nature Briefing—on the same theme, or topic, as your recent blog!

Co-authors point the way to paper mills

A new approach looks at authors, rather than the content of papers, to help identify journal articles that originate from ‘paper mills’ — factories for fake research. It looks for unusual patterns of co-authorship and peculiar networks of researchers, which could be a sign that authorship was paid for, rather than earned. The approach could be crucial as artificial intelligence (AI) systems make it all too easy to churn out convincing fake manuscripts. “This is the kind of signal that is much more difficult to work around, or outcompete, by clever use of generative AI,” says Hylke Koers of the International Association of Scientific, Technical, and Medical Publishers.

Thanks for sharing! As course coordinator for our Advanced Writing in Biology course, we always devote the early part of the course to discussion about scientific misconduct and publishing ethics.

It was interesting that the concept of predatory journals didn’t connect well with students, even though we tried to emphasize that when they moved into writing their literature reviews — they needed to rely on indexed databases (such as PubMed) to find credible papers from credible sources. Too often, they just Google whatever and end up with crap.

This year, our focus was on generative Artificial Intelligence (AI), which just makes the whole problem worse. If ChatGPT can’t find what you need, it just hallucinates and makes it up. These are troubling times for scientists who wish to remain honest!


This entry was posted in Center for Environmental Genetics. Bookmark the permalink.