Originally published September 21 2013
Scientists accused of engaging in 'citation stacking' to boost journal profiles, much like 'black hat' SEO
by Ethan A. Huff, staff writer
(NaturalNews) Four Brazilian scientific journals and ten others from around the world have been barred from the influential Thomson Reuters Impact Factor rating system for one year following the discovery that a cohort of editors published a series of papers designed to artificially boost the rankings of their respective journals. According to Nature, at least one editor has lost his job as a result of this "citation stacking" tactic, which involves journal editors basically teaming up with each other to publish articles containing hundreds of references to papers published in each other's journals.
The purpose of citation stacking is to boost a journal's overall ranking on the impact factor scale, which claims to assess the quality of scientific research by the amount of citations it receives. Many say the scale is flawed and riddled with errors, but it is the accepted standard by which many organizations and academic institutions measure the quality of published research. And it is exactly what landed former Clinics editor Mauricio Rocha e Silva in the mess that he is currently facing.
Frustrated by his country's obsession with the impact factor system, which overlooks quite a bit of credible research carried out by solid scientists who might otherwise publish in lesser-known, native-to-Brazil journals, Rocha e Silva decided to come up with his own plan. Since Brazil's government-sponsored journal system is doing nothing to help promote emerging Brazilian journals, and the agency that evaluates graduate programs is totally fixated on the biased impact factor paradigm, Rocha e Silva was essentially forced to think outside the box for the benefit of native scientists.
According to reports, he teamed up with the editors of at least three other Brazilian journals to publish a series of papers containing citations to other studies published in the pool of journals. Each journal published studies containing citations to all the other journals in the pool except its own in order to avoid self-citation. The purpose of this, of course, was to increase the overall number of citations for each journal, and consequently boost its rankings in the Thomson Reuters system.
But the plan backfired when a new Thomson Reuters algorithm for detecting citation stacking identified the anomaly. Since citation stacking is very similar to the common practice of so-called "black hat" search engine optimization (SEO), which involves webmasters padding their websites with excess keywords and bypassing normal search engine protocols to inflate rankings, Thomson Reuters was already one step ahead in anticipating its potential to also occur in journals, which is why it came up with a special detection algorithm.
Journal impact factor system biased and unfair, incapable of properly assessing quality of scientific research, say many In truth, though, citation stacking is not necessarily wrong, particularly because the impact factor system itself unfairly inhibits seemingly "inferior" journals from ever improving and gaining recognition. Additionally, if the citations themselves are still valid, then there is no actual fraud occurring. The only real infringement with citation stacking, in other words, at least in this particular case where honest Brazilian scientists were simply trying to help out their own underdog journals that were being inequitably suppressed by the system, is that it bucks the interests of the larger status quo system.
"Scientists and editors were not blamed for publishing bad science but for helping to undermine a stupid and misused science-rating system," writes one commenter on the Nature story. "The primitiveness of this metrics-based, pseudo-scientific attempt to quantify quality is an insult to every creative scientist."
"If the corruptness of the rating mechanism established by a system reflects its own deformity, as we recently learned from the financial markets and their rating agencies, every scientist should be alarmed. All legal efforts to undermine the assessment of scientific work by metrics should therefore be welcomed," this commenter adds.
You can read the full Nature report on this citation stacking incident here:
The following post on Reciprocal Space also provides further insight into the flawed impact factor system:
Sources for this article include:
All content posted on this site is commentary or opinion and is protected under Free Speech. Truth Publishing LLC takes sole responsibility for all content. Truth Publishing sells no hard products and earns no money from the recommendation of products. NaturalNews.com is presented for educational and commentary purposes only and should not be construed as professional advice from any licensed practitioner. Truth Publishing assumes no responsibility for the use or misuse of this material. For the full terms of usage of this material, visit www.NaturalNews.com/terms.shtml