Meaningless index

Not that long ago I learned about a paper published by the lab of Björn Brems about the big fraud that impact factor in scientific publishing is when it comes to being an actual measure of how good science is. Not that it was news to many of us but it comes with a wealth of new data to support this view and of which he comments on in his blog.

In short, what this team did was to analyze variables such as scientific quality, applicability or the likelyhood of retraction (you know, when an article has to be withdrawn from a journal because of experimental flaws or scientific misconduct such as faking data). Now, off to see HOW useful impact factor is…

For newbies, the impact factor is, in theory, a measure of the impact the discoveries/advances published in a certain journal, and the way to calculate it is to count the times the articles published in that journal have been cited by scientists in other articles over the course of a certain time period, usually 2 years. Cool journals like Science or Nature have really high impact factors and everyone would kill to publish there because that’s the way to make a CV.

Now back to this article. What is it really useful for? Data show that impact factor correlates only weakly with the most common indicators for utility/quality. Also, in statistic terms it is only slightly better than chance to estimate the novelty or the importance of the research and worse still to evaluate the quality of a scientific paper. This basically means if you’re looking for an article with good scientific quality you’re better iff the flashy pages of glam magazine and into a specialized journal, specially if we consider the higher retraction factor for those glossy journals.

Wait still a bit longer! there’s more…there’s actually one thing the impact factor is good for: precisely to predict where it is more probable that an article will get retracted, among other reasons cause trying to publish in one of those high ranking journals is a high pressure job, which some times makes people incur in dishonest practices but also because there’s more attention to catch the cheater. That was the case with Hwang Woo-suk, the corean scientist that argued to have obtained the first stem cell cultures in vitro and turned out (among other things like asking his PhD students to donate their own eggs…) to have faked the results he published in Science. Or more recently, on the same topic, the new  polemic over the possible errors and image duplications in Mitalipov’ s Cell paper. Also it is useful to confirm what we all think we know: flashy science is better, isn’t it? What I want to say with this is that it serves the purpose of autoconfirmation of the subjective bias towards the quality of the journal we are sending the best of our data, because they wouldn’t have such an impact factor if they weren’t good science, right? Unless it is on the benefit of those selling/publishing them…

Brembs comments at the end of the blog post where he presents his article that he published it in a neuroscience journal because the big ones didn’t consider his data novel or of enough wide interest to justify publishing. Big fish doesn’t want to run a story on the fake basis of the rulette of the “publish or perish” (publica o desaparece)…what a surprise!

Well, no. Not really. To those working here for a while it’s a well known fact that sometimes what gets to these journals is by far not the best science. So beware of indices, neither Standard and Poor´s nor the impact factor are real indicators. Quality in science should be measured in a different way. By reproducibility, for instance. By tech transfer. By leap of knowledge. And like so, a large etcetera.

Addendum: There exists an initiative to end up with this kind of scientific evaluation system DORA (The San Francisco Declaration on Research Assesment) and improve on the ways we rate science. I encourage you to take a look, whether you’re a scientist or not it doesn’t matter, after all it’s also your money that makes this possible, and maybe sign it?

One thought on “Meaningless index

Comments are closed.