This is great but I can’t believe that it actually got published!
Posts Tagged ‘publish or perish’
We have written extensively about the drawbacks of the Journal Impact factor and the DORA initiative on this site (see: The Journal Impact Factor and the Lazy Scientist, Nobel Prize Winners Address Brutal Cuts in Scientific Funding, A Journal Impact Factor Scandal)
Here is on graduates student’s reaction to the pressures of publishing in one of the “big three.”
Several weeks ago in the blog post “Is the Institution of “Grant-Funded” Research Killing Scientific Integrity?” I wrote about the Scientific American article suggesting that scientific retraction can change public opinion. In that post, I suggested that perhaps the pressures to publish or perish are leading some of our colleagues down the unethical road of publishing false results.
This past week, a researcher at University of Edinburgh published her findings that scientific research is suffering due to pressure on scientists to publish only “publishable” (i.e. positive) results. Certainly negative findings are critical to advancing scientific knowledge but who wants to become engaged in an unsexy project. Worse yet, if the data isn’t publishable then you are definitely at risk of losing your grant funding.
Check out the press release from the Public Library of Science and let me know your thoughts. What can we do (or do we need to do anything) to overcome this bias?
A press release from PLoS ONE
April 22, 2010
The quality of scientific research may be suffering because academics are being increasingly pressured to produce ‘publishable’ results, a new study suggests. A large analysis of papers in all disciplines shows that researchers report more “positive” results for their experiments in US states where academics publish more frequently. The results are reported in the online, open-access journal PLoS ONE on April 21st, by Daniele Fanelli, of the University of Edinburgh.
The condition of today’s scientists is commonly described by the expression “publish or perish”. Their careers are increasingly evaluated based on the sheer number of papers listed in their CVs, and by the number of citations received – a measure of scientific quality that is hotly debated. To secure jobs and funding, therefore, researchers must publish continuously. The problem is that papers are likely to be accepted by journals and to be cited depending on the results they report.
“Scientists face an increasing conflict of interest, torn between the need to be accurate and objective and the need to keep their careers alive” says Fanelli, “while many studies have shown the deleterious effects of financial conflicts of interests in biomedical research, no one has looked at this much broader conflict, which might affect all fields”.
Dr Fanelli analysed over 1300 papers that declared to have tested a hypothesis in all disciplines, from physics to sociology, the principal author of which was based in a U.S. state. Using data from the National Science Foundation, he then verified whether the papers’ conclusions were linked to the states’ productivity, measured by the number of papers published on average by each academic.
Findings show that papers whose authors were based in more “productive” states were more likely to support the tested hypothesis, independent of discipline and funding availability. This suggests that scientists working in more competitive and productive environments are more likely to make their results look “positive”. It remains to be established whether they do this by simply writing the papers differently or by tweaking and selecting their data.
“The outcome of an experiment depends on many factors, but the productivity of the US state of the researcher should not, in theory, be one of them,” explains Fanelli “we cannot exclude that researchers in the more productive states are smarter and better equipped, and thus more successful, but this is unlikely to fully explain the marked trend observed in this study”.
Positive results were less than half the total in Nevada, North Dakota and Mississippi. At the other extreme, states including Michigan, Ohio, District of Columbia and Nebraska had between 95% and 100% positive results, a rate that seems unrealistic even for the most outstanding institutions.
These conclusions could apply to all scientifically advanced countries. “Academic competition for funding and positions is increasing everywhere”, says Fanelli “Policies that rely too much on cold measures of productivity might be lowering the quality of science itself”.
Public Library of Science