Posts Tagged ‘scientific publications’

Co-Founders of PLOS Discuss Open Access

 :: Posted by American Biotechnologist on 01-14-2014

A Journal Impact Factor Scandal

 :: Posted by American Biotechnologist on 08-29-2013

In a wildly popular post published several months ago, we took a controversial stance in discussing the ugly side of the journal impact factor. In that article we argued that the journal impact factor (JIF) is a useless tool for measuring the productivity of scientists and that it is being used unfairly to grant merit increases to scientists based solely on their JIF ranking. The article generated dozens of comments, with many readers avidly agreeing with our opinion and enthusiastically sharing their stories of colleagues who have been cheated out of deserved promotions due to their dearth of publications in journals with high JIF rankings.

In a rather ironic twist of fate, the journal Nature, (probably one of the highest ranking JIF journals around), broke a story on how several Brazilian scientific journals have been suspended from Thomson Reuters’ JIF service for inappropriately manipulating the journal’s content to falsely increase their JIF rating. The accused Brazilian journal editors encouraged scientists to cite other Brazilian journal articles in their publication in order to help increase the JIF ranking of the cited journal. In an even more egregious move, the editors created a Brazilian cartel and agreed to stack their publications with citations from their peer’s journals, falsely inflating the Brazilian journals’ JIF.

The editors defended their action by claiming that many Brazilian scientists are hesitant to publish in local journals due to the governmental policy of preferentially funding scientists that publish in high JIF journals. This has created a Pandora’s box for Brazilian journals looking to improve their JIF score since local scientists are unwilling to publish in the local journals, (which in the long-term would help increase the journal’s JIF), making it even more difficult for Brazilian journals to improve their ranking.

So as you see, not only are academic institutions using JIF the wrong way, governments are as well. Moral of the story, get rid of JIF and find a better way to evaluate scientific contribution!

The Ugly Side of the Journal Impact Factor

 :: Posted by American Biotechnologist on 05-16-2013

Are you obsessed with publishing in high ranking journals such as Cell or Science? Do you gloss over your works that have been published in low ranked journals when talking with colleagues or attending a job interview? If the answer is yes, you are not alone.

Since its invention approximately 60 years ago, the Journal Impact Factor (JIF) has been used to assess the quality of academic literature and the influence of scientific papers on the scientific community. The JIF was proposed by Eugene Garfield in the early 1950s and originally published by his Institute for Scientific Information (ISI) as a subscription buying tool for academic and medical librarians. The JIF assigns a score to scientific journals based on the average number of citations received in a year per paper published in the journal during the two preceding years. It has since become the authority on which journals are considered top tier publications are therefore premiere space for scientists wishing to best publicize their work and gain notoriety.

Unfortunately, the JIF has also become a tool used to ascertain a scientist’s worth and can often be a determining factor in the levels of funding they are to receive. This is an unfortunate turn of events since the JIF contains many deficiencies such as glossing over differences between fields, and lumping primary research articles in with much more easily cited review articles. As such, researchers that publish quality work in lower ranked journals are often at a disadvantage compared to those publishing secondary research in higher ranking journals.

In order to “protest” and counter this phenomenon, a group of publishers from both high impact and low impact journals have formed the Declaration on Research Assessment (DORA) which aims to lower the influence of the JIF on assessing scientific merit.

Dora has released 18 principles which are geared towards accomplishing these goals. Some of the recommendations that stand out the most include:

  • JIF should not be used to measure quality of individual articles or to asses an individual scientist’s contributions, or in hiring, promotion or funding decisions
  • Funding agencies should place more weight on the scientific content of a paper than its JIF
  • Scientific content of a paper should be considered a more important hiring decision than the JIF
  • A call for organizations to be open and transparent by providing data and methods used to calculate all metrics
  • Researchers should challenge research assessment practices that rely inappropriately on JIF

To download the full list of recommendations visit http://am.ascb.org/dora/files/SFDeclarationFINAL.pdf.

Becoming a PubMed Expert

 :: Posted by American Biotechnologist on 06-14-2012

PubMed is probably the most important database used by scientists across the globe. Most researchers will routinely do a PubMed search to look for relevant literature on a monthly, if not weekly, basis. But how many of us really know how to use PubMed to its fullest extent? How much more efficient would our literature searches be if we just knew how to make the most out of PubMed? Luckily, NCBI, has put together a YouTube channel with detailed explanations on how to use many of their advanced products. Here is a video demonstrating how to use the PubMed Advanced Search Builder. Be sure to checkout the NCBI YouTube page for more great videos.

Making the most of negative results

 :: Posted by American Biotechnologist on 03-02-2011

How often have we lamented that obtaining negative results has stymied productivity and gotten in the way of career progression? After all, no reputable journal will accept negative results, and without publications we may as well throw our science careers out the window.

The problem is that negative results ARE results nonetheless. While they may not be as sexy as positive data, negative data are a consequence of hard work and should be considered just as important as positive data.

One reason for this observed bias can likely be attributed to the early stages of scientific methodology. All scientific experiments start out with background research that leads to a hypothesis. The overwhelming number of scientists hypothesize that treatment A will result in consequence B happening to subject C. Very few scientists will hypothesize that treatment A will not result in any changes to subject C. Who wants to do THAT kind of experiment.

What we often fail to remember is that the hypothesis is really just an educated guess which must honestly be proven to be either correct or incorrect. Unfortunately, only hypotheses that are proven correct end up making it past the cutting room floor of the high-impact journals which is akin to rewarding good guessing over hard work.

In an effort to recognize the important contribution of hard-working scientists who’s experiments have concluded with negative results, an online database called Figshare has been created as a global repository for all the unpublished negative data coming out of hard-working science labs.

The idea is to have scientists publish ALL their negative data in the database. The database is open access and therefore any information stored there can be used freely by other scientists as long as it is properly attributed.

At the very least, such a system helps prevent other scientists from wasting untold amounts of money repeating the same experiment only to eventually come up with similar negative results. Whats-more, should your data eventually be published by another researcher, you will receive a citation and perhaps even an opportunity for collaboration.

As is written on the figshare website:

Unless we as scientists publish all of our data, we will never achieve access to the sum of all scientific knowledge.

Although Figshare cannot replace the thrill of publishing in a top-tiered scientific journal, it should help take away the sting of negative results and lead to an appreciation for all scientific data both positive and negative.