A new report urges caution against the over-reliance on citation statistics such as the impact factors. The report, entitled Citation Statistics, on the use of citations in assessing research quality (PDF document), was commissioned by the International Mathematical Union (IMU). The work based on practices as reported from mathematicians and other scientists from around the world.
The report highlights several common misuses of citation statistics in evaluating research. It cautions that citations provide only a limited and incomplete view of research quality. Research is too important, they say, to measure its value with only a single, coarse tool.
The report argues that the meaning of citations is not well-understood. This means that a citation’s meaning can be very far from ‘impact’, making the objectivity of citations illusory. In addition, while having a single number to judge quality is indeed simple, it can lead to a shallow understanding of something as complicated as research. Numbers are not inherently superior to sound judgements, observes the report.