Friday, 27 June 2008

Caution in using Citation Statistics

Citation Statistics (Alder et al. 2008) is a timely reminder, in view of the new style Research Excellence Framework [REF], successor to the Research Assessment Exercise [RAE], of the limitations of using citations as a basis for assessing research quality.

The report - which has an excellent Executive Summary - points out that statistics can be misapplied, misunderstood and misused. In addition statistics are not inherently objective - a point perhaps more widely accepted in the Social Sciences - they can in fact be as subjective as the process of peer review. Citation analysis can only be an indicator of research impact, on its own it tells us nothing about the research itself. In the words of the report it gives a shallow picture.

The authors emphasise the crude nature of journal Impact Factors as a measure of research quality. They could of added that in some discipline areas most research is published in journals without Impact Factors. In addition their is a confusion in the minds of some between Impact Factors - having an article published in a journal with an IF - and number of citations for an individual article. Warnings are also posted about the nature of h and other indexes.


Links

Research Information. 2008. Report cautions against the over-reliance on citation statistics. Research Information. 20 June 2008. Available from: http://www.researchinformation.info/news/news_story.php?news_id=305 [Accessed: 27 June 2008].

Alder, R., Ewing, J. and Taylor, P., 2008. Citation Statistics:A report from the International Mathematical Union (IMU) in cooperation with the International Council of Industrial and Applied Mathematics (ICIAM) and the Institute of Mathematical Statistics (IMS). Available from: http://www.mathunion.org/fileadmin/IMU/Report/CitationStatistics.pdf [Accessed:27 June 2008].

1 comment:

Anonymous said...

Note that there is a critique of this report here