Refute it thus

Citation indices are poor and distorting proxies for research quality - they should get the boot, argues Thomas Docherty

May 19, 2011

Walking in the churchyard in Harwich in 1763, Boswell and Johnson discussed the "ingenious sophistry" of Berkeley's ostensibly mad "proof" of the non-existence of matter. "I observed", writes Boswell, "that though we are satisfied his doctrine is not true, it is impossible to refute it." Famously, Johnson kicked a large stone with mighty force, saying, "I refute it thus."

Kicking that stone - call it "impact" - was Johnson's assertion of material realities against a philosophy that reduces knowledge to capricious opinion. Swift had a similar target in his 1726 satire on intellectual fantasists from the Academy of Lagado in Gulliver's Travels. There "projectors" devise mad ideas that work in theory but fail entirely to touch upon material reality, practice or fact.

The damaging idea of "impact", though discredited, is nonetheless on our agenda, but paradoxically this impact will lose us the realities of Johnson's stone. One measure of research impact is the citation index, an ingenious sophistry increasingly touted as a useful means of "proving" research impact and, by extension, research value. He who is cited a thousand times has a high impact factor, guaranteeing further funding and favour; he who is cited but once may find his career in jeopardy.

Citation indices have a detrimental impact on our research cultures, turning the realities of substantive knowledge into the merest illusion. Eugene Garfield, an early architect of the "science" of citation measurement, is clear about its dangers, arguing that proper evaluation of research depends simply on reading articles and making meaningful intellectual judgements. We are in danger of allowing "impact factors" to become a proxy for value; and in doing so we become like the academicians rightly attacked by Swift for losing touch with real-world problems.

ADVERTISEMENT

The impact factor of journals is measured sometimes to four decimal points. We stand like Boswell, intimidated by the sophistry that generates the difference between 4.3211 and 4.3879. Such numerical precision has the daunting power of scientific objectivity, but it is illusory. Important questions - how and why you cite - go ignored. You may cite Berkeley many times, but only in order roundly to dismiss him. You may be a novice researcher: citing establishes scholarly credentials. Alternatively, you may be working in an extremely specialised field, whose tiny constituency guarantees minimal citation. You may be hugely influential without being cited at all: a teacher. To be cited indicates nothing more than that one has been cited; and to count citations and then use the figure as an indicator of value is to reduce research quality to a popularity contest.

Worse, this ideology drives us towards a conformist research orthodoxy in which "valued" research becomes increasingly narrowly concentrated on work whose high citation count "validates" it; and then "good" research becomes that which, in citing the already cited, conforms to its now established, accepted ideas. What then of research that calls established orthodoxy into question or ignores it totally? Citation indices validate merely the comforting illusions of orthodoxy over the sometimes painful difficulties of engaging with knowledge. The process is also inimical to the organic evolution of disciplines, where new research generates new, not-yet-established journals whose status means that they simply do not exist for the indexers.

ADVERTISEMENT

Citation indices are based on the "data" given primarily by counting. Data alone are nothing, but they can rise to the status of "information" when they are used to construct a meaningful sentence or research claim ("I refute Berkeley thus"); and "information" becomes knowledge if, and only if, it is discussed and debated. The indices encourage us to substitute data for knowledge. This is yet more damaging when the data are false, when the count is wrong, failing to capture all the citations.

This is why the academic community argues against the damaging falsifications of the impact agenda. Research leaders should abandon the ideology that validates accepted opinion instead of engaged knowledge. We should all abandon entirely citation indices as a proxy for evaluation. And you can quote me on that.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT