Commentary on:
Collini, S. (2009) Impact on humanities: Researchers must take a stand now or be judged and rewarded as salesmen. Times Literary Supplement. November 13 2009.
One can agree whole-heartedly with Professor Collini that much of the spirit and the letter of the RAE and the REF and their acronymous successors are wrong-headed and wasteful -- while still holding that measures ("metrics") of scholarly/scientific impact are not without some potential redeeming value, even in the Humanities. After all, even expert peer judgment, if expressed rather than merely silently mentalized, is measurable. (Bradley's observation on the ineluctability of metaphysics applies just as aptly to metrics: "Show me someone who wishes to refute metaphysics and I'll show you a metaphysician with a rival system.")
The key is to gather as rich, diverse and comprehensive a spectrum of candidate metrics as possible, and then test and validate them jointly, discipline by discipline, against the existing criteria that each discipline already knows and trusts (such as expert peer judgment) so as to derive initial weights for those metrics that prove to be well enough correlated with the discipline's trusted existing criteria to be useable for prediction on their own.
Prediction of what? Prediction of future "success" by whatever a discipline's (or university's or funder's) criteria for success and value might be. There is room for putting a much greater weight on the kinds of writings that fellow-specialists within the discipline find useful, as Professor Collini has rightly singled out, rather than, say, success in promoting those writings to the general public. The general public may well derive more benefit indirectly, from the impact of specialised work on specialists, than from its direct impact on themselves. And of course industrial applications are an impact metric only for some disciplines, not others.
Ceterum censeo: A book-citation impact metric is long overdue, and would be an especially useful metric for the Humanities.
Harnad, S. (2001)
Research access, impact and assessment.
Times Higher Education Supplement 1487: p. 16.
Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003)
Mandated online RAE CVs Linked to University Eprint Archives: Improving the UK Research Assessment Exercise whilst making it cheaper and easier. Ariadne 35.
Brody, T., Carr, L., Harnad, S. and Swan, A. (2007) T
ime to Convert to Metrics.
Research Fortnight pp. 17-18.
Harnad, S. (2008)
Open Access Book-Impact and "Demotic" Metrics Open Access Archivangelism October 10, 2008.
Harnad, S. (2008)
Validating Research Performance Metrics Against Peer Rankings.
Ethics in Science and Environmental Politics 8 (11) doi:10.3354/esep00088
Special Issue on "The Use And Misuse Of Bibliometric Indices In Evaluating Scholarly Performance"
Harnad, S. (2009)
Open Access Scientometrics and the UK Research Assessment Exercise.
Scientometrics 79 (1)