The following is a comment on an article that appeared in the Thursday April 13th issue of the
The Independent concerning the UK Research Assessment Exercise (
RAE) and
Metrics (followed by a response to another piece in
The Independent about
Web Metrics).
Re: Hodges, L. (2006) The RAE is dead - long live metrics. The Independent April 13 2006
Absolutely no one can justify (on the basis of anything but superstition) holding onto an expensive, time-wasting research assessment system such as the
RAE, which produces rankings that are almost perfectly correlated with, hence almost exactly predictable from, inexpensive objective metrics such as prior funding, citations and research student counts.
Hence the only two points worth discussing are (1) which metrics to use and (2) how to adapt the choice of metrics and their relative weights for each discipline.
The web has opened up a vast and rich universe of
potential metrics that can be tested for their validity and predictive power: citations, downloads, co-citations, immediacy, growth-rate, longevity, interdisciplinarity, user tags/commentaries and much, much more. These are all measures of research uptake, usage, impact, progress and influence. They have to be tested and weighted according to the unique profile of each discipline (or even subdiscipline). Just the prior-funding metric alone is highly predictive on its own, but it also generates a
Matthew Effect: a self-fulfilling, self-perpetuating prophecy. So multiple, weighted mertics are needed for balanced evaluation and prediction.
I would not for a moment believe, however, that any (research) discipline lacks predictive metrics of research performance altogether. Even less credible is the superstitious notion that the only way (or the best) to evaluate research is for RAE panels to re-do, needlessly, locally, the
peer review that has already been done, once, by the journals in which the research has already been published.
The urgent feeling that some form of human re-review is somehow crucial for fairness and accuracy has nothing to do with the RAE or metrics in particular; it is just a generic human superstition (and
irrationality) about
population statistics versus my own unique, singular case...
Re: Diary (The Independent, 13 April 2006, other article, same issue)
"A new international university ranking has been launched and the UK has 25 universities in the world's top 300. The results are based on the popularity of the content of their websites on other university campuses. The G Factor is the measure of how many links exist to each university's website from the sites of 299 other research-based universities, as measured by 90,000 google searches. No British university makes it into the Top 10; Cambridge sits glumly just outside at no 11. Oxford languishes at n.20. In a shock Southampton University is at no.25 and third in Britain. Can anyone explain this? Answers on a postcard. The rest of the UK Top 10, is UCL, Kings, Imperial, Sheffield, Edinburgh, Bristol and Birmingham."
The reasons for the University of Southampton's extremely high overall webmetric rating are four: (1) U. Southampton's university-wide research performance
(2) U. Southampton's Electronics and Computer Science (ECS) Department's involvement in many high-profile web projects and activities (among them the semantic web work of the web's inventor, ECS Prof. Tim Berners-Lee, the Advanced Knowledge Technologies (AKT) work of Prof. Nigel Shadbolt, and the pioneering web science contributions of Prof. Wendy Hall)
(3) The fact that since 2001 U. Southampton's ECS has had a mandate requiring that all of its research output be made Open Access on the web by depositing it in the ECS EPrints Repository, and that Southampton has a university-wide self-archiving policy (soon to become a mandate) too
(4) The fact that maximising access to research (by self-archiving it free for all on the web) maximises research usage and impact (and hence web impact)
This all makes for an extremely strong Southampton web presence, as reflected in such metrics as the "
G factor", which places Southampton 3rd in the UK and 25th among the world's top 300 universities or
Webometrics,which places Southampton 6th in UK, 9th in Europe, and 80th among the top 3000 universities it indexes.
Of course, these are extremely crude metrics, but Southampton itself is developing more powerful and diverse metrics for all Universities in preparation for the newly announced
metrics-only Research Assessment Exercise.
Some references:
Harnad, S. (2001)
Why I think that research access, impact and assessment are linked.
Times Higher Education Supplement. 1487: p. 16.
Hitchcock, S., Brody, T., Gutteridge, C., Carr, L., Hall, W., Harnad, S., Bergmark, D. and Lagoze, C. (2002)
Open Citation Linking: The Way Forward.
D-Lib Magazine 8(10).
Harnad, S. (2003)
Why I believe that all UK research output should be online.
Times Higher Education Supplement. Friday, June 6 2003.
Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003)
Mandated online RAE CVs Linked to University Eprint Archives: Improving the UK Research Assessment Exercise whilst making it cheaper and easier.
Ariadne 35.
Berners-Lee, T., De Roure, D., Harnad, S. and Shadbolt, N. (2005)
Journal publishing and author self-archiving: Peaceful Co-Existence and Fruitful Collaboration.
Brody, T., Harnad, S. and Carr, L. (2006)
Earlier Web Usage Statistics as Predictors of Later Citation Impact.
Journal of the American Association for Information Science and Technology (JASIST).
Shadbolt, N., Brody, T., Carr, L. & Harnad, S. (2006)
The Open Research Web: A Preview of the Optimal and the Inevitable. In: Jacobs, N., Eds.
Open Access: Key Strategic, Technical and Economic Aspects. Chandos.
Citebase impact ranking engine and usage/citation
correlator/predictor
Beans and Bean Counters
Bibliography of Findings on the Open Access Impact Advantage
Stevan Harnad
American Scientist Open Access Forum