Future UK Research Assessment Exercise (RAE) to be Metrics-Based
As predicted, and
long urged, the UK's wasteful, time-consuming
Research Assessment Exercise (RAE) is to be replaced by
metrics:
"Research exercise to be scrapped"
Donald MacLeod, Guardian Wednesday March 22, 2006
Science and innovation investment framework 2004-2014: next steps supporting UK Budget 2006"The Government is strongly committed to the dual support system, and to rewarding research excellence, but recognises some of the burdens imposed by the existing Research Assessment Exercise (RAE). The Government's firm presumption is that after the 2008 RAE the system for assessing research quality and allocating 'quality-related' (QR) funding will be mainly metrics-based... The Government will launch a consultation on its preferred option for a metrics-based system for assessing research quality and allocating QR funding, publishing results in time for the 2006 Pre-Budget Report."
"Over recent years a number of studies have considered options for a radically different allocation system for QR in order to avoid or reduce the need for a peer review process. The focus in most cases has been on identifying one or more metrics that could be used to assess research quality and allocate funding, for example research income, citations, publications, research student numbers etc. The Government has considered the evidence to date and favours identifying a simpler system that may not precisely replicate the level of detailed analysis of the RAE but would enable an appropriate distribution of QR funding at the institutional level."
"[M]etrics collected as part of the next assessment will be used to undertake an exercise shadowing the 2008 RAE itself, to provide a benchmark on the information value of the metrics as compared to the outcomes of the full peer review process. The aim of any changes following this exercise will be to reduce the administrative burden of peer review, wherever possible, consistent with the overriding aim of assessing excellence"
RAE outcome is most closely correlated (r = 0.98) with the metric of prior RCUK research funding (
Figure 4.1) (this is no doubt in part a "
Matthew Effect"), but
research citation impact is another metric highly correlated with the RAE outcome, even though it is not explicitly counted. Now it can be explicitly counted (along with other powerful new performance metrics) and all the rest of the ritualistic time-wasting can be abandoned, without further ceremony.
This represents a great boost for institutional self-archiving in Open Access
Institutional Repositories, not only because that is the obvious, optimal
means of submission to the new metric RAE, but because it is also a powerful
means of maximising research impact, i.e., maximising those metrics: (I hope
Research Councils UK (RCUK) is listening!).
Harnad, S. (2001) Why I think that research access, impact and assessment are linked. Times Higher Education Supplement 1487: p. 16.
Harnad, S. (2003) Why I believe that all UK research output should be online. Times Higher Education Supplement. Friday, June 6 2003.
Harnad, S., Carr, L., Brody, T. & Oppenheim, C. (2003) Mandated online RAE CVs Linked to University Eprint Archives: Improving the UK Research Assessment Exercise whilst making it cheaper and easier. Ariadne 35.
Beans and Bean Counters (2005)
On Thu, 23 Mar 2006, Adrian Smith (Leeds) wrote:
"See also Mark Maslin [UCL] in Nature:
Research skewed by stress on highest-impact journals
And this new metric RAE policy will help "unskew" it, by instead placing the weight on the individual author/article citation counts (and download counts, CiteRanks, authority counts, citation/download latency, citation/longevity, co-citation signature, and many, many new OA metrics waiting to be devised and validated, including full-text semantic-analysis and semantic-web-tag analyses too) rather than only, or primarily, on the blunter instrument (the journal impact factor).
This is not just about one number any more! The journal tag will still have some weight, but just one weight among many, in an OA scientometric multiple regression equation, customised for each discipline.
This is an occasion for rejoicing at progress, pluralism and openness, not digging up obsolescent concerns about over-reliance on the journal impact factor.
On Thu, 23 Mar 2006, Ian Sommerville wrote on the CPHC list:
"This is the wording from the budget document 'The Government wants this to continue, but thinks the close correlation between Research Council income and QR income may provide an opportunity for allocating QR using a radically simpler system. '
"The point is made that, at an institutional level, there is a 0.98 correlation between research income and QR. No mention of citation impact. An alternative metric may be proposed for the humanities."
The
document actually says
"one or more metrics... could be used to assess research quality and allocate funding, for example research income, citations, publications, research student numbers etc."
You are quite right, though, that the default metric many have in mind is research income, but be patient! Now that the door has been opened to objective metrics (instead of amateurish in-house peer-re-review), this will spawn more and more candidates for enriching the metric equation. If RAE top-slicing wants to continue to be an independent funding source in the present "
dual" funding system (
RCUK/
RAE), it will want to have some predictive metrics that are independent of prior funding. (If RAE instead just wants to redundantly echo research funding, it need merely scale up RCUK research grants to absorb what would have been the RAE top-slice and drop the RAE and dual funding altogether!)
The important thing is to scrap the useless, time-wasting RAE preparation/evaluation ritual we were all faithfully performing, when the outcome was already so predictable from other, cheaper, quantitative sources. Objective metrics are the natural, sensible way to conduct such an exercise, continuously, and once we are doing metrics, many powerful new predictive measures will emerge, over and above grant income and citations. The RAE ranking will not come from one variable, but from a multiple regression equation, with many weighted predictor metrics in an Open Access world, in which research full-texts in their own authors'
Institutional Repositories are
citation-linked,
download-monitored and otherwise scientometrically assessed and analysed
continuously.
Hitchcock, S., Brody, T., Gutteridge, C., Carr, L., Hall, W., Harnad, S., Bergmark, D. and Lagoze, C. (2002) Open Citation Linking: The Way Forward. D-Lib Magazine 8(10).
Brody, T., Harnad, S. and Carr, L. (2005) Earlier Web Usage Statistics as Predictors of Later Citation Impact. Journal of the American Association for Information Science and Technology (JASIST).
Citebase impact ranking engine (2001)
Usage/citation correlator/predictor
Stevan Harnad
The revelation from a small note in the UK government's budget statement that in future the Research Assessment Exercise (RAE) will be based on metrics Research exercise to be scrapped, The Guardian (22 March 2006) has predictably led the academic estab
Tracked: Apr 18, 21:06