Universities UK recommends making all the research outputs submitted to the UK's new
Research Excellence Framework (REF) Open Access (OA).
The UUK's recommendation is of course very welcome and timely.
All research funded by the RCUK research councils is already covered by the fact that all the UK councils already
mandate OA. It is this policy, already adopted by the UK, that the US is now also contemplating adopting, in the form of the proposed
Federal Research Public Access Act (FRPAA), as well as the discussion in President Obama's ongoing OSTP
Public Access Policy Forum.
But if HEFCE were to follow the UUK's recommendation, it would help to ensure Open Access to UK research funded by the EU (for which OA is only partially mandated thus far) and other funders, as well as to unfunded research -- for which OA is mandated by a still small but growing number of
universities in the UK and worldwide. (The same UUK proposal could of course be taken up by UK's universities, for once they mandate OA for all their research output, all UK research, funded and unfunded, becomes OA!)
There is an arbitrary constraint on REF submissions, however, which would greatly limit the scope of an OA requirement (as well as the scope of REF itself):
Only four research outputs per researcher may be submitted, for a span covering at least four years, rather than all research output in that span.
This limitation arises because the REF retains the costly and time-consuming process of
re-reviewing, by the REF peer panels, of all the already peer-reviewed research outputssubmitted. This was precisely what it had
earlier been proposed to replace by
metrics, if they prove sufficiently correlated with -- and hence predictive of -- the peer panel ranklings. Now it will only be partially supplemented by a few metrics.
This is a pity, and an opportunity lost, both for OA and for testing and validating a rich and diverse new battery of metrics and initializing their respective weights, discipline by discipline. Instead, UUK has endorsed a simplistic (and likewise untested and arbitrary) a-priori weighting ("60/20/20 for outputs, impact and environment").
Harnad, S. (2009) Open Access Scientometrics and the UK Research Assessment Exercise. Scientometrics 79 (1) Also in Proceedings of 11th Annual Meeting of the International Society for Scientometrics and Informetrics 11(1), pp. 27-33, Madrid, Spain. Torres-Salinas, D. and Moed, H. F., Eds. (2007)