Research Assessment in Languages, Linguistics and Area Studies
Author: Jim Coleman© Prof Jim Coleman
Abstract
The Research Assessment Exercise (RAE) is a form of peer review on the basis of which the UK Government allocates nearly £1 billion a year of funding, This article traces the origins, history, mechanisms, shortcomings, successes and possible future of the RAE.
Table of contents
Background
In the 1980s, the Thatcher Government demanded accountability, transparency and quality assurance and enhancement from British universities. Previously, 'old' universities (those founded or designated up to the end of the 1960s) received a block grant, undifferentiated for teaching and research, according to student numbers. Initially, assessment of research performance through what became the Research Assessment Exercise (RAE) carried no financial implications, and the first exercise in 1986 was widely derided, but by the fifth in 2001, some £900 million of annual funding depended on the results. All universities had developed sophisticated RAE strategies, while for most individual academics, RAE had become the dominant concern of their professional lives. Polytechnics joined the RAE in 1992, the year they acquired university status: previously unfunded for research, they faced uneven competition.
The 2001 RAE
The RAE process has been refined through consultation and feedback. In 2001, senior researchers nominated by subject associations were appointed to discipline-based panels, among them American Studies, Middle Eastern and African Studies, Asian Studies, European Studies, French, German (with Dutch and Scandinavian Languages), Italian, Russian (with Slavonic and East European Languages), Iberian and Latin American Languages, Communication, Cultural and Media Studies, and Linguistics. Each university selected the individuals to submit under each Unit of Assessment (UoA). Data in each submission covered staff numbers, research income, numbers of research students and research degree completions, and two textual elements, one describing research context and strategy and the other evidence of peer esteem.
However, the crucial part of each RAE submission was the listing of up to four 'best' publications for each researcher. Using criteria and working methods which were themselves publicly developed through consultation, each panel assessed the quality of each publication and researcher on a three-point scale (international, national, sub-national). The proportion of research in each category dictated the grade awarded to the UoA (from 1 at lowest to 5* at highest), provided such a grade was consistent with the remainder of the submission. Only later was the (inadequate) level of funding for each subject and grade decided by the Funding Councils.
The 2008 RAE
The RAE has demonstrably achieved its purpose in many ways. Individuals, groups and departments now plan research systematically. Resources are identified and monitored, peer review and citation are the key criteria by which quality is measured. Every university now monitors staff performance, and accountability at all levels means academics are no longer funded for research they fail to produce. Productivity and quality have improved so that the UK now has one of the most cost-effective and widely cited research profiles in the world.
But the exercise has been criticised for its cost, for encouraging game-playing in submissions, for rewarding traditional categories of research, and for diverting energy and resources from the core function of teaching. Deliberate attempts to encourage interdisciplinary and pedagogical research have failed. The 2003 Roberts review therefore examined all aspects of the RAE and proposed a radically new model. However, consultation achieved a consensus that the changes would add complexity and expense without cutting game-playing, and in 2004 the funding councils dumped virtually all the Roberts recommendations, retaining only the principle of increased selectivity and a new grading system. In RAE 2008, the percentage of outputs (i.e. publications) in each submission will be expressed not as grades but as quality profiles. Use of five categories from four-star (top international) through one-star (lower national) to unclassified should reward both consistently excellent departments and pockets of excellence in less research-intensive institutions. But until it is clear how a particular profile will translate into funding, and thus to what extent a mediocre tail will pull down the star performers, guesswork and game-playing will persist. ‘Main panels’ will oversee disciplinary sub-panels, to which it is hoped that Applied Linguistic will be added: otherwise language teaching research will be scattered not only across sub-panels but across the four Main Panels covering Linguistics, Education, Languages and Area Studies.
Related links
RAE 2001
www.hero.ac.uk/rae
RAE 2008
www.rae.ac.uk
Roberts review of Research Assessment
www.ra-review.ac.uk
Referencing this article
Below are the possible formats for citing Good Practice Guide articles. If you are writing for a journal, please check the author instructions for full details before submitting your article.
- MLA style:
Canning, John. "Disability and Residence Abroad". Southampton, 2004. Subject Centre for Languages, Linguistics and Area Studies Guide to Good Practice. 7 October 2008. http://www.llas.ac.uk/resources/gpg/2241. - Author (Date) style:
Canning, J. (2004). "Disability and residence abroad." Subject Centre for Languages, Linguistics and Area Studies Good Practice Guide. Retrieved 7 October 2008, from http://www.llas.ac.uk/resources/gpg/2241.
Humbox
The Humbox is a humanities teaching resource repository jointly managed by LLAS.