92RAE

Research Assessment Exercise 1992: Library and Information Management and Communications and Media Studies Panel

By JUDITH ELKIN and DEREK LAW

Abstract: Reports on the 1992 Research Assessment Exercise and specifically on the work of the Library and Information Management and Communication and Media Studies Panel, in considering research submissions in two units of assessment. Panel members agreed with all of the final ratings and felt that rigour and fairness had been applied to the exercise within the parameters set by the Funding Council. There was, however, a feeling that two cited publications was insufficient to give an indication of an individual’s contribution to research and that in future there should be greater emphasis on non traditional forms of publication. Concluded that three cases need to be considered by the Funding Councils in the areas under consideration : where library staff undertake some work in a related academic department; where library staff undertake some work in a library and information science department; and where the library itself has a solid base of research activity but where there is no library and information science department.

The 1992 Research Assessment Exercise has had a mixed reception in the academic community:

- on the whole welcomed by the departments which did better than expected;

- regarded as unfair by those which did worse than expected;

- as timely, but still skewed unfairly to the traditional sector, by the ex-polytechnics which for the first time were eligible for central government

funding for research;

- as untimely by many of the old universities which regarded their new competitors as unworthy upstarts.

Comments from the Universities Funding Council (UFC) after the event emphasized that the essential heart of the Exercise was qualitative and suggested that the principles on which the Exercise operated were ’selective, transparent, dynamic, stable and accessible’ (Bekhradnia, 1992), while the methodology was generally perceived to be both rigorous and fair. A Times Higher Education Supplement leader described the Exercise as being ’conducted in a way which commands considerable respect’ (Times Higher Education Supplement, 1992a) and goes on to say: ’If the rankings show the extent to which world class research is concentrated in rather few institutions, they also show the wide spread of good work’.

Until recently, departments in the ’old’ universities were government funded by formulae in proportion to student numbers to carry out teaching and research; departments in the ’new’ universities (former polytechnics) were government funded in proportion to student numbers to carry out teaching; there was no formula funding for research. Staff, in the latter, who undertook research did so in the time free from teaching and, if funded at all, this was largely through industrial sponsorship or consultancy.

Prior to the last funding round, all higher education institutions in the UK were invited to participate in the 1992 Research Assessment Exercise, which was to be the third such Exercise. The purpose of the Exercise was to rate the quality of research for predetermined units of assessment, which equated broadly with traditional academic departments, after which the UFC would apportion research funding on a formula based both on that rating and on the numbers of research staff declared to be ’active’ by the institution. In addition the Higher Education Funding Council for England (HEFCE) has used numbers of research assistants and research students in its funding formula. The funding would take effect from 1993/1994, when the new Higher Education Funding Councils would be in place.

An extract from UFC Circular 5 / 92 states:

Research for the purpose of the Council’s review is to be understood as original investigation undertaken in order to gain knowledge and understanding. It includes scholarship; the invention and generation of ideas, images, performance and artifacts including design where these lead to new or substantially improved insights; and the use of existing knowledge in experimental development to produce new or substantially improved materials, devices, products and processes, including design and construction. (UFC, 1992a)

The criteria for awarding the ratings were summarized as:

If research is taken to be the generation of new knowledge, then quality in research can be described as the degree of impact (i.e. the extent to which general understanding is increased) that this knowledge has on the area or sub area at local, national or international level. (UFC, 1992b)

UFC Circular 5/92 setting out the procedures and guidelines for the Exercise was issued in March 1992, with a final date for submissions from institutions of 30 June 1992. Some 2700 submissions, representing the work of over 43 000 full-time equivalent (FTE) academics, were received from 172 higher education institutions (HEI). Some 950 of these, covering the work of 8000 (FTE) academics, were from the non-UFC sector (former polytechnics, higher education colleges, Scottish Central Institutions and the Open University). Submissions were assessed by 62 panels and sub-panels with 450 members and 50 assessors. Twelve per cent of the membership of panels was drawn from the non-UFC sector.

The 1992 Exercise broadly followed the approach of the previous 1989 Exercise, with information provided by the HEIs as the basis for peer review assessment of research quality by specialist panels. However, institutions were invited to list only two publications and up to two other forms of public output for each member of staff whose research was to be taken into account. These, together with a range of other data provided by the HEIs, formed the basis for the judgement of research quality. Perhaps more importantly, on this occasion not departments but submissions were to be rated. This meant that institutions were allowed to cite only research active staff and that, in theory, the lone researcher of international stature in a generally poor department could be considered the equal of a large department in a major university. In practice this notion of the lone researcher caused many panels some unease. It also caused institutions which were not submitting all staff, to gamble the value of a possibly higher rating against the funding to be gained from the volume measure of including more staff.

The Exercise was retrospective, based on the period 1988-1992, and aimed to take a ’snapshot’ picture of the quality of research being undertaken in institutions at a particular time. The Exercise was therefore intended to be based on the actual submissions, not on speculation about what might have been or future potential.

The submissions gave detailed information on research active staff, their two cited publications plus other research output, a numerical summary of all publications by such staff (1988-1992 inclusive), categorized by type of output, e.g. authored book, edited book, short work, refereed conference paper, other conference paper, editorships, articles for academic, professional or popular journals, computer software, painting/sculpture, performance, etc.; students and studentships, including completion rates; external research income. Departments were allowed three pages of narrative, to cover the research plan in the current environment, future developments and a statement of general observations, to include esteem indicators. In addition to the submissions, UFC contracted a firm of management consultants who provided a breakdown of data showing comparisons of the number of research students per year, research students per active researcher, research income per active researcher, etc. On the whole, for this panel, this quantitative data proved little more than indicative in what was intended to be a qualitative exercise but the narrative elements from the individual submissions proved disproportionately helpful in indicating a coherent framework for research and future direction.

In the guidance to panels, the UFC emphasized that the Exercise was to cover the whole range of academic research (applied, strategic and basic), with the emphasis on quality of research rather than quantity. Each panel was urged to adopt a particular approach to reflect as far as possible the specific characteristics of each subject area and the views of panel members, whilst working within a common framework. The ratings were based on a common five point scale with common definitions which related to attainable levels of national and international excellence in each unit of assessment (Figure 1).

The accompanying notes state that:

’Attainable’ levels of excellence’ refers to an absolute standard of quality in each unit of assessment, and should be independent of the conditions for research within individual departments.

and

The international criterion adopted should equate to a level of excellence that it is reasonable to expect for the unit of assessment, even though there may be no current examples of such a level whether in the UK or elsewhere. (UFC 1992c)

Although apparently a little bland, the definitions of the five levels in the end proved helpful in the decision making on borderline cases.

The allocation of money for research, as a result of this Exercise gave £600 million to the old universities and £40 million to the new. However, the panels were specifically forbidden to discuss money, as will be seen later, and this outcome reflects the later financial calculations of the Funding Council rather than the suggestions of the panels. It should also be pointed out that the Funding Councils did not announce the funding formula until after the Exercise was completed. While some HEIs complained that this lack of information hindered their selection of staff to include in returns, it did ensure that panels were unaware of the financial implications of their decisions.

LIBRARY AND INFORMATION MANAGEMENT AND COMMUNICATION AND MEDIA STUDIES PANEL

There were 72 units of assessment and 62 assessment panels and sub-panels. Two units of assessment, Library and Information Management (LIM) (Unit 64) and Communication and Media Studies (Unit 68) were combined in a joint assessment panel, representing specialists from the two areas. There were 20+ submissions in the former; 30+ in the latter. The panel was chaired by the UFC subject adviser for Library and information Science (LIS), Derek Law, Kings College London. Members of the panel were selected by the UFC in consultation with the subject adviser, following a wide trawl of institutions and professional bodies which produced a long list of recommended names. The Secretariat sought to include one or more panel members from the non-academic sphere (where appropriate), presumably in the hope of introducing doses of reality, public accountability and to give an impression of the way in which research responds to public need. (See Annex A for the list of panel members. One of the originally selected panel members transferred to another panel in the light of the submissions received) (UFC 1992d). The panel was an interesting mix: it consisted entirely of academics on the communication and media side; on the LIS side it consisted of only two academics plus two practising chief librarians (including the chairman) both with some research background, and a commercial consultant, reflecting perhaps a confusion between academic and service delivery departments in the LIS field. There was no overlap of panel members from the previous Exercise. The panel was serviced by a member of the HEFCE Secretariat who also serviced several other panels, thus ensuring some further consistency between approaches.

All panel members received the full set of submissions for that panel. Panels were given considerable freedom in terms of procedures and interpretation of the guidelines, on the basis of the distinctive needs of the research area under review.

The panel met on three separate occasions between September and November 1992, following two briefing meetings in July which took place prior to receipt of submissions. At the briefing meetings, it was agreed that the panel should assess all submissions as a group, although individuals would lead in their own area of expertise. It was also agreed that the criteria to be used in reaching the final ratings would be quality of publications (not quantity), in appropriate rank order, together with number of research students and completion rates, with research income as a significant indicator (particularly in LIM). A rank order of publications was not made at that stage, although panel members agreed to come to the first full meeting prepared to discuss such a ranking.

The method of working was also agreed: all submissions were to be divided between panel members, so that each submission would have a ’champion’ and seconder. Thus submissions would be analysed in detail by two panel members, with written notes circulated to all panel members in advance of meetings, when submissions would be scrutinized by everyone. Potential conflicts of interest were declared, so that the panel member concerned was excluded from the discussion of that submission (and indeed left the room for the main discussion of such submissions).

The rank order of publications agreed at the first full meeting was that authored works, refereed articles and refereed conference papers rated more highly than individual reviews, letters or lightweight articles in popular journals, non-refereed. This ranking of publications varied slightly in the communication and media part of the discussions, where chapters in books were seen as most highly regarded and refereed conference papers and articles were comparatively unusual.

The process of citing only two papers per active researcher in the submission was not helpful, particularly as the quality of some of the publications cited was questionable, as examples of quality research, e.g., book reviews, internal reports. In one department’s submission, 75 per cent of publications shown as total publications were book reviews, leaving the panel wondering whether some of the individuals concerned had time for any teaching, let alone research. There was considerable inconsistency in categorizing publications, for example is the Library Association Record a popular, professional or academic journal? It was cited in all three categories. It also has to be admitted that the general quality of reference citation was poor. This seems unforgivable in an academic exercise but particularly so, from academics within the information field! One submission had a level of mis-spelling which caused some consideration to be given to downgrading on the basis that a grant-awarding body might well have treated it in this way. Further problems surrounded the listing of category C staff. At least two institutions included large numbers of library staff who potentially might have improved the rating but in practice were excluded from the volume measure which determined income. Category C staff were defined in the circular as:

any staff who make an independent contribution to the research of the department, but do not have a contract with the institution... It includes, for example, NHS consultants, some Oxford and Cambridge College fellows, staff in Research Council units that are an integral part of the department and emeritus professors.

This definition seems to offer little scope for the inclusion of library staff, but institutions were advised that this was the appropriate place to put them. To compound the confusion, other institutions named individuals known to members of the panel as working in a library as Category A (’research active’) staff. This confusion over the role of research in academic services remains unresolved and is considered further below.

The discussion of the five point rating scale took up considerable time. What were ’attainable’ levels of national and international excellence within library and information management and communication and media? This was an ongoing debate through all of the meetings, although it was agreed early on that there would be departments who warranted a rating of 5, where they were strong on all indicators and clearly achieved an international reputation. It was also likely that there would be departments at the other end of the scale who only merited a rating of 1. The starting point for the panel was working through a decision tree, spending longer on the first two or three submissions considered in each of the two areas, to set some kind of level. The first step was to distinguish between categories 1 and 2. This could be answered through the simple question of whether the research deserved public funding or not. A model submission was selected as an exemplar of the top rating of 5, followed by exemplars of each rating point. Inevitably, this was fluid, with full and frank discussion of each submission. The split between the two disciplines was helpful here in focusing debate on the justification for particular recommendations in the light of both units of assessment. There was a high degree of consensus on most submissions, allowing discussion to be focused on the few submissions where there was a difference of opinion. At this stage the priority was seen as giving a rank order rather than spending a great deal of time assigning the exact rating. The final meeting was held, after all the ratings were provisionally assigned, to allow time for reflection and ensure that parity across and between the two disciplines, had been applied.

Although institutions specified the unit of assessment, two submissions were transferred to other panels, with the consent of the institutions, on the advice of the panel, which effectively deemed them ’out of scope’. In three further cases, elements of submissions were referred to other panels for advice. Thus, for example, some elements of the submission of a library school (which in fairness should remain anonymous) were referred to the panel on Celtic Studies.

The department’s overall ’research climate’, management, monitoring and evaluation of research, as expressed in the narrative were the final elements of the criteria deemed applicable by this panel. In the end this carried more weight than had been anticipated. Thus a department where the majority of research active staff were publishing material seen as of international significance, where there was a good record of successful research student completion rates and reasonable research income, in a climate of positively and strategically managed research, scored high in the final ratings. It was perhaps this single issue of how coherently managed the research activity was, which provided the best guide to the quality of the department.

Departments where research was fragmented even if individually sound; where the publications cited did not indicate research of even national importance or were seen as insignificant ; where the student completion rates were poor or there were no research students; where income was poor or non-existent; and where the narrative seemed confused, incoherent, lacking cohesion or any kind of forward thrust or strategic direction, generally scored low ratings.

The Exercise was by definition aimed at assessing ’snapshots’. This led to a certain degree of frustration on the part of the panel since it was clear that some institutions were moving out of difficult periods, while others might have been seen to have ’peaked’; yet no allowance could be made for this. In two cases there was a wish to suggest that some ’Dev R’ funding might be made available to recognize this. The ’Dev R’ or Development Research funding seemed ideal for these cases, but since funding issues were specifically excluded from the panel’s brief this was quickly, if with regret, seen as ultra vires.

In the final analysis of the whole process, the panel felt that the Exercise had been carried out with great thoroughness, fairness and professionalism and that justice had been done.

The panel finally awarded:

~ two 5s (City University; University of Sheffield)

~ three 4s (Loughborough University of Technology; University of Strathclyde; University College of Wales, Aberystwyth)

~ four 3s (University of Bath; University of Brighton; De Montfort University; University of Leicester)

~ six 2s (University of Central Lancashire; University College London; Manchester Metropolitan University; University of Northumbria at Newcastle; Queen’s University of Belfast; Robert Gordon University)

~ four Is (Liverpool John Moores University; Royal Free Hospital School of Medicine; Thames Valley University; Queen Margaret College)

for Library and Information Management (an average weighting of 3.08) (UFC 1992e).

~ four 5s (Goldsmiths’ College; University of Warwick; University of Westminster; University of Stirling)

~ two 4s (University of East Anglia; University of Sussex)

~ seven 3s (University of East London; University of Kent at Canterbury; University of Leeds; University of Leicester; Nottingham Trent

University; Sheffield Hallam University; University of Ulster)

~ thirteen 2s (University of the West of England, Bristol; City of London Polytechnic; CoventryUniversity; Liverpool John Moores University;

University of North London; Staffordshire University; Thames Valley University; West Surrey College of Art and Design; University of Wolverhampton; Duncan of Jordanstone College of Art; Glasgow Polytechnic; Queen Margaret College; University of Glamorgan)

~ five Is (City University; Salford College of Technology; Southampton Institute of Higher Education; Trinity and All Saints College; University of

Wales College of Cardiff)

for Communication and Media Studies (an average weighting of 2.66) (UFC 1992f).

This was the only unit of assessment in which a former polytechnic (the University of Westminster) scored a 5, for its Centre for Communication and Information Studies. This even merited a front page headline in the Times Higher Education Supplement: ’Media star rises in Westminster’ which went on to say:

’The University of Westminster will be broadcasting on all channels this week the news that it is the only former polytechnic to achieve a coveted star rating...’ (Times Higher Education Supplement, 1992b). Westminster was also interesting in that the submission firmly straddled the two groups with some of the work firmly centred in the LIM area.

ANALYSIS OF LIM SUBMISSIONS

Of the teaching departments in the sector, only 13 of the 16 eligible submitted; one (UCE, Birmingham) on the basis that the institution itself considered that as the majority of its research was ’near-market’, this was inappropriate for the Exercise: a view not commonly shared by other institutions with a similar research profile. In addition, there were submissions from the University of Bath, De Montfort University, University of Central Lancashire, Queen Margaret College and The Royal Free Hospital School of Medicine.

There has not been too much published evidence of the quantity of research in the LIM sector. The Research Assessment Exercise produces the first figures on the size of the sector. By definition all research outside the higher education (HE) sector is excluded and even within that sector research by practitioners is largely excluded. Nevertheless, the following figures are interesting as a first approximation of the size of the sector although incomplete. Analysis of the Library and Information Management submissions shows the following ’snapshot ’ picture of research activity in the sector:

199 (141.8 FTEs) ’active’ researchers in the field, with 22.5 post doctoral students and 14.8 postgraduate research support staff.

· The average number of publications per active researcher, was 4.9 p.a., with averages ranging from 6.4 to 0.5 for individual departments.

The average of full-time and part-time research students per department, 1988-1991, varied from 29.8 to 0.8 p.a.

· Research income, 1988-1992, showed a total of 197.6 grants and contracts with a value of £4 628 314, the major sources of which are shown below in Table 1 (UFC, 1992g).

DEBRIEFING

Although the Exercise was seen as reasonably fair, the by now HEFCE called a debriefing meeting of panel chairmen after the event. This reinforced the impression that a variety of practices had been adopted, and commended this approach for future exercises. The criteria on which ratings were made for other panels varied widely. For example, the Applied Mathematics panel established that its major criterion was research and publications in significant journals, with research income less important; the Chemistry panel took the primary view that none of the new universities would rate more than 1 and took as its main criteria the existence of research staff in all chemical areas in any department, together with staff of national standing, followed by publications in top journals; the Biochemistry panel was concerned only with quality of research in good journals, plus peer reviewed income.

COMMENTS FOR THE FUTURE

The panel felt that rigour had been applied and fairness ensured within the parameters set by the Funding Council. Panel members agreed unanimously with all of the final ratings. There was a general feeling that perhaps two cited publications was not sufficient to give a real feel for the quality of an individual’s contribution to research, and that in future there should be a greater emphasis on the value of non-traditional forms of publication. This panel saw the management of the research process and the creation of a real ’research climate’ as a far more important indicator than was anticipated. Perhaps this should be highlighted in future exercises. The membership of panels was seen as adequate, although in the case of the LIM panel, overbalanced by practitioners.

On the other hand there was a feeling that the Funding Council must take a firm stance on the role of research in the broad area of academic service departments. If the research is valued it should be judged as such; if not it should be ignored. Three cases need to be considered by the Funding Councils in our area. The first is that of a member of library staff who may undertake some work in a related academic department (an archivist in the history department for example); the second is where library staff undertake research with a LIS department; the third is where the library itself has a solid base of research activity but there is no LIS department. Much clearer guidelines are required for an area where there are several thousand professional staff, who far outweigh LIS departments in sheer numbers. The compromise of the intermediate C category which can affect ratings but not attract funding seems a messy one.

The differentiation in the five point scale was seen as too narrow, with considerable variation, in each of the grades, between departments scoring at the top of the band and those at the bottom. There was a distinct preference for moving to a seven or 10 point scale.

A recent circular (HEFCE, 1994) has announced that the next Research Assessment Exercise will take place in 1996. The census date will be 31 March 1996, with submissions due by 30 April 1996. The next Research Assessment Exercise will follow broadly the same approach as in 1992 but institutions will be invited to list up to four works for each member of staff whose research is to be taken into account.

Submissions will not be required to include a summary count of publications, signalling clearly the concern with quality rather than quantity. Assessment panels will be established around nine months in advance of the census date so that the criteria individual panels intend to adopt in forming their judgement can be published.

March 1994

REFERENCES

Bekhradnia, Bahram (1992) Quoted in papers to invited seminar on UFC Research Assessment Exercise, University of Westminster, 26 May 1993

HEFCE (1994) Circular RAE96 1/94: 1996 Research Assessment Exercise

The Times Higher Education Supplement (1992a) 18 December, p.1

The Times Higher Education Supplement (1992b) 18 December, leader

UFC (1992a) Circular 5/92: Research Assessment Exercise 1992. Annex A: Research

UFC (1992b) Circular 5/92. Annex D: Guidance to panel members

UFC (1992c) Circular 5/92. Annex C: The rating scale - interpretation of scale points

UFC (1992d) Circular 15/92: Research Assessment Exercise 1992: membership of panels, p.26

UFC (1992e) Circular 26/92: Research Assessment Exercise 1992: the outcome, p.72

UFC (1992f) Circular 26/92: Research Assessment Exercise 1992: the outcome, p.76

UFC (1992g) Unit of Assessment Report for Library and Information Management, (64), August 1992,17.1.

ANNEX A: LIST OF PANEL MEMBERS

Mr D. Law (Chairman) Librarian, King’s College London

Professor P. Brophy, Librarian, University of Central Lancashire

Mr J. Corner, Department of Politics and Communication Studies, University of Liverpool

Professor J. Elkin Head of School of Information Studies, University of Central England in Birmingham

Mr E. M. Keen, Department of Information and Library Studies, University College of Wales, Aberystwyth

Professor G. Kress, Institute of Education, London University

Professor J. MacDonald$, Head of Department of Theatre, Film and Television Studies, University of Glasgow

Professor P. R. Schlesinger, Head of Department of Film and Media Studies, University of Stirling

Mrs B. White, Consultant, Brenda White Associates

Secretary - Miss J. Fitzgerald

$ transferred to Drama Panel

Authors

* Professor Judith Elkin is Head of the School of Information Studies and Director of the Centre for Information Research and Training at the University of Central England in Birmingham. Her research interests lie in the fields of public and school libraries, children’s and multicultural literature and equal opportunities.

**Derek Law is Director of Information Services at King’s College London and has worked in a number of higher education libraries over the last 25 years. He was the UFC’s subject advisor in LIS and is a member of the Joint Information Systems Committee, chairing its Information Services Sub-Committee.