Remembering History

Remembering History: The Work of the ISSC

Derek Law,

University of Strathclyde

Introduction

As we enter the Web2.0 world it is very easy to forget how recent most of the electronic environment we take for granted is; to lose track of the maze of false starts and blind alleys; to forget that what we take for granted is not deeply embedded. The web has existed for barely fifteen years. A decade ago Google and Amazon and Skype did not exist and libraries still had a monopoly of mediated – and often charged – on line searching. As late as 1995, JISC had just joined W3O and in its first report on the web, presented at a conference that March and attended by (only) 54 HE institutions reported that “The WWW is clearly an important tool for many applications and is expected to be so for some years to come” (AGOCG, 1995). This underwhelming vote of confidence simply reflects the fact that today’s broad and certain highway was not a self-evidently obvious path to follow.

The growth of database usage had been slow. In 1971 a mere 18,000 MEDLARS searches were conducted in the whole United States, based on a batch-processing system (Landesman, 2005). DIALOG was launched in 1972. A 1984 survey of 376 US higher education institutions showed that less than half offered on-line searching, and of those that did, less than 5.8% performed more than 1000 searches a year. (Perry, 1992). This was the time when CD-Rom was seen as the answer and we loaded as many as two dozen datasets in jukeboxes. There was agonised debate as to whether users should have mandatory training courses before being allowed to have their carefully pre-booked sessions at library terminals.

The Funding Councils and Libraries

All of that changed in the three years from 1991 to 1993. In 1991 the Computer Board was recast to become the Information Systems Committee of the Universities Funding Council (UFC). Its original role had been the funding of mainframe computers in the then much smaller number of universities. Board members revelled in the formal title of godfather (this author being briefly but officially the godfather for Wales!). The Computer Board was prescient in sensing that while the need for centrally funded computers was disappearing, there was continuing benefit in a centrally managed programme promoting the electronic world which was beginning to appear. It therefore began work on the first national data set procurement. The Information Systems Committee (ISC) only had the time to award the contract for the ISI dataset before it too disappeared to become the Joint Information Systems Committee, while UFC was balkanised as the (national) Funding Councils. ISC’s other significant achievement in its one year life was to decide to commission a national review, the specification for which became the IT component of the Follett Review. JISC operated then, as now, through a series of sub-committees, and the Information Services Sub-Committee (ISSC) was set up and charged with managing and developing the support infrastructure for JANET and with developing and implementing a national datasets policy. In 1993 came the Follett Report, the NCSA Mosaic Web browser and the first SuperJANET contract to deliver 34Mb connectivity to 55 universities. The world had changed irredeemably.

A National Datasets Policy

The Computer Board had its offices in Orange Street in central London. Close by stands what was the traditional post-meeting venue of the Hand and Racquet public house. It was here that the main discussions took place to plan the acquisition of the ISI dataset. A combination of ambition, an end of year Computer Board surplus and a publisher end of year sales shortfall led to the striking of the first ever national data set deal. The initial deals focused on abstracts and indexes. Electronic journals were still experimental and even in 1993, a report for the Follett Committee was very ambivalent as to their potential (Waddell, 1993). The ISI deal was accomplished quickly and was immediately and formally described as “A giant leap in the dark”. In a world of mediated on-line searching it was not clear whether the general mass of users would have either the skill or ambition to undertake their own searches. The first deal with ISI was very successful, being adopted by some 76 institutions, but the second and third, Embase and Inside Information, attracted only 22 institutions. (Scanlon, 1993). Indeed it was a sufficiently large change to move from mediated to unrestricted access that a Computer Board sub-committee was set up to review needs and policy and, in effect retrospectively to validate the rushed decision. The advantage of this approach was that from the start a set of principles was established, mainly by ISSC, to guide policy on datasets. With a budget of eight million pounds a year to acquire datasets and a nervous and uncertain publishing industry, ISSC spent a lot of effort in determining its guiding principles. Some of this thinking was spurred by Harry East’s comment that the committee’s activity was “pragmatism in search of a policy” (East, 1994). Ten years on it seems worth rescuing these now forgotten principles from the mists of time, not least because they have helped shape where we are.

By 1994 the ISSC had developed the concept of the Distributed National Electronic Resource (DNER). The model was still seen as a centrally funded and driven one, but was to be based at five national data centres. One of these already existed as the Data Archive at Essex, whose funding was taken over by JISC; one was to be a new distributed model in the Arts and Humanities Data Centre, awarded to King’s College London by competition. Also by competition, Bath, Edinburgh and Manchester Universities were designated as national data centres.

By this time the first ISI contract was coming close to renewal and so the DONUT Strategy was created. This model described two situations. A core of universal datasets such as ISI would be purchased for all. This would be surrounded by a whole series of subject specific datasets. It was clear that the substantially positive response to the acquisition of the first dataset and its heavier than expected usage as well as the public knowledge of the eight million pound budget would lead to publishers in general and ISI in particular adopting a much tougher negotiating position. The DONUT Strategy then offered ISI the option of being the jam in the doughnut or the hole in the doughnut; they could be part of the solution or part of the problem (Scanlon,1993). The ISSC stance was deliberately hard. National deals were still a novelty and yet there was much more data than the budget could buy. If a publisher was difficult they could simply be ignored since there was always another deal to hand.

Underpinning negotiations with publishers were a set of principles which may now seem self-evident, but certainly were not obvious in 1992. These were:

Free at the point of use

This was certainly not a given. Charged mediated searching was still the norm. That had always been a source of irritation to the computer literate, and a 1993 report stated that “90% of academics had their own microcomputer” (Waddell,1993). The determination to spread electronic methods of working which permeated all JISC programmes would be held back by charging but encouraged by free access. The view was also heavily coloured by the fact that the membership of ISSC was dominated by librarians for whom free access to information was an article of faith

Subscription not transaction based

This model had been developed by CHEST (The Combined Higher Education Software Team) based at the University of Bath for the purchase of software licences. This again fitted comfortably with the librarians’ view of how information should be made available and their experience with printed journal subscriptions. At that point publishers were pressing the transaction based model. Fortunately the size of the budget gave JISC sufficient clout to set its own terms, while the experience of CHEST in negotiation ensured success.

Universality

The intention was to use the budget to cover all disciplines. This again was a hard fought debate. The pressure was to spend the budget entirely on big science research publications as these were the most expensive and therefore most at risk of being cancelled in most institutions. However there was a clear wish to spread computer skills and practice throughout the HE community. It was felt that this would be greatly helped if every member of staff and every student had access to at least one resource which was essential to them.

Lowest common denominator

In the same way it was clear that there should be something for staff and students at all levels. In the post-1992 world with double the number of universities there was no political will to set up an elite system of resources rather than a mass system.

Commonality of interfaces

This turned out to be a clear area of failure, although it may have helped to develop thinking on interoperability. Every publisher had developed or was developing different search and operating systems for their products and it was clear that this was set to become a nightmare for users. ISSC was confident that its financial muscle would allow it to start imposing sufficient commonality to make life easier for users. This proved a fond hope.

Common mass instruction programmes

It seemed clear that such a large change required a major training component to deliver the full benefit of the principles. Initially this focussed on a partnership approach with the JANET User Group for Libraries (JUGL). The issue remained prominent as JISCs work expanded and the eLib programme funded the Netskills programme based at Newcastle University. The need continued to be important and this soon developed into a large enough activity to require its own Sub-Committee, CALT (Committee for Awareness, Learning and Training). At the same time much emphasis was placed on producing high quality documentation to support datasets purchased.

ISSC and FIGIT

When the Follett Report was published in late 1993 it endorsed further development of activity in what was now a JISC core programme of database and dataset provision and the development of network navigation tools and services (Brindley, 1994). When the FIGIT Group was set up in 1994 to implement the Follett recommendations, harmony of development was ensured through cross-membership of the two committees. The ISSC budget remained at £8 million per annum compared with FIGIT’s £3 million. As a broad division of labour FIGIT funded development projects and exploratory work, while ISSC funded operational services and infrastructure. This was never a hard and fast rule, of course. FIGIT had five thematic lines to pursue under Chris Rusbridge’s energetic and imaginative leadership. Matters of interest which fell outside those lines tended to be picked up by ISSC. In terms of infrastructure ISSC supported a number of activities. By 1995 the key ones were

AGOCG. The Advisory Group on Computer Graphics, which provided a single national focus for computer graphics, visualization and multimedia. Based at Loughborough it carried out software and hardware evaluations, ran workshops and seminars and assisted sites in the introduction of key technologies. It offered a then useful “technology watch” service.

Cache Service. This was still a novelty in the mid-1990s when the provision of bandwidth (or more accurately the cost of such provision) was a major issue for JISC. Cache sites simply capture the international traffic and store it for a brief period. This assumes that the best guide to what will be used is what has been used. Early results showed that a modest investment in servers produced the equivalent of a large increase in bandwidth, and a national service was duly set up.

The Database Resources Research Group. Evaluation was an early and important requirement for all services. It was felt that even a modest investment in electronic services would be better made in knowledge of how they were used. A small unit was therefore funded at City University to study who used network services and why.

CHEST. CHEST was based jointly at Bath and De Montfort Universities. It was responsible for the negotiation of software and data purchases on a national basis, either through purchase or by licensing. Software purchasing was a longstanding Computer Board activity taken over and extended by ISSC. By mobilising the total purchasing power of the Higher Education community, large discounts were acquired.

Resource Discovery A review study of CNIDR (Clearinghouse for Networked Information and Resource Discovery) and of InterNIC in the United States was completed in 1994 to consider how we might use these American ideas in a UK context to make generally available information on network developments and standards and to provide advice and leadership on local system design. Although not completely followed through this did lead to the development of resource discovery services such as ADAM, EEVL and SOSIG.

MAILBASE. It is now difficult to remember a time when e-mail was not the pre-eminent communication form. But at that time it remained a minority activity. What did exist was often channelled through listservers. Mailbase was based at the University of Newcastle and organised the Listserv activity in the United Kingdom. Its brief was wider however and it also set out to organise the communities which would operate listservers.

UKOLN. It is again difficult to remember a time when UKOLN did not exist, but in its present form it was created in 1992, when the much lamented British Library Research & Development Department and ISC agreed jointly to fund a new unit based on two older units at the University of Bath (UKOLN, 1995). Under a succession of energetic leaders UKOLN has become one of the outstanding bequests of the ISC.

Resource Discovery

Perhaps the area in which ISSC worked most closely with FIGIT was that of resource discovery. This operated at several levels. The first was overtly political. The first major – and also still on-going service – was the COPAC database (Cousins, 1997). While a good thing to do in its own right, the protracted negotiations with CURL to gain access to the database were very clearly signalled as being intended to undermine the cash-strapped British Library’s much rumoured intention to charge for use of its new OPAC. It was also intended to provide some underpinning infrastructure to the document delivery strand of the e-Lib programme, again potentially undercutting the British Library if it continued to raise document delivery prices.

ISSC also took over and developed three existing services:

HENSA. Was a shareware archive. It had two parts, with Unix numerical and statistical software offered from the University of Kent and pc software from Lancaster University. At Kent, Internet searches could also be performed using the archive server.

NISS. This set of services was based at the University of Bath and concentrated on current information ranging from yellow pages to newspapers. It aimed to promote an electronic information culture through providing access to useful collections of information. It also acted as a gateway to other services and resources and provides information through the NISS Bulletin Board.

BUBL. The BUBL Information Service, based at the University of Strathclyde, offered an Internet current awareness service, together with organised, user-friendly access to Internet resources and services with a combined gopher/WWW subject tree being a particular feature.

But ISSC also worked with FIGIT on a more developmental approach and took responsibility for developing subject based services, in response to user demand. These services had a common theme and a common set of standards. It was believed that attempting to catalogue everything on the Internet was not reasonable – probably a bad decision in the light of Google’s success! Instead it was intended to make available a limited set of resources of importance to a discipline; catalogue and abstract them; ensure availability; and provide documentation and support. In an inversion of Gresham’s Law that bad money drives out good, it was believed that good information would drive out bad. High quality information, properly catalogued, reliably available, properly documented and supported would be preferred to information of unknown provenance and quality, infrequently available and without support.

There was no commonly accepted standard for resource discovery and so the ROADS project was set up, to support IAFA templates and encourage their use. It was felt that it was less important whether this was the right decision than to demonstrate that UK Higher Education was seen as a major player with a right to a place in the fora where standards decisions were being made. The subjects covered were:

ADAM. Based at the West Surrey Institute this looked at the quite unusual set of resources required by groups as varied as fashion design students and jewellery craftsmen. Visual images are a major element here.

EEVL based at Heriot Watt University supported the engineering community.

OMNI was based at the National Institute for Medical Research and covered Medicine.

RUDI covered Urban Design and was based at Hertfordshire and Oxford Brookes University

SOSIG was based at the University of Bristol, covering the Social Sciences

It should be evident from this list of projects that ISSC was firmly adopting the distributed model championed by e-Lib to ensure that as many institutions as possible were engaged in developing e-activity, as a means of ensuring the widest possible support base and the largest number of proselytizers possible.

Conclusion

The work of the ISSC is largely and perhaps rightly forgotten, and usually but incorrectly subsumed under the general banner of the e-Lib programme. This is of little moment other than to those directly concerned. FIGIT and the e-Lib programme defined a decade of momentous change and quite clearly changed a centralized institutionally based information culture, to a personal, individual desktop based information culture. And yet barely a single e-Lib project has survived or is remembered, except in the fond memories of the good old days. Paradoxically, as described above, much of the work and infrastructure created by ISSC survives, yet the committee itself is forgotten. ISSC and FIGIT were harmoniously working the same patch until both were reconstructed in 1997. ISSC does however have one bit of work which has faded away, but deserves not to be lost. It articulated the principles of acquisition and the concept of a distributed national resource. The latter has been usefully recast at least twice as the world has moved on, but the former has passed into a comfortable desuetude which perhaps denotes the acceptance of the principles as a norm. But an occasional reminder that the HE community acts on principle and not just pragmatically is no bad thing.

References

Advisory Group on Computer Graphics (AGOCG). World-Wide Web – A Strategic Tool for UK Higher Education (Sima Report Series, No.12) [Loughborough: AGOCG, 1995].

Brindley, Lynne. Joint Funding Councils’ Libraries Review Group (the ‘Follett’) Report – the contribution of the Information Technology Sub-Committee Program 28 (1994), 275-278

Cousins, Shirley Ann COPAC: the new national OPAC service based on the CURL database. Program (31) 1997, 1 - 21

East, Harry, National datasets acquisition: pragmatism in pursuit of policy, Journal of Information Networking, 2(1) 1994, pp 1-12

Landesman, Margaret. Getting it Right – The Evolution of Reference Collections in Frost, William J. The Reference Collection: From the Shelf to the Web, Haworth Press, 2005

Perry, Edwin M. The historical development of computer-assisted literature searching and its effects on librarians and their clients Library Software Review 11 (1992) 18-24

Scanlon, S. The UK’s networked Dataset Revolution Continues (Library & Information Briefings, No 50) London: LITC, 1993

UKOLN Annual Report, 1995. University of Bath, 1996

Waddell, Pam. The potential for electronic journals in UK Academia. SEPSU,1993. [SEPSU report for the HEFCE Libraries Review].