CD Rom

CD ROM: A YOUNG TECHNOLOGY WITH A GREAT FUTURE BEHIND IT?

As CD-Rom begins to spread, the commonest reaction I hear from library administrators is that it is a technology which creates more problems than it solves. It is certainly the case that as its use continues to spread the problems become more apparent and my intention is to provoke a little discussion by taking on the role of devil's advocate and looking at the sort of issues which could marginalise CD. Much of the evidence I shall cite comes from the PACS-L Bulletin Board in the United States.

Firstly cost. The experience in the UK in higher education is that the ratio of pc's to products is very close to 1:1. This in turn brings associated costs; replacement drives when the compact disc is stuck in the A drive of the pc; paper costs or the option of installing a high quality, high cost printer and card payment system; the nuisance of virus attacks on the pc and the cost of keeping it virus free; incompatible retrieval software and differing CONFIG.SYS files giving higher staff training costs; and perhaps worst of all very labour intensive user instruction, because of the poor quality of software and/or documentation.

The second issue is response times. These are inevitably and noticeably poorer than from networked mainframes. While the CD market remains related to audio drives and in its present state of limited development, manufacturers will concentrate on market penetration rather than technical progress.

Thirdly, perhaps the area which concerns most libraries is networking. Now there are lots of Local Area Network experiments. Typically these have pushed up to three-four users. The most ambitious project I know is at the University of Utah in Salt Lake City with twenty-one CD drives on one server with seven workstations. There is talk of network bridges but so far only talk. Ohio State has linked four pc's to a net based on a pc server but note this allows only four users. Norwich University in Vermont has a small network bridge but this links only 2 pc's at 1200 baud. Needless to say these developments are user driven with little obvious incentive for manufacturers to reduce sales by encouraging networks.

Fourthly comes data quality. There are suggestions from the United States that quality control is becoming an issue. Faulty discs are appearing with bad blocks of data, which appear to relate to pre-production mastering rather than disc production.

Fifthly, the lack of currency of the data is a well understood and accepted problem - but nonetheless a problem. The more often discs are produced, the more expensive the data is.

Next comes price. Even stable data-sets tend to be expensive - at least from the commercial sector. You can get Thesaurus Linguae Graecae for US$60 but commercial products often cost thousands of dollars, because they are aimed at a narrow institutional market. To add insult to injury the requirement of some suppliers to return old discs when new ones appear leaves a sense of unease.

Then there are the management difficulties, which abound; disc security; viruses, charges for printout or floppy discs; booking systems; systems support; PC-sizing; noise in the library, and so on. All of these require to be addressed by library managers already overburdened with other difficulties.

Then there is the relation of file size to disc capacity, which may seem impressive, but also poses problems. Many products do not fit one disc and as the price/performance ratio of other storage media drops, these other media look increasingly attractive.

Finally and to cap it all, search software is neither wonderful nor compatible.

That constitutes a fairly extensive list of complaints. In one sense they may not matter because all are capable of solution and many are being actively addressed. However, market penetration is slow. At the end of 1989 there were only 250 players in UK Universities. Unless there is rapid change I begin to wonder whether competing technologies may not take some of the shine off CD's. And it is important to recognise that it is a competing technology and not a new one with no rivals. On-line and even the printed word compete with CD.

The Computer Board has begun to take an interest in data and is set to pump some hundreds of thousands of pounds each year into end-user searching. By the end of the year all ISI databases should be freely available over JANET with up to 300 concurrent uses possible on a 50 gigabyte file. There are rumours this week of similar deals in the states with AGRICOLA and ERIC becoming available over BITNET. There will be all sorts of possibilities. Weekly data refreshment, downloading of subsets of the file, stored searches and so on. CD will have to work very hard to compete with that.

The breakthrough that CD needs is to become an end-user not an institutional product. But there is little evidence that big suppliers like IBM or Apple are ready to add CD drives to their standard products. Even if they did the most likely outcome is that users will want writable discs to create their own files.

I have no doubt that CD has a minimum future in replacing microform with stable unchanging datasets, such as the old British Library Catalogue - although even there I gather some problems exist in recording changed pressmarks. I also suspect that there is a big future for mixed media where new services mixing data graphics and sound will appear. However, I increasingly think that unless networking issues in particular are resolved, CD will find substantial competition from the great leaps being made, and very importantly centrally funded, in wide area networking.

At the moment then, I see problems arising faster than solutions.