00ManInfo10

Information Management: Future Perfect or Past Imperfect?

By Derek G. Law

__________________________________________________________

By common consent we are entering the third great revolution of humankind, the information revolution. The UK Prime Minister, Tony Blair, is quoted in the preface to this volume as comparing the importance of the information revolution to that of the agricultural and industrial revolutions. That comparison is often made but it conceals an important truth. The key feature of the agricultural revolution was not the use of the plough to allow planting, but the development of wheeled carts, which allowed the distribution of food and the development of trade. The key feature of the industrial revolution was not the invention of the steam engine, but the development of the network of canals and railways, which allowed the distribution of raw materials and finished products to allow economics to develop. In the same way, the key feature of the information society is not the computer, but the development of networks which allow the global distribution of information. In every case development has sprung from the distribution mechanism.

Universities have always been international in their nature and ambitions and more recently, further education colleges have developed international links. They also possess some of the most advanced networks in the world. Information is the foundation of education. As this book has shown, it permeates every aspect of university and college life and its creation, dissemination, understanding and preservation are what we all do. Yet the very fact that it is pervasive allows us to take it for granted and to ignore the threats to our traditional lifeblood. For the first time multinational corporations have begun to buy up information, while several companies, for example Motorola, Ford, British Aerospace, have begun to explore setting up “universities”. At the same time, we continue to allow staff to hand over all rights in the information we create. European legislation which protects the rights-holders rather than users is growing and many practices we have taken for granted about the use of information are being eroded.

There is an interesting analogy with the privatisation of the UK water companies. Like information, water was, and is, an essential commodity that was taken for granted and, if not free, was so cheap that it might have been. Then came privatisation. Prices rose dramatically: users were cut off for non-payment. Supplies dried up through a mixture of incompetence and underinvestment; private profit replaced public good in all sorts of environmental areas; rivers were sucked dry; the fact that the water companies were now in the private sector meant that the government found itself almost powerless to intervene. The comparison with the increasingly aggressive private information sector is all too obvious.

Some institutional leaders also adopt a dangerously complacent attitude, assuming that the Internet will somehow provide ready access to information. They run the risk of forgetting that education is at least as much a producer of information as a consumer and that the creation of a teaching or research which relies on students behaving like mouse potatoes concedes the battle to our emerging competitors. This is all the more ironic since the Internet as it is at present is almost wholly inimical to academic pursuits.

Internet Meltdown?

A whole host of problems surround the electronic equivalent of bibliographic control. Some are a function of the medium, while others are variations of old issues. A thread which runs through all of the problems is the failure of the academy to recognise that the problems exist. The Internet is seen as a great and liberating development, but it is not a neutral development and requires very substantial international effort if it is to be made usable for sustained scholarly communication rather than short-term gratification. The problems begin at the most basic levels.

The very act of naming and identifying electronic objects consistently is fraught with difficulty. A book or a filed letter is a static object which does not change over time. In an electronic environment there is a need to reference objects as they move and change over time and place. The temporary nature of URLs is notorious and it has been claimed that they have an average life of 75 days. Even where the URL remains constant, issues of version control and quality assurance remain unresolved. The seriousness of this problem cannot be overemphasised, for the continuity of citation is central to scholarship and without it scholarship cannot flourish. Some attempts are being made to deal with this problem, the current favourite being digital object identifiers. These originate from the commercial publishing world and it is not then clear whether they have validity and applicability beyond the commercial sector. A significant if unquantified proportion of the material held in any organisation and in any medium is either non-commercial or out of copyright and any new system must be able to embrace everything from incunables to examination papers.

The issue of naming objects is also difficult and as yet unresolved. At present anyone can name an object with no obligation to maintain names over time. This is compounded by the fact that many of the reference points we take for granted in the print world disappear. A book published by Oxford University Press implies a set of values, standards and scholarly rigour that is understood. But an address incorporating the phrase “ox.ac.uk”, could be anything from a university press to a student PC in a rented room. The persistence of object names is a long way from having a settled structure – and there is little evidence that the official bodies in scholarship understand the threat this poses.

Metadata and the description of objects is in rather better case. The Dublin Core standard first produced by Stu Weibel at OCLC (Online Computer Library Centre) has very rapidly developed international acceptance with participation in standards work from Europe, USA and the Pacific Rim. But even here much work remains to be done. Cataloguing has historically described static and largely immutable objects. The Internet offers new genres of multimedia and even services which will require appropriate description. This work remains to be developed.

Unlike the book, terms and conditions of use must also be described for electronic materials. Many will have multiple copyright permissions, many will be licensed rather than published, many will have restrictions on categories of users – and these will vary according to the terms of sale rather than be inherent in the product. Although he initial success of the Dublin Core gives confidence that these problems can be resolved, a great deal of international effort will be required to create a usable system.

Searching and indexing have proved much more difficult technically than the designers of Web robots would have us believe. Web indexing systems are breaking down as their architecture breaks down under the weight of data. It is increasingly common to undertake a search on Lycos or Excite or Infoseek and recover hundreds of thousands of hits in apparently random order. Much work is going on here but designers despair at the inability or unwillingness of the public to master Boolean searching and most systems still have a long way to go to beat a half-way competent reference librarian. Web searching has undoubtedly transformed the ability of searchers to acquire a whole range of current reference information, but is dramatically poor at discovering scholarship and research.

Unlike the print world, the electronic one will require validation of the rights of the user. User authentication is regarded as an essential element of electronic commerce, but it too lacks basic elements for the furtherance of scholarly activity. At present there are no good ways for proving membership of the ‘data club’ when away from the parent institution. Scholars visiting another institution, students on vacation or researchers on field trips are difficult to validate. There is then a very knotty problem surrounding usage data. On the one hand commercial publishers wish to collect usage information as a marketing tool. They are, however, unwilling to release this information to libraries so that they can judge whether usage justifies subscription. On the other hand, many users do not wish anyone to know what they are reading or researching. Traditionally, organisations have preserved the anonymity of user data except where criminal acts are suspected. Is this a right, or simply a custom?

Then there are a series of issues and old battlegrounds to revisit. Rights management systems are growing quickly and are promoted largely by commercial concerns. They provide many areas of philosophic contention. As mentioned above, the question of whether the user can remain anonymous conflicts with commercial need. The issue of preservation remains technically, legally and operationally unresolved. Historically this has been the domain of the national libraries for published information and of the originating organisations for their own archives, but it is not clear that they will or can perform the same role in an electronic environment. We cannot reasonably expect preservation to be undertaken by publishers. Furthermore, the whole issue of fair use is being revisited by publishers, some of whom declare that it does not or cannot exist electronically. Major battles need to be undertaken on these issues, again with little evidence that the academy understands or cares about the issues.

The preservation and archiving of electronic information has barely surfaced as a very complex issue. The Data Archive at the University of Essex has existed since 1967 and has perhaps as clear a picture as anywhere of the so far intractable problems of storing, refreshing and kitemarking information. The problems are staggeringly complex technically and staggeringly expensive to resolve. Although some progress is being made on the legal deposit of commercial material, little appears to be done on the non-commercial and primary materials of scholarship. There are no standards or control or approval mechanisms for institutions or data repositories. This position may be compared with that in the United Kingdom where archives are expected to meet the BS5454 standard and the Historical Manuscripts Commission takes an active interest in the state of repositories and where archivists have specialist professional training. A new class of electronic material, what Clifford Lynch of CNI (Coalition for Networked Information) has called ‘endangered content’, is emerging where the formal and informal records of disciplines are effectively at risk through neglect 1. Archives collect papers, but institutions do not sample or preserve the e-mail or word-processed files of their scholars. Lab books are routinely preserved by scientists but it is doubtful if any institution has a policy for the preservation of digitally captured images or data from research equipment.

Network topology is barely discussed as an issue due to a naïve assumption that there will be an infinitely expanding amount of bandwidth which will somehow be made available to scholarship. And yet there is no evidence to support this view. US universities have abandoned the failing Internet provided by telecommunications companies to create Internet II as a private network attuned to their needs. In Europe the relatively modest ambition of the European Union to link existing research networks through the TEN-34 Project has been “shaped by a series of non-technical influences such as non-availability of required public services” 2, while “standard PNO (Public Network Operator) services in Europe could not fulfil the requirements of the R&D community in Europe” 3. Equally the assumption that we accept a simple commercial approach to network planning is questionable. At present in the UK, bandwidth is acquired in the light of use rather than as a result of scholarly or educational policy decisions. Thus bandwidth expands at a great rate to the east coast of North America to meet traffic growth. There is almost no debate on whether policy should drive such acquisition and route bandwidth to, say, southern Africa then India, Singapore, Australia and then the west coast of the USA opening up markets and scholarship to what is sometimes called UK Higher Education Limited. There is a creeping form of cybercolonialism in the assumption that only the USA has digital material of value to the world. It is interesting to note the recent decision of the Australian vice-chancellors to use network charges to discriminate against overseas Web sites and in favour of Australian ones 4. No discussion appears to take place of how the products and output of small learned societies are to be mirrored around the world and what standards and quality control will apply to mirror sites. Again the scholarly community is silent while the commercial giants of the STM world dictate the shape of scholarly communication – despite the fact that large scientific publishers are aberrant rather than the norm.

Nor is the network yet totally robust. A Dilbert cartoon pointedly and uncomfortably accurately suggested that all of the time saved through automation in the information age had been lost by people sitting at Web browsers waiting for pages to load. Networks do not yet for example give the reliable quality of service required for multicasting, while video clips have all the power, quality and assurance of early silent films. It should be self-evident that for research institutions working at the leading edge of scholarship and indeed telecommunications, the standard service provided by Internet service providers will always be inadequate.

A more positive element which is emerging in the electronic era is the broadening of what constitutes content. Services such as the Arts and Humanities Data Service, 5 based at King’s College London, or the excellent SCRAN (Scottish Cultural Resources Access Network) project 6, funded by the museums of Scotland, are much involved in the digitisation of museum and archive collections. This is happening fast and brings relevant experience in activities such as new licensing models and standards. It also highlights the role of curators in the digital environment as relating to presentation as well as preservation. But again there appears to be little concerted effort by the official organs of scholarship to build formal cross-domain linkages.

It is increasingly appropriate to question whether the Internet is the most appropriate general vehicle for information provision. This is a recognition that unrestricted international access is not a good use of resources and that a combination of local resources, mirror sites and caches can be more effective.

The Internet is wonderful when it works, but for large numbers of resources it is almost unusable. Very little thought has been given to alternative models in the rush to connect to what is good. The irony in this is that we have an excellent model for an alternative strategy in the classic library. It is a paradox of networks that electronic resources may make it worth reconsidering returning to a holdings strategy rather than an access strategy for information. As the cost of filestore drops and becomes competitive with bandwidth costs, it may be proper to acquire and hold information locally.

What then are the main messages from this book?

· The time is ripe for colleges and universities to play an increasingly significant role in the learning age, in lifelong learning, in the local, regional and national political, cultural and economic agenda

· Effective and efficient management of information and the move towards knowledge based approaches will be crucial for further and higher education to manage within a context of change and growth and to contribute fully to networked learning opportunities.

· Mature information strategies, which consider the whole way in which information is created, purchased, accessed, managed, exploited and disposed of within individual institutions, allied to the individual culture, the way people work and questions of ownership of information are fundamental to future effective growth and development in further and higher education.

· Countries throughout the world are trying to transform themselves into information societies; societies that can grow and prosper through the more effective use of information and its associated technologies. But we must recognise that the technology of global communication brings with it a considerable threat to national and local cultures. There is a danger that the distinctive European culture, in all its richness and diversity, will become smothered by a bland global culture, manufactured in Hollywood and designed to meet the desires of the majority.

· There is a real danger that the information society will reinforce and deepen social exclusion rather than help to overcome it. We already have a European society in which the minority are excluded from the mainstream of social benefit through poor education, lack of employment, low incomes, disability and poor housing. We need to ensure that the creation of an information society does not exclude them further.

· Easy access to information as a result of improved availability of IT in colleges and universities has increased the expectation of the internal customer so that administrators have had to make more and more internal and external management information available to all. This has enabled responsive, devolved management and administration, the delegation of information ownership and the increasing use of more flexible working patterns. The role of the administrator is changing towards one of partnership with the ‘customers’ adding value through provision of the highest quality information and data together with analysis and commentary providing the tools and infrastructure to access appropriate information in a timely way.

· Information overload needs to be countered by the development of criteria that differentiate good and poor quality information.

· ICT offers great possibilities for making information seeking more comprehensive and richer, presenting results in motivating and informative ways, making research more democratic and pointing the way to more effective information seeking; however, information stress, over hasty abandonment of traditional libraries and an erosion of quality of information may counteract the benefits.

· A number of different technologies have a place in the learning cycle and each can add value to the students’ learning experience but face-to-face communication and traditional teaching methods should not be forgotten; a balanced, managed approach is necessary to ensure quality of learning.

· Intellectual property issues need to be adequately understood and addressed in both print and other media forms and the knowledge base preserved.

· The UK government’s lifelong learning agenda will require more ‘joined-up’ thinking, particularly with regard to library and information resources and the need for collaborative solutions to cross-sectoral use of libraries.

But above all perhaps, the message of all the authors is that although information is a staple of educational life, it is one whose taste and flavour can be improved with constant care and attention.

Notes

1. 1. Discussed in an unpublished paper given at the European Union Telematics Conference in Barcelona, February 1998.

2. 2. M. Behringer (1997) The implementation of TEN-34. Paper presented at JENC8, the eighth annual Joint European Networking Conference, May, and later published in DANTE IN PRINT, 28, at: http://www.dante.net/pubs/dip/28/28html

3. 3. Ibid.

4. 4.News report in The Times Higher 1322, 6 March 1988.

5. 5. http://www.ahds.ac.uk

6. 6. http://scran.ac.uk/