Back to Basics

Back to Basics: A-literacy, the Boolean gene, Convergence and the long tail

Derek G. Law

Centre For Digital Library Research, University of Strathclyde

Abstract:

Purpose: Based on a paper given at the Fiesole Retreat held in Lund in July 2006. The paper explores issues on the future relevance of libraries in a world dominated by the web and how far “off-web” resources will have any relevance to users.

Design/methodology/approach: the article combines narrative and analysis

Findings: Libraries are slow to respond to external competitors and cultural changes, but their own practices paradoxically leave them well equipped to make such responses.

Originality/value: Challenges libraries to build on existing experience and skills in the Web 2.0 world

Key Words: Folksonomies, Social Networking, Collection development

Paper Type: Viewpoint

Aliteracy is a useful concept to illustrate much of what is going on over the web at present. It is a concept which should not be confused with illiteracy or even dumbing down. Of course dumbing down exists. As a regular buyer of books at my local bookshop, I actually felt moved to complain a few weeks ago when they relaunched Dan Brown’s Da Vinci Code to coincide with the launch of the disastrous film. They also released a new action man. Yes, Leonardo Da Vinci. Not an artist and a man of letters, but an action man after whom the much better known Leonardo Di Caprio was named, according to the retailer’s blurb. Now that’s dumbing down. Or take the latest shake-up at the BBC where changes are being made to make the BBC “the most creative organisation in the world”. The press release states that the intention is to make the BBC “ready for 360 degree multi-platform content creation”. As a first step Television becomes Vision and Radio becomes Audio. This perhaps comes close to my real worry. And my real worry is aliteracy. The fact is that clever, well informed people no longer find libraries essential and that there are now two worlds – on-web and off-web. It is at least theoretically possible to gain a PhD without reading anything, at least in science and technology. You formulate a hypothesis, write some software, run various machines and use more software to analyse results. The literature review requires cut and paste skills, not reading skills. That is the world libraries increasingly face.

The question of what libraries should do in the face of computers is not new. I first heard it posed as a problem by Jimmy Thompson twenty years ago in his seminal book The End of Libraries. (Thompson, 1982) “The Librarians and Libraries that do not accept the change will inevitably be victims of the evolution. For the dinosaurs it will indeed be the end.” He quoted Fred Lancaster: “We are already very close to the day in which a great science Library could exist in a space less than 10 feet square” (Lancaster, 1978). That day has arrived

Despite the growing evidence of the need for a fundamental rethink of the role and place of libraries, commentators have reached for the comfort blanket of the library as a place and for the precedents of history. The library as place has been variously pictured as the last remaining substantial social space in universities; as the last remaining public place of trust in society, in the case of public libraries as where young children can be left in the care of strangers while parents shop. The precedents of history trace a 4000 year path from the oral tradition through tablets of stone to papyri, the printed word and even sound and film. We comfort ourselves that through these 4000 years of history we have often been buffeted by great waves of change, but never yet capsized. Librarians will have their own favourite statistics. There are still five times as many library cards as Amazon users; there are more libraries than McDonald’s outlets in the United States of America; one person in six in the world is a registered library user; there are over one million libraries and over 700,000 librarians.

Yet increasingly these look like little more than comfort blankets. The Google Library project plans to digitise some thirty-six million volumes over the next few years, (Andersen, 2006) and there are powerful legal and economic arguments which seem to assure its success. (Varian, 2006). Who will then need even a large university library? The aliterate hive mind ignores the off-web in favour of the big gravitational hubs of the Internet and these are increasingly the places where other people build systems and services on top of the hubs. The big upsurge in social networking services is based on blogs, wikis, Instant messaging and other tools which are creating new spaces where services are being built, spaces which are quite foreign to libraries. Youtube has acquired twenty million users a month in eighteen months and they watch one hundred million video clips a day. Myspace has one hundred million users and the number is growing by 240,000 a day, while Google receives one billion requests a day. (Naughton, 2006). Even our own professional thinking and development uses these tools. The latest big thing is social networking. Most of the professional interest has concentrated on folksonomies (Tonkin, 2006) – although tellingly little of that makes its way into print and most of the discussion centres around blogs. Unlike social-networking sites such as LinkedIn and Friendster, which concentrate on developing relationships, social sites such as del.icio.us, 43Things and Flickr focus their attention on organizing data. Users organize their own or other’s data in the public sphere and the social, or community, aspects arise from there as users share and seek out like-minded individuals. And if even classification is under threat, what is left?

Now at least in the academic sphere, not all social networking poses a threat. Myspace is the biggest website in the US, having overtaken Yahoo. It has 100 developers, but no content – this shows in the quality of the content. Take a recent randomly searched question on the great Northern War and the [typical] response of Metalrat from Houston:

“Unfortunately, it turned out that Charles the 12 LOVED war and was really good at it. He pretty much kicked everybodies ass at the beginning. Scared the crap out of Peter the Great on a few occasions to the point he fled a few battlefields and ran all the way back to St Pete. Unfortunately Charles loved war too much and didn't know how to stop. Invading Russia in the late fall probably wasn't a good idea and he ended losing everything at the battle of Poltova. Fun guy though if you like war maniacs.” Myspace, 2006). Accurate enough but hardly the stuff of which term papers could be made.

Much more interesting as a community effort is Wikipedia.. Jordanhill Station in Glasgow has the unusual distinction of becoming the one millionth entry on Wikipedia. The entry was begun on 1st March 2006 with a single sentence. Within 24 hours it had been edited 400 times and expanded to entry that prints out as five pages. There is no such entry in Encyclopaedia Britannica which is barely 10% of the size at 120,000 entries. Wikipedia is currently the 17th most popular site on the Internet at 14,000 hits a second. And much more up to date than Britannica. The first entry on the 2006 Israeli-Lebanon conflict appeared on the wiki within six hours of the capture of the two Israeli soldiers by Hezbollah. The argument rages as to accuracy and whether a thousand amateur administrators can provide adequate quality control – or as Jorge Cauz, president of the Encyclopaedia Britannica recently put it, “Wikipaedia is to the Encyclopaedia Britannica as American Idol is to the Julliard School” (McGinty, 2006). This seems to me to miss the point entirely. Those of a certain age will remember the war over video recorders where the consensus was that Betamax was technically far superior to VHS. But VHS was the popular choice and the financial success – except that both were overtaken by DVD.

The underlying issue for libraries is not an overload of information but a shortage of attention for the abundance of information. This is as true of research as teaching, where we increasingly want to gather create and share. We are only just beginning to understand how data flows through the research process from research bids and bid management to human resource management and research outcomes. Instead of the historic position where users adapted their workflow to the library, visiting us at fixed times, now we have to adapt to their workflow.

It has historically been the case that library collections were built for the user in future not present times, certainly in the humanities. It was also the case that and is still the case that research libraries collect more non-commercial items than commercial items. I see a significant present failure as the failure to engage with collection policy for born digital material. There is no real debate on what should be collected and by who and as a result valuable material is already being lost. Our successors will rightly blame us for this. Libraries should collect the born digital material which will give us brand differentiation. We can see some elements of this – although not yet with born digital material – in such deep archives as the immensely rich Valley of the Shadow – pulling together resources from a range of media, on the American Civil War. As was always the case, in the text-based age it will be our special collections and archives of electronic materials which will give libraries both purpose and brand differentiation. Further Dempsey (2006) has further argued that it is in the aggregation of these resources to turn libraries into a major gravitational hub that any salvation must lie.

In some ways the library community is already very advanced as a broker organisation. It has a developed ecosystem of resource sharing, supported by shared and standardised cataloguing, messaging and delivery services and reciprocal access. It is accustomed to depending on others for services through a commonly created infrastructure. However more recent library development has drifted away from this sharing, so that in the UK there is no truly national union catalogue and a fragmented resource sharing infrastructure. We have lost the lessons of the past and need to rediscover the importance of aggregation. But the first building block will be an understanding of how we create, build and collect electronic collections locally.

Another way of looking at this comes through the so-called long tail postulated by Chris Anderson (2006) as a way of describing the niche marketing developing around the big hubs such as Google, Yahoo and Amazon. But if new as a concept it is not new as a practice. “Libraries were into long tails before long tails were cool. Any library stocking more than a few thousand titles (i.e., the vast majority of libraries) knows all about the long tail. In fact, most large libraries have collections that extend far beyond the utmost limits of the longest tail. In other words, many items in their collections have not been used since added. Perhaps some libraries, in an effort to boost circulation statistics, have focused too much on the "heady" end of their collections. Rather than cater to the clamorers for [Dean] Koontz, perhaps libraries should cultivate more long-tail usage. If the long-tail phenomenon is here to stay, perhaps the 80/20 rule (that 20 percent of the collection accounts for 80 percent of the use) will become increasingly suspect”. (Peters, 2006)

I said earlier that we should collect the born digital material which will give us brand differentiation. In principle that’s a rather easy and autonomous thing to do. The interesting questions are whether we can then follow Dempsey’s advice to aggregate our resources, whether we can find sensible kitemarking to inform users – and perhaps whether we can build joint services with publishers so that the combination of content, quality assurance, lack of bias in selection and ease of access assure us of a place on-web.

References

Andersen, Deborah Lines. Benchmarks: The Google Library Journal of the Association for History and Computing 7(3), 2004

Anderson, Chris The Long Tail: Why the Future of Business Is Selling Less of More.

Lancaster, F.W. Towards paperless information systems. New York: Academic Press, 1978

McGinty, Stephen Online Amateurs beginning to put ‘Britannica’ under pressure. The Scotsman August 1st 2006, p27

Metalrat Fighting on Five Fronts. http://forum.myspace.com/ 2006.

Naughton, John Websites that changed the world The Observer. Supplement, 15 Years of the Internet p.4

Peters, Tom The Long Tail Wags the Dog. ALA Techsource 7th July 2006.

http://www.techsource.ala.org/blog/2006/07/the-long-tail-wags-the-dog.html

Thompson, James The End of Libraries London, Bingley,1982

Tonkin, Emma Folksonomies: The Fall and rise of plain-text tagging Ariadne, 47, 2006 http://www.ariadne.ac.uk/issue47/tonkin

Varian, Hal R. The Google Library Project. Technical report, UC Berkeley, 2006.

(Professor Derek Law is Head of the Information Resources Directorate at the University of Strathclyde and is a founder of the Centre for Digital Library Research there. He has published some 200 articles, conference papers and book chapters focused mainly on developing electronic resources, digital libraries and national information strategies.)