New Developed Websites (from January 1, 2015):
Introduction
International Open Public Digital Library (IOPDL) will consist of new published collections and existing qualitative and quantitative collections of Well-Designed Digital Libraries (WDDLs) in many subject areas. To determine WDDLs, Evaluation tools and methods were investigated. Through investigations, the main finding is that there is no appropriate method to evaluate several performances together. Existing methods evaluate one or two specific field(s) of many performances. Thus, one combined method is suggested to conduct the general requirement of the IOPDL. It is to evaluate multiple performances together of existing digital libraries with the suggested evaluation criteria: Content, Usability and Performance Evaluation (CUPE) criteria. More details about evaluations are in the paper, Evaluating Digital Libraries. The evaluation is limited regionally almost in the USA and timely on 2010.
The suggested CUPE criteria
The framework of the suggested CUPE criteria is:
The Content Evaluation Criteria
Accuracy - whether collections have accurate information in the subject area that the users can trust;
Coverage - adequacy of the scope of the collection, considering both breadth and depth;
Authority - how authoritative the site appears to be, based on the reputation of the organization or sponsors; and
Satisfaction - experts’ overall response to the digital library's collection
The Usability Evaluation Criteria
The Performance Evaluation Criteria
The Content Evaluation with Content Quality Evaluation Criteria
First of all, we examine whether a digital library has a unique specialized collection in one of subject areas. Those subject areas are drawn from the Library of Congress Classification, listed below. Several similar subject areas are combined, and few subject areas are deleted from the Library of Congress Classification to simplify subject areas.
Modified fifteen subject areas based on the Library of Congress Classification
With the Content Quality Evaluation criteria, Professor McDonough and I investigated which existing digital libraries are representative and have authority in a subject area. In evaluating collections’ content of existing digital libraries, we put emphasis on whether each digital library satisfies accuracy, coverage, authority, and satisfaction criteria in their collections. Finally, we recommend three to seven digital libraries in each subject area of fifteen subject areas, as candidates of well-designed digital libraries. Total sixty two digital libraries are recommended as candidates of well-designed digital libraries. The recommended digital libraries in each subject area are in the Candidate DLs of WDDLs.
The Usability Evaluation with the Usability Evaluation Criteria
Accessibility
Accessibility evaluation in Usability Evaluation Criteria investigates limitations and error degrees in accessing a digital library. Several methodologies have been presented to evaluate accessibility. One of them, W3C Web Accessibility initiative provides many tools that evaluate accessibility automatically. I chose seven web accessibility evaluation tools based on standards and unique characteristics. Each accessibility evaluation tool complies with one of Illinois Information Technology Accessibility Act (IITAA), Electronic and Information Technology Accessibility Standards (Section 508), W3C Web Content Accessibility Guidelines (WCAG), etc.
The chosen seven tools are:
Interface Usability Evaluation
After evaluating Accessibility, sixty two candidate digital libraries are evaluated again for their interfaces usability with three Usability evaluation criteria: Convenience, Interfaces’ Consistency, and Visible Design and Aesthetic Appeal Evaluation criteria. We designed significantly check lists for each criterion so that through the check lists, usability of each digital library can be enough evaluated by each criterion. Each criterion, thus, includes five or three evaluation check lists. Also, we use heuristic method to evaluate those digital libraries with the check lists of three criteria. I spend some time reviewing each digital library going through several times the interface of the digital library, and then evaluate closely the digital library, inspecting whether it satisfies each check list. Each check list is scored as one point when the digital library satisfies it. Each Criterion is scored with 5-point scale.
Check Lists
Convenience (Ease of Use)
Interface’s Consistency
Visible design and Aesthetic Appeal
The Performance Evaluation with Response/Retrieval Time and Relevance Criteria
Response/Retrieval Time
Response, retrieval time, is defined as how much time it takes to carry out tasks such as navigation or browsing links, and/or searching or obtaining resources. It is calculated as the average time that a digital library takes to process all requests. In detail, the suggested Response/Retrieval Time Evaluation Criteria measure the response time with the time that takes to access to all included links in the home page (I call it as ‘the link response time’), and the time that takes to show results for queries in search engines (I call it as ‘the search response time’). With the link and search response time criteria, the outline of the response time evaluations is:
Relevance
The relevance criteria measure how much relevant retrieved websites are to the query. It is measured by calculating how many words are matched with the query in retrieved websites. This method is used by Google Rankings Ultimate SEO Tool as keyword density (SEO Tools – keyword density). That is, the relevance of the CUPE criteria measures the relevance based on Keyword Density. Keyword density may be not appropriate for websites that have many images and sounds instead of words. However, keyword density is generally an efficient method to assess relevance of obtain results and how much obtained websites are related to the query.
Overall designed method to measure relevance in the paper is as follows:
Reference
Cynthia Says. (n.d.). Retrieved from http://www.cynthiasays.com/Pages/About.aspx
Etre accessibility check. (n.d.). Retrieved June 2010, from http://www.etre.com/tools/accessibilitycheck/
Fujitsu. (n.d.). Fujitsu web accessibility inspector 5.11. Retrieved June 2010, from http://www.fujitsu.com/global/accessibility/assistance/wi/
Illinois information technology accessibility act standards 1.0. (n.d.). Retrieved 2010, from http://www.dhs.state.il.us/IITAA/IITAAStandards.html
Jin, S. (2014). The International Open Public Digital Library (IOPDL): A Proposal for the Future. Retrieved from http://courseweb.lis.illinois.edu/~sunjin/Papers/InternationalOpenPublicDigitalLibrary-Proposal.pdf
Jin, S. (2014). The International Open Public Digital Library (IOPDL): A Proposal for the Future. Evaluating Existing Digital Libraries with the Suggested Criteria: Content, Usability, and Performance Evaluation Criteria. Retrieved from http://courseweb.lis.illinois.edu/~sunjin/Papers/InternationalOpenPublicDigitalLibrary-EvaluatingDLs.pdf
LC. (n.d.). Library of Congress Classification Outline. Retrieved from The Library of Congress: http://www.loc.gov/catdir/cpso/lcco/
Nielsen, J. (1993). Usability engineering. San Francisco: Morgan Kaufmann.
SEO Tools – keyword density. (n.d.). Retrieved 2010, from http://www.seochat.com/seo-tools/keyword-density/
UIUC. (n.d.). Functional Accessibility Evaluator 1.1. Retrieved from fae: http://fae.cita.uiuc.edu/about/
W3C. (n.d.). Markup Validation Service. Retrieved June 2010, from http://validator.w3.org/
WAEX. (n.d.). Web Accessibility Evaluator in a single XSLT file. Retrieved June 2010, from http://www.it.uc3m.es/vlc/waex.html
WAVE. (n.d.). WAVE (Web Accessibility Evaluation) Tool. Retrieved April 10, 2010, from http://wave.webaim.org/
January 09, 2014