Evaluate programs and services using measurable criteria
The challenge associated with measuring programs and services is the answers are often inside users' minds. Surveys ask them to share their thoughts but do not guarantee honest reporting or accurate memory. Alas, combined with user and collection statistics, survey results are passable evaluation instruments. Information literacy instructors measure similarly elusive information; increase in knowledge, skills, change in behavior and corresponding information literacy improvement in students’ coursework.
User statistics, however are easily measured by integrated library automation software with evaluation-specific functions that can report frequency of user visits, number of materials checked out per visit; quarter or year, number of users per household, etcetera. They can report quantities and staff can make inferences based on the numbers. For example, if the door count is up, but circulation is down, it might mean that users are enjoying the computer access or educational programming. Performance Management Systems such as LIBQUAL can measure users’ satisfaction with programs, services and materials. That information can be compared with past performance and with similar libraries.
Of library evaluation criteria, Leonard Lawless of Carlton University writes, “Just ask.” He notes that while colleagues embark on “fact-finding missions” (Lawless, 2016), he has found anecdotal evidence to be useful. I agree. Library evaluation should consist of cyclical, objective measurement combined with daily user interaction. Find ways to incorporate survey questions like, “what brought you in today?” into interactions with users. Also, consider hiring an external consultant to design community surveys and focus group questions. Maintain advisory groups to represent several demographic and genre groups.
Using measurable criteria to evaluate libraries is important to me for several reasons. Mostly, I want to do my best, and maturity has taught me that outside information and clearly defined benchmarks are the direct path to success. Because I am as susceptible to wishful thinking as anyone else, I would not trust my perceptions to judge library wellness. Moreover, librarians manage colossal budgets. Stakeholders need information to feel confident in the library’s collection, facility, management and services. We are responsible for reflecting the community’s interests. Measurable criteria reveal those interests. I see objective data combined with staff perceptions and user feedback as essential to my life in general, from the bathroom scale and workout numbers to assignment points and professor feedback: I cannot know what I cannot measure.
The San Jose State University MLIS curriculum planning team incorporated evaluation using measurable criteria into most classes. For example, in LIBR 204, the first unit focused on assessing management theories against measures of effectiveness and workplace culture. Then, we applied those lessons to an analysis of library planning documents. A supporting work product included a strengths, weaknesses, opportunities and threats (SWOT) analysis of a University Library. Preparation for the analysis began by collecting information on the collection, services, facility, management, finances and reputation in the community. Then, my team designed ways to evaluate each of the attributes. The most beneficial aspect of the project was articulating clearly defined steps to measure and remedy weaknesses.
INFO 284: Seminar in Archives and Records Management provided experience evaluating management policies and software using specific criteria. For example, one research assignment compared several leading electronic records management software (ERMS). Preparation for the assignment began with collecting sales data to determine which software was most popular. Then, I read company literature and user reviews to learn about their features, which was necessary for comparing them. The coursework covered evaluation of information security with special emphasis on its legal and historical value. This aspect of evaluation was expressed in an assignment about the legal and ethical implications of poor information management.
LIBR 210, Reference and Information Services provided interaction with evaluation criteria. One of the assignments used Reference and User Services (RUSA) guidelines to analyze a reference interview. The experience taught me to hold myself and others accountable to objective standards. INFO 250: Design and Implementation of Instructional Strategies for Information Professionals, focused on assessing instructional material, teaching methods and student outcomes. This is widely discussed and exhibited in Competency K and I respectively.
INFO 254: Information Literacy and Learning exposed me to the American Library Association Institutional Repository (ALAIR) and the American Association of School Librarians (AASL) evaluation standards. I learned that in 2016, the Association of College and Research Libraries (ACRL) rescinded the Information Literacy Competency Standards for Higher Education adopted in 2000. They were replaced by the Framework for Information Literacy for Higher Education the same year. This framework, a set of standards for evaluating library service, was the central attraction for the classwork. The Frames guided assignments such as an evaluation of an information literacy class and its assessment instruments. Then, I created an information literacy assessment tool using Google Forms. The survey was designed to be administered before and after a thirty-minute information literacy class for college freshman. It had to be two minutes or less, be accessible from student computers via link, and measure progress. I gained experience using the survey building tool and designing assessment questions.
The second sample work-product is an evaluation of the transparency of three government agency websites. I choose it to support Competency N because it shows my ability to collect topic-specific sources to measure an ideal. I crafted the research question around an amorphous topic for the fun of experiencing an exciting, new challenge. The assignment forced me to think deeply about how to quantify honesty.
The inherent discombobulation of trying something new and beyond my ability soon sorted itself, as resources to compare government transparency were abundant. The preparation included a close reading of the Government Finance Officers Association (GFOA) website. The GFAO provides critical analysis of the three agencies in question. Because the U.S. Public Interest Research Group rates state transparency, I learned techniques for quantifying honesty from their publications. The Government Accounting Office and the Recovery Accountability and Transparency Board police spending and record who receives government money.
With these support resources identified, the next step was to translate the variables into measurable criteria. For variables, I used information about: Employees, finances, self-reporting, external reporting and to what extent reports matched. The measurable criteria for each attribute is presented in four categories arranged from low to high-levels of compliance.
To test the agencies, I developed a list of scandals based on news reports and researched them for relevance and context. The paper compared negative news reports with the information presented on the agencies’ websites. My hypothesis was; news reports would not match the agencies’ self-reporting. I was wrong. The research showed that all three government agencies reported the same or more information about their missteps. They were supremely accountable.
The third piece of evidence is a website critique for LIBR 230, Issues in Academic Libraries. Rather than comparisons to ACRL standards, it uses survey results reported in peer reviewed articles to measure library website performance. One of the survey’s referenced was Raward’s Checklist. The research paper was intended as an exercise in applying accessibility and design standards to academic library websites, but I also used it as an opportunity to learn what not to do. I developed the criteria for evaluation as I reviewed the literature and vetted research. The criteria that emerged from the preparation were: Physical access, online access, website design, usability, services, collection and facility.
The research revealed an academic library that did not have a digital collection or an online public access catalog (OPAC) for its physical collection. Another had such poor website design, that getting to the general search bar took three clicks on links that were not clearly marked. Another library was not open to the public. The assignment revealed the benefit of applying analysis to measurable criteria, and the MLIS program required this exercise quite often.
In summary, the courses trained me to communicate more professionally by supporting my communications with measurable evidence. The coursework, lectures and texts taught me that no matter how obscure the topic; someone is an expert: Professional associations exist to measure everything. Case studies of library assessments are copious and instructive. I learned to measure ideas by operationalizing the questions. Translating intangible variables into measurable factors for my employer will be intellectually stimulating, meaningful work.
Similarly, having the discipline to slow down and digest resources and outcomes became routine during the coursework. I have developed the humility to defer to survey results and colleague feedback. I look forward to designing assessment instruments around intriguing library problems. I will volunteer to lead evaluation teams: The design and discovery process is fascinating to me. In my future position, I will continue to use resources to increase my frame of reference and discover support materials that will nurture objective evaluation.
Resources
Lawless, L. (2015). How to evaluate library services. Education Forum. Retrieved from https://www.researchgate.net/post/How_do_we_as_librarians_assess_and_evaluate_library_services
Evidence 1: Assessment Tool
https://docs.google.com/forms/d/1A0tB5atxcGUQifIcaN9jmemOy4xnC2fz-sFR_pOXAts/edit
Evidence 2: Transparency Report and Rubric
Disclosure and Transparency Rubric: How Well do the Environmental Protection Agency, Department of Energy and Department of Agriculture Communicate Through Their Websites?
Snow Marlonsson
San Jose State University School of Information
Abstract
This research measures website transparency of the Environmental Protection Agency, Department of Energy and the Department of Agriculture. It compares their news posts with major news outlets’ articles on matching topics. The research finds that all three agency websites disclose information even when it reveals conflicts of interest. All three government agencies achieved very high transparency ratings.
Keywords: government website transparency, conflict of interest disclosure
Introduction
The heart of conflict of interest laws is to protect the public from a special class of dishonesty: Systemic injustice. We want our government to rest on a fair platform. This means that agencies should not stack the deck against the thing they are employed to protect. The Food and Drug administration should not be run by a lobbyist for the largest genetically modified seed company in the world, for example. But this paper is not about agency staff. Nor is it about corruption. This research seeks to measure the transparency of the parent agencies against the reporting of major news outlets and against each other in the context of the Government Website Rubric designed for this assignment. The question this inquiry answers is modest: Do the agencies in question disclose employment and spending decisions that could represent conflicts of interest or do they seem to obfuscate?
Method
This research reviews government website evaluations and rubric design. It discovers the attributes researchers use to evaluate e government websites. That information is synthesized into the Government Website Transparency Rubric. The rubric rates the following five attributes on a 1- 4 point scale: Human resources disclosures, financial transparency, accessibility (design and function), news disclosure and content congruity with major news outlets. The Government Finance Officers Association’s search function was used extensively as a third party “control” to assess the content and timeliness of government blog posts and press releases. Other measures of transparency are more amorphous. Whether government information is similar to that reported by the media is subjective. Essentially, I am comparing content and applying an all or nothing judgement: Same or not-the–same. Similarly, the sections on honest reporting are not the result of well operationalized questions, but comparisons with other sources. The government websites’ addresses:
http://www.usda.gov/wps/portal/usda/usdahome
Review of Literature
Financial transparency can be determined by the degree of disclosure available to non-specialists. Governments may publish their budgets, but without curation, laypeople may not be able to provide enough context to understand the data. E government assessments must observe whether the websites provide analysis, context and comparisons. Determine if data is presented in graphs, as interactive elements or simulations. Can the public ask questions or observe multi-perspective debates? Baxandall and Wohlschlegel, posit that transparency requires comprehensiveness, unified electronic location (a website) and intuitive searching (2010). An emerging attribute of e government fiscal transparency is the checkbook search function that allows citizens to see what their government bought, from whom and when (Justice and McNutt, 2010). Another way to evaluate fiscal transparency is consult experts in the field: The Government Finance Officers Association (GFOA) awards recognition to exemplary financial document reporting. The awards are limited to state and local governments, but the GFAO provides a wealth of critical analysis of the financial practices of the three federal agencies featured below. Further, the U.S. Public Interest Research Group (PIRG) rates state transparency on a 100 point scale. The PIRG is a resource that provides guidance for evaluating government websites. The standardized, highest value attributes for the PIRG scorecard are: “Checkbook-level” searching, recipient locating, identifying subsidies, disclosing disbursements and listing the purposes for payments (Justice and McNutt, 2010). The Government Accounting Office, heavily influenced by the Recovery Accountability and Transparency Board, and the Federal Funding Accountability and Transparency Act of 2006, recommends similar interactive measures to make information accessible to the public. They recommend incorporating good design and relevant graphic representations of data.
Moving beyond financial transparency, how can we measure the truth of statements made on government websites? Initially, I sought to confirm data by finding it in multiple places, but spending time becoming friendly with the three websites altered my perception of them because the “other sources” were politicians who answer to constituents, reporters who need to bolster readership, and activists, who are not always very balanced. The website information looked more credible, the more alternative sources I added to the frame of reference.
What variables contribute to government website efficacy? Morgeson and Mithas (2009) define e government website efficacy as the ability to deliver services (mostly in the form of information) to the public. They identified nine attributes: Customization, organization, reliability, navigation, overall satisfaction, confirmation to expectations, comparison to ideal, customer retention and propensity to recommend. Marciniak (2000) rates comprehensiveness and maneuverability as the highest value attributes for e government websites1. Specifically, he values the ability to download and print primary documents and research. These qualities were incorporated into the design and accessibility section of the Government Website Transparency Rubric.
Findings
The Environmental Protection Agency earns a nearly perfect transparency assessment given the rubric’s parameters. Several senior deputies’ biographies revealed education and work histories that were incongruent with their posts, but they do not represent a conflict of interest as initially hypothesized. For example, Stanley Meiburg, Acting Deputy Administrator of the EPA, is not educated in science, but politics. This distinction becomes important later, as the EPA defends itself after the Gold King Mine spill and after a low-level EPA sex scandal where Mr. Meiburg uses his political expertise to navigate vigorous criticism. In his defense, Mr. Meiburg did write a dissertation that explored the relationship between science and political leadership and public administration. Jim Jones, the Assistant Administrator for the Office of Chemical Safety and Pollution Prevention is an economist, and Janet McCabe; Office of Air and Radiation was a lawyer. One would expect to see environmental engineers and chemists leading science agencies.
The EPA provides access to their budget, the goals it is designed to accomplish, and the process the EPA wants to use to achieve the goals. The public can see reports of the previous year’s spending, goals and what results they achieved. As noted by the literature review, curation makes data more accessible to the public. The EPA has interpreted the information through illustrations and reports that seem to be written in accordance with federal plain language guidelines which is based on research that find the majority of the public reads at a seventh grade reading level; fifth grade for health and science topics (http://qpc.co.la.ca.us/cms1_033658.pdf).
The EPA provides bulleted highlights that educate the public about the process of forming the budget and continuously invite the public to ask questions, call or otherwise participate in their governance.
To determine the honesty of these pre chewed, partially digested interpretations, this research compares the highlights to raw data. No discrepancies were identified.
A review of content by media outlets parallel this assessment of general honesty. The information, both featured and buried, echo major media reports of EPA news and spending. EPA featured news, press releases and investigative reports about the Gold King Mine spill on its home page. While media reports claimed incongruence between EPA claims and facts, I found that the information was the same and the disputes were tied to degree and perception. For example, Mark Mathews of the Denver Post reported that two congressmen called for an investigation of the Bureau of Reclamation investigation (yes, an investigation of the investigation) because it does not blame the EPA with enough ferocity. The EPA admitted that the spill could have been prevented. The congressmen went as far as to blame the president of the United States for the lack of answers about the investigation by Bureau representative, Secretary Sally Jewell. I noticed that Ms. Jewell answered the questions; the congressman simply did not like the answers. The formal EPA statement asserts that the EPA was at the Gold King Mine assessing ongoing releases of contaminated water that had been occurring, unchecked and was treating the toxic water and creating a plan to fix the problem. This perspective was not represented in the sample set of media coverage. What is noteworthy here, is the EPA addresses the criticisms by issuing their report; a report that accepts blame, and they provide more information and context on their homepage: a more complete picture of events, which is ignored by politicians and media. So, review of this sample story indicates that the EPA seems to be more transparent than the media who report on the agency.
The final rubric topic addresses accessibility. The EPA’s website design is clear, intuitive and well formatted. The large font size and monochromatic color scheme render the pages easy to read. All of the functions work. Clicking a feature opens in a new tab. This research found no broken links. One link could not open, but it may have been a temporary problem, or related to my connection. The EPA earned a score of 18 out of 20. A point was withheld for the limited employee biographies. They conveyed past work histories, education and special projects, but did not disclose personal views or political contributions. The other point reflects the agencies refusal to use explicitly apologetic vocabulary in its press release about the Golf King Mine spill.
The Department of Energy website’s page of employee information was temporarily closed for maintenance. In its place, they offered comical and educational video about National Labs performed by employees, including Secretary Moniz. Because I could not check the employee’s backgrounds through the DoE website, I did not make a determination of employee transparency.
The DoE exhibits financial transparency by offering its budget in many formats. The public can choose to watch a video of the 2016 or 2015 budget briefing by Secretary Ernest Moniz, skim over the information on a highlights page, or see the DoE’s requests to congress. The public can view raw data or the DoE’s press release. They can order a monograph of the statistical plan for the last several years without migrating away from the Budget Page. Budgets that are specific to fossil fuels, environmental management, energy efficiency, electricity delivery, nuclear energy, nuclear security, energy reliability, “science” and more. The public is welcome to view performance reviews that detail how well the agency met its goals. The performance reports are available for each subsection of the DoE and by special project, for example, the GAO High Risk Improvement Plan.
The DoE provides a page called Quality Guidelines. It conveys the standards developed as a result of Section 515 Treasury and General Government Appropriations Act (Pub. L. 106-554). The guidelines require peer review of government science documents. Curiously, the Denver Post Reporting on the EPA featured the commentary of an environmental engineer who peer reviewed the investigative report the congressmen were using to defame the office of the president, along with Secretary Sally Jewell. So, as was the case with INFO 221, the process of completing the assignments loops around from finding to applying the lessons. I digress. The agencies are required to provide mechanisms for communicating these reports to the public e.g. their website. From this page, the public can access original research and requests for correction. The homepage also contains links to open government, which explains how the DoE is enacting President Obama’s mandate for a more transparent, participatory, collaborative government. It articulates why providing high value data sets to the public is important to good governance. This set of pages easily fulfills the rubric’s requirement for honest reporting as it provides links to primary documents, peer review, corrections and challenges as well as curated information for the untrained public.
The EPA website incorporates all known design principles for accessibility. The website’s homepage contains news organized in uniform boxes across the bottom 2/3 of the page. The news is current and relevant, sometimes reflecting popular news and entertainment. It conveys how homeowners can save money on heating and cooling bills. The interactive elements are entertaining in and of themselves. The site provides surveys, calculators and comparison tools. The DoE earns a 15 out of 15 on the Government Website Transparency Rubric. It is effervescent in every attribute; even the broken page was charming and interesting.
The focus of the Department of Agriculture seems to be on providing services and actionable information about a very pertinent thing to all humans everywhere: food. It seems less interested in coaxing the public’s interest with tips or news. The website design is a bit convoluted because the webmaster used limited tabs and in-text links. This forces huge dropdown menus which tend to hide information. The shape of the website’s layout is reminiscent of late 1990s sites which would be fine, except that it gives the impression of being out-dated. Still, it scores well according to the rubric because the menus do lead to original USDA reports. The links are functional; opening in new tabs and they present new menus. Navigating the site takes more time than the other websites, but the information required for good transparency is there. The USDA rubric score is 14 out of 20. Each rubric element fell short of perfect, but the employee biographies were absolutely hidden.
Comparison
The search function made each website very user friendly. If one did not have time to scan the homepage, he could just use the search bar. They all had a high degree of accuracy and flexibility, which implies that the website designers may have used the same or similar code for the search function. The design attribute was a strong indicator of transparency because, if the public cannot find information, say from poor design, it may as well not exist on the website. The government guidelines for plain language suggest using language aimed at the audience, in this case, the public. I think they were all either successful or very successful at using appropriate vocabulary. The guidelines also control the chronology of information. This attribute was evidenced by the way moving through the menus made sense; topics flowed well from one to the next. All three sites placed social media icons prominently and encouraged users to communicate.
None of the agencies issued any apologies. The EPA could have issued an apology regarding the failures associated with the Gold King Mine. Mathews reported that the EPA accepted blame, but no apology was issued (2015). Similarly, the EPA Acting Administrator, Stanley Meiburg could have apologized for failure to prevent a subordinate from sexually harassing sixteen interns and coworkers when he testified during a Congressional Hearing, but did not. This may indicate a flaw in the rubric, as legal counsel for agencies might prevent management level apologies (US House of Representatives, 2015). The term apology should be changed to acknowledgement. All three agencies provide performance, process and achievement reports for their budgets.
Curiously, The Department of Energy was the only agency that self-reported their Section 515 Guidelines. According to their Information Quality page, the EPA and Department of Agriculture are required to accommodate the guidelines, and I am confident they do, but their guidelines are not predominantly featured from the homepage menu. The DoE link was highly visible, even from the mobile phone version which conveys that communicating transparency is a priority.
The Department of Energy was more interactive than the other two sites. It included interactive tools for estimating energy use, heating costs and potential savings. The DoE website was more entertaining with its reference to Star Wars and current events. It provided contractor information in addition to employee biographies. It was just a little better than the other websites on all rubric measures. In comparison, the Department of Agriculture seemed to be missing the opportunity to showcase itself. It connected to patrons by featuring a slideshow of farmers, and grabbed my attention with its links for new farmers—something the DoE and EPA could never do: New to using electricity? New to breathing air? But “new farmers” juxtaposed next to old farmers implies promise and freshness; new beginnings.
Conclusion
Brookhart and Chen (2015) explain that the advantage of using a rubric is to help users visualize what a better work-product could look like. The Government Website Transparency Rubric contains the most idealized, transparent end-product I could assemble from the literature, imagination and curiosity; yet, all three government websites featured here exceeded those benchmarks handily. My hypothesis, that inappropriate people had been granted leadership roles in government, were not appropriately challenged by the public, and therefore, the integrity of government website communication might be suspect, was very outdated. All of which is to acknowledge the importance of reassessment and recording the facts as they are at a given time, because change, for good or ill, registers slowly.
Follow up work could test the rubric against other countries’ government websites. Adjusting the rubric to accommodate more complex attributes such as video blogs and personal personnel statements would advance the initial idea. Further research could confirm the featured websites’ transparency by testing the functions. Test the lines of communication by e-mailing officials and calling the posted phone numbers. I would have liked to have devoted more time to comparing information across sources, such as hearings and newspapers because the interplay between the sources is, as Muniez (2000) says about non-governmental sources: Valuable and interesting. I would have liked to fact-check all of the information presented here and on the three reviewed sites. In such an information rich, media saturated environment, where even the EPA deputy has a video introduction, the revised research question might be: How much government information can the public absorb?
Bibliography
Brookhart, Susan M., and Fei Chen. "The quality and effectiveness of descriptive rubrics." Educational Review 67, no. 3 (August 2015): 343-368. Academic Search Complete, EBSCOhost (accessed December 6, 2015).
Government Accountability Office, (March 2015). “Government Transparency; Efforts to Improve Information on Federal Spending” (accessed December 1, 2015)
Justice, Jonathan B., and John G. McNutt. "Social Capital, E-Government, and Fiscal Transparency in the States." Public Integrity 16, no. 1 (2014 2013): 5-24. Academic Search Complete, EBSCOhost (accessed December 3, 2015).
Marciniak, Todd M. "Rating e-Gov Web Sites." Government Finance Review 16, no. 5 (October 2000): 54. Academic Search Complete, EBSCOhost (accessed December 2, 2015)
Mathews, Matt K. “Congress calls for new Gold King Mine investigation”(December 2015) http://www.denverpost.com/news/ci_29224790/congress-calls-new-gold-king-mine-investigation (accessed December 10, 2015)
Morgeson, Forrest V., and Sunil Mithas. "Does E-Government Measure Up to E-Business? Comparing End User Perceptions of U.S. Federal Government and E-Business Web Sites." Public Administration Review 69, no. 4 (July 2009): 740-752. OmniFile Full Text Mega (H.W. Wilson), EBSCOhost (accessed December 4, 2015).
US House of Representatives (April 2015): “Testimony of Stanley A. Meiburg Acting Deputy Administrator US Environmental Protection Agency Before the Committee on Oversight and Reform US House of Representatives (accessed December 5, 2015) https://oversight.house.gov/wp-content/uploads/2015/04/Mr.-Meiburg-Testimnony.pdf
Notes
1. Marciniak’s observations from 2000 are relevant because the perspective is foundational. His vocabulary for describing e government websites echoed current research, sometimes verbatim. The close correlation between the old and new perspectives helped me determine which attributes were constant and which ones were trendy.
Evidence 3: Website Assessment
Snow Marlonsson
LIBR 230 Section 10
Investigation #2
This investigation seeks an example of an inadequate academic library website, not to find fault, but to learn more about the conditions that characterize problematic libraries. What does a failing academic library website look like? A 2012 Princeton Review Survey polled 122,000 college students about their libraries and ranked the results. Meredeth Schwartz (2012) reports the bottom-ranking academic libraries in her article for Library Journal:
United States Merchant Marine Academy, Tuskegee University, University of Dallas, Prescott College, Howard University, College of the Atlantic, Duquesne University, Bard College at Simon’s Rock, Bard College, Seattle University, Emerson College, Montana Tech of the Univ. of Montana, Drexel University, Green Mountain College, University of Hawaii at Manoa, Juniata College , Birmingham-Southern College, and Wells College.
Using the web usability checklist for academic library websites (Raward, 2001) to evaluate some of the lowest-rated schools’ library websites revealed that several libraries from this selection have questionable website features. The Green Mountain College main page informs students of the library’s history instead of offering a search function. Juniata College used excessive library jargon. The United States Merchant Marine Academy did not have online access to their catalog: The collection was limited to the physical library. The College of the Atlantic does not seem to have a library. It provides a picture of a Tibetan door accompanied by a literary quote on a page titled: Off Campus Study. To learn about the other side of excellence, ironically, this investigation turns to Drexel University’s Hagerty Library.
Drexel University’s American Library Association approved graduate level library science program is consistently ranked one of the best in the nation (Schwartz, 2012). Why then would a survey of college students determine that its library is inadequate? While this investigation finds the library website to be acceptable, 85% using Raward’s Checklist (2001) the library’s website provides some clues as to the low ranking by the students in the Princeton Review Survey. Access to the library website from the campus website is not clearly marked, and once the student has found the library website, he must find a link to access another page that contains the search function. This process takes time. The route to information may not be obvious to some undergraduates. This investigation finds that once the search function is found, success increases. The information retrieval software, quantity of results and quality of materials are all very good.
Visibility
This project assumes that an undergraduate student will begin a search for the campus library on the campus’ website. Unfortunately, Drexel University has hidden the link to their library. Website visitors can choose from tabs: The Drexel Difference, Academics, Research, Campus Life, and About Drexel. A student might expect to find the library on the research page. However, the research page describes a sample of the research students perform from an advertising perspective. Links on the page lead to pages that answer what, why and how students research. The how page, titled, Resources does not lead to the library. Finding a dead end, students might go back to the campus page and choose the tab Academics. The academics page has a link to a library page, but it is not the library page. It does not contain a search function or links to library services. Instead, it advertises the library system from a marketing perspective. This perspective is not useful to a student who wants to access library information or services. Within the descriptions of the libraries on campus are links to the various libraries. If the undergraduate student happens to choose the Hagerty library link, which is not marked as the main library, he will arrive at a library page with a search function. This is unacceptable website design. If the topic of libraries must be hidden under the Academics tab, it must lead directly to the main library. If users need to choose a special or departmental library, let them do it from the main library page. Preferably, the campus website will contain a tab that leads to the main library page with the search function placed prominently in the center of the page. Ideally, students will bookmark the page and never waste time searching for the search function again.
In their 2013 report, Modeling a Library Website Redesign Process: Developing a User-Centered Website Through Usability Testing, Becker and Yannotta report that only 1% of undergraduate students begin their research from their library’s website. 84% of undergraduates use a search engine. The authors’ usability testing finds that undergraduates are confused by the vocabulary commonly used on academic library websites. Becker and Yannotta’s (2013, p.7) literature review finds that undergraduates are confused by the number of choices presented on the library sites. They advocate a simple, streamlined interface for undergraduate users (Becker & Yannotta, 2013).
Design
The main library page is cluttered with options and more marketing. A student who wants to access information must divide his attention between many messages. The webmaster has attempted to separate the topics by grouping them in separate colored boxes, tabs across the top and bottom and by using different fonts. Visually, the messages are not well weighted. They all call out to the user with the same force. Chow, Bridges, & Commander (2014) report that 63% of academic websites have a search function on the main page. I propose that Drexel’s Hagerty Library follows this trend and moves its search function to the first point of contact with students. Move the search box from the upper right quadrant to the center middle of the page. Place the slide show under the search box. Either remove the tabs or the colored boxes: do not use both devises to indicate categories. Use more visual cues to convey meaning. For example, use a clock icon in addition to the word hours and use an avatar or person icon in addition to the phrase chat with a librarian. These visual cues will help students skim to the information they need without requiring them to read each un-weighted entry. Becker and Yannotta’s research find that simplicity is the most important website design element (2013, p.8). For example, during their usability testing, Becker and Yannotta (2013) discovered that 80% of students started their search for articles with the Titles of Databases tab even though they had all attended a basic freshman-level library literacy course that attempted to teach students to choose the Databases tab. Usability testing revealed that replacing databases with find increased success to 66%. This example illustrates how simple the academic library website must be to reach students. Wording, distractions and design must emphasize the overarching purpose of the website (Becker & Yannotta, p.13). Librarians’ efforts to market their products and services to students is important, but students want to access information quickly. I would like Drexel’s webmaster to reconcile these two perspectives through better design.
Usability
The Hagerty Library search function is very easy to use. Once the student finds the search box his experience will improve as its use is intuitive and effective. The streamlined search bar contains two dropdown menus, a text box and a go, or enter button. The dropdown menus specify location and type of material. For example, a student can place a material type limit such as books. This changes the options on the search bar automatically. Now the user can choose another category such as author or keyword. Once the user chooses the go button, he can get started or refine the results. The results appear in a center-oriented column. A column on the right displays the results’ abstracts, publication information and options. The column on the right displays an array of limiters including date range, language, type of publication, and others.
Information Retrieval
This investigation used the phrase, “Effects of corporal punishment on students” to evaluate the software’s ability to retrieve information. The software was able to ignore the word of. The top results show that search terms were assumed to be related by “and”. Bottom results show an and/ or relationship between search terms. The software found the search terms in all aspects of the entry but prioritized the entries that found the search terms in the title. In fact, the first result is an article whose title is the sample set of search terms. The search function was not able to disregard or interpret misspelled words. A smart search function that could search similar terms or broaden the search was not readily available. The pages load immediately and the links are valid.
Conclusion
Drexel University’s Hagerty Library website was chosen based on its low Princeton Review Survey ranking with the intention of learning about academic library dysfunctions that should be avoided during academic library employment. Instead, this investigation finds an acceptable library website. Only one feature must be corrected immediately: Visibility. Students may have ranked the library poorly because they could not find the access point. It is hidden by superfluous fluff. Website visitors must navigate marketing ploys to get to relevant information. Once students find the main library search box their query leaves Drexel-controlled territory and enters Serials Solutions; a division of ProQuest (ProQuest, 2015), where student success is inevitable because the product is simple, intuitive and accesses a vast collection. This review of the Hagerty Library website is: Do not put barriers in front of users. Deliver them directly to the search function. Ensure that your search function is the best product your organization can afford.
References
Becker, D. A., & Yannotta, L. (2013). Modeling a library website redesign process: Developing a user-centered website through usability testing. Information Technology & Libraries, 32(1), 6-22.
Chow, A. S., Bridges, M., & Commander, P. (2014). The Website Design and Usability of US Academic and Public Libraries. Reference & User Services Quarterly, 53(3), 253-265.
ProQuest. (2015). Serials Solutions and Maruzen announce content agreement. Retrieved from http://www.proquest.com/blog/2013/serials-solutions-and-maruzen-announce-content-agreement.html
Raward, R. (2001). Academic library website design principles: Development of a checklist. Australian Academic & Research Libraries. 32.2
Schwartz, M. (August, 2012). Princeton review student survey ranks college libraries. Library Journal. Retrieved from http://lj.libraryjournal.com/2012/08/academic-libraries/princeton-review-student-survey-ranks-college-libraries/