Competency N
Evaluation
Evaluation
Evaluate programs and services using measurable criteria.
Introduction
Evaluation is critical to the information profession, and Competency N tells me that only through evaluation using measurable criteria can an information professional discover and communicate the value of the programs and services they and their institution provide. This is important for several reasons.
First, it gives information professionals data for objectively assessing how well programs and services are fulfilling their purposes. As Matthews (2018) put it, evaluation “is particularly useful for planning and delivering library services, and it serves as the foundation for improving programs” (p. 255).
Second, it allows the information professional to compare different programs and services to one another more objectively, and thus make better decisions about which to start or purchase, or reject or discontinue.
These first two reasons can come into play especially during budget cuts when information professionals are forced to eliminate offerings. However, Buck (2016) advised that even during times of growing budgets, libraries should anticipate and plan ahead for contraction, as there are many factors, some foreseeable and some not, that could lead to sudden and substantial reductions in operating funds (p. 201). Frequent evaluation can help information professionals be prepared to make tough decisions regarding program and service reductions.
A third reason for evaluating is, it lets the information professional demonstrate to others—e.g., library board members, company executives, the public—the importance of those programs and services and the overall value the library or information center provides. For example, Shumaker (2012) wrote, “specialized librarians and information professionals in [the corporate and government] sectors have repeatedly been exhorted to measure their value in return on investment, time saved, costs saved, and similar measures” (p. 99). According to Rogers and Densch (2017), “As an information professional, it’s not enough to do great work. You must also constantly measure and market your unique expertise and the value you bring to your company” (p. 455). The second action item—marketing—cannot take place effectively without the first—measurement.
Establishing Criteria
In order to evaluate programs and services against measurable criteria, it is important for the information professional to be able to establish criteria that are directly applicable to their institution and its mission. However, this does not mean they must make up all criteria from scratch. Rather, they should look to established criteria that have been shown to be useful for proving value at other institutions.
Industry associations are indispensable sources for these criteria, with the most prominent being the American Library Association (ALA) and its sub-organizations including the Reference and User Services Association (RUSA) and the Association of College and Research Libraries (ACRL). These two divisions provide a variety of often highly targeted documents describing practices and criteria for information professionals to use in evaluating their activities and services. For example, RUSA’s “Measuring and Assessing Reference Services and Resources: A Guide” contains “measurement tools to assist managers in evaluating reference services and resources” (ALA, 2008, Introduction section). The ACRL similarly developed a standards document specifically “to provide archivists and special collections librarians with a set of precisely defined, practical measures . . . to support the assessment of public services and their operational impacts at the local institutional level” (ALA, 2018, Audience and Purpose section, para. 1).
Other organizations can provide even more specific evaluation criteria for different types of libraries and information centers, especially special libraries. Crumpton and Porter-Fyke (2016) pointed readers to the American Association of Law Libraries (AALL) and the Medical Library Association (MLA) for professionals working in those fields. The authors noted the AALL’s guidance for how law librarians can evaluate and communicate the value they add by using criteria such as cost-effectiveness of information sources and the elimination of repetition in research (p. 162). The authors also discussed Valerie J. Ryder’s take on the qualitative and quantitative benefits that corporate librarians provide, with quantitative measurements including risk mitigation through copyright assistance, canceling duplicate subscriptions, and “the costs saved by using the library’s internal resources compared to hiring consultants” (p. 162).
For healthcare librarians, the MLA (n.d.) published a robust guide for measuring and proving value by providing access to select studies and summarizing their key findings in categories such as improving clinical decisions and patient care quality and reducing healthcare costs. The goal of the guide is to give association members tools “to advocate to employers and the public the value, impact, and benefits of health sciences libraries and librarians” (para. 1), but healthcare librarians can also use these criteria to evaluate their own performance and progress toward goals.
A final way of establishing criteria is by defining them against the mission statement of the library or its institution. Wallace (2004) believed that “If libraries are to be valued institutions, they must find ways to distinguish themselves from their competition. . . . A mission statement that sends a clear message about the library’s unique role and contribution is an essential first step” (p. 5).
Measurement Tools
With criteria in hand, the information professional has myriad tools they can use to measure programs and services. Foremost among these are surveys, which can capture sentiments such as user opinions of programs and services, but they can also focus on quantitative data such as return on investment (ROI). Vilches (2017), writing about an ROI study, said that “this kind of information-collection exercise can inform services and purchasing decisions” (p. 466). Surveys of library employees and surveys of patrons can both help uncover service and program value through results that indicate things such as time saved in using a service or increases in visitors or donations resulting from a program.
One important lesson I learned during my coursework, however, is to also know the limitations of the means and results of evaluations. For example, the types of survey results mentioned in the preceding paragraph are not hard figures, and Vilches (2017) noted that one con to the ROI survey is that it “primarily measured the library’s perceived benefits to the user. There was no ironclad way” (p. 465) to prove the benefits (e.g., it might be difficult to verify the accuracy of the amount of time a library employee says they saved by using a service, or the exact dollar amount increase in donations resulting from a program). Still, that does not mean these measurements do not deliver meaningful insights; they do, and so the results are valuable for making decisions regarding services and programs.
Evidence
Evidence 1: INFO 210 Reference and Information Services – Phone Reference Evaluation
For this exercise I had to call a public library with a particular research question and then evaluate the service they provided using RUSA guidelines as delineated in the association's document “Guidelines for Behavioral Performance of Reference and Information Service Providers.” Specifically, I recorded my observations regarding how well the Fort Vancouver Regional Library’s phone reference services matched RUSA guidelines for visibility and approachability, interest, listening and inquiring, searching, and follow-up. These criteria are not specific to phone reference services, but applicable to other virtual reference services such as over web chat or email, and to in-person services as well.
This assignment shows that I am able to assess a library service against established criteria and make recommendations for improvement. The goal of the RUSA guidelines “was to identify and recommend observable behavioral attributes that could be correlated with positive patron perceptions of reference librarian performance” (ALA, 2013, Introduction section, para. 1). Thus, while there were no numerical metrics involved, the criteria are nevertheless designed to be measurable, and to aid information professionals in evaluating and training library staff to improve services to patrons.
Evidence 2: INFO 202 Information Retrieval System Design – Website Evaluation and Redesign
One of the main services libraries now provide to patrons is access to information and programs via its website. For this group project, my teammates and I conducted an audit of a real-world library website to evaluate it against criteria such as organization and functionality. We wrote the document as a group but each of us spearheaded a section, with mine being “Site map of existing site & discussion.”
Judging factors such as ease of use can be a subjective endeavor. However, Wakeham (2004) wrote, “The library website is an alternative entry to the services offered. It should be professionally designed and constructed” (p. 239), suggesting there are objective criteria by which to evaluate a site. This project shows that I am able to assess a service such as a library website according to more concrete standards, and to recommend actions for establishing further measurable criteria to obtain insights for further improving the service.
The objective criteria we used included identifying broken links and noting duplicate sites and navigational pathways that would likely lead to user confusion. The library site map flowcharts, which I laid out in the web-based app Draw.io, depict the current and our proposed website organizations to visualize these flaws and contrast them with a more streamlined and easy-to-follow layout. We also recommended surveys and usage data for additional quantifiable insights.
Evidence 3: INFO 231 Issues in Special Libraries and Information Centers – Internal Marketing at Special Libraries
In this research paper I discussed marketing to internal audiences at special libraries. Librarians of every type must show their value to their organization, but while librarians in public and academic libraries might inherit standard measures of achievement such as number of patrons served or reference materials accessed, those in special libraries might have to get creative with how they assess and evaluate the programs and services they offer in order to improve those services and to demonstrate they are contributing to the success of their company, hospital, institution, etc.
Megaridis (2018) wrote, “As many people are unaware of special libraries and information centers and their roles, marketing of resources and services is even more critical in these environments than in other library sectors” (pp. 109–110). No matter the type of library or information organization, it’s not enough to provide value—you have to be able to communicate that value, and doing that effectively requires quantitative figures. Particularly in the “Evaluation” section of this paper, I show that I can identify both pertinent metrics and methods of obtaining them, such as user satisfaction surveys and ROI studies.
Conclusion
While evaluation in library and other information center settings might seem nebulous and hard to quantify, my coursework taught me that I have valuable tools at my disposal for assessing the programs and services my institutions provides. These tools include resources for establishing measurable criteria as well as the actual means of collecting data. Moreover, I learned that as an information professional, conducting such evaluative projects and activities is crucial for making decisions about those programs and services and for establishing their value, and thus helping prove and communicate the library’s overall value. With the knowledge I have gained from my coursework, I could conduct meaningful, evidenced-backed evaluations of programs and services in any information organization I might work for in the future.
References
American Library Association. (2008, January 4). Measuring and assessing reference services and resources: A guide. https://www.ala.org/rusa/sections/rss/rsssection/rsscomm/evaluationofref/measrefguide
American Library Association. (2013, May 28). Guidelines for behavioral performance of reference and information service providers. http://www.ala.org/rusa/resources/guidelines/guidelinesbehavioral
American Library Association. (2018, January 7). Standardized statistical measures and metrics for public services in archival repositories and special collections libraries. https://www.ala.org/acrl/sites/ala.org.acrl/files/content/standards/statmeasures2018.pdf
Buck, W. (2016). Providing help in hard times: A blueprint for successful strategic planning. Journal of Library Administration, 56(2), 199–208. https://doi.org/10.1080/01930826.2015.1124703
Crumpton, B. E., & Porter-Fyke, E. (2016). The special library: Applicability and usefulness of the MLIS in non-traditional library settings. The Bottom Line, 29(3), 151–165. https://doi-org.libaccess.sjlibrary.org/10.1108/BL-04-2016-0017
Matthews, J. R. (2018). Evaluation: An introduction to a crucial skill. In K. Haycock & M.-J. Romaniuk (Eds.), The portable MLIS: Insights from the experts (2nd ed., pp. 255–264). Libraries Unlimited.
Medical Library Association. (n.d.). Evidence you can use to communicate library value. https://www.mlanet.org/p/cm/ld/fid=1247
Megaridis, C. (2018). Working in different library environments: Special libraries and information centers. In S. Hirsh (Ed.), Information services today: An introduction (2nd ed., pp. 106–116). Rowman & Littlefield.
Rogers, A. E., & Densch, K. L. (2017). Marketing your expertise. In J. M. Matarazzo & T. Pearlstein (Eds.), The Emerald handbook of modern information management (pp. 455–474). https://ebookcentral-proquest-com.libaccess.sjlibrary.org/lib/sjsu/detail.action?docID=4981621
Vilches, K. (2017). ROIs and surveys in special libraries: One corporate experience. Journal of Library Administration, 57(4), 461–467. https://doi.org/10.1080/01930826.2017.1300457
Wakeham, M. (2004). Marketing and health libraries. Health Information and Libraries Journal, 21(4), 237–244. https://doi.org/10.1111/j.1471-1842.2004.00540.x
Wallace, L. K. (2004). Libraries, mission and marketing. American Library Association.