Starting with needs analysis, according to Mandel (2017), wayfinding is one of the critical problems that libraries need to address, yet empirical research in this field is limited. Li and Klippel (2012) notes that patrons usually struggle with wayfinding, especially in large and multi-level libraries. They found that the most frequently posed questions at help desks are requests for directions. This is supported by Mandel (2013), who conducted interviews with library patrons and reported that many users struggle with navigation in the library. Observations showed that patrons were making U-turns or appearing lost or confused. However, in the same study, it was noted that these negative experiences don't seem to lead to recommended changes, because patrons may not know what recommendations to make. As a result, this problem remains unresolved for many libraries up to this day.
As for the learner analysis, patrons of public libraries are usually residents of the town where it is located. In the case of Hamden Public Library, this includes residents of Hamden, Connecticut, USA. According to the latest data from the U.S. Census Bureau (2021), the median age of Hamden residents is 37, with a population that is predominantly male. About half of the residents hold a bachelor’s degree or higher. The town’s racial and ethnic composition is about half White, with the other half representing a diverse mix, including: African American, Asian, American Indian and Alaska Native, and Native Hawaiian and Other Pacific Islander, among others. Nearly all households speak English only at home. More than half of the population is employed, and the median household income is $80,779. Additionally, nearly all households have access to a computer and a broadband internet subscription at home. However, digital literacy remains a concern, which is why various library-based digital literacy programs have been established to support the community (Canham-Clyne, 2022). Furthermore, approximately 10% of the population has some form of disability, which must be considered when designing accessible instructional materials.
As for the context analysis, most contextual factors in libraries are supportive of learning. Patrons generally have a positive attitude toward learning (Israel, 2013), and there are a variety of services and resources beyond books, such as photocopying and printing services, computer stations, internet access, the library catalog, and study spaces, among others (Mushtaq & Arshad, 2022).
Lastly, for the task analysis, Mandel (2013) revealed that public library patrons take different routes within the library depending on the purpose of their visit. There is no single stop or destination that applies to all users, as each patron’s path is shaped by their individual goals, whether it be borrowing books, using computers, attending programs, or accessing specific services.
Murphy (2014), in her study of library patrons, revealed that patrons use a variety of informal, self-directed, and information sharing approaches to learn about library resources and services. She highlighted the fact that the library naturally functions as an informal learning environment. Patrons are typically adults who prefer independence, so as a result, formal instruction and direct assistance often do not resonate with them. The study concluded with a recommendation for libraries to continue developing resources that support patrons’ informal and self-directed learning preferences. Building on the findings of the study of Murphy (2014) it can be inferred that self-directed learning (SDL) may be the instructional approach suited for patrons. According to Garrison (1997), SDL consists of three (3) dimensions. First is self-management, which refers to the learner’s ability to set their own goals. Second is self-monitoring, which refers to one’s ability to track and assess one’s own progress. Third is motivation, which refers to one’s internal drive to engage in and complete a task.
To address wayfinding challenges, scholars have proposed various materials and aids to support patrons. Cumberbatch et al. (2023) redesigned their library’s existing library map and added more visual elements. Rakshikar and Powdwal (2020) recommended the development of a printed library map that patrons can carry with them and that is also prominently displayed near the catalog computers. Williams (2018) introduced a library concierge service staffed by personnel who provide personalized wayfinding assistance. Melcher (2023) focused on refreshing the library’s signage system by adding intuitive and strategically placed signs throughout the space.
Other researchers have focused on more digital and technology-based solutions. For instance, Chia (2014) developed a mobile 3D library map application that offers a 360° panoramic view of the library, path overviews, and step-by-step directions to specific locations. Similarly, Lee et al. (2015) created an interactive library map accessible through users' personal devices.
A well-known model for formatively evaluating developed instructional products is the Alpha-Beta model. In this model, formative evaluation consists of two (2) key stages. The Alpha stage involves expert review, where specialists examine the product to identify design flaws, usability issues, and alignment with intended objectives. The Beta stage follows, in which a small group of target users interacts with the product to provide feedback on its effectiveness, clarity, and user experience.
Lee et al. (2015), in their study involving the development of an interactive library map, focused on the Beta stage of formative evaluation, wherein a group of patrons was invited to pilot test the map and provide feedback on its usability and effectiveness. Cumberbatch et al. (2023) conducted a survey with a group of users to assess satisfaction levels and identify gaps for improvement in their redesigned university library map.
When implementing a solution developed for library wayfinding, a common method of evaluation is the timed navigation task, in which users are asked to locate specific areas or resources within the library while their time and accuracy are measured. Chia (2014) used this method to assess her 3D Library Map model, and Cumberbatch et al. (2023) employed a similar approach when they redesigned the library map for their university.
Kirkpatrick’s Model of Evaluation is a well-known and widely used framework for evaluating the effectiveness of instructional programs. According to Smidt et al. (2009), it could be used to assess the likelihood that a training program would satisfy the needs of participating individuals as well as the organization conducting the training. It is composed of four (4) levels of evaluation: reaction, learning, behavior, and results.
No documented and published studies have yet demonstrated the use of Kirkpatrick’s Model of Evaluation to assess instructional programs or materials specifically implemented for library patrons. Nonetheless, the model has been used to evaluate brochures, as seen in the study by Ruiz (2019), which proposed to apply the Kirkpatrick’s Model to assess a brochure guide on internet resources for hearing parents of deaf children.
Starting with needs analysis, since libraries typically do not employ experts in map design, they often commission someone externally or rely on a volunteer to create instructional maps for patrons. However, as revealed in the study of Cumberbatch et al. (2023), due to ongoing changes in layout and construction, these materials quickly become outdated. This leads to a compounding problem, where outdated maps not only fail to support patrons, but also contribute to wayfinding confusion.
As for learner analysis, data from the United States Census Bureau (as cited in Data USA, 2017) shows that the average age of library staff is approximately 47. The majority of library staff are female, and nearly all hold a graduate degree, typically in library and information science. The racial and ethnic composition of library staff is less diverse than that of the general population, with most identifying as White, and smaller proportions identifying as African American, Asian, American Indian and Alaska Native, Native Hawaiian, and other racial backgrounds. Their median pay is around $64,320 (US Bureau of Labor Statistics, 2024). Nearly all have access to a computer and a broadband internet connection, and most receive digital literacy training either through formal education or professional development initiatives. There are library staff with disabilities (American Library Association, 2019), although the actual number is not well documented in national datasets.
On context analysis, contextual factors within libraries are typically supportive of learning. Librarians, as pillars of an educational institution, are well-documented to have a positive attitude toward learning (Al-Qallaf, 2006). They are intrinsically motivated to engage in professional development activities, driven particularly by “personal satisfaction, development of new knowledge, challenging tasks, preparation for future work, and networking with other librarians” (Chang et al., 2014).
Lastly, on task analysis focused on map editing, there is scarce literature on librarians’ direct experience with map maintenance. It is common for libraries to rely on volunteers or one-time consultants to develop maps or implement wayfinding solutions (Chia, 2014; Cumberbatch et al., 2023; Melcher, 2023; Rakshikar & Powdwal, 2020; Williams, 2018), likely because such tasks fall outside the typical responsibilities of library staff. Nevertheless, Yi (2016) revealed that librarians use a variety of techniques to promote services and resources that may be adapted for map-related tasks. These promotional efforts often span across digital media (e.g., library websites, social media), print materials (e.g., booklets, brochures, flyers, direct mail), and in-person events (e.g., exhibits, displays, library tours). The skills used in creating these materials may be transferable to map maintenance responsibilities.
For online workforce learning, scenario-based learning is one of the most widely recommended instructional strategies. According to Clark and Mayer (2012), it allows adult learners to apply what they’ve learned in a simulation of real-world scenarios without the risk of real-life consequences. This approach enables learners to make mistakes, reflect, and adjust their understanding in a safe and forgiving environment. This is supported by the study of Blakiston (2010), who concluded that scenario-based learning is an effective strategy for shifting learners from passive recipients to active participants in the learning process. In fact, according to Mehall (2021), scenario-based e-learning is no different from in-class scenario-based learning in terms of perceived effectiveness, as students view both approaches as equally effective.
In developing online courses for the workplace, industry-recognized tools such as Articulate 360 are commonly used due to their flexibility, ease of use, and support for interactive, multimedia-rich content (Orsborn et al., 2017; Osadcha et al., 2021). Some of the learning theories that apply in e-learning or online course development include Gagné’s Nine Events of Instruction, as illustrated in the study by Kamal et al. (2024), who used Gagné’s model as the framework for developing the structure of their e-learning application. Mayer (2013), on the other hand, uses Mayer’s theory of multimedia learning and concluded that learners benefit more from computer-based instruction when it incorporates multimedia, particularly high-quality media designed according to Mayer’s principles of multimedia learning.
For the formative evaluation of an online course, the well-known Alpha-Beta model may be applied. This model includes a stage for expert review (Alpha) and a stage for usability testing with representative users (Beta). For instance, in the study by Kamal et al. (2024), where they developed an e-learning application, they first conducted an expert evaluation with a small group of specialists as participants. This was followed by a usability testing phase involving a sample from the target user group to assess functionality, clarity, and user experience. Lee-Jayaram et al. (2019), in the faculty development course they designed, conducted Alpha-Beta testing as part of their formative evaluation process. In both studies, these evaluations provided valuable insights, particularly regarding potential improvements to the e-learning courses they developed.
Kirkpatrick’s model has been used to evaluate training programs for staff. Ziaei and Bolandi (2014) used the model to evaluate librarians’ training courses at public libraries, and the result showed that the training courses were deemed as successful from the participants' perspective. These results are supported by a study by Cobblah and van der Walt (2016) that found a positive correlation between library staff work performance and staff training and development.
Moreover, Galloway (2005) concluded that Kirkpatrick’s model remains relevant in evaluating distance delivery and e-learning programs, like online courses. This is supported by the study of Shallal et al. (2023) which evaluated a virtual faculty development program in bioethics using the said model and concluded that the clinical faculty's understanding and method of teaching bioethics were successfully enhanced by the online course. Tourrette et al. (2024) evaluated an e-learning course, developed using Rise 360, to train general practitioners in planetary health and found that it significantly increased participants’ self-assessment knowledge scores and positively altered their planetary health behaviors.