Navigation

ESPL Assessment Background Survey Response

Part 1:

 

 

 

EFFECTIVE, SUSTAINABLE, AND PRACTICAL LIBRARY ASSESSMENT

 

 

Assessment Visit Background Information

 

 

Below are some areas we would like information about before our assessment visit. Your response will give us a better sense a better sense of assessment work that has been done at your institution, specific issues that you’d like to see addressed, and your expectations for this visit.  

 

 

  1. Short summary of what’s been done in assessment in last few years, including how you’ve used assessment information.  Attach any relevant documents (or provide URLs)

The Web Committee set up a Usability Group a few years ago to assess the Library’s web site. The Usability Group has done testing of the Library’s web site with individual students. We’ve also conducted five focus groups with students during the last academic year. The results of both the usability testing and the focus groups are available on the Library’s intranet. Since this is not available to you, we’ve attached the latest annual report of the Usability Group (usability-annual-report-07-08.doc).

 

 

Boston University’s Office of Information Technology publishes web site statistics monthly. Pardee Management Library has tracked these statistics over time for the Libraries (see attached: Web Hits 2007 -2008.xls). The new Assessment Group looked over these statistics and decided that these statistics did not give us the specific information we need. So, the Assessment Group is in the process of implementing Google Analytics on the Library’s web site.

 

 

The Libraries implemented EZproxy late in the summer of 2007. Since then we’ve been investigating using EZproxy statistics for assessment. Some summary statistics are below.

 

 

Just to highlight the unique users figures, they are: 17,658 for the fall semester, and 22,235 for the spring semester. This is the count of the portion of the BU community that has accessed library subscription resources at least once during those time periods. The highest estimate of the total university population, i.e., full and part-time students, faculty and staff, now appears to be 42,052. We are currently giving 24,905 as our FTE's for students. So usage in the spring ranges from 50% of the community to about 89%, depending on how one decides to count the "community".

 

 EZproxy Usage Summary for Fall 2007:

 Count:            September:    October:    November:    December:    Semester:

 Total Logins:        29,484:        45,423:        46,556:        31,089:        152,552:

 Average Per Day:        983:         1,465:         1,514:         1,003:          1,250:

 Unique Users Per Day:

    Total:        20,330:        32,367:        33,054:        21,584:

    Average:        678:         1,044:         1,079:           696:

    Low:           326:           465:           346:            93:

    High:         1,251:         1,402:         1,707:         1,623:

 Unique Users/Month:     7,855:        10,545:        10,769:         8,170:

 Unique Users for the Fall 2007 Semester                          17,658

 On-Campus vs. Off-Campus Usage

 Off-Campus:         47.2%:         46.2%:         47.4%:        52.3%:

 On-All-Campuses:     52.8%:         53.8%:         52.6%:        47.7%:

  (On-ChRiv-Campus):     51.0%:         52.0%:         51.0%:        46.1%:

  (On-MED-Campus):      1.8%:          1.8%:          1.6%:         1.6%:

 

 

Early in 2008 we established an Assessment Group. This grew out of the Library’s Web Committee. We realized we needed a group to come up with an Assessment Plan and implement it. The charge of the Assessment Group (AssessmentGroupCargev1.doc) and the first report to the Director (Assessment Group Annual Report 07-08.doc) are attached.

 

 

  1. Important assessment motivators
    1. External (e.g. accreditation, governing board, university)

Boston University is preparing for re-accreditation in March, 2009. This upcoming re-accreditation was one of the main reasons we formed an Assessment Group and decided to participate in ARL’s ESP Library Assessment Program

.

  1. Internal (e.g. process improvement, better service/support, we should be doing this)

Both the University’s and the Library’s strategic plans are user focused. BU’s culture is becoming more and more user focused. Improving service for faculty and students is becoming the most important factor influencing decisions in the libraries.

 

 

  1. Organizational structure for assessment
    1. Individual with responsibility

No one staff member has this responsibility.

  1. Committee or other group

The Assessment Group was formed early this year. The Assessment Group, chaired by Dan Piekarski, is a sub-group of the Web Committee.

  1. Reporting line

The Assessment Group reports to the Library Director, the Web Committee, and the staff as a whole.

 

 

  1. What’s worked well (short description)

The EZproxy implementation was carried out as a joint initiative of the University’s Office of Information Technology and the Libraries as a whole. This collaboration went very well and was completed late in 2007. The implementation group will consider what statistics will be collected and how they will be collected when time permits.

 

 

The Usability Testing and the Focus Groups were conducted to improve the Library’s web site and have provided much valuable data. Many web site changes are now based on student recommendations. For instance, this summer the Web Committee is investigating federated searching tools.  This was the strongest recommendation resulting from the focus groups. Many more easily implemented recommendations have already been made or are under development.

.

  1. What’s been a problem/sticking or breaking point (short description)

Limited staff resources are often a limiting factor. Members of committees or task forces, such as the Web Committee, the Usability Group, and the Assessment Group, typically wear many hats and are unable to devote the time necessary to move the initiatives along expeditiously.

 

 

  1. Areas that you want us to address (e.g. organizational structure, assessment culture, specific issues, methodology, analysis and reporting, management information, using data for improvement)

Three areas we would like you to address include methodology, analysis and reporting, and using data for improvement. The last area, using data for improvement, is the most important.

 

 

  1. Expectations/outcomes for our participation in this effort

We welcome your visit and recommendations. We hope your participation and insights will help us implement a practical, sustainable program of assessment.

 

 

Part 2:

 

 

EFFECTIVE, SUSTAINABLE AND PRACTICAL LIBRARY ASSESSMENT

 

 

Statistics Inventory

Note: Boston University Libraries encompass Mugar Memorial Library and branches (Science & Engineering, Pardee Management, African Studies, Music, and Education) which report to the Library Director, as well as three professional libraries (Law, Theology, and Medicine) which report to the Deans of their schools. The Libraries have very good collegial relationships and work together on common areas of interest such as EZproxy implementation, Digital Repository development, and ARL statistics reporting. For these surveys, we’ll be responding only for Mugar Memorial Library and branches.

 

 

Question 1: What statistics (other than ARL/ACRL Annual Statistics) are collected by library departments or programs?  Please be as specific as

 Branch libraries typically collect statistics such as circulation counts, web site hits, instruction sessions, and reserve use. We’ve attached annual report statistics for two branch libraries—Pardee Management and Education. Pardee Management Library is our largest branch with the most traffic. The attached files include Pardee Instruction statistics (instruction stats 0708.doc) Pardee Access statistics (AccessStats2008DP2.xls) and a series of Pardee statistics from 1997 (PardeeStats1997+.xls). PERL (Education) is one of our smaller branches with much less traffic (PERL Appendix 2007-2008.doc).

 

 

Individual departments such as Serials, ILL, and Collection Management, typically collect statistics directly related to their activities. For instance, for Collection Management, the Collection Development Manager collects a variety of statistics from individual selectors and collates these numbers. We’ve attached the most recent annual report statistics for Collection Management (collections-mugar-branches-07-08.doc) as an example.

 

 

           

Question 2: How are the statistics kept?

Each committee, department, and branch keeps track of their own statistics. Statistics collected by Library Committees, such as the Library’s Usability Group, are made available to all staff on the Library’s intranet. Departmental statistics (ILL, Serials, Circulation, and Collection Management) are shared with staff as needed, at the discretion of the heads of the departments. Branch library statistics are usually not shared. The Library Director keeps track of all departmental and branch statistics, as well as the annual compilations. There isn’t any single repository of statistical data.

 

 

Question 3:      a) Why do you collect them?

                        b) How do you use them?

Statistics are collected for annual reports, for accreditation, and for ARL. These statistics are used to inform, rather than drive, decision making in the Libraries. We would like to make much better use of our statistics. This is why we have formed an Assessment Group, why one of us attended the Assessment Conference in Seattle, and why we are participating in Effective, Sustainable, and Practical Library Assessment.

 

 

 

 

Jim Self

Steve Hiller

Visiting Program Officers, ARL

self@virginia.edu

hiller@u.washington.edu

Comments