Assessment Committee Meeting Minutes - 2/16/2017
NEXT MEETING: Feb 22, 2:00-3:30 (location TBD)
Present: Linda (chair), Brendan (minutes), David, Steve, Dan, Virtual Konstantin
- Review comments about introduction & Summary
- Review feedback from specialists
- Organization of files in OpenBU
1) Organization and Files in OpenBU
o David proposes to merge reports into a unified report. Will require a bit of rewriting for the references to other subreports and to remove redundancy of summary
2) Reviewed notes about Introduction/Summary
o Recommendations are necessarily “increase-improve-acquire” oriented (as opposed to “cut-reduce”) due to how the survey was structured.
o Final recommendation was agreed upon.
3) Specialist Feedback
o Wonderfully high response rate.
o Kanopy wording should not be “has been negotiated” – technically incorrect, still in negotiation.
o Discussion about statement about library inventory project. We’ll decide on wording once we know for sure what it will entail and when it is slated to be completed.
o Only one comment about BC search interface, so our statement expressing generality is incorrect.
o We should ask Vika about her comments about misrepresented data in case we are missing anything.
4) Discussed statistics behind graphs
o Brendan fact checked the data. David and Brendan discussed different methods of acquiring statistics, in particular the Use/Importance joint charts.
TODO: Linda to contact Doreen about timeline and scope of the inventory project
David to reach out to Vika to address her question
Assessment Committee Meeting Minutes - 2/7/2017
Present: Linda (chair), Brendan (minutes), Steve, David, Tom, Dan
Agenda: REVIEW LIBRARY COMMENTS
- Linda agrees that data spreadsheet is okay to share with the Data Repository people.
- Linda has List of reviewers for the subreports.
Discussion about Comments on Library Reports
- Used positive to negative ratio when there was a significant number of NA responses.
- Recommendations for the Library Collections were discussed. A problem lies in emphasizing the importance of the budget without pushing it in the provost’s face. Wording was handled delicately for all parts.
- East Asian language increase recommendation was discussed, is the problem lack of staff? How do we phrase it that addresses the issue?
- We reviewed new recommendations – Shelf Organization, Bib Managers
Linda will compose messages to the listed reviewers. Will offer 3 days to review.
Next Meeting: 2/16 (12:30-2:00) in the Lg. Mugar Conference Room- To discus Executive Overview
Assessment Committee Meeting Minutes - 2/1/2017
Present: Present: Linda (Chair), Tom, Dan, David, Brendan, Steve (minutes)
Next meeting: 2:00-3:30 on Tuesday, 2/7, in the Mugar administrative conference room
1.) Review of spreadsheet on survey data broken down by departments. RESOLVED
2.) Review of Library Support of Research report. REVIEWED AND EDITED PDF
3.) Information needed for depositing documents in OpenBU: Eleni will need official author(s) (Assessment Committee, individual members?), titles, metadata, collection title - LINDA
4.) In addition to Library Heads, which specialists who should review the final draft for essential changes?
LINDA HAS PROPOSED: Jack Ammerman, Mary Blanchard, Tom Casserly, Joe Harzbecker, Bob Hudson, Arlyne Jackson, Helen Jacoby, Anna Lawless, Tim Lewontin, Amy Limpitlaw, Ken Liss, Holly Mockovak, Beth Restrick, Jennifer Robble, Mike Ward, Ron Wheeler, Vika Zafrin
Please review this list of specialists and AULs. Please get back to Linda by the end of the day on Friday, 2/3, with any comments, deletion, or additions.5.) We need a title for the summary/general report (Executive Summary? Overview? General Summary?)
Assessment Committee Meeting Minutes - 11/16/2016
Present: Linda (chair), Dan (minutes), Brendan, David, Ellen
Next meeting: 12:30PM on Wednesday, 2/1, in the Estin room.
A) Discussion of Teaching and Learning report
B) Discussion of Library as Place report
C) Discussion of Comment Trends
D) Data Sharing Group
Out of time
Attendees: DF, DB, KS, LP, TH
Assessment Committee Meeting Minutes 12/2/2016
Next meeting: Linda to send out Doodle
Assessment Committee Meeting Minutes - 11/16/2016
Present: Linda (chair), Dan (minutes), Brendan, David, Steve, Tom
Next meeting: 1:00PM on Friday, 12/2, in the Mugar admin conference room.
Discussion of 2016 Library Survey Report
a. Qualitative analysis (Brendan, Konstantin, Ellen)
Discussion of comment document
The sub-team working on this collected:
- interesting comments that are typical or come up often, or enriches a statistic
Linda noted we want to present the data as it is .... not negative or positive. Brendan asked, how do we want to be "representative" of the comments? Steve answered- we want to gather comments that illuminate issues, whether they
are representative of the whole sentiment or not. David noted that pop out comments should be reasonably well-written, and at least have something to do with the nearby charts or data.
The committee went through comments to discuss interesting ones... collecting them on "sheet 2" within the Google Doc above.
b. Quantitative analysis and report (David)
David continued to work up charts and went through an explanation of a few that were interesting: for instance, grad students responding to how much the library helps them teach. The committee discussed which stats were interesting and how to best format the charts. Tom suggested we show absolute number of folks on these charts; it could help give some perspective.
David continued, bringing up three charts side by side, showing use, importance, and satisfaction of course reserves. For this he limited to teaching faculty, and the committee discussed if that was appropriate. Linda noted we should change the label to Teaching Faculty to make it more transparent, if that is the population. This chart showed a relatively low number of teaching faculty use reserves, and Steve noted it is therefore the amount NOT using course reserves that may be more interesting: is course reserves a dying technology similar to the fax machine?
Out of time.
Assessment Committee Meeting Minutes - 11/2/2016
Present – Linda (chair), Brendan (minutes), Tom, Konstantin, Steve, David, Sarah
Assessment conference: BU looked great at the conference according to Linda. Congratulations everyone. Posters will be online.
Agenda: User Survey Reports (Quant, Qual, Contextual Comments)
- David: Quantitative Analysis
o Options presented for dealing with missing data (unanswered questions, NA responses, etc) in Likert questions.
1) Show only 1-5 responses.
2) Show all possibilities (including NA, unanswered)
3) Show all but the non-answers
4) Show 1-5 and the % of respondents answering that part of the question
5) Show 1-5 and the non-responses for those answering *some* part of the question.
o David and Linda offer to go with simplest graphs with detailed, unambiguous captions.
o In depth discussion about the significance of including/excluding “N/A”. Include? They made an active decision in selecting N/A. They are a significant percent of the population. Exclude? Does it contribute to the story of the data, or is it distracting? Is it interpretable? Ultimately every graph will need to be scrutinized and decided for individually.
o Brendan requests we reconsider always including NA options (listed in Survey Keeper)
o Faculty and Graduate questions about funding were different. One had checkboxes. One had radio Y/N/?.
o Survey Keeper: Check boxes don’t allow explicit y/n options, meaning analysis is more confusing and requires more context.
o Konstantin proposes having our PDFs with links to an online, publically accessible, interactive data dashboard
- Brendan: Qualitative Analysis
o Google sheet has categories of comments broken down roughly by topics in report, representative comments, and general findings from comments. Research findings is the only one currently shared.
o Brendan's spreadsheet at the following link: https://docs.google.com/a/bu.edu/spreadsheets/d/1onXmmLsrBK9nCY6nlNtQooMUK5ASNF1_82ypRaj7S6k/edit?usp=sharing
o Brendan wants specific ideas for comment categories to pull.
o Trends in the resources topics were discussed.
o Steve requests comments about distance students and specific subjects for lacking resources.
Next meeting---Wednesday 11/16, 12:30-1:45 in Mugar Conference Room.
Assessment Committee Meeting Minutes - 10/19/2016
Attendees - Dan, Brendan, Steve (minutes), Linda, David, Tom, Ellen, Sarah, and Mike Ward (invited guest)
User Survey Report- continued discussion
a. Quantitative data & report writing – discussion led by David
For survey keeper: Storage location of research data – answers aren’t mutually exclusive (e.g., someone relying exclusively on an IS&T solution could reasonable check both ‘commercial cloud storage’ and ‘IS&T storage solution,’ Google drive being an IS&T solution)
Regarding sharing draft documents with others in the library, it was decided that we should hold off until the text of each had been fleshed out more.
b. Qualitative data analysis - tabled
See Brendan’s spreadsheet with subjects and topics that David covered in our last meeting, with related comments pulled out of the qualitative data:
Next meeting: 1:00PM on Wednesday, 10/26, in the Mugar admin conference room.
Assessment Committee Meeting Minutes - 10/5
Absent - Ellen
Assessment Committee Meeting minutes 9/22/16
Present: Linda (chair), Dan (minutes), Brendan, Ellen, David, Steve, Tom, Konstantin, Sarah
Next Meeting: Wednesday 10/5, 1-2:30, Lg Mugar Conference Room
- Linda, Ellen: consider and present to the group three or four possible ways to lay out the content of the top level report
2) Discussion of top level report
a) looked at a series of new charts by David using graduation caps to represent respondents (each cap = 100). These were Frequency of Visit charts, both physical and online: one for undgrads, one for grads, one for faculty, and one combining all three. Steve suggested having the axis labels wrap around the content so that they are easier to read. David suggested he may shade alternating rows to help readability, and will otherwise beautify the content. Sarah suggested perhaps producing interactive content online after the main report. Dan suggested maybe the one that combines all of the patron groups might be too busy for our top level report.
b) David presented a diversion bar chart for satisfaction with BU Libraries Search. Considering it is a major aspect of our work it may be good to feature this. The first faculty survey in 2010 was given just before the first implementation of BULS. There lead to discussion about what we really want to feature in the report.
c) Discussion of what belongs in the first report from the library survey 2016.
Steve suggested we include items for which we have made demonstrable
(and sometimes even measurable) changes due to survey feedback. Outlets
would be an example, but we thought it may be tough to connect many
improvements directly to our survey results.
From the paper flipboard in the room: High level report must include:
overall satisfaction with the libraries (1), contributions (1-3), use
of the libraries (the graduation cap charts).
There was much discussion. Finally we decided Linda and Ellen will meet as a subgroup to consider and present to the group three or four possible ways to lay out the content of the top level report / executive summary.
3) Discussion of the qualitative data; commentsBrendan related that the subgroup organized tagging by having members go through and create nodes: Konstantin for Faculty comments, Ellen for Graduate student comments, and Brendan for undergraduate comments. This was phase one of the project. Phase two would be to merge these 3 sets of nodes into one large set. Phase three would be to un-code items that are not relevant. Konstantin walked through his process of tagging faculty comments. For reference, there were about 1000 faculty respondents, and a total of 695 comments. One thing he did that was different than what the others did was that he coded comments for "attitude": - positive, negative, and recommendations. Sarah noted that if there are questions about some aspect of comments that come up after we produce a final node set, that concepts can be pulled from nVivo afterward. An example was mentions of peer institutions.
Out of time
Next Meeting: Wednesday 10/5, 1-2:30, Lg Mugar Conference Room
Assessment Committee Meeting 9/16/16
Present: Linda (chair), Steve, Brendan (minutes), Dan, Sarah, Konstantin, David, Tom
Next Meeting: Thursday 9/22, 2-3:30, Lg Mugar Conference Room
- David: share paper about divergent bar chart format
- David: make a heat plot of library use data.
Out of time
Next Meeting: 9/22, 2-3:30 in Mugar Conference room
Assessment Committee Meeting 7/26/16
Present: Linda (chair), Steve, Brendan, Dan, Sarah, Konstantin (minutes), Ellen, David, Tom
150 library staff members and that more than 100 responded (~102 or 103).
Findings available on Google Drive
Over fifteen different solutions for storing data
73% respondents share with other library staff members
55% would have “most” or “all” of their products on their computer
Email is the most common way to share
Staff generally look for policies on the Library website
Begin a conversation with your department
Less is more
Only use BU-supported solutions
Develop a routine backup plan
Ensure work is saved in shared locations
Adopt simple, usable naming conventions
Move away from email as a storage and sharing tool
We can share this recommendation with staff as long as there is no personally identifiable information
Tom’s recommendation 1) adopt the Office 365 Suite as the default solution for the BU Libraries. 2) task a training team to facilitate Office 365 training by using both IS&T and local trainers 3) task a design team to design an intranet for the BU Libraries using SharePoint and explore ways to ensure the intranet has the appropriate staff resources to ensure it remains usable and functional
We should make clear that these are not final recommendations but a report on the survey. Tom and Ellen will work on the short introduction. Timeline: Linda will email Bob Hudson and AULs on Friday 7/28.
Brendan modified old data sets to match conventions we used in 2016.
David is continuing work on the quantitative analysis. The trends are: faculty’s satisfaction with library is growing. The undergraduate and graduate comparison of 2016 and 2010 shows that the overall satisfaction with Library is overall did not change. David also is experimenting with scatter graphs and correlation analysis using R.
Brendan, Konstantin and Ellen split up work between three different groups and began coding.
Usage stat for serials has been updated (around 1000 titles), including cost per use data.
Next meeing: August 12, 1pm
Assessment Committee Meeting 6/24/16
Present: Linda, David, Ellen, Brendan, Steve, Dan, Tom (minutes)
Next Meeting July 26, 2016 @ 12:30 in large Mugar Conference Room
To do: Please comment on Data Management Plan by July 18 (all)
Updates from around the table
Assessment Committee Meeting 5/26/16
Present: Linda, David, Ellen, Brendan, Steve, Tom, Konstantin, Sarah, Dan (minutes)
Next Meeting: 6/24/16 from 12:30-2, Mugar Admin Conf Rm
Jason Yee, Head of Library Computing Services at MED, was our guest
Konstantin has finished doctorate! Congrats.
1) Update on Staff Survey
2) Update on 2016 surveys
it be worth doing longitudinal?
him and cc Brendan as well
3) Data Management Plan
Present: Linda, David, Ellen, Brendan, Dan, Sarah, Steve (minutes)
Notes from the look at the preliminary data analysis of the
2016 faculty user surveys (Brendan)
Survey responders have been randomized and winners (5 UG, 4 G, 3 F) picked. Brendan will send the names to Linda, who will email w/ banner (running a draft by Ellen beforehand) asking them when they can come in to pick up their prizes. Pictures will be taken when possible & sent to Dan for posting & news item.
A look at how the data is being cleaned up (Brendan): Brendan has been meticulously recording everything he has been doing to clean up the data. This document is available on the Google Drive 2016 data folder as read-only.
National library surveys including ARL, ACRL, IPEDS
Triennial User Surveys
Website Usage Patterns
Published reportsData storage & sharing
Next meeting: Noon to 1:30 on Thursday, 5/26, in the large Mugar conference room.
Present: Linda, David, Russ, Steve, Brendan (notes), Tom, Ellen, Dan, Konstantin
SK: Expiration dates matter! Can't backdate on Qualtrics.
Publicity Summary (Dan):
SK: We should have a general image targeted for mass social media.
SK: Medical campus is not good for postering. There are digital displays though.
Undergraduate results (Brendan):
-Brendan: To pull out following data-
Graduate results (Brendan):
-Brendan: To pull out following data-
Tom - Send out the dmp information again again
Brendan - to meet with Konstantin about comment analysis.
Next Meeting: 4/20 at 12:45 at LAW. Come early for the NISO webinar on Library Assessment.
BD, TH, DB, LP, KS, EF, RS, SS (telecom), DF
Next meeting: 4/7 Thursday 12:00pm-1:30pm
The survey has successfully launched on 3/15/16 There were just 17 faculty emails that bounced
Thus far: 4% for undergraduate 680 average duration 20 minutes.Graduate students 5% 1100 (the top responders are GRS and Medical school) Faculty 5% 250
There is a concern that we won’t be able to reach the same 30% we did last time. A positive development is that there are more comments than last time
The launch was staggered to allow for the potential intervention. But everything went everything very smoothly.
Steve suggested that we should develop a second reminder which politely reminds patrons that they haven’t yet taken the survey.
The Task Force - we had a discussion how to approach the communication strategies with the staff in regards to the work of the task force. Linda will contact Bob Hudson.
Every school is covered – digital and physical posters. Reminders will go 3/22 and 3/29
Posters are on the buses
Dan is tracking all the social media. Five postings on Facebook and four on the Blog, and nine tweets.
News announcements on some of the library web sites.
Idea – can we use mascots (BU Red) in the next survey to boost our response rates.
One of the folders is labeled “Data Management”
Tom suggested two approaches – a number of meeting devoted to this topic or to work together on Google Drive and then have just a few meetings.
We are waiting for the response from Bob Hudson regarding the survey that we would like to put out to the library staff about data management
Data sets and the issue of de-identifying the data - can we release de-identified our data. We can certainly release the survey instrument and reports.TO DO:
· Review the Google Doc – do everything up to the policy (the first three questions)
· Review Shirley’s comments
BD, TH, DB, LP, KS, EF, RS, SS, DF
Minutes - Assessment Committee – 12/11/2015
Next meeting: Thursday, January 7 at 12pm - Mugar Admin Conference Room
Present: Dan (minutes), Tom H, Tom C, Sarah, Konstantin, Ellen, Linda (chair), Brendan, David, Steve
TechQual results at BU, and a coming National Survey of Student Engagement - Tom C
Applying a Data Management Plan to MINES report material - Tom H, Ellen, Brendan
Staff survey on storing and sharing data - Tom H, Ellen, Brendan
Item for next agenda: Linda
Minutes - Assessment Ctte – 10/30/2015
Present: Tom, Konstantin, Ellen, Linda (chair), Brendan, David, Steve (minutes)
Logistic of 11/19 all-staff meeting:
Minutes - Assessment Committee - 10/22/15
Attendees: DF, RS, LP (chair), EF (minutes), BD, DB, TH, SS
· Harvard assessment librarians will be visiting Monday, November 2 at 11 AM
· Tentatively scheduling all-staff meeting to present MINES report November 19th from 2-3, but not advertising yet
Action items: Linda, Brendan, and Tom will read the appendix of the MINES report and bring feedback. David will rewrite the conclusions section based on today’s feedback, in a cohesive and coherent style, and send around a new draft for feedback
Next Meeting: 10/30 in Law Library Conference Room (third floor) 2-3:30PM.
MINES Conclusions Discussion--general
· Primarily discussion of compiled comments available here
· Will remove “use of library resources for course work is closely tied to the academic calendar, while use for research is more constant” because it doesn’t really get us anything
· KS had suggested quotes from Franklin & Plum article. Decided not to use actual quote, but incorporate ideas in these and Tom Casserly’s quote on library use, and note that it is in line with what other schools have found (undergraduates are using library extensively, but they are not accessing online resources from within the library)
· Also need to change “the libraries will” to “the libraries shall” because it’s a recommendation, not a mandate or promise
· All of TH comments are included in LP comments, as are EF’s so majority of discussion focused on LP’s suggestions.
· We will incorporate Dan’s feedback on use of the term “online sessions” re grad students to avoid inferring overall use
· Mines data does not tell us how heavily resources are being used. This is an indicator of library use, similar to gate counts, but there is limited ability to extrapolate overall use from these user sessions—need to reword title “number of online sessions initiated in year” in figure
· We can tell who is using things, but not comfortable saying “heavy” or “significant” use because who can tell if it really is?
· Change emphasis on use to who, rather than what—all members of community are using resources in what we would consider significant ways. Overall, all users are using print and electronic collections—no one group has been left behind by shift from print to e.
· Is there a way to include ARL data so that we can demonstrate how we rank in reference to our peers? (i.e., to determine how heavy our usage is) This would be difficult, because the data would need to be normalized (FTE and volume count)
· “From the number of online session, we roughly infer usage”—> then include that all groups are making use of the resources (as opposed to roughly estimating amounts of usage by group), which also gets us away from qualitative statements about amount of use.
· Collections are part of our services, we should also discuss services.
· May also need to examine services to graduate students
· “online journal collections” should be changed to “online collections.” Also need to include “online services” in this.
· Reword to something like “library’s preferred mode of acquisitions is serving all patron groups equally well/satisfactory” b/c presently, implications don’t actually deal with WHO.
· Also add that because grad students are heaviest user group, should consider/examine their services and needs to make sure we are adequately meeting them.
· TH: Who should have two conclusions and implications: one discussing everyone, and then also a second conclusion and implication on graduate students.
· Suggested text: Currently, collection building is largely focused on faculty, which serves undergrad demand (through course reserves). Unlike UT and UW, we don’t have separate undergrad and grad libraries, so question of what are we doing to satisfy graduate needs isn’t currently getting answered. Worth investigating, although ultimately may not be a big deal.
· RS: suggests focusing energies on getting greater integration b/w library resources and primo and course management systems
· Need to rewrite conclusion to be clearer that it includes both types of students (grad and undergrad).
· Also need to include that faculty secondary use is preparing for teaching.
· Graduate student teaching use number (1%) makes sense, when you consider what and who is teaching—law, med, business don’t teach, usually teaching canned courses so not necessarily research for teaching.
· Break into three conclusions and implications: WHY coursework, WHY teaching, and WHY research (w/in research, tease out funded vs unfunded research).
· undergrad coursework implication: information literacy instruction. DF to pull from survey to determine whether or not they’re having trouble, how we can better support their efforts
BD did analysis of how WHY varies by location—tasks groups are likely to be doing in library vs other locations. Implications for services—need to make them available at points of access. Services based on research and scholarship should take this into account, especially for undergrads, since they’re not doing research when they come to the library, and by the time they are doing research, they’re no longer coming to the library. Determined to be interesting, but not worth putting into report. Save for internal use.
· MINES data and survey data conflict re value of space à Illustrates disconnect between library space use and library collection use.
· If not using library space for access to collections, what are they using it for, and how well are we doing it? à External survey data brought in to help answer this.
· MINES shows one side of this (use of collections)—> not saying library space is not important, just saying it’s not being used for what we thought it was being used for
· Add to table on library resource use: library gate count to help illustrate that just because resources are not being used in the library, does not mean the library is not getting used. Once gate count is added, we can bring in survey data on importance of library.
· Suggest adding note on non-homogeneity of populations between libraries, because some use of e-resources comes from within library.
· Suggest adding to conclusions that just because students use online resources, they are still looking for human services—reference help, etc.
· Implication: need physical version of MINES to issue short survey to every 100th person coming in to the library.
· Need to include comments and conclusions on use of physical space to include professional students/schools
Minutes - Assessment Committee - October 15, 2015
Attendees: Linda (chair), Brendan (minutes), David, Russ, Virtual Konstantin, Dan, Tom, Steve, and Sarah
Notifications and Changes to Agenda -
MINES Publicity Timeline
Discussion and Review of Final Draft – Conclusion
NEXT MEETING: 10/22/2015 at 2:00, location TBD
Minutes - Assessment Committee - October 8, 2015
Attendees: Linda, Russell, Konstantin, Dan, David, Brendan, Tom, Steve, Ellen, and Sarah
Our next meeting is: Thursday, October 15 from 2:00-3:30 in the Pardee Conference Room
Assessment Committee Minutes - September 4, 2015
Attendees: Linda, Russ, Dan, David, Brendan, Tom, Steve, Ellen, Konstantin
Reporting for the ARL annual statistics has begun. The deadline is Oct. 1 for ARL and about a week prior for BU. Linda will be coordinating these efforts and will be contacting people appropriately.
MINES Survey Report:
We had a great conversation about the David's initial draft for our MINES Survey Report. The group all agreed the initial report was an excellent start. We spent the remainder of the meeting discussing the document's structure, language, and overall purpose.
Some general thoughts:
- The group would like to conduct more analysis in research funding section of the document. This analysis might include a breakdown of research funding and library usage by audience and school.
- We discussed that we need to be mindful that there is not a 1:1 relationship between online sessions and use. Additionally, we should clarify how the MINES data supports and supplements the data we already gather about usage of library resources.
- We looked at restructuring the sections as: who (use), what (top online), when (use over time), where (location), why (purpose)
- We determined that any charts information about the print collection should be symmetrical with the MINES data (same audience breakdown, same scale, and legend)
Potential "high-level" recommendations:
- Our conversion to "e first" is not leaving any group behind and that we should continue building services around our electronic holdings
- Library services should assume that patrons are going to access our holdings online and outside of the library's walls
- Coursework drives undergraduate and graduate use of the library and we should prioritize outreach efforts appropriately
- We should continue prioritizing staff efforts and distribution around our electronic holdings
Our next meeting is: Thursday, September 17 from 12:00-1:30 in the large Mugar Conference Room
Assessment Committee Minutes 7/30/15
Attendees - Linda (chair), Brendan (minutes), Dan, David, Ellen, Steve, Tom
Next meeting: Thursday, August 6 from 1:00-2:00 pm (Report Review
NEXT-next meeting: Friday, September 4 at 2pm
Linda to add Tom to assess-l,
to forward him the contact for research assessment at the Provost's office,
& to research whether an evaluation of (inter)national library surveys has been done.
Brendan to get circulation statistics for Lib Use Report.
Dan to get webhit numbers for David
David to contact Mike Ward about Primo searches.
Ellen to forward link to Assessment Cookbook notification.
Everyone to continue working on their initiatives.
Assessing social media impact (Dan)
Plan includes a scholarly lit review, student feedback, env scans of peer institutions, and approaches to mgmt.
Include a collection of example posts.
Finish gathering by November this year.
Interviewing to finish March next year. (contact Mary McGowan, Ellen to share law contacts)
Complete report by next year summer..
Output: explanation of best practices and local policy recommendations.
Dan mentions BrowZine, a tool for journal browsing. Expensive but may be useful for faculty.
MINES/Library Collection Use Report (David)
Next meeting will be devoted to group review of the draft.
Dan, Brendan, and Steve to assist with collecting statistics for the report.
Looking to finish the report by second or so week of September.
Data Security (Tom, Ellen, Brendan)
we need an index of who's collecting what, and contact data.
output: How libraries organize our data.
Linda suggests putting our data in our institutional repository.
Data Visualization (Tom, Brendan, Konstantin*)
Efforts made to visualize SEL headcount data. Preliminary graphs will be made more interactive.
Usage Stats (Steve)
We need to set standards for how to put data into Alma.
Collab between Steve, Ellen, maybe Jennifer and Tim.
Will start working over the fall.
Lots of stuff (Ellen)
ACRL sent out request for "Assessment cookbook". Due date is 8/6/15. Ellen to forward link to all of us.
Idea: putting together a professional reading collection.
Steve - why not make a lib guide for them?
Cornell U Library research and assessment unit - Check it out.
Provost decided not to renew InCites. They like Academic Analytics (strictly faculty).
Library may push to get access to Academic Analytics. We may be able to work with the company to get lib-interesting data.
New Idea - Look at other national/international surveys (only a few) and institutions sharing the same assessment genealogy. Establish reasons
for why we have assessed the way we have.
Linda talked with Jillian "director of learning assessment for BU". Expect more communication in the future.
Timeline, realistically start after mid October. output a report like MINES and recommendations.
David - This should be published if it hasn't been done before.
Data Storytelling Workshop (Tom & Brendan)-
Designed for journalism fields interested in data. It's a new field meaning we're not behind the trend.
Showed off techniques for getting leads on stories, searching for datasets, cleaning data with tools.
Most important lesson: keep focus on people. Add stories to the data to engage.
Libraries stand to inform people about data storytelling ourselves, workshops, guides (timed with seminar papers maybe).
Journalism students may be a useful asset for us.
Assessment Committee Minutes 5/29/15
Attendees - Linda (chair), Dan (minutes), Brendan, David, Tom H, Steve
Next meeting: Thursday, June 18 at 2pm
1. Data management (Brendan & Tom)
Brendan and Tom presented a slideshow about the Data Management initiative. They examined at data classifications, possible data roles and overall goals at the Libraries. They are planning to construct data policies that encourage folks to work with the Libraries on data issues. Data permissions, training from IS&T for OneDrive, and centralizing library data were also discussed.
2. InCites (Linda & Steve)
Linda and Steve lead the discussion of this tool.
Initiative: Exploration of InCites as an assessment tool
3. Assessing Social Media Impact (Dan)
This project will be a preliminary look into social media use best practices for libraries, and will result in a document that includes:
"Being active in the social media sphere is important because it provides innovative ways for us to connect with users we may never see face to face on a personal and meaningful level." Fichter, Darlene, and Jeff Wisniewski. "How To Measure the Results of Your Activity on Social Media Sites." Marketing Library Services 23.2 (2009).
Assessment Committee Minutes 5/5/15
Attendees - Ellen, Dan, Konstantin, Brendan, David, Tom, Steve, Linda (chair)
Next meeting: Friday, May 29 at 1pm
1. Minutes, announcements, changes to the agenda
2. FY16 Initiatives
Assessment Committee Minutes 4/17/15
Attendees—BD, TH, SS, JR, JA, RS, EF, DF, KS (via skype), MW, LP, TL
Brendan re Scott Macomber & EZ-Proxy Logs
Should we really do this? Is it worth it? What questions are we trying to answer?
BD, DB, and LP met with Eddie from Sourcing re Tableau
Key focus for next year: data and getting it out to users
Assessment Meeting Minutes
Ellen, Dan, Konstantin, Brendan, David, Sarah (minutes), Steve, Linda (chair)
Next meeting 4/17 (Friday) 2-3:30pm. Linda will send out room info.
Eddie from Sourcing is using Tableau, coming to meet with some of group 4/7/15 2pm small conf rm
Reviewed what was covered in last mtg
ARL SPEC kits – Konstantin
~4/year on hot topics are accepted. Written by ARL member libraries generally. All ARL libraries are asked to fill in survey, then data compiled and report written by authoring institution. Access to 2003-current will require a subscription. Earlier years are FT in Hathi trust
2015 topics include one on scholarly output assessment activities
Steve notes that in the past the examples/best practices included in SPEC reports were one of the more useful parts. So just the TOCs can be useful to know who might have examples (that may be available for free from their sites).
May be worthwhile to see what assessment ones have been done.
David – Analytics
Look at patron searches in BULS, can this inform reference and instruction?
Mike Ward says google analytics of page are currently better than Primo/ex Libris built in ones. No examples or models exist that DF found to assess the queries (how to do it, or if it’s useful). DAGS may be using some small parts of this for improving searches/results/displays
DF doesn’t think it’s a high priority now, might be interesting to look again in a year when ex Libris analytics may be better.
Brendan – MINES
“We downloaded all the data from MINES, and there’s a lot of it” - over 7K responses. Completion rate around 91% last time it was calculated. BD cleaned up some bad data (probably statistically insignificant anyway). Added some databases hit based on connection URL for Ebsco. Discussed having BD create a couple other graphs. DF, BD, and LP will work on how to create reports from this data. DF- audiences: brief report for outside/above library, more detailed for internal use or sharing with other libraries. Potentially present to grad research council. Adding in data from outside survey? (# of potential users per school, etc). need context for reader in reports. What would we do differently if we ran MINES again? Longitudinal snapshot?
Should BD work to draw out more detailed specific databases from aggregators? – consensus was no b/c of potential quality of data issues.
Dan –Tableau Trial
Uploaded some spreadsheets (ebrary stats Steve had sent out) and then explored some graphical display options. Lots of online support. Dan did look at a few example videos, but mostly was just exploring. Showed us 3 visualizations he’d worked on. Definite interest in learning more about it.
Assessment Meeting Minutes 03/09/2015
Present: Linda (head), Brendan (minutes), Dan, David, Ellen, Sarah, Konstantin, Russ, Steve
Next Meeting: Monday 03/23/15, Mugar Conference Rm.
Individual Research Topics
Assessment Meeting Minutes 02/23/2015
Present: Linda (head), Dan (minutes), Brendan, David, Ellen, Sarah, Konstantin, Russ
Next Meeting: Monday 03/09/15, Admin Conf Rm.
1. All - monitor for 2015 Survey developments as appropriate.
2. All - be prepared at the next meeting to discuss future directions as assigned below.
Update on Survey 2015
We are starting to run low on time to accomplish the 2015 survey this Spring. If we are not able to accomplish it, our new target would become Spring 2016. As this is a distinct possibility, all of our materials are now well archived (thanks Brendan) in Dropbox and other locations. There is also a "readme" file that explains what documents and materials can be found in the archive, and describes their purpose.
Discussion of future work
At the university level, there is a renewed focus on assessment systems for the evaluation of learning outcomes, and the library will be participating in this in the near future. In light of this Committee’s charge and goals and the Libraries’ strategic plans, what other assessment activities would improve library services and resources.
* From the ARL-ASSESS listserv:
Data Management and Visualization Webcasts
Data visualization brings library stories to life. Using Tableau software, libraries may better harness, analyze, and report their data to internal and external stakeholders. Join us for a three-part series on data management with Tableau, presented by experienced assessment librarians.
Sarah Murphy, Ohio State University
Tuesday, March 3, 2015, 2:00–2:30 p.m. EST
Register online free of charge
The Ohio State University (OSU) has used Tableau since 2012 to support a number of assessment projects. Examples include a series of interactive dashboards that query, analyze, and deliver transactional data to subject librarians to support their collection development and engagement activities. To deliver library data of interest to the broader academic community, the Libraries has also started to embed interactive, downloadable Tableau dashboards via select OSU Libraries websites.
Sarah Anne Murphy is coordinator of assessment for The Ohio State University Libraries. She earned an MLS from Kent State University in 2000 and an MBA from The Ohio State University in 2008. She has published two books and several papers related to library assessment.
Jeremy Buhler, University of British Columbia
Tuesday, March 10, 2015, 2:00–2:30 p.m. EDT
Register online free of charge
The University of British Columbia (UBC) Library began using Tableau in 2013 to analyze longitudinal LibQUAL+ results. The resulting online display highlights changes in survey results over time and facilitates the comparison of responses across several user groups. Building on the success of this visualization, UBC Library is now using Tableau to encourage library staff to engage with, think about, and find new applications for library metrics.
Jeremy Buhler, MLIS, is assessment librarian at the University of British Columbia Library, where he has worked since 2011.
Rachel Lewellen, University of Massachusetts Amherst
Tableau helps the University of Massachusetts (UMass) Amherst Libraries staff to both analyze and understand data. Selectors may access purchasing and use data without having to maintain or manipulate spreadsheets. Using Tableau for the analysis of a consortial e-book project minimizes duplication of effort and provides a common perspective for understanding the data. Interactive visualizations expedite access to live data from MINES (Measuring the Impact of Networked Electronic Services), and make it easier for staff to use circulation, service desk, and gate-count data for operational purposes.
Rachel Lewellen is assessment librarian at the UMass Amherst Libraries. She has been using Tableau since 2012
Tuesday, April 21, 2015
2:00–3:00 p.m. EDT
Discussion and Q&A with Sarah Murphy, Jeremy Buhler, and Rachel Lewellen
Brendan, Dan, David (chair pro tem), Ellen, Russ, Steve (minutes)
Next meeting: 2pm, 2/9, Mugar conference room
-Steve will serve as Sarah's backup to resolve the capitalization of Library, Libraries, library, libraries through the three surveys
Bob Hudson updated the committee on the approval status, that the Provost has been given copies of the surveys to review.
Survey pilots: Went well, with almost all 'piloteers' taking and completing the survey.
The committee then reviewed comments from the pilot (see Brendan's 'Review of Encountered Pilot Errors and their Fixes' handout) and corrected errors as necessary.
Dan, Russ, Brendan, Steve, Ellen, Konstantin (minutes), David, Linda (chair)
Next meeting: 2pm, 1/26, Mugar conference room
- Konstantin and Ellen - identify 2 staff who are willing to participate in the pilot
- collect and review responses from pilot testers
-Sarah still needs to resolve the capitalization of Library, Libraries, library, libraries through the three surveys
We’ve reviewed display option on the iPad. It looks OK.
Discussion of remaining issues:
QF15, QG2a: Open Access - we’ve agreed to the changes for points one and two regarding articles published
Resolved the wording for library search – FQ5 - copy from FQ5, GQ7 and UQ6 -
FQ6, GQ6, UQ6 - our agreement: “Research assistance …” “General assistance”
FQ9 and QG13 - “archiving papers and other publications in digital archives” – include SSRN
“Dear Faculty …” text is finalized
We’ve finalized the email list for pilot participants.
Include some staff to participate in the pilot – Linda will email the link
Dan, Russ, Brendan, Steve, Ellen, Linda (chair), David (minutes)
Next meeting: 2pm, 1/20, Mugar conference room
Linda will be on vacation Jan. 26th - Feb. 8th
Should know Provost decision Jan 22nd.
Looked at Linda’s proposed emails for survey recipients. Ellen had minor edit that was accepted and made by Brendan.
For pilot, Brendan recommends that we we just use a non-individualized link for pilot. Can test panels separately amongst ourselves.
Brendan looked at display logic affecting data; there should be no real problem.
There was some editing of Linda’s emails for pilot participants, recorded by Brendan.
Which surveys should staff see in the pilot? All of them.
Linda brought up possible issue with viewing survey on iPad? David couldn’t reproduce on his iPad. Ellen will check it on hers.
Went over various comments and suggestions for surveys. Brendan captured changes made.
Russ will propose wording for Subject librarians/staff assistance/research assistance.
Will use student workers for undergrad pilot.
Next meeting is all about the pilot, and we plan to send it out after meeting.
Linda (chair), Brendan (minutes), Steve, Dan, Sarah, Ellen, David , Konstantin, Russ
Next mtg: 01/12/15 2:00-3:30, Lg. conf. rm.
AGENDA: 1. Review previous meeting 2. Plan pilot
BD finished changing the surveys. Questions have been updated as discussed.
Linda made chart of each survey's section and question flow. It all looks good and makes sense.
Linda likes convenient sampling. “Try 3-5 each"
We need people from both campuses.
Steve mentions that data may be difficult to parse due to display logic.
Brendan will check if this applies to us.
Panels will be created for the reviewers
For undergrads, don't use student workers, but their friends are okay.
We should mention we’re collecting demos in the intro.
Show Marlene, Mary, and Amy the survey before pilot. “We thought you’d like to see this before it goes out.”
Everyone’s Spring recess ends after March 15th. Target release date: April 2nd.
Sarah showed off the adverts. Very nice. Still some tinkering here and there. Lots of places we’re advertising, still waiting on even more locations.
Russ: "Can we modify our websites for advertising the survey?" Linda: “IRB says about 1 advert in each library is fine. More than one will shift our user group toward the patrons."
Scheduling: Mondays 2-3:30 in the big conference room for the next 5 weeks, EXCEPT January 20th, 2-3:30 instead of January 19th.
Steve, Dan, Brendan, Sarah (minutes), Ellen, David (Chair), Konstantin
next mtg: 1/5/15 2pm conf rm.
SDS presented new wording for OA questions on grad and fac (FQ36, GQ2a)
On the faculty survey, the open access factor on the question "How important are the following factors in your decision on where to publish journal articles?" will read: 'Journal allows open access (immediately or after an embargo)'.
DF suggested a small wording change: move “have you” after the “regarding articles you have published” and start each question with the verb (published, paid, deposited).
Info lit Questions
DB reports from conversation w/ Ken Liss
already implemented wording on guides/tutorials
separating teaching from clinical/research purpose (FQ 32,33,43,45) - KL interested, we were not open to making this large a change at this time
consider in survey keepers- separating teaching from research from fac, or teaching from study for grads.
GQ17 (could be added to fac FQ 45 as well) Streaming Media
SDS - yes wants this question
BD has concern about “streaming media” wording could be interpreted different ways (synchonous vs asynchronous, live streaming vs netflix style)
switch to “online video and audio resources”?
“streaming audio and video”
decided on wording:
“Access to streaming audio and video resources” for both GQ17 & FQ45
Review of Questions/issues from BD’s email 12/15
discussed, BD will make changes
Assessment Meeting Minutes 12/15/14
Attendees: Linda (head), Dan, Brendan, David, Ellen, Steve, Sarah, Konstantin (minutes), Russ, Vika (called-in)
Next Meeting: Monday, December 22, 2pm Admin. Conf. Rm.
NEXT MEETING: Agenda
1. Minutes (thanks Ellen for the terrific minutes), announcements, changes to the agenda?
2. Discussion with Vika about our OA questions (Sarah)
3. Brendan’s overview presentation and outstanding questions (Brendan)
4. Discussion about Ken’s feedback on the survey’ Information Literacy questions (Dan)
5. Publicity files, Brendan’s suggestion about publicity
6. Do we want to add an augmented services question to the undergrad survey?
Brendan walked us through all the major and minor changes. (see Brendan’s power point presentation).
The group discussed question “During that past academic year at BU have you …. 4th option: “Worked on a research project including data”. We’ve decided to keep the wording as is.
Double check if IS&T term is clear to the medical users - include BU/BUMC IS&T ? …. (Konstantin)
We’ve discussed the advantage of mentioning “Blackboard” in a question which asks about “course system”
Sarah brought up the point that Ken’s suggestion that library guides need to be reincorporated in the undergrad and graduate surveys.
We’ve discussed open issues concerning service augmentation. Russ suggested to change the word “augmented” to “enhance”. We all agreed.
We’ve decided to discuss the potential of removing “media streaming” during the next meeting.
Discussion with Vika about OA questions:
Sarah summarized Vika’s concerns and Vika clarified that we shouldn’t focus just on “open access journals” vs. open access in general.
Vika suggested to formulate a separate question that asks about Gold and Green access.
Discussion followed. Steve argued that we’re trying to find out if our patrons are making their work open access rather than publishing in the “open access” journal.
We’ve decided to modify the question so that we capture “immediately or after embargo” wording.
Question “Have you ever submitted an article to: ….” - We’ve decided that we have to consider the wording in the context of Vika’s suggestion.
Sarah emailed Vika’s suggested wording.
Dan discussed Ken's suggestions, but we still need to revisit them.
Augmented Services for the Undergraduates
Russ suggested that we substitute the word "augmented" with "enhanced". The group unanimously agreed.
Assessment Meeting Minutes 12/8/14
Attendees: RS, ER (minutes), LP, BD, DB, SS, DF (for last half)
Next Meeting: Monday, December 15, Admin. Conf. Rm.
We focused on a group review of the completed graduate and faculty surveys
-Grad & Faculty Surveys will NOT have a “demographics” bloc header—decision is that is it not necessary to prep users to answer personal questions.
Sarah is setting up a phone in with Vika for next Monday’s meeting to discuss the Open Access wording on the survey and what data she would like us to collect.
Things to cover at next meeting: Open Access questions and language with Vika, input from Ken on Information Literacy, conventions/consistency of language across all three surveys with Brendan, review the three surveys in their entirety, data management questions with David, and discuss the pilot
On satisfaction and importance questions, IMPORTANCE rating will be asked first, and then SATISFACTION rating
-Brendan is looking at alternative phrasing for “information literacy” and also looking at the three surveys together to see where language should consistent, but isn’t—he will present these at 12/15 meeting
-David is working on language for Research Data questions
-“Research data” option has been added to question about “during the past academic year, have you…” and set up as display logic, so only survey takers who have worked with research data will see questions about storage and sharing
-Question on bibliographic managers may be moved to different block, but this was not yet decided
-Question on assigned readings and how students access—to be reviewed with Steve and Konstantin at 12/15 meeting
-Dan is following up with Ken on the language for the information literacy question (FQ30)—this language must be to us by 12/15
FQ32—discussed switching assessment and importance (to importance and assessment) but decided not too, since it would be leading, but are aware we may now be comparing apples and oranges
-“Information literacy” is library speak, but may have to remain
-We agreed to remove “other” from the importance & satisfaction re library resources questions
-We agreed to add an option on current awareness to our question on augmenting library services—this question will also be copied to the faculty survey
-We changed the question on library use and space (group and individual study space, access to computers) to focus on HOW graduate students use the library resourceà this change needs to be copied over to the undergraduate survey_______________________________________________________
Assessment Meeting Minutes 11/17/2014
Present: Linda (head), Dan (minutes), Brendan, David, Ellen, Steve, Konstantin, Russ
Next Meeting: Monday, 11/24, Admin Conf Rm.
1. Faculty Survey Subgroup should note the discussions in the high level review below and make necessary edits.
2. Undergrad Survey Subgroup should be prepared to walk the group through a preview of that survey at the next meeting.
High Level Review of Faculty Survey
Assessment Committee Meeting, Monday, November 10, 2014
1. Minutes, announcements, changes to the agenda
a. No changes were made to the minutes of the last full committee meeting submitted by Ellen
b. The following announcements:
i. Brendan noted a possible completion rate problem in the MINES survey, but he observed—and the rest of us agreed—that it did not represent a significant problem. Even we had to accept the lower rate, it was still quite high and not worth intervention.
ii. Brendan also noted that in Qualtrics we can make notes and comments within the draft questions themselves, allowing us to keep all of the committee’s work product there and make it more convenient to read everything in one place, not only now but in the future.
iii. Linda announced that Ron Yeany and his team at IS&T have offered to look into whether they can help in extracting useful usage data from Exproxy.
c. No changes were suggested to the agenda proposed by Linda.
2. Continue discussion about usage, importance, and satisfaction and what changes we should make on our surveys.
a. Continued and reviewed the ongoing discussion we have had about whether to use usage or satisfaction in our survey. David and Steve quickly reprised the issues for us.
b. Konstantin registered his concerns about using importance and offered the white paper from the Customer Satisfaction Research Institute, along with four other relevant articles in the literature, as food for thought in our consideration when considering the use of importance in our survey.
c. Steve noted that if we moved away from importance we would lose the ability for longitudinal study, since we used importance in the first round of surveys.
d. We returned to the question if we could not use all three factors in the survey as suggested in the last meeting. Russ queried if we could use the pilot period to test if the three factors could all be used in the survey without making the question too long and cumbersome, confusing the test taker.
e. Linda noted that we may have more room in the surveys for a few longer questions given that we are hoping to collect demographic data from the Registrar and Human Resources, rather than by asking the respondents.
f. Sarah observed that if we used all three factors that would give additional insight into patron feedback.
g. Consensus: all three elements will be used in the pilot on questions related to services.
3. Demonstration/review of the Graduate Student Library Survey
a. Graduate Student subcommittee: have had a high level discussion and are now having a more targeted discussion. The subcommittee has met twice and needs additional time to discuss issues, but that can follow this discussion by the full committee.
b. A question was raised whether we can use the panels in Qualtrics for the distance education and online learners.
c. Question 8: Brendan wondered if the wording is loaded; Linda did not believe it was; Steve suggested using “Please rate the importance” as a way to resolve the possible loaded nature of the question. The question was also raised if the elements surveyed are really separate or do they represent a skillset (or many different skillsets). Sarah suggested getting Ken Liss’s insights on this. She also recommended inviting him to our next meeting to get his thoughts here as well as on other info literacy questions.
d. Question 9: on first line it was recommended to use just BU Library Search and then on the second line use “Databases” and search tools. Russ and Konstantin also recommended using the examples of HeinOnline and PubMed, as was similarly done on the faculty survey. We also considered whether we needed new examples of Social Media beyond Twitter and blogs, or need any at all. The consensus was to keep the two existing examples.
e. Question 17: Sarah wondered if we should not change the wording to be more like UW’s analog question #8, to be more user-centric. We agreed that the subcommittee in its next meeting would investigate this further and make recommendations to the full committee.
4. Progress report on the Faculty Library Survey
a. We have had a high level committee discussion and one subcommittee meeting discussion. Issues in the subcommittee, principally about the use of usage or importance, were then discussed by the full committee at the last meeting as well as in the early portion of this meeting.
5. High level review of the Faculty Library Survey
a. Additional high level discussions need to occur.
6. Next steps?
a. Undergrad survey is very close to being finished. Needs an additional half hour of the full committee’s time.
b. Grad survey needs discussion in two more meetings.
c. Faculty needs two more meetings, although we may be able to shorten that time inasmuch as we are utilizing language developed in the new grad survey in corresponding sections in the faculty survey.
d. Should we include post-docs in our survey, and how?
Assessment Meeting Minutes 11/3/14
Present: BD, DF, LP, ER (minutes), SDS, DB, RS, KS
Graduate Subcommittee: Friday, November 7, 1-2:30, Mugar Small Admin Conf. Room
Undergrad Subcommittee: Monday, November 10, 12:30-2, Administrative Offices, Mugar, small conference room
Full Assessment Committee meeting, Monday, November 10, 2-3:30, Administrative Offices, Mugar, large conference room
Brendan gave an update on the post-survey information we will use to capture entries for the gift cards, and he will send it around.
There have been no changes to the undergraduate survey since our reserves discussion last week, and the undergraduate subcommittee will update us at the next meeting.
Whole committee: need to look at the larger information literacy question issues with Ken before we can decide on which word to use
SRS, ER, KS: Add question about subject librarians to grad student survey
All subgroups: make survey questions closer/more parallel across the board (see slide 11 where DF has compared the same questions across the three surveys, highlighting the different language we use)
Undergrad survey/undergrad subcommittee: we have lumped together questions on physical things and services, and these should probably be two separate categories (see slide 15)
Discussion re David’s presentation on usage vs satisfaction vs importance:
Presentation (with DF’s recommendations on what word to use with which question) is on the wiki
Overall: we are considering adding satisfaction questions to the faculty survey, but will ask undergrads only usage questions.
Whether to continue distinguishing print vs ebooks? Key points:
· Really two separate questions: (a) how important are these sources (print vs ebook) to you and (b) how satisfied are you with the selection we have at BU?
· Do we care about responses if it will not change our format selection? (i.e., we are an e-preferred library)
· Are we focusing on satisfaction with the format of our materials or our collection (e.g., “you don’t have what I want” vs “you have what I want, but not in my preferred format)
o Decided are trying to get more at satisfaction with the collection and so will combine the formats, which is what we did when asking about our journals collection
o May also add the word “collection” to the question to really get people to focus in on it
· This question will also have a comments box
· Determined there are no questions about e vs print materials we want to add, as we already know what we need to on that topic
Importance vs Usage re: Library Services
· Satisfaction questions are good for giving us longitudinal data, but are not particularly useful for capturing information from just one survey
· How to capture usage?
o If faculty member is on sabbatical and we ask about the past year—will they say they have not used anything, when ordinarily, they heavily rely on our resources?
§ Alternatively, will faculty pay attention to “current year” or look at their normal course of business?
§ But “current year” should definitely be kept for ugrad survey, as it helps us to look at what resources each class uses
§ Grant cycle is 3 years—if faculty are not writing grants in this current academic year, will usage questions underreport on resource usage/importance?
o Use of importance AND satisfaction questions together could give us a better idea of where we need to improve (e.g., less heavily-used resource may still be important, especially with varying workloads over the course of a faculty member’s career. Should we not limit to the current year for faculty, as it may provide an inaccurate snapshot?)
o What can this data really tell us? How would we tell if something is low use because they are dissatisfied or if it is low use because it is not an important resource?
· Faculty use in past year vs importance to faculty—
o How to capture resources that are important but may not have been used in the past year and/or are not necessarily used on an annual basis?
o Issue: probably don’t want to ask faculty all three questions (is it important, do you use it/how often do you use it, are you satisfied with it?)
· Issues with deriving usage stats from satisfaction data—e.g., on the undergrad survey, they reported satisfaction higher than their reported usage, but can also be important—the discrepancy itself provides useful information
· “use is use”—usage questions wouldn’t tell us the type or quality of use (i.e., getting help with a paper vs asking us where the bathroom is)
· Should we ask all three questions, but cut down the number of services we’re asking about to five or so?
· Services for which usage is declining may still be important (i.e., ILL)
· Currently, the usage question is a yes/no, not a likert scale. What is the value of asking it this way?
o If we give up the usage question, we give up being able to report usage
o Likert scale is also misleading, because what might be frequent for one user wouldn’t be frequent for another
· Tip people off at the beginning with a header like “for the following services, we will ask you to think about whether you use these services, how important they are to you, and how satisfied you are with them” – makes people more likely to answer questions, rather than getting bogged down by never ending questions and quitting the survey?
· Can usage be pulled from other areas so we don’t have to ask the question (i.e., get ILL stats from ILL librarians)?
o This question helps capture the perception of services they think they use
o Capturing usage in the survey helps us correlate use and importance in a way that cannot be captured with other data sources on use
· Degree of use—do we care? à kind of an open question
· Importance might also measure unique, non-repetitive events, as opposed to regular use of an itemà potential for skewed/misleading results
Overall: committee members agree on main points with the exception of the use vs importance questions
Suggestions for resolving this issue:
· Make our ideal survey, then edit
o Need to be careful we’re not making an entirely new survey and losing comparability to prior surveys
· Create a matrix of Use, Satisfaction, and Importance with two categories: Facilities and Services
· Is there a more elegant way to ask about what library you use (that would help eliminate the need for separate facilities questions, so we only need to ask about services)?
Can we bring in some topical questions (i.e., data management) that we can change each time?
Assessment Meeting Minutes 10/26/14
Present: SDS, DF, KS, ER, LP, BD, SRS (minutes), RS, DB
Brendan went to Google Analytics workshop on GA for libraries in Washington DC, and reported back on some of what he learned:
DAG is looking to do some usability work specific to the BULS searchbox
Subgroup for Faculty Survey review will meet this Wed 10/29
Intro needs provost/IRB review but otherwise done
Minor wording and ordering changes on Page 1
Individual libraries at BU section
Whirlwind update from Grad Survey review group
Assessment Meeting Minutes 10/6/2014
Present: Linda (head), Brendan (minutes), David, Ellen, Steve, Konstantin, Sarah
1. Find things to improve about the graduate/faculty surveys.
2. Review undergraduate survey post-edits [[ ignore bracketed sections ]]
Review of Changes to Undergraduate Survey
Graduate Survey Collective Critique
Publicity Update – Sarah
General Notes and Miscellany
Assessment Meeting Minutes 9/29/14
Present: Brendan, Dan, Steve, Lisa, Ellen, Russel, Linda (chair), David, Sarah, Konstantin (minutes), Ken Liss
Next meeting: Monday 10/6, 2pm-3:30pm, Mugar Administrative Conference Room
1. Minutes, announcements, changes to the agenda?
2. Publicity update (Sarah)
3. Full committee high-level review of the Graduate Student Library Survey
4. Discussion about Undergraduate Student Library Survey (Undergrad sub-group)
5. Next steps?
Assessment Meeting Minutes 09/15/2014
Present: Brendan, Dan, Steve, Lisa, Ellen, Russell, Linda (chair), David (minutes), Sarah
Next Meeting: Monday, 9/29, 2pm-3:30pm, Mugar Administrative Conference Room
Following Meetings: For rest of year, will be meeting every other Monday at 2pm, except for one Friday: 9/29, 10/17 (), 10/27, 11/10, 11/24, 12/8, 12/22
Sarah: Remind reference librarians about the ongoing MINES survey
Linda: Ask for input from Ken Liss on questions about instruction and services (Q8 & Q9 in undergrad survey).
Steve: Propose question to capture undergraduate use of "reserve" materials
David: Talk to JD about question on bibliographic managers, and which ones to list in question.
Brendan, Dan, and Lisa: Work on undergrad survey for next meeting, including, comparing to MIT and UW surveys, looking at qualitative data from last survey, and proposing any needed wording changes.
Sarah, Ellen & ???: Similar work (for future meeting) for grad survey.
Dan announced improvements to the websites Google Analytics, which now track searches from across site, including branches.
Russell introduced our newest member, Ellen Richardson, Assistant Librarian for Administration at the law library.
Brendan showed the current state of the survey. We have had almost 3,500 results so far, with an extremely high completion rate. Brendan has downloaded the data (so far) into Excel to do some preliminary analysis, including parsing the URL to let us see what resource is being visited. Extremely impressive.
Brendan will share data with anyone on committee who is interested. We can look at it and use it, but remember it is preliminary and hasn't yet undergone complete analysis.
Linda reported that until we hear from Provost (nothing yet), we still don’t know for certain whether we will do everyone in next survey, or only faculty. We will finish work on updating undergrad survey; then do the faculty survey update, in case that is the only survey we administer.
Decided that for each survey we will do pass as a group to make sure we have right questions, without wordsmithing. Then break into groups to look at wording and compare to other surveys.
Sarah suggested we may want to speak to Ken Liss to find out if the questions on instruction and services (Q8 & Q9) are really the metrics he wants to track. Linda will do it.
For the questions about facilities (Q13), we need to rethink our categories. "Furniture" can be confusing. Seating vs. Chairs. Outlets. Lighting.
On Q18 regarding course reserves, Steve brought up wanting to know about where students are getting resources we traditionally think of as e-reserves (e.g., the library's e-reserves system, Blackboard, faculty web site, etc.). There was a lively discussion on whether undergrads (as opposed to faculty) are the right ones to ask, and whether a question can be written that captures what we want to know from undergrads. Steve will work on possible wording for question.
Sarah remembered there being useful feedback on survey in qualitative data. When we break into groups, groups should look at qualitative data for their survey for comments about the survey.
Q21: The question was changed from Yes/No to a scale; need to change wording of question to reflect change.
Decided that if undergrads answer "Yes" to Q22 about using bibliographic manager, there will be a further question asking which one(s) they use. David will talk to JD about what should be on list.
Because the wording for the comments boxes refers to the page titles, we need to add header to pages, or change wording of comment question.
Some time was spent on the perennial use vs. importance vs. satisfaction problem, with no solution.
Brendan, Dan, and Lisa volunteered to work on detailed wording for undergrad survey by next meeting. This will include comparing to MIT and UW surveys, and looking at qualitative data from last survey
Sarah & Ellen volunteered to look at wording for Grad survey (for future meeting).
Brief discussion of a replacement for the irreplaceable Lisa, who is leaving us. We are a large group, so we don't have to replace her. Linda suggests we consider needed skills in making decision.
Assessment Meeting Minutes for 07/31/14
Present: David (temp chair), Dan (minutes), Steve, Brendan, Lisa
Next meeting: TBD - Doodle; Administrative Conference Room Mugar
Lisa lead discussion of the last sheet in this Google Doc, called "Comparing Under Q BU, UW, MIT".
- not much different from our to UW's survey: their survey is nice and very short
- Answer rows on MIT survey Q2 are nicely specific.
- We also looked at MIT Q11's layout (aware | importance). We considered how this might apply for us, considering we often ask about use/importance/satisfaction. We looked at our questions 9, 25, and 26 in comparison.
- We have something very similar to MIT's Q12.
- MIT Q16 + Q18 include library instruction outside of a class; probably not that useful for BU (could mean HGARC events etc?).
- MIT did not ask overall importance of libraries
- MIT Q24, the $100 to spend question. Poorly worded and makes respondent add - not good for fatigue.
- Lisa also noted that there may be some Qs from MIT not represented at all on the spreadsheet. She did not include them because there were no comparable Qs on our survey.
Discussion of editablesurveys on Quatrics
David began this with the question of when, exactly, might it be favorable to combine the data coming from undergrads, grads, and fac. There seem to be very few questions where we would want to do this, and without weighting the numbers for response rate, the numbers would be skewed. We decided to begin composing Qs on three separate surveys.
As we will begin editing the Qualtrics surveys, Dan will share the archival surveys on Qualtrics, but set permissions so that no one can edit them.
We began looking at the faculty survey questions on Qualtrics. We considered the frequency of physical and virtual visits, which were combined in 2010 but later separated, and we added a "during the current academic year" phrase as well. We looked at the undergrad survey in comparison, and realized we should edit that first as it was the latest, and would need the least editing.
Do we need to ask about majors again? Used those to send comments to specific librarians, etc.
Brendan noted the prevalence of the N/A option on many questions, and we noted the consultants advised having this "way out" for respondents.
Our question 13 - piped text from previous question to set up to work in Qualtrics. Steve is satisfied with the options on Q19, to look at changes over time.
MIT asks about USE of more specific sources or collections than we do. We considered setting up our Q differently, but we do have many other statistical sources on collection usage, so we decided NOT to add any about specific USE other than the options we already have listed in the survey.
Brendan suggested that we could create a numbering system that would indicate which questions are longitudinally useful.
We completed an initial run-through of the questions on the undergrad survey, with some edits left for Brendan to finish up later, as he was "driving" the Pardee room computer and editing on the fly.
Scheduling next meeting
We looked at scheduling, but could not get around vacation schedules. David will send a Doodle out to help schedule it.
Assessment Meeting Minutes for 7/14/2014
Present: Linda (chair), Brendan, Konstantin (minutes). Lisa, Russell, Steve, David
Next meeting: Thursday, 7/31/14 at 12pm (at Pardee)
- Ask Bob Hudson to talk to Provost (Linda)
- Look at the questions (David)
- Discuss Brendan's action plan (all)
- Take a look at Konstantin's analysis (all)
- Update numbers on the spreadsheet (Dan)
- Create the Google doc (David)
- Wording, style, etc. - comparison of MIT and U of W - tabs: faculty (Konstantin), grad (Helen/Russ), undergrad (Lisa)
Russ suggested adding Helen Richardson for membership on the Assessment Committee. (all agreed)
Dan talked about Google Analytics - workshops - comparison functionality, bounce rate, exit rate.
Dan reported on survey transfer. To access surveys on Qualtrics: go to bu.edu/qualtrics
Discussion followed in regards to our survey plan
Brendan's report: meeting with Pam Andrews (Lauren Hess replaced Pam in 7/14). Creation of panels. We discussed the advantages and disadvantages of working with the Provost office.
Konstantin raised a question if BMC affiliates will be included in the panel.
Assessment Meeting Minutes for 6/2/2014
Present: Linda (chair), Brendan (helped with minutes), Konstantin, Lisa, Dan, Russell, Steve, Sarah, David (minutes)
Next meeting: Thursday, 6/19/2014 at 2pm
Brendan: MIT Survey:
Dan: Transferring Surveys
Russ: Comparing UW grad & undergrad surveys
Lisa: Ithaka Survey
Assessment Meeting Minutes for 05/19/14
Present: Linda (chair), Brendan (minutes), Russell, Dan, Sarah, Konstantin, Steve, David,
Next meeting: Monday, June 2nd, 1:00-2:30; Administrative Conference Room Mugar
- Russ: Compare UW grad & undergraduate surveys.
- David: Compare undergrad & grad BU reports.
- Dan: Transfer all other BU surveys
- Konstantin: Compare the other BU surveys to each other.
- Brendan: Compare the other BU surveys to each other & troubleshoot Qualtrics issues.
- Sarah: Determine any problems with ceasing to collect discipline information & survey keeper
- Brendan received a promotion to Assessment Data Specialist (or something like that). Now we can offload all our projects onto him!
Plans for New Survey:
Linda brings up two important questions:
Dan's update - Uploading Old Survey to Qualtrics:
David's update - Reviewing Survey Reports:
Russ - Analyzing UW Survey:
Out of time. Next meeting decided for June 2nd.
Assessment Meeting Minutes for 04/03/14
Present: Linda (chair), Dan (minutes), Russell, Brendan, Lisa, David, Konstantin
Next meeting: Thursday, May 1st 1-2:30; Administrative Conference Room Mugar
Linda will serve on a panel at an OCLC meeting at Brandeis on April 22nd called "Getting the Right Fit: Tailoring Assessment Strategies for your Library." More info is available here.
MINES is proceeding very smoothly now. We have had a 0% drop out rate. We need to think about how to back up the data periodically. David believes we can download the data at any time, and we agreed that perhaps a schedule should be followed to download and backup the data periodically, although IS&T seems to trust the cloud.
Faculty Survey planning
We began planning for a faculty survey, not in exact order of the agenda, but the info is here below.
i. Game plan- general, but not unanimous, consensus to perform surveys individually instead of grouped. We also decided to renew Survey Monkey (Linda will do this) once more as the deadline is fast approaching and it may come in handy
ii. Developing the survey - will follow these steps
1. comparing to other surveys - Small groups will do initial comparison work and then bring suggestions to an AC meeting for all to discuss. Brendan will create a google doc in case groups wish to post something before meetings.
a. Dan will transfer the prior BU faculty survey from Survey Monkey to Qualtrics and look for any differences (particularly in skip logic). Dan will create 2 copies on Qualtrics - one to edit and one for posterity.
b. Linda and Konstantin will then look over comparisons from our own grad and undergrad surveys
c. Next, Lisa and Brendan will look at comparisons to the MIT survey
d. Lastly, Russ and possibly others will compare to UW and Ithaka instruments
2. final pass thoughts - examine possibilities of skip logic, questions or other items to eliminate to keep the survey short, and look through our survey keepers list
iii. Timeline - We're thinking about Spring 2015. Russ expressed that we should be mindful of the timing, as Law faculty will be relocated during Spring 2015. Survey fatigue
iv. IRB - Linda will accomplish
v. Publicity/Recruitment - poster from Anita Greene group and emails
vi. Launch - nothing noted
vii. Analysis, quantitative - nothing noted
viii. Analysis, qualitative - have no one currently for this. there is the possibility of going back to tag previous surveys.
ix. Report writing - will likely rely on in-house talent.
x. Staff involvement in changes - we need to examine this more.
Assessment Meeting Minutes for 01/29/14
Present: Linda (chair), Brendan (minutes), Russell, Lisa, David, Steve, Konstantin, Sarah, Dan
Next meeting: Friday, Feb. 14th (Valentine's Day) 2-3:30; Administrative Conference room Mugar
- Russell will be joining us henceforth as a new member of the Assessment Committee. Welcome!
Tuesday, 1/21 Phone Call w/ UMass Amherst’s Rachel Lewellen
Present were David, Steve, and Sarah and Linda. Overall, a very positive, informative call. UMA started out with a year long implementation and collected data from all users who started Ezproxy sessions during two, 2 hr. periods per month. In UMA's second year-long implementation of MINES, they switched to n = 140, and then reduced finally to n = 120. It’s important not to change n too far along so that data won’t have to be excluded, although Terry Plum implied that collection data can be adjusted if a change to N is needed during an implementation. Over 70% of surveyed users completed the survey at UMass (!!!), no complaints registered, not even about frequency. UMA did not have a comment box, but did have an email link. They only received one email. UMA included a question about why survey taker is using the resource. Shared graphs are promising; of note the URL collection stats of those surveyed generally reflects the URL collection stats of the entire user-base (ie statistically relevant data). Very promising overall!
Let’s apply this to our MINES!
- We can use the URL data that will be gathered, but it will involve more involved analysis than other sources of data.
- Konstantin: “We should use ‘BMC Resident’ for primary status question”. Everyone agrees with the tactful word choice.
- Google Chrome is being problematic for everyone with scrolling issues (only for most recent update). It may interact with survey if someone tries to get to bottom of primary status list. “Maybe we can change the number of visible items on the dropdown list” suggests David. Brilliance! Later, “Alas, we cannot do that” informs David. Foiled by technology again. We shall investigate during test period whether Chrome will be problematic
- “High school student shouldn’t be at top of list. It’s not ordered by age (Brendan grumbles), so we should order it by the groups most likely to use it, ie put High school students below staff on the drop down list. General consensus, Brendan dissents vehemently. Too bad for him.
- David’s suggestion to get the sponsored question to drop down works. An excellent solution to the problem we were having.
- “email@example.com” to be changed to “firstname.lastname@example.org” so assessment committee members can personally field survey questions.
- On criteria for assigning survey-testing: “Perfect for those who have miffed you!”
- Thoughts on n:
* Larger student base and greater usage means n can be larger (or smaller? No, larger… we think).
* We can change n within the first month if our number doesn’t suffice.
* Linda: “Let’s go around and see what everyone thinks is reasonable.” Crickets
* After some deliberation, Brendan: “100 or whatever is reasonable”.
* Number-crunching Steve: “100 gets us just enough data with 70% expectation rate.”
Directions will be on the site. Mugar to have two demo sessions on Friday and Monday, one in the morning, and one in the evening. Everyone else to schedule demos for the other branches. Remember, we want to test on as many platforms as possible, even mobile ones. Also we need to test for blank survey results. Proposal: Go live on 11th @ 12:00? … eh, maybe not. We’ll see.
- Steve, on n: “Yes means no, and no means yes”
- Russ, on difference between Law Library and other libraries: “We actually provide service” (excellent smack talk, if I may say so)
- Linda: “How does Valentines day work for everyone?” All concur. There will probably be Valentine’s goodies, and Brendan will probably bring something (he didn’t mention this though).
Assessment Meeting Minutes for 01/16/14
Present: Linda Plunket (chair), Dan Benedetti (minutes), Russell Sweet, Brendan DeRoo, Lisa Philpotts, David Fristrom, Tim Lewontin
Next meeting: Wednesday, Jan. 29th 2-3:30; Administrative Conference room Mugar
MINES Initiative updates
Nearly ready to implement, but need to set up staff testing on a test website. We will be looking for any minor issues, probably with 'refnews' staff, with Linda working out which staff should test what databases. Sarah will be writing up the instructions for staff testing, and all should review them. Russ also notes we should start the surveying early enough before finals, or alternatively wait until early June.
A) Some institutions are putting open access databases behind ezproxy in order to track through MINES. David reminded us that many patrons go straight thru google to resources, so we won't be able to get them. Tim pointed out that we don't send pubmed thru ezproxy because of the way their website used to work, but we can now. Because of possible complications with redirects, we should carefully test the performance of very important databases like pubmed.
B) We worked on the wording of the MINES questions. Terry Plum had two suggestions on wording: 1-that we don't need an 'other' on q2.CORRECTION BY LINDA: TERRY'S COMMENT WAS NOT THAT WE DIDN'T NEED IT BUT ASKED IF RESPONDENTS WOULD KNOW THAT OTHER REFERRED TO OTHER-AT-BU. 2- on the 'purpose' question, to include 'research unfunded (faculty)', because it is primarily faculty that do research and she wanted to distinguish this from scholarship. We decided to change this to "unfunded research and scholarship" and not include '(faculty)' or '(students)' in answer options. We also decided to add an answer option for 'library staff'. Brendan caught an incorrect plural of "affiliates'. Finally, a question for Konstantin came up: If asked to give their primary status, would faculty on the Med campus be confused between a 'faculty' option and an 'affiliate' option?
C) Two places are trying to get answers on what resources patrons are using, thru MINES: Umass Amherst and Univ of Toronto. There will be a call with Rachel Lewellen, Assessment Librarian from Umass Tuesday (01.21, 10am) that all who can should attend. Umass is also increasing the number of times the survey will be filled out, by having the survey pop up more often. We decided that we should also track resources, and David will look at Qualtrics closely to see if we can get this easily, as it seems should be possible.