Assessment Meeting Minutes for 6/2/2014
Present: Linda (chair), Brendan (helped with minutes), Konstantin, Lisa, Dan, Russell, Steve, Sarah, David (minutes)
Next meeting: Thursday, 6/19/2014 at 2pm
Brendan: MIT Survey:
Dan: Transferring Surveys
Russ: Comparing UW grad & undergrad surveys
Lisa: Ithaka Survey
Assessment Meeting Minutes for 05/19/14
Present: Linda (chair), Brendan (minutes), Russell, Dan, Sarah, Konstantin, Steve, David,
Next meeting: Monday, June 2nd, 1:00-2:30; Administrative Conference Room Mugar
- Russ: Compare UW grad & undergraduate surveys.
- David: Compare undergrad & grad BU reports.
- Dan: Transfer all other BU surveys
- Konstantin: Compare the other BU surveys to each other.
- Brendan: Compare the other BU surveys to each other & troubleshoot Qualtrics issues.
- Sarah: Determine any problems with ceasing to collect discipline information & survey keeper
- Brendan received a promotion to Assessment Data Specialist (or something like that). Now we can offload all our projects onto him!
Plans for New Survey:
Linda brings up two important questions:
Dan's update - Uploading Old Survey to Qualtrics:
David's update - Reviewing Survey Reports:
Russ - Analyzing UW Survey:
Out of time. Next meeting decided for June 2nd.
d) Lisa asked about the capitalization and usage of BU Libraries and all its wonderful variations. Sarah will check footnotes and captions on figures to ensure consistent phrasing and capitalization of BU Libraries.
c) Sarah noted that some sentences in the draft began with percentages or numbers. The committee decided that percentages were ok, but that other numbers should be written out as words. survey keeper 1. Sarah will check through the report looking for where percentages or numbers begin sentences.
OCLC Assessment event update
Assessment Meeting Minutes for 04/03/14
Present: Linda (chair), Dan (minutes), Russell, Brendan, Lisa, David, Konstantin
Next meeting: Thursday, May 1st 1-2:30; Administrative Conference Room Mugar
Linda will serve on a panel at an OCLC meeting at Brandeis on April 22nd called "Getting the Right Fit: Tailoring Assessment Strategies for your Library." More info is available here.
MINES is proceeding very smoothly now. We have had a 0% drop out rate. We need to think about how to back up the data periodically. David believes we can download the data at any time, and we agreed that perhaps a schedule should be followed to download and backup the data periodically, although IS&T seems to trust the cloud.
Faculty Survey planning
We began planning for a faculty survey, not in exact order of the agenda, but the info is here below.
i. Game plan- general, but not unanimous, consensus to perform surveys individually instead of grouped. We also decided to renew Survey Monkey (Linda will do this) once more as the deadline is fast approaching and it may come in handy
ii. Developing the survey - will follow these steps
1. comparing to other surveys - Small groups will do initial comparison work and then bring suggestions to an AC meeting for all to discuss. Brendan will create a google doc in case groups wish to post something before meetings.
a. Dan will transfer the prior BU faculty survey from Survey Monkey to Qualtrics and look for any differences (particularly in skip logic). Dan will create 2 copies on Qualtrics - one to edit and one for posterity.
b. Linda and Konstantin will then look over comparisons from our own grad and undergrad surveys
c. Next, Lisa and Brendan will look at comparisons to the MIT survey
d. Lastly, Russ and possibly others will compare to UW and Ithaka instruments
2. final pass thoughts - examine possibilities of skip logic, questions or other items to eliminate to keep the survey short, and look through our survey keepers list
iii. Timeline - We're thinking about Spring 2015. Russ expressed that we should be mindful of the timing, as Law faculty will be relocated during Spring 2015. Survey fatigue
iv. IRB - Linda will accomplish
v. Publicity/Recruitment - poster from Anita Greene group and emails
vi. Launch - nothing noted
vii. Analysis, quantitative - nothing noted
viii. Analysis, qualitative - have no one currently for this. there is the possibility of going back to tag previous surveys.
ix. Report writing - will likely rely on in-house talent.
x. Staff involvement in changes - we need to examine this more.
Assessment Meeting Minutes for 01/29/14
Present: Linda (chair), Brendan (minutes), Russell, Lisa,
David, Steve, Konstantin, Sarah, Dan
Next meeting: Friday, Feb. 14th (Valentine's Day) 2-3:30; Administrative Conference room Mugar
- Russell will be joining us henceforth as a new member of the Assessment Committee. Welcome!
Tuesday, 1/21 Phone Call w/ UMass Amherst’s Rachel Lewellen
Present were David, Steve, and Sarah and Linda. Overall, a very positive, informative call. UMA started out with a year long implementation and collected data from all users who started Ezproxy sessions during two, 2 hr. periods per month. In UMA's second year-long implementation of MINES, they switched to n = 140, and then reduced finally to n = 120. It’s important not to change n too far along so that data won’t have to be excluded, although Terry Plum implied that collection data can be adjusted if a change to N is needed during an implementation. Over 70% of surveyed users completed the survey at UMass (!!!), no complaints registered, not even about frequency. UMA did not have a comment box, but did have an email link. They only received one email. UMA included a question about why survey taker is using the resource. Shared graphs are promising; of note the URL collection stats of those surveyed generally reflects the URL collection stats of the entire user-base (ie statistically relevant data). Very promising overall!
Let’s apply this to our
- We can use the URL data that will be gathered, but it will involve more involved analysis than other sources of data.
- Konstantin: “We should use ‘BMC Resident’ for primary status question”. Everyone agrees with the tactful word choice.
- Google Chrome is being problematic for everyone with scrolling issues (only for most recent update). It may interact with survey if someone tries to get to bottom of primary status list. “Maybe we can change the number of visible items on the dropdown list” suggests David. Brilliance! Later, “Alas, we cannot do that” informs David. Foiled by technology again. We shall investigate during test period whether Chrome will be problematic
- “High school student shouldn’t be at top of list. It’s not ordered by age (Brendan grumbles), so we should order it by the groups most likely to use it, ie put High school students below staff on the drop down list. General consensus, Brendan dissents vehemently. Too bad for him.
- David’s suggestion to get the sponsored question to drop down works. An excellent solution to the problem we were having.
- “firstname.lastname@example.org” to be changed to “email@example.com” so assessment committee members can personally field survey questions.
- On criteria for assigning
survey-testing: “Perfect for those who have miffed you!”
- Thoughts on n:
* Larger student base and greater usage means n can be larger (or smaller? No, larger… we think).
* We can change n within the first month if our number doesn’t suffice.
* Linda: “Let’s go around and see what everyone thinks is reasonable.” Crickets
* After some deliberation, Brendan: “100 or whatever is reasonable”.
* Number-crunching Steve: “100 gets us just enough data with 70% expectation rate.”
Directions will be on the site. Mugar to have two demo sessions on Friday and Monday, one in the morning, and one in the evening. Everyone else to schedule demos for the other branches. Remember, we want to test on as many platforms as possible, even mobile ones. Also we need to test for blank survey results. Proposal: Go live on 11th @ 12:00? … eh, maybe not. We’ll see.
- Steve, on n: “Yes means no, and no means yes”
- Russ, on difference between Law Library and other libraries: “We actually provide service” (excellent smack talk, if I may say so)
- Linda: “How does Valentines day work for everyone?” All concur. There will probably be Valentine’s goodies, and Brendan will probably bring something (he didn’t mention this though).
Assessment Meeting Minutes for 01/16/14
Present: Linda Plunket (chair), Dan Benedetti (minutes), Russell Sweet, Brendan DeRoo, Lisa Philpotts, David Fristrom, Tim Lewontin
Next meeting: Wednesday, Jan. 29th 2-3:30; Administrative Conference room Mugar
MINES Initiative updates
Nearly ready to implement, but need to set up staff testing on a test website. We will be looking for any minor issues, probably with 'refnews' staff, with Linda working out which staff should test what databases. Sarah will be writing up the instructions for staff testing, and all should review them. Russ also notes we should start the surveying early enough before finals, or alternatively wait until early June.
A) Some institutions are putting open access databases behind ezproxy in order to track through MINES. David reminded us that many patrons go straight thru google to resources, so we won't be able to get them. Tim pointed out that we don't send pubmed thru ezproxy because of the way their website used to work, but we can now. Because of possible complications with redirects, we should carefully test the performance of very important databases like pubmed.
B) We worked on the wording of the MINES questions. Terry Plum had two suggestions on wording: 1-that we don't need an 'other' on q2.CORRECTION BY LINDA: TERRY'S COMMENT WAS NOT THAT WE DIDN'T NEED IT BUT ASKED IF RESPONDENTS WOULD KNOW THAT OTHER REFERRED TO OTHER-AT-BU. 2- on the 'purpose' question, to include 'research unfunded (faculty)', because it is primarily faculty that do research and she wanted to distinguish this from scholarship. We decided to change this to "unfunded research and scholarship" and not include '(faculty)' or '(students)' in answer options. We also decided to add an answer option for 'library staff'. Brendan caught an incorrect plural of "affiliates'. Finally, a question for Konstantin came up: If asked to give their primary status, would faculty on the Med campus be confused between a 'faculty' option and an 'affiliate' option?
C) Two places are trying to get answers on what resources patrons are using, thru MINES: Umass Amherst and Univ of Toronto. There will be a call with Rachel Lewellen, Assessment Librarian from Umass Tuesday (01.21, 10am) that all who can should attend. Umass is also increasing the number of times the survey will be filled out, by having the survey pop up more often. We decided that we should also track resources, and David will look at Qualtrics closely to see if we can get this easily, as it seems should be possible.
Assessment Meeting Minutes for 10/7/13
Present: Linda (chair), Sarah (minutes), Steve, Brendan, Dan, Lisa, David, Konstantin
Next meeting: Thursday October 31st at 2pm.
Dan- open a ticket re: google analytics Primo problem
Linda will work on process of staff feedback as part of analytics cycle
David demo quantitative data presentation
Konstantin will demo qualitative data presentation
Dan will demo GA
university council mentioned David's article that appeared in BU Today
Jack asked us to do some Primo usability work- requires followup.
Dan- tested if primo search terms were being captured in Primo analytics account - no. He has an email in to Brendan at IT, will open as a ticket
All lib mtg for Undergrad Survey scheduled for Nov 14th
Steve- ARL stats mostly done. UStats made the e-collections numbers easier than in past years, probably also more complete.
A number of differences from past years' numbers mostly reflecting changes in how we were gathering the stats
Circ numbers are very tricky in Alma, and vastly different from III approach, so extra challenging this year
Dan- Website analytics will start counting outgoing links and searches from top page. GA not working in Primo currently.
Lisa- adobe suite finally installed
Librarywide Meeting 3-4pm Nov 14th
Terrace Lounge in GSU
Primo usability- wat that would look like, is it all end-user, or also
Assessment cycle- solutions that need to be passed up the chain
Assessment Meeting Minutes for 9/16/13
Present: Linda (chair), Sarah, Steve, Brendan, Dan, Lisa, David, Konstantin (minutes), David
Next meeting: October 3rd, 1pm.
-review final report by Wednesday 4pm.
- plan library wide meeting (around October 17)
- David posted it - the final version is ready!
- Everyone should read it by the end of today 9/17 (4pm) or Thursday morning the latest.
- Bob Hudson will send it to the University Provost and the Associate Provost, Undergraduate Affairs. The report will then be shared with the deans of the undergraduate schools, as well as other key administrators on campus, such as the Dean of Students and Vice President for Enrollment and Student Affairs. The report will also be sent to library directors.
- Sarah will work on the News Letter for the Mugar websites
The deadline for returning the survey comments were August 15. We will give a new due date after our library meeting. Only changes that were made will be posted on the web site. (Linda will send an email to Dan)
Linda reported: We don’t need an IRB approval- so we don’t have to include the question about exclusion of students under 18 years old.
Konstantin presented the draft MINES survey to the Assessment Committee for review. Lots of good discussion and a few suggestions for changes ensued. Significant decisions made by the group included:
· The survey will be optional rather than mandatory for users.
· No comments box.
· There will not be any links included on the survey.
· The survey will be as short as possible both in terms of number of questions and choices within questions.
Questions for Scott Macomber, IS&T were raised.
· The Assessment Committee would like this survey to be optional rather than mandatory. We would like the survey to be presented to every Nth user, and then for the user to have an easy way to opt out from the survey page. How is an optional choice implemented?
· What happens when a user hits the "back" button either from the survey page or from the resource if they've already been passed through to the resource?
· If a user hits close (or submit or finished) and they haven't filled out the survey, what gets reported? Is it blanks? We are concerned since it is possible to design the survey such that the first answer option for each question is visible in its text box.
· Approximately how many times is EZproxy called on an average day? This will help the Assessment Committee decide whether N (as in Nth user) should equal 250 or 500.
· When will the test survey be ready?
· Is the survey a webpage or a pop up?
· Will we be able to change the survey after we give it to you for testing?
· Can we change the Nth number between now and January? What about changing the Nth number after we go live in January?
· How will you re-direct the user to the correct URL once the user has completed, or opted out, of the survey?
· Will survey be accessible on mobile devices (the Qualtrics survey looks fine on a Google smart phone.)
October 7th – due date
Steve reported on the use of journal statistics. Steve, Tim and Brian are developing a plan for cancelations for this fiscal year. The deadline for making cancelation decision on September 1.
Lisa submitted a request for a cloud based Adobe Suite.
Usability issues - Lisa brought up the question of usability testing for library search – we will discuss it next time.
Assessment Meeting Minutes for 9/6/13
Present: Linda (chair), Sarah, Steve, Brendan, Dan (minutes), Lisa, David
Next meeting: September 16th 11:00AM
b) Sarah suggested we make sure quotes are accurate as to what students actually wrote in the report. Brendan will check over the quotes in the survey report to make sure they are consistent with what was written in the survey, and that none are duplicated.
Dan announced that Yelena reviewed the draft, and had only four or five comments. Dan will send these to David in an email for consideration. Some were the same as points raised in today's discussion, and he will note those for David in the email.
a) Review began with David and Brendan speaking about denominators on Figure 8. The issue was whether to ONLY tally those students who selected Mugar, for example, as a most frequently visited library (Brendan's query), versus tallying those students who selected Mugar as a most frequently visited library AND those students who selected Mugar as a next most frequently visited library (as it is in David's draft). It was decided that as long as the footnote is very clear as to what is being tallied (and it is in this case), we are satisfied it can stand as is.
1> Review of David's draft began and many edits were made on the fly. Reported below are items that rose to action items, or that were of some significance.
e) In going through the report, we noted some figures used inconsistent capitalization of labels. Brendan will ensure that figures in the report use consistent phrases and capitalization.
f) It was suggested that we consider making the URLs in the report live links. The committee decided against this, in part because of style considerations, and in part because we would like to keep folks focused in the report rather than leading them elsewhere. survey keeper 2.
g) Brendan pointed out that where a percentage appears in the text of the report, the quoted number may not exactly match the corresponding numbers in a chart. This slight difference is due to the way that the numbers have been rounded. The commitee decided this is ok, as it is a common statistical practice. survey keeper 3.
2> There was a discussion about what tasks might be best suited for our student worker Maggie. Her work on the Serials Review with Steve is nearly complete. Various projects were considered, and then it was decided that David's data dashboard project may be a good fit. Steve will finish up serials work with Maggie, and then introduce her to David for work on the data dashboard he has created.
3> Next steps for the undergraduate report were considered. David will revise the report to create a new version, with suggested edits. Thank you, David! Linda will share the video Shirley created with the committee through email. Linda will perform a last review, and coordinate sending out the finished report to Bob, the Provost, etc. Dan will then help post it to the website. Steve will finish up serials work with Maggie, and then introduce her to David for work on the data dashboard he has created.
Assessment Meeting Minutes for 8/14/13
Present: Linda (chair), Sarah(minutes), Konstantin, Steve, Brendan, Dan, Lisa
Next Mtg: Aug 22 Thursday 1-2:30
We noted that after we finish writing the report, we should review the comments for actionable themes for library admin to act on (eg broken outlets) that might otherwise be missed.
We went over 9 of the remaining 11 comments areas for best quotes for the survey:
KS- positive services, positive subj libs
Linda- group study, light, noise
a couple executive summary comments
suggests flipping default graph layout from low-high to high-low
BU Libs consistency
89-87%- statistically significant percentage difference for "most important"
ss and ks -we tagged them as we detected patterns- comments themes frequency//table p.3 reword the headers in table on p.3?
heated debate about qualitative data
Assessment Meeting Minutes for 8/12/13
Present: Linda (chair), Sarah, Konstantin, Steve, Brendan, Dan, Lisa
Next meeting: August 14th 1:30PM
Reminder: the target deadline for submitting the Undergraduate Library Survey report to the Provost is the first week of September
Our method of working with the comments: through the committee's "crowdsourcing" we've identified the most representative comments (i,e, our focus is on the comments' tone and message)
To see comments in NVivo: \Dropbox\Storage\Shirley\Comments\2013 Undergraduate Student Library Survey Comments ALL\NVivo Files
Important: "sitting and furniture" category were separated into two categories but in NVivo the nodes are nested, i.e. "sitting" is the child of "furniture" node. 57 items are broadly about the furniture and 66 are specifically about the sitting. There's no overlap. We should relabel them or conglomerate into "Sitting & Furniture."
Important: the comments are presented in the order of importance based on today's discussion.
Dan: #10 comments about citation managers
The prevailing sentiment in comments was that EasyBib rocks (RefWorks not so much). EasyBib was by far the most mentioned tool. There were just a few positive comments on RefWorks.
The committee liked this quote best (particularly the 2nd sentence).
"EasyBib is okay, but no good if you are doing APA. I really like RefWorks, we just started using it this semester in my WR150 course."
Here are two others I chose that are more representative of the overall sentiments of undergrads.
"Easybib is my go to site for citing research papers."
RefWorks is an extremely disappointing tool. EasyBib is much more effective and user-friendly."
Finally, here's a good one if we wish to present "the information need" in the report:
"I would like to learn more about EasyBib or other bibliographic managers as I think it could be very helpful and let you work more on your research paper than the bibliography."
Steve #3 Library Resources
" The research capability that BU Libraries offer is amazing. The amount of databases that I have access to is very helpful when I need to find information, whether it be critical essays for my writing class or scientific papers for my chemistry or biology classes."
" I really appreciate the library's selection on a wide variety of topics. I know that I'll always be able to find a book specifically pertaining to any research paper or extracurricular interest I might have."
" I think the library is a wonderful and vital part of our university. This year I have taken primarily science courses, which is the main reason why I don't use the resources offered by the libraries (Interlibrary borrowing, databases)."
"Online resource articles are great and have more books/articles online for students to access would be a great help."
"The amount of material at these libraries is fabulous. I have always been able to find a large amount of information and books on whatever topic I was looking for."
"I use the databases/eJournal access all the time but never go to the library to do it."
Some short & pithy comments:
"Wonderful and extensive."
"Electronic access is paramount!"
Lisa # 6 Mugar Library General Positive Remarks
Most of these statements were very short and general- good, great, that type of thing. The prevailing theme was that Mugar is a good place to study. Here are two quotes along those lines:
"Mugar is great! I love knowing I can go there and study or find a good print source for a paper. Everything is easy to use, and if you're stuck, there's always someone available to help."
"Mugar seems to have the best set up and is very easy for me to study in. This library in particular has become a very comforting place for me to study on those late nights."
Lisa # 15 BU Library Search
The themes in this section included difficulty finding known items and frustration over irrelevant search results.
"It’s very difficult to figure out if the library actually has a certain book. Also, many of the links to online journal articles or books that come up as results in the Mugar search engine do not work."
"The overall library search engine can be very frustrating to deal with. Oftentimes, the words that I search are not related to what the search engine produces."
Dan suggested that we can choose a comment to represent the usefulness of EasyBib for the undergraduates and/or choose a comment that compares the use of EasyBib and Refworks. The group wanted to know if it's possible to identify comments by the status - freshman, sophomore, etc.
Steve shared his observation that the representative comments were positive about textbooks. Linda suggested to take a look at OpenEducation at UMass (Amherst) - this is a good resource for the survey follow up. Textbooks are important for the undergrads and we can do something about it with these comments. (see http://guides.library.umass.edu/content.php?pid=87648&sid=1714807 ) The OpenEducation initiative offers incentives to the faculty at UMass Amherst to create an openly available "course pack" in lieu of textbooks.
We discussed the stylistic "quality" of comments. We agreed that comments cannot be 'cleaned up' so not to distort the meaning of the comments we can correct grammatical mistakes only so that we could use these comments in the final report.
There was some concern about the misgivings of using qualitative data and quantitative data in the mixed method since our goal is to identify representative comments and not to claim that the comments reflect the overall trends in collected quantitative answers.
Assessment Meeting Minutes for 7/29/13
Present: David, Sarah, Dan, Konstantin, Brendan, Lisa, Steve (Minutes)
Top 15 comments areas:
Here is a table of the final Comments totals/hot spots.
TOP 15 Most Commented Upon Issues
Assessment Meeting Minutes for 7/15/13
Present: Linda (Chair), David, Sarah, Dan, Konstantin, Steve, Lisa (Minutes)
Discussion (Updates on AC initiatives)
Dan: Google Analytics
Linda: ARL Stats
Sarah & Konstantin: MINES
David: Undergrad Survey Report
Steve: Serials review.
Assessment Meeting 6/24/2013
Present: Linda, Lisa, Steve, David, Shirley, Dan, Kelly Sarah
Next meeting: 10:30am on July 15th (Dan on Google Analytics, Linda on ARL stats)
Linda made several announcements:
Shirley's Presentation on NVivo for Analyzing Survey Comments
[This is my “stream of consciousness” transcript of high-points of her presentation.]
Just talking about one part of large program. Created project, imported data from data set (excel file from SurveyMonkey). Will put latest version in DropBox "Undergraduate Survey 2013 last.nvp".
Can choose fields to be tagged, in this case all comment fields (shown in white).
Create "nodes" (equivalent to tags, with hierarchy) Took quick and dirty approach, there are many others possible approaches.
To tag, highlight all or part of comment, right click, "Code Selection" at existing or new node.
Hierarchy can be edited (drag and drop, or cut and paste). Aggregate node to include children.
View->Coding Stripes lets you see what codes are applied to given record.
When you create new node, you need to look back to see what else should go there. Started with tags from previous survey.
Faster than Excel process, but not as happy with end product.
For resources, did breakdown by subject manually while coding. May be possible to do programmatically, but would be more work up-front.
Need not consistently tag positive vs negative.
Can combine data sources (though this project had a single source).
For Mugar, a lot of positive general comments. Negative comments tend to be specific.
We should not edit Shirley's file, archive it somewhere, possibly in IR with permanent embargo.
For research librarians, should send all of them all comments.
Unused "Respondent" nodes are for matrix stuff which wasn't done.
Queries: Word Frequency, with stop list
Reports: Canned or create you own. Into Word document.
Report Coding by Node summary, use to create top 10 list
Has book on NVivo, Linda will keep it. Paula is ordering another one.
Group will talk at next meeting how to carry work forward.
Goodbye to Shirley
Everyone said nice things about Shirley, which can be summed up as “Irreplaceable.”
Assessment Meeting 6/10/13
Next mtg Monday 6/24/13 9:30-11
Present: Benedetti (minutes), Riddle, Starikov, Plunket (Chair), Fristrom, Leslie-Smith, Philpotts
1. The newly revised Assessment Plan is up in Google Drive, in the Goals Folder
2. We will spend the next meeting discussing the Comments from the Undergrad Survey
Status Update on Goal Statements and the Assessment Plan
Assessment Committee (AC) members should look at and possibly comment on the Assessment Plan (2013-2016). The Assessment Plan is in the Goals folder on Google Drive, and will be submitted to the four library directors for review when completed. The current time frame for the plan is 3 years, but could stretch to 2017 - it will be an evolving document.
If there are items within the Assessment Plan that don't mesh with your Goal Statement, one or the other should be updated. To update the Plan with any changes, please send an email to Linda requesting the change. AC members should revise Goal Statements with any suggestions from the last meeting.
As we move forward with our plan and goals, we should think about how to best structure our meetings to best support work in our areas.
Later in the AC meeting, we did some live editing of the MINES goal document. AC members should investigate the use of "we" in their goal documents and eliminate those references.
David's Overview of Survey Monkey Results from the Undergraduate Survey
The Undergraduate Survey Report is largely expected to follow the general outline from the Grad Survey Report. Some areas of possible further investigation highlighted at meeting were:
Dan noticed at least one facetious survey respondent via an inappropriate comment. There could be others who simply have clicked through the survey to get into the iPad mini drawing, but how do we know? A possible answer would be to include a question like: "If you're still reading this click 2". But the general feeling among the AC was that we likely have a statistically insignificant number of these problem respondents in our results - Dan will investigate.
David will likely have a draft by August and we would like to produce the Report for submission to the Provost in September.
Shirley's Preliminary Discussion of Work on Comments from the Undergrad Survey
Shirley is using NVivo to work on the comments this time around, which is a slightly different approach. She is not using queries but setting up nodes. Areas of the libraries that seem to have garnered the most comments:
Present: Struble , Riddle (minutes), Starikov, Plunket (Chair), Fristrom, S. Smith, Philpotts, Benedetti
Next meeting: 9:30 on June 11
assessment mtg 5/10/13
Next mtg Friday 5/24 9:30-11
Present: Struble (minutes), Riddle, Starikov, Plunket (Chair), Fristrom, S. Smith, Philpotts
1.Linda discussed ARL stats annual mtg
2.Hired a student-Maggie. She’s available through summer until she graduates in Dec 2013. Steve working with her currently on use data for journals. Ustats is an ex libris use-data program for counter compliant data, that was part of SFX (sushi client), that we are now looking at taking advantage of. Can also add in cost data from Alma. Will be good for cost per use and evaluation, especially of journals.
Review Goal Statements:
rewrite and target specific areas
map goals and timeline more closely to revised assessment plan (2-3 year)
template w/ uniform goal statements
Other goals or things the Assessment committee should be doing, or does do but has not documented
4/18/13 Assessment meeting
Minutes for 2013-04-018
Present: David, Sarah, Dan, Steve, Linda (chair), Shirley, Lisa, Konstantin (minutes)
Next meeting: 10:30-12, Thursday April 18, Mugar conference room
Agenda items for next meeting: Discuss Graduate Survey results, report on selection of winners, discuss comments for F14 goals
Action Items: upload shared documents (re: FY14 leadership group plan, i.e. Grad survey/Impact Response, ARL, Google Analytics, MINES) to Google Docs (5/25) and comment by 5/8
David and Dan - communicate to Shirley 3 names (iPad winners) for a photo-op, normalization of survey data (dupes, 18+, undergrads)
Shirley - follow up: Mary McGowan and students (library video)
1. Research Assistant position
Linda, Steve and David will interview Mia (Meggy), a graduate student, for the Research Assistant position (the Assessment Committee needs help with usage stats analysis, comments, perhaps Google Analytics.) We also hope to set up an infrastructure so that we’ll be able to have a permanent position in the future.
2. Update to Survey Results and Library Staff Responses
Mary McGowan will be working with her students on creating a library video in which library staff will share their ideas and report on actions taken in response to the graduate student survey. Shirley will follow up with Mary.
We reached 21% response rate!
Planning our distribution of prizes:
-We need to verify the following: 1) undergrad status 2) 18+ 3) remove duplicates
Our strategy: 2 volunteers (DF and DB) -
David and Dan will vet dups, undergrads, 18+
Shirley will work on the photo-op (David and Dan will provide 3 names to Shirley)
The list of all winners will be deleted after the prizes have been given out.
4. Graduate Survey
Quantitative analysis – David encouraged everyone to check the analytics in Survey Monkey and report to him if anything worthy of noting. BE CAREFUL NOT TO DELETE DATA!!!
Qualitative analysis – Shirley reported that NVivo is limited to help us with analysis of comments because of exports/imports issues. Shirley is still determining how we can use NVivo for our purposes. One main problem is the limit of 100 comments – Shirley will find out if we’re allowed to increase this limit beyond 100 comments.
David and Shirley pointed out that it would be interesting to use mixed methodology method and to explore how we can coordinate qualitative and quantitative types of data.
We are planning to report our results from the Undergraduate Survey in the Fall Semester.
6. Assessment Committee Goals
Rough draft of the survey (DF, SLS, LP) - updates for the group
Impact Response (Lisa) - Lisa will post updates for the group
ARL Stats - Linda will post updates for the group
Google Analytics – Dan will post updates for the group
Mines - Sarah and Konstantin will post updates for the group
We will post documents for sharing and commenting on Google Drive.
To add comments: highlight what you’d like to comment, right click, add comment
Minutes for 2013-04-01
Present: David, Linda (chair), Sarah, Kelly (minutes), Shirley, Steve, Dan
Next meeting: 10:30-12, Thursday April 18, Mugar conference room
Agenda items for next meeting: Discuss survey results, scope of report--just undergraduate survey v. all 3; continuing discussion of FY14 goals
Sarah will follow up with Lisa re: flyers in CGS
Shirley will send email with link to brainstorming assessment page to firstname.lastname@example.org
All: Read brainstorming page from top to bottom; send comments to/let Shirley know when you have read it (http://www.bu.edu/library/about/library-assessment/brainstorming/)
Sarah, Lisa: work on mock-up of UNLV and UW graphic using our data
Linda will send Sarah and Lisa a copy of the UPenn brochure
Steve and Linda will examine documents re: collections and university programs and propose similar document (see Announcements)
Linda will draw up standardized list of ARL stats and publish to library staff: this is what we collect; what are you doing with other stats? (to streamline assessment operations throughout the libraries)
Konstantin: add language to Assessment Plan re: the role of the assessment committee in the libraries: how do we support, which things are our responsibility (e.g., conducting surveys and focus groups)
Dan/Kelly: 2nd draft of Google Analytics goal, including info on working with web committee, updated timeline, proposed questions
All: In drafts of goals outlines for FY14, tie goals to Assessment Plan
Sarah: Add to MINES goals what questions we will ask during evaluation period
Undergrad survey: Faculty will get email re: undergraduate survey on 4/2, students will get reminder email on 4/4
Linda: Tim Barbari and Beth Loizeaux (Assoc. Provost for Graduate Affairs and Assoc. Provost for Undergraduate Affairs) want to evaluate collections in preparation for bringing on new faculty and starting new programs (chiefly grad. programs). Have requested a brief document on how this would work.
This will be good for communication and raising the profile of the libraries. Has been influential in other schools on going ahead with programs (Georgetown, U. Maryland).
Only issues actually being addressed should be on page (take out Stone)
Changes agreed upon: Shirley change intro on action plan responses page to present tense; add to next to last paragraph “within their departments”; make sure names are formatted consistently; school libraries now “Professional Libraries”
13% have responded so far (Survey Monkey documentation notes that you have to give answers to count as responding for statistics--does not count if someone just views the survey)
Uptick in respondents after Dean Elmore retweeted link to survey
Shirley has worked very hard on HTML for faculty email
David--Mary F has covered Com, JD Eng, Brendan Astronomy
Kelly--Flyers hung in CAS, FitRec, will take to Stone this afternoon
Konstantin--Put some in CAS
Steve--Put approx. 10 flyers around CFA, a few in SMG, including undergraduate lounge on 2nd floor
Sarah--3 flyers in SMG; SHA--left large poster with staff in dean’s office for them to hang, left 12 to be placed on tables; MET--bulletin board near undergraduate services office
Shirley--has placed 17 posters around dining services, excluding some Starbucks and Breadwinners
Dan--poster in the lobby (will be up for 1 week) and 5-6 flyers in SED; digital displays in the Center for Career Services, 100 Bay State Rd.
Linda--Bob Hudson on agenda for admins meeting; Bob Brown mentioned including in BU Today, and Linda has contacted IRB to see if this is okay--has not received response, will update group
2 emails on 4/2 and 4/4--faculty and students
Close survey at end of business day on 4/16
David will harvest data
Comments: NVivo training in fall; tutorials available now at IMC in the SED
Timetable for report: have ready to send to deans and undergrads in fall
Goals for FY14
Ongoing discussion: want to pause surveys for a year and do MINES before restarting the 3-year cycle and evaluate other tools; would give time to think about focus groups, door surveys, smaller targeted surveys on (e.g.) spaces, etc.
MINES: Will this tool work? Embraced by ARL
Factsheets: Yes--good PR
Revise Assessment Plan: 5 years old
Overall survey: asses process, look at tool itself
Audience for Goals page on Assessment page: library staff, deans, Bob; not for assessment groups at other libraries (discussions of tools not important); present tense
Can’t track outgoing links
Primo search? Leave out for now; just look at site
Questions: How can we change/improve site? Can we use GA to improve/inform collection development?
How to interact with web committee? We analyze stats and then push to web committee? What can we not control or study fruitfully? (Primo search)
Look for base set of data (old GA data)
Contact folks at IS&T after questions developed: Ron Yeany (mentioning Brendan Gannon); also bring Tim Lewontin into discussion; add these discussions to timeline
This year: 1. evaluate, 2. decide
Administering tool itself--two options:
Option 1: Do MINES through ARL and pay them for report; we would deal with IS&T; ARL delivers reports and peer comparisons
Option 2: We use tool ourselves
Questions for evaluation period:
Do we work with U. Toronto? At what stages do we work with them?
What is ARL’s timeframe?
We have to decide every nth time questions appear, how long we run it.
What info can we get on whom? (IS&T)
Minutes for 3/21/2013
Present: David, Linda (chair), Sarah (minutes), Kelly, Konstantin, Shirley, Steve.
Next meeting: 1:00-2:30 Monday, April 1st, Conference room
Agenda items for next meeting:
Linda will cover new work to be undertaken, to get the libs involved when new programs proposed
Discuss work to articulate goals for public website
Sarah- send google groups on MOOCs and aside about hesitancy for the group registration
Shirley- move old assessment plan to website from wiki, update paragraph on top page, add Konstantin's profile
Steve- follow up with contact at Ithika
Put posters up:
All - write up copy for website on our upcoming goals, each for their own area of responsibility
MOOCs- sign up if interested
One page handouts- Lisa will do this (will be her goal) UW and UNLV examples
Steve- Roger Schoenfeld from Ithika would like to communicate with us about our surveys, Steve will follow up
Survey keeper- always do undergrads in spring semester to minimize underage students
Survey Keeper- there are certificate and non-degree programs in several colleges/schools. This survey does not cover any of them
David- Mary F wil cover com and JD hit Eng yesterday
Kelly- dropped off poster and flyers at CAS (they have someone who posts them) and droped flyers at fitrec. Will follow up tomorrow
Konstantin- CAS stuck a couple up
Put posters up in small branch libs (Astro-David, Stone-Kelly, VAL-Sarah)
Steve CFA and SMG- not yet, will hit today and tomorrow
Sarah- SHA and MET not yet
Shirley- starbucks refused, most others were fine (put takedown date and contact info on back
Social media - mmcgowan and tom Kurland working on the display outside library. Error today, will be fixed tomoorw
Denise putting up today in Mugar
Dan got displays out to digital
Steve will drop off ones for SED for either Dan or Yelena to post
Linda will follow up on branches (SEL set, Steve delivering with PERL)
Reword intro on top page
Staff page- Shirley will add profile for Konstantin, add Lisa to list
Added some visual interst (charts) on grad and fac survey pages
Continues apace- tweets going out (10 total), Dean Elmore will retweet, news piece being posted, blog posting with cc to facebook >all same wording
Surveykeeper- be more open in IRB wording for social media next time?
have leaders for each goal- post to website along with objectives and timelines, a few well thought out paragraphs of description. Committee review before posting. Linda will work with anyone who wants help on what to write up.
Google Analytics- Kelly
Konstantin- revise assessment (plan form 2003 w/ Linda)
Overall survey –David (quantitative and rpt, Linda admin inc IRB, Shirley-qualitative (comments)
Eresources evaluation- Steve to lead (Linda and David and Tim also working on this)
Other assessment collection work that we have been tasked with but not yet undertaken- Tim Barbari wants the get the libs involved when new programs proposed (future goal)
LP will cover this in next mtg (add to agenda)
Agenda- set regular mtg times
Minutes for 3/11/2013
Present: David , Linda (chair), Dan, Sarah, Kelly, Konstantin,Shirley (minutes) Steve.
Next meeting: 11:00am on Thursday, March 21st [is going to include lunch but Linda has to leave at 1:30pm]
Agenda for next meeting: TBD.
• Linda will send out a 'cheat sheet' of things to remember when asking permission to hang posters
• Linda will let us know when the 8 1/2 X 11 posters arrive at her office. Everyone has to fetch their set of posters from Linda.
• Linda will send out the email about the upcoming survey to all library staff
• Linda will obtain a list of faculty email addresses (faculty broadcast email to go out April 2)
• Linda to get Bob’s approval for the brainstorming page
• Linda to talk to Mary McGowan about a responses video.
• Dan to check the google doc prize entry spreadsheet to check that Linda's input appears in it
• Everyone - let Linda know guestimates for number of posters needed ASAP (8.5x11)
• Everyone - prepare to put up posters in designated areas AFTER the broadcast email has been received by the undergrads(due March 19).
• Shirley to create an Outlook Calendar for Assessment that can be shared
• Shirley to open a ticket for allowing bu.edu access only to the results of the brainstorming sessions
Agenda1. Minutes, announcements, changes to the agenda
2. Update on Undergrad Survey
Digital displays Posters, assignments for schools, colleges, administrative units
Recruitment emails Four email webpages Test Registrar email list for undergrads Provost list for faculty Deans Staff
o Prizes3. Feedback on Graduate student comments from departments & branches
4. Goals for FY14
9. Resident Assistants will post the posters on every floor of the ug residences.
23. Goals for FY14
Survey : Leaders – LP, DF, SLS
Collections:Leaders SDS, DF
ARL Stats: Leader – LP
Assessment Plan 09 revision: Leader – KS
Google Analytics: Leader – KR
MINES: leader –SRS
24. Dan will do Qualtrix training
· Change layout to subject (name) problem/solution.
· Email respondents to let them know it will be open to the public, so there’s a chance to rewrite. DF asks if we want to edit for public understanding (eg, use of words Alma and Primo)
· Set a date when it will go public [March 31]
· Responses as a PR tool –edit with an eye to public review
· Add word “comments” to heading, and put it in perspective (actions based on comments as well as aggregate data)
· Next step might be to push out through our existing PR avenues (Linda will run past Bob first – after the date to go public)
· For security responses tweak wording to reflect that door checks etc are Mugar-specific
· Pardee’s response as an example of response to user lack of knowledge about what is in our collections/databases
· Look at what branches and law med theo can do ti share the followup on survey comments
· Add dates to terms like “over intersession” and “by midterms” (eg, “by spring 2013 midterms”)
· Ask all staff to review ALL TOPICS for things like policy changes to BLC cards, or need for another micro scanner
· Add the fact we only asked for two topics to be identified and solutions posed
· SRS will ask Tom to review (micro scanner as example)
· LP will talk to Mary McGowan about a responses video
· Shirley will password protect with BUlibraries
· Shirley will open an IS&T ticket to get a response from Ron Yeany about levels of password protection
· Dan mentioned (somewhat unrelated) that libfiles as a place for image parking makes libfiles more open to the public. Does it need a content review?
· Assessment committee can edit in google doc, Shirley will ask others to email her directly
Minutes for 2/25/2013
Present: David (minutes), Linda (chair), Dan, Sarah, Lisa, Kelly, Konstantin, Steve. Special guest: Dan O'Mahony
Next meeting: 1:30pm on March 11 [was originally going to include lunch, but that has been postponed]
Agenda for next meeting: Wrap up on survey, Shirley's presentation, goals for committee.
We introduced ourselves to Dan O., and he introduced himself: He is Director of Library Planning & Assessment, and Scholarly Resources Librarian in the Social Sciences at Brown. Currently finding out about what assessment means to various libraries. How it is divided up (Brown doesn’t have a group right now). How to handle qualitative stuff?
Linda and Sarah described what Shirley did with comments from survey. General discussion. Dan O. said that they used lots of people in tagging; got more front-line and collection development librarians involved. Sarah asked how did they deal with privacy concerns? Dan O said it basically didn’t come up (they had stripped any identifying information).
Talk about top-down vs. bottom-up assessment. Dan O talked about involvement in space planning.
Linda asked about turnstiles added at Brown libraries (replaced guard desk checking IDs). Talked about how they handle data from turnstiles.
Dan O. left.
Linda: Update on ungrad survey. Provost weighed in at IRB, we are expedited, should hear by or before the 11th.
Publicity: Linda met with Mary McGowan, she is taking on much of the responsibility. Mainly electronic, though there will be 100s of posters. Assessment committee members will be responsible for hanging (and taking down). Dan is investigating sizes of digital displays. Shirley is working on emails. Linda will write letter to library staff, we will see before it goes out. Gifts being worked on -- meeting decided we will go with Wifi only iPad Mini with 32Gig. David will periodically harvest survey while it is ongoing; everyone else (except Dan) should not go to SurveyMonkey while survey is on. Dan will talk to Shirley to get URL of thank you page (non-templated). Dan: Fitrec electronic displays come from cable channel they don’t control, so we will go with print posters there.
Present: Linda (via skype), Shirley, Kelly, Lisa, David, Sarah, Konstantin (minutes), Dan
Next meeting Monday 2/25/13, 2:00pm--3:30pm, Mugar Conf Rm (Dan O'Mahony from Brown will be joining our group)
Dan reported on SED survey review - recommendations weren't useful, N/A - some questions were raised. This was followed by a discussion in which we've concluded that our survey is in good shape.
Linda talked about the timeline for submission to IRB. On Saturday Linda arranged with David Lazar, the Communication Specialist, Provost Office, a plan to encourage students to take the survey outside of the classes. Linda also asked to create a list of the undergraduates 18+ so that we could limit the number of invitations to those who are <18.
Graduate certification programs (?)
We need a break down by school of students that need to be excluded.
Linda talked about the communications about the physical and the digital posters. Bethany and Mary's comments about colors, text, etc.
David took us over the action items from last time (changes made by Dan, Kelly's communications with MET, skip logic, graphics).
Shirley made suggestions about skip logic - what if the student skips the primary library, is the second most frequently library going to be displayed? Dan and David will take a look at it.
Linda reported that she made changes to the Introduction to comply with the IRB - # of prizes and chance of winning. Some changes to the wording (f.e.x Individual libraries).
Discussion followed about search and discovery tools - we ask about importance and satisfaction but not about usage.
Shirley - do we want to ask specifically about our website. Discussion followed and we all agreed that for our purposes it would be too general.
WE AGREED THAT OUR SURVEY IS FINAL EXCEPT PROOFREADING !!!!!!
Linda will wait until Tuesday before saving the survey as a PDF file.
How soon should we let other librarians know about the survey? We agreed that this should be done after we get IRB approval.
Agenda for next meeting:
Present: Linda (via skype), Shirley, Kelly, Lisa, David, Sarah, Konstantin, Dan (minutes)
Next meeting Monday 2/4/13, 1:00pm--2:30pm, Mugar Conf Rm
1. No announcements
2. Linda relayed that no conflict of interest form needed.
3. We have administrative support from the Provost’s office for the broadcast email and the creation of the email distribution list, where we can hopefully exclude non-degree students. Bob Hudson has given the go-ahead for funding our incentives. We will not print poster or other materials until we have the final URL, QR code, and possible approval from the IRB.
4. SED grad student
5. Pilot changes to our survey
We reviewed the Pilot results and comments and came up with these changes:
Change "fifteen minutes" to "10-15 minutes".
Kelly will talk to a MET administrator to see if our two MET questions make sense.
We discussed CAS majors because of some confusion but in the end decided on not changing them at all.
On q8 in this section about "librarian-led instruction", we will add a "don't know" option.
On q9, the "Borrowing / getting items not owned by the BU libraries" option will change to match the Interlibrary Borrowing option in q23 [i.e., "Interlibrary borrowing (Interlibrary Loan, BU WorldCat Local)"].
Individual Libraries section
To help students read our questions, we will capitalize MOST FREQUENTLY and NEXT MOST FREQUENTLY on q11 and q13.
We discussed moving this section down because of some confusion on whether to answer for specific libraries (eg only for Mugar) but in the end decided on not changing the order at all.
For anyone filling out the same library in both q11 and q13, we need to disregard their answers to q14.
If a student skips q11 or q13 question, their answers on the next q should not be tallied in the data analysis.
David will double check on what we can do about being unable to control skip logic for students who skip a question. Dan believes we are unable to control skip logic for those students that skip ahead - they simply go to the next page in our survey.
Library Resources section
In q16 we will get rid of the "other" option that is seemingly serving no purpose.
We had a repeat of the concern about the ordering of sections during this discussion, but again chose not to change the order at all.
Research Methods section
We agreed to delete "to you" in q18 in this section. We also agreed to pass this by Steve - which Dan will do.
Overall Contributions Section
We will delete the "planning for your future" option on q22, and again will pass this by Steve.
We will change the q23 scale label on very important as noted by several students from 1 to 5.
On q23 and q24 we will change "librarian assistance with reference and research" to "research assistance from librarians" which matches the option on q9.
Everyone should proofread the new version as soon as Dan is able to make the changes.
Everyone should review the graphics to be submitted soon by Mary McGowans group of students. Would you find this attractive / want to see it on the bus, etc. ?
7. IRB Approval
Present: Linda (via skype), Shirley, Kelly, Lisa, Steve, David, Sarah (minutes), Dan
Next meeting Monday 1/28/13, 2:30pm--4:00pm, Mugar Conf Rm
Review wording/ question order:
Top paragraph review for IRB:
Mary M met with Linda re publicity and incentives
17 year olds
Changes made in real time
Review order of responses on individual questions:
Dan will randomize order of items in Q9, Q 12 (Q14 same random order as Q12), Q22, Q23 (Q24 same order as Q23 except overall at bottom); re-alphabetize Q11 and Q13
Cathy M coming up with list of pilotees
Linda will send us an email to review by 2pm Wed 1/23, finalized for Cathy M by end of day Wed
In email to pilotees, ask them to look for:
Assessment Committee meeting minutes 01/18/2013
Present: Dan, David, Kelly, Konstantin, Lisa, Sarah, Shirley, Steve (minutes)
Next meeting: Tuesday, January 22, 1:00-3:00 pm in the Mugar Conference Room
Dan: Complete edits
discussed at today’s meeting in Survey Monkey
All: Complete CITI Human Subjects training (see Linda’s email)
For next meeting: we will begin planning the pilot.
Review questions in Survey Monkey
Specific issues identified:
Assessment Committee meeting minutes 01/15/2013
Present: Linda (Chair),
Steve, Konstantin (minutes), Shirley
, Sarah, Dan, Lisa
Next meeting: Friday, January 18th 2:00-3:30 pm in the Mugar Conference Room
All: Edit & add assigned survey questions to Google Drive by Thursday 12pm (or sooner!)
All: Complete CITI Human Subjects training (see Linda’s email) by January 17 by 5pm
Dan: Scan the questions for continuity and consistency (i.e. tense, wording, for example "library" vs. "libraries"
For next meeting: we should discuss the hiring of the graduate student assistant.
Linda (and the group) welcomed Lisa Philpotts (Mugar) who will be participating in the next 5-10 meetings.
2. Question Development (2nd round)
Participants discussed implemented changes. Discussion followed what further changes should be addressed next. Linda reassigned questions as follows:
Demographics (Shirley--> David )
Library Resources (Steve --> Sarah )
Research Methods and Skills (David --> Konstantin)
Individual Libraries (Konstantin ---> Shirley)
Services (Kelly -- > Steve)
Contributions (Sarah --> Kelly)
We've agreed that it's important to consider the following during our next meeting: "worst to best" vs. "best to worst" order, "least to most" vs. most to least", etc.
Consistency in wording is very important! see Q 21 for example.
Include n/a (also test inclusion of n/a in a pilot)
Assessment Committee meeting minutes 01/10/2013
Present: Linda (Chair),
David, Kelly (minutes)
Steve, Konstantin, Shirley
, Sarah, Dan
Next meeting: Monday, January 14th at 10:30am in the Mugar Conference Room
Friday, January 18th 2:00-3:30 pm in the Mugar Conference Room
All: Complete CITI Human Subjects training (see Linda’s email) by January 16th.
All: Continue to wordsmith the questions in your section (DF - Research Methods; KS - Individual Libraries; LP - Contributions; SLS - Demographics; KR - Library Services; SS - Library Resources)
Linda : Dan O'Mahoney from Brown University will be visiting us in February to get input in order to create a more robust program at Brown. Everyone agrees that both February 11th and February 25th (both Mondays) from 2:00 to 3:30 PM will work for us. Linda will submit both times to Dan as possibilities for his visit.
Dan: Suggestion to use survey keepers as a checklist after questions are finalized, since some concern the IRB process.
2. Question Development
Change "Freshmen" to "First Year" (BU terminology)
Check on wording/titling of College of Health and Rehabilitation Sciences/Sargent College
Possibly list primary discipline for CAS and MET
Move frequency questions to Services section
Library Resources (Steve)
Bold "library resources" and/or otherwise make clear that this question does not apply to textbooks or books/materials students have purchased
Discussion on whether print and electronic books should be one option or two--would give use data but this is not a preference question; also not all books available as both print and electronic
Include "during the current academic year" wording
Research Methods and Skills (David)
Use specific "databases" instead of the word "databases"--e.g., JSTOR, EBSCO, PsychINFO, etc.
Add Y/N question: Have you used a bibliographic manager program to assist with citations (e.g., RefWorks, EasyBib, Zotero)?
Individual Libraries (Konstantin)
Change wording to "My primary BU library that I visit most frequently is..."
Wording issues with "Order and cleanliness of the physical collection"--want to address signage, ease of finding items on shelves
Now includes frequency questions
Include "during the current academic year" on frequency questions
Include scanners as a service
Add "Decision to attend BU" as fourth option on question "What contributions do the BU libraries make to your:"
Possibly use UW wording for importance and satisfaction questions
Make sure overall importance question and satisfaction question (Q19, Q20) have same phrasing and "importance" and "satisfaction" in bold
Assessment Committee meeting minutes 01/07/2013
Present: Linda (Chair),
Steve, Konstantin, Shirley
Next meetings: Thursday, January 10th 12:30-2:30pm in Mugar Conference Room
Monday, January 14th at 10:30am in the Mugar Conference Room
All: Complete CITI Human Subjects training (see Linda’s email) by January 16th.
All: Read Kelly's google doc in the Assessment folder on the google drive.
All: Wordsmith the questions in your section (DF - Research Methods; KS - Individual Libraries; LP - Contributions; SLS - Demographics; KR - Library Services; SS - Library Resources)
Linda : Dan O'Mahoney from Brown University will be visiting us after mid-February sometime. He won the poster session at the Baltimore Assessment Conference. He created a web form to train staff to categorize and tag comments. Dan wants input in order to create a more robust program at Brown.
survey keeper - wait till Dan is back
2. Review process and timeline for the undergraduate survey
Linda would like the survey completed by mid February in order to submit it to the IRB. We should meet frequently over the next few weeks to finalize the undergraduate survey.
3. Question development for the undergraduate survey
a) David and Linda created a document combining all of the questions used in the UW and BU faculty, grad and undergrad surveys. They selected 21 questions to be included in the current BU Undergraduate Survey. Linda handed out both documents.
b) We reviewed the deleted questions, then examined the 21 questions which were suggested for inclusion. Everyone agreed on the inclusion of the 21 questions.
A)Modifications to the 21 Questions:
We need to add "Undecided" in here somehow.
Q4, 5, 6
We must specify "this academic year" in the question stem.
Sarah suggested we get feedback from Tom C. on WR150 before we decide to include Q6 but Linda pointed out that WR150 was not mandatory for all, only for CAS students.
We need to look at changing these categories. For example textbooks may be important, but datasets may not be as important to undergrads. David wanted to distinguish between the use of a textbook or use of a library book. Linda suggested that maybe we should specify that they are "library resources" but then this would preclude mentioning textbooks. This question needs to be reworded.
David wanted to capture the use of Reserves - this should be added.
David would like these options to be shrunk down somehow.
Include 'BU Libraries Search' not just 'catalogs'
Leave Google Scholar out as an option - it may be confusing for those who see it as a BU library tool.
Change the wording of colleagues to indicate faculty and fellow students (peers and professors).
Perhaps focus on citation and plagiarism here (BU faculty was concerned with this). Faculty saw ethics and plagiarism as 2 diff topics in the fac survey.
Maybe include a 'use' question here?
Q12, 13, 14, 15
We will ask them about their primary and secondary libraries here and then ask them specifics about that particular library.
Sarah would like us to include hours here too.
Relook at wording order
Could we include how libraries help info literacy skills/find resources/cite sources?(Or we could include this as a 'use' topic).
Collections and Ref services may be combined, in order to facilitate comparisons to other survey data, once we are in the data analysis phase.
These are 2 new questions - Linda referred to UW's 'Overall importance to work by group' chart. We should use kind of chart in our results.
May be useful to make the wording between Q19 , 20 different. Instead use active voice - "you" in the question stem. e.g. How important are.... to you?
Sara would like us to incorporate some kind of time management question (ability to manage time)
Sara pointed out that writing questions for undergrads is very different to what we have done before. We should simplify the wording but we should be concerned about the validity of comparisons across surveys/ survey groups and over time. Linda replied that UW had made these kinds of comparisons.
Sara - Distance ed is graduate only
Sara mentioned UROP may be a resource.
Sara is concerned that we are not collecting data about the libraries they don't use and why they don't use them.
Sara would like to run focus groups to glean data needs etc.. Linda suggested individual libraries may carry these out.
3. Survey Keepers
We will discuss these when Dan gets back.
4. Next Meetings
Thursday, January 10th 12:30-2:30pm in Mugar Conference Room
Monday, January 14th at 10:30am in the Mugar Conference Room
Assessment Committee meeting minutes 12/10/2012
Present: Steve, Konstantin, Shirley, Dan, Sarah, Linda, David (minutes), Kelly
Next meeting: January 7th 1:30pm in Mugar Conference
All: Complete CITI Human Subjects training (see Linda’s email) by January 16th.
All: Look at links about project management in Konstantin’s email.
All: Look at ARL letter attached to Linda’s email.
Dan: Move or tag survey keepers that don’t apply to current survey.
Kelly, Sarah, Konstantin: Work on wording for the first 5 undergrad questions before next meeting.
Linda: IS&T has bought 500 seats on NVivo (http://www.qsrinternational.com/products_nvivo.aspx) -- Qualitative data analysis software. We hope to use. Not sure what campuses will have access; hope it is “One BU.”
Linda: Have ok to interview and hire a graduate (or undergraduate) student for Assessment. Using Anita Greene funds. 5 to 10 hours.
Linda: All have to take CITI training, Linda will send it, including which modules. Get it done by mid January.
Looked at "Question & Wording Comparison for BU's Undergraduate Student Library Survey" document (in Assessment Google Docs collection) for planning for undergrad survey. Sarah took notes directly in document.
Linda will be focusing on IRB. Kelly, Sarah, and Konstantin will concentrate on wording of undergrad survey. They will go in depth on the 5 questions we covered in meeting.
Assessment Committee Meeting minutes 11/30/12
Present: Linda (chair), Sarah, Konstantin, Kelly, David, Shirley, Dan (minutes)
Next Meeting: 12/10/12
Announcements: Steve Smith will join us at our next meeting.
We also decided to hold meetings every other Monday at 2:30pm.
· Look at Shirley's Google Doc for collecting library improvements.
· Take a look at the Rationalizing ARL Annuals Statistics Collection – Draft, for consideration by the Assessment Committee 11/8/12
· Konstantin will investigate Microsoft Project vs Zoho and will report.
· Take a look at Kelly’s and Sarah’s document
· Take a look at Survey Keepers
A) Projects -
Back Burner - Alma Analytics and ACRL Metrics
Middle Burner - Google Analytics and MINES (The year after the undergrad survey might be good for MINES)
Front Burner - undergraduate survey creation (lead by Linda) and Grad Survey feedback (lead by Shirley)
New project: eResources Collection Assessment -
Steve Smith will be coming on board with this.
Probably will involve looking at high ticket items, packages.
What is our preparedness for an uncertain budget?
Project will involve
- looking at Counter compliant stats
- SFX to see overlapping subscriptions
- careful consideration for departments with only a few very important resources
- communication with faculty and students
- dividing resources into 3 categories (can't live without, really want to keep, let's talk).
B) Project Management (lead by Konstantin)
Konstantin showed us three project management tools. All would take some dedicated staff time to keeping the system up to date.
1.Basecamp - can control who gets each email notification; to dos only possible to assign to one person at a time; where is the Gantt vchart?
2.Zoho - easy to spot task progress and milestones; convenient dashboard; like Facebook but for projects; prioritized tasks; task dependency; recognizes google APIs; $50 per month
3.Clarizen - visual roadmap; milestones and tasks (with managers); documents discussion; time tracking; $25 per seat per month; visually reminds of SharePoint.
Discussion followed about Microsoft Project, Excel, and Outlook as to what extent we might use these as a PM tool. There was a concern that Project might not be cloud-based. Konstantin will look into Project to learn more, and confer with IS&T. The group generally favored Zoho.
C. Feedback on brainstorming library improvements (lead by Shirley)
Feedback already received from Roger, Lou and Vika. With the Alma implementation, some folks are not able to address this immediately. Shirley created a Google form to collect responses to our email scheduled in January.
Present: Linda (chair), Sarah, Konstantin (minutes), Kelly, David, Shirley, Dan
Action items for next meeting (Next meeting: New date/time/location is Friday, 11/30, 2:30-4pm in the conference room, Administrative Offices, Mugar.
· Dan will revise a draft email about brainstorming ideas for library improvements that could be released to Department and Branch Heads.
· Linda will send Shirley a list of email contacts so that we can look forward to sending an email (that Dan has been working on) to librarians/department heads (around January)
· Take a look at the Rationalizing ARL Annuals Statistics Collection – Draft, for consideration by the Assessment Committee 11/8/12
· Konstantin will compare Zoho, Basecamp and Microsoft Project for next meeting
· Take a look at Kelly’s and Sarah’s document
· Take a look at Survey Keepers
· Take a look at Shirley’s document and try to think what else can be added
Linda’s feed back:
· The poster sessions – “Using What You Collection, Library Data and Student Success” from the University of Minnesota
· Linda’s and David’s visit to the UMass
· Project management software / timelines (Sarah and Konstantin)
· Document for facilitating brainstorming session with staff (Dan)
· ARL/BU stats documentation (Linda)
· UW’s undergraduate survey – wording, etc. (Kelly and Sarah)
· Alma Analytics (David)
· Document for comments processing (Shirley)
Linda’s feedback about the conference
· MINES (Measured Impact of Networked Electronic Resources)
o Terry Plumb, B. Franklin – web form for electronic resources
o York University – tied it to the “stream” i.e. Ezproxy stream (for example, EBSO, Sage ebooks, CSI, etc.)
o Linda suggested this since this kind of reporting is valuable for administration we can do it once a year or more
§ We all agreed that it’s a great idea to go through ARL, although it is possible to do it on our own.
The poster sessions – “Using What You Collection, Library Data and Student Success” from the University of Minnesota
· GPA and retention rates, as tied to library data
o Web site login, circulation, instruction sections, reference sessions, etc.
Linda’s and David’s visit to the UMass
· Rachel’s feedback: Comments sent out to department head
o Asked department head to give feedback after meeting with the staff
“Brainstorming” sessions with staff
· Dan prepared a document for brainstorming
· We all agreed that it doesn’t have to be a “survey monkey”
· We also agree that the timing has to be right
· We can also do “save the date” message to department heads and send it out in January.
Sarah and Konstantin
· Basecamp, Zoho, Microsoft Project, use of Excel sheets
· Konstantin will compare these products for next meeting
· Dan created a document about brainstorming ideas
o For next meeting Dan will revise draft email about brainstorming ideas for library improvements that could be released to Department and Branch Heads. We are not going to post comments but we can offer to contact Shirley
o We should send out to ALC in January
§ Linda will send Shirley a list so that these people can look forward to (January) a document that Dan has created (action item)
§ See Rationalizing ARL Annuals Statistics Collection – Draft for consideration by the Assessment Committee 11/8/12
Kelly and Sarah
Question and Wording Comparison for BU’s undergraduate student library survey (available on google sites)
· Kelly and Sarah compared facilities, frequency, resources, sites, etc.
o Revisions that are needed
o We need to be clear what questions we could group together and report on. This will affect our wording (action item)
§ Consider defining core questions
o Is there an overall contribution question? Yes.
o Sarah suggested to concatinate some questions (frequency and X)
o Linda – better use of skip service
o Next time: take a look at Kelly’s and Sarah’s document (action item)
· ALMA analytics
o Reading documentation wasn’t helpful
· Placed Grad Student Survey documentation in a Dropbox (Documentation- Comments)
o Preparing the data spreadsheet
o Setting up the categories
o Sorting the comments on department contacts in the spreadsheet
o Follow up
o Development -> Analysis
o What we do/how we do it/why we do it/resources
· Project Management
o ACRL Metrics
Take a look at Survey Keepers
Take a look at Shirley’s document and try to think what else can be added
Minutes: Thursday 10/25 9am-10:30
Present: Konstantin, Kelly, Shirley, Dan, David, Sarah (minutes), Linda
Friday Nov 9, 2pm Agenda item: create timeline for tasks/goals
Sarah and Konstantin will look for (free) software for project mgmt/timelines
Dan will create a draft document for facilitating brainstorming sessions with staff
Linda will look at ARL/BU stats documentation and draft note for folks she's working with
Kelly and Sarah will get UWs undergrad survey and start thinking about changes vs keepers from their questions
David will bond with Alma analytics
Shirley will draft document for comments process
timeline of our tasks/goals: project management approach to scheduling our work for the upcoming year(s) (Gantt chart?), rewrite charge
Timeframes - 2nd cycle right away?, Yearlong or multiyear schedule, review approach, pull in Gordon, ask UW how they did longitudinal work
Document steps of process for data analysis (DF), comments (SL-S)
Response approaches to finished surveys
DB asked what if anything ever happened to bulleted followup suggestions from faculty survey?
*followup on grad and faculty
Develop undergrad survey
Fix up our existing Google wiki- move to BU google sites and clean up nav, etc
Network with others on campus
-format and platform for e-materials?
will fall to Steve
ARL annual stats
LP has a charge to, in the next year, work with Law, Med, and Theo as well as Ruth Milesky from Institutional Research on rationalizing how ARL annual numbers are gathered and reported across the 4 BU libraries
- we have purchased access to this tool
- includes access to ACRL's Counting opinions database (get it cheap b/c we submit ARL stats), which many institutions find useful for peer comparisons (despite the quality of the ARL data behind it)
- Investigate ways to use?
DB and DF asked how much of the followup should fall to the assessment group. To what degree does this activity meet our group's charge? We were all in agreement about the need to track, document, and broadcast how change is happening based on data and survey results. SS pointed out that our group may be uniquely poised to facilitate communication of ideas from front end staff to policy makers who can institute that change. Suggestions about getting buy-in from department and branch heads.
Minutes: Tuesday 10/2/2012 10am-11:30am
Present: Linda Plunket (chair), David Fristrom, Sarah Struble, Shirley Leslie-Smith, Kellie Riddle, Konstantin Starikov (minutes).
Next meeting: Thursday 10/25 9am Mugar
1. Alma Analytics
· Reporting function
o Ebooks (!)
o The creation of lists functionality is very robust in Alma
o Need for analytics documentation
o Use of widgets
§ We should have a widget
o We need to take a leadership role and/or identify staff in each library to explore analytics capability
o Is it possible to import Primo analytics data into Alma?
o Our goal – use ARL questions as a sandbox for creating stored logical sets
o Before Millennium goes away, Jack will run the stats as a backup.
2. Google Analytics
o Feedback: Brendan Gannon’s presentation (9/21) was very helpful. Brendan emphasized the need to formulate questions first about our usage before delving into
o Should we set up the model for coordinating web analytics and Alma analytics teams?
o Delegate the responsibilities to experts but have the group (committee) develop broad questions.
o We need a technical lead
3. Preparing for 10/18
o Encourage your colleagues to come.
o On The agenda:
o David will give overview of the report
§ David suggested that we shouldn’t go over specific questions rather to remind everyone what the sections are
o Shirley will give overview of the comments
§ Shirley will give examples
o Linda will lead the discussion but everyone will be encouraged to contribute.
· Bob Hudson will open up the meeting and will place it in the context
· New staff will be introduced by supervisors
· We should also do a general discussion about the conclusions.
· Bring up the Assessment page so that people will identify members
§ Agenda for 10/18:
o Bob Hudson will introduce new staff (10 minutes)
o Linda will go over the report and agenda for the meeting and that people should feel comfortable contacting assessment members (5 minutes)
o David will use Prezy to talk about the survey (facts, charts, graphs, reports) (up to 30 minutes)
o Shirley will talk about the comments (15 minutes)
§ Will talk about the process of identifying and tagging data, trends, etc.
o Questions from the audience
§ If there are no questions/ or we can also open the discussion by talking about lost books, group study space, training patrons on finding e-journal articles and explore ideas how we can improve our services.
Discussion: what investigators will do to address questions from non-committee library staff
· Linda: we are not going to announce what we will or will not do but at the same time as members of the Assessment Committee we will find the time to provide data that will help librarians with departmental decisions.
· Sarah - librarians should come with specific inquiries in order for us to help them.
Sarah - Sarah will let Linda/Bob know if there will be guests who are not librarians.
Logistics - bring a computer.
· Linda will check if wi-fi access is available
· Room access (set up, etc.) – we need to access the room 15-20 minutes before the event.
4. Future goals: preparing for the undergraduate survey
· March of 2013 (?)
· When should we start – last year, about right after Christmas
· Take a look at the “things learned” on our website and also at the recommendations from Steve’s report (The University of Washington)
· Pilot - important to do for the next survey
Minutes: Wednesday 9/5/2012
Present: David, Linda (chair), Kelly, Sarah, Shirley(minutes), Young-Joo.
2. Farewell to Young-Joo.
1. Numbers check. David made a few changes based on Dan’s findings. Thanks to Dan for checking the numbers. Dan will check the numbers in the Appendix.
2. Bob Hudson’s name is no longer on the survey. He felt that those who were responsible for the document should receive the credit for it. The Library Assessment Committee is now credited.
3. Sarah suggested sharing the survey report with Nancy Coleman (Head of Dist Ed.). Linda will confer with Bob about sharing the report with the Deans and Nancy.
4. Survey Report Discussion
4.1. Pg 2 Place to study. Dan wanted more info so David will add a sentence indicating the need for study space for grad students was the greatest need.
4.2. Pg 5 David will try to adjust figure 2.
4.3. Pg 6 Fig 4 - Linda requested that librarians take care to explain that the size of the circle relates to the number of respondents not total students. Young-Joo and Kelly in particular should relay this information to Mary and Amy. The response rates per school may be used to illustrate this. Sarah compared School of Social Work (+_ 60%) with the Dental School (+_ 14%).
4.4. Pg 10 The comment reflects that there is still a need for print resources in some areas.
4.5. Pg 14 Dan’s suggestion was considered. The footnote was deemed to be important and will be kept.
4.6. Pg 15 The title of fig 13 includes the word ’Use’ in reference to class presentations and online guides. We couldn’t think of a better word to replace it.
5. Next Steps:
5.1. Linda will give the report to Bob who will send it to the provost, and will ask for a meeting thereafter. That day Linda will give the report to Mary, and then get it to the Deans.
5.2. Bob and Linda will look at sending a copy of the report to ALL of the graduate students.
6. Comments Discussion
6.1. Change Pappas to Law Libraries
6.2. Add filters and freeze panes to each tab
6.3. Complete comments as soon as possible
6.4. Ask Konstantin to take over Young-Joo’s med comments (no student assistance allowed)
6.5. Anonymity is paramount
7. Open Staff Meeting
7.1. We may have to have two meetings – one general meeting and then a meeting to discuss the comments spreadsheet and how to use it. Try to complete the comments ASAP.
7.2. Venue may be in the Law tower or the PAL Study Lounge. Where and when to be determined.
8. Sarah noted that we still need to keep working with the data, even though the exec summary is complete.
9. Appendix Discussion
9.1. The Appendix will now be called ‘Supplementary Data’ and this must be changed throughout the report.
9.2. Dan will check the numbers.
9.3. Linda will revise all of the key categories throughout the report (e.g. change’ Important’ to ‘4’) and will change the charts to reflect this.
9.4. The report and the supplementary data will appear on the Assessment Committee web page once complete. (Law, Med and Theo can post it on their sites as they prefer).
10. ALMA Training
10.1.Linda mentioned the Tuesday (1:30-3:30) and Thursday (10:30-12:30) training sessions and recommended that assess-l attend.
10.2.The Assessment Committee should all attend next Tuesdays training session on Reporting and Analytics. It starts at 8am and ends at 4.
11. Google Analytics
11.1.Meeting with Brendan Gannon at 3pm on Sept. 19
11.2.We must think about how we can use the data that we are collecting.
11.3.We should try to complete all GA tutorials, read the FAQ’s etc before the meeting.
Meeting Date: July 26, 2012
Present: Linda, David, Jim (minutes), Dan, Sarah, Shirley, Young-Joo
Meeting Date: June 28, 2012
Present: Shirley, Dan, David (minutes), Linda
Next Meeting: Friday, July 13th, 2:00pm, Mugar Admin
Linda described what is going on with Google Analytics for new website. There were some questions, so we called Brendan Gannon from IS&T to get some answers:
We looked at Appendix Draft.
Question was raised of using “Satisfied” etc. as labels when actual label was 4. Unresolved.
Should we do executive summary? Maybe wait until we have report.
Next steps? Try to come up with narrative for report.
Library centric report vs. grad student centric
Meeting date: Thursday, May 24, 2012
Present: Dan Benedetti, David Fristrom, Young-Joo Lee, Shirley Leslie-Smith, Linda Plunket (minutes), Sarah Struble
Next meeting: Tuesday, June 12, 3:30-5PM
Reviewed Survey Monkey analysis by page and question (ignoring the comments for now)
Since everyone is very busy with the website transition, Alma implementation, end-of-year spending, and L2324 evaluations, few have had a chance to look at the data. We reviewed the data together to get a sense of the data and what we need to do.
1. Is your primary affiliation with a Distance Education or Online Learning program at BU?
PAGE: DEMOGRAPHICS PAGE 2
1. With which campus are you primarily affiliated?
2. What is your degree program?
3. As a graduate student at BU, have you:
4. What is your primary discipline?
5. With which college/school are you primarily affiliated?
PAGE: LIBRARY RESOURCES
1. How frequently do you visit a BU library in person?
2. How frequently do you access BU Libraries' resources, services or websites online?
3. Please rate the IMPORTANCE of the following resources to your work:
4. Please rate your SATISFACTION with the following BU library resources:
PAGE: RESEARCH METHODS
1. How important have the following skills been to your success in your current academic program?
2. Please rate the importance of the following for FINDING the resources you need for your work:
PAGE: LIBRARY SERVICES
1. Select the BU libraries you visit IN PERSON on a regular basis:
2. Are BU Libraries open when you need them during periods listed below?
3. Please rate the IMPORTANCE of the following BU library services to your work:
4. Please rate your SATISFACTION with the following BU library services:
5. Please rate how useful augmenting the following library services would be for your work:
PAGE: OVERALL CONTRIBUTION AND SATISFACTION
1. What contributions do the BU libraries make to your:
2. How satisfied are you with the BU libraries':
Minutes: Monday 5/10/12
Present: Linda, Dan (minutes), Young-Joo, Sarah, Shirley, David, Jenna, Mary F.
Next Meeting: Thursday, May 24, 2012 @ 9:30. Mugar Conference Room
Data Management Pilot Project with Assessment
Jenna and Mary from the Data Management Group (DMG) joined us for a discussion of the pilot project for our data. Dataverse is an open source tool for organizing research data, although there is a cost associated with it. It is different from repository software because the focus is on data, whereas Dspace is more focused (or used) primarily for text documents.
Data we could provide for public consumption, or sharing with other stakeholders on campus:
Excluded were comments and the raw data itself; unless the storage space has sufficient security safeguards.
ARL Annual Stats Survey (notes from LP)
A revised version of the ARL annual statistics survey will be finalized after this meeting. The Committee is refining the instructions. A few of the supplemental statistics have been incorporated into the survey. The rest have been deleted.
The loss of the serials count and the addition of a title count regardless of format were the most controversial changes. The aggregators confound the issue. ARL libraries cannot agree how to count titles. Do we count based on what we steward, catalog, maintain, pay for, or make discoverable? If a collection has 8000 items, do we count this as one or as 8000? Should HathiTrust titles be counted? One member library jumped from a serials count of 9M to 14M in one year by adding in what they held in common with their statewide system. The instructions will attempt to clarify how we count.
The value of the statistical data is in its depth and breadth, in the ability we have to compare institutions. The problem with the data is how we count.
The survey was abbreviated so that:
1. The Committee is considering developing an ongoing facilities inventory that would be online and could be updated as changes are made. This might include renovations, number of physical items and off-site storage. This might include space as service and narrative descriptions.
2. The Committee is considering developing a survey about Special Collections. This is is an early discussion stage.
Grad Survey AnalysisDavid has been working on translating the survey into JMP for manipulation not possible in Survey Monkey. We discussed manipulation of those who responded "other" to the discipline question. Shirley will learn more about JMP and possibly help David translate schools/departments into a list that would filter "other" responses into the disciplines for analysis. Next steps are for everyone to look for what charts we want to highlight in the main report. By the next meeting, everyone should look at the survey results with no filters (no schools or departments) applied for interesting results. For the meeting after, we will be looking for interesting results with filters applied (each person on the committee will take a different school for filters, like last time). Our goal is to produce one full report.
Minutes: Monday 4/30/12
Present: Linda, Jim (minutes), Young-Joo, Sarah, Dan, Shirley, David
Next Meeting: Thursday, May 10, 2012 @ 2:30. Mugar Conference Room
BLC Presentation with Megan
Minutes: Thursday 3/29/2012
Present: Sarah, Shirley, Dan, David (minutes), Linda (chair), Jim
Next meeting: Thursday, April 12th, 2pm. Mugar conference room.
Robert Hudson stopped by and told us about his contact with the Deans; he is pushing them to push the survey.
Report on response to survey; currently at 21% overall.
We looked at response rates for the various schools and colleges.
Sarah met with someone from distance ed to find out who to contact about getting out word. Will send email to Nancy Coleman and Eric Friedman.
MET is going to be hard to reach.
Talk about varying responses to postcards at varying venues.
GSU will continue for another week, SMG will ramp up again for another week.
Talk about keeping copies of response tracking so we can look at results over time. David pointed out that SurveyMonkey gives us date (and time) stamps of when people complete survey, so the response rate over time could be charted.
There is a JMP presentation on April 4th, but it was decided no one needs to attend.
Linda mentioned that Washington does all three (facutly, grad, undergrad) surveys together. Do we want to continue doing one a year, or switch to their model?
Full week until next reminder goes out to grad students..
Sarah talked about presentation she saw at Library Technology Conference on visual display of data.
Minutes: Thursday 3/22/2012 1:30-2:30
Present:Linda (chair), Dan, David, Jim, Sarah (minutes), Shirley
Next meeting: Thursday, March 29, 2pm. Mugar large conf rm
No one but David Fristrom and a certain anxious but unnamed assessment committee chair should be going in to the survey in Survey Monkey while it is open.
DF is doing daily loads in XLS of both survey results and drawing entrants list, saved to dropbox folder "2012 graduate survey backups"
LP has spoken to Jenna about using our Faculty Survey data as a test case for the Data Management group. We agreed to work with them to create a data management plan. This will include only our fac survey material, not all of the assessment committee's work (at this time). As DF is on both committees, he is going to communicate with Jenna about moving forward with this.
Minutes: Monday, 3/12/2012, 9:30-11AM
Next meeting: Thursday, March 22, 1:30-3:00 pm
Everyone: fetch your posters/postcards from Linda's office. The easels are in the coat room.
[Take another poster per school as per Linda's direction:
Dan Benedetti [1 large dining hall poster for the BU Pub (use your own easel); 2 large posters for PERL;2 large posters for PARDEE; 3 small posters for SED departments; Shirley has the 24 CAS posters; and a bunch of postcards]
David Fristrom [1 large dining hall poster for the SMG Starbucks (plus 1 easel); 2 large posters for SEL; 2 large posters for ASTRO; 3 small posters for ENG departments; 3 small posters for CFA; and a bunch of postcards]
Shirley [1 large dining hall poster for the School of Law Cafe (plus 1 easel); 1 large poster for Pappas; 2 large posters for African Studies; 1 large poster for ASC; 24 small posters for CAS departments; 6 small posters for LAW departments; and a bunch of postcards]
Sarah Struble [1 large dining hall poster for the F/S Dining Hall (plus 1 easel); 2 large posters for MUSIC; 4 small posters for MGMT departments; 3 small posters for SARGENT; and a bunch of postcards]
Linda Plunkett [1 large dining hall poster for the GSU Food Court (plus 1 easel); 2 large posters for MUGAR; 2 large posters for STONE; 3 small posters for COMM departments for Diane D'Almeida; and a bunch of postcards]
Jim Skypeck [2 large posters for THEO; 10 small posters for MET departments; 3 small posters for THEO departments; (zero posters for SSW)]
Young-Joo Lee [1 large dining hall poster for the Med Campus); 2 large posters for MED library 4 large posters for each schools; 10 small posters for DENTAL departments; 8 small posters for MED departments; 8 small posters for SPH departments and 4 extra small posters and a bunch of postcards (1/4 from the box, I don't know how many -Young-Joo]
Linda to send a test email, containing the proposed link to the graphic html page, to assess-l, in order to facilitate a decision on whether to include the graphic in emails 2,3 and 4 or stick with the text option.
Linda to contact Diane d’Almeida regarding CAM posters
Linda to ensure posters for Med are ready by Wed/Thurs and will give them to Joe to take back.
Linda to contact ?Rubin Rossi? About reserving the GSU link ( GSU Demo Area We have booked the Demo Area of the GSU (first floor, entrance to food court) all day, M-F, 4/2 through 4/17, but not 4/4. The dates between 3/20 and 3/30 were already booked.- Linda 3/12) (Thanks to Jim for the suggestion to try for the link. I think it's close enough that we can consider it the GSU. I called GSU Reservations and spoke with Jim. He said there are no tables left for 3/20-3/30, but he gave us permission to stand with an easel between the bottom of the staircase to the 2nd floor and the door to the outside courtyard. He said no one else should be this spot. He will let John Battaglino, the manager of GSU, and Mindy, from the Student Activities Office (SAO), know that library staff have permission to stand and hand out postcards. - Linda 3/13)
Linda will let us know as soon as Bob’s email has been sent
Linda to order an extra poster for the African Studies Center.
Linda to send an email– giving staff basic info (start, end dates) and indicators on do’s and don’ts surrounding the survey. Linda will contact Marlene. Young-Joo and Jim will pass it on to their staff.
Linda and Dan to add their SurveyKeepers to the site
Dan and Linda to work on connecting the survey to the done page (prize entry page)
Dan to create banner on top pages for all library sites (Library websites message/banner-- OK'd by Bob Hudson and Tim Lewontin. Dan Benedetti will create a banner message and post it no sooner than 3/20 on the top page of www.bu.edu/library. He will ask each Mugar branch head if they would like a banner on their top page. If so, he will do this for them and post no sooner than 3/20. Dan will also let Marlene, Jim Skypeck, and Young-Joo know that we are doing this and offer to give them the HTML code if they would like to follow suit.- Linda 3/12)
Dan to check if the collector/ existing survey results can be cleared (and should ensure this is done on the night March 19. (Dan was able to clear the one person who had taken the survey. So please do take the survey using this URL--surveymonkey.com/s/bugradstudents. Please take the survey all the way through including registering for the drawing. Please email me by Wednesday evening after you've done this and let me know if it went smoothly or if you ran into any glitches whatsoever. Then Dan or I will clear the survey and the drawing spreadsheet before 3/20.- Linda 3/1/2)
Dan to check with Yelena to see if she can change her hours to help in the BU Pub.
Sarah to call ODE – (look over Tom’s email beforehand)
David to contact Brock for help handing out postcards at SMG Starbucks
Sarah to remind Lisa to place posters
Young-Joo to ask Constantine and Mary to place posters while she is away.
Survey recruitment strategies:
· Broadcast emails:
1. Linda contacted the provosts dept and the IRB and received permission to include a link to a graphic html page in the 4 broadcast emails that are to be sent out. Susan Wishinsky created the web page at www.bu.edu/library/about/gradsurvey/gradsurvey.html . The link out to the web page is embedded in the email, not the graphic itself. No attachments are permitted in broadcast emails.
2. The smaller graphic is darker so Mary McGowan and students are to work on this
3. David noted that many browsers do not display graphics by default and suggested that the first email should be sent in a text only (universal) format.
4. Sarah suggested we send a test email to ourselves first. Linda agreed and will send a test email to assess-l.
5. A final decision on the graphic/text emails will be made at the next AC (assessment committee)meeting.
PJ is lining up an analyst for our email list, it has not yet been created. (I have asked that the analyst who is developing the email list of graduate > students give us counts that will be helpful for the analysis of the data. I've asked for the number of graduate students in each school or college who will be on the email list. From this we can figure out the breakdown of CRC vs. Med Campus by adding up the number of grad students in each of the three schools on the Medical Campus. I also asked, if possible, for the analyst to send us the number of Distance Education students who would receive the emails. I don't think they can do this, but I thought I'd ask. Is there any other information we want from the analyst creating the email list? – Linda 3 /12)
The survey does not yet connect to the “done” page which collects the prize entry information. Dan and Linda will work on this.
Linda would like to test to ensure that the collector is working and that the spreadsheet is functioning before March 20. Dan will look into this.
The existing survey results should be emptied/deleted pref. on March 19. Dan will check to see how /if this is possible. (Dan was able to clear the one person who had taken the survey. So please do take the survey using this URL--surveymonkey.com/s/bugradstudents. Please take the survey all the way through including registering for the drawing. Please email me by Wednesday evening after you've done this and let me know if it went smoothly or if you ran into any glitches whatsoever. Then Dan or I will clear the survey and the drawing spreadsheet before 3/20.- Linda 3/1/2)
1. There are
a. one 13x19 per school
b. two 13x19 per library (try to place in lobby)
c. one 8 1/2x11 per department
2. Locations and numbers of 8 1/2x11 posters:
a. Med ( 8 posters) – Young-Joo (will ask Constantine and Mary to place the posters at Med)
b. SPH ( 8 posters) – Young-Joo
c. Dental ( 10 posters) – Young-Joo
b. ENG ( 3 posters) – David Fristrom
c. Sargent ( 4 posters) - Sarah (to remind Lisa; also find out no. of posters; place in grad lounges)
d. COMM ( 3 posters) – Diane(Linda is to contact Diane D’Almeida and ask her to place posters)
e. CAS ( 24 posters) – Dan and Shirley (24 depts, SED- Dan; CAS place 13x19 on grad bulletin board)
f. MET ( 10 posters) – Jim Skypeck
g. CFA ( 3 posters) – David Fristrom
h. LAW ( 6 posters) - Shirley L-S
i. MGMT ( 4 posters) – Sarah Struble (Terry Crystal and Mary to help?)
j. School of Social Work ( 0 posters) and Thelogy ( 3 posters) – Jim
k. SED ( 3 posters) – Dan
Use your good sense and ask re: best placement for posters
3. The orders for all of the graphics are in
a. 3000 postcards
b. 35 (13x19)
c. 6 (24x30) – one for the Mugar lobby
d. (8 ½ x11) these will have a white border, Linda will have them printed by the end of the week. Linda will send them to Young-Joo via Joe(wed/Thursday).
4. Hanging 13x19 posters in the libraries (2 posters each):
· SEL and Astro – David Fristom
· ASL – Shirley L-S
· Pardee – Dan Benedetti
· PERL – Dan Benedetti
· Music – Sarah Struble
· LAW – Shirley L-S
· Med – Young-Joo Lee
· Stone – Linda Plunkett
· Theo - Jim Skypeck
· Mugar - Linda Plunkett
Distribution of responsibility for 6 dining halls/ postcard distribution:
1. 3000 postcards have been ordered. If we begin to run low it will take 4-7 days to reorder. When our supply drops to below 1000 we need to communicate this back to Linda.
2. Responsibility for each of the 6 dining halls includes:
§ Taking charge of postcards
§ Taking charge of people handing out the postcards
3. Allocation of dining halls:(1 large poster plus easel each)
§ LAW School Café –
Shirley and Jim(some hours) [Marlene Alderman has graciously accepted responsiblity for the Law School Cafe) Shirley3/15/12]
§ SMG Starbucks – David (with
Brock; Danny Pierkarski? or Pardee night staff?)
§ Faculty Dining hall – Sarah
§ GSU Food Court – Linda
§ BU Pub – Dan (Yelena?)
David will contact Brock; Linda to ask Tom for help; Shirley to contact
Marlene Alderman; Dan to ask Yelena if she can change hours; Everyone
can contact Tom to get staff to help out.
Linda will contact Rubin Rossi(?) about reserving the GSU demo area/link.
( GSU Demo Area
5. Easels: there are 3 new easels(med already has one; Dan already has one for the pub). One easel for each dining room (Linda, Sarah, David and Shirley each get one). Kathy has them in the coat room (they must be returned).
6. Sarah to call ODE and should look over Tom’s initial email.
Monitoring participation: David to help Linda to monitor participation once the survey is open.
Survey recruitment strategies general:
will contact the deans and will ask to include digital postcard on schools
smartboards,/ in lobbies/ entrances. (When Bob contacts the Deans next week about the survey he will
ask them to > contact their graduate students and encourage them to
take the survey. Bob will also suggest that they post a message on
one or more centrally located smart boards or electronic monitors
in their schools. He will include a couple of our recruitment files
for their use on the monitors – Linda 3/12)
· Linda will let us know as soon as Bob’s email has been sent (anticipate sending 3/20 or 19)
Wrap up of library wide meeting:
number ever to attend A success!
All institutes and centers should be included (as well as school/colleges etc.)
Use public terminals to advertise the undergrad survey
Extra-large posters in the link
Discussion Items for next meeting:
Should we make the 2, 3, 4 emails plain text or include the graphic?
Minutes: Thursday, 2/23/2012, 1:30-3PM
meeting: Monday, March 12, 9:30-11 am
· Survey comments communication: Develop clear guideline on communicating survey comments to library staff
· Survey comments & Privacy issue: Shirley will email to Linda the list of librarians who received the comments, Linda will send email to all librarians to remind them about confidentiality(done - Friday, February 24, 2012 3:56 PM)
· Survey Recruitment: Jim will get the list of dining rooms (and their schedules – found: http://www.bu.edu/dining), information on grad students’ dining habit (where and when) from the Dining services (done: list (Friday, February 24, 2012 10:11 AM); communication with the contact person at Dining Services – permission to hang poster granted(Friday, February 24, 2012 12:01 PM))
· Recruitment on Med Campus: Young-Joo will find out whether we could handout posters at Peets’ café in the Medical building, use electronic bulletin board (big TV screen on the Med building) and will arrange who and how will post/distribute the posters and postcards on the week of March 19 (She’s out of town March 20-22).
(IRB application says…)
The University Provost will send out two reminder emails spaced approximately one week apart. A third recruitment email will be sent from the four Library Directors of the main libraries, if deemed necessary by the response rate.
Attached please also find a two files we will use for recruitment—one is for a poster and one for postcards-- which we will use to encourage graduate students to take the survey.
These files will be sent electronically to:
§ All the schools and colleges that have graduate students asking them to distribute the flyer (or the information in the flyer) to graduate students while the survey is open
§ The Office of Distance Education asking them to distribute the flyer (or the information in the flyer) to their graduate students while the survey is open
§ The various BU library websites asking them to post the flyer (or the information in the file) while the survey is open
§ These files will be printed as:
§ Posters which will be hung in each BU Library
§ Posters which will be hung in each department in each BU School or College that has graduate students
§ A poster which will be hung on the outside of Warren Towers
§ Postcards which will be handed out in the George Sherman Union and other dining halls on both campuses.
§ Strategies to increase participation:
o Poster & postcard: on 3/20 – 4/20?
§ Locations of posters:
· Med (YJ –will be out all week on the week of 3/20, need to arrangement with other librarian at Med Lib)
· Dining Rooms (may need permission – Jim will ask)
o Jim (will send poster to his contact person?)
o Linda (should track what we’re doing, will get permission for shuttle bus next time)
Present: Dan, David, Jim, Linda (chair), Sarah, Shirley, and Young-Joo
Minutes jointly taken by Jim, Sarah, and posted by Linda
Next meeting: TBD via a Doodle Sarah will send out
Next meeting: 1/27/12@1:30pm
Jim will continue to look at university departments to eliminate those without graduate students.
Agenda for next meeting:
Present: Linda (chair), Shirley(minutes), Jim, Dan B., David, Sarah
Next Meetings: 1/23/12 @ 11:30am, 1/30/12 @ 10:00am (Location: conference room, Adminstrative Offices, Mugar)
Present: Linda (chair), David (minutes), Jim, Dan B., Shirley, Sarah
Next Meetings: 1/20/12 @ 1:30pm, 1/23/12 @ 10am (if needed), 1/30/12 @ 10am (Location: conference room, Adminstrative Offices, Mugar)
Present: Linda, Sarah, David, Dan B (minutes), Dan P, Shirley, Jim
Next Meeting: Jim Self visit 10/03/11 | 3-4:30 | Mugar Conference Room
Tentative agenda for next meeting:
Present: Linda, Sarah, David, Dan B, Kate, Jim (minutes)
Next Meeting: 9/20/11|3-4:30|Mugar Conference Room
Tentative agenda for next meeting:
Present: Linda, Dan B., Danny P., Sarah, David, Kate and Alex (minutes)
Next Meeting: 8/29 | 3-4:30 | Mugar Conference Room
Present: Dan B, Kate, David, Dan P, Linda, Jim (minutes), Alex, and Sarah
Tuesday 8/9 2:30-4:00, PERL E-lab
Present: Dan B., Dan P (minutes), Jim, Linda (chair), Kate, Alex, David
Thursday, 7/28 2-3:30, Mugar Admin Office
Finish your personalized reports
Error correct your partners report (see below for pairings)
MED - Mary's name on the report
SMG - Ask Arlyne
Education - Dan B. will place his name on the front of the report
SAR - Assessment committee
CAS - Assessment committee
LAW - Will ask
CFA - Offer to Holly
We are currently working on 8 customized reports. We have setup proofreaders for the reports. The groupings are as follows:
Alex & Linda
Sarah will proofread SAR for Jim. Kate will work on the formatting.
Dan B & Alex
David and Jim will work together on the numbers. Linda will format the report with help from Sydney.
Kate and Dan P. will work together.
Our consultants are waiting on BU Works to be functional before they can start helping us.
Present: Dan B., Dan P, Jim, Linda (chair), Kate, Alex, David (minutes), Sarah
Monday, 5/2 3-4:30, PERL, webinar with LibPAS
All: Think of questions for LibPAS
David: Help Jim with his report as needed
Aim for June for completing all reports.
Bob will send CAS/GRS report, heads of branches will send their reports, Mary will send medical reports, etc.
Dan P. mentioned need for error checking on reports.
Linda asked for opinions on LibPAS, but no one had any at this time. All need to think of questions for webinar.
David showed completed first draft of report for ENG; had some feedback. Discussed who report should be “to”; decided dean of ENG.
Linda suggested having partners for completing reports; having Alex help her was extremely useful. David volunteered to help Jim.
Some talk of deadline. Linda reports Bob thinks it would be great to have them to Deans by end of semester. Sarah can not make it her highest priority. Linda pointed out Deans don't disappear in summer, June might be reasonable. All should probably go out at around the same time (within reason). Bob will send CAS/GRS report, heads of branches will send their reports, Mary will sent medical reports, etc.
Linda reports consultants willing to work with us, not too expensive. One condition, which we felft was reasonable: if we are dissatisfied with their work, talk to them rather tan gossiping with others.
Status of reports:
Down the road, we might do a comparison of embedded libraries to non-embedded.
Kate would like help with error checking.
Dan P. is working on tool for error checking.
Present: Dan P, Dan B, Linda (chair), David, Jim, Kate (minutes)
Dan B will make a list of possible tags for comment analysis.
Linda will set up a webinar with Counting Opinions, ask about possibility of 2 different locations
Continue with customized reports (all)
Survey Monkey Gold
Up and ready to use. Costs $300/year. Dan B briefly discussed his exploration of SMGold, saying it did interesting things with comments, like makes tag clouds and lists frequencies of certain words. For any comment we can add multiple (5-10?) tags.
Linda likes the negative/positive categories that Dan O'Mahoney did.
Discussion of using library names as tags. Even though filters are helpful in organizing the school/dept origin of the commenter, David points out that tags denoting library names are still worthwhile for when specific libraries are mentioned in comments.
Linda asks how we should start a list of tags, important to have a complete so we don’t have to re-tag. Dan B volunteered to make a list, then we can all review and tackle at next meeting.
Counting Opinions LibPAS
Linda talked about ACRL vendors. The program Roger Brisson recommended (?XML) was too expensive ($25,000/year). But she found Counting Opinions which is a lot less expensive ($7000/yr with discount for a 3 year contract). They have also done impressive things with Rutgers and Cornell. Their product can pull data from different systems (like III, gate counts, surveys). Includes LibSAT and LibPAS. LibSAT more qualitative. More interested in LibPAS. There is no example site to look at, but Linda will set up a (free) webinar so we can see the examples provided at the conference.
A big issue associated with LibPAS is the amount of time involved in setting up the data collection and organization. The reps estimated it would take one person 75% of their time for 3 months to have it set up. Linda thinks the time-investment might be worthwhile, because once it’s set up the data collection can be automatic, or at least easier to submit, and what the programs can do with the data is impressive.
The company reps claimed they are familiar with III. There could be an issue with ezproxy stats - the company hasn’t worked with it before but wants to.
Linda will talk to people at Temple and possible University of Arizona as they have the product.
David thinks this looks like a good product but will take a lot of staff commitment to make worthwhile. Assessment is only one part of it. He is also concerned that the company is still in the pilot stage (based on their advertising material). Linda says from what she saw of the Rutgers and Cornell programs, they seem to be able to do a lot of what they promise.
Also said it contains a survey tool that some users (according to reps) have described as “survey monkey on steroids.”
All agreed we should at least look at the product via a webinar. Linda will set up to coincide with one of the next two meetings. If we can get the webinar simultaneously on two campuses, this would be better as Kate wants to invite people from med lib who deal with more with stats.
Bob OK’d taking on Gordon Fretwell as a consultant on retainer. Gordon is a retired UMass assesment coordinator and a consultant for ARL stats. Gordon asked that we also take on Rachel Lewellen, a part time assessment coordinator at UMass Amherst.
Progress on Customized Survey Reports
Jim has written a lot of the CAS report and thus has a lot of questions from this experience.
1. Is it important to mention when a department is under reported?
We think no, because we carefully describe the data in the charts (as in, “respondents who answered this question).
2. The plan for reports - will they first be given out by Bob?
The library director will have discretion here. For example, Arlene can give to School of MGMT. The Provost wants to see all the customized reports at some point, but she doesn’t need to see them first. For CAS, Bob will give this one out, and then the dept liaisons can work with the reports.
Ultimately these reports will be public, like the original report is.
3. How do we create these reports with respect to the original? Should they be addenda, or should they stand on their own?
Discussion about the merits of either approach. Decide we want the report to stand on its own, but we shouldn’t repeat too much of the original report. David pointed out that readers looking carefully at the first document will be forced to go through the same information twice.
We want to report to stand alone, but not be the same length as the original. As Linda put it, the customized report should be like a short story; the original report, a book.
Discussed the need for an example, or a rough template, but the reports really should reflect the specific needs of each college.
Kate will summarize parts of the original document.
We are encouraged to steal from each other.
Jim will send out his report as an example and get feedback.
In the customized report tweak the acknowledgements section, so the person who wrote it should be first, and the “submitted to” area will change based on who is getting the document (library director, for example).
Linda thinks it would be interesting to look at the differences of embedded libraries on the charles river campus (example look at branch libraries - Med library)
Things we need to remember next survey:
Ask about discipline (humanites, social science, sciene)
Provide NA option for each question
Present: Dan P, Alex, Kate, Linda (chair), David (minutes), Jim, Sarah, Dan
Next Meeting: Tuesday, March 29th, 3pm. Location: Mugar admin
Sarah: Find resources on minimum number of survey respondents needed for adequate privacy.
Linda: Ask Bob about hiring Gordon Fretwell for consulting.
David: Will do ENG report
Jim: Will do CAS report
Dan P: Look into open source analysis tools
Alex and Dan B: Look into using NVivo.
Dan P: Create an Excel spreadsheet with all comments.
Linda: Ask Bob to buy SurveyMonkey Gold.
Dan P: Create secure area of Assessment Wiki to store final versions of documents.
Kate: Go on vacation.
Linda reported that Numbers is ordered for the iPads; Tom expects it RSN. People gave back their iPads (labeled with their names) so that Numbers could be installed on them.
Dan P uploaded survey data to DropBox in mySQL, not normalized. Deciding on interface and how to make it available.
David brings up problem of simultaneous users of SurveyMonkey applying competing filters -- there is no way to have duplicate copies of survey with data. Kate mentioned you could quickly take screenshot with Zotero.
Kate noticed differences between SurveyMonkey and JMP when filtering. David suggested that it may be because of the questionable way SM does percents.
Dan P showed snapshot of data in mySQL. Showed report with SQL. Can create interface so anyone can extract data.
Kate again brought up discrepancies. Will double check her figures, and will check with David if there is still a problem. [Kate has checked, and everything is now fine.]
Problem with small number of responses for theology. We decided there aren't enough people to do report. Alex asked what about when subset (department) is too small?
Sarah will look to find resources of minimum number needed for adequate privacy.
Linda raised possibility of getting consulting from Gordon Fretwell.. Sarah suggests deciding what questions we would want to ask consultant. Now may be good time to ask.
Jim asked “What is vision for report?”
Linda’s vision was to start with overall report, and rewrite it to reflect data and interests of school.
David’s vision was to assume everyone would read overall report; additional reports would be supplements with additional information relevant to school.
No decision was made on which vision to go with, or even if we need to stick to one vision.
Jim asked if we should break down ENG by college?
Alex asked if there is a problem comparing subset (e.g. ENG) to overall (which includes ENG)? Decided there was no problem as long as comparison was adequately described.
Jim and David swap ENG and CAS reports.
According to Linda, Bob wants these reports before deans leave for summer.
Discussion of how to analyze comments.
The options suggested were:
Betsy's Excel spreadsheet. Linda isn't sure it is right tool.
Atlas TI (sp?) has a day or two learning curve, but is good. Expensive. $1K a seat.
NVivo, School of Ed has 7 copies, may only be student license. $650 a seat.
David asked what the goal of analysis was. Kate suggested it would provide more data for eventual publication, Dan P. that is would make searching comments easier, Sarah that there is information hidden in comments we need to analyze to find. Linda says it allows
quantitative analysis of comments, and from Betsy's article, benefits include: provide meat for liaisons, find low-cost action items. Sarah said comments can help improve next survey, drive actions.
Dan P said WordPress supports tags, so it is possible to create analysis tool with php and SQL.
Dan P will look into open source analysis tools, Alex and Dan B will look into NVivo.
Dan P can create an Excel spreadsheet with all comments.
Linda will ask Bob to buy SurveyMonkey Gold.
Dan P brought up need for place for final documents (like survey report). He will create secure part of Assessment Wiki.
Kate's job is to go on vacation.
Present: Bob, Linda, Dan B, Dan P, Alex, Sarah, David, Kate
Next Meeting: Thursday, March 24, 1pm. Location: TBD
Dan B to make a link under "about us" to the faculty library survey report
David will look into setting up a Dropbox account for the committee
All will start thinking about creating our individual customized reports (see list below)
Dan P will look at both Google charting software, and the possibility of using MySQL as a text analysis tool.
A. Feedback from Bob
The Provost's comments on our report were very positive. Our report went out to faculty by email under Bob's name a few days later. The Provost is leaning toward having us survey graduate students next.
B. Creating customized reports are now the focus. The low number of respondents in some areas rules out several groups such as Theology, but we came up with the following to guide our initial work.
We will start with one customized report for MED.
Medical Campus - Kate
Law - Linda
CFA - Alex;
SED - Dan B;
CAS - David;
Sargent - Sarah;
Management - Dan P;
and by group decree, Engineering and everything else - Jim.
C. How To Create Reports
We started out thinking that we will use iPads for the "numbers" program that David used for charts, but decided several other tools should be considered as well. If we will use "Numbers", we need to contact Jack to make sure that app is installed on our University iPads. David will do some training on numbers at our next meeting. Dan P will look at Google charting software.
D. Comment analysis / Mining comments
Survey Monkey has a text analysis in a subscription level above SED's. There's a tab for coding text comments. In the future, we want our own survey tool anyway, not SED's subscription. Various methods of mining comments were discussed. Betsy from Northeastern created an Excel tool. Dan O Mahoney from Brown created a web form that had a dual purpose - to analyze the comments and to get staff buy-in on the whole assessment project. Linda P. may also follow-up with Gordon Fretwell.
Dan P will look at the possibility of using MySQL as a text analysis tool.
E. Other Assessment activities
Sarah pointed out the need to create a bigger scale game plan.
Present: Bob Hudson, Linda (chair), David, Kate, Danny P(Minutes), Dan B, Jim, Sarah
Next Meeting: Monday, March 14, 3:15pm. Location: Mugar Conf. Room
Add assessment committee names to the executive summary (acknowledgments)
David will go into the pdf and change some of the smaller bar charts to make values visible
Think about what other assessment we can do
Begin planning for customized reports
Bob will meet with the provost on 3/8/11. Once we have approval from the provost Linda will send out the report to the head of Law/Theo/Med. Step 3 will involve Linda sending the report to the head of the branches and from their the Deans of the individual schools.
We discussed adding color to the charts. Bob leaned towards not adding color and this was the agreement of the group.
Sarah brought up the fact that certain schools (SHA) don't have enough responses to do breakdown reports. In this case we will most likely need to give the overall summary.
At some point in the near future (after he meets with the provost), Bob has asked the group to prepare a PowerPoint presentation of the survey results. He mentioned some key things to keep in mind for the PowerPoints, namely not to make it too dense, and to show some self-awareness and humor.
Bob encouraged the group to add their names to the executive summary.
Linda has asked the group to start thinking about what other assessments we can do and to start to consider how we will do breakdowns for the schools.
Present: Linda P (chair), David, Kate, Danny P, Dan B, Alex, Jim, Sarah
Next meeting: Thursday 3/3/11 1:30-3pm LOCATION TBD (Mugar conf rm is
booked. Dan or Danny, is there room at Pardee or PERL?)
1) Review draft of full report including executive summary
-address the importance/satisfaction issue
2) Next steps with survey
3) Balanced Scorecard
KB makes changes as note from review of draft
SS and LP wordsmith executive summary
DF adds "Figure no.X" text to front of text below each chart
DB mentioned and SS seconded interest in seeing the report in color. DF
said he can do it- in March. Default color scheme from the charts ap
he's using isn't bad. We agreed we will not hold up presenting report
based on availability of color charts, but it would be a nice added
bonus. Also agreed that we will present report in paper and PDF formats
to avoid formatting issues from Microsoft Word version conversions.
As a group, we walked through the suggested changes and comments from
Kate, Linda and Jim. KB will make the changes in the current draft and
resend them. Needs to get them to DF by Friday so he can make changes
to charts (adding figure no.s) before he leaves for beautiful sunny
Sarah and Linda will wordsmith the executive summary, paying particular
attention to adopting changes we made in the full report.
After Bob Hudson meets with the provost to present the report...
-timeframe for customized reports? (we did not set one, but JS mentioned it)
-analyze and tag comments
-how many customized reports will we need? At least 3, for Law, Med & Theo; probabaly also one for Pardee. DB thinks he has what he needs from Survey Monkey for PERL. Didn't decide/know about SEL, Music, ASL. Either way, Law, Med and Theo would be next in line for custom reports.
We discussed the process for creating these customized reports. SS expressed concern that Assessment group has the best handle on the limitations of the data based on how we asked the questions, and was concerned about possible unintentional misrepresentation of the data by those less familiar with the entire survey process. LP suggested that we create drafts, present directors/heads of other libraries/ departments with some possible charts and comments from which to choose, and then work together on creating a finished product to their satisfaction, but that we craft initial drafts. This plan seemed to be generally agreeable to everyone.
Linda reported on the 2/14/11 ARL Balanced Scorecard Webcast. SS also mentioned a Balanced Scorecard postconference workshop that she attended as part of the 2010 Assessment Conference. Both LP and SS are in favor of BU Libraries using this tool. JS and others want to understand it better. LP suggested looking at McMaster's documentation of the process.
ARL "Research Library Issues" on Balanced Scorecard
PPT Slides from the workshop SS attended:
Present: Jim, Linda, Sarah, Danny, Dan B., Alex (Minutes), David and Ella!
Next Meeting: 2/16/11, 1:15pm-2:45pm, Mugar Conference Room
1. EVERYONE will: Re-Read Report esp. the last three sections
2. David will: Update graphs; Insert graphs and quotes into report and executive summary using MS Word; Resolve percentages between report/executive summary and graphs data (Survey Monkey numbers vs. JMP numbers);
3. David and Danny will check SMG numbers
4. Sarah will: Read through the report for flow and grammar; Check APA style guidelines for partial quotes and ellipses; Send edits to Linda
5. Alex will: Double check selected comments for accuracy; Forward the comments selection to the group (primarily David); investigate text formatting in publisher in case Word proves unwieldy
1. We talked about Linda's proposed changes to the charts. We decided to keep the medical campus chart and go ahead with the other changes.
2. We discussed needing to clarify what our percentages are based on. (Did we clarify? My notes are a little fuzzy on this.)
3. We discussed the usage of "faculty" in the report/executive summary. We decided as long as we are consistent the current usage is fine.
4. We decided not to include the 5 colleges charts in the provost's report, but that they will be very useful in the customized reports.
5. We discussed the comments selections and where they could fit into the report/ executive summary.
Present: Linda, Sarah, Kate, Alex, David, Danny, Dan B, Jim (minutes)
Next meeting: February 2, 1pm, Mugar Conference Room
Present: Linda, Jim, David, Alex, Kate, Dan B (minutes)
Next meeting: Jan 26th, 1pm, Mugar conf. rm
Announcements: Linda said we may have a bit more time than expected to finish our report to the provost, as she is not yet accepting meetings.
The Executive Summary needs to tell more of a "story"
We will list *summary percentages* for each question as an appendix to the full report, at least for Bob
Kate will investigate presenting some data either in tables, charts, or bullet points.
Jim will look into the comments to gather relevant material.
Alex will check the total tally of comments we received.
David will produce charts Linda has laid out in the draft of the full report.
Dan B will look for other interesting charts to possibly include.
Kate will check validity of two statements; one about Med's digital policy on books and the other about Open Access- Med vs. CRC.
All will continue to scrutinize the two drafts we now have, our charts as they get created (see a-d below), and also any other charts or comments that may be worth reporting.
Executive Summary and Full Report of the 2010 Faculty Survey -
Linda brought drafts of these to the meeting. The Executive Summary should present what we want the Provost to know, but some on the committee thought the it was a bit too long and perhaps needs to tell more of a "story". Kate will investigate presenting some data either in tables, charts, or bullet points - although Linda stated Executive Summaries usually do not contain those.
In order to help tell a story Dan suggested possibly singling out one or two telling comments in the Executive Summary. Jim will look into the comments to gather relevant material for the Summary as well as the full report, and also look into if revealing a bit of demographic info about any particular commenter, without breaking privacy, would help tell the story. Alex will check the total tally of comments we received.
David posited that we should feel free to draw conclusions, especially in the Summary, *about* the faculty, rather than just communicate facts as revealed in the survey. In other words, we can draw out those conclusions, and then mention the survey results in the full report that back up those statements. David also suggests listing *summary percentages* for each question (not the full data itself) as an appendix to the full report, and we agreed to do that, at least for our submission to Bob, who may then choose to withdraw it.
Linda has laid out the needs for most graphics in the full report, and David will produce those charts. Dan B will look for other interesting charts to possibly include, and a number of other charts will need close inspection after they are created, such as:
a) attitudes toward ebooks in SEL vs Med (David's suggestion)
b) something about finding aids, such as: is the importance of the catalog in the humanities as opposed to the same in the sciences (or something like this) interesting? (Dan's suggestion)
c) can we somehow highlight schools vs something? Either standing by themselves or some combination thereof? (Linda's suggestion)
d) ILL on the Med campus vs the Charles River Campus (Kate's suggestion)
Further, two charts that Linda mentioned in the draft of the full report will also need to be checked to make sure their statements are valid, and Kate will check these:
1) Regarding ebooks, that the Medical Library "has already adopted a digital strategy for books"; this is in the Executive Summary as well as the Full Report; and
2) regarding Open Access, that "More respondents on the Medical Campus as compared to those on the Charles River Campus have submitted to a journal allowing Open Access...."; this is only in the Full Report.
Next Steps -
Linda mentioned a possible consultant for statistical expertise named Gordon Fretwell, but she remains open to the idea of finding someone on campus to fulfill that role. After finishing these reports, we will work to create ones for individual schools.
Present: Linda, Sarah, Jim, David, Alex, Dan B, Dan P, Kate (minutes)
Next meeting: Jan 11th, 2pm, Mugar? Note that Jan 4th meeting cancelled
Text highlighted in green note survey results flagged for possible inclusion in report to provost
David sent out survey data in a JMP file (FacultySurvey.jmp), shared on individual google doc accounts.
Report for the provost needs to be ready by late January
Group will furnish reports for different schools and will handle requests for data when asked.
The next student survey will go out in Spring 2012
Linda will put up rough draft of report/story Jan 5th or 6th
David will send out spreadsheet of number of respondents vs. # of sent email invitations
Sarah will let us know about the free graphics program
David and Dan P. will send out invitations to DropBox (free program for storing, syncing and sharing files)
Everyone will do analyses in JMP and consider interesting points to pull out for report.
Email the assess-l list with interesting survey results, Linda will compile list
David reviewed JMP capabilities. Note that JMP is good for analysis but for our graphics it may be best to use another program like Excel or the free one Susan uses (JMP isn’t very pretty, and the graphics aren’t displayed on the same scale across compared distributions). Because our data is ordinal/nominal, our safest form of analysis is distributions.
To get to the distribution tool in JMP, go to Analyze -> Distributions.
If you want to look at, for example, Ebook,ejournal, etc use by departments, choose Ebook, ejournal questions and put it on the Y columns, then select “By” and choose the second factor (for example campus, department)
David notes that there is an interesting difference between print book use between MED and science (science likes print books more). This may affect our Sci/SocSci/Humanities breakdown.
In our analyses on JMP, we should look for other interesting things to flag
You can save your analyses to a location outside of JMP to share Also, the labels in JMP are shortened to fit into one row. If you need to see the full question, look in SurveyMonkey.
David gave a brief outline of notable results we should include in our report to the provost.
Linda said we should look at the stats for MGMT - might have higher rating for foot traffic and could speak to facility differences.
The outline of the report can be divided into 3 parts:
Discussion about the structure of the report. David thinks it should be organized by themes. Dan B suggested we pull out schools mentioned in the Strategic Plan (MGMT, Med, Law, CFA). David can do breakdowns. After discussion about importance of other schools, we decided that we should make sure to include examples that focus on the strategic plan schools, but not to put too much focus on just these four.
David pointed out that in the survey, those that received external funding rated the library as more important to them.
Other items to explore for the report
Linda wondered whether we should have had survey takers select there type of discipline (sci, soc sci, humanities) instead of (David) doing all the work to code the departments after the fact. Some disagreement as to whether this is a good idea or not. Keep in mind for next survey.
Discussion about timing of next survey - agreed spring is best time to survey students. Will wait until 2012 to do it. Cycle may go faster with subsequent surveys.
Sarah recommended getting some outside statistical expertise to back up our findings. David points out that JMP won’t let us take means, but we do need to investigate the numbers behind the analyses it provides. Linda suggested putting out an ad for a work study grad student in statistics, or asking a faculty contact to recommend a student. Social science would be the best area.
As far as other schools/depts, decided we would analyze for them - tell them the variables we looked at, and we can do the data analysis for them. Perhaps do a report for each school.
Present: Linda, David, Kate, Alex (Minutes)
Next Meeting: Monday 12/20 2pm @ Mugar Conference Room
Updates: Bob Hudson would like Mary Blanchard to hold off on reporting the Med. Campus data to the Med. Provost till after Bob's meeting with the new CRC provost.
-David will send out cleaned up/complete JMP file with the data.
-EVERYONE to play with data in JMP or Survey Monkey.
-EVERYONE to brainstorm the type of analysis we'd like for the Provost's report. What kind of a story do we want to tell about the library? Look at data in the context of the University's Strategic Plan.
-Kate and Sarah will reconcile the MED/Dental/Public Health subject breakdowns (Science, Social Science, Humanities)
-Keep a running list of changes and additions to Faculty Survey for future iterations
-Take note of misleading/ambiguous comments in the survey.
-Next user survey: Who? When? Estimated time to develop? Should we be spending our time on this now?
-Other projects besides user survey?
-Will the different schools/branches analyze their own data from the faculty survey or will the Assessment Committee do the analysis.
-Linda mentioned that we should drop the word "representative" from our comments selections because representative comments are ascertained through a form of analysis that we're not doing. Instead we will use "selected" or "illustrative" comments.
-David mentioned using the comments as a way to illustrate the quantitative data for the provost's report.
-JMP overview: JMP good for analysis, not visualization. We can use excel for nice graphs/charts. There are slight formatting issues when data is downloaded into JMP. David is working on cleaning up the data. David will send clean workable file to committee.
-Excel overview: We can use pivot tables for graphing/charting. Lots of options for visualization. Colors can be customized and saved. Graphs can be saved to any format: JPEG, TIFF etc.
Present: Linda (chair), Alex, Sarah, Dan B, Dan P, Jim, Kate, David
Tue 12/4, 1pm; Mon 12/20, 2pm; Tue 1/4 2pm; Tue 1/11, 2pm
The Mugar Conference room is reserved for all these dates
Present: Linda (chair), Alex, Sara, Dan B., James, Kate, David (minutes), Dan P.
Next Meeting: TBD
Present: Linda (phone), Danny P., Alex, Dan B., David, Jim, Kate (minutes)
Next meeting: Monday, August 30th, Mugar Admin Conference Room
1. Create list of departments for school and update the departmental list page (everyone - see assignments below)
2. Linda will send out links to the COI form and training we need to complete.
3. Sign conflict of interest form and complete online human research training, send COI form and certificate from training to Linda (everyone)
4. David will show us the options and limitations of the analysis tools SurveyMonkey
5. Linda will contact Pamela Bagely at Dartmouth regarding SurveyMonkey analysis
6. Alex will make a copy of the faculty survey and experiment with Skip Logic.
7. Jim and Kate will email Linda with suggested changes for follow-up letter to faculty pre-testers
8. Send list of faculty pre-test contacts/email addresses to Alex (everyone); Alex send list to Linda
9. Danny P will talk about website statistics for the next meeting (Kate for medical campus).
A. Discussion of Departments
List of departments will have to come from contact with each school. There is no list for the complete university; the registrar’s office reports that there is no exhaustive list that includes all the departments for each school.
There is a question of whether we really need to have a break down of all the departments, but Linda points out that a list would be helpful for Tom Casserly’s replacement, as he or she will develop a liaison program.
As for SurveyMonkey, Linda looked into Skip Logic and reports that we will have to have the department options on a separate page (since the list of options will vary based on which school a user selects). Alex will make a copy of the survey and experiment with SkipLogic.
We will work collectively to make a list of departments, and have split up the schools in the following manner:
Dan: CAS (includes GRS), EOP
Kate: Everything on MED
Send your list to Kate and she’ll compile a master list for Alex.
B)Timing of Survey
Mary Blanchard likes the Nov 1st date for the med campus, but any later that week and the survey would conflict with a AAMC conference most med faculty members will be attending.
Linda says that Bob Hudson will talk to the provost about the timing in the year, and will discuss any other surveys going out (to possibly improve the timing of ours). Will also show provost the links to the preliminary results of the UW study to give them an idea of ours.
Linda talked with Mary Banks (head of IRB) and they agreed we should formally apply for an exemption. We will need this if we want to publish a paper using the data we collect. We will try for a B2 exemption. Linda is going to be the PI. Everyone approved of Linda being the PI (for real). Everyone else on the committee will be co-investigators. That means everyone has to fill out a conflict of interest (COI) form and complete an online training regarding human research subjects. Linda will send out links to these forms and we will send the signed COI form and training certificate back to her.
We have Bob’s approval to go ahead. Linda pointed out that IRB has to see all of our recruitment material we use (this includes any emails, and possible brochures, flyers, and website). We can’t do any ad hoc recruiting. We can’t change the survey, text of emails. We have to go over EVERYTHING. If we have to change something we’ll need to reapply. Linda thinks the fastest we can get an exemption is in 2-3 weeks.
Jim and Kate made small suggestions, will email Linda with changes.
Question as to whether we have a master list of faculty pre-testers - we don’t because everyone sent their emails separately to the contacts. Everyone should send their list of contacts/email addresses to Alex and she will sent to Linda.
E) SurveyMonkey analysis
Linda has decided to contact Pamela Bageley at Dartmouth about analysis in SurveyMonkey because she has used the program extensively. Linda thinks SM has good enough analysis tools for our first survey, but probably won’t be good enough for the types of analysis we want to do in the future. David thinks that SM’s tools aren’t even good enough for a lot of basic analysis (like for combining departments into larger sections like humanities - difficult to sort users and responses by variables we don’t explicitly ask about). We will have to export the data to another tool. Linda will ask Pamela about other tools to use.
We discussed the analysis tools further and decided we need more guidance on how to use them. In the next meeting David will explain the tools and show what we can and can’t do with SM, using the UW preliminary results as a guide to the sort of analysis we’d like to do.
Jim brought up a discrepancy in the pdf version of the UW survey results (9a + 9b), but the discrepancy does not exist in the html version, which accounts for 80% of responses.
Linda also mentioned that in the next meeting we should talk about website statistics and how some pages use Google Analytics data and others use NIS numbers generated by IS&T. Danny will talk about this in the next meeting; Kate will talk about how the medical library website stats are collected.
Present: Linda by phone (Chair), Dan B. (minutes), Sarah, Kate, David, Alex, Jim
Next meeting: Wednesday August 4th, Mugar Admin Conference Room
1) EOP is "English Orientation Program", which we do not need, and UNI is also a category to delete.
2) Steve has put up preliminary results from their 2010 Faculty Survey (http://www.lib.washington.edu/assessment/surveys/survey2010/default.html)
3) Danny has put up the latest (7/1/09-6/30/10) Google Analytics on the Welcome page.
1) Kate will contact the Registrar (possibly both campuses) to request a list of academic departments. Make shift list available here
2) Kate will likewise check with Mary about the timing of the Fall survey.
3) Linda will check with Bob about coordinating the timing of the Fall survey with the Provost.
4) Linda will draft a follow-up email to pretest participants.
5) Linda will contact a colleague she believes is at Haverford College about whether to use Survey Monkey alone as our analysis tool.
6) Sarah will look at the comments about the introduction to the pretest.
7) Sarah will work on a draft message to send out to the Fall survey participants.
8) Everyone should try a bit of analysis with the pretest results for our next meeting.
9) Everyone should take a look at UW's 2010 preliminary results at http://www.lib.washington.edu/assessment/surveys/survey2010/default.html
1) We will administer the actual survey in November if possible. We planned November 1st as the launch, November 8th for the first reminder email, November 15th for the second reminder email, and November 30th as the deadline.
A) We went through the entire survey question by question, approving most for the actual Faculty Survey, and changing others on the fly.
Question 3 - Based on the pretest, we will have trouble rationalizing what we receive as text, so we should consider changing this to a drop-down or other menu. Kate will contact the Registrar (possibly both campuses) to request a list.
Question 5 - Changed wording to "during the past academic year at BU, have you...."
Question 7 - Expanded "Journals (electronic or print)" to two items, "Print Journals" and "E-Journals"
Question 8 - We decided to leave this the way it is after a lengthy discussion of alternatives to "library pages".
Questions 13-16 - Approved each of these questions, and also decided to treat non-answers as "not applicable" answers.
Questions 18-19 - Changed final item to "Subject librarian(s) in your academic area".
We also added the new question as number 22. The wording from UW did not meet our needs, so we changed the wording. After much discussion, "What contribution does the BU Libraries make to your..." changed to "What contributions do the BU Libraries make to your..."; and the first item was changed to now read "Ability to keep current in your field". The final page of the survey (page 6) was changed to reflect the addition of that question, so that page is now titled "Overall Contribution and Satisfaction".
B) Other considerations:
Sarah will look at the comments about the introduction to the pretest.
We decided to administer the actual survey in November if possible. We planned November 1st as the launch, November 8th for the first reminder email, November 15th for the second reminder email, and November 30th as the deadline. Linda will check with Bob about coordinating this timing with the Provost, to ensure we are not competing with other surveys that may be taking place. Kate will likewise check with Mary.
Linda will contact a colleague she believes is at Haverford College about whether to use Survey Monkey alone as our analysis tool.
Linda will draft a follow-up email to pretest participants, thanking them yet making it clear they should also fill out the actual survey in the Fall. Alex reports details regarding this email can be found in our minutes of 6-9.
Sarah will also work on a draft message to send out to the Fall survey participants.
Everyone should try a bit of analysis with the pretest results for our next meeting.
Everyone should take a look at UW's 2010 preliminary results at http://www.lib.washington.edu/assessment/surveys/survey2010/default.html
Present: Alex, Kate, Danny, Sarah, Jim (minutes)
Next Meeting: 7/14 2:30-4:00 OR 7/15 2:00-4:00 Alex will send out a Doodle.
The group discussed the wide variety of responses to the department question and wondered if there was a need for a more standardized list. We were not sure a drop-down menu would work but we were concerned about ambiguity of the pre-test responses.
We decided to keep "other" in question #8 but drop listservs. We also decided to leave questions 10-11 alone as (N/A) seems an adequate response to a set of publishing questions.
For question #20, we decided to reduce the list to 4 services"
Present: Linda (Chair), Dan B., Sarah, Kate, David, Danny, Alex (minutes)
Next meeting: Tuesday June 22nd, Pardee
Announcements: UW changed a few of their question based on our survey (we think). Woohoo!
1)Add "Answer only if relevant to you" (exact phrasing?) to Questions #10, #11 and #13 - #16 - Alex
2)Change "Subject/Branch librarian" in Question #18 to "Subject librarian(s) for your academic area" - Alex
3)Question #8: Delete "Online Groups and list serves" and "Other" from list of options - Alex
4)Question #22: Add "Libraries physical spaces/facilities" as an option -Alex
5)Re-write intro paragraph to survey and draft email to faculty from provost (or whoever sends out Fall survey) - Sarah
Include the following in intro: reasoning behind survey; survey will be publicly available; emphasize confidentiality;
6)Draft follow-up email thanking pre-test participants - Linda
Include the following: Sincere thanks; importance of their comments; changes in survey based on comments; reminder to please
re-take in the fall
7)Contact Steve Hiller about UW 2010 Question #7: how do they interpret the results, what do they use the data for? - Linda
With great valiance and tenacity we discussed all of the Pretest comments in one sitting. For the most part the pretest comments were constructive and resulted in positive changes such as action items #s 1 and 2.
Sarah suggested we start tracking problematic questions for revision in later iterations of the survey
The Fall survey should go out to an all faculty list-serve from the provosts of both the Charles River Campus and the Medical Campus
Linda proposed deleting Question #8 from the BU survey and adding question #7 from the UW 2010 survey. There is some hesitation about adding any more untested question after the pretest. A final decision was not reached. Linda will contact Steve Hiller for clarification and talk to Bob Hudson about the necessity of this type of question. After some discussion we decided to drop several options from BU's Question #8 (action item #3) and to add an option to BU's Question #22 (action item #4).
Agenda for next meeting:
-Steves response on UW 2010 Q#7
-Bob's response to UW 2010 Q#7
-Proposed new services question
-Survey Responses (not just pretest)
-nuts and bolts of getting the survey out (contacting the provost etc.)
-other projects for Assessment Committee to handle
Present: Linda (chair), Dan B., Jim, Kate, Sarah, David, Danny (minutes), Alex!
Next Meeting: June 9 - 2pm - Pardee Library
The pretest will be sent out on 5/17
A reminder will be sent out on 5/24
The end date for the pretest will be 6/4
The title of the survey will be: "2010 Faculty Library Survey"
We will ask the pretest survey takes how long the test took them.
The pretest will be sent out individually to the faculty members by the person who selected them.
There is a "#1" before the section title, it should be removed.
Sarah S. will write an intro paragraph and Alex will put it up on the wiki.
Alex will send out an URL (containing the survey) by the end of the week for the assessment group to test.
Sarah will email everyone two texts, the original message to be sent out to the pretest group, and a reminder message to be sent out on 5/24
Alex will add a box for faculty members to voluntarily place their email address when they take the pretest.
Alex's redone Survey ( 2.0.9376532):
#6 -> Switched Scale, No N/A needed
#7 -> add N/A
#8 -> Same scale as #7, Add N/A
#10 -> add N/A
#12-15 -> Remove the question "familiarity with research materials", combine two of the questions to produce "providing correct citations and using information in an ethical manner.
#17&18 -> will have the option "don't use" instead of N/A
#21 -> Typo: oline should be online. Add N/A
Add a comment box after Q11 on the 2.0 survey. - "Comments on publishing or open access?"
Remove the bolding and all caps from questions 12-15, 17-18.
Present: Linda (chair), Dan B., Jim, Kate, Sarah, David (minutes)
Next meeting scheduled for 5/11 at 2pm-4pm at Pardee(?)
News about poor Alex; glad to hear she is recovering
Sarah is taking LYRASIS class on Basic Surveys for Librarians, Dan will also take class
We will push to get survey ready for pilot ASAP, so we don't lose volunteers to summer
Everyone will find additional faculty volunteers for pilot, and email all their volunteer names and email addresses to David
David will collect volunteer names and addresses and put them someplace, but not post them to public wiki.
Linda will contact Holly, Beth, Marlene, to see if they can suggest faculty volunteers for pilot.
Dan will work with Arlyne to line up SMG faculty for pilot (Linda asked Dan Wednesday after meeting)
Kate will make all the comment boxes in survey a consistent size.
Kate will change scale on question 8 to importance rather than frequency.
Someone (Kate?) needs to remove "Merged Survey to Reflect UW 2010" title from survey.
Went over agenda.
So far we have about 20 possible faculty for pilot (not all have been contacted). Sarah brought up problem that we know these people, they probably have pro-library biases. Linda replied (quoting Steve): this is convenience sampling.
Discussion of who (besides Alex) can edit Survey Monkey. Current survey is 2.0; anyone can play with previous one; Kate will play with it.
Began to go through staff feedback, which Kate did such a great job of organizing in spreadsheet. Made changes in survey as we went along, so they are not listed here.
Didn't get through all staff recommendations (we started with the easy stuff); will continue at next meeting.
Present: Linda, David, Jim, Alex, Sarah S., Dan b., Kate (minutes)
Next meeting scheduled for 5/5/10 at 2:30pm at Mugar (most likely)
Linda announced that we will not have to go through the IRB process; though this might be a problem for freshman who are >=18
Sarah mentioned the ACRL News has a whole issue dealing with IRB --http://crln.acrl.org/content/71/4/190.full
Staff comments will go out the 21st. Staff will have a week to comment
The Assessment committee will convene (5/5) to make changes due to the comments
The Faculty pilot will go out after graduation
Alex will edit the survey on digilib for staff viewing
Linda will write email up going out to staff about getting feedback for entire survey
Linda will send out survey to Mary, Marlene and Jim, asking to send it out to staff
Sarah will talk to Rhoda (Mugar) and Terri (Law), and Kate to Joe&Konstantin regarding ILL and this question
Kate will collect comments and summarize them for our 5/5 meeting
Each of us will have list of ~3 faculty names/contact info for pilot survey
Sarah will tweak UW letter to faculty re: pilot survey for use at BU
Sarah will look at pretest/pilot wording from UW if available, for use at BU
Proposed service questions:
Discussion of the proposed new services questions got them down to 9 (see the list here). Most of the questions that were eliminated were done so because we were either already offering the service (in-class instruction) and it's covered in other survey questions, or the service is already being planned (federated searching/metalib).
The list will further be edited after comments from the staff. The number of new services on the final survey will be 5.
Intercampus transportation service -
Problems about copyright if we offer article service between campuses. Do those copies made count towards copyright limit?
Chrls river already doing this for free on ILL/charge when through medical library
Changed wording of question to Dan's suggestion of "Expedited delivery..." for both article and book question to deal with differences between medical and charles river campuses
Kate will talk to Joe, and Sarah to Rhoda and Terri regarding ILL and this question
As for the question about delivering ILL via email, some discussion over whether to ask this considering we will already be doing this. Decided to leave it in to see response.
Discussion over whether we are promising more than we can offer with tutorials. Some have software for it, but it's just a matter of time available. Left in to see interest.
Blackboard/CMS question -
Changed to reflect a wider range of services we can offer for BB (instead of just helping to embed links)
Dan thinks it would be better to cover this with other means of assessment (focus group, for example). Doesn't think it will be one of the top 5 most important questions. 2 initial questions merged into 1, will leave to gauge any interest, and also make sure the survey includes OA question
Linda will write email up going out to staff about getting feedback for entire survey
Holes in survey
Dealing with the survey in surveymonkey. Will put in the new services questions.
OA question (Have you ever submitted an article to OA journal) to after question about where to publish
One comment about the survey going from high to low (5->1) - commenter thought it should go from low to high. But gets awkward with "Don't use"/N/A button
Will leave range the way it is.
Have to make it clear about 5 being high b/c of digilib formatting (won't be a problem on survey monkey)
Alex will try to change the right side menu "comments on paragraph 1" to "comments on question 1". Change weekly/monthly to week/month b/c of space issue
"If Other Please Specify" to be changed
LP will send out survey to Mary Marlene Jim, ask to send it out to staff
Need to do some trouble shooting with comments, but wait until Alex puts in edits
Should we send out a link to survey monkey to the staff giving feedback? No, because can't fill it out.
Will be doing a faculty pilot, should suffice for getting opinions about layout (if it's confusing, for instance) want staff to focus on wording.
Alex edits digilib, Linda writes email out to staff (give staff a week to comment - try to get it out on the 21st, then give them to 27th). Kate organizes staff comment, assessment committee edits, make adjustments due to comments. then out to library heads.
Give time estimate as to how long it takes to complete the survey. Have some staff not involved in committee take it (we are all too familiar with it).
Meet 2:30pm Wed, May 5th
Linda asks whether we should introduce the over staff meeting - won't have but will discuss results with them.
Currently recruit for faculty pilot. Need about 20-25. 2-3 from science, 2-3 education, 2-3 med (maybe more?), etc.
Have a letter (UW's letter that we can tweak)- Sarah will look for this letter and tweak it
Pilot might go out at a difficult time b/c of grading.
Questions over whether it should go out from a central email or if individual committee members should send. More likely to get response if ind members, BUT harder to remember to send out reminders.
Do survey after graduation
For faculty - how to leave comments? Comment boxes in survey monkey
From UW-Steve's advice about the pilot:
Should be timing the pre-test and it is Important to have open ended questions on the pretest. (will get staff to do)Put something about importance of our knowing the timing in the faculty letter - also, in survey, ask meta-questions about how long it took them to complete the survey, if there was anything that was confusing, suggestions about better wording.
Present: Linda, David, Dan P., Jim, Alex, Sarah S., Kate, Dan b. (minutes)
Next meeting scheduled for 4/13/2010 at 2:30pm at Pardee(?)
BLC annual meeting Assessment COI 4/1/10 1:30--2:30 having balanced scorecard presentation.
Limit the number of "potential new services" to 5 on the survey itself, but ask staff for comments on all.
Include a note in the email that goes out to teams asking for comments that asks them to pay particular attention to question 15 (this is the potential new services question).
Theo, Med, and Law will also be asked to send Digilib faculty survey to their staff for comments.
Kate will work on proposed services question
Dan P will work on formatting digilib comments, and also enable anonymous comments.
LP will speak to other AULs about assessment/planning documents
Staff for comment, IRB application, then pilot; Or- pilot then IRB application? LP will ask Prof Frasier
SS looks for holes in overall survey
A. Look at strategic plan for assessment wording
David led us through his document at http://docs.google.com/Doc?docid=0ARIN2pT8txcGZGNuanJncnRfMjE3dHdxZ3oyZ2s&hl=en that discusses the implications of the strategic plan for our faculty survey. At the end there are five conclusions, quoted here:
1. Although we ask about satisfaction with several services, we don't necessarily ask about every service (e.g. circulation or storage retrieval).
2. We don't ask about mobile or other means of access (beyond a question about print vs. electronic books).
3. We ask nothing about whether they consider the library a "top destination."
4. We don't ask about "awareness and responsiveness" of library staff.
5. We don't ask about the faculty's awareness of instruction services, or their likelihood of using them.
In addition, we talked about the need to be strategic in what we assess, and how we might enable other staff to take on assessment roles for themselves (eg - Helen/Circulation). We then talked about the possibility of crafting advice to the teams on setting priorities, setting reasonable goals, understanding just what needs to be done and why to do it, getting baseline data, and identifying where in the faculty survey relevant data may be.
B. Discuss ideas for "potential new services" question
We had quite a few ideas for potential new services to ask about on the faculty survey:
1. Alumni database access: IST does not have a way to authenticate alumni, and Linda suggested this may be more expensive than we can afford. This will not appear on the faculty survey at this time.
2. Mobile devices (eg iPhone)
3. Intercampus "paging" (ie delivery) of books. Sarah suggested we might ask Jack about turning on this button in III, but (apparently) there is currently not a way around the problem of someone in Mugar requesting a book be delivered from Mugar.
4. Intercampus "paging" (ie delivery) of photocopies.
5. help for fulfilling article deposit mandates (eg NIH)
6. help with library links in courseware systems (eg Blackboard)
7. course-specific (research) guides
8. article desktop delivery (electronic) to faculty for ILL documents (already happening at THEO)
9. more eBooks in your field of study
We then talked about how we should probably limit the number of "potential new services" to 5 on the survey itself, but that we might ask staff for comments on all of these. Also, we decided to include a note in the email that goes out to teams asking for comments that asks them to pay particular attention to question 15 (this is the potential new services question).
C. New wording for collection satisfaction question
This discussion, led by Alex, resulted in editing question 21 on the fly, to phrase it in the manner we all accepted: "How satisfied are you with the BU Libraries?"
D. Discuss getting survey up on digilib site (same style as strategic planning) to elicit staff comments
The faculty survey is being copied question by question into a Digilib blog at http://digilib.bu.edu/blogs/buassessment/ - this is where staff will be able to make comments on each question. Dan P will work on formatting digilib comments, and also enable anonymous comments. Theo, Med, and Law will also be asked to send Digilib faculty survey to their staff for comments once the version is formatted correctly. LP will bring this up with the AULs. The process going forward will be to get the faculty survey to staff for comment, then the IRB application will be submitted, and then we will run the pilot: Or should it be, rather, this way: pilot, then IRB application? LP will ask Prof Frasier about this.
UW recruits faculty through their liaison system. LP suggested that if each of us convinced three faculty members to join the pilot, then we would have a good pilot group. But it is not quite time to do this yet.
Next meeting's agenda
Look at any holes in what we are asking/survey in its entirety - discussion led by Sarah S
Present: Linda, David, Dan P., Jim, Alex, Sarah S. (minutes), Kate
Next meeting scheduled for Wed 3/31 at 2pm at Pardee
LP spoke with SED faculty who is on the CRC IRB. He will review our survey before we formally submit it. CRC approval should clear us for both campuses.
1) Edited Section 2, Website questions, and removed it as a separate section. List of edits below in "Discussion" section.
2) Looked at and accepted changes to Information Literacy questions
1) Alex will make sure all the (high)s and (low)s are consistent in survey
2) David will look at strategic plan for assessment wording (begun at http://spreadsheets.google.com/ccc?key=0AhIN2pT8txcGdDB0U0o1Ri0yUW0zR3dXZGp1UTZFOEE&hl=en)
3) Sarah S will look for holes in what we are asking
4) All members will brainstorm ideas for "potential new services" question
5) Alex will work on new wording for "overall satisfaction" question (UW#16) -collection component does not differentiate e from print
Section 2 Website
Question 1- "library resources" wording made it a question about resources, not about access, and we decided the info we really want to know is how often they access the library. Changed it to: How frequently do you use the following methods to access the Boston University Libraries? The scale (frequent to infrequent 1-5 likert with no actual amounts listed and an N/A option) did not make sense. We noted that Steve had mentioned if we use time periods (day week month) will not be able to find a mean/median, but were not convinced that would actually be useful/necessary. Scale is: now more often weekly monthly once per semester less often, with no N/A (will interpret less often as including nevers or N/As). Ranges were chosen with idea in mind that we may want to compare across user groups once we do grad/undergrad surveys, too.
Question 2- unanimously agreed to delete this question. (which library do you usually begin with w/drop-down list). We know what we really want to know behind this- who is using the branch pages, are they starting there, etc, but we could not come up with a clear way to ask this. The wording implied that they start at a lib page when we acknowledged that probably they start with Google or a bookmarked link. Will have to use another method to get this data.
Questions 3 and 4- We discussed the reason we would want to ask #3, since we already know the answer. Considered moving 3 & 4 up to importance/satisfaction services questions and including them there, but decided #3 doesn't make sense to ask and that #4 is primarily historical- we don't have this problem nearly as much post ezproxy. Also were concerned about how to phrase it to stress that it was about access, not collections/content. Decided to drop them both.
Question 5 - deleted. Decided it was too open ended, and would turn into a laundry list of e-resources faculty want. Perhaps this is a focus group question?
At this point there was only one question in the "web site" section, it it asks about visits to both the virtual and physical library. So we moved it up under the demographic questions, currently it is Q #6
In the course of discussing these questions, we segued into the topic of collections, and looked at where and how we asked collections questions. We noted that the UW question #16, which we have adopted, asks a collections satisfaction question but does not differentiate between e-resources and print holdings. We decided we wanted to make that distinction . Alex will work on the wording.
Information Literacy Questions
We looked over these questions, and also looked at the Strategic Plan http://digilib.bu.edu/blogs/buLib/ , to discuss how these questions met and were informed by goals 4 and 5. Linda pointed out the way that UW reports their information literacy survey question results always as a block (never just one question, always as a set) and suggested that we also do that. Sarah mentioned SAILS and suggested that we could use that as a baseline to compare the numbers we get from the survey to (eg, do the numbers from SAILS match with what skills faculty believe students do/don't have). We accepted these questions as they stand, and made no edits except to the spacing of the high and low in the response scale, which Alex will look for and fix throughout the survey.
Next meeting's agenda
Look at strategic plan for assessment wording- discussion led by David
Look at any holes in what we are asking/survey in its entirety - discussion led by Sarah S
Discuss ideas for "potential new services" question (all)
New wording for collection satisfaction question- discussion led by Alex
Discuss getting survey up on digilib site (same style as strategic planning) to elicit staff comments (all)
Present: Linda, David, Dan B., Dan P., Jim, Alex
Next meeting scheduled for Monday 3/22 at 2pm. Location TBD
Announcement: BLC Assessment to discuss Balanced Score Card at BLC's Annual Celebration
1) Committee will continue to consider sending paper version of survey. UW found a higher response rate for paper surveys but work load may be prohibitive.
2) An overall satisfaction question like UW's question 16 from their 2010 survey will be added. Wording and categories TBD.
3) David will make proposed edits to the Service Questions section in 2.0 version of survey. List of edits below in "Discussion" section.
4) Dan B. will make proposed changes to Open Access questions in version 2.0 of survey. List of edits below in "Discussion" section.
5) Jim made proposed edits to Demographics questions to version 2.0 of survey in real time. List of edits below in "Discussion" section.
6) Alex will add "Which school are you primarily affiliated with" back to the 2.0 version of survey. It will be Q# 2
1) Danny P. will take over the website questions section of survey. Analysis based on criteria described in 1/21/10 minutes.
2) Linda to discuss posting the survey to digilib with Jack Ammerman. Survey will appear in the same format as Jacks version of the Strategic Plan for staff comments.
3) David will look over the strategic plan to align its assessment goals with the work of the assessment committee.
4) Committee must look into how to interpret non-respondents for each question.
5) Alex will investigate color coding in survey monkey.
7) Alex to send email to Tracy Thrasher-Hybl about usability testing on "Find @ BU"
David's Service Questions:
David presented his analysis of the service questions section (now Q#16 and Q#17 in the 2.0 version of the survey). Click here for David's analysis.
We discussed the use of service questions as a public relations tool and a guide for resource allocation. The issue of non-respondents came up. It makes sense to lump non-respondents in with "don't use" to create an average for the "importance" section of the service questions. In other sections non-response/don't use can be interpreted as "don't know" or "hate." A final decision on how to treat non-respondents has to be made for each section.
We discussed color coding question in Survey Monkey as a way of demarcating related questions. Color coding may not be an option.
Should we ask specific questions about the types of services the library can/should provide. UW does this on their 2010 survey in Question 14. Committee should look to Strategic Plan for purposed library services currently in the works as a starting point for formulating this type of question.
The list of resources for Q#16 and Q#17 is too long. David will shorten the list to six resources. "Requesting books or journal items not owned by the library" will be changed to "Borrowing books or getting journals not owned by the library." Eg ILL" will be added in parenthesis will also be added. David will delete "Assisting your students in finding material for their assignments" from the list. It is covered in other areas.
Dan B.'s Open Access Questions:
Dan B's analysis of the Open Access questions can be found here.
Institutional Repository group has asked to keep "timeliness" in the response section of Q#10 in the 2.0 version of the survey (How important are the following factors in your decision on where to publish journal articles?). This question to remain as is.
Open access Q2 and Q3 will be merged. A decision must be made on the placement of this question in the survey.
Jim's Demographics Questions:
Q5 moved to Q6, disrupted flow of demographics questions.
Q1 "What is your primary department" may be tricky for people with multiple associations. Committee reached a decision to accept multiple associations.
"What is your primary school" to be added in the Q2 spot.
An "other" section with no comment space to be added to "With which campus are you primarily affiliated?"
The full time vs. part time status is an important question to know since the survey will go to both typed of faculty.
Q4 "Which BU library is your primary library" was deleted. This question is too problematic eg. multiple library associations, distance education, library top page as location.
* Unassigned actions and pending decisions appear in bold italics
Next meeting: TBD: Linda has sent out a Doodle request for scheduling purposes and will notify the group of the preferred dates.
Present: Linda, Alex, Jim, Dan B, David, Megan
The meeting began with recognition of Megan's departure and her valuable work with the committee. She will be missed. Megan suggested adding Kate Bronstad from Medical as her replacement and Linda added her to the assessment distribution list and the committee.
The rest of the meeting consisted of a review of Megan's work on the Library Resources section of the survey and group discussion of changes to the wording of our survey. Megan graciously agreed to post the revision of her work on the wiki.
The group questioned the need for an SFX question in this section or even in the survey itself. Further discussion of this subject is planned once the entire committee meets again.
In question #1, Megan and Jim raised the possibility of a comments section to answer the "why" of the questions. If faculty rank personal collections as high but the library as low, don't we want to know why that is? The group decided that these questions might be better asked in other settings.
The group categorized the 3 questions in this way:
Next Meeting: Wednesday, Feb. 10th at 2pm, Pardee (if available) or PERL (backup)
Present: Linda, Alex, Sarah, Dan B., Dan P. Tim, Jim, Megan, David
David will learn something about statistics.
Sarah will look for holes in survey (important information we aren't collecting)
For each section of the survey, the following people will go over the questions, ask why we are asking each of them, think about what results we want (including possible charts/graphs), and go over the points raised about the section by Steve:
Tim: Website questions (though they may ultimately end up in other sections).
Dan B: Open access
Danny and Linda: Fight over information literacy
Faculty survey will be pushed to November.
Linda reported on assessment forum at ALA. Big name researcher Carol Tenpir from U of Tennessee gave talk -- she has a big grant to look at value of academic libraries to universities, based on work at U of Illionois looking at flow of grant money, but taking it much further. Linda presented our work, was well recieved. Power point is up on assessment wiki (Alex helped with design). She spent all day at LibQual session, as reported in e-mail.
Talked about upcoming Assessment Conference in Baltimore (Oct. 24-27). Linda is going and doing poster session on our survey work, Sarah is thinking of going. Linda says the conference is very, very, good -- live and breath assessment for three days. Anyone wants to go, talk to Linda.
Long Discussion of What Next for Survey
We discussed what we learned from Steve Hiller's visit. Sarah emphasized the importance of knowing what we are going to do with the data before creating questions. Linda described having lunch with Steve after meeting, and his opinion that we are not really ready to do the survey this Spring. We all agreed to move the survey to November
Some discussion of whether we should include part-time faculty (Steve's survey doesn't, there are reasons pro and con). No decision was reached.
Sarah asked if we are getting ahead of ourselves in working on the detailed wording of the survey, should we step back and concentrate more on what we want out of it? Megan suggested we investigate the recent Sustainability survey to see if we should use the same tool. Sarah raised the concern that a BU homegrown survey tool may not be maintained. No decision was reached.
Discussion of statistics tools, and recent workshop on SPSS a couple of people attended. JMP (http://www.jmp.com/) has been suggested as alternative to SPSS (BU will be providing it). Sarah emphasized importance of either having one of us come up to speed on statistics, or getting outside help; Linda argued for growing out own expertise ("how hard can it be?"). David stressed importance of some actual knowledge of statistics, e.g. not just knowing the difference between a mean and a median, but knowing which one to use in a particular situation. David volunteered to learn something about statistics.
Linda suggests everyone take a section and mock up the kind of report/chart we would want to create using data, while Sarah wanted to figure out something closer to the raw data we want.
Sarah brought up section of Steve's article "Assessing User Needs, Satisfaction, and Library Performance at the University of Washington Libraries" (Library Trends 49.4 Spring 2001 p605) that talked about their survey goals:
Should we have a similar list of goals? We decided we should, and Sarah and Jim would independently come up with them (but later on we appeared to change their charges).
David mentioned that we had already done some looking at what kinds of questions we wanted answered before creating our current questions. Sarah wondered if we should find out from faculty what they want to be asked; Dan. B. suggested that we will be asking what is missing during pre-testing, and Linda reminded us we could include open-ended questions in pre-testing.
Dan. P. didn't think we should tear apart the current survey and start over. Sarah thought we didn't need to throw it completely our, but must make sure we are collecting data we can use and manipulate.
Linda suggested that we break down the survey by sections, and for each one someone should come up with what information we want, and what it might look like (charts, graphs, etc.). Jim suggested we don't get hung up looking at Steve's survey while we do this.
We arrived at the following breakdown of assignments:
Sara will look for holes (questions we aren't asking)
Tim will do website questions (though they may ultimately end up in other sections).
Megan will do library resources section
Dan B will do open access section
Jim will do demographics section
David will do services section
Danny and Linda will fight for information literacy
For all, we will look at why we are asking each question, the possible graphical representation of the results, and the issues raised about the section in the minutes of meeting with Steve
Megan started taking us through her minutes of meeting with Steve.
Some discussion of IRB, will do Charles River. We want to be "exempt"
More discussion of faculty feedback. Megan suggested if we have a few questions we can't word correctly, we might have focus group with faculty.
Yet more discussion of Primary Library question. We want to use it with other questions (so we can break down answers per library). Should we have multiple check-offs? What about people who never use physical library?
Next Meeting: Thursday Jan. 21 at 3pm, School of Education Library
Present: Linda (chair), Megan (minutes), Dan B., Dan P., Jim, David, Sarah and Steve Hiller
Action Items: Update survey questions based on Steve's feedback (See question #9 below).
1) Can you speak about the UW's intent in regard to using responses for Public Relations versus using them to improve internal work flows?
PR has a big role in how we communicate the value that the library is providing the community. UW's survey allowed them to look at program improvement but also to get a sense of the importance for our various user communities. For example, UW put up a report last spring during budget cuts that contained a lot of the data from the survey. This is currently available on their website. Data that relates to resources and services is very useful for communications. However, Steve stressed that the survey itself is not a marketing tool. Sarah asked how much do he finds it ends up being this way anyway. Steve answered that faculty may become more aware of a service, but we can't push something overtly in the survey.
Sarah asked which of the questions on the survey are asking questions that you want to know about the library , and which of the questions are asking questions for information about the survey (for example, demographics). Steve answered that it is less the latter because of the IRB. Also, respondents many not be as forthcoming. For example, gender has been dropped.
2) In what general or specific ways has the UW faculty survey changed since the first time it was given?
UW tried to move the survey to include more impact questions or demographics that allow UW to do specific types of demographics questions- ex: have you taught this year, did you get grant funding. They also added the budget reduction question and the importance questions.
David F- how much rewording have you done and how has this affected your longitudinal analysis? UW survey question #4- 2007 asked both using a campus computer/ using an off-campus computer- this has been eliminated due to changes in wireless service,use of mobile devices, etc.
Sarah- do you write the questions for longitudinal use? Yes, if there are things that they want to keep track of.
Sarah- concern of whether faculty understand the difference between subscription resources and the open web. Steve answers that it depends on the local environment, helps that UW does not have federated searching. Faculty have seemed to be able to differentiate between relatively well, but it is messy.
3) How would you recommend we do the pilot for the survey:
a) have faculty take the actual survey, and we interpret the results
or, b) have faculty give direct feedback about the questions we ask, and those we're maybe leaving out?
Sarah is concerned that we have not used a model where we talk to the group that we are surveying about what we are asking, partly because it has been modeled so heavily on the UW survey. Is this an issue? Steve- Best if we can do this, but may be able to use the pretest to gather some of this information by asking additional questions about what was confusing for participants or if there is anything that they expected to be there that wasn't? Steve has agreed to send pretest questions that UW asked. They meet with the faculty senate council in the library and talk to them about what they see as important- generally this has not been extremely helpful. Also, met with student groups and tend to get the best feedback from this group.
Dan asked if faculty who take the pretest also take the regular test. Steve said that they do and that we need to be explicit about asking faculty to take it twice.
UW recruits faculty through the liaisons. 20-25 faculty for the pre-test. Steve also agreed to send us the message that UW liaisons used for recruitment.
We asked Steve whether we need both an email pretest and a focus group? Is this too much? He suggested that if we do both, just target specific questions. We should get as much feedback as possible on the survey because it can give the perspective of the respondent in a qualitative way and allow us to look for patterns in the responses.
Steve stresses that for the analysis piece, we should use something other than survey monkey, such as SPSS, SAS. We will need more flexibility than survey monkey can provide. As a result, we need to set up the survey in a way that it can feed into SPSS and think about what we want to use the data. For example, one might look at frequency of use and might pull out the group who says they use something weekly. Then you can evaluate the rest of that group's responses in this context and track trends over time.
Should be timing the pre-test and it is Important to have open ended questions on the pretest.
4) What shortcomings are there with web-only surveys? Are there any significant advantages with print?
Logistically, it is the easiest for distribution, financially, and data capture. They went to web-based in 2004 but decided to include a print survey in one of the reminder notices. UW distributed the survey with the same URL, but with reminder notices, they mapped where responses were coming from by department and specific groups can be targeted by liaisons. When response rate was particularly low, would target specific group because it is important to get a representative sample.
Steve recommends that we wait a week to 10 days to send reminder. We should expect to get half as many responses for each reminder.
Sarah brings up the issue of survey fatigue. Because UW partners with the Office of educational assessment, they are able to identify samples of students and controls who will be over-surveyed. Helps to mention upfront that we only do this once every three years and faculty will be less irritated.
Linda asks if spring is the best time to do this? With students is a definite wait, faculty it is better to do so, but if not possible wait to do until late in the fall semester, say Nov.
5) How did UW encourage faculty response? How does UW separate the faculty identities from the responses, and how easy is that to move past the Institutional Review Board?
UW emails come from the Library Dean. Liaisons are really important in the reminder emails. Also, offered incentives for the bookstore but with the faculty this didn't really seem to make a difference. UW will probably drop the incentive for faculty in the near future.
The survey was used for organizational improvement only and was anonymous. The incentive does not effect this because it goes to a different agency. Participants finish the survey, their anonymous data is submitted and then they have the opportunity to sign up for a drawing. However, different with the print. Still in 2007, 28% of responses came back in print. Showed that there are still people who prefer to do a print survey. It is extremely important to make the print survey mirror the online version as much as possible so we can compare data.
Sarah asked Steve his opinion on LibQual. He said that he felt that the results are not as useful, especially outside of the comments field.
6) We see that you collected faculty rank. Is that merely to ensure you do not get skewed results toward any singular group? Are there other questions on the survey where the concern is simply to make sure results haven't been skewed toward any singular group?
We asked Steve what he considered important in terms of demographics because we ask primary academic department, but don't identify rank. Steve does not think that rank is very important. Originally, UW used rank in place of an age questions to evaluate the adoption of the new technologies. Currently, UW does not survey part-time faculty who are not tenure track because the response rate is much lower and it pulls down the response rate. This group also tended to be a little more negative and less thorough because they did not have as much experience with library services. Thus, we have to decide how we want to define faculty? UW's definition has included tenure track, research faculty, full time lectures/instructors, post-docs. Sarah suggests that it would be important to get data about these ratios from the schools. Steve said that post-doc response seems to be equivalent to assistant professors. UW does not survey affiliate faculty, i.e. people who do not get paid by them, but researchers who get paid by grants are included.
UW also did a survey of non-faculty professional staff
7) Will your faculty have to log in to fill out the survey? Does making them do so add a legitimacy to the survey that makes it worthwhile to require?
They don't log in. They only submit identifiable information about themselves if they want to sign up for the incentives once the participants have completed the survey.
8) Asking about Frequency / Importance / Satisfaction, or some combination of those three. We have had many discussions about the need to ask each of these for various services and resources. We have only one single question about frequency of use (our website), so are we missing something here?
Steve said that UW asks a lot fewer frequency questions than they did before. Qualitative data has show that frequency is not a predictor of value. The only frequency question left is #4. Also, if you specify time periods (weekly, monthly, etc.), you can't do a statistical mean. With frequency questions, we have to think in advance of how we are going to use this and what sort of statistical analysis we will be doing. UW ditched the library website for the frequency question- a lot of confusion about what is the library website.
Jim asked about the new survey questions #10 and #11. Specifically, these questions don't talk about student performance in the graduate category, only undergraduate. We plan to do both in our survey. Steve says it is muddy with graducate students if it is not tied to a specific tract or specific class. Graduate students are spread around. The undergraduate question is used almost as a gap analysis. Also, UW took out finding info on the web and in the library because they can be the same thing. Currently, they are discussing adding a question to the student survey that asks "What makes you anxious about using the library", but they are working on the wording.
The difference between Frequency, importance, satisfaction questions also comes into play when you think about how you treat missing responses. On satisfaction, it doesn't matter. In terms of importance, UW found that there are times when people just don't answer for that question- 23% don't answer this question. We have to decide how we would rate this- does this mean it is low importance? Need to be consistent with how we deal with this. For example, with a frequency question is a "no" response an "I don't use this"?
Sarah- how do you use the I don't knows"? Steve- can be a good indicator of good visability. Satisfaction- 41.4% no response for the last question in 12 about liaisons.
Too confusing include all of these types of questions, would never ask all three.
9) Would you please preview our survey with us and give some feedback as we go?
Comments about our survey:
#1- may need an "other" option. People get upset if they don't have a way to identify this.
#2- Drop academic department
#3- UW asks them just to do the departments, and then have a mapping feature that auto maps the department to school, so we are not asking them twice.
UW does to require answers to any question. Tends to be some people that don't answer the demographics.
#4- Get rid of this questions- can just do a separate question
#5- Which is your primary one? How will we interpret this information- faculty would need to choose more than one. Are we looking at which library they visit, or which library's staff, resources they use? Worth explaining this in the survey. Perhaps list all of the libraries and select which ones they use on a regular basis. UW only does this for students but doesn't ask this for faculty anymore because they don't come to the library anymore.
#6- Do we want to split the funding question? federal vs. external funding- may not be that meaningful for us. Change to "Federal or other external funding".
Web Site Questions:
#1- only frequency questions- what are we measuring against here? Especially if they can get to a range of library resource without going through the library. Thinking about how we will use this information. Jim suggests that instead we use UW question #4.
Steve likes questions #3 and #4-- UW added a satisfaction question on the remote access to questions and services.
#5- open ended questions might be better to ask in pre-test and then construct a list question. Work on revising this one.
#2 have you submitted to an open access journal? Sticks out on a standardized survey, we need to include examples.
#3 Have you ever deposited an item to: -- ok, but we may want to look at whether we ask the DCommon question
Find @ BU:
With one-offs, Steve recommends that we fit this into another category- for example, put this in with satisfaction with library resources.
Maybe we could lump the SFX question into this section? Saves space and gives the participant context.
#1 was dropped at UW for undergraduates and they are considering dropping it for faculty because they are not sure how separate those categories are now.
#2 Wikipedia is out on the new one.
#3 UW stopped separating print and electronic- not about the format, about the source. The old vs. new journals question comes from decision whether or not to purchase back files.
#1 Need to provide a NA option. Also think about whether to include the "in the library" "on the web" distinction. UW changed to "scholarly information".
Need to add NA on satisfaction questions.
We may want to drop library instruction option because it is asked about in other areas of the survey.
Requesting books or journals articles- UW got rid of this for real estate.
Overall satisfaction questions? Do we want to add this? An overall importance question to different areas is useful in the context of looking at resources and services.
Steve stressed the importance of thinking about how we can make it shorter and he thinks that we are trying to do too much. Some of the satisfaction questions could be answered in an overall satisfaction questions.
Next meetings: Friday, Jan. 15th at 10am Mugar conf. rm w/ Steve Hiller; Thursday, Jan. 21st at 3pm Mugar conf rm (probably)
Next meeting: Wed. January 13th at 2pm (location: Mugar Admin Conference Room unless otherwise notified).
Present: Jim Skypeck (minutes), Dan Benedetti, Dan Piekarski, Linda Plunket (chair), Megan Bresnahan, David Fristrom, Tim Lewontin, and Alex Solodkaya.
Next Meetings: Wed Jan @ 2pm, Wed Jan 13 @ 2pm (location Pardee unless we hear otherwise)
IRB and survey distribution questions
Are we asking survey-takers to log in? If we do, that will make IRB process much harder. Discussion seemed to conclude that anonymous, no login was fine. Will send the link in an email to faculty only. Should note in email that it is a FACULTY survey, to please not forward it, and note approximate time it will take to complete (since recent techqual survey was so long).
Will send email reminder(s) to entire group. LP wil ask Bob Hudson if he can nudge deans to get them to encourage faculty participation.
Reviewed question development
each of us will make updates to our section before Christmas, so Alex can start moving them to Survey Monkey
Resources (MS and SS)
Next Meeting: Dec. 21st at Pardee
Present: Danny Piekarski, Sarah Struble, Tim Lewontin, Dan Benedetti, Jim Skypeck, Linda Plunket
The group worked on editing recent updates made to the faculty survey, as highllighted in red on the survey itself by Sarah Struble.
Under Services, it was decided to drop frequency questions, as it would make evaliuation of survey results easier.
Under Importance of Services:
On the U Washington Survey, we had a question for Steve (lastname?) as to whether or not #6 (frequency) and #13(satisfaction) seem to ask the same thing viz use of online/web services and sites
Under "Other Questions":
Should questions relating to print vs digital all a "no preference" option?
Under "Demographic Questions":
a pull-down should be added asking for primary affiliation with college/school and department
Under "Information Literacy"
We wondered how listed categories correlated with ACRL standards.
Under Rating of "Graduate/Professional Student Perfomance":
We also wondered whether or not we should ask for a "primary library affiliation?"
Sarah and Meghan will prepare the survey section on Library Resources, and Tim will prepare the section on Library Websites for the next meeting.
Next Meeting: Dec. 3rd at Mugar
Present: Linda (chair), Dan B., Dan P., Tim, David, Megan (minutes), Jim, and Sarah
The group decided to primarily concentrate our discussion on the faculty survey for this meeting. The group also agreed to meet every other week in order to meet spring goals.
David suggested that Google Wave may be a useful tool for virtual meetings- David will send invites and links to tutorials for GoogleWave to everyone in the group.
We decided to meet every other Thursday. Next two meeting dates will be Dec. 3rd and 17th at 2 pm.
We began by looking at each of the sections/categories that we defined in the previous meeting. Each group member posted their write-up on the wiki (http://sites.google.com/site/buassessment/faculty-survey---goals-process-categories-of-questions)
Linda asked the group to identify any holes in her list. Megan suggested that we might add something about the faculty members’ status as either an adjunct, assistant, associate, or full professor, or more importantly, whether the faculty member is clinical or research faculty. Sarah agreed that asking the faculty member to define whether they do research, classroom teaching, or clinical teaching would be useful at the CRC campus. However, the group decided that it would be unnecessary to ask the participant to define their faculty status. As mentioned last week, we may have a difficult time distributing this survey to adjunct faculty because it is not clear whether there is a comprehensive listserv for CRC adjuncts. Linda said that she thinks there are 2 lists- one for all faculty and another for CRC including adjuncts. Linda will check into whether and how we can contact adjuncts.
We discussed whether or not (and if so how) we should ask faculty about their primary BU library affiliation. We discussed how we would ask this question. Tim suggested that we ask what library they use most often. Linda stressed that we would need to have an “NA” option because this question presumes that they use a library. Then we can state something like “the following questions refer to your primary library.” David suggested that we provide a list of librarians and ask the faculty member “which of these locations have you used in the last year?” Sarah and Tim both stressed that we will need to make the distinction between virtual and physical visits to the library. Sarah also mentioned that she thinks that library space is not very important to faculty. The group decided that this question should be moved to the facilities section.
Linda and Jim agree that asking whether a person is a full or part time is valuable. We also decided that we do not need to know how many years a faculty member has been employed at BU.
There is unanimous agreement among group members that our survey should not be as long as TechQual!
We began by reviewing the list of items that we would define as “Library Resources”. Everyone agreed that we can get rid of reserve and reference in the books section. We can just ask if they use print or electronic books. Jim suggested that instead of old or new journals, maybe define by storage or stacks, but the group decided that we should keep it simple and just ask about print or electronic journals. Sara mentioned that we need to give specific examples of what we would consider a database or index. Same with “Archives”, we will need to define this and give specific examples so it is clear what we are asking faculty. We decided to get rid of newspapers because, other than access through Lexis/Nexis we don’t really subscribe to individual newspapers at either campus. Audio and Visual media can be lumped into a single source (this would mostly be for the music collections).
Megan asked if we move library research/subject guides and tutorials from Library Resources to Services. The group agreed that this would be appropriate as long as it is mentioned somewhere. Megan also mentioned that we will not be listing these questions by category on the survey.
“Software” is included as library resource. The group discussed whether faculty use software and decided that we can probably get rid of this resource. Megan noted that we will want to add software back in for our survey of students. We also decided to get rid of dissertations, reports and scores from the library resources list.
Next we discussed the information we want to learn about the list of library resources.
Sarah mentioned that it might be difficult to figure out how faculty find library resources? How do they track stuff down? How are they accessing these resources? Maybe we can use the UW questions for this section . . . hard to phrase without putting words in their mouths.
Most people agreed that it is important to ask open ended questions and/or include a comments field. Even if we cannot analyze all of this qualitative data, we still have it available to us so we can pull golden responses from faculty to support our quantitative findings.
David- are there services that they want from us that we are not providing?
Sarah- what services do we have do they not know that we have?
David said he broke his section into three categories:
· How often do they use services
Sarah suggested that if we ask about course management software, we need to give examples and specifically mention Blackboard or CoureInfo. Sarah said that we need to be careful about asking “how important is this service?” She thinks they will say it is important even though they don’t use it. David added that it is difficult to ask why they are not using something?
Linda and David agree that we will need to provide several opportunities for faculty to make comments in this section.
Tim said that we need to ask if they are using the website and if not how are they accessing our resources. Megan and Linda mentioned that the focus of these questions should not only relate to the Mugar website but should also account for the branch libraries such as Medical, Law and Theology. We will need to list different library websites, including catalog, so that the faculty comments can be associated with a particular library site.
Dave mentioned that he really likes the way that the UW survey does not ask about the website specifically, but instead asks about information seeking behavior.
Linda suggestions that our plan should be to take our discussion from the meeting, write our questions, and really look at the UW survey. If we decide not to use any of their questions we must have a strong reason.
Dan P. suggests cutting the hours questions, but how frequently do you use the library should be left. The group also decided to cut questions about how knowledgeable the staff are because we won’t know if the faculty are referring to a student worker or a professional staff member. Comments related to staff services will likely be captured through the questions on library services. Dan P. (may have been another group member?) suggested that we ask something like “How do you rate the quality of the facilities? Please elaborate.” Group agreed that it would be useful to ask faculty to elaborate on many of their answers.
Sarah suggested that as some point in the future we might want to run a door survey to capture additional information in this category about how people who use the physical space in the library are using that space. In this survey, we might ask “what did you do today, how long did you stay, did you interact with library staff, did you find what you are looking for” etc.
We agreed to eliminate the frequency and hours questions.
Dan suggested that we might try to capture faculty’s preferences for print or digital (which was somewhat covered under the library resources category. We noted that preference would be more valuable to know than what faculty member’s actually use- they would use the format that the library provides.
The group agreed that Open Access should be addressed in the survey. At the least, this may help raise awareness of the issue (though the survey should not be instructive) and provide a metric for comparison when the next faculty survey in conducted in 3 years when OA awareness at the University will likely be much greater. Megan mentioned that she thinks we should make a distinction between faculty who are using open access resources and those who are seeking out OA resources for publication.
Dan also noted that we will need to find a way to address SFX services and the group agreed that we should refer to the service as “Find at BU”. Tim also reiterated that we should ask about faculty members’ experiences with remote access.
For the Next Meeting on Nov. 3rd: Action Items
Since David noted that he liked the questions about information literacy on UW survey, he agreed to come up some questions related to information literacy.
Each group member is now responsible for writing up questions related to their categories and lining them up with the UW survey questions when possible.
Sarah will make a Google doc for listing our questions and share with everyone.
Tim will post his questions to the wiki.
Megan will post minutes.
Dan P. will add several group members as Gmail users to the wiki.
Quick Note (11/16/09):
I happened to notice that the Google Sites storage limit is 100MB per site. I know this is something that had concerned us in the past. We are currently using 6% (6MB for you liberal arts majors).
Next meeting: TBD
Present: Linda (chair), Dan B. (Minutes), Dan P., Tim, David, Megan Bresnahan, Jim Skypeck
- Linda will add Megan, Mary, and Jim to our assessment listserve, and also query Marlene to see if she would like to be added. Jim will also forward select emails to Vika.
- Committee members will decide what the data might look like and create the questions that would result for each of the following:
Tim Library Websites
David Library Service
Sarah & Megan Library Resources
Dan P. Library Facilities
Dan B. Other
We agreed that these reports should be posted to the wiki.
- We decided that for demographic purposes (the first bullet under Categories), we would like the first bulleted item underneath that begins "Primary School, College, Institute..." to be reflected on the survey as multiple-select boxes, so that faculty can select as many as are appropriate.
- We also decided that we do not need to collect the fifth bullet item, that begins "Status - full, associate, assistant..." but that we do wish to learn how many years someone has been here.
- Ask, under the demographics portion, if the faculty member strictly does only research, teaching, or a combination of both.
Tim L. has looked more closely at the ezproxy logs MySQL database that Scott from IT had compiled one time. There is a long version, with around 250 million entries, which is probably too large for us to handle in any way, and a short version, which has about 250 thousand entries, which Tim investigated. The short version gives us info on how many times a certain domain was accessed, user affiliations, dates, etc. It would be possible that the database could be seeded with info from the EZproxy server on a set schedule, but pulling info out (vendor names, for example) will probably have to be tailored each time because of URL changes to databases and the like. Tim estimated that perhaps about 1,200 separate domains exist in the log.
Linda P provided an outline of what the next steps for the faculty survey should be. It is posted on the wiki at: http://sites.google.com/site/buassessment/faculty-survey---goals-process-categories-of-questions.
Megan worried that respondents are backing into our resources through Google and the survey might not get to that. Sarah seconded that concern, confirming that it has been her experience that users definitely back into some, and don't connect with it as a library-provided resource. Dan B. suggested that if we wanted to see how much this is happening, we need to be careful in the way the question is worded on the survey, since knowing users start at Google is not quite the same as knowing that users actually do back into library resources that way.
There is a LibQual session happening at MidWinter, coming up on January the 18th. Linda, Dan P., and David expressed interest in attending. Linda will add Megan, Mary, and Jim to our assessment listserve, and also query Marlene to see if she would like to be added. Jim will also forward select emails to Vika.
In general for the survey, we are looking to use Survey Monkey only (no paper) and are hoping that it will be a short survey. Permission will have to be obtained through the IRB and the Provost. An email would be sent out to faculty (adjuncts included?) and a link provided.
Much of the discussion focused on how we will proceed to create the questions and then run the survey. Jim pointed out that faculty probably shouldn't have input on drafting the questions that they will then be asked to complete (under #2 on the outline). Dan suggested perhaps some faculty expert in running surveys might be consulted (particularly, Linda had previously mentioned a Professor Stefanakis at SED). David added that perhaps faculty liaisons might serve a role as a possible pilot group, as they are naturally interested in library issues. Sarah noted that department secretaries might be better able to provide lists of adjunct faculty than deans, if cannot the email lists we can get from the Provost does not adequately cover them.
To proceed to creating questions, Linda has broken out the types of information we may wish to gather into categories on the outline. Tim seconded this approach. Dan B. suggested we may just want to go through the Univ. of Washington's survey, and possibly other surveys, to cherry-pick questions we think apply to us, and then review those to see if anything is missing.
We decided that for demographic purposes (the first bullet under Categories), we would like the first bulleted item underneath that begins "Primary School, College, Institute..." to be reflected on the survey as multiple-select boxes, so that faculty can select as many as are appropriate. We also decided that we do not need to collect the fifth bullet item, that begins "Status - full, associate, assistant..." but that we do wish to learn how many years someone has been here.
A couple of clarifications for survey purposes were suggested. Megan stated that it is important to be sure to include ebooks as a clear question on the survey, as the Med. School now has quite a collection. And David contributed that under "library services" (bullet three in the categories), we need to be clear when we ask faculty about reference, whether we are asking them to give feedback on their own reference service experience, or whether we are asking them to give feedback on how they feel their students have been helped by reference services. Distinctions like this in other services and other categories are likewise to be duly noted. This lead to the decision to ask, under the demographics portion, if the faculty member strictly does only research, teaching, or a combination of both.
Sarah suggested and we decided to draft the questions after deciding what data we would like to receive. Committee members will decide what the data might look like and create the questions that would result for each of the following:
Tim Library Websites
David Library Service
Sarah & Megan Library Resources
Dan P. Library Facilities
Dan B. Other
We agreed that these reports should be posted to the wiki.
Next meeting: September 21st 2:00pm.
Present: Linda (chair), Dan P. (Minutes), Tim, David
The ARL Mapping report that David has worked on can be found under Survery Manipulations.
ARL Mapping: Next Steps:
- A column should be added that would include an example (such as a number that has been collected in the past)
- Next to the example column should be a column of "steps". This would be the process that whoever is collecting the data has to go through to get the number. All queries that are used in Millennium should be written down. Also important is the fields that the queries use.
-This is important because, as Linda said, "Bob will not be here forever" <--- Threat?
-Tim informed the group that IT has scripts for ezproxy data and keeps the data in a MySQL database that needs to be parsed.
-David noted that we should look more closely at the EZProxy data when/if we get counter data.
Library Learning Wiki:
-Tim has placed a report up on the library learning wiki.
Worldcat Local Comparative Stats:
-The group decided to look at the data collected after we have worldcat up and running. This has been decided as a sixth initiative for the assessment group to tackle.
-The group will look at ILL & Circulation Numbers. Did our numbers go up or down with worldcat and do we think there is causation?
Six People, Six Projects:
-The six initiatives of the assessment group have been split among the group members.
-At the next meeting each group member will give a small presentation about their area and what we can do to tackle the initiative (see our charge for more details)
Tim & David Counter & EZProxy
Sarah Worldcat Local
Dan P. Google Analytics
Dan B. Data Website / farm
Linda Faculty Survey
Tim Data Collection Survey
Next meeting: August 6th 2:30pm.
Present: Linda (chair), Dan P., Dan B., David (minutes)
David presented his work on mapping from data collection survey to ARL report (up on wiki). A few corrections were made. We will think about what to do next and discuss it at next meeting.
Next we discussed Linda's proposed Assessment Plan; overall the reaction was positive.
David wondered if there was too much to do? Dan suggested extending time window.
Danny wondered how we would keep data farm going? Do it in steps? Start small?
Linda suggested not a full-blown data farm, just web site for reporting data. Simplify.
Linda also suggested changing it to five year plan.
On doing survey, we were still waiting to hear if we can use Steve's survey [we later heard that we could]. Should we use Survey Monkey? General opinion was favorable.
On COUNTER: Need Tim input.
Analytics: Danny's done fair amount.
Minutes 6/30/09 - Pardee Conference Room, 2:30
Next Meeting 7/14/09, Pardee Conference Room, 2:30
Danny and Dan reported glorious weather on their vacations!
David gave an update on the mapping. It's somewhat tricky mapping from ARL statistics back to various branch and department statistics, but it's coming along. David should have a first draft by 7/14. Linda mentioned a tool that might (or might not) come in handy for the mapping. It's called Vue, is open source, and was developed at Tufts.
Most of the meeting was fleshing out an Assessment Plan. Linda will write a first draft and post on the wiki before 7/14. The Assessment Group will be asked to review and comment first, and then ALC and Refnews. We agreed to propose five projects over the course of the next two years:
Minutes 6/16/09 - Pardee Conference Room
Next Meeting 6/30/09
Present: Dan Benedetti, David Fristrom, Tim Lewontin (minutae), Dan Piekarski, Linda Plunket (chair)
--Linda tabled an agenda motion to re-select chair until a meeting when all members were present.
--Dan B reviewed his revision of the survey. Several future edits were considered, but tabled for the time being.:
1)Re-organize columns so that like categories for each library or department matched like category as one reads across the columns.
2)Rationalize items by scope, so that all libraries or departments to which a statistic collection process applied are included for that category.
3)Combine ARL statistical report with Annual Report stats.
4)Review Dan's list of "Questionables" in order to ask for clarification for each list item from survey participants.
It was decided that it would be better for now, to restrict ourselves to mapping survey results to the ARL Report.
Linda distributed a chart on Goals and Outcomes for the group.
--Tim will post ARL Statistics Report in the Wiki.
--David will map survey results to ARL Statistics Report.
--Group will work on filling out the Goals and Outcomes chart.
Minutes - 5/27/09 - Pardee Conference Room
Present: Dan Benedetti , David Fristrom, Sarah Struble, Dan Piekarski (minutes) , Linda Plunket (chair)
Assessment Committee Minutes 5-11-2009
Present: Dan Benedetti (minutes), David Fristrom, Tim Lewontin, Sarah Struble, Dan Piekarski , Linda Plunket (chair)
Next Meeting: Wednesday, May 27th, 2:30pm, Pardee Conference Room
* David will add his presentation to the Assessment Wiki.
* Sarah may follow up with David Snyder about certain III stats.
* Linda will post the revised charge to the wiki. NOTE: If anyone would like to edit this document, be sure to turn on the "track changes" function (tools - track changes) within your version of Word.
* Linda's revised charge was accepted.
* The group generally agreed to continue analyzing survey results as a whole while simultaneously trying to create an assessment plan as a whole, rather than break into subgroups to accomplish this, at least for the time being.
* Everyone agreed that our plan should focus on what we can reasonably hope to accomplish; on practical things such as identifying how much time it takes a book to go from ordered to placement in the stacks.
1. Minutes, announcements, reports on assessment programs/webinars
A> David reported on the Counter webinar. He used a free presentation software available at prezi.com. Counter is not a standard but rather a code of practice, explicitly defining items, such as requests of full text articles per journal, which are then presented in XML of Excel format by counter-compliant vendors. Currently, compliance for things like journals or databases is fairly common, not so much compliance from ebook vendors yet. This info is desirable as something complementary to journal impact factors for collection development use. Part of the webinar called "Using Counter Reports" was a presentation of how one librarian manipulates spreadsheets to present data efficiently. David speculated that a relational database would be even better, but would take a lot of work up front. Another part of the webinar focused on SUSHI, a way to automate counter report downloads from different vendors. Serials Solutions offers a product (360) that does just that. ERM could be our depositing place for such reports but there are tough technical issues to resolve. David will add his presentation to the Assessment Wiki.
B> Linda reported on a NERCOMP SIG workshop entitled "Assessment into Action" that she attended. There were two presenters, including an excellent one with a focus on quantitative methods of research. The gist will likely be repeated in an upcoming BLC Assessment COI meeting. She will post the url to this presentation as soon as it's posted on the NERCOMP site.
2. Data Collection Survey
Bob's group collects no unique data, and security will likewise not provide a finished survey. David Snyder may have more a thorough response soon, and Sarah may get in touch with him about some particular items after this Wednesday.
We looked at the Cataloging response to the survey, and talked in general as to how we will use the results. Sarah suggested the survey results should inform policy-making, but noted that it would be difficult to really change historical problems such as Law's separate bibliographic records in our catalog due originally to their RLIN membership. Tim backed that up by saying the other libraries at BU often have good reason to maintain separate bib. records. Linda hoped to be able to compare branches, or other like units, in order to streamline stat gathering - eliminating duplication of effort. David thought we might also be able to see holes in what is current gathered - opportunities to collect other statistics that might be more meaningful to staff. Dan B. suggested that at least the survey could serve as a means to try to centralize what is currently gathered to an online form (data farm). Dan P. noted that during the process of filling out the data collection survey, Pardee staff were able to identify and eliminate the collection of a few types of data that is no longer needed.
Linda made several changes to the charge, which we all accepted gratefully. Some minor revisions were suggested, and Linda will post it to the Wiki. NOTE: If anyone would like to edit this document, be sure to turn on the "track changes" function (tools - track changes) within your version of Word.
We spoke about the ARL Spec Kit at http://www.arl.org/bm~doc/spec303book.pdf.zip in order to view sample charges and plans. Dan B. liked the idea of red-flagging items (it is USC that does this, page 89) such as "data not stored centrally", and "self-reporting where automation is possible". The group generally agreed to continue analyzing survey results as a whole while simultaneously trying to create an assessment plan as a whole, rather than break into subgroups to accomplish this, at least for the time being. The reason to take this approach was well-put by David when he noted that there are assessment actions we can take which won't be evident from the survey results, like user surveys, counter reports, and the recommendations from the consultants. Everyone agreed that our plan should focus on what we can reasonably hope to accomplish, such as identifying how much time it takes a book to go from ordered to placement in the stacks. At the same time, some policy decisions clearly must be made from a top-down perspective - such as a clear policy on the addition of electronic journals into the catalog. For those types of problems, the assessment process must be carried out, so that focus groups and the like can inform policy-making, as was stated at the beginning of the meeting by Sarah.