This past week I went to the ELUNA conference. I try and send conference notes out when I go to a conference; in this case all of us use Alma to some extent, so I’m sending my notes to TS.
If you haven’t been to ELUNA before, the format is that there are big ‘plenary’ sessions, generally presented by Ex Libris staff, and then concurrent sessions (which means you choose which sessions to attend). Personally, I find the plenary sessions to be a waste of time—they are generally used by Ex Libris as an opportunity to big themselves up and to talk about things they’re hoping to do in the medium/long term. I find the roadmap documents to be more useful. However, the concurrent sessions can be excellent—they are generally presented by people who use Alma, and they can be very practical explanations of how they do certain things, improve their workflows, etc.
I’ll provide a shorter version of my notes from this year’s conference below. If you want more detail, just let me know—I’ve got longer notes for most sessions. Note that even though this is the shorter version of my notes, this is still a very long email. Apologies! (and feel free to ignore it)
The first session I attended was presented by Kun Lin, Sondra Mahserejian and me! It was nice to have the first session, because that meant I didn’t need to stress about it for the rest of the conference! We talked about Alma<>WorkDay integrations, as well as GOBI and student systems.
There were quite a few good questions from the audience—there are a lot of institutions who are either migrating to Alma (and already have WorkDay Finance) or have WorkDay Finance (and are migrating to Alma) so it was a good opportunity to showcase how we do things.Note that there was a session happening at the same time as ours: “But we’ve always done it this way—challenges and strategies in enacting technical services change” but sadly I had to attend the session I was presenting at.
The second session I wanted to attend was too full—standing room only—so I didn’t attend that. But it was entitled “Using Analytics to find problems in your catalog” – I’m going to catch up with this one via the slides.
So instead I went a couple of rooms down and attended “Harnessing the power of BIBFRAME in knowledge graphs for LLMs and generative AI” – The main point this session was making was that bibliographic data stored in BIBFRAME is easier to harvest by large language models which are used to build AI’s knowledge.
A reparative cataloging community of practice in the WRLC (a consortium in Washington DC)
They explained what reparative cataloging is (they noted that the WRLC contains 9 institutions; one is Gallaudet, which is for deaf/hearing impaired students, and two are HBCUs)
They established a community of practice for reparative cataloging in 2021; it’s made up of 11 librarians from all 9 member institutions. Their goal is ‘progress not perfection’
Among other things, they write a monthly newsletter that’s distributed to all the 9 libraries and is an update on their activities as well as other cataloging related news (e.g. new subject headings)
One project they worked on was subject heading replacements—either replacing or supplementing LCSH in their catalog
Location, location, location – revamping the request for new location process
This was presented by the systems librarian at the University of Oregon. She went through the process that existed (and the new process) for requesting new Alma locations (e.g. “dml-stk”)
She talked about the complexity in managing locations, and what they did to make it simpler.
Takeways:
i. This is something that I think could be improved at USC – not just the process for creating new locations, but also I think we could do an audit of those that we have, and maybe write up some documentation about what our strategy is. I think I will raise this in the ILS WG.
ii. She tangentially mentioned the Poison Book Project which I don’t think I’d heard of. It tracks rare books that are found to have been bound with either arsenic or chromium in their binding. I checked our catalog and we have about 10, so Marta is going to look into these and we will discuss in the SC/TS WG to determine if we need to have a strategy.
iii. Finally, one of the central theses of this presentation was that documentation is vitally important—documentation should be living documents. I know we have a lot to improve in this area, and it’s something I’m going to try and continue to work on.
iv. I’m also going to try and think about some specific location problems at USC and consider whether we need to look for a solution for them.
The next session I attended was focused on a specific cataloging workflow in remote storage facilities. I attended because I thought it might be applicable to some of our workflows (either archival materials or cataloged). It sort of was:
Takeaways
i. One theme was that this was a lot of cleanup due to decisions made by long-retired colleagues. Given today’s knowledge, these were bad decisions, but it doesn’t matter—focusing on cleaning up is the important thing.
ii. They were working on low use materials—microfilms, GovDocs, phonodiscs (LPs), and parts (they separate scores and parts!!). To me these all seemed like candidates for either weeding or significant changes (reuniting the scores and parts).
Using new features in Alma to set up easier queries in Analytics
This wasn’t a great presentation.
The main takeaway for me is that I need to ask ILS to set up some widgets for me—I have lots of reports sent to me, but no widgets.
OCLC data sync at Princeton
This was an excellent presentation, and it relates to something that Sandi and I have been working on over the past few months—setting up Alma so that it sends our holdings to OCLC (so catalogers don’t need to manually maintain our holdings) and getting updated OCLC records back (this is called data sync).
He talked through the configuration they used, and that they turned on datasync only briefly because they hadn’t fully prepared for what they were doing
i. This was a helpful reminder that when we do this at USC we should make sure that we do it incrementally to make sure we catch any problems before they propagate.
Setting up LHR data sync with OCLC – Not just for serials!
Hopefully we will do this at USC, but for us it will just be for serials!
Why do we do things this way?
This was a presentation from an institution that had been on Alma for 13 years, and they were taking the time to reevaluate some of their workflows.
They mentioned some things that were familiar to me from USC – reluctance to delete lost items for instance
i. They did a series of liball training sessions to make sure that everyone who wanted information from Alma had the skills to get it for themselves.
ii. They have turned on a lot more autocollections for electronic resources in an effort to simplify the workflow for themselves
iii. They have done a complete overhaul of borrowing guidelines, significantly reducing the number of policies in use
California Ex Libris User Group – There will be a conference in Long Beach October 24-25
Navigating Change from the middle
This was a presentation by someone who’s a fairly new eresources librarian talking about how to manage change in that department (it’s a new department). It wasn’t a great presentation.
Faster, Better, Happier Alma management with Scrum
This was the same presenter from Princeton who talked earlier about their OCLC datasync. This was focused on how they achieved that project (and others) using the agile/scrum project management frameworks. It was really interesting.
He mentioned that LinkedIn Learning has an Agile Foundations course; they went through it together and learned a lot. I might try and take this course.
The final session I went to was from an Ex Libris presenter talking about “Maximizing Alma’s impact through library best practices – This is something that I’m glad Ex Libris is doing (it’s been a real gap in their training and support, I think) and her suggestions were themed around automation (e.g. using APIs with vendors), decisionmaking (setting up Analytics reports), simplifying workflows (making sure you’re using import profiles), and making use of Alma innovations (like cloud apps).
I will try and go through this a little more closely and see if there’s anything I’m missing here.