How do academic libraries value their electronic resources?

UK academic libraries spend millions on electronic resources annually to support learning, teaching and research at their institutions. Demonstrating the value of these online resources presents a number of challenges for libraries, especially when it comes to attributing value to e-resources in a consistent and meaningful way. The concept of resource or content “value” is subjective, and will be applied differently between HE institutions, but also between different subject areas.

Usage statistics can be a starting point for discussion with regards the value of e-resources. The financial imperative for HE libraries to maximise value for money from the annual investment made in online resources has never been more important. Well-informed resource decision-making is key for libraries, so usage statistics play a crucial role for libraries in adopting a more data-driven approach when analysing user activity across their online collections. There is a mass of data out there though for libraries to work with and manipulate, so it is helpful for library staff to know what questions they want to ask of the data to inform their decision-making processes and the criteria they set out for e-resource evaluations.

Drivers for a more data-driven approach to e-resource renewals

DMU Library’s Content Delivery Team ran a “How does the library value its electronic resources?” workshop for DMU Library staff in April 2016 (I was one of the co-presenters on the session). I also attended a UKSG “Usage stats for decision-making” event in June 2016. Both events raised discussion points around how libraries currently measure resource usage and apply value to their online content. The UKSG event featured a panel of presenters from UK academic libraries, and as an information professional who deals with usage statistics on a daily basis at DMU Library, I was fascinated to hear some of the different approaches to collating usage data in UK libraries. It became clear, whilst listening to the individual presentations, that although academic libraries may review usage statistics in diverse ways (approaches seemed to be dependent upon the following factors – size of library budget and who has responsibility to ensure money is spent, subjects taught at the particular institution, number of library staff working with usage stats and the technical expertise of library staff when it came to presenting usage data) there are a number of current common “drivers” that libraries are faced with when it comes to applying value in a more objective, evidence-based manner. In no particular order of importance, these drivers include:

  • Compiling usage metrics and collating the subsequent feedback from different stakeholders can be time-consuming activities for library staff. The need to automate processes where possible is crucial to “free up” staff time to produce clear and concise data analysis
  • The need for libraries to make financial savings across increasingly stretched budgets
  • Improving online resource provision for under-invested subject areas
  • Higher student expectations of university library services and provision of online content during their studies
  • Anecdotal or arbitrary (subjective) decision-making processes leading to retention of low use, non-relevant library resources
  • Different usage analysis is often needed for different audiences (subject teams, academic staff in faculties, management and users). Applying value in a consistent way can be a challenge if different stakeholders are evaluating usage across different contexts

Status = under review? 

The key to tackling these challenges seems to be in asking “what stories do you want the usage data to tell?”. Acknowledging what criteria needs to be included in resource analysis is crucial. This criteria will obviously vary from institution to institution, but also from subject to subject. Cost per use (CPU) seems to be the “go-to” measure of value for e-resources used by HE libraries, and I think there was consensus at the UKSG workshop that this is not going to change any time soon as CPU gives a basic and simple calculation of user activity in line with the financial investment made by the library year on year.

One of the presenters at the UKSG workshop, Anna Franca from King’s College London, highlighted the potential need for usage “formulas” or “indicators” to help institutions establish more data-driven analysis for e-resource renewal or cancellation decisions. The indicators may provide more meaningful and helpful insight when reviewing CPU (or any types of usage data a library may use), and potentially provide libraries with more time to assess under-used resources on their radar and gauge the potential reasons behind this poor usage. Usage indicators will differ between institutions, and will need to fit alongside analysis of annual resource subscription costs, but the following example (for a full text resource) could be employed to shape resource renewal or cancellation discussions between library/faculty/management staff:

  • CPU below £1 per download = automatic renewal
  • CPU between £3 to £5 per download = “under review” status
  • CPU over £5 per download = cancellation (or substitution of content) recommended

The indicators above are a guide as to the type of indicators that could be applied by a library to implement a more data-driven process to resource decision-making. I appreciate that for any renewal or cancellation decision, additional context will always be required to provide a comprehensive overview of usage and relevancy of an e-resource to a particular university faculty or department. Some online resources may be deemed specialist or niche, and will be aimed at, and used by, a smaller cohort of students. The CPU indicators may therefore have to be shifted to take this “specialist” tag into consideration (for example, £3 to £5 per download may represent very good value for a particular specialist database). The UKSG usage stats workshop highlighted some other factors around resource and subject profiles that library staff may need to take into account when reviewing e-resource usage activity:

  • Fluctuating inflation/currency exchange rates over time. These may skew resource CPU figures for individual years
  • Changes to teaching and learning at the institution over time – subjects will have varying “characteristics” that define that area of study (for example, differences in online publication formats and frequencies)
  • Publisher problems – inaccessible resources or poorly designed user interfaces.
  • (Lack of) promotion and marketing of the e-resource
  • Unique content versus duplicated content (via alternative library databases or collections). This “overlap analysis” is an increasing part of my day-to-day library work
  • Digital rights management (DRM) limitations when looking at e-book usage metrics
  • Potential interlibrary loans costs incurred if e-resource subscription is cancelled.
  • Open access versus paid-for content

The future?

After listening to the different library presenters at the UKSG usage stats workshop, and discussing themes around usage statistics with some of the other workshop attendees, it became clear to me that DMU Library could benefit in a number of ways by progressing towards a more data-driven framework for e-resources renewals. I was keen to take what I had seen and listened to at the UKSG event back to DMU and converse with colleagues in the Content Delivery Team and the wider department. I believe a revised approach to usage data would create extra lead time for library staff to prioritise the analysis of under-used databases and collections purchased by the directorate. Content Delivery currently presents usage and cost analysis for all subscribed e-resources renewals that feature in the library’s portfolio. This can be time-consuming for Content Delivery staff to assemble and present, especially when several e-resource subscriptions expire at the same time. It would be wiser to concentrate on, and allow more evaluation time for, those e-resources that suffer low usage. Library subject teams could comprehensively examine factors that lie behind an e-resource’s poor use (this could mirror some of the additional context around e-resource usage I discussed earlier in this post). Action plans could be devised and set in motion to try and increase usage for the e-resource over a period of time (e.g. via targeted promotion to students and staff). These actions could then be reviewed over time to see if they had any effect. The outcome of the e-resource appraisal may result in the outright cancellation of the e-resource (depending on the immediate need to save money) but may also prompt libraries to check the viability of alternative (but more relevant?) online products on the market to replace or act as a substitute for existing content.

Going forward, I can certainly see value in creating an “under-review” status for e-resources that display high cost per use in line with usage indicators or banding set by the library. The Content Delivery Team are still at an early stage of working on this framework proposal. The April 2016 in-house DMU event on “how do we value our electronic resources?” reinforced the concept that there are many variables in reviewing the impact of, and attributing value to, e-resources in different subject areas. That saying, I also came away believing that an increased evidence-based focus for e-resource selection and retention would help strengthen the collaboration between Content Delivery (the producers of usage data) and Academic Liaison Teams (the assessors of the usage data). This improved liaison between library teams can only help the department in delivering one of its key objectives to “maximise the usage and impact of the information resources available to users” at DMU.

 

 

 

 

 

 

 

About Mitchell Dunkley

I work in the Content Delivery Team at De Montfort University Library. I manage DMU Library's e-resources portfolio and I am involved in library systems admin, collating resource usage statistics and troubleshooting. All comments are my own.
This entry was posted in DMU, Libraries, Library, UKSG, Usage stats. Bookmark the permalink.