#Digifest18 reflections…

On 6th-7th March 2018 I attended #Digifest18. The festival is hosted annually by Jisc at the ICC in Birmingham. Digifest embraces all things “digital”, highlighting how digital technologies and online services are used in the HE and FE sectors to support learners. The two-day event is free to attend and the conference programme features a mix of keynote talks, workshops, plenary sessions, panel discussions and vendor exhibitions. As an information professional working at a HE library, I find Digifest an interesting, enthralling, challenging and (ultimately) rewarding event to attend. The event keeps me up-to-date with emerging digital technologies on the horizon in the educational sector, invites interesting speakers to discuss current trends and themes affecting digital industries and provides delegates with a chance to listen to case studies of best practice and debate the opportunities and problems that arise with electronic services and strategies.

A tractor simulator being tested in the exhibition hall at #Digifest18

Here are a few of my personal reflections on the #Digifest18 event…

Developing student and staff digital capabilities

As per #Digifest17, developing digital literacies was a theme that regularly cropped up in workshops, plenaries and panel discussions over the two days. The #Digifest18 conference tag line was Harnessing the power of edtech: thriving in a digital world, so it was not surprising to see a lot of debate and discussion amongst speakers and delegates with regards this theme. A number of sessions focused on the need to support learners (and the staff that teach these learners) in an ever-changing and fluctuating digital world. Whilst digital technologies can potentially open up learning and education to many students, there are various challenges still to overcome – this could be widening student participation and engagement with digital systems (removing financial or social barriers to learning), valuing the mental wellbeing of both learners and teachers and providing a landscape of blended learning and mixed methods to encourage students and staff to embrace digital technologies (the online systems utilised and the learning spaces digital systems are used in). The passionate opening #Digifest18 keynote, from National Union of Students (NUS) President Shakira Martin, emphasised the need for organisations to provide flexible and adaptable systems in education to support learners of all ages. It was great to have a student-focused keynote to start the conference – it is vital for students to be asked what they want from their digital learning experience and for institutions to listen attentively and act on this feedback.

I came away from Digifest thinking that assumptions or preconceived ideas made by educational establishments can be often be misleading – we think we may know what the student journey looks like but students’ expectations and the technologies they use can change in very short time. This was echoed by some of the speakers at the event and attendees tweeting during the conference (for example, Ange Fitzpatrick’s tweet below):

Students are assumed to be part of an intrinsically “digital” generation (I heard the well-worn “Digital Natives” concept raised again at this year’s conference) but barriers to digital do remain in place. Some of the workshops I attended highlighted that not all learners are tech-savvy and not all learners have direct access to digital services and electronic devices. That raises the question of how do universities/colleges/libraries interact with students across the board and provide them with fair access to these digital products and tools?

The What do students want? workshop and Going where the wild students are! debate both highlighted a wider discussion around students’ feedback and provided several case studies to display how institutions/libraries can potentially address the development of student digital skills. Learning can take place in the gaps between formal learning (traditional lectures/seminars). Trips to the library, a coffee shop, studying in halls or at home are all a learning environment, whether studying independently or collectively. The need for flexibility and adaptability in learning spaces was voiced – the student experience is a wide spectrum and organisations need to cater for different student needs (within obvious financial and space limitations). The Going where the wild students are! discussion underlined that some students want the “wild rumpus!” of a busy group study learning space that they can customise and experiment with (students having autonomy to move different forms of furniture and employ different digital tech to support their learning). On the other hand, some students want structured learning spaces that they can use for quiet or silent study. These physical and digital learning environments should be “digitally-rich” with robust Wi-Fi connection (now a basic human requirement!), laptops to loan and plenty of sockets for students to power their devices! Institutions need to get their IT departments on board to remove access barriers and make the transition to tech as seamless as possible. Group spaces should encourage creativity, collaboration and be a safe space for students to ask questions about their learning. In terms of libraries, both sessions I mention above called attention to the fact that students can view libraries as a “home” (the social) or libraries as a “workplace” (learning, employment). Libraries and librarians add experience and a know-how in using resources, finding information and navigating learners through online content. The University of Liverpool pointed out how they move library services into less well-used spaces on university campus with their “Library On Tour” branding.

The What do students want? session emphasised that students like to feel valued. Creating a sense of “community” or “belonging” is important to learners. The session recommended that students should be involved in local conversations around institutional decision making and design – communicate with students in the digital spaces they inhabit and create conversations using different tools to engage. Social media was deemed very important in this conversation. There is increasing use of Snapchat/Instagram by younger, prospective students. The expectations of future students can change swiftly and innovations at institutions can often take a while to set up, so libraries and universities have to be agile, evolve quickly, speed up workflows to future-proof learning and their ability to meet students’ demands.

Universal Design for Learning (UDL)

Another topic that featured at #Digifest18 was resource accessibility and usability. Again, this theme fits in neatly with the opening keynote talk addressing the need for flexible and adaptable educational systems and spaces. I have been involved in some work at DMU Library looking at online resources access via smartphones/tablets and providing support for learners using mobile apps and sites. DMU have this type of work labelled as Universal Design for Learning or UDL. UDL views the learning environment as the disabling factor, not the individual learner.

The Infiltrating the systems – inclusive policies as a driver for effective practice workshop showcased collaboration between the University of Kent and Jisc as part of the OPERA project. The presenters stated that inclusive policies were a universal and positive concept (improve learning for all) but putting these policies into practice can often be difficult. I found this session to be very enlightening. The presenters demonstrated some useful ideas which provided lots of food for thought. Kent University have re-badged their assistive technologies as productivity tools for all learners – “Smarter tools for study”. These are online tools and apps that make learning better for all. The technologies remove or reduce barriers to online content for all students. A number of presenters throughout the event called for a much more holistic attitude at universities with regards student support. It is important for university departments to work closer together to help meet the needs of students.


The Jisc Digifest event will always be a celebration of digital technologies but I felt this year’s conference was more muted than in previous years. Maybe this was due to the current challenges facing the HE and FE education sectors – funding uncertainties, impact of Brexit and organisations being asked to do more with less. These challenges potentially threaten the ability of organisations to comprehensively support the learning journeys of their students but simple and inexpensive digital innovations can still significantly improve the learner experience:

The festival promoted the need for organisations to be agile, adaptable (yes that word again!) and to listen to their learners with regards their needs and requirements. A statement from Nick Woolley (head of libraries at University of Northumbria) resonated with me – the relationship between a student and their organisation is a fluid one. It is ever-changing, dependent upon the student’s need(s) at any given time. Students can be learners, partners, co-creators or consumers (the idea of students as “customers” was challenged several times at the event but most organisations do focus their front of house operations on customer service principles).

I am part of a De Montfort University Library working group planning a Learning At Work Week for library staff in May 2018. #Digifest18 prompted many ideas to take back to this group work. Engaging staff to develop their own digital capabilities and awareness is important I think in trying to support learners and ensuring staff feel as though they are contributing to a positive student learning experience. This engagement requires strategies that promote training, collaboration and skills sharing.

As ever, the Digifest event was a pleasure to attend and provided me with lots of ideas, content and material to make me reflect on my own work at DMU Library and to share with other library colleagues. The festival also provides a valuable networking opportunity to speak with peers and friends from other libraries/organisations who attend. I look forward to #Digifest19 and would recommend the conference to any librarian or information professional dealing with digital content and systems.



Posted in Uncategorized | Leave a comment

Using Trello to manage DMU Library’s Under Review process for online resources renewals

In August 2016, I wrote a blog post outlining the initial stages of work in trying to establish an “Under Review” framework for online resources purchases and renewals at DMU Library. The aims of the Under Review process are to a) deliver a more data-driven approach to the selection, retention or potential cancellation of electronic resources that DMU Library purchases on an annual basis and b) to improve communication and collaboration between DMU Library teams involved in analysing the value of library online subscriptions.

The infographic I created below outlines the basic components of the Under Review workflow:

Infographic highlighting the different stages of the Under Review workflow.

The Under Review process is centred around the creation of usage and cost triggers that highlight if an online library resource is being poorly used or has increased in cost beyond expected budget allocations. If any of these usage or cost markers are activated, the resource is alerted as being “Under Review”, meaning that further evaluation of the low usage or high annual cost is required to ensure that the resource is delivering maximum value to students and staff.

Under Review Group

In early 2017, an Under Review group was assembled at DMU Library. The group is comprised of staff representing different library teams that are involved in the selection, acquisition and implementation of library resources – this includes subject librarians, Bibliographic Services staff, Content Delivery colleagues and library senior management. The starting remit of the group was to create some usage and cost markers that would underpin the Under Review process. The group would meet on a monthly basis to evaluate resources that happened to hit any of these triggers.

Content Delivery prepare regular resource usage and cost analysis for library subject teams and this analysis can be referred to the Under Review group to discuss if any usage or cost triggers are hit. The current usage and cost triggers agreed by the Under Review group are:

  • Resource cost per use (CPU) of more than £15
  • Annual resource cost increase of more than 5%
  • Annual resource usage decrease of 25% or more

Using Trello in Under Review meetings

To assist the Under Review workflow, the group decided to use a free online service called Trello from the outset of the process. Content Delivery staff had utilised Trello in a number of other library projects, so the group hoped that Trello would help with the management of Under Review meetings and any actions derived from them. Trello visualises projects and tasks in an easy-to-use, intuitive and flexible way and the working group were keen to take advantage of these benefits to help encourage collaboration between staff involved in the Under Review process.

An Under Review Group Trello Board was created and the board would be populated as, and when, meetings and dialogue took place. The Lists feature on Trello would allow the agenda for each monthly meeting to be set up (and added to) in advance of the group getting together. Trello Cards could then be used with the Lists functionality to document discussion and decision-making with regards e-resources selection in a granular way:

An example of the type of Trello Board used in the Under Review process.

For an individual member of the group to interact with the board, they had to first register with Trello to create a free profile. Once library staff had registered with the service they could then view the board and utilise Trello features. The Under Review triggers were added to the board, using a list, and were assigned different coloured labels. This would allow the group to add the usage and cost triggers to individual resource cards in a simple but visually interesting way. The group also decided to take advantage of several other free Trello features to help organise the work.

An example of some of the Trello features that are used to organise work within the group.

Some of these Trello features include:

  • The usage and cost analysis for any particular online resource (most commonly an Excel spreadsheet) can be added to the relevant Trello Card and viewed/shared by the group. Trello allows you to share files and links on each separate card created.
  • Discussion from group members can be recorded in the card itself by using the Add Comment feature. This reduces the need to send copious emails between group members and stores feedback in one place.
  • A Trello Checklist of actions related to the resource can be created and tasks can be “ticked” when completed (the checklist even shows the % progress if tasks remain outstanding). These actions can also be assigned to individual members of the group as long the assignee is registered with the Trello site.
  • A deadline for discussion or related actions to be completed can also be added by using the Trello Due Date feature. A calendar pop up displays from where a date or time can be selected.

These features are basic (as part of the free registration with Trello) but they are quick to set up and effective in helping to support the administration of the Under Review work. Other features known as Trello Power-Ups are available to explore and use on the site. Power-Ups are applications on Trello that enhance a Board by linking to other online services (e.g. the Google Drive Power-Up).

DMU Library’s Under Review process has now been live for approximately a year and has provided the department with an increased evidence-based focus when evaluating library online resources. There is more effective collaboration between library teams and the Under Review meetings act as a forum to raise questions about the perceived value and relevance of library e-resources. The group may suggest added resource promotion to tackle poor usage or investigate the development of better access pathways for resources that will improve discoverability of the resource and hopefully increase subsequent usage. I am sure that there will be tweaks to the Under Review process as the library group moves forward but the underlying principles of the workflow will ensure better informed decisions are made by the library when it comes to selecting resources, services and systems to support studentse.g  and staff.

Posted in DMU, Libraries, Library, Usage stats | Leave a comment

Using JUSP COUNTER JR1 usage data to review DMU Library’s ScienceDirect journal titles

I have previously blogged about my work in helping DMU Library to create a framework to move towards a more data-driven process for online resources renewals. The context for this work is to ensure that DMU Library is making well informed, evidence-based decisions relating to its electronic resources subscriptions. It is vital for DMU Library to continually maximise the value of its annual investment in online resources and provide access to relevant electronic content to support learning, teaching and research across the university.

Subscribed v Non- subscribed titles

One electronic resource that I concentrated on in the summer of 2016 was Elsevier’s ScienceDirect full text journal collection. Like many UK HE institutions, DMU Library subscribes to ScienceDirect via the national Jisc/NESLi agreement. This is a long-standing subscription cost that constitutes a major spend from the library’s annual online resources budget. The construction of the ScienceDirect e-journal agreement means that DMU Library connects to content in two ways:

  • Subscribed (or core) titles. These are journals in the collection that have been selected by library subject teams to best support learning, teaching and research at the institution.
  • Non-Subscribed (or non-core) titles via ScienceDirect Freedom Collection content. These are non-core publications that the library purchases as top-up content to the Subscribed titles in the collection.

As the Jisc/NESLi agreement with ScienceDirect was to be re-negotiated for 2017, July 2016 was a suitable time for the library to re-evaluate the value and relevancy of its ScienceDirect Subscribed titles. The existing Elsevier agreement allowed UK universities (should they wish) to substitute Subscribed journals for Non-Subscribed journals in the collection, as long as the institution’s contractual spend in the deal was maintained. If the ScienceDirect journal usage data for DMU highlighted that individual titles did not continue to support academic faculty interests, then the library would look to act to increase the value of its Subscribed titles before the new 2017 journal agreement was in place. The review also made sense when looking at maintaining post-cancellation access rights for DMU Subscribed titles should the library decide not renew the Elsevier agreement.

After several days of planning, the objectives for the ScienceDirect review were agreed. These were:

  • Identify Subscribed ScienceDirect journal titles with low usage with a view to replacing them with high use Non-Subscribed titles. Any title substitutions would have to be made in line with the institution maintaining its annual contractual spend with the provider.
  • Base the review on COUNTER JR1 usage data exported from the JUSP service and 2016 journal list prices from Elsevier. This usage and cost data would then be combined to provide a cost-per-use (CPU) metric for each reviewed Subscribed and Non-Subscribed title. As libraries can mark “core” titles in the JUSP service (via access to KB+), DMU Subscribed titles could be easily filtered for display in the exported COUNTER file.
  • Usage and cost analysis to be collated by the library’s Content Delivery Team and disseminated to subject librarians in Academic Liaison for evaluation.
  • Outcomes from the evaluation process to be captured by Content Delivery and candidates for substitution communicated back to the provider.
JUSP allows you to mark up "core" titles in electronic journal collections.

JUSP allows libraries to mark up “core” titles in their electronic journal collections

Ready, steady, go…

I have mapped out the ScienceDirect review workflow in more detail below:

  1. Exported Elsevier JR1 COUNTER usage data from JUSP to Microsoft Excel. The statistics covered the period between Jan 2012 to Jun 2016. As DMU Library had already identified existing Subscribed (or core) titles in the JUSP service via KB+, I was able to filter the usage data to display Subscribed titles only and then sorted the data from low to high usage.
  2. To ensure the review was more effective and up-to-date, the Elsevier usage data was refined further to cover usage between Jan 2015 – Jun 2016. The data was resorted: Subscribed titles from low to high usage, Unsubscribed titles from high to low usage.
  3. Received breakdown of 2016 journal costs from Elsevier to aid cost analysis in the review – costs for Subscribed titles in the agreement, and list prices for top 25 most used Unsubscribed titles.
  4. Two Excel files to work with – usage data exported from JUSP and journal costs from Elsevier. Used VLOOKUP function in Excel to merge the two datasets into one spreadsheet. Ran VLOOKUP match on journal title ISSNs.
  5. Created CPU metric for current Subscribed titles and top 25 most used Unsubscribed titles. Created proposed lists of Subscribed title with high CPU (to be removed) and Unsubscribed titles with low CPU (for addition). Analysis sent to subject librarians in Academic Liaison for evaluation.
  6. Based on the CPU metric, subject teams selected proposed Subscribed titles for substitution. Proposed titles sent to Elsevier. Elsevier ratified substitutions and made relevant changes to DMU Library’s Subscribed journals list for 2017.

CPU: what constitutes good value?

CPU provides a basic appraisal of library user activity in line with the financial investment made by the library. CPU may well represent a starting point for library staff discussion with regards the value of an online resource or collection. CPU does require context though, and one way to potentially achieve this is to set up some usage indicators. I blogged about creating usage indicators to help shape resource renewal or cancellation in my Aug 2016 post asking “How do academic libraries value their electronic resources?“.

Whilst not strictly applied by subject teams at DMU in this example of a journals review, the general usage indicators below could be applied by libraries when approaching this type of journal substitutions work. Obviously, other factors may need to be considered, but it does provide a framework for a move towards a more data-driven decision making process:

  • CPU < £1: excellent value, automatically retain title.
  • CPU between £1-£5: good to fair value, recommend retaining title.
  • CPU between £5-£10: fair to poor value, investigate reasons for low use, potential substitution.
  • CPU > £10: automatic substitution.

Outcomes of the review

After the review was completed, and the revised Subscribed titles list was approved by Elsevier and Jisc, DMU Library had removed approximately 20 low use, high CPU Subscribed titles and replaced them with a similar number of high use, low CPU Non-Subscribed titles for the new 2017 agreement. When analysing DMU’s revised Subscribed titles list, Jisc reported (based on 2015 full year COUNTER usage) that the substituted titles would increase the proportion of DMU core title usage in the Elsevier deal from around 18% to 29%.

I found the ScienceDirect review process to be a worthwhile activity. The process was not without its challenges, but it did provide an opportunity for library staff collaboration and discussion. It also gave the directorate a chance to engage with useful library user activity data and make online resource selection decisions that will hopefully provide more relevant content and support for students and staff at the institution.

Posted in DMU, Libraries, Library, Usage stats | Leave a comment

Reflections on #Digifest17

This week I attended my third Jisc Digifest conference. The free-to-attend event is hosted annually at the ICC in Birmingham, and is organised by Jisc to celebrate all things “digital” in the education sector. Here are a few of my personal reflections on the #Digifest17 conference.

Digifest badge

My #Digifest17 delegate badge and lanyard.

Jisc’s Digifest conference did not disappoint once again – the 2017 event contained a plethora of interesting and insightful talks, workshops, demos and debates. Scheduling which of these talks to go to was easily done by using the interactive Digifest app on my iPad – you could access the event programme via the app and create your own personal schedule for the two days at the conference.

I think the two main themes for me at #Digifest17 were “visualisation” and “digital literacies”. A number of the Digifest talks covered these topics when highlighting online resources or services.

Digital literacies

The “Building digital expertise in your organisation” discussion featured a number of case studies highlighting different institutional approaches to develop staff digital capabilities. I especially found North Lindsey College’s “DPD Go!” programme to be a fascinating example of creating a framework for increasing digital literacies and core competencies for staff – the idea of an “app club” for library staff is one which I think will resonate with colleagues I work with at DMU. The discussion promoted the need to embed digital capabilities in the workplace and for staff to engage with this process – digital skills can raise staff confidence (staff like to be rewarded and celebrate individual achievements) and directly add value to the learning experience of students. These ideas echoed similar outcomes in the recent 2017 NMC Horizon report. There was a sense though that organisations had to be persistent and consistent in their approach, and use “multi-pronged” tactics to support staff development and training in this area.

Whilst the “Building digital expertise in your organisation” session focused mainly on the development of staff digital skills, Rafe Hallet’s “Surfing in the Shallows” presentation highlighted students’ experiences with using online resources in HE. The talk debated the erosion of old reading and academic practices (a nostalgia for the concept of the “lone scholar”?) in light of new digital tech and resources. Does this mean that the (perceived) scattered, distracted reading processes of current students are in some way associated with the formats of this new digital tech? The talk went on to present that knowledge consumption was much more visual in the 21st Century and that students employed increasing creative and non-linear methods to absorb learning online. New digital resources are responding to new forms of reading, and it was important for universities to find a balance between digital play and academic rigour.

Some of the insights in Rafe Hallet’s talk were also promoted in Sara Perry’s “Digital jamming” workshop. Sara Perry uses digital tools in her work with students at the University of York to embrace group creativity and synthesise collaboration. The workshop was a highly interactive 10 minutes which saw conference attendees drawing stick figures and using these to create an online “meme” for Digifest17.

Stick figure portraits

Stick figure family created during the “Digital jamming” workshop (courtesy of @beccihutchins).


Two sessions which took place on the second day of Digifest17 highlighted the importance of visualising content or data in digital services. The new UK Medical Heritage Library (UKMHL) collection was promoted during the “Historical Texts: visualising digital collections” talk. The UKMHL collection is a new open access digital resource which includes diverse methods for visualising archive content. The resource uses date histograms, image walls and sunburst visualisations for students to discover and engage with online content. Rafe Hallet used the UKMHL collection as a good example of a digital resource experimenting with new forms of visualised knowledge for students. Digital content is layered for students to find, participate in and co-create.

The session on “Business intelligence for higher education” demonstrated student datasets being analysed, and used, to form interactive dashboards in the Heidi Plus project. The dashboards in the Jisc/HESA project used Tableau software to present the data in a highly visual way. The project also represented examples of “agile” project management. As I work with usage data in my day-to-day work at DMU Library, I was interested to hear about how the dashboards were created and if any potential efficiencies were achieved as part of the project work. Improving data visualisation for library colleagues is an important part of my library work, and it was beneficial to hear how universities involved in the Heidi Plus project observed the work as continuing professional development for staff (around data manipulation, data visualisation and creating a shared language for planning projects).

Social media

I think the most valuable session for me during Digifest17 was Eric Stoller’s “Part Deux: why educators can’t live without social media”. The value was partly down to timing (I am due to co-present on a social media discussion in May 2017 for a Learning at Work week at DMU Library), but a number of the themes in Stoller’s presentation struck a chord. Stoller covered various points in his discussion – the value of social media (for staff and students) with regards networking (connecting with others via a shared interest), research, career development and engagement with learners. Stoller also spoke about the balance between “professional” and “social” spaces on social media. Social media is constantly evolving and is fluid in the way it can potentially blur lines between work space and private life. Stoller shared his belief that it was important for educators and educational establishments to be adaptive and progressive – Stoller suggested this organisational mindset and ethos should be set by leaders at institutions. The key function, to engage with students and learners, must remain even if social media tools come and go.


I found Digifest17 to be an enjoyable and enthralling event. As in previous years, the event is an excellent vehicle for current awareness with regards digital trends, resources and services in the HE and FE sectors. As I tweeted right after the event, your head will be buzzing for days after the event and you will want to take ideas back to your library/college/work place and share these insights with colleagues.

Digifest tweet

Twitter praise!

Roll on Digifest18!


Posted in #Digifest, Jisc, Libraries, Social Media | Leave a comment

How do academic libraries value their electronic resources?

UK academic libraries spend millions on electronic resources annually to support learning, teaching and research at their institutions. Demonstrating the value of these online resources presents a number of challenges for libraries, especially when it comes to attributing value to e-resources in a consistent and meaningful way. The concept of resource or content “value” is subjective, and will be applied differently between HE institutions, but also between different subject areas.

Usage statistics can be a starting point for discussion with regards the value of e-resources. The financial imperative for HE libraries to maximise value for money from the annual investment made in online resources has never been more important. Well-informed resource decision-making is key for libraries, so usage statistics play a crucial role for libraries in adopting a more data-driven approach when analysing user activity across their online collections. There is a mass of data out there though for libraries to work with and manipulate, so it is helpful for library staff to know what questions they want to ask of the data to inform their decision-making processes and the criteria they set out for e-resource evaluations.

Drivers for a more data-driven approach to e-resource renewals

DMU Library’s Content Delivery Team ran a “How does the library value its electronic resources?” workshop for DMU Library staff in April 2016 (I was one of the co-presenters on the session). I also attended a UKSG “Usage stats for decision-making” event in June 2016. Both events raised discussion points around how libraries currently measure resource usage and apply value to their online content. The UKSG event featured a panel of presenters from UK academic libraries, and as an information professional who deals with usage statistics on a daily basis at DMU Library, I was fascinated to hear some of the different approaches to collating usage data in UK libraries. It became clear, whilst listening to the individual presentations, that although academic libraries may review usage statistics in diverse ways (approaches seemed to be dependent upon the following factors – size of library budget and who has responsibility to ensure money is spent, subjects taught at the particular institution, number of library staff working with usage stats and the technical expertise of library staff when it came to presenting usage data) there are a number of current common “drivers” that libraries are faced with when it comes to applying value in a more objective, evidence-based manner. In no particular order of importance, these drivers include:

  • Compiling usage metrics and collating the subsequent feedback from different stakeholders can be time-consuming activities for library staff. The need to automate processes where possible is crucial to “free up” staff time to produce clear and concise data analysis
  • The need for libraries to make financial savings across increasingly stretched budgets
  • Improving online resource provision for under-invested subject areas
  • Higher student expectations of university library services and provision of online content during their studies
  • Anecdotal or arbitrary (subjective) decision-making processes leading to retention of low use, non-relevant library resources
  • Different usage analysis is often needed for different audiences (subject teams, academic staff in faculties, management and users). Applying value in a consistent way can be a challenge if different stakeholders are evaluating usage across different contexts

Status = under review? 

The key to tackling these challenges seems to be in asking “what stories do you want the usage data to tell?”. Acknowledging what criteria needs to be included in resource analysis is crucial. This criteria will obviously vary from institution to institution, but also from subject to subject. Cost per use (CPU) seems to be the “go-to” measure of value for e-resources used by HE libraries, and I think there was consensus at the UKSG workshop that this is not going to change any time soon as CPU gives a basic and simple calculation of user activity in line with the financial investment made by the library year on year.

One of the presenters at the UKSG workshop, Anna Franca from King’s College London, highlighted the potential need for usage “formulas” or “indicators” to help institutions establish more data-driven analysis for e-resource renewal or cancellation decisions. The indicators may provide more meaningful and helpful insight when reviewing CPU (or any types of usage data a library may use), and potentially provide libraries with more time to assess under-used resources on their radar and gauge the potential reasons behind this poor usage. Usage indicators will differ between institutions, and will need to fit alongside analysis of annual resource subscription costs, but the following example (for a full text resource) could be employed to shape resource renewal or cancellation discussions between library/faculty/management staff:

  • CPU below £1 per download = automatic renewal
  • CPU between £3 to £5 per download = “under review” status
  • CPU over £5 per download = cancellation (or substitution of content) recommended

The indicators above are a guide as to the type of indicators that could be applied by a library to implement a more data-driven process to resource decision-making. I appreciate that for any renewal or cancellation decision, additional context will always be required to provide a comprehensive overview of usage and relevancy of an e-resource to a particular university faculty or department. Some online resources may be deemed specialist or niche, and will be aimed at, and used by, a smaller cohort of students. The CPU indicators may therefore have to be shifted to take this “specialist” tag into consideration (for example, £3 to £5 per download may represent very good value for a particular specialist database). The UKSG usage stats workshop highlighted some other factors around resource and subject profiles that library staff may need to take into account when reviewing e-resource usage activity:

  • Fluctuating inflation/currency exchange rates over time. These may skew resource CPU figures for individual years
  • Changes to teaching and learning at the institution over time – subjects will have varying “characteristics” that define that area of study (for example, differences in online publication formats and frequencies)
  • Publisher problems – inaccessible resources or poorly designed user interfaces.
  • (Lack of) promotion and marketing of the e-resource
  • Unique content versus duplicated content (via alternative library databases or collections). This “overlap analysis” is an increasing part of my day-to-day library work
  • Digital rights management (DRM) limitations when looking at e-book usage metrics
  • Potential interlibrary loans costs incurred if e-resource subscription is cancelled.
  • Open access versus paid-for content

The future?

After listening to the different library presenters at the UKSG usage stats workshop, and discussing themes around usage statistics with some of the other workshop attendees, it became clear to me that DMU Library could benefit in a number of ways by progressing towards a more data-driven framework for e-resources renewals. I was keen to take what I had seen and listened to at the UKSG event back to DMU and converse with colleagues in the Content Delivery Team and the wider department. I believe a revised approach to usage data would create extra lead time for library staff to prioritise the analysis of under-used databases and collections purchased by the directorate. Content Delivery currently presents usage and cost analysis for all subscribed e-resources renewals that feature in the library’s portfolio. This can be time-consuming for Content Delivery staff to assemble and present, especially when several e-resource subscriptions expire at the same time. It would be wiser to concentrate on, and allow more evaluation time for, those e-resources that suffer low usage. Library subject teams could comprehensively examine factors that lie behind an e-resource’s poor use (this could mirror some of the additional context around e-resource usage I discussed earlier in this post). Action plans could be devised and set in motion to try and increase usage for the e-resource over a period of time (e.g. via targeted promotion to students and staff). These actions could then be reviewed over time to see if they had any effect. The outcome of the e-resource appraisal may result in the outright cancellation of the e-resource (depending on the immediate need to save money) but may also prompt libraries to check the viability of alternative (but more relevant?) online products on the market to replace or act as a substitute for existing content.

Going forward, I can certainly see value in creating an “under-review” status for e-resources that display high cost per use in line with usage indicators or banding set by the library. The Content Delivery Team are still at an early stage of working on this framework proposal. The April 2016 in-house DMU event on “how do we value our electronic resources?” reinforced the concept that there are many variables in reviewing the impact of, and attributing value to, e-resources in different subject areas. That saying, I also came away believing that an increased evidence-based focus for e-resource selection and retention would help strengthen the collaboration between Content Delivery (the producers of usage data) and Academic Liaison Teams (the assessors of the usage data). This improved liaison between library teams can only help the department in delivering one of its key objectives to “maximise the usage and impact of the information resources available to users” at DMU.








Posted in DMU, Libraries, Library, UKSG, Usage stats | 2 Comments

Using LibAnswers to manage e-resources user enquiries at DMU Library

The Content Delivery Team at DMU Library & Learning Services have recently started to trial Springshare’s LibAnswers service, in the hope it will assist the library to manage its e-resource troubleshooting activities more effectively and seamlessly.

Previous to using LibAnswers, Microsoft Outlook email was used to contact DMU students and staff who reported database/e-journal/e-book access problems to the library. Content Delivery’s E-Resources Mailbox email account, eresources@dmu.ac.uk, was created several years ago and allowed library staff to forward user emails that highlighted access difficulties to online library content. The forwarding of these emails was usually completed by staff working in the library’s Just Ask Team (if initial “triage” of the access problem from the helpdesk could not be fixed) or by DMU subject librarians in teaching/1-2-1 sessions with students/staff. Once the E-Resources Mailbox had received a new email message, Content Delivery staff would investigate the access problem and take action as appropriate.

It soon became clear that there were limitations in using Outlook to manage troubleshooting enquiries efficiently. A number of Content Delivery staff had access to the E-Resources Mailbox. Efforts were made to keep the mailbox uncluttered, but this became an onerous task to keep up with, especially during busy times of term. E-resource enquiries can often be complex and may require several different stages of investigation by different library staff before they are resolved. The library may also need to liaise with content and systems providers to log errors and request support. Tracking the status of these enquiries in Outlook could be difficult and time-consuming (especially if the library was waiting for external providers to communicate progress). The team did experiment with using the “Categories” and “Flags” options in Outlook to try and better organise the administration of responses, but building this into already set workflows was a challenge.

So, the team decided to look at LibAnswers as a potential alternative to Outlook. DMU Library had already purchased LibGuides and LibCal from Springshare, so there was a degree of familiarity with aspects of this service (look of the site, functionality etc). The team had already created an e-resources troubleshooting LibGuide for example. A fellow colleague from Content Delivery and myself created LibAnswers accounts and started to see how we could utilise the resource to better manage e-resource troubleshooting.

The E-Resources Mailbox (eresources@dmu.ac.uk) address had become somewhat familiar to library staff who regularly reported student/staff access issues, so we wanted to keep this account if at all possible to minimise confusion. LibAnswers allows us to redirect email traffic from the E-Resources Mailbox within Outlook to be active in the LibAnswers dashboard – individual enquiries in LibAnswers are called “tickets”, and each ticket is assigned an ID. Once logged into the LibAnswers site, we can view these tickets and work on them as needed.

We have been trialling the LibAnswers service for several weeks now. Early evaluation of the product has been promising, even if we haven’t had time to review/test every single part of the admin portal. Features/options we currently like are:

  • Ability for library staff to “claim” tickets when they start work on an enquiry (and also potentially “unclaim” at a later date if the ticket is passed on to another member of staff). This “claiming” of the support ticket is then highlighted in the central dashboard, so other colleagues who may be logged in to the dashboard concurrently can see which tickets are being currently worked on, and others which require potential attention.
  • Allow other team members to check and answer LibAnswers tickets in times of staff absences (illness, holidays). This means DMU students and staff can be contacted in a timely manner.
  • Assign or transfer the ticket to relevant library staff if added expertise is needed.
  • Add an internal note to support tickets if certain actions are required (e.g. removal of login details from a library access point). These internal notes are not communicated to the user making the enquiry, but stay logged in the dashboard for library staff to view.
  • Easy-to-view history of replies in any one support ticket (the most up-to-date correspondence between library and enquirer is displayed at the top of the ticket).
  • Set different ticket categories dependent upon the status of the support call (new, open, pending, closed).
  • Closing a support ticket removes the enquiry from the current log of open/pending calls in the dashboard, but are still easily accessible in the “Answers” tab if further analysis/comment is required at a later date.
  • Having an alert display when a new ticket/new reply is received.

Problems we have faced so far are:

  • Adding a document or attachment to a LibAnswers ticket which can then only be opened if you have a LibAnswers account.
  • Users or staff who copy (Cc) the eresources@dmu.ac.uk mailbox in with related support replies/correspondence show up as new (separate) support tickets in the dashboard (there is a “merge” option in the LibAnswers admin site which we are looking at as a potential way around this).
  • If an already created message is resent, there is no log in the system of who this was sent to.

Features we hope to add in the future:

  • Look at using the reusable answers option to save time when replying to tickets that cover already reported or similar access problems.
  • Add category tags to ticket replies to further refine our processes and ability to filter the dashboard (e.g. e-books, e-journals, databases, authentication issues).
  • Use the popular links section to add pre-set URLs which we may need to refer to in ticket replies (DMU Library webpage, Subject Guides, E-Resources troubleshooting LibGuide). This will save time when constructing replies to support tickets.
  • Create a generic Content Delivery signature which we can add to all ticket replies.

We will continue to work with LibAnswers and see if we can fully integrate this service into our daily troubleshooting activities. Watch this space!

Posted in DMU, Libraries, Library, Troubleshooting | 1 Comment

Library e-resources usage & cost analysis – creating a template spreadsheet.

Part of my work in the DMU Library Content Delivery Team is to head up the capture, collation and dissemination of library e-resource usage stats. The library buys many different types of online resources, from many different vendors, and all of these resources need to be continually evaluated and assessed to see if they are returning on the financial investment DMU Library has made in purchasing them for DMU students and staff.

DMU Library continues to make extensive use of JUSP. JUSP saves time in journal usage stats workflows by removing the need for library staff to manually access and download journal usage reports from individual publisher admin sites. JUSP acts as a one-stop-shop for library staff to view and export COUNTER journal usage stats from different vendors, in various types of usage report formats (from specific core title usage to more general usage trends over time).

As part of DMU Library workflows, Content Delivery staff create resource usage/cost analysis documents for subject staff to review and evaluate for upcoming renewals. This assists DMU subject librarians to make prompt and effective renewal/cancellation decisions for the content the library purchases. This analysis contains raw usage data (e-journal downloads, e-book section requests or database metrics depending on the type of resource) from previous years of subscription and resource costs paid by the library during those years of subscription. To save time and create some sense of uniformity to the analysis created, the library harvests and displays annual usage based on the DMU financial (and SCONUL reporting) year – August to July.

Once the raw data is uploaded to the analysis file, Content Delivery staff then attempt to visually represent the data in some way. This representation is usually in the form of graphs, charts or tables. This visualisation of usage and cost data makes it easier for library subject or management staff to spot and interpret usage and cost trends, which in turn, should better inform their resource renewal decisions.

Content Delivery are constantly looking for ways to reduce the time it takes library staff to create these cost and usage analysis files. The files are often quite large, and creating the “visual” aspects of the analysis (charts, graphs) can be time-consuming and require an advanced understanding of Excel. If a number of renewal files need to be created simultaneously, especially around periods in the year when a number of resources expire at the same time, it is difficult to create documents quickly and disseminate to subject librarians in a timely manner (even if we are using added-value services like JUSP to organise metrics). Content Delivery strive to give library subject colleagues more time to digest the analysis we create, but sometimes this is difficult to fulfil due to the number of files being worked on at any one time.

After attending a recent JUSP Community Advisory Group meeting, librarians on the group spoke about the creation of usage “template” files, and how it would be valuable to have different examples of these templates hosted in the Community Area of the JUSP site. This would then showcase how different HE libraries process and evaluate their resource usage stats. This gave me inspiration to look at DMU processes and see if I could create an analysis file template myself, based on earlier streamlining of usage stats processes by Content Delivery staff. My hope was that DMU Library staff would be able to upload raw data to a template file, and then use set Excel formulas within the file to create usage/cost analysis at the touch of a button. As long as the raw data was entered into the correct cells in the file, the formulas should create successful data outputs.

I created tabs in my template Excel file. There would be tabs to data “dump” raw usage stats from specific DMU subscription years (2011-12, 2012-13, 2013-14 etc). I then had a tab for subscription costs (these would have to be transferred from a separate file we keep to record e-resource subscription allocations and costs). The final tab, marked “Usage & cost analysis” was where the analysis calculations would occur. I set formulas in the analysis tab to create automatic data outputs and visualisations.

Tabs were created for annual raw data and subscription costs.

Tabs were created to upload annual raw data and subscription costs.

The data outputs in the template are not ground-breaking – I wanted them to be basic and clear to review (in a table format), showing % increase/decrease in use and cost, and also calculating an annual cost per download figure. I also inserted two visual representations of the data contained in the table – a column chart to show cost per download for the years selected, and a line chart to map total usage against total subscription costs. Again, as long as the correct data is entered in the correct cells, the formulas should do the rest and automatically configure the graphs/charts.

Set Excel formulas create figures for analysis and drive visualisations of the data.

Excel formulas create figures for analysis and drive visualisations of usage and cost data.

You can see the usage stats template I have created by logging on to JUSP (your institution has to be signed up to the service) and visiting the “Community Area” section of the site. The template contains dummy usage data to show how the template works, but hopefully, you will be able to remove the data and add your own, and the outputs should still work (as long as the raw usage data totals and sub costs are allocated to the correct cells within each tab!).

DMU Library have also recently invested in ProQuest’s Intota assessment tool to manage library online content, so I am hoping this service will provide more online tools for Content Delivery to use to effectively interrogate and review resource usage and costs.

Please do let me know what you think of the usage template. I hope to see more of these types of templates appear on JUSP in the future – it would be great to see how other libraries analyse usage stats and subscription costs and highlight best practice in this area.


P.S. I want to say a big thank you to a former DMU Library colleague, Chris Voss. A lot of Chris’s hard work earlier in 2014 has gone into the creation of this current usage template.

Posted in DMU, Libraries, Library, Usage stats | 1 Comment

SmartSpaces Awareness Campaign

During February and March 2014, DMU Library and Learning Services hosted two awareness weeks to help raise the profile of a DMU green initiative called SmartSpaces. The SmartSpaces project aims to save energy use (electricity, gas and water consumption) in several Leicester public buildings through the use of IT. The project encourages engagement with building users via a SmartSpaces website and online discussion forum. The website measures current energy use in participating buildings by displaying the data in the form of a spectrum of “smiley” (or not so smiley) faces. The faces are colour-coded to visualise current levels of energy use. A sad red face highlights increased energy use, a yellow face means neutral performance, while a smiley green face indicates reduced energy consumption.

SmartSpaces smiley face icon

SmartSpaces smiley face logo.

As the Kimberlin Library is one of the DMU buildings which features in the SmartSpaces project, the library’s Green Impact Team arranged two awareness weeks a month apart to try and promote the initiative to library users. The Green Impact Team also produced a SmartSpaces LibGuide to coincide with these promotional activities. The first awareness week was hosted the week commencing 3rd Feb 2014. Green Impact Team staff set up a SmartSpaces display in the Learning Zone on the ground floor of Kimberlin Library. The display featured the smiley faces spectrum on display boards, posters marketing the SmartSpaces website, red, yellow and green balloons and a plasma screen presentation with slides highlighting the library’s involvement in the project.

Kimberlin Library SmartSpaces display

Kimberlin Library SmartSpaces display

The display was then staffed over set lunchtime sessions during the week. Library staff would hand out SmartSpaces flyers and discuss the project with interested users. A library laptop was also on hand, allowing staff to show the SmartSpaces website and online forum. The library’s Green Impact Team worked closely with DMU’s Sustainability Team during the preparation for the awareness weeks. The Sustainability Team kindly sent over some SmartSpaces smiley face cakes which were handed out to library users in the building.

SmartSpaces smiley face cakes!

SmartSpaces smiley face cakes

The second awareness week ran from 3rd-7th March 2014. The Green Impact Team repeated the SmartSpaces display in Kimberlin Library’s Learning Zone, but made some changes to the organisation of the publicity display after the first awareness week was reviewed. A single lunchtime session on 4th March 2014 was staffed (rather than having several lunchtime sessions throughout the week), and the plasma screen presentation was removed entirely. The library also tied in its marketing of SmartSpaces with Fairtrade Fortnight. The Fairtrade promotion was already up and running around DMU campus in the first week of March 2014, so it seemed like a good way to increase awareness of two green initiatives at the same time. The theme for this year’s Fairtrade Fortnight was fairtrade bananas. Bananas were given out at the display to library users in Kimberlin’s Learning Zone. Members of staff from both the library and Sustainability teams also dressed up in banana costumes in an attempt to gain the maximum attention of building users – what troopers!

Both SmartSpaces awareness weeks were promoted via DMU Library social media pages, such as @LibraryDMU and Facebook. DMU Sustainability and SmartSpaces Twitter feeds also joined in with this marketing, using hashtags such as #smartspaces and #dmu to widen exposure of the tweets:

SmartSpaces was promoted via library social media pages

SmartSpaces was promoted via library social media pages

During the run up to the first awareness week, the library ran a survey of its users to see if they had already heard about SmartSpaces from posters up around DMU campus. DMU’s Sustainability Team held an official campus-wide launch for SmartSpaces on 16th Jan 2014 in the Queens Building, and several big SmartSpaces posters were put up in participating DMU buildings (including the Kimberlin Library). The library’s Green Impact Team was joined by a project assistant from the IESD faculty who assisted with the creation and carrying out of the survey. The survey was then repeated during the second awareness week in March 2014, to see if the marketing display and events hosted by the library during the two publicity weeks had helped to increase users’ awareness of SmartSpaces in the library and around DMU campus. The Green Impact Team is currently digesting these figures in its evaluation of the SmartSpaces awareness campaign and related project work.

Posted in DMU, Library | Leave a comment

Library Camp 2013

I haven’t blogged for a while, but feel invigorated to do so once again after attending Library Camp 2013 last Saturday (30th Nov 2013). The event was hosted in the very impressive new Library of Birmingham building (brilliant and chaotic all in one – it has a glass elevator for goodness sake!), but the real bonus was to spend the day with some mighty fine people, all interested in discussing and debating issues central to libraries/librarians/info professionals in the UK. The atmosphere was vibrant and eclectic – it was fantastic to see so many people join together in their own time on a Saturday (lots travelling many miles early in the morning or the night before) to share their experiences, views and feedback on so many diverse library themes. Attendees pitched sessions on catalogues, social media, management styles, the future of public libraries, open access repositories etc. I could go on, so please check the Library Camp wiki for more info about the sessions that were ran (Willy Wonka did not pitch, in spite of the elevator)…

A glass elevator. In a library?

I think the essence of Library Camp (apart from the cake; “librarians love the CAKE!”) is the spirit of debate, whether face-to-face discussion in the sessions, quietly chatting over tea and cake in one of many “nooks and crannies” at the venue, or in the online conversations/communication taking place on Twitter, Facebook and other social media in the run up to, during and after the event. Views can be leftfield, challenging and tackle the status quo, but as there is no set agenda to the event, this type of discussion fuels the day. The attendees drive the content. The first session I sat in on was about social media and libraries. This is a topic I have a great interest in, being a “prolific” user of social media myself (stop posting pictures of yourself eating ice cream I hear you cry!), and having responsibility for contributing posts to a number of DMU Library’s official social media pages. You can also learn lots of new stuff at Library Camp (the photo above is now on the Wikimedia Commons site after I heard about this platform at the event!). A later session concentrated on teaching and libraries. This is not something I have direct experience of, but it was valuable to sit in the session and listen to attendees’ personal views, responses and anecdotes. This stuff can be gold for personal development.

I deliberately wanted to keep this blog post short (as I said, visit the Library Camp website, follow what was tweeted via the event’s #libcampuk13 hashtag or read others’ blog posts for more insight into the “flavour” of the day), but will end that I heartily recommend attending next year’s event, even if you only having a passing personal interest in libraries. It is very much worth it…

Posted in DMU, Library, Personal, Social Media | Leave a comment

Troubleshooting LibGuide

Over the past few weeks, DMU Library’s Content Delivery (CD) Team have been working on constructing a LibGuide to assist DMU Library staff with online resources troubleshooting. The objective of the team was to better help library staff recognise online resources problems (e.g. problems affecting library online databases, eJournals and eBooks) that users may report via an information desk, the library’s Just Ask email service or over the phone. The team not only wanted to help staff identify such problems, but aimed to provide general troubleshooting guidance in a “one-stop-shop” portal – guidance which staff could access quickly to improve library interaction & communication with users.

There seemed to be an appetite amongst library staff for such an online guide. Content Delivery hosted team days for library colleagues in late November 2012, and the idea of a troubleshooting guide gained a positive reaction from attendees. I also gave a presentation on online resources troubleshooting with a colleague at a COMPI (Content Management Planning & Innovation) Team away-day in December 2012, and again the feedback from team members about the creation of a CD LibGuide was upbeat.

So, after the Christmas / New Year holiday, the team began the prep to create the guide. The LibGuides structure was already in place as DMU subject librarians had built new individual subject pages for library users to consult during 2012. Content Delivery referred to these exisiting guides as a starting point – looking at the types of information other subject guides featured and covered. LibGuides allows you to create separate page tabs within your online guide; each page can then feature boxes of content or information for users to view, read and interact with.

The first job for the team was to attempt to “categorise” online resources problems. This was a challenging task, as there are no general “hard and fast” rules in this area it seems – most access problems may be a mix of errors or systems not working together (authentication, security settings etc). I have previously blogged on this online resources “definition” difficulty. In the end, the team came up with five general resource problem “types”:

  • Resource error message
  • Subscription problem
  • Authentication error (e.g. DMU Single Sign On)
  • Mobile or tablet access difficulty
  • “User” education – clicking on the wrong link etc

The next part was to try and provide troubleshooting info for library staff to pass on and advise users at frontline library service points (info desks, roving) in an engaging and easy-to-read format. This may be general technical guidance staff could impart to users to try and fix an access problem, or key questions to ask users to help diagnose resource errors more effectively. 

The CD Team decided to create separate page tabs for each of the resource problem types (as mentioned above). These individual page tabs give an outline of the resource problem “type” in brief bullet points (e.g. a subscription problem usually displays a pay-wall message asking for the user to pay for access), then go on to inform staff how they might logically go about potentially fixing the problem for the user (e.g. check that the user is entering the correct Single Sign On login).

The LibGuides format also allowed CD to provide bitesize “snippets” of technical information in small boxes which feature on the LibGuide page tabs (e.g. Clearing Your Internet Cache). This “box” approach seems to de-clutter the pages quite nicely, especially where a page is text-heavy, and filter information to staff in a clear and concise way. LibGuide boxes can also be re-used and copied to other pages within the same LibGuide – this saved a lot of admin time for CD staff when creating the page. A number of the information boxes appear more than once in the guide (e.g. linking to the @LibraryDMU Twitter feed), so the ability to immediately re-use content was a time saving bonus for the team.

The final two page tabs on the troubleshooting LibGuide were dedicated to what the CD Team can do to help if the user is still having access problems, and future work CD are putting into place to try an improve the way resource problems are dealt with in the library. We also set up a “feedback” box for library staff to engage with and provide comments on the use and value of the troubleshooting LibGuide. As the LibGuide is quite easy to edit, the CD team are very interested to hear about additional content staff want to appear on the guide.

I would be interested to hear how other libraries have tackled the issue of troubleshooting online resources access problems. Have you created a LibGuide? Something else for staff to use? Please do feel free to leave comments or tweet me with your feedback / views!

The troubleshooting LibGuide URL is http://libguides.library.dmu.ac.uk/content.php?pid=419498.



Posted in DMU, Library, Troubleshooting | Leave a comment