8 May 2013

Library Impact and Assessment - ANLTC Seminar, 7th May 2013

Demonstrating library impact is not only crucial for communicating our value to stakeholders, but also helps us design and shape our services more effectively. Where do we have the greatest impact and deliver most value? How can we show causation of - rather than simply correlation with - desired outcomes? Which services have least impact, and why? How can we address this? Or perhaps we should be primarily focusing on those areas where we do add real value and "purposefully abandoning" those that don't? What about those who don't engage with our services at all - what is causing this non-usage? These were just some of the questions I asked myself throughout the course of yesterday's ANLTC event on Library Impact and assessment.

The morning session focused on assessing the impact of information literacy, first through the lens of the CONUL ACIL 2012 survey on curriculum-integrated instruction (Mary Antonesa, NUIM), and then Lorna Dodd (UCD) outlined the process of taking stock of IL instruction to inform future directions. The potential for reusable learning objects to serve as an effective and flexible form of IL support surfaced in both presentations, an idea worth exploring, particularly in the context of falling staff numbers. Graham Stone's (University of Huddersfield) presentation on the Library Impact Data Project offered a brief glimpse of the vast possibilities that are open to us regarding the analysis of our usage and activity data. It is likely that as we dig deeper into this data deluge, more and more questions will emerge, and ultimately we are only limited by our time and our resources. However, unfortunately these represent very real constraints for most if not all of us. Fortunately, Huddersfield have made their Impact Data Project toolkit available for others to use and learn from, and you can also read more about the implementation and findings of the project. Graham also provided a taster of the ongoing Library Analytics and Metrics Project (LAMP), which I have been following with interest for the past few months via their blog.

JISC LAMP Dashboard - WIP: http://jisclamp.mimas.ac.uk/2013/04/dashboard-some-first-thoughts/

Unsurprisingly LibQual also featured on several occasions throughout the day. Ciara McCaffrey (UL) synthesised the experiences of using the survey instrument from the perspective of Irish Universities, and Peter Corrigan (NUIG) described the practicalities and workload involved in analysing LibQual comments and qualitative data, including a review of some of the relevant software packages available.

Whilst many of the presentations looked at impact assessment from different perspectives (such as customer service, information literacy, or usage data), some common themes resonated throughout. Firstly, there is a clear need for librarians as a profession to start looking beyond our safety-zone of input- and process-focused metrics. We need to start building a culture of impact assessment in our libraries that tries to measure the real value we add to out institutions. Collaboration is also key in this regard, be it with faculty for tracking IL outcomes, or indeed with our users themselves as Jo Aitkins (University of Leicester) reminded us. In the majority of cases, libraries are part of a broader organisation or institution, and how we can assess and demonstrate our impact is deeply rooted in this context.

The value of being able to benchmark services to provide a relative value and meaning for our metrics was also highlighted; this is the real advantage of using standardised instruments such as LibQual, in spite of their simplifications and limitations compared with local tools. Indeed, perhaps initially at least, simplification is a compromise we will have to make. Often the variables we are trying to capture are covert and complex, and it can be difficult (or even impossible in some cases) to extract and eliminate confounding factors. However, if we wait for a 'perfect' measure to surface, demonstrating the impact of our services will always remain out of reach. Instead, as a recursive process, our assessment efforts merely represent the next step in a continuous cycle of improvement and refinement, with each additional study and piece of data providing something we can build on in the future.

0 comments:

Post a Comment