28 May 2013

The Wrong Twitter Metrics

Assessing the impact of your library's Twitter account is a challenge, especially when there's a temptation to simply reach for the easy statistics, like the number of people who follow you. However, these numbers often tell you very little about the value of using Twitter, and in fact, can even tell you the wrong things.

An organisation's Twitter account is very different to a personal one. Whilst the latter may primarily function as a means of keeping up to date, for organisations like libraries, the point of using a tool like Twitter should be to generate real interaction and engagement with your users. Consequently, this is what you should be trying to capture when evaluating impact, not whether you have 50 or 5,000 followers.

Real life examples, exchanges and vignettes that show evidence of real engagement can offer much greater insight than a simple statistic. As well as this richer qualitative data, some quantitative measures such as how many people are tweeting @you or retweeting you are also useful. This shows that people value the content that you are sharing, and thus are more likely to continue to follow and interact with you, building closer and longer-lasting relationships. You can also measure how many users are being referred to your library website, blog or databases via Twitter, to assess if the tool helps to directly drive traffic to your resources and services. Map the network of your followers and see how it fits with your library's strategic positioning.

That said, it is still very difficult to show the value of Twitter in a way that is often required in an annual report - a quick bottom line or a headline statistic. However, at the very least, I would strongly recommend trying to avoid the trap of using the following metrics to demonstrate value and impact.

Image: Twitter

The Wrong Metrics

1. Total Number of Followers:
This is one of the easiest measures to obtain, but also perhaps the most flawed. Firstly, it is an easy statistic to manipulate: simply follow more people. Although following 100 people will probably only result in 20% or so reciprocal follow-backs at most, it is still a quick way to increase your absolute number of followers. Clearly however, an increase of this nature is meaningless. Moreover, the quantity of followers gives no indication of the quality of your followers. For instance, are they actually library users or other stakeholders, i.e. people you are interesting in connecting with? Therefore, what might look like a pretty upward sloping line on a graph really tells you very little about the value of your library's Twitter account.

2. Followers to Following ratio:
But surely you need to compare your number of followers with the number of people you follow in order to provide a relative and meaningful measure? And a higher ratio means your account must be great, right? Well, yes and no. Looking at this ratio is certainly a valuable metric, but if you a believe a higher ratio is always a good thing, you may be missing the point of Twitter.

Twitter is about engaging with your users, that means following them back when they take the time to follow you. This demonstrates that you are actually interested in your users' information and what they have to say. If you aren't following the majority of your followers, what kind of a message does this send out? It is different for personal accounts (particularly celebrities, where an extremely high ratio of followers is expected for obvious reasons), but when you are specifically trying to build relationships with your users as an organisation or a business, showing interest in them is key. This doesn't mean that your ratio has to be exactly one, but is should be close to it: this sends out a signal that you are not just racking up as many followers as possible who you can broadcast information to, but instead are actually interested in engaging, communicating and sharing with your users as a two-way process.

3. Number of Tweets
In the same way that looking at your total number of followers is of limited value, measuring the number of times you tweet is similarly flawed. Tweeting should be about sharing content and information that contributes something to your network. If you are not adding value through your tweets there is little point, in fact, excessive tweeting may even turn people off by filling up their Twitter stream with irrelevant content. Therefore looking at retweets of, and replies to, your content is a much better measure, as it shows the kind of content that actually engages your users. You can also use this valuable feedback to direct and shape how you use Twitter for your library. What works? What doesn't? Pay attention to your followers in an honest and authentic way and they will tell you.

Posted on Tuesday, May 28, 2013 | Categories:

27 May 2013

Library Camp Ireland, The Chocolate Factory, 25th May 2013

It's difficult to write about the first Library Camp Ireland (#irelibcamp) in a way that captures the unique atmosphere of the event. Even tweets, storifys and photos can convey very little of the fluidity, energy and idea-sharing throughout the day. A great turn-out (especially for a sunny Saturday afternoon), brought together information professionals from all sectors creating a real diversity of perspectives and participants.

Image: Giada Gelli, LAI CDG

The first pitch - speed networking facilitated by double-pitcher Helen Kielt - provided a great way to meet new colleagues, two minutes at a time. During the session I spoke with a current MLIS student and a recent graduate from Aberystwyth, as well as finally putting some faces to familiar Twitter-handles. As a pitcher myself, it was a chance (and indeed a challenge!) to try and connect with an audience without using projectors, Powerpoint or laptops. Instead, it was flipcharts, post-it notes and ideas that sparked the discussion. The unstructured format of the day provided flexibility for pitchers to team up, and I joined forces with Jane Burns for a session on writing for publication in library journals - discussing OALIS and An Leabharlann respectively. I was particularly interested in exploring why LIS professionals read / don't read journals, and similarly why they write or don't write. Some of the reasons for writing suggested by participants included reflecting, recording, sharing and learning, as well as helping to be perceived with greater credibility and authenticity by non-library staff. In spite of such advantages, the barriers - particularly a lack of time - remain tricky obstacles for many of us. Committing to a set time for writing each week can help develop a routine, even if it is only an hour a week, and setting a fixed deadline can motivate you to get the final draft over the line. Some out-loud thinking from Jane Burns offered suggestions for the future of An Leabharlann, including how LAI members identify with it as a journal, and the possibility of using alternative  channels such as advertising to fund the publication.

Image: Giada Gelli, LAI CDG
Three pitches ran simultaneously during each of the two sessions, split by networking, coffee and cake (some superb efforts that included a Queen of Saba from Aoife Connolly, Victoria Sponge à la Paul Hollywood from John McManus, Ann O'Sullivan's now legendary brownies and some very pretty #irelibcamp fairy cakes from CDG Chair Giada Gelli). Attendees were encouraged to move between pitches as they wished. I was sorry to miss John McManus' MARC session and the Professional Development pitch from Laura Rooney Ferris, Aoife Connolly and Marie Cullen, as they were running during my own session. However, for the second session I managed to catch parts of all three pitches, moving from Augmented Reality courtesy of DCU, to Wellbeing pitched by Helen Kielt, and finally a very active discussion on the value of LIS qualifications facilitated by Emmet Keoghan. The latter generated input from a wide range of participants: current students, recent and not so recent graduates, as well as those yet to pursue a professional qualification, making for some frank comments and many different views. Should qualifications concentrate on unique LIS skills and theories, or on broader, practical competencies like financial, project and marketing management? Is there enough emphasis on leadership? What will an MLIS look like in ten years time? A great way to finish the day, before decamping to The Black Sheep for further discussion :)

Thanks to the LAI CDG & A&SL Committees for the hard work in making the event happen, as well as to the pitchers and participants for their honesty, energy and ideas.

24 May 2013

Report on UKSG 2013, Bournemouth, UK -- Part 2

Guest post by Anne Madden

Key themes:
UKSG 2013 was apparently the biggest yet. With 930 delegates, the organising committee are very restricted in the venues open to them. Bournemouth and the Bournemouth International Centre (BIC) proved acceptable and accessible and will be recycled for 2016. In particular, the sponsor hall was well laid out and worked very well (see pics below). Getting to a workshop did occasionally involve a bit of advanced navigation though.

Part 1 of the report is available here

This (final) part covers the following themes:
  • Our clients under the microscope
  • The role of Technology

Our clients under the microscope
I have to start this section with the outstanding presentation by Josh Harding – a self-confessed “paperless student”. Josh is a post-grad medical student; in his earlier student days, he was a book and paper-bound stereotypical student. This time round, the Tablet has set him free. He requires “lots of information, very quickly, very efficiently”. The iPad in combination with a carefully selected set of apps now meet all his information needs.  Not only is it more practical but by allowing searching and switching across all types of content from notes to texts to lectures, it also adds enormous value in terms of speed and efficiency. He predicts it will be the norm among students within 18 months, and wondered whether we the information providers would be ready.

He proceeded to list the tools and resources he would use in a typical day. As they are very discipline- (and iPad) specific, I won’t list them here but I think his presentation is recommended viewing  not just to those in healthcare but to anyone providing services to students of any description.

Some generic apps include Notability, Inkling and GoodReader. He is a fan in particular of Inkling, a “(relatively) smart” interactive textbook source which allows per chapter download. As far as eBooks are concerned, in his view it’s a case of “a lot done, more to do” – smart should mean “adaptation based on learning analytics”. His textbook of the future would record his progress, identify any weaknesses – as well as applaud his successes, compare his performance with that of his peers and allow him to adapt the content to his needs (based on e.g. the amount of time he spends on the different chapters).

Notability is the handwriting app he uses, and into it he pulls his eBooks, lecture notes and other resource apps so that he can add his notes and links. These are then sent to the Cloud in PDF format where he can later access them using GoodReader. Obviously connectivity is key to making this all work and central to this is cloud storage.

The Librarian’s role is to identify how to deliver these services to our students, making students aware of what’s available and how to both access and use it effectively. He suggests that using “early adopter” students to advise their peers may be a way of achieving this. A major challenge is costs – in the same way as core paper textbooks and resources could be loaned from the library, in the future the digital equivalent should be equally freely available. Other challenges include fragmentation – multiple platforms, multiple apps and variable quality; PDFs with DRM restricting their use with 3rd party apps.

This said, his wish list not surprisingly included:
  • Single platform, single source for textbooks which would be smart and interactive.
  • Institutional subscription to core apps; however, students are willing to pay a fee if they feel they are getting what they want, in the format they want, at a reasonable price.

The following speaker should have been Sian Bane, discussing MOOCs but as she couldn’t make it, Ken Chad stepped into the breach and introduced the concept of "jobs to be done" (JTBD). This is the application of a commercial or business concept to the library world. Once you have carried out your user survey or /needs analysis, you then focus on meeting the unmet needs of your clients.

You focus on the “sweet spot”; ignore services that “competitors” can provide, stick within your capabilities, and wherever this intersects with an identified unmet customer needs you have found your niche. Focus on ends, not means: “Not quarter inch drills, quarter inch holes” (Theodore Levitt).  The jobs referred to in the title relate to jobs your users need to get done.

It was at this point that we were treated to the Spice Girls’ “I’ll tell you what I want, what I really really want” with special emphasis on the phrases “make it fast” and “don’t go wasting my precious time”. Citing Clayton Christensen, he suggested that students don’t want textbooks, they want to pass exams, preferably without ever having to open a textbook – that is the “JTBD”.

Introducing the JTBD method to your library, you need to identify 3 things:
1. What is the problem/job?
2. Who should solve it?
3. In what circumstances/scenario?

In conjunction with your clients, you need to establish:
- Why this problem is important to them
- How they are currently handling it
- Why they have chosen the solution they have chosen
- Their likes and dislikes of their chosen solution

Back at base, you then need to ask yourself:
- What do I already have that might work
- In what circumstances will it be effective
- Any weaknesses or strengths?
- Why will my clients use this, instead of what they are already doing?
- Does my solution address the full problem?

It may be that you need more context in order to fully understand the JTBD. It’s not just providing something that will fill a gap, your solution must add value or be seen as adding value by the client.

The next three presentations relate to recent user research studies, carried out by Lynn Silipigni Connaway of OCLC, Simon Inger of Renew Training and the interesting juxtaposition of Jo Alcock and Mark Brown of Birmingham City University who gave a lightning presentation.

Connaway looked at the “digital student”. They endorse both the Josh Harding and the Clayton Christensen view – the alternative title to her presentation was “I don’t think I have ever picked up a book out of the library to do any research – all I have used is my computer.”

They like Google “reliable and fast”, and Wikipedia. They believe the following:
- Libraries are quiet places with internet access.  They contain many print books.
- “If two sources say the same thing, then it is most likely the truth”

Student Nirvana would be to find a one stop shop for everything they need to pass their exams. Is this what we deliver? Not according to this research. Students don’t so much go to the Library web page as stumble across it through their preferred search-engine. They (and Faculty) may land on a Library page whose content is out of date and irrelevant, and not feel tempted to stay.

So, in the words of Ken Chad, what are the jobs to be done? Firstly, she suggests, digital students need digital librarians. The “Ask a Librarian” service should be more interactive and personalised. Customised help features should be triggered by scenarios that might otherwise make students turn away e.g. “no results” to their search.

She used the term “the inside-out Library” – i.e. if students are happening on Library pages and resources through generic search engines, then the library must recognise linking as a key role. In an inspired example of this, faculty are now adding key citations to Wikipedia articles as they know this will increase the chances of students spotting them. In general, students feel guilty about using Wikipedia – even while citing and using the references from a Wikipedia article, they rarely mention Wikipedia as the source, and even more rarely will they read these citations. Their behaviour is “squirrelling” – storing to read later, but later rarely comes.

Another key factor is lack of experience with databases and a lack of specialised search engines. The research showed that 62% of all researchers are self-taught. Mentor programmes are common in universities – providing training for mentors on research is a way of propagating awareness.

The lightning presentation by Alcock and Brown served up highlights from their recent survey on Discovery and healthcare students. The full report is available to view here and even based on their highlights it was obvious that their findings were very much in line with those of the other research surveys present.
  • Students get familiar with one particular database or search tool and use this for general searching
  • Their choice of search tool is usually scenario-specific
  • Google is used to “scope” a topic – to find buzz words which they then search in a peer-review database. This echoes findings from the NUIG breakout session on “simplifying the search experience”.
  • They showed a number of advanced searching skills such as the ability to transfer their searching skills across different databases; awareness of different databases and knew they could work with their results to e.g. limit them by date etc.

Again on the topic of content discovery but on a global scale, Simon Inger gave an overview in 20 minutes of what is obviously a very substantial piece of research: “How readers discover content in scholarly journals”. “Readers” referred to researchers, students and librarians and all had their own network of routes to content.

These routes are of interest to both librarians and publishers – different readers will, he suggests, carry different value. While different sets of analytics are available to publishers and librarians, neither has the full picture. The current research attempts to address this and is a repeat of similar research carried out in 2005 and 2008 but on a far wider scale with 19,000 respondents.

To be fully comprehensive, they wanted representation from all geographical locations, sectors and professions. Given the no. of responses, even a small percentage such as 1700 responses (9%) from the medical sector; 2,500 responses (13%) from students, is still a very significant cohort.

The research covered preferences in search engines, discovery tools, apps and devices – all of which could be analysed by sector, geographical location, profession etc. Superficially, most of the findings come as no surprise:
• Different disciplines arrive at content in different ways.
• Different reader groups arrive at content through different routes

A more in-depth analysis of the data provides more practical and translational results:
  • Social Sciences are more likely to link from a Library web page than any other sector
  • Linking from search results and from targeted email alerts is more popular in Medicine than other sectors
  • Social Sciences prefer journal aggregators (ProQuest/Ebsco) as a starting point in a search; however their most popular starting point is an academic search engine such as Google Scholar.
  • Medical sciences are more likely to use A&I database (PubMed)
Changes since 2005:
  • Library web pages show a fairly significant increase in usage as a starting point in search
  • Journal alerts are declining in popularity as a means of discovering recent articles in a field
  • Bookmarks to journal home pages are on the up
  • The biggest supporters of Library web pages in “search” are Education Research and Humanities; physics and other hard sciences are least likely to use them
Library subscribed discovery resources:
  • A&I resources almost twice as likely to be used in Life Sciences and Medicine than a library web page or full text aggregator; Humanities more likely to use full text aggregators
Google in preference to Google Scholar by subject area:
  • Google Scholar most popular in Social Sciences, Psychology and Education Research with all other subjects preferring Google.
  • Physics & mathematics showed the strongest leaning of all to use Google over Google Scholar.  “Google Scholar only covers academic material”.
Device used:
  • Desktops/laptops still most used across all sectors
  • Tablets and phones small but significant.  Most popular in the medical sector which supports Josh Harding’s experience.

While the results are interesting, understanding the underlying reasons for them is not clear. Do they match our own experiences? Is there a platform that can support all preferences?

Simplifying the search experience” – a breakout session by Monica Crump and Ronan Kennedy of NUIG – fits neatly here. Armed with a new Library Discovery service (Primo Central), they set out to provide their users with a Library search interface second to none.

This was a very enlightening and engaging presentation; among the “Sacred Cows” to be sacrificed was the dedicated library group. They described how at times the Group itself became the focus rather than the issue it set out to resolve. Secondly, not everyone understands library terminology; “print locations” translates to a student as “where I can find printers”. Finally, selecting features for their pedagogical value is commendable but may just discourage usage.

Once the new interface was finalised to the satisfaction of the Group, it was launched and then followed up by a LibQual survey. They received mixed messages: “website is difficult to use” and “website is easy to use”. The LibQual survey was followed up by user observation studies to clarify the situation.

Some useful findings:
  • They ticked all the boxes that librarians like to tick.  Unfortunately these proved to be the wrong boxes
  • Librarians aim for perfection; users want “good enough”.  Or as Ken Chad would put it, can it just get the job done?
  • “You can please some of the people some of the time etc”.  You will sometimes need to make a radical change and plan to manage any consequences.
  • Just because your system is feature-rich does not mean you should use all of them. Less can be more.
  • Library training makes for better searchers
  • Channel the options – don’t offer them all on the one page.  E.g. Default: single search box, with, below, “More search options” button.
  • To get maximum buy-in, choose when to make any major changes (not to coincide with start of a new academic year).
  • As already mentioned in the Alcock/Brown presentation, Google is generally used to “set context” but once this is done, they move to the library-based resources.
  • Academics had a preference for “point and link” as opposed to locating through a search.
Having revamped the interface on the basis of feedback, they have now repeated the LibQual survey; however at the time of the Conference, the results were not yet available.

The final one in this section is a lightning presentation by Eric Hunter on a current awareness service for his users at the RCSI. The service was developed to address both the need for relevant information at point-of-need and cognitive overload. A number of different prototypes were tested and a bulletin created which contained links to the content online.

The process is very labour-intensive however and given staff shortages, it is not certain whether it is the best use of librarians’ time. The service is currently under evaluation.

The role of technology
 I must confess to a lack of familiarity with some of the acronyms so some of “the science bit” may have gone over my head. I’ve provided links to the presentations wherever they exist and have listed the highlights below.

Anyone involved in Library discovery technology should keep their eyes open for a joint UKSG/JISC study which attempts to identify knowledge gaps and best practice. Its title will be “Assessing the impact of Library discovery technology on content usage” and it is due out in September. UKSG acts as an “incubator” for projects according to Ed Pentz of Cross-Ref, an example would be the usage-factor which has now been moved to Counter.

Liam Earney discussed the topic of Knowledge bases. His main contention is that inaccuracies have crept into systems to the detriment of the information landscape. Some rather surprising examples include the fact that few publishers have a complete and accurate list of all their publications. Equally, ISSNs are sometimes missing or inaccurately portrayed in journals. He made the catchy remark: “set your data free….. but please tidy it up and make it presentable first”!

He spoke of two products which have been developed to capture the information required to manage e-resources: KB+ and GOKb, both available under open licence. Duplication of effort needs to be avoided – only minor points of differentiation exist between the four largest knowledge bases. The focus should instead be on: open data (to ensure one single accurate version is in circulation), collaborative communities (again to avoid duplication and encourage sharing of metadata and data), enriched information (incorporating human awareness) and standards/best practice guidelines – too many too varied. Some of the partners in the project: KBart, Editeur, PieJ.

The aim of this project, as in many of the others in their own way, is interoperability.

The Journal Usage Statistics Portal (JUSP) and how it is used by the Open University. JUSP collects Counter statistics that are SUSHI compliant. It also gives you options to “interrogate the content” in a number of ways in order to assist in renewal decisions for example. The benefits of using one interface as opposed to multiple publisher platforms, is obvious. Other usage includes subscription negotiations based on price per use; assessing the value of “big deals”, it can also be used to indicate trends in usage. Nesli 1 & 2 publishers currently participate while others have been contacted and asked to do so.

Also from the Open University is how their ebook acquisition policy came about. They first establish the purpose of each text (research, core student text etc). For research, a single-user licence may be enough but a course text might require a 300 concurrent user licence which is often not available through regular channels. This content can then end up on a separate platform.

The whole process has becoming very labour-intensive. They have ended up using 5 different ebook aggregators and their biggest wish is for a one-stop-shop for all ebook suppliers. Their likes include: ability to download to devices; printing rights; individual login options, appropriate formats. Dislikes were pretty much the opposite of these.

Mobile access was addressed by both the University of Surrey and the Taylor & Francis mobile team. T&F gave a short overview of the relatively new T&F online platform, its features and benefits (content indexed to Google Scholar; alerting service etc). Their web app is free and can be downloaded directly from their site.  Key features include saving to a device where they can later be read offline; sharing via Facebook and Twitter. It is compatible with Android, Blackberry as well as iPhone. A once-off “pairing” of the device is all that’s required for access. The balance of the presentation related to how they promoted this new mobile service.

The University of Surrey gave a brief overview of how they integrated mobile technologies to the academic library. Aim: make their resources available on any device, anywhere. They decided not to go down the app route but decided to use what they currently had as far as possible. Their initial approach was staff-focused, developing familiarity using a hands-on approach – feedback was very positive. iPads were then trialled, to give on-the-go demonstrations to academic departments, and for more interactive and practical roving user support.

They carried out a short focus group study to establish a list of technical changes required for their web page:
  • QR codes, however, not hugely used
  • mobile version of catalogue: positive feedback
  • mobile friendly sites for opening hours and study room bookings (functional changes, not just cosmetic ones)
  • making lists of what is available within subscribed resources, etc
The Focus group also emphasised the need for the library to integrate with the institution as they did not see it as separate to the institution.

In a very absorbing breakout session, Fred Guy and Adam Rusbridge of the University of Edinburgh discussed the long term availability of eJournals. Following a brief introduction on the archiving infrastructure scene, they introduced a panel of experts to address the questions of who is archiving our journals, how they are doing it, and what recent changes have occurred.

He mentioned a key resource: “The Keepers” (http://thekeepers.org/thekeepers/keepers.asp) – a registry of archives for eJournals. This registry is the result of a scoping study by Loughborough University and Rightscom Ltd. in 2008, “A scoping study for a registry of electronic journals that indicates where they are archived.”

Developments in this area can be tracked on the JARVIG website: JARVIG (eJournals archiving interest group). Their Action Plan is being reviewed following a cost/benefit analysis in 2012. Funding for the project is provided by PEPRS and EDINA. The JARVIG role is to develop the infrastructure in partnership with stakeholders such as the ISSN registry (which as was mentioned in an earlier presentation, is in need of tidying up). Another partnership is that between the Libraries of Cornell and Columbia Universities (http://2cul.org/). “Trusted archives” include LOCKSS, CLOCKSS, Nederland National Library, British Research Library, The Hathi Trust, the National Science Library and Portico.

Libraries should look at the preservation of their local collections; David Prosser and Lorraine Estelle both mentioned the JISC nesli model licence which includes archival rights. Licenced subscriptions provide current access to content, but as we have probably all experienced, once you cancel a subscription you frequently no longer have access even to the years you subscribed. The nesli licence (as mentioned in the earlier JUSP presentation) is being accepted by more publishers but it is worth requesting its acceptance with all publishers when renewing.

Publishers must be persuaded to be included in the LOCKSS archive. Archives may be “Light” or “Dark”; “light” archives permit not only access but reproduction, lending and general circulation; “dark” archives will only allow local access and reproduction under certain conditions. Archived collections must be compatible with link resolvers and other search technology; content would be available continuously and at a cost that would not exceed that of print.

A side-effect of these online archives is the amount of space it would clear; the RLUK has set the clearing of 100 kms of shelf space as a target.

For the future, a more joined-up approach is required.  There is currently (and may never be) one global market solution.  Coverage has to be comprehensive – already gaps are appearing.

Finally, a breakout session from Anita Wilcox of UCC: an open-source ERM system. The initial choice was based principally on budgetary restraints – when researching which serials managements system to choose, CUFTS emerged as a viable solution.

The system was developed and hosted by Simon Fraser University Library and partners, who provided ready assistance throughout the set-up period and after. For a small fee, they also provided assistance in the development of the interface. So far, it has equalled out-of-the box solutions in almost all respects:
  • It uses CrossRef for linking
  • It provides a direct to article openURL link resolver (GODOT)
  • There is a licence tab which will identify licence-specific data (Athens, walk-ins etc)
  • User guides can be uploaded to the database links
  • The research Suite Journals portal and the CJDB Database portal can be integrated with the Library catalogue
  • There is a statistics portal where you can create your own sushi stats
  • Ranking is allowed, so you can decide on the order of results display
  • It creates a full MARC record for all records in the system
  • It permits the uploading and integrating of local ILL form
  • The initial record table allows for input of item-specific data: licence, pricing model and subject information
  • A Provider record allows the tracing of source of individual items
  • Data can be imported in either PDF or doc format
She finished by noting key pros and cons:
Pro: far greater control and rapid response
Con: no automated upgrades when publisher links are changed

The final presentations of the Conference were by Jason Scott and T. Scott Plutchak. I won’t even attempt to cover the presentation given by Jason Scott – you need to view this yourself – it’s worth it, he is a highly entertaining and irreverent presenter!

T. Scott Plutchak, who is a key player in the American OA movement and was part of the Scholarly Publishing Roundtable, finished by echoing the words of Jill Emery – librarians tend to see publishers as adversaries in the battle to achieve democratic access to research. He suggested that we both worked from the same premise: the value of scholarly literature. It was time to dispense with misconceptions about each other and move forward by recognising the contribution of all parties.

His presentation can be viewed here.

And so ended UKSG 2013 – it was a truly enlightening experience and I feel very lucky to have been there.  Sincere thanks to the Acquisitions Group of Ireland for making my trip possible.

23 May 2013

LIR #irelibchat Summary: Mobile Technologies in Libraries, 21st May, 2013


Guest post by Siobhán Dunne, DCU & LIR Committee

The LIR HEAnet Group for Librarians (LIR) was delighted to join forces with #irelibchat on this month’s Twitter chat. At a seminar in November, we had seen that librarians were really animated about mobile technologies and we were keen to continue that conversation. With over two hundred tweets on the night, we weren’t disappointed - not bad for what turned out to be a gorgeous Summers evening!

We began by asking whether libraries were slow to implement new technologies thereby missing the peak of the trend. Not surprisingly, people weren’t slow to challenge this. Yet there is recognition that there is no point in being mobile just for the sake of it, it has to add value. As @Yffudj pointed out ‘it’s not about trends, it's about what's useful’. Whilst librarians are often innovators, tweeters agreed that it is more important to be useful innovators.

There was consensus that service delivery for academic ebooks has a long way to go; there are huge discrepancies in licencing models and it is confusing to users. It was noted that NUIM and UL libraries have ran successful pilots lending e book devices and as @dstokes01 attested ‘Many public libraries actively engaged with eBook lending and provide access to variety of resources remotely’.

So, are apps on the way out? Do people prefer mobile websites? When I think back to the discussions we had at the LIR/AGI symposium last November, most answered yes to both. Certainly, whilst there was some chat about vendors creating apps for databases, there was very little discussion about libraries developing their own apps. Where they are developed, apps should sing for their supper and add value to existing information channels, as @joeyanne commented: ‘Interesting to note that when University of Sheffield developed an app, they only used info available in other form’.

The challenge is to get the right balance between providing variety to our users in terms of how we package information and ensuring that we don’t privilege some users who happen to use smart phones or other devices: ‘ I'm wary of technologies that only work for some, or require user investment’ (@Yffudj )

Not surprisingly, QR codes and augmented reality featured prominently. Some felt that QR codes may have seen their day; ‘wonder if QR codes are being used with any great emphasis anywhere (anymore)? We are using them, but not to a great degree these days’ (@jclark923), however there were some suggestions as to how it could be deployed for learning ‘this QR code treasure hunt could be fun for assessing first year undergraduates: http://t.co/4q6ohtMxSp ‘(@dunnesiobhan) - and it ticks the gamification box. Augmented reality adds value by harnessing geolocation and educating users at points of need throughout the library – great potential for making orientation more interactive. Another suggestion for orientation came from @trimroy ‘Best use of mob I have seen is in #urbangames Fun. Great for hunts orientations & recreations of space’

Twitter also featured – both as a communication tool: answering information queries and as a learning tool: getting students to use it as research resource for assignments. The former could facilitate librarians in smaller libraries or ‘on the go’ or even be integrated as a roving service in a larger library.

The question of what ‘being mobile’ actually is, was raised – ‘I use my tablet to show doctors how to access resources via mob devices - great to have something to demonstrate with around the hospital! (@libfocus). However, it’s not just about mobile websites and apps, as ‘conversely people now use mobile devices in a non-mobile setting (e.g. iPad on the couch!)’ (@libfocus)

The bigger question is perhaps, how effective the deployment of mobile technologies for library services is in practice. Some suggestions for tracking mobile activity included hit counters such as Google Analytics and Stat Counter (@Yffudj). Augmented reality products such as LAYAR and AURASMA also provide usage statistics. There was a sense that catalogues are lacking when it comes to statistics – they are usually very ‘general’. Also, we need to determine how to measure qualitative content – perhaps there’s an app for that?!

All in all, this was a lively #irelibchat with plenty of food for thought and some great resource recommendations. @joeyanne suggested ‘some ideas for other ways to use mobile technologies available at http://t.co/ouSe2xp6t7 ‘ Finally, @trimroy reminded us that it’s a busy week for Irish librarians ‘am I the only one mixing #irelibcamp with #irelibchat?!’

Indeed, this #irelibchat was a great prelude to #irelibcamp which takes place this Saturday: http://laicdg.wordpress.com/tag/irelibcamp/ 
Watch this space!

20 May 2013

Simplicity is More Difficult Than it Looks

When I saw the following link in my Twitter stream - 8 Must Have Tools for Dropbox - it sparked a realisation. In truth, I probably come across a dozen such links per day through various channels, and I could have chosen any, as the question would be the same: How did we get to a point where there are eight "must have" tools in order to use a single application? Given that so many of us probably rely on at least 5 or 6 such tools in our everyday life (in my own case, Twitter, Gmail, Google Drive, LinkedIn, Slideshare, and Blogger), if each of these requires another handful of productivity add-ons (or even several handfuls - 99 "essential" Twitter tools!) in order to use them effectively, our information workflows (as well as our device home screens) get cluttered pretty quickly. There has to be a better way.

Image: Disk Depot (Wikimedia Commons)
How have we come to a point where using one app 'requires' so many others? Well for one reason, we now use information in a much more integrated way, and these add-on applications often afford us this luxury by helping us to connect our usage across different contexts and platforms. So there is an aspect of this that is certainly driven by user demand. Another reason is that, good design is about simplicity. Yet simplicity is very hard to do. Look at many of the off-line information and productivity tools we use every day in the physical world and how simply they are designed. Books are the obvious example. Who doesn't love post-it notes? There is still something incredibly satisfying about crossing off tasks on a paper-based to-do list. People still use these things because they are incredibly simple yet powerful tools. Indeed one of the original selling points of Dropbox for many users was its simplicity, built around existing user habits and behaviours; simply drag and drop your files in the same way you always do. So why do we so often think that greater complexity and sophistication is necessarily better?

If we think of many of our library services and resources, they are far from simple. Sometimes things are intrinsically complex and there is no way of avoiding it. But in other cases, the language we use, the processes we create, and the rules we make can seem overly complicated. Sometimes we don't need to do everything all at once. Designers need to realise this, but perhaps so do end users.

Sometimes, less is more.

Posted on Monday, May 20, 2013 | Categories:

17 May 2013

CONUL ACIL Annual Information Literacy Seminar, 11th June 2013

Looking forward to speaking at this!
Via the LAI HSLG Mailing list:

The annual seminar of the CONUL Advisory Committee on Information Literacy will be held on Tuesday 11th June in the Trinity Long Room Hub, Trinity College Dublin. The seminar is aimed at library staff involved in information literacy education and initiatives.

To register, simply email gmccabe [@] rcsi.ie before Thurs 30 May.

The fee for the seminar is €35 for CONUL members and €50 for non-CONUL members

Programme:

CONUL Advisory Committee on Information Literacy
Annual Information Literacy Seminar
Tuesday June 11th, 2013, Trinity Long Room Hub, TCD


10.15am – 10.45am
Registration, Trinity Long Room Hub entrance lobby
Tea/Coffee, Neill/Hoey Lecture Theatre, Level 2, Trinity Long Room Hub

10.45am – 11.30am
Engagement and the student experience
Dr Noel O’Connor, Director of Student Services, DIT

11.30am – 12.00pm
Pecha Kucha 1
- Librarians as personal tutors. Grainne McCabe, RCSI
- Designing and delivering an information literacy programme to second level students: an NUIM experience.  Elaine Bean, NUIM
- Measuring the impact of IL instruction, Lorna Dodd, UCD

12.00pm – 12.45pm
Being digital: the development of the Digital and Information Literacy framework
Jo Parker, Library Services Manager (Information Literacy), Open University

12.45pm – 13.00pm
ACIL Information Literacy Survey, 2012
Mary Antonesa, ACIL Committee

13. 00pm – 14.00pm
Lunch (Tea & Sandwiches Provided), Ideas Space, Level 3, Trinity Long Room Hub

14.00pm – 14.30pm
Pecha Kucha 2 
- Trinity College Dublin, MPhil in Digital Humanities and Culture: an archivist’s perspective. Ellen O’Flaherty, TCD
- Understanding the Information Behaviour of Humanities PhDs on a Graduate Information Literacy Course. Ronan Madden, UCC
- FI:LO Developing online tutorials for law students. Sarah-Anne Kennedy, DIT

14.30pm – 15.00pm
Academic writing: elements of a synthesis
Miriam Fitzpatrick, Lecturer in Architecture, UCD/WIT

15.00pm – 15.30pm
Pecha Kucha 3 
- IL & clinical legal learning and practice. Hugo Kelly, NUIG
- Diagnosing Information Literacy: A Healthcare Lens for the SCONUL Seven Pillars Model. Michelle Dalton, UHL (University Hospital Limerick)
- DCU’s annual training programme for staff and researchers. Lisa Callaghan, DCU

Report on UKSG 2013, Bournemouth, UK -- Part 1

Guest post by Anne Madden

Key themes:
UKSG 2013 was apparently the biggest yet. With 930 delegates, the organising committee are very restricted in the venues open to them. Bournemouth and the Bournemouth International Centre (BIC) proved acceptable and accessible and will be recycled for 2016. In particular, the sponsor hall was well laid out and worked very well (see pics below). Getting to a workshop did occasionally involve a bit of advanced navigation though.

Over the 3 days, my idea that the Conference was focused around Open Access and the Finch report was shot down. Finch was only the launch pad. Very roughly, the themes could be broken down to:
  • Open Access (Part 1)
  • The Research Agenda (Part 1)
  • Our clients under the microscope (Part 2)
  • The role of Technology (Part 2)
The inclusion of Lightning talks and Breakout Sessions further broadened the scope of the Conference, and some of these talks and workshops were real highlights for me.

Day One
Following greetings from Ross MacIntyreChair of UKSG and Bob Boissy, President of NASIG, various awards were presented. It was good to see Ireland was represented by Kathryn Walsh from NUI Maynooth who won one of the early career professional awards.

Open Access
The first presentation was by Phil Sykes “Open Access gets tough”. As an early aside, he welcomed the HEFCE decision to insist on OA for REF 2020 where decisions are made on how much funding will be provided to each university. This, he claims, will be a game-changer as loss for non-compliance could be disastrous. HEFCE have an annual research budget of £2 billion.  It was obvious from the outset that Phil Sykes was passionate about OA.

Sykes didn’t honestly believe that Institutional Repositories were being adequately supported. He credits David Willetts as the man who breathed new life into the OA movement and the Finch study. Finch proposed Gold OA (immediate access on a CC By licence). RCUK immediately adopted the recommendations but also allowed for Green OA which allows for an embargo period especially for humanities and social sciences. At least 45% of funded research would have to be on a Gold basis for this year, gradually building to 100%. The APC costs were being deducted at a time when the budget was already very tight, so this was not a popular decision. As a result, now 45% must be published OA, but either Green or Gold is acceptable.

He highlighted how the stars were currently aligned for a major push-through of OA: prominent politicians and key research figures are all very supportive of the movement.  He did chide librarians for their lack-lustre support of OA. As librarians, we should be raising awareness of the current enormous costs of subscribed journals and providing all relevant information, support and advice on the OA alternative to our clients.

Fred Dylla (“The evolving view of public access to the results of publicly funded research in the US”)  brought us through the various steps, studies, reports and legislation leading to OA in the US: the Scholarly Publishing Roundtable; the Competes Act (which was signed earlier this year by President Obama); the NSF report; the OSTP Directive; the FERPA Act (2012). There is a nice insight into the process at http://www.aip.org/fyi/2012/049.html. One of the earliest “wins” for OA was when, in 2007, the NIH Public Access Mandate was passed. By September 2013 the OSTP have directed that eleven funding agencies, that between them control a $100m budget, come up with a plan for public access. In it, they must keep the international scene under consideration.

Mr. Dylla highlighted the logistical and technical problems that need to be solved using various partnership approaches:
- Interoperability: for example, would research reports housed in http://science.gov be compatible with publisher platform and linking resources?
- What about unique identifiers?
- Identifying funding sources?

Already some of these issues are being addressed. In May this year, CrossRef will launch FundRef – which will trace and ID funding sources. ORCID (covered separately) is now launched and will provide a unique identifier for researchers. A trial group of Public Libraries are making research available free of charge within the participating libraries. There is also an article rental scheme, DeepDyve, which already has 100 publishers on board.

For anyone with an interest in the progress in the US of Open Access over the past 3 years, Dylla’s presentation is an excellent source of information. One slightly disappointing note relates to their apparent acceptance of Green OA with an average of 12 months’ embargo as a target. Also interesting will be how OA journals will develop – already according to one questioner hundreds of “journals” with titles remarkably similar to leading journals are touting all and sundry for papers.

So what should librarians be doing to support this? Jill Emery quoted from Lankes’ Atlas of New Librarianship “the mission of librarians is to improve society through facilitating knowledge creation in their communities”. It is our duty to champion what is being created locally. We are also responsible, she suggests, for ensuring that OA is not a separate project, but is central to the organisation’s research agenda.  What are we doing with our collections to help to facilitate these major changes in knowledge in our local societies/user groups?

Ms. Emery regretted that the general attitude of librarians is “adversarial” with publishers as the bad guys, and only their greed holding back the democratisation of research. She flagged three important points:
1. Open Access is still going to involve money
2. It is still going to need management and organisation. “This is the golden age of cataloguers”.
3. OA has to be integrated into mainstream library and organisational activity.

The movement is gaining momentum; a recent survey estimated 17% of journal articles published in 2015 will be OA. Meanwhile, library budgets were being cut and journal costs were soaring. Before proceeding with any OA plans, Jill Emery and her colleagues first decided to survey the current scene. Of the ten major publishers surveyed, all had a significant number of hybrid journals available. While general indications suggested an APC of $660, the average identified in their survey was $3,000. Librarians should challenge these prices – what is the value of the prestige of a publication against its cost?

She next introduced a term briefly touched on earlier by Phil Sykes “double-dipping”. Discounts to subscription prices offered to institutions based on the number of APCs they have submitted to the journal/publisher. OUP were by far the most transparent in this respect. Another factor to follow is the type of licence that is being offered.

A major difficulty at present is tracking usage – a key role for librarians is to ensure that any available tools such as FundRef and ORCID will be utilised to allow for Counter 4 compliance (currently no publishers are Counter 4 compliant). Librarians should also be changing mindsets within faculty and departments. Should librarians be dealing with APC charges? Essentially librarians should be developing the organisational OA policy and this will require becoming the experts on OA standards and developments and promoting them through the organisation.

My next stop was the Mendeley break-out session. At the time of attending, this was a good fit with the OA agenda. The statistics were staggering:
- 381 million documents in the database (not unique, included some duplications)
- 227,000 public or private groups
- 50-60,000 active users per week
- 1,500 Mendeley advisors establishing local communities

It operates as a Desktop client but there is also a web version. The user profile is largely academics and top level researchers, who use it to organise and annotate their collections as well as sharing with interested communities within Mendeley. It is compatible with a wide range of resources – with mobiles via Scholarley or Droideley and with Kindles using Kin-sync. The “Moodley” project is under way with Cambridge University to integrate it into their symplectic VLEs. The words “shared” and “collaborative” were used repeatedly; including user-generated tagging, collaborative filtering, and more than 300 apps which have been generated for Mendeley by the users.

As for altmetrics, there is a quite accurate relationship between article readership in Mendeley and the citation index – except that it is available much earlier.

But we all know what happened next, so the future of this valued resource is uncertain.

Also falling under the OA umbrella was the Swets session on OA and the role of the intermediary. This was hosted by Maxim van Gisbergen of Swets. In an evolving scholarly communication world, what assistance can an intermediary provide? Who will be their customer – should the intermediary be funded by publishers or librarians? Some suggestions included the traditional roles of facilitating visibility and access to resources, but also proposed they might provide assistance in pulling together “big deals” and APC payments. Once again it was mentioned that librarians should strongly consider whether they wanted to get involved in management of APC management within their organisation. The Research Information Network had recently published a paper on how intermediaries could be involved in this process. The key point I took from this break-out session was the disruptive effect that OA will have on all stakeholders.

Two lightning talks finish the section on Open Access. The first was Carolyn Alderson, of JISC, who spoke of a new project launched by JISC Collections to facilitate the payment of APCs. It is a consolidated platform developed in partnership with OAK (Open Access Key) who provide the technical expertise.  JISC are already “trusted partners” in managing payments between institutions and publishers. She reiterated Phil Sykes view that IRs are not being used effectively. Additional benefits of the service include the ability to collect information about APCs for both libraries and publishers, and the opportunity to promote standardisation of process through the industry. The pilot has just been launched: 64 institutions and 60 publishers have shown an interest in becoming involved.

Not quite OA, but the Coventry University textbook project may have been seen as such by their students.  As a marketing ploy inter alia, they took the “no hidden extras” route for their 2012/13 first year students and in conjunction with Ingram Coutts supplied each new student with the core textbooks they needed free of charge for the duration of their Course. Funding was provided by each department, and the entire project was co-ordinated by the Library. A week before the students arrived, Ingram Coutts arrived on scene with their containers of books and set about bundling and marking them with each student’s ID no. Next year they are looking at a more eBook focused solution, again in partnership with Ingram Coutts. One interesting fact is that, although borrowing figures were down, footfall in the library increased by 10%.

The Research Agenda
Presentations on research came from Jenny Delasalle of Warwick University; Laurel Haak of ORCID and, in a breakout session, Joanna Ball of Sussex University. There may well have been more break-out sessions on the subject but this is the only one I attended that falls under the research heading.

The topic chosen by Jenny Delasalle was the relevance of librarians to research evaluation and vice versa. Universities are ranked substantially on their research output; this output is evaluated on where they have published, what has been published and the citation index. As libraries regularly provide this information or resources to provide it, the link to the library is already established.

An additional role for the librarian is to provide guidance on using social media to increase the impact of research. For REF 2014, 20% of the research funding decision will be based on impact. RCUK require research reports which are submitted to ROS (Research Outcomes System) – librarians are key partners in this through CRIS (Current Research Information Systems)/IRs and through handling metadata creation.

Another initiative is Snowball Metrics, involving Elsevier and 8 Universities including Warwick. This has produced a “recipe book” identifying core units of measurement or metrics to allow for the equitable benchmarking of universities on a global basis. For example, counting the value of grants as well as the number of projects, and breaking it down to Depts, FTEs, quarterly figures etc.

So what tools can we point our clients at to measure what is needed? The prestige of the journal in which it is published is still very influential. For clients looking for information on journals, there are a number of tools including Ulrich’s, JCR, “Snip”, and now also Google Scholar metrics.

Article level measures: in addition to the traditional citation index, use any new tools available – tools for counting hits on specific sites; “likes”, reader ratings etc. Librarians need to step in and advise on these tools and how researchers should use them when developing their profiles and submitting their research portfolio.

Two items that fall under her “recommended viewing” list include the http://altmetrics.org site, and the recent SURF report “Users, narcissism and control – tracking the impact of scholarly publications in the 21st century”. Future opportunities include tracking and being the expert on the changing publication profile: the article level economy, and the non-traditional article.

It had already come up in conversation, now it was the turn of Laurel Haak to give her very animated presentation on ORCID. ORCID has three key benefits: it connects a researcher to their output and follows them through their career; it facilitates and automates the population of IRs; and avoids any confusion between the output of researchers sharing the same name.

Very simply, ORCID is a register which provides a unique and persistent ID for every researcher. It is an open, not-for-profit initiative and is the “person part” of the system, combined with DOIs and organisational IDs – all designed for interoperability. First, it has to be adopted by both researchers and by information systems, publishers etc.

Currently, there are 8-9,000 new registrants per week and it is being integrated into manuscripts submission systems, IRs and resources such as Scopus. By contacting professional bodies, they are receiving permission to embed ORCID in the members’ renewal forms. Researchers are responsible for managing their own account – issues of privacy are addressed. Occasionally, a “trusted organisation” may set up the account for the researcher, and then contacts them to claim it.

ORCID is being accepted as the standard author identifier tool and it is worth checking out the site in order to recommend to staff that they sign up as early as possible.

The final presentation I have included in this section is the workshop by Joanna Ball, University of Sussex, entitled “Supporting research data management on a shoestring: a practical solution”. The management of research data had previously been a neglected part of the research process, but new standards from funders such as RCUK and EPSRC require that a detailed management plan be submitted at the bidding stage. She undertook to develop a Code of Practice which would identify roles for the Research office, Library and for IT services. Rather than re-inventing the wheel, she re-purposed previous JISC projects, such as RDMRose. From the outset it was decided that the project would be carried out by postgraduate students (and postdoctoral students for Life Sciences participants, for hierarchical reasons); they were perceived to be more acceptable and less possibly biased than Librarians.

Some outcomes: set up a Digital Curation Centre to first up-skill librarians. Twitter Chat was used to address questions on the ethics of sharing. The need for a flexible solution that would take account of a range of different tools and media was highlighted. To work, the Code of Practice would require collaboration and buy-in from all parties.

She put some queries to her audience:
- Should each university/institution follow their own path or could collaboration and cooperation (maybe using a “best practice lab”) result in a one-size-fits-all out of the box solution? 
- What are the roles for librarians in the research management agenda?

In answer to this last question, she suggested our roles should be ethics, data protection, curation, copyright and legal limitation information, storage, authority on responsibility and ownership of data. In other words, “information literacy”.

The themes 'Our clients under the microscope' and 'The role of Technolocy' will be covered shortly in a separate blog post.

Sincere thanks to the Acquisitions Group of Ireland for making my trip possible.

*Guest blog posts represent the personal views of the poster and do not represent the official opinions or commentary of libfocus.com

9 May 2013

LIR #irelibchat - Mobile Technologies in Libraries - Tues 21st May 8-9pm

#irelibchat is joining forces with the lovely folk at LIR to bring you our next Twitter chat on Mobile Technologies in Libraries on Tuesday 21st May at 8-9pm.

Libraries have the reputation for thinking long and hard, and debating all of the finer points, before deciding to implement a new technology or technology-based service, sometimes missing the peak of the trend (Jacobs, 2009). Building on the success of LIR’s symposium in November and previous #irelibchat discussions, this chat will consider whether libraries are in fact, committed to embracing and implementing new mobile technologies.

  • Have you been involved in implementing a mobile catalogue or website?
  • Do you use skype or more ‘formal’ communication tools for information skills provision?
  • Have you experimented with lending mobile devices?
  • Have you developed a library app or examined the use of QR codes or augmented reality for your library service?
  • Are there other mobile technologies which you implemented in your library? Do your users like them and what did you learn from the experience?

A couple of interesting resources on the topic include a presentation by @joeyann on Experimenting with Mobile Technologies in Libraries and The Handheld Librarian: Mobile Technology and the Librarian by Thomas A. Peters and Lori Bell.

This is a joint collaboration between #irelibchat and the LIR Group. To join in, simply search for the #irelibchat tag on twitter and remember to tag your own tweets with #irelibchat so that others can find them!

8 May 2013

Library Impact and Assessment - ANLTC Seminar, 7th May 2013

Demonstrating library impact is not only crucial for communicating our value to stakeholders, but also helps us design and shape our services more effectively. Where do we have the greatest impact and deliver most value? How can we show causation of - rather than simply correlation with - desired outcomes? Which services have least impact, and why? How can we address this? Or perhaps we should be primarily focusing on those areas where we do add real value and "purposefully abandoning" those that don't? What about those who don't engage with our services at all - what is causing this non-usage? These were just some of the questions I asked myself throughout the course of yesterday's ANLTC event on Library Impact and assessment.

The morning session focused on assessing the impact of information literacy, first through the lens of the CONUL ACIL 2012 survey on curriculum-integrated instruction (Mary Antonesa, NUIM), and then Lorna Dodd (UCD) outlined the process of taking stock of IL instruction to inform future directions. The potential for reusable learning objects to serve as an effective and flexible form of IL support surfaced in both presentations, an idea worth exploring, particularly in the context of falling staff numbers. Graham Stone's (University of Huddersfield) presentation on the Library Impact Data Project offered a brief glimpse of the vast possibilities that are open to us regarding the analysis of our usage and activity data. It is likely that as we dig deeper into this data deluge, more and more questions will emerge, and ultimately we are only limited by our time and our resources. However, unfortunately these represent very real constraints for most if not all of us. Fortunately, Huddersfield have made their Impact Data Project toolkit available for others to use and learn from, and you can also read more about the implementation and findings of the project. Graham also provided a taster of the ongoing Library Analytics and Metrics Project (LAMP), which I have been following with interest for the past few months via their blog.

JISC LAMP Dashboard - WIP: http://jisclamp.mimas.ac.uk/2013/04/dashboard-some-first-thoughts/

Unsurprisingly LibQual also featured on several occasions throughout the day. Ciara McCaffrey (UL) synthesised the experiences of using the survey instrument from the perspective of Irish Universities, and Peter Corrigan (NUIG) described the practicalities and workload involved in analysing LibQual comments and qualitative data, including a review of some of the relevant software packages available.

Whilst many of the presentations looked at impact assessment from different perspectives (such as customer service, information literacy, or usage data), some common themes resonated throughout. Firstly, there is a clear need for librarians as a profession to start looking beyond our safety-zone of input- and process-focused metrics. We need to start building a culture of impact assessment in our libraries that tries to measure the real value we add to out institutions. Collaboration is also key in this regard, be it with faculty for tracking IL outcomes, or indeed with our users themselves as Jo Aitkins (University of Leicester) reminded us. In the majority of cases, libraries are part of a broader organisation or institution, and how we can assess and demonstrate our impact is deeply rooted in this context.

The value of being able to benchmark services to provide a relative value and meaning for our metrics was also highlighted; this is the real advantage of using standardised instruments such as LibQual, in spite of their simplifications and limitations compared with local tools. Indeed, perhaps initially at least, simplification is a compromise we will have to make. Often the variables we are trying to capture are covert and complex, and it can be difficult (or even impossible in some cases) to extract and eliminate confounding factors. However, if we wait for a 'perfect' measure to surface, demonstrating the impact of our services will always remain out of reach. Instead, as a recursive process, our assessment efforts merely represent the next step in a continuous cycle of improvement and refinement, with each additional study and piece of data providing something we can build on in the future.

7 May 2013

iPad in hand = roving librarians

Continuing from a recent post, which looked at the potential of the tablet-PC to facilitate constructivist instruction and learning approaches in the classroom, this second instalment takes a utilitarian library perspective. Given the increasing prevalence of tablets among library patrons (I see more of them around the library by the day) it makes sense to include tablets to the librarian’s existing toolkit for research and reference services, as well as information literacy instruction.

The idea that librarians arm themselves with enhancing-services technologies (e.g. see txting or the ubiquitous presence of IM reference services) which reach students in- and outside the physical confines of the library is nothing new. But the idea of roving reference librarians certainly represents a conceptual leap in terms of delivering a targeted service at the point of need, rather than expecting students to physically turn up at the traditional (stationary) reference desk.

Here are some immediate benefits if reference/teaching librarians equip themselves with tablet-PCs:
  • Increase visibility of the library, its staff and resources as focus is given to the individual needs of the user (teaching and training)
  • Roving outreach activities may enhance the quality of patron-librarian interactions (point-in-need support)
  • Students who avoid the reference desk (for all sorts of different reasons) can be captured by directly approaching them on their terms
  • Roaming librarians may provide an opportunity for the library to better integrate with the wider institutional culture
Suggested modus operandi:
Roaming librarians invite students to approach them directly in the library and other locations as appropriate (staff armed with iPAD in hand and a t-shirt with the slogan, say, “reference librarian on duty” act as meaningful signifiers to students in need of help). This can be combined with the library’s IM reference service for the purpose of locating students that happen to be on campus. Effectively, the reference librarian responds to an IM contact and approaches the caller in situ if a) the complexity of the query warrants such a move, b) the caller happens to be physically in the library/on campus. Adequate IM backup covers for IM overflows and bottlenecks.

Some challenges....
  • Procurement, maintenance and operating service costs
  • Roving reference should be carefully planned considering service hours, service location(s), approach style, evaluation, technology, training and staffing so that services meet expectations
  • Spatial layout of the library might render this type of service inappropriate as noise levels are likely to increase
  • Student social spaces should not be 'invaded' by roaming librarians; focus instead on academic and academic/social spaces
Below is a short list of iPAD apps that represent a good starting point (check out ref for more). Bear in mind that apps are constantly updated and new apps are launched all the time.

Google apps (including Google Search, Maps and Youtube among others)
Chalk Board (allows you to draw on a chalkboard just as you would in a classroom)
Dragon Dictation (voice recognition application that allows you to easily speak and instantly see your text or email messages)
Dictionary (trusted reference content from Dictionary.com & Thesaurus.com - works offline)
Dropbox (cloud based file storage)
Evernote (collect information from anywhere into a single place)
Nook  (access over 3 million books, magazines, newspapers, comics, and more)
Jumbo Calculator (large-buttoned calculator for everyone, ages 2-92)
Kindle (optimised for iPAD)
Wikipanion (Wikipedia for iPAD)
World Factbook (includes extensive information on more than 250 countries and locations around the world)
PDF Reader Pro (very handy)
iSSRN (Social Science Research Network for access to scholarly research in the social sciences and humanities)

EBSCOhost (access EBSCOhost database content, provided courtesy of your library [as applicable])

References and resources:
McCabe, K., MacDonald, J.. Roaming Reference: Reinvigorating Reference through Point of Need Service. Partnership: the Canadian Journal of Library and Information Practice and Research, North America, 6, nov. 2011. Available at: https://journal.lib.uoguelph.ca/index.php/perj/article/view/1496/2241. Date accessed: 03rd May. 2013.
ALA TechSource. 2012. Rethinking Reference and Instruction with Tablets. [ONLINE] Available at: http://www.alatechsource.org/taxonomy/term/106/rethinking-reference-and-instruction-with-tablets. [Accessed 02 May 13].
Maloney M, Wells V. iPads to Enhance User Engagement During Reference Interactions. Library Technology Reports [serial online]. November 2012;48(8):11-16. Available from: Academic Search Complete, Ipswich, MA. Accessed May 03, 2013.
Apple. 2013. iPAD Support. [ONLINE] Available at: http://www.apple.com/support/ipad/. [Accessed 03 May 13].
McKiernan, Gerry. 2013. Spectrum > Mobile Learning, Libraries, And Technologies. [ONLINE] Available at: http://mobile-libraries.blogspot.ie/. [Accessed 02 May 13].
Miller, R.K. et al.. 2013. iPads and Tablets in Libraries. [ONLINE] Available at: http://tabletsinlibraries.tumblr.com/. [Accessed 01 May 13].
Derry, Bill. 2013. Apps for Librarians. [ONLINE] Available at: http://pinterest.com/wplbillderry/apps-for-librarians/. [Accessed 01 May 13].
University of Washington, Bothell. 2013. Apps for the iPAD. [ONLINE] Available at: http://www.bothell.washington.edu/learningtech/help/how-to/ipad/ipad-apps. [Accessed 01 May 13].

2 May 2013

Marketing Your Library's Electronic Resources: A how-to-do-it manual by Marie R. Kennedy & Cheryl LaGuardia (Review)

With libraries typically spending an increasing chunk of their budgets on e-resources, ensuring these products are both visible and accessible to our users has become crucial. As our digital libraries continue to evolve outside of our physical spaces, the links between the library and its resources are sometimes not as obvious as we would like them to be. As Kennedy and LaGuardia put it, the adage of “if we build it [subscribe to it], they will come" can not necessarily be relied upon in an environment where there is intense competition for our users’ attention.

The first key point the authors make is that one-off marketing events won’t work in isolation, they need to be part of a bigger plan. In this context, a nine-step cycle for developing a marketing plan for e-resources is presented, but in truth the structure could be applied to nearly any aspect of library services. Indeed there is a lot of detail on general issues such as the process of developing a marketing plan, and communicating a deliberate, clear and consistent message to your users. This could be interpreted as either a strength or weakness of the book depending on the reader's individual needs and motivations. The activities and processes that the authors discuss will undoubtedly provide a solid grounding in marketing for librarians in most contexts. However, whilst the text frequently links back to the area of e-resources, it often does not extend beyond the general. As a reader looking for specifics to take away, I was somewhat disappointed in this respect. For example, from memory I can’t recall a single mention of Twitter, and a quick look at the index lists a mere two pages under “social networks and marketing techniques”. Discovery interfaces get a similarly brief mention, when it would have been possible to devote an entire chapter to this area alone.

That said, Kennedy & LaGuardia do an excellent job of summarising the relevant literature in the area, although as a result, at times it is hard to hear the authors’ own voices and experiences coming through. Throughout the book there are extracts from various libraries’ marketing plans, and the last third or so of the book comprises sample plans and forms, which provide useful guidance and ideas for those developing their own strategies. There are also some nice quick reference aspects, such as the list of marketing techniques based on Kennedy’s (2010) previous research on pages 52 & 53. This provides some at-a-glance inspiration for practical ideas for promoting your e-resources, including techniques such as calendars, VLEs and user guides.

Whilst the book is certainly well-written and contains high quality information, at times I question if the level of detail included about writing a marketing plan is necessary, when there is a wealth of other resources and publications users could have consulted for such information. This approach would have freed up some space for a deeper analysis and discussion of specific promotional and marketing techniques for e-resources, and how effective (or otherwise) they are. Instead the book is very much a one-stop shop, and for those already familiar with marketing plans and strategies and looking for highly specific advice, this can be a little frustrating. For those completely new to the area however, this is certainly a book that will serve as a useful toolkit from start to finish - if there is such a thing as a finish in marketing!