Add this page to your book
Remove this page from your book
Table of Contents
Chapter 6: Impact measures and statistics
Library statistics are an important means for library self assessment and performance management. Methodologies for gathering and harnessing library statistics have a long track record in major western countries. The EU project LibEcon2000 (http://www.libecon.org/) set a regional direction for consistent data collection for library statistics, which itself facilitates measurement of performance at the national level through the use of consistent statistics. The Global Centre for ICT in conjunction with the Parliamentary Libraries Section of IFLA has similarly undertaken regular surveys of Parliamentary libraries whose information provides valuable feedback for libraries in setting their direction. The recently promulgated Manifesto for Statistics published by IFLA clearly sets out the importance of statistics in supporting the library within the Parliament. Statistics serve several purposes: to provide evidentiary support of the ways in which the library fulfills its role, to guide the library in decision making about investment in service delivery (and budgetary planning), and to guide progressive improvement of service delivery.
Performance measurement of libraries is not simply about collecting usage statistics, collection sizes and budgetary information on staffing, acquisitions and electronic resources. It also entails a continuous process of assessment that involves eliciting the end users opinion of the library performance. Roxanne Missingham's paper discussed in the introduction on the changing role of reference services in libraries highlights the importance of understanding and adjusting service delivery based on the current needs of the Parliamentary members and their staff.
The ISO 1160 standards also affirm the measurement of the quality and effectiveness of the services delivered as well as the goals and objectives of the library.
Current standards for the collection of statistics in libraries are proposed by ISO TC46/SC8, section of the International Standards Organization (ISO 2789 and ISO 11620) in Europe, and the National Information Standards Organization (ANSI/NISO Z39.7) in the United States. Major projects such as LibEcon2000 have illustrated the strategic benefit of having global statistics consistent across libraries regionally and nationally, and have informed subsequent efforts toward consistent standards for statistical gathering. Organisations such the International Coalition of Library Consortia (ICOLC -http://www.library.yale.edu/consortia/webstats.html), JSTOR Web Statistics Task Force (http://www.library.yale.edu/~kparker/WebStats.html), and the D-Lib Working Group’s Digital Library Metrics (http://www.dlib.org/metrics/public/) demonstrate the considerably interest in improving standards for library measurement.
An important resource for the library is the IFLA Library Statistics Manifesto (see Resources below).
Measuring resource use
Most integrated library management systems come with a suite of tools for reporting on collection usage by categories of patrons and items. The types of statistics that should be collated and tracked monthly and annually are:
- Acquisitions by type of patron and type of item against budget
- Circulation statistics (reservations/holds and borrowings)
- In-house usage statistics (many systems will allow tracking of usage by checking in items left on desks and stacking trolleys by checking these in before shelving. This can provide a valuable measure of in-house usage).
- Overdue rates and return rates
- Search statistics (what subject and keywords are searched)
- Web-based statistics - what parts of the library website are most frequently used
- Digital library usage statistics
- Reference queries by client and by type of query and resource used
Measuring electronic collection
The breadth and diversity of electronic systems presents specific challenges to gathering common statistics across divers platforms and services. Different suppliers, where they supply usage statistics, may do so in a variety of different ways. Nevertheless, assessing electronic usage as part of statistics gathering is most critical for libraries at a critical juncture of transition in the use of library services. An important task of the library is to assemble best equivalence measures of usage taking different source figures. For instance, while one vendor might provide statistics on searches undertaken and downloads made, another might break this down to collection or title categories.
There is no question that electronic systems can extend library services beyond the normal opening hours. Where these services are delivered through an internal library “proxy” some tracking is possible of these resources. Some vendors may provide information on when the services are being used. For instance JSTOR is provides reports detailing the breakdown by hour of access and services used. The library will probably have to use a combination of information elicited from web server logs, its own internal systems and vendor provided reports to glean a clear picture of electronic systems usage.
Website usage statistics
Usage statistics can give you a good indication over time of the important parts of your websites (intranet, extranet and Internet). There are many tools that allow you to analyse these usage statistics (see the resources section below). The following are some guidelines on reviewing these statistics:
- Page hits are useful as a relative measure over time and to measure the relative popularity of different pages and sections of the site. They are not an indication of the number of users, since much usage may be masked by web site caches.
- Customer visit figures are an approximation of the number of unique customers visiting your site. They rely on the web server logs to give sufficient information to estimate which pages a single customer has used over time (as distinct from usage by discrete different customers).
- Usage by hour of day - this information can be very useful in measuring times during which your website receives peak usage during the week.
- Referring information - when provided by your web logs can give useful statistics on where customers came *from* to reach your website
- Search keywords and phrases - gives an indication of topic areas used by customers using your site
It is particularly worth reviewing changes in statistics usage patters before and after major website changes.
Measuring customer satisfaction
The measurement of customer satisfaction is newer territory for libraries, but provides important feedback that can be particularly valuable in judging strategic directions. Statistical and Qualitative research methods are part of the basic research toolkit. Quantitative methods are applied to the analysis of population data, controlled trials, surveys census taking, econometrics, ratings analysis and many other areas. Quantitative research involves population sampling techniques which give the capacity to analyse and the ability to generalise theories. There are many texts on the most effective approaches to Quantitative research.
The mission of Qualitative Research is the discovery of new phenomena through careful in-depth examination of the results of non-quantitative investigation. The scope can be anything from the detailed study of a single case to the textual analysis of large amounts of free-form survey data. Approaches to Qualitative Research include:
- Focus groups
- Case Studies
- Delphi Method
- Content Analysis
- Action Research
The in-depth analysis of a particular organisation, situation or environment can highlight possible cause/effect relationships that are not otherwise apparent. They represent Max Weber's “typification” - the realisation of generalised models through the detailed understanding of specific cases. They are, of their very nature, open to interpretation, and subjective. A case study may involve re-interpretation of existing data in a new way.
Quantitative analysis can be an effective tool for the analysis of specific opinions and issues. However, Focus Groups can be an efficient way of rapidly gathering many different opinions in a relatively short space of time. A selected panel of users discussing issues in an environment controlled by an interviewer, potentially involving a series of iterations on questions, can provide immediate feedback on questions being tested. With an experienced interviewer, follow-up questions can arise immediately to reveal aspects of a question that have not yet been considered, and in this way key issues can be identified quite early.
The risk of focus groups is the potential domination within a selected group of strong individuals, whose opinion tends to occupy the discussion space. Similarly, interviewer bias can subtly be communicated to the participants. Typically, the Focus Group is useful for measuring consumer reaction, evaluating consumer-purchasing decisions, and measuring the use of products and services. They can be an effective approach to measuring the potential target audience reaction to a proposed idea.
The question design for focus groups yields best results when the target group are taken through several phases in the development of their ideas leading to the central question of questions. Characteristically the focus group will go through four phases:
- Introductory questions. These introduce the broad interest area. Their main purpose is to stimulate the initial discussion among the participants.
- Transitional questions. The group should be led through more concrete questions, examples or case studies, which focus the discussion in the interest area.
- Key Questions. The key focus group questions are introduced by the moderator when the group has reached a suitable level of discussion and engagement in the interest area. The key interest areas should be directly addressed. Feedback, discussion and the following of interesting aspects of the discussion are a key role of the moderator.
- Final Questions. A final series of questions can be used to wrap up the discussion and give a sense of closure, as well as exploring ancillary topics of interest arising from the key questions of the focus group.
Results are gathered from four or more focus groups and these are compiled using a Qualitative Data analysis tool such N5.
The Delphi Method
The Delphi method is an approach to forecasting using expert panels. Like a focus group, discussion and panel sessions are used to elicit opinion and ideas regarding developments that may be on the horizon. This is an iterative process, that may see several groups exchange their ideas as they work to a consensus on key future trends, issues or research directions. In the nature of these panels, very strong facilitation is necessary to avoid an early convergence to consensus or the domination of one individual or theme. In the final round of a Delphi session, questions are often ranked in priority or probability. Such techniques are often a useful approach to formulating options in cases of high uncertainty. The work by Lindstone and Turnoff (1975) presents a comprehensive appraisal of the Delphi approach.
In many cases researchers already have a rich resource of content available for textual mining. Content analysis looks at trends and occurrences and meanings in such texts. Word frequency, contextual analysis, semantic analysis of texts, clustering and other analysis methods now rely heavily on Information Systems. Software tools such as ATLAS*TI and NUD*IST are particularly strong in methods for content analysis using Grounded Theory. Other packages focus on thesaurus based and probabilistic analysis of texts: Semio Taxonomy and Intelligent Miner for texts being two examples. Hamlet is a software tool that focuses on various techniques for word frequency analysis. Linguistic analysts also have at command a range of software applications focused specifically on lexical analysis: such as Interlinear Text Processor and Shoebox.
Finally, Action Research is an immensely popular method for situationally based research. Rather than attempting to compartmentalise the researcher and the subjects of the research, Action Research assumes the active engagement of the researcher in the problem and its resolution. It is focused on applied research, and continual refinement.
Consistency over time is important in the use of both qualitative and quantitative statistics, particularly where they are used to measure key performance indicators and strategic decisions for the library.
Reporting and Key Performance Indicators
Libraries in most organisations are now subject to a level of scrutiny as to their role and relevance unlike any time before. Parliamentary libraries are not exempt from this scrutiny. It is important therefore that the library begins to prepare the statistics that demonstrate its utility in the daily life of the parliament. For management reporting these statistics are often presented in terms of KPI's or Key Performance Indicators. Groundwork with management is needed to ensure these indicators reinforce the relevance of the library service. Ground work is needed in the library to ensure these KPI results are truly reflective of the breadth of service delivery.
The purpose for gathering these statistics is to keep the Parliament informed of the ongoing contribution and value of the library and to facilitate the direction of resources where they are most needed. The library should prepare an annual report on the activities of the library which brings together the achievements and activities of the year. KPI's should be developed in conjunction with the Parliamentary management to reflect priorities for the library in supporting the work of the Parliament. An example of the annual report from the House of Commons Parliamentary Library in the United Kingdom can be found in the case study below.
The annual report can present:
- the Key Performance Indicators - these might include:
- collection usage statistics
- collection development statistics
- research service statistics (requests, reports)
- training delivered
- website usage and statistics
- the major projects and achievements for the previous year
- the major tasks facing the library for the forthcoming year
An example of an annual report is presented in the Case Study below.
<blockquote>Missingham, Roxanne. “Parliamentary library and research services in the 21st century: A Delphi study”. doi: 10.1177/0340035210396783. IFLA Journal March 2011 vol. 37 no. 1 52-61. http://ifl.sagepub.com/content/37/1/52.full.pdf+html
Message from the Librarian and Director General Research and Information Services for Members
- What We Said We Would Do
- Activity & performance
- Other developments
Parliamentary Office of Science and Technology (POST) Public Information Directorate
- What we said we would do
- Activity and Performance
- Developments in 2010/11
- Other developments
Information Management Directorate
- What We Said We Would Do
- Activity and Performance
- Other developments
- Developments in 2010/11
- What We Said We Would Do
Departmental Services Directorate
- Activity and Performance
- DIS Structure (April 2011)
- Reference and User Services Guidelines for introducing Electronic Info Resources to users. http://www.ala.org/ala/mgrps/divs/rusa/resources/guidelines/guidelinesintroduction.cfm. These guidelines from the ALA are useful descriptions of some procedures necessary to help users make the most of the electronic services provided by the library. They also have a document on Implementing and Maintaining Virtual Reference Services (http://www.ala.org/ala/mgrps/divs/rusa/resources/guidelines/virtual-reference-se.pdf)
- ISO 11620:2008 Performance Indicators. http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=37853. This standard defines performance indicators for libraries and defines steps for establishing these performance indicators.
- ISO 2789:2006 International library statistics. http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=39181. Defines standards for international performance reporting by libraries
- Piwik analytics - http://piwik.org/ - An open source alternative to Google analytics. This may require a little extra installation time on your servers, but has quite powerful reporting capability similar to Google analytics.
- AWStats open source log analyzer - http://awstats.sourceforge.net/ - AWStats is an powerful open source log analyser. Your IT area should be able to assist you in providing log files from your intranet and extranet. This tool requires some initial configuration time, but has strong reporting capabilities, including keyword search usage. Since it operates off your log files, it is not restricted to showing only public web site usage statistics.
Examples of statistics gathering approaches
- LibEcon. http://www.libecon.org/d|LibEcon. A European initiative to gather consistent individual and regional statistics on libraries.
- LIBQUAL. http://www.libqual.org/home. LIBQUAL is a not-for-profit structured set of services to “to solicit, track, understand, and act upon users’ opinions of service quality”. The merit of the system is its widespread base and the ability to asses individual library results against a large performance base gathered over time. It is managed by members of the Association of Research Libraries (principally large university libraries). The approach an methodology are strong, and could form the basis for similar regional evaluation arrangements for parliamentary libraries.
- International Collections of Statistics. http://www.caul.edu.au/caul-programs/caul-statistics/interstats. A useful reference resource provided by the Council of Australian University Libraries.
The IFLA Statistics and Evaluation section has published the:
IFLA Statistics Manifesto. http://www.ifla.org/publications/ifla-library-statistics-manifesto
The manifesto includes a model questionnaire.