Library statistics are an important means for library self assessment and performance management. Methodologies for gathering and harnessing library statistics have a long track record in major western countries. The EU project LibEcon2000 (http://www.libecon.org/) set a regional direction for consistent data collection for library statistics, which itself facilitates measurement of performance at the national level through the use of consistent statistics. The Global Centre for ICT in conjunction with the Parliamentary Libraries Section of IFLA has similarly undertaken regular surveys of Parliamentary libraries whose information provides valuable feedback for libraries in setting their direction. The recently promulgated Manifesto for Statistics published by IFLA clearly sets out the importance of statistics in supporting the library within the Parliament. Statistics serve several purposes: to provide evidentiary support of the ways in which the library fulfills its role, to guide the library in decision making about investment in service delivery (and budgetary planning), and to guide progressive improvement of service delivery.
Performance measurement of libraries is not simply about collecting usage statistics, collection sizes and budgetary information on staffing, acquisitions and electronic resources. It also entails a continuous process of assessment that involves eliciting the end users opinion of the library performance. Roxanne Missingham's paper discussed in the introduction on the changing role of reference services in libraries highlights the importance of understanding and adjusting service delivery based on the current needs of the Parliamentary members and their staff.
The ISO 1160 standards also affirm the measurement of the quality and effectiveness of the services delivered as well as the goals and objectives of the library.
Current standards for the collection of statistics in libraries are proposed by ISO TC46/SC8, section of the International Standards Organization (ISO 2789 and ISO 11620) in Europe, and the National Information Standards Organization (ANSI/NISO Z39.7) in the United States. Major projects such as LibEcon2000 have illustrated the strategic benefit of having global statistics consistent across libraries regionally and nationally, and have informed subsequent efforts toward consistent standards for statistical gathering. Organisations such the International Coalition of Library Consortia (ICOLC -http://www.library.yale.edu/consortia/webstats.html), JSTOR Web Statistics Task Force (http://www.library.yale.edu/~kparker/WebStats.html), and the D-Lib Working Group’s Digital Library Metrics (http://www.dlib.org/metrics/public/) demonstrate the considerably interest in improving standards for library measurement.
An important resource for the library is the IFLA Library Statistics Manifesto (see Resources below).
Most integrated library management systems come with a suite of tools for reporting on collection usage by categories of patrons and items. The types of statistics that should be collated and tracked monthly and annually are:
The breadth and diversity of electronic systems presents specific challenges to gathering common statistics across divers platforms and services. Different suppliers, where they supply usage statistics, may do so in a variety of different ways. Nevertheless, assessing electronic usage as part of statistics gathering is most critical for libraries at a critical juncture of transition in the use of library services. An important task of the library is to assemble best equivalence measures of usage taking different source figures. For instance, while one vendor might provide statistics on searches undertaken and downloads made, another might break this down to collection or title categories.
There is no question that electronic systems can extend library services beyond the normal opening hours. Where these services are delivered through an internal library “proxy” some tracking is possible of these resources. Some vendors may provide information on when the services are being used. For instance JSTOR is provides reports detailing the breakdown by hour of access and services used. The library will probably have to use a combination of information elicited from web server logs, its own internal systems and vendor provided reports to glean a clear picture of electronic systems usage.
Usage statistics can give you a good indication over time of the important parts of your websites (intranet, extranet and Internet). There are many tools that allow you to analyse these usage statistics (see the resources section below). The following are some guidelines on reviewing these statistics:
It is particularly worth reviewing changes in statistics usage patters before and after major website changes.
The measurement of customer satisfaction is newer territory for libraries, but provides important feedback that can be particularly valuable in judging strategic directions. Statistical and Qualitative research methods are part of the basic research toolkit. Quantitative methods are applied to the analysis of population data, controlled trials, surveys census taking, econometrics, ratings analysis and many other areas. Quantitative research involves population sampling techniques which give the capacity to analyse and the ability to generalise theories. There are many texts on the most effective approaches to Quantitative research.
The mission of Qualitative Research is the discovery of new phenomena through careful in-depth examination of the results of non-quantitative investigation. The scope can be anything from the detailed study of a single case to the textual analysis of large amounts of free-form survey data. Approaches to Qualitative Research include:
The in-depth analysis of a particular organisation, situation or environment can highlight possible cause/effect relationships that are not otherwise apparent. They represent Max Weber's “typification” - the realisation of generalised models through the detailed understanding of specific cases. They are, of their very nature, open to interpretation, and subjective. A case study may involve re-interpretation of existing data in a new way.
Quantitative analysis can be an effective tool for the analysis of specific opinions and issues. However, Focus Groups can be an efficient way of rapidly gathering many different opinions in a relatively short space of time. A selected panel of users discussing issues in an environment controlled by an interviewer, potentially involving a series of iterations on questions, can provide immediate feedback on questions being tested. With an experienced interviewer, follow-up questions can arise immediately to reveal aspects of a question that have not yet been considered, and in this way key issues can be identified quite early.
The risk of focus groups is the potential domination within a selected group of strong individuals, whose opinion tends to occupy the discussion space. Similarly, interviewer bias can subtly be communicated to the participants. Typically, the Focus Group is useful for measuring consumer reaction, evaluating consumer-purchasing decisions, and measuring the use of products and services. They can be an effective approach to measuring the potential target audience reaction to a proposed idea.
The question design for focus groups yields best results when the target group are taken through several phases in the development of their ideas leading to the central question of questions. Characteristically the focus group will go through four phases:
Results are gathered from four or more focus groups and these are compiled using a Qualitative Data analysis tool such N5.
The Delphi method is an approach to forecasting using expert panels. Like a focus group, discussion and panel sessions are used to elicit opinion and ideas regarding developments that may be on the horizon. This is an iterative process, that may see several groups exchange their ideas as they work to a consensus on key future trends, issues or research directions. In the nature of these panels, very strong facilitation is necessary to avoid an early convergence to consensus or the domination of one individual or theme. In the final round of a Delphi session, questions are often ranked in priority or probability. Such techniques are often a useful approach to formulating options in cases of high uncertainty. The work by Lindstone and Turnoff (1975) presents a comprehensive appraisal of the Delphi approach.
In many cases researchers already have a rich resource of content available for textual mining. Content analysis looks at trends and occurrences and meanings in such texts. Word frequency, contextual analysis, semantic analysis of texts, clustering and other analysis methods now rely heavily on Information Systems. Software tools such as ATLAS*TI and NUD*IST are particularly strong in methods for content analysis using Grounded Theory. Other packages focus on thesaurus based and probabilistic analysis of texts: Semio Taxonomy and Intelligent Miner for texts being two examples. Hamlet is a software tool that focuses on various techniques for word frequency analysis. Linguistic analysts also have at command a range of software applications focused specifically on lexical analysis: such as Interlinear Text Processor and Shoebox.
Finally, Action Research is an immensely popular method for situationally based research. Rather than attempting to compartmentalise the researcher and the subjects of the research, Action Research assumes the active engagement of the researcher in the problem and its resolution. It is focused on applied research, and continual refinement.
Consistency over time is important in the use of both qualitative and quantitative statistics, particularly where they are used to measure key performance indicators and strategic decisions for the library.
Libraries in most organisations are now subject to a level of scrutiny as to their role and relevance unlike any time before. Parliamentary libraries are not exempt from this scrutiny. It is important therefore that the library begins to prepare the statistics that demonstrate its utility in the daily life of the parliament. For management reporting these statistics are often presented in terms of KPI's or Key Performance Indicators. Groundwork with management is needed to ensure these indicators reinforce the relevance of the library service. Ground work is needed in the library to ensure these KPI results are truly reflective of the breadth of service delivery.
The purpose for gathering these statistics is to keep the Parliament informed of the ongoing contribution and value of the library and to facilitate the direction of resources where they are most needed. The library should prepare an annual report on the activities of the library which brings together the achievements and activities of the year. KPI's should be developed in conjunction with the Parliamentary management to reflect priorities for the library in supporting the work of the Parliament. An example of the annual report from the House of Commons Parliamentary Library in the United Kingdom can be found in the case study below.
The annual report can present:
An example of an annual report is presented in the Case Study below.
<blockquote>Missingham, Roxanne. “Parliamentary library and research services in the 21st century: A Delphi study”. doi: 10.1177/0340035210396783. IFLA Journal March 2011 vol. 37 no. 1 52-61. http://ifl.sagepub.com/content/37/1/52.full.pdf+html
Message from the Librarian and Director General Research and Information Services for Members
Parliamentary Office of Science and Technology (POST) Public Information Directorate
Information Management Directorate
Departmental Services Directorate
Examples of statistics gathering approaches
The IFLA Statistics and Evaluation section has published the:
IFLA Statistics Manifesto. http://www.ifla.org/publications/ifla-library-statistics-manifesto
The manifesto includes a model questionnaire.