Introduction
There is a constant reduction and closing down of libraries all over the world, and therefore a growing need for libraries to use reliable and accessible data to show how good they perform and how good they are in comparison with other libraries. The competition has become genuinely global (Laitinen, 2019: 263). “Modern academic [as well as public] libraries need reliable and accessible data in order to be able to measure and assess the quality of their services and the satisfaction of their users. Efficient and effective tools are essential in order to make better business and service decisions, and to make the library more visible” (Jilovsky, 2011: 48). This includes internal statistics and reports, annual reports for the public as well as data collected from the state or the national statistical providers.
Today, it is not enough for the library simply to quantify the extent of the resources or how they are used; one also must be able to show that this investment and availability of resources produces better results [e.g.] in terms of research and education in the university and that these services are being delivered in an efficient manner. This means that libraries must diversify their indices with which they are evaluated. The traditional library statistics represent a sound basis to do this analysis, but more is needed. (Laitinen and Saarti, 2012: 254.)
I.e., it is “[…] very evident that it is extremely important to segment and analyse segments of users investigating their behaviour, purposes and preferences” (Pors, 2010: 23). Furthermore, comparing the USA’s and China’s national library statistical system, Liu emphasized already in 1997, “[t]he library statistics generated are expected to describe and compare the effectiveness and efficiency of libraries” (193).
However, it is necessary to recognize the difference between longitudinal data for one specific library, showing development / trends / etc. over time versus benchmarking, needing comparable data from other (libraries in this case).
Benchmarking, as defined in Encyclopedia of evaluation,
is a method used by organizations to compare their performance, processes, or practices with peer organizations or those in other business sectors. This method can focus on performance, in which case one identifies the most important indicators of success and then compares one’s own performance with that of other organizations. The focus can also be a particular process, such as billing or information technology. This method is common in business and industry, and it has seen growing acceptance in the public sector, where it is most associated with the identification of best practices. Often, benchmarking is a collaboration among like (similar) organizations that attempts both to identify important indicators of success and to provide data to the collaborating organizations about performance. (Mathison, 2005.)
In order for this process to be successful, there must be data for comparison, including comparable data collected in the same way, and data from entities that are interesting to compare with.
For libraries, one important resource for data would be national library statistics, even if “[i]t is awkward and tedious to compare results […] when they are contained in separate publications” (Sumsion, 1997: 163). Here it is important to have national statistics that offer internationally comparable data and key performance indicators.
Librarians have been involved in collecting and disseminating statistics for many years. Utilising statistics to describe and assess the operation of library activities as a part of library tradition has received growing attention from researchers, policymakers, library managers, and professionals. […] The results […] may also help the national statistical providers to obtain an understanding of the characteristics and pattern of use of their products. (Liu, 2001: 187)
and show them the importance of national statistics as well as giving them an idea of the importance of comparable data. Already in 2014 Landøy and Raade found out while doing research in testing the usability of national collection data in Norwegian academic libraries that “[o]ne conclusion is that benchmarking through use of national indicators is full of pitfalls, and that those pitfalls will only be discovered through actual usage and trials” (819).
Doing research regarding effects of the New Public Management (NPM) and austerity in European public and academic libraries in five countries (Finland, Germany, Great Britain, Norway and Romania), comparing 22 public and academic libraries, the researchers had difficulties in finding key performance indicators in national statistics to use for this comparison (Düren et al., 2019a).
Therefore, it was decided to start another research project, comparing selected indicators of the five national statistics, describing the indicators, comparing their quality and analysing which indicators are missing in some countries. National library statistics as well as the IFLA Library Map of the World will be analysed in reference to the ISO Standards 11620 (ISO, 2014a) and 16439 (ISO, 2014b). The paper ends with conclusions and the need for further research.
For both research projects a document analysis (archival research) has been conducted. In addition to the basic methods of data collection, document analysis represents a further independent process group for obtaining and interpreting empirical data (Döring and Bortz, 2016: 533). In this genuine document analysis, existing or found documents (extant documents) are used. They were produced completely independently of the research process (Döring and Bortz, 2016: 533), as the national library statistics and the IFLA Library Map of the World in these research projects. In this case, these documents are publicly accessible and could be used without having to consider copyright and data protection issues (Döring and Bortz, 2016: 535). The qualitative content analysis is one possibility to evaluate this data material (Döring and Bortz, 2016: 535).
Also the national library statistics have undergone a quantitative content analysis (Schnell, Hill, and Esser, 2013: 397).
What did we find in the first research project?
For the research project regarding effects of the New Public Management (NPM) and austerity in European public and academic libraries in five countries, comparing 22 public and academic libraries, the original idea for the research project included the comparison of data regarding information literacy for all of the 22 libraries (Düren et al., 2019a). The set of indicators that was required from the national statistics included:
Visitors / visits,
Virtual visits,
Loans,
Number of activities and events,
Number of user trainings, and
Participants in user trainings.
The idea was to compare the data over an eleven-year period from 2007 to 2017. In the following some examples of the different national library statistics are presented to show the difficulty in using these data for the above mentioned research project.
All researchers were asked to provide the above mentioned statistical data for the libraries, in which they conducted expert interviews with library leaders.
Finland1
F1 | Library visits | User training | Number of attendaces at user training | Events |
2007 | 1,255,964 | * | * | * |
2008 | 1,279,959 | * | * | * |
2009 | 1,218,304 | * | * | * |
2010 | 1,232,223 | * | * | * |
2011 | 1,216,734 | * | * | * |
2012 | 1,251,702 | 311 | 5,956 | 863 |
2013 | 1,163,322 | 274 | 5,052 | 842 |
2014 | 1,131,740 | 337 | 6,41 | 945 |
2015 | 1,134,023 | 484 | 9,073 | 869 |
2016 | 1,115,274 | 521 | 9,439 | 1,008 |
F1-Public library
* No data
F2 | Number of exhibitions | Number of events | Number of participants |
2007 | 15 | 6 | 6,291 |
2008 | 24 | 8 | 6,067 |
2009 | 33 | 10 | 5,375 |
2010 | 27 | 15 | 5,901 |
2011 | 41 | 14 | 6,550 |
2012 | 51 | 16 | 6,885 |
2013 | 41 | 23 | 4,667 |
2014 | 28 | 19 | 4,940 |
2015 | 21 | 19 | 4,176 |
2016 | 33 | 13 | 6,546 |
F2-Academic library
In Finland due to mergers, some data was not available. The library F2, e.g., started in 2010 as merger of three small universities and there was a need to add the numbers of these three university libraries together in 2007-2009 manually.
Also some indicators are missing at all, such as “Virtual visits” and “Loans” (in F1). The year 2017 was not available at the time of data collection. Also in F2 the events and exhibitions are separate indicators. The indicator “Number of participants” shows the numbers of attendants in user education of the library. In Finland, the numbers of attendants in exhibitions and events are not calculated because they happen in open space impossible to count the number of attendants. This makes the comparison of data over different countries again difficult.
Germany2
G1 | Visitors | Virtual visits | Loans | No. of activities and events |
2007 | 1,178,502 | 932,978 | 3,340,513 | 1,814 |
2008 | 1,269,334 | 1,245,417 | 3,478,557 | 1,854 |
2009 | 1,313,782 | 1,907,626 | 3,583,553 | 1,866 |
2010 | 1,141,154 | * | 3,498,198 | 1,898 |
2011 | 1,217,576 | * | 3,565,870 | 2,062 |
2012 | 1,113,693 | 757,659 | 3,651,663 | 2,018 |
2013 | 1,090,025 | 1,000,160 | 3,687,125 | 2,149 |
2014 | 1,023,645 | 907,109 | 3,527,157 | 2,290 |
2015 | 972,137 | 951,525 | 3,429,575 | 2,281 |
2016 | 977,982 | * | 3,319,472 | 2,349 |
2017 | 1,082,547 | * | 3,281,494 | 2,323 |
G1-Public library
* No data
G2 | No. of events | Visitors | Virtual visits | No. of user trainings | User trainings-participants |
2007 | 0 | 294,125 | 284,321 | 350 | * |
2008 | 0 | 316,180 | 313,634 | 343 | * |
2009 | 2 | 328,112 | 362,097 | 349 | 2,303 |
2010 | 0 | 309,317 | 293,432 | 226 | 1,479 |
2011 | 1 | 314,069 | 294,000 | 222 | 2,646 |
2012 | 1 | 323,797 | 259,438 | 199 | 3,171 |
2013 | 1 | 244,072 | 264,473 | 163 | 2,156 |
2014 | 0 | 244,463 | 254,935 | 1,440 | 4,543 |
2015 | 2 | 275,542 | 269,962 | 304 | 3,433 |
2016 | 4 | 291,733 | * | 340 | 3,064 |
2017 | 2 | 234,041 | * | 288 | 3,863 |
G2-Academic library
* No data
The German National Library Statistic, the Deutsche Bibliotheksstatistik (DBS), follows the definitions of the ISO Standard 2789 (ISO, 2013) (Hochschulbibliothekszentrum des Landes Nordrhein-Westfalen (HBZ), 2019).
However, as can be seen, there are several data missing in some years. Also in one case one can see that the quality control after finishing the collection of the statistical data seems to be missing, as it cannot be correct that in one year 1.440 user trainings have been conducted, compared to the other years. Especially if compared to participants in user trainings.
Great Britain3
UK1 | Visits | Loans | Downloads |
2007 | 1,237,496 | 1,333,161 | * |
2008 | 1,191,603 | 1,328,975 | * |
2009 | 1,093,726 | 1,373,597 | * |
2010 | 1,087,592 | 1,370,902 | * |
2011 | 1,069,910 | 1,366,464 | * |
2012 | 1,075,137 | 1,333,912 | 9,766 |
2013 | 986,174 | 1,275,718 | 18,266 |
2014 | 1,026,277 | 1,186,870 | 52,938 |
2015 | 1,058,781 | 1,108,625 | 66,101 |
2016 | 1,166,735 | 989,014 | 109,116 |
UK1-Public library
* No data
UK2 | Visits | Loans | Full text article requests | E-book section requests |
2007 | 2,097,336 | 1,841,236 | 4,859,439 | 449,897 |
2008 | 1,853,336 | 1,971,187 | 5,600,000 | 500,000 |
2009 | 1,844,336 | 1,983,269 | 5,376,000 | 420,000 |
2010 | 1,906,595 | 2,014,650 | 7,487,950 | 382,062 |
2011 | 1,929,637 | 2,100,671 | 6,846,509 | * |
2012 | 2,158,446 | 1,829,995 | 8,000,000 | 600,000 |
2013 | 2,678,134 | 1,855,178 | 7,353,310 | 1,062,840 |
2014 | 2,827,945 | 386,781 | 7,584,918 | 1,280,815 |
2015 | 2,801,548 | 362,073 | 8,240,223 | 1,952,969 |
2016 | 2,895,423 | 331,985 | 11,298,954 | 5,296,125 |
K2-Academic library
* No data
In addition, in the UK, many data is missing and the newest year was not available.
Norway4
N1 | Visites | Loans | Downloads |
2013 | 149,942 | 134,978 | 0 |
2014 | 130,012 | 127,672 | 1,332 |
2015 | 145,724 | 145,745 | 4,557 |
2016 | 148,013 | 149,169 | 4,622** |
2017 | 156,246 | 141,715 | 4,398** |
N1-Public library
** incomplete. data of PressReader usage is missing
N2 | Visits | Loans |
2013 | 530,476 | 500,389 |
2014 | 518,252 | 492,890 |
2015 | 481,027 | 532,196 |
2016 | 421,931 | 515,111 |
2017 | 314,020 | 478,171 |
N2-Academic library
No comparable data for electronic downloads available
Here, most of the requested data was not easily available at the time of the above-mentioned research project.
In Norway, the National Library of Norway collects data from school, public and academic libraries. The data are aggregated before being published at the National Library webpage and at Statistics Norway, but it is possible to access the “raw data” - the data that has been submitted by the libraries in a five-year time span, but separately for each library type. The researcher may quite easily combine the data as needed. Unfortunately, not all libraries are submitting good quality data. In addition, it needs to be considered that many academic libraries in 2015 were in the middle of a change in library systems, and the data was not available or is not reliable.
The lack of comparable data is also influenced by the fact that Norwegian public libraries are small (half of them have less than two members of staff in 2018, see https://bibliotekutvikling.no/statistikk/forside/statistikk-for-folkebibliotek/historisk-statistikk-for-folkebibliotek/ accessed: 02.04.2020). This obviously influences the statistical accuracy and interest, and it will be interesting to see if the widespread mergers of public libraries that follow the reorganisation of municipalities and regions in Norway will help. Also, the national library statistics for public libraries are said to move from the municipality statistics given for financial statistics to Statistics Norway. In addition, the National Library of Norway promised to implement a better and more intuitive statistic tool.
At the same time it is necessary to remember that as long as the input data are flawed, the statistical analyses and benchmarking will be less useful.
Romania
At the time of the above mentioned research project, no national library statistics could be used. The library leaders that had been interviewed provided some data.
R1 | Visits | Loans | Activities |
2017 | 105,509 | 162,841 | - |
The last five years | 549,003 | 852,134 | - |
No data | 33 |
R1-Public library
There have not been individual data for visits and loans in this library for the last five years. Also it is unclear, in which year 33 activities have been executed.
In the meantime, Romania has developed its statistics and the indicators have been improved, but not regarding the indicators that we needed for the above mentioned research project. The historical data is still difficult to find. Data is gathered by the National Institute of Statistics (NIS; Romanian: Institutul Naţional de Statisticţ, INS), “a Romanian government agency which is responsible for collecting national statistics, in fields such as geography, the economy, demographics and society” (Wikipedia, 2020). The NIS has data from 1995 for numbers of libraries divided by kind of library; for number of books in the libraries and for number of active users. They also have statistics of loans from 2011, for each county, as well as number of librarians employed (INSSE, 2020 http://statistici.insse.ro:8077/tempo-online/#/pages/tables/insse-table accessed: 02.04.2020).
Discussion
In retrospect, it seemed naive to think that there would be comparable data in different national library statistics. But at least the two indicators “Visitors / visits” and “Loans” should have been available in every country. Even if “Visitors / visits” is always a difficult indicator, as some count by hand (regularly or at random), some have an electronic devise installed at the door which counts only incoming or incoming as well as outgoing visitors. Also “Loans” is an indicator that seems to be measured differently in every country, as the researcher from Great Britain, e.g., reported the indicators “Loan”, “Full text article requests”, and “E-book section requests” (UK2 - Academic library).
However, even when data is available and possible to find, they were not reported to the research group. This is evident from our informers especially in Norway and Romania. We see this as an indication of a lack of statistical skills in the libraries. In libraries all over Europe, those responsible of delivering annual statistics sometimes change annually. Often there are no national training sessions (lectures and workshops) for the libraries. Also, the human resources in the libraries directed for evaluation has sometimes been cut into minimum, giving possibility for quality control on the macro level only, leaving “small” discrepancies not reflecting on the national level but influencing on the library level, into shadow.
At the same time it is necessary to remember that as long as the input data are flawed, the statistical analyses will be less useful. It will be difficult to benchmark within the nation, and impossible to benchmark with similar institutions abroad. This undermines the work that libraries can carry out to strengthen their position within their institutions, and also reduces the enthusiasm and incitement for learning from others.
We see no quick fix to this conundrum and the dilemma it shows. Already in 2009, problems with the use of statistical data has been reported, as e.g. Lau stated, “[…] measurement of progress in Information Literacy is not easy to carry out due to the lack of sufficient and meaningful statistics” (2009: 108). And Poll underlined, also in 2009, that the “[c]omparison of statistical results between institutions or countries will never be possible, if the data and the data collection methods have not been defined and fixed carefully” (27).
National Library Statistics and the IFLA Library Map of the World in Reference to the ISO Standards 2789, 11620, 16439, and 21248
Internationally, IFLA (International Federation of Library Associations) has for a long time seen the challenges following from the fact that libraries globally lack comparable data for benchmarking and advocacy. Several initiatives have been set up, and the latest is the IFLA Library Map of the World.
“Along with [the new] development, new evaluation methods and indicators will be needed to show the value and impact of the operations and services in cultural heritages organizations” (Laitinen, 2019: 263), and with this in academic as well as public libraries. This includes numerical indicators, but also indicators that obtain knowledge about the strengths as well as the development of each library to be able to produce and measure benefit for the libraries’ customers (Laitinen, 2019: 263). Also Melo and Repanovici emphasize that “[…] it is particularly important for libraries to be able to show that they function efficiently, but also that they provide services with impact and value to the success of institutional goals” (2018: 431) and that “[t]hroughout the world it has already been recognized that these data are essential for management, decision making and Library Advocacy” (2018: 438). Landøy and Raade state “[a]cademic libraries may find results in benchmarking that may be good arguments in their budgetary discussions with their universities” (2014: 819). This again shows the importance of good quality of national library statistics data to enable benchmark between libraries.
All in all “[l]ibrary statistics in a vacuum are of limited usefulness” (Heaney, 2009: 24). E.g. in South Africa “[t]here seems to be consensus among academic libraries […] that statistics have to be collected and made available nationally. The majority of these institutions have started using the statistics database” (Chiware and Becker, 2015: 6). And - another example - there has been the Russian National Standard developed that “[…] creates conditions not only for drawing comparisons but also for accurate analysis to be used in planning, accounting, and forecasting”(Dzhigo, 2015: 148).
The idea for the IFLA Library Map of the World started, “[w]hen IFLA needed reliable data about libraries and their services worldwide, [and] it became apparent that there are no such data. […] The final goal is that these statistics should be collected regularly on a national basis, so that there will be reliable and internationally comparable data of library services and library use” (Ellis et al., 2009: 123).
The IFLA Library Map of the World describes itself as follows: “Selected library performance metrics provide national level library data across all types of libraries in all regions of the world” (IFLA, 2019: Home). The Library Map of the World aims to collect the most recent data from existing sources, if and where available, and from as many countries as possible, and covering all types of libraries.
The initial set of performance metrics is the same for all types of libraries (IFLA, 2019: Home):
Number of libraries (library service points)
Number of libraries (library service points) providing internet access
Number of full-time equivalent (FTE) staff
Number of volunteers (headcount)
Number of registered users
Number of visitors
Number of loans and downloads
As good as the general idea of a worldwide library map is, the IFLA Library Map of the World now does not provide comparable longitudinal data from each country. In addition, it only offers a comparison between countries, not between individual libraries from different countries.
The framework for collecting library statistics and for the evaluation of library operations is given in the valid standards of the library field. There are indicators for this in the general library statistics standard ISO 2789 (ISO, 2013) and in the standard for evaluating the impact of libraries ISO 16439 (ISO, 2014b), in the standard ISO 11620 (ISO, 2014a) for the library performance indicators, and in the standard for evaluation of the national libraries ISO 21248 (ISO, 2019), which are not yet used in all of the national statistics described in section “What did we find in the first research project?” and also are these standards at the moment not the basis for the IFLA Library Map of the World.
Conclusions and Further Research
A big challenge in producing national common statistics is the lack of resources for sufficient quality control on the national level, and thus putting the responsibility of checking and revisions on the responsibility of local level. Despite the seemingly simple task of checking the annual statistics, those in charge for the statistics in their libraries may not always have the skills and routine needed. On the national level, the resources on quality control are sufficient on the macro level only, leaving individual libraries in the shadow.
We emphasise the need for support and training for the library staff that work with compiling and registering data from the individual libraries. Our research has shown that there also is a need to clarify the actual indicators to avoid misunderstandings. What is included and excluded from the different categories needs to be made very clear, to avoid confusion and misunderstandings. Examples may include: “What kind of staff is to be counted as library staff?”, “What is a loan?”. Indicators like “loans divided on members of library staff” that can be used to signal efficiency may be different, if e.g., janitor staff and cleaners are included in library staff in some libraries, but not in others.
It is important to have data available that enables academic as well as public libraries to benchmark their performance in comparison with libraries from the own country as well as internationally. For this, more research has to be done to define indicators that can be used worldwide.
Due to the demands of the libraries’ modern environment, the principle of measuring library performance should, in addition to traditional assessment, emphasize the impact and value of the libraries, both inside their frame organizations (university, college, school, municipality, research institute etc.) and in the society, as well as the additional value perceived by the clientele. Therefore for the future compilation of library statistics both on the local and national level, one of the increasing needs for statistics will be to be able to show these impact of libraries.
As neither national library statistics nor the IFLA Library Map of the World are nowadays supporting all questions regarding the library’s own performance in comparison to other libraries, other ideas came up, such as using the Net Promoter Score (NPS) to measure library users’ satisfaction with library services (Laitinen, 2018). Using the NPS and other possible quality indicators would be one way to follow the idea of turning the qualitative data, e.g., from user surveys into numeric values, as introduced in the standard ISO 16439 (ISO, 2014b). The next step might also be including the quality indicators in the published national statistics to ease benchmarking of libraries of same type.