DOI: 10.1111/hir.12040

A bibliometric approach demonstrates the impact of a social care data set on research and policy Christine Urquhart* & Sara Dunn† *Department of Information Studies, Aberystwyth University, Aberystwyth, UK, and †SaraDunn Associates, Dorchester, UK

Abstract Background: The National Minimum Dataset for Social Care (NMDS-SC) has provided detailed data since 2006 on the workforce for adult social care services in England. In 2012, the organisation responsible for the data set commissioned an evaluation into the impact of the data set on researchers and policymakers. Objective: Discusses how a novel wide-ranging bibliometric approach, complemented by interviews of key informants, demonstrated the developing impact of the data set. Methods: The evaluation comprised web metrics to assess NMDS-SC-related traffic on relevant websites; bibliometrics to assess the use of NMDS-SC data in scholarly publications and grey literature; telephone interviews with 12 key users of the data set; and an online survey completed by 24 key users of the data set. A theoretical framework for research impact was used. Results: The web metrics demonstrated increase in traffic on the relevant pages of the organisation’s website. There were references to the data set in 175 separate publications (15% from academic journals, 50% as policy/practice reports and 35% as media communications. Interviews evidenced many impacts, for example provision of robust data for secondary analysis that challenged conventional views about the social care workforce. Conclusion: Bibliometrics plus interviews provided a rounded picture of the data set’s impact. Keywords: bibliometrics; data set; qualitative evaluation; research impact; social care

Key Messages

• • • •

Librarians supporting health and social services research should use a set of bibliometric/altmetric indicators to assess research impact. Qualitative methods help to assess important conceptual uses of research. Librarians should provide better guidance on how to cite a data set. Frameworks of research impact and citation typologies need to reflect the use of social media by different disciplines

Introduction Skills for Care is an English sector skills agency. It supports employers in the care sector to develop the knowledge and skills of nearly 1.56 million workers and to plan using data from the National Minimum Data Set for Social Care (NMDS-SC). Social care in the UK is a mixed economy, with services provided by local authorities, by voluntary

Correspondence: Christine Urquhart, Department of Information Studies, Aberystwyth University, Llanbadarn Fawr Aberystwyth, Aberystwyth SY23 3AS, UK. E-mail: [email protected]

294

and not for profit organisations, and by private companies. The National Minimum Data Set for Social Care (NMDS-SC) has collected information about organisations providing care services and their employees since early 2006, and now covers more than half the providers of social care in England. As well as working with employers in the care sector, Skills for Care also works closely with government departments including the Departments for Health, for Business, Innovation and Skills, and for Education. The NHS Information Centre for Health and Social Care, the Care Quality Commission and other bodies with responsibil-

© 2013 The authors. Health Information and Libraries Journal © 2013 Health Libraries Group Health Information & Libraries Journal, 30, pp. 294–302

Bibliometrics for research impact in social care, Christine Urquhart & Sara Dunn

ity for safety and quality in social care in England are also important working partners. Data sets produced by public authorities in the UK are subject to legislation on access to the data. The Protection of Freedoms Act (2012)1 extends the previous Freedom of Information legislation by requiring public data sets to be available in a reusable format. A data set2 is defined as a collection of information held in electronic form where all or most of the information meets four criteria concerning (i) the purpose of data collection, (ii) factual nature of the data (raw data), (iii) not associated with official statistics (covered by other legislation) and (iv) the presentation of the data. The NMDS-SC is intended to help employers with workforce and service planning, including recruitment and retention, by providing benchmarking data. Nationally, the data set provides a picture of the skills, qualifications and composition of the social care workforce in England. The Social Care Workforce Research Unit at King’s College, London, is the principal academic unit using the data set for secondary analysis, producing reports, academic journal articles and the Social Care Workforce Periodical, an online open access journal. Skills for Care also produce some further analyses of the data set in regular briefings and reports such as the bi-annual State of the Adult Social Care Workforce. The evaluation discussed in this article aimed to identify the impact of the NMDS-SC on the policy and research communities in social care and related areas. It complemented another evaluation strand that focused on the impacts of the data set on social care employers. A summary report of the overall evaluation was published by Skills for Care.3 Literature review The first theme in this brief literature review discusses the use of bibliometric techniques to assess the development of scholarly and non-scholarly communication about a data set and its associated publications. The second, and interlinked, theme is the assessment of research impact, in other words putting research findings into practice. Only the main theoretical principles underpinning the evaluation are described here.

Bibliometrics Bibliometrics is usually defined as the quantitative analysis of the characteristics of documents published by researchers, but the use of qualitative research in science and technology studies is also increasing.4 The metrics of bibliometrics requires attention paid to the units of analysis,5,6 and normally this might be an author, or research group, or published paper, or report. A data set is a more unusual unit of analysis for bibliometrics in library and information science, but the increasing availability of public data sets on the Web, and their use by researchers for secondary analysis, means that a data set can be a unit of analysis (provided it is cited as such) for bibliometric studies. For example, the Scholarly Database, contains several data sets about publications and research awards and an evaluation,7 suggests that more use will be made of data sets in scientometric research, particularly with the development of more associated data visualisation techniques. Scientometrics usually refers to policy and research productivity of countries or regions, bibliometrics usually (but not always) refers to the research productivity or characteristics of individuals, and altmetrics, a newer term usually refers to the diversity of sources that may be counted – not just articles, but blogs, and smaller units of publication.8 The growth of webometrics,9 the study of webbased content with quantitative methods, often with similar social science research goals to the bibliometric and scientometric studies, means that the scope of bibliometric research is much wider now, and the range of techniques wider (e.g. link analysis in webometrics or Web 2.0 studies with altmetrics). However, interpretation needs care, as indicated in a web intelligence analysis of the National electronic Library for Health10 that explored how link data might be used with transaction log files to provide more insights. Webometric methods including link analysis have been used to assess academic networks, and one evaluation has examined the impact of digitised scholarly resources.11 Citation analysis is possible with SCOPUS and Google Scholar as well as Web of Science, and comparisons indicate that Google Scholar may pick up on the literature that is not well covered by WoS for the social sciences.12–14

© 2013 The authors. Health Information and Libraries Journal © 2013 Health Libraries Group Health Information & Libraries Journal, 30, pp. 294–302

295

296

Bibliometrics for research impact in social care, Christine Urquhart & Sara Dunn

Research impact According to a typology developed by Sandra Nutley’s team at the Research Unit for Research Utilisation,15 research has conceptual and instrumental uses. Instrumental use refers to the direct impact of research on policy and practice decisions. It identifies the influence of a specific piece of research in making a specific decision or defining the solution to a specific problem. Conceptual use is more fuzzy, comprising the complex and often indirect ways in which research can have an impact on the knowledge, understanding and attitudes of policymakers. Such uses of research may be less demonstrable but are no less important than more instrumental forms of use. Landry and colleagues,16 defined a ‘ladder’ of research use, based on their assessment of the utilisation of social science research knowledge in Canada. The first rung of the ladder is ‘transmission’, where the researchers have transmitted key findings to relevant policymakers and practitioners. Second comes ‘cognition’, where the research findings have been read and understood by their recipients. The third stage is ‘reference’, where the findings have been cited in reports. The fourth stage comprises ‘effort and influence’, where efforts have been made to ensure the findings influence decisions. The ‘top of the ladder’ is ‘application’, whereby the findings led to applications and extension within the policy or practice communities. This ladder is a useful schematic, but it assumes a staged and linear approach to levels of research use. There is evidence that the real-life process is much more muddied and the trail of conceptual use (as defined above) is not clear to follow.17 We also need to remember stakeholders’ interests in making research ‘reports’ appear important factual evidence, although they may really be ‘a little hot, a little cold, a little old, a bit of this and a bit of that’ (translated from original).18 The value of a new data set may be uncertain. Relations may develop among those who contribute to the data set, those who sponsor it, and those who use it for research. As a tool (good or bad), the data set is evolving, and its use may change interpretations and usage of other similar data sets. Citation analysis assumes, basically, that references to data or articles in published items indicate

use of the data or article: ‘Metaphorically speaking, citations are frozen footprints in the landscape of scholarly achievement’.19 The extent and purpose of citation may vary – and various typologies of citations have been proposed.20 With the growth of the Web, the scope of citation analysis can be broader, and capable of tracking some research use at Landry’s transmission stage, as altmetrics proponents suggest. While bibliometrics is usually carried out purely as quantitative analysis, one study of digital repository management used link analysis, interviews with repository managers and an online survey of repository users to evaluate five different UK public repositories, although the interviews did not contribute to the bibliometric analysis directly.21 For the 2014 Research Excellence Framework (REF) administered by the Higher Education Funding Council for England (HEFCE),22 the Panel responsible for social work and social policy assessment lists types of impact relevant to social work and social policy research. The list includes as follows: influence on professional standards and guidelines; influence on planning or management of services; challenge to conventional wisdom among stakeholders; and improved public understanding of social issues. Objectives and scope The aim of the evaluation was to identify and quantify the observable impacts of NMDS-SC on research and policy. An additional output (not reported in this article) was the set of recommendations for Skills for Care on enhancing the impact of NMDS-SC. In this assessment, we took ‘research settings’ to cover both formal research from academic bodies, and non-academic research undertaken by knowledge organisations, think tanks, trade organisations and the mainstream media. We defined ‘policy settings’ as primarily national (for England), and we were interested in how NMDS-SC interacted with policy thinking and influenced policy directions and choices. The research did not cover in detail the impact of the data set on the local decision-making – including local policy and commissioning decisions – undertaken by individual employer organisations, although some more local processes were noted in passing.

© 2013 The authors. Health Information and Libraries Journal © 2013 Health Libraries Group Health Information & Libraries Journal, 30, pp. 294–302

Bibliometrics for research impact in social care, Christine Urquhart & Sara Dunn

Methods The research approach had four strands, intended to complement each other and to provide some triangulation of data. It incorporated: • Web metrics – a quantitative assessment of NMDS-SC related traffic on the Skills for Care and NMDS-SC websites • Bibliometrics – a primarily quantitative assessment of how NMDS-SC, and associated publications manifest in scholarly and professional publications, and in grey literature (an approach with similarities to contextual citation analysis23,24) • Interviews – telephone interviews based on a semi-structured discussion guide with 12 key users of the data set in the policy and research fields • Short online survey – 24 selected individuals gave their views on whether specific types of policy impacts could be ascribed to NMDS-SC. The research was conducted between March and May 2012. The web metrics analysis focused on data provided by Skills for Care showing levels of use of various HTML pages and pdf document downloads directly related to NMDS-SC, either from the Skills for Care website or from the NMDS-SC website. The data were generated by Google Analytics and show trends in total number of visits, unique visitors, new and returning visitors, and some data on referral sources. The drawbacks of Google Analytics include the potential for inaccurate data due to website users blocking cookies, leading to underestimations of traffic volumes, and variations in the ways site sessions are recorded, leading to overestimations for total numbers of visits. Unique visitors are generally a more robust metric when using Google Analytics. For the bibliometrics, we used several approaches to searching published outputs in this study. First, Thomson Reuters Web of Science was used to assess the patterns of citation of the National Minimum Data set for Social Care, and main compilations based on NMDS-SC – the State of the Adult Social Care Workforce reports, and the NMDS-SC Briefings series. The Thomson Reuters Journal Citation Reports provided information on the

Impact Factors of the academic journals in which the citing articles appear. The drawback of WoS for the UK social science literature is the dominance of American journals, but nevertheless, this is a standard approach.25 For the second approach, Google Scholar provided entry points into both the academic literature and the grey literature in the form of reports, important for social policy research publication.26 Google Scholar search engine results records include ‘cited by’ data for some items. Google Scholar is constructed in a different way from WoS, and the range of material included is much greater. Other databases were also used to identify any appearance of NMDS-SC in the legal, business, health service, social services and local authority literature. These included NHS Evidence, LG Search, HeinLaw Online, EBSCO Business Source Complete, Nexis and Emerald Journals. We then conducted an extensive targeted ‘grey literature’ search, by manually searching and if necessary browsing the websites of organisations considered likely to be publishing materials drawing on NMDS-SC. These organisations were identified from existing knowledge of the sector, by the client and ourselves, and the initial findings from the bibliometric survey. A total of 24 organisational websites were manually searched and browsed, including UK government departments, sectoral bodies, knowledge intermediary organisations such as independent research organisations (e.g. Social Care Institute for Excellence, Centre for Workforce Intelligence), campaigning organisations, think tanks, trade/employer organisations and the professional and mainstream press. We also conducted a limited search of social media, using social media aggregator sites. The NMDS-SC data set has been established for sufficient time for all five rungs of the Landry ladder to be identified, and the interviews were intended to clarify some of the problems that might occur around assessing how conceptual or instrumental use of the data set was, something that might be hard to identify from citation analysis on its own. Phone interviews used a semi-structured interview schedule to elicit why and how the data set was used, any complementary types of data used, the perceived impacts of the data set,

© 2013 The authors. Health Information and Libraries Journal © 2013 Health Libraries Group Health Information & Libraries Journal, 30, pp. 294–302

297

298

Bibliometrics for research impact in social care, Christine Urquhart & Sara Dunn

the alternatives (if any) if the data set had not been available and possible improvements or enhancements. The 12 interviewees were identified by Skills for Care as key stakeholders and users of the data set. They were a mix of researchers, policymakers and knowledge intermediaries. To further investigate the policy impacts of the data set in particular, a short online survey was developed to focus on this specific area. The survey considered eight types of impact that research can have on policy, as identified in the research utilisation literature and also taking account of definitions of research impact from the UK HEFCE Research Excellence Framework 2012. A total of 63 individuals who had been identified as key stakeholders or relevant experts were asked to complete the survey. Of these 63 invitees, 24 responded, a response rate of 38%. Results Web metrics on the NMDS-SC website showed a steady increase in traffic from 2008 to 2011 on those pages offering research outputs from the data set. Traffic on these pages continued to increase when traffic to other pages on the site had reached a plateau. The principal compendium report produced by Skills for Care based on the data set, the bi-annual State of the Adult Social Care Workforce, showed a large increase in downloads – 230% between 2008 and 2010. (Some of this increase, however, may be attributable to an increase in promotion of all digital products by the organisation during the same period). 60

The bibliometric research identified a total of 175 documents that made mention of NMDS-SC during the period Jan 2006–March 2012 (Fig. 1). Figure 1 illustrates that documents from academic sources were the most numerous, making up about one-third of the total (54 of 175). They included articles in journals (both peer-reviewed and non-peer-reviewed), as well as reports from academic institutions. Next most numerous (36) were government documents, which include documents from central government and local government, including policy, strategy and practice guidance. The 30 documents described as ‘press’ included both the professional press such as Community Care, and mainstream press such as the Guardian, New Statesman, BBC. ‘Knowledge intermediary’ documents (21) included reports from sectoral bodies such as SCIE, CQC (and its predecessor, the Commission for Social Care Inspection (CSCI), as well as research organisations like Joseph Rowntree, many of which were the subject of the targeted grey literature searches. Skills for Care’s own output was identified separately. Campaigning documents (seven) included those from Unison, and Age Concern, and the six Think Tank documents include the Kings’ Fund and IPPR (Institute for Public Policy Research). Finally, the five ‘Care Trade’ documents were from ADASS, some of which are joint publications with Skills for Care. As well as organisational provenance, we assigned each item we discovered a document type:

54

50 40 30 20 10

36 30 21

16 7

6

5

0

Figure 1 Documents retrieved (total 175), period 2006–2012, by organisation type © 2013 The authors. Health Information and Libraries Journal © 2013 Health Libraries Group Health Information & Libraries Journal, 30, pp. 294–302

Bibliometrics for research impact in social care, Christine Urquhart & Sara Dunn



Twenty-seven of 175 items (15%) were classified as Articles in academic journals – intended for other academic researchers; included journals with impact factors (from Web of Science) and without • Eighty-five of 175 items – almost half – were classified as Reports – intended for funders, academic researchers, policymakers and practitioners; this included Government reports and Skills for Care’s own outputs • Sixty-three of 175 items (36%) were classified as Communications – including content on web pages and the professional and mainstream press – intended for social care practitioners or the general public. The quantity of document types retrieved was dependent on how easy they are to find in either databases or search engines. Ensuring discoverability requires some knowledge of metadata and general web informatics, not necessarily something smaller organisations such as trade or professional bodies would have access to. For each document retrieved, we made an estimate of how central the use of NMDS-SC data was to the content of the document: Level 1: Listing: news announcement, catalogue entry or other listing concerning NMDS-SC service or output Level 2:Mention-background: NMDS-SC data or service is part of background in document Level 3: Mention-central: NMDS-SC data or service is evaluated or discussed in document Level 4: Citation-background: NMDS-SC data are formally referenced but use of data is part of background to document Level 5: Citation-central: NMDS-SC data are formally referenced and use of data is central to document. The largest groups were ‘Level 4’ and ‘Level 5’ documents, that is, those in which NMDS-SC data were formally referenced, either as background or as a central data source (Fig. 2). The use of NMDS-SC in these documents was usually in the form of The State of the Adult Social Care Workforce (SOASC 2008 or 2010), or NMDS-SC Briefings, rather than references to raw data, as the data set itself. One interviewee mentioned that they were unsure how to reference the data set and so preferred to reference the SOASC reports instead.

Level 1; 3% Level 2; 11%

Level 5; 33% Level 3; 20%

Level 4; 33%

Figure 2 Levels of NMDS-SC use in 175 documents retrieved

The variation of citation formats we found for the data set suggested this could be a common problem. Figure 3 illustrates that almost half of the academic uses of the data set were at level 5, in other words the data set was central to the content of the article or report. Over time, the academic references increased, although at first they formed a small proportion of the uses of the data set. This reflects the length of time required to get articles peer-reviewed and published. It is interesting to note that the government documents tended to use the data set at level 3, in other words the data set was central to the document but not formally referenced. This illustrates clearly the challenge posed in the evaluation of use of the data set: most of these items would not have been discovered through a standard bibliometric citation search, yet they are central to the picture of NMDS-SC impact. We can also see that the press items contained the most level 2 mentions, where the data set is not formally referenced (unsurprisingly) and the data set is part of the background to the items. Level 4 items were the most varied – government documents, reports by knowledge intermediary/sectoral bodies and journal articles. A wide range of non-academic organisations provided level 4 citations in their reports, and sometimes the report was the result of collaboration between various organisations. Several of the Skills for Care reports also featured the work with local authorities or work with a research consultancy. Separating academic research from policy research or practice-based evidence in the field of social care, as with other parts

© 2013 The authors. Health Information and Libraries Journal © 2013 Health Libraries Group Health Information & Libraries Journal, 30, pp. 294–302

299

300

Bibliometrics for research impact in social care, Christine Urquhart & Sara Dunn

Figure 3 Document types (organisational sources) retrieved by levels of NMDS-SC use

of public policy, is extremely difficult, as research is partly driven by the needs of funders, and funders are most often interested in research that has a practical or policy-driven outcome. It was possible to produce Web of Science citation maps for some of the articles. These maps show all the previous works authors make reference to when writing an article and then show the subsequent works that go on to cite that article once published (downstream citations). For one example (Curtis, L., Moriarty, J., & Netten, A. (2010) ‘The expected working life of a social worker’ British Journal of Social Work 40, 1628–1643), there were five Web of Science downstream citations (at April 2012). Google Scholar had 13 downstream citations, the five shown in Web of Science, plus several more journal article citations, a doctoral thesis citation and a slide pack presentation. The greatest number of citations for a single item was for a report on migrant care workers: 50 Google Scholar downstream citations. The contribution by NMDS-SC to the policy debate around migrant workers was identified by many interviewees as an important example of its impact. Tracking mentions of NMDS-SC in the press was difficult – indexing was variable in consistency and depth. Visibility in the social media was low, but this was not surprising as the data set appeared to be used by a relatively small group of researchers. Twenty-one different kinds of use of the data set were identified by the interviewees and survey

respondents, with the purpose of the usage falling broadly into five groups: research; policy development; service management and planning; inspection or quality assurance; or communications. The availability of the data set had made research on the social care workforce more robust, and the data set complemented other data sets available. A majority of the respondents to the survey felt that NMDS-SC had ‘alerted policymakers to a particular issue in adult social care on several occasions’ (19 of 24) and ‘had identified priorities for policy action (which may include further research) on several occasions’ (18 of 24). Interviewees and survey respondents pointed to the increased awareness for operational planning, and several provided examples where the data set had challenged conventional wisdom (e.g. on recruitment and retention). While all the researchers said that sources would always be publicly acknowledged, policymaker interviewees suggested that a number of types of use of the data set in policy work would not be publicly recorded, including provision of internal briefings; informing internal policy discussions with no public outputs; and informing policy summary documents that did not have a formal reference style. Discussion No other recent example in the literature could be found of this mixed-method approach to determining the research and policy impact of a

© 2013 The authors. Health Information and Libraries Journal © 2013 Health Libraries Group Health Information & Libraries Journal, 30, pp. 294–302

Bibliometrics for research impact in social care, Christine Urquhart & Sara Dunn

data set, although there is some related research. Considerable effort has been made on visualisation techniques (e.g. citation maps in Web of Science), and Google Scholar, for example, offers researchers facilities for calculating their comparative impact and research productivity through their hindex. A study by the Cybermetrics Research Group at the University of Wolverhampton (in conjunction with Loughborough University) used interviews and link analysis to evaluate the development of some digital repositories,21 but the function of the interviews was only to obtain the views of the repository managers on the results of the link analysis. Comparative link analysis is very helpful in assessing the spread of awareness of the resource and the type of users, but trying to distinguish between conceptual and instrumental impact would be very difficult: both are important, and interviews help to uncover some of the impact trails.15,17 The various methods for the study thus complemented each other well. The web metrics indications were confirmed by the bibliometric analysis. The ‘levels of mention’ schematic worked, although perhaps numbering the levels in the way we did values academic research uses over policy uses, which we noted were more likely to be at level 3. In terms of impact on policy and practice, there may be little difference between level 2 and level 4, apart from the norms of citation practice. It was important to distinguish between level 1 and level 2, and there was a clear gradation in expected impact between level 1 and level 2. Similarly for level 4 and level 5, a mention in the background literature review (level 4) provides much less evidence of impact than level 5 use of the data set within the methods, or analysis or discussion. Level 3 was easy to label (once found), but interviews with the people responsible for writing the document are necessary to assess the impact of the data set, and even then, we have to realise that a research report or a policy document is, to some extent, a sanitised account of discussion, debate, checking, verification, and finding a middle way – a way of doing and presenting social science.18 The difficulty we had of identifying citations to the data set, and the variations found, even from the same author, in itself indicated that the authors are sometimes unsure how

to acknowledge the contribution of the data set, and cite it. It is therefore no surprise that the trail may run cold in policy documents that are not accompanied by many references to the evidence. Figure 3, document types (organisational sources) by level of mention, demonstrates that it is important to distinguish between the purpose of use (as indicated by the source of the publication) and level of mention. They are not synonymous, it is not simply a distinction between formal and informal publication and that is why it is important to use several measures to help to understand conceptual and instrumental use. Google Scholar was invaluable in assessment of the impact of the data set in the grey literature, such as government and other agency reports, and the non-WoS journals, as expected.25,26 Generally, the number of Google Scholar citations exceeded the number of Web of Science citations, as other studies in social sciences indicated they would,11– 13 although our interest was less the number of citations but where they occurred. The lists should be checked to avoid any duplication. The main area of uncertainty in identification of the mentions was in the press and media reports, which included the blogs by journalists working for the popular Community Care. Altmetrics8 here is developing. Conclusion The mixed-method approach to assessing the impact of the NMDS-SC on policy, practice and research helped to explain conceptual and instrumental uses of the data set. The interviews and online survey complemented the bibliometric analysis and the web metrics analysis. Identifying the main users at a national level was relatively easy, as the names were mostly already known to Skills for Care, but for an impact evaluation that studied an international audience, a different approach should be used to identify data set users, as their perspectives would be different. The Landry framework worked well, but it is important to identify the source (and purpose) of mention as well to avoid conflating impact with implied use from the citation format. Altmetrics is a developing area. Librarians involved in supporting research impact assessments should use

© 2013 The authors. Health Information and Libraries Journal © 2013 Health Libraries Group Health Information & Libraries Journal, 30, pp. 294–302

301

302

Bibliometrics for research impact in social care, Christine Urquhart & Sara Dunn

a framework that suits the discipline being assessed. For social sciences, Google Scholar seems a necessary tool in bibliometric analysis, to complement web metrics and Web of Science bibliometric analysis. Acknowledgements The authors thank Skills for Care for their assistance and the reviewers for constructive comments. References 1 Great Britain. Protection of Freedoms Act. Accessible at: http://services.parliament.uk/bills/2010-12/protectionoffreedoms.html 2 Hasan, I. Freedom of Information and datasets. June 16, 2011. Accessible at: http://www.lawgazette.co.uk/in-practice/ freedom-information-and-datasets. 3 Skills for Care. An evaluation of the National Minimum Dataset for Adult Social Care. Leeds, 2012. www.skillsforcare.org.uk 4 Lariviere, V. The decade of metrics? Examining the evolution of metrics within and outside LIS Bulletin of the American Society for Information Science and Technology, 2012, 38, 12–17. 5 Borgman, C. L. & Furner, J. Scholarly communication and bibliometrics. Annual Review of Information Science and Technology, 2002, 36, 2–72. 6 Vinkler, P. The Evaluation of Research by Scientometric Indicators. Oxford: Chandos Publishing, 2010. 7 LaRowe, G., Ambre, S., Burgoon, J., Ke, W. & B€orner, K. The Scholarly Database and its utility for scientometrics research. Scientometrics, 2009, 79, 219–234. 8 Priem, J. Altmetrics: a manifesto. Accessible at: http://www. altmetrics.org/manifesto 9 Thelwall, M. A history of webometrics. Bulletin of the American Society for Information Science and Technology, 2012, 38, 18–23. 10 Zuccala, A., Thelwall, M., Oppenheim, C. & Dhiensa, R. Web intelligence analysis of digital libraries. Journal of Documentation, 2007, 63, 558–589. 11 Eccles, K. E., Thelwall, M. & Meyer, E. T. Measuring the web impact of digitised scholarly resources. Journal of Documentation, 2012, 68, 512–526. 12 Mingers, J. & Lipitakis, E. A. E. C. G. Counting the citations: a comparison of Web of Science and Google Scholar in the field of business and management. Scientometrics, 2010, 85, 613–625.

13 Kousha, K. & Thelwall, M. Google citations and Google Web/URL Citations: a multi-discipline exploratory analysis. Journal of the American Society for Information Science and Technology, 2007, 58, 1055–1065. 14 Levine-Clark, M. & Gil, E. A comparative analysis of social sciences citation tools. Online Information Review, 2009, 33, 986–996. 15 Nutley, S. M., Walter, I. & Davies, H. T. O. Using Evidence: How Research can Inform Public Services. London: Policy Press, 2007: 36 16 Landry, R., Amara, N. & Lamari, M. Utilization of social science research knowledge in Canada. Research Policy, 2001, 30, 333–349. 17 Davies, H., Nutley, S. & Walter, I. Assessing the impact of social science research: conceptual, methodological and practical issues. A background discussion paper for the ESRC Symposium on Assessing Non-Academic Impact of Research, May 2005. Accessible at: http://www.odi.org.uk/ rapid/Events/ESRC/docs/background_paper.pdf 18 Latour, B. Le metier de chercheur regard d’un anthropologie. 2nd edn. Versailles: Quae, 2010 (original 1994), www. quae.com (p.167 of e-version) 19 Cronin, B. The need for a theory of citing. Journal of Documentation, 1981, 37, 16–24. 20 Swales, J. Citation analysis and discourse analysis. Applied Linguistics, 1986, 7, 39–56. 21 Zuccala, A., Thelwall, M., Oppenheim, C. & Dhiensa, R. Digital Repository Management Practices, user needs and Potential users: An Integrated Analysis. Bristol: JISC, 2006. Accessible at: http://repository.jisc.ac.uk/139/1/FinalReport% 5b1%5d.pdf 22 Higher Education Funding Council for England, (HEFCE) Panel Criteria and Working Methods REF 2014. (Table C1, p.69–70). Bristol: HEFCE, 2012. 23 Spasser, M. The enacted fate of undiscovered public knowledge. Journal of the American Society for Information Science and Technology, 1997, 48, 707–717. 24 Kousha, K. & Thelwall, M. The role of online videos in research communication: A content analysis of YouTube videos cited in academic publications.. Journal of the American Society for Information Science and Technology, 2012, 63, 1710–1727. 25 Van Leeuwen, T. The application of bibliometric analyses in the evaluation of social science research: who benefits from it and why it is still feasible. Scientometrics, 2006, 66, 133–154. 26 Nederhof, A. J. Bibliometric monitoring of research performance in the social sciences and humanities: a review. Scientometrics, 2006, 66, 81–100. Received 12 December 2012; Accepted 26 June 2013

© 2013 The authors. Health Information and Libraries Journal © 2013 Health Libraries Group Health Information & Libraries Journal, 30, pp. 294–302

A bibliometric approach demonstrates the impact of a social care data set on research and policy.

The National Minimum Dataset for Social Care (NMDS-SC) has provided detailed data since 2006 on the workforce for adult social care services in Englan...
469KB Sizes 0 Downloads 0 Views