DOI: 10.1111/hir.12054

A checklist to assess database-hosting platforms for designing and running searches for systematic reviews Alison Bethel & Morwenna Rogers PenCLAHRC, University of Exeter Medical School, Exeter, UK

Abstract Background: Systematic reviews require literature searches that are precise, sensitive and often complex. Database-hosting platforms need to facilitate this type of searching in order to minimise errors and the risk of bias in the results. Objectives: The main objective of the study was to create a generic checklist of criteria to assess the ability of host platforms to cope with complex searching, for example, for systematic reviews, and to test the checklist against three host platforms (EBSCOhost, OvidSP and ProQuest). Method: The checklist was developed as usual review work was carried out and through discussion between the two authors. Attributes on the checklist were designated as ‘desirable’ or ‘essential’. The authors tested the checklist independently against three host platforms and graded their performance from 1 (insufficient) to 3 (performs well). Results: Fifty-five desirable or essential attributes were identified for the checklist. None of the platforms performed well for all of the attributes on the checklist. Conclusions: Not all database-hosting platforms are designed for complex searching. Librarians and other decision-makers who work in health research settings need to be aware of the different limitations of host platforms for complex searching when they are making purchasing decisions or training others. Keywords: bibliographic databases; database searching; information retrieval; literature searching; searching; review, systematic

Key Messages

• • • •

Librarians who make purchasing decisions should consider subscribing to database-hosting platforms that allow for complex searching, if available. The checklist developed by the authors could be used to assess the suitability of platforms for designing and running complex searches for systematic reviews Database host companies should consider the complex search needs of systematic reviewers when designing or updating their platforms Database owners should be aware that host platforms might not cope well with highly evolved search strategies, such as those for systematic reviews, before they provide a particular vendor with the sole access rights to their database. Library and information service professionals should be responsible for determining what tools we use for searching, and for speaking to suppliers and budget holders if these tools are inadequate

Introduction Correspondence: Morwenna Rogers, Information Specialist, PenCLAHRC, University of Exeter Medical School, Veysey Building, Salmon Pool Lane, Exeter EX2 4SF, UKE-mail: morwenna.rogers@exeter. ac.uk

Information specialists within systematic review teams perform a number of key roles including scoping searches, designing search strategies, advising on resources, translating and running

© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group Health Information & Libraries Journal, 31, pp. 43–53

43

44

A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers

searches across different databases and downloading results for the reviewers. Invariably, the information specialist is responsible for ensuring that all relevant data are retrieved: failure to do so could result in a biased review. There are many reasons why data may be missed, for example, poor indexing of references on databases, failure to search across multiple resources or an insufficient search strategy.1–3 However, the ability of database-hosting platforms to facilitate this type of search technique is seldom cited as a reason for missing data. This issue is important as some host platforms have gained sole right to key health databases. For example, currently, the British Nursing Index is only commercially available through ProQuest, and CINAHL (Cumulative Index to Nursing and Allied Health) is only commercially available via EBSCOhost. These databases form an important source of health-related literature, and they need to be searched comprehensively and exhaustively for systematic review, particularly if the research area is in nursing. With only one platform providing access to the literature on these databases, it is vital to ensure that each platform can cope with the complex level of searching needed for systematic reviews. Search strategies for systematic reviews should be highly sensitive to capture as much relevant information as possible.4 In addition, a search strategy should be as transparent as possible and documented in a way that enables it to be evaluated and reproduced.5 Evidence indicates that comprehensive search strategies are required for systematic reviews, as basic or intermediate searches could miss key papers.1, 6 Consequently, search strategies for systematic reviews are often complex and can be hundreds of lines long, with many combinations of Boolean logic, wildcards and adjacency/proximity instructions. The challenge faced by information specialists in producing a search strategy that is balanced between sensitivity and precision is understood.4,5 However, the difficulties faced in translating or running a complex search across different platforms are not well documented. Previous studies focus either on the performance of individual databases on different platforms or on how alterations in search strategies translated

across different platforms affect the results.6–10 For example, Kimball7 produced an analysis of three platforms (OvidSP, EBSCO and Engineering Village) for searching one database (GeoRef) but did not examine the specific requirements for complex searching or produce a useable checklist. In addition, there are several studies that examined the results of individual searches carried out on the same database but on different platforms.6,8–10 Younger and Boddy8 assessed the performance of the database AMED across three platforms (DIALOG DataStar, OVID and EBSCOhost) and found the number of hits can vary considerably with basic searches. Sewell9 and Casler10 compared the performance and features of various host platforms for searching CAB abstracts and AGRICOLA, respectively. Sewell evaluated CAB direct, EBSCOhost, ISI and OvidSp and found no statistically significant differences in precision or recall. The study concluded that the user population and cost, as well as performance, should be a consideration in the purchasing decision-making process. Casler produced a table of features of five host platforms available at the time, but no overall evaluation of comparative data. Bandyopadhyay6 evaluated the information retrieval of Biological Abstracts on two platforms (SilverPlatter and EBSCOhost) using both novice and complex queries on each one. This study found that more complex searching generated the best results and concluded that database providers were developing user-friendly interfaces without simplifying the underlying search mechanisms, such that unskilled searching would not find the desired information. However, to date, there have been no published studies detailing the development and validation of a checklist for assessing the performance of platforms. The aim of this study, therefore, was to produce a valid checklist for assessing the abilities of database host platforms for carrying out complex searches such as those used in a systematic review. Furthermore, this study aimed to assess the performance, using the checklist, of three host platforms for this type of work. It was anticipated that the development of such a checklist could build on previous research highlighting differences between sets of results from databases when run across different platforms. A checklist might also help with

© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group Health Information & Libraries Journal, 31, pp. 43–53

A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers

the purchasing decision-making processes carried out in medical libraries and research institutions dependant on these platforms for carrying out complex searches. Method Producing the checklist We searched LISA, LISTA, Medline, EMBASE, Web of Knowledge and ERIC for studies assessing host platform performance of searching capacity. As we found no studies assessing the performance of database platforms using a checklist, we developed our own checklist based on independent suggestions and follow-up discussions between the two authors. Both authors developed their own performance criteria, and these were then compared and merged into one checklist. Features commonly used for basic searching (such as simple search functions, reference lists, automatic search filters, links to full text, etc.) were discounted from the list as these were either not necessary functions for a systematic search or were rarely used for complex searching. Each criterion was classified as essential (E) or desirable (D) based on mutual agreement between the authors, and the criteria were categorised according to type. An individual criterion was considered essential if its absence would render a complex search impossible or extremely difficult and time-consuming or would severely affect the results of the search. Results of a search were severely affected if records could not be saved or downloaded, if it was not clear how many records had been retrieved (e.g. if there was automatic deduplication) or if numbers were not consistent between running identical searches on the same database. Desirable attributes were those which generally made the search process easier or more time-efficient but would not significantly affect the overall performance of the search. The two authors independently rated each criterion, and discrepancies were resolved by discussion (see Table 1). Assessing the performance of the platforms Complex searches (more than 30 lines long and featuring extensive use of Boolean logic, proximity

terms, combinations of MeSH and free text and multiple field headings) were carried out by the authors on databases via three platforms: EBSCOhost (CINAHL), OvidSP (Medline) and ProQuest (ASSIA). These three platforms were chosen because they were all available via Exeter University Medical School Library and were commonly used by the authors for searching. Testing the platforms As stated, the three host platforms/databases tested for this study were as follows: Platform 1 Platform 2 Platform 3

OvidSP/Medline ProQuest/ASSIA EBSCOhost/CINAHL

Three more columns were added to the checklist for the testing process covering the availability of the particular function, the grade given by the IS and any explanatory notes (see Appendix 1). Review topics were selected from the projects that authors were currently working on. This meant that the host platforms were tested under the conditions of normal use. The subject areas selected were ‘child-reported health outcome measures’ and ‘ADHD in school settings’. These were broad enough to necessitate the use of various databases located on different platforms, and the searches were designed as part of a real systematic review on these topics. Both strategies were over 50 lines long on each of the three databases and used combinations of controlled vocabulary and free text, field codes, Boolean terms and proximity syntax. The search strategies used are available from the authors on request. The checklist was used to rate the performance of the individual platforms against the criteria. Scores from 1 to 3 were assigned on the basis of the performance of each platform as follows: 1 Did not perform the function or was so difficult to find or do that it was deemed ineffective 2 Performed the function but not intuitive or confusing terminology was provided 3 Performed the function well The tests were carried out in May 2012 on days where there were no known object circumstances such as server downtimes or routine upgrades. After being reviewed by the two authors, two sets

© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group Health Information & Libraries Journal, 31, pp. 43–53

45

46

A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers Table 1 List of criteria Features Studied Searching (functions) Searching (syntax)

Field codes Controlled vocabulary

Display (Search)

Display (Records)

Downloading

Search History

Performance

Other

Essential Criteria Command line searches Boolean terms Phrase searching Adjacency terms Proximity terms Right truncation Parenthesis Combining parentheses within strings with Boolean Combining parentheses within strings with adjacency Combining parentheses with single field codes Combining parentheses within strings with proximity Combining parentheses with multiple field codes Available to use Ability to combine Subject headings e.g. MeSH Thesaurus available (displayed in hierarchy) Ability to explode headings Scope note available Ability to combine controlled vocabulary terms with free text Ability to choose a narrower term Option to view search history while using search screen Build up searches line-by-line with the number of hits visible for each string Combine searches (with Boolean)

Option to change the number of hits viewed per page Option to view search history on record display screen Select all results from complete set of records rather than page by page A wide choice of export/download options Can save search history Re-run saved searches

Desirable Criteria Single character truncation Left truncation Masking within a word Short cut to combining strings with AND/OR 2?>(e.g. OR/1-10)

Easily accessible Ability to choose multiple terms from the thesaurus

Ability to edit previous lines of search as it develops Ability to insert new lines of search into existing search Ability to move search lines around within search Renumber searches after deletion Refine search by update code Option to choose fields to display Can move onto next record when in full record display Search term highlighted Able to download large numbers of records (500+) in one go Can share saved searches Export search history Edit saved searches

Can handle long and complex searches, >50 lines long Can handle large numbers of records >1000 Is compatible with major reference management systems: EndNote and Reference Manager Compatible with major web browsers: IE, Firefox and Google Chrome Help facility is easy to locate and informative Results are consistent Turn off any deduplication

© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group Health Information & Libraries Journal, 31, pp. 43–53

A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers

of results were combined, and an average score was produced. Results Producing the checklist The authors identified 10 basic features of platforms representing the main functions required to carry out and process a search. These features incorporated search syntax, display, processing of results and overall performance. Within these categories, 56 individual criteria were identified as being either essential (38) or desirable (18) for systematic searches. Table 1 shows the list of criteria as grouped by feature, together with the classification decisions made by the authors. Agreements/Disagreements The level of agreement between the grades recorded by the two reviewers for each platform was compared. Agreement was highest for OvidSP, with 85% agreement between the two reviewers on grade, followed by EBSCOhost (67%) and ProQuest (65%) (Table 2). The complete set of grades assigned by the authors for each platform is given in Appendix 2.Graph 1. Combined results for the 3 platforms

insert new lines of search into the existing search history; ability to move search lines around within an existing search strategy; having an option to choose which fields to display in the results screen. It is important to note that we refer to the existing (rather than saved) search strategy here. OvidSP, for example, allows comprehensive editing of strategies once they are saved, but until the search is re-run, the impact of the changes (i.e. the number of hits generated) cannot be seen. This approach can lead to problems with version control as the search needs to be saved under a new name and re-run each time it is edited to see how many hits are generated. Two of the platforms (ProQuest and EBSCOhost) also performed poorly on other functional characteristics, such as the ability to select all results from complete set of records rather than page by page and the ability to save a search history (so that the entire strategy can be re-run). In addition, both these platforms performed poorly when handling long and complex searches of 50 lines or more, and with saving and exporting large numbers of records (500+). However, for 17 of the checklist criteria, all three platforms scored 2.5 or 3. Using complex search syntax was a particular strength across all three platforms, as was the use of controlled vocabulary (e.g. MeSH), a wide range of download options and compatibility with major reference management systems. In addition, the help facility was easy to locate and use within the three platforms.

Interpretation of results All three host platforms performed poorly (grading of 1.5 or 1) on three of the criteria: the ability to Table 2 Combined results of IS grading of host platforms across all the checklist criteria

Grade*

OvidSP No. of criteria (%)

ProQuest No. of criteria (%)

EbscoHOST No. of criteria (%)

1 1.5 2 2.5 3

2 3 4 4 42

11 7 8 8 21

12 4 9 6 24

(3.6) (5.5) (7.3) (7.3) (76.4)

(20) (12.7) (14.5) (14.5) (38.2)

(21.8) (7.3) (16.4) (10.9) (43.6)

*Average score awarded (for example, 1.5 represents where the first reviewer awarded grade 1 and the second reviewer awarded grade 2).

Discussion Searches for systematic reviews are necessarily complex and involve designing strategies that are often run across multiple databases on different host platforms. Information specialists and researchers need these host platforms to be both intuitive and capable of facilitating complex searching. We have found through experience that owners of host platforms are not always responsive to these needs and focus primarily on functions that allow fast retrieval of references, basic search functions and links to full text articles. It is possible that database providers do not consider the ability to conduct searches for systematic reviews as part of their function or that it is not economically viable for them to do so.

© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group Health Information & Libraries Journal, 31, pp. 43–53

47

48

A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers

In the absence of similar studies, we relied on our own knowledge and experience to produce and refine the list of criteria that we felt were important for the searching performance of database host platforms. It is important to note that each platform has many more functions than in the checklist, we focus only on those criteria that are essential to carry out complex searching. The topics that were chosen for testing the platforms (a review of child health outcome measures and ADHD in school settings) were broad enough for relevant information to be located on the three databases chosen (CINAHL/EBSCOhost, Medline/ OvidSP and ASSIA/ProQuest). The topics warranted complex searching on all three platforms, with combinations of subject headings and free text, proximity syntax and use of field codes. Therefore, the topics chosen would not have favoured any individual platform. For the purposes of this study, we were not interested in how the alterations in search strategy affected the results of the search but in the actual performance of the platforms for carrying out the tasks identified on the checklist. The results found that no platform performed well for all the functions required for a systematic review search and that there was wide variation between the platforms. The lower grading for some of the criterion could be down to individual preference, how the system has been configured by our own institution or a lack of clear understanding about the complex functions of the platforms. In addition, there was not complete agreement between the authors about the performance of the individual platforms against the list of criteria. This shows that the assessment process was also subjective and that having at least two assessors is required to minimise bias. Furthermore, the grading results are only accurate for the time the searches were run; it is important to note that platforms are always releasing upgrades and making improvements. However, the list of criteria is still relevant, and we anticipate it will be of use to information specialists working in the field of systematic reviews, as well as procurement teams deciding which platform to purchase, and database developers who wish to meet the needs of review teams. We focused on the 3 major database providers that we could access at the

University of Exeter Medical School; however, there are other hosts as well as stand-alone databases that could also be assessed using the checklist. The checklist is not intended to be a complete and final list of all functions we require from host platforms. It could incorporate other useful functions such as the ability to share search strategies between different information specialists within the same team. This would allow searches to be run via different accounts which would be useful for running update searches, for instance, if the information specialist changes, or for using and amending previous search strategies for closely related topics carried out by different information specialists. Another useful function would be the ability to group-related search histories into folders. We intend to update and publish the criteria periodically, bearing in mind that methods and technologies adapt and change over time. Previous studies in this area had predominantly looked at the variations in results generated from the same database but hosted on different platforms, and reasons for these variations.6–10 This project bridges the gap between this understanding and the practical assessment of host platforms for complex searches, as would be expected in a systematic review. The resulting checklist could be used by any information professional involved in either carrying out this type of searching or evaluating databases before making a procurement decision. Future research could investigate more widely the experience of academics carrying out searches for systematic reviews. It would be useful to investigate whether databases are commonly selected for reviews based on the ease of searching. If so, the results might encourage hosts to develop more review-friendly functions, or owners of databases to consider the broader requirements of systematic review researchers when selecting a host platform for their resource. Conclusion There is little, if any, published debate about the limitations of database host providers in facilitating searches for systematic reviews. It is possible therefore that host providers, database owners and information professionals do not consider this to be an important issue when upgrading systems,

© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group Health Information & Libraries Journal, 31, pp. 43–53

A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers

choosing hosts or purchasing database packages and instead focus primarily on ease of use, convenience and cost. Systematic reviews are being carried out increasingly by students as part of a higher degree course or dissertation: this is an issue therefore that warrants discussion throughout the academic information profession and not just among information specialists in healthcare research. Of the three platforms examined, only one (OvidSP) performed well on the majority of functions required for complex searching. As the other two host providers have sole commercial rights to key health databases, it is imperative that the platform is capable of running complex searches for systematic reviews. This is required to minimise the risk of key databases being searched unsystematically, or worse, being omitted from the review process, and subsequently increasing the risk of possible bias in systematic reviews. Librarians and procurement teams should be aware of the needs of systematic reviewers and information specialists when making decisions about database packages and host platforms. Funding This article presents independent research funded by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC) for the South West Peninsula. The views expressed in this publication are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health in England.

References 1 Brettle, A. J., Long, A. F., Grant, M. J. & Greenhalgh, J. Searching for information on outcomes: do you need to be comprehensive? Quality in Health Care 1998, 7, 163–167. 2 Papaioannou, D., Sutton, A., Carroll, C., Booth, A. & Wong, R. Literature searching for social science systematic reviews: consideration of a range of search techniques. Health Information and Libraries Journal 2010, 27, 114– 122. 3 Savoie, I., Helmer, D., Green, C. J. & Kazanjiian, A. Beyond Medline: reducing bias through extended systematic review search. International Journal of Technology Assessment in Health Care 2003, 19, 168–178. 4 Higgins, J. P. T. & Green, S. (eds). Cochrane Handbook for Systematic Reviews of Interventions, version 5.1.0 (updated March 2011). The Cochrane Collaboration 2011. Accessible at: http://www.cochrane-handbook.org/ 5 Centre for Reviews and Dissemination (CRD). Systematic Reviews: CRD’s Guidance for Undertaking Reviews in Healthcare. York, UK, University of York, 2009. 6 Bandyopadhyay, A. Examining biological abstracts on two platforms: what do end users need to know? Science & Technology Libraries 2010, 29, 34–52. 7 Kimball, R. The GeoRef database: a detailed comparison and analysis of three platforms. Science & Technology Libraries 2010, 29, 111–129. 8 Younger, P. & Boddy, K. When is a search not a search? A comparison of searching the AMED complementary health database via EBSCOhost, OVID and DIALOG. Health Information and Libraries Journal 2009, 26, 126–135. 9 Sewell, R. R. Comparing four CAB abstracts platforms from a veterinary medicine perspective. Journal of Electronic Resources in Medical Libraries 2011, 8, 134–149. 10 Casler, C. L., Herring, E., Smith, H., Moberly, H. K., Flood, S. & Perry, V. Comparing AGRICOLA by Vendor. Journal of Agricultural & Food Information 2003, 4, 33. Received 28 January 2013; Accepted 20 November 2013

1Appendix Full checklist

No.

Criteria

1 1a 2 2a

Searching (functions) Command line searches Searching (syntax) Boolean terms

E/D

Available Y/N

IS Grade (1-3)

Notes

E E

(continued) © 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group Health Information & Libraries Journal, 31, pp. 43–53

49

50

A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers Table . (continued)

No.

Criteria

E/D

2b 2c 2d 2e 2f 2g 2h 2i 2j 2k 2l 2m 2n 2o 3 3a 3b 3c 4 4a 4b 4c 4d 4e 4f 4g 5 5a 5b 5c 5d 5e 5f 5g 5h 6 6a 6b 6c 6d

Phrase searching Adjacency terms Proximity terms Right truncation Left truncation Single character truncation Masking within a word Parenthesis Combining parentheses within strings with Boolean Combining parentheses within strings with adjacency Combining parentheses within strings with proximity Combining parentheses with single field codes Combining parentheses with multiple field codes Short cut to combining strings with AND/OR (e.g. OR/1-10) Field codes Available to use Easily accessible Ability to combine (e.g. ti,ab) Controlled vocabulary Subject headings e.g. MeSH Thesaurus available (displayed in hierarchy) Ability to choose multiple terms from the thesaurus Ability to combine controlled vocabulary terms with free-text Ability to explode headings Ability to choose a narrower term Scope note available Display (Search) Option to view search history while using search screen Build up searches line-by-line with the number of hits visible for each string Ability to edit previous lines of search as it develops Ability to insert new lines of search into existing search Ability to move search lines around within search Combine searches (with Boolean?) Renumber searches after deletion Refine search by update code Display (Records) Option to choose fields to display Option to change the number of hits viewed per page Option to view search history on record display screen Ability to choose records and not lose this choice when you move onto the next page Can move onto next record when in full record display Search term highlighted Downloading Select all results from complete set of records rather than page-by-page Able to download large numbers of records (500+) in one go A wide choice of export/download options Search History Can save search history Can share saved searched Export search history

E E E E D D D E E E E E E D

6e 6f 7 7a 7b 7c 8 8a 8b 8c

Available Y/N

IS Grade (1-3)

Notes

E D E E E D E E E E E E D D D E D D D E E E D D E D E E D D

(continued) © 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group Health Information & Libraries Journal, 31, pp. 43–53

A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers Table . (continued)

No.

Criteria

E/D

8d 8e 9 9a 9b 9c 9d 10 10a 10b 10c

Edit saved searches Re-run saved searches Performance Can handle long and complex searches, >50 lines long Can handle large numbers of records >1000 Is compatible with major reference management systems Compatible with major web browsers: IE, Firefox and Google Chrome Other Help facility is easy to locate and informative Results are consistent Turn off any deduplication

D E

Available Y/N

IS Grade (1-3)

Notes

E E E E E E E

2AppendixChecklist with grading OvidSP

No.

Criteria

1 1a 2 2a 2b 2c 2d 2e 2f 2g 2h 2i 2j

Searching (functions) Command line searches Searching (syntax) Boolean terms Phrase searching Adjacency terms Proximity terms Right truncation Left truncation Single character truncation Masking within a word Parenthesis Combining parentheses within strings with Boolean Combining parentheses within strings with adjacency Combining parentheses within strings with proximity Combining parentheses with single field codes Combining parentheses with multiple field codes Short cut to combining strings with AND/OR (e.g. OR/1-10) Field codes

2k

2l

2m 2n

2o

3

ProQuest

EBSCOhost

E/D

Rev 1

Rev 2

Combined

Rev 1

Rev 2

Combined

Rev 1

Rev 2

Combined

E

3

3

3

3

3

3

1

3

2

E E E E E D D D E E

3 3 3 3 3 1 3 3 3 3

3 3 3 3 3 1 3 3 3 3

3 3 3 3 3 1 3 3 3 3

3 2 2 3 3 3 3 3 3 3

3 2 3 3 3 3 3 3 3 3

3 2 2.5 3 3 3 3 3 3 3

3 3 1 3 3 1 3 3 3 3

3 3 1 3 3 1 3 3 3 3

3 3 1 3 3 1 3 3 3 3

E

3

3

3

3

3

3

1

1

1

E

3

3

3

3

3

3

3

3

3

E

3

3

3

1

3

2.5

3

3

3

E

3

3

3

2

3

2.5

1

1

1

D

3

3

3

1

1

1

1

3

2

(continued) © 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group Health Information & Libraries Journal, 31, pp. 43–53

51

52

A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers Table . (continued) OvidSP

ProQuest

EBSCOhost

No.

Criteria

E/D

Rev 1

Rev 2

Combined

Rev 1

Rev 2

Combined

Rev 1

Rev 2

Combined

3a 3b 3c

Available to use Easily accessible Ability to combine (e.g. ti,ab) Controlled vocabulary Subject headings e.g. MeSH Thesaurus available (displayed in hierarchy) Ability to choose multiple terms from the thesaurus Ability to combine controlled vocabulary terms with free-text Ability to explode headings Ability to choose a narrower term Scope note available Display (Search) Option to view search history while using search screen Build up searches line-by line with the number of hits visible for each string Ability to edit previous lines of search as it develops Ability to insert new lines of search into existing search Ability to move search lines around within search Combine searches (with Boolean?) Renumber searches after deletion Refine search by update code Display (Records) Option to choose fields to display Option to change the number of hits viewed per page Option to view search history on record display screen Ability to choose records and not lose this choice when you move onto the next page

E D E

3 2 3

3 3 3

3 2.5 3

3 3 3

3 3 3

3 3 3

3 2 1

1 1 1

2 1.5 1

E E

3 3

3 3

3 3

3 1

2 2

2.5 1.5

2 1

3 3

2.5 2

D

3

3

3

3

3

3

3

3

3

E

3

3

3

2

2

2

3

1

2

E E

3 3

3 3

3 3

3 3

2 2

2.5 2.5

3 3

3 2

3 2.5

E

3

3

3

3

1

2

2

2

2

E

3

3

3

3

1

2

3

3

3

E

3

3

3

1

2

1.5

3

3

3

D

1

2

1.5

1

1

1

3

3

3

D

1

2

1.5

2

1

1.5

1

1

1

D

1

2

1.5

1

1

1

1

1

1

E

2

3

2.5

3

2

2.5

2

3

2.5

D

3

3

3

1

1

1

3

3

3

D

2

2

2

1

1

1

1

1

1

D

1

1

1

1

1

1

1

1

1

E

3

3

3

3

3

3

2

2

2

E

3

3

3

1

1

1

3

3

3

E

3

3

3

3

3

3

1

3

2

4 4a 4b 4c 4d

4e 4f 4g 5 5a 5b

5c 5d 5e 5f 5g 5h 6 6a 6b

6c 6d

(continued)

© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group Health Information & Libraries Journal, 31, pp. 43–53

A checklist to assess database-hosting platforms, Alison Bethel & Morwenna Rogers Table . (continued) OvidSP

ProQuest

EBSCOhost

No.

Criteria

E/D

Rev 1

Rev 2

Combined

Rev 1

Rev 2

Combined

Rev 1

Rev 2

Combined

6e

Can move onto next record when in full record display Search term highlighted Downloading Select all results from complete set of records rather than page-by-page Able to download large numbers of records (500+) in one go A wide choice of export/download options Search History Can save search history Can share saved searched Export search history Edit saved searches Re-run saved searches Performance Can handle long and complex searches, >50 lines long Can handle large numbers of >records >1000 Is compatible with major reference management systems Compatible with major web browsers: IE, Firefox and Google Chrome Other Help facility is easy to locate and informative Results are consistent Turn off any deduplication

D

3

3

3

3

3

3

3

3

3

D

2

2

2

3

3

3

3

3

3

E

3

3

3

3

2

2.5

1

1

1

D

3

3

3

3

3

3

2

1

1.5

E

3

3

3

3

3

3

3

3

3

E D D D E

3 1 2 2 3

3 3 2 3 3

3 2 2 2.5 3

1 1 2 3 1

1 1 2 1 1

1 1 2 2 1

1 1 3 1 3

1 1 1 3 3

1 1 2 2 3

E

3

3

3

1

2

1.5

2

1

1.5

E

3

3

3

1

2

1.5

2

1

1.5

E

3

3

3

3

3

3

3

2

2.5

E

3

3

3

2

2

2

3

3

3

E

2

3

2.5

3

2

2.5

2

3

2.5

E E

3 3

3 3

3 3

1 2

1 2

1 2

3 3

3 3

3 3

6f 7 7a

7b

7c 8 8a 8b 8c 8d 8e 9 9a 9b 9c

9d

10 10a

10b 10c

© 2014 The authors. Health Information and Libraries Journal © 2014 Health Libraries Group Health Information & Libraries Journal, 31, pp. 43–53

53

A checklist to assess database-hosting platforms for designing and running searches for systematic reviews.

Systematic reviews require literature searches that are precise, sensitive and often complex. Database-hosting platforms need to facilitate this type ...
134KB Sizes 0 Downloads 4 Views