This article was downloaded by: [Trent University] On: 10 October 2014, At: 18:55 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Medical Reference Services Quarterly Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/wmrs20

Development of a Web-Based Repository for Sharing Biomedical Terminology From Systematic Review Searches: A Case Study a

b

Ahlam A. Saleh , Melissa A. Ratajeski & John LaDue

b

a

Arizona Health Sciences Library, University of Arizona , Tucson , Arizona , USA b

Health Sciences Library System, University of Pittsburgh , Pittsburgh , Pennsylvania , USA Published online: 15 Apr 2014.

To cite this article: Ahlam A. Saleh , Melissa A. Ratajeski & John LaDue (2014) Development of a Web-Based Repository for Sharing Biomedical Terminology From Systematic Review Searches: A Case Study, Medical Reference Services Quarterly, 33:2, 167-178, DOI: 10.1080/02763869.2014.897518 To link to this article: http://dx.doi.org/10.1080/02763869.2014.897518

PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/termsand-conditions

Medical Reference Services Quarterly, 33(2):167–178, 2014 Published with license by Taylor & Francis ISSN: 0276-3869 print=1540-9597 online DOI: 10.1080/02763869.2014.897518

Development of a Web-Based Repository for Sharing Biomedical Terminology From Systematic Review Searches: A Case Study AHLAM A. SALEH Arizona Health Sciences Library, University of Arizona, Tucson, Arizona, USA

Downloaded by [Trent University] at 18:55 10 October 2014

MELISSA A. RATAJESKI and JOHN LADUE Health Sciences Library System, University of Pittsburgh, Pittsburgh, Pennsylvania, USA

Requests for comprehensive searches, such as searches to support systematic reviews, seem to be evolving into routine practice in the health sciences library environment. Collecting terminology for these searches is often a time-consuming process. This case study reports on the development of a searchable web-based repository, MedTerm Search Assist, as a means for librarians to share biomedical terminology from systematic review searches. KEYWORDS Collaboration, database development, informatics, MedTerm Search Assist, systematic reviews, terminology, usability testing

INTRODUCTION A systematic review attempts to identify, appraise and synthesize all the empirical evidence that meets pre-specified eligibility criteria to answer a given research question. Researchers conducting systematic reviews use explicit methods aimed at minimizing bias, in order to produce more reliable findings that can be used to inform decision making.1

# Ahlam A. Saleh, Melissa A. Ratajeski, and John LaDue Received: November 18, 2014; Revised: January 8, 2014; Accepted: January 27, 2014. This article is based on a paper presented at the Medical Library Association annual meeting, Minneapolis, MN, May 17, 2011, and a paper presented at the Medical Library Association Mid Atlantic Chapter annual conference, Richmond, VA, October 12, 2011. Address correspondence to Ahlam A. Saleh, Arizona Health Sciences Library, University of Arizona, 1501 N. Campbell Avenue, P.O. Box 245079, Tucson, AZ 85724. E-mail: asaleh@ahsl. arizona.edu 167

Downloaded by [Trent University] at 18:55 10 October 2014

168

A. A. Saleh et al.

Because of their importance, publications of systematic reviews are on the rise. A run of the Clinical Queries Systematic Review search filter in PubMed yields nearly a double increase in the quantity of results yielded in the five-year timespan of 2008–2012. With more systematic reviews being produced each year, conducting literature searches for systematic reviews is evolving into a routine responsibility for many information specialists in special, academic, and clinical settings, especially in light of the Institute of Medicine standards for systematic reviews, which makes a recommendation in standard 3.1 to ‘‘work with a librarian or other information specialist trained in performing systematic reviews.’’2 Because the goal of a systematic review is to summarize all the available research on a specific topic, the database searches to support these reviews need to be quite extensive. The searches should include a comprehensive list of variants of keywords and subject headings for a topic. Such thorough searching methodology will help minimize the chance that relevant studies would be missed. Gathering the keywords and subject headings for a systematic review search can be time consuming, however, requiring librarians to receive input from review authors, scan relevant articles, examine the indexing of citations in various databases, and look at bibliographies and published search strategies within the methods section of existing systematic reviews, when available (see Figure 1). The Centre for Reviews and Dissemination systematic review guidance handbook describes a similar workflow process for identifying appropriate terms.3

FIGURE 1 Example of a workflow process for gathering search terminology.

Downloaded by [Trent University] at 18:55 10 October 2014

Developing a Web-Based Repository

169

Once the keywords and subject headings have been gathered for a topic, they are typically saved in a format accessible only by the creator and=or the systematic review team. Such formats may include tables, spreadsheets, or written notes. Unless these terms are eventually documented in the systematic review publication or shared elsewhere, they are not accessible to other information specialists. Recent standards such as the PRISMA statement and the Institute of Medicine Standards state that search strategies should be included in the text of the published systematic review, though this does not always occur.2,4 Even if the searches are planned to be documented in a review, it could take years for publication, and therefore the search terminology would not be accessible in a timely fashion. Considering the effort and time it takes to gather the comprehensive terminology required for systematic review searches, it seems reasonable to propose that librarians collaborate and share their work with one another, as the terminology collected for one search could be used by another individual working on a similar topic. Furthermore, a librarian may receive another systematic review request on a similar topic, allowing previously collected terms to be re-used. There are several examples of initiatives that demonstrate development of a means to share work in the information specialist field: .

. .

The Association of College & Research Libraries hosts Peer-Reviewed Instructional Materials Online (PRIMO)—a resource to ‘‘share peerreviewed instructional materials created by librarians.’’5 The Medical Library Association DocKits include materials from a number of institutions on health sciences library management topics.6 The PubMed Search Strategies Blog provides entries of PubMed search strategies for topics.7

These projects demonstrate that librarians are willing to share their work and that the development of a database for librarians to share terminology collected for systematic reviews with one another may be a feasible idea. If such a database were available, information specialists could review the terminology collected by their colleagues as a starting point to develop their searches for systematic reviews.

OBJECTIVES This project sought to develop a publicly available database as a repository for librarians and information specialists that would allow them to: 1. share biomedical terminology collected for systematic reviews with their colleagues; and 2. identify potential terminology to use in a systematic review literature search.

170

A. A. Saleh et al.

DATABASE DEVELOPMENT Database Structure

Downloaded by [Trent University] at 18:55 10 October 2014

A project team was formed, which included two subject expert librarians and an information technology librarian who would be responsible for aspects such as coding and database design. The tool’s overall aesthetic design was based on the library’s existing web pages, utilizing multiple tabs. The latest release, MedTerm Search Assist 1.0, is available at . A screenshot of the latest version of the home page is provided in Figure 2. The core structure of the database was designed to include the following: . . .

a main record, to display data entered for terms in the database; a form for librarians to add a new term into the database; and a form for librarians to add additional terminology to an existing term record.

Figure 3 depicts the form to be used to request a new term addition. This version of the form is the latest version post modifications from usability testing. The fields to be included in the main record of entered terms were decided by the subject expert librarians (see Table 1 for the final listing of fields included). Systematic review searches are typically conducted in multiple databases, and each database possibly requires specific controlled vocabulary (subject terminology) to be used. Keywords, however, can be searched as free text in any database, including MEDLINE, making them the most versatile and useful. Therefore, the Keyword field was designated as a mandatory field. Because only one field is required, there may be variation in record content.

FIGURE 2 MedTerm Search Assist home page (color figure available online).

171

Downloaded by [Trent University] at 18:55 10 October 2014

Developing a Web-Based Repository

FIGURE 3 Add a new term form.

Each field contains an accompanying Notes section. Examples of content appropriate for the Notes section include: providing a citation that discusses challenges or tips on searching in a specific topic area, noting that certain keywords return an exorbitant amount of irrelevant citations because of different meanings for the term, or whether truncation at a certain point in the term would be problematic. An example of a record for a term

172

A. A. Saleh et al.

TABLE 1 Outline of the Fields That Comprise a Standard Record for a Term

Downloaded by [Trent University] at 18:55 10 October 2014

Main field

Subfields

Keywords (mandatory)

Synonyms, term variations, and related terms

Subject terminology (optional)

MEDLINE: MEDLINE: MEDLINE: MEDLINE:

Search strategies (optional)

PubMed Ovid MEDLINE

MeSH Pharmacological Action Floating Subheadings Substance Names

(post-usability testing) that is approved and included in the database is demonstrated in Figure 4. Although the focus of the database was on sharing terminology, a decision was made to also include search strategies because this would allow users to quickly copy and paste to complete a preliminary scope search on the topic. Additionally, users would also be able to see how terms were combined in search statements with Boolean operators and advanced features such as adjacency. These search strategy entries should not be confused with studied search filters, which typically provide sensitivity and=or specificity data and may be validated. The Cochrane Handbook of Systematic Reviews recommends that, at minimum, the databases CENTRAL and MEDLINE are searched for Cochrane Systematic Reviews.8 Typically, MEDLINE is the first database that a search is devised for in health sciences systematic review searches. Subsequent database searches are then based off this first search. Therefore, the team decided to limit the database to include only MEDLINE subject terminology and search strategies, which would be the most useful to librarians beginning a systematic review search. The MedTerm Search Assist database was built using the MySQL database system run on a Linux operating system. The front-end was developed using PHP and is hosted on an in-house Apache server. MySQL and PHP were both chosen over equivalent products because of familiarity, ease of use, cost, open source code, cross-platform interoperability, and widely available support through a large user and developer community. Table 2 describes the overall structure of the database. Vivisimo was implemented as the main search mechanism. The University of Pittsburgh had prior experience with integration and successful use of Vivisimo search engine technology, which made this technology a suitable choice. The database was designed so that users could search for a term in the database or browse a list of available terms in the database (with direct links to individual term records). The search features available are basic. When a search is executed, the fields searched include Keywords and Subject Terminology. Plurals are automatically retrieved if a singular form of a term is used in a search. There is no truncation permitted, and quotation marks may

Downloaded by [Trent University] at 18:55 10 October 2014

Developing a Web-Based Repository

173

FIGURE 4 Example of a term record.

be used for phrases. An ‘‘About’’ page describes the project purpose and scope, and instructions on using the database are provided through a Help link and on each form.

Usability Testing Institutional Review Board (IRB) approval was obtained to conduct usability testing. For testing, the project team populated the database with several data sets of terminology and search strategies available from their own work. The team used iterative usability testing by conducting two phases, with changes being made to the database between the phases. The literature suggests that usability testing does not require large sample sizes in early testing; therefore, the team decided on a target recruitment of at least 10 participants for each phase of testing.9–11 The first phase of testing was conducted through an online survey comprised of 13 questions and tasks with a response format

174

A. A. Saleh et al.

TABLE 2 Description of the MySQL Tables in the MedTerm Search Assist Database No. 1.

Term

2. 3.

Keyword Keyword accompanying table Keyword note MeSH term MeSH term accompanying table MeSH term note Pharmacological action Pharmacological action accompanying table Pharmacological action note Floating subheading Floating subheading accompanying table Floating subheading note Substance name Substance name accompanying table Substance name note Search strategy: PubMed Search strategy: Ovid MEDLINE

4. 5. 6. 7. 8. 9.

Downloaded by [Trent University] at 18:55 10 October 2014

MySQL tables

10. 11. 12. 13. 14. 15. 16. 17. 18.

Fields Unique ID, term, approval status, modification approval status, creation date, last modified date, creator, modifier Unique ID, keyword Unique ID, term id, keyword id, approval status Unique ID, term ID, note, approval status Unique ID, MeSH term Unique ID, term ID, MeSH ID, approval status Unique ID, term ID, note, approval status Unique ID, pharm term Unique ID, term ID, pharm ID, approval status Unique ID, term ID, note, approval status Unique ID, subheading term Unique ID, term ID, subheading ID, approval status Unique ID, term ID, note, approval status Unique ID, substance term Unique ID, term ID, substance ID, approval status Unique ID, term ID, note, approval status Unique ID, term ID, strategy, note, approval status Unique ID, term ID, strategy, note, approval status

consisting of a Likert scale accompanied with options to comment. A 5-point Likert scale, ranging from 1 (strongly disagree) to 5 (strongly agree) was used. The survey tasks and questions are further detailed in Table 3. The participants for the first phase of testing were 25 information specialists recruited through an announcement posted on the ‘‘Expertsearching’’ Medical Library Association list-serv. Aside from age restrictions imposed by the IRB, no specific requisites were made when recruiting this population of users. An incentive was provided to participants. The LimeSurvey web application was used to administer the questionnaire. Results were tabulated on a spreadsheet and color coded to identify patterns of problem areas noted with the product. During testing, some remote testers initially encountered an error in the tool, which asked them to enter a username and password. This prevented some testers from completing all the tasks and consequently may have influenced their Likert ratings. Based on feedback from phase 1 testing, revisions were made to: . . .

make clear the purpose and audience of the database; facilitate navigation; and make the link more visible to access the ‘‘Add terms to an existing record’’ form.

Developing a Web-Based Repository

175

TABLE 3 Usability Testing Questionnaire

Downloaded by [Trent University] at 18:55 10 October 2014

Tasks



Questions=statements

Please utilize the following URL to access the Medical Terminology Database

1. What are your first impressions of the database homepage (comment only)? 2. The purpose of this database is clear (Likert and comment).

Search for the term ‘‘antidepressants.’’

3. I was easily able to search for the term antidepressants and identify whether there were any results (Likert and comment).

Search for the term ‘‘nursing homes,’’ view the record, and glance over the content.

4. The record is easy to read and the heading titles are clear (Likert and comment). 5. I would be able to easily take the information from this record to create a search strategy on the topic Nursing Homes (Likert and comment).

Add the term ‘‘professional relationship’’ into the database, using the below terms to fill out the ‘‘add a new term form’’ Keywords: relationship, collaboration, Interpersonal relations, interprofessional, teamwork, healthcare team. MeSH terms: interpersonal relations, interprofessional relations, group processes, partnership practice

6. It was easily apparent on how to enter a new term into the database (Likert and comment). 7. The ‘‘add a new term form’’ is easy to complete (Likert and comment). 8. The instructions for the ‘‘add a new term form’’ are easy to understand (Likert and comment). 9. Once a term was entered it was easy to get back to the main database page (Likert and comment).

Add the following MeSH terms to the existing ‘‘antidepressant’’ record: Depression=drug therapy Depressive Disorders=drug therapy, Antidepressive Agents (explode)



10. It was intuitive on how to edit an existing record (Likert and comment). 11. This database will be useful to me (Likert and comment). 12. This database will save me time when preparing searches (Likert only). 13. Please provide any additional feedback you would like share (comment only).

The format available to respond to questions or statements in the questionnaire is noted in parenthesis.

The second phase of testing was conducted in-person on the revised database. The participants included eleven internally recruited librarians from the University of Pittsburgh and one individually selected nonaffiliate librarian, none of whom had experience with the MedTerm Search Assist database prior to testing. The same questionnaire devised for online testing was used, although the questions and tasks were read aloud to the participants. The two subject expert librarians were present in the room to administer the test. One librarian read the questions out loud to the participants, while the other librarian took detailed notes since there would be no audio or video recording of the session. Each participant was instructed to verbally express what he or she was thinking while completing tasks. A laptop was

176

A. A. Saleh et al.

projected onto a large-screen display so that both test administrators could see the steps taken by the testers to complete given tasks. Based on feedback from phase 2 testing, revisions were made to: . . . .

further clarify the purpose and audience of the database; simplify the instructions; notate which fields were required; and increase readability by changing the shades of colors used.

Downloaded by [Trent University] at 18:55 10 October 2014

Database Maintenance As terms are added by librarians into the MedTerm Search Assist database, they are monitored by way of direct alert in an e-mail account solely accessible by the development team. Submitted terms are reviewed for obvious errors such as misspellings but are not reviewed for thoroughness, quality, or accuracy prior to publication in the database. It should be noted that no additional review will occur of term records each year during the annual changing of MeSH headings. Given staff limitations and planning for realistic allocation of time permissible for maintenance of the product, this was the best approach that could be managed. The database’s administrative interface provides the capability to accept, reject, and=or modify all submitted terminology. The database is routinely crawled to index newly added terms.

DISCUSSION This development project was an attempt to create a free, web-accessible database that would present itself as a promising tool to facilitate the search preparation process in systematic review searches. MedTerm Search Assist was not developed to become the only place to identify terms for a search, but rather as a starting point, to potentially minimize the time spent using the various means previously described in the introduction, especially because these means have limitations. For example, not all published systematic reviews include search documentation, or the search documentation is limited to information professionals that have a subscription to the journal or organization where the review is published. Additionally, there may be instances where no systematic review has been published on the topic. Also, the project team recognized that the use of the Entry Terms in the MEDLINE MeSH browser would be a helpful starting point to collect synonyms for a search; the terms provided are not inclusive, however. The MedTerm Search Assist database was made available for public use on May 17, 2011. A preliminary Webtrends analytic report for 2012–2013 indicates 1,603 visits during that time period. The analytics also indicate that there is an international presence, with the United States being the most

Downloaded by [Trent University] at 18:55 10 October 2014

Developing a Web-Based Repository

177

active, followed by Australia and Canada for the top three active countries. As of September 2013, 26 terms were submitted by librarians and approved to be added to the database. This number is much lower than expected, considering how the team envisioned the resource to be used. Some preliminary speculation into the low usage includes acknowledgement that using this resource may not seamlessly integrate into an information professional’s workflow, lack of knowledge of the existence of the database, and finally, the likelihood that some database design issues were not identified with the initial usability testing. To bolster usage, the project team plans on increasing marketing of the database across targeted list-servs, whose subscribers are librarians or information specialists that complete systematic reviews. In addition, it is anticipated that dissemination through publication and presentation will increase awareness about the MedTerm Search Assist database and spark an interest by the audience to investigate and contribute terminology.

CONCLUSION The MedTerm Search Assist database was developed for information professionals to share keywords, MEDLINE search strategies, and search tips with one another. Librarians and information professionals are encouraged to submit their search terminology into the MedTerm Search Assist database. Along with increased advertising, next steps include possible modifications of the database along with more targeted user-focused usability testing.

REFERENCES 1. Cochrane Collaboration. ‘‘About Cochrane Systematic Reviews and Protocols.’’ Accessed September 2, 2013. http://www.thecochranelibrary.com/view/0/About CochraneSystematicReviews.html. 2. Institute of Medicine. Finding What Works in Health Care: Standards for Systematic Reviews. Washingtion, DC: The National Academies Press, 2011. 3. Centre for Reviews and Dissemination. Systematic Reviews: CRD’s Guidance for Undertaking Reviews in Healthcare. January 2009. http://www.york.ac.uk/inst/crd/ pdf/Systematic_Reviews.pdf. 4. Moher, D., A. Liberati, J. Tetzlaff, D. Altman, and The PRISMA Group. ‘‘Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The Prisma Statement.’’ PLoS Medicine 6, no. 7 (2009): e1000097. doi:10.1371=journal.pmed1000097. 5. Association of College & Research Libraries. ‘‘Peer-Reviewed Instructional Materials Online.’’ Accessed February 2, 2012. http://www.ala.org/acrl/aboutacrl/ directoryofleadership/sections/is/iswebsite/projpubs/primo. 6. Medical Library Association. ‘‘Dockits.’’ September 6, 2013. http://www.mlanet. org/publications/dockits/index.html.

178

A. A. Saleh et al.

Downloaded by [Trent University] at 18:55 10 October 2014

7. Schmidt, C. Pubmed Search Strategies. Accessed January 13, 2012. http://pub medsearches.blogspot.com/. 8. Cochrane Collaboration. Cochrane Handbook for Systematic Reviews of Interventions. March 2011. http://www.cochrane-handbook.org. 9. United States Department of Health and Human Services, and United States General Services Administration. Research-Based Web Design & Usability Guidelines. Washington, DC: U.S. Government Printing Office, 2006. 10. Tullis, T., and J. Stetson. ‘‘Comparison of Questionnaires for Assessing Website Usability.’’ In Proceedings of the Usability Professionals’ Association Conference, 1–12. Minneapolis, MN: Usability Professionals’ Association, 2004. http://home. comcast.net/~tomtullis/publications/UPA2004TullisStetson.pdf. 11. Krug, S. Don’t Make Me Think!: A Common Sense Approach to Web Usability. 2nd ed. Berkeley, CA: New Riders, 2006.

ABOUT THE AUTHORS Ahlam A. Saleh, MD, MLS ([email protected]) is Information Services Librarian, Arizona Health Sciences Library, University of Arizona, 1501 N Campbell Avenue, P.O. Box 245079, Tucson, AZ 85724. Melissa A. Ratajeski, MLIS, AHIP, RLAT ([email protected]) is Reference Librarian; and John LaDue, MLIS ([email protected]) is Knowledge Integration Librarian; both at Health Sciences Library System, University of Pittsburgh, 200 Scaife Hall, 3550 Terrace Street, Pittsburgh, PA 15261.

Development of a Web-based repository for sharing biomedical terminology from systematic review searches: a case study.

Requests for comprehensive searches, such as searches to support systematic reviews, seem to be evolving into routine practice in the health sciences ...
1MB Sizes 0 Downloads 3 Views