560772 research-article2014

The Diabetes Educator OnlineFirst, published on December 5, 2014 as doi:10.1177/0145721714560772

TDEXXX10.1177/0145721714560772Quality of Information on Diabetic NeuropathyChumber et al

Quality of Information on Diabetic Neuropathy 1

A Methodology to Analyze the Quality of Health Information on the Internet The Example of Diabetic Neuropathy

Purpose

Sundeep Chumber, BMBS, BSc Jörg Huber, PhD, MSc

The purpose of this work was to evaluate the criteria used to assess the quality of information on diabetic neuropathy on the Internet.

Methods Different search engines (Google, Yahoo, Bing, and Ask) and 1 governmental health website (MedlinePlus) were studied. The websites returned (200 for each search engine) were then classified according to their affiliation (eg, commercial, professional, patient groups). A scoring system was devised from the literature to assess quality of information. Websites were also analyzed using the 2 most widely used instruments for assessing the quality of health information, the Journal of the American Medical Association (JAMA) scoring system and the Health On the Net Foundation (HON) certification.

Pietro Ghezzi, PhD From the Brighton & Sussex Medical School, Falmer, Brighton, UK (Mr Chumber, Dr Ghezzi), and University of Brighton, School of Health Sciences, Falmer, Brighton, UK (Dr Huber). Correspondence to Pietro Ghezzi, Brighton & Sussex Medical School, BSMS-Trafford Centre, Falmer, Brighton, BN19RY, UK ([email protected]). No conflict of interest or funding stated on copyright. DOI: 10.1177/0145721714560772 © 2014 The Author(s)

Results Professional websites or health portals scored better according to most criteria. Google and MedlinePlus returned results scoring significantly higher than other engines in some of the criteria. The use of different instruments gave different results and indicates that the JAMA score and the HON certification may not be sufficient ones.

Chumber et al Downloaded from tde.sagepub.com at UCSF LIBRARY & CKM on April 30, 2015

The Diabetes EDUCATOR 2

Conclusions This methodology could be used to evaluate the reliability and trustworthiness of information on the Internet on different topics to identify topic areas or websites where the available information is not appropriate.

I

n 2012, 72% of US Internet users searched for health information on the Internet.1 The Internet offers a wide range of online sources, including health-related websites, blogs, and social networking sites. While social media are particularly important in sharing experiences, providing emotional support and advice to patients, few people use them to find health information.2,3 While there are several specialized websites aimed specifically at providing the public with health information, 77% of the users rely primarily on generic search engines, and only 13% will use specialized health websites or portals1; interestingly, this pattern has not changed in the past 10 years.4 The use of the Internet to search for health-related information with reliance on popular search engines is not restricted to patients. Studies have shown that clinical staffs make use of the Internet, including generic search engines such as Google, to search for health information.5 In this context, it has been suggested that even the use of nonspecialized search engines, including Google, and not only PubMed, may be useful to medical students and young doctors in making a diagnosis.6,7 However, there is a concern that patients might be misled into non-evidence-based practices or treatments by websites. Websites could potentially encourage selfdiagnoses and become a substitute to medical advice. It is often thought that the amount of information available on the web is overwhelming, and patients, who are not medically trained, may not be able to recognize inaccurate information.8,9 In addition, people seldom look beyond the first 10 websites offered by default by Google and other search engines.9 On the other hand, there is evidence that use of the Internet can improve patient participation in medical decision making.10 These concerns have raised the issue of measuring the quality of information available on the web. A number of criteria and instruments have been devised to this purpose. Of these, the most commonly used are the Journal

of the American Medical Association (JAMA) criteria and the Health On the Net Foundation (HON) code. The JAMA criteria, originally devised in 1997, score the quality of a website based on 4 main criteria that must be matched: disclosure of authorship, attribution of sources (references), disclosure of commercial interest and ownership of the website, and currency (indication of the date of update).11 The HON code is a quality certification from the Health On the Net Foundation, a Swiss-based nonprofit organization. Accreditation is granted to websites that adhere to 8 ethical principles: authorship, complementarity, privacy, attribution, justifiability, transparency, financial disclosure, and advertising policy.12 Other criteria often taken into account are whether the website is user-friendly in terms of interactivity13 and whether the text is readable by a layperson. Health information inevitably uses medical language and terminology that may provide a challenge to patients. Highly reliable websites may prove to be inaccessible to individuals with low literacy rates, thereby limiting their use.14,15 The purpose of this study was to assess the quality of health information for a specific pathology available on the Internet, as assessed by 2 instruments, the HON code and the JAMA criteria, comparing the results returned by different search engines. Given the rising prevalence of diabetes and the high likelihood that patients with diabetes will search the Internet,16 the quality of information regarding it is important. In fact, among chronic conditions, patients with diabetes are the most likely to use the Internet, with an adjusted rate for Internet use for health information of 52%,16 followed by depression.16,17 Research regarding online information related to diabetes and its complications is limited.18,19 One study evaluated the usability, content, and reliability of 47 websites, following the web search diabetes mellitus.18 Most sites were found to be commercial organizations, while the remainder were either institutional or governmental. Reliability of websites was assessed according to authorship, expert availability, and HON code accreditation. Only 8 (17%) sites adhered to all 3 criteria, with authorship and HON code logo being displayed on a single site each. Commercial sites had better overall reliability than institutional or governmental sites, while there was no correlation between usability of sites and reliability.18 Thus, a limited number of websites analyzed for diabetes mellitus met the criteria for quality, with the 2 best websites being the American Diabetes Association (www.diabetes.org) and the Joslin Diabetes Center (www.joslin.org).18

Volume XX, Number X, Month/Month 2014 Downloaded from tde.sagepub.com at UCSF LIBRARY & CKM on April 30, 2015

Quality of Information on Diabetic Neuropathy 3

Table 1

Derived Reliability Scoring System, Used to Assess the Quality of Online Health Information Criterion

Description

Authorship

No authorship Authorship, no affiliations and credentials Full authorship, affiliations, and credentials   I. No references   II. References unclear References listed clearly No disclosure/ownership Full disclosure/ownership No indication of the date content was posted and updated Indication of the date content was posted and updated   I. Information does not support doctor-patient relationship   II. Site does not clearly state that information supports doctor-patient relationship Site clearly states that information supports doctor-patient relationship Site does not back up claims relating to benefit and performance Site contains some evidence to back up claims relating to benefit and performance, but no referencing/sources Site contains trial-based evidence to back up claims relating to benefit and performance No contact details Contact details   I. Advertising not distinct from editorial content   II. Advertising of own product   I. No advertising   II. Advertising distinct from editorial content

Attribution Disclosure Currency Complementarity Justifiability

Transparency Advertising

Score

This is a quantitative study evaluating the reliability of online information related to diabetic neuropathy. Quality criteria were identified first, followed by comparison of the results with those based on the HON code and the JAMA criteria. Then 200 websites from each of the main search engines (Google, Bing, Yahoo, Ask) and 1 specialized health portal (the US government MedlinePlus) were scored for the set criteria and the data analyzed qualitatively and quantitatively, using statistical methods. To assess the reliability or trustworthiness of websites, a scoring system was devised, based on a combination of quality evaluation tools used in the HON code; JAMA benchmarks; the Health-Related Website Evaluation Form (HRWEF), which assesses for content, accuracy, author, currency, audience, navigation external links, and structure; and Quality Component Scoring System (QCSS), which assesses for ownership, purpose, authorship, author qualification, attribution, interactivity, and

0 1 2 0 1 0 1 0 1 0 1 0 1 2 0 1 0 1

currency.11,20-23 Sites were evaluated for authorship, attribution, disclosure, currency, complementarity, justifiability, transparency, and advertising. A score of 1 or 2 was given if a specific criterion was met and 0 if the site failed to meet the criterion, giving a maximum score of 10, as summarized in Table 1. Sites were also evaluated using the JAMA benchmarks, obtaining a score from 0 to 4,11 and assessed for HON code certification, as noted by the presence of a HON seal on the website. If a particular criterion, such as authorship, was not visible on the initial web page, the 3-click rule was used. This is an unofficial yet widely incorporated website navigation rule, suggesting that individuals should be able to find the information they are looking for within 3 clicks.24 If a specific criterion was not visible within 3 clicks, it was considered not to be listed and given a score of 0. The methodology and workflow used in this study could be applied to analyze the quality of information on other types of diseases or conditions.

Chumber et al Downloaded from tde.sagepub.com at UCSF LIBRARY & CKM on April 30, 2015

The Diabetes EDUCATOR 4

Kruskal-Wallis test followed by Dunn’s test using GraphPad Prism Software (GraphPad Software, La Jolla, California).

Results Distribution of Websites

Websites derived from each search engine were categorized according to affiliation: patient group (PG), commercial (C), health portal (HP), professional (P), or other/ nonprofit (O/NP). The distribution of the type of websites was similar for the different search engines, as shown in Table 3, with the notable exception of MedlinePlus, which did not return any PG or C websites.

Figure 1.  Flowchart of data processing.

Reliability Scoring of Websites

Methods Identification and Selection of Websites

The 4 most popular search engines (google.com, yahoo .com, bing.com, and ask.com) and 1 health portal (nlm .nih.gov/MedlinePlus/)25 were searched for the term diabetic nerve pain, as patients would most likely use this expression.26 The search was conducted under the “private browsing” setting so that the searches would not be influenced by previous browsing history. The first 200 websites were independently visited and screened. Websites were excluded if they met one of the following criteria: nonfunctioning/nonaccessible links, denied direct access through password requirement, were not in English, or failed to provide any information related to diabetic neuropathy. Different web pages listed from the same website were not excluded. The workflow is shown in Figure 1. Classification of Website Affiliations

Websites were classified as professional (ie, government, major medical centers, libraries, universities, online journals, and other educational institutions), commercial (ie, pharmaceutical companies), health portals or patient group (ie, blogs and forums written by patients, chat rooms, and support groups), or other/nonprofit organization (ie, charitable organizations, associations), according to affiliation, as described in Table 2. Data Analysis

When average values are shown, data are reported as the mean ± standard deviation (SD). Multiple comparisons for nonparametric variables were performed using the

The mean reliability scores of websites identified by the different search engines are shown in Figure 2. It is interesting to note that Google returned websites with a significantly higher average reliability score than Yahoo, Bing, and, surprisingly, MedlinePlus. Figure 3 reports the breakdown of the mean reliability score for each affiliation, derived from the 5 different search engines. It can be seen that HP and P sites had statistically significantly higher reliability scores compared with PG, C, and O/NP sites. This was true for all 4 generic search engines, while, among the websites returned by MedlinePlus, only P sites had a significantly higher reliability score. Reliability of Websites According to JAMA Benchmark Criteria

The same websites were analyzed again using the JAMA criteria. Figure 4 shows the mean JAMA quality scores for websites analyzed according to search engines and affiliations. Similarly to what observed with the composite score used in Figure 3, HP and P sites obtained the highest percentage of reliable sites for all the 4 main search engines. There were also differences in the reliability of websites returned by the different engines: the total score of the websites returned by Google was significantly higher than that of websites identified with Yahoo or Bing. The percentage of sites achieving a JAMA quality criteria score of ≥3 were 51% for Google, 49% for Ask, 37% for Bing, and 34% for both Yahoo and MedlinePlus. Number of HON Code–Certified Sites

Figure 5 shows the number of HON code–affiliated websites for the different search engines and their

Volume XX, Number X, Month/Month 2014 Downloaded from tde.sagepub.com at UCSF LIBRARY & CKM on April 30, 2015

Quality of Information on Diabetic Neuropathy 5

Table 2

Classification of Website Affiliations Affiliation

Description

Examples

Professional (P)

Site published by a person or organization with professional knowledge of the data (eg, government, institutions, libraries, universities, publishers, online journals, and other educational institutions) A site that buys, sells, or provides a service for a fee (eg, profit organizations)

joslin.org cdc.gov/ medicalcenter.osu.edu/

Commercial (C) Health portal (HP)

A website or search engine that provides information on a wide scope of health and medical subjects (eg,. health blogs)

Patient group (PG)

A website designed specifically for or by patients (eg, patient blogs, patient forums, chat rooms, and support groups)

Other/nonprofit (O/NP)

A site that exists purely for educational or charitable reasons, with no financial beneficiaries, which does not fit into any of the other affiliations (eg, charitable organizations, associations, and news portals)

cymbalta.com/ lyrica.com/ pfizer.co.za/ webmd.com/ mayoclinic.com/ medicinenet.com/ diabetesforum.com/ neuropathy-treatment.org/ diabeticnervepainleg.blogspot.com/ diabeteshome.ca/ diabetespainhelp.com/ diabetes.org/

Table 3

Distribution of Website by Affiliations and Search Engine

Patient group Commercial Health portal Professional Other

Google, %

Yahoo, %

Bing, %

Ask, %

Medline, %

 7 20 30 29 14

18 13 33 20 16

16 14 30 19 21

10 18 34 19 19

 0  0 50 32 18

Total number of websites analyzed was the following: Google, 187; Yahoo, 194; Bing, 196; Ask, 186; and MedlinePlus, 198.

breakdown in the different affiliations. MedlinePlus and Google had the highest proportion of HON-accredited sites (white bars, 23% and 20%, respectively), while Yahoo, Bing, and Ask had 13%, 12%, and 13%, respectively. With all search engines, HP sites accounted for the greatest number of HON-certified sites. The numbers of HON code–accredited sites for MedlinePlus, however, were evenly split between HP and P sites, indicating that the professional sites identified by MedlinePlus are more

likely to be HON compliant than those identified by the other search engines. Relationship Between JAMA Quality Criteria and HON Code Certification

Websites were evaluated to investigate a potential correlation between the JAMA quality criteria score and HON code certification (Table 4). For all of the search

Chumber et al Downloaded from tde.sagepub.com at UCSF LIBRARY & CKM on April 30, 2015

The Diabetes EDUCATOR 6

Figure 2.  Mean reliability scores by search engine. Values bearing the same label are significantly different from each other. Uppercase letters, P < .0001; lowercase letters, P < .001; numbers, P < .01.

engines, the mean JAMA score was significantly higher in the HON code–accredited sites compared with those sites without HON code certification. (For all search engines, P < .0001 by the Mann-Whitney test comparing HON– and HON+ means.) Subgroup Analysis: Reliability Scores

Research indicates that approximately 75% of users do not look beyond the first page (ie, first 10 sites) of search results, while 90% only view the first or second page of results (ie, first 20 sites).27 Given this, the first 20 results, derived from the first 2 pages of each search engine, were independently analyzed. Table 5 shows the average values of the mean reliability score, the number of sites with a high JAMA score (≥3), and the number of sites displaying the HON seal among the top 20 sites returned by each search engine. There was no statistically significant difference among the different search engines, although it should be noted that the smaller number of websites (20) in this subanalysis, compared with the analysis on 200 websites shown in Figure 2, makes the statistical analysis more conservative.

Discussion Reliability and Website Affiliation

This study provides an insight into the reliability of online information related to diabetic neuropathy, with a particular focus on website affiliation and search engine type. Specifically, this study has evaluated the reliability or trustworthiness of diabetic neuropathy–associated

websites most likely to be viewed by patients. Overall, reliability scores varied between different search engines and affiliations, in particular. The predominant finding was the link between website affiliation and reliability. Health portals and professional sites were found to be the most reliable, obtaining significantly higher reliability scores, based on both the derived reliability scoring system and JAMA benchmarks, than patient group, commercial, and other nonprofit sites. Our study also revealed that HP sites were more likely to be HON code accredited compared with other groups. Previous studies have also assessed possible relationships between website quality and association, with varied results. For instance, in a quality assessment of online Hebrew written health information about oral contraceptives, health management organizations and health portals were found to have higher credibility scores than other groups, based on the HON code principles.28 Bedell et al18 reported diabetes-specific commercial sites as more reliable than institutional or governmental sites, according to 3 reliability markers (author identification, available experts, and HON code logo). In a separate study, related to online breast cancer information, commercial sites were more likely to display the HON code seal than professional groups or nonprofit organizations.29 However, although an indicator of reliability, this does not necessarily mean that commercial sites were more reliable overall, as HON code approval is a voluntary process that must be initiated by the website publisher. Thus, sites that do not display the HON code seal may have simply not applied for HON code approval, as opposed to failing to adhere to the principles.29 HON Code and JAMA Benchmarks

An additional observation of this study is the correlation between HON code certification and JAMA quality criteria score. Although distinct quality assessment tools, the HON code encompasses all 4 of the JAMA criteria. Thus, one would expect HON code–accredited sites to comply with all 4 JAMA criteria or a minimum of 3. Although most HON code–certified sites obtained a maximum JAMA score, a fair proportion failed to do so. This has been supported by additional studies, where it has been observed that numerous sites with a HON code seal actually fail to comply with all 8 HON code principles.29,30 Assessor subjectivity and variation in interpretation of the criteria may provide a possible explanation for

Volume XX, Number X, Month/Month 2014 Downloaded from tde.sagepub.com at UCSF LIBRARY & CKM on April 30, 2015

Quality of Information on Diabetic Neuropathy 7

Figure 3.  Mean reliability scores by search engine and affiliation. Values bearing the same label are significantly different from each other. Uppercase letters, P < .0001; lowercase letters, P < .001; numbers, P < .01.

Figure 4. Mean JAMA scores by search engine and affiliation. Values bearing the same label are significantly different from each other. Total number of websites analyzed was as follows: Google, 187; Yahoo, 194; Bing, 196; Ask, 186; and MedlinePlus, 198.Uppercase letters, P < .0001; lowercase letters, P < .001; numbers, P < .01; symbols, P < .05.

this apparent failing. Furthermore, websites that display the HON code seal are not routinely assessed, and compliance is not systematically enforced. Thus, these sites

may not continue to strictly adhere to the principles, following approval.29 The JAMA benchmarks, however, provide a condensed and relatively easy tool for

Chumber et al Downloaded from tde.sagepub.com at UCSF LIBRARY & CKM on April 30, 2015

The Diabetes EDUCATOR 8

Figure 5.  Number of Health On the Net Foundation (HON) code–affiliated websites by search engine and affiliation. Total number of websites analyzed was as follows: Google, 187; Yahoo, 194; Bing, 196; Ask, 186; and MedlinePlus, 198.

Table 4

Mean JAMA Quality Criteria Score in Websites With and Without HON Code Accreditation JAMA Score

Google Yahoo Bing Ask MedlinePlus

HON Code –

HON Code +

2.3 ± 1.2 2.0 ± 1.1 2.0 ± 1.1 2.3 ± 1.2 2.2 ± 0.8

3.7 ± 0.6 (38) 3.4 ± 0.7 (25) 3.5 ± 0.7 (24) 3.6 ± 0.5 (24) 3.8 ± 0.6 (46)

Data are mean ± SD. The number of HON+ sites is shown in parentheses. Total number of websites analyzed was the following: Google, 187; Yahoo, 194; Bing, 196; Ask, 186; and MedlinePlus, 198. JAMA score is from 0 to 4, with 4 being the highest. Abbreviations: HON, Health On the Net Foundation; JAMA, Journal of the American Medical Association; SD, standard deviation.

assessing the reliability of websites and have been shown to correlate with higher levels of accuracy.29,31,32 Reliability of Websites Derived From Different Search Engines

There is limited research about the reliability of healthrelated websites derived from different search engines. One study of note, by Maloney et al,33 appraised

the quality and validity of online information about osteoarthritis, with a focus on search engine type. A total of 15 search engines representing medical, general, and meta-search engines were analyzed and compared. Medical portals, such as MedlinePlus, were found to retrieve higher quality information in relation to the DISCERN rating instrument, which appraises the quality of health information related to treatment choices, including the description of their benefits and the scientific basis, while there was no significant difference between types of search engine and relevance of information sourced.33 Consequently, it was expected that the medical search engine would provide a higher number of quality websites, being a more concise and relevant database.33 There was a degree of overlap noted between the 4 general search engines, particularly between Yahoo and Bing, as noted in previous studies.18,33 Of all the engines, Google returned higher quality websites in most of the analyses performed in this study. This is a reassuring finding, given that Google is the most popular search engine.25 Interestingly, websites derived from MedlinePlus had similar reliability scores to Yahoo and Bing and were significantly lower than Google and Ask in terms of JAMA score. This is an unexpected observation, particularly since MedlinePlus consisted mainly of health portals and professional sites, which were generally found to have higher reliability scores. Such a finding may be attributed to the fact that there were a large

Volume XX, Number X, Month/Month 2014 Downloaded from tde.sagepub.com at UCSF LIBRARY & CKM on April 30, 2015

Quality of Information on Diabetic Neuropathy 9

Table 5

Reliability of the First 20 Websites From Each Search Engine

Reliability score, mean ± SD JAMA score ≥3, No. (%) HON code +, No. (%)

Google

Yahoo

Bing

Ask

Medline

6.5 ± 2.0 9 (45) 4 (20)

5.9 ± 2.3 9 (45) 5 (25)

5.6 ± 2.6 8 (40) 4 (20)

6.0 ± 3.0 11 (65) 2 (12)

7.5 ± 2.4 11 (55) 2 (10)

Abbreviations: HON, Health On the Net Foundation; JAMA, Journal of the American Medical Association; SD, standard deviation.

number of similar web pages, from the same website, listed in the MedlinePlus result list. Nonetheless, MedlinePlus did have the highest number of HON code– accredited sites, although one should bear in mind the limitations of the HON code, as discussed earlier. Reliability scores were also shown to be similar between each of the 5 search engines in the subgroup analysis. This is perhaps more significant, given that patients are more likely to access these sites.27

Conclusions and Limitations of the Study The derived reliability scoring system used in this study may provide the basis for a website evaluation tool than can be used by patients and health professionals alike. The instrument aims to use objective criteria to assess the reliability of online information and encompasses the key criteria of other reliability assessment tools. Our study suggests that the derived scoring system is successful in identifying reliable websites and is comparable to other validated validity assessment tools (ie, JAMA benchmarks). Further research would need to be conducted to assess the relevance of scores obtained and the overall validity of this instrument. Additional assessment tools would be required to cover areas such as readability, accuracy, and usability. It is significant to note that this study was not intended to evaluate the scientific value of the information provided, such as whether the reader is directed toward evidencebased treatments supported by randomized controlled clinical trials and/or approved by regulatory agencies. In fact, most studies dealing with the “quality of health information” on the web, as well as scoring or accreditation systems such as the JAMA criteria or the HON accreditation, do not take into consideration the actual medical validity of the information. It would be important,

for instance, to know whether a website recommends the use of a treatment for which there is scientific evidence of efficacy or other treatment of unproven efficacy. This is probably the main limitation of this approach, and more sophisticated analysis would be required by analyzing the text presented by the websites and comparing it with that of a “gold standard,” as used in other studies.18,29 It would be important to design future research to evaluate in depth what information the websites provide and correlate it with the existing “quality” criteria. Another limitation of this approach might come from a website having more than 1 page returned by the search engine, so that if a particularly low-scoring site was listed multiple times in the search results, this would cause the overall reliability score for the search engine to be lower. Likewise, a repeated high reliability scoring site would have the opposite effect. Furthermore, differences in study design may account for the variation seen between previous studies. Eliminating repeated websites, as well as comparing additional medical search engines, would strengthen the comparisons. Future studies should extend this type of analysis, which was limited here to the reliability/trustworthiness of the websites, to other dimensions of the health information quality. These should include accessibility, both in terms of readability and of open access, and an analysis of the scientific validity of the information provided, such as whether a website suggests the use of treatments whose efficacy has been demonstrated scientifically, and the emerging issue of whether a commercial website is marketing counterfeit medicinal products. Implications for Educators

Web searches for health issues are frequently considered unreliable.

Chumber et al Downloaded from tde.sagepub.com at UCSF LIBRARY & CKM on April 30, 2015

The Diabetes EDUCATOR 10

Our searches on neuropathic pain using several widely used search engines and a government-sponsored health portal showed (1) that health and professional portals are more trustworthy and (2) that Google, through its way of identifying websites, provides the most reliable information, even compared with the government-sponsored MedlinePlus. This is, of course, true for this basic level of reliability as assessed by our quality assessment tool, which extends existing tools (JAMA, HON). However, the research will need to be extended to assess the validity of the information provided by the websites and the way end users rely on these websites (ie, explore actual user behavior). This initial study suggests that some of the criticism of web-based information retrieved from commercial search engines is unfounded and, at least according to the criteria used here, not particularly biased. Further research confirming these findings for other conditions is required. In addition, the findings imply that in particular, Google is just as likely or even more likely to find highquality information, something that is just as important for health professionals as it is for patients. These findings, including the different results obtained with different search engines, underline the importance of educating patients to assess the reliability of health-related websites and thus avoid the confusion between computer literacy (being able to use a hardware and a software) and information literacy, which enables them to identify highquality information on their condition. References 1. Fox S, Jones S. Health Online 2013. Pew Research Center. http:// pewinternet.org/Reports/2013/Health-online.aspx. Accessed June 2, 2014. 2. Fox S. The Social Life of Health Information. Washington, DC: Pew Internet & American Life Project; 2011. 3. Fox S, Jones S. The Social Life of Health Information. Washington, DC: Pew Internet & American Life Project; 2009. 4. Fox S. Search engines. Pew Research Center. http://www.pew internet.org/2002/07/03/search-engines. Accessed June 2, 2014. 5. Hider PN, Griffin G, Walker M, Coughlan E. The informationseeking behavior of clinical staff in a large health care organization. J Med Libr Assoc. 2009;97(1):47-50. 6. Falagas ME, Ntziora F, Makris GC, Malietzis GA, Rafailidis PI. Do PubMed and Google searches help medical students and young doctors reach the correct diagnosis? A pilot study. Eur J Intern Med. 2009;20(8):788-790. 7. Nalliah S, Chan SL, Ong CL, et al. Effectiveness of the use of Internet search by third year medical students to establish a clinical diagnosis. Singapore Med J. 2010;51(4):332-338.

8. Berland GK, Elliott MN, Morales LS, et al. Health information on the Internet: accessibility, quality, and readability in English and Spanish. JAMA. 2001;285(20):2612-2621. 9. Eysenbach G, Kohler C. How do consumers search for and appraise health information on the World Wide Web? Qualitative study using focus groups, usability tests, and in-depth interviews. BMJ. 2002;324(7337):573-577. 10. Lee C, Gray S, Lewis N. Internet use leads cancer patients to be active health care consumers. Patient Educ Couns. 2010(81) (suppl):S63-S69. 11. Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor—let the reader and viewer beware. JAMA. 1997;277(15):1244-1245. 12. Net Hot. The HON code of conduct for medical and health web sites (HONcode). http://www.hon.ch. Accessed June 2, 2014. 13. Lustria MLA. Can interactivity make a difference? Effects of interactivity on the comprehension of and attitudes toward online health content. J Am Soc Info Sci Tech. 2007;58(6):766-776. 14. McInnes N, Haglund BJ. Readability of online health information: implications for health literacy. Inform Health Soc Care. 2011;36(4):173-189. 15. Leroy G, Helmreich S, Cowie JR, Miller T, Zheng W. Evaluating online health information: beyond readability formulas. AMIA Annu Symp Proc. 2008;2008:394-398. 16. Wagner T, Baker L, Bundorf M, Singer S. Use of the Internet for health information by the chronically ill. Pre Chronic Dis. 2004;1(4):A13. 17. Haviland MG, Pincus HA, Dial TH. Type of illness and use of the Internet for health information. Psychiatr Serv. 2003;54(9):1198. 18. Bedell SE, Agrawal A, Petersen LE. A systematic critique of diabetes on the World Wide Web for patients and their physicians. Int J Med Inform. 2004;73(9-10):687-694. 19. Hansberry DR, Suresh R, Agarwal N, Heary RF, Goldstein IM. Quality assessment of online patient education resources for peripheral neuropathy. J Peripher Nerv Syst. 2013;18(1):44-47. 20. Health On the Net Foundation. The HON code of conduct for medical and health web sites (HONcode). February 4, 2013. http://www.hon.ch/HONcode/Conduct.html. Accessed February 10, 2013. 21. Teach L. Health-related web site evaluation form. Rollins School of Public Health, Emory University. http://www.sph.emory.edu/ WELLNESS/instrument.html. Accessed February 12, 2013. 22. Peterlin BL, Gambini-Suarez E, Lidicker J, Levin M. An analysis of cluster headache information provided on Internet websites. Headache. 2008;48(3):378-384. 23. Martins EN, Morse LS. Evaluation of Internet websites about retinopathy of prematurity patient education. Br J Ophthalmol. 2005;89(5):565-568. 24. Zeldman J. Taking Your Talent to the Web: A Guide for the Transitioning Designer. Indianapolis, IN: New Riders Publishing; 2001. 25. Alexa. The top 500 sites on the web. http://www.alexa.com/top sites. Accessed November 3, 2012. 26. Lerner EB, Jehle DV, Janicke DM, Moscati RM. Medical communication: do our patients understand? Am J Emerg Med. 2000;18(7):764-766. 27. King A. Website Optimization. Sebastopol, CA: O’Reilly; 2008.

Volume XX, Number X, Month/Month 2014 Downloaded from tde.sagepub.com at UCSF LIBRARY & CKM on April 30, 2015

Quality of Information on Diabetic Neuropathy 11

28. Neumark Y, Flum L, Lopez-Quintero C, Shtarkshall R. Quality of online health information about oral contraceptives from Hebrewlanguage websites. Isr J Health Policy Res. 2012;1(1):38. 29. Meric F, Bernstam EV, Mirza NQ, et al. Breast cancer on the World Wide Web: cross sectional survey of quality of information and popularity of websites. BMJ. 2002;324(7337):577-581. 30. Laversin S, Baujard V, Gaudinat A, Simonet MA, Boyer C. Improving the transparency of health information found on the Internet through the honcode: a comparative study. Stud Health Technol Inform. 2011;169:654-658.

31. Ni Riordain R, McCreary C. Head and neck cancer information on the Internet: type, accuracy and content. Oral Oncol. 2009;45(8):675-677. 32. Lopez-Jornet P, Camacho-Alonso F. The quality of Internet sites providing information relating to oral cancer. Oral Oncol. 2009;45(9):e95-e98. 33. Maloney S, Ilic D, Green S. Accessibility, nature and quality of health information on the Internet: a survey on osteoarthritis. Rheumatology (Oxford). 2005;44(3):382-385.

For reprints and permission queries, please visit SAGE’s Web site at http://www.sagepub.com/journalsPermissions.nav.

Chumber et al Downloaded from tde.sagepub.com at UCSF LIBRARY & CKM on April 30, 2015

A methodology to analyze the quality of health information on the internet: the example of diabetic neuropathy.

The purpose of this work was to evaluate the criteria used to assess the quality of information on diabetic neuropathy on the Internet...
630KB Sizes 0 Downloads 7 Views