H e a l t h C a r e Po l i c y a n d Q u a l i t y • O r i g i n a l R e s e a r c h Hansberry et al. Health Literacy and Online Resources

Downloaded from www.ajronline.org by NYU Langone Med Ctr-Sch of Med on 05/31/15 from IP address 128.122.253.228. Copyright ARRS. For personal use only; all rights reserved

Health Care Policy and Quality Original Research

Health Literacy and Online Educational Resources: An Opportunity to Educate Patients David R. Hansberry 1 Nitin Agarwal2 Stephen R. Baker 1 Hansberry DR, Agarwal N, Baker SR

OBJECTIVE. Given the increasing accessibility of material on the Internet and the use of these materials by patients as a source of health care information, the purpose of this study was to quantitatively evaluate the level of readability of resources made available on the European Society of Radiology website to determine whether these materials meet the health literacy needs of the general public as set forth by guidelines of the U.S. National Institutes of Health (NIH) and the American Medical Association (AMA). MATERIALS AND METHODS. All 41 patient education articles created by the European Society of Radiology (ESR) were downloaded and analyzed with the following 10 quantitative readability scales: the Coleman-Liau Index, Flesch-Kincaid Grade Level, Flesch Reading Ease, FORCAST Formula, Fry Graph, Gunning Fog Index, New Dale-Chall, New Fog Count, Raygor Reading Estimate, and the Simple Measure of Gobbledygook. RESULTS. The 41 articles were written collectively at a mean grade level of 13.0 ± 1.6 with a range from 10.8 to 17.2. For full understanding of the material, 73.2% of the articles required the reading comprehension level of, at minimum, a high school graduate (12th grade). CONCLUSION. The patient education resources on the ESR website are written at a comprehension level well above that of the average Internet viewer. The resources fail to meet the NIH and AMA guidelines that patient education material be written between the third and seventh grade levels. Recasting these resources in a simpler format would probably lead to greater comprehension by ESR website viewers.

T

Keywords: health literacy, Internet, patient education, radiology, readability DOI:10.2214/AJR.14.13086 Received April 28, 2014; accepted after revision June 30, 2014. 1 Department of Radiology, New Jersey Medical School, Rutgers University, 150 Bergen St, Rm C320, Newark, NJ 07101-1709. Address correspondence to D. R. Hansberry ([email protected]). 2 Department of Neurological Surgery, New Jersey Medical School, Rutgers University, Newark, NJ.

AJR 2015; 204:111–116 0361–803X/15/2041–111 © American Roentgen Ray Society

he use of the Internet as a source of health care information has become routine for many individuals. According to several studies, 63% of U.S. Internet users and 71% of European Internet users rely on it as a source of health care information [1–3]. Those who seek health care information on the Internet do so for a variety of reasons. For example, in one study [4], 81% of participants knew someone who had recently received the diagnosis of a medical condition; 58% had themselves received the diagnosis of a medical condition; and 56% had been given a prescription for a new medication or treatment regimen. As physicians, we encourage shared clinical decision making with our patients and want them to be informed about their health and their health care options. Studies have shown a positive effect on the physician-patient relationship when the patient actively participates, a circumstance that tends to result in better patient compliance [5–8]. It is the general expectation that our

ability to inform patients using online education resources is commensurate with their ability to comprehend such information. Among adults, 35% of Americans and 47% of European Union residents evince the deficiency of limited health literacy, manifesting it as struggling to understand the directions on a prescription bottle [9, 10]. The American Medical Association (AMA) has cited health literacy as the strongest independent predictor of health status and noted that poor health literacy correlates with increased risk of hospitalization [11]. This deficiency in comprehension has economic consequences. The U.S. Department of Health and Human Services Office of Disease Prevention and Health Promotion [12] has estimated that limited health literacy costs the U.S. economy 106–236 billion dollars annually. In addition, by one estimate [13], in the United States, the annual cost of health care of Medicaid enrollees with limited health literacy was 437% of that of all Medicaid enrollees. The AMA and the U.S. National In-

AJR:204, January 2015 111

Hansberry et al.

Downloaded from www.ajronline.org by NYU Langone Med Ctr-Sch of Med on 05/31/15 from IP address 128.122.253.228. Copyright ARRS. For personal use only; all rights reserved

TABLE 1: Formulas for the 10 Readability Scales Scale

Elements

Formula

Coleman-Liau Index

Average no. of letters per 100 words (L), average no. of sentences (0.0588 × L) – (0.296 × S) – 15.8 per 100 words (S)

Flesch-Kincaid Grade Level

Average no. of syllables per word (SY ), average no. of words per sentence (W )

(0.39 × W ) + (11.8 × SY ) – 15.59

Flesch Reading Ease

Average no. of syllables (B), average no. of words per sentence (W ), average no. of sentences (S)

206.835 – [84.6 × (B / W )] – [1.015 × (W / S)]

FORCAST Formula

No. of single-syllable words in a 150-word sample (SS)

20 – (SS / 10)

Fry Graph

Average no. of sentences, syllables per 100 words

1. Extract a 100-word passage from the selection 2. Count the number of sentences in each passage (count a half sentence as 0.5) 3. Count the number of syllables in each passage 4. Find the point on the chart (3 samples recommended for best results)

Gunning Fog Index

No. of sentences (S), no. of words (W ), no. of words with ≥ 3 syllables (C)

0.4 × {(W / S) + [(C / W ) × 100]}

New Dale-Chall

Average no. of words per sentence (AW ), percentage unfamiliar words (%U)

(0.1579 × %U) + (0.0496 × AW )

New Fog Count

No. of complex words (C), no. of easy words (E), no. of sentences (S)

({[E + (3 × C)] / S} – 3) / 2

Raygor Readability Estimate Average no. of sentences, long (≥ 6 characters) words per 100 words 1. Select a 100-word passage from the selection 2. Count the number of sentences, estimated to the nearest tenth 3. Count the number of words that are ≥ 6 letters 4. Find the point on the chart (3 samples recommended for best results) SMOG Readability Formula

Average no. of words with ≥ 3 syllables (C), average no. of sentences (S)

1.043 × √[(C × (30 / S)] + 3.1291

Note—SMOG = Simple Measure of Gobbledygook.

stitutes of Health (NIH) have recommended that patient education materials be composed on a third to seventh grade level to meet the needs of the average American, who reads at the eighth grade level [14, 15]. Previous studies have shown a disparity between the readability level of text in the patient education sections of major medical societies’ web pages and the health literacy of the general public [16–26]. A 2013 article in JAMA Internal Medicine showed that major medical societies uniformly fail to meet the needs of average individuals using the Internet because the text offered is pitched to a too-high level of comprehension. Studies of radiologic societies’ web pages have shown similar findings of discrepancy between the sophistication of the textual narrative and the joint AMA and NIH guidelines. This includes patient education resources from a website cosponsored by the American College of Radiology and the Radiological Society of North America [25], the American Society of Neuroradi-

112

ology website [26], the Cardiovascular and Interventional Radiology Society of Europe website, and the Society of Interventional Radiology website [26]. Given the complex nature of diagnostic and interventional radiology, it is critical that the resources presented to patients and their caregivers be appropriately written to maximize the potential comprehension of the material. In this study, we reviewed and quantitatively evaluated the level of readability of patient education resources readily available at the European Society of Radiology (ESR) website. Materials and Methods In September 2013, all of the patient education materials available at the ESR website were downloaded as Microsoft Word (version 2010) files for readability analysis (Readability Studio Professional Edition, version 2012.1, Oleander Software) with 10 readability scales. Before analysis of these 41 patient education articles, all images, references, and copyright information was removed from the text. The 10 readability scales used to analyze

the articles were the Coleman-Liau Index [27], Flesch-Kincaid Grade Level [28], Flesch Reading Ease [29], FORCAST Formula (FORCAST is an acronym derived from the developers’ surnames—Ford, Caylor, and Sticht) [30], Fry Graph [31], Gunning Fog Index [32], New Dale-Chall [33], New Fog Count [28], Raygor Reading Estimate [34], and the Simple Measure of Gobbledygook (SMOG) [35]. On the Flesch Reading Ease scale, text is graded on a 0–100 scale with scores of 0–30 corresponding to very difficult; 30–50, difficult; 50–60, fairly difficult; 60–70, standard; 70–80, fairly easy; 80–90, easy; and 90–100, very easy. On the nine other readability scales, the score correlates with the academic grade level required to understand the material (Table 1).

Results Collectively, when averaged across the nine readability scales that entail a grade level, the 41 articles at the ESR website were found to be written at a mean grade level of 13.0 ± 1.6 (Fig. 1 and Table 2). The individual articles ranged from grade level 10.8 to

AJR:204, January 2015

Health Literacy and Online Resources

Downloaded from www.ajronline.org by NYU Langone Med Ctr-Sch of Med on 05/31/15 from IP address 128.122.253.228. Copyright ARRS. For personal use only; all rights reserved

TABLE 2: European Society of Radiology Patient Education Articles (n = 41) and Corresponding Scores on 10 ­Readability Scales

Topic

FleschFlesch New Kincaid Gunning New Raygor Reading Coleman- Dale- Grade Fry Fog Fog Reading Ease Liau Index Chall Level FORCAST Graph Index Count Estimate SMOG Average

Abdomen

29

15

16+

14.1

12.1

17

16

Bone density imaging

37

12

11.5

12.2

11.4

17

14

Chest and thorax

28

15

14

13.9

12

17

14

CT

47

11

11.5

10.9

11

13

13

CT of the brain

45

11

11.5

11.5

10.8

14

14

CT of the chest

46

11

11.5

11.1

11.1

14

13

12.5

17

15.2

8.8

12

10.6

17

8.6

11

9.7

12

8.8

SD

14.9

1.8

14.3

12.6

2.3

15.3

14.3

2.1

13.4

11.5

1.5

13.9

12.0

1.5

12

13.6

11.8

1.6

CT of the heart

46

11

11.5

11.2

10.9

14

13

8.1

11

13.8

11.6

1.8

CT of the large thoracic blood vessels

45

11

11.5

11.2

10.9

14

14

9.3

11

13.7

11.8

1.6

CT of the upper abdominal organs

40

12

11.5

12

11.7

16

14

9.4

13

14.2

12.7

1.9

Genitourinary tract

9

17

16+

15.9

13.6

17

16

8.6

17+

15.2

15.1

2.6

Glossary

37

13

14

12.4

11.7

17

14

9.4

13

14.4

13.2

2.1

Head and neck

37

14

16+

12.9

11.3

16

16

12.1

17

15.3

14.5

2.0 2.9

Interventional radiology

1

19+

16+

19+

13.3

17

18

10.6

17

19+

16.5

Lung scan

48

11

11.5

12.1

10.1

12

15

12.4

10

14

12.0

1.6

MRI

47

12

11.5

11.1

11.1

13

13

9.4

11

13.4

11.7

1.3

MRI of the brain

45

12

11.5

11.8

11

13

14

10.1

11

13.9

12.0

1.3

MRI of the joints

45

12

11.5

11.3

11.1

14

13

9.4

11

13.5

11.9

1.5

MRI of the renal arteries

44

12

11.5

11.9

11.1

14

14

10.2

11

13.9

12.1

1.4

MRI of the spine

45

12

11.5

11.2

11.1

14

13

9.2

11

13.5

11.8

1.5

Molecular imaging

16

16

16+

16.5

12.7

17

18

13.4

17

18.3

16.1

1.9

Musculoskeletal

24

14

16+

16.5

13

17

16

15.3

17

15

15.6

1.3

Neurologic

0

18

16+

19+

12.7

17

19+

17.5

17

19+

17.2

2.0

Nuclear medicine

31

14

14

13.8

11.7

17

15

11.2

13

15.1

13.8

1.8

Radiography

36

13

11.5

12.4

11.9

17

14

8.7

13

14.1

12.8

2.3

Radiography of bone fractures

38

12

11.5

12.6

11.8

16

16

10.9

13

14.7

13.1

1.8

Radiography of the abdomen

31

13

14

13.5

11.8

17

16

9.9

13

15.4

13.7

2.1

Radiography of the breast (mammography)

37

13

11.5

13.1

11.2

16

15

10.7

17

15.1

13.6

2.3

Radiography of the joints

39

12

11.5

12

11.5

16

15

10.3

13

14.5

12.9

1.9

Radiography of the kidneys

42

12

11.5

11.7

11.5

15

13

8.9

13

13.6

12.3

1.7

Radiography of the lung

43

12

11.5

12

11.2

15

14

9.3

12

14.2

12.3

1.8

Radiography of the spine

32

13

14

13.3

12.1

17

16

10.7

13

15.2

13.8

1.9

Radiography of the upper abdominal organs

43

12

11.5

11.8

11.1

14

14

10.3

12

14.3

12.3

1.5

Radiography of the upper gastrointestinal tract

33

14

14

13.3

11.8

17

15

11

13

14.9

13.8

1.8

Ultrasound

43

12

11.5

11.7

11.1

15

14

9.9

11

14.3

12.3

1.8

Ultrasound of the breast

51

11

9.5

10.3

10.2

12

13

8.8

10

13.5

10.9

1.6

Ultrasound of the abdomen in children

38

12

11.5

12.7

11.2

16

15

11

12

15

13.0

1.9

Ultrasound of the arteries in the neck

52

11

9.5

10.2

10.4

11

13

8.9

10

13.4

10.8

1.6

Ultrasound of the female lower abdomen

41

12

11.5

11.4

11.4

16

14

9.2

12

13.9

12.4

2.0

Ultrasound of the thyroid

51

11

9.5

9.8

10.8

12

13

8

10

12.8

10.8

1.6

Ultrasound of the veins

49

11

9.5

10.5

10.8

13

13

8.9

11

13.5

11.3

1.6

Virtual endoscopy of the large intestine (CT colonography)

35

13

14

12.8

11.5

17

15

9.8

13

14.7

13.4

2.1

Note—SMOG = Simple Measure of Gobbledygook.

AJR:204, January 2015 113

114

et40 al. 37 34 31 Article No.

28 25 22 19 16 13 10 7 4 1 2

1

Average No. of Sentences per 100 Words

Discussion The 41 articles available at the ESR website for Internet consumers uniformly fail to meet the guidelines set forth by the joint AMA and NIH recommendations and probably prevent more widespread understanding of the information. The reading scores suggest that to comprehend 73.2% of articles, the viewer needs to read at the level of a high school graduate (12th grade) or higher. This discrepancy is further magnified by the fact that the level of education completed is usually higher than an individual’s level of health literacy and consequently his or her ability to comprehend health care material [14]. Another study [36] showed that more than 70% of patients have their treatment decision making affected by resources they view on the Internet. Even those with higher health literacy can have problems understanding health care–related information [37]. Given the increasing availability of the Internet and its widespread use as a source of medical information for patients and their caregivers, it is critical to provide material written in a sufficiently simple textual format. Despite this need, many radiologic, medical, and surgical society web

Fig. 1—Graph shows Hansberry reading grade level of 41 individual articles downloaded from European Society of Radiology (ESR) patient education website. Wide vertical rectangle represents third to seventh grade level recommended in guidelines of U.S. National Institutes of Health and American Medical Association. Thin vertical line represents average grade level of all patient education articles.

3

4

5

6

7

8 9 10 11 12 13 14 15 16 17 18 19 Grade Level

25.0+ 20.0 16.7 1 14.3 12.5 11.1 2 10.0 9.1 3 8.3 7.7 4 7.1 6.7 5 6.3 5.9 6 5.6 5.2 5.0 7 4.8 Ap 4.5 pr ox 4.3 8 im 4.2 at 4.0 9 e 3.8 G 10 ra 3.7 de 11 12 3.6 13 L 14 15 16 17+ ev 3.5 el 3.3 3.0 2.5 2.0− 108− 112 116 120 124 128 132 136 140 144 148 152 156 160 164 168 172 176 180 182+

Average No. of Syllables per 100 Words

A

No. of Sentences per 100 Words

Downloaded from www.ajronline.org by NYU Langone Med Ctr-Sch of Med on 05/31/15 from IP address 128.122.253.228. Copyright ARRS. For personal use only; all rights reserved

grade level 17.2 (Fig. 1 and Table 2). Thirty of the 41 articles (73.2%) required the comprehension level of a high school graduate (12th grade). None of the articles met AMA and NIH recommendations for patient education resources to be written between the third and seventh grade levels. In evaluating the 41 articles with the individual readability scales, we did not find much variation between scales. The Fry Graph, Gunning Fog Index, and SMOG showed the highest level of readability with mean scores of 15.2 ± 1.8, 14.6 ± 1.4, and 14.6 ± 1.4 (Fig. 2 and Table 2). The articles had an average New Fog Count score of 10.2 ± 1.9. Of the 41 articles analyzed with the New Fog Count, only 11 of 41 (26.8%) had a readability level below ninth grade, and the lowest was still overly high at the 8.0 grade level. Results with the FORCAST Formula showed the articles to be written at the 11.5 ± 0.8 grade level. The New DaleChall formula showed the 41 articles to be written at a 12.5 ± 2.0 grade level; FleschKincaid Grade Level, 12.6 ± 2.1; ColemanLiau Index, 12.7 ± 1.9; and Raygor Reading Estimate, 12.9 ± 2.4. Results with the Flesch Reading Ease scale showed the 41 articles to have a score of 37.2 ± 12.3, which correlates to difficult; 92.7% (38/41) of the articles were written at a difficult or very difficult level.

3.2− 3.4 3.6 3.8 4.0 4.3 4.6 4.9 5.2 5.7 6.3 6.9 7.7 9.0 10.2 12.0 13.5 16.0 19.0 23.0 28.0+ 6−

Invalid

9

10

11

12

College

Professor

8 7 6 5 4 3

Invalid

8

12

16

20

24

28

32

36

40

44+

No. of Long Words per 100 Words

B Fig. 2—Reading scales for 41 articles from European Society of Radiology patient education website. A, Fry Graph. B, Raygor Reading Estimate.

AJR:204, January 2015

Downloaded from www.ajronline.org by NYU Langone Med Ctr-Sch of Med on 05/31/15 from IP address 128.122.253.228. Copyright ARRS. For personal use only; all rights reserved

Health Literacy and Online Resources pages have failed to provide text written at an appropriate level [23–26]. However, the educational resources can be redrafted to improve their level of readability to reach and educate a more widespread audience. The AMA, the NIH, the U.S. Centers for Disease Control and Prevention, and others have developed guidelines on how to write health care materials at a level appropriate for general readership [14, 15, 38–41]. There were several limitations to our study. Our analysis relied on 10 readability scales to determine the level of readability of text. Although these indexes are well established, each is narrowly defined. The formulas rely on metrics, such as average number of words per sentence and average number of syllables per word, that may not correlate with medical terminology (Table 1). For example, the use of a short word such as stent, as opposed to a polysyllabic alternative, would decrease a readability score, but the term may not be common nomenclature for the general public. Conversely, longer words, such as emergency, would increase a readability score even though the general public presumably understands its meaning. We minimized the inherent restriction of each index by using 10 distinct scales, each entailing different criteria in various combinations. Results with each showed that the patient education articles on the ESR website are written at a level higher than that called for in the AMA and NIH guidelines. Furthermore, the readability indexes fail to consider the effect of multimedia tools, such as images, videos, and audio supplementation. Nevertheless, the mitigating effects of multimedia displays, although helpful for patient understanding, do not by themselves overcome the textual component of an overly elevated level of comprehension. Inasmuch as well-educated health care professionals are designing and writing the patient education materials on many web pages, it is perhaps not surprising that the material is written at a correspondingly high grade level. This appears to be the case for resources written for patients interested in diagnostic and interventional radiology. The ESR web page devoted to patient education is written at a level too complex for the general public, particularly persons with low health literacy, to fully use. If this material were rewritten in a simpler narrative, the general readership that would benefit from the material would likely increase.

References 1. Andreassen HK, Bujnowska-Fedak MM, Chronaki CE, et al. European citizens’ use of E-health services: a study of seven countries. BMC Public Health 2007; 7:53 2. Hesse BW, Nelson DE, Kreps GL, et al. Trust and sources of health information: the impact of the Internet and its implications for health care providers—findings from the first Health Information National Trends Survey. Arch Intern Med 2005; 165:2618–2624 3. Fox S. The social life of health information, 2011. Pew Research Internet Project. www.pewinternet. org/2011/05/12/the-social-life-of-health-­information2011. May 12, 2011. Accessed September 12, 2014 4. Rice RE. Influences, usage, and outcomes of Internet health information searching: multivariate results from the Pew surveys. Int J Med Inform 2006; 75:8–28 5. McMullan M. Patients using the Internet to obtain health information: how this affects the patient– health professional relationship. Patient Educ Couns 2006; 63:24–28 6. Roter DL. Patient participation in the patient-provider interaction: the effects of patient question asking on the quality of interaction, satisfaction and compliance. Health Educ Monogr 1977; 5:281–315 7. Simpson M, Buckman R, Stewart M, et al. Doctor-patient communication: the Toronto consensus statement. BMJ 1991; 303:1385–1387 8. Brown JB, Weston WW, Stewart MA. Patientcentred interviewing. Part II. Finding common ground. Can Fam Physician 1989; 35:153–157 9. U.S. Department of Health and Human Services website. America’s health literacy: why we need accessible health information. www.health.gov/ communication/literacy/issuebrief. 2008. Accessed September 12, 2014 10. HLS-EU Consortium. Comparative report of health literacy in eight EU member states: the European health literacy survey HLS-EU. www.health-­literacy. eu. 2012. Accessed September 12, 2014 11. [No authors listed]. Health literacy: report of the Council on Scientific Affairs, Ad Hoc Committee on Health Literacy for the Council on Scientific Affairs, American Medical Association. JAMA 1999; 281:552–557 12. U.S. Department of Health and Human Services, Office of Disease Prevention and Health Promotion. National action plan to improve health literacy. Washington, DC: U.S. Department of Health and Human Services, 2010 13. Weiss BD, Blanchard JS, McGee DL, et al. Illiteracy among Medicaid recipients and its relationship to health care costs. J Health Care Poor Underserved 1994; 5:99–111 14. Weis BD. Health literacy: a manual for clinicians. Chicago, IL: American Medical Associa-

tion, American Medical Foundation, 2003 15. U.S. National Library of Medicine, National Institutes of Health. How to write easy to read health materials. MedLine Plus website. www.nlm.nih. gov/medlineplus/etr.html. Updated February 13, 2013. Accessed September 12, 2014 16. Agarwal N, Chaudhari A, Hansberry DR, Tomei KL, Prestigiacomo CJ. A comparative analysis of neurosurgical online education materials to assess patient comprehension. J Clin Neurosci 2013; 20:1357–1361 17. Eloy JA, Li S, Kasabwala K, et al. Readability assessment of patient education materials on major otolaryngology association websites. Otolaryngol Head Neck Surg 2012; 147:466–471 18. Hansberry DR, Suresh R, Agarwal N, Heary RF, Goldstein IM. Quality assessment of online patient education resources for peripheral neuropathy. J Peripher Nerv Syst 2013; 18:44–47 19. Kasabwala K, Agarwal N, Hansberry DR, Baredes S, Eloy JA. Readability assessment of patient education materials from the American Academy of Otolaryngology—Head and Neck Surgery Foundation. Otolaryngol Head Neck Surg 2012; 147:466–471 20. Agarwal N, Sarris C, Hansberry DR, Lin MJ, Barrese JC, Prestigiacomo CJ. Quality of patient education materials for rehabilitation after ­neurological surgery. NeuroRehabilitation 2013; 32:817–821 21. Kasabwala K, Misra P, Hansberry DR, et al. Readability assessment of the American Rhinologic Society patient education materials. Int Forum Allergy Rhinol 2013;3:325–333 22. Misra P, Agarwal N, Kasabwala K, Hansberry DR, Setzen M, Eloy JA. Readability analysis of healthcare-oriented education resources from the American Academy of Facial Plastic and Reconstructive Surgery (AAFPRS). Laryngoscope 2013; 123:90–96 23. Hansberry DR, Agarwal N, Shah R, et al. Analysis of the readability of patient education materials from surgical subspecialties. Laryngoscope 2014; 124:405–412 24. Agarwal N, Hansberry DR, Sabourin V, Tomei KL, Prestigiacomo CJ. A comparative analysis of the quality of patient education materials from medical specialties. JAMA Intern Med 2013; 173:1257–1259 25. Hansberry DR, John A, John E, Agarwal N, Gonzales SF, Baker SR. A critical review of the readability of online patient education resources from RadiologyInfo.org. AJR 2014; 202:566–575 26. Hansberry DR, Kraus C, Agarwal N, Baker SR, Gonzales SF. Health literacy in vascular and interventional radiology: a comparative analysis of online patient education resources. Cardiovasc Intervent Radiol 2014; 37:1034–1040

AJR:204, January 2015 115

Downloaded from www.ajronline.org by NYU Langone Med Ctr-Sch of Med on 05/31/15 from IP address 128.122.253.228. Copyright ARRS. For personal use only; all rights reserved

Hansberry et al. 27. Coleman M, Liau TL. A computer readability formula designed for machine scoring. J Appl Psychol 1975; 60:283–284 28. Kincaid EH. The Medicare program: exploring federal health care policy. N C Med J 1992; 53:596–601 29. Flesch R. A new readability yardstick. J Appl Psychol 1948; 32:221–233 30. Caylor JS, Sticht TG, Fox LC, Ford JP. Methodologies for determining reading requirements of military occupational specialties: technical report 73-5 Alexandria, VA: Human Resources Research Organization, 1973 31. Fry E. A readability formula that saves time. J Read 1968; 11:513–516 32. Gunning R. The technique of clear writing. New York, NY: McGraw-Hill, 1952

33. Chall JS. Readability revisited: the new DaleChall readability formula. Cambridge, MA: Brookline Books, 1995 34. Raygor AL. The Raygor readability estimate: a quick and easy way to determine difficulty. In: Pearson PD, ed. Reading: theory, research and practice—twenty-sixth yearbook of the National Reading Conference. Clemson, SC: National Reading Conference, 1977 35. McLaughlin GH. SMOG grading: a new readability formula. J Read 1969; 12:639–646 36. Penson RT, Benson RC, Parles K, Chabner BA, Lynch TJ Jr. Virtual connections: Internet health care. Oncologist 2002; 7:555–568 37. Kandula S, Zeng-Treitler Q. Creating a gold standard for the readability measurement of health texts. AMIA Annu Symp Proc 2008; Nov 6:353–357

38. Doak CC, Doak LG, Root JH. Teaching patients with low literacy skills. Philadelphia, PA: J. B. Lippincott, 1996 39. Osborne H. Overcoming communication barriers in patient education. Gaithersburg, MD: Aspen Publishers, 2001 40. National Literacy and Health Program. Easy does it! Plain language and clear verbal communication: training manual. Ottawa, ON, Canada: Canadian Public Health Association, 1998 41. Centers for Disease Control and Prevention, Office of the Associate Director for Communication, Strategic and Proactive Communication Branch. Simply put: a guide for creating easy-to-understand materials, 3rd ed. Atlanta, GA: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, 2009

F O R YO U R I N F O R M AT I O N

AJR Web Exclusives are peer-reviewed journal articles selected by the Editor in Chief for immediate full-text Web-only publication based on their timeliness and critical importance to current medical imaging issues.

116

AJR:204, January 2015

Health literacy and online educational resources: an opportunity to educate patients.

Given the increasing accessibility of material on the Internet and the use of these materials by patients as a source of health care information, the ...
604KB Sizes 3 Downloads 6 Views