Assessment of Online Patient Education Materials from Major Dermatologic Associations

aAnn M. John, MD; bElizabeth S. John, MD; cDavid R. Hansberry, MD, PhD; aWilliam Clark Lambert, MD, PhD

aDepartment of Dermatology, Rutgers New Jersey Medical School, Newark, New Jersey; bDepartment of Medicine, Rutgers Robert Wood Johnson Medical School, New Brunswick, New Jersey; cDepartment of Medicine, Hahnemann University Hospital, Drexel University College of Medicine, Philadelphia, Pennsylvania

Disclosure: The authors report no relevant conflicts of interest.


Abstract

Objective: Patients increasingly use the internet to find medical information regarding their conditions and treatments. Physicians often supplement visits with written education materials. Online patient education materials from major dermatologic associations should be written at appropriate reading levels to optimize utility for patients. The purpose of this study is to assess online patient education materials from major dermatologic associations and determine if they are written at the fourth to sixth grade level recommended by the American Medical Association and National Institutes of Health. Design: This is a descriptive and correlational design. Setting: Academic institution. Participants/measurements: Patient education materials from eight major dermatology websites were downloaded and assessed using 10 readability scales. A one-way analysis of variance and Tukey’s Honestly Statistically Different post hoc analysis were performed to determine the difference in readability levels between websites. Results: Two hundred and sixty patient education materials were assessed. Collectively, patient education materials were written at a mean grade level of 11.13, with 65.8 percent of articles written above a tenth grade level and no articles written at the American Medical Association/National Institutes of Health recommended grade levels. Analysis of variance demonstrated a significant difference between websites for each reading scale (p<0.001), which was confirmed with Tukey’s Honestly Statistically Different post hoc analysis. Conclusion: Online patient education materials from major dermatologic association websites are written well above recommended reading levels. Associations should consider revising patient education materials to allow more effective patient comprehension. (J Clin Aesthet Dermatol. 2016;9(9):23–28.)


Medical diagnoses and treatments are often difficult to understand for patients. With the advent of the internet, patients increasingly rely on online health information to clarify information provided by physicians.[1] Of the 78 percent of Americans who use the Internet, nearly 80 percent use it to search for health information.[2],[3] Patients rely on internet resources for researching diseases, understanding the viewpoints of patients with similar diseases, reviewing therapeutic and side effect profiles of therapy, and viewing rankings of medical providers.[3] According to one study, 69 percent of surveyed patients used online health information as a second opinion and 11 percent solely relied on internet sources instead of visiting a physician.[4]

Currently, numerous websites are funded by national physician organizations to provide patient education materials (PEMs). However, these internet-based PEMs are often too complex for the average patient. Readability is defined as the ease of comprehension of written material.[5–7] It serves as an objective measure to quantify the usefulness of an article to readers. Readability and patient literacy are intrinsically associated: A patient with poor literacy skills will not benefit from articles written at an advanced readability level.[8],[9] According to the National Center for Education Statistics, in 2003, only 12 percent of Americans had proficient health literacy.[10] Given the association of low health literacy with poorer health outcomes, it is crucial for healthcare providers to ensure that PEMs are written at a level suitable for general public literacy.[8],[9]

The average patient at a public hospital reads about four grade levels below their completed grade level.[10] To account for the diverse patient body, the National Institutes of Health (NIH) and American Medical Association (AMA) recommend that patient information is written between a fourth and sixth grade level.[11],[12] However, studies in specialties including radiology, radiation oncology, nephrology, family medicine, anesthesiology, and orthopedic surgery, have continued to show that health information is written at too advanced a level for the average American to fully comprehend.[13–18] A previous study in dermatology examined the readability of PEMs on 15 selected topics from the American Academy of Dermatology (AAD), Wikipedia, MedicineOnline, and WebMD. None of the analyzed PEMs were written at levels within NIH/AMA recommended guidelines.[19] Another study specifically on psoriasis found that PEMs are too advanced for average patient comprehension.[20]

This study investigates the readability of online patient education articles from major dermatological associations, including AAD, American Melanoma Foundation (AMF), American Society for Dermatologic Surgery (ASDS), American Society for Mohs Surgery (ASMS), Melanoma Research Foundation (MRF), National Psoriasis Foundation (NPF), Society for Pediatric Dermatology (SPD), and Skin Cancer Foundation (SCF). The purpose of this study is to determine the utility of PEMs from physician-associated websites.

Methods

In May 2015, internet-based PEMs from eight major dermatologic associations were downloaded into Microsoft Word documents. These organizations along with the number of articles downloaded from each association are listed in Table 1 . In total, 260 internet-based PEMs were included in the analysis. Material specifically directed toward patient education was included. Media articles, articles directed toward clinicians, and nonmedical information, including copyright notices, authors, author information, references, and acknowledgements, were excluded. In addition, figures and captions were excluded. This study qualifies as a nonhuman subject research per the Institutional Review Board of Rutgers University–New Jersey Medical School.

The readability assessment was performed with Readability Studio, Professional Edition Version 2012.1, Oleander Software, Ltd (Vandalia, Ohio). Articles were analyzed with eight numerical assessments and two graphical assessments. These included Flesch Reading Ease (FRE), Flesch-Kincaid Grade Level (FKGL), Simple Measure of Gobbledygook (SMOG) test, Coleman-Liau Index (CLI), Gunning Fog Index (GFI), New Fog Count (NFC), New Dale-Chall Readability Formula (NDC), FORCAST, Raygor Readability Estimate (RRE) graph, and Fry Readability Graph. The FRE scale evaluated readability by generating scores between 0 and 100, with the following score associations: 0–30 = Very Difficult, 31–50 = Difficult, 51–60 = Fairly Difficult, 61–70 = Standard, 71–80 = Fairly Easy, 81–90m = Easy, and 91–100 = Very Easy. The remaining tools for calculating the readability level are based on various algorithms found in Table 2, and correlate with an appropriate academic grade level necessary to understand the material. For example, scores above 12 correspond to post-high school grade levels. These tools utilized sentence length, number of words, number of characters per word, number of syllables per word, and other characteristics to generate readability scores. Statistical differences between the groups were measured by a one-way analysis of variance (ANOVA) test and confirmed with Tukey’s Honestly Significant Different (HSD) post hoc analysis. Significance level of p<0.05 was used.

Results

The mean FRE score of the 260 PEMs was 41.7, ranging from 25 to 60 (Figure 1). The mean score corresponds to a Difficult level of reading. Overall, mean grade level across the remaining nine readability assessments was 11.13±2.2 (Figure 2). Of PEMs, 65.8 percent were written above a tenth grade level, and no articles had a mean grade level between the 4th and 6th grade. Mean FKGL grade level was 10.7, ranging from 5.1 to 18.4. Mean SMOG grade level was 12.7, ranging from 7.5 to 18.3. Mean CLI grade level was 11.6, ranging from 6.6 to 16.2. Mean GFI was 11.9, ranging from 5.7 to 18.4. Mean NFC grade level was 8.0, ranging from 2.4 to 18.4. Mean NDC grade level was 10.8, ranging from 5.5 to 16.0. Mean FORCAST grade level was 11.0, ranging from 9 to 12.6. Mean grade level determined from Fry was 13.1, ranging from 6.0 to 17.0 (Figure 3), and mean grade level determined from RRE was 11.8, ranging from 5.0 to 17.0 (Figure 4). The highest mean grade level of PEMs was from ASMS (13.75), and the lowest was from AAD (8.3).

ANOVA results indicated significant difference between all dermatologic associations (p<0.001) overall and for each individual readability assessment. Tukey’s HSD post hoc confirmed this analysis.

Discussion

Patients increasingly rely on the internet to understand personal health diagnoses and treatments. Patients who access the internet rank it as their second most valued resource after information obtained directly from physicians.4 According to one survey, 48 percent of patients said online information facilitated management of their health, while 47 percent said that online information impacted how they make personal health decisions.[21] Physicians also tend to refer patients to written education articles to further elucidate diseases and treatments. Despite this significant reliance on online information, PEMs in various fields are written well above the average American reading level.[13–20]

This study specifically investigates the readability of PEMs provided by major dermatologic associations, as these are from physician-associated websites. The articles surveyed were collectively written at an 11.13 grade level (Figure 2), which is much higher than the AMA/NIH-recommended fourth to sixth grade level. In fact, none of the articles complied with recommended guidelines. The mean FRE score corresponded to a Difficult reading level (Figure 1). There were also differences in readability between different associations. This may be due to the varying subjects that are addressed by each website and the different number of PEMs downloaded from each website. Overall, PEMs from major dermatologic associations were too advanced for the average American patient’s comprehension (Figures 3 and 4).

Online health information can provide several benefits to the physician-patient relationship. For one, patient use of online health information reduces physician time spent with each patient. In a study conducted by Johns Hopkins Medical Center, patients who previewed online information required only 15 minutes to understand the same information that patients without such a preview took one hour to comprehend.[22] In addition, an informed patient alleviates some of the physician responsibility. One study found that physicians believed the informed patient had either a beneficial (38%) or neutral (54%) effect on the physician-patient relationship.[23] However, if an article is written at a level beyond patient comprehension, it can instead confuse the patient, increase anxiety, and lengthen patient-doctor visits. To ensure that patient education material is a beneficial supplement to the patient-physician interaction, articles should be written at levels that will elucidate health information for the average American patient.

This study has certain limitations intrinsic to the readability assessments. Four of the scales (FRE, FKGL, SMOG, and CLI) rely on number of syllables in a word rather than content. Thus, short words, such as “lesion,” “macule,” and “wheal” that are medical jargon would be considered less complex. Likewise, lengthy words, such as “generalized” and “symmetrical” that are commonly known to the general public could inappropriately increase the level of readability. In addition, if an author defines a word, such as “lichenification,” and then uses it again, it will still be classified as a difficult word. The readability tools do not take into account a variety of other factors that influence patient comprehension, including sentence structure, use of images and diagrams, layout of websites, and content design. Use of the Suitability Assessment of Materials may be a future tool to quantify the effectiveness of charts, videos, and audio supplements.[24]

Recommendations for improving read-ability of articles are as follows. Authors should use shorter sentences and words that are less than three syllables when possible. While this may be impossible with certain terms in dermatology, authors should clearly define difficult words and concepts. Authors should consider relying on illustrations, diagrams, tables, and lists to break up long blocks of text. In addition, brief summaries should be provided to recapitulate addressed concepts. Authors should also avoid category words, such as beta-blockers, which are well-understood by physicians but not most patients. The use of audio and video supplements should also be incorporated when possible. Future directions of this study include surveying patients who read PEMs from dermatologic associations to obtain a subjective measure of patient comprehension. By improving readability of online PEMs, dermatologists may find a positive impact on physician-provider relationship, patient understanding, and patient adherence.

Conclusion

PEMs from major dermatologic associations are written at levels too advanced for the average American patient’s comprehension. As patients increasingly rely on internet sources to elucidate diagnoses and treatments, it is important that physician-associated websites provide PEMs that patients can understand. Revision of PEMs should be considered to optimize patient comprehension and subsequent benefit to the patient-physician dynamic. In doing so, both patients and physicians can experience a more streamlined and focused interaction.

References

1. Covering kids & families. Health Literacy Style Manual. October 2005.

2. Internet and American Life Project. Demographics of Internet Users. Washington, DC: Pew Research Center; 2011.

3. Fox S. The Social Life of Health Information. Washington, DC: Pew Research Center; 2011.

4. Diaz JA, Griffith RA, Ng JJ, et al. Patients’ use of the internet for medical information. J Gen Intern Med. 2002;17:180–185.

5. Hargis G. Readability and computer documentation. ACM Comput Doc. 2000;24: 122–131.

6. Klare GR. Measurement Of Readability. Ames, IA: Iowa State University Press; 1963. 7. DuBay W. The Principles of Readability. Costa Mesa, CA: Impact Information; 2004.

8. Wolf MS, Gazmararian JA, Baker DW. Health literacy and functional health status among older adults. Arch Intern Med. 2005;165:1946–1952.

9. DeWalt DA, Berkman ND, Sheridan S, et al. Literacy and health outcomes. J Gen Intern Med. 2004;19:1228–1239.

10. Fox S. The social life of health information 2011. http://www.pewinternet.org/Reports/2011/Social-Life-of-Health-Info/Summary-of-Findings.aspx. Accessed June 1, 2015.

11. National Institutes of Health. How to write easy to read health materials. National Library of Medicine Website. Available at: http://www.nlm.nih.gov/medlineplus/etr. html. Accessed June 1, 2015.

12. Weis BD. Health Literacy: A Manual for Clinicians. Chicago, IL: American Medical Association, American Medical Foundation; 2003.

13. Byun J, Golden DW. Readability of patient education materials from professional societies in radiation oncology: are we meeting the national standard? Int J Radiat Oncol Biol Phys. 2015;91:1108–1109.

14. De Oliveira GS, Jr., Jung M, McCaffery KJ, et al. Readability evaluation of internet-based patient education materials related to the anesthesiology field. J Clin Anesth. 2015;27(5):401–405.

15. Eltorai AE, Sharma P, Wang J, Daniels AH. Most American Academy of Orthopaedic Surgeons’ online patient education material exceeds average patient reading level. Clin Orthop Relat Res. 2015;473: 1181–1186.

16. Hansberry DR, John A, John E, et al. A critical review of the readability of online patient education resources from RadiologyInfo.Org. AJR Am J Roentgenol. 2014;202:566–575.

17. Morony S, Flynn M, McCaffery KJ, et al. Readability of written materials for CKD patients: a systematic review. Am J Kidney Dis. 2015;65:842–850.

18. Schoof ML, Wallace LS. Readability of American Academy of Family Physicians patient education materials. Family Medicine. 2014;46:291–293.

19. Tulbert BH, Snyder CW, Brodell RT. Readability of patient-oriented online dermatology resources. J Clin Aesthet Dermatol. 2011;4:27–33.

20. Smith GP. The readability of patient education materials designed for patients with psoriasis: what have we learned in 20 years? J Am Acad Dermatol. 2015;72: 737–738.

21. Rice RE. Influences, usage, and outcomes of internet health information search: multivariate results from the Pew surveys. Int J Med Inform. 2006;75:8–28.

22. Ferguson T. Online patient-helpers and physicians working together: a new partnership for high quality health care. BMJ. 2000;321:1129–1132.

23. Murray E, Lo B, Pollack L, et al. The impact of health information on the internet on health care and the physician-patient relationship: national U.S. survey among 1,050 U.S. physicians. J Med Internet Res. 2003;5:e17.

24. Doak CC, Doak LG, Root JH. Teaching Patients With Low Literacy Skills. 2nd ed. Philadelphia, PA: JB Lippincott Co; 1996.