JMIR Form Res. 2025 Oct 22. 9 e68000
Background: As internet usage continues to rise, an increasing number of individuals rely on online resources for health-related information. However, prior research has shown that much of this information is written at a reading level exceeding national recommendation, which may hinder patient comprehension and decision-making. The American Medical Association (AMA) recommends that patient-directed health materials be written at or below a 6th-grade reading level to ensure accessibility and promote health literacy. Despite these guidelines, studies indicate that many online health resources fail to meet this standard. The exercise stress test is a widely used diagnostic tool in cardiovascular medicine, yet no prior studies have assessed the readability and quality of online health information specific to this topic.
Objective: This study aimed to evaluate the readability and quality of online resources on exercise stress testing and compare these metrics between academic and non-academic sources.
Methods: A cross-sectional readability and quality analysis was conducted using Google and Bing to identify web-based patient resources related to exercise stress testing. Eighteen relevant websites were categorized as academic (n=7) or nonacademic (n=11). Readability was assessed using four established readability formulas: Flesch-Kincaid Grade Level (FKGL), Flesch Reading Ease (FRE), Simple Measure of Gobbledygook (SMOG), and Gunning Fog (GF). Website quality and reliability were evaluated using the modified DISCERN (mDISCERN) tool. Statistical comparisons between academic and nonacademic sources were performed using independent samples t tests.
Results: The average FKGL, SMOG, and GF scores for all websites were 8.36, 8.28, and 10.14, respectively, exceeding the AMA-recommended 6th-grade reading level. Academic sources had significantly higher FKGL (9.1 vs. 7.9, P=.03), SMOG (8.9 vs. 7.9, P=.04), and lower FRE scores (57.6 vs. 65.3, P=.006) than nonacademic sources, indicating greater reading difficulty. The average GF scores for academic and nonacademic sources were 10.68 and 9.81, respectively, but this difference was not statistically significant. The quality of web resources, as assessed by mDISCERN, was classified as fair overall, with an average score of 29.44 out of 40 (74%). While academic and nonacademic websites had similar mDISCERN scores, areas such as source citation, publication dates, and acknowledgment of uncertainty were consistently lacking across all resources.
Conclusions: Online resources on exercise stress testing are, on average, written at a reading level that exceeds the AMA's 6th-grade reading guideline, potentially limiting patient comprehension. Academic sources are significantly more difficult to read than nonacademic sources, though neither category meets the recommended readability standards. The quality of web-based resources was found to be fair but could be improved by ensuring transparency in sourcing and providing clearer, more comprehensive information. These findings underscore the need for improved accessibility and readability in online health information to support patient education and informed decision-making.
Keywords: DISCERN; FKGL; FRE; Flesch reading ease; Flesch-Kincaid grade level; GF; Gunning Fog index; SMOG; Simple Measure of Gobbledygook; cross-sectional study; exercise; exercise stress test; health information; health literacy; mDISCERN; medical information; patient outcomes; physical activity; quality; quality analysis; readability; search engines; websites