Abstract
Many quality criteria have been developed to rate the quality of online health information. However, few instruments have been validated for inter-observer reliability. Therefore, we assessed the degree to which two raters agree upon the presence or absence of information based on 22 popularly cited quality criteria on a sample of 21 complementary and alternative medicine websites. Our preliminary analysis showed a poor inter-rater agreement on 10 out of the 22 quality criteria. Therefore, we created operational definitions for each of the criteria, decreased the allowed choices and defined a location to look for the information. As a result 15 out of the 22 quality criteria had a kappa >0.6. We conclude that even with precise definitions some commonly used quality criteria to assess the quality of health information online cannot be reliably assessed. However, inter-rater agreement can be improved by providing precise operational definitions.
Original language | English (US) |
---|---|
Pages (from-to) | 1308-1312 |
Number of pages | 5 |
Journal | Medinfo. MEDINFO |
Volume | 11 |
Issue number | Pt 2 |
State | Published - 2004 |