Background: Trauma clinical decision support systems improve adherence with evidence-based practice but suffer from poor usability and the lack of a user-centered design. The objective of this study was to compare the effectiveness of user and expert-driven usability testing methods to detect usability issues in a rib fracture clinical decision support system and identify guiding principles for trauma clinical decision support systems. Methods: A user-driven and expert-driven usability investigation was conducted using a clinical decision support system developed for patients with rib fractures. The user-driven usability evaluation was as follows: 10 clinicians were selected for simulation-based usability testing using snowball sampling, and each clinician completed 3 simulations using a video-conferencing platform. End-users participated in a novel team-based approach that simulated realistic clinical workflows. The expert-driven heuristic evaluation was as follows: 2 usability experts conducted a heuristic evaluation of the clinical decision support system using 10 common usability heuristics. Usability issues were identified, cataloged, and ranked for severity using a 4-level ordinal scale. Thematic analysis was utilized to categorize the identified usability issues. Results: Seventy-nine usability issues were identified; 63% were identified by experts and 48% by end-users. Notably, 58% of severe usability issues were identified by experts alone. Only 11% of issues were identified by both methods. Five themes were identified that could guide the design of clinical decision support systems—transparency, functionality and integration into workflow, automated and noninterruptive, flexibility, and layout and appearance. Themes were preferentially identified by different methods. Conclusion: We found that a dual-method usability evaluation involving usability experts and end-users drastically improved detection of usability issues over single-method alone. We identified 5 themes to guide trauma clinical decision support system design. Performing usability testing via a remote video-conferencing platform facilitated multi-site involvement despite a global pandemic.
Bibliographical noteFunding Information:
This research was supported by the Agency for Healthcare Research and Quality and Patient-Centered Outcomes Research Institute under grant K12HS026379 (CJT).
© 2022 Elsevier Inc.