Abstract
OBJECTIVE: Video-based performance assessments provide essential feedback to surgical residents, but in-person and remote video-based assessment by trained proctors incurs significant cost. We aimed to determine the reliability, accuracy, and difficulty of untrained attending staff surgeon raters completing video-based assessments of a basic laparoscopic skill. Secondarily, we aimed to compare reliability and accuracy between 2 different types of assessment tools. DESIGN: An anonymous survey was distributed electronically to surgical attendings via a national organizational listserv. Survey items included demographics, rating of video-based assessment experience (1 = have never completed video-based assessments, 5 = often complete video-based assessments), and rating of favorability toward video-based and in-person assessments (0 = not favorable, 100 = favorable). Participants watched 2 laparoscopic peg transfer performances, then rated each performance using an Objective Structured Assessment of Technical Skill (OSATS) form and the McGill Inanimate System for Training and Evaluation of Laparoscopic Skills (MISTELS). Participants then rated assessment completion ease (1 = Very Easy, 5 = Very Difficult). SETTING: National survey of practicing surgeons. PARTICIPANTS: Sixty-one surgery attendings with experience in laparoscopic surgery from 10 institutions participated as untrained raters. Six experienced laparoscopic skills proctors participated as expert raters. RESULTS: Inter-rater reliability was substantial for both OSATS (k = 0.75) and MISTELS (k = 0.85). MISTELS accuracy was significantly higher than that of OSATS (κ: MISTELS = 0.18, 95%CI = [0.06,0.29]; OSATS = 0.02, 95%CI = [-0.01,0.04]). While participants were inexperienced with completing video-based assessments (median = 1/5), they perceived video-based assessments favorably (mean = 73.4) and felt assessment completion was “Easy” on average. CONCLUSIONS: We demonstrate that faculty raters untrained in simulation-based assessments can successfully complete video-based assessments of basic laparoscopic skills with substantial inter-rater reliability without marked difficulty. These findings suggest an opportunity to increase access to feedback for trainees using video-based assessment of fundamental skills in laparoscopic surgery.
Original language | English (US) |
---|---|
Pages (from-to) | 850-857 |
Number of pages | 8 |
Journal | Journal of surgical education |
Volume | 81 |
Issue number | 6 |
DOIs | |
State | Published - Jun 2024 |
Externally published | Yes |
Bibliographical note
Publisher Copyright:© 2024 Association of Program Directors in Surgery
Keywords
- laparoscopy
- reliability
- validity
- video-based assessment
- virtual assessment
PubMed: MeSH publication types
- Journal Article
- Comparative Study