Development and Pilot Testing of a Programmatic System for Competency Assessment in US Anesthesiology Residency Training

Glenn E. Woodworth, Zachary T. Goldstein, Aditee P. Ambardekar, Mary E. Arthur, Caryl F. Bailey, Gregory J. Booth, Patricia A. Carney, Fei Chen, Michael J. Duncan, Ilana R. Fromer, Matthew R. Hallman, Thomas Hoang, Robert Isaak, Lisa L. Klesius, Beth L. Ladlie, Sally Ann Mitchell, Amy K. Miller Juve, John D. Mitchell, Brian J. McGrath, John A. SheplerCharles R. Sims, Christina M. Spofford, Pedro P. Tanaka, Robert B. Maniker

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


BACKGROUND: In 2018, a set of entrustable professional activities (EPAs) and procedural skills assessments were developed for anesthesiology training, but they did not assess all the Accreditation Council for Graduate Medical Education (ACGME) milestones. The aims of this study were to (1) remap the 2018 EPA and procedural skills assessments to the revised ACGME Anesthesiology Milestones 2.0, (2) develop new assessments that combined with the original assessments to create a system of assessment that addresses all level 1 to 4 milestones, and (3) provide evidence for the validity of the assessments. METHODS: Using a modified Delphi process, a panel of anesthesiology education experts remapped the original assessments developed in 2018 to the Anesthesiology Milestones 2.0 and developed new assessments to create a system that assessed all level 1 through 4 milestones. Following a 24-month pilot at 7 institutions, the number of EPA and procedural skill assessments and mean scores were computed at the end of the academic year. Milestone achievement and subcompetency data for assessments from a single institution were compared to scores assigned by the institution's clinical competency committee (CCC). RESULTS: New assessment development, 2 months of testing and feedback, and revisions resulted in 5 new EPAs, 11 nontechnical skills assessments (NTSAs), and 6 objective structured clinical examinations (OSCEs). Combined with the original 20 EPAs and procedural skills assessments, the new system of assessment addresses 99% of level 1 to 4 Anesthesiology Milestones 2.0. During the 24-month pilot, aggregate mean EPA and procedural skill scores significantly increased with year in training. System subcompetency scores correlated significantly with 15 of 23 (65.2%) corresponding CCC scores at a single institution, but 8 correlations (36.4%) were <30.0, illustrating poor correlation. CONCLUSIONS: A panel of experts developed a set of EPAs, procedural skill assessment, NTSAs, and OSCEs to form a programmatic system of assessment for anesthesiology residency training in the United States. The method used to develop and pilot test the assessments, the progression of assessment scores with time in training, and the correlation of assessment scores with CCC scoring of milestone achievement provide evidence for the validity of the assessments.

Original languageEnglish (US)
Pages (from-to)1081-1093
Number of pages13
JournalAnesthesia and analgesia
Issue number5
StatePublished - May 1 2024

Bibliographical note

Publisher Copyright:
© 2024 Lippincott Williams and Wilkins. All rights reserved.


Dive into the research topics of 'Development and Pilot Testing of a Programmatic System for Competency Assessment in US Anesthesiology Residency Training'. Together they form a unique fingerprint.

Cite this