Development and Pilot Testing of Entrustable Professional Activities for US Anesthesiology Residency Training

Glenn E. Woodworth, Adrian P. Marty, Pedro P. Tanaka, Aditee P. Ambardekar, Fei Chen, Michael J. Duncan, Ilana R. Fromer, Matthew R. Hallman, Lisa L. Klesius, Beth L. Ladlie, Sally Ann Mitchell, Amy K. Miller Juve, Brian J. McGrath, John A. Shepler, Charles Sims, Christina M. Spofford, Wil Van Cleve, Robert B. Maniker

Research output: Contribution to journalArticlepeer-review

23 Scopus citations

Abstract

BACKGROUND: Modern medical education requires frequent competency assessment. The Accreditation Council for Graduate Medical Education (ACGME) provides a descriptive framework of competencies and milestones but does not provide standardized instruments to assess and track trainee competency over time. Entrustable professional activities (EPAs) represent a workplace-based method to assess the achievement of competency milestones at the point-of-care that can be applied to anesthesiology training in the United States. METHODS: Experts in education and competency assessment were recruited to participate in a 6-step process using a modified Delphi method with iterative rounds to reach consensus on an entrustment scale, a list of EPAs and procedural skills, detailed definitions for each EPA, a mapping of the EPAs to the ACGME milestones, and a target level of entrustment for graduating US anesthesiology residents for each EPA and procedural skill. The defined EPAs and procedural skills were implemented using a website and mobile app. The assessment system was piloted at 7 anesthesiology residency programs. After 2 months, faculty were surveyed on their attitudes on usability and utility of the assessment system. The number of evaluations submitted per month was collected for 1 year. RESULTS: Participants in EPA development included 18 education experts from 11 different programs. The Delphi rounds produced a final list of 20 EPAs, each differentiated as simple or complex, a defined entrustment scale, mapping of the EPAs to milestones, and graduation entrustment targets. A list of 159 procedural skills was similarly developed. Results of the faculty survey demonstrated favorable ratings on all questions regarding app usability as well as the utility of the app and EPA assessments. Over the 2-month pilot period, 1636 EPA and 1427 procedure assessments were submitted. All programs continued to use the app for the remainder of the academic year resulting in 12,641 submitted assessments. CONCLUSIONS: A list of 20 anesthesiology EPAs and 159 procedural skills assessments were developed using a rigorous methodology to reach consensus among education experts. The assessments were pilot tested at 7 US anesthesiology residency programs demonstrating the feasibility of implementation using a mobile app and the ability to collect assessment data. Adoption at the pilot sites was variable; however, the use of the system was not mandatory for faculty or trainees at any site.

Original languageEnglish (US)
Pages (from-to)1579-1591
Number of pages13
JournalAnesthesia and analgesia
Volume132
Issue number6
DOIs
StatePublished - Jun 1 2021

Bibliographical note

Publisher Copyright:
© 2021 Lippincott Williams and Wilkins. All rights reserved.

Keywords

  • Anesthesiology/education
  • Humans
  • Internship and Residency/standards
  • Pilot Projects
  • Professional Role
  • Program Development/standards
  • Surveys and Questionnaires
  • United States

PubMed: MeSH publication types

  • Research Support, Non-U.S. Gov't
  • Journal Article

Fingerprint

Dive into the research topics of 'Development and Pilot Testing of Entrustable Professional Activities for US Anesthesiology Residency Training'. Together they form a unique fingerprint.

Cite this