AEM Educ Train. 2026 May 11;10:e70172. doi: 10.1002/aet2.70172. eCollection 2026 Jun.
ABSTRACT
BACKGROUND: The American Board of Emergency Medicine (ABEM) has recently introduced a new Certifying Exam, which incorporates assessment of performance in a simulated clinical environment. Simulation-based assessments allow for reproducible, controlled evaluations. The residency program has a responsibility to prepare residents for the Certifying Exam and to verify resident competency. We sought to determine the feasibility of a simulation-based assessment program encompassing three domains (resuscitation, procedural skills, and communication) with assessments against pre-determined Minimum Competency Scores (MCS) to provide evidence for program director attestations of competence.
METHODS: Eight stations were chosen based on ABEM’s Certifying Exam content, the Accreditation Council for Graduate Medical Education (ACGME) Milestones, and proposed Entrustable Professional Activities. Simulation-based case scenarios and assessment checklists were created with standard settings performed using the Mastery Angoff process. These stations were categorized into three different domains: resuscitation (Adult Medical, Pediatric Medical, Trauma, and Neonatal), procedural skills (Direct Laryngoscopy, Cricothyrotomy, and Central Venous Catheter Placement), and communication (Breaking Bad News). Results were summarized with descriptive statistics.
RESULTS: Thirty emergency medicine residents at an academic Midwestern residency program underwent assessments over one year. None of the residents achieved the MCS on all eight assessments. Residents achieved the MCS on an average of 4.24 stations. There was large performance variability, particularly for the Pediatric Medical (27%-100% items correct), Neonatal Resuscitation (30%-100%), and Breaking Bad News (31%-100%) scenarios. A greater percentage of residents met the MCS for procedural assessments.
CONCLUSIONS: This pilot study demonstrated the feasibility of a simulation-based assessment program designed to provide objective evidence of clinical competency. Assessment outcomes and associated feedback can be used by learners to guide the development of educational plans for performance improvement. Educators can use the resulting data to assess readiness for independent practice, prepare for the Certifying Exam, and identify content areas that may require additional program-wide education.
PMID:42125624 | PMC:PMC13159544 | DOI:10.1002/aet2.70172