More human, less machine:  How one medical board is evolving their certifying examination.

By:Diane Gorgas, Felix Ankel

This post is published in advance of the Council of Emergency Medicine Residency Directors Annual Assembly to be held in Seattle, WA, March 1-5, 2025

Recently one of us discussed the concept of quantum medical education which emphasizes both the “particle” of medical knowledge and the “waveform” of shared wisdom. This reflects shifts in how we interact with  medical knowledge, professional identity, and structure.  With the rise of artificial intelligence, the value of clinicians as navigators of wisdom will increase.  Clinicians must develop fusion skills to optimize patient care, blending deep expertise with adaptability and situational awareness.  This “more human, less machine” presents a challenge for certifying boards:  how do we develop national standards that assess candidates beyond medical knowledge?   The following is an exemplar of how one board is moving beyond assessing medical knowledge and modernizing their certifying assessment to meet the needs of the future.

The American Board of Emergency Medicine’s (ABEM) new certifying exam (CE)

Currently, candidates seeking board certified in emergency medicine must pass a  written qualifying exam consisting of approximately 305 multiple choice questions (MCQ) based on model of EM and an oral board exam with 5 single patient cases and 2 structured interview cases.  In 2026, the oral board exam will be replaced with a new certifying exam, a shift that represents a fundamental rethinking of assessment in emergency medicine

ABEM employs a cyclic, iterative approach to exam development and revision to meet the needs of the public.  In 2016, ABEM revolutionized continuing certification by introducing MyEMCert, a longitudinal modular-based, formative assessment that replaced a summative 10-year high stakes 10-year MCQ exam.  This marked a philosophical shift that emphasized ongoing learning over episodic testing.

The Board-Certified Initiative (BCI)

Building on the success of MyEMCert, ABEM launched the Becoming Certified Initiative (BCI) in 2021 to modernize initial certification.  This was driven by a recognition that the landscape of emergency medicine is changing and that critical skill sets were being underrepresented in assessment, both at the residency program level and by the national certifying body. A certifying exam that goes beyond medical knowledge offers an opportunity to set the highest standard for emergency medicine and offer a credential that differentiates board certified emergency physicians from others providing care in the emergency department.

The BCI incorporated broad input from the EM community including surveys, focus groups, and interviews.  A national summit for all EM organizations including representatives from the public, all major EM organizations,  and the American Board of Medical Specialties (ABMS) was held focusing on the future critical skill set for Emergency Physicians.  Stakeholders included:

  1. ABEM-certified physicians
  2. EM program directors
  3. Employers and hospital/health system leaders,
  4. EM residents, academic and community physicians, national physician groups
  5. Public representatives

A longitudinal Stakeholder Advisory group was created as an external voice for the Board of Directors regarding initial certification

Over 4300 individuals gave input into the current and future skill set required for the successful practice of Emergency Medicine.  This feedback revealed three key areas of focus:

  1. Maintaining the highest standards in the field, ensuring both a written qualifying exam for medical knowledge and a fair, valid, reliable certifying exam to assess additional competencies.
  2. Aligning the exam with clinical practice, moving from traditional assessments towards real-world scenarios.  
  3. Assessing additional competencies, going beyond medical knowledge to evaluate the skills emergency physicians need in practice.

New Case Types

Specific areas which were seen as being under-represented in the current testing schema were demonstration of team management and leadership, the ability to task shift and prioritize tasks, high stakes and difficult communications, and procedural skills.  The BCI working group created the case types which could optimally evaluate these domains, while still having a strong anchor in requisite medical knowledge.  They are divided into three categories, clinical decision cases, communication skills, and procedural skills.  This yielded eight case types for assessment:

Clinical Cases:

  •             Clinical Decision Making
  •             Prioritization

Communication Cases:

  •             Patient Centered Communication
  •             Conflict Management
  •             Difficult Conversations (Breaking Bad News)
  •             Reassessment/Troubleshooting

Procedural:

  •             Critical Procedures
  •             Clinical Ultrasound

Case Development Teams

The new exam requires new constructs of case development.  Traditionally, ABEM drew from its most senior physician volunteers to create case content material by populating static templates for case creation.  The new CE includes more early career physicians with agency to be generative in creating evaluations in their given domain. This has involved intentional teaming by diversifying team leadership and composition and maximizing team building and autonomy. Other novel approaches to case development teams have been to include public members and Standardized Patient Actors in the case development process.  Both groups have been instrumental in providing key aspects of realism and authenticity to the case development process and a non-physician perspective on assessment.

Scoring

The traditional oral board examination was a highly structured and rubric driven scoring assessment.  A balance was sought for the new CE in maintaining a reliable and standardized assessment for every candidate, while allowing for the introduction of holistic and subjective examiner scoring.  Clear messaging on criteria for competence has been attached to every case type, which has served as the structure for development of scoring rubrics.  In addition, each case has the potential for an overall strength of performance rating, which has been psychometrically validated as providing increased reliability.

A clear and sharp focus on concern for an unbiased and fair examination has been central throughout case development and will continue so in case administration.  Especially with the potential for bias in moving towards holistic and therefore more subjective scoring, every effort is being made by ABEM to eliminate potential bias.  This has included assurances that all case developers and future examiners go through implicit bias training, and that a separate review panel of practicing emergency physician stakeholders screen cases for potential bias in any form.  ABEM will have multiple scorers for a selection of cases which have been identified as high risk for implicit bias. Additionally, there will be a randomized sampling of all cases for multiple reviews. Per usual practices of ABEM, extensive examiner training will be given prior to each exam administration to avoid administration bias, and senior examiners will be onsite to provide in person CQI and examiner coaching

Traditionally, ABEM’s oral board examination has consistently shown the highest levels of validity, reliability and fairness.  Reliability scores for the oral board examination have been greater than 0.95 for years.  The new certifying exam is seeking to duplicate this high level of performance, while expanding the scope of assessment and increasing clinical relevance.  Critical in understanding how that is accomplished is understanding ABEM’s commitment to criterion referencing.  There is no curve and no scale for ABEM examinations.  100% of candidates on a given examination could fail, or 100% could pass.  Rather, the passing score is set based off an established performance of the expected qualified candidate as judged by currently clinically active Emergency Physicians in the process of Standard Setting.  The expectation of the need to standard set for the new CE is recognized by the board of directors and ABEM has committed to doing so to maintain fairness.

The Assessment Center

A novel assessment required ABEM to seek a novel assessment center.  This required not only a facility which could house exam rooms for both tabletop evaluations (Clinical Cases) but also had many clinical simulation rooms where procedures and communication could be observed.  Although simulation centers are commonplace in most medical schools and residency training programs, finding a center which can accommodate 4,000 candidates annually, the expected volume of candidates seeking assessment in EM yearly within the next few years is exceedingly difficult.  After visiting multiple assessment centers nationally during the BCI process, only the AIME center was able to accommodate this volume.  Equally important to the structure and size of the assessment center was the experience of the staff running the center and the professional Standarized Patients regionally available.  ABEM is employing staff from the American Board of Anesthesiology who run the AIME center in Raleigh, NC, and have had 7 years of experience administering the ABAs certifying exam which utilizes both clinical cases and communication cases.  The key to ABEM’s success to date though has been integration of the Standardized Patient Actors (SPs) in the case development process.  Their insight into the pitfalls and keys for success in developing a highly reliable, standardized assessment for every candidate has been invaluable.  They are professional patient actors who in addition to serving a critical role in script development for the communication cases, provide the AIME center with a diverse group of patient actors which can truly represent the patients seen in any ED daily. 

Assessment-as-system continuum

There has been an evolution of assessment along an assessment-as-measurement to assessment-as-judgment to assessment-as-system continuum.  For most clinicians this “system” includes components of self-assessments, workplace assessments (WBA) in residencies, and independent assessments such as board examination.

We believe that the value of an assessment system is influenced by the value of connections of the parts of the system.  We hope that having a national standard that assesses candidates in areas such as clinical decision making, prioritization, patient centered communication, conflict management and negotiation, difficult conversations and breaking bad news, reassessment and troubleshooting, clinical procedures and ultrasound will have a forcing function on work-based assessments in residencies and on self-assessment by learners.  Likewise, we hope to update the independent assessments of our certifying exam based on the needs of the public and the advances in competence based medical education. Assessment drives learning. Learning transforms care.

Image from IStock

About The authors

Diane Gorgas, MD serves as the president of the American Board of Emergency Medicine. 

Felix Ankel serves as the Secretary-Treasurer and Chair of the Academic Affairs Committee of the American Board of Emergency Medicine

The views and opinions expressed in this post are those of the author(s) and do not necessarily reflect the official policy or position of The University of Ottawa. For more details on our site disclaimers, please see our ‘About’ page