AAMC Core EPAs Pilot: Lessons Learned and Implications for CBME

By: Jonathan Amiel (@jmamd)

Ten years ago, the Association of American Medical Colleges convened a group of North American medical educators and education scholars to propose a set of EPAs that medical school graduates should be able to perform with indirect supervision when they begin residency. Thee AAMC published these Core EPAs for Entering Residency in 2014 and they include:

  1. Gather a history and perform a physical examination.
  2. Prioritize a differential diagnosis following a clinical encounter.
  3. Recommend and interpret common diagnostic and screening tests.
  4. Enter and discuss orders and prescriptions.
  5. Document a clinical encounter in the patient record.
  6. Provide an oral presentation of a clinical encounter.
  7. Form clinical questions and retrieve evidence to advance patient care.
  8. Give or receive a patient handover to transition care responsibility.
  9. Collaborate as a member of an interprofessional team.
  10. Recognize a patient requiring urgent or emergent care and initiate evaluation and management.
  11. Obtain informed consent for tests and/or procedures.
  12. Perform general procedures of a physician.
  13. Identify system failures and contribute to a culture of safety and improvement.

AAMC then commissioned the Core EPA Pilot to test the feasibility and utility of the Core EPAs at a diverse set of U.S. medical schools. A robust, enthusiastic, and highly collaborative group of educators came together with one another and with their learners and embarked a what turned out to be a seven-year journey. The group wrapped up its work last year and reported its findings over the course of the pilot. We thought we’d share some of our findings on lessons learned and what are some potential implications for CBME.

Guiding Principles Help Encourage Fidelity

As CBME implementers have found over the years, the context in which programs exist has a tremendous impact on how well the programs align with the core components of CBME. In piloting the Core EPAs, it was exceptionally helpful for each of the 10 pilot schools to agree to the following guiding principles:

  1. Employ a systematic approach to map educational opportunities and assessments for each EPA.
  2. Explicitly measure the attributes of trustworthiness in addition to the specific knowledge, skills, and attitudes required for each EPA.
  3. Create a longitudinal view of each learner’s performance via, at minimum, aggregated performance evidence, and consider the added value of longitudinal relationships and formal coaching structures in informing entrustment decisions.
  4. Gather multimodal performance evidence from multiple assessors about each learner for each EPA.
  5. Include global professional judgments about the entrustment of each learner in the body of evidence that supports summative entrustment decisions.
  6. Ensure a process for formative feedback along the trajectory to entrustment to provide opportunities for both remediation and potential acceleration of responsibilities.
  7. Create a process to render and maintain formal entrustment decisions by a trained group (entrustment committee) that reviews performance evidence for each learner.
  8. Ensure that each learner is an active participant in the entrustment process — aware of expectations, engaged in gathering and reviewing performance evidence, and generating individualized learning plans to attain entrustment.
  9. Align formal entrustment decisions regarding individual learners with nationally established performance expectations, as currently described in the Core EPAs Curriculum Developers’ Guide.

These guiding principles aligned well with the Core Components Framework for CBME with extra emphasis on data visualization, coaching, and the role of trustworthiness in undergirding entrustment decisions.

What Did We Learn About the Core EPAs?

When the Core EPAs were published, there was a lot of discussion among implementers and CBME scholars about which of the EPAs were consistent with ten Cate’s EPA construct and which might not be and also which are truly sufficiently present in curricula and which might not be. The pilot collected robust data on the feasibility and utility of implementing each of the EPAs and, not surprisingly, they performed differently. A few clusters that emerged:

  • Cluster A: EPAs 1, 2, 5, 6, 7, and 9 align well with existing curricula, meaning that there exist ample opportunities for learners to practice them with direct observation and feedback. à These are the “Core of the Core.”
  • Cluster B: EPAs 3, 4, and 8 that may be represented most prominently in the senior UME curricula, where learners may have opportunities to practice them, but perhaps in limited volume, with supervision that may be inconsistent or not sufficiently intentional to collect evidence robust enough to substantiate entrustment decisions. → These call for curriculum/assessment improvement.
  • Cluster C: EPAs 10, 11, 12, and 13 that in most of our participating schools’ UME curricula appear to be absent or are meaningfully, and unfortunately, underdeveloped, the “aspirational EPAs.” → These call for systems improvement.

In addition, the group (and colleagues across the country) have been thinking about which EPAs were not included in the originally published list that are nevertheless core to the medical student role and may be considered for inclusion in UME CBME curricula. Some examples have been using telemedicine, discharge planning, engaging in patient-centered decision-making, practicing healthcare in community settings → These call for innovation.

What Does This Mean for CBME?

First, the work of the pilot helped to accelerate the implementation of CBME at the institutions that participated while many other institutions were also innovating. Collectively, we know much more now about mediators and mitigators of Core EPA implementation (a series of case studies is forthcoming – stay tuned!). Additionally, we have contributed to the growing momentum for using assessment data more and better to facilitate learning, with specific emphasis on visualizing data longitudinally to inform coaching discussions and higher-stakes progression decisions.

Importantly, we have data to inform the effort and call to action to define a set of foundational competencies for undergraduate medical education. This is the effort that, at least in the United States, aims to align curriculum and assessment in medical schools with the frameworks used in residency programs and fellowships so that the transition can be more effective, safe, and equitable.

About the author: Jonathan Amiel, MD, is Professor of Psychiatry and Senior Associate Dean for Innovation in Health Professions Education at Columbia University Irving Medical Center and a former Associate Project Lead of the AAMC Core EPA Pilot.

The views and opinions expressed in this post are those of the author(s) and do not necessarily reflect the official policy or position of The Royal College of Physicians and Surgeons of Canada. For more details on our site disclaimers, please see our ‘About’ page