Supporting Team Tortoise: Grant Funding and the Advancement of CBME

By: Michael A. Barone (@BaroneMichael); John Moore

Sometimes the pace of progress to widespread adoption of Competency Based Medical Education (CBME) can feel more like the deliberate tortoise than the speedy hare…but we all know how that story ends.1

Those of us who advocate for CBME naturally want change to happen faster; but let’s pause and reflect on our restlessness. The “slow and steady” pace of CBME advancement reflects the essential time to answer critical research questions to build the validity argument for medical education transformation. It also reflects the time needed to adapt existing educational and assessment systems to support CBME. 

Given the number and depth of the questions to be answered and the complexity of redesigning educational systems embedded in healthcare delivery systems, research studies and implementation pilots are critical.  This ICE Blog will focus on the importance of funding to support these important efforts – i.e., providing the essential energy for the determined tortoise to win the race.

Why this blog post at this time?  In our daily work at NBME, we are fortunate to oversee one of medical education’s all too rare sources of funding – The Stemmler Fund.2 In January 2023, the physician educator for whom this fund is named passed away.3 The death of Dr. Stemmler provided the opportunity for us to look back on nearly three decades of funding important work in medical education and assessment. 

When reviewing the projects funded by Stemmler grants since the program’s inception in 1995, one can’t help thinking about the way Drs. Lambert Schuwirth and Cees van der Vleuten described five decades of evolution in assessment in their paper entitled, “A history of assessment in medical education”.4  In their paper, the authors describe assessment initially being viewed as a measurement problem, designed to “tell people apart” and to minimize human judgement.  This era was based on assumptions that “competence could and even should be captured purely quantitatively and that it could be expressed as a (single) score.”  Starting in the 1960’s, this phase lasted about 30 years.

In the next stage of evolution (1990’s), the importance of context and a desire to measure constructs beyond medical knowledge led to the embrace of human judgment and gradually moved assessment in the direction of methods such as workplace-based assessment. This period coincides with the beginning of the Stemmler Fund program (established 1995).  The emerging interests in assessment at that time aligned with some funding decisions in the early cycles of Stemmler.  One of the first grants awarded was to Dr. Patricia O’Sullivan on the “Demonstration of Portfolio Assessment in Residency Education”.5

Schuwirth and van der Vleuten next describe assessment evolving into “assessment as a system” where assessment starts to move from a “methods-oriented approach to a whole-systems approach.”  This period has evolved over the past 15-20 years and coincides with the emerging views of assessment-for-learning and programmatic assessment.6 In reviewing Stemmler funding decisions, we start to see evidence of projects which aggregate different data sources to inform decision making. This is demonstrated in a grant awarded (2002) to Dr. Maxine Papadakis on an Investigation of Professionalism Problems in Medical School as a Risk Factor for Physician Discipline.7 

More recently, the Stemmler Fund has seen funding decisions align with this “systems” view of assessment as well as with the implementation of CBME programs, including assessment projects leveraging large and dynamic data sets.

Grants during this most recent period include:

  • Dr. Bob Englander and colleagues (2017) on Mental Models and Learner Outcomes: Gap Filling of Validity Evidence to Support Time Variable Competency‐Based Advancement8
  • Dr. Dan Schumacher and colleagues (2019) on Assessing Residents’ Clinical Performance Using Resident-Sensitive Quality Measures10
  • Drs. Stefanie Sebok-Syer and Lorelei Lingard (2020) on Conceptualizing and Assessing Interdependent Performance in Collaborative Clinical Environments11
  • Drs. Verity Schaye, Jesse Burk-Rafel and Sally Santen (2022) on the Development and Validation of a Machine Learning Model for Automated Workplace-Based Assessment of Resident Clinical Reasoning Documentation12

In the classic fable, the tortoise was described as dogged.  As we review the impact of one grant funding program, we see that dogged determination of talented investigators, combined with financial support for ambitious resource intensive projects, leads to important results to advance CBME.

It’s also worth mentioning that supporting CBME is a team sport, built on numerous intramural and extramural (organizational) funding sources that have prioritized medical education reform.  A number of these organizations and programs come to the top of our minds, and we’ve listed them below – but we know that our view is not comprehensive and admittedly North American-centric.  Please help to create awareness of local and global funding opportunities by typing them into the Blog Chat below.

  • The Royal College of Physicians and Surgeons of Canada13
  • The American Medical Association’s (AMA) CBME-related initiatives for UME and GME. (14-16)
  • The AAMC (Association of American Medical Colleges) through Groups on Educational Affairs (GEA’s) and funding large scale pilots and initiatives17-21
  • The Josiah Macy Jr. Foundation21,22
  • The American Board of Pediatrics Foundation23

Throughout this blog, our reference to the Tortoise and Hare fable is not intended to imply that CBME implementation is a race. It may however be intended to imply that the pluckiness and “stick-to-it-ness” of the CBME movement dedicated to implementing an educational and assessment system based on outcomes meaningful to the public, is confident and prepared for the long-game.  Recall that in some versions of the fable, it’s actually the tortoise who challenges the hare to the race, not the other way around.  Perhaps that serves as a message to never underestimate the impact of a “slow and steady” movement, especially when it’s well supported. 

About the authors:
Michael A. Barone, MD, MPH is Vice President Competency-based Assessment, NBME. Philadelphia, PA. USA.
John Moore, M.A. is Director, Assessment Data Initiatives, NBME. Philadelphia, PA. USA


1.The Hare and the Tortoise.  Available at:  

2. About the Stemmler Fund.  Available at: 

3. Edward J. Stemmler.  Available at:

4. Schuwirth LWT & CPM van der Vleuten. A history of assessment in medical education. Adv Health Sci Educ Theory Pract. 2020;25(5):1045-1056.

5. O’Sullivan PS, KK Cogbill, T McClain, MD Reckase & JA Clardy. Portfolios as a novel approach for residency evaluation. Acad Psychiatry. 2002;26(3):173-9.

6. Van Der Vleuten CPM, LWT Schuwirth, EW Driessen, MJB Govaerts, S Heeneman. Twelve Tips for programmatic assessment. Med Teach. 2015;37(7):641-646.

7. Papadakis MA, CS Hodgson, A Teherani & ND Kohatsu. Unprofessional behavior in medical school is associated with subsequent disciplinary action by a state medical board. Acad Med. 2004;79(3):244-9.

8. Schwartz A, DF Balmer, E Borman-Shoap, A Chin, D Henry, BE Herman, P Hobday, JH Lee, S Multerer, RE Myers, K Ponitz, A Rosenberg, JB Soep, DC West & R Englander. Shared Mental Models Among Clinical Competency Committees in the Context of Time-Variable, Competency-Based Advancement to Residency. Acad Med. 2020;95(11S Association of American Medical Colleges Learn Serve Lead: Proceedings of the 59th Annual Research in Medical Education Presentations):S95-S102.

9. Variability in Trainee Autonomy and Learning in Surgery (VITALS) Trial.  Available at:

10. Smirnova A, S Chahine, C Milani, A Schuh, SS Sebok-Syer, JL Swartz, JA Wilhite, A Kalet, SJ Durning, KMJMH Lombarts, CPM van der Vleuten & DJ Schumacher. Using Resident-Sensitive Quality Measures Derived From Electronic Health Record Data to Assess Residents’ Performance in Pediatric Emergency Medicine. Acad Med. 2023;98(3):367-375

11.Sebok-Syer SS, L Lingard, M Panza, TA Van Hooren & CE Rassbach. Supportive and collaborative interdependence: Distinguishing residents’ contributions within health care teams. Med Educ. 2023 Feb 23. Online ahead of print.

12.Development & Validation of a Machine Learning Model for Automated Workplace-Based Assessment of Resident Clinical Reasoning Documentation.  Available at:  

13.Awards and Grants.  Available at: 

14.Accelerating Change in Medical Education Innovation Grant Program. Available at:

15.Lomis KD, SA Santen, M Dekhtyar, VS Elliott, J Richardson, MM Hammoud, R Hawkins & SE Skochelak. The Accelerating Change in Medical Education Consortium: Key Drivers of Transformative Change. Acad Med. 2021;96(7):979-988.

16.AMA Reimagining Residency initiative.  Available at: 

17.GEA National Grant Award Call for Educational Research Grant Proposals.  Available at:

18.Competency-Based Medical Education (CBME).  Available at: 

19.Education in Pediatrics Across the Continuum (The EPAC Project).  Available at:

20.Core Entrustable Professional Activities for Entering Residency.  Available at:

21.Andrews JS, JF Jr Bale, JB Soep, M Long, C Carraccio, R Englander & D Powell; EPAC Study Group. Education in Pediatrics Across the Continuum (EPAC): First Steps Toward Realizing the Dream of Competency-Based Education. Acad Med. 2018;93(3):414-420.

22.Josiah Macy Jr. Foundation. Apply for a Grant.  Available at

23.Schumacher DJ, Michelson C, Winn AS, Turner DA, Elshoff E & B Kinnear. Making prospective entrustment decisions: Knowing limits, seeking help and defaulting. Med Educ. 2022;56(9):892-900.

The views and opinions expressed in this post are those of the author(s) and do not necessarily reflect the official policy or position of The Royal College of Physicians and Surgeons of Canada. For more details on our site disclaimers, please see our ‘About’ page