Learning-Performance Distinction: Implications for Assessment and CBME

By Dilshan Pieris (@DilshanPieris_)

Competency-based medical education (CBME) is an outcomes-based approach to post-graduate training that is rapidly rising to prominence in the medical education community (1). In fact, the Royal College of Physicians and Surgeons of Canada (RCPSC) reports that by 2022, every residency programme in Canada will have transitioned to a CBME model (2). In lieu of this upcoming transition, it is important to understand the relationship between CBME and behaviourism and how this can impact the educational development of residents if not carefully accounted for.

Behaviourism was originally conceptualized by B. F. Skinner, who supported the view that human behaviour is externally motivated; put simply, “the environment selects behaviour” (3). Indeed, any given context will contain stimuli that either encourage or discourage particular behaviours. This is the underlying premise of CBME; desired behaviours (“competencies” and “milestones”) are encouraged by manipulating the educational milieu (i.e. curricula, assessments, evaluations) within which residents are trained (1). Consequently, assessing whether or not residents have achieved these behavioural outcomes to an acceptable level of proficiency relies on direct observation from faculty (1). However, caution must be taken when drawing conclusions about competency from such observations as learning (retained changes in ability) cannot always be inferred from performance—a phenomenon known as the learning-performance distinction (4,5). This effect is best illustrated by comparing the effects of different practice schedules (i.e. distributed versus massed) on immediate and delayed test performance. (Distributed practice is when practice sessions spaced out over time; massed practice is when there is little to no spacing between sessions (6).)

In 2013, Dunlosky et al. amalgamated reviews exploring these schedules and found that, when given an equal number of sessions, massed practice outperforms distributed practice on immediate tests (6). However, when tested for retention (i.e. learning) through delayed tests, the effect reverses—distributed practice outperforms massed practice (6). The same effect is evident when comparing mixed practice (interleaving topics and problem types within a single session) with blocked practice (topics and problem types practiced in succession with a single session); blocked practice outperforms mixed practice on immediate tests, but not on delayed tests (6). As such, learning cannot be inferred from performance on immediate tests because those trained with a distributed and/or mixed schedule will perform poorly and be prematurely deemed “incompetent” (false-negative), and those trained with a massed and/or blocked schedule will perform well and be prematurely deemed “competent” (false-positive) (6). Indeed, making such false judgements can have implications for the trajectory of the residents’ training. For instance, false-negative judgements can lead to the unnecessary remediation of residents who are actually competent in the particular technical or nontechnical skills being examined. On the other hand, false-positive judgements can result in residents not being offered the remedial training they need to become competent in certain technical or nontechnical skills. By minimizing the likelihood of false-negative or false-positive judgements through prioritization of delayed assessments, residency programmes can better ensure that each resident receives an appropriate amount of training for their level of competence.

Overall, the shift toward CBME is a step in the right direction, though its dependence on observable behavioural outcomes warrants changes to how residents are observed, assessed, and trained such that inaccurate judgments are avoided. Specifically, it is imperative that assessments of competence are based on observations of delayed tests of retention rather than immediate performance tests to ensure that learning has actually occurred. Furthermore, residency programmes will benefit from evaluating their curricula to ensure that each technical or nontechnical skill being taught is done so using distributed and mixed practice schedules, if feasible. In closing, by timing assessments more appropriately and approaching curricula with a critical eye, post-graduate medical educators will be well-equipped to accept and optimally deliver CBME to their residents.


  1. Iobst WF, Sherbino J, Cate OT, Richardson DL, Dath D, Swing SR, Harris P, Mungroo R, Holmboe ES, Frank JR, International CBME Collaborators. Competency-based medical education in postgraduate medical education. Medical teacher. 2010 Aug 1;32(8):651-6.
  2. Royal College of Physicians and Surgeons of Canada [RCPSC]. The Royal College of Physicians and Surgeons of Canada :: CBD implementation [Internet]. Royalcollege.ca. 2018 [cited 27 April 2018]. Available from: http://www.royalcollege.ca/rcsite/cbd/cbd-implementation-e
  3. Skinner BF. Cognitive science and behaviourism. British Journal of psychology. 1985 Aug 1;76(3):291-301.
  4. Kantak SS, Winstein CJ. Learning–performance distinction and memory processes for motor skills: A focused review and perspective. Behavioural brain research. 2012 Mar 1;228(1):219-31.
  5. Wulf G, Shea C, Lewthwaite R. Motor skill learning and performance: a review of influential factors. Medical education. 2010 Jan 1;44(1):75-84.
  6. Dunlosky J, Rawson KA, Marsh EJ, Nathan MJ, Willingham DT. Improving students’ learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychological Science in the Public Interest. 2013 Jan;14(1):4-58.

Featured Image via Blue Diamond Gallery