Simplifying EPA Assessment in the Workplace

By: Daniel Nel MD, PhD, Brent Thoma MD, PhD

Competency-Based Education (CBE) has been criticized for increasing assessment burden. Within health professions education (HPE), CBE assessment systems frequently incorporate Workplace-Based Assessment (WBA) of Entrustable Professional Activities (EPAs) that must be completed multiple times by multiple observers in multiple contexts.1,2 This data supports programmatic assessment, but is challenging to implement in busy clinical environments without overburdening assessors, threatening the sustainability, utility, and viability of WBA and CBE itself.3-4

For sustainable engagement in CBE assessment, the perceived benefits of participation and the experience of authentic engagement must outweigh the burden of doing so. While it is essential to articulate the educational value of WBAs and to apply robust change-management strategies that draw on both intrinsic and extrinsic motivation, it is equally important to design assessment systems that are feasible and minimise burden.[BT1] 5 This can be achieved in several ways, including selecting fewer EPAs to assess, reducing required assessment quotas, or lowering the number of observations needed for summative decision-making.6 However, one particularly effective strategy is to simplify the assessment tool used for WBA.

WBA tools have been designed for various purposes. For example, the mini-CEX (mini-Clinical Evaluation Exercise) was designed to assess patient presentations, while DOPS (Direct Observation of Procedural Skills) and OSATS (Objective Structured Assessment of Technical Skills) were developed to evaluate procedural performance.7 Similarly, there are bespoke assessment tools for a wide range of other clinical activities, such as producing radiology or pathology reports, among others. Many of these tools predate the EPA era, and, although they have since been modified to accommodate contemporary thinking around entrustment (e.g., replacement of numerical rating scales with entrustment-focused supervision scales), original design elements have remained. While these elements may add detail to the assessment encounter, they also add to the complexity and time required for a supervisor to complete a WBA.

Before describing a simplified tool for WBA, it is important to remember that within the context of CBE in HPE, assessments of EPAs are designed to serve a dual purpose:

  • First, they are intended to provide feedback to the trainee. While in-person feedback and a meaningful learning conversation are critical, we know that these often do not occur because of the pressures and complexities of the clinical environment. As a result, the written feedback captured in a WBA may be the only feedback the trainee receives for that activity. This feedback is an opportunity for trainees to gain a better understanding their current level of ability. By comparing a supervisor’s rating with their own expectations, this feedback supports the development of self-assessment skills—an essential attribute for independent practice.
  • Second, they provide a snapshot of trainee competence. This snapshot is crucial for educational decision-making, including identifying the need for additional support, monitoring progression through training, and, importantly, determining readiness for certification as an independent practitioner. While such decisions are often made by a competency committee, access to assessment data also enables program directors to make timely, ad hoc decisions and provide targeted support when needed.

In order to meet the above requirements, at a minimum, a simplified WBA tool needs to capture three elements (similar to the CPR2 (Context, Performance, Recommendation/Reinforcement) tool, previously summarized on the ICENet blog):8

  1. The context of the observation
  2. The trainee’s performance
  3. Feedback to the trainee

Contextual information may include key identifiers such as the trainee’s name, their current rotation, the date of the activity, the specific EPA observed, and a simple difficulty rating for the task. This provides essential context for interpreting the assessment. Importantly, much of this information (e.g., the date, trainee, rotation) can be captured by the assessment systems themselves as ‘metadata’ that is attached to or prepopulated within an assessment, minimizing the number of questions asked of the assessor and simplifying the form.

Performance assessment can use a prospective entrustment-supervision scale that provides a global rating, as evidence suggests that such global ratings are as accurate as multiple sub-ratings with the benefit of reducing the time required to complete the assessment. An important complementary element to performance rating is a brief narrative description/list of the components of the activity the trainee can perform independently. This is particularly valuable because prospective entrustment is not always well understood by all faculty, contributing to “hawk and dove” variability. In some cases, a supervisor may rate a trainee as requiring indirect supervision, yet their narrative description clearly indicates that the trainee did and can perform all components independently. This narrative, therefore, helps contextualise the entrustment rating and, to some extent, replaces detailed generic sub-criteria by providing more specific, actionable guidance for improvement.

The final essential element is feedback to the trainee, captured through simple narrative fields, reinforcing what the trainee did well and making recommendations regarding what they could do differently. Framing feedback in this way is important to avoid superficial comments such as “did well” or “keep it up,” and to ensure meaningful formative input.

What emerges from a focus on these three key elements of a WBA form is a streamlined tool with powerful advantages. Such a tool can be applied to almost any focused clinical task (EPA), whether presenting a patient(s) on ward rounds, managing a clinic, compiling a radiology or pathology report, performing a single operation, or running the anaesthetic components of an entire theatre list. A simple tool enables simpler faculty development that can be applied across specialties and institutions while facilitating the efficient review of aggregated data by competency committees tasked with making high-stakes decisions.

Innovation and the development of new tools should not be discouraged, and it is important to acknowledge that different contexts have different resources and needs. However, most clinicians—particularly supervisors—are chronically overwhelmed. It is therefore unsurprising that WBA implementation struggles in many settings, and without effective WBA, CBE is significantly weakened.

In most cases, simpler is better. This holds true for many aspects of life—and we would argue, certainly, for WBA as well.


Refrences:

  1. Bok HGJ, Teunissen PW, Favier RP, Rietbroek NJ, Theyse LFH, Brommer H, et al. Programmatic assessment of competency-based workplace learning: when theory meets practice. BMC Med Educ. 2013;13:123.
  2. Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J; International Competency-Based Medical Education Collaborators. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94(7):1002–1009.
  3. Massie J, Ali JM. Workplace-based assessment: a review of user perceptions and strategies to address the identified shortcomings. Adv Health Sci Educ Theory Pract. 2016;21(2):455–473.
  4. Anderson HL, Kurtz J, West DC. Implementation and use of workplace-based assessment in clinical learning environments: a scoping review. Acad Med. 2021;96(11 Suppl):S164–S174.
  5. Implementing workplace-based assessment is about change management [Internet]. ICENet blog; 2023 Oct 12 [cited 2026 Mar 24]. Available from: https://icenet.blog/2023/10/12/implementing-workplace-based-assessment-is-about-change-management/
  6. Assessment quotas for workplace-based assessment [Internet]. ICENet blog; 2024 Sep 19 [cited 2026 Mar 24]. Available from: https://icenet.blog/2024/09/19/assessment-quotas-for-workplace-based-assessment/
  7. Burch VC. The changing landscape of workplace-based assessment. J Appl Test Technol. 2019;20:37–59.
  8. Context, performance, recommendation and reinforcement (CPR2): bringing supervisor narrative comments to life in competency-based medical education [Internet]. ICENet blog; 2024 Feb 6 [cited 2026 Mar 24]. Available from: https://icenet.blog/2024/02/06/contextperformancerecommendation-and-reinforcement-cpr2-bringing-supervisor-narrative-comments-to-life-in-competency-based-medical-education/

About the Author:

Daniel Nel is a general surgeon and surgical educator at the University of Cape Town and Groote Schuur Hospital, with special interests in Workplace-Based Assessment and Competency-Based Education implementation.

Brent Thoma is an emergency physician and clinician educator in Saskatchewan, Canada with an interest in learning analytics and the establishment of new medical schools

The views and opinions expressed in this post are those of the author(s) and do not necessarily reflect the official policy or position of The University of Ottawa or the American Medical Association. For more details on our site disclaimers, please see our ‘About’ page