By: Brent Thoma (@Brent_Thoma) and Stefanie Sebok-Syer (@StefSebok)
The widespread implementation of competency-based medical education (CBME) has unveiled a new problem: a vast amount of completed workplace-based assessments that are not being effectively aggregated, analyzed, or visualized. Symptoms of this challenge within your own CBME programs could include:
- Learners not integrating the feedback from their assessments because they struggle to easily access and/or make sense of them.1
- Competence Committees reverting to a ‘problem identification’ lens2 that deprives learners of developmental feedback because reviewing their assessment data is so onerous and time consuming.3
- Faculty rarely, if ever, receiving feedback on the quality of their assessments or guidance on how they can improve.4
- Program Directors unable to use their aggregate assessment data to inform their program’s evaluation and improvement.5
If any of these symptoms resonate, know that you are not alone. Many institutions have updated their assessment systems without upgrading their technological infrastructure. This has hampered CBME’s implementation as stakeholders burdened by antiquated technology blame the new assessment system for their woes. Perhaps worse, the rich insights into their learners, faculty, programs, and systems that are contained within this assessment data are being lost.6
Unfortunately, this challenge will not be addressed easily or quickly. At scale, even relatively simple analytics and visualizations like those described above are underpinned by intentional data governance and sophisticated data architecture. Despite their foundational importance, investment in these areas is perceived to have a poor return because it does not address the immediate needs of key stakeholders.
This problem is important to acknowledge because, consistent with other industries, data is becoming the currency of medical education. Leaders in this area employ data engineers, architects, analysts, and scientists to build sophisticated systems that allow them to amalgamate, store, analyze, and visualize their data. Such efforts both support the core use cases for educational assessment data described above while also preparing for advanced work such as:
- The amalgamation of educational and clinical data to incorporate attributable patient care outcomes into programmatic assessment.7,8
- The use of sophisticated analytical techniques to incorporate predictive and prescriptive analytics into their educational programs.9
- The sharing of assessment data within and between institutions10 to support advanced program evaluation, quality improvement, accreditation, and research efforts.6
Historically, educational technology has been a subdomain of medical education in which most are content if things are ‘working.’ Moving forward, educational leaders will need to take a more active role in engaging with stakeholders to shape how educational data are collected, transported, stored, and used. In some instances, this may mean articulating the importance of educational data and positioning its value on par with clinical data. Our leaders should be emboldened to ask questions about their data infrastructure and how it could, and should, be used to leverage and support our learners, faculty, programs, and systems.
About the authors:
Brent Thoma, MD MA MSc FRCPC, is an emergency and trauma physician in the Saskatchewan Health Authority and Professor of Emergency Medicine in the Department of Emergency Medicine.
Stefanie Sebok-Syer, PhD, is an Assistant Professor of Emergency Medicine and Senior Scientist in The PEARL (Precision Education and Assessment Research Lab) at Stanford University School of Medicine.
References:
1.Carey R, G Wilson, V Bandi, D Mondal, L Martin, R Woods, et al. Developing a dashboard to meet the needs of residents in a competency-based training program: A design-based research project. Can Med Educ J. 2020;11(6):e31–5.
2. Hauer KE, B Chesluk, W Iobst, E Holmboe, RB Baron, CK Boscardin, et al. Reviewing Residents’ Competence: A Qualitative Study of the Role of Clinical Competency Committees in Performance Assessment. Acad Med. 2015;90(8):1084–92.
3. Thoma B, V Bandi, R Carey, D Mondal, R Woods, L Martin, et al. Developing a dashboard to meet Competence Committee needs: a design-based research project. Can Med Educ J. 2020;11(1):e16–34.
4. Yilmaz Y, R Carey, T Chan, V Bandi, S Wang, RA Woods, et al. Developing a dashboard for faculty development in competency-based training programs: a design-based research project. Can Med Educ J. 2021;12(4):48–64.
5. Yilmaz Y, R Carey, T Chan, V Bandi, S Wang, RA Woods, et al. Developing a dashboard for program evaluation in competency-based training programs: a design-based research project. Can Med Educ J. 2022;13(5):14–27.
6. Thoma B, H Caretta-Weyer, D Schumacher, EJ Warm, AK Hall, SJ Hamstra, et al. Becoming a Deliberately Developmental Organization: Using competency-based assessment data for organizational development. Med Teach. 2021;43(7):801–9.
7. Smirnova A, S Chahine, C Milani, A Schuh, SS Sebok-Syer, J Swartz, et al. Using Resident-Sensitive Quality Measures Derived From Electronic Health Record Data to Assess Residents’ Performance in Pediatric Emergency Medicine. Acad Med. 2023;98(3):367-375.
8. Schaye V, B Guzman, J Burk-Rafel, M Marin, I Reinstein, D Kudlowitz, et al. Development and Validation of a Machine Learning Model for Automated Assessment of Resident Clinical Reasoning Documentation. J Gen Intern Med. 2022;37(9):2230–8.
9. Reinstein I, J Hill, DA Cook, M Lineberry, MV Pusic. Multi-level longitudinal learning curve regression models integrated with item difficulty metrics for deliberate practice of visual diagnosis: groundwork for adaptive learning. Adv Health Sci Educ. 2021;26(3):881–912.
10. Abbott KL, AE Krumm, JK Kelley, DE Kendrick, MJ Clark, X Chen, et al. Surgical Trainee Performance and Alignment With Surgical Program Director Expectations. Ann Surg. 2022;276(6):e1095–100.
The views and opinions expressed in this post are those of the author(s) and do not necessarily reflect the official policy or position of The Royal College of Physicians and Surgeons of Canada. For more details on our site disclaimers, please see our ‘About’ page