Appl Clin Inform 2023; 14(05): 996-1007
DOI: 10.1055/s-0043-1777103
Research Article

User-Centered Evaluation and Design Recommendations for an Internal Medicine Resident Competency Assessment Dashboard

Scott Vennemeyer
1   Department of Biomedical Informatics, College of Medicine, University of Cincinnati, Ohio, United States
,
Benjamin Kinnear
2   Department of Pediatrics, College of Medicine, University of Cincinnati, Ohio, United States
5   Department of Internal Medicine, College of Medicine, University of Cincinnati, Ohio, United States
,
Andy Gao
1   Department of Biomedical Informatics, College of Medicine, University of Cincinnati, Ohio, United States
3   Medical Sciences Baccalaureate Program, College of Medicine, University of Cincinnati, Ohio, United States
,
Siyi Zhu
1   Department of Biomedical Informatics, College of Medicine, University of Cincinnati, Ohio, United States
4   School of Design, College of Design, Architecture, Art, and Planning (DAAP), University of Cincinnati, Ohio, United States
,
Anunita Nattam
1   Department of Biomedical Informatics, College of Medicine, University of Cincinnati, Ohio, United States
3   Medical Sciences Baccalaureate Program, College of Medicine, University of Cincinnati, Ohio, United States
,
Michelle I. Knopp
5   Department of Internal Medicine, College of Medicine, University of Cincinnati, Ohio, United States
6   Division of Hospital Medicine, Cincinnati Children's Hospital Medical Center, Department of Pediatrics, College of Medicine, University of Cincinnati, Ohio, United States
,
Eric Warm
5   Department of Internal Medicine, College of Medicine, University of Cincinnati, Ohio, United States
,
Danny T.Y. Wu
1   Department of Biomedical Informatics, College of Medicine, University of Cincinnati, Ohio, United States
2   Department of Pediatrics, College of Medicine, University of Cincinnati, Ohio, United States
3   Medical Sciences Baccalaureate Program, College of Medicine, University of Cincinnati, Ohio, United States
4   School of Design, College of Design, Architecture, Art, and Planning (DAAP), University of Cincinnati, Ohio, United States
› Author Affiliations

Abstract

Objectives Clinical Competency Committee (CCC) members employ varied approaches to the review process. This makes the design of a competency assessment dashboard that fits the needs of all members difficult. This work details a user-centered evaluation of a dashboard currently utilized by the Internal Medicine Clinical Competency Committee (IM CCC) at the University of Cincinnati College of Medicine and generated design recommendations.

Methods Eleven members of the IM CCC participated in semistructured interviews with the research team. These interviews were recorded and transcribed for analysis. The three design research methods used in this study included process mapping (workflow diagrams), affinity diagramming, and a ranking experiment.

Results Through affinity diagramming, the research team identified and organized opportunities for improvement about the current system expressed by study participants. These areas include a time-consuming preprocessing step, lack of integration of data from multiple sources, and different workflows for each step in the review process. Finally, the research team categorized nine dashboard components based on rankings provided by the participants.

Conclusion We successfully conducted user-centered evaluation of an IM CCC dashboard and generated four recommendations. Programs should integrate quantitative and qualitative feedback, create multiple views to display these data based on user roles, work with designers to create a usable, interpretable dashboard, and develop a strong informatics pipeline to manage the system. To our knowledge, this type of user-centered evaluation has rarely been attempted in the medical education domain. Therefore, this study provides best practices for other residency programs to evaluate current competency assessment tools and to develop new ones.

Protection of Human and Animal Subjects

The study protocol was reviewed by the University of Cincinnati IRB and determined as “Nonhuman subject” research (#2019-1418). All the research data were de-identified.




Publication History

Received: 03 June 2023

Accepted: 25 October 2023

Article published online:
20 December 2023

© 2023. Thieme. All rights reserved.

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

 
  • References

  • 1 Accreditation Council for Graduate Medical Education. Common program requirements. Accessed January 24, 2022 at: https://acgme.org/what-we-do/accreditation/common-program-requirements/
  • 2 Andolsek KM, Padmore J, Hauer KE, Ekpenyong A, Edgar L, Holmboe E. Clinical Competency Committees: A Guidebook for Programs. Accredidation Council for Graduate Medical Education (2020)
  • 3 van der Vleuten CPM, Schuwirth LWT, Driessen EW. et al. A model for programmatic assessment fit for purpose. Med Teach 2012; 34 (03) 205-214
  • 4 Hauer KE, O'Sullivan PS, Fitzhenry K, Boscardin C. Translating theory into practice: implementing a program of assessment. Acad Med 2018; 93 (03) 444-450
  • 5 Schut S, Maggio LA, Heeneman S, van Tartwijk J, van der Vleuten C, Driessen E. Where the rubber meets the road - An integrative review of programmatic assessment in health care professions education. Perspect Med Educ 2021; 10 (01) 6-13
  • 6 Pack R, Lingard L, Watling CJ, Chahine S, Cristancho SM. Some assembly required: tracing the interpretative work of Clinical Competency Committees. Med Educ 2019; 53 (07) 723-734
  • 7 Lomis KD, Russell RG, Davidson MA. et al. Competency milestones for medical students: design, implementation, and analysis at one medical school. Med Teach 2017; 39 (05) 494-504
  • 8 Spickard III A, Ridinger H, Wrenn J. et al. Automatic scoring of medical students' clinical notes to monitor learning in the workplace. Med Teach 2014; 36 (01) 68-72
  • 9 Dai P. The conceptual model of influencing factors and influencing mechanism on team decision-making quality mediated by information sharing. IB 2013; 05 (04) 119-125
  • 10 Dennis AR. Information exchange and use in small group decision making. Small Group Res 1996; 27 (04) 532-550
  • 11 Kinnear B, Warm EJ, Hauer KE. Twelve tips to maximize the value of a clinical competency committee in postgraduate medical education. Med Teach 2018; 40 (11) 1110-1115
  • 12 Friedman KA, Raimo J, Spielmann K, Chaudhry S. Resident dashboards: helping your clinical competency committee visualize trainees' key performance indicators. Med Educ Online 2016; 21: 29838
  • 13 Boscardin C, Fergus KB, Hellevig B, Hauer KE. Twelve tips to promote successful development of a learner performance dashboard within a medical education program. Med Teach 2018; 40 (08) 855-861
  • 14 Li Q. Overview of data visualization. In: Embodying Data. Singapore: Springer; 2020: 17-47 DOI: 10.1007/978-981-15-5069-0_2
  • 15 Choi HH, Clark J, Jay AK, Filice RW. Minimizing barriers in learning for on-call radiology residents-end-to-end web-based resident feedback system. J Digit Imaging 2018; 31 (01) 117-123
  • 16 Levin JC, Hron J. Automated reporting of trainee metrics using electronic clinical systems. J Grad Med Educ 2017; 9 (03) 361-365
  • 17 Ehrenfeld JM, McEvoy MD, Furman WR, Snyder D, Sandberg WS. Automated near-real-time clinical performance feedback for anesthesiology residents: one piece of the milestones puzzle. Anesthesiology 2014; 120 (01) 172-184
  • 18 Johna S, Woodward B. Navigating the next accreditation system: a dashboard for the milestones. Perm J 2015; 19 (04) 61-63
  • 19 Cooney CM, Cooney DS, Bello RJ, Bojovic B, Redett RJ, Lifchez SD. Comprehensive observations of resident evolution: a novel method for assessing procedure-based residency training. Plast Reconstr Surg 2016; 137 (02) 673-678
  • 20 Durojaiye AB, Snyder E, Cohen M, Nagy P, Hong K, Johnson PT. Radiology resident assessment and feedback dashboard. Radiographics 2018; 38 (05) 1443-1453
  • 21 Thoma B, Bandi V, Carey R. et al. Developing a dashboard to meet Competence Committee needs: a design-based research project. Can Med Educ J 2020; 11 (01) e16-e34
  • 22 Warm EJ, Mathis BR, Held JD. et al. Entrustment and mapping of observable practice activities for resident assessment. J Gen Intern Med 2014; 29 (08) 1177-1182
  • 23 Warm EJ, Held JD, Hellmann M. et al. Entrusting observable practice activities and milestones over the 36 months of an internal medicine residency. Acad Med 2016; 91 (10) 1398-1405
  • 24 Chen HC, ten Cate O, Delany C, Molloy E. Assessment through entrustable professional activities. In: Learning and Teaching in Clinical Contexts: A Practical Guide. Philidelphia: Elsevier Health Sciences; 2018: 286-304
  • 25 Kelleher M, Kinnear B, Wong SEP, O'Toole J, Warm E. Linking workplace-based assessment to ACGME milestones: a comparison of mapping strategies in two specialties. Teach Learn Med 2020; 32 (02) 194-203
  • 26 Schauer DP, Kinnear B, Kelleher M, Sall D, Schumacher DJ, Warm EJ. Developing the expected entrustment score: accounting for variation in resident assessment. J Gen Intern Med 2022; 37 (14) 3670-3675
  • 27 Wu DTY, Vennemeyer S, Brown K. et al. Usability testing of an interactive dashboard for surgical quality improvement in a large congenital heart center. Appl Clin Inform 2019; 10 (05) 859-869
  • 28 Schoonenboom J, Johnson RB. How to construct a mixed methods research design. Kolner Z Soz Sozialpsychol (Aufl) 2017; 69 (Suppl. 02) 107-131
  • 29 Tague NR. Affinity diagram. In: The Quality Toolbox. 2nd ed.. Milwaukee: ASQ Quality Press; 2005: 96-100
  • 30 ASQ. What is an affinity diagram? K-J Method. Accessed August 17, 2022 at: https://asq.org/quality-resources/affinity
  • 31 Nielsen Norman Group (World Leaders in Research-Based User Experience. Affinity Diagramming for Collaboratively Sort UX Findings and Design Ideas. Accessed September 2, 2022 at: https://www.nngroup.com/articles/affinity-diagram/
  • 32 Ali S, Ronaldson S. Ordinal preference elicitation methods in health economics and health services research: using discrete choice experiments and ranking methods. Br Med Bull 2012; 103 (01) 21-44
  • 33 Unertl KM, Novak LL, Johnson KB, Lorenzi NM. Traversing the many paths of workflow research: developing a conceptual framework of workflow terminology through a systematic literature review. J Am Med Inform Assoc 2010; 17 (03) 265-273
  • 34 Rosala M. Why map in Discovery: 3 mapping methods. Published January 16, 2022. Accessed September 11, 2022 at: https://www.nngroup.com/articles/mapping-in-discovery/
  • 35 Leech BL. Asking questions: techniques for semistructured interviews. PS Polit Sci Polit 2002; 35 (04) 665-668
  • 36 Vennemeyer S, Parikh M, Mu S, Kinnear B, Wu DTY. Evaluation of a static dashboard to support resident learning and competency assessment. In: 2020 Workshop on Visual Analytics in Healthcare (VAHC), MD, USA; 2020: 28-30 DOI: 10.1109/VAHC53729.2020.00012
  • 37 Waxman H, Braunstein G, Dantzker D. et al. Performance on the internal medicine second-year residency in-training examination predicts the outcome of the ABIM certifying examination. J Gen Intern Med 1994; 9 (12) 692-694
  • 38 American College of Physicians. MKSAP: Medical Knowledge Self-Assessment Program VIII. Philadelphia, PA: American College of Physicians; 1988: c1988-c1989 https://search.library.wisc.edu/catalog/999593859302121
  • 39 Warm EJ, Schauer DP, Diers T. et al. The ambulatory long-block: an accreditation council for graduate medical education (ACGME) educational innovations project (EIP). J Gen Intern Med 2008; 23 (07) 921-926
  • 40 Kelleher M, Kinnear B, Sall DR. et al. Warnings in early narrative assessment that might predict performance in residency: signal from an internal medicine residency program. Perspect Med Educ 2021; 10 (06) 334-340
  • 41 Ginsburg S, van der Vleuten CPM, Eva KW. The hidden value of narrative comments for assessment: a quantitative reliability analysis of qualitative data. Acad Med 2017; 92 (11) 1617-1621
  • 42 Hanson JL, Rosenberg AA, Lane JL. Narrative descriptions should replace grades and numerical ratings for clinical performance in medical education in the United States. Front Psychol 2013; 4: 668
  • 43 Bartels J, Mooney CJ, Stone RT. Numerical versus narrative: a comparison between methods to measure medical student performance during clinical clerkships. Med Teach 2017; 39 (11) 1154-1158
  • 44 Van Der Vleuten CPM, Schuwirth LWT, Driessen EW, Govaerts MJB, Heeneman S. Twelve Tips for programmatic assessment. Med Teach 2015; 37 (07) 641-646
  • 45 Mathis BR, Warm EJ, Schauer DP, Holmboe E, Rouan GW. A multiple choice testing program coupled with a year-long elective experience is associated with improved performance on the internal medicine in-training examination. J Gen Intern Med 2011; 26 (11) 1253-1257
  • 46 Warm EJ, Schauer D, Revis B, Boex JR. Multisource feedback in the ambulatory setting. J Grad Med Educ 2010; 2 (02) 269-277
  • 47 Zafar MA, Diers T, Schauer DP, Warm EJ. Connecting resident education to patient outcomes: the evolution of a quality improvement curriculum in an internal medicine residency. Acad Med 2014; 89 (10) 1341-1347
  • 48 Grinberg M. Flask Web Development. 1st ed.. O'Reilly Media, Inc.; 2014