Skip to main content

Improving Alignment Between an Infant Sepsis Prediction Model and User Expectations Using Human-Centered Design Methods

We developed a machine learning model using electronic health record data to improve neonatal sepsis recognition. In developing system designs to study the presentation of model output to clinicians, we applied two human-centered design methods: clinician interviews and rapid prototyping. Methods: The dynamic rapid prototypes, using nearly 60 patient and model data elements per hour over 72 hours, allowed us to visualize patient data, model predictions, and feature importance. Visualization of this data revealed anomalies such as shifts in model output and discrepancies between feature importance and results from the clinician interviews. A multidisciplinary team with expertise in neonatology, data science, and human-computer interaction reviewed these anomalies using clinician interview analysis, patient chart reviews, electronic health record data analysis, and model code reviews. Results: The review process resulted in identifying three categories of anomalies: feature selection, feature importance, and model stability. This process resulted in over 40 changes to the model. Conclusion: While discovered ad hoc, our experience suggests more rigorous strategies for applying human-centered design methods beyond the presentation of machine learning model output to the development and testing of models.

Learning Objectives

  • Apply human-centered design (HCD) methods, such as end-user interviews and rapid prototyping with low-cost data visualizations, to identify anomalies in a NICU sepsis machine learning model’s output.

Speaker

  • Alex Ruan, MD (Children's Hospital of Philadelphia)

Formative Usability Testing of Designs to Present Machine Learning Output for Improving Sepsis Recognition in Critical Infants

Background: Traditional usability testing evaluates system usability but does not address explainability, which is crucial for systems using machine learning or other forms of artificial intelligence (AI). We developed a machine learning model to improve sepsis recognition in the neonatal intensive care unit (NICU). Methods: In an ongoing study to explore model representation and use, we developed system mockups using patient and model data from four patients. The mockups utilized nearly 200 data elements per patient and were tested iteratively in a format designed to observe user-system problems, assessment of usability using the Post-Study Scenario System Usability Scale (PSSUQ), and an assessment of explainability informed by published methods extending the Technology Assessment Model (TAM) with new constructs such as trust and understandability. Previous work in interviewing 30 NICU clinicians identified NICU nurses and advance practice providers (APP) as potentially benefiting most from the model. Thirty clinicians (15 nurses, 15 APP) from two tier four NICUs participated in the test. Results: Formative testing resulted in seven iterative versions of the system. Testing revealed and addressed usability problems with format, layout, labeling, and support content. Explainability problems identified and addressed include data science terminology, model feature importance, and presenting model data over 24 hours. PSSUQ scores were positive and consistent across all seven versions and overall responses to the TAM based questionnaire indicated high agreement with the explainability of the system. Conclusion: This study demonstrates that adapting usability testing to include explainability effectively identifies and resolves issues in AI-based systems.

Learning Objectives

  • Develop a system that presents machine learning output for user testing

Speaker

  • Alex Ruan, MD (Children's Hospital of Philadelphia)

Usability of integrated care pathways at a freestanding children’s hospital

Integrative care pathways (ICPs) are evidence-based, structured care plans used to improve the quality of care and patient outcomes in individuals presenting with a specified clinical problem. While ICPs can potentially be valuable, the healthcare team's usability has historically been a barrier due to the lack of integration into their workflows. AgileMD is an ICP application that integrates into the electronic health record (EHR) and can be utilized to create ICPs for various conditions. In June 2023, Children’s Nebraska, a freestanding children’s hospital, began rolling out ICPs using AgileMD. However, the tool's usability was not assessed among nurses. Therefore, this project aims to describe the usability of the Junctional Ectopic Tachycardia (JET) pathway and the Chylothorax pathway by comparing outcome metrics between the AgileMD pathway and the standard workflow for nurses within the CCU. The Clinical Effectiveness (CE) team developed a nursing task-driven simulation case for the Chylothorax pathway and the JET pathway. Two test patients were developed in the EHR playground. After completing both case simulations, each nurse completed the NASA-TLX and the System Usability Scale (SUS). A total of 16 CCU nurses completed the study. Results of the NASA-TLX demonstrated a significantly lower cognitive load across all domains for the AgileMD workflow compared to the standard workflow. The SUS score of 91.85 corresponds to an A+ letter grade, indicating “Best imaginable” usability. This study will expand to assess the task completion, click burden, and eye fixation between the two modalities.

Learning Objectives

  • Develop a system that presents machine learning output for user testing

Speaker

  • Kelsey Zindel, DNP, APRN-NP, CPNP-AC/PC (Children's Nebraska)

Patterns in Viewing of Pediatric Portal Notes

The 21st Century Cures act mandated the sharing of all clinical notes to patients unless exempted by limited allowable exceptions. However, little is known about who is accessing notes and what types of notes are being viewed. IRB exception was granted for review of metadata from all notes for all patients < 25 yo from a multi-state health system between July 2022 – June 2023. We collected information on patient demographics, note types, and author specialties. Continuous variables were summarized using medians and categorical variables as percentages. Further statistical analysis using GEE models pending with significant p <0.05. 1,578,188 unique notes were collected from 419,136 individual patients with 1,269,828 shared on activated patient portals. <1% of notes were blocked by providers based on acceptable exemptions. Notes were more likely to be viewed if patients were younger patients (0-2 years) (24.5%) compared to older (18+) (20.4%), English speaking (24.5%) vs Spanish (14.0%) or Other (19.1%), were privately insured, or considered medically complex. Notes from outpatient specialists were viewed most often (n=156,982, 33%), followed by outpatient primary care (n=103,572, 27%), emergency department (n=11,430, 11%), and inpatient (n=28,923, 9%) notes. Viewership by outpatient note specialty ranged from 3% (Radiology) to 45% (Genetics). Viewership by inpatient note specialty ranged from 3% (Neonatology, Rehabilitation Medicine) to 22% (Anesthesia). The data suggest that demographic and clinical factors influence the viewing of notes. These findings can inform strategies to improve access to information among families of all backgrounds to address the digital divide.

Learning Objectives

  • Identify practice changes related to 21st Century Cures Act Describe note blocking, sharing and viewing rates between visit types Describe Odds Ratio access rates between demographics, note type, and specialties

Speaker

  • Gift Kopsombut, MD (Nemours)

 


About CME/CNE Credit

The following information pertains to individual sessions included in the AMIA 2025 Clinical Informatics Conference On Demand product. A total of 16.75 CME/CNE credits may be earned if all sessions are completed.

Continuing Education Credit

Physicians

The American Medical Informatics Association is accredited by the Accreditation Council for Continuing Medical Education (ACCME) to provide continuing medical education for physicians.

The American Medical Informatics Association designates this online enduring material for 16.75 AMA PRA Category 1™ credits. Physicians should claim only the credit commensurate with the extent of their participation in the activity.

Claim credit no later than within two years of the release date or within one year of your purchase date, whichever is sooner.

ANNC Accreditation Statement

The American Medical Informatics Association is accredited as a provider of nursing continuing professional development by the American Nurses Credentialing Center's Commission on Accreditation.

  • Nurse Planner (Content): Robin Austin, PhD, DNP, DC, RN, NI-BC, FAMIA, FAAN
  • Approved Contact Hours: 16.75 participant maximum CME/CNE

ACHIPsTM

AMIA Health Informatics Certified ProfessionalsTM (ACHIPsTM) can earn 1 professional development unit (PDU) per contact hour.

ACHIPsTM may use CME/CNE certificates or the ACHIPsTM  Recertification Log to report 2025 CIC sessions attended for ACHIPsTM Recertification.

Claim credit no later than within two years of the release date or within one year of your purchase date, whichever is sooner. 

FAQs

Content was recorded live at AMIA's Informatics Summit March 10-13, 2025 in Pittsburgh, PA and at AMIA’s Annual Symposium event November 9-13, 2024, in San Francisco, CA.

Plan now to join us for the next Annual Symposium or Informatics Summit!

CME or CNE credit must be claimed no later than two from the release date or within one year of your purchase date, whichever is sooner. No credit will be issued that time.

Yes! AMIA On Demand is available for anyone to purchase. Become an AMIA member before you purchase to receive exclusive member discounts. Join AMIA today.

We’re glad you asked! AMIA offers a variety of membership options, all with exclusive benefits and abundant networking opportunities. Choose the membership that’s right for you.

The Audio-only format of all sessions is available free of charge exclusively to AMIA members. 

Access the audio recordings now (login required):

Join us at the next AMIA event and engage with leaders from across the health informatics field. 

Yes! You can claim Self-Study credit when you complete AMIA On Demand sessions, in addition to claiming Live credit for attending the live event. View the full details on self-study accreditation for this product.

Yes, The AMIA 2024 Annual Symposium On Demand Bundle (Presenter, Slides, and Audio) may be purchased for 8 educational credits using your health system’s code at checkout. Individual sessions (Presenter, Slides, and Audio) may be purchased for 1 educational credit per session using your health system’s code at checkout.

Available On:
Dates and Times:
Available Until:
Type: AMIA On Demand
Course Format(s): On Demand
Credits:
0.75
CME
,
0.75
CNE
Price: Member: $60, Nonmember: $85
Purchase now
Share