Beyond electronic health record data: leveraging natural language processing and machine learning to uncover cognitive insights from patient-nurse verbal communications
Presenter
Moderator
Statement of Purpose
Alzheimer's disease and related dementias (ADRD) are escalating public health challenges, with many cases remaining undiagnosed until advanced stages. While previous research has focused on Electronic Health Record (EHR) data and structured cognitive assessments, early and subtle indicators—such as linguistic and interaction cues during routine patient-clinician conversations—often go unnoticed. Building on emerging evidence that speech features (e.g., semantic coherence, syntactic complexity, and turn-taking patterns) are highly sensitive to early cognitive decline, our study leverages natural language processing (NLP) and machine learning (ML) methods to analyze patient-nurse verbal communication in home healthcare.
This approach advances the field by integrating real-world audio-recorded patient-clinician verbal communications with EHR data (including NLP driven risk factors from clinical notes), enabling a more comprehensive detection of early-stage cognitive decline. Our findings demonstrate that linguistic and interaction cues from verbal communications provide valuable insights that significantly improve screening accuracy when combined with informative features from EHR data. By offering a scalable pipeline to record, analyze, and fuse these data sources, this work paves the way for earlier interventions, potentially improving patient outcomes and reducing healthcare costs associated with late-stage diagnosis.