Zero-shot medical event prediction using a generative pretrained transformer on electronic health records.
Longitudinal data in electronic health records (EHRs) represent an individual's clinical history through a sequence of codified concepts, including diagnoses, procedures, medications, and laboratory tests. Generative pretrained transformers (GPT) can leverage this data to predict future events. While fine-tuning of these models can enhance task-specific performance, it becomes costly when applied to many clinical prediction tasks. In contrast, a pretrained foundation model can be used in zero-shot forecasting setting, offering [...]
Author(s): Redekop, Ekaterina, Wang, Zichen, Kulkarni, Rushikesh, Pleasure, Mara, Chin, Aaron, Hassanzadeh, Hamid Reza, Hill, Brian L, Emami, Melika, Speier, William F, Arnold, Corey W
DOI: 10.1093/jamia/ocaf160