Programmatic Theme: Data Science

Abstract: To explicitly learn patient representations from longitudinal clinical notes, we propose a hierarchical attention-based recurrent neural network (RNN) with greedy segmentation to distinguish between shorter and longer, more meaningful gaps between notes. The proposed model is evaluated for both a direct clinical prediction task (mortality) and as a transfer learning pre-training model to downstream evaluation (phenotype prediction of obesity and its comorbidities). Experimental results first show the proposed model with appropriate segmentation achieve the best performance on mortality prediction, indicating the effectiveness of hierarchical RNNs in dealing with longitudinal clinical text. Attention weights from the models highlight those parts of notes with the largest impact on mortality prediction and hopefully provide a degree of interpretability. Following the transfer learning approach, we also demonstrate the effectiveness and generalizability of pre-trained patient representations on target tasks of phenotyping.

Learning Objective: 1. Understand that the HAN model learns a hierarchical representation for a patient from long-term dependencies of clinical notes by utilizing multiple levels of RNNs. With the greedy segmentation algorithm, the real sequential information between notes comes to the surface.
2. Understand the attention mechanism that the model is capable of identifying critical portions of clinical notes at different levels of sentence, document, and patient with attention mechanisms. Through analyzing the results of the attention weights, we gain some in-depth insights into the portion of clinical notes that are contributing most to predicting patient's condition.
3. By encoding downstream notes to generate patient representation, we successfully show the efficiency of transfer learning from pre-training to a small dataset of phenotyping tasks.


Yuqi Si (Presenter)
University of Texas Health Science Center at Houston

Kirk Roberts, University of Texas Health Science Center at Houston

Keywords, Themes & Types