Download PDFOpen PDF in browser

Self-Attention Long-Term Dependency Modelling in Electroencephalography Sleep Stage Prediction

EasyChair Preprint no. 6910

12 pagesDate: October 20, 2021

Abstract

Complex sleep stage transition rules pose a challenge for the learning of inter-epoch context with Deep Neural Networks (DNNs) in ElectroEncephaloGraphy (EEG) based sleep scoring. While DNNs were able to overcome the limits of expert systems, the dominant bidirectional Long Short-Term Memory (LSTM) still has some limitations of Recurrent Neural Networks. We propose a sleep Self-Attention Model (SAM) that replaces LSTMs for inter-epoch context modelling in a sleep scoring DNN. With the ability to access distant EEG as easily as adjacent EEG, we aim to improve long-term dependency learning for critical sleep stages such as Rapid Eye Movement (REM). Restricting attention to a local scope reduces computational complexity to a linear one with respect to recording duration. We evaluate SAM on two public sleep EEG datasets: MASS-SS3 and SEDF-78 and compare it to literature and an LSTM baseline model via a paired t-test. On MASS-SS3 SAM achieves kappa = 0.80, which is equivalent to the best reported result, with no significant difference to baseline. On SEDF-78 SAM achieves kappa = 0.78, surpassing previous best results, statistically significant, with +4% F1-score improvement in REM. Strikingly, SAM achieves these results with a model size that is at least 50 times smaller than the baseline.

Keyphrases: attention, inter-epoch context, Sleep Scoring

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:6910,
  author = {Georg Brandmayr and Manfred Hartmann and Franz Fürbass and Georg Dorffner},
  title = {Self-Attention Long-Term Dependency Modelling in Electroencephalography Sleep Stage Prediction},
  howpublished = {EasyChair Preprint no. 6910},

  year = {EasyChair, 2021}}
Download PDFOpen PDF in browser