Parallel Spatial-Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI
Copyright © 2020 Liu, Shen, Liu, Yang, Xiong and Lin..
Motor imagery (MI) electroencephalography (EEG) classification is an important part of the brain-computer interface (BCI), allowing people with mobility problems to communicate with the outside world via assistive devices. However, EEG decoding is a challenging task because of its complexity, dynamic nature, and low signal-to-noise ratio. Designing an end-to-end framework that fully extracts the high-level features of EEG signals remains a challenge. In this study, we present a parallel spatial-temporal self-attention-based convolutional neural network for four-class MI EEG signal classification. This study is the first to define a new spatial-temporal representation of raw EEG signals that uses the self-attention mechanism to extract distinguishable spatial-temporal features. Specifically, we use the spatial self-attention module to capture the spatial dependencies between the channels of MI EEG signals. This module updates each channel by aggregating features over all channels with a weighted summation, thus improving the classification accuracy and eliminating the artifacts caused by manual channel selection. Furthermore, the temporal self-attention module encodes the global temporal information into features for each sampling time step, so that the high-level temporal features of the MI EEG signals can be extracted in the time domain. Quantitative analysis shows that our method outperforms state-of-the-art methods for intra-subject and inter-subject classification, demonstrating its robustness and effectiveness. In terms of qualitative analysis, we perform a visual inspection of the new spatial-temporal representation estimated from the learned architecture. Finally, the proposed method is employed to realize control of drones based on EEG signal, verifying its feasibility in real-time applications.
Medienart: |
E-Artikel |
---|
Erscheinungsjahr: |
2020 |
---|---|
Erschienen: |
2020 |
Enthalten in: |
Zur Gesamtaufnahme - volume:14 |
---|---|
Enthalten in: |
Frontiers in neuroscience - 14(2020) vom: 01., Seite 587520 |
Sprache: |
Englisch |
---|
Beteiligte Personen: |
Liu, Xiuling [VerfasserIn] |
---|
Links: |
---|
Themen: |
BCI |
---|
Anmerkungen: |
Date Revised 29.12.2020 published: Electronic-eCollection Citation Status PubMed-not-MEDLINE |
---|
doi: |
10.3389/fnins.2020.587520 |
---|
funding: |
|
---|---|
Förderinstitution / Projekttitel: |
|
PPN (Katalog-ID): |
NLM31926761X |
---|
LEADER | 01000naa a22002652 4500 | ||
---|---|---|---|
001 | NLM31926761X | ||
003 | DE-627 | ||
005 | 20231225171107.0 | ||
007 | cr uuu---uuuuu | ||
008 | 231225s2020 xx |||||o 00| ||eng c | ||
024 | 7 | |a 10.3389/fnins.2020.587520 |2 doi | |
028 | 5 | 2 | |a pubmed24n1064.xml |
035 | |a (DE-627)NLM31926761X | ||
035 | |a (NLM)33362458 | ||
040 | |a DE-627 |b ger |c DE-627 |e rakwb | ||
041 | |a eng | ||
100 | 1 | |a Liu, Xiuling |e verfasserin |4 aut | |
245 | 1 | 0 | |a Parallel Spatial-Temporal Self-Attention CNN-Based Motor Imagery Classification for BCI |
264 | 1 | |c 2020 | |
336 | |a Text |b txt |2 rdacontent | ||
337 | |a ƒaComputermedien |b c |2 rdamedia | ||
338 | |a ƒa Online-Ressource |b cr |2 rdacarrier | ||
500 | |a Date Revised 29.12.2020 | ||
500 | |a published: Electronic-eCollection | ||
500 | |a Citation Status PubMed-not-MEDLINE | ||
520 | |a Copyright © 2020 Liu, Shen, Liu, Yang, Xiong and Lin. | ||
520 | |a Motor imagery (MI) electroencephalography (EEG) classification is an important part of the brain-computer interface (BCI), allowing people with mobility problems to communicate with the outside world via assistive devices. However, EEG decoding is a challenging task because of its complexity, dynamic nature, and low signal-to-noise ratio. Designing an end-to-end framework that fully extracts the high-level features of EEG signals remains a challenge. In this study, we present a parallel spatial-temporal self-attention-based convolutional neural network for four-class MI EEG signal classification. This study is the first to define a new spatial-temporal representation of raw EEG signals that uses the self-attention mechanism to extract distinguishable spatial-temporal features. Specifically, we use the spatial self-attention module to capture the spatial dependencies between the channels of MI EEG signals. This module updates each channel by aggregating features over all channels with a weighted summation, thus improving the classification accuracy and eliminating the artifacts caused by manual channel selection. Furthermore, the temporal self-attention module encodes the global temporal information into features for each sampling time step, so that the high-level temporal features of the MI EEG signals can be extracted in the time domain. Quantitative analysis shows that our method outperforms state-of-the-art methods for intra-subject and inter-subject classification, demonstrating its robustness and effectiveness. In terms of qualitative analysis, we perform a visual inspection of the new spatial-temporal representation estimated from the learned architecture. Finally, the proposed method is employed to realize control of drones based on EEG signal, verifying its feasibility in real-time applications | ||
650 | 4 | |a Journal Article | |
650 | 4 | |a BCI | |
650 | 4 | |a EEG | |
650 | 4 | |a deep learning | |
650 | 4 | |a motor imagery | |
650 | 4 | |a spatial-temporal self-attention | |
700 | 1 | |a Shen, Yonglong |e verfasserin |4 aut | |
700 | 1 | |a Liu, Jing |e verfasserin |4 aut | |
700 | 1 | |a Yang, Jianli |e verfasserin |4 aut | |
700 | 1 | |a Xiong, Peng |e verfasserin |4 aut | |
700 | 1 | |a Lin, Feng |e verfasserin |4 aut | |
773 | 0 | 8 | |i Enthalten in |t Frontiers in neuroscience |d 2007 |g 14(2020) vom: 01., Seite 587520 |w (DE-627)NLM184194563 |x 1662-4548 |7 nnns |
773 | 1 | 8 | |g volume:14 |g year:2020 |g day:01 |g pages:587520 |
856 | 4 | 0 | |u http://dx.doi.org/10.3389/fnins.2020.587520 |3 Volltext |
912 | |a GBV_USEFLAG_A | ||
912 | |a GBV_NLM | ||
951 | |a AR | ||
952 | |d 14 |j 2020 |b 01 |h 587520 |