What are wereallydecoding? Unveiling biases in EEG-based decoding of the spatial focus of auditory attention

© 2024 IOP Publishing Ltd..

Objective.Spatial auditory attention decoding (Sp-AAD) refers to the task of identifying the direction of the speaker to which a person is attending in a multi-talker setting, based on the listener's neural recordings, e.g. electroencephalography (EEG). The goal of this study is to thoroughly investigate potential biases when training such Sp-AAD decoders on EEG data, particularly eye-gaze biases and latent trial-dependent confounds, which may result in Sp-AAD models that decode eye-gaze or trial-specific fingerprints rather than spatial auditory attention.Approach.We designed a two-speaker audiovisual Sp-AAD protocol in which the spatial auditory and visual attention were enforced to be either congruent or incongruent, and we recorded EEG data from sixteen participants undergoing several trials recorded at distinct timepoints. We trained a simple linear model for Sp-AAD based on common spatial patterns filters in combination with either linear discriminant analysis (LDA) or k-means clustering, and evaluated them both across- and within-trial.Main results.We found that even a simple linear Sp-AAD model is susceptible to overfitting to confounding signal patterns such as eye-gaze and trial fingerprints (e.g. due to feature shifts across trials), resulting in artificially high decoding accuracies. Furthermore, we found that changes in the EEG signal statistics across trials deteriorate the trial generalization of the classifier, even when the latter is retrained on the test trial with an unsupervised algorithm.Significance.Collectively, our findings confirm that there exist subtle biases and confounds that can strongly interfere with the decoding of spatial auditory attention from EEG. It is expected that more complicated non-linear models based on deep neural networks, which are often used for Sp-AAD, are even more vulnerable to such biases. Future work should perform experiments and model evaluations that avoid and/or control for such biases in Sp-AAD tasks.

Medienart:

E-Artikel

Erscheinungsjahr:

2024

Erschienen:

2024

Enthalten in:

Zur Gesamtaufnahme - volume:21

Enthalten in:

Journal of neural engineering - 21(2024), 1 vom: 06. Feb.

Sprache:

Englisch

Beteiligte Personen:

Rotaru, Iustina [VerfasserIn]
Geirnaert, Simon [VerfasserIn]
Heintz, Nicolas [VerfasserIn]
Van de Ryck, Iris [VerfasserIn]
Bertrand, Alexander [VerfasserIn]
Francart, Tom [VerfasserIn]

Links:

Volltext

Themen:

Audiovisual stimulation
Eye-gaze biases
Feature drifts
Journal Article
Research Support, Non-U.S. Gov't
Spatial auditory attention decoding (Sp-AAD)

Anmerkungen:

Date Completed 07.02.2024

Date Revised 09.04.2024

published: Electronic

Citation Status MEDLINE

doi:

10.1088/1741-2552/ad2214

funding:

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

NLM367567059