基于空−时−频多域交叉注意力学习的脑电情绪识别方法

    EEG-based Emotion Recognition Using Spatio-Temporal-Spectral Cross-Attention Learning

    • 摘要: 脑电情绪识别是健康评估与精神疾病临床干预的重要智能辅助技术。然而,脑电信号特征在空间、时间和频谱域上具有复杂的多维度非线性耦合关系,使得有效学习与情绪相关的脑电特征异常困难,进而影响下游的情绪识别任务性能。为应对上述挑战,本文提出一种基于空−时−频交叉注意力的脑电情绪识别网络 (Emotional Spatio-Temporal-Spectral Cross-Attention Network, ESTSCA-Net)。该模型采用双分支特征融合框架:在空时域分支中,设计多尺度二维卷积网络以串行模式处理空时信息,自适应捕捉脑神经活动的空时上下文关联模式;在空频域分支中,设计基于跨通道与频带双重注意力机制的三维瓶颈残差网络,精确加权表征脑神经活动的关键空频振荡模式。进一步设计双向多头交叉注意力交互策略,实现空−时−频多域特征的深度融合,从而构建出情绪表征分类器。基于公开DEAP和MEEG数据集的实验结果表明,ESTSCA-Net能够充分挖掘不同情绪状态下脑电信号的空−时−频特征,并在唤醒度和效价评价指标上均优于现有主流基线模型。

       

      Abstract: Electroencephalogram (EEG) -based emotion recognition is an essential intelligent technique for health assessment and clinical intervention. However, EEG signals exhibit complex and complementary non-linear correlations across spatio-temporal-frequency domains, posing significant challenges to effective feature modeling and downstream emotion recognition performance. To address these challenges, an Emotional Spatio-Temporal-Spectral Cross-Attention Network (ESTSCA-Net) is proposed. The proposed model adopts a dual-branch feature fusion architecture: in the spatio-temporal branch, a multi-scale 2D convolutional network is designed to sequentially process spatio-temporal information, adaptively capturing the contextual dependencies of neural activities; in the spatio-spectral branch, a 3D bottleneck residual network with channel-wise and cross-frequency attention mechanisms is developed to selectively encode critical spatio-spectral neural oscillations. Furthermore, a bidirectional multi-head cross-attention interaction strategy is introduced to achieve deep fusion of spatio-temporal-spectral features, forming an effective emotion representation classifier. Experimental results on the public DEAP and MEEG datasets demonstrate that ESTSCA-Net can comprehensively extract spatio-temporal-spectral EEG features across different emotional states and consistently outperforms state-of-the-art baseline models in both arousal and valence metrics.

       

    /

    返回文章
    返回