Affective brain-computer interfaces: Insights from psychology and neuroscience

  • Jingjing CHEN 1 ,
  • Xin HU 2 ,
  • Xinke SHEN 3 ,
  • Dan ZHANG , 4, *
Expand
  • 1. School of Education, Tsinghua University, Beijing 100084, China
  • 2. School of Medicine, University of Pittsburgh, Pittsburgh 15261, USA
  • 3. Department of Biomedical Engineering, Southern University of Science and Technology, Shenzhen 518055, China
  • 4. Department of Psychological and Cognitive Sciences, Tsinghua University, Beijing 100084, China

Received date: 2025-07-08

  Online published: 2025-12-26

Copyright

All rights reserved. Unauthorized reproduction is prohibited.

Abstract

Significance: Empowering machines to understand human emotions remains one of the primary challenges in developing artificial intelligence (AI). The affective brain-computer interface (BCI), which decodes emotional states based on brain signals, is an emerging field combining psychology, neuroscience, and AI. Brain signals are inherently uncontrollable, contain rich emotion-specific information, and provide a promising physiological basis for developing computing systems that support continual emotion monitoring. Since its inception, affective BCI research has required close collaboration across various disciplines: it depends on the use of information science for feature engineering and algorithm development, psychology for theoretical frameworks of emotion, and neuroscience for revealing the neural mechanisms underlying emotional processes. Such a demand for multidisciplinary co-operation forms the core focus of this review. Specifically, this paper focuses on the methods by which psychology-and neuroscience-based insights can inspire and advance affective BCI research. Progress: We summarize the current progress at three levels: theoretical, technical, and applied. In the first one, recent advances in affective science offer new perspectives for shaping affective BCI paradigms. The traditional discrete and dimensional frameworks have laid the groundwork for emotion decoding but often overlook positive emotions and the dynamic intensity of affective experiences. Recent emotion theories emphasizing refined positive emotions, mixed emotions, and context-dependent emotions provide valuable directions for improving emotion representation. Affective computing should align with these developments, integrating them into computational models to enhance ecological validity. In turn, affective BCI research may also contribute to psychology by offering evidence to test and refine emotion theories, fostering reciprocal progress across disciplines. At the technical level, neuroscience provides crucial insights for building more robust affective BCIs. Findings on emotional valence lateralization and distributed emotion-associated brain representations can inform the design of models that better capture emotional processing complexity. Moreover, inter-subject brain synchronization research has revealed mechanisms that enhance model generalizability across users, suggesting that incorporating neuroscientific findings can substantially improve the performance and reliability of affective BCIs. At the application level, affective BCIs are expanding beyond emotion recognition toward understanding emotion-related individual differences. Variability between individuals—often treated as noise—may instead offer meaningful information about personality traits or mental health conditions. In the long term, the goal of affective BCI systems may evolve from accurately identifying emotions to comprehensively understanding each individual's psychological tendencies and dynamic affective patterns across multimodal neural and behavioral data. We advocate for stronger integration between affective BCI technologies and practical domains. Such integration allows practical demands to drive technological development, ensuring that affective BCI remains human-centered. Conclusions and Prospects: Finally, we discuss the technical challenges of affective BCI, including extending algorithms from controlled laboratory settings to real-world scenarios, advancing sensor technology for more convenient and reliable brain-signal acquisition, and leveraging large models to enhance performance for affective BCI. Specifically, we emphasize the vital role of ethical considerations: as affective BCIs move from passive emotion detection toward active emotional support or intervention, the responsibility of humans as rational moral agents in a future era of man-computer symbiosis must be considered, ensuring the autonomy of human emotions.

Cite this article

Jingjing CHEN , Xin HU , Xinke SHEN , Dan ZHANG . Affective brain-computer interfaces: Insights from psychology and neuroscience[J]. Journal of Tsinghua University(Science and Technology), 2025 , 65(12) : 2341 -2350 . DOI: 10.16511/j.cnki.qhdxxb.2025.21.049

人工智能先驱Minsky曾精辟论述:“问题不在于智能机器能否拥有情感,而在于没有情感的机器能否实现真正的智能”[1]。这一论断充分肯定了情感在智能系统架构中的核心价值——作为人类认知过程的关键维度,情感不仅驱动着决策制定、社会交互和学习适应等高级功能,更是实现真正类人智能交互的必要基础。在这一背景下,美国麻省理工学院Picard教授开创性地提出“情感计算”(affective computing)[2],这一跨学科研究领域的建立标志着信息科学、认知科学和心理学等学科的研究者开始系统性地探索如何赋予计算机识别、理解、表达乃至模拟情感的能力,为构建具有情感智能的新一代人机交互范式奠定了关键基础。
情感脑机接口(affective brain-computer interface) 作为情感计算领域的新兴研究方向,通过直接解码大脑神经活动来实现情绪状态识别,展现出独特的技术优势[3-4]。情绪相关线索的收集是实现情感计算的前提。传统基于表情、语音等音视频信息的方案具有容易采集、技术成熟度高等优势,但其情绪线索来源是可直接观测的外部行为表现;一方面,外部行为表现可能被主观抑制或伪装(如“扑克脸”现象)[5]; 另一方面,外部行为表现可能是离散的,要实现连续识别需要用户持续说话或做出表情。基于心率、皮肤电导等外周生理信号的方法虽然能够实现连续监测[6],但这些信号主要反映情绪的唤醒程度,在情绪特异性识别方面存在局限。相比之下,脑电(EEG)、功能性近红外光谱(fNIRS)等中枢神经信号[7-8]具有3个显著优势:其一,脑神经活动具有天然的不可控性,具备优异的抗伪饰能力;其二,脑神经活动信号蕴含丰富的情绪特异信息,可用于区分不同情绪状态;其三,持续的中枢神经活动为情绪状态的连续监测提供了可靠的生理基础。这些特性使得以中枢神经信号为情绪线索来源的情感脑机接口成为探索情绪神经机制和开发新型情感计算系统的理想技术路径。
情感脑机接口自诞生伊始就具有跨学科的属性。一方面,情感脑机接口的计算架构遵循典型的监督学习范式:通过采集带有情绪标注的脑神经信号数据(如EEG、fNIRS),建立神经特征与情绪标注之间的映射关系,经模型训练与参数优化后,最终实现对新采集脑信号的实时情绪解码。因此,情感脑机接口的进展极大受益于机器学习、深度学习领域的发展。另一方面,情感脑机接口以情绪为识别目标。作为一种复杂的心理生理现象,情绪的主观体验与对应神经表征具有显著的个体差异性、情境依赖性等特性,这使得传统机器学习范式在情感脑机接口应用中面临独特挑战。正是这种跨维度复杂性决定了情感脑机接口研究必须采用深度交叉学科的路径[9]。一方面需要信息科学在特征工程和算法架构上的持续创新;另一方面离不开心理学提供情绪理论框架,以及神经科学揭示情绪加工的神经机制。这种多学科协同创新的必要性是本文综述的核心立场。
本文从理论层面、技术层面和应用层面分别进行论述,并从多学科交叉视角探讨情感脑机接口的未来发展。

1 情绪心理学重构计算范式

情感脑机接口的核心任务是在不同情绪状态与脑神经数据之间建立稳定、可分的映射关系。要建立这一映射关系,首先需要确定情绪的基本框架,为情绪状态量化标注提供标签体系。对人类的主观感受进行量化是心理学领域的重要研究问题。情绪作为最重要的主观感受之一,大量的心理学研究者对情绪标签进行研究,产生了广泛的影响。在情感脑机接口研究实践中,研究者常常选择情绪心理学领域中的某一种代表性理论作为情绪标注框架。例如,以Ekman基本情绪理论[10]为代表的离散框架(见图 1a)尝试用有限的、离散的类别(如愤怒、悲伤、喜悦等)描述情绪。以Russell核心情绪理论[11]为代表的维度框架(见图 1b)则希望从平静-激动(即唤起)和消极-积极(即效价)等连续坐标轴对情绪体验进行量化。对于遵循典型监督学习范式的情感脑机接口而言,情绪标注框架既是计算建模的起点,也是计算模型识别的目标。这使得情绪心理学领域的早期研究成果深刻地影响了情感脑机接口的计算范式。目前,绝大多数相关研究基于离散或维度情绪标注框架开展算法设计与优化工作[12-15]
图 1 代表性情绪标注框架
然而,当前研究在情绪标注框架的应用中存在一定的局限。例如,基于离散情绪框架开展的情感脑机接口研究往往只关注少量基本情绪类别,但日常生活中的情绪状态却丰富多彩[16-17]。有限的基本情绪类别难以覆盖现实需求,限制了对应情感计算模型的应用范围。与此同时,离散情绪框架主要关注不同情绪类别之间的区分,对同一类别情绪内部的强度动态变化则考虑有限。效价-唤起维度理论虽然提供了连续量化情绪空间的理论可能性,但相关实际应用常常未能实现对情绪状态的充分精细化处理,主要侧重高低效价、高低唤起的区分。
近年来的情绪心理学研究从精细化积极情绪、多元复合情绪结构、情绪情境特异性等方面拓展了经典情绪理论,正在形成更具现实解释效力的情绪框架。无论采用离散框架还是维度框架,现有计算范式均有整合新发现的优化空间。其中,离散框架主要面临积极情绪关注不足的问题。Ekman等提出的6种基本情绪中,仅有“喜悦”可被称为积极情绪,但文[18]表明,人们在日常生活中更多体验积极情绪而非消极情绪。当前分类体系的失衡一定程度上限制了情感计算的应用效果,并引起学界关注:积极心理学领域呼吁研究者重视积极情绪的独特价值,细化积极情绪类别[19];神经科学的实证研究也基于脑神经特征实现了多种积极情绪的区分,证实了开展积极情绪识别的可行性[20-22]。因此,发展面向精细积极情绪识别的计算范式应成为情感脑机接口的重点方向。对于维度框架,单一效价维度的基本假设是将积极和消极视作坐标轴上相对的、无法共存的两极[23],但文[24]表明人类的情绪体验存在多元复合的特点,可以同时体验到积极和消极情绪。例如,足球比赛中支持的球队惜败时,人们可能会体会到悲伤、愤怒,但也有对球员奋力拼搏的喜悦和感动。近年来,已经有研究者关注到这一问题。近期研究通过情绪分布学习等框架[25],已初步实现了基于脑电的混合情绪识别,并发布了混合情绪数据集助力进一步探索[26]。关注自然情境的研究尤其可以考虑积极与消极情绪共现的可能性,通过将积极情绪和消极情绪重构为2个独立的维度,实现对多元复合情绪的准确建模与识别。
此外,虽然情绪理论研究普遍追求跨情境通用的情绪框架,但情绪体验的情境特异性正逐渐成为研究重点[27]。实证研究表明,不同情境(这里的情境既包括不同的情绪诱发任务,也包括自发产生情绪体验的一般任务)中的典型情绪可能存在差异:视频观看可能引发钦佩、同情、痛苦、娱乐、敬畏和尴尬等27种情绪类别,而音乐聆听则可能产生胜利和梦幻感等13种情绪类别[16, 28];相比人际交互,人机交互情境下消极情绪出现频率更高并伴随特异的多元复合模式[29];学业场景则出现无聊、焦虑等情绪类型[30-31]。上述研究展示了情绪的情境依赖性,说明在不同事件中,情绪体验是因情境而异的。这也提示,除了开发跨情境可用框架之外,也可聚焦特定情境的情绪识别。情境化情绪类别(如学业情境中的困惑[32]、传播实践中的“萌”[33]、游戏情境中的心流[34]等)的探索性研究为构建情境适配的计算范式提供了重要基础,也为解决具体应用场景需求提供支撑[35]
相比图像识别等具有相对“金标准”的机器学习任务,情感脑机接口的一大重要特点是情绪标注体系本身仍处于不确定与不完备的状态,具有大量可拓展空间。这既影响训练标签的质量,也影响识别效果的评估。要解决这一问题离不开心理学对情绪通用理论框架的研究进展。虽然立足应用的情感脑机接口研究不必因问题尚无定论就停滞不前,根据具体研究情境在实践中选择适用的描述框架即可[2];但另一方面,情感脑机接口的研究者也应当对情绪心理学的前沿共识保持密切的关注,整合最新进展对计算范式进行拓展与重构。这种学科协同具有双向价值:心理学提出的细粒度情绪分类(如多元复合情绪、情境特异性情绪等)能提升脑机接口的情绪覆盖范围,推动实用化脑机接口发展;而脑机接口研究本身也有望从计算视角出发,为验证和优化情绪理论提供跨学科的证据支撑。

2 神经科学启发计算模型架构

脑神经活动中携带情绪相关信息是构建情感脑机接口计算模型所需的基本假设。脑神经活动与情绪状态之间需要存在特定的映射关系,即某种脑神经活动模式对应某种情绪状态。在这一前提下,研究者才能基于脑神经活动实现对情绪状态的识别[9, 36]。面向情绪调节/干预的双向情感脑机接口则提出了更高的要求:脑神经活动不仅仅需要是情绪体验的伴生物,更应当是情绪体验的物质基础。对相关基本假设的检验正是情绪神经科学的研究范畴[37]。与此同时,脑神经信号是情感脑机接口的信息来源,如何用好这一特殊的数据类型,也需要对脑神经信号特点的深入认识。因此,情感脑机接口的计算建模发展离不开神经科学对情绪神经机制的揭示。
近年来,神经科学对情绪脑机制的研究进展已经为计算模型架构设计提供了重要生物学依据。例如,神经科学研究发现的情绪效价偏侧化现象[38-40]为情感脑机接口的特征工程与模型设计提供了重要启示。基于这一发现,研究者引入左右脑功率差异[41]、双侧半球交互特征[42]、半脑不对称性特征[43]和左右半球细粒度频谱差异[44]等体现偏侧化的特征工程方法,显著提升了情绪识别性能。与此同时,神经科学中关于情绪分布式表征的发现[45-48]也让研究者重视对大脑全局信息的引入、关注图网络的应用潜力,构建了系列基于脑网络表征的情绪解码方法,取得了较好的识别效果[49-53]。可见,基于神经科学发现开发的特征工程与计算模型架构,对提升情感脑机接口性能具有重要意义。
与此同时,神经科学对大脑个体间同步现象的研究也促进了情感脑机接口泛化性的提升。2004年,Hasson等[54]发现,在观看同一部电影时,不同被试的大脑出现了广泛分布的、与情绪唤起场景相关的被试间一致响应;Dmochowski等[55-56]进一步发现,基于脑电提取的个体间相似性能反映情绪介导的注意力并预测偏好;Ding等[57]则发现脑电个体间相似性能实时预测情绪效价和唤起的变化。这些研究表明,情绪刺激可引发个体间一致的神经响应,为情绪表征的个体间不变性学习和领域泛化提供了实证基础。基于这一思路,Shen等[58]提出CLISA方法,通过对比学习对齐不同个体在相同情绪视频下的隐层神经表征,在THU-EP九分类和SEED三分类任务上的跨个体域泛化准确率分别达到45.7%和86.4%,取得了当时最优的领域泛化性能。后续,MSHCL方法[59]引入双曲损失以建模情绪和刺激的层级结构,CL-CS方法[60]增强了脑电频域特征提取,DAEST方法[61]引入动态注意力机制建模脑电状态转换,进一步实现了性能提升。上述研究展示了基于神经活动同步性的表征学习对提升情感脑机接口跨个体泛化性能的潜力。与此同时,EEG、fNIRS等不同脑神经信号反映了脑活动的不同层面,存在不同的时空频特点,也可能包含着情绪信息的不同表达。如何融合不同模态信息也是情感脑机接口的研究热点[62]。例如,Si等[63]利用EEG微状态时间动态、fNIRS空间模式和EEG-fNIRS时空联合特征开展情绪识别,为情感脑机接口的发展提供了多模态互补特征。
综上,对情绪脑表征、脑神经信号的基本原理的进一步理解可以帮助研究者以更高效、可泛化的方式挖掘脑神经信号中蕴含的情绪信息,更好地构建情感脑机接口方法。未来,通过进一步整合神经科学的相关发现,研究者将有望在跨模态情绪识别[64]、情绪实时追踪及反馈[57, 65-66]等方向取得进展,为实现应用目标作出贡献。

3 应用拓展:从状态计算到特质计算

在应用层面,情感脑机接口最初主要聚焦于用户体验[67-68]、服务人机交互[69-70]等具体应用场景。核心目标是基于神经信号建立可推广至所有用户的通用情绪识别模型。如前所述,个体差异在这一目标下往往被视为影响模型泛化能力和准确性的“噪声”[9]。这种“噪声”主要源于2方面:一是个体对相同情绪刺激的反应差异[71]。例如,让所有被试观看一段旨在诱发“悲伤”的材料,若部分个体未产生相应情绪体验,则情绪标签本身即带有偏差,影响模型训练效果。此类差异将影响依赖标准化情绪刺激(比如标准化情绪诱发图片、声音、视频等)获取情绪标签的情感脑机接口任务。二是个体在报告自身情绪体验时的能力差异。研究者发现,个体的情绪觉察能力存在差异,部分人难以及时、准确地识别自身情绪状态[72-73],从而影响基于自我报告的情绪标签质量。此类差异将影响依赖被试自我报告情绪体验(比如利用Likert量表对于情绪积极程度进行评分)的情感计算任务。更进一步,上述2种差异来源也会放大情绪主观体验与神经响应映射关系的个体差异,从而影响情感脑机接口的跨被试迁移效果。近年来,研究者通过跨被试特征提取[57]、特征归一化[74]、迁移学习[75-77]、对比学习[58, 78]等策略,努力应对上述个体差异的“噪声”,以提升情感脑机接口的跨被试应用性能。
然而,随着情感脑机接口与心理测评、精神病学等应用领域的交叉融合,研究者开始重新审视这些“个体差异”,认为情绪相关的个体差异本身可能蕴含重要的现实意义与应用价值。比如,心理学研究发现,对于相同情绪刺激的差异化反应可能反映了个体的人格特质。例如,神经质更高的个体更容易体验强烈的消极情绪[79],而外向性水平更高的个体更容易体验更强的积极情绪[80]。受此启发,有研究通过记录情绪刺激诱发的EEG响应,实现了对大五人格这一代表性个体特质的有效预测[81-82]。类似地,精神病学领域的研究也发现个体对于情绪刺激的神经响应有望作为精神障碍的潜在神经标志物[83],比如抑郁患者通常表现出对情绪刺激的异常EEG/ERP响应[84-85]。文[86-88]已经尝试基于此类神经响应建立了抑郁自动识别系统,有望用于辅助精神障碍的评估与筛查。
可见,与不同学科应用场域的深度互动一方面可以促进情感脑机接口方法体系本身的高速迭代,加强技术的“深度”,也有望促使情感脑机接口的研究范畴从“识别当前情绪状态”拓展到“识别情绪相关的个体特质”,提升技术应用的“广度”。在这一广度的拓展过程中,过往被视为建模干扰因素的个体差异,成为了建模的核心目标。而此类聚焦个体特质的情感脑机接口的关键挑战之一在于实验范式的设计,即如何诱发对目标特质敏感的情绪反应。例如,若目标是识别抑郁风险,可考虑设计积极与消极情绪的诱发任务[89];若目标是识别社交焦虑,则可以考虑社交互动相关的任务范式[90]。还需强调的是,情绪状态与情绪特质并非彼此独立。情绪特质往往影响个体在不同情境下的情绪状态反应,而相同情境中的差异性情绪状态亦可反映个体的特质倾向。因此,将情绪状态识别与特质评估加以整合,成为情感脑机接口的又一重要发展方向。未来研究可以尝试在模型架构层面对情绪状态与目标特质分别建模,比如建立多任务学习框架,同时包含识别当前情绪状态的模块与评估个体特质水平的模块,并在2个模块之间共享部分特征或输出互为输入,从而实现联动建模。尽管当前情感脑机接口领域对这一方向的直接探索仍较有限,但在其他模态(如语音、面部表情等)情感计算中已有成功先例。例如,一项基于面部表情录像的情感计算研究尝试采用多任务学习框架建立深度神经网络,同时识别视频中的人物的情绪状态与人格特质[91]。此外,一些研究虽然并未直接采用多任务学习架构,但是也尝试利用个体特质信息优化情绪状态识别模型,或反过来借助情绪识别任务提升特质识别模型的性能,展示了状态预测与特质识别相互促进彼此性能表现的潜力。比如,一项语音情绪识别研究将个体的情绪颗粒度水平作为模型参数纳入建模过程,有效提升了情绪识别的准确率[92]。另一项语音和文本的多模态情绪识别研究则将个体的人格特质纳入建模,提升了情绪识别的性能[93]。还有一项利用面部表情录像识别个体人格特质的研究在模型预训练环节采用情绪信息引导的编码器,有效提升了人格识别任务的准确性[94]
综上,情感脑机接口不仅可作为状态监测工具,也可作为特质识别与评估系统。情感脑机接口从情绪状态识别向情绪特质识别的拓展,有望为个性化人机交互、精神障碍识别与智能心理干预等场景提供新的技术路径与理论支撑。未来,情感脑机接口系统的长远发展目标,或许并非仅仅是更准确地“识别情绪”,而是在多模态神经行为线索中,理解个体的心理倾向与变化趋势。这一转向也将推动情感脑机接口从被动反应式识别迈向主动理解式感知、预测,为人工智能的发展奠定更深层次、更人性化的情感智能基础。

4 未来研究展望

心理学、神经科学与计算机等不同学科交叉互动为情感脑机接口技术的发展提供了重要支撑(见图 2),也为其未来研究指明了方向。
图 2 跨学科视角的情感脑机接口概念框架
首先,情感脑机接口技术本身需要进一步提升。目前,情感脑机接口研究多集中于实验室内的视频观看诱发情绪情境,但现实应用中存在更多复杂的情绪变化情境,如社交互动或思考、回忆中的情绪波动等。当前的算法如何有效泛化到复杂多变的日常自然情境,仍需要进一步探索。近年来,基于大规模脑电预训练的基础模型取得了重要进展,联合大量脑电数据通过自监督学习得到的脑电表征能够泛化到不同下游任务,为开发具有不同情境泛化性的情绪识别模型带来了新的机遇[95]。整合大模型的最新进展、提升情感脑机接口性能,将是研究的重要方向。与此同时,情感脑机接口离不开脑神经数据的获取,传感器硬件的设计与优化问题已经成为制约相关应用落地的主要瓶颈之一。虽然当前可穿戴式脑成像设备快速发展,耳电极等创新方案的提出大幅度提升了易用性[96],但成本与性能的平衡仍需进一步优化。而这一系列问题的解决离不开材料和电子领域的持续投入。
其次,情感脑机接口本质上可被视作一种通用技术:让使用者不依赖传统交互渠道即可实现个体情绪状态实时表达。因此,情感脑机接口有望在关注个体情绪状态的多个应用场域发挥作用。然而,如果希望将一项通用技术转化为指向现实问题的解决方案,在实践中发挥切实作用,离不开通用技术与应用场域的深度互动,在实践需求的牵引下实现技术的深化与拓展。第3章关于情感脑机接口从个人状态拓展到特质识别的讨论正是一个典型案例。与此同时,从情绪被动感知到主动支持的转变也是情感脑机接口应用方向的重要拓展机遇。面向日常场景,文[66]尝试基于实时情绪识别,提供视听反馈,实现了情绪调节。随着大模型共情对话生成功能的进一步加强,研究者将有望在教学等场景中形成较高质量的个性化闭环反馈,服务于调节与提升的应用需求。当然,设计优化的反馈策略以实现应用目标离不开对具体场景的充分研究与把握。在视听反馈之外,利用电刺激、磁刺激对情绪相关脑区直接干预,构建闭环情感脑机接口也是重要的研究话题[97]。文[98]利用植入式电极实现抑郁症状特异性生物标志物的检测,并利用局部电刺激构建生物标志物驱动的闭环刺激疗法,实现了难治性抑郁症状的改善,验证了该思路的可行性。
最后,人工智能的高速发展使得Licklider[99]在1960年提出的“人机共生”(man-computer symbiosis)概念有望得以实现。作为解读人类思维奥秘的重要工具,脑机接口技术可能将在人机共生的实现过程中发挥重要作用[100],也引发了研究者、用户与政府机构等利益相关方对伦理问题的深入探讨[101-103]。其中,个人隐私、数据安全等多方面讨论都同样适用于情感脑机接口。特别地,情感作为人类主观体验的重要组成部分,驱动着决策制定、社会交互等高级功能。文[104]认为,闭环脑刺激可能通过干扰情绪系统的自主决策过程,使人失去作为理性道德主体的地位。虽然相关研究目前主要关注情绪系统存在异常的难治性抑郁症患者,但人作为理性道德主体的责任问题确实需要在人机共生的未来来临之前进行充分的讨论,以保护与促进人的情感主体性。

5 结论

作为一个典型的交叉研究领域,多学科协作始终是情感脑机接口发展的重要动力。本文从理论、技术与应用3个方面,综述了心理学与神经科学推动情感脑机接口取得关键进展的思路与案例,旨在为未来研究提供启发。总体来看,情感脑机接口已完成可行性验证,通过大量实证研究证明了其独特价值,正不断拓展研究范畴,探索其在用户体验、人机交互、心理健康等领域的应用前景。
当前,情感脑机接口领域形成了良好的开源氛围,公开数据集与代码不断涌现[15, 105],为加快研究的迭代速度、检验算法的稳定性和可重复性提供了重要支撑。在多学科有生力量的持续投入下,情感脑机接口领域将不断迎来突破。
1
MINSKY M . The society of mind[M]. New York: Simon and Schuster, 1986.

2
PICARD R W . Affective computing[M]. Cambridge: The MIT Press, 2000.

3
WU D R , LU B L , HU B , et al. Affective brain-computer interfaces (abcis): A tutorial[J]. Proceedings of the IEEE, 2023, 111 (10): 1314- 1332.

DOI

4
CHEN H Y , LI J X , HE H H , et al. Toward the construction of affective brain-computer interface: A systematic review[J]. ACM Computing Surveys, 2025, 57 (6): 156.

5
赵国朕, 宋金晶, 葛燕, 等. 基于生理大数据的情绪识别研究进展[J]. 计算机研究与发展, 2016, 53 (1): 80- 92.

ZHAO G Z , SONG J J , GE Y , et al. Advances in emotion recognition based on physiological big data[J]. Journal of Computer Research and Development, 2016, 53 (1): 80- 92.

6
UDOVI AČG I AĆG G, DEREK J, RUSSO M, et al. Wearable emotion recognition system based on GSR and PPG signals[C]//Proceedings of the 2nd International Workshop on Multimedia for Personal Health and Health Care. Mountain View, USA: Association for Computing Machinery, 2017: 53-59.

7
ALARCÃO S M , FONSECA M J . Emotions recognition using EEG signals: A survey[J]. IEEE Transactions on Affective Computing, 2019, 10 (3): 374- 393.

DOI

8
SI X P , HE H , YU J Y , et al. Cross-subject emotion recognition brain-computer interface based on fNIRS and DBJNet[J]. Cyborg and Bionic Systems, 2023, 4, 0045.

DOI

9
HU X , CHEN J J , WANG F , et al. Ten challenges for EEG-based affective computing[J]. Brain Science Advances, 2019, 5 (1): 1- 20.

DOI

10
EKMAN P . An argument for basic emotions[J]. Cognition and Emotion, 1992, 6 (3-4): 169- 200.

DOI

11
RUSSELL J A . A circumplex model of affect[J]. Journal of Personality and Social Psychology, 1980, 39 (6): 1161- 1178.

DOI

12
KOELSTRA S , MUHL C , SOLEYMANI M , et al. DEAP: A database for emotion analysis; using physiological signals[J]. IEEE Transactions on Affective Computing, 2012, 3 (1): 18- 31.

DOI

13
WANG Y , SONG W , TAO W , et al. A systematic review on affective computing: Emotion models, databases, and recent advances[J]. Information Fusion, 2022, 83-84, 19- 52.

DOI

14
ZHENG W L , LU B L . Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks[J]. IEEE Transactions on Autonomous Mental development, 2015, 7 (3): 162- 175.

DOI

15
JIANG W B , LIU X H , ZHENG W L , et al. SEED-Ⅶ: A multimodal dataset of six basic emotions with continuous labels for emotion recognition[J]. IEEE Transactions on Affective Computing, 2025, 16 (2): 969- 985.

DOI

16
COWEN A S , KELTNER D . Self-report captures 27 distinct categories of emotion bridged by continuous gradients[J]. Proceedings of the National Academy of Sciences of the United States of America, 2017, 114 (38): E7900- E7909.

17
RUSSELL J A . Culture and the categorization of emotions[J]. Psychological Bulletin, 1991, 110 (3): 426- 450.

DOI

18
TRAMPE D , QUOIDBACH J , TAQUET M . Emotions in everyday life[J]. PLoS One, 2015, 10 (12): e0145450.

DOI

19
FREDRICKSON B L . Chapter One-Positive emotions broaden and build[J]. Advances in Experimental Social Psychology, 2013, 47, 1- 53.

20
HU X , YU J W , SONG M D , et al. EEG correlates of ten positive emotions[J]. Frontiers in Human Neuroscience, 2017, 11, 26.

21
HU X , ZHUANG C , WANG F , et al. fNIRS evidence for recognizably different positive emotions[J]. Frontiers in Human Neuroscience, 2019, 13, 120.

DOI

22
ZHAO G Z , ZHANG Y L , ZHANG G H , et al. Multi-target positive emotion recognition from EEG signals[J]. IEEE Transactions on Affective Computing, 2023, 14 (1): 370- 381.

DOI

23
BRIESEMEISTER B B , KUCHINKE L , JACOBS A M . Emotional valence: A bipolar continuum or two independent dimensions?[J]. Sage Open, 2012, 2 (4): 1- 12.

24
BERRIOS R , TOTTERDELL P , KELLETT S . Eliciting mixed emotions: A meta-analysis comparing models, types, and measures[J]. Frontiers in Psychology, 2015, 6, 428.

25
LIU F , YANG P , SHU Y Z , et al. Emotion dictionary learning with modality attentions for mixed emotion exploration[J]. IEEE Transactions on Affective Computing, 2024, 15 (3): 1289- 1302.

DOI

26
YANG P , LIU N Q , LIU X G , et al. A multimodal dataset for mixed emotion recognition[J]. Scientific Data, 2024, 11 (1): 847.

DOI

27
BARRETT L F . The conceptual act theory: A précis[J]. Emotion Review, 2014, 6 (4): 292- 297.

DOI

28
COWEN A S , FANG X , SAUTER D , et al. What music makes us feel: At least 13 dimensions organize subjective experiences associated with music across different cultures[J]. Proceedings of the National Academy of Sciences, 2020, 117 (4): 1924- 1934.

DOI

29
TANG L L , YUAN P J , ZHANG D . Emotional experience during human-computer interaction: A survey[J]. International Journal of Human-Computer Interaction, 2024, 40 (8): 1845- 1855.

DOI

30
PEKRUN R, LINNENBRINK-GARCIA L. Academic emotions and student engagement[M]//CHRISTENSON S L, RESCHLY A L, WYLIE C. Handbook of Research on Student Engagement. Boston, USA: Springer, 2012: 259-282.

31
HE W H , LUO H F , ZHANG D , et al. Student's subjective feelings during classroom learning[J]. Learning and Instruction, 2024, 91, 101891.

DOI

32
XU T , WANG J B , ZHANG G T , et al. Confused or not: Decoding brain activity and recognizing confusion in reasoning learning using EEG[J]. Journal of Neural Engineering, 2023, 20 (2): 026018.

DOI

33
胡鑫, 蒋俏蕾, 宋磊, 等. "萌"的媒介效果: 基于脑电情感计算的"萌"视频情绪分析[J]. 全球传媒学刊, 2021, 8 (6): 27- 44.

HU X , JIANG Q L , SONG L , et al. The media effect of "Moe": EEG-based affective computing of the emotions evoked by "Moe" videos[J]. Global Journal of Media Studies, 2021, 8 (6): 27- 44.

34
WU S F , LU Y L , LIEN C J . Detecting students' flow states and their construct through electroencephalogram: Reflective flow experiences, balance of challenge and skill, and sense of control[J]. Journal of Educational Computing Research, 2021, 58 (8): 1515- 1540.

DOI

35
陈菁菁, 王非, 高小榕, 等. 教育领域中的脑-机接口应用: 动向与挑战[J]. 科技导报, 2022, 40 (12): 90- 101.

CHEN J J , WANG F , GAO X R , et al. Brain-computer interface applications in education: Trends and challenges[J]. Science & Technology Review, 2022, 40 (12): 90- 101.

36
FAIRCLOUGH S H . Fundamentals of physiological computing[J]. Interacting with Computers, 2009, 21 (1-2): 133- 145.

DOI

37
KRAGEL P A , LABAR K S . Decoding the nature of emotion in the brain[J]. Trends in Cognitive Sciences, 2016, 20 (6): 444- 455.

DOI

38
KELLEY N J , HORTENSIUS R , SCHUTTER D J L G , et al. The relationship of approach/avoidance motivation and asymmetric frontal cortical activity: A review of studies manipulating frontal asymmetry[J]. International Journal of Psychophysiology, 2017, 119, 19- 30.

DOI

39
SUTTON S K , DAVIDSON R J . Prefrontal brain asymmetry: A biological substrate of the behavioral approach and inhibition systems[J]. Psychological Science, 1997, 8 (3): 204- 210.

DOI

40
OCKLENBURG S , PETERBURS J , MUNDORF A . Hemispheric asymmetries in the amygdala: A comparative primer[J]. Progress in Neurobiology, 2022, 214, 102283.

DOI

41
LIN Y P , WANG C H , JUNG T P , et al. EEG-based emotion recognition in music listening[J]. IEEE Transactions on Biomedical Engineering, 2010, 57 (7): 1798- 1806.

DOI

42
LI Y , WANG L , ZHENG W M , et al. A novel bi-hemispheric discrepancy model for EEG emotion recognition[J]. IEEE Transactions on Cognitive and Developmental Systems, 2021, 13 (2): 354- 367.

DOI

43
CUI H , LIU A P , ZHANG X , et al. EEG-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network[J]. Knowledge-Based Systems, 2020, 205, 106243.

DOI

44
YAN R F , LU N , YAN Y X , et al. A fine-grained hemispheric asymmetry network for accurate and interpretable EEG-based emotion classification[J]. Neural Networks, 2025, 184, 107127.

DOI

45
LINDQUIST K A , SATPUTE A B , WAGER T D , et al. The brain basis of positive and negative affect: Evidence from a meta-analysis of the human neuroimaging literature[J]. Cerebral Cortex, 2016, 26 (5): 1910- 1922.

DOI

46
WAGER T D , KANG J , JOHNSON T D , et al. A bayesian model of category-specific emotional brain responses[J]. PLoS Computational Biology, 2015, 11 (4): e1004066.

DOI

47
SAARIMÄKI H , EJTEHADIAN L F , GLEREAN E , et al. Distributed affective space represents multiple emotion categories across the human brain[J]. Social Cognitive and Affective Neuroscience, 2018, 13 (5): 471- 482.

DOI

48
PESSOA L . A network model of the emotional brain[J]. Trends in Cognitive Sciences, 2017, 21 (5): 357- 371.

DOI

49
LI P Y , LIU H , SI Y J , et al. EEG based emotion recognition by combining functional connectivity network and local activations[J]. IEEE Transactions on Biomedical Engineering, 2019, 66 (10): 2869- 2881.

DOI

50
WU X , ZHENG W L , LI Z Y , et al. Investigating EEG-based functional connectivity patterns for multimodal emotion recognition[J]. Journal of Neural Engineering, 2022, 19 (1): 016012.

DOI

51
SONG T F , ZHENG W M , SONG P , et al. EEG emotion recognition using dynamical graph convolutional neural networks[J]. IEEE Transactions on Affective Computing, 2020, 11 (3): 532- 541.

DOI

52
LI C B , TANG T , PAN Y , et al. An efficient graph learning system for emotion recognition inspired by the cognitive prior graph of EEG brain network[J]. IEEE Transactions on Neural Networks and Learning Systems, 2025, 36 (4): 7130- 7144.

DOI

53
DING Y , TONG C X , ZHANG S L , et al. EmT: A novel transformer for generalized cross-subject EEG emotion recognition[J]. IEEE Transactions on Neural Networks and Learning Systems, 2025, 36 (6): 10381- 10393.

DOI

54
HASSON U , NIR Y , LEVY I , et al. Intersubject synchronization of cortical activity during natural vision[J]. Science, 2004, 303 (5664): 1634- 1640.

DOI

55
DMOCHOWSKI J P , SAJDA P , DIAS J , et al. Correlated components of ongoing EEG point to emotionally laden attention—a possible marker of engagement?[J]. Frontiers in Human Neuroscience, 2012, 6, 112.

56
DMOCHOWSKI J P , BEZDEK M A , ABELSON B P , et al. Audience preferences are predicted by temporal reliability of neural processing[J]. Nature Communications, 2014, 5 (1): 4567.

DOI

57
DING Y , HU X , XIA Z Y , et al. Inter-brain EEG feature extraction and analysis for continuous implicit emotion tagging during video watching[J]. IEEE Transactions on Affective Computing, 2021, 12 (1): 92- 102.

DOI

58
SHEN X K , LIU X G , HU X , et al. Contrastive learning of subject-invariant EEG representations for cross-subject emotion recognition[J]. IEEE Transactions on Affective Computing, 2023, 14 (3): 2496- 2511.

DOI

59
CHANG J , ZHANG Z X , QIAN Y H , et al. Multi-scale hyperbolic contrastive learning for cross-subject EEG emotion recognition[J]. IEEE Transactions on Affective Computing, 2025, 16 (3): 1716- 1731.

DOI

60
HU M T , XU D , HE K J , et al. Cross-subject emotion recognition with contrastive learning based on EEG signal correlations[J]. Biomedical Signal Processing and Control, 2025, 104, 107511.

DOI

61
SHEN X K, GAN R M, WANG K X, et al. Dynamic-attention-based EEG state transition modeling for emotion recognition[J/OL]. IEEE Transactions on Affective Computing. (2025-07-29)[2025-07-30]. https://ieeexplore.ieee.org/document/11098886.

62
权学良, 曾志刚, 蒋建华, 等. 基于生理信号的情感计算研究综述[J]. 自动化学报, 2021, 47 (8): 1769- 1784.

QUAN X L , ZENG Z G , JIANG J H , et al. Physiological signals based affective computing: A systematic review[J]. Acta Automatica Sinica, 2021, 47 (8): 1769- 1784.

63
SI X P , HUANG H , YU J Y , et al. EEG microstates and fNIRS metrics reveal the spatiotemporal joint neural processing features of human emotions[J]. IEEE Transactions on Affective Computing, 2024, 15 (4): 2128- 2138.

DOI

64
GAO C G , UCHITOMI H , MIYAKE Y . Cross-sensory EEG emotion recognition with filter bank riemannian feature and adversarial domain adaptation[J]. Brain Sciences, 2023, 13 (9): 1326.

DOI

65
LIU Y J , YU M J , ZHAO G Z , et al. Real-time movie-induced discrete emotion recognition from EEG Signals[J]. IEEE Transactions on Affective Computing, 2018, 9 (4): 550- 562.

DOI

66
HUANG W C , WU W , LUCAS M V , et al. Neurofeedback training with an electroencephalogram-based brain-computer interface enhances emotion regulation[J]. IEEE Transactions on Affective Computing, 2023, 14 (2): 998- 1011.

DOI

67
CANO S , ARAUJO N , GUZMAN C , et al. Low-cost assessment of user experience through EEG signals[J]. IEEE Access, 2020, 8, 158475- 158487.

DOI

68
FREY J, DANIEL M, CASTET J, et al. Framework for electroencephalography-based evaluation of user experience[C]//Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. San Jose, USA: Association for Computing Machinery, 2016: 2283-2294.

69
KUMAR N , KUMAR J . Measurement of cognitive load in HCI systems using EEG power spectrum: An experimental study[J]. Procedia Computer Science, 2016, 84, 70- 78.

DOI

70
ZHAO M R , GAO H N , WANG W , et al. Research on human-computer interaction intention recognition based on EEG and eye movement[J]. IEEE Access, 2020, 8, 145824- 145832.

DOI

71
HAMANN S , CANLI T . Individual differences in emotion processing[J]. Current Opinion in Neurobiology, 2004, 14 (2): 233- 238.

DOI

72
BODEN M T , THOMPSON R J . Facets of emotional awareness and associations with emotion regulation and depression[J]. Emotion, 2015, 15 (3): 399- 410.

DOI

73
LANE R D , SCHWARTZ G E . Levels of emotional awareness: A cognitive-developmental theory and its application to psychopathology[J]. The American Journal of Psychiatry, 1987, 144 (2): 133- 143.

DOI

74
CHEN H Y , SUN S T , LI J X , et al. Personal-zscore: Eliminating individual difference for EEG-based cross-subject emotion recognition[J]. IEEE Transactions on Affective Computing, 2023, 14 (3): 2077- 2088.

DOI

75
LI J P , QIU S , SHEN Y Y , et al. Multisource transfer learning for cross-subject EEG emotion recognition[J]. IEEE Transactions on Cybernetics, 2020, 50 (7): 3281- 3293.

76
LI W , HUAN W , HOU B W , et al. Can emotion be transferred? A review on transfer learning for EEG-based emotion recognition[J]. IEEE Transactions on Cognitive and Developmental Systems, 2022, 14 (3): 833- 846.

DOI

77
LIN Y P , JUNG T P . Improving EEG-based emotion classification using conditional transfer learning[J]. Frontiers in Human Neuroscience, 2017, 11, 334.

DOI

78
DAI S , LI M , WU X , et al. Contrastive learning of EEG representation of brain area for emotion recognition[J]. IEEE Transactions on Instrumentation and Measurement, 2025, 74, 2506913.

79
MCCRAE R R , JOHN O P . An introduction to the five-factor model and its applications[J]. Journal of Personality, 1992, 60 (2): 175- 215.

DOI

80
WATSON D, CLARK L A. Chapter 29-Extraversion and its positive emotional core[M]//HOGAN R, JOHNSON J, BRIGGS S. Handbook of Personality Psychology. San Diego, USA: Academic Press, 1997: 767-793.

81
LI W Y , WU C P , HU X , et al. Quantitative personality predictions from a brief EEG recording[J]. IEEE Transactions on Affective Computing, 2022, 13 (3): 1514- 1527.

DOI

82
LI W Y , HU X , LONG X F , et al. EEG responses to emotional videos can quantitatively predict big-five personality traits[J]. Neurocomputing, 2020, 415, 368- 381.

DOI

83
ABI-DARGHAM A , MOELLER S J , ALI F , et al. Candidate biomarkers in psychiatric disorders: State of the field[J]. World Psychiatry, 2023, 22 (2): 236- 262.

DOI

84
BYLSMA L M , MORRIS B H , ROTTENBERG J . A meta-analysis of emotional reactivity in major depressive disorder[J]. Clinical Psychology Review, 2008, 28 (4): 676- 691.

DOI

85
SHESTYUK A Y , DELDIN P J , BRAND J E , et al. Reduced sustained brain activity during processing of positive emotional stimuli in major depression[J]. Biological Psychiatry, 2005, 57 (10): 1089- 1096.

DOI

86
CAI H S , HAN J S , CHEN Y F , et al. A pervasive approach to EEG-based depression detection[J]. Complexity, 2018, 2018, 5238028.

DOI

87
LI X W , HU B , SUN S T , et al. EEG-based mild depressive detection using feature selection methods and classifiers[J]. Computer Methods and Programs in Biomedicine, 2016, 136, 151- 161.

DOI

88
WU C T , DILLON D G , HSU H C , et al. Depression detection using relative EEG power induced by emotionally positive images and a conformal kernel support vector machine[J]. Applied Sciences, 2018, 8 (8): 1244.

DOI

89
LAI C H . Promising neuroimaging biomarkers in depression[J]. Psychiatry Investigation, 2019, 16 (9): 662- 670.

DOI

90
AL-EZZI A , KAMEL N , FAYE I , et al. Review of EEG, ERP, and brain connectivity estimators as predictive biomarkers of social anxiety disorder[J]. Frontiers in Psychology, 2020, 11, 730.

DOI

91
ZHANG L , PENG S Y , WINKLER S . PersEmoN: A deep network for joint analysis of apparent personality, emotion and their relationship[J]. IEEE Transactions on Affective Computing, 2022, 13 (1): 298- 305.

DOI

92
SHEPSTONE S E , TAN Z H , JENSEN S H . Audio-based granularity-adapted emotion classification[J]. IEEE Transactions on Affective Computing, 2018, 9 (2): 176- 190.

DOI

93
LI J L, LEE C C. Attention learning with retrievable acoustic embedding of personality for emotion recognition[C]//2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). Cambridge, UK: IEEE, 2019: 171-177.

94
SONG S Y , JAISWAL S , SANCHEZ E , et al. Self-supervised learning of person-specific facial dynamics for automatic personality recognition[J]. IEEE Transactions on Affective Computing, 2023, 14 (1): 178- 195.

DOI

95
JIANG W B, ZHAO L M, LU B L. Large brain model for learning generic representations with tremendous EEG data in BCI[EB/OL]. (2024-05-29)[2025-07-04]. http://arxiv.org/abs/2405.18765.

96
WANG Z H , SHI N L , ZHANG Y C , et al. Conformal in-ear bioelectronics for visual and auditory brain-computer interfaces[J]. Nature Communications, 2023, 14 (1): 4213.

DOI

97
SHANECHI M M . Brain-machine interfaces from motor to mood[J]. Nature Neuroscience, 2019, 22 (10): 1554- 1564.

DOI

98
SCANGOS K W , KHAMBHATI A N , DALY P M , et al. Closed-loop neuromodulation in an individual with treatment-resistant depression[J]. Nature Medicine, 2021, 27 (10): 1696- 1700.

DOI

99
LICKLIDER J C R . Man-computer symbiosis[J]. IRE Transactions on Human Factors in Electronics, 1960, HFE-1 (1): 4- 11.

DOI

100
张丹, 李佳蔚. 探索思维的力量: 脑机接口研究现状与展望[J]. 科技导报, 2017, 35 (9): 62- 67.

ZHANG D , LI J W . Exploring the power of mind: Status and prospect of brain-computer interfaces[J]. Science & Technology Review, 2017, 35 (9): 62- 67.

101
YUSTE R , GOERING S , ARCAS B A Y , et al. Four ethical priorities for neurotechnologies and AI[J]. Nature, 2017, 551 (7679): 159- 163.

DOI

102
IENCA M , HASELAGER P , EMANUEL E J . Brain leaks and consumer neurotechnology[J]. Nature Biotechnology, 2018, 36 (9): 805- 810.

DOI

103
GORDON E C , SETH A K . Ethical considerations for the use of brain-computer interfaces for cognitive enhancement[J]. PLoS Biology, 2024, 22 (10): e3002899.

DOI

104
柯晓宇. 情感脑机智能中的责任反思[N]. 社会科学报, 2025-04-10(6).

KE X Y. Reflections on responsibility in emotional brain-computer intelligence[N]. Social Sciences Weekly, 2025-04-10(6). (in Chinese)

105
CHEN J J , WANG X B , HUANG C , et al. A large finer-grained affective computing EEG dataset[J]. Scientific Data, 2023, 10 (1): 740.

DOI

Outlines

/