Please wait a minute...
 首页  期刊介绍 期刊订阅 联系我们 横山亮次奖 百年刊庆
 
最新录用  |  预出版  |  当期目录  |  过刊浏览  |  阅读排行  |  下载排行  |  引用排行  |  横山亮次奖  |  百年刊庆
清华大学学报(自然科学版)  2020, Vol. 60 Issue (5): 422-429    DOI: 10.16511/j.cnki.qhdxxb.2020.21.002
  专题:计算语言学 本期目录 | 过刊浏览 | 高级检索 |
结合规则蒸馏的情感原因发现
巫继鹏, 鲍建竹, 蓝恭强, 徐睿峰
哈尔滨工业大学(深圳)计算机科学与技术学院, 深圳 518055
Emotion cause extraction using rule distillation
WU Jipeng, BAO Jianzhu, LAN Gongqiang, XU Ruifeng
School of Computer Science and Technology, Harbin Institute of Technology(Shenzhen), Shenzhen 518055, China
全文: PDF(2916 KB)  
输出: BibTeX | EndNote (RIS)      
摘要 现有基于深度学习的情感原因发现方法往往缺乏对文本子句之间关系的建模,且存在学习过程不易控制、可解释性差和对高质量标注数据依赖过大的不足。为此,该文提出了一种结合规则蒸馏和层级注意力网络的情感原因发现方法。该方法使用结合位置编码和残差结构的层级注意力网络捕获子句内部以及子句和情感表达句之间的潜层语义关系。进而,采用基于对抗学习的知识蒸馏框架将情感原因表达相关的语言学规则引入模型,最终实现结合深度神经网络和语言学规则的情感原因发现。在中文情感原因发现数据集上的实验结果显示,该方法F1值比现有最优方法提升约0.02,达到了已知的最佳性能。
服务
把本文推荐给朋友
加入引用管理器
E-mail Alert
RSS
作者相关文章
巫继鹏
鲍建竹
蓝恭强
徐睿峰
关键词 情感原因发现层次注意力网络规则蒸馏    
Abstract:Most existing deep learning emotion cause extraction methods are unable to model latent semantic relationships between clauses. In addition, these methods are not easily controlled, are difficult to interpret and need high-quality annotations. This paper presents an emotion cause extraction method that incorporates rule distillation with a hierarchical attention network. The hierarchical attention network uses position encoding and the residual structure to capture the latent semantic relationships within the clauses and between the clauses and the emotional expression. A knowledge distillation architecture based on adversarial learning then introduces linguistic rules related to the emotion cause expression into the deep neural network. Tests on a Chinese emotion cause extraction dataset show that this method outperforms the state-of-the-art method by 0.02 in F1, the best known result.
Key wordsemotion cause extraction    hierarchical attention network    rule distillation
收稿日期: 2019-11-14      出版日期: 2020-04-26
基金资助:徐睿峰,教授,E-mail:xuruifeng@hit.edu.cn
引用本文:   
巫继鹏, 鲍建竹, 蓝恭强, 徐睿峰. 结合规则蒸馏的情感原因发现[J]. 清华大学学报(自然科学版), 2020, 60(5): 422-429.
WU Jipeng, BAO Jianzhu, LAN Gongqiang, XU Ruifeng. Emotion cause extraction using rule distillation. Journal of Tsinghua University(Science and Technology), 2020, 60(5): 422-429.
链接本文:  
http://jst.tsinghuajournals.com/CN/10.16511/j.cnki.qhdxxb.2020.21.002  或          http://jst.tsinghuajournals.com/CN/Y2020/V60/I5/422
  
  
  
  
  
  
  
  
[1] LEE S Y M, CHEN Y, HUANG C R. A text-driven rule-based system for emotion cause detection[C]//Proceedings of the NAACL HLT 2010 Workshop on Computational Approaches to Analysis and Generation of Emotion in Text. Los Angeles, USA:Association for Computational Linguistics, 2010:45-53.
[2] LEE S Y M, CHEN Y, HUANG C R, et al. Detecting emotion causes with a linguistic rule-based approach[J]. Computational Intelligence, 2013, 29(3):390-416.
[3] GUI L, YUAN L, XU R F, et al. Emotion cause detection with linguistic construction in Chinese Weibo Text[C]//Proceedings of the Natural Language Processing and Chinese Computing. Shenzhen, China:Springer, 2014:457-464.
[4] GUI L, HU J N, HE Y L, et al. A question answering approach to emotion cause extraction[C]//Proceedings of the Conference on Empirical Methods in Natural Language Processing. Copenhagen, Denmark:Association for Computational Linguistics, 2017:1593-1602.
[5] DING Z X, HE H H, ZHANG M R, et al. From independent prediction to re-ordered prediction:Integrating relative position and global label information to emotion cause identification[C]//Proceedings of the National Conference on Artificial Intelligence. Hawaii, USA:AAAI, 2019:6343-6350.
[6] GUI L, WU D Y, XU R F, et al. Event-driven emotion cause extraction with corpus constructions[C]//Proceedings of 2016 Conference on Empirical Methods in Natural Language Processing. Austin, USA:Association for Computational Linguistics, 2016:1639-1649.
[7] 余传明, 李浩男, 安璐. 基于多任务深度学习的文本情感原因分析[J]. 广西师范大学学报(自然科学版), 2019, 37(1):50-61.YU C M, LI H N, AN L. Analysis of text emotion cause based on multi-task deep learning[J]. Journal of Guangxi Normal University (Natural Science Edition), 2019, 37(1):50-61. (in Chinese)
[8] HINTON G, VINYALS O, DEAN J. Distilling the knowledge in a neural network[C]//Proceedings of the Workshops of Advances in Neural Information Processing Systems. Montreal, Canada, 2014:200-208.
[9] HU Z T, MA X Z, LIU Z Z, et al. Harnessing deep neural networks with logic rules[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, Germany:Association for Computational Linguistics, 2016:2410-2420.
[10] LIU J, CHEN Y B, LIU K. Exploiting the ground-truth:An adversarial imitation based knowledge distillation approach forevent detection[C]//Proceedings of the Thirty-Third AAAI Conference on Artificial Intelligence. Hawaii, USA:AAAI, 2019:6754-6761.
[11] KRISHNA K, JYOTHI P, IYYER M. Revisiting theimportance of encoding logic rules in sentiment classification[C]//Proceedings of 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, Belgium:Association for Computational Linguistics, 2018:4743-4751.
[12] RUSSO I, CASELLI T, RUBINO F, et al. EMOCause:An easy-adaptable approach to emotion cause contexts[C]//Proceedings of the 2nd Workshop on Cmputational Approaches to Subjectivity and Sentiment Analysis. Portland, USA:Association for Computational Linguistics, 2011:153-160.
[13] LI X J, SONG K S, FENG S, et al. A co-attention neural network model for emotion cause analysis with emotional context awareness[C]//Proceedings of 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, Belgium:Association for Computational Linguistics, 2018:4752-4757.
[14] YANG Z C, YANG D Y, DYER C, et al. Hierarchical attention networks for document classification[C]//Proceedings of 2016 Conference of the North American Chapter of the Association for Computational Linguistics. San Diego, USA:Association for Computational Linguistics, 2016:1480-1489.
No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
版权所有 © 《清华大学学报(自然科学版)》编辑部
本系统由北京玛格泰克科技发展有限公司设计开发 技术支持:support@magtech.com.cn