Please wait a minute...
 首页  期刊介绍 期刊订阅 联系我们 横山亮次奖 百年刊庆
 
最新录用  |  预出版  |  当期目录  |  过刊浏览  |  阅读排行  |  下载排行  |  引用排行  |  横山亮次奖  |  百年刊庆
清华大学学报(自然科学版)  2023, Vol. 63 Issue (9): 1326-1338    DOI: 10.16511/j.cnki.qhdxxb.2023.21.009
  大数据 本期目录 | 过刊浏览 | 高级检索 |
面向中文的字词组合序列实体识别方法
王庆人, 王银子, 仲红, 张以文
安徽大学 计算机科学与技术学院, 合肥 230093
Chinese-oriented entity recognition method of character vocabulary combination sequence
WANG Qingren, WANG Yinzi, ZHONG Hong, ZHANG Yiwen
College of Computer Science and Technology, Anhui University, Hefei 230093, China
全文: PDF(5316 KB)  
输出: BibTeX | EndNote (RIS)      
摘要 作为信息抽取的核心任务, 命名实体识别能够从文本中识别不同类型命名实体。 得益于深度学习在字词表示、 特征提取方面的应用, 中文命名实体识别任务取得了丰富研究成果。 然而, 中文命名实体识别任务依旧面临词汇信息缺乏的挑战, 主要表现为: 1) 词汇边界信息和上下文语义信息未充分利用; 2) 字和自匹配词汇间语义信息未能有效捕获; 3) 图注意力网络输出信息中不同交互图信息的重要性未被考虑。 该文提出一种面向中文的字词组合序列实体识别方法。 采用字词组合序列嵌入结构, 实现词汇边界信息以及字符与词汇间语义信息的充分捕捉; 采用多图注意力融合架构, 实现不同图神经网络提取特征重要性的区分。 实验表明, 相比已有经典方法, 该方法在Weibo、 Resume、 OntoNotes4.0及MSRA四个数据集上的F1明显提升, 在中文命名实体识别任务上具有可行性。
服务
把本文推荐给朋友
加入引用管理器
E-mail Alert
RSS
作者相关文章
王庆人
王银子
仲红
张以文
关键词 自然语言处理命名实体识别图注意力网络字词组合嵌入多图注意力    
Abstract:[Objective] As the core task of information extraction, named entity recognition recognizes various types of named entities from the text. The task of Chinese-named entity recognition has benefited from the application of deep learning in character vocabulary representation, feature extraction, and other aspects, achieving rich results. However, this task still faces the challenge of a lack of vocabulary information, which has been regarded as one of the primary impediments to the development of a high-performance Chinese-named entity recognition (NER) system. While the automatically constructed dictionary contains rich lexical boundary information and lexical semantic information, the integration of word knowledge in the Chinese NER task still faces challenges, such as the effective integration of the semantic information of self-matching words and their context information into Chinese characters. Furthermore, although graph neural networks can be used to extract feature information from various Chinese character-vocabulary interaction diagrams in feature extraction, the challenge of how to fuse features based on the importance of the information from the respective interaction diagrams into the original input sequence is yet to be solved.[Methods] This paper proposes a Chinese-oriented entity recognition method of Chinese-vocabulary combination sequence. (1) First, this method proposes a Chinese-vocabulary combination sequence embedding structure that primarily uses self-matching words to replace the Chinese characters in the Chinese character sequence under consideration. To make complete use of the self-matching vocabulary information, we also constructed a sequence for the self-matching vocabulary and vectorized the vocabulary and Chinese characters. At the coding level, we obtained the context information of the Chinese character sequence, the vocabulary sequence, and the Chinese-word combination sequence using the BiLSTM model and then fused the information from the words in the Chinese word combination sequence into the corresponding words in the vocabulary sequence. Furthermore, the graph neural network was used to extract the features of different Chinese-vocabulary interaction diagrams so that the enhanced vocabulary information can be integrated into Chinese characters, which can not only make complete use of the vocabulary boundary information but also integrate the context information of the self-matching vocabulary sequence into characters while capturing the semantic information between the Chinese characters and words, further enriching the character features. Finally, the conditional random field was used to decode and label the entities. (2) Considering the importance of different Chinese character-word interaction diagram information to the original input Chinese character sequence is not the same, this method proposes a multigraph attention fusion structure. It assigns a score to the correlation of the Chinese character sequence based on different Chinese character-word interaction diagram information, differentiates between structural features based on their importance, and fuses different Chinese character-word interaction diagram information into the Chinese character sequence based on their proportions.[Results] The F1 value of the new method was higher than that of the original method on Weibo, Resume, OntoNotes4.0, and MSRA data by 3.17% (Weibo_all), 1.21%, 1.33%, and 0.43%, respectively, thus verifying the feasibility of the new method on Chinese NER tasks.[Condusions] The experiment revealed that the proposed method is more effective than the original method.
Key wordsnatural language processing    named entity recognition    graph attention neural network    character-word combination embedding    multigraph attention
收稿日期: 2023-03-20      出版日期: 2023-08-19
基金资助:国家自然科学基金重点项目(U1936220); 国家自然科学基金青年项目(62006003)
通讯作者: 仲红,教授,E-mail:zhongh@ahu.edu.cn      E-mail: zhongh@ahu.edu.cn
作者简介: 王庆人(1986-),男,讲师。
引用本文:   
王庆人, 王银子, 仲红, 张以文. 面向中文的字词组合序列实体识别方法[J]. 清华大学学报(自然科学版), 2023, 63(9): 1326-1338.
WANG Qingren, WANG Yinzi, ZHONG Hong, ZHANG Yiwen. Chinese-oriented entity recognition method of character vocabulary combination sequence. Journal of Tsinghua University(Science and Technology), 2023, 63(9): 1326-1338.
链接本文:  
http://jst.tsinghuajournals.com/CN/10.16511/j.cnki.qhdxxb.2023.21.009  或          http://jst.tsinghuajournals.com/CN/Y2023/V63/I9/1326
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
  
[1] 琚生根, 李天宁, 孙界平. 基于关联记忆网络的中文细粒度命名实体识别[J]. 软件学报, 2021, 32(8): 2545-2556. JU S G, LI T N, SUN J P. Chinese fine-grained name entity recognition based on associated memory networks[J]. Journal of Software, 2021, 32(8): 2545-2556. (in Chinese)
[2] SUN J, GAO J F, ZHANG L, et al. Chinese named entity identification using class-based language model[C]// COLING 2002: The 19th International Conference on Computational Linguistics. Taipei, China: Association for Computational Linguistics, 2002: 1-7.
[3] BUNESCU R, MOONEY R. A shortest path dependency kernel for relation extraction[C]// Proceedings of Human Language Technology Conference and Conference on Empirical Methods in Natural Language Processing. Vancouver, Canada: Association for Computational Linguistics, 2005: 724-731.
[4] 叶育鑫, 薛环, 王璐, 等. 基于带噪观测的远监督神经网络关系抽取[J]. 软件学报, 2020, 31(4): 1025-1038. YE Y X, XUE H, WANG L, et al. Distant supervision neural network relation extraction base on noisy observation[J]. Journal of Software, 2020, 31(4): 1025-1038. (in Chinese)
[5] CHEN Y B, XU L H, LIU K, et al. Event extraction via dynamic multi-pooling convolutional neural networks[C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Beijing, China: Association for Computational Linguistics, 2015: 167-176.
[6] 贺瑞芳, 段绍杨. 基于多任务学习的中文事件抽取联合模型[J]. 软件学报, 2019, 30(4): 1015-1030. HE R F, DUAN S Y. Joint Chinese event extraction based multi-task learning[J]. Journal of Software, 2019, 30(4): 1015-1030. (in Chinese)
[7] MOLLÁ D, VAN ZAANEN M, SMITH D. Named entity recognition for question answering[C]// Proceedings of Australasian Language Technology Workshop 2006. Sydney, Australia: ALTA, 2006: 51-58.
[8] BOSSELUT A, RASHKIN H, SAP M, et al. COMET: Commonsense transformers for automatic knowledge graph construction[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence, Italy: Association for Computational Linguistics, 2019: 4762-4779.
[9] 杨东华, 何涛, 王宏志, 等. 面向知识图谱的图嵌入学习研究进展[J]. 软件学报, 2022, 33(9): 3370-3390. YANG D H, HE T, WANG H Z, et al. Survey on knowledge graph embedding learning[J]. Journal of Software, 2022, 33(9): 3370-3390. (in Chinese)
[10] 王鑫, 邹磊, 王朝坤, 等. 知识图谱数据管理研究综述[J]. 软件学报, 2019, 30(7): 2139-2174. WANG X, ZOU L, WANG C K, et al. Research on knowledge graph data management: A survey[J]. Journal of Software, 2019, 30(7): 2139-2174. (in Chinese)
[11] YANG J, TENG Z Y, ZHANG M S, et al. Combining discrete and neural features for sequence labeling[C]// 17th International Conference on Computational Linguistics and Intelligent Text Processing. Konya, Turkey: Springer, 2018: 140-154.
[12] HE H F, SUN X. A unified model for cross-domain and semi-supervised named entity recognition in Chinese social media[C]// Proceedings of the 31st AAAI Conference on Artificial Intelligence. San Francisco, USA: AAAI Press, 2017: 3216-3222.
[13] LI H B, HAGIWARA M, LI Q, et al. Comparison of the impact of word segmentation on name tagging for Chinese and Japanese[C]// Proceedings of the Ninth International Conference on Language Resources and Evaluation. Reykjavik, Iceland: LREC, 2014: 2532-2536.
[14] HE J Z, WANG H F. Chinese named entity recognition and word segmentation based on character[C]// Proceedings of the Sixth SIGHAN Workshop on Chinese Language Processing. Hyderabad, India: ACL, 2008: 128-132.
[15] ZHANG Y, YANG J. Chinese NER using lattice LSTM[C]// Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Melbourne, Australia: Association for Computational Linguistics, 2018: 1554-1564.
[16] SUI D B, CHEN Y B, LIU K, et al. Leverage lexical knowledge for Chinese named entity recognition via collaborative graph network[C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Hong Kong, China: Association for Computational Linguistics, 2019: 3830-3840.
[17] 胡滨, 耿天玉, 邓赓, 等. 基于知识蒸馏的高效生物医学命名实体识别模型[J]. 清华大学学报(自然科学版), 2021, 61(9): 936-942. HU B, GENG T Y, DENG G, et al. Faster biomedical named entity recognition based on knowledge distillation[J]. Journal of Tsinghua University (Science and Technology), 2021, 61(9): 936-942. (in Chinese)
[18] 谭红叶, 郑家恒, 刘开瑛.基于变换的中国地名自动识别研究[J].软件学报, 2001, 12(11):1608-1613. TAN H Y, ZHENG J H, LIU K Y. Research on method of automatic recognition of Chinese place name based on transformation[J]. Journal of Software, 2001, 12(11): 1608-1613. (in Chinese)
[19] TSAI T H, WU S H, LEE C W, et al. Mencius: A Chinese named entity recognizer using the maximum entropy-based hybrid model[J]. IJCLCLP, 2004, 9(1): 65-82.
[20] ISOZAKI H, KAZAWA H. Efficient support vector classifiers for named entity recognition[C]// COLING 2002: The 19th International Conference on Computational Linguistics. Taipei, China: Association for Computational Linguistics, 2002: 1-7.
[21] BIKEL D M, MILLER S, SCHWARTZ R, et al. Nymble: A high-performance learning name-finder[C]// Proceedings of the Fifth Conference on Applied Natural Language Processing. Washington, USA: Association for Computational Linguistics, 1997: 194-201.
[22] LAFFERTY J D, MCCALLUM A, PEREIRA F C N. Conditional random fields: Probabilistic models for segmenting and labeling sequence data[C]// Proceedings of the Eighteenth International Conference on Machine Learning. San Francisco, USA: Morgan Kaufmann Publishers Inc., 2001: 282-289.
[23] 尹学振, 赵慧, 赵俊保, 等. 多神经网络协作的军事领域命名实体识别[J]. 清华大学学报(自然科学版), 2020, 60(8): 648-655. YIN X Z, ZHAO H, ZHAO J B, et al. Multi-neural network collaboration for Chinese military named entity recognition[J]. Journal of Tsinghua University (Science and Technology), 2020, 60(8): 648-655. (in Chinese)
[24] MA X Z, HOVY E. End-to-end sequence labeling via Bi-directional LSTM-CNNs-CRF[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Berlin, Germany: Association for Computational Linguistics, 2016: 1064-1074.
[25] CHIU J P C, NICHOLS E. Named entity recognition with bidirectional LSTM-CNNs[J]. Transactions of the Association for Computational Linguistics, 2016, 4: 357-370.
[26] COLLOBERT R, WESTON J, BOTTOU L, et al. Natural language processing (almost) from scratch[J]. Journal of Machine Learning Research, 2011, 12: 2493-2537.
[27] YAN H, SUN Y, LI X N, et al. An embarrassingly easy but strong baseline for nested named entity recognition[J/OL].(2022-09-15)[2023-03-20]. https://arxiv.org/abs/2208.04534.
[28] LIU L Y, SHANG J B, REN X, et al. Empower sequence labeling with task-aware neural language model[C]// Proceedings of the 32nd AAAI Conference on Artificial Intelligence. New Orleans, USA: AAAI Press, 2018: 5253-5260.
[29] HUANG Z H, XU W, YU K. Bidirectional LSTM-CRF models for sequence tagging[J/OL]. (2015-08-09)[2023-03-20]. https://arxiv.org/abs/1508.01991.
[30] LAMPLE G, BALLESTEROS M, SUBRAMANIAN S, et al.Neural architectures for named entity recognition[C]// Proceedings of 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego, USA: Association for Computational Linguistics, 2016: 260-270.
[31] GUI T, MA R T, ZHANG Q, et al. CNN-based Chinese NER with lexicon rethinking[C]// Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence. Macao, China: IJCAI, 2019: 4982-4988.
[32] ZHANG Y, WALLACE B. A sensitivity analysis of (and practitioners' guide to) convolutional neural networks for sentence classification[C]// Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Taipei, China: Asian Federation of Natural Language Processing, 2017: 253-263.
[33] LIU P F, QIU X P, HUANG X J. Recurrent neural network for text classification with multi-task learning[C]// Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. New York, USA: AAAI Press, 2016: 2873-2879.
[34] LIU W, XU T G, XU Q H, et al. An encoding strategy based word-character LSTM for Chinese NER[C]// Proceedings of 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1(Long and Short Papers). Minneapolis, USA: Association for Computational Linguistics, 2019: 2379-2389.
[35] CHO K, VAN MERRIËNBOER B, BAHDANAU D, et al. On the properties of neural machine translation: Encoder-decoder approaches[C]// 8th Workshop on Syntax, Semantics and Structure in Statistical Translation. Doha, Qatar: Association for Computational Linguistics, 2014: 103-111.
[36] STRUBELL E, VERGA P, BELANGER D, et al. Fast and accurate entity recognition with iterated dilated convolutions[C]// Proceedings of 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen, Denmark: Association for Computational Linguistics, 2017: 2670-2680.
[37] VELI AČG KOVIĆG P, CUCURULL G, CASANOVA A, et al. Graph attention networks[C]// 6th International Conference on Learning Representations. Vancouver, Canada: ICLR, 2017.
[38] GRAVES A. Supervised sequence labelling with recurrent neural networks[M]. Berlin, Germany: Springer, 2012.
[39] NIE Y Y, TIAN Y H, SONG Y, et al. Improving named entity recognition with attentive ensemble of syntactic Information[C]// Findings of the Association for Computational Linguistics: EMNLP 2020. Association for Computational Linguistics, 2020: 4231-4245.
[40] VITERBI A. Error bounds for convolutional codes and an asymptotically optimum decoding algorithm[J]. IEEE Transactions on Information Theory, 1967, 13(2): 260-269.
[41] WEISCHEDEL R, PALMER M, MARCUS M, et al. OntoNotes release 4.0[EB/OL]. (2011-02-15)[2023-03-20]. https://doi.org/10.35111/gfjf-7r50.
[42] LEVOW G A. The third international Chinese language processing bakeoff: Word segmentation and named entity recognition[C]// Proceedings of the Fifth SIGHAN Workshop on Chinese Language Processing. Sydney, Australia: Association for Computational Linguistics, 2006: 108-117.
[43] PENG N Y, DREDZE M. Named entity recognition for Chinese social media with jointly trained embeddings[C]// Proceedings of 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, Portugal: Association for Computational Linguistics, 2015: 548-554.
[44] LI S, ZHAO Z, HU R F, et al. Analogical reasoning on Chinese morphological and semantic relations[C]// Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). Melbourne, Australia: Association for Computational Linguistics, 2018: 138-143.
[45] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: A simple way to prevent neural networks from overfitting[J]. The Journal of Machine Learning Research, 2014, 15(1): 1929-1958.
[46] KINGMA D P, BA J. Adam: A method for stochastic optimization[C]// 3rd International Conference on Learning Representations. San Diego, USA: ICLR, 2014.
[47] MA R T, PENG M L, ZHANG Q, et al. Simplify the usage of lexicon in Chinese NER[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2020: 5951-5960.
[48] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Long Beach, USA: Curran Associates Inc., 2017: 6000-6010.
[49] LI X N, YAN H, QIU X P, et al. FLAT: Chinese NER using flat-lattice transformer[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistic. Association for Computational Linguistics, 2020: 6836-6842.
[1] 王昀, 胡珉, 塔娜, 孙海涛, 郭毅峰, 周武爱, 郭昱, 张皖哲, 冯建华. 大语言模型及其在政务领域的应用[J]. 清华大学学报(自然科学版), 2024, 64(4): 649-658.
[2] 张雪芹, 刘岗, 王智能, 罗飞, 吴建华. 基于多特征融合和深度学习的微观扩散预测[J]. 清华大学学报(自然科学版), 2024, 64(4): 688-699.
[3] 陆思聪, 李春文. 基于场景与话题的聊天型人机会话系统[J]. 清华大学学报(自然科学版), 2022, 62(5): 952-958.
[4] 胡滨, 耿天玉, 邓赓, 段磊. 基于知识蒸馏的高效生物医学命名实体识别模型[J]. 清华大学学报(自然科学版), 2021, 61(9): 936-942.
[5] 尹学振, 赵慧, 赵俊保, 姚婉薇, 黄泽林. 多神经网络协作的军事领域命名实体识别[J]. 清华大学学报(自然科学版), 2020, 60(8): 648-655.
[6] 贾旭东, 王莉. 基于多头注意力胶囊网络的文本分类模型[J]. 清华大学学报(自然科学版), 2020, 60(5): 415-421.
[7] 陈乐乐, 黄松, 孙金磊, 惠战伟, 吴开舜. 基于BM25算法的问题报告质量检测方法[J]. 清华大学学报(自然科学版), 2020, 60(10): 829-836.
[8] 李明扬, 孔芳. 融入自注意力机制的社交媒体命名实体识别[J]. 清华大学学报(自然科学版), 2019, 59(6): 461-467.
[9] 王元龙, 李茹, 张虎, 王智强. 阅读理解中因果关系类选项的研究[J]. 清华大学学报(自然科学版), 2018, 58(3): 272-278.
[10] 卢兆麟, 李升波, Schroeder Felix, 周吉晨, 成波. 结合自然语言处理与改进层次分析法的乘用车驾驶舒适性评价[J]. 清华大学学报(自然科学版), 2016, 56(2): 137-143.
[11] 张旭, 王生进. 基于自然语言处理的特定属性物体检测[J]. 清华大学学报(自然科学版), 2016, 56(11): 1137-1142.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
版权所有 © 《清华大学学报(自然科学版)》编辑部
本系统由北京玛格泰克科技发展有限公司设计开发 技术支持:support@magtech.com.cn