[1] 琚生根, 李天宁, 孙界平. 基于关联记忆网络的中文细粒度命名实体识别[J]. 软件学报, 2021, 32(8): 2545-2556. JU S G, LI T N, SUN J P. Chinese fine-grained name entity recognition based on associated memory networks[J]. Journal of Software, 2021, 32(8): 2545-2556. (in Chinese)
[2] SUN J, GAO J F, ZHANG L, et al. Chinese named entity identification using class-based language model[C]// COLING 2002: The 19th International Conference on Computational Linguistics. Taipei, China: Association for Computational Linguistics, 2002: 1-7.
[3] BUNESCU R, MOONEY R. A shortest path dependency kernel for relation extraction[C]// Proceedings of Human Language Technology Conference and Conference on Empirical Methods in Natural Language Processing. Vancouver, Canada: Association for Computational Linguistics, 2005: 724-731.
[4] 叶育鑫, 薛环, 王璐, 等. 基于带噪观测的远监督神经网络关系抽取[J]. 软件学报, 2020, 31(4): 1025-1038. YE Y X, XUE H, WANG L, et al. Distant supervision neural network relation extraction base on noisy observation[J]. Journal of Software, 2020, 31(4): 1025-1038. (in Chinese)
[5] CHEN Y B, XU L H, LIU K, et al. Event extraction via dynamic multi-pooling convolutional neural networks[C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Beijing, China: Association for Computational Linguistics, 2015: 167-176.
[6] 贺瑞芳, 段绍杨. 基于多任务学习的中文事件抽取联合模型[J]. 软件学报, 2019, 30(4): 1015-1030. HE R F, DUAN S Y. Joint Chinese event extraction based multi-task learning[J]. Journal of Software, 2019, 30(4): 1015-1030. (in Chinese)
[7] MOLLÁ D, VAN ZAANEN M, SMITH D. Named entity recognition for question answering[C]// Proceedings of Australasian Language Technology Workshop 2006. Sydney, Australia: ALTA, 2006: 51-58.
[8] BOSSELUT A, RASHKIN H, SAP M, et al. COMET: Commonsense transformers for automatic knowledge graph construction[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence, Italy: Association for Computational Linguistics, 2019: 4762-4779.
[9] 杨东华, 何涛, 王宏志, 等. 面向知识图谱的图嵌入学习研究进展[J]. 软件学报, 2022, 33(9): 3370-3390. YANG D H, HE T, WANG H Z, et al. Survey on knowledge graph embedding learning[J]. Journal of Software, 2022, 33(9): 3370-3390. (in Chinese)
[10] 王鑫, 邹磊, 王朝坤, 等. 知识图谱数据管理研究综述[J]. 软件学报, 2019, 30(7): 2139-2174. WANG X, ZOU L, WANG C K, et al. Research on knowledge graph data management: A survey[J]. Journal of Software, 2019, 30(7): 2139-2174. (in Chinese)
[11] YANG J, TENG Z Y, ZHANG M S, et al. Combining discrete and neural features for sequence labeling[C]// 17th International Conference on Computational Linguistics and Intelligent Text Processing. Konya, Turkey: Springer, 2018: 140-154.
[12] HE H F, SUN X. A unified model for cross-domain and semi-supervised named entity recognition in Chinese social media[C]// Proceedings of the 31st AAAI Conference on Artificial Intelligence. San Francisco, USA: AAAI Press, 2017: 3216-3222.
[13] LI H B, HAGIWARA M, LI Q, et al. Comparison of the impact of word segmentation on name tagging for Chinese and Japanese[C]// Proceedings of the Ninth International Conference on Language Resources and Evaluation. Reykjavik, Iceland: LREC, 2014: 2532-2536.
[14] HE J Z, WANG H F. Chinese named entity recognition and word segmentation based on character[C]// Proceedings of the Sixth SIGHAN Workshop on Chinese Language Processing. Hyderabad, India: ACL, 2008: 128-132.
[15] ZHANG Y, YANG J. Chinese NER using lattice LSTM[C]// Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Melbourne, Australia: Association for Computational Linguistics, 2018: 1554-1564.
[16] SUI D B, CHEN Y B, LIU K, et al. Leverage lexical knowledge for Chinese named entity recognition via collaborative graph network[C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). Hong Kong, China: Association for Computational Linguistics, 2019: 3830-3840.
[17] 胡滨, 耿天玉, 邓赓, 等. 基于知识蒸馏的高效生物医学命名实体识别模型[J]. 清华大学学报(自然科学版), 2021, 61(9): 936-942. HU B, GENG T Y, DENG G, et al. Faster biomedical named entity recognition based on knowledge distillation[J]. Journal of Tsinghua University (Science and Technology), 2021, 61(9): 936-942. (in Chinese)
[18] 谭红叶, 郑家恒, 刘开瑛.基于变换的中国地名自动识别研究[J].软件学报, 2001, 12(11):1608-1613. TAN H Y, ZHENG J H, LIU K Y. Research on method of automatic recognition of Chinese place name based on transformation[J]. Journal of Software, 2001, 12(11): 1608-1613. (in Chinese)
[19] TSAI T H, WU S H, LEE C W, et al. Mencius: A Chinese named entity recognizer using the maximum entropy-based hybrid model[J]. IJCLCLP, 2004, 9(1): 65-82.
[20] ISOZAKI H, KAZAWA H. Efficient support vector classifiers for named entity recognition[C]// COLING 2002: The 19th International Conference on Computational Linguistics. Taipei, China: Association for Computational Linguistics, 2002: 1-7.
[21] BIKEL D M, MILLER S, SCHWARTZ R, et al. Nymble: A high-performance learning name-finder[C]// Proceedings of the Fifth Conference on Applied Natural Language Processing. Washington, USA: Association for Computational Linguistics, 1997: 194-201.
[22] LAFFERTY J D, MCCALLUM A, PEREIRA F C N. Conditional random fields: Probabilistic models for segmenting and labeling sequence data[C]// Proceedings of the Eighteenth International Conference on Machine Learning. San Francisco, USA: Morgan Kaufmann Publishers Inc., 2001: 282-289.
[23] 尹学振, 赵慧, 赵俊保, 等. 多神经网络协作的军事领域命名实体识别[J]. 清华大学学报(自然科学版), 2020, 60(8): 648-655. YIN X Z, ZHAO H, ZHAO J B, et al. Multi-neural network collaboration for Chinese military named entity recognition[J]. Journal of Tsinghua University (Science and Technology), 2020, 60(8): 648-655. (in Chinese)
[24] MA X Z, HOVY E. End-to-end sequence labeling via Bi-directional LSTM-CNNs-CRF[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Berlin, Germany: Association for Computational Linguistics, 2016: 1064-1074.
[25] CHIU J P C, NICHOLS E. Named entity recognition with bidirectional LSTM-CNNs[J]. Transactions of the Association for Computational Linguistics, 2016, 4: 357-370.
[26] COLLOBERT R, WESTON J, BOTTOU L, et al. Natural language processing (almost) from scratch[J]. Journal of Machine Learning Research, 2011, 12: 2493-2537.
[27] YAN H, SUN Y, LI X N, et al. An embarrassingly easy but strong baseline for nested named entity recognition[J/OL].(2022-09-15)[2023-03-20]. https://arxiv.org/abs/2208.04534.
[28] LIU L Y, SHANG J B, REN X, et al. Empower sequence labeling with task-aware neural language model[C]// Proceedings of the 32nd AAAI Conference on Artificial Intelligence. New Orleans, USA: AAAI Press, 2018: 5253-5260.
[29] HUANG Z H, XU W, YU K. Bidirectional LSTM-CRF models for sequence tagging[J/OL]. (2015-08-09)[2023-03-20]. https://arxiv.org/abs/1508.01991.
[30] LAMPLE G, BALLESTEROS M, SUBRAMANIAN S, et al.Neural architectures for named entity recognition[C]// Proceedings of 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego, USA: Association for Computational Linguistics, 2016: 260-270.
[31] GUI T, MA R T, ZHANG Q, et al. CNN-based Chinese NER with lexicon rethinking[C]// Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence. Macao, China: IJCAI, 2019: 4982-4988.
[32] ZHANG Y, WALLACE B. A sensitivity analysis of (and practitioners' guide to) convolutional neural networks for sentence classification[C]// Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers). Taipei, China: Asian Federation of Natural Language Processing, 2017: 253-263.
[33] LIU P F, QIU X P, HUANG X J. Recurrent neural network for text classification with multi-task learning[C]// Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence. New York, USA: AAAI Press, 2016: 2873-2879.
[34] LIU W, XU T G, XU Q H, et al. An encoding strategy based word-character LSTM for Chinese NER[C]// Proceedings of 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1(Long and Short Papers). Minneapolis, USA: Association for Computational Linguistics, 2019: 2379-2389.
[35] CHO K, VAN MERRIËNBOER B, BAHDANAU D, et al. On the properties of neural machine translation: Encoder-decoder approaches[C]// 8th Workshop on Syntax, Semantics and Structure in Statistical Translation. Doha, Qatar: Association for Computational Linguistics, 2014: 103-111.
[36] STRUBELL E, VERGA P, BELANGER D, et al. Fast and accurate entity recognition with iterated dilated convolutions[C]// Proceedings of 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen, Denmark: Association for Computational Linguistics, 2017: 2670-2680.
[37] VELI AČG KOVIĆG P, CUCURULL G, CASANOVA A, et al. Graph attention networks[C]// 6th International Conference on Learning Representations. Vancouver, Canada: ICLR, 2017.
[38] GRAVES A. Supervised sequence labelling with recurrent neural networks[M]. Berlin, Germany: Springer, 2012.
[39] NIE Y Y, TIAN Y H, SONG Y, et al. Improving named entity recognition with attentive ensemble of syntactic Information[C]// Findings of the Association for Computational Linguistics: EMNLP 2020. Association for Computational Linguistics, 2020: 4231-4245.
[40] VITERBI A. Error bounds for convolutional codes and an asymptotically optimum decoding algorithm[J]. IEEE Transactions on Information Theory, 1967, 13(2): 260-269.
[41] WEISCHEDEL R, PALMER M, MARCUS M, et al. OntoNotes release 4.0[EB/OL]. (2011-02-15)[2023-03-20]. https://doi.org/10.35111/gfjf-7r50.
[42] LEVOW G A. The third international Chinese language processing bakeoff: Word segmentation and named entity recognition[C]// Proceedings of the Fifth SIGHAN Workshop on Chinese Language Processing. Sydney, Australia: Association for Computational Linguistics, 2006: 108-117.
[43] PENG N Y, DREDZE M. Named entity recognition for Chinese social media with jointly trained embeddings[C]// Proceedings of 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, Portugal: Association for Computational Linguistics, 2015: 548-554.
[44] LI S, ZHAO Z, HU R F, et al. Analogical reasoning on Chinese morphological and semantic relations[C]// Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). Melbourne, Australia: Association for Computational Linguistics, 2018: 138-143.
[45] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: A simple way to prevent neural networks from overfitting[J]. The Journal of Machine Learning Research, 2014, 15(1): 1929-1958.
[46] KINGMA D P, BA J. Adam: A method for stochastic optimization[C]// 3rd International Conference on Learning Representations. San Diego, USA: ICLR, 2014.
[47] MA R T, PENG M L, ZHANG Q, et al. Simplify the usage of lexicon in Chinese NER[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2020: 5951-5960.
[48] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. Long Beach, USA: Curran Associates Inc., 2017: 6000-6010.
[49] LI X N, YAN H, QIU X P, et al. FLAT: Chinese NER using flat-lattice transformer[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistic. Association for Computational Linguistics, 2020: 6836-6842.