计算机科学与技术

神经机器翻译系统在维吾尔语-汉语翻译中的性能对比

展开
  • 清华大学 计算机科学与技术系, 智能技术与系统国家重点实验室, 清华信息科学与技术国家实验室(筹), 北京 100084

收稿日期: 2017-02-23

  网络出版日期: 2017-08-15

Performance comparison of neural machinetranslation systems in Uyghur-Chinese translation

Expand
  • Tsinghua National Laboratory for Information Science and Technology, State Key Laboratory of Intelligent Technology and Systems, Department of Computer Science and Technology, Tsinghua University, Beijing 100084, China

Received date: 2017-02-23

  Online published: 2017-08-15

摘要

基于深度学习的神经机器翻译已在多个语言对上显著超过传统的统计机器翻译,成为当前的主流机器翻译技术。该文从词粒度层面出发,对国际上具有影响力的6种神经机器翻译方法在维吾尔语-汉语翻译任务上进行了深入分析和比较,这6种方法分别是基于注意力机制(GroundHog),词表扩大(LV-groundhog),源语言和目标语言采用子词(subword-nmt)、字符与词混合(nmt.hybrid)、子词与字符(dl4mt-cdec)以及完全字符(dl4mt-c2c)方法。实验结果表明:源语言采用子词、目标语言采用字符的方法(dl4mt-cdec)在维吾尔语-汉语神经机器翻译任务上性能最佳。该文不仅是首次将神经机器翻译方法应用到维吾尔语-汉语机器翻译任务上,也是首次将不同的神经机器翻译方法在同一语料库上进行了对比分析。该研究对维吾尔语-汉语机器翻译任务和神经机器翻译的进一步研究工作都具有重要的参考意义。

本文引用格式

哈里旦木·阿布都克里木, 刘洋, 孙茂松 . 神经机器翻译系统在维吾尔语-汉语翻译中的性能对比[J]. 清华大学学报(自然科学版), 2017 , 57(8) : 878 -883 . DOI: 10.16511/j.cnki.qhdxxb.2017.22.054

Abstract

The neural machine translation based on deep learning significantly surpasses the traditional statistical machine translation in many languages, and becomes the current mainstream machine translation technology. This paper compares six influential neural machine translation methods from the level of word granularity in the task of Uyghur-Chinese machine translation. These methods are attention mechanism (GroundHog), vocabulary expansion (LV-groundhog), source language and target language with subword units (subword-nmt), characters and words mixed (nmt.hybrid), subword units and characters (dl4mt-cdec), and complete characters (dl4mt-c2c). The experimental results show that Uyghur-Chinese neural machine translation performs best when the source language is segmented into subword units and the target language is represented by characters (dl4mt-cdec). This paper is the first to use neural machine translation for Uyghur-Chinese machine translation and the first to compare different neural machine translation methods on the same corpus. This work is an important reference not only for Uyghur-Chinese machine translation, but also for general neural machine translation tasks.

参考文献

[1] Sutskever I, Vinyals O, Le Q. Sequence to Sequence Learning with Neural Networks[Z/OL]. (2014-09-10)[2015-03-15]. https://arxiv.org/abs/1409.3215. [2] Bahdanau D, Cho K, Bengio Y. Neural Machine Translation by Jointly Learning to Align and Translate[Z/OL]. (2014-09-01)[2015-03-20]. https://arxiv.org/abs/1409.0473. [3] Wu Y, Schuster M, Chen Z, et al. Google's Neural Machine Translation System:Bridging the Gap between Human and Machine Translation[Z/OL]. (2016-09-26)[2016-10-02]. https://arxiv.org/abs/1609.08144. [4] Johnson M, Schuster M, Le Q, et al. Google's Multilingual Neural Machine Translation System:Enabling Zero-Shot Translation[Z/OL]. (2016-11-14)[2016-11-16]. https://arxiv.org/abs/1611.04558. [5] Jean S, Cho K, Memisevic R, et al. On using very large target vocabulary for neural machine translation[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics. Beijing, 2015:1-10. [6] Luong M, Sutskever I, Le Q, et al. Addressing the rare word problem in neural machine translation[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics. Beijing, 2015:11-19. [7] Chung J, Cho K, Bengio Y. A character-level decoder without explicit segmentation for neural machine translation[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, Germany, 2016:1693-1703. [8] Ling W, Trancoso I, Dyer C, et al. Character-Based Neural Machine Translation[Z/OL]. (2015-11-14)[2016-02-03]. https://arxiv.org/abs/1511.04586. [9] Costa-Jussà M, Fonollosa J. Character-based neural machine translation[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, Germany, 2016:357-361. [10] Luong M, Manning C. Achieving open vocabulary neural machine translation with hybrid word-character models[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, Germany, 2016:1054-1063. [11] Lee J, Cho K, Hofmann T. Fully Character-Level Neural Machine Translation without Explicit Segmentation[Z/OL]. (2016-10-10)[2016-10-22]. https://arxiv.org/abs/1610.03017. [12] Sennrich R, Haddow B, Birch A. Neural machine translation of rare words with subword units[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, Germany, 2016:1715-1725. [13] Luong M, Le Q, Sutskever I, et al. Multi-Task Sequence to Sequence Learning[Z/OL]. (2015-11-19)[2016-07-17]. https://arxiv.org/abs/1511.06114. [14] Firat O, Cho K, Bengio Y. Multi-way, multilingual neural machine translation with a shared attention mechanism[C]//Proceedings of NAACL-HLT. San Diego, CA, USA, 2016:866-875. [15] Firat O, Sankaran B, Al-Onaizan Y, et al. Zero-resource translation with multi-lingual neural machine translation[C]//Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. Austin, TX, USA, 2016:268-277. [16] 哈里旦木·阿布都克里木, 程勇, 刘洋, 等. 基于双向门限递归单元神经网络的维吾尔语形态切分[J]. 清华大学学报(自然科学版), 2017, 57(1):1-6. ABUDUKELIMU Halidanmu, CHENG Yong, LIU Yang, et al. Uyghur morphological segmentation with bidirectional GRU neural networks[J]. J Tsinghua Univ (Sci and Tech), 2017, 57(1):1-6. (in Chinese) [17] Abudukelimu H, Liu Y, Chen X, et al. Learning distributed representations of Uyghur words and morphemes[C]//Proceedings of CCL/NLP-NABD. Guangzhou, 2015:202-211. [18] Creutz M, Lagus K. Unsupervised discovery of morphemes[C]//Morphological and Phonological Learning:Proceedings of the 6th Workshop of the ACL Special Interest Group in Computational Phonology (SIGPHON). Stroudsburg, PA, USA, 2002:21-30. [19] Koehn P, Hoang H, Birch A, et al. Moses:Open source toolkit for statistical machine translation[C]//Proceedings of the 45th Annual Meeting of the Association for Computational Linguistics Companion Volume Proceedings of the Demo and Poster Sessions. Pargue, Czech Republic, 2007:177-180. [20] Sennrich R. How Grammatical Is Character-Level Neural Machine Translation Assessing MT Quality with Contrastive Translation Pairs[Z/OL]. (2016-12-14)[2016-12-20]. https://arxiv.org/abs/1612.04629. [21] Zohp B, Yuret D, May J, et al. Transfer learning for low-resource neural machine translation[C]//Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. Austin, TX, 2016:1568-1575. [22] Luong M, Pham H, Manning C. Effective approaches to attention-based neural machine translation[C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, Portugal, 2015:1412-1421.
文章导航

/