会议专题

Adapting Attention-based Neural Network to Low-resource Mongolian-Chinese Machine Translation

  Neural machine translation(NMT)has shown very promising results for some resourceful languages like En-Fr and En-De.The success partly relies on the availability of large scale and high quality parallel corpora.We research on how to adapt NMT to very low-resource Mongolian-Chinese machine translation by introducing attention mechanism,sub-words translation,monolingual data and a NMT correction model.We proposed a sub-words model to address the out-of-vocabulary(OOV)problem in attention-based NMT model.Monolingual data help alleviate the low-resource problem.Besides,we explore a Chinese NMT correction model to enhance the translation performance.The experiments show that the adapted Mongolian-Chinese attention-based NMT ma-chine translation obtains an improvement of 1.70 BLEU points over the phrased-based statistical machine translation baseline and 3.86 BLEU points over normal NMT baseline on an open training set.

Low-resource Mongolian-Chinese Machine translation SMT NMT

Jing Wu Hongxu Hou Zhipeng Shen Jian Du Jinting Li

College of Computer Science,Inner Mongolia University,Hohhot,China

国际会议

第五届自然语言处理与中文计算会议(NLPCC-ICCPOL2016)

昆明

英文

1-10

2016-12-02(万方平台首次上网日期,不代表论文的发表时间)