This is a brief summary of paper for me to study and organize it, Improving Neural Machine Translation Models with Monolingual Data (Sennrich et al., ACL 2016) that I read and studied.
This paper propose new data augmentation they call back-tranlsation.
They focused that NMT has obtained state-of-the art performance for several language pair and that target-side monolingual data plays an important role in boosting fluency for phrased-based statistical mahicne translation.
So, they investigated the use of monolingual data for NMT.
They experiemt two different methods to fill source side of monolingual training instance.
One is to use a dummy source sentence, i.e. They pair monlingual setenences on target side with a single-word dummy source side **** to allow processing of both parallel and monolingual trainig examples with the same network graph.
The other is to use a source sentence obtained via back-translation, which they call synthetic source sentence taht is obtained from automatically translating the target sentence into the source language.
They find that the latter is more effecitve.
Note(Abstract):
Neural Machine Translation (NMT) has obtained state-of-the art performance for several language pairs, while only using parallel data for training. Targetside monolingual data plays an important role in boosting fluency for phrasebased statistical machine translation, and they investigate the use of monolingual data for NMT. In contrast to previous work, which combines NMT models with separately trained language models, they note that encoder-decoder NMT architectures already have the capacity to learn the same information as a language model, and they explore strategies to train with monolingual data without changing the neural network architecture. By pairing monolingual training data with an automatic backtranslation, they can treat it as additional parallel training data, and they obtain substantial improvements on the WMT 15 task English↔German (+2.8–3.7 BLEU), and for the low-resourced IWSLT 14 task Turkish→English (+2.1–3.4 BLEU), obtaining new state-of-the-art results. They also show that fine-tuning on in-domain monolingual and parallel data gives substantial improvements for the IWSLT 15 task English→German.
Reference
- Paper
- How to use html for alert