Neural Machine Translation by Minimising the Bayes-risk with Respect to Syntactic Translation Lattices

Neural Machine Translation by Minimising the Bayes-risk with Respect to Syntactic Translation Lattices” by Felix Stahlberg, Adrià de Gispert, Eva Hasler, and Bill Byrne. In Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics, 2017.

Abstract

We present a novel scheme to combine neural machine translation (NMT) with traditional statistical machine translation (SMT). Our approach borrows ideas from linearised lattice minimum Bayes-risk decoding for SMT. The NMT score is com- bined with the Bayes-risk of the trans- lation according the SMT lattice. This makes our approach much more flexible than n-best list or lattice rescoring as the neural decoder is not restricted to the SMT search space. We show an efficient and simple way to integrate risk estimation into the NMT decoder which is suitable for word-level as well as subword-unit-level NMT. We test our method on English- German and Japanese-English and report significant gains over lattice rescoring on several data sets for both single and en- sembled NMT. The MBR decoder pro- duces entirely new hypotheses far beyond simply rescoring the SMT search space or fixing UNKs in the NMT output.

BibTeX entry:

@inproceedings{eacl17:nmtmbr,
   author = {Felix Stahlberg and Adri{\`a} de Gispert and Eva Hasler and
	Bill Byrne},
   title = {Neural Machine Translation by Minimising the Bayes-risk with
	Respect to Syntactic Translation Lattices},
   booktitle = {Proceedings of the 15th Conference of the European Chapter
	of the Association for Computational Linguistics},
   year = {2017},
   url = {https://arxiv.org/abs/1612.03791}
}

Back to Bill Byrne publications.