Hi, everyone,
Junyoung Chung will talk today about recent work on Character-Level Neural
Machine Translation:
A Character-Level Decoder without Explicit Segmentation for Neural Machine
Translation
Who: Junyoung Chung
When: Fri. May 6th, 14:30
Where: AA3195
Abstract: The existing machine translation systems, whether phrase-based or
neural, have relied almost exclusively on word-level modelling with
explicit segmentation. In this paper, we ask a fundamental question: can
neural machine translation generate a character sequence without any
explicit segmentation? To answer this question, we evaluate an
attention-based encoder– decoder with a subword-level encoder and a
character-level decoder on four language pairs–En-Cs, En-De, En-Ru and
En-Fi– using the parallel corpora from WMT’15. Our experiments show that
the models with a character-level decoder outperform the ones with a
subword-level decoder on all of the four language pairs. Furthermore, the
ensembles of neural models with a character-level decoder outperform the
state-of-the-art non-neural machine translation systems on En-Cs, En-De and
En-Fi and perform comparably on En-Ru.
---