DE eng

Search in the Catalogues and Directories

Hits 1 – 4 of 4

1
Scientific Credibility of Machine Translation Research: A Meta-Evaluation of 769 Papers ...
BASE
Show details
2
Synthesizing Parallel Data of User-Generated Texts with Zero-Shot Neural Machine Translation ...
BASE
Show details
3
Softmax Tempering for Training Neural Machine Translation Models ...
Dabre, Raj; Fujita, Atsushi. - : arXiv, 2020
BASE
Show details
4
Exploiting Out-of-Domain Parallel Data through Multilingual Transfer Learning for Low-Resource Neural Machine Translation ...
Abstract: This paper proposes a novel multilingual multistage fine-tuning approach for low-resource neural machine translation (NMT), taking a challenging Japanese--Russian pair for benchmarking. Although there are many solutions for low-resource scenarios, such as multilingual NMT and back-translation, we have empirically confirmed their limited success when restricted to in-domain data. We therefore propose to exploit out-of-domain data through transfer learning, by using it to first train a multilingual NMT model followed by multistage fine-tuning on in-domain parallel and back-translated pseudo-parallel data. Our approach, which combines domain adaptation, multilingualism, and back-translation, helps improve the translation quality by more than 3.7 BLEU points, over a strong baseline, for this extremely low-resource scenario. ... : Accepted at the 17th Machine Translation Summit ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.1907.03060
https://arxiv.org/abs/1907.03060
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
4
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern