1 |
SMDT: Selective Memory-Augmented Neural Document Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Zero-shot Cross-lingual Transfer of Prompt-based Tuning with a Unified Multilingual Prompt ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Towards Making the Most of Multilingual Pretraining for Zero-Shot Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Zero-shot Cross-lingual Transfer of Neural Machine Translation with Multilingual Pretrained Encoders ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Multilingual Machine Translation Systems from Microsoft for WMT21 Shared Task ...
|
|
Yang, Jian; Ma, Shuming; Huang, Haoyang; Zhang, Dongdong; Dong, Li; Huang, Shaohan; Muzio, Alexandre; Singhal, Saksham; Awadalla, Hany Hassan; Song, Xia; Wei, Furu. - : arXiv, 2021
|
|
Abstract:
This report describes Microsoft's machine translation systems for the WMT21 shared task on large-scale multilingual machine translation. We participated in all three evaluation tracks including Large Track and two Small Tracks where the former one is unconstrained and the latter two are fully constrained. Our model submissions to the shared task were initialized with DeltaLM\footnote{\url{https://aka.ms/deltalm}}, a generic pre-trained multilingual encoder-decoder model, and fine-tuned correspondingly with the vast collected parallel data and allowed data sources according to track settings, together with applying progressive learning and iterative back-translation approaches to further improve the performance. Our final submissions ranked first on three tracks in terms of the automatic evaluation metric. ... : WMT21 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://arxiv.org/abs/2111.02086 https://dx.doi.org/10.48550/arxiv.2111.02086
|
|
BASE
|
|
Hide details
|
|
7 |
DeltaLM: Encoder-Decoder Pre-training for Language Generation and Translation by Augmenting Pretrained Multilingual Encoders ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation? ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Syntax-aware Data Augmentation for Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
M3P: Learning Universal Representations via Multitask Multilingual Multimodal Pre-training ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
XLM-T: Scaling up Multilingual Machine Translation with Pretrained Cross-lingual Transformer Encoders ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
A Novel Task-Oriented Text Corpus in Silent Speech Recognition and its Natural Language Generation Construction Method ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Learning Unsupervised Word Mapping by Maximizing Mean Discrepancy ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|