3 |
AUTOLEX: An Automatic Framework for Linguistic Exploration ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
MCoNaLa: A Benchmark for Code Generation from Multiple Natural Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
A Systematic Evaluation of Large Language Models of Code ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Expanding Pretrained Models to Thousands More Languages via Lexicon-based Adaptation ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Attention-Passing Models for Robust and Data-Efficient End-to-End Speech Translation
|
|
|
|
In: Transactions of the Association for Computational Linguistics, 7, 313–325 ; ISSN: 2307-387X (2022)
|
|
BASE
|
|
Show details
|
|
9 |
MasakhaNER: Named entity recognition for African languages
|
|
|
|
In: EISSN: 2307-387X ; Transactions of the Association for Computational Linguistics ; https://hal.inria.fr/hal-03350962 ; Transactions of the Association for Computational Linguistics, The MIT Press, 2021, ⟨10.1162/tacl⟩ (2021)
|
|
BASE
|
|
Show details
|
|
10 |
Phoneme Recognition through Fine Tuning of Phonetic Representations: a Case Study on Luhya Language Varieties ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Few-shot Language Coordination by Modeling Theory of Mind ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Systematic Inequalities in Language Technology Performance across the World's Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Multilingual Multimodal Pre-training for Zero-Shot Cross-Lingual Transfer of Vision-Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
MetaXL: Meta Representation Transformation for Low-resource Cross-lingual Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
When Does Translation Require Context? A Data-driven, Multilingual Exploration ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Efficient Test Time Adapter Ensembling for Low-resource Language Varieties ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Distributionally Robust Multilingual Machine Translation ...
|
|
|
|
Abstract:
Multilingual neural machine translation (MNMT) learns to translate multiple language pairs with a single model, potentially improving both the accuracy and the memory-efficiency of deployed models. However, the heavy data imbalance between languages hinders the model from performing uniformly across language pairs. In this paper, we propose a new learning objective for MNMT based on distributionally robust optimization, which minimizes the worst-case expected loss over the set of language pairs. We further show how to practically optimize this objective for large translation corpora using an iterated best response scheme, which is both effective and incurs negligible additional computational cost compared to standard empirical risk minimization. We perform extensive experiments on three sets of languages from two datasets and show that our method consistently outperforms strong baseline methods in terms of average and per-language performance under both many-to-one and one-to-many translation settings. ... : Long paper accepted by EMNLP2021 main conference ...
|
|
Keyword:
Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
|
|
URL: https://arxiv.org/abs/2109.04020 https://dx.doi.org/10.48550/arxiv.2109.04020
|
|
BASE
|
|
Hide details
|
|
|
|