1 |
XTREME-S: Evaluating Cross-lingual Speech Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
mSLAM: Massively multilingual joint pre-training for speech and text ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Larger-Scale Transformers for Multilingual Masked Language Modeling ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Multilingual Speech Translation from Efficient Finetuning of Pretrained Models ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Unsupervised Cross-lingual Representation Learning for Speech Recognition ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Multilingual Speech Translation with Efficient Finetuning of Pretrained Models ...
|
|
Li, Xian; Wang, Changhan; Tang, Yun; Tran, Chau; Tang, Yuqing; Pino, Juan; Baevski, Alexei; Conneau, Alexis; Auli, Michael. - : arXiv, 2020
|
|
Abstract:
We present a simple yet effective approach to build multilingual speech-to-text (ST) translation by efficient transfer learning from pretrained speech encoder and text decoder. Our key finding is that a minimalistic LNA (LayerNorm and Attention) finetuning can achieve zero-shot crosslingual and cross-modality transfer ability by only finetuning less than 10% of the pretrained parameters. This enables effectively leveraging large pretrained models with low training cost. Using wav2vec 2.0 for acoustic modeling, and mBART for multilingual text generation, our approach advanced the new state-of-the-art for 34 translation directions (and surpassing cascaded ST for 23 of them) on large-scale multilingual ST benchmark CoVoST 2 (+6.4 BLEU on average across 15 En-X directions and +5.1 BLEU on average across 19 X-En directions). Our approach demonstrates strong zero-shot performance in a many-to-many multilingual model (+5.7 BLEU on average across 18 non-English directions), making it an appealing approach for ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://arxiv.org/abs/2010.12829 https://dx.doi.org/10.48550/arxiv.2010.12829
|
|
BASE
|
|
Hide details
|
|
7 |
Unsupervised Cross-lingual Representation Learning at Scale ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Emerging Cross-lingual Structure in Pretrained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Specializing distributional vectors of all words for lexical entailment
|
|
|
|
BASE
|
|
Show details
|
|
10 |
What you can cram into a single \$&!#* vector: Probing sentence embeddings for linguistic properties
|
|
|
|
In: ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics ; https://hal.archives-ouvertes.fr/hal-01898412 ; ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Jul 2018, Melbourne, Australia. pp.2126-2136 (2018)
|
|
BASE
|
|
Show details
|
|
11 |
XNLI: Evaluating Cross-lingual Sentence Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
What you can cram into a single vector: Probing sentence embeddings for linguistic properties ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Very Deep Convolutional Networks for Text Classification
|
|
|
|
In: European Chapter of the Association for Computational Linguistics EACL'17 ; https://hal.archives-ouvertes.fr/hal-01454940 ; European Chapter of the Association for Computational Linguistics EACL'17, 2017, Valencia, Spain (2017)
|
|
BASE
|
|
Show details
|
|
15 |
What you can cram into a single $&!#* vector: probing sentence embeddings for linguistic properties
|
|
|
|
BASE
|
|
Show details
|
|
|
|