DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 38

1
XTREME-S: Evaluating Cross-lingual Speech Representations ...
BASE
Show details
2
One Country, 700+ Languages: NLP Challenges for Underrepresented Languages and Dialects in Indonesia ...
BASE
Show details
3
Expanding Pretrained Models to Thousands More Languages via Lexicon-based Adaptation ...
BASE
Show details
4
MasakhaNER: Named entity recognition for African languages
In: EISSN: 2307-387X ; Transactions of the Association for Computational Linguistics ; https://hal.inria.fr/hal-03350962 ; Transactions of the Association for Computational Linguistics, The MIT Press, 2021, ⟨10.1162/tacl⟩ (2021)
BASE
Show details
5
Charformer: Fast Character Transformers via Gradient-based Subword Tokenization ...
BASE
Show details
6
Multi-view Subword Regularization ...
BASE
Show details
7
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation ...
BASE
Show details
8
Efficient Test Time Adapter Ensembling for Low-resource Language Varieties ...
BASE
Show details
9
Analogy Training Multilingual Encoders ...
Garneau, Nicolas; Hartmann, Mareike; Sandholm, Anders. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
10
XTREME-R: Towards More Challenging and Nuanced Multilingual Evaluation ...
BASE
Show details
11
A Call for More Rigor in Unsupervised Cross-lingual Learning ...
BASE
Show details
12
Rethinking embedding coupling in pre-trained language models ...
BASE
Show details
13
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer ...
BASE
Show details
14
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models ...
BASE
Show details
15
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts ...
BASE
Show details
16
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer ...
Pfeiffer, Jonas; Vulic, Ivan; Gurevych, Iryna; Ruder, Sebastian. - : Apollo - University of Cambridge Repository, 2020
Abstract: The main goal behind state-of-the-art pretrained multilingual models such as multilingual BERT and XLM-R is enabling and bootstrapping NLP applications in low-resource languages through zero-shot or few-shot cross-lingual transfer. However, due to limited model capacity, their transfer performance is the weakest exactly on such low-resource languages and languages unseen during pretraining. We propose MAD-X, an adapter-based framework that enables high portability and parameter-efficient transfer to arbitrary tasks and languages by learning modular language and task representations. In addition, we introduce a novel invertible adapter architecture and a strong baseline method for adapting a pretrained multilingual model to a new language. MAD-X outperforms the state of the art in cross-lingual transfer across a representative set of typologically diverse languages on named entity recognition and causal commonsense reasoning, and achieves competitive results on question answering. ...
URL: https://dx.doi.org/10.17863/cam.62211
https://www.repository.cam.ac.uk/handle/1810/315104
BASE
Hide details
17
Morphologically Aware Word-Level Translation ...
BASE
Show details
18
Morphologically Aware Word-Level Translation
In: Proceedings of the 28th International Conference on Computational Linguistics (2020)
BASE
Show details
19
Morphologically Aware Word-Level Translation ...
BASE
Show details
20
XTREME: A Massively Multilingual Multi-task Benchmark for Evaluating Cross-lingual Generalization ...
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
38
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern