DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 45

1
Delving Deeper into Cross-lingual Visual Question Answering ...
BASE
Show details
2
Cross-Lingual Dialogue Dataset Creation via Outline-Based Generation ...
BASE
Show details
3
Improving Word Translation via Two-Stage Contrastive Learning ...
BASE
Show details
4
Towards Zero-shot Language Modeling ...
BASE
Show details
5
Multilingual and Cross-Lingual Intent Detection from Spoken Data ...
BASE
Show details
6
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems ...
BASE
Show details
7
Modelling Latent Translations for Cross-Lingual Transfer ...
BASE
Show details
8
Prix-LM: Pretraining for Multilingual Knowledge Base Construction ...
BASE
Show details
9
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
BASE
Show details
10
xGQA: Cross-Lingual Visual Question Answering ...
BASE
Show details
11
On Cross-Lingual Retrieval with Multilingual Text Encoders ...
BASE
Show details
12
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
13
Evaluating Multilingual Text Encoders for Unsupervised Cross-Lingual Retrieval ...
BASE
Show details
14
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
BASE
Show details
15
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
BASE
Show details
16
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
BASE
Show details
17
Emergent Communication Pretraining for Few-Shot Machine Translation ...
BASE
Show details
18
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer ...
BASE
Show details
19
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer ...
Abstract: The main goal behind state-of-the-art pre-trained multilingual models such as multilingual BERT and XLM-R is enabling and bootstrapping NLP applications in low-resource languages through zero-shot or few-shot cross-lingual transfer. However, due to limited model capacity, their transfer performance is the weakest exactly on such low-resource languages and languages unseen during pre-training. We propose MAD-X, an adapter-based framework that enables high portability and parameter-efficient transfer to arbitrary tasks and languages by learning modular language and task representations. In addition, we introduce a novel invertible adapter architecture and a strong baseline method for adapting a pre-trained multilingual model to a new language. MAD-X outperforms the state of the art in cross-lingual transfer across a representative set of typologically diverse languages on named entity recognition and causal commonsense reasoning, and achieves competitive results on question answering. Our code and adapters are ... : EMNLP 2020 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2005.00052
https://dx.doi.org/10.48550/arxiv.2005.00052
BASE
Hide details
20
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models ...
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
45
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern