DE eng

Search in the Catalogues and Directories

Hits 1 – 16 of 16

1
Delving Deeper into Cross-lingual Visual Question Answering ...
BASE
Show details
2
IGLUE: A Benchmark for Transfer Learning across Modalities, Tasks, and Languages ...
BASE
Show details
3
Smelting Gold and Silver for Improved Multilingual AMR-to-Text Generation ...
BASE
Show details
4
xGQA: Cross-Lingual Visual Question Answering ...
BASE
Show details
5
Smelting Gold and Silver for Improved Multilingual AMR-to-Text Generation ...
BASE
Show details
6
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts ...
BASE
Show details
7
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models ...
BASE
Show details
8
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer ...
BASE
Show details
9
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models ...
BASE
Show details
10
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts ...
BASE
Show details
11
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer ...
Pfeiffer, Jonas; Vulic, Ivan; Gurevych, Iryna. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
12
MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer
Vulic, Ivan; Pfeiffer, Jonas; Ruder, Sebastian; Gurevych, Iryna. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
Abstract: The main goal behind state-of-the-art pretrained multilingual models such as multilingual BERT and XLM-R is enabling and bootstrapping NLP applications in low-resource languages through zero-shot or few-shot cross-lingual transfer. However, due to limited model capacity, their transfer performance is the weakest exactly on such low-resource languages and languages unseen during pretraining. We propose MAD-X, an adapter-based framework that enables high portability and parameter-efficient transfer to arbitrary tasks and languages by learning modular language and task representations. In addition, we introduce a novel invertible adapter architecture and a strong baseline method for adapting a pretrained multilingual model to a new language. MAD-X outperforms the state of the art in cross-lingual transfer across a representative set of typologically diverse languages on named entity recognition and causal commonsense reasoning, and achieves competitive results on question answering.
URL: https://www.repository.cam.ac.uk/handle/1810/315104
https://doi.org/10.17863/CAM.62211
BASE
Hide details
13
AdapterHub: A Framework for Adapting Transformers
Pfeiffer, Jonas; Ruckle, Andreas; Poth, Clifton. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing: System Demonstrations (EMNLP 2020), 2020
BASE
Show details
14
Specialising Distributional Vectors of All Words for Lexical Entailment ...
Kamath, Aishwarya; Pfeiffer, Jonas; Ponti, Edoardo. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details
15
Specializing distributional vectors of all words for lexical entailment
Ponti, Edoardo Maria; Kamath, Aishwarya; Pfeiffer, Jonas. - : Association for Computational Linguistics, 2019
BASE
Show details
16
A neural autoencoder approach for document ranking and query refinement in pharmacogenomic information retrieval
Broscheit, Samuel; Pfeiffer, Jonas; Gemulla, Rainer. - : Association for Computational Linguistics, 2018
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
16
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern