DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 46

1
Parameter-Efficient Neural Reranking for Cross-Lingual and Multilingual Retrieval ...
BASE
Show details
2
On cross-lingual retrieval with multilingual text encoders
Litschko, Robert; Vulić, Ivan; Ponzetto, Simone Paolo. - : Springer Science + Business Media, 2022
BASE
Show details
3
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems ...
BASE
Show details
4
On Cross-Lingual Retrieval with Multilingual Text Encoders ...
BASE
Show details
5
Evaluating Multilingual Text Encoders for Unsupervised Cross-Lingual Retrieval ...
BASE
Show details
6
RedditBias: A Real-World Resource for Bias Evaluation and Debiasing of Conversational Language Models ...
BASE
Show details
7
LexFit: Lexical Fine-Tuning of Pretrained Language Models ...
BASE
Show details
8
Verb Knowledge Injection for Multilingual Event Processing ...
BASE
Show details
9
Is supervised syntactic parsing beneficial for language understanding tasks? An empirical investigation
Glavaš, Goran; Vulić, Ivan. - : Association for Computational Linguistics, 2021
BASE
Show details
10
Evaluating multilingual text encoders for unsupervised cross-lingual retrieval
BASE
Show details
11
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
BASE
Show details
12
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer ...
BASE
Show details
13
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers ...
Abstract: Massively multilingual transformers pretrained with language modeling objectives (e.g., mBERT, XLM-R) have become a de facto default transfer paradigm for zero-shot cross-lingual transfer in NLP, offering unmatched transfer performance. Current downstream evaluations, however, verify their efficacy predominantly in transfer settings involving languages with sufficient amounts of pretraining data, and with lexically and typologically close languages. In this work, we analyze their limitations and show that cross-lingual transfer via massively multilingual transformers, much like transfer via cross-lingual word embeddings, is substantially less effective in resource-lean scenarios and for distant languages. Our experiments, encompassing three lower-level tasks (POS tagging, dependency parsing, NER), as well as two high-level semantic tasks (NLI, QA), empirically correlate transfer performance with linguistic similarity between the source and target languages, but also with the size of pretraining corpora of ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2005.00633
https://dx.doi.org/10.48550/arxiv.2005.00633
BASE
Hide details
14
Verb Knowledge Injection for Multilingual Event Processing ...
BASE
Show details
15
Probing Pretrained Language Models for Lexical Semantics ...
BASE
Show details
16
SemEval-2020 Task 2: Predicting Multilingual and Cross-Lingual (Graded) Lexical Entailment ...
BASE
Show details
17
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
BASE
Show details
18
Specializing unsupervised pretraining models for word-level semantic similarity
Ponti, Edoardo Maria; Korhonen, Anna; Vulić, Ivan. - : Association for Computational Linguistics, ACL, 2020
BASE
Show details
19
Non-linear instance-based cross-lingual mapping for non-isomorphic embedding spaces
Glavaš, Goran; Vulić, Ivan. - : Association for Computational Linguistics, 2020
BASE
Show details
20
Classification-based self-learning for weakly supervised bilingual lexicon induction
Vulić, Ivan; Korhonen, Anna; Glavaš, Goran. - : Association for Computational Linguistics, 2020
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
46
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern