DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5
Hits 1 – 20 of 85

1
Parameter-Efficient Neural Reranking for Cross-Lingual and Multilingual Retrieval ...
BASE
Show details
2
Geographic Adaptation of Pretrained Language Models ...
BASE
Show details
3
On cross-lingual retrieval with multilingual text encoders
Litschko, Robert; Vulić, Ivan; Ponzetto, Simone Paolo. - : Springer Science + Business Media, 2022
BASE
Show details
4
Data for paper: "Evaluating Resource-Lean Cross-Lingual Embedding Models in Unsupervised Retrieval" ...
Litschko, Robert; Glavaš, Goran. - : Mannheim University Library, 2021
BASE
Show details
5
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems ...
BASE
Show details
6
On Cross-Lingual Retrieval with Multilingual Text Encoders ...
BASE
Show details
7
Evaluating Multilingual Text Encoders for Unsupervised Cross-Lingual Retrieval ...
Abstract: Pretrained multilingual text encoders based on neural Transformer architectures, such as multilingual BERT (mBERT) and XLM, have achieved strong performance on a myriad of language understanding tasks. Consequently, they have been adopted as a go-to paradigm for multilingual and cross-lingual representation learning and transfer, rendering cross-lingual word embeddings (CLWEs) effectively obsolete. However, questions remain to which extent this finding generalizes 1) to unsupervised settings and 2) for ad-hoc cross-lingual IR (CLIR) tasks. Therefore, in this work we present a systematic empirical study focused on the suitability of the state-of-the-art multilingual encoders for cross-lingual document and sentence retrieval tasks across a large number of language pairs. In contrast to supervised language understanding, our results indicate that for unsupervised document-level CLIR -- a setup with no relevance judgments for IR-specific fine-tuning -- pretrained encoders fail to significantly outperform models ... : accepted at ECIR'21 (preprint) ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; H.3.3; I.2.7; Information Retrieval cs.IR
URL: https://dx.doi.org/10.48550/arxiv.2101.08370
https://arxiv.org/abs/2101.08370
BASE
Hide details
8
RedditBias: A Real-World Resource for Bias Evaluation and Debiasing of Conversational Language Models ...
BASE
Show details
9
LexFit: Lexical Fine-Tuning of Pretrained Language Models ...
BASE
Show details
10
Verb Knowledge Injection for Multilingual Event Processing ...
BASE
Show details
11
Is supervised syntactic parsing beneficial for language understanding tasks? An empirical investigation
Glavaš, Goran; Vulić, Ivan. - : Association for Computational Linguistics, 2021
BASE
Show details
12
Evaluating multilingual text encoders for unsupervised cross-lingual retrieval
BASE
Show details
13
Training and domain adaptation for supervised text segmentation
Glavaš, Goran; Ganesh, Ananya; Somasundaran, Swapna. - : Association for Computational Linguistics, 2021
BASE
Show details
14
AraWEAT: Multidimensional Analysis of Biases in Arabic Word Embeddings ...
BASE
Show details
15
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
BASE
Show details
16
On the Limitations of Cross-lingual Encoders as Exposed by Reference-Free Machine Translation Evaluation ...
BASE
Show details
17
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer ...
BASE
Show details
18
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
Ponti, Edoardo; Glavaš, Goran; Majewska, Olga. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
19
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers ...
BASE
Show details
20
Verb Knowledge Injection for Multilingual Event Processing ...
BASE
Show details

Page: 1 2 3 4 5

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
85
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern