DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6
Hits 41 – 60 of 112

41
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers
Ravishankar, Vinit; Glavas, Goran; Lauscher, Anne. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details
42
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning
Liu, Qianchu; Korhonen, Anna-Leena; Majewska, Olga. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details
43
XHate-999: Analyzing and Detecting Abusive Language Across Domains and Languages
Glavas, Goran; Karan, Mladen; Vulic, Ivan. - : International Committee on Computational Linguistics, 2020. : https://www.aclweb.org/anthology/2020.coling-main.559, 2020. : Proceedings of the 28th International Conference on Computational Linguistics (COLING 2020), 2020
BASE
Show details
44
Specializing unsupervised pretraining models for word-level semantic similarity
Ponti, Edoardo Maria; Korhonen, Anna; Vulić, Ivan. - : Association for Computational Linguistics, ACL, 2020
BASE
Show details
45
Non-linear instance-based cross-lingual mapping for non-isomorphic embedding spaces
Glavaš, Goran; Vulić, Ivan. - : Association for Computational Linguistics, 2020
BASE
Show details
46
Classification-based self-learning for weakly supervised bilingual lexicon induction
Vulić, Ivan; Korhonen, Anna; Glavaš, Goran. - : Association for Computational Linguistics, 2020
BASE
Show details
47
AraWEAT: Multidimensional analysis of biases in Arabic word embeddings
Lauscher, Anne; Takieddin, Rafik; Ponzetto, Simone Paolo. - : Association for Computational Linguistics, 2020
BASE
Show details
48
Probing pretrained language models for lexical semantics
Vulić, Ivan; Korhonen, Anna; Litschko, Robert. - : Association for Computational Linguistics, 2020
BASE
Show details
49
Common sense or world knowledge? Investigating adapter-based knowledge injection into pretrained transformers
Lauscher, Anne; Majewska, Olga; Ribeiro, Leonardo F. R.. - : Association for Computational Linguistics, 2020
BASE
Show details
50
XHate-999: analyzing and detecting abusive language across domains and languages
Glavaš, Goran; Karan, Mladen; Vulić, Ivan. - : Association for Computational Linguistics, 2020
BASE
Show details
51
On the limitations of cross-lingual encoders as exposed by reference-free machine translation evaluation
Zhao, Wei; Glavaš, Goran; Peyrard, Maxime. - : Association for Computational Linguistics, 2020
BASE
Show details
52
XCOPA: A multilingual dataset for causal commonsense reasoning
Ponti, Edoardo Maria; Majewska, Olga; Liu, Qianchu. - : Association for Computational Linguistics, 2020
BASE
Show details
53
Improving bilingual lexicon induction with unsupervised post-processing of monolingual word vector spaces
Glavaš, Goran; Korhonen, Anna; Vulić, Ivan. - : Association for Computational Linguistics, 2020
BASE
Show details
54
From zero to hero: On the limitations of zero-shot language transfer with multilingual transformers
Ravishankar, Vinit; Glavaš, Goran; Lauscher, Anne. - : Association for Computational Linguistics, 2020
BASE
Show details
55
SemEval-2020 Task 2: Predicting multilingual and cross-lingual (graded) lexical entailment
Glavaš, Goran; Vulić, Ivan; Korhonen, Anna. - : Association for Computational Linguistics, 2020
BASE
Show details
56
Towards instance-level parser selection for cross-lingual transfer of dependency parsers
Litschko, Robert; Vulić, Ivan; Agić, Želiko. - : Association for Computational Linguistics, 2020
BASE
Show details
57
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
BASE
Show details
58
Do We Really Need Fully Unsupervised Cross-Lingual Embeddings? ...
BASE
Show details
59
How to (Properly) Evaluate Cross-Lingual Word Embeddings: On Strong Baselines, Comparative Analyses, and Some Misconceptions ...
Abstract: Cross-lingual word embeddings (CLEs) enable multilingual modeling of meaning and facilitate cross-lingual transfer of NLP models. Despite their ubiquitous usage in downstream tasks, recent increasingly popular projection-based CLE models are almost exclusively evaluated on a single task only: bilingual lexicon induction (BLI). Even BLI evaluations vary greatly, hindering our ability to correctly interpret performance and properties of different CLE models. In this work, we make the first step towards a comprehensive evaluation of cross-lingual word embeddings. We thoroughly evaluate both supervised and unsupervised CLE models on a large number of language pairs in the BLI task and three downstream tasks, providing new insights concerning the ability of cutting-edge CLE models to support cross-lingual NLP. We empirically demonstrate that the performance of CLE models largely depends on the task at hand and that optimizing CLE models for BLI can result in deteriorated downstream performance. We indicate the ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/1902.00508
https://dx.doi.org/10.48550/arxiv.1902.00508
BASE
Hide details
60
Specialising Distributional Vectors of All Words for Lexical Entailment ...
Kamath, Aishwarya; Pfeiffer, Jonas; Ponti, Edoardo. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details

Page: 1 2 3 4 5 6

Catalogues
0
0
0
0
6
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
106
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern