DE eng

Search in the Catalogues and Directories

Hits 1 – 15 of 15

1
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
2
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
Liu, Qianchu; Liu, Fangyu; Collier, Nigel. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
3
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
4
Context vs Target Word: Quantifying Biases in Lexical Semantic Datasets ...
BASE
Show details
5
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
BASE
Show details
6
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
Abstract: Anthology paper link: https://aclanthology.org/2021.emnlp-main.571/ Abstract: Capturing word meaning in context and distinguishing between correspondences and variations across languages is key to building successful multilingual and cross-lingual text representation models. However, existing multilingual evaluation datasets that evaluate lexical semantics "in-context" have various limitations. In particular, 1) their language coverage is restricted to high-resource languages and skewed in favor of only a few language families and areas, 2) a design that makes the task solvable via superficial cues, which results in artificially inflated (and sometimes super-human) performances of pretrained encoders, on many target languages, which limits their usefulness for model probing and diagnostics, and 3) little support for cross-lingual evaluation. In order to address these gaps, we present AM2iCo (Adversarial and Multilingual Meaning in Context), a wide-coverage cross-lingual and multilingual evaluation set; it ...
URL: https://underline.io/lecture/37450-am2ico-evaluating-word-meaning-in-context-across-low-resource-languages-with-adversarial-examples
https://dx.doi.org/10.48448/gty8-be19
BASE
Hide details
7
Improving Machine Translation of Rare and Unseen Word Senses ...
BASE
Show details
8
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
BASE
Show details
9
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
Ponti, Edoardo; Glavaš, Goran; Majewska, Olga. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
10
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning
Liu, Qianchu; Korhonen, Anna-Leena; Majewska, Olga. - : Association for Computational Linguistics, 2020. : Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), 2020
BASE
Show details
11
XCOPA: A multilingual dataset for causal commonsense reasoning
Ponti, Edoardo Maria; Majewska, Olga; Liu, Qianchu. - : Association for Computational Linguistics, 2020
BASE
Show details
12
Investigating cross-lingual alignment methods for contextualized embeddings with Token-level evaluation ...
Liu, Qianchu; McCarthy, D; Vulić, I. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details
13
Second-order contexts from lexical substitutes for few-shot learning of word representations ...
Liu, Qianchu; McCarthy, D; Korhonen, Anna-Leena. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details
14
Second-order contexts from lexical substitutes for few-shot learning of word representations
Liu, Qianchu; McCarthy, D; Korhonen, Anna-Leena. - : *SEM@NAACL-HLT 2019 - 8th Joint Conference on Lexical and Computational Semantics, 2019
BASE
Show details
15
Investigating cross-lingual alignment methods for contextualized embeddings with Token-level evaluation
Liu, Qianchu; McCarthy, D; Vulić, I. - : CoNLL 2019 - 23rd Conference on Computational Natural Language Learning, Proceedings of the Conference, 2019
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
15
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern