DE eng

Search in the Catalogues and Directories

Hits 1 – 11 of 11

1
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
BASE
Show details
2
Probing Pretrained Language Models for Lexical Semantics ...
BASE
Show details
3
Specializing unsupervised pretraining models for word-level semantic similarity
Ponti, Edoardo Maria; Korhonen, Anna; Vulić, Ivan. - : Association for Computational Linguistics, ACL, 2020
BASE
Show details
4
Probing pretrained language models for lexical semantics
Vulić, Ivan; Korhonen, Anna; Litschko, Robert. - : Association for Computational Linguistics, 2020
BASE
Show details
5
XCOPA: A multilingual dataset for causal commonsense reasoning
Ponti, Edoardo Maria; Majewska, Olga; Liu, Qianchu. - : Association for Computational Linguistics, 2020
BASE
Show details
6
Specializing Unsupervised Pretraining Models for Word-Level Semantic Similarity ...
BASE
Show details
7
Specializing distributional vectors of all words for lexical entailment
Ponti, Edoardo Maria; Kamath, Aishwarya; Pfeiffer, Jonas. - : Association for Computational Linguistics, 2019
BASE
Show details
8
Cross-lingual semantic specialization via lexical relation induction
Glavaš, Goran; Vulić, Ivan; Korhonen, Anna. - : Association for Computational Linguistics, 2019
BASE
Show details
9
Informing unsupervised pretraining with external linguistic knowledge
Abstract: Unsupervised pretraining models have been shown to facilitate a wide range of downstream applications. These models, however, still encode only the distributional knowledge, incorporated through language modeling objectives. In this work, we complement the encoded distributional knowledge with external lexical knowledge. We generalize the recently proposed (state-of-the-art) unsupervised pretraining model BERT to a multi-task learning setting: we couple BERT's masked language modeling and next sentence prediction objectives with the auxiliary binary word relation classification, through which we inject clean linguistic knowledge into the model. Our initial experiments suggest that our "linguistically-informed" BERT (LIBERT) yields performance gains over the linguistically-blind "vanilla" BERT on several language understanding tasks.
Keyword: 004 Informatik
URL: https://madoc.bib.uni-mannheim.de/51956/
https://arxiv.org/pdf/1909.02339.pdf
BASE
Hide details
10
Adversarial Propagation and Zero-Shot Cross-Lingual Transfer of Word Vector Specialization ...
BASE
Show details
11
Adversarial propagation and zero-shot cross-lingual transfer of word vector specialization
Ponti, Edoardo Maria; Vulić, Ivan; Glavaš, Goran. - : Association for Computational Linguistics, 2018
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
11
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern