DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 44

1
Learning to Borrow -- Relation Representation for Without-Mention Entity-Pairs for Knowledge Graph Completion ...
BASE
Show details
2
Learning Meta Word Embeddings by Unsupervised Weighted Concatenation of Source Embeddings ...
Bollegala, Danushka. - : arXiv, 2022
BASE
Show details
3
Sense Embeddings are also Biased--Evaluating Social Biases in Static and Contextualised Sense Embeddings
BASE
Show details
4
I Wish I Would Have Loved This One, But I Didn't -- A Multilingual Dataset for Counterfactual Detection in Product Reviews ...
BASE
Show details
5
Detect and Classify – Joint Span Detection and Classification for Health Outcomes ...
BASE
Show details
6
Unsupervised Abstractive Opinion Summarization by Generating Sentences with Tree-Structured Topic Guidance ...
BASE
Show details
7
Fine-Tuning Word Embeddings for Hierarchical Representation of Data Using a Corpus and a Knowledge Base for Various Machine Learning Applications
In: Comput Math Methods Med (2021)
BASE
Show details
8
RelWalk - A Latent Variable Model Approach to Knowledge Graph Embedding.
Bollegala, Danushka; Kawarabayashi, Ken-ichi; Yoshida, Yuichi. - : Association for Computational Linguistics, 2021
BASE
Show details
9
Dictionary-based Debiasing of Pre-trained Word Embeddings.
Bollegala, Danushka; Kaneko, Masahiro. - : Association for Computational Linguistics, 2021
BASE
Show details
10
Unsupervised Abstractive Opinion Summarization by Generating Sentences with Tree-Structured Topic Guidance
Sakata, Ichiro; Mori, Junichiro; Bollegala, Danushka. - : Massachusetts Institute of Technology Press, 2021
BASE
Show details
11
Unsupervised Abstractive Opinion Summarization by Generating Sentences with Tree-Structured Topic Guidance
BASE
Show details
12
Debiasing Pre-trained Contextualised Embeddings.
Kaneko, Masahiro; Bollegala, Danushka. - : Association for Computational Linguistics, 2021
BASE
Show details
13
Autoencoding Improves Pre-trained Word Embeddings ...
Abstract: Prior work investigating the geometry of pre-trained word embeddings have shown that word embeddings to be distributed in a narrow cone and by centering and projecting using principal component vectors one can increase the accuracy of a given set of pre-trained word embeddings. However, theoretically, this post-processing step is equivalent to applying a linear autoencoder to minimise the squared l2 reconstruction error. This result contradicts prior work (Mu and Viswanath, 2018) that proposed to remove the top principal components from pre-trained embeddings. We experimentally verify our theoretical claims and show that retaining the top principal components is indeed useful for improving pre-trained word embeddings, without requiring access to additional linguistic resources or labelled data. ... : COLING 2020 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2010.13094
https://arxiv.org/abs/2010.13094
BASE
Hide details
14
Autoencoding Improves Pre-trained Word Embeddings ...
BASE
Show details
15
Graph Convolution over Multiple Dependency Sub-graphs for Relation Extraction ...
BASE
Show details
16
Language-Independent Tokenisation Rivals Language-Specific Tokenisation for Word Similarity Prediction ...
BASE
Show details
17
Graph Convolution over Multiple Dependency Sub-graphs for Relation Extraction.
Mandya, Angrosh; Coenen, Frans; Bollegala, Danushka. - : International Committee on Computational Linguistics, 2020
BASE
Show details
18
Multi-Source Attention for Unsupervised Domain Adaptation.
Bollegala, Danushka; Cui, Xia. - : Association for Computational Linguistics, 2020
BASE
Show details
19
Learning to Compose Relational Embeddings in Knowledge Graphs
Hakami, Huda; Chen, Wenye; Bollegala, Danushka. - : Springer Singapore, 2020
BASE
Show details
20
Tree-Structured Neural Topic Model
BASE
Show details

Page: 1 2 3

Catalogues
0
0
1
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
43
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern