1 |
Learning Functional Distributional Semantics with Visual Data ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
IAPUCP at SemEval-2021 task 1: Stacking fine-tuned transformers is almost all you need for lexical complexity prediction
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Investigating Cross-Linguistic Adjective Ordering Tendencies with a Latent-Variable Model ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Autoencoding Pixies: Amortised Variational Inference with Graph Convolutions for Functional Distributional Semantics ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Linguists Who Use Probabilistic Models Love Them: Quantification in Functional Distributional Semantics ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Investigating Cross-Linguistic Adjective Ordering Tendencies with a Latent-Variable Model ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Investigating Cross-Linguistic Adjective Ordering Tendencies with a Latent-Variable Model
|
|
|
|
In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) (2020)
|
|
BASE
|
|
Show details
|
|
12 |
Words are vectors, dependencies are matrices: Learning word embeddings from dependency graphs
|
|
Copestake, Ann; Czarnowska, P; Emerson, Guy. - : Association for Computational Linguistics, 2019. : https://aclanthology.org/volumes/W19-04/, 2019. : IWCS 2019 - Proceedings of the 13th International Conference on Computational Semantics - Long Papers, 2019
|
|
BASE
|
|
Show details
|
|
13 |
Functional Distributional Semantics: Learning Linguistically Informed Representations from a Precisely Annotated Corpus ...
|
|
Emerson, Guy. - : Apollo - University of Cambridge Repository, 2018
|
|
BASE
|
|
Show details
|
|
14 |
Functional Distributional Semantics: Learning Linguistically Informed Representations from a Precisely Annotated Corpus
|
|
Emerson, Guy. - : University of Cambridge, 2018. : Department of Computer Science and Technology, 2018. : Trinity College, 2018
|
|
Abstract:
The aim of distributional semantics is to design computational techniques that can automatically learn the meanings of words from a body of text. The twin challenges are: how do we represent meaning, and how do we learn these representations? The current state of the art is to represent meanings as vectors – but vectors do not correspond to any traditional notion of meaning. In particular, there is no way to talk about truth, a crucial concept in logic and formal semantics. In this thesis, I develop a framework for distributional semantics which answers this challenge. The meaning of a word is not represented as a vector, but as a function, mapping entities (objects in the world) to probabilities of truth (the probability that the word is true of the entity). Such a function can be interpreted both in the machine learning sense of a classifier, and in the formal semantic sense of a truth-conditional function. This simultaneously allows both the use of machine learning techniques to exploit large datasets, and also the use of formal semantic techniques to manipulate the learnt representations. I define a probabilistic graphical model, which incorporates a probabilistic generalisation of model theory (allowing a strong connection with formal semantics), and which generates semantic dependency graphs (allowing it to be trained on a corpus). This graphical model provides a natural way to model logical inference, semantic composition, and context-dependent meanings, where Bayesian inference plays a crucial role. I demonstrate the feasibility of this approach by training a model on WikiWoods, a parsed version of the English Wikipedia, and evaluating it on three tasks. The results indicate that the model can learn information not captured by vector space models. ; Schiff Fund Studentship
|
|
Keyword:
distributional semantics; formal semantics; machine learning
|
|
URL: https://www.repository.cam.ac.uk/handle/1810/284882 https://doi.org/10.17863/CAM.32253
|
|
BASE
|
|
Hide details
|
|
16 |
Functional Distributional Semantics
|
|
Emerson, Guy; Copestake, Ann. - : The Association for Computational Linguistics, 2016. : Proceedings of the 1st Workshop on Representation Learning for NLP, 2016
|
|
BASE
|
|
Show details
|
|
17 |
Lacking integrity: HPSG as a morphosyntactic theory
|
|
Emerson, Guy; Copestake, Ann. - : University Library J. C. Senckenberg, 2015. : http://web.stanford.edu/group/cslipublications/cslipublications/HPSG/2015/emerson-copestake.pdf, 2015. : Proceedings of the International Conference on Head-Driven Phrase Structure Grammar, 2015
|
|
BASE
|
|
Show details
|
|
18 |
Leveraging a semantically annotated corpus to disambiguate prepositional phrase attachment
|
|
Emerson, Guy; Copestake, Ann. - : The Association for Computer Linguistics, 2015. : https://aclanthology.org/volumes/W15-01/, 2015. : IWCS 2015 - Proceedings of the 11th International Conference on Computational Semantics, 2015
|
|
BASE
|
|
Show details
|
|
|
|