DE eng

Search in the Catalogues and Directories

Hits 1 – 14 of 14

1
The Language Model Understood the Prompt was Ambiguous: Probing Syntactic Uncertainty Through Generation ...
BASE
Show details
2
Does referent predictability affect the choice of referential form? A computational approach using masked coreference resolution ...
Abstract: It is often posited that more predictable parts of a speaker's meaning tend to be made less explicit, for instance using shorter, less informative words. Studying these dynamics in the domain of referring expressions has proven difficult, with existing studies, both psycholinguistic and corpus-based, providing contradictory results. We test the hypothesis that speakers produce less informative referring expressions (e.g., pronouns vs. full noun phrases) when the context is more informative about the referent, using novel computational estimates of referent predictability. We obtain these estimates training an existing coreference resolution system for English on a new task, masked coreference resolution, giving us a probability distribution over referents that is conditioned on the context but not the referring expression. The resulting system retains standard coreference resolution performance while yielding a better estimate of human-derived referent predictability than previous attempts. A statistical ...
Keyword: Computational Creativity; Computational Linguistics; Machine Learning; Machine Learning and Data Mining; Natural Language Processing
URL: https://underline.io/lecture/39866-does-referent-predictability-affect-the-choice-of-referential-formquestion-a-computational-approach-using-masked-coreference-resolution
https://dx.doi.org/10.48448/nk3m-8a80
BASE
Hide details
3
Does referent predictability affect the choice of referential form? A computational approach using masked coreference resolution ...
BASE
Show details
4
Similarity is closeness: using distributional semantic spaces to model similarity in visual and linguistic metaphors
In: Corpus linguistics and linguistic theory. - Berlin ; New York : Mouton de Gruyter 15 (2019) 1, 101-137
BLLDB
Show details
5
What do Entity-Centric Models Learn? Insights from Entity Linking in Multi-Party Dialogue ...
BASE
Show details
6
Putting words in context: LSTM language models and lexical ambiguity ...
BASE
Show details
7
Negated adjectives and antonyms in distributional semantics: not similar?
Aina, Laura; Bernardi, Raffaella; Fernández, Raquel. - : Associazione Italiana di Linguistica Computazionale
BASE
Show details
8
Putting words in context: LSTM language models and lexical ambiguity
Boleda, Gemma; Gulordava, Kristina; Aina, Laura. - : ACL (Association for Computational Linguistics)
BASE
Show details
9
Modeling word interpretation with deep language models: the interaction between expectations and lexical information
Aina, Laura; Brochhagen, Thomas; Boleda, Gemma. - : Cognitive Science Society
BASE
Show details
10
A distributional study of negated adjectives and antonyms
Fernández, Raquel; Bernardi, Raffaella; Aina, Laura. - : CEUR Workshop Proceedings
BASE
Show details
11
AMORE-UPF at SemEval-2018 Task 4: BiLSTM with entity library
Westera, Matthijs; Silberer, Carina; Aina, Laura. - : ACL (Association for Computational Linguistics)
BASE
Show details
12
How to represent a word and predict it, too: improving tied architectures for language modelling
Gulordava, Kristina; Aina, Laura; Boleda, Gemma. - : ACL (Association for Computational Linguistics)
BASE
Show details
13
How to represent a word and predict it, too: improving tied architectures for language modelling
Boleda, Gemma; Aina, Laura; Gulordava, Kristina. - : ACL (Association for Computational Linguistics)
BASE
Show details
14
What do entity-centric models learn? Insights from entity linking in multi-party dialogue
Westera, Matthijs; Silberer, Carina; Aina, Laura. - : ACL (Association for Computational Linguistics)
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
1
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
13
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern