DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6
Hits 1 – 20 of 115

1
Optimality Theory: Constraint Interaction in Generative Grammar ...
Smolensky, Paul; Prince, Alan S.. - : Rutgers University, 2022
BASE
Show details
2
Compositional processing emerges in neural networks solving math problems
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
BASE
Show details
3
Infinite use of finite means? Evaluating the generalization of center embedding learned from an artificial grammar
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
BASE
Show details
4
Compositional Processing Emerges in Neural Networks Solving Math Problems ...
BASE
Show details
5
Distributed neural encoding of binding to thematic roles ...
BASE
Show details
6
Infinite use of finite means? Evaluating the generalization of center embedding learned from an artificial grammar ...
BASE
Show details
7
Compositional processing emerges in neural networks solving math problems ...
BASE
Show details
8
How much do language models copy from their training data? Evaluating linguistic novelty in text generation using RAVEN ...
BASE
Show details
9
Compositional Processing Emerges in Neural Networks Solving Math Problems
In: Cogsci (2021)
BASE
Show details
10
Emergent Gestural Scores in a Recurrent Neural Network Model of Vowel Harmony
In: Proceedings of the Society for Computation in Linguistics (2021)
BASE
Show details
11
Testing for Grammatical Category Abstraction in Neural Language Models
In: Proceedings of the Society for Computation in Linguistics (2021)
BASE
Show details
12
Universal linguistic inductive biases via meta-learning ...
BASE
Show details
13
Tensor Product Decomposition Networks: Uncovering Representations of Structure Learned by Neural Networks
In: Proceedings of the Society for Computation in Linguistics (2020)
BASE
Show details
14
Learning a gradient grammar of French liaison
In: Proceedings of the Annual Meetings on Phonology; Proceedings of the 2019 Annual Meeting on Phonology ; 2377-3324 (2020)
BASE
Show details
15
RNNs Implicitly Implement Tensor Product Representations
In: International Conference on Learning Representations ; ICLR 2019 - International Conference on Learning Representations ; https://hal.archives-ouvertes.fr/hal-02274498 ; ICLR 2019 - International Conference on Learning Representations, May 2019, New Orleans, United States (2019)
BASE
Show details
16
Quantum Language Processing ...
BASE
Show details
17
Transient blend states and discrete agreement-driven errors in sentence production
In: Proceedings of the Society for Computation in Linguistics (2019)
Abstract: Errors in subject-verb agreement are common in everyday language production. This has been studied using a preamble completion task in which a participant hears or reads a preamble containing inflected nouns and forms a complete English sentence (“The key to the cabinets” could be completed as "The key to the cabinets is gold.") Existing work has focused on errors arising in selecting the correct verb form for production in the presence of a more ‘local’ noun with different number features (The key to the cabinets are gold). However, the same paradigm elicits substantial numbers of preamble errors ("The key to the cabinets" repeated as "The key to the cabinet") that existing theories have largely failed to address. We propose a Gradient Symbolic Computation (GSC) account of agreement and preamble errors. Sentence processing is modeled as a continuous-time, continuous-state stochastic dynamical system. Within this continuous representational space, a subset of states reflect discrete symbolic structures. The remainder are blend states where multiple symbols are simultaneously partially active. Initial phases of computation prefer blend states; an additional dynamic control parameter, commitment strength, pushes the model to discrete structures. This process, combined with stochastic gradient ascent dynamics respecting grammatical constraints on syntactic structures, yields discrete sentence outputs. We propose that transient blend states allow portions of target and non-target syntactic structures to interact, yielding both verb and preamble errors.
Keyword: Computational Linguistics; dynamical systems. neural networks; Gradient Symbolic Computation; Psycholinguistics and Neurolinguistics; sentence production
URL: https://scholarworks.umass.edu/scil/vol2/iss1/54
https://scholarworks.umass.edu/cgi/viewcontent.cgi?article=1067&context=scil
BASE
Hide details
18
Augmentic Compositional Models for Knowledge Base Completion Using Gradient Representations
In: Proceedings of the Society for Computation in Linguistics (2019)
BASE
Show details
19
Augmenting Compositional Models for Knowledge Base Completion Using Gradient Representations ...
BASE
Show details
20
A Simple Recurrent Unit with Reduced Tensor Product Representations ...
BASE
Show details

Page: 1 2 3 4 5 6

Catalogues
10
2
21
0
1
0
2
Bibliographies
38
0
0
0
0
0
0
0
22
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
34
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern