3 |
Supplementary material from "Simpler grammar, larger vocabulary: How population size affects language" ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Supplementary material from "Simpler grammar, larger vocabulary: How population size affects language" ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Additional Variations of Simulation Parameters from Simpler grammar, larger vocabulary: How population size affects language ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Additional Variations of Simulation Parameters from Simpler grammar, larger vocabulary: How population size affects language ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Simpler grammar, larger vocabulary: How population size affects language
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Simpler grammar, larger vocabulary : how population size affects language
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Identification of probabilities
|
|
|
|
Abstract:
Within psychology, neuroscience and artificial intelligence, there has been increasing interest in the proposal that the brain builds probabilistic models of sensory and linguistic input: that is, to infer a probabilistic model from a sample. The practical problems of such inference are substantial: the brain has limited data and restricted computational resources. But there is a more fundamental question: is the problem of inferring a probabilistic model from a sample possible even in principle? We explore this question and find some surprisingly positive and general results. First, for a broad class of probability distributions characterized by computability restrictions, we specify a learning algorithm that will almost surely identify a probability distribution in the limit given a finite i.i.d. sample of sufficient but unknown length. This is similarly shown to hold for sequences generated by a broad class of Markov chains, subject to computability assumptions. The technical tool is the strong law of large numbers. Second, for a large class of dependent sequences, we specify an algorithm which identifies in the limit a computable measure for which the sequence is typical, in the sense of Martin-Löf (there may be more than one such measure). The technical tool is the theory of Kolmogorov complexity. We analyze the associated predictions in both cases. We also briefly consider special cases, including language learning, and wider theoretical implications for psychology.
|
|
Keyword:
BF Psychology; QA Mathematics; RC0321 Neuroscience. Biological psychiatry. Neuropsychiatry; TA Engineering (General). Civil engineering (General)
|
|
URL: http://wrap.warwick.ac.uk/84258/1/WRAP_0973769-wbs-291116-identnick.pdf http://wrap.warwick.ac.uk/84258/ http://wrap.warwick.ac.uk/84258/7/WRAP_1-s2.0-S0022249616301432-main.pdf https://doi.org/10.1016/j.jmp.2016.11.004
|
|
BASE
|
|
Hide details
|
|
15 |
Instantaneous conventions : the emergence of flexible communicative signals
|
|
|
|
BASE
|
|
Show details
|
|
18 |
The language faculty that wasn't: a usage-based account of natural language recursion
|
|
|
|
BASE
|
|
Show details
|
|
19 |
The language faculty that wasn't : a usage-based account of natural language recursion
|
|
|
|
BASE
|
|
Show details
|
|
|
|