1 |
Learning Stress Patterns with a Sequence-to-Sequence Neural Network
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2022)
|
|
BASE
|
|
Show details
|
|
2 |
Learning Repetition, but not Syllable Reversal
|
|
|
|
In: Proceedings of the Annual Meetings on Phonology; Proceedings of the 2020 Annual Meeting on Phonology ; 2377-3324 (2021)
|
|
Abstract:
Reduplication is common, but analogous reversal processes are rare, even though reversal, which involves nested rather than crossed dependencies, is less complex on the Chomsky hierarchy. We hypothesize that the explanation is that repetitions can be recognized when they match and reactivate a stored trace in short-term memory, but recognizing a reversal requires rearranging the input in working memory before attempting to match it to the stored trace. Repetitions can thus be recognized, and repetition patterns learned, implicitly, whereas reversals require explicit, conscious awareness. To test these hypotheses, participants were trained to recognize either a reduplication or a syllable-reversal pattern, and then asked to state the rule. In two experiments, above-chance classification performance on the Reversal pattern was confined to Correct Staters, whereas above-chance performance on the Reduplication pattern was found with or without correct rule-stating. Final proportion correct was positively correlated with final response time for the Reversal Correct Staters but no other group. These results support the hypothesis that reversal, unlike reduplication, requires conscious, time-consuming computation.
|
|
Keyword:
explicit learning; Formal Language Theory; implicit learning; learning; memory; reduplication; reversal
|
|
URL: http://journals.linguisticsociety.org/proceedings/index.php/amphonology/article/view/4912 https://doi.org/10.3765/amp.v9i0.4912
|
|
BASE
|
|
Hide details
|
|
4 |
French schwa and gradient cumulativity
|
|
|
|
In: Glossa: a journal of general linguistics; Vol 5, No 1 (2020); 24 ; 2397-1835 (2020)
|
|
BASE
|
|
Show details
|
|
5 |
Assimilation triggers metathesis in Balantak: Implications for theories of possible repair in Optimality Theory
|
|
|
|
In: University of Massachusetts Occasional Papers in Linguistics (2020)
|
|
BASE
|
|
Show details
|
|
8 |
Learning Reduplication with a Neural Network without Explicit Variables
|
|
|
|
In: Joe Pater (2019)
|
|
BASE
|
|
Show details
|
|
9 |
Phonological typology in Optimality Theory and Formal Language Theory: Goals and future directions
|
|
|
|
In: Joe Pater (2019)
|
|
BASE
|
|
Show details
|
|
10 |
Learning syntactic parameters without triggers by assigning credit and blame
|
|
|
|
In: Joe Pater (2019)
|
|
BASE
|
|
Show details
|
|
11 |
Generative linguistics and neural networks at 60: foundation, friction, and fusion
|
|
|
|
In: Joe Pater (2019)
|
|
BASE
|
|
Show details
|
|
12 |
Preface: SCiL 2019 Editors’ Note
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2019)
|
|
BASE
|
|
Show details
|
|
13 |
Substance matters: A reply to Jardine 2016
|
|
|
|
In: Joe Pater (2018)
|
|
BASE
|
|
Show details
|
|
14 |
Seq2Seq Models with Dropout can Learn Generalizable Reduplication
|
|
|
|
In: Joe Pater (2018)
|
|
BASE
|
|
Show details
|
|
15 |
Preface: SCiL 2018 Editors’ Note
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2018)
|
|
BASE
|
|
Show details
|
|
20 |
Gradient Exceptionality in Maximum Entropy Grammar with Lexically Specific Constraints
|
|
|
|
BASE
|
|
Show details
|
|
|
|