1 |
How Efficiency Shapes Human Language
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-03552539 ; 2022 (2022)
|
|
BASE
|
|
Show details
|
|
2 |
A verb-frame frequency account of constraints on long-distance dependencies in English
|
|
|
|
In: Prof. Gibson (2022)
|
|
BASE
|
|
Show details
|
|
3 |
Dependency locality as an explanatory principle for word order
|
|
|
|
In: Prof. Levy (2022)
|
|
Abstract:
© 2020 Printed with the permission of Richard Futrell, Roger P. Levy, & Edward Gibson. This work focuses on explaining both grammatical universals of word order and quantitative word-order preferences in usage by means of a simple efficiency principle: dependency locality. In its simplest form, dependency locality holds that words linked in a syntactic dependency (any head–dependent relationship) should be close in linear order. We give large-scale corpus evidence that dependency locality predicts word order in both grammar and usage, beyond what would be expected from independently motivated principles, and demonstrate a means for dissociating grammar and usage in corpus studies. Finally, we discuss previously undocumented variation in dependency length and how it correlates with other linguistic features such as head direction, pro-viding a rich set of explananda for future linguistic theories.*.
|
|
URL: https://hdl.handle.net/1721.1/138802.2
|
|
BASE
|
|
Hide details
|
|
4 |
When classifying grammatical role, BERT doesn't care about word order... except when it matters ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Grammatical cues are largely, but not completely, redundant with word meanings in natural language ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Learning Constraints on Wh-Dependencies by Learning How to Efficiently Represent Wh-Dependencies: A Developmental Modeling Investigation With Fragment Grammars
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2022)
|
|
BASE
|
|
Show details
|
|
7 |
When Classifying Arguments, BERT Doesn't Care About Word Order. Except When It Matters
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2022)
|
|
BASE
|
|
Show details
|
|
8 |
Word order affects the frequency of adjective use across languages
|
|
|
|
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
|
|
BASE
|
|
Show details
|
|
9 |
Syntactic dependencies correspond to word pairs with high mutual information
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
10 |
Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
11 |
Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
12 |
Syntactic dependencies correspond to word pairs with high mutual information
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
13 |
Structural Supervision Improves Learning of Non-Local Grammatical Dependencies
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
14 |
Maze Made Easy: Better and easier measurement of incremental processing difficulty
|
|
|
|
In: Other repository (2021)
|
|
BASE
|
|
Show details
|
|
15 |
An Information-Theoretic Characterization of Morphological Fusion ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Multilingual BERT, Ergativity, and Grammatical Subjecthood ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Sensitivity as a Complexity Measure for Sequence Classification Tasks ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
What do RNN Language Models Learn about Filler–Gap Dependencies?
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
20 |
Language Learning and Processing in People and Machines
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
|
|