1 |
How Efficiency Shapes Human Language
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-03552539 ; 2022 (2022)
|
|
BASE
|
|
Show details
|
|
2 |
A verb-frame frequency account of constraints on long-distance dependencies in English
|
|
|
|
In: Prof. Gibson (2022)
|
|
BASE
|
|
Show details
|
|
3 |
Dependency locality as an explanatory principle for word order
|
|
|
|
In: Prof. Levy (2022)
|
|
BASE
|
|
Show details
|
|
4 |
When classifying grammatical role, BERT doesn't care about word order... except when it matters ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Grammatical cues are largely, but not completely, redundant with word meanings in natural language ...
|
|
|
|
Abstract:
The combinatorial power of language has historically been argued to be enabled by syntax: rules that allow words to combine hierarchically to convey complex meanings. But how important are these rules in practice? We performed a broad-coverage cross-linguistic investigation of the importance of grammatical cues for interpretation. First, English and Russian speakers (n=484) were presented with subjects, verbs, and objects (in random order and with morphological markings removed) extracted from naturally occurring sentences, and were asked to identify which noun is the agent of the action. Accuracy was high in both languages (~89% in English, ~87% in Russian), suggesting that word meanings strongly constrain who is doing what to whom. Next, we trained a neural network machine classifier on a similar task: predicting which nominal in a subject-verb-object triad is the subject. Across 30 languages from eight language families, performance was consistently high: a median accuracy of 87%, comparable to the ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.2201.12911 https://arxiv.org/abs/2201.12911
|
|
BASE
|
|
Hide details
|
|
6 |
Learning Constraints on Wh-Dependencies by Learning How to Efficiently Represent Wh-Dependencies: A Developmental Modeling Investigation With Fragment Grammars
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2022)
|
|
BASE
|
|
Show details
|
|
7 |
When Classifying Arguments, BERT Doesn't Care About Word Order. Except When It Matters
|
|
|
|
In: Proceedings of the Society for Computation in Linguistics (2022)
|
|
BASE
|
|
Show details
|
|
8 |
Word order affects the frequency of adjective use across languages
|
|
|
|
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
|
|
BASE
|
|
Show details
|
|
9 |
Syntactic dependencies correspond to word pairs with high mutual information
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
10 |
Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
11 |
Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
12 |
Syntactic dependencies correspond to word pairs with high mutual information
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
13 |
Structural Supervision Improves Learning of Non-Local Grammatical Dependencies
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
14 |
Maze Made Easy: Better and easier measurement of incremental processing difficulty
|
|
|
|
In: Other repository (2021)
|
|
BASE
|
|
Show details
|
|
15 |
An Information-Theoretic Characterization of Morphological Fusion ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Multilingual BERT, Ergativity, and Grammatical Subjecthood ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Sensitivity as a Complexity Measure for Sequence Classification Tasks ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
What do RNN Language Models Learn about Filler–Gap Dependencies?
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
20 |
Language Learning and Processing in People and Machines
|
|
|
|
In: Association for Computational Linguistics (2021)
|
|
BASE
|
|
Show details
|
|
|
|