DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 58

1
How Efficiency Shapes Human Language
In: https://hal.archives-ouvertes.fr/hal-03552539 ; 2022 (2022)
BASE
Show details
2
A verb-frame frequency account of constraints on long-distance dependencies in English
In: Prof. Gibson (2022)
BASE
Show details
3
Dependency locality as an explanatory principle for word order
In: Prof. Levy (2022)
BASE
Show details
4
When classifying grammatical role, BERT doesn't care about word order... except when it matters ...
BASE
Show details
5
Grammatical cues are largely, but not completely, redundant with word meanings in natural language ...
BASE
Show details
6
Learning Constraints on Wh-Dependencies by Learning How to Efficiently Represent Wh-Dependencies: A Developmental Modeling Investigation With Fragment Grammars
In: Proceedings of the Society for Computation in Linguistics (2022)
BASE
Show details
7
When Classifying Arguments, BERT Doesn't Care About Word Order. Except When It Matters
In: Proceedings of the Society for Computation in Linguistics (2022)
BASE
Show details
8
Word order affects the frequency of adjective use across languages
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
BASE
Show details
9
Syntactic dependencies correspond to word pairs with high mutual information
In: Association for Computational Linguistics (2021)
BASE
Show details
10
Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
In: Association for Computational Linguistics (2021)
BASE
Show details
11
Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models
In: Association for Computational Linguistics (2021)
BASE
Show details
12
Syntactic dependencies correspond to word pairs with high mutual information
In: Association for Computational Linguistics (2021)
BASE
Show details
13
Structural Supervision Improves Learning of Non-Local Grammatical Dependencies
In: Association for Computational Linguistics (2021)
BASE
Show details
14
Maze Made Easy: Better and easier measurement of incremental processing difficulty
In: Other repository (2021)
BASE
Show details
15
An Information-Theoretic Characterization of Morphological Fusion ...
BASE
Show details
16
Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT ...
BASE
Show details
17
Multilingual BERT, Ergativity, and Grammatical Subjecthood ...
Papadimitriou, Isabel; Chi, Ethan A.; Futrell, Richard. - : University of Massachusetts Amherst, 2021
BASE
Show details
18
Sensitivity as a Complexity Measure for Sequence Classification Tasks ...
Abstract: We introduce a theoretical framework for understanding and predicting the complexity of sequence classification tasks, using a novel extension of the theory of Boolean function sensitivity. The sensitivity of a function, given a distribution over input sequences, quantifies the number of disjoint subsets of the input sequence that can each be individually changed to change the output. We argue that standard sequence classification methods are biased towards learning low-sensitivity functions, so that tasks requiring high sensitivity are more difficult. To that end, we show analytically that simple lexical classifiers can only express functions of bounded sensitivity, and we show empirically that low-sensitivity functions are easier to learn for LSTMs. We then estimate sensitivity on 15 NLP tasks, finding that sensitivity is higher on challenging tasks collected in GLUE than on simple text classification tasks, and that sensitivity predicts the performance both of simple lexical classifiers and of vanilla ... : Accepted by TACL. This is a pre-MIT Press publication version ...
Keyword: Computation and Language cs.CL; Computational Complexity cs.CC; FOS Computer and information sciences; Machine Learning cs.LG
URL: https://dx.doi.org/10.48550/arxiv.2104.10343
https://arxiv.org/abs/2104.10343
BASE
Hide details
19
What do RNN Language Models Learn about Filler–Gap Dependencies?
In: Association for Computational Linguistics (2021)
BASE
Show details
20
Language Learning and Processing in People and Machines
In: Association for Computational Linguistics (2021)
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
1
0
1
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
56
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern