DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 58

1
How Efficiency Shapes Human Language
In: https://hal.archives-ouvertes.fr/hal-03552539 ; 2022 (2022)
BASE
Show details
2
A verb-frame frequency account of constraints on long-distance dependencies in English
In: Prof. Gibson (2022)
BASE
Show details
3
Dependency locality as an explanatory principle for word order
In: Prof. Levy (2022)
BASE
Show details
4
When classifying grammatical role, BERT doesn't care about word order... except when it matters ...
BASE
Show details
5
Grammatical cues are largely, but not completely, redundant with word meanings in natural language ...
BASE
Show details
6
Learning Constraints on Wh-Dependencies by Learning How to Efficiently Represent Wh-Dependencies: A Developmental Modeling Investigation With Fragment Grammars
In: Proceedings of the Society for Computation in Linguistics (2022)
BASE
Show details
7
When Classifying Arguments, BERT Doesn't Care About Word Order. Except When It Matters
In: Proceedings of the Society for Computation in Linguistics (2022)
Abstract: We probe nouns in BERT contextual embedding space for grammatical role (subject vs. object of a clause), and examine how probing results vary between prototypical examples, where the role matches what we would expect from seeing that word in the context, and non-prototypical examples, where the role is mostly imparted by the context. In this way, engage with the contrast that has arisen in the literature, between studies that show contextual models as grammatically sensitive, and others that show that these models are robust to changes in word order. Our experiments yield three results: 1) Grammatical role is recovered in later layers for difficult non-prototypical cases, while prototypical cases are accurate without many layers of context 2) When we switch the subject and the object of a sentence around (eg, The chef cut the onion, The onion cut the chef), we see that the same word (eg, onion) can be fluently identified as both a subject and an object 3) Subjecthood probing breaks if we ablate local word order by shuffle words locally and break grammaticality.
Keyword: BERT; Computational Linguistics; Contextual embeddings; grammatical role; prototype; subjecthood; verb arguments; word order
URL: https://scholarworks.umass.edu/scil/vol5/iss1/18
https://scholarworks.umass.edu/cgi/viewcontent.cgi?article=1244&context=scil
BASE
Hide details
8
Word order affects the frequency of adjective use across languages
In: Proceedings of the Annual Meeting of the Cognitive Science Society, vol 43, iss 43 (2021)
BASE
Show details
9
Syntactic dependencies correspond to word pairs with high mutual information
In: Association for Computational Linguistics (2021)
BASE
Show details
10
Hierarchical Representation in Neural Language Models: Suppression and Recovery of Expectations
In: Association for Computational Linguistics (2021)
BASE
Show details
11
Structural Supervision Improves Few-Shot Learning and Syntactic Generalization in Neural Language Models
In: Association for Computational Linguistics (2021)
BASE
Show details
12
Syntactic dependencies correspond to word pairs with high mutual information
In: Association for Computational Linguistics (2021)
BASE
Show details
13
Structural Supervision Improves Learning of Non-Local Grammatical Dependencies
In: Association for Computational Linguistics (2021)
BASE
Show details
14
Maze Made Easy: Better and easier measurement of incremental processing difficulty
In: Other repository (2021)
BASE
Show details
15
An Information-Theoretic Characterization of Morphological Fusion ...
BASE
Show details
16
Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT ...
BASE
Show details
17
Multilingual BERT, Ergativity, and Grammatical Subjecthood ...
Papadimitriou, Isabel; Chi, Ethan A.; Futrell, Richard. - : University of Massachusetts Amherst, 2021
BASE
Show details
18
Sensitivity as a Complexity Measure for Sequence Classification Tasks ...
BASE
Show details
19
What do RNN Language Models Learn about Filler–Gap Dependencies?
In: Association for Computational Linguistics (2021)
BASE
Show details
20
Language Learning and Processing in People and Machines
In: Association for Computational Linguistics (2021)
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
1
0
1
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
56
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern