DE eng

Search in the Catalogues and Directories

Hits 1 – 14 of 14

1
Learning Argument Structures with Recurrent Neural Network Grammars
In: Proceedings of the Society for Computation in Linguistics (2022)
Abstract: In targeted syntactic evaluations, the syntactic competence of LMs has been investigated through various syntactic phenomena, among which one of the important domains has been argument structure. Argument structures in head-initial languages have been exclusively tested in the previous literature, but may be readily predicted from lexical information of verbs, potentially overestimating the syntactic competence of LMs. In this paper, we explore whether argument structures can be learned by LMs in head-final languages, which could be more challenging given that argument structures must be predicted before encountering verbs during incremental sentence processing, so that the relative weight of syntactic information should be heavier than lexical information. Specifically, we examined double accusative constraint and double dative constraint in Japanese with the sequential and hierarchical LMs: n-gram model, LSTM, GPT-2, and RNNG. Our results demonstrated that the double accusative constraint is captured by all LMs, whereas the double dative constraint is successfully explained only by the hierarchical model. In addition, we probed incremental sentence processing by LMs through the lens of surprisal, and suggested that the hierarchical model may capture deep semantic roles that verbs assign to arguments, while the sequential models seem to be influenced by surface case alignments.
Keyword: acceptability; argument structure; Computational Linguistics; grammaticality; Japanese; language model; probability; structure
URL: https://scholarworks.umass.edu/cgi/viewcontent.cgi?article=1258&context=scil
https://scholarworks.umass.edu/scil/vol5/iss1/9
BASE
Hide details
2
Cross-linguistic patterns of morpheme order reflect cognitive biases: An experimental study of case and number morphology ...
BASE
Show details
3
Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars ...
BASE
Show details
4
Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars ...
BASE
Show details
5
Effective Batching for Recurrent Neural Network Grammars ...
BASE
Show details
6
Lower Perplexity is Not Always Human-Like ...
BASE
Show details
7
Lower Perplexity is Not Always Human-Like ...
BASE
Show details
8
Modeling Human Morphological Competence
In: Front Psychol (2020)
BASE
Show details
9
Modeling Morphological Processing in Human Magnetoencephalography
In: Proceedings of the Society for Computation in Linguistics (2020)
BASE
Show details
10
Dual suppletion in Japanese
In: Proceedings of the 14th Workshop on Altaic Formal Linguistics (WAFL14) ([2019]), S. 193-204
Leibniz-Zentrum Allgemeine Sprachwissenschaft
Show details
11
Case-Number morpheme order ...
Saldana, Carmen; Oseki, Yohei; Culbertson, Jennifer. - : Open Science Framework, 2019
BASE
Show details
12
Some consequences of simplest Merge and defectiveness in Japanese
In: Proceedings of the 10th Workshop on Altaic Formal Linguistics (WAFL10) ([2018]), S. 217-228
Leibniz-Zentrum Allgemeine Sprachwissenschaft
Show details
13
The reliability of acceptability judgments across languages
In: Glossa: a journal of general linguistics; Vol 3, No 1 (2018); 100 ; 2397-1835 (2018)
BASE
Show details
14
Wh-Concord in Okinawan = Syntactic Movement + Morphological Merger
In: University of Pennsylvania Working Papers in Linguistics (2016)
BASE
Show details

Catalogues
0
0
0
0
0
0
2
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
12
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern