1 |
Disentangling Syntax and Semantics in the Brain with Deep Networks
|
|
|
|
In: ICML 2021 - 38th International Conference on Machine Learning ; https://hal.archives-ouvertes.fr/hal-03361421 ; ICML 2021 - 38th International Conference on Machine Learning, Jul 2021, Online conference, France (2021)
|
|
BASE
|
|
Show details
|
|
2 |
The Mapping of Deep Language Models on Brain Responses Primarily Depends on their Performance
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-03361439 ; 2021 (2021)
|
|
BASE
|
|
Show details
|
|
3 |
Disentangling Syntax and Semantics in the Brain with Deep Networks ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Can RNNs learn Recursive Nested Subject-Verb Agreements? ...
|
|
|
|
Abstract:
One of the fundamental principles of contemporary linguistics states that language processing requires the ability to extract recursively nested tree structures. However, it remains unclear whether and how this code could be implemented in neural circuits. Recent advances in Recurrent Neural Networks (RNNs), which achieve near-human performance in some language tasks, provide a compelling model to address such questions. Here, we present a new framework to study recursive processing in RNNs, using subject-verb agreement as a probe into the representations of the neural network. We trained six distinct types of RNNs on a simplified probabilistic context-free grammar designed to independently manipulate the length of a sentence and the depth of its syntactic tree. All RNNs generalized to subject-verb dependencies longer than those seen during training. However, none systematically generalized to deeper tree structures, even those with a structural bias towards learning nested tree (i.e., stack-RNNs). In ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://arxiv.org/abs/2101.02258 https://dx.doi.org/10.48550/arxiv.2101.02258
|
|
BASE
|
|
Hide details
|
|
5 |
Neural dynamics of phoneme sequencing in real speech jointly encode order and invariant content
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-03089733 ; 2020 (2020)
|
|
BASE
|
|
Show details
|
|
6 |
Language processing in brains and deep neural networks: computational convergence and its limits
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-03089737 ; 2020 (2020)
|
|
BASE
|
|
Show details
|
|
7 |
What Limits Our Capacity to Process Nested Long-Range Dependencies in Sentence Comprehension?
|
|
|
|
In: ISSN: 1099-4300 ; Entropy ; https://hal.archives-ouvertes.fr/hal-03089730 ; Entropy, MDPI, 2020, 22 (4), pp.446. ⟨10.3390/e22040446⟩ (2020)
|
|
BASE
|
|
Show details
|
|
8 |
What Limits Our Capacity to Process Nested Long-Range Dependencies in Sentence Comprehension?
|
|
|
|
In: Entropy (Basel) (2020)
|
|
BASE
|
|
Show details
|
|
9 |
Opportunities and challenges for a maturing science of consciousness
|
|
|
|
In: EISSN: 2397-3374 ; Nature Human Behaviour ; https://hal.archives-ouvertes.fr/hal-02355093 ; Nature Human Behaviour, Nature Research 2019, 3 (2), pp.104-107. ⟨10.1038/s41562-019-0531-8⟩ (2019)
|
|
BASE
|
|
Show details
|
|
|
|