DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 41

1
Cross-Situational Learning Towards Robot Grounding
In: https://hal.archives-ouvertes.fr/hal-03628290 ; 2022 (2022)
BASE
Show details
2
Cross-Situational Learning Towards Robot Grounding
In: https://hal.archives-ouvertes.fr/hal-03628290 ; 2022 (2022)
BASE
Show details
3
What does the Canary Say? Low-Dimensional GAN Applied to Birdsong
In: https://hal.inria.fr/hal-03244723 ; 2021 (2021)
BASE
Show details
4
What does the Canary Say? Low-Dimensional GAN Applied to Birdsong
In: https://hal.inria.fr/hal-03244723 ; 2021 (2021)
BASE
Show details
5
Which Hype for my New Task? Hints and Random Search for Reservoir Computing Hyperparameters
In: ICANN 2021 - 30th International Conference on Artificial Neural Networks ; https://hal.inria.fr/hal-03203318 ; ICANN 2021 - 30th International Conference on Artificial Neural Networks, Sep 2021, Bratislava, Slovakia (2021)
BASE
Show details
6
Canary Song Decoder: Transduction and Implicit Segmentation with ESNs and LTSMs
In: https://hal.inria.fr/hal-03203374 ; 2021 (2021)
BASE
Show details
7
Which Hype for my New Task? Hints and Random Search for Reservoir Computing Hyperparameters
In: https://hal.inria.fr/hal-03203318 ; 2021 (2021)
BASE
Show details
8
Canary Song Decoder: Transduction and Implicit Segmentation with ESNs and LTSMs
In: ICANN 2021 - 30th International Conference on Artificial Neural Networks ; https://hal.inria.fr/hal-03203374 ; ICANN 2021 - 30th International Conference on Artificial Neural Networks, Sep 2021, Bratislava, Slovakia. pp.71--82, ⟨10.1007/978-3-030-86383-8_6⟩ ; https://link.springer.com/chapter/10.1007/978-3-030-86383-8_6 (2021)
BASE
Show details
9
Hierarchical-Task Reservoir for Online Semantic Analysis from Continuous Speech
In: ISSN: 2162-237X ; IEEE Transactions on Neural Networks and Learning Systems ; https://hal.inria.fr/hal-03031413 ; IEEE Transactions on Neural Networks and Learning Systems, IEEE, 2021, ⟨10.1109/TNNLS.2021.3095140⟩ ; https://ieeexplore.ieee.org/abstract/document/9548713/metrics#metrics (2021)
BASE
Show details
10
Editorial: Language and Robotics
In: ISSN: 2296-9144 ; Frontiers in Robotics and AI ; https://hal.inria.fr/hal-03533733 ; Frontiers in Robotics and AI, Frontiers Media S.A., 2021, 8, ⟨10.3389/frobt.2021.674832⟩ (2021)
BASE
Show details
11
Learning to Parse Sentences with Cross-Situational Learning using Different Word Embeddings Towards Robot Grounding ...
BASE
Show details
12
Cross-Situational Learning with Reservoir Computing for Language Acquisition Modelling
In: 2020 International Joint Conference on Neural Networks (IJCNN 2020) ; https://hal.inria.fr/hal-02594725 ; 2020 International Joint Conference on Neural Networks (IJCNN 2020), Jul 2020, Glasgow, Scotland, United Kingdom ; https://wcci2020.org/ (2020)
BASE
Show details
13
Hierarchical-Task Reservoir for Anytime POS Tagging from Continuous Speech
In: 2020 International Joint Conference on Neural Networks (IJCNN 2020) ; https://hal.inria.fr/hal-02594495 ; 2020 International Joint Conference on Neural Networks (IJCNN 2020), Jul 2020, Glasgow, Scotland, United Kingdom ; https://wcci2020.org/ (2020)
BASE
Show details
14
Language Acquisition with Echo State Networks: Towards Unsupervised Learning
In: ICDL 2020 - IEEE International Conference on Development and Learning ; https://hal.inria.fr/hal-02926613 ; ICDL 2020 - IEEE International Conference on Development and Learning, Oct 2020, Valparaiso / Virtual, Chile (2020)
BASE
Show details
15
A Journey in ESN and LSTM Visualisations on a Language Task
In: https://hal.inria.fr/hal-03030248 ; 2020 (2020)
BASE
Show details
16
Recurrent Neural Networks Models for Developmental Language Acquisition: Reservoirs Outperform LSTMs
In: SNL 2020 - 12th Annual Meeting of the Society for the Neurobiology of Language ; https://hal.inria.fr/hal-03146558 ; SNL 2020 - 12th Annual Meeting of the Society for the Neurobiology of Language, Oct 2020, Virtual Edition, Canada (2020)
Abstract: International audience ; We previously developed cortico-striatal models for sentence comprehension (Hinaut & Dominey 2013) and sentence production (Hinaut et al. 2015). The sentence comprehension model is based on the reservoir computing principle: a random recurrent neural network (a reservoir) provides a rich recombination of sequential word inputs (e.g. a piece of prefrontal cortex), and an output layer (e.g. striatum) learns to "reads- out" the roles of words in the sentence from the internal recurrent dynamics. The model has several interesting properties, like the ability to predict the semantic roles of words during online processing. Additionally, we demonstrated its robustness to various corpus complexities, in different languages, and even its ability to work with bilingual inputs. In this study, we propose to (1) use the model in a new task related to a developmental language acquisition (i.e. Cross-Situational Learning), (2) provide a quantitative comparison with one of the best performing neural networks for sequential tasks (a LSTM), and (3) provide a qualitative analysis on the way reservoirs and LSTMs solve the task. This new Cross-Situational Task is as follows: for a given sentence, the target output provided often contains more detailed features than what is available in the sentence. Thus, the models have not only to learn how to parse sentences to extract useful information, but also to statistically infer which word is associated with which feature. While reservoir units are modelled as leaky average firing rate neurons, LSTM units are engineered to gate information using a costly and biologically implausible learning algorithm (Back-Propagation Through Time). We found that both models were able to successfully learn the task: the LSTM reached slightly better performance for the basic corpus, but the reservoir was able to significantly outperform LSTMs on more challenging corpora with increasing vocabulary sizes (for a given set of hyperparameters). We analyzed the hidden activations of internal units of both models. Despite the deep differences between both models (trained or fixed internal weights), we were able to uncover similar inner dynamics: the most useful units (with strongest weights to the output layer) seemed tuned to keep track of several specific words in the sentence. Because of its learning algorithm, it is predictable to see such behavior in a LTSM but not in a reservoir; in fact, the LSTM contained more tuned-like units than the reservoir. These differences between LSTMs and reservoirs highlights differences between classical Deep Learning approaches (based on back-propagation algorithm) and more plausible brain learning mechanisms. First, the reservoir is more efficient in terms of training time and cost (the LSTM needs several passes on the training data, while the reservoir uses it only one). Secondly, only the reservoir model seems to scale to larger corpora without the need to adapt specifically the hyperparameters of the model. Finally, the presence of more tuned units in the LSTM compared to the reservoir might be an explanation of why the LSTM seems to overfit too much to training data and have limited generalization capabilities when the learning data available becomes limited.
Keyword: [INFO.INFO-AI]Computer Science [cs]/Artificial Intelligence [cs.AI]; [INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG]; [INFO.INFO-NE]Computer Science [cs]/Neural and Evolutionary Computing [cs.NE]; [SDV.NEU]Life Sciences [q-bio]/Neurons and Cognition [q-bio.NC]; Echo State Networks; Reservoir computing
URL: https://hal.inria.fr/hal-03146558
https://hal.inria.fr/hal-03146558/document
https://hal.inria.fr/hal-03146558/file/HinautVariengien_SNL2020_poster.pdf
BASE
Hide details
17
Learning to Parse Grounded Language using Reservoir Computing
In: ICDL-Epirob 2019 - Joint IEEE 9th International Conference on Development and Learning and Epigenetic Robotics ; https://hal.inria.fr/hal-02422157 ; ICDL-Epirob 2019 - Joint IEEE 9th International Conference on Development and Learning and Epigenetic Robotics, Aug 2019, Olso, Norway. ⟨10.1109/devlrn.2019.8850718⟩ ; https://ieeexplore.ieee.org/abstract/document/8850718 (2019)
BASE
Show details
18
Teach Your Robot Your Language! Trainable Neural Parser for Modelling Human Sentence Processing: Examples for 15 Languages
In: ISSN: 2379-8920 ; EISSN: 2379-8939 ; IEEE Transactions on Cognitive and Developmental Systems ; https://hal.inria.fr/hal-01964541 ; IEEE Transactions on Cognitive and Developmental Systems, Institute of Electrical and Electronics Engineers, Inc, 2019, ⟨10.1109/TCDS.2019.2957006⟩ ; https://doi.org/10.1109/tcds.2019.2957006 (2019)
BASE
Show details
19
A Reservoir Model for Intra-Sentential Code-Switching Comprehension in French and English
In: CogSci'19 - 41st Annual Meeting of the Cognitive Science Society ; https://hal.inria.fr/hal-02432831 ; CogSci'19 - 41st Annual Meeting of the Cognitive Science Society, Jul 2019, Montréal, Canada ; https://cognitivesciencesociety.org/cogsci-2019/ (2019)
BASE
Show details
20
Replication of Laje & Mindlin's model producing synthetic syllables
In: European Birdsong Meeting ; https://hal.inria.fr/hal-01964522 ; European Birdsong Meeting, Apr 2018, Odense, Denmark. 2018 (2018)
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
41
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern