DE eng

Search in the Catalogues and Directories

Hits 1 – 19 of 19

1
Cross-Situational Learning Towards Robot Grounding
In: https://hal.archives-ouvertes.fr/hal-03628290 ; 2022 (2022)
BASE
Show details
2
Cross-Situational Learning Towards Robot Grounding
In: https://hal.archives-ouvertes.fr/hal-03628290 ; 2022 (2022)
Abstract: How do children acquire language through unsupervised or noisy supervision? How do their brain process language? We take this perspective to machine learning and robotics, where part of the problem is understanding how language models can perform grounded language acquisition through noisy supervision and discussing how they can account for brain learning dynamics. Most prior works have tracked the co-occurrence between single words and referents to model how infants learn wordreferent mappings. This paper studies cross-situational learning (CSL) with full sentences: we want to understand brain mechanisms that enable children to learn mappings between words and their meanings from full sentences in early language learning. We investigate the CSL task on a few training examples with two sequence-based models: (i) Echo State Networks (ESN) and (ii) Long-Short Term Memory Networks (LSTM). Most importantly, we explore several word representations including One-Hot, GloVe, pretrained BERT, and fine-tuned BERT representations (last layer token representations) to perform the CSL task. We apply our approach to three diverse datasets (two grounded language datasets and a robotic dataset) and observe that (1) One-Hot, GloVe, and pretrained BERT representations are less efficient when compared to representations obtained from fine-tuned BERT. (2) ESN online with final learning (FL) yields superior performance over ESN online continual learning (CL), offline learning, and LSTMs, indicating the more biological plausibility of ESNs and the cognitive process of sentence reading. (2) LSTM with fewer hidden units showcases higher performance for small datasets, but LSTM with more hidden units is Cross-Situational Learning needed to perform reasonably well on larger corpora. (4) ESNs demonstrate better generalization than LSTM models for increasingly large vocabularies. Overall, these models are able to learn from scratch to link complex relations between words and their corresponding meaning concepts, handling polysemous and synonymous words. Moreover, we argue that such models can extend to help current human-robot interaction studies on language grounding and better understand children's developmental language acquisition. We make the code publicly available * .
Keyword: [INFO.INFO-AI]Computer Science [cs]/Artificial Intelligence [cs.AI]; [INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL]; [INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG]; [INFO.INFO-NE]Computer Science [cs]/Neural and Evolutionary Computing [cs.NE]; [INFO.INFO-RB]Computer Science [cs]/Robotics [cs.RO]; [SDV.NEU]Life Sciences [q-bio]/Neurons and Cognition [q-bio.NC]; BERT; cross-situational learning; echo state networks; grounded language; LSTM
URL: https://hal.archives-ouvertes.fr/hal-03628290/document
https://hal.archives-ouvertes.fr/hal-03628290/file/Journal_of_Social_and_Robotics.pdf
https://hal.archives-ouvertes.fr/hal-03628290
BASE
Hide details
3
Which Hype for my New Task? Hints and Random Search for Reservoir Computing Hyperparameters
In: ICANN 2021 - 30th International Conference on Artificial Neural Networks ; https://hal.inria.fr/hal-03203318 ; ICANN 2021 - 30th International Conference on Artificial Neural Networks, Sep 2021, Bratislava, Slovakia (2021)
BASE
Show details
4
Canary Song Decoder: Transduction and Implicit Segmentation with ESNs and LTSMs
In: https://hal.inria.fr/hal-03203374 ; 2021 (2021)
BASE
Show details
5
Which Hype for my New Task? Hints and Random Search for Reservoir Computing Hyperparameters
In: https://hal.inria.fr/hal-03203318 ; 2021 (2021)
BASE
Show details
6
Canary Song Decoder: Transduction and Implicit Segmentation with ESNs and LTSMs
In: ICANN 2021 - 30th International Conference on Artificial Neural Networks ; https://hal.inria.fr/hal-03203374 ; ICANN 2021 - 30th International Conference on Artificial Neural Networks, Sep 2021, Bratislava, Slovakia. pp.71--82, ⟨10.1007/978-3-030-86383-8_6⟩ ; https://link.springer.com/chapter/10.1007/978-3-030-86383-8_6 (2021)
BASE
Show details
7
Cross-Situational Learning with Reservoir Computing for Language Acquisition Modelling
In: 2020 International Joint Conference on Neural Networks (IJCNN 2020) ; https://hal.inria.fr/hal-02594725 ; 2020 International Joint Conference on Neural Networks (IJCNN 2020), Jul 2020, Glasgow, Scotland, United Kingdom ; https://wcci2020.org/ (2020)
BASE
Show details
8
Language Acquisition with Echo State Networks: Towards Unsupervised Learning
In: ICDL 2020 - IEEE International Conference on Development and Learning ; https://hal.inria.fr/hal-02926613 ; ICDL 2020 - IEEE International Conference on Development and Learning, Oct 2020, Valparaiso / Virtual, Chile (2020)
BASE
Show details
9
Recurrent Neural Networks Models for Developmental Language Acquisition: Reservoirs Outperform LSTMs
In: SNL 2020 - 12th Annual Meeting of the Society for the Neurobiology of Language ; https://hal.inria.fr/hal-03146558 ; SNL 2020 - 12th Annual Meeting of the Society for the Neurobiology of Language, Oct 2020, Virtual Edition, Canada (2020)
BASE
Show details
10
A Reservoir Model for Intra-Sentential Code-Switching Comprehension in French and English
In: CogSci'19 - 41st Annual Meeting of the Cognitive Science Society ; https://hal.inria.fr/hal-02432831 ; CogSci'19 - 41st Annual Meeting of the Cognitive Science Society, Jul 2019, Montréal, Canada ; https://cognitivesciencesociety.org/cogsci-2019/ (2019)
BASE
Show details
11
An Empirical Study on Bidirectional Recurrent Neural Networks for Human Motion Recognition
Tanisaro, Pattreeya; Heidemann, Gunther. - : Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik, 2018. : LIPIcs - Leibniz International Proceedings in Informatics. 25th International Symposium on Temporal Representation and Reasoning (TIME 2018), 2018
BASE
Show details
12
Teach Your Robot Your Language! Trainable Neural Parser for Modelling Human Sentence Processing: Examples for 15 Languages
In: https://hal.inria.fr/hal-01665807 ; 2017 (2017)
BASE
Show details
13
Recurrent Neural Network for Syntax Learning with Flexible Predicates for Robotic Architectures
In: The Sixth Joint IEEE International Conference Developmental Learning and Epigenetic Robotics (ICDL-EPIROB) ; https://hal.inria.fr/hal-01417697 ; The Sixth Joint IEEE International Conference Developmental Learning and Epigenetic Robotics (ICDL-EPIROB), Sep 2016, Cergy, France ; http://icdl-epirob.org/ (2016)
BASE
Show details
14
Recurrent Neural Network Sentence Parser for Multiple Languages with Flexible Meaning Representations for Home Scenarios
In: IROS Workshop on Bio-inspired Social Robot Learning in Home Scenarios ; https://hal.inria.fr/hal-01417667 ; IROS Workshop on Bio-inspired Social Robot Learning in Home Scenarios, Oct 2016, Daejon, South Korea ; https://www.informatik.uni-hamburg.de/wtm/SocialRobotsWorkshop2016/index.php (2016)
BASE
Show details
15
Echo State Networks For Arabic Phoneme Recognition ...
Hmad, Nadia; Allen, Tony. - : Zenodo, 2013
BASE
Show details
16
Echo State Networks For Arabic Phoneme Recognition ...
Hmad, Nadia; Allen, Tony. - : Zenodo, 2013
BASE
Show details
17
On-Line Processing of Grammatical Structure Using Reservoir Computing
In: In A. E. P. Villa, et al.: Artificial Neural Networks and Machine Learning - ICANN 2012 - 22nd International Conference on Artificial Neural Networks ; https://hal.inria.fr/hal-02561301 ; In A. E. P. Villa, et al.: Artificial Neural Networks and Machine Learning - ICANN 2012 - 22nd International Conference on Artificial Neural Networks, Sep 2012, Lausanne, Switzerland. pp.596-603, ⟨10.1007/978-3-642-33269-2_75⟩ (2012)
BASE
Show details
18
On-Line Processing of Grammatical Structure Using Reservoir Computing *
In: http://www.sbri.fr/files/publications/hinaut 12 icann.pdf
BASE
Show details
19
2007 Special Issue Learning grammatical structure with Echo State Networks
In: http://cs.ucsd.edu/%7Eechristiansen/papers/grammar_esn.pdf
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
19
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern