DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...49
Hits 1 – 20 of 973

1
An Overview of Indian Spoken Language Recognition from Machine Learning Perspective
In: ISSN: 2375-4699 ; EISSN: 2375-4702 ; ACM Transactions on Asian and Low-Resource Language Information Processing ; https://hal.inria.fr/hal-03616853 ; ACM Transactions on Asian and Low-Resource Language Information Processing, ACM, In press, ⟨10.1145/3523179⟩ (2022)
BASE
Show details
2
Assessing the impact of OCR noise on multilingual event detection over digitised documents
In: ISSN: 1432-5012 ; EISSN: 1432-1300 ; International Journal on Digital Libraries ; https://hal.archives-ouvertes.fr/hal-03635985 ; International Journal on Digital Libraries, Springer Verlag, 2022, ⟨10.1007/s00799-022-00325-2⟩ (2022)
BASE
Show details
3
Introducing the HIPE 2022 Shared Task: Named Entity Recognition and Linking in Multilingual Historical Documents
In: Advances in Information Retrieval. 44th European Conference on IR Research, ECIR 2022, Stavanger, Norway, April 10–14, 2022, Proceedings, Part II ; https://hal.archives-ouvertes.fr/hal-03635971 ; Matthias Hagen; Suzan Verberne; Craig Macdonald; Christin Seifert; Krisztian Balog; Kjetil Nørvåg; Vinay Setty. Advances in Information Retrieval. 44th European Conference on IR Research, ECIR 2022, Stavanger, Norway, April 10–14, 2022, Proceedings, Part II, 13186, Springer International Publishing, pp.347-354, 2022, Lecture Notes in Computer Science, 978-3-030-99738-0. ⟨10.1007/978-3-030-99739-7_44⟩ (2022)
BASE
Show details
4
Can Character-based Language Models Improve Downstream Task Performance in Low-Resource and Noisy Language Scenarios?
In: Seventh Workshop on Noisy User-generated Text (W-NUT 2021, colocated with EMNLP 2021) ; https://hal.inria.fr/hal-03527328 ; Seventh Workshop on Noisy User-generated Text (W-NUT 2021, colocated with EMNLP 2021), Jan 2022, punta cana, Dominican Republic ; https://aclanthology.org/2021.wnut-1.47/ (2022)
BASE
Show details
5
Cross-lingual few-shot hate speech and offensive language detection using meta learning
In: ISSN: 2169-3536 ; EISSN: 2169-3536 ; IEEE Access ; https://hal.archives-ouvertes.fr/hal-03559484 ; IEEE Access, IEEE, 2022, 10, pp.14880-14896. ⟨10.1109/ACCESS.2022.3147588⟩ (2022)
BASE
Show details
6
Cross-Situational Learning Towards Robot Grounding
In: https://hal.archives-ouvertes.fr/hal-03628290 ; 2022 (2022)
Abstract: How do children acquire language through unsupervised or noisy supervision? How do their brain process language? We take this perspective to machine learning and robotics, where part of the problem is understanding how language models can perform grounded language acquisition through noisy supervision and discussing how they can account for brain learning dynamics. Most prior works have tracked the co-occurrence between single words and referents to model how infants learn wordreferent mappings. This paper studies cross-situational learning (CSL) with full sentences: we want to understand brain mechanisms that enable children to learn mappings between words and their meanings from full sentences in early language learning. We investigate the CSL task on a few training examples with two sequence-based models: (i) Echo State Networks (ESN) and (ii) Long-Short Term Memory Networks (LSTM). Most importantly, we explore several word representations including One-Hot, GloVe, pretrained BERT, and fine-tuned BERT representations (last layer token representations) to perform the CSL task. We apply our approach to three diverse datasets (two grounded language datasets and a robotic dataset) and observe that (1) One-Hot, GloVe, and pretrained BERT representations are less efficient when compared to representations obtained from fine-tuned BERT. (2) ESN online with final learning (FL) yields superior performance over ESN online continual learning (CL), offline learning, and LSTMs, indicating the more biological plausibility of ESNs and the cognitive process of sentence reading. (2) LSTM with fewer hidden units showcases higher performance for small datasets, but LSTM with more hidden units is Cross-Situational Learning needed to perform reasonably well on larger corpora. (4) ESNs demonstrate better generalization than LSTM models for increasingly large vocabularies. Overall, these models are able to learn from scratch to link complex relations between words and their corresponding meaning concepts, handling polysemous and synonymous words. Moreover, we argue that such models can extend to help current human-robot interaction studies on language grounding and better understand children's developmental language acquisition. We make the code publicly available * .
Keyword: [INFO.INFO-AI]Computer Science [cs]/Artificial Intelligence [cs.AI]; [INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL]; [INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG]; [INFO.INFO-NE]Computer Science [cs]/Neural and Evolutionary Computing [cs.NE]; [INFO.INFO-RB]Computer Science [cs]/Robotics [cs.RO]; [SDV.NEU]Life Sciences [q-bio]/Neurons and Cognition [q-bio.NC]; BERT; cross-situational learning; echo state networks; grounded language; LSTM
URL: https://hal.archives-ouvertes.fr/hal-03628290
https://hal.archives-ouvertes.fr/hal-03628290v2/file/Journal_of_Social_and_Robotics.pdf
https://hal.archives-ouvertes.fr/hal-03628290v2/document
BASE
Hide details
7
Cross-Situational Learning Towards Robot Grounding
In: https://hal.archives-ouvertes.fr/hal-03628290 ; 2022 (2022)
BASE
Show details
8
A Neural Pairwise Ranking Model for Readability Assessment ...
Lee, Justin; Vajjala, Sowmya. - : arXiv, 2022
BASE
Show details
9
A Transformer-Based Contrastive Learning Approach for Few-Shot Sign Language Recognition ...
BASE
Show details
10
pNLP-Mixer: an Efficient all-MLP Architecture for Language ...
BASE
Show details
11
Frame Shift Prediction ...
BASE
Show details
12
MASSIVE: A 1M-Example Multilingual Natural Language Understanding Dataset with 51 Typologically-Diverse Languages ...
BASE
Show details
13
Adapting BigScience Multilingual Model to Unseen Languages ...
BASE
Show details
14
Does Corpus Quality Really Matter for Low-Resource Languages? ...
BASE
Show details
15
A New Generation of Perspective API: Efficient Multilingual Character-level Transformers ...
Lees, Alyssa; Tran, Vinh Q.; Tay, Yi. - : arXiv, 2022
BASE
Show details
16
Agreement ...
Tal, Shira. - : Open Science Framework, 2022
BASE
Show details
17
Agreement ...
Tal, Shira. - : Open Science Framework, 2022
BASE
Show details
18
Natural Language Descriptions of Deep Visual Features ...
BASE
Show details
19
Learning Bidirectional Translation between Descriptions and Actions with Small Paired Data ...
BASE
Show details
20
Cross-view Brain Decoding ...
BASE
Show details

Page: 1 2 3 4 5...49

Catalogues
11
0
13
0
0
0
1
Bibliographies
49
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
923
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern