DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6...71
Hits 21 – 40 of 1.420

21
Cross-Lingual Transfer Learning for Arabic Task-Oriented Dialogue Systems Using Multilingual Transformer Model mT5
In: Mathematics; Volume 10; Issue 5; Pages: 746 (2022)
BASE
Show details
22
Measuring Terminology Consistency in Translated Corpora: Implementation of the Herfindahl-Hirshman Index
In: Information; Volume 13; Issue 2; Pages: 43 (2022)
BASE
Show details
23
Comparative Study of Multiclass Text Classification in Research Proposals Using Pretrained Language Models
In: Applied Sciences; Volume 12; Issue 9; Pages: 4522 (2022)
BASE
Show details
24
The Role of Task Complexity and Dominant Articulatory Routines in the Acquisition of L3 Spanish
In: Languages; Volume 7; Issue 2; Pages: 90 (2022)
BASE
Show details
25
Leveraging Frozen Pretrained Written Language Models for Neural Sign Language Translation
In: Information; Volume 13; Issue 5; Pages: 220 (2022)
BASE
Show details
26
Analyzing COVID-19 Medical Papers Using Artificial Intelligence: Insights for Researchers and Medical Professionals
In: Big Data and Cognitive Computing; Volume 6; Issue 1; Pages: 4 (2022)
BASE
Show details
27
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning
Nguyen, Huong Thi Thu. - : Humboldt-Universität zu Berlin, 2022
BASE
Show details
28
Practical English speaking language ...
SH.J.Mamatkulova. - : Zenodo, 2022
BASE
Show details
29
Practical English speaking language ...
SH.J.Mamatkulova. - : Zenodo, 2022
BASE
Show details
30
ETHNOCULTURAL AND SOCIOLINGUISTIC FACTORS IN TEACHING RUSSIAN AS A FOREIGN LANGUAGE ...
Sanobar Saidovna Abdumuratova. - : Academic research in educational sciences, 2022
BASE
Show details
31
The Effects of Event Depictions in Second Language Phrasal Vocabulary Learning ...
Nguyen, Huong Thi Thu. - : Humboldt-Universität zu Berlin, 2022
BASE
Show details
32
The Rise and Fall of Linguistic Transfer ...
Ozernyi, Daniil M.. - : Zenodo, 2022
BASE
Show details
33
The Rise and Fall of Linguistic Transfer ...
Ozernyi, Daniil M.. - : Zenodo, 2022
BASE
Show details
34
Toward an Epistemic Web
In: 197 ; RatSWD Working Paper Series ; 22 (2022)
BASE
Show details
35
StaResGRU-CNN with CMedLMs: a stacked residual GRU-CNN with pre-trained biomedical language models for predictive intelligence
Ni, Pin; Li, Gangmin; Hung, Patrick C.K.. - : Elsevier Ltd, 2022
BASE
Show details
36
Competence management in the UK heritage railway industry
Baughan, Robert Henry Edward. - : The University of Edinburgh, 2022
BASE
Show details
37
An Empirical Study of Factors Affecting Language-Independent Models
BASE
Show details
38
„A Hund is er scho’“. Die Migration eines Ausdrucks und seine bayerisch-ungarische Transfergeschichte
Weithmann, Michael. - : Universität Tübingen, 2022
BASE
Show details
39
Neural-based Knowledge Transfer in Natural Language Processing
Wang, Chao. - 2022
Abstract: In Natural Language Processing (NLP), neural-based knowledge transfer, which is to transfer out-of-domain (OOD) knowledge to task-specific neural networks, has been applied to many NLP tasks. To further explore neural-based knowledge transfer in NLP, in this dissertation, we consider both structured OOD knowledge and unstructured OOD knowledge, and deal with several representative NLP tasks. For structured OOD knowledge, we study the neural-based knowledge transfer in Machine Reading Comprehension (MRC). In single-passage MRC tasks, to bridge the gap between MRC models and human beings, which is mainly reflected in the hunger for data and the robustness to noise, we integrate the neural networks of MRC models with the general knowledge of human beings embodied in knowledge bases. On the one hand, we propose a data enrichment method, which uses WordNet to extract inter-word semantic connections as general knowledge from each given passage-question pair. On the other hand, we propose a novel MRC model named Knowledge Aided Reader (KAR), which explicitly uses the above extracted general knowledge to assist its attention mechanisms. According to the experimental results, KAR is comparable in performance with the state-of-the-art MRC models, and significantly more robust to noise than them. On top of that, when only a subset (20%-80%) of the training examples are available, KAR outperforms the state-of-the-art MRC models by a large margin, and is still reasonably robust to noise. In multi-hop MRC tasks, to probe the strength of Graph Neural Networks (GNNs), we propose a novel multi-hop MRC model named Graph Aided Reader (GAR), which uses GNN methods to perform multi-hop reasoning, but is free of any pre-trained language model and completely end-to-end. For graph construction, GAR utilizes the topic-referencing relations between passages and the entity-sharing relations between sentences, which is aimed at obtaining the most sensible reasoning clues. For message passing, GAR simulates a top-down reasoning and a bottom-up reasoning, which is aimed at making the best use of the above obtained reasoning clues. According to the experimental results, GAR even outperforms several competitors relying on pre-trained language models and filter-reader pipelines, which implies that GAR benefits a lot from its GNN methods. On this basis, GAR can further benefit from applying pre-trained language models, but pre-trained language models can mainly facilitate the within-passage reasoning rather than cross-passage reasoning of GAR. Moreover, compared with the competitors constructed as filter-reader pipelines, GAR is not only easier to train, but also more applicable to the low-resource cases. For unstructured OOD knowledge, we study the neural-based knowledge transfer in Natural Language Understanding (NLU), and focus on the neural-based knowledge transfer between languages, which is also known as Cross-Lingual Transfer Learning (CLTL). To facilitate the CLTL of NLU models, especially the CLTL between distant languages, we propose a novel CLTL model named Translation Aided Language Learner (TALL), where CLTL is integrated with Machine Translation (MT). Specifically, we adopt a pre-trained multilingual language model as our baseline model, and construct TALL by appending a decoder to it. On this basis, we directly fine-tune the baseline model as an NLU model to conduct CLTL, but put TALL through an MT-oriented pre-training before its NLU-oriented fine-tuning. To make use of unannotated data, we implement the recently proposed Unsupervised Machine Translation (UMT) technique in the MT-oriented pre-training of TALL. According to the experimental results, the application of UMT enables TALL to consistently achieve better CLTL performance than the baseline model without using more annotated data, and the performance gain is relatively prominent in the case of distant languages.
Keyword: Cross-lingual transfer learning; Graph neural network; Information technology; Knowledge base; Knowledge graph; Knowledge transfer; Machine Reading Comprehension; Multi-hop reasoning; Natural Language Processing; Natural language understanding; Neural network; unsupervised machine translation
URL: http://hdl.handle.net/10315/39096
BASE
Hide details
40
Chinese Idioms: Stepping Into L2 Student’s Shoes
In: Acta Linguistica Asiatica, Vol 12, Iss 1 (2022) (2022)
BASE
Show details

Page: 1 2 3 4 5 6...71

Catalogues
50
5
1
0
0
0
6
Bibliographies
105
0
0
0
0
0
0
2
8
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
1.281
2
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern