DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 33

1
Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural Machine Translation ...
BASE
Show details
2
Language Models are Few-shot Multilingual Learners ...
BASE
Show details
3
BiToD: A Bilingual Multi-Domain Dataset For Task-Oriented Dialogue Modeling ...
BASE
Show details
4
Are Multilingual Models Effective in Code-Switching? ...
BASE
Show details
5
Adapting High-resource NMT Models to Translate Low-resource Related Languages without Parallel Data ...
BASE
Show details
6
Learning Fast Adaptation on Cross-Accented Speech Recognition ...
BASE
Show details
7
Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models via Continual Learning ...
BASE
Show details
8
XPersona: Evaluating Multilingual Personalized Chatbot ...
BASE
Show details
9
Meta-Transfer Learning for Code-Switched Speech Recognition ...
BASE
Show details
10
On the Importance of Word Order Information in Cross-lingual Sequence Labeling ...
BASE
Show details
11
Attention-Informed Mixed-Language Training for Zero-shot Cross-lingual Task-oriented Dialogue Systems ...
BASE
Show details
12
Zero-shot Cross-lingual Dialogue Systems with Transferable Latent Variables ...
Abstract: Despite the surging demands for multilingual task-oriented dialog systems (e.g., Alexa, Google Home), there has been less research done in multilingual or cross-lingual scenarios. Hence, we propose a zero-shot adaptation of task-oriented dialogue system to low-resource languages. To tackle this challenge, we first use a set of very few parallel word pairs to refine the aligned cross-lingual word-level representations. We then employ a latent variable model to cope with the variance of similar sentences across different languages, which is induced by imperfect cross-lingual alignments and inherent differences in languages. Finally, the experimental results show that even though we utilize much less external resources, our model achieves better adaptation performance for natural language understanding task (i.e., the intent detection and slot filling) compared to the current state-of-the-art model in the zero-shot scenario. ... : Accepted in EMNLP 2019 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
URL: https://dx.doi.org/10.48550/arxiv.1911.04081
https://arxiv.org/abs/1911.04081
BASE
Hide details
13
Towards Universal End-to-End Affect Recognition from Multilingual Speech by ConvNets ...
BASE
Show details
14
Code-Switched Language Models Using Neural Based Synthetic Data from Parallel Sentences ...
BASE
Show details
15
Hierarchical Meta-Embeddings for Code-Switching Named Entity Recognition ...
BASE
Show details
16
GlobalTrait: Personality Alignment of Multilingual Word Embeddings ...
BASE
Show details
17
Learn to Code-Switch: Data Augmentation using Copy Mechanism on Language Modeling ...
BASE
Show details
18
Mem2Seq: Effectively Incorporating Knowledge Bases into End-to-End Task-Oriented Dialog Systems ...
BASE
Show details
19
Bilingual Character Representation for Efficiently Addressing Out-of-Vocabulary Words in Code-Switching Named Entity Recognition ...
BASE
Show details
20
Code-Switching Language Modeling using Syntax-Aware Multi-Task Learning ...
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
33
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern