1 |
Quality Assurance of Generative Dialog Models in an Evolving Conversational Agent Used for Swedish Language Practice ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
Slangvolution: A Causal Analysis of Semantic Change and Frequency Dynamics in Slang ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Similarity between person roles in a card sorting experiment ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
SPT-Code: Sequence-to-Sequence Pre-Training for Learning Source Code Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Ensemble of Opinion Dynamics Models to Understand the Role of the Undecided in the Vaccination Debate ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Generating Authentic Adversarial Examples beyond Meaning-preserving with Doubly Round-trip Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Pirá: A Bilingual Portuguese-English Dataset for Question-Answering about the Ocean ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
A comparative study of several parameterizations for speaker recognition ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
A Neural Pairwise Ranking Model for Readability Assessment ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
A bilingual approach to specialised adjectives through word embeddings in the karstology domain ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Speaker verification in mismatch training and testing conditions ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Universal Conditional Masked Language Pre-training for Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
SMDT: Selective Memory-Augmented Neural Document Translation ...
|
|
|
|
Abstract:
Existing document-level neural machine translation (NMT) models have sufficiently explored different context settings to provide guidance for target generation. However, little attention is paid to inaugurate more diverse context for abundant context information. In this paper, we propose a Selective Memory-augmented Neural Document Translation model to deal with documents containing large hypothesis space of the context. Specifically, we retrieve similar bilingual sentence pairs from the training corpus to augment global context and then extend the two-stream attention model with selective mechanism to capture local context and diverse global contexts. This unified approach allows our model to be trained elegantly on three publicly document-level machine translation datasets and significantly outperforms previous document-level NMT models. ...
|
|
Keyword:
Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.2201.01631 https://arxiv.org/abs/2201.01631
|
|
BASE
|
|
Hide details
|
|
17 |
Learning How to Translate North Korean through South Korean ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
When do Contrastive Word Alignments Improve Many-to-many Neural Machine Translation? ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Conditional Bilingual Mutual Information Based Adaptive Training for Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|