DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 32

1
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation ...
BASE
Show details
2
Self-Training Sampling with Monolingual Data Uncertainty for Neural Machine Translation ...
BASE
Show details
3
On the Copying Behaviors of Pre-Training for Neural Machine Translation ...
BASE
Show details
4
Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation ...
BASE
Show details
5
On the Inference Calibration of Neural Machine Translation ...
Wang, Shuo; Tu, Zhaopeng; Shi, Shuming. - : arXiv, 2020
BASE
Show details
6
EmpDG: Multi-resolution Interactive Empathetic Dialogue Generation ...
BASE
Show details
7
On the Sparsity of Neural Machine Translation Models ...
Abstract: Modern neural machine translation (NMT) models employ a large number of parameters, which leads to serious over-parameterization and typically causes the underutilization of computational resources. In response to this problem, we empirically investigate whether the redundant parameters can be reused to achieve better performance. Experiments and analyses are systematically conducted on different datasets and NMT architectures. We show that: 1) the pruned parameters can be rejuvenated to improve the baseline model by up to +0.8 BLEU points; 2) the rejuvenated parameters are reallocated to enhance the ability of modeling low-level lexical information. ... : EMNLP 2020 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2010.02646
https://arxiv.org/abs/2010.02646
BASE
Hide details
8
Assessing the Bilingual Knowledge Learned by Neural Machine Translation Models ...
He, Shilin; Wang, Xing; Shi, Shuming. - : arXiv, 2020
BASE
Show details
9
Understanding and Improving Lexical Choice in Non-Autoregressive Translation ...
Ding, Liang; Wang, Longyue; Liu, Xuebo. - : arXiv, 2020
BASE
Show details
10
Information Aggregation for Multi-Head Attention with Routing-by-Agreement ...
Li, Jian; Yang, Baosong; Dou, Zi-Yi. - : arXiv, 2019
BASE
Show details
11
Neuron Interaction Based Representation Composition for Neural Machine Translation ...
Li, Jian; Wang, Xing; Yang, Baosong. - : arXiv, 2019
BASE
Show details
12
Multi-Granularity Self-Attention for Neural Machine Translation ...
Hao, Jie; Wang, Xing; Shi, Shuming. - : arXiv, 2019
BASE
Show details
13
Towards Understanding Neural Machine Translation with Word Importance ...
He, Shilin; Tu, Zhaopeng; Wang, Xing. - : arXiv, 2019
BASE
Show details
14
Towards Better Modeling Hierarchical Structure for Self-Attention with Ordered Neurons ...
Hao, Jie; Wang, Xing; Shi, Shuming. - : arXiv, 2019
BASE
Show details
15
Translating pro-drop languages with reconstruction models
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Shi, Shuming, Zhang, Tong, Graham, Yvette and Liu, Qun orcid:0000-0002-7000-1792 (2018) Translating pro-drop languages with reconstruction models. In: Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18), 2–7 Feb 2018, New Orleans, LA, USA. ISBN 978-1-57735-800-8 (2018)
BASE
Show details
16
Translating pro-drop languages with reconstruction models
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Shi, Shuming, Zhang, Tong, Graham, Yvette and Liu, Qun orcid:0000-0002-7000-1792 (2018) Translating pro-drop languages with reconstruction models. In: 32nd AAAI Conference on Artificial Intelligence (AAAI 2018), 2 - 7 Feb 2018, New Orleans, LA, USA. ISBN 978-1-57735-800-8 (2018)
BASE
Show details
17
Translating Pro-Drop Languages with Reconstruction Models ...
BASE
Show details
18
Exploiting Deep Representations for Neural Machine Translation ...
Dou, Zi-Yi; Tu, Zhaopeng; Wang, Xing. - : arXiv, 2018
BASE
Show details
19
A novel and robust approach for pro-drop language translation [<Journal>]
Wang, Longyue [Verfasser]; Tu, Zhaopeng [Sonstige]; Zhang, Xiaojun [Sonstige].
DNB Subject Category Language
Show details
20
Exploiting cross-sentence context for neural machine translation
In: Wang, Longyue orcid:0000-0002-9062-6183 , Tu, Zhaopeng, Way, Andy orcid:0000-0001-5736-5930 and Liu, Qun orcid:0000-0002-7000-1792 (2017) Exploiting cross-sentence context for neural machine translation. In: 2017 Conference on Empirical Methods in Natural Language Processing, 7-8 Sept 2017, Copenhagen, Denmark. (2017)
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
31
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern