DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 27

1
The FLORES-101 Evaluation Benchmark for Low-Resource and Multilingual Machine Translation ...
BASE
Show details
2
LAWDR: Language-Agnostic Weighted Document Representations from Pre-trained Models ...
BASE
Show details
3
Classification-based Quality Estimation: Small and Efficient Models for Real-world Applications ...
BASE
Show details
4
Improving Zero-Shot Translation by Disentangling Positional Information ...
Liu, Danni; Niehues, Jan; Cross, James. - : Association for Computational Linguistics, 2021
BASE
Show details
5
As Easy as 1, 2, 3: Behavioural Testing of NMT Systems for Numerical Translation ...
BASE
Show details
6
Putting words into the system's mouth: A targeted attack on neural machine translation using monolingual data poisoning ...
BASE
Show details
7
Detecting Hallucinated Content in Conditional Neural Sequence Generation ...
BASE
Show details
8
Alternative Input Signals Ease Transfer in Multilingual Machine Translation ...
Sun, Simeng; Fan, Angela; Cross, James. - : arXiv, 2021
BASE
Show details
9
Improving Zero-Shot Translation by Disentangling Positional Information ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.101 Abstract: Multilingual neural machine translation has shown the capability of directly translating between language pairs unseen in training, i.e. zero-shot translation. Despite being conceptually attractive, it often suffers from low output quality. The difficulty of generalizing to new translation directions suggests the model representations are highly specific to those language pairs seen in training. We demonstrate that a main factor causing the language-specific representations is the positional correspondence to input tokens. We show that this can be easily alleviated by removing residual connections in an encoder layer. With this modification, we gain up to 18.5 BLEU points on zero-shot translation while retaining quality on supervised directions. The improvements are particularly prominent between related languages, where our proposed model outperforms pivot-based translation. Moreover, our approach allows easy integration of new languages, ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/3d82-wz22
https://underline.io/lecture/25990-improving-zero-shot-translation-by-disentangling-positional-information
BASE
Hide details
10
XLEnt: Mining a Large Cross-lingual Entity Dataset with Lexical-Semantic-Phonetic Word Alignment ...
BASE
Show details
11
Adapting High-resource NMT Models to Translate Low-resource Related Languages without Parallel Data ...
BASE
Show details
12
Improving Zero-Shot Translation by Disentangling Positional Information
Liu, Danni; Li, Xian; Niehues, Jan. - : Association for Computational Linguistics, 2021
BASE
Show details
13
Massively Multilingual Document Alignment with Cross-lingual Sentence-Mover's Distance ...
BASE
Show details
14
MLQE-PE: A Multilingual Quality Estimation and Post-Editing Dataset ...
BASE
Show details
15
Improving Zero-Shot Translation by Disentangling Positional Information ...
Liu, Danni; Niehues, Jan; Cross, James. - : arXiv, 2020
BASE
Show details
16
Unsupervised quality estimation for neural machine translation
In: 8 ; 539 ; 555 (2020)
BASE
Show details
17
An exploratory study on multilingual quality estimation
In: 366 ; 377 (2020)
BASE
Show details
18
BERGAMOT-LATTE submissions for the WMT20 quality estimation shared task
In: 1010 ; 1017 (2020)
BASE
Show details
19
Findings of the WMT 2020 shared task on quality estimation
In: 743 ; 764 (2020)
BASE
Show details
20
MLQE-PE: A multilingual quality estimation and post-editing dataset
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
27
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern