DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5
Hits 1 – 20 of 88

1
NMTScore: A Multilingual Analysis of Translation-based Text Similarity Measures ...
Vamvas, Jannis; Sennrich, Rico. - : arXiv, 2022
BASE
Show details
2
Improving Zero-shot Cross-lingual Transfer between Closely Related Languages by injecting Character-level Noise ...
Aepli, Noëmi; Sennrich, Rico. - : arXiv, 2021
BASE
Show details
3
On Biasing Transformer Attention Towards Monotonicity ...
BASE
Show details
4
Wino-X: Multilingual Winograd Schemas for Commonsense Reasoning and Coreference Resolution ...
Emelin, Denis; Sennrich, Rico. - : Association for Computational Linguistics, 2021
BASE
Show details
5
ELITR Multilingual Live Subtitling: Demo and Strategy ...
BASE
Show details
6
Share or Not? Learning to Schedule Language-Specific Capacity for Multilingual Translation ...
BASE
Show details
7
Edinburgh’s End-to-End Multilingual Speech Translation System for IWSLT 2021 ...
Zhang, Biao; Sennrich, Rico. - : ACL Anthology, 2021
BASE
Show details
8
On Biasing Transformer Attention Towards Monotonicity ...
Rios, Annette; Amrhein, Chantal; Aepli, Noëmi. - : Association for Computational Linguistics, 2021
BASE
Show details
9
Revisiting Negation in Neural Machine Translation ...
BASE
Show details
10
Understanding the Properties of Minimum Bayes Risk Decoding in Neural Machine Translation ...
BASE
Show details
11
Analyzing the Source and Target Contributions to Predictions in Neural Machine Translation ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.91 Abstract: In Neural Machine Translation (and, more generally, conditional language modeling), the generation of a target token is influenced by two types of context: the source and the prefix of the target sequence. While many attempts to understand the internal workings of NMT models have been made, none of them explicitly evaluates relative source and target contributions to a generation decision. We argue that this relative contribution can be evaluated by adopting a variant of Layerwise Relevance Propagation (LRP). Its underlying 'conservation principle' makes relevance propagation unique: differently from other methods, it evaluates not an abstract quantity reflecting token importance, but the proportion of each token's influence. We extend LRP to the Transformer and conduct an analysis of NMT models which explicitly evaluates the source and target relative contributions to the generation process. We analyze changes in these contributions when ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/25975-analyzing-the-source-and-target-contributions-to-predictions-in-neural-machine-translation
https://dx.doi.org/10.48448/29jz-9466
BASE
Hide details
12
Vision Matters When It Should: Sanity Checking Multimodal Machine Translation Models ...
BASE
Show details
13
Wino-X: Multilingual Winograd Schemas for Commonsense Reasoning and Coreference Resolution ...
BASE
Show details
14
Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT ...
BASE
Show details
15
Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT ...
Voita, Elena; Sennrich, Rico; Titov, Ivan. - : ACL Anthology, 2021
BASE
Show details
16
Contrastive Conditioning for Assessing Disambiguation in MT: A Case Study of Distilled Bias ...
BASE
Show details
17
Language Modeling, Lexical Translation, Reordering: The Training Process of NMT through the Lens of Classical SMT ...
BASE
Show details
18
On Biasing Transformer Attention Towards Monotonicity ...
NAACL 2021 2021; Aepli, Noëmi; Amrhein, Chantal. - : Underline Science Inc., 2021
BASE
Show details
19
Universal rewriting via machine translation
Mallinson, Jonathan. - : The University of Edinburgh, 2021
BASE
Show details
20
Share or Not? Learning to Schedule Language-Specific Capacity for Multilingual Translation
In: Zhang, Biao; Bapna, Ankur; Sennrich, Rico; Firat, Orhan (2021). Share or Not? Learning to Schedule Language-Specific Capacity for Multilingual Translation. In: International Conference on Learning Representations, Virtual, 3 May 2021 - 7 May 2021, ICLR. (2021)
BASE
Show details

Page: 1 2 3 4 5

Catalogues
0
0
0
0
2
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
86
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern