DE eng

Search in the Catalogues and Directories

Hits 1 – 14 of 14

1
Challenges and Strategies in Cross-Cultural NLP ...
BASE
Show details
2
IGLUE: A Benchmark for Transfer Learning across Modalities, Tasks, and Languages ...
BASE
Show details
3
Vision-and-Language or Vision-for-Language? On Cross-Modal Influence in Multimodal Transformers ...
BASE
Show details
4
Multimodal Pretraining Unmasked: A Meta-Analysis and a Unified Framework of Vision-and-Language BERTs ...
BASE
Show details
5
Multimodal pretraining unmasked: A meta-analysis and a unified framework of vision-and-language berts ...
BASE
Show details
6
Visually Grounded Reasoning across Languages and Cultures ...
BASE
Show details
7
On Language Models for Creoles ...
BASE
Show details
8
Visually Grounded Reasoning across Languages and Cultures ...
BASE
Show details
9
Multimodal pretraining unmasked: A meta-analysis and a unified framework of vision-and-language berts
In: Transactions of the Association for Computational Linguistics, 9 (2021)
BASE
Show details
10
The Role of Syntactic Planning in Compositional Image Captioning ...
BASE
Show details
11
On Language Models for Creoles ...
BASE
Show details
12
Visually Grounded Reasoning across Languages and Cultures ...
BASE
Show details
13
It’s Easier to Translate out of English than into it: Measuring Neural Translation Difficulty by Cross-Mutual Information ...
BASE
Show details
14
It’s Easier to Translate out of English than into it: Measuring Neural Translation Difficulty by Cross-Mutual Information
In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (2020)
Abstract: The performance of neural machine translation systems is commonly evaluated in terms of BLEU. However, due to its reliance on target language properties and generation, the BLEU metric does not allow an assessment of which translation directions are more difficult to model. In this paper, we propose cross-mutual information (XMI): an asymmetric information-theoretic metric of machine translation difficulty that exploits the probabilistic nature of most neural machine translation models. XMI allows us to better evaluate the difficulty of translating text into the target language while controlling for the difficulty of the target-side generation component independent of the translation task. We then present the first systematic and controlled study of cross-lingual translation difficulties using modern neural translation systems. Code for replicating our experiments is available online at https://github.com/e-bug/nmt-difficulty.
URL: https://hdl.handle.net/20.500.11850/462891
https://doi.org/10.3929/ethz-b-000462309
BASE
Hide details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
14
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern