DE eng

Search in the Catalogues and Directories

Hits 1 – 6 of 6

1
Investigating the Helpfulness of Word-Level Quality Estimation for Post-Editing Machine Translation Output ...
BASE
Show details
2
Multi-Head Highly Parallelized LSTM Decoder for Neural Machine Translation ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.23 Abstract: One of the reasons Transformer translation models are popular is that self-attention networks for context modelling can be easily parallelized at sequence level. However, the computational complexity of a self-attention network is $O(n^2)$, increasing quadratically with sequence length. By contrast, the complexity of LSTM-based approaches is only O(n). In practice, however, LSTMs are much slower to train than self-attention networks as they cannot be parallelized at sequence level: to model context, the current LSTM state relies on the full LSTM computation of the preceding state. This has to be computed n times for a sequence of length n. The linear transformations involved in the LSTM gate and state computations are the major cost factors in this. To enable sequence-level parallelization of LSTMs, we approximate full LSTM context modelling by computing hidden states and gates with the current input and a simple bag-of-words representation ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/fcc7-e373
https://underline.io/lecture/25374-multi-head-highly-parallelized-lstm-decoder-for-neural-machine-translation
BASE
Hide details
3
Comparing Feature-Engineering and Feature-Learning Approaches for Multilingual Translationese Classification ...
BASE
Show details
4
Modeling Task-Aware MIMO Cardinality for Efficient Multilingual Neural Machine Translation ...
BASE
Show details
5
A Bidirectional Transformer Based Alignment Model for Unsupervised Word Alignment ...
BASE
Show details
6
A Computational Model of the Referential Semantics of Projective Prepositions
In: Conference papers (2006)
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
6
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern