1 |
Le modèle Transformer: un « couteau suisse » pour le traitement automatique des langues
|
|
|
|
In: Techniques de l'Ingenieur ; https://hal.archives-ouvertes.fr/hal-03619077 ; Techniques de l'Ingenieur, Techniques de l'ingénieur, 2022, ⟨10.51257/a-v1-in195⟩ ; https://www.techniques-ingenieur.fr/base-documentaire/innovation-th10/innovations-en-electronique-et-tic-42257210/transformer-des-reseaux-de-neurones-pour-le-traitement-automatique-des-langues-in195/ (2022)
|
|
BASE
|
|
Show details
|
|
2 |
Investigating alignment interpretability for low-resource NMT
|
|
|
|
In: ISSN: 0922-6567 ; EISSN: 1573-0573 ; Machine Translation ; https://hal.archives-ouvertes.fr/hal-03139744 ; Machine Translation, Springer Verlag, 2021, ⟨10.1007/s10590-020-09254-w⟩ (2021)
|
|
BASE
|
|
Show details
|
|
3 |
Impact of Encoding and Segmentation Strategies on End-to-End Simultaneous Speech Translation
|
|
|
|
In: INTERSPEECH 2021 ; https://hal.archives-ouvertes.fr/hal-03372487 ; INTERSPEECH 2021, Aug 2021, Brno, Czech Republic (2021)
|
|
BASE
|
|
Show details
|
|
4 |
cushLEPOR uses LABSE distilled knowledge to improve correlation with human translation evaluations
|
|
|
|
In: Erofeev, Gleb, Sorokina, Irina, Han, Lifeng orcid:0000-0002-3221-2185 and Gladkoff, Serge (2021) cushLEPOR uses LABSE distilled knowledge to improve correlation with human translation evaluations. In: Machine Translation Summit 2021, 16-20 Aug 2021, USA (online). (In Press) (2021)
|
|
BASE
|
|
Show details
|
|
5 |
Meta-evaluation of machine translation evaluation methods
|
|
|
|
In: Han, Lifeng orcid:0000-0002-3221-2185 (2021) Meta-evaluation of machine translation evaluation methods. In: Workshop on Informetric and Scientometric Research (SIG-MET), 23-24 Oct 2021, Online. (2021)
|
|
BASE
|
|
Show details
|
|
7 |
BERT, mBERT, or BiBERT? A Study on Contextualized Embeddings for Neural Machine Translation ...
|
|
|
|
Abstract:
Anthology paper link: https://aclanthology.org/2021.emnlp-main.534/ Abstract: The success of bidirectional encoders using masked language models, such as BERT, on numerous natural language processing tasks has prompted researchers to attempt to incorporate these pre-trained models into neural machine translation (NMT) systems. However, proposed methods for incorporating pre-trained models are non-trivial and mainly focus on BERT, which lacks a comparison of the impact that other pre-trained models may have on translation performance. In this paper, we demonstrate that simply using the output (contextualized embeddings) of a tailored and suitable bilingual pre-trained language model (dubbed BiBERT) as the input of the NMT encoder achieves state-of-the-art translation performance. Moreover, we also propose a stochastic layer selection approach and a concept of dual-directional translation model to ensure the sufficient utilization of contextualized embeddings. In the case of without using back translation, our ...
|
|
Keyword:
Computational Linguistics; Language Models; Machine Learning; Machine Learning and Data Mining; Machine translation; Natural Language Processing
|
|
URL: https://dx.doi.org/10.48448/j1py-kc88 https://underline.io/lecture/37793-bert,-mbert,-or-bibertquestion-a-study-on-contextualized-embeddings-for-neural-machine-translation
|
|
BASE
|
|
Hide details
|
|
8 |
Sinhala-English Code-mixed and Code-switched Data Classification ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
Exploring Pre-Trained Transformers and Bilingual Transfer Learning for Arabic Coreference Resolution ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Comparing Feature-Engineering and Feature-Learning Approaches for Multilingual Translationese Classification ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Role of Language Relatedness in Multilingual Fine-tuning of Language Models: A Case Study in Indo-Aryan Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Wino-X: Multilingual Winograd Schemas for Commonsense Reasoning and Coreference Resolution ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
ERNIE-M: Enhanced Multilingual Representation by Aligning Cross-lingual Semantics with Monolingual Corpora ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
A Massively Multilingual Analysis of Cross-linguality in Shared Embedding Space ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Improved Multilingual Language Model Pretraining for Social Media Text via Translation Pair Prediction ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
Don't Go Far Off: An Empirical Study on Neural Poetry Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|