DE eng

Search in the Catalogues and Directories

Hits 1 – 8 of 8

1
Multilingual Language Model Adaptive Fine-Tuning: A Study on African Languages ...
BASE
Show details
2
MasakhaNER: Named entity recognition for African languages
In: EISSN: 2307-387X ; Transactions of the Association for Computational Linguistics ; https://hal.inria.fr/hal-03350962 ; Transactions of the Association for Computational Linguistics, The MIT Press, 2021, ⟨10.1162/tacl⟩ (2021)
BASE
Show details
3
The effect of domain and diacritics in Yorùbá-English neural machine translation
In: 18th Biennial Machine Translation Summit ; https://hal.inria.fr/hal-03350967 ; 18th Biennial Machine Translation Summit, Aug 2021, Orlando, United States (2021)
Abstract: International audience ; Massively multilingual machine translation (MT) has shown impressive capabilities, including zero and few-shot translation between low-resource language pairs. However, these models are often evaluated on high-resource languages with the assumption that they generalize to low-resource ones. The difficulty of evaluating MT models on low-resource pairs is often due to lack of standardized evaluation datasets. In this paper, we present MENYO-20k, the first multi-domain parallel corpus with a special focus on clean orthography for Yorùbá-English with standardized train-test splits for benchmarking. We provide several neural MT benchmarks and compare them to the performance of popular pre-trained (massively multilingual) MT models both for the heterogeneous test set and its subdomains. Since these pre-trained models use huge amounts of data with uncertain quality, we also analyze the effect of diacritics, a major characteristic of Yorùbá, in the training data. We investigate how and when this training condition affects the final quality and intelligibility of a translation. Our models outperform massively multilingual models such as Google (+8.7 BLEU) and Facebook M2M (+9.1 BLEU) when translating to Yorùbá, setting a high quality benchmark for future research.
Keyword: [INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL]
URL: https://hal.inria.fr/hal-03350967
https://hal.inria.fr/hal-03350967/document
https://hal.inria.fr/hal-03350967/file/adelani_MTSummit2021.pdf
BASE
Hide details
4
The Effect of Domain and Diacritics in Yorùbá-English Neural Machine Translation ...
BASE
Show details
5
EdinSaar@WMT21: North-Germanic Low-Resource Multilingual NMT ...
BASE
Show details
6
Transfer learning and distant supervision for multilingual Transformer models: A study on African languages
In: 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) ; https://hal.inria.fr/hal-03350901 ; 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), Nov 2020, Punta Cana, Dominica (2020)
BASE
Show details
7
Transfer Learning and Distant Supervision for Multilingual Transformer Models: A Study on African Languages ...
BASE
Show details
8
Massive vs. Curated Word Embeddings for Low-Resourced Languages. The Case of Yorùbá and Twi ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
8
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern