DE eng

Search in the Catalogues and Directories

Hits 1 – 8 of 8

1
Multilingual Language Model Adaptive Fine-Tuning: A Study on African Languages ...
BASE
Show details
2
MasakhaNER: Named entity recognition for African languages
In: EISSN: 2307-387X ; Transactions of the Association for Computational Linguistics ; https://hal.inria.fr/hal-03350962 ; Transactions of the Association for Computational Linguistics, The MIT Press, 2021, ⟨10.1162/tacl⟩ (2021)
BASE
Show details
3
The effect of domain and diacritics in Yorùbá-English neural machine translation
In: 18th Biennial Machine Translation Summit ; https://hal.inria.fr/hal-03350967 ; 18th Biennial Machine Translation Summit, Aug 2021, Orlando, United States (2021)
BASE
Show details
4
The Effect of Domain and Diacritics in Yorùbá-English Neural Machine Translation ...
BASE
Show details
5
EdinSaar@WMT21: North-Germanic Low-Resource Multilingual NMT ...
BASE
Show details
6
Transfer learning and distant supervision for multilingual Transformer models: A study on African languages
In: 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) ; https://hal.inria.fr/hal-03350901 ; 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), Nov 2020, Punta Cana, Dominica (2020)
Abstract: International audience ; Multilingual transformer models like mBERT and XLM-RoBERTa have obtained great improvements for many NLP tasks on a variety of languages. However, recent works also showed that results from high-resource languages could not be easily transferred to realistic, low-resource scenarios. In this work, we study trends in performance for different amounts of available resources for the three African languages Hausa, isiXhosa and Yorùbá on both NER and topic classification. We show that in combination with transfer learning or distant supervision, these models can achieve with as little as 10 or 100 labeled sentences the same performance as baselines with much more supervised training data. However, we also find settings where this does not hold. Our discussions and additional experiments on assumptions such as time and hardware restrictions highlight challenges and opportunities in low-resource learning.
Keyword: [INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL]
URL: https://hal.inria.fr/hal-03350901
https://hal.inria.fr/hal-03350901/file/hedderich_EMNLP2020.pdf
https://hal.inria.fr/hal-03350901/document
BASE
Hide details
7
Transfer Learning and Distant Supervision for Multilingual Transformer Models: A Study on African Languages ...
BASE
Show details
8
Massive vs. Curated Word Embeddings for Low-Resourced Languages. The Case of Yorùbá and Twi ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
8
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern