DE eng

Search in the Catalogues and Directories

Hits 1 – 6 of 6

1
ANEA: Distant Supervision for Low-Resource Named Entity Recognition ...
BASE
Show details
2
On the Correlation of Context-Aware Language Models With the Intelligibility of Polish Target Words to Czech Readers
In: Front Psychol (2021)
BASE
Show details
3
Transfer learning and distant supervision for multilingual Transformer models: A study on African languages
In: 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) ; https://hal.inria.fr/hal-03350901 ; 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), Nov 2020, Punta Cana, Dominica (2020)
BASE
Show details
4
Distant supervision and noisy label learning for low resource named entity recognition: A study on Hausa and Yorùbá
In: ICLR Workshops (AfricaNLP & PML4DC 2020) ; https://hal.archives-ouvertes.fr/hal-03359111 ; ICLR Workshops (AfricaNLP & PML4DC 2020), Apr 2020, Addis Ababa, Ethiopia (2020)
BASE
Show details
5
Transfer Learning and Distant Supervision for Multilingual Transformer Models: A Study on African Languages ...
Abstract: Multilingual transformer models like mBERT and XLM-RoBERTa have obtained great improvements for many NLP tasks on a variety of languages. However, recent works also showed that results from high-resource languages could not be easily transferred to realistic, low-resource scenarios. In this work, we study trends in performance for different amounts of available resources for the three African languages Hausa, isiXhosa and Yorùbá on both NER and topic classification. We show that in combination with transfer learning or distant supervision, these models can achieve with as little as 10 or 100 labeled sentences the same performance as baselines with much more supervised training data. However, we also find settings where this does not hold. Our discussions and additional experiments on assumptions such as time and hardware restrictions highlight challenges and opportunities in low-resource learning. ... : Accepted at EMNLP'20 ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
URL: https://arxiv.org/abs/2010.03179
https://dx.doi.org/10.48550/arxiv.2010.03179
BASE
Hide details
6
On the Interplay Between Fine-tuning and Sentence-level Probing for Linguistic Knowledge in Pre-trained Transformers ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
6
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern