DE eng

Search in the Catalogues and Directories

Hits 1 – 11 of 11

1
Differentiable Generative Phonology ...
BASE
Show details
2
Applying the Transformer to Character-level Transduction ...
Wu, Shijie; Cotterell, Ryan; Hulden, Mans. - : ETH Zurich, 2021
BASE
Show details
3
Everything Is All It Takes: A Multipronged Strategy for Zero-Shot Cross-Lingual Information Extraction ...
BASE
Show details
4
Applying the Transformer to Character-level Transduction
In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume (2021)
BASE
Show details
5
Do Explicit Alignments Robustly Improve Multilingual Encoders? ...
Wu, Shijie; Dredze, Mark. - : arXiv, 2020
BASE
Show details
6
SIGMORPHON 2020 Shared Task 0: Typologically Diverse Morphological Inflection ...
BASE
Show details
7
Are All Languages Created Equal in Multilingual BERT? ...
Wu, Shijie; Dredze, Mark. - : arXiv, 2020
Abstract: Multilingual BERT (mBERT) trained on 104 languages has shown surprisingly good cross-lingual performance on several NLP tasks, even without explicit cross-lingual signals. However, these evaluations have focused on cross-lingual transfer with high-resource languages, covering only a third of the languages covered by mBERT. We explore how mBERT performs on a much wider set of languages, focusing on the quality of representation for low-resource languages, measured by within-language performance. We consider three tasks: Named Entity Recognition (99 languages), Part-of-speech Tagging, and Dependency Parsing (54 languages each). mBERT does better than or comparable to baselines on high resource languages but does much worse for low resource languages. Furthermore, monolingual BERT models for these languages do even worse. Paired with similar languages, the performance gap between monolingual BERT and mBERT can be narrowed. We find that better models for low resource languages require more efficient pretraining ... : RepL4NLP Workshop 2020 (Best Long Paper) ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2005.09093
https://dx.doi.org/10.48550/arxiv.2005.09093
BASE
Hide details
8
The Paradigm Discovery Problem ...
Erdmann, Alexander; Elsner, Micha; Wu, Shijie. - : ETH Zurich, 2020
BASE
Show details
9
The Paradigm Discovery Problem
In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (2020)
BASE
Show details
10
Emerging Cross-lingual Structure in Pretrained Language Models ...
BASE
Show details
11
The SIGMORPHON 2019 Shared Task: Morphological Analysis in Context and Cross-Lingual Transfer for Inflection ...
BASE
Show details

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
11
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern