8 |
WikiBERT models: deep transfer learning for many languages ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Dependency parsing of biomedical text with BERT
|
|
|
|
In: BMC Bioinformatics (2020)
|
|
BASE
|
|
Show details
|
|
12 |
Is Multilingual BERT Fluent in Language Generation? ...
|
|
|
|
Abstract:
The multilingual BERT model is trained on 104 languages and meant to serve as a universal language model and tool for encoding sentences. We explore how well the model performs on several languages across several tasks: a diagnostic classification probing the embeddings for a particular syntactic property, a cloze task testing the language modelling ability to fill in gaps in a sentence, and a natural language generation task testing for the ability to produce coherent text fitting a given context. We find that the currently available multilingual BERT model is clearly inferior to the monolingual counterparts, and cannot in many cases serve as a substitute for a well-trained monolingual model. We find that the English and German models perform well at generation, whereas the multilingual model is lacking, in particular, for Nordic languages. ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
|
|
URL: https://arxiv.org/abs/1910.03806 https://dx.doi.org/10.48550/arxiv.1910.03806
|
|
BASE
|
|
Hide details
|
|
14 |
Universal Dependencies 2.2
|
|
|
|
In: https://hal.archives-ouvertes.fr/hal-01930733 ; 2018 (2018)
|
|
BASE
|
|
Show details
|
|
19 |
Universal Dependencies 2.1
|
|
|
|
In: https://hal.inria.fr/hal-01682188 ; 2017 (2017)
|
|
BASE
|
|
Show details
|
|
|
|