DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 37

1
Universal Dependencies 2.9
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
2
Universal Dependencies 2.8.1
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
3
Universal Dependencies 2.8
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
4
Semantic Search as Extractive Paraphrase Span Detection ...
BASE
Show details
5
Deep learning for sentence clustering in essay grading support ...
BASE
Show details
6
Morpho-syntactically annotated corpora provided for the PARSEME Shared Task on Semi-Supervised Identification of Verbal Multiword Expressions (edition 1.2)
BASE
Show details
7
Universal Dependencies 2.7
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
8
Universal Dependencies 2.6
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
9
Towards Fully Bilingual Deep Language Modeling ...
BASE
Show details
10
WikiBERT models: deep transfer learning for many languages ...
BASE
Show details
11
Universal Dependencies v2: An Evergrowing Multilingual Treebank Collection ...
BASE
Show details
12
Dependency parsing of biomedical text with BERT
In: BMC Bioinformatics (2020)
BASE
Show details
13
Universal Dependencies 2.5
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2019
BASE
Show details
14
Universal Dependencies 2.4
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2019
BASE
Show details
15
Is Multilingual BERT Fluent in Language Generation? ...
BASE
Show details
16
Multilingual is not enough: BERT for Finnish ...
Abstract: Deep learning-based language models pretrained on large unannotated text corpora have been demonstrated to allow efficient transfer learning for natural language processing, with recent approaches such as the transformer-based BERT model advancing the state of the art across a variety of tasks. While most work on these models has focused on high-resource languages, in particular English, a number of recent efforts have introduced multilingual models that can be fine-tuned to address tasks in a large number of different languages. However, we still lack a thorough understanding of the capabilities of these models, in particular for lower-resourced languages. In this paper, we focus on Finnish and thoroughly evaluate the multilingual BERT model on a range of tasks, comparing it with a new Finnish BERT model trained from scratch. The new language-specific model is shown to systematically and clearly outperform the multilingual. While the multilingual model largely fails to reach the performance of previously ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/1912.07076
https://dx.doi.org/10.48550/arxiv.1912.07076
BASE
Hide details
17
Universal Dependencies 2.2
In: https://hal.archives-ouvertes.fr/hal-01930733 ; 2018 (2018)
BASE
Show details
18
Universal Dependencies 2.3
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2018
BASE
Show details
19
Universal Dependencies 2.2
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2018
BASE
Show details
20
Artificial Treebank with Ellipsis
Droganova, Kira; Zeman, Daniel; Kanerva, Jenna. - : Charles University, Faculty of Mathematics and Physics, Institute of Formal and Applied Linguistics (UFAL), 2018
BASE
Show details

Page: 1 2

Catalogues
1
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
1
0
0
0
Open access documents
34
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern