DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 52

1
Universal Dependencies 2.9
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
2
Universal Dependencies 2.8.1
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
3
Universal Dependencies 2.8
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
4
Beyond the English Web: Zero-Shot Cross-Lingual and Lightweight Monolingual Classification of Registers ...
BASE
Show details
5
Deep learning for sentence clustering in essay grading support ...
BASE
Show details
6
Universal Dependencies 2.7
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
7
Universal Dependencies 2.6
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
8
Towards Fully Bilingual Deep Language Modeling ...
BASE
Show details
9
The birth of Romanian BERT ...
BASE
Show details
10
WikiBERT models: deep transfer learning for many languages ...
BASE
Show details
11
Exploring Cross-sentence Contexts for Named Entity Recognition with BERT ...
BASE
Show details
12
Universal Dependencies v2: An Evergrowing Multilingual Treebank Collection ...
BASE
Show details
13
Dependency parsing of biomedical text with BERT
In: BMC Bioinformatics (2020)
BASE
Show details
14
Universal Dependencies 2.5
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2019
BASE
Show details
15
Universal Dependencies 2.4
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2019
BASE
Show details
16
Multilingual is not enough: BERT for Finnish ...
Abstract: Deep learning-based language models pretrained on large unannotated text corpora have been demonstrated to allow efficient transfer learning for natural language processing, with recent approaches such as the transformer-based BERT model advancing the state of the art across a variety of tasks. While most work on these models has focused on high-resource languages, in particular English, a number of recent efforts have introduced multilingual models that can be fine-tuned to address tasks in a large number of different languages. However, we still lack a thorough understanding of the capabilities of these models, in particular for lower-resourced languages. In this paper, we focus on Finnish and thoroughly evaluate the multilingual BERT model on a range of tasks, comparing it with a new Finnish BERT model trained from scratch. The new language-specific model is shown to systematically and clearly outperform the multilingual. While the multilingual model largely fails to reach the performance of previously ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/1912.07076
https://dx.doi.org/10.48550/arxiv.1912.07076
BASE
Hide details
17
A neural classification method for supporting the creation of BioVerbNet ...
Chiu, Billy; Majewska, Olga; Pyysalo, Sampo. - : Figshare, 2019
BASE
Show details
18
A neural classification method for supporting the creation of BioVerbNet ...
Chiu, Billy; Majewska, Olga; Pyysalo, Sampo. - : Figshare, 2019
BASE
Show details
19
A neural classification method for supporting the creation of BioVerbNet ...
Chiu, Billy; Majewska, Olga; Pyysalo, Sampo. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details
20
A Neural Classification Method for Supporting the Creation of BioVerbNet ...
Chiu, Hon Wing; Majewska, Olga; Pyysalo, Sampo. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details

Page: 1 2 3

Catalogues
1
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
1
0
0
0
Open access documents
49
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern