DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 52

1
Universal Dependencies 2.9
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
2
Universal Dependencies 2.8.1
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
3
Universal Dependencies 2.8
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
4
Beyond the English Web: Zero-Shot Cross-Lingual and Lightweight Monolingual Classification of Registers ...
BASE
Show details
5
Deep learning for sentence clustering in essay grading support ...
BASE
Show details
6
Universal Dependencies 2.7
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
7
Universal Dependencies 2.6
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
8
Towards Fully Bilingual Deep Language Modeling ...
BASE
Show details
9
The birth of Romanian BERT ...
BASE
Show details
10
WikiBERT models: deep transfer learning for many languages ...
Abstract: Deep neural language models such as BERT have enabled substantial recent advances in many natural language processing tasks. Due to the effort and computational cost involved in their pre-training, language-specific models are typically introduced only for a small number of high-resource languages such as English. While multilingual models covering large numbers of languages are available, recent work suggests monolingual training can produce better models, and our understanding of the tradeoffs between mono- and multilingual training is incomplete. In this paper, we introduce a simple, fully automated pipeline for creating language-specific BERT models from Wikipedia data and introduce 42 new such models, most for languages up to now lacking dedicated deep neural language models. We assess the merits of these models using the state-of-the-art UDify parser on Universal Dependencies data, contrasting performance with results using the multilingual BERT model. We find that UDify using WikiBERT models ... : 7 pages, 1 figure ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences; Machine Learning cs.LG
URL: https://arxiv.org/abs/2006.01538
https://dx.doi.org/10.48550/arxiv.2006.01538
BASE
Hide details
11
Exploring Cross-sentence Contexts for Named Entity Recognition with BERT ...
BASE
Show details
12
Universal Dependencies v2: An Evergrowing Multilingual Treebank Collection ...
BASE
Show details
13
Dependency parsing of biomedical text with BERT
In: BMC Bioinformatics (2020)
BASE
Show details
14
Universal Dependencies 2.5
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2019
BASE
Show details
15
Universal Dependencies 2.4
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2019
BASE
Show details
16
Multilingual is not enough: BERT for Finnish ...
BASE
Show details
17
A neural classification method for supporting the creation of BioVerbNet ...
Chiu, Billy; Majewska, Olga; Pyysalo, Sampo. - : Figshare, 2019
BASE
Show details
18
A neural classification method for supporting the creation of BioVerbNet ...
Chiu, Billy; Majewska, Olga; Pyysalo, Sampo. - : Figshare, 2019
BASE
Show details
19
A neural classification method for supporting the creation of BioVerbNet ...
Chiu, Billy; Majewska, Olga; Pyysalo, Sampo. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details
20
A Neural Classification Method for Supporting the Creation of BioVerbNet ...
Chiu, Hon Wing; Majewska, Olga; Pyysalo, Sampo. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details

Page: 1 2 3

Catalogues
1
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
1
0
0
0
Open access documents
49
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern