DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 37

1
Universal Dependencies 2.9
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
2
Universal Dependencies 2.8.1
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
3
Universal Dependencies 2.8
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
4
Semantic Search as Extractive Paraphrase Span Detection ...
BASE
Show details
5
Deep learning for sentence clustering in essay grading support ...
BASE
Show details
6
Morpho-syntactically annotated corpora provided for the PARSEME Shared Task on Semi-Supervised Identification of Verbal Multiword Expressions (edition 1.2)
BASE
Show details
7
Universal Dependencies 2.7
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
8
Universal Dependencies 2.6
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
9
Towards Fully Bilingual Deep Language Modeling ...
Abstract: Language models based on deep neural networks have facilitated great advances in natural language processing and understanding tasks in recent years. While models covering a large number of languages have been introduced, their multilinguality has come at a cost in terms of monolingual performance, and the best-performing models at most tasks not involving cross-lingual transfer remain monolingual. In this paper, we consider the question of whether it is possible to pre-train a bilingual model for two remotely related languages without compromising performance at either language. We collect pre-training data, create a Finnish-English bilingual BERT model and evaluate its performance on datasets used to evaluate the corresponding monolingual models. Our bilingual model performs on par with Google's original English BERT on GLUE and nearly matches the performance of monolingual Finnish BERT on a range of Finnish NLP tasks, clearly outperforming multilingual BERT. We find that when the model vocabulary size is ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2010.11639
https://arxiv.org/abs/2010.11639
BASE
Hide details
10
WikiBERT models: deep transfer learning for many languages ...
BASE
Show details
11
Universal Dependencies v2: An Evergrowing Multilingual Treebank Collection ...
BASE
Show details
12
Dependency parsing of biomedical text with BERT
In: BMC Bioinformatics (2020)
BASE
Show details
13
Universal Dependencies 2.5
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2019
BASE
Show details
14
Universal Dependencies 2.4
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2019
BASE
Show details
15
Is Multilingual BERT Fluent in Language Generation? ...
BASE
Show details
16
Multilingual is not enough: BERT for Finnish ...
BASE
Show details
17
Universal Dependencies 2.2
In: https://hal.archives-ouvertes.fr/hal-01930733 ; 2018 (2018)
BASE
Show details
18
Universal Dependencies 2.3
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2018
BASE
Show details
19
Universal Dependencies 2.2
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2018
BASE
Show details
20
Artificial Treebank with Ellipsis
Droganova, Kira; Zeman, Daniel; Kanerva, Jenna. - : Charles University, Faculty of Mathematics and Physics, Institute of Formal and Applied Linguistics (UFAL), 2018
BASE
Show details

Page: 1 2

Catalogues
1
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
1
0
0
0
Open access documents
34
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern