DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 52

1
Universal Dependencies 2.9
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
2
Universal Dependencies 2.8.1
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
3
Universal Dependencies 2.8
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
4
Beyond the English Web: Zero-Shot Cross-Lingual and Lightweight Monolingual Classification of Registers ...
BASE
Show details
5
Deep learning for sentence clustering in essay grading support ...
BASE
Show details
6
Universal Dependencies 2.7
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
7
Universal Dependencies 2.6
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
8
Towards Fully Bilingual Deep Language Modeling ...
BASE
Show details
9
The birth of Romanian BERT ...
BASE
Show details
10
WikiBERT models: deep transfer learning for many languages ...
BASE
Show details
11
Exploring Cross-sentence Contexts for Named Entity Recognition with BERT ...
Abstract: Named entity recognition (NER) is frequently addressed as a sequence classification task with each input consisting of one sentence of text. It is nevertheless clear that useful information for NER is often found also elsewhere in text. Recent self-attention models like BERT can both capture long-distance relationships in input and represent inputs consisting of several sentences. This creates opportunities for adding cross-sentence information in natural language processing tasks. This paper presents a systematic study exploring the use of cross-sentence information for NER using BERT models in five languages. We find that adding context as additional sentences to BERT input systematically increases NER performance. Multiple sentences in input samples allows us to study the predictions of the sentences in different contexts. We propose a straightforward method, Contextual Majority Voting (CMV), to combine these different predictions and demonstrate this to further increase NER performance. Evaluation on ...
Keyword: Computer and Information Science; Natural Language Processing; Neural Network
URL: https://underline.io/lecture/6385-exploring-cross-sentence-contexts-for-named-entity-recognition-with-bert
https://dx.doi.org/10.48448/c1he-ey84
BASE
Hide details
12
Universal Dependencies v2: An Evergrowing Multilingual Treebank Collection ...
BASE
Show details
13
Dependency parsing of biomedical text with BERT
In: BMC Bioinformatics (2020)
BASE
Show details
14
Universal Dependencies 2.5
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2019
BASE
Show details
15
Universal Dependencies 2.4
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2019
BASE
Show details
16
Multilingual is not enough: BERT for Finnish ...
BASE
Show details
17
A neural classification method for supporting the creation of BioVerbNet ...
Chiu, Billy; Majewska, Olga; Pyysalo, Sampo. - : Figshare, 2019
BASE
Show details
18
A neural classification method for supporting the creation of BioVerbNet ...
Chiu, Billy; Majewska, Olga; Pyysalo, Sampo. - : Figshare, 2019
BASE
Show details
19
A neural classification method for supporting the creation of BioVerbNet ...
Chiu, Billy; Majewska, Olga; Pyysalo, Sampo. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details
20
A Neural Classification Method for Supporting the Creation of BioVerbNet ...
Chiu, Hon Wing; Majewska, Olga; Pyysalo, Sampo. - : Apollo - University of Cambridge Repository, 2019
BASE
Show details

Page: 1 2 3

Catalogues
1
0
0
0
1
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
1
0
0
0
Open access documents
49
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern