DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...18
Hits 1 – 20 of 360

1
Graph Neural Networks for Multiparallel Word Alignment ...
BASE
Show details
2
CaMEL: Case Marker Extraction without Labels ...
BASE
Show details
3
Listening to Affected Communities to Define Extreme Speech: Dataset and Experiments ...
BASE
Show details
4
Differentiable Multi-Agent Actor-Critic for Multi-Step Radiology Report Summarization ...
BASE
Show details
5
Geographic Adaptation of Pretrained Language Models ...
BASE
Show details
6
Distributed representations for multilingual language processing
Dufter, Philipp [Verfasser]; Schütze, Hinrich [Akademischer Betreuer]. - München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2021
DNB Subject Category Language
Show details
7
Domain adaptation in Natural Language Processing
Sedinkina, Marina [Verfasser]; Schütze, Hinrich [Akademischer Betreuer]. - München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2021
DNB Subject Category Language
Show details
8
Combining contextualized and non-contextualized embeddings for domain adaptation and beyond
Pörner, Nina Mareike [Verfasser]; Schütze, Hinrich [Akademischer Betreuer]. - München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2021
DNB Subject Category Language
Show details
9
Graph Algorithms for Multiparallel Word Alignment
In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing ; The 2021 Conference on Empirical Methods in Natural Language Processing ; https://hal.archives-ouvertes.fr/hal-03424044 ; The 2021 Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics, Nov 2021, Punta Cana, Dominica ; https://2021.emnlp.org/ (2021)
BASE
Show details
10
Static Embeddings as Efficient Knowledge Bases? ...
Abstract: Recent research investigates factual knowledge stored in large pretrained language models (PLMs). Instead of structural knowledge base (KB) queries, masked sentences such as "Paris is the capital of [MASK]" are used as probes. The good performance on this analysis task has been interpreted as PLMs becoming potential repositories of factual knowledge. In experiments across ten linguistically diverse languages, we study knowledge contained in static embeddings. We show that, when restricting the output space to a candidate set, simple nearest neighbor matching using static embeddings performs better than PLMs. E.g., static embeddings perform 1.6% points better than BERT while just using 0.3% of energy for training. One important factor in their good comparative performance is that static embeddings are standardly learned for a large vocabulary. In contrast, BERT exploits its more sophisticated, but expensive ability to compose meaningful representations from a much smaller subword vocabulary. ... : NAACL2021 CRV; first two authors contributed equally ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2104.07094
https://arxiv.org/abs/2104.07094
BASE
Hide details
11
Does He Wink or Does He Nod? A Challenging Benchmark for Evaluating Word Understanding of Language Models ...
BASE
Show details
12
Superbizarre Is Not Superb: Derivational Morphology Improves BERT's Interpretation of Complex Words ...
BASE
Show details
13
Discrete and Soft Prompting for Multilingual Models ...
Zhao, Mengjie; Schütze, Hinrich. - : arXiv, 2021
BASE
Show details
14
ParCourE: A Parallel Corpus Explorer for a Massively Multilingual Corpus ...
BASE
Show details
15
Multilingual LAMA: Investigating Knowledge in Multilingual Pretrained Language Models ...
BASE
Show details
16
Wine is Not v i n. -- On the Compatibility of Tokenizations Across Languages ...
BASE
Show details
17
Graph Algorithms for Multiparallel Word Alignment ...
BASE
Show details
18
Locating Language-Specific Information in Contextualized Embeddings ...
BASE
Show details
19
Dynamic Contextualized Word Embeddings ...
BASE
Show details
20
Measuring and Improving Consistency in Pretrained Language Models ...
BASE
Show details

Page: 1 2 3 4 5...18

Catalogues
19
21
23
0
25
1
6
Bibliographies
55
2
3
2
1
0
0
0
9
Linked Open Data catalogues
0
Online resources
1
0
0
0
Open access documents
218
0
1
2
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern