DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5
Hits 1 – 20 of 85

1
Parameter-Efficient Neural Reranking for Cross-Lingual and Multilingual Retrieval ...
BASE
Show details
2
Geographic Adaptation of Pretrained Language Models ...
BASE
Show details
3
On cross-lingual retrieval with multilingual text encoders
Litschko, Robert; Vulić, Ivan; Ponzetto, Simone Paolo. - : Springer Science + Business Media, 2022
BASE
Show details
4
Data for paper: "Evaluating Resource-Lean Cross-Lingual Embedding Models in Unsupervised Retrieval" ...
Litschko, Robert; Glavaš, Goran. - : Mannheim University Library, 2021
BASE
Show details
5
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems ...
BASE
Show details
6
On Cross-Lingual Retrieval with Multilingual Text Encoders ...
BASE
Show details
7
Evaluating Multilingual Text Encoders for Unsupervised Cross-Lingual Retrieval ...
BASE
Show details
8
RedditBias: A Real-World Resource for Bias Evaluation and Debiasing of Conversational Language Models ...
BASE
Show details
9
LexFit: Lexical Fine-Tuning of Pretrained Language Models ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.410 Abstract: Transformer-based language models (LMs) pretrained on large text collections implicitly store a wealth of lexical semantic knowledge, but it is non-trivial to extract that knowledge effectively from their parameters. Inspired by prior work on semantic specialization of static word embedding (WE) models, we show that it is possible to expose and enrich lexical knowledge from the LMs, that is, to specialize them to serve as effective and universal "decontextualized" word encoders even when fed input words "in isolation" (i.e., without any context). Their transformation into such word encoders is achieved through a simple and efficient lexical fine-tuning procedure (termed LexFit) based on dual-encoder network structures. Further, we show that LexFit can yield effective word encoders even with limited lexical supervision and, via cross-lingual transfer, in different languages without any readily available external knowledge. Our evaluation ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/2skf-gv34
https://underline.io/lecture/25829-lexfit-lexical-fine-tuning-of-pretrained-language-models
BASE
Hide details
10
Verb Knowledge Injection for Multilingual Event Processing ...
BASE
Show details
11
Is supervised syntactic parsing beneficial for language understanding tasks? An empirical investigation
Glavaš, Goran; Vulić, Ivan. - : Association for Computational Linguistics, 2021
BASE
Show details
12
Evaluating multilingual text encoders for unsupervised cross-lingual retrieval
BASE
Show details
13
Training and domain adaptation for supervised text segmentation
Glavaš, Goran; Ganesh, Ananya; Somasundaran, Swapna. - : Association for Computational Linguistics, 2021
BASE
Show details
14
AraWEAT: Multidimensional Analysis of Biases in Arabic Word Embeddings ...
BASE
Show details
15
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
BASE
Show details
16
On the Limitations of Cross-lingual Encoders as Exposed by Reference-Free Machine Translation Evaluation ...
BASE
Show details
17
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer ...
BASE
Show details
18
XCOPA: A Multilingual Dataset for Causal Commonsense Reasoning ...
Ponti, Edoardo; Glavaš, Goran; Majewska, Olga. - : Apollo - University of Cambridge Repository, 2020
BASE
Show details
19
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers ...
BASE
Show details
20
Verb Knowledge Injection for Multilingual Event Processing ...
BASE
Show details

Page: 1 2 3 4 5

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
85
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern