DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4
Hits 1 – 20 of 75

1
Delving Deeper into Cross-lingual Visual Question Answering ...
BASE
Show details
2
Cross-Lingual Dialogue Dataset Creation via Outline-Based Generation ...
BASE
Show details
3
Improving Word Translation via Two-Stage Contrastive Learning ...
BASE
Show details
4
Towards Zero-shot Language Modeling ...
BASE
Show details
5
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems ...
BASE
Show details
6
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
BASE
Show details
7
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
8
Parameter space factorization for zero-shot learning across tasks and languages ...
BASE
Show details
9
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
Liu, Qianchu; Liu, Fangyu; Collier, Nigel. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
10
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
BASE
Show details
11
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
12
Semantic Data Set Construction from Human Clustering and Spatial Arrangement ...
Majewska, Olga; McCarthy, Diana; Van Den Bosch, Jasper JF. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
13
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
BASE
Show details
14
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
BASE
Show details
15
Parameter space factorization for zero-shot learning across tasks and languages
In: Transactions of the Association for Computational Linguistics, 9 (2021)
BASE
Show details
16
AM2iCo: Evaluating Word Meaning in Context across Low-Resource Languages with Adversarial Examples ...
BASE
Show details
17
Fast, Effective, and Self-Supervised: Transforming Masked Language Models into Universal Lexical and Sentence Encoders ...
Abstract: Anthology paper link: https://aclanthology.org/2021.emnlp-main.109/ Abstract: Pretrained Masked Language Models (MLMs) have revolutionised NLP in recent years. However, previous work has indicated that off-the-shelf MLMs are not effective as universal lexical or sentence encoders without further task-specific fine-tuning on NLI, sentence similarity, or paraphrasing tasks using annotated task data. In this work, we demonstrate that it is possible to turn MLMs into effective universal lexical and sentence encoders even without any additional data and without any supervision. We propose an extremely simple, fast and effective contrastive learning technique, termed Mirror-BERT, which converts MLMs (e.g., BERT and RoBERTa) into such encoders in 20-30 seconds without any additional external knowledge. Mirror-BERT relies on fully identical or slightly modified string pairs as positive (i.e., synonymous) fine-tuning examples, and aims to maximise their similarity during identity fine-tuning. We report huge gains ...
Keyword: Language Models; Natural Language Processing; Semantic Evaluation; Sociolinguistics
URL: https://dx.doi.org/10.48448/vrhm-zd20
https://underline.io/lecture/37527-fast,-effective,-and-self-supervised-transforming-masked-language-models-into-universal-lexical-and-sentence-encoders
BASE
Hide details
18
LexFit: Lexical Fine-Tuning of Pretrained Language Models ...
BASE
Show details
19
Verb Knowledge Injection for Multilingual Event Processing ...
BASE
Show details
20
A Closer Look at Few-Shot Crosslingual Transfer: The Choice of Shots Matters ...
BASE
Show details

Page: 1 2 3 4

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
75
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern