DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 34

1
XHate-999: analyzing and detecting abusive language across domains and languages
Glavaš, Goran [Verfasser]; Karan, Mladen [Verfasser]; Vulic, Ivan [Verfasser]. - Mannheim : Universitätsbibliothek Mannheim, 2021
DNB Subject Category Language
Show details
2
Specializing unsupervised pretraining models for word-level semantic similarity
Lauscher, Anne [Verfasser]; Vulic, Ivan [Verfasser]; Ponti, Edoardo Maria [Verfasser]. - Mannheim : Universitätsbibliothek Mannheim, 2021
DNB Subject Category Language
Show details
3
Towards instance-level parser selection for cross-lingual transfer of dependency parsers
Litschko, Robert [Verfasser]; Vulic, Ivan [Verfasser]; Agić, Želiko [Verfasser]. - Mannheim : Universitätsbibliothek Mannheim, 2021
DNB Subject Category Language
Show details
4
SimLex-999 Slovenian translation SimLex-999-sl 1.0
Pollak, Senja; Vulić, Ivan; Pelicon, Andraž. - : University of Ljubljana, 2021
BASE
Show details
5
Towards Zero-shot Language Modeling ...
BASE
Show details
6
Multilingual and Cross-Lingual Intent Detection from Spoken Data ...
BASE
Show details
7
Crossing the Conversational Chasm: A Primer on Natural Language Processing for Multilingual Task-Oriented Dialogue Systems ...
BASE
Show details
8
Modelling Latent Translations for Cross-Lingual Transfer ...
Abstract: While achieving state-of-the-art results in multiple tasks and languages, translation-based cross-lingual transfer is often overlooked in favour of massively multilingual pre-trained encoders. Arguably, this is due to its main limitations: 1) translation errors percolating to the classification phase and 2) the insufficient expressiveness of the maximum-likelihood translation. To remedy this, we propose a new technique that integrates both steps of the traditional pipeline (translation and classification) into a single model, by treating the intermediate translations as a latent random variable. As a result, 1) the neural machine translation system can be fine-tuned with a variant of Minimum Risk Training where the reward is the accuracy of the downstream task classifier. Moreover, 2) multiple samples can be drawn to approximate the expected loss across all possible translations during inference. We evaluate our novel latent translation-based model on a series of multilingual NLU tasks, including commonsense ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://dx.doi.org/10.48550/arxiv.2107.11353
https://arxiv.org/abs/2107.11353
BASE
Hide details
9
Prix-LM: Pretraining for Multilingual Knowledge Base Construction ...
BASE
Show details
10
Learning Domain-Specialised Representations for Cross-Lingual Biomedical Entity Linking ...
BASE
Show details
11
xGQA: Cross-Lingual Visual Question Answering ...
BASE
Show details
12
On Cross-Lingual Retrieval with Multilingual Text Encoders ...
BASE
Show details
13
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
BASE
Show details
14
Evaluating Multilingual Text Encoders for Unsupervised Cross-Lingual Retrieval ...
BASE
Show details
15
RedditBias: A Real-World Resource for Bias Evaluation and Debiasing of Conversational Language Models ...
BASE
Show details
16
Parameter space factorization for zero-shot learning across tasks and languages ...
BASE
Show details
17
Analogy Training Multilingual Encoders ...
Garneau, Nicolas; Hartmann, Mareike; Sandholm, Anders. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
18
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language Models ...
Liu, Qianchu; Liu, Fangyu; Collier, Nigel. - : Apollo - University of Cambridge Repository, 2021
BASE
Show details
19
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts ...
BASE
Show details
20
How Good is Your Tokenizer? On the Monolingual Performance of Multilingual Language Models ...
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
3
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
31
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern