DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 39

1
IGLUE: A Benchmark for Transfer Learning across Modalities, Tasks, and Languages ...
BASE
Show details
2
Universal Dependencies 2.9
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
3
Universal Dependencies 2.8.1
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
4
Universal Dependencies 2.8
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2021
BASE
Show details
5
Modelling Latent Translations for Cross-Lingual Transfer ...
BASE
Show details
6
Minimax and Neyman–Pearson Meta-Learning for Outlier Languages ...
BASE
Show details
7
Mind the Context: The Impact of Contextualization in Neural Module Networks for Grounding Visual Referring Expressions ...
BASE
Show details
8
Back-Training excels Self-Training at Unsupervised Domain Adaptation of Question Generation and Passage Retrieval ...
BASE
Show details
9
Minimax and Neyman–Pearson Meta-Learning for Outlier Languages ...
Abstract: Read paper: https://www.aclanthology.org/2021.findings-acl.106 Abstract: Model-agnostic meta-learning (MAML) has been recently put forth as a strategy to learn resource-poor languages in a sample-efficient fashion. Nevertheless, the properties of these languages are often not well represented by those available during training. Hence, we argue that the i.i.d. assumption ingrained in MAML makes it ill-suited for cross-lingual NLP. In fact, under a decision-theoretic framework, MAML can be interpreted as minimising the expected risk across training languages (with a uniform prior), which is known as Bayes criterion. To increase its robustness to outlier languages, we create two variants of MAML based on alternative criteria: Minimax MAML reduces the maximum risk across languages, while Neyman–Pearson MAML constrains the risk in each language to a maximum threshold. Both criteria constitute fully differentiable two-player games. In light of this, we propose a new adaptive optimiser solving for a local ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://underline.io/lecture/26197-minimax-and-neyman-pearson-meta-learning-for-outlier-languages
https://dx.doi.org/10.48448/1h13-vc36
BASE
Hide details
10
Visually Grounded Reasoning across Languages and Cultures ...
BASE
Show details
11
Visually Grounded Reasoning across Languages and Cultures ...
BASE
Show details
12
Visually Grounded Reasoning across Languages and Cultures ...
BASE
Show details
13
Universal Dependencies 2.7
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
14
Universal Dependencies 2.6
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2020
BASE
Show details
15
Words aren't enough, their order matters: On the Robustness of Grounding Visual Referring Expressions ...
BASE
Show details
16
MeDAL ...
Wen, Zhi; Lu, Xing Han; Reddy, Siva. - : Zenodo, 2020
BASE
Show details
17
Universal Dependencies 2.5
Zeman, Daniel; Nivre, Joakim; Abrams, Mitchell. - : Universal Dependencies Consortium, 2019
BASE
Show details
18
Universal Dependencies 2.4
Nivre, Joakim; Abrams, Mitchell; Agić, Željko. - : Universal Dependencies Consortium, 2019
BASE
Show details
19
CoQA: A Conversational Question Answering Challenge
In: Transactions of the Association for Computational Linguistics, Vol 7, Pp 249-266 (2019) (2019)
BASE
Show details
20
Universal Dependencies 2.2
In: https://hal.archives-ouvertes.fr/hal-01930733 ; 2018 (2018)
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
39
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern