DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 59

1
Ara-Women-Hate: The first Arabic Hate Speech corpus regarding Women ...
BASE
Show details
2
Towards the Early Detection of Child Predators in Chat Rooms: A BERT-based Approach ...
BASE
Show details
3
STaCK: Sentence Ordering with Temporal Commonsense Knowledge ...
BASE
Show details
4
Searching for an Effective Defender: Benchmarking Defense against Adversarial Word Substitution ...
BASE
Show details
5
Graphine: A Dataset for Graph-aware Terminology Definition Generation ...
BASE
Show details
6
End-to-end style-conditioned poetry generation: What does it take to learn from examples alone? ...
BASE
Show details
7
To what extent do human explanations of model behavior align with actual model behavior? ...
BASE
Show details
8
Time-aware Graph Neural Network for Entity Alignment between Temporal Knowledge Graphs ...
BASE
Show details
9
What’s Hidden in a One-layer Randomly Weighted Transformer? ...
BASE
Show details
10
Finetuning Pretrained Transformers into RNNs ...
BASE
Show details
11
Sometimes We Want Ungrammatical Translations ...
BASE
Show details
12
Pruning Neural Machine Translation for Speed Using Group Lasso ...
BASE
Show details
13
Elementary-Level Math Word Problem Generation using Pre-Trained Transformers ...
BASE
Show details
14
Does External Knowledge Help Explainable Natural Language Inference? Automatic Evaluation vs. Human Ratings ...
BASE
Show details
15
The Low-Resource Double Bind: An Empirical Study of Pruning for Low-Resource Machine Translation ...
BASE
Show details
16
Knowledge Graph Representation Learning using Ordinary Differential Equations ...
BASE
Show details
17
What Models Know About Their Attackers: Deriving Attacker Information From Latent Representations ...
BASE
Show details
18
Mind the Context: The Impact of Contextualization in Neural Module Networks for Grounding Visual Referring Expressions ...
Abstract: Anthology paper link: https://aclanthology.org/2021.emnlp-main.516/ Abstract: Neural module networks (NMN) are a popular approach for grounding visual referring expressions. Prior implementations of NMN use pre-defined and fixed textual inputs in their module instantiation. This necessitates a large number of modules as they lack the ability to share weights and exploit associations between similar textual contexts (e.g. 'dark cube on the left' vs. 'black cube on the left'). In this work, we address these limitations and evaluate the impact of contextual clues in improving the performance of NMN models. First, we address the problem of fixed textual inputs by parameterizing the module arguments. This substantially reduce the number of modules in NMN by up to 75% without any loss in performance. Next we propose a method to contextualize our parameterized model to enhance the module’s capacity in exploiting the visiolinguistic associations. Our model outperforms the state-of-the-art NMN model on CLEVR-Ref+ ...
Keyword: Computational Linguistics; Machine Learning; Machine Learning and Data Mining; Natural Language Processing; Neural Network
URL: https://dx.doi.org/10.48448/c8vt-s207
https://underline.io/lecture/37933-mind-the-context-the-impact-of-contextualization-in-neural-module-networks-for-grounding-visual-referring-expressions
BASE
Hide details
19
EM ALBERT: a step towards equipping Manipuri for NLP ...
BASE
Show details
20
ProtoInfoMax: Prototypical Networks with Mutual Information Maximization for Out-of-Domain Detection ...
BASE
Show details

Page: 1 2 3

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
59
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern