1 |
Developing Conversational Data and Detection of Conversational Humor in Telugu ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Not All Negatives are Equal: Label-Aware Contrastive Loss for Fine-grained Text Classification ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Open Aspect Target Sentiment Classification with Natural Language Prompts ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
SYSML: StYlometry with Structure and Multitask Learning: Implications for Darknet Forum Migrant Analysis ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Connecting Attributions and QA Model Behavior on Realistic Counterfactuals ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
End-to-end style-conditioned poetry generation: What does it take to learn from examples alone? ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Solving Aspect Category Sentiment Analysis as a Text Generation Task ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
CLASSIC: Continual and Contrastive Learning of Aspect Sentiment Classification Tasks ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Improving Multimodal fusion via Mutual Dependency Maximisation ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Perceived and Intended Sarcasm Detection with Graph Attention Networks ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Improving Federated Learning for Aspect-based Sentiment Analysis via Topic Memories ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
How much coffee was consumed during EMNLP 2019? Fermi Problems: A New Reasoning Challenge for AI ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
LMSOC: An Approach for Socially Sensitive Pretraining ...
|
|
|
|
Abstract:
While large-scale pretrained language models have been shown to learn effective linguistic representations for many NLP tasks, there remain many real-world contextual aspects of language that current approaches do not capture. For instance, consider a cloze-test “I enjoyed the _______ game this weekend”: the correct answer depends heavily on where the speaker is from, when the utterance occurred, and the speaker’s broader social milieu and preferences. Although language depends heavily on the geographical, temporal, and other social contexts of the speaker, these elements have not been incorporated into modern transformer-based language models. We propose a simple but effective approach to incorporate speaker social context into the learned representations of large-scale language models. Our method first learns dense representations of social contexts using graph representation learning algorithms and then primes language model pretraining with these social context representations. We evaluate our approach ...
|
|
Keyword:
Language Models; Machine Learning; Natural Language Processing; Sentiment Analysis
|
|
URL: https://dx.doi.org/10.48448/g0tj-ns83 https://underline.io/lecture/39396-lmsoc-an-approach-for-socially-sensitive-pretraining
|
|
BASE
|
|
Hide details
|
|
18 |
The Effect of Round-Trip Translation on Fairness in Sentiment Analysis ...
|
|
|
|
BASE
|
|
Show details
|
|
20 |
MATE: Multi-view Attention for Table Transformer Efficiency ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|