Home
Catalogue search
Refine your search:
Keyword:
Computational Linguistics (2)
Condensed Matter Physics (2)
Deep Learning (2)
Electromagnetism (2)
FOS Physical sciences (2)
Information and Knowledge Engineering (2)
Neural Network (2)
Semantics (2)
Creator / Publisher
Year
Medium
Type
BLLDB-Access
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 2 of 2
1
"We will Reduce Taxes" - Identifying Election Pledges with Language Models ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Adhikari, pankaj.adhikari@monash.edu
;
Fornaciari, Tommaso
;
Hovy, Dirk
;
Naurin, Elin
;
Runeson, Julia
;
Thomson, Robert
. - : Underline Science Inc., 2021
Abstract:
Read paper: https://www.aclanthology.org/2021.findings-acl.301 Abstract: In an election campaign, political parties pledge to implement various projects--should they be elected. But do they follow through? To track election pledges from parties' election manifestos, we need to distinguish between pledges and general statements. In this paper, we use election manifestos of Swedish and Indian political parties to learn neural models that distinguish actual pledges from generic political positions. Since pledges might vary by election year and party, we implement a Multi-Task Learning (MTL) setup, predicting election year and manifesto's party as auxiliary tasks. Pledges can also span several sentences, so we use hierarchical models that incorporate contextual information. Lastly, we evaluate the models in a Zero-Shot Learning (ZSL) framework across countries and languages. Our results indicate that year and party have predictive power even in ZSL, while context introduces some noise. We finally discuss the ...
Keyword:
Computational Linguistics
;
Condensed Matter Physics
;
Deep Learning
;
Electromagnetism
;
FOS Physical sciences
;
Information and Knowledge Engineering
;
Neural Network
;
Semantics
URL:
https://underline.io/lecture/26392-'we-will-reduce-taxes'---identifying-election-pledges-with-language-models
https://dx.doi.org/10.48448/x2b1-ym56
BASE
Hide details
2
SemEval 2021 Task 12: Learning with Disagreement ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Chamberlain, Jon
;
Dumitrache, Anca
. - : Underline Science Inc., 2021
BASE
Show details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
2
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern