DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6...72
Hits 21 – 40 of 1.423

21
HIT - A Hierarchically Fused Deep Attention Network for Robust Code-mixed Language Representation ...
BASE
Show details
22
Minimally-Supervised Morphological Segmentation using Adaptor Grammars with Linguistic Priors ...
BASE
Show details
23
Bridging Subword Gaps in Pretrain-Finetune Paradigm for Natural Language Generation ...
BASE
Show details
24
LearnDA: Learnable Knowledge-Guided Data Augmentation for Event Causality Identification ...
BASE
Show details
25
Quotation Recommendation and Interpretation Based on Transformation from Queries to Quotations ...
BASE
Show details
26
How Did This Get Funded?! Automatically Identifying Quirky Scientific Achievements ...
BASE
Show details
27
Minimax and Neyman–Pearson Meta-Learning for Outlier Languages ...
BASE
Show details
28
CLINE: Contrastive Learning with Semantic Negative Examples for Natural Language Understanding ...
BASE
Show details
29
Towards Protecting Vital Healthcare Programs by Extracting Actionable Knowledge from Policy ...
BASE
Show details
30
DYPLOC: Dynamic Planning of Content Using Mixed Language Models for Text Generation ...
Abstract: Read paper: https://www.aclanthology.org/2021.acl-long.501 Abstract: We study the task of long-form opinion text generation, which faces at least two distinct challenges. First, existing neural generation models fall short of coherence, thus requiring efficient content planning. Second, diverse types of information are needed to guide the generator to cover both subjective and objective content. To this end, we propose DYPLOC, a generation framework that conducts dynamic planning of content while generating the output based on a novel design of mixed language models. To enrich the generation with diverse content, we further propose to use large pre-trained models to predict relevant concepts and to generate claims. We experiment with two challenging tasks on newly collected datasets: (1) argument generation with Reddit ChangeMyView, and (2) writing articles using New York Times' Opinion section. Automatic evaluation shows that our model significantly outperforms competitive comparisons. Human judges further ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Information and Knowledge Engineering; Neural Network; Semantics
URL: https://dx.doi.org/10.48448/xtrk-d266
https://underline.io/lecture/25720-dyploc-dynamic-planning-of-content-using-mixed-language-models-for-text-generation
BASE
Hide details
31
Automated Concatenation of Embeddings for Structured Prediction ...
BASE
Show details
32
QASR: QCRI Aljazeera Speech Resource A Large Scale Annotated Arabic Speech Corpus ...
BASE
Show details
33
Code Generation from Natural Language with Less Prior Knowledge and More Monolingual Data ...
BASE
Show details
34
On the Distribution, Sparsity, and Inference-time Quantization of Attention Values in Transformers ...
BASE
Show details
35
Learning Disentangled Latent Topics for Twitter Rumour Veracity Classification ...
BASE
Show details
36
Sequence Models for Computational Etymology of Borrowings ...
BASE
Show details
37
Scaling Within Document Coreference to Long Texts ...
BASE
Show details
38
How to Split: the Effect of Word Segmentation on Gender Bias in Speech Translation ...
BASE
Show details
39
Prefix-Tuning: Optimizing Continuous Prompts for Generation ...
BASE
Show details
40
Chase: A Large-Scale and Pragmatic Chinese Dataset for Cross-Database Context-Dependent Text-to-SQL ...
BASE
Show details

Page: 1 2 3 4 5 6...72

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
1.423
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern