1 |
Le modèle Transformer: un « couteau suisse » pour le traitement automatique des langues
|
|
|
|
In: Techniques de l'Ingenieur ; https://hal.archives-ouvertes.fr/hal-03619077 ; Techniques de l'Ingenieur, Techniques de l'ingénieur, 2022, ⟨10.51257/a-v1-in195⟩ ; https://www.techniques-ingenieur.fr/base-documentaire/innovation-th10/innovations-en-electronique-et-tic-42257210/transformer-des-reseaux-de-neurones-pour-le-traitement-automatique-des-langues-in195/ (2022)
|
|
BASE
|
|
Show details
|
|
3 |
Navigating the Kaleidoscope of COVID-19 Misinformation Using Deep Learning ...
|
|
|
|
Abstract:
Anthology paper link: https://aclanthology.org/2021.emnlp-main.485/ Abstract: Irrespective of the success of the deep learning- based mixed-domain transfer learning approach for solving various Natural Language Processing tasks, it does not lend a generalizable solution for detecting misinformation from COVID-19 social media data. Due to the inherent complexity of this type of data, caused by its dynamic (context evolves rapidly), nuanced (misinformation types are often ambiguous), and diverse (skewed, fine-grained, and overlapping categories) nature, it is imperative for an effective model to capture both the local and global context of the target domain. By conducting a systematic investigation, we show that: (i) the deep Transformer- based pre-trained models, utilized via the mixed-domain transfer learning, are only good at capturing the local context, thus exhibits poor generalization, and (ii) a combination of shallow network-based domain-specific models and convolutional neural networks can efficiently ...
|
|
Keyword:
Computational Linguistics; Covid-19; Deep Learning; Language Models; Machine Learning; Machine Learning and Data Mining; Natural Language Processing
|
|
URL: https://dx.doi.org/10.48448/yyza-sr36 https://underline.io/lecture/37959-navigating-the-kaleidoscope-of-covid-19-misinformation-using-deep-learning
|
|
BASE
|
|
Hide details
|
|
4 |
HittER: Hierarchical Transformers for Knowledge Graph Embeddings ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
HETFORMER: Heterogeneous Transformer with Sparse Attention for Long-Text Extractive Summarization ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Not All Negatives are Equal: Label-Aware Contrastive Loss for Fine-grained Text Classification ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Unsupervised Multi-View Post-OCR Error Correction With Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
AttentionRank: Unsupervised Keyphrase Extraction using Self and Cross Attentions ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Automatic Fact-Checking with Document-level Annotations using BERT and Multiple Instance Learning ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Towards the Early Detection of Child Predators in Chat Rooms: A BERT-based Approach ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Semantic Categorization of Social Knowledge for Commonsense Question Answering ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Pre-train or Annotate? Domain Adaptation with a Constrained Budget ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Stepmothers are mean and academics are pretentious: What do pretrained language models learn about you? ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
CLIFF: Contrastive Learning for Improving Faithfulness and Factuality in Abstractive Summarization ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
Automatic Text Evaluation through the Lens of Wasserstein Barycenters ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Combining sentence and table evidence to predict veracity of factual claims using TaPaS and RoBERTa ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Meta Distant Transfer Learning for Pre-trained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|