2 |
How transfer learning impacts linguistic knowledge in deep NLP models? ...
|
|
|
|
Abstract:
Transfer learning from pre-trained neural language models towards downstream tasks has been a predominant theme in NLP recently. Several researchers have shown that deep NLP models learn non-trivial amount of linguistic knowledge, captured at different layers of the model. We investigate how fine-tuning towards downstream NLP tasks impacts the learned linguistic knowledge. We carry out a study across popular pre-trained models BERT, RoBERTa and XLNet using layer and neuron-level diagnostic classifiers. We found that for some GLUE tasks, the network relies on the core linguistic information and preserve it deeper in the network, while for others it forgets. Linguistic information is distributed in the pre-trained language models but becomes localized to the lower layers post fine-tuning, reserving higher layers for the task specific knowledge. The pattern varies across architectures, with BERT retaining linguistic information relatively deeper in the network compared to RoBERTa and XLNet, where it is ... : Findings of the ACL 2021 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://arxiv.org/abs/2105.15179 https://dx.doi.org/10.48550/arxiv.2105.15179
|
|
BASE
|
|
Hide details
|
|
3 |
How transfer learning impacts linguistic knowledge in deep NLP models? ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Effect of Post-processing on Contextualized Word Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Similarity Analysis of Contextual Word Representation Models ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
AraBench: Benchmarking Dialectal Arabic-English Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
A Clustering Framework for Lexical Normalization of Roman Urdu ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
Analyzing Individual Neurons in Pre-trained Language Models ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
On the Linguistic Representational Power of Neural Machine Translation Models
|
|
|
|
In: Computational Linguistics, Vol 46, Iss 1, Pp 1-52 (2020) (2020)
|
|
BASE
|
|
Show details
|
|
10 |
On the Linguistic Representational Power of Neural Machine Translation Models ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
What Is One Grain of Sand in the Desert? Analyzing Individual Neurons in Deep NLP Models ...
|
|
|
|
BASE
|
|
Show details
|
|
12 |
Identifying and Controlling Important Neurons in Neural Machine Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
15 |
Challenging Language-Dependent Segmentation for Arabic: An Application to Machine Translation and Part-of-Speech Tagging ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
The SUMMA Platform Prototype
|
|
|
|
In: http://infoscience.epfl.ch/record/233575 (2017)
|
|
BASE
|
|
Show details
|
|
17 |
Egyptian Arabic to English Statistical Machine Translation System for NIST OpenMT'2015 ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
QCMUQ@QALB-2015 Shared Task: Combining Character level MT and Error-tolerant Finite-State Recognition for Arabic Spelling Correction ...
|
|
|
|
BASE
|
|
Show details
|
|
19 |
QCMUQ@QALB-2015 Shared Task: Combining Character level MT and Error-tolerant Finite-State Recognition for Arabic Spelling Correction ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|