DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 39

1
Probing for the Usage of Grammatical Number ...
Abstract: A central quest of probing is to uncover how pre-trained models encode a linguistic property within their representations. An encoding, however, might be spurious-i.e., the model might not rely on it when making predictions. In this paper, we try to find encodings that the model actually uses, introducing a usage-based probing setup. We first choose a behavioral task which cannot be solved without using the linguistic property. Then, we attempt to remove the property by intervening on the model's representations. We contend that, if an encoding is used by the model, its removal should harm the performance on the chosen behavioral task. As a case study, we focus on how BERT encodes grammatical number, and on how it uses this encoding to solve the number agreement task. Experimentally, we find that BERT relies on a linear encoding of grammatical number to produce the correct behavioral output. We also find that BERT uses a separate encoding of grammatical number for nouns and verbs. Finally, we identify in ... : ACL 2022 (Main Conference) The discussion section had been inadvertently removed before the article was published on arxiv ...
Keyword: Computation and Language cs.CL; FOS Computer and information sciences
URL: https://arxiv.org/abs/2204.08831
https://dx.doi.org/10.48550/arxiv.2204.08831
BASE
Hide details
2
On Homophony and Rényi Entropy ...
BASE
Show details
3
On Homophony and Rényi Entropy ...
BASE
Show details
4
Revisiting the Uniform Information Density Hypothesis ...
BASE
Show details
5
Revisiting the Uniform Information Density Hypothesis ...
BASE
Show details
6
Modeling the Unigram Distribution ...
BASE
Show details
7
A Bayesian Framework for Information-Theoretic Probing ...
BASE
Show details
8
A surprisal--duration trade-off across and within the world's languages ...
BASE
Show details
9
Revisiting the Uniform Information Density Hypothesis ...
BASE
Show details
10
What About the Precedent: An Information-Theoretic Analysis of Common Law ...
BASE
Show details
11
Modeling the Unigram Distribution ...
BASE
Show details
12
Finding Concept-specific Biases in Form–Meaning Associations ...
BASE
Show details
13
Modeling the Unigram Distribution
In: Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (2021)
BASE
Show details
14
What About the Precedent: An Information-Theoretic Analysis of Common Law
In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
BASE
Show details
15
Finding Concept-specific Biases in Form–Meaning Associations
In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
BASE
Show details
16
A Non-Linear Structural Probe
In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
BASE
Show details
17
Disambiguatory Signals are Stronger in Word-initial Positions
In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume (2021)
BASE
Show details
18
How (Non-)Optimal is the Lexicon?
In: Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2021)
BASE
Show details
19
A Bayesian Framework for Information-Theoretic Probing
In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (2021)
BASE
Show details
20
How (Non-)Optimal is the Lexicon? ...
NAACL 2021 2021; Blasi, Damián; Cotterell, Ryan. - : Underline Science Inc., 2021
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
39
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern