DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 22

1
Backtranslation in Neural Morphological Inflection ...
BASE
Show details
2
To POS Tag or Not to POS Tag: The Impact of POS Tags on Morphological Learning in Low-Resource Settings ...
BASE
Show details
3
Applying the Transformer to Character-level Transduction ...
Wu, Shijie; Cotterell, Ryan; Hulden, Mans. - : ETH Zurich, 2021
BASE
Show details
4
Do RNN States Encode Abstract Phonological Alternations? ...
NAACL 2021 2021; Hulden, Mans; Nicolai, Garrett. - : Underline Science Inc., 2021
BASE
Show details
5
Do RNN States Encode Abstract Phonological Processes? ...
BASE
Show details
6
Applying the Transformer to Character-level Transduction
In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume (2021)
BASE
Show details
7
Can a Transformer Pass the Wug Test? Tuning Copying Bias in Neural Morphological Inflection Models ...
Liu, Ling; Hulden, Mans. - : arXiv, 2021
BASE
Show details
8
SIGMORPHON 2020 Shared Task 0: Typologically Diverse Morphological Inflection ...
BASE
Show details
9
Analogy Models for Neural Word Inflection ...
Abstract: Analogy is assumed to be the cognitive mechanism speakers resort to in order to inflect an unknown form of a lexeme based on knowledge of other words in a language. In this process, an analogy is formed between word forms within an inflectional paradigm but also across paradigms. As neural network models for inflection are typically trained only on lemma-target form pairs, we propose three new ways to provide neural models with additional source forms to strengthen analogy-formation, and compare our methods to other approaches in the literature. We show that the proposed methods of providing a Transformer sequence-to-sequence model with additional analogy sources in the input are consistently effective, and improve upon recent state-of-the-art results on 46 languages, particularly in low-resource settings. We also propose a method to combine the analogy-motivated approach with data hallucination or augmentation. We find that the two approaches are complementary to each other and combining the two approaches ...
Keyword: Computer and Information Science; Natural Language Processing; Neural Network
URL: https://dx.doi.org/10.48448/vx2m-6395
https://underline.io/lecture/6355-analogy-models-for-neural-word-inflection
BASE
Hide details
10
UniMorph 3.0: Universal Morphology
In: Proceedings of the 12th Language Resources and Evaluation Conference (2020)
BASE
Show details
11
UniMorph 3.0: Universal Morphology ...
BASE
Show details
12
The SIGMORPHON 2019 Shared Task: Morphological Analysis in Context and Cross-Lingual Transfer for Inflection ...
BASE
Show details
13
RNN Classification of English Vowels: Nasalized or Not
In: Proceedings of the Society for Computation in Linguistics (2019)
BASE
Show details
14
On the Complexity and Typology of Inflectional Morphological Systems
In: Transactions of the Association for Computational Linguistics, Vol 7, Pp 327-342 (2019) (2019)
BASE
Show details
15
Marrying Universal Dependencies and Universal Morphology ...
BASE
Show details
16
On the Complexity and Typology of Inflectional Morphological Systems ...
BASE
Show details
17
Sound Analogies with Phoneme Embeddings
In: Proceedings of the Society for Computation in Linguistics (2018)
BASE
Show details
18
Quantifying the Trade-off Between Two Types of Morphological Complexity
In: Proceedings of the Society for Computation in Linguistics (2018)
BASE
Show details
19
A Comparison of Feature-Based and Neural Scansion of Poetry ...
BASE
Show details
20
Foma: a finite-state compiler and library
In: Association for Computational Linguistics / European Chapter. Conference of the European Chapter of the Association for Computational Linguistics. - Menlo Park, Calif. : ACL 12 (2009), 29-32
BLLDB
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
1
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
21
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern