DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 22

1
Backtranslation in Neural Morphological Inflection ...
BASE
Show details
2
To POS Tag or Not to POS Tag: The Impact of POS Tags on Morphological Learning in Low-Resource Settings ...
BASE
Show details
3
Applying the Transformer to Character-level Transduction ...
Wu, Shijie; Cotterell, Ryan; Hulden, Mans. - : ETH Zurich, 2021
BASE
Show details
4
Do RNN States Encode Abstract Phonological Alternations? ...
NAACL 2021 2021; Hulden, Mans; Nicolai, Garrett. - : Underline Science Inc., 2021
BASE
Show details
5
Do RNN States Encode Abstract Phonological Processes? ...
BASE
Show details
6
Applying the Transformer to Character-level Transduction
In: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume (2021)
Abstract: The transformer has been shown to outperform recurrent neural network-based sequence-to-sequence models in various word-level NLP tasks. Yet for character-level transduction tasks, e.g. morphological inflection generation and historical text normalization, there are few works that outperform recurrent models using the transformer. In an empirical study, we uncover that, in contrast to recurrent sequence-to-sequence models, the batch size plays a crucial role in the performance of the transformer on character-level tasks, and we show that with a large enough batch size, the transformer does indeed outperform recurrent models. We also introduce a simple technique to handle feature-guided character-level transduction that further improves performance. With these insights, we achieve state-of-the-art performance on morphological inflection and historical text normalization. We also show that the transformer outperforms a strong baseline on two other character-level transduction tasks: grapheme-to-phoneme conversion and transliteration.
URL: https://hdl.handle.net/20.500.11850/518998
https://doi.org/10.3929/ethz-b-000518998
BASE
Hide details
7
Can a Transformer Pass the Wug Test? Tuning Copying Bias in Neural Morphological Inflection Models ...
Liu, Ling; Hulden, Mans. - : arXiv, 2021
BASE
Show details
8
SIGMORPHON 2020 Shared Task 0: Typologically Diverse Morphological Inflection ...
BASE
Show details
9
Analogy Models for Neural Word Inflection ...
BASE
Show details
10
UniMorph 3.0: Universal Morphology
In: Proceedings of the 12th Language Resources and Evaluation Conference (2020)
BASE
Show details
11
UniMorph 3.0: Universal Morphology ...
BASE
Show details
12
The SIGMORPHON 2019 Shared Task: Morphological Analysis in Context and Cross-Lingual Transfer for Inflection ...
BASE
Show details
13
RNN Classification of English Vowels: Nasalized or Not
In: Proceedings of the Society for Computation in Linguistics (2019)
BASE
Show details
14
On the Complexity and Typology of Inflectional Morphological Systems
In: Transactions of the Association for Computational Linguistics, Vol 7, Pp 327-342 (2019) (2019)
BASE
Show details
15
Marrying Universal Dependencies and Universal Morphology ...
BASE
Show details
16
On the Complexity and Typology of Inflectional Morphological Systems ...
BASE
Show details
17
Sound Analogies with Phoneme Embeddings
In: Proceedings of the Society for Computation in Linguistics (2018)
BASE
Show details
18
Quantifying the Trade-off Between Two Types of Morphological Complexity
In: Proceedings of the Society for Computation in Linguistics (2018)
BASE
Show details
19
A Comparison of Feature-Based and Neural Scansion of Poetry ...
BASE
Show details
20
Foma: a finite-state compiler and library
In: Association for Computational Linguistics / European Chapter. Conference of the European Chapter of the Association for Computational Linguistics. - Menlo Park, Calif. : ACL 12 (2009), 29-32
BLLDB
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
1
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
21
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern