1 |
What you can cram into a single \$&!#* vector: Probing sentence embeddings for linguistic properties
|
|
|
|
In: ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics ; https://hal.archives-ouvertes.fr/hal-01898412 ; ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Jul 2018, Melbourne, Australia. pp.2126-2136 (2018)
|
|
BASE
|
|
Show details
|
|
3 |
What you can cram into a single vector: Probing sentence embeddings for linguistic properties ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Fader Networks: Manipulating Images by Sliding Attributes
|
|
|
|
In: 31st Conference on Neural Information Processing Systems (NIPS 2017) ; https://hal.archives-ouvertes.fr/hal-02275215 ; 31st Conference on Neural Information Processing Systems (NIPS 2017), Dec 2017, Long Beach, CA, United States. pp.5969-5978 (2017)
|
|
BASE
|
|
Show details
|
|
7 |
Polyglot Neural Language Models: A Case Study in Cross-Lingual Phonetic Representation Learning ...
|
|
Tsvetkov, Yulia; Sitaram, Sunayana; Faruqui, Manaal; Lample, Guillaume; Littell, Patrick; Mortensen, David; Black, Alan W; Levin, Lori; Dyer, Chris. - : arXiv, 2016
|
|
Abstract:
We introduce polyglot language models, recurrent neural network models trained to predict symbol sequences in many different languages using shared representations of symbols and conditioning on typological information about the language to be predicted. We apply these to the problem of modeling phone sequences---a domain in which universal symbol inventories and cross-linguistically shared feature representations are a natural fit. Intrinsic evaluation on held-out perplexity, qualitative analysis of the learned representations, and extrinsic evaluation in two downstream applications that make use of phonetic features show (i) that polyglot models better generalize to held-out data than comparable monolingual models and (ii) that polyglot phonetic feature representations are of higher quality than those learned monolingually. ... : Proceedings of NAACL 2016; 10 pages ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.1605.03832 https://arxiv.org/abs/1605.03832
|
|
BASE
|
|
Hide details
|
|
8 |
What you can cram into a single $&!#* vector: probing sentence embeddings for linguistic properties
|
|
|
|
BASE
|
|
Show details
|
|
|
|