1 |
Input Representations for Parsing Discourse Representation Structures: Comparing English with Chinese ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
On the Difficulty of Translating Free-Order Case-Marking Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
UDapter: Language Adaptation for Truly Universal Dependency Parsing ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Understanding Cross-Lingual Syntactic Transfer in Multilingual Recurrent Neural Networks ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Zero-shot Dependency Parsing with Pre-trained Multilingual Sentence Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Learning Topic-Sensitive Word Representations ...
|
|
|
|
Abstract:
Distributed word representations are widely used for modeling words in NLP tasks. Most of the existing models generate one representation per word and do not consider different meanings of a word. We present two approaches to learn multiple topic-sensitive representations per word by using Hierarchical Dirichlet Process. We observe that by modeling topics and integrating topic distributions for each document we obtain representations that are able to distinguish between different meanings of a given word. Our models yield statistically significant improvements for the lexical substitution task indicating that commonly used single word representations, even when combined with contextual information, are insufficient for this task. ... : 5 pages, 1 figure, Accepted at ACL 2017 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.1705.00441 https://arxiv.org/abs/1705.00441
|
|
BASE
|
|
Hide details
|
|
9 |
Neural versus Phrase-Based Machine Translation Quality: a Case Study ...
|
|
|
|
BASE
|
|
Show details
|
|
10 |
A Survey of Word Reordering in Statistical Machine Translation: Computational Models and Language Phenomena ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|