1 |
Parallel processing in speech perception with local and global representations of linguistic context
|
|
|
|
In: eLife (2022)
|
|
BASE
|
|
Show details
|
|
2 |
Using surprisal and fMRI to map the neural bases of broad and local contextual prediction during natural language comprehension ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Community-level Research on Suicidality Prediction in a Secure Environment: Overview of the CLPsych 2021 Shared Task
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Connecting Documents, Words, and Languages Using Topic Models
|
|
|
|
BASE
|
|
Show details
|
|
5 |
Assessing Composition in Sentence Vector Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Relating lexical and syntactic processes in language: Bridging research in humans and machines
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Guided Probabilistic Topic Models for Agenda-setting and Framing
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Modeling Dependencies in Natural Languages with Latent Variables
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Structured local exponential models for machine translation
|
|
|
|
Abstract:
This thesis proposes a synthesis and generalization of local exponential translation models, the subclass of feature-rich translation models which associate probability distributions with individual rewrite rules used by the translation system, such as synchronous context-free rules, or with other individual aspects of translation hypotheses such as word pairs or reordering events. Unlike other authors we use these estimates to replace the traditional phrase models and lexical scores, rather than in addition to them, thereby demonstrating that the local exponential phrase models can be regarded as a generalization of standard methods not only in theoretical but also in practical terms. We further introduce a form of local translation models that combine features associated with surface forms of rules and features associated with less specific representation -- including those based on lemmas, inflections, and reordering patterns -- such that surface-form estimates are recovered as a special case of the model. Crucially, the proposed approach allows estimation of parameters for the latter type of features from training sets that include multiple source phrases, thereby overcoming an important training set fragmentation problem which hampers previously proposed local translation models. These proposals are experimentally validated. Conditioning all phrase-based probabilities in a hierarchical phrase-based system on source-side contextual information produces significant performance improvements. Extending the contextually-sensitive estimates with features modeling source-side morphology and reordering patterns yields consistent additional improvements, while further experiments show significant improvements obtained from modeling observed and unobserved inflections for a morphologically rich target language.
|
|
Keyword:
Artificial intelligence; Computer science; Exponential models; Linguistics; Machine learning; Machine translation; Maximum entropy
|
|
URL: http://hdl.handle.net/1903/12150
|
|
BASE
|
|
Hide details
|
|
15 |
A Formal Model of Ambiguity and its Applications in Machine Translation
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Extending Phrase-Based Decoding with a Dependency-Based Reordering Model
|
|
|
|
In: DTIC (2009)
|
|
BASE
|
|
Show details
|
|
17 |
Extending Phrase-Based Decoding with a Dependency-Based Reordering Model
|
|
|
|
BASE
|
|
Show details
|
|
18 |
COMPUTATIONAL ANALYSIS OF THE CONVERSATIONAL DYNAMICS OF THE UNITED STATES SUPREME COURT
|
|
|
|
BASE
|
|
Show details
|
|
19 |
Fine-Grained Linguistic Soft Constraints on Statistical Natural Language Processing Models
|
|
|
|
BASE
|
|
Show details
|
|
|
|