DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4
Hits 1 – 20 of 79

1
Parallel processing in speech perception with local and global representations of linguistic context
In: eLife (2022)
BASE
Show details
2
Using surprisal and fMRI to map the neural bases of broad and local contextual prediction during natural language comprehension ...
BASE
Show details
3
Community-level Research on Suicidality Prediction in a Secure Environment: Overview of the CLPsych 2021 Shared Task
BASE
Show details
4
Connecting Documents, Words, and Languages Using Topic Models
Yang, Weiwei. - 2019
Abstract: Topic models discover latent topics in documents and summarize documents at a high level. To improve topic models' topic quality and extrinsic performance, external knowledge is often incorporated as part of the generative story. One form of external knowledge is weighted text links that indicate similarity or relatedness between the connected objects. This dissertation 1) uncovers the latent structures in observed weighted links and integrates them into topic modeling, and 2) learns latent weighted links from other external knowledge to improve topic modeling. We consider incorporating links at three different levels: documents, words, and topics. We first look at binary document links, e.g., citation links of papers. Document links indicate topic similarity of the connected documents. Past methods model the document links separately, ignoring the entire link density. We instead uncover latent document blocks in which documents are densely connected and tend to talk about similar topics. We introduce LBH-RTM, a relational topic model with lexical weights, block priors, and hinge loss. It extracts informative topic priors from the document blocks for documents' topic generation. It predicts unseen document links with block and lexical features and hinge loss, in addition to topical features. It outperforms past methods in link prediction and gives more coherent topics. Like document links, words are also linked, but usually with real-valued weights. Word links are known as word associations and indicate the semantic relatedness of the connected words. They provide more information about word relationships in addition to the co-occurrence patterns in the training corpora. To extract and incorporate the knowledge in word associations, we introduce methods to find the most salient word pairs. The methods organize the words in a tree structure, which serves as a prior (i.e., tree prior) for tree LDA. The methods are straightforward but effective, yielding more coherent topics than vanilla LDA, and slightly improving the extrinsic classification performance. Weighted topic links are different. Topics are latent, so it is difficult to obtain ground-truth topic links, but learned weighted topic links could bridge the topics across languages. We introduce a multilingual topic model (MTM) that assumes each language has its own topic distributions over the words only in that language and learns weighted topic links based on word translations and words' topic distributions. It does not force the topic spaces of different languages to be aligned and is more robust than previous MTMs that do. It outperforms past MTMs in classification while still giving coherent topics on less comparable and smaller corpora.
Keyword: Artificial intelligence; Computer science; Topic Model
URL: http://hdl.handle.net/1903/25003
https://doi.org/10.13016/jiwk-1bh3
BASE
Hide details
5
Assessing Composition in Sentence Vector Representations ...
BASE
Show details
6
Relating lexical and syntactic processes in language: Bridging research in humans and machines
BASE
Show details
7
Guided Probabilistic Topic Models for Agenda-setting and Framing
Nguyen, Viet An. - 2015
BASE
Show details
8
Soft syntactic constraints for Arabic-English hierarchical phrase-based translation
In: Machine translation. - Dordrecht [u.a.] : Springer Science + Business Media 26 (2012) 1-2, 137-157
BLLDB
OLC Linguistik
Show details
9
Crowdsourced Monolingual Translation
Hu, Chang. - 2012
BASE
Show details
10
Decision Tree-based Syntactic Language Modeling
BASE
Show details
11
Modeling Dependencies in Natural Languages with Latent Variables
BASE
Show details
12
Exploiting syntactic relationships in a phrase-based decoder: an exploration
In: Machine translation. - Dordrecht [u.a.] : Springer Science + Business Media 24 (2010) 2, 123-140
BLLDB
OLC Linguistik
Show details
13
Gibbs Sampling for the Uninitiated
In: DTIC (2010)
BASE
Show details
14
Structured local exponential models for machine translation
BASE
Show details
15
A Formal Model of Ambiguity and its Applications in Machine Translation
BASE
Show details
16
Extending Phrase-Based Decoding with a Dependency-Based Reordering Model
In: DTIC (2009)
BASE
Show details
17
Extending Phrase-Based Decoding with a Dependency-Based Reordering Model
BASE
Show details
18
COMPUTATIONAL ANALYSIS OF THE CONVERSATIONAL DYNAMICS OF THE UNITED STATES SUPREME COURT
Hawes, Timothy. - 2009
BASE
Show details
19
Fine-Grained Linguistic Soft Constraints on Statistical Natural Language Processing Models
BASE
Show details
20
Generalizing Word Lattice Translation
In: DTIC (2008)
BASE
Show details

Page: 1 2 3 4

Catalogues
6
0
7
0
0
0
0
Bibliographies
18
0
0
0
0
0
0
0
6
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
50
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern