3 |
P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
Zero-Shot Information Extraction as a Unified Text-to-Triple Translation ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
A Study on Mandarin Utterance Particles Translation: A Preliminary Translation Pattern of Mandarin Utterance Particle Ba in Teahouse
|
|
Tang, Jie. - : The University of Queensland, School of Languages and Cultures, 2016
|
|
BASE
|
|
Show details
|
|
8 |
Word Embedding based Correlation Model for Question/Answer Matching ...
|
|
|
|
Abstract:
With the development of community based question answering (Q&A) services, a large scale of Q&A archives have been accumulated and are an important information and knowledge resource on the web. Question and answer matching has been attached much importance to for its ability to reuse knowledge stored in these systems: it can be useful in enhancing user experience with recurrent questions. In this paper, we try to improve the matching accuracy by overcoming the lexical gap between question and answer pairs. A Word Embedding based Correlation (WEC) model is proposed by integrating advantages of both the translation model and word embedding, given a random pair of words, WEC can score their co-occurrence probability in Q&A pairs and it can also leverage the continuity and smoothness of continuous space word representation to deal with new pairs of words that are rare in the training parallel text. An experimental study on Yahoo! Answers dataset and Baidu Zhidao dataset shows this new method's ... : 8 pages, 2 figures ...
|
|
Keyword:
Artificial Intelligence cs.AI; Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://arxiv.org/abs/1511.04646 https://dx.doi.org/10.48550/arxiv.1511.04646
|
|
BASE
|
|
Hide details
|
|
|
|