Abstract
Semantic transformation of a natural language question into its corresponding logical form is crucial for knowledge-based question answering systems. Most previous methods have tried to achieve this goal by using syntax-based grammar formalisms and rule-based logical inference. However, these approaches are usually limited in terms of the coverage of the lexical trigger, which performs a mapping task from words to the logical properties of the knowledge base, and thus it is easy to ignore implicit and broken relations between properties by not interpreting the full knowledge base. In this study, our goal is to answer questions in any domains by using the semantic embedding space in which the embeddings encode the semantics of words and logical properties. In the latent space, the semantic associations between existing features can be exploited based on their embeddings without using a manually produced lexicon and rules. This embedding-based inference approach for question answering allows the mapping of factoid questions posed in a natural language onto logical representations of the correct answers guided by the knowledge base. In terms of the overall question answering performance, our experimental results and examples demonstrate that the proposed method outperforms previous knowledge-based question answering baseline methods with a publicly released question answering evaluation dataset: WebQuestions.
Original language | English |
---|---|
Article number | 10144 |
Pages (from-to) | 9086-9104 |
Number of pages | 19 |
Journal | Expert Systems With Applications |
Volume | 42 |
Issue number | 23 |
DOIs | |
Publication status | Published - 2015 Dec 15 |
Keywords
- Distributional semantics
- Embedding model
- Knowledge base
- Labeled-LDA
- Neural networks
- Question answering
ASJC Scopus subject areas
- Engineering(all)
- Computer Science Applications
- Artificial Intelligence