A Two-Level Recurrent Neural Network Language Model Based on the Continuous Bag-of-Words Model for Sentence Classification

Yo Han Lee, Dong W. Kim, Myo Taeg Lim

Research output: Contribution to journalArticle

Abstract

In this paper, a new two-level recurrent neural network language model (RNNLM) based on the continuous bag-of-words (CBOW) model for application to sentence classification is presented. The vector representations of words learned by a neural network language model have been shown to carry semantic sentiment and are useful in various natural language processing tasks. A disadvantage of CBOW is that it only considers the fixed length of a context because its basic structure is a neural network with a fixed length of input. In contrast, the RNNLM does not have a size limit for a context but only considers the previous context's words. Therefore, the advantage of RNNLM is complementary to the disadvantage of CBOW. Herein, our proposed model encodes many linguistic patterns and improves upon sentiment analysis and question classification benchmarks compared to previously reported methods.

Original languageEnglish
Article number1950002
JournalInternational Journal on Artificial Intelligence Tools
Volume28
Issue number1
DOIs
Publication statusPublished - 2019 Feb 1

    Fingerprint

Keywords

  • continuous bag-of-words
  • language model
  • Recurrent neural network
  • sentence classification

ASJC Scopus subject areas

  • Artificial Intelligence

Cite this