Neural dialog state tracker for large ontologies by attention mechanism

Youngsoo Jang, Jiyeon Ham, Byung Jun Lee, Youngjae Chang, Kee Eung Kim

Research output: Chapter in Book/Report/Conference proceedingConference contribution

9 Citations (Scopus)

Abstract

This paper presents a dialog state tracker submitted to Dialog State Tracking Challenge 5 (DSTC 5) with details. To tackle the challenging cross-language human-human dialog state tracking task with limited training data, we propose a tracker that focuses on words with meaningful context based on attention mechanism and bi-directional long short term memory (LSTM). The vocabulary including a plenty of proper nouns is vectorized with a sufficient amount of related texts crawled from web to learn a good embedding for words not existent in training dialogs. Despite its simplicity, our proposed tracker succeeded to achieve high accuracy without sophisticated pre- and post-processing.

Original languageEnglish
Title of host publication2016 IEEE Workshop on Spoken Language Technology, SLT 2016 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages531-537
Number of pages7
ISBN (Electronic)9781509049035
DOIs
Publication statusPublished - 2017 Feb 7
Externally publishedYes
Event2016 IEEE Workshop on Spoken Language Technology, SLT 2016 - San Diego, United States
Duration: 2016 Dec 132016 Dec 16

Publication series

Name2016 IEEE Workshop on Spoken Language Technology, SLT 2016 - Proceedings

Conference

Conference2016 IEEE Workshop on Spoken Language Technology, SLT 2016
Country/TerritoryUnited States
CitySan Diego
Period16/12/1316/12/16

Keywords

  • Attention mechanism
  • Dialog state tracking
  • DSTC5
  • Recurrent Neural Network
  • Word embedding

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Artificial Intelligence
  • Language and Linguistics
  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Neural dialog state tracker for large ontologies by attention mechanism'. Together they form a unique fingerprint.

Cite this