Efficient exact inference with loss augmented objective in structured learning

Alexander Bauer, Shinichi Nakajima, Klaus Robert Müller

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Structural support vector machine (SVM) is an elegant approach for building complex and accurate models with structured outputs. However, its applicability relies on the availability of efficient inference algorithms - the state-of-the-art training algorithms repeatedly perform inference to compute a subgradient or to find the most violating configuration. In this paper, we propose an exact inference algorithm for maximizing nondecomposable objectives due to special type of a high-order potential having a decomposable internal structure. As an important application, our method covers the loss augmented inference, which enables the slack and margin scaling formulations of structural SVM with a variety of dissimilarity measures, e.g., Hamming loss, precision and recall, Fβ-loss, intersection over union, and many other functions that can be efficiently computed from the contingency table. We demonstrate the advantages of our approach in natural language parsing and sequence segmentation applications.

Original languageEnglish
Article number7547945
Pages (from-to)2566-2579
Number of pages14
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume28
Issue number11
DOIs
Publication statusPublished - 2017 Nov

Keywords

  • Dynamic programming
  • Graphical models
  • High-order potentials
  • Inference
  • Margin scaling (MS)
  • Slack scaling (SS)
  • Structural support vector machines (SVMs)
  • Structured output

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Efficient exact inference with loss augmented objective in structured learning'. Together they form a unique fingerprint.

Cite this