Efficient Exact Inference With Loss Augmented Objective in Structured Learning

Alexander Bauer, Shinichi Nakajima, Klaus Muller

Research output: Contribution to journalArticle

Abstract

Structural support vector machine (SVM) is an elegant approach for building complex and accurate models with structured outputs. However, its applicability relies on the availability of efficient inference algorithms--the state-of-the-art training algorithms repeatedly perform inference to compute a subgradient or to find the most violating configuration. In this paper, we propose an exact inference algorithm for maximizing nondecomposable objectives due to special type of a high-order potential having a decomposable internal structure. As an important application, our method covers the loss augmented inference, which enables the slack and margin scaling formulations of structural SVM with a variety of dissimilarity measures, e.g., Hamming loss, precision and recall, Fβ-loss, intersection over union, and many other functions that can be efficiently computed from the contingency table. We demonstrate the advantages of our approach in natural language parsing and sequence segmentation applications.

Original languageEnglish
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
Publication statusAccepted/In press - 2016 Aug 19

Fingerprint

Learning
Support vector machines
Language
Availability
Support Vector Machine

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Cite this

Efficient Exact Inference With Loss Augmented Objective in Structured Learning. / Bauer, Alexander; Nakajima, Shinichi; Muller, Klaus.

In: IEEE Transactions on Neural Networks and Learning Systems, 19.08.2016.

Research output: Contribution to journalArticle

@article{d02c2a8984f64ff7985fc4da2f83e4cc,
title = "Efficient Exact Inference With Loss Augmented Objective in Structured Learning",
abstract = "Structural support vector machine (SVM) is an elegant approach for building complex and accurate models with structured outputs. However, its applicability relies on the availability of efficient inference algorithms--the state-of-the-art training algorithms repeatedly perform inference to compute a subgradient or to find the most violating configuration. In this paper, we propose an exact inference algorithm for maximizing nondecomposable objectives due to special type of a high-order potential having a decomposable internal structure. As an important application, our method covers the loss augmented inference, which enables the slack and margin scaling formulations of structural SVM with a variety of dissimilarity measures, e.g., Hamming loss, precision and recall, Fβ-loss, intersection over union, and many other functions that can be efficiently computed from the contingency table. We demonstrate the advantages of our approach in natural language parsing and sequence segmentation applications.",
author = "Alexander Bauer and Shinichi Nakajima and Klaus Muller",
year = "2016",
month = "8",
day = "19",
doi = "10.1109/TNNLS.2016.2598721",
language = "English",
journal = "IEEE Transactions on Neural Networks and Learning Systems",
issn = "2162-237X",
publisher = "IEEE Computational Intelligence Society",

}

TY - JOUR

T1 - Efficient Exact Inference With Loss Augmented Objective in Structured Learning

AU - Bauer, Alexander

AU - Nakajima, Shinichi

AU - Muller, Klaus

PY - 2016/8/19

Y1 - 2016/8/19

N2 - Structural support vector machine (SVM) is an elegant approach for building complex and accurate models with structured outputs. However, its applicability relies on the availability of efficient inference algorithms--the state-of-the-art training algorithms repeatedly perform inference to compute a subgradient or to find the most violating configuration. In this paper, we propose an exact inference algorithm for maximizing nondecomposable objectives due to special type of a high-order potential having a decomposable internal structure. As an important application, our method covers the loss augmented inference, which enables the slack and margin scaling formulations of structural SVM with a variety of dissimilarity measures, e.g., Hamming loss, precision and recall, Fβ-loss, intersection over union, and many other functions that can be efficiently computed from the contingency table. We demonstrate the advantages of our approach in natural language parsing and sequence segmentation applications.

AB - Structural support vector machine (SVM) is an elegant approach for building complex and accurate models with structured outputs. However, its applicability relies on the availability of efficient inference algorithms--the state-of-the-art training algorithms repeatedly perform inference to compute a subgradient or to find the most violating configuration. In this paper, we propose an exact inference algorithm for maximizing nondecomposable objectives due to special type of a high-order potential having a decomposable internal structure. As an important application, our method covers the loss augmented inference, which enables the slack and margin scaling formulations of structural SVM with a variety of dissimilarity measures, e.g., Hamming loss, precision and recall, Fβ-loss, intersection over union, and many other functions that can be efficiently computed from the contingency table. We demonstrate the advantages of our approach in natural language parsing and sequence segmentation applications.

UR - http://www.scopus.com/inward/record.url?scp=84983036054&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84983036054&partnerID=8YFLogxK

U2 - 10.1109/TNNLS.2016.2598721

DO - 10.1109/TNNLS.2016.2598721

M3 - Article

AN - SCOPUS:84983036054

JO - IEEE Transactions on Neural Networks and Learning Systems

JF - IEEE Transactions on Neural Networks and Learning Systems

SN - 2162-237X

ER -