Two-dimensional solution surface for weighted support vector machines

Seung Jun Shin, Yichao Wu, Hao Helen Zhang

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

The support vector machine (SVM) is a popular learning method for binary classification. Standard SVMs treat all the data points equally, but in some practical problems it is more natural to assign different weights to observations from different classes. This leads to a broader class of learning, the so-called weighted SVMs (WSVMs), and one of their important applications is to estimate class probabilities besides learning the classification boundary. There are two parameters associated with theWSVM optimization problem: one is the regularization parameter and the other is the weight parameter. In this article, we first establish that the WSVM solutions are jointly piecewise-linear with respect to both the regularization and weight parameter.We then develop a state-of-theart algorithm that can compute the entire trajectory of the WSVM solutions for every pair of the regularization parameter and the weight parameter at a feasible computational cost. The derived two-dimensional solution surface provides theoretical insight on the behavior of the WSVM solutions. Numerically, the algorithm can greatly facilitate the implementation of the WSVM and automate the selection process of the optimal regularization parameter. We illustrate the new algorithm on various examples. This article has online supplementary materials.

Original languageEnglish
Pages (from-to)383-402
Number of pages20
JournalJournal of Computational and Graphical Statistics
Volume23
Issue number2
DOIs
Publication statusPublished - 2014 Jan 1
Externally publishedYes

Fingerprint

Support Vector Machine
Regularization Parameter
Binary Classification
Optimal Parameter
Piecewise Linear
Assign
Two Parameters
Computational Cost
Regularization
Entire
Trajectory
Optimization Problem
Support vector machine
Estimate
Class
Learning

Keywords

  • Binary classification
  • Coefficient path algorithm
  • Probability estimation
  • SVM

ASJC Scopus subject areas

  • Statistics and Probability
  • Discrete Mathematics and Combinatorics
  • Statistics, Probability and Uncertainty

Cite this

Two-dimensional solution surface for weighted support vector machines. / Shin, Seung Jun; Wu, Yichao; Zhang, Hao Helen.

In: Journal of Computational and Graphical Statistics, Vol. 23, No. 2, 01.01.2014, p. 383-402.

Research output: Contribution to journalArticle

@article{84d4f673231e4df0a9950f4b48a4a92e,
title = "Two-dimensional solution surface for weighted support vector machines",
abstract = "The support vector machine (SVM) is a popular learning method for binary classification. Standard SVMs treat all the data points equally, but in some practical problems it is more natural to assign different weights to observations from different classes. This leads to a broader class of learning, the so-called weighted SVMs (WSVMs), and one of their important applications is to estimate class probabilities besides learning the classification boundary. There are two parameters associated with theWSVM optimization problem: one is the regularization parameter and the other is the weight parameter. In this article, we first establish that the WSVM solutions are jointly piecewise-linear with respect to both the regularization and weight parameter.We then develop a state-of-theart algorithm that can compute the entire trajectory of the WSVM solutions for every pair of the regularization parameter and the weight parameter at a feasible computational cost. The derived two-dimensional solution surface provides theoretical insight on the behavior of the WSVM solutions. Numerically, the algorithm can greatly facilitate the implementation of the WSVM and automate the selection process of the optimal regularization parameter. We illustrate the new algorithm on various examples. This article has online supplementary materials.",
keywords = "Binary classification, Coefficient path algorithm, Probability estimation, SVM",
author = "Shin, {Seung Jun} and Yichao Wu and Zhang, {Hao Helen}",
year = "2014",
month = "1",
day = "1",
doi = "10.1080/10618600.2012.761139",
language = "English",
volume = "23",
pages = "383--402",
journal = "Journal of Computational and Graphical Statistics",
issn = "1061-8600",
publisher = "American Statistical Association",
number = "2",

}

TY - JOUR

T1 - Two-dimensional solution surface for weighted support vector machines

AU - Shin, Seung Jun

AU - Wu, Yichao

AU - Zhang, Hao Helen

PY - 2014/1/1

Y1 - 2014/1/1

N2 - The support vector machine (SVM) is a popular learning method for binary classification. Standard SVMs treat all the data points equally, but in some practical problems it is more natural to assign different weights to observations from different classes. This leads to a broader class of learning, the so-called weighted SVMs (WSVMs), and one of their important applications is to estimate class probabilities besides learning the classification boundary. There are two parameters associated with theWSVM optimization problem: one is the regularization parameter and the other is the weight parameter. In this article, we first establish that the WSVM solutions are jointly piecewise-linear with respect to both the regularization and weight parameter.We then develop a state-of-theart algorithm that can compute the entire trajectory of the WSVM solutions for every pair of the regularization parameter and the weight parameter at a feasible computational cost. The derived two-dimensional solution surface provides theoretical insight on the behavior of the WSVM solutions. Numerically, the algorithm can greatly facilitate the implementation of the WSVM and automate the selection process of the optimal regularization parameter. We illustrate the new algorithm on various examples. This article has online supplementary materials.

AB - The support vector machine (SVM) is a popular learning method for binary classification. Standard SVMs treat all the data points equally, but in some practical problems it is more natural to assign different weights to observations from different classes. This leads to a broader class of learning, the so-called weighted SVMs (WSVMs), and one of their important applications is to estimate class probabilities besides learning the classification boundary. There are two parameters associated with theWSVM optimization problem: one is the regularization parameter and the other is the weight parameter. In this article, we first establish that the WSVM solutions are jointly piecewise-linear with respect to both the regularization and weight parameter.We then develop a state-of-theart algorithm that can compute the entire trajectory of the WSVM solutions for every pair of the regularization parameter and the weight parameter at a feasible computational cost. The derived two-dimensional solution surface provides theoretical insight on the behavior of the WSVM solutions. Numerically, the algorithm can greatly facilitate the implementation of the WSVM and automate the selection process of the optimal regularization parameter. We illustrate the new algorithm on various examples. This article has online supplementary materials.

KW - Binary classification

KW - Coefficient path algorithm

KW - Probability estimation

KW - SVM

UR - http://www.scopus.com/inward/record.url?scp=84901752896&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84901752896&partnerID=8YFLogxK

U2 - 10.1080/10618600.2012.761139

DO - 10.1080/10618600.2012.761139

M3 - Article

AN - SCOPUS:84901752896

VL - 23

SP - 383

EP - 402

JO - Journal of Computational and Graphical Statistics

JF - Journal of Computational and Graphical Statistics

SN - 1061-8600

IS - 2

ER -