### Abstract

The support vector machine (SVM) is a popular learning method for binary classification. Standard SVMs treat all the data points equally, but in some practical problems it is more natural to assign different weights to observations from different classes. This leads to a broader class of learning, the so-called weighted SVMs (WSVMs), and one of their important applications is to estimate class probabilities besides learning the classification boundary. There are two parameters associated with theWSVM optimization problem: one is the regularization parameter and the other is the weight parameter. In this article, we first establish that the WSVM solutions are jointly piecewise-linear with respect to both the regularization and weight parameter.We then develop a state-of-theart algorithm that can compute the entire trajectory of the WSVM solutions for every pair of the regularization parameter and the weight parameter at a feasible computational cost. The derived two-dimensional solution surface provides theoretical insight on the behavior of the WSVM solutions. Numerically, the algorithm can greatly facilitate the implementation of the WSVM and automate the selection process of the optimal regularization parameter. We illustrate the new algorithm on various examples. This article has online supplementary materials.

Original language | English |
---|---|

Pages (from-to) | 383-402 |

Number of pages | 20 |

Journal | Journal of Computational and Graphical Statistics |

Volume | 23 |

Issue number | 2 |

DOIs | |

Publication status | Published - 2014 Jan 1 |

Externally published | Yes |

### Fingerprint

### Keywords

- Binary classification
- Coefficient path algorithm
- Probability estimation
- SVM

### ASJC Scopus subject areas

- Statistics and Probability
- Discrete Mathematics and Combinatorics
- Statistics, Probability and Uncertainty

### Cite this

*Journal of Computational and Graphical Statistics*,

*23*(2), 383-402. https://doi.org/10.1080/10618600.2012.761139

**Two-dimensional solution surface for weighted support vector machines.** / Shin, Seung Jun; Wu, Yichao; Zhang, Hao Helen.

Research output: Contribution to journal › Article

*Journal of Computational and Graphical Statistics*, vol. 23, no. 2, pp. 383-402. https://doi.org/10.1080/10618600.2012.761139

}

TY - JOUR

T1 - Two-dimensional solution surface for weighted support vector machines

AU - Shin, Seung Jun

AU - Wu, Yichao

AU - Zhang, Hao Helen

PY - 2014/1/1

Y1 - 2014/1/1

N2 - The support vector machine (SVM) is a popular learning method for binary classification. Standard SVMs treat all the data points equally, but in some practical problems it is more natural to assign different weights to observations from different classes. This leads to a broader class of learning, the so-called weighted SVMs (WSVMs), and one of their important applications is to estimate class probabilities besides learning the classification boundary. There are two parameters associated with theWSVM optimization problem: one is the regularization parameter and the other is the weight parameter. In this article, we first establish that the WSVM solutions are jointly piecewise-linear with respect to both the regularization and weight parameter.We then develop a state-of-theart algorithm that can compute the entire trajectory of the WSVM solutions for every pair of the regularization parameter and the weight parameter at a feasible computational cost. The derived two-dimensional solution surface provides theoretical insight on the behavior of the WSVM solutions. Numerically, the algorithm can greatly facilitate the implementation of the WSVM and automate the selection process of the optimal regularization parameter. We illustrate the new algorithm on various examples. This article has online supplementary materials.

AB - The support vector machine (SVM) is a popular learning method for binary classification. Standard SVMs treat all the data points equally, but in some practical problems it is more natural to assign different weights to observations from different classes. This leads to a broader class of learning, the so-called weighted SVMs (WSVMs), and one of their important applications is to estimate class probabilities besides learning the classification boundary. There are two parameters associated with theWSVM optimization problem: one is the regularization parameter and the other is the weight parameter. In this article, we first establish that the WSVM solutions are jointly piecewise-linear with respect to both the regularization and weight parameter.We then develop a state-of-theart algorithm that can compute the entire trajectory of the WSVM solutions for every pair of the regularization parameter and the weight parameter at a feasible computational cost. The derived two-dimensional solution surface provides theoretical insight on the behavior of the WSVM solutions. Numerically, the algorithm can greatly facilitate the implementation of the WSVM and automate the selection process of the optimal regularization parameter. We illustrate the new algorithm on various examples. This article has online supplementary materials.

KW - Binary classification

KW - Coefficient path algorithm

KW - Probability estimation

KW - SVM

UR - http://www.scopus.com/inward/record.url?scp=84901752896&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84901752896&partnerID=8YFLogxK

U2 - 10.1080/10618600.2012.761139

DO - 10.1080/10618600.2012.761139

M3 - Article

AN - SCOPUS:84901752896

VL - 23

SP - 383

EP - 402

JO - Journal of Computational and Graphical Statistics

JF - Journal of Computational and Graphical Statistics

SN - 1061-8600

IS - 2

ER -