Principal weighted support vector machines for sufficient dimension reduction in binary classification

Seung Jun Shin, Yichao Wu, Hao Helen Zhang, Yufeng Liu

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

Sufficient dimension reduction is popular for reducing data dimensionality without stringent model assumptions. However, most existing methods may work poorly for binary classification. For example, sliced inverse regression (Li, 1991) can estimate at most one direction if the response is binary. In this paper we propose principal weighted support vector machines, a unified framework for linear and nonlinear sufficient dimension reduction in binary classification. Its asymptotic properties are studied, and an efficient computing algorithm is proposed. Numerical examples demonstrate its performance in binary classification.

Original languageEnglish
Pages (from-to)67-81
Number of pages15
JournalBiometrika
Volume104
Issue number1
DOIs
Publication statusPublished - 2017 Mar 1

Fingerprint

Sufficient Dimension Reduction
Binary Classification
Support vector machines
Support Vector Machine
Sliced Inverse Regression
Asymptotic Properties
Dimensionality
Binary
Numerical Examples
Computing
Estimate
Demonstrate
Dimension reduction
Support vector machine
support vector machines
methodology
Model

Keywords

  • Fisher consistency
  • Hyperplane alignment
  • Reproducing kernel Hilbert space
  • Weighted support vector machine

ASJC Scopus subject areas

  • Statistics and Probability
  • Mathematics(all)
  • Agricultural and Biological Sciences (miscellaneous)
  • Agricultural and Biological Sciences(all)
  • Statistics, Probability and Uncertainty
  • Applied Mathematics

Cite this

Principal weighted support vector machines for sufficient dimension reduction in binary classification. / Shin, Seung Jun; Wu, Yichao; Zhang, Hao Helen; Liu, Yufeng.

In: Biometrika, Vol. 104, No. 1, 01.03.2017, p. 67-81.

Research output: Contribution to journalArticle

Shin, Seung Jun ; Wu, Yichao ; Zhang, Hao Helen ; Liu, Yufeng. / Principal weighted support vector machines for sufficient dimension reduction in binary classification. In: Biometrika. 2017 ; Vol. 104, No. 1. pp. 67-81.
@article{96f7bf2a3ed44012a2172c4b6a008f5f,
title = "Principal weighted support vector machines for sufficient dimension reduction in binary classification",
abstract = "Sufficient dimension reduction is popular for reducing data dimensionality without stringent model assumptions. However, most existing methods may work poorly for binary classification. For example, sliced inverse regression (Li, 1991) can estimate at most one direction if the response is binary. In this paper we propose principal weighted support vector machines, a unified framework for linear and nonlinear sufficient dimension reduction in binary classification. Its asymptotic properties are studied, and an efficient computing algorithm is proposed. Numerical examples demonstrate its performance in binary classification.",
keywords = "Fisher consistency, Hyperplane alignment, Reproducing kernel Hilbert space, Weighted support vector machine",
author = "Shin, {Seung Jun} and Yichao Wu and Zhang, {Hao Helen} and Yufeng Liu",
year = "2017",
month = "3",
day = "1",
doi = "10.1093/biomet/asw057",
language = "English",
volume = "104",
pages = "67--81",
journal = "Biometrika",
issn = "0006-3444",
publisher = "Oxford University Press",
number = "1",

}

TY - JOUR

T1 - Principal weighted support vector machines for sufficient dimension reduction in binary classification

AU - Shin, Seung Jun

AU - Wu, Yichao

AU - Zhang, Hao Helen

AU - Liu, Yufeng

PY - 2017/3/1

Y1 - 2017/3/1

N2 - Sufficient dimension reduction is popular for reducing data dimensionality without stringent model assumptions. However, most existing methods may work poorly for binary classification. For example, sliced inverse regression (Li, 1991) can estimate at most one direction if the response is binary. In this paper we propose principal weighted support vector machines, a unified framework for linear and nonlinear sufficient dimension reduction in binary classification. Its asymptotic properties are studied, and an efficient computing algorithm is proposed. Numerical examples demonstrate its performance in binary classification.

AB - Sufficient dimension reduction is popular for reducing data dimensionality without stringent model assumptions. However, most existing methods may work poorly for binary classification. For example, sliced inverse regression (Li, 1991) can estimate at most one direction if the response is binary. In this paper we propose principal weighted support vector machines, a unified framework for linear and nonlinear sufficient dimension reduction in binary classification. Its asymptotic properties are studied, and an efficient computing algorithm is proposed. Numerical examples demonstrate its performance in binary classification.

KW - Fisher consistency

KW - Hyperplane alignment

KW - Reproducing kernel Hilbert space

KW - Weighted support vector machine

UR - http://www.scopus.com/inward/record.url?scp=85019930767&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85019930767&partnerID=8YFLogxK

U2 - 10.1093/biomet/asw057

DO - 10.1093/biomet/asw057

M3 - Article

AN - SCOPUS:85019930767

VL - 104

SP - 67

EP - 81

JO - Biometrika

JF - Biometrika

SN - 0006-3444

IS - 1

ER -