Probability-enhanced sufficient dimension reduction for binary classification

Seung Jun Shin, Yichao Wu, Hao Helen Zhang, Yufeng Liu

Research output: Contribution to journalArticlepeer-review

23 Citations (Scopus)


In high-dimensional data analysis, it is of primary interest to reduce the data dimensionality without loss of information. Sufficient dimension reduction (SDR) arises in this context, and many successful SDR methods have been developed since the introduction of sliced inverse regression (SIR) [Li (1991) Journal of the American Statistical Association 86, 316-327]. Despite their fast progress, though, most existing methods target on regression problems with a continuous response. For binary classification problems, SIR suffers the limitation of estimating at most one direction since only two slices are available. In this article, we develop a new and flexible probability-enhanced SDR method for binary classification problems by using the weighted support vector machine (WSVM). The key idea is to slice the data based on conditional class probabilities of observations rather than their binary responses. We first show that the central subspace based on the conditional class probability is the same as that based on the binary response. This important result justifies the proposed slicing scheme from a theoretical perspective and assures no information loss. In practice, the true conditional class probability is generally not available, and the problem of probability estimation can be challenging for data with large-dimensional inputs. We observe that, in order to implement the new slicing scheme, one does not need exact probability values and the only required information is the relative order of probability values. Motivated by this fact, our new SDR procedure bypasses the probability estimation step and employs the WSVM to directly estimate the order of probability values, based on which the slicing is performed. The performance of the proposed probability-enhanced SDR scheme is evaluated by both simulated and real data examples.

Original languageEnglish
Pages (from-to)546-555
Number of pages10
Issue number3
Publication statusPublished - 2014 Sep 1
Externally publishedYes


  • Binary classification
  • Conditional class probability
  • Fisher consistency
  • Sufficient dimension reduction
  • Weighted support vector machines (WSVMs)

ASJC Scopus subject areas

  • Statistics and Probability
  • Biochemistry, Genetics and Molecular Biology(all)
  • Immunology and Microbiology(all)
  • Agricultural and Biological Sciences(all)
  • Applied Mathematics


Dive into the research topics of 'Probability-enhanced sufficient dimension reduction for binary classification'. Together they form a unique fingerprint.

Cite this