Penalized principal logistic regression for sparse sufficient dimension reduction

Seung Jun Shin, Andreas Artemiou

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

Sufficient dimension reduction (SDR) is a successive tool for reducing the dimensionality of predictors by finding the central subspace, a minimal subspace of predictors that preserves all the regression information. When predictor dimension is large, it is often assumed that only a small number of predictors is informative. In this regard, sparse SDR is desired to achieve variable selection and dimension reduction simultaneously. We propose a principal logistic regression (PLR) as a new SDR tool and further develop its penalized version for sparse SDR. Asymptotic analysis shows that the penalized PLR enjoys the oracle property. Numerical investigation supports the advantageous performance of the proposed methods.

Original languageEnglish
Pages (from-to)48-58
Number of pages11
JournalComputational Statistics and Data Analysis
Volume111
DOIs
Publication statusPublished - 2017 Jul 1

Fingerprint

Sufficient Dimension Reduction
Logistic Regression
Logistics
Predictors
Central Subspace
Oracle Property
Dimension Reduction
Variable Selection
Numerical Investigation
Asymptotic Analysis
Asymptotic analysis
Dimensionality
Regression
Subspace

Keywords

  • Max-SCAD penalty
  • Principal logistic regression
  • Sparse sufficient dimension reduction
  • Sufficient dimension reduction

ASJC Scopus subject areas

  • Statistics and Probability
  • Computational Theory and Mathematics
  • Computational Mathematics
  • Applied Mathematics

Cite this

Penalized principal logistic regression for sparse sufficient dimension reduction. / Shin, Seung Jun; Artemiou, Andreas.

In: Computational Statistics and Data Analysis, Vol. 111, 01.07.2017, p. 48-58.

Research output: Contribution to journalArticle

@article{c29c4537aeb748e5800fbc671fd174e3,
title = "Penalized principal logistic regression for sparse sufficient dimension reduction",
abstract = "Sufficient dimension reduction (SDR) is a successive tool for reducing the dimensionality of predictors by finding the central subspace, a minimal subspace of predictors that preserves all the regression information. When predictor dimension is large, it is often assumed that only a small number of predictors is informative. In this regard, sparse SDR is desired to achieve variable selection and dimension reduction simultaneously. We propose a principal logistic regression (PLR) as a new SDR tool and further develop its penalized version for sparse SDR. Asymptotic analysis shows that the penalized PLR enjoys the oracle property. Numerical investigation supports the advantageous performance of the proposed methods.",
keywords = "Max-SCAD penalty, Principal logistic regression, Sparse sufficient dimension reduction, Sufficient dimension reduction",
author = "Shin, {Seung Jun} and Andreas Artemiou",
year = "2017",
month = "7",
day = "1",
doi = "10.1016/j.csda.2016.12.003",
language = "English",
volume = "111",
pages = "48--58",
journal = "Computational Statistics and Data Analysis",
issn = "0167-9473",
publisher = "Elsevier",

}

TY - JOUR

T1 - Penalized principal logistic regression for sparse sufficient dimension reduction

AU - Shin, Seung Jun

AU - Artemiou, Andreas

PY - 2017/7/1

Y1 - 2017/7/1

N2 - Sufficient dimension reduction (SDR) is a successive tool for reducing the dimensionality of predictors by finding the central subspace, a minimal subspace of predictors that preserves all the regression information. When predictor dimension is large, it is often assumed that only a small number of predictors is informative. In this regard, sparse SDR is desired to achieve variable selection and dimension reduction simultaneously. We propose a principal logistic regression (PLR) as a new SDR tool and further develop its penalized version for sparse SDR. Asymptotic analysis shows that the penalized PLR enjoys the oracle property. Numerical investigation supports the advantageous performance of the proposed methods.

AB - Sufficient dimension reduction (SDR) is a successive tool for reducing the dimensionality of predictors by finding the central subspace, a minimal subspace of predictors that preserves all the regression information. When predictor dimension is large, it is often assumed that only a small number of predictors is informative. In this regard, sparse SDR is desired to achieve variable selection and dimension reduction simultaneously. We propose a principal logistic regression (PLR) as a new SDR tool and further develop its penalized version for sparse SDR. Asymptotic analysis shows that the penalized PLR enjoys the oracle property. Numerical investigation supports the advantageous performance of the proposed methods.

KW - Max-SCAD penalty

KW - Principal logistic regression

KW - Sparse sufficient dimension reduction

KW - Sufficient dimension reduction

UR - http://www.scopus.com/inward/record.url?scp=85013487524&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85013487524&partnerID=8YFLogxK

U2 - 10.1016/j.csda.2016.12.003

DO - 10.1016/j.csda.2016.12.003

M3 - Article

AN - SCOPUS:85013487524

VL - 111

SP - 48

EP - 58

JO - Computational Statistics and Data Analysis

JF - Computational Statistics and Data Analysis

SN - 0167-9473

ER -