Abundant inverse regression using sufficient reduction and its applications

Hyun Woo Kim, Brandon M. Smith, Nagesh Adluru, Charles R. Dyer, Sterling C. Johnson, Vikas Singh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Statistical models such as linear regression drive numerous applications in computer vision and machine learning. The landscape of practical deployments of these formulations is dominated by forward regression models that estimate the parameters of a function mapping a set of p covariates, x, to a response variable, y. The less known alternative, Inverse Regression, offers various benefits that are much less explored in vision problems. The goal of this paper is to show how Inverse Regression in the “abundant” feature setting (i.e., many subsets of features are associated with the target label or response, as is the case for images), together with a statistical construction called Sufficient Reduction, yields highly flexible models that are a natural fit for model estimation tasks in vision. Specifically, we obtain formulations that provide relevance of individual covariates used in prediction, at the level of specific examples/samples — in a sense, explaining why a particular prediction was made. With no compromise in performance relative to other methods, an ability to interpret why a learning algorithm is behaving in a specific way for each prediction, adds significant value in numerous applications. We illustrate these properties and the benefits of Abundant Inverse Regression on three distinct applications.

Original languageEnglish
Title of host publicationComputer Vision - 14th European Conference, ECCV 2016, Proceedings
EditorsJiri Matas, Nicu Sebe, Max Welling, Bastian Leibe
PublisherSpringer Verlag
Pages570-584
Number of pages15
ISBN (Print)9783319464862
DOIs
Publication statusPublished - 2016 Jan 1
Externally publishedYes

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume9907 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Fingerprint

Inverse Regression
Sufficient
Covariates
Prediction
Formulation
Linear regression
Computer Vision
Learning algorithms
Computer vision
Statistical Model
Learning systems
Labels
Learning Algorithm
Regression Model
Machine Learning
Distinct
Target
Subset
Alternatives
Model

Keywords

  • Abundant regression
  • Age estimation
  • Alzheimer’s disease
  • Inverse regression
  • Kernel regression
  • Temperature prediction

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Kim, H. W., Smith, B. M., Adluru, N., Dyer, C. R., Johnson, S. C., & Singh, V. (2016). Abundant inverse regression using sufficient reduction and its applications. In J. Matas, N. Sebe, M. Welling, & B. Leibe (Eds.), Computer Vision - 14th European Conference, ECCV 2016, Proceedings (pp. 570-584). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 9907 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-319-46487-9_35

Abundant inverse regression using sufficient reduction and its applications. / Kim, Hyun Woo; Smith, Brandon M.; Adluru, Nagesh; Dyer, Charles R.; Johnson, Sterling C.; Singh, Vikas.

Computer Vision - 14th European Conference, ECCV 2016, Proceedings. ed. / Jiri Matas; Nicu Sebe; Max Welling; Bastian Leibe. Springer Verlag, 2016. p. 570-584 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 9907 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Kim, HW, Smith, BM, Adluru, N, Dyer, CR, Johnson, SC & Singh, V 2016, Abundant inverse regression using sufficient reduction and its applications. in J Matas, N Sebe, M Welling & B Leibe (eds), Computer Vision - 14th European Conference, ECCV 2016, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 9907 LNCS, Springer Verlag, pp. 570-584. https://doi.org/10.1007/978-3-319-46487-9_35
Kim HW, Smith BM, Adluru N, Dyer CR, Johnson SC, Singh V. Abundant inverse regression using sufficient reduction and its applications. In Matas J, Sebe N, Welling M, Leibe B, editors, Computer Vision - 14th European Conference, ECCV 2016, Proceedings. Springer Verlag. 2016. p. 570-584. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-319-46487-9_35
Kim, Hyun Woo ; Smith, Brandon M. ; Adluru, Nagesh ; Dyer, Charles R. ; Johnson, Sterling C. ; Singh, Vikas. / Abundant inverse regression using sufficient reduction and its applications. Computer Vision - 14th European Conference, ECCV 2016, Proceedings. editor / Jiri Matas ; Nicu Sebe ; Max Welling ; Bastian Leibe. Springer Verlag, 2016. pp. 570-584 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{e1712576fc3b49e3874ecc34ee70f322,
title = "Abundant inverse regression using sufficient reduction and its applications",
abstract = "Statistical models such as linear regression drive numerous applications in computer vision and machine learning. The landscape of practical deployments of these formulations is dominated by forward regression models that estimate the parameters of a function mapping a set of p covariates, x, to a response variable, y. The less known alternative, Inverse Regression, offers various benefits that are much less explored in vision problems. The goal of this paper is to show how Inverse Regression in the “abundant” feature setting (i.e., many subsets of features are associated with the target label or response, as is the case for images), together with a statistical construction called Sufficient Reduction, yields highly flexible models that are a natural fit for model estimation tasks in vision. Specifically, we obtain formulations that provide relevance of individual covariates used in prediction, at the level of specific examples/samples — in a sense, explaining why a particular prediction was made. With no compromise in performance relative to other methods, an ability to interpret why a learning algorithm is behaving in a specific way for each prediction, adds significant value in numerous applications. We illustrate these properties and the benefits of Abundant Inverse Regression on three distinct applications.",
keywords = "Abundant regression, Age estimation, Alzheimer’s disease, Inverse regression, Kernel regression, Temperature prediction",
author = "Kim, {Hyun Woo} and Smith, {Brandon M.} and Nagesh Adluru and Dyer, {Charles R.} and Johnson, {Sterling C.} and Vikas Singh",
year = "2016",
month = "1",
day = "1",
doi = "10.1007/978-3-319-46487-9_35",
language = "English",
isbn = "9783319464862",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "570--584",
editor = "Jiri Matas and Nicu Sebe and Max Welling and Bastian Leibe",
booktitle = "Computer Vision - 14th European Conference, ECCV 2016, Proceedings",

}

TY - GEN

T1 - Abundant inverse regression using sufficient reduction and its applications

AU - Kim, Hyun Woo

AU - Smith, Brandon M.

AU - Adluru, Nagesh

AU - Dyer, Charles R.

AU - Johnson, Sterling C.

AU - Singh, Vikas

PY - 2016/1/1

Y1 - 2016/1/1

N2 - Statistical models such as linear regression drive numerous applications in computer vision and machine learning. The landscape of practical deployments of these formulations is dominated by forward regression models that estimate the parameters of a function mapping a set of p covariates, x, to a response variable, y. The less known alternative, Inverse Regression, offers various benefits that are much less explored in vision problems. The goal of this paper is to show how Inverse Regression in the “abundant” feature setting (i.e., many subsets of features are associated with the target label or response, as is the case for images), together with a statistical construction called Sufficient Reduction, yields highly flexible models that are a natural fit for model estimation tasks in vision. Specifically, we obtain formulations that provide relevance of individual covariates used in prediction, at the level of specific examples/samples — in a sense, explaining why a particular prediction was made. With no compromise in performance relative to other methods, an ability to interpret why a learning algorithm is behaving in a specific way for each prediction, adds significant value in numerous applications. We illustrate these properties and the benefits of Abundant Inverse Regression on three distinct applications.

AB - Statistical models such as linear regression drive numerous applications in computer vision and machine learning. The landscape of practical deployments of these formulations is dominated by forward regression models that estimate the parameters of a function mapping a set of p covariates, x, to a response variable, y. The less known alternative, Inverse Regression, offers various benefits that are much less explored in vision problems. The goal of this paper is to show how Inverse Regression in the “abundant” feature setting (i.e., many subsets of features are associated with the target label or response, as is the case for images), together with a statistical construction called Sufficient Reduction, yields highly flexible models that are a natural fit for model estimation tasks in vision. Specifically, we obtain formulations that provide relevance of individual covariates used in prediction, at the level of specific examples/samples — in a sense, explaining why a particular prediction was made. With no compromise in performance relative to other methods, an ability to interpret why a learning algorithm is behaving in a specific way for each prediction, adds significant value in numerous applications. We illustrate these properties and the benefits of Abundant Inverse Regression on three distinct applications.

KW - Abundant regression

KW - Age estimation

KW - Alzheimer’s disease

KW - Inverse regression

KW - Kernel regression

KW - Temperature prediction

UR - http://www.scopus.com/inward/record.url?scp=84990026591&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84990026591&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-46487-9_35

DO - 10.1007/978-3-319-46487-9_35

M3 - Conference contribution

AN - SCOPUS:84990026591

SN - 9783319464862

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 570

EP - 584

BT - Computer Vision - 14th European Conference, ECCV 2016, Proceedings

A2 - Matas, Jiri

A2 - Sebe, Nicu

A2 - Welling, Max

A2 - Leibe, Bastian

PB - Springer Verlag

ER -