### Abstract

Statistical models such as linear regression drive numerous applications in computer vision and machine learning. The landscape of practical deployments of these formulations is dominated by forward regression models that estimate the parameters of a function mapping a set of p covariates, x, to a response variable, y. The less known alternative, Inverse Regression, offers various benefits that are much less explored in vision problems. The goal of this paper is to show how Inverse Regression in the “abundant” feature setting (i.e., many subsets of features are associated with the target label or response, as is the case for images), together with a statistical construction called Sufficient Reduction, yields highly flexible models that are a natural fit for model estimation tasks in vision. Specifically, we obtain formulations that provide relevance of individual covariates used in prediction, at the level of specific examples/samples — in a sense, explaining why a particular prediction was made. With no compromise in performance relative to other methods, an ability to interpret why a learning algorithm is behaving in a specific way for each prediction, adds significant value in numerous applications. We illustrate these properties and the benefits of Abundant Inverse Regression on three distinct applications.

Original language | English |
---|---|

Title of host publication | Computer Vision - 14th European Conference, ECCV 2016, Proceedings |

Editors | Jiri Matas, Nicu Sebe, Max Welling, Bastian Leibe |

Publisher | Springer Verlag |

Pages | 570-584 |

Number of pages | 15 |

ISBN (Print) | 9783319464862 |

DOIs | |

Publication status | Published - 2016 Jan 1 |

Externally published | Yes |

### Publication series

Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|

Volume | 9907 LNCS |

ISSN (Print) | 0302-9743 |

ISSN (Electronic) | 1611-3349 |

### Fingerprint

### Keywords

- Abundant regression
- Age estimation
- Alzheimer’s disease
- Inverse regression
- Kernel regression
- Temperature prediction

### ASJC Scopus subject areas

- Theoretical Computer Science
- Computer Science(all)

### Cite this

*Computer Vision - 14th European Conference, ECCV 2016, Proceedings*(pp. 570-584). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 9907 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-319-46487-9_35

**Abundant inverse regression using sufficient reduction and its applications.** / Kim, Hyun Woo; Smith, Brandon M.; Adluru, Nagesh; Dyer, Charles R.; Johnson, Sterling C.; Singh, Vikas.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

*Computer Vision - 14th European Conference, ECCV 2016, Proceedings.*Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 9907 LNCS, Springer Verlag, pp. 570-584. https://doi.org/10.1007/978-3-319-46487-9_35

}

TY - GEN

T1 - Abundant inverse regression using sufficient reduction and its applications

AU - Kim, Hyun Woo

AU - Smith, Brandon M.

AU - Adluru, Nagesh

AU - Dyer, Charles R.

AU - Johnson, Sterling C.

AU - Singh, Vikas

PY - 2016/1/1

Y1 - 2016/1/1

N2 - Statistical models such as linear regression drive numerous applications in computer vision and machine learning. The landscape of practical deployments of these formulations is dominated by forward regression models that estimate the parameters of a function mapping a set of p covariates, x, to a response variable, y. The less known alternative, Inverse Regression, offers various benefits that are much less explored in vision problems. The goal of this paper is to show how Inverse Regression in the “abundant” feature setting (i.e., many subsets of features are associated with the target label or response, as is the case for images), together with a statistical construction called Sufficient Reduction, yields highly flexible models that are a natural fit for model estimation tasks in vision. Specifically, we obtain formulations that provide relevance of individual covariates used in prediction, at the level of specific examples/samples — in a sense, explaining why a particular prediction was made. With no compromise in performance relative to other methods, an ability to interpret why a learning algorithm is behaving in a specific way for each prediction, adds significant value in numerous applications. We illustrate these properties and the benefits of Abundant Inverse Regression on three distinct applications.

AB - Statistical models such as linear regression drive numerous applications in computer vision and machine learning. The landscape of practical deployments of these formulations is dominated by forward regression models that estimate the parameters of a function mapping a set of p covariates, x, to a response variable, y. The less known alternative, Inverse Regression, offers various benefits that are much less explored in vision problems. The goal of this paper is to show how Inverse Regression in the “abundant” feature setting (i.e., many subsets of features are associated with the target label or response, as is the case for images), together with a statistical construction called Sufficient Reduction, yields highly flexible models that are a natural fit for model estimation tasks in vision. Specifically, we obtain formulations that provide relevance of individual covariates used in prediction, at the level of specific examples/samples — in a sense, explaining why a particular prediction was made. With no compromise in performance relative to other methods, an ability to interpret why a learning algorithm is behaving in a specific way for each prediction, adds significant value in numerous applications. We illustrate these properties and the benefits of Abundant Inverse Regression on three distinct applications.

KW - Abundant regression

KW - Age estimation

KW - Alzheimer’s disease

KW - Inverse regression

KW - Kernel regression

KW - Temperature prediction

UR - http://www.scopus.com/inward/record.url?scp=84990026591&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84990026591&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-46487-9_35

DO - 10.1007/978-3-319-46487-9_35

M3 - Conference contribution

AN - SCOPUS:84990026591

SN - 9783319464862

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 570

EP - 584

BT - Computer Vision - 14th European Conference, ECCV 2016, Proceedings

A2 - Matas, Jiri

A2 - Sebe, Nicu

A2 - Welling, Max

A2 - Leibe, Bastian

PB - Springer Verlag

ER -