Generalizing analytic shrinkage for arbitrary covariance structures

Daniel Bartz, Klaus Muller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Citations (Scopus)

Abstract

Analytic shrinkage is a statistical technique that offers a fast alternative to crossvalidation for the regularization of covariance matrices and has appealing consistency properties. We show that the proof of consistency requires bounds on the growth rates of eigenvalues and their dispersion, which are often violated in data. We prove consistency under assumptions which do not restrict the covariance structure and therefore better match real world data. In addition, we propose an extension of analytic shrinkage-orthogonal complement shrinkage-which adapts to the covariance structure. Finally we demonstrate the superior performance of our novel approach on data from the domains of finance, spoken letter and optical character recognition, and neuroscience.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems
PublisherNeural information processing systems foundation
Publication statusPublished - 2013
Event27th Annual Conference on Neural Information Processing Systems, NIPS 2013 - Lake Tahoe, NV, United States
Duration: 2013 Dec 52013 Dec 10

Other

Other27th Annual Conference on Neural Information Processing Systems, NIPS 2013
CountryUnited States
CityLake Tahoe, NV
Period13/12/513/12/10

Fingerprint

Optical character recognition
Finance
Covariance matrix

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Bartz, D., & Muller, K. (2013). Generalizing analytic shrinkage for arbitrary covariance structures. In Advances in Neural Information Processing Systems Neural information processing systems foundation.

Generalizing analytic shrinkage for arbitrary covariance structures. / Bartz, Daniel; Muller, Klaus.

Advances in Neural Information Processing Systems. Neural information processing systems foundation, 2013.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Bartz, D & Muller, K 2013, Generalizing analytic shrinkage for arbitrary covariance structures. in Advances in Neural Information Processing Systems. Neural information processing systems foundation, 27th Annual Conference on Neural Information Processing Systems, NIPS 2013, Lake Tahoe, NV, United States, 13/12/5.
Bartz D, Muller K. Generalizing analytic shrinkage for arbitrary covariance structures. In Advances in Neural Information Processing Systems. Neural information processing systems foundation. 2013
Bartz, Daniel ; Muller, Klaus. / Generalizing analytic shrinkage for arbitrary covariance structures. Advances in Neural Information Processing Systems. Neural information processing systems foundation, 2013.
@inproceedings{e94eaba2d51843c187a0b5967b28e4b9,
title = "Generalizing analytic shrinkage for arbitrary covariance structures",
abstract = "Analytic shrinkage is a statistical technique that offers a fast alternative to crossvalidation for the regularization of covariance matrices and has appealing consistency properties. We show that the proof of consistency requires bounds on the growth rates of eigenvalues and their dispersion, which are often violated in data. We prove consistency under assumptions which do not restrict the covariance structure and therefore better match real world data. In addition, we propose an extension of analytic shrinkage-orthogonal complement shrinkage-which adapts to the covariance structure. Finally we demonstrate the superior performance of our novel approach on data from the domains of finance, spoken letter and optical character recognition, and neuroscience.",
author = "Daniel Bartz and Klaus Muller",
year = "2013",
language = "English",
booktitle = "Advances in Neural Information Processing Systems",
publisher = "Neural information processing systems foundation",

}

TY - GEN

T1 - Generalizing analytic shrinkage for arbitrary covariance structures

AU - Bartz, Daniel

AU - Muller, Klaus

PY - 2013

Y1 - 2013

N2 - Analytic shrinkage is a statistical technique that offers a fast alternative to crossvalidation for the regularization of covariance matrices and has appealing consistency properties. We show that the proof of consistency requires bounds on the growth rates of eigenvalues and their dispersion, which are often violated in data. We prove consistency under assumptions which do not restrict the covariance structure and therefore better match real world data. In addition, we propose an extension of analytic shrinkage-orthogonal complement shrinkage-which adapts to the covariance structure. Finally we demonstrate the superior performance of our novel approach on data from the domains of finance, spoken letter and optical character recognition, and neuroscience.

AB - Analytic shrinkage is a statistical technique that offers a fast alternative to crossvalidation for the regularization of covariance matrices and has appealing consistency properties. We show that the proof of consistency requires bounds on the growth rates of eigenvalues and their dispersion, which are often violated in data. We prove consistency under assumptions which do not restrict the covariance structure and therefore better match real world data. In addition, we propose an extension of analytic shrinkage-orthogonal complement shrinkage-which adapts to the covariance structure. Finally we demonstrate the superior performance of our novel approach on data from the domains of finance, spoken letter and optical character recognition, and neuroscience.

UR - http://www.scopus.com/inward/record.url?scp=84898987337&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84898987337&partnerID=8YFLogxK

M3 - Conference contribution

BT - Advances in Neural Information Processing Systems

PB - Neural information processing systems foundation

ER -