Layer-wise analysis of deep networks with Gaussian kernels

Grégoire Montavon, Mikio L. Braun, Klaus Muller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

Deep networks can potentially express a learning problem more efficiently than local learning machines. While deep networks outperform local learning machines on some problems, it is still unclear how their nice representation emerges from their complex structure. We present an analysis based on Gaussian kernels that measures how the representation of the learning problem evolves layer after layer as the deep network builds higher-level abstract representations of the input. We use this analysis to show empirically that deep networks build progressively better representations of the learning problem and that the best representations are obtained when the deep network discriminates only in the last layers.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
Publication statusPublished - 2010 Dec 1
Externally publishedYes
Event24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 - Vancouver, BC, Canada
Duration: 2010 Dec 62010 Dec 9

Other

Other24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
CountryCanada
CityVancouver, BC
Period10/12/610/12/9

Fingerprint

Learning systems

ASJC Scopus subject areas

  • Information Systems

Cite this

Montavon, G., Braun, M. L., & Muller, K. (2010). Layer-wise analysis of deep networks with Gaussian kernels. In Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010

Layer-wise analysis of deep networks with Gaussian kernels. / Montavon, Grégoire; Braun, Mikio L.; Muller, Klaus.

Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 2010.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Montavon, G, Braun, ML & Muller, K 2010, Layer-wise analysis of deep networks with Gaussian kernels. in Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010, Vancouver, BC, Canada, 10/12/6.
Montavon G, Braun ML, Muller K. Layer-wise analysis of deep networks with Gaussian kernels. In Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 2010
Montavon, Grégoire ; Braun, Mikio L. ; Muller, Klaus. / Layer-wise analysis of deep networks with Gaussian kernels. Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010. 2010.
@inproceedings{0f784a31352943e28e943db01e70e2b0,
title = "Layer-wise analysis of deep networks with Gaussian kernels",
abstract = "Deep networks can potentially express a learning problem more efficiently than local learning machines. While deep networks outperform local learning machines on some problems, it is still unclear how their nice representation emerges from their complex structure. We present an analysis based on Gaussian kernels that measures how the representation of the learning problem evolves layer after layer as the deep network builds higher-level abstract representations of the input. We use this analysis to show empirically that deep networks build progressively better representations of the learning problem and that the best representations are obtained when the deep network discriminates only in the last layers.",
author = "Gr{\'e}goire Montavon and Braun, {Mikio L.} and Klaus Muller",
year = "2010",
month = "12",
day = "1",
language = "English",
isbn = "9781617823800",
booktitle = "Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010",

}

TY - GEN

T1 - Layer-wise analysis of deep networks with Gaussian kernels

AU - Montavon, Grégoire

AU - Braun, Mikio L.

AU - Muller, Klaus

PY - 2010/12/1

Y1 - 2010/12/1

N2 - Deep networks can potentially express a learning problem more efficiently than local learning machines. While deep networks outperform local learning machines on some problems, it is still unclear how their nice representation emerges from their complex structure. We present an analysis based on Gaussian kernels that measures how the representation of the learning problem evolves layer after layer as the deep network builds higher-level abstract representations of the input. We use this analysis to show empirically that deep networks build progressively better representations of the learning problem and that the best representations are obtained when the deep network discriminates only in the last layers.

AB - Deep networks can potentially express a learning problem more efficiently than local learning machines. While deep networks outperform local learning machines on some problems, it is still unclear how their nice representation emerges from their complex structure. We present an analysis based on Gaussian kernels that measures how the representation of the learning problem evolves layer after layer as the deep network builds higher-level abstract representations of the input. We use this analysis to show empirically that deep networks build progressively better representations of the learning problem and that the best representations are obtained when the deep network discriminates only in the last layers.

UR - http://www.scopus.com/inward/record.url?scp=84860614044&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84860614044&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:84860614044

SN - 9781617823800

BT - Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010

ER -