Layer-wise analysis of deep networks with Gaussian kernels

Grégoire Montavon, Mikio L. Braun, Klaus Muller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

Deep networks can potentially express a learning problem more efficiently than local learning machines. While deep networks outperform local learning machines on some problems, it is still unclear how their nice representation emerges from their complex structure. We present an analysis based on Gaussian kernels that measures how the representation of the learning problem evolves layer after layer as the deep network builds higher-level abstract representations of the input. We use this analysis to show empirically that deep networks build progressively better representations of the learning problem and that the best representations are obtained when the deep network discriminates only in the last layers.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
Publication statusPublished - 2010 Dec 1
Externally publishedYes
Event24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 - Vancouver, BC, Canada
Duration: 2010 Dec 62010 Dec 9

Other

Other24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
CountryCanada
CityVancouver, BC
Period10/12/610/12/9

ASJC Scopus subject areas

  • Information Systems

Fingerprint Dive into the research topics of 'Layer-wise analysis of deep networks with Gaussian kernels'. Together they form a unique fingerprint.

  • Cite this

    Montavon, G., Braun, M. L., & Muller, K. (2010). Layer-wise analysis of deep networks with Gaussian kernels. In Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010