Insights from classifying visual concepts with multiple kernel learning

Alexander Binder, Shinichi Nakajima, Marius Kloft, Christina Müller, Wojciech Samek, Ulf Brefeld, Klaus Muller, Motoaki Kawanabe

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

Combining information from various image features has become a standard technique in concept recognition tasks. However, the optimal way of fusing the resulting kernel functions is usually unknown in practical applications. Multiple kernel learning (MKL) techniques allow to determine an optimal linear combination of such similarity matrices. Classical approaches to MKL promote sparse mixtures. Unfortunately, 1-norm regularized MKL variants are often observed to be outperformed by an unweighted sum kernel. The main contributions of this paper are the following: we apply a recently developed non-sparse MKL variant to state-of-the-art concept recognition tasks from the application domain of computer vision. We provide insights on benefits and limits of non-sparse MKL and compare it against its direct competitors, the sum-kernel SVM and sparse MKL. We report empirical results for the PASCAL VOC 2009 Classification and ImageCLEF2010 Photo Annotation challenge data sets. Data sets (kernel matrices) as well as further information are available at http://doc.ml.tuberlin.de/image_mkl/(Accessed 2012 Jun 25).

Original languageEnglish
Article numbere38897
JournalPLoS One
Volume7
Issue number8
DOIs
Publication statusPublished - 2012 Aug 24

Fingerprint

learning
Learning
seeds
Volatile organic compounds
Computer vision
computer vision
seed set
methodology
Datasets
Recognition (Psychology)

ASJC Scopus subject areas

  • Agricultural and Biological Sciences(all)
  • Biochemistry, Genetics and Molecular Biology(all)
  • Medicine(all)

Cite this

Binder, A., Nakajima, S., Kloft, M., Müller, C., Samek, W., Brefeld, U., ... Kawanabe, M. (2012). Insights from classifying visual concepts with multiple kernel learning. PLoS One, 7(8), [e38897]. https://doi.org/10.1371/journal.pone.0038897

Insights from classifying visual concepts with multiple kernel learning. / Binder, Alexander; Nakajima, Shinichi; Kloft, Marius; Müller, Christina; Samek, Wojciech; Brefeld, Ulf; Muller, Klaus; Kawanabe, Motoaki.

In: PLoS One, Vol. 7, No. 8, e38897, 24.08.2012.

Research output: Contribution to journalArticle

Binder, A, Nakajima, S, Kloft, M, Müller, C, Samek, W, Brefeld, U, Muller, K & Kawanabe, M 2012, 'Insights from classifying visual concepts with multiple kernel learning', PLoS One, vol. 7, no. 8, e38897. https://doi.org/10.1371/journal.pone.0038897
Binder A, Nakajima S, Kloft M, Müller C, Samek W, Brefeld U et al. Insights from classifying visual concepts with multiple kernel learning. PLoS One. 2012 Aug 24;7(8). e38897. https://doi.org/10.1371/journal.pone.0038897
Binder, Alexander ; Nakajima, Shinichi ; Kloft, Marius ; Müller, Christina ; Samek, Wojciech ; Brefeld, Ulf ; Muller, Klaus ; Kawanabe, Motoaki. / Insights from classifying visual concepts with multiple kernel learning. In: PLoS One. 2012 ; Vol. 7, No. 8.
@article{7d0bfbf5b5484db095c449606e4ba161,
title = "Insights from classifying visual concepts with multiple kernel learning",
abstract = "Combining information from various image features has become a standard technique in concept recognition tasks. However, the optimal way of fusing the resulting kernel functions is usually unknown in practical applications. Multiple kernel learning (MKL) techniques allow to determine an optimal linear combination of such similarity matrices. Classical approaches to MKL promote sparse mixtures. Unfortunately, 1-norm regularized MKL variants are often observed to be outperformed by an unweighted sum kernel. The main contributions of this paper are the following: we apply a recently developed non-sparse MKL variant to state-of-the-art concept recognition tasks from the application domain of computer vision. We provide insights on benefits and limits of non-sparse MKL and compare it against its direct competitors, the sum-kernel SVM and sparse MKL. We report empirical results for the PASCAL VOC 2009 Classification and ImageCLEF2010 Photo Annotation challenge data sets. Data sets (kernel matrices) as well as further information are available at http://doc.ml.tuberlin.de/image_mkl/(Accessed 2012 Jun 25).",
author = "Alexander Binder and Shinichi Nakajima and Marius Kloft and Christina M{\"u}ller and Wojciech Samek and Ulf Brefeld and Klaus Muller and Motoaki Kawanabe",
year = "2012",
month = "8",
day = "24",
doi = "10.1371/journal.pone.0038897",
language = "English",
volume = "7",
journal = "PLoS One",
issn = "1932-6203",
publisher = "Public Library of Science",
number = "8",

}

TY - JOUR

T1 - Insights from classifying visual concepts with multiple kernel learning

AU - Binder, Alexander

AU - Nakajima, Shinichi

AU - Kloft, Marius

AU - Müller, Christina

AU - Samek, Wojciech

AU - Brefeld, Ulf

AU - Muller, Klaus

AU - Kawanabe, Motoaki

PY - 2012/8/24

Y1 - 2012/8/24

N2 - Combining information from various image features has become a standard technique in concept recognition tasks. However, the optimal way of fusing the resulting kernel functions is usually unknown in practical applications. Multiple kernel learning (MKL) techniques allow to determine an optimal linear combination of such similarity matrices. Classical approaches to MKL promote sparse mixtures. Unfortunately, 1-norm regularized MKL variants are often observed to be outperformed by an unweighted sum kernel. The main contributions of this paper are the following: we apply a recently developed non-sparse MKL variant to state-of-the-art concept recognition tasks from the application domain of computer vision. We provide insights on benefits and limits of non-sparse MKL and compare it against its direct competitors, the sum-kernel SVM and sparse MKL. We report empirical results for the PASCAL VOC 2009 Classification and ImageCLEF2010 Photo Annotation challenge data sets. Data sets (kernel matrices) as well as further information are available at http://doc.ml.tuberlin.de/image_mkl/(Accessed 2012 Jun 25).

AB - Combining information from various image features has become a standard technique in concept recognition tasks. However, the optimal way of fusing the resulting kernel functions is usually unknown in practical applications. Multiple kernel learning (MKL) techniques allow to determine an optimal linear combination of such similarity matrices. Classical approaches to MKL promote sparse mixtures. Unfortunately, 1-norm regularized MKL variants are often observed to be outperformed by an unweighted sum kernel. The main contributions of this paper are the following: we apply a recently developed non-sparse MKL variant to state-of-the-art concept recognition tasks from the application domain of computer vision. We provide insights on benefits and limits of non-sparse MKL and compare it against its direct competitors, the sum-kernel SVM and sparse MKL. We report empirical results for the PASCAL VOC 2009 Classification and ImageCLEF2010 Photo Annotation challenge data sets. Data sets (kernel matrices) as well as further information are available at http://doc.ml.tuberlin.de/image_mkl/(Accessed 2012 Jun 25).

UR - http://www.scopus.com/inward/record.url?scp=84865281106&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84865281106&partnerID=8YFLogxK

U2 - 10.1371/journal.pone.0038897

DO - 10.1371/journal.pone.0038897

M3 - Article

VL - 7

JO - PLoS One

JF - PLoS One

SN - 1932-6203

IS - 8

M1 - e38897

ER -