The joint submission of the TU Berlin and Fraunhofer FIRST (TUBFI) to the image CLEF 2011 photo annotation task

Alexander Binder, Wojciech Samek, Marius Kloft, Christina Müller, Klaus Robert Müller, Motoaki Kawanabe

Research output: Contribution to journalConference articlepeer-review

17 Citations (Scopus)

Abstract

In this paper we present details on the joint submission of TU Berlin and Fraunhofer FIRST to the ImageCLEF 2011 Photo Annotation Task.We sought to experiment with extensions of Bag-of-Words (BoW) models at several levels and to apply several kernel-based learning methods recently developed in our group. For classifier training we used non-sparse multiple kernel learning (MKL) and an efficient multi-task learning (MTL) heuristic based on MKL over kernels from classifier outputs. For the multi-modal fusion we used a smoothing method on tag-based features inspired by Bag-of-Words soft mappings and Markov random walks. We submitted one multi-modal run extended by the user tags and four purely visual runs based on Bag-of-Words models. Our best visual result which used the MTL method was ranked first according to mean average precision (MAP) within the purely visual submissions. Our multi-modal submission achieved the first rank by MAP among the multi-modal submissions and the best MAP among all submissions. Submissions by other groups such as BPACAD, CAEN, UvA-ISIS, LIRIS were ranked closely.

Original languageEnglish
JournalCEUR Workshop Proceedings
Volume1177
Publication statusPublished - 2011
Event2011 Cross Language Evaluation Forum Conference, CLEF 2011 - Amsterdam, Netherlands
Duration: 2011 Sep 192011 Sep 22

Keywords

  • Bag-of-words
  • Image classification
  • Image clef
  • Multi-task learning
  • Multiple kernel learning
  • Photo annotation
  • Theseus

ASJC Scopus subject areas

  • Computer Science(all)

Fingerprint

Dive into the research topics of 'The joint submission of the TU Berlin and Fraunhofer FIRST (TUBFI) to the image CLEF 2011 photo annotation task'. Together they form a unique fingerprint.

Cite this