Active in-hand object recognition on a humanoid robot

Bjorn Browatzki, Vadim Tikhanoff, Giorgio Metta, Heinrich Bulthoff, Christian Wallraven

Research output: Contribution to journalArticle

11 Citations (Scopus)

Abstract

For any robot, the ability to recognize and manipulate unknown objects is crucial to successfully work in natural environments. Object recognition and categorization is a very challenging problem, as 3-D objects often give rise to ambiguous, 2-D views. Here, we present a perception-driven exploration and recognition scheme for in-hand object recognition implemented on the iCub humanoid robot. In this setup, the robot actively seeks out object views to optimize the exploration sequence. This is achieved by regarding the object recognition problem as a localization problem. We search for the most likely viewpoint position on the viewsphere of all objects. This problem can be solved efficiently using a particle filter that fuses visual cues with associated motor actions. Based on the state of the filter, we can predict the next best viewpoint after each recognition step by searching for the action that leads to the highest expected information gain. We conduct extensive evaluations of the proposed system in simulation as well as on the actual robot and show the benefit of perception-driven exploration over passive, vision-only processes at discriminating between highly similar objects. We demonstrate that objects are recognized faster and at the same time with a higher accuracy.

Original languageEnglish
Article number6840371
Pages (from-to)1260-1269
Number of pages10
JournalIEEE Transactions on Robotics
Volume30
Issue number5
DOIs
Publication statusPublished - 2014 Oct 1

Fingerprint

Object recognition
End effectors
Robots
Electric fuses

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Control and Systems Engineering
  • Computer Science Applications

Cite this

Active in-hand object recognition on a humanoid robot. / Browatzki, Bjorn; Tikhanoff, Vadim; Metta, Giorgio; Bulthoff, Heinrich; Wallraven, Christian.

In: IEEE Transactions on Robotics, Vol. 30, No. 5, 6840371, 01.10.2014, p. 1260-1269.

Research output: Contribution to journalArticle

Browatzki, Bjorn ; Tikhanoff, Vadim ; Metta, Giorgio ; Bulthoff, Heinrich ; Wallraven, Christian. / Active in-hand object recognition on a humanoid robot. In: IEEE Transactions on Robotics. 2014 ; Vol. 30, No. 5. pp. 1260-1269.
@article{7891d6b341a74a5ea00e2b8a43f81bc9,
title = "Active in-hand object recognition on a humanoid robot",
abstract = "For any robot, the ability to recognize and manipulate unknown objects is crucial to successfully work in natural environments. Object recognition and categorization is a very challenging problem, as 3-D objects often give rise to ambiguous, 2-D views. Here, we present a perception-driven exploration and recognition scheme for in-hand object recognition implemented on the iCub humanoid robot. In this setup, the robot actively seeks out object views to optimize the exploration sequence. This is achieved by regarding the object recognition problem as a localization problem. We search for the most likely viewpoint position on the viewsphere of all objects. This problem can be solved efficiently using a particle filter that fuses visual cues with associated motor actions. Based on the state of the filter, we can predict the next best viewpoint after each recognition step by searching for the action that leads to the highest expected information gain. We conduct extensive evaluations of the proposed system in simulation as well as on the actual robot and show the benefit of perception-driven exploration over passive, vision-only processes at discriminating between highly similar objects. We demonstrate that objects are recognized faster and at the same time with a higher accuracy.",
author = "Bjorn Browatzki and Vadim Tikhanoff and Giorgio Metta and Heinrich Bulthoff and Christian Wallraven",
year = "2014",
month = "10",
day = "1",
doi = "10.1109/TRO.2014.2328779",
language = "English",
volume = "30",
pages = "1260--1269",
journal = "IEEE Transactions on Robotics",
issn = "1552-3098",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "5",

}

TY - JOUR

T1 - Active in-hand object recognition on a humanoid robot

AU - Browatzki, Bjorn

AU - Tikhanoff, Vadim

AU - Metta, Giorgio

AU - Bulthoff, Heinrich

AU - Wallraven, Christian

PY - 2014/10/1

Y1 - 2014/10/1

N2 - For any robot, the ability to recognize and manipulate unknown objects is crucial to successfully work in natural environments. Object recognition and categorization is a very challenging problem, as 3-D objects often give rise to ambiguous, 2-D views. Here, we present a perception-driven exploration and recognition scheme for in-hand object recognition implemented on the iCub humanoid robot. In this setup, the robot actively seeks out object views to optimize the exploration sequence. This is achieved by regarding the object recognition problem as a localization problem. We search for the most likely viewpoint position on the viewsphere of all objects. This problem can be solved efficiently using a particle filter that fuses visual cues with associated motor actions. Based on the state of the filter, we can predict the next best viewpoint after each recognition step by searching for the action that leads to the highest expected information gain. We conduct extensive evaluations of the proposed system in simulation as well as on the actual robot and show the benefit of perception-driven exploration over passive, vision-only processes at discriminating between highly similar objects. We demonstrate that objects are recognized faster and at the same time with a higher accuracy.

AB - For any robot, the ability to recognize and manipulate unknown objects is crucial to successfully work in natural environments. Object recognition and categorization is a very challenging problem, as 3-D objects often give rise to ambiguous, 2-D views. Here, we present a perception-driven exploration and recognition scheme for in-hand object recognition implemented on the iCub humanoid robot. In this setup, the robot actively seeks out object views to optimize the exploration sequence. This is achieved by regarding the object recognition problem as a localization problem. We search for the most likely viewpoint position on the viewsphere of all objects. This problem can be solved efficiently using a particle filter that fuses visual cues with associated motor actions. Based on the state of the filter, we can predict the next best viewpoint after each recognition step by searching for the action that leads to the highest expected information gain. We conduct extensive evaluations of the proposed system in simulation as well as on the actual robot and show the benefit of perception-driven exploration over passive, vision-only processes at discriminating between highly similar objects. We demonstrate that objects are recognized faster and at the same time with a higher accuracy.

UR - http://www.scopus.com/inward/record.url?scp=84907856102&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84907856102&partnerID=8YFLogxK

U2 - 10.1109/TRO.2014.2328779

DO - 10.1109/TRO.2014.2328779

M3 - Article

VL - 30

SP - 1260

EP - 1269

JO - IEEE Transactions on Robotics

JF - IEEE Transactions on Robotics

SN - 1552-3098

IS - 5

M1 - 6840371

ER -