Object recognition for SLAM in floor environments using a depth sensor

Hee Won Chae, Chansoo Park, Hyejun Yu, Jae-Bok Song

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

Depth sensors have been increasingly used for object recognition in recent years. However, it is very challenging for simultaneous localization and mapping (SLAM) to make use of the forward scenes from a depth sensor. To this end, we introduce the object recognition framework for SLAM in indoor environments based on the extraction of an object-level descriptor. The proposed object-level descriptor can be obtained based on the surface appearances acquired from a depth sensor without any training. To express the surface normal distribution, a well-known descriptor, fast point feature histogram (FPFH), with a small sampling radius is used to define basic shape elements of a plane, a cylinder and a sphere. The object-level descriptor to recognize the objects can be obtained using these shape elements. Several experiments on arbitrary objects on the floor show the proposed scheme is useful in object recognition and generation of the feature map.

Original languageEnglish
Title of host publication2016 13th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages405-410
Number of pages6
ISBN (Electronic)9781509008216
DOIs
Publication statusPublished - 2016 Oct 21
Event13th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2016 - Xian, China
Duration: 2016 Aug 192016 Aug 22

Other

Other13th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2016
CountryChina
CityXian
Period16/8/1916/8/22

Fingerprint

Simultaneous Localization and Mapping
Object recognition
Object Recognition
Descriptors
Sensor
Sensors
Normal distribution
Sampling
Feature Point
Histogram
Gaussian distribution
Express
Radius
Object
Experiments
Arbitrary
Experiment

Keywords

  • Depth sensor
  • Object recognition
  • SLAM

ASJC Scopus subject areas

  • Modelling and Simulation
  • Artificial Intelligence
  • Control and Optimization

Cite this

Chae, H. W., Park, C., Yu, H., & Song, J-B. (2016). Object recognition for SLAM in floor environments using a depth sensor. In 2016 13th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2016 (pp. 405-410). [7734070] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/URAI.2016.7734070

Object recognition for SLAM in floor environments using a depth sensor. / Chae, Hee Won; Park, Chansoo; Yu, Hyejun; Song, Jae-Bok.

2016 13th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2016. Institute of Electrical and Electronics Engineers Inc., 2016. p. 405-410 7734070.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Chae, HW, Park, C, Yu, H & Song, J-B 2016, Object recognition for SLAM in floor environments using a depth sensor. in 2016 13th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2016., 7734070, Institute of Electrical and Electronics Engineers Inc., pp. 405-410, 13th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2016, Xian, China, 16/8/19. https://doi.org/10.1109/URAI.2016.7734070
Chae HW, Park C, Yu H, Song J-B. Object recognition for SLAM in floor environments using a depth sensor. In 2016 13th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2016. Institute of Electrical and Electronics Engineers Inc. 2016. p. 405-410. 7734070 https://doi.org/10.1109/URAI.2016.7734070
Chae, Hee Won ; Park, Chansoo ; Yu, Hyejun ; Song, Jae-Bok. / Object recognition for SLAM in floor environments using a depth sensor. 2016 13th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2016. Institute of Electrical and Electronics Engineers Inc., 2016. pp. 405-410
@inproceedings{fe450e5a352e4106a04105feb3230486,
title = "Object recognition for SLAM in floor environments using a depth sensor",
abstract = "Depth sensors have been increasingly used for object recognition in recent years. However, it is very challenging for simultaneous localization and mapping (SLAM) to make use of the forward scenes from a depth sensor. To this end, we introduce the object recognition framework for SLAM in indoor environments based on the extraction of an object-level descriptor. The proposed object-level descriptor can be obtained based on the surface appearances acquired from a depth sensor without any training. To express the surface normal distribution, a well-known descriptor, fast point feature histogram (FPFH), with a small sampling radius is used to define basic shape elements of a plane, a cylinder and a sphere. The object-level descriptor to recognize the objects can be obtained using these shape elements. Several experiments on arbitrary objects on the floor show the proposed scheme is useful in object recognition and generation of the feature map.",
keywords = "Depth sensor, Object recognition, SLAM",
author = "Chae, {Hee Won} and Chansoo Park and Hyejun Yu and Jae-Bok Song",
year = "2016",
month = "10",
day = "21",
doi = "10.1109/URAI.2016.7734070",
language = "English",
pages = "405--410",
booktitle = "2016 13th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2016",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Object recognition for SLAM in floor environments using a depth sensor

AU - Chae, Hee Won

AU - Park, Chansoo

AU - Yu, Hyejun

AU - Song, Jae-Bok

PY - 2016/10/21

Y1 - 2016/10/21

N2 - Depth sensors have been increasingly used for object recognition in recent years. However, it is very challenging for simultaneous localization and mapping (SLAM) to make use of the forward scenes from a depth sensor. To this end, we introduce the object recognition framework for SLAM in indoor environments based on the extraction of an object-level descriptor. The proposed object-level descriptor can be obtained based on the surface appearances acquired from a depth sensor without any training. To express the surface normal distribution, a well-known descriptor, fast point feature histogram (FPFH), with a small sampling radius is used to define basic shape elements of a plane, a cylinder and a sphere. The object-level descriptor to recognize the objects can be obtained using these shape elements. Several experiments on arbitrary objects on the floor show the proposed scheme is useful in object recognition and generation of the feature map.

AB - Depth sensors have been increasingly used for object recognition in recent years. However, it is very challenging for simultaneous localization and mapping (SLAM) to make use of the forward scenes from a depth sensor. To this end, we introduce the object recognition framework for SLAM in indoor environments based on the extraction of an object-level descriptor. The proposed object-level descriptor can be obtained based on the surface appearances acquired from a depth sensor without any training. To express the surface normal distribution, a well-known descriptor, fast point feature histogram (FPFH), with a small sampling radius is used to define basic shape elements of a plane, a cylinder and a sphere. The object-level descriptor to recognize the objects can be obtained using these shape elements. Several experiments on arbitrary objects on the floor show the proposed scheme is useful in object recognition and generation of the feature map.

KW - Depth sensor

KW - Object recognition

KW - SLAM

UR - http://www.scopus.com/inward/record.url?scp=85000443208&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85000443208&partnerID=8YFLogxK

U2 - 10.1109/URAI.2016.7734070

DO - 10.1109/URAI.2016.7734070

M3 - Conference contribution

AN - SCOPUS:85000443208

SP - 405

EP - 410

BT - 2016 13th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2016

PB - Institute of Electrical and Electronics Engineers Inc.

ER -