Mobile robot localization using fusion of object recognition and range information

Byung Doo Yim, Yong Ju Lee, Jae-Bok Song, Woo Jin Chung

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Citations (Scopus)

Abstract

Most present localization algorithms are either range or vision-based. In many environments, only one type of sensor cannot often ensure successful localization; furthermore, using low-priced range sensors instead of expensive, but accurate, laser scanners often lead to poor performance. This paper proposes an MCL-based localization method that robustly estimates the robot pose with fusion of the range information from a low-cost IR scanner and the SIFT based visual information gathered using a mono camera. With sensor fusion, the rough pose estimation from range-based sensors is compensated by the vision-based sensors and slow object recognition can be overcome by the frequent update of the range information. In order to synchronize the two sensors with different bandwidths, the encoder information gathered during object recognition is exploited. This paper also suggests a method for evaluating localization performance that is based on the normalized probability of a vision sensor model. Various experiments show that the proposed algorithm can estimate the robot pose reasonably well and can accurately evaluate the localization performance.

Original languageEnglish
Title of host publicationProceedings - IEEE International Conference on Robotics and Automation
Pages3533-3538
Number of pages6
DOIs
Publication statusPublished - 2007 Nov 27
Event2007 IEEE International Conference on Robotics and Automation, ICRA'07 - Rome, Italy
Duration: 2007 Apr 102007 Apr 14

Other

Other2007 IEEE International Conference on Robotics and Automation, ICRA'07
CountryItaly
CityRome
Period07/4/1007/4/14

Fingerprint

Object recognition
Mobile robots
Fusion reactions
Sensors
Robots
Cameras
Bandwidth
Lasers
Costs

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering

Cite this

Yim, B. D., Lee, Y. J., Song, J-B., & Chung, W. J. (2007). Mobile robot localization using fusion of object recognition and range information. In Proceedings - IEEE International Conference on Robotics and Automation (pp. 3533-3538). [4209637] https://doi.org/10.1109/ROBOT.2007.364019

Mobile robot localization using fusion of object recognition and range information. / Yim, Byung Doo; Lee, Yong Ju; Song, Jae-Bok; Chung, Woo Jin.

Proceedings - IEEE International Conference on Robotics and Automation. 2007. p. 3533-3538 4209637.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Yim, BD, Lee, YJ, Song, J-B & Chung, WJ 2007, Mobile robot localization using fusion of object recognition and range information. in Proceedings - IEEE International Conference on Robotics and Automation., 4209637, pp. 3533-3538, 2007 IEEE International Conference on Robotics and Automation, ICRA'07, Rome, Italy, 07/4/10. https://doi.org/10.1109/ROBOT.2007.364019
Yim BD, Lee YJ, Song J-B, Chung WJ. Mobile robot localization using fusion of object recognition and range information. In Proceedings - IEEE International Conference on Robotics and Automation. 2007. p. 3533-3538. 4209637 https://doi.org/10.1109/ROBOT.2007.364019
Yim, Byung Doo ; Lee, Yong Ju ; Song, Jae-Bok ; Chung, Woo Jin. / Mobile robot localization using fusion of object recognition and range information. Proceedings - IEEE International Conference on Robotics and Automation. 2007. pp. 3533-3538
@inproceedings{303c3530a35c48549f2158bbbe575b49,
title = "Mobile robot localization using fusion of object recognition and range information",
abstract = "Most present localization algorithms are either range or vision-based. In many environments, only one type of sensor cannot often ensure successful localization; furthermore, using low-priced range sensors instead of expensive, but accurate, laser scanners often lead to poor performance. This paper proposes an MCL-based localization method that robustly estimates the robot pose with fusion of the range information from a low-cost IR scanner and the SIFT based visual information gathered using a mono camera. With sensor fusion, the rough pose estimation from range-based sensors is compensated by the vision-based sensors and slow object recognition can be overcome by the frequent update of the range information. In order to synchronize the two sensors with different bandwidths, the encoder information gathered during object recognition is exploited. This paper also suggests a method for evaluating localization performance that is based on the normalized probability of a vision sensor model. Various experiments show that the proposed algorithm can estimate the robot pose reasonably well and can accurately evaluate the localization performance.",
author = "Yim, {Byung Doo} and Lee, {Yong Ju} and Jae-Bok Song and Chung, {Woo Jin}",
year = "2007",
month = "11",
day = "27",
doi = "10.1109/ROBOT.2007.364019",
language = "English",
isbn = "1424406021",
pages = "3533--3538",
booktitle = "Proceedings - IEEE International Conference on Robotics and Automation",

}

TY - GEN

T1 - Mobile robot localization using fusion of object recognition and range information

AU - Yim, Byung Doo

AU - Lee, Yong Ju

AU - Song, Jae-Bok

AU - Chung, Woo Jin

PY - 2007/11/27

Y1 - 2007/11/27

N2 - Most present localization algorithms are either range or vision-based. In many environments, only one type of sensor cannot often ensure successful localization; furthermore, using low-priced range sensors instead of expensive, but accurate, laser scanners often lead to poor performance. This paper proposes an MCL-based localization method that robustly estimates the robot pose with fusion of the range information from a low-cost IR scanner and the SIFT based visual information gathered using a mono camera. With sensor fusion, the rough pose estimation from range-based sensors is compensated by the vision-based sensors and slow object recognition can be overcome by the frequent update of the range information. In order to synchronize the two sensors with different bandwidths, the encoder information gathered during object recognition is exploited. This paper also suggests a method for evaluating localization performance that is based on the normalized probability of a vision sensor model. Various experiments show that the proposed algorithm can estimate the robot pose reasonably well and can accurately evaluate the localization performance.

AB - Most present localization algorithms are either range or vision-based. In many environments, only one type of sensor cannot often ensure successful localization; furthermore, using low-priced range sensors instead of expensive, but accurate, laser scanners often lead to poor performance. This paper proposes an MCL-based localization method that robustly estimates the robot pose with fusion of the range information from a low-cost IR scanner and the SIFT based visual information gathered using a mono camera. With sensor fusion, the rough pose estimation from range-based sensors is compensated by the vision-based sensors and slow object recognition can be overcome by the frequent update of the range information. In order to synchronize the two sensors with different bandwidths, the encoder information gathered during object recognition is exploited. This paper also suggests a method for evaluating localization performance that is based on the normalized probability of a vision sensor model. Various experiments show that the proposed algorithm can estimate the robot pose reasonably well and can accurately evaluate the localization performance.

UR - http://www.scopus.com/inward/record.url?scp=36348935543&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=36348935543&partnerID=8YFLogxK

U2 - 10.1109/ROBOT.2007.364019

DO - 10.1109/ROBOT.2007.364019

M3 - Conference contribution

AN - SCOPUS:36348935543

SN - 1424406021

SN - 9781424406029

SP - 3533

EP - 3538

BT - Proceedings - IEEE International Conference on Robotics and Automation

ER -