Monocular vision and odometry-based SLAM using position and orientation of ceiling lamps

Seo Yeon Hwang, Jae Bok Song

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)


This paper proposes a novel monocular vision-based SLAM (Simultaneous Localization and Mapping) method using both position and orientation information of ceiling lamps. Conventional approaches used corner or line features as landmarks in their SLAM algorithms, but these methods were often unable to achieve stable navigation due to a lack of reliable visual features on the ceiling. Since lamp features are usually placed some distances from each other in indoor environments, they can be robustly detected and used as reliable landmarks. We used both the position and orientation of a lamp feature to accurately estimate the robot pose. Its orientation is obtained by calculating the principal axis from the pixel distribution of the lamp area. Both corner and lamp features are used as landmarks in the EKF (Extended Kalman Filter) to increase the stability of the SLAM process. Experimental results show that the proposed scheme works successfully in various indoor environments.

Original languageEnglish
Pages (from-to)164-170
Number of pages7
JournalJournal of Institute of Control, Robotics and Systems
Issue number2
Publication statusPublished - 2011 Feb


  • Ceiling
  • Mobile robot
  • Monocular camera
  • SLAM

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Applied Mathematics


Dive into the research topics of 'Monocular vision and odometry-based SLAM using position and orientation of ceiling lamps'. Together they form a unique fingerprint.

Cite this