Vision SLAM using omni-directional visual scan matching

Fu Sheng Huang*, Kai-Tai Song

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

7 Scopus citations

Abstract

Because of its 360° field of view, an omni-directional camera is suitable for detecting and tracking environmental features in mobile robot navigation applications. This study aims to investigate simultaneous localization and mapping (SLAM) of a mobile robot using omni-directional images. A switching method of visual reference scans is proposed to facilitate fast visual scan matching in the SLAM design. In this method, new reference scans can be added to a database and an existent reference scan can be switched to be current reference scan automatically in SLAM calculation. Visual reference scans can be used repeatedly to reduce the computation complexity of extended Kalman filter (EKF) in the SLAM algorithm. Experimental results show that the correct matching rate of landmark features is 92.6%. Indoor navigation experiments validate the proposed localization algorithm. Average localization error of 10cm has been achieved in a 30m travel in an indoor environment using omni-directional images.

Original languageEnglish
Title of host publication2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS
Pages1588-1593
Number of pages6
DOIs
StatePublished - 1 Dec 2008
Event2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS - Nice, France
Duration: 22 Sep 200826 Sep 2008

Publication series

Name2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS

Conference

Conference2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS
CountryFrance
CityNice
Period22/09/0826/09/08

Fingerprint Dive into the research topics of 'Vision SLAM using omni-directional visual scan matching'. Together they form a unique fingerprint.

Cite this