Path-Tracking Control Based on Deep ORB-SLAM2

Kai Tai Song, Song Qing Ou

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This paper proposes a mobile robot navigation control design based on deep vSLAM approach. The deep vSLAM system integrates ORB-SLAM2 with a deep neural network (DNN) to enhance the localization performance of an embedded navigation system. An RGB-D camera is utilized to realize the deep vSLAM system. In the proposed deep vSLAM algorithm, the DNN works to accelerate feature matching and feature detection steps in the original ORB-SLAM2. Further, we developed a path-tracking controller to navigate the robot on the planned path based on the vSLAM localization. Several interesting experiments are presented to validate the performance of the proposed method. The experimental results on a mobile robot show that the computation time of the DNN based feature matching is reduced to 45.16% of that of the original ORB-SLAM2, and the path-tracking error of a squared-shape path of 19.8m is within 20mm.

Original languageEnglish
Title of host publication2020 International Automatic Control Conference, CACS 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728171982
DOIs
StatePublished - 4 Nov 2020
Event2020 International Automatic Control Conference, CACS 2020 - Hsinchu, Taiwan
Duration: 4 Nov 20207 Nov 2020

Publication series

Name2020 International Automatic Control Conference, CACS 2020

Conference

Conference2020 International Automatic Control Conference, CACS 2020
CountryTaiwan
CityHsinchu
Period4/11/207/11/20

Keywords

  • Mobile robots
  • path tracking control
  • RGB-D camera
  • Visual SLAM

Fingerprint Dive into the research topics of 'Path-Tracking Control Based on Deep ORB-SLAM2'. Together they form a unique fingerprint.

Cite this