Dynamic visual tracking control of a mobile robot with image noise and occlusion robustness

Chi Yi Tsai*, Kai-Tai Song

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

This paper presents a robust visual tracking control design for a nonholonomic mobile robot equipped with a tilt camera. This design aims to allow the mobile robot to keep track of a dynamic moving target in the camera's field-of-view; even though the target is temporarily fully occluded. To achieve this, a control system consisting of a visual tracking controller (VTC) and a visual state estimator (VSE) is proposed. A novel visual interaction model is derived to facilitate the design of VTC and VSE. The VSE is responsible for estimating the optimal target state and target image velocity in the image space. The VTC then calculates the corresponding command velocities for the mobile robot to work in the world coordinates. The proposed VSE not only possesses robustness against the image noise, but also overcomes the temporary occlusion problem. Computer simulations and practical experiments of a mobile robot to track a moving target have been carried out to validate the performance and robustness of the proposed system.

Original languageEnglish
Pages (from-to)1007-1022
Number of pages16
JournalImage and Vision Computing
Volume27
Issue number8
DOIs
StatePublished - 2 Jul 2009

Keywords

  • Nonholonomic mobile robots
  • Temporary partial/full occlusion
  • Visual interaction model
  • Visual state estimation
  • Visual tracking control

Fingerprint Dive into the research topics of 'Dynamic visual tracking control of a mobile robot with image noise and occlusion robustness'. Together they form a unique fingerprint.

Cite this