Robust visual tracking control system of a mobile robot based on a dual-Jacobian visual interaction model

Chi Yi Tsai, Kai-Tai Song*, Xavier Dutoit, Hendrik Van Brussel, Marnix Nuttin

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

23 Scopus citations


This paper presents a novel design of a robust visual tracking control system, which consists of a visual tracking controller and a visual state estimator. This system facilitates human-robot interaction of a unicycle-modeled mobile robot equipped with a tilt camera. Based on a novel dual-Jacobian visual interaction model, a robust visual tracking controller is proposed to track a dynamic moving target. The proposed controller not only possesses some degree of robustness against the system model uncertainties, but also tracks the target without its 3D velocity information. The visual state estimator aims to estimate the optimal system state and target image velocity, which is used by the visual tracking controller. To achieve this, a self-tuning Kalman filter is proposed to estimate interesting parameters and to overcome the temporary occlusion problem. Furthermore, because the proposed method is fully working in the image space, the computational complexity and the sensor/camera modeling errors can be reduced. Experimental results validate the effectiveness of the proposed method, in terms of tracking performance, system convergence, and robustness.

Original languageEnglish
Pages (from-to)652-664
Number of pages13
JournalRobotics and Autonomous Systems
Issue number6-7
StatePublished - 30 Jun 2009


  • Kalman filter
  • Visual estimation
  • Visual interaction model
  • Visual tracking control

Fingerprint Dive into the research topics of 'Robust visual tracking control system of a mobile robot based on a dual-Jacobian visual interaction model'. Together they form a unique fingerprint.

Cite this