Visual tracking control of a mobile robot using a new model in image plane

Chi Yi Tsai*, Kai-Tai Song

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Scopus citations

Abstract

This paper presents a novel visual tracking model in image plane for autonomous mobile robots. Based on this control model, a single visual tracking controller is proposed for both static and moving objects in a two-dimensional plane. In this design, the mobile robot is equipped with a fixed camera for environment detection. The visual feedback is used to control the robot pose for tracking a moving object of interest. A novel state space representation of the camera-object visual interaction model fully defined in the image plane is first derived, a closed-loop stabilizing control law is then designed. The visual tracking control scheme achieves exponential convergence of the closed-loop visual tracking system through pole-placement method. Simulation results verify the effectiveness of the proposed control scheme, both in terms of tracking performance and system convergence.

Original languageEnglish
Title of host publication2005 International Conference on Advanced Robotics, ICAR '05, Proceedings
Pages540-545
Number of pages6
DOIs
StatePublished - 1 Dec 2005
Event12th International Conference on Advanced Robotics, 2005. ICAR '05 - Seattle, WA, United States
Duration: 18 Jul 200520 Jul 2005

Publication series

Name2005 International Conference on Advanced Robotics, ICAR '05, Proceedings
Volume2005

Conference

Conference12th International Conference on Advanced Robotics, 2005. ICAR '05
CountryUnited States
CitySeattle, WA
Period18/07/0520/07/05

Keywords

  • Autonomous robots
  • Human-robot interaction
  • Image-based robot control
  • Visual servoing
  • Visual tracking control

Fingerprint Dive into the research topics of 'Visual tracking control of a mobile robot using a new model in image plane'. Together they form a unique fingerprint.

Cite this