Robust visual tracking control of mobile robots based on an error model in image plane

Chi Yi Tsai*, Kai-Tai Song

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations

Abstract

This paper presents a novel visual tracking control scheme, which is robust to the quantization error in practical implementation. The proposed control scheme is based on an error model of camera-object visual interaction in image plane for unicycle mobile robots. In order to overcome the quantization error encountered in practical system, a stability necessary condition for ensuring global asymptotic stability of the closed-loop visual tracking system is derived through Lyapunov's direct method. The robust control law is then proposed to guarantee that the visual tracking system satisfies the stability necessary condition based on Lyapunov theory. Experimental results verify the effectiveness of the proposed control scheme, both in terms of tracking performance and system convergence.

Original languageEnglish
Title of host publicationIEEE International Conference on Mechatronics and Automation, ICMA 2005
Pages1218-1223
Number of pages6
DOIs
StatePublished - 17 Nov 2005
EventIEEE International Conference on Mechatronics and Automation, ICMA 2005 - Niagara Falls, ON, Canada
Duration: 29 Jul 20051 Aug 2005

Publication series

NameIEEE International Conference on Mechatronics and Automation, ICMA 2005

Conference

ConferenceIEEE International Conference on Mechatronics and Automation, ICMA 2005
CountryCanada
CityNiagara Falls, ON
Period29/07/051/08/05

Keywords

  • Human-robot interaction
  • Mobile robots
  • Visual servoing
  • Visual tracking control

Fingerprint Dive into the research topics of 'Robust visual tracking control of mobile robots based on an error model in image plane'. Together they form a unique fingerprint.

Cite this