Face tracking interaction control of a nonholonomic mobile robot

Chi Yi Tsai*, Kai-Tai Song

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

10 Scopus citations

Abstract

This paper presents a novel face tracking control scheme for human-robot interaction control in image plane. This control scheme is robust to the velocity quantization error in practical implementation. A visual tracking controller is designed for ensuring global asymptotic stability of the closed-loop visual tracking system based on an error-state control model in image plane. In order to overcome the quantization uncertainty encountered in practical systems, an image-based robust control law is proposed to guarantee the stability of the robotic control system based on Lyapunov theory. This design provides a useful solution for smooth visual tracking control of slow-motion robots in a home setting. Simulation and Experimental results verify the effectiveness of the proposed visual tracking control scheme, both in terms of tracking performance and system convergence.

Original languageEnglish
Title of host publication2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2006
Pages3319-3324
Number of pages6
DOIs
StatePublished - 1 Dec 2006
Event2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2006 - Beijing, China
Duration: 9 Oct 200615 Oct 2006

Publication series

NameIEEE International Conference on Intelligent Robots and Systems

Conference

Conference2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2006
CountryChina
CityBeijing
Period9/10/0615/10/06

Keywords

  • Human-robot interaction
  • Interaction control
  • Real-time face tracking
  • Visual tracking control

Fingerprint Dive into the research topics of 'Face tracking interaction control of a nonholonomic mobile robot'. Together they form a unique fingerprint.

Cite this