Real-Time Human Movement Retrieval and Assessment With Kinect Sensor

Min Chun Hu, Chi Wen Chen, Wen-Huang Cheng, Che Han Chang, Jui Hsin Lai, Ja Ling Wu

Research output: Contribution to journalArticle

45 Scopus citations

Abstract

The difficulty of vision-based posture estimation is greatly decreased with the aid of commercial depth camera, such as Microsoft Kinect. However, there is still much to do to bridge the results of human posture estimation and the understanding of human movements. Human movement assessment is an important technique for exercise learning in the field of healthcare. In this paper, we propose an action tutor system which enables the user to interactively retrieve a learning exemplar of the target action movement and to immediately acquire motion instructions while learning it in front of the Kinect. The proposed system is composed of two stages. In the retrieval stage, nonlinear time warping algorithms are designed to retrieve video segments similar to the query movement roughly performed by the user. In the learning stage, the user learns according to the selected video exemplar, and the motion assessment including both static and dynamic differences is presented to the user in a more effective and organized way, helping him/her to perform the action movement correctly. The experiments are conducted on the videos of ten action types, and the results show that the proposed human action descriptor is representative for action video retrieval and the tutor system can effectively help the user while learning action movements.

Original languageEnglish
Article number6862031
Pages (from-to)742-753
Number of pages12
JournalIEEE Transactions on Cybernetics
Volume45
Issue number4
DOIs
StatePublished - 1 Apr 2015

Keywords

  • Feature extraction
  • human action
  • human skeleton
  • motion assessment
  • nonlinear time warping
  • video retrieval

Fingerprint Dive into the research topics of 'Real-Time Human Movement Retrieval and Assessment With Kinect Sensor'. Together they form a unique fingerprint.

  • Cite this