Frame difference history image for gait recognition

Chun Chieh Lee*, Chi Hung Chuang, Jun-Wei Hsieh, Ming Xuan Wu, Kuo Chin Fan

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

14 Scopus citations

Abstract

In this paper, we propose a simple but effective human identification method based on gait features using frame difference history image (FDHI). Before constructing the FDHI feature, a sequence-based silhouette normalization scheme and an alignment pre-processing step are applied. After that, a post-processing step is devised for getting more representative gait signatures for human identification. Two types of FDHI templates are then extracted and represented more compactly by a dimensionality reduction technique, i.e., Principal Component Analysis (PCA) followed by Linear Discriminant Analysis (LDA). The transformed feature vectors are then respectively classified by individual K-Nearest Neighbor (KNN) classifiers. Lastly, the final classification decision is made by a fusion technique. Experimental results are provided to prove the superiority of the proposed method.

Original languageEnglish
Title of host publicationProceedings of 2011 International Conference on Machine Learning and Cybernetics, ICMLC 2011
Pages1785-1788
Number of pages4
DOIs
StatePublished - 7 Nov 2011
Event2011 International Conference on Machine Learning and Cybernetics, ICMLC 2011 - Guilin, Guangxi, China
Duration: 10 Jul 201113 Jul 2011

Publication series

NameProceedings - International Conference on Machine Learning and Cybernetics
Volume4
ISSN (Print)2160-133X
ISSN (Electronic)2160-1348

Conference

Conference2011 International Conference on Machine Learning and Cybernetics, ICMLC 2011
CountryChina
CityGuilin, Guangxi
Period10/07/1113/07/11

Keywords

  • Gait recognition
  • PCA/LDA
  • Video Surveillance

Fingerprint Dive into the research topics of 'Frame difference history image for gait recognition'. Together they form a unique fingerprint.

Cite this