Maximum confidence hidden Markov modeling for face recognition

Jen-Tzung Chien*, Chih Pin Liao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

41 Scopus citations

Abstract

This paper presents a hybrid framework of feature extraction and hidden Markov modeling(HMM) for two-dimensional pattern recognition. Importantly, we explore a new discriminative training criterion to assure model compactness and discriminability. This criterion is derived from hypothesis test theory via maximizing the confidence of accepting the hypothesis that observations are from target HMM states rather than competing HMM states. Accordingly, we develop the maximum confidence hidden Markov modeling (MC-HMM) for face recognition. Under this framework, we merge a transformation matrix to extract discriminative facial features. The closed-form solutions to continuous-density HMM parameters are formulated. Attractively, the hybrid MC-HMM parameters are estimated under the same criterion and converged through the expectation-maximization procedure. From the experiments onFERET and GTFD facial databases, we find that the proposed method obtains robust segmentation in presence of different facial expressions, orientations, etc. In comparison with maximum likelihood and minimum classification error HMMs, the proposed MC-HMM achieves higher recognition accuracies with lower feature dimensions.

Original languageEnglish
Pages (from-to)606-616
Number of pages11
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume30
Issue number4
DOIs
StatePublished - 1 Apr 2008

Keywords

  • Classifier design and evaluation
  • Confidence measure
  • Discrimative feature extraction
  • Discrimative training
  • Face and gesture recognition
  • Face recognition
  • Hidden Markov model
  • Parameter learning
  • Pattern classifacation
  • Statistical

Fingerprint Dive into the research topics of 'Maximum confidence hidden Markov modeling for face recognition'. Together they form a unique fingerprint.

Cite this