Online Hierarchical Transformation of Hidden Markov Models for Speech Recognition

Jen-Tzung Chien*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

41 Scopus citations

Abstract

This paper proposes a novel framework of online hierarchical transformation of hidden Markov model (HMM) parameters for adaptive speech recognition. Our goal is to incrementally transform (or adapt) all the HMM parameters to a new acoustical environment even though most of HMM units are unseen in observed adaptation data. We establish a hierarchical tree of HMM units and apply the tree to dynamically search the transformation parameters for individual HMM mixture components. In this paper, the transformation framework is formulated according to the approximate Bayesian estimate, which the prior statistics and the transformation parameters can be jointly and incrementally refreshed after each consecutive adaptation data is presented. Using this formulation, only the refreshed prior statistics and the current block of data are needed for online transformation. In a series of speaker adaptation experiments on the recognition of 408 Mandarin syllables, we examine the effects on constructing various types of hierarchical trees. The efficiency and effectiveness of proposed method on incremental adaptation of overall HMM units are also confirmed. Besides, we demonstrate the superiority of proposed online transformation to Huo's on-line adaptation [16] for a wide range of adaptation data.

Original languageEnglish
Pages (from-to)656-667
Number of pages12
JournalIEEE Transactions on Speech and Audio Processing
Volume7
Issue number6
DOIs
StatePublished - 1 Dec 1999

Keywords

  • Approximate bayesian estimate
  • Em algorithm
  • Hidden markov models
  • Online hierarchical transformation
  • Speaker adaptation
  • Speech recognition

Fingerprint Dive into the research topics of 'Online Hierarchical Transformation of Hidden Markov Models for Speech Recognition'. Together they form a unique fingerprint.

Cite this