Association pattern language modeling

Jen-Tzung Chien*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

26 Scopus citations


Statistical n-gram language modeling is popular for speech recognition and many other applications. The conventional n-gram suffers from the insufficiency of modeling long-distance language dependencies. This paper presents a novel approach focusing on mining long distance word associations and incorporating these features into language models based on linear interpolation and maximum entropy (ME) principles. We highlight the discovery of the associations of multiple distant words from training corpus. A mining algorithm is exploited to recursively merge the frequent word subsets and efficiently construct the set of association patterns. By combining the features of association patterns into n-gram models, the association pattern n-grams arc estimated with a special realization to trigger pair n-gram where only the associations of two distant words are considered. In the experiments on Chinese language modeling, we find that the incorporation of association patterns significantly reduces the perplexities of n-gram models. The incorporation using ME outperforms that using linear interpolation. Association pattern n-gram is superior to trigger pair n-gram. The perplexities : arc further reduced using more association steps. Further, the proposed association pattern n-grams are not only able to elevate document classification accuracies but also improve speech recognition rates.

Original languageEnglish
Pages (from-to)1719-1728
Number of pages10
JournalIEEE Transactions on Audio, Speech and Language Processing
Issue number5
StatePublished - 1 Sep 2006


  • Association pattern
  • Data mining
  • Language model
  • Long distance association
  • Maximum entropy and trigger pairz

Fingerprint Dive into the research topics of 'Association pattern language modeling'. Together they form a unique fingerprint.

Cite this