A low memory zerotree coding for arbitrarily shaped objects

Chorng Yann Su*, Bing-Fei Wu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

20 Scopus citations

Abstract

The Set Partitioning In Hierarchical Trees (SPIHT) algorithm is a computationally simple and efficient zerotree coding technique for image compression. However, high working memory requirement is its main drawback for hardware realization. In this study, we present a low memory zerotree coder (LMZC), which requires much less working memory than SPIHT. The LMZC coding algorithm abandons the use of lists, defines a different tree structure, and merges the sorting pass and the refinement ass together. The main techniques of LMZC are the recursive programming and a top-bit scheme (TBS). In TBS, the top bits of transformed coefficients are used to store the coding status of coefficients instead of the lists used in SPIHT. In order to achieve high coding efficiency, shape-adaptive discrete wavelet transforms are used to transformation arbitrarily shaped objects. A compact emplacement of the transformed coefficients is also proposed to further reduce working memory. The LMZC carefully treats "don't care" nodes in the wavelet tree and does not use bits to code such nodes. Comparison of LMZC with SPIHT shows that for coding a 768 × 512 color image, LMZC saves at least 5.3 MBytes of memory but only increases a little execution time and reduces minor peak signal-to noise ratio (PSNR) values, thereby making it highly promising for some memory limited applications.

Original languageEnglish
Pages (from-to)271-282
Number of pages12
JournalIEEE Transactions on Image Processing
Volume12
Issue number3
DOIs
StatePublished - 1 Mar 2003

Keywords

  • Arbitrarily shaped image coding
  • Image compression
  • Low memory
  • Recursive programming
  • Shape adaptive zerotree coding

Fingerprint Dive into the research topics of 'A low memory zerotree coding for arbitrarily shaped objects'. Together they form a unique fingerprint.

Cite this