Fast disparity estimation for 3DTV applications

Yu Cheng Tseng*, Tian-Sheuan Chang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The depth estimation reference software (DERS) algorithm developed by the MPEG 3-D Video Coding could produce high-quality disparity maps for 3DTV applications but suffers from high computational complexity due to its complicated graph-cut optimization. Therefore, this paper proposed a new fast disparity estimation algorithm that could significantly reduce the computational complexity by the downsampled matching cost method. To address the temporal consistency in videos, this paper proposed the no-motion registration for the foreground copy artifact and the still-edge preservation for the flicker artifact. In addition, the occlusion problem is also solved in the proposed algorithm. The experimental results show that our algorithm could generate comparable disparity maps to the DERS algorithm, and only takes 10.8% of its execution time.

Original languageEnglish
Title of host publication2012 IEEE Visual Communications and Image Processing, VCIP 2012
DOIs
StatePublished - 1 Dec 2012
Event2012 IEEE Visual Communications and Image Processing, VCIP 2012 - San Diego, CA, United States
Duration: 27 Nov 201230 Nov 2012

Publication series

Name2012 IEEE Visual Communications and Image Processing, VCIP 2012

Conference

Conference2012 IEEE Visual Communications and Image Processing, VCIP 2012
CountryUnited States
CitySan Diego, CA
Period27/11/1230/11/12

Keywords

  • Disparity estimation
  • stereoscopic video

Fingerprint Dive into the research topics of 'Fast disparity estimation for 3DTV applications'. Together they form a unique fingerprint.

  • Cite this

    Tseng, Y. C., & Chang, T-S. (2012). Fast disparity estimation for 3DTV applications. In 2012 IEEE Visual Communications and Image Processing, VCIP 2012 [6410760] (2012 IEEE Visual Communications and Image Processing, VCIP 2012). https://doi.org/10.1109/VCIP.2012.6410760