Multipass stereo matching algorithm using high-curvature points on image profiles

Yuan Chih Peng*, Sheng-Jyh Wang

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review


In this paper, we propose a new algorithm to do correspondence for stereo images. This algorithm applies two passes of feature-based matching to establish a coarse disparity map first. Then, by carefully matching the intensity information, a dense disparity map is generated. In this algorithm, instead of the commonly used 'edge' points, the high-curvature points of image profiles are chosen as the feature points to be matched. These high-curvature points can be easily extracted from the images by checking the 2nd derivatives of the intensity profiles. These high-curvature features can faithfully catch the major characteristics of the profile shape and can thus avoid some ambiguities in feature matching. A dissimilarity measure, which is closely related to the profile shape, is thus defined using these feature points. To reduce the ambiguity in local matching, the dynamic programming technique is used to achieve a global optimal correspondence. After the feature matchings, an intensity-based approach is used to establish a dense disparity map. Both the sum-of-squared-difference method (SSD) and the dynamic programming method are used. By carefully checking the consistence between intensity continuity and disparity continuity, a fairly accurate disparity map can be efficiently generated even if the images are short of texture.

Original languageEnglish
Pages (from-to)219-230
Number of pages12
JournalProceedings of SPIE - The International Society for Optical Engineering
StatePublished - 1 Jan 1999
EventProceedings of the 1999 Stereoscopic Displays and Virtual Reality Systems VI - San Jose, CA, USA
Duration: 25 Jan 199928 Jan 1999

Fingerprint Dive into the research topics of 'Multipass stereo matching algorithm using high-curvature points on image profiles'. Together they form a unique fingerprint.

Cite this