One-eyed stereo : a general approach to modeling 3-d scene geometry

A single two-dimensional image is an ambiguous representation of the three-dimensional world¿many different scenes could have produced the same image¿yet the human visual system is ex-tremely successful at recovering a qualitatively correct depth model from this type of representation. Workers in th...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on pattern analysis and machine intelligence. - 1979. - 8(1986), 6 vom: 01. Juni, Seite 730-41
1. Verfasser: Strat, T M (VerfasserIn)
Weitere Verfasser: Fischler, M A
Format: Aufsatz
Sprache:English
Veröffentlicht: 1986
Zugriff auf das übergeordnete Werk:IEEE transactions on pattern analysis and machine intelligence
Schlagworte:Journal Article
Beschreibung
Zusammenfassung:A single two-dimensional image is an ambiguous representation of the three-dimensional world¿many different scenes could have produced the same image¿yet the human visual system is ex-tremely successful at recovering a qualitatively correct depth model from this type of representation. Workers in the field of computational vision have devised a number of distinct schemes that attempt to emulate this human capability; these schemes are collectively known as ``shape from...'' methods (e.g., shape from shading, shape from texture, or shape from contour). In this paper we contend that the distinct assumptions made in each of these schemes is tantamount to providing a second (virtual) image of the original scene, and that each of these approaches can be translated into a conventional stereo formalism. In particular, we show that it is frequently possible to structure the problem as one of recovering depth from a stereo pair consisting of the supplied perspective image (the original image) and an hypothesized orthographic image (the virtual image). We present a new algorithm of the form required to accomplish this type of stereo reconstruction task
Beschreibung:Date Completed 02.10.2012
Date Revised 12.11.2019
published: Print
Citation Status PubMed-not-MEDLINE
ISSN:1939-3539