Shape from contours and multiple stereo - A hierarchical, mesh-based approach

Hendrik Kück*, Wolfgang Heidrich, Christian Vogelgsang

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

We present a novel method for 3D shape recovery based on a combination of visual hull information and multi image stereo. We start from a coarse triangle mesh extracted from visual hull information. The mesh is then hierarchically refined and its vertex positions are optimized based on multi image stereo information. This optimization procedure utilizes 3D graphics hardware to evaluate the quality of vertex positions, and takes both color consistency, and occlusion effects as well as silhouette information into account. By directly working on a triangle mesh we are able to obtain more spatial coherence than algorithms based entirely on point information, such as voxel-based methods. This allows us to deal with objects that have very little structure in some places, as well as small specular patches.

Original languageEnglish (US)
Title of host publicationProceedings - 1st Canadian Conference on Computer and Robot Vision
Pages76-83
Number of pages8
StatePublished - 2004
Externally publishedYes
EventProceedings - 1st Canadian Conference on Computer and Robot Vision - London, Ont, Canada
Duration: May 17 2004May 19 2004

Publication series

NameProceedings - 1st Canadian Conference on Computer and Robot Vision

Other

OtherProceedings - 1st Canadian Conference on Computer and Robot Vision
Country/TerritoryCanada
CityLondon, Ont
Period05/17/0405/19/04

Keywords

  • Hardware accelerated
  • Image based modeling
  • Mesh deformation
  • Multi view stereo
  • Surface reconstruction
  • Triangle meshes
  • Visual hull

ASJC Scopus subject areas

  • General Engineering

Fingerprint

Dive into the research topics of 'Shape from contours and multiple stereo - A hierarchical, mesh-based approach'. Together they form a unique fingerprint.

Cite this