The scale of a texture and its application to segmentation

Byung Woo Hong*, Stefano Soatto, Kangyu Ni, Tony Chan

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

20 Scopus citations

Abstract

This paper examines the issue of scale in modeling texture for the purpose of segmentation. We propose a scale descriptor for texture and an energy minimization model to find the scale of a given texture at each location. For each pixel, we use the intensity distribution in a local patch around that pixel to determine the smallest size of the domain that can be used to generate neighboring patches. The energy functional we propose to minimize is comprised of three terms: The first is the dissimilarity measure using the Wasserstein distance or Kullback-Leibler divergence between neighboring patch distributions; the second maximizes the entropy of the local patch, and the third penalizes larger size at equal fidelity. Our experiments show the proposed scale model successfully captures the intrinsic scale of texture at each location. We also apply our scale descriptor for improving texture segmentation based on histogram matching [15].

Original languageEnglish (US)
Title of host publication26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR
DOIs
StatePublished - 2008
Externally publishedYes
Event26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR - Anchorage, AK, United States
Duration: Jun 23 2008Jun 28 2008

Publication series

Name26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR

Other

Other26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR
Country/TerritoryUnited States
CityAnchorage, AK
Period06/23/0806/28/08

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Control and Systems Engineering

Fingerprint

Dive into the research topics of 'The scale of a texture and its application to segmentation'. Together they form a unique fingerprint.

Cite this