On defocus, diffusion and depth estimation

Vinay P. Namboodiri, Subhasis Chaudhuri

Research output: Contribution to journalArticlepeer-review

63 Citations (SciVal)

Abstract

An intrinsic property of real aperture imaging has been that the observations tend to be defocused. This artifact has been used in an innovative manner by researchers for depth estimation, since the amount of defocus varies with varying depth in the scene. There have been various methods to model the defocus blur. We model the defocus process using the model of diffusion of heat. The diffusion process has been traditionally used in low level vision problems like smoothing, segmentation and edge detection. In this paper a novel application of the diffusion principle is made for generating the defocus space of the scene. The defocus space is the set of all possible observations for a given scene that can be captured using a physical lens system. Using the notion of defocus space we estimate the depth in the scene and also generate the corresponding fully focused equivalent pin-hole image. The algorithm described here also brings out the equivalence of the two modalities, viz. depth from focus and depth from defocus for structure recovery.

Original languageEnglish
Pages (from-to)311-319
Number of pages9
JournalPattern Recognition Letters
Volume28
Issue number3
Early online date5 Jun 2006
DOIs
Publication statusPublished - 1 Feb 2007

Keywords

  • Depth from defocus
  • Diffusion
  • Shape estimation
  • Spectral method

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'On defocus, diffusion and depth estimation'. Together they form a unique fingerprint.

Cite this