Abstract
An intrinsic property of real aperture imaging has been that the observations tend to be defocused. This artifact has been used in an innovative manner by researchers for depth estimation, since the amount of defocus varies with varying depth in the scene. There have been various methods to model the defocus blur. We model the defocus process using the model of diffusion of heat. The diffusion process has been traditionally used in low level vision problems like smoothing, segmentation and edge detection. In this paper a novel application of the diffusion principle is made for generating the defocus space of the scene. The defocus space is the set of all possible observations for a given scene that can be captured using a physical lens system. Using the notion of defocus space we estimate the depth in the scene and also generate the corresponding fully focused equivalent pin-hole image. The algorithm described here also brings out the equivalence of the two modalities, viz. depth from focus and depth from defocus for structure recovery.
Original language | English |
---|---|
Pages (from-to) | 311-319 |
Number of pages | 9 |
Journal | Pattern Recognition Letters |
Volume | 28 |
Issue number | 3 |
Early online date | 5 Jun 2006 |
DOIs | |
Publication status | Published - 1 Feb 2007 |
Keywords
- Depth from defocus
- Diffusion
- Shape estimation
- Spectral method
ASJC Scopus subject areas
- Software
- Signal Processing
- Computer Vision and Pattern Recognition
- Artificial Intelligence