Skip to content Skip to navigation
University of Warwick
  • Study
  • |
  • Research
  • |
  • Business
  • |
  • Alumni
  • |
  • News
  • |
  • About

University of Warwick
Publications service & WRAP

Highlight your research

  • WRAP
    • Home
    • Search WRAP
    • Browse by Warwick Author
    • Browse WRAP by Year
    • Browse WRAP by Subject
    • Browse WRAP by Department
    • Browse WRAP by Funder
    • Browse Theses by Department
  • Publications Service
    • Home
    • Search Publications Service
    • Browse by Warwick Author
    • Browse Publications service by Year
    • Browse Publications service by Subject
    • Browse Publications service by Department
    • Browse Publications service by Funder
  • Help & Advice
University of Warwick

The Library

  • Login
  • Admin

Accurate depth from defocus estimation with video-rate implementation

Tools
- Tools
+ Tools

Raj, Alex Noel Joseph (2009) Accurate depth from defocus estimation with video-rate implementation. PhD thesis, University of Warwick.

[img] PDF
WRAP_THESIS_Raj_2009.pdf - Requires a PDF viewer.

Download (6Mb)
Official URL: http://webcat.warwick.ac.uk/record=b2317955~S9

Request Changes to record.

Abstract

The science of measuring depth from images at video rate using „defocus‟ has been investigated. The method required two differently focussed images acquired from a single view point using a single camera. The relative blur between the images was used to determine the in-focus axial points of each pixel and hence depth.
The depth estimation algorithm researched by Watanabe and Nayar was employed to recover the depth estimates, but the broadband filters, referred as the Rational filters were designed using a new procedure: the Two Step Polynomial Approach. The filters designed by the new model were largely insensitive to object texture and were shown to model the blur more precisely than the previous method. Experiments with real planar images demonstrated a maximum RMS depth error of 1.18% for the proposed filters, compared to 1.54% for the previous design.
The researched software program required five 2D convolutions to be processed in parallel and these convolutions were effectively implemented on a FPGA using a two channel, five stage pipelined architecture, however the precision of the filter coefficients and the variables had to be limited within the processor. The number of multipliers required for each convolution was reduced from 49 to 10 (79.5% reduction) using a Triangular design procedure. Experimental results suggested that the pipelined processor provided depth estimates comparable in accuracy to the full precision Matlab‟s output, and generated depth maps of size 400 x 400 pixels in 13.06msec, that is faster than the video rate.
The defocused images (near and far-focused) were optically registered for magnification using Telecentric optics. A frequency domain approach based on phase correlation was employed to measure the radial shifts due to magnification and also to optimally position the external aperture. The telecentric optics ensured pixel to pixel registration between the defocused images was correct and provided more accurate depth estimates.

Item Type: Thesis (PhD)
Subjects: T Technology > TA Engineering (General). Civil engineering (General)
T Technology > TR Photography
Library of Congress Subject Headings (LCSH): Image processing -- Digital techniques, Depth of field (Photography), Algorithms, Texture mapping, Measurement -- Research, Video recordings -- Research
Official Date: September 2009
Dates:
DateEvent
September 2009Submitted
Institution: University of Warwick
Theses Department: School of Engineering
Thesis Type: PhD
Publication Status: Unpublished
Supervisor(s)/Advisor: Staunton, Richard C.
Sponsors: University of Warwick (UoW)
Format of File: pdf
Extent: 191 leaves : ill., charts
Language: eng

Request changes or add full text files to a record

Repository staff actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics

twitter

Email us: wrap@warwick.ac.uk
Contact Details
About Us