11665

A Unified Approach for Registration and Depth in Depth from Defocus

Rami Ben-Ari
Orbotech Ltd.
IEEE Transactions on Pattern Analysis and Machine Intelligence, 2014
@article{ben2014unified,

   title={A Unified Approach for Registration and Depth in Depth from Defocus},

   author={Ben-Ari, Rami},

   publisher={IEEE},

   year={2014}

}

Download Download (PDF)   View View   Source Source   

698

views

Depth from Defocus (DFD) suggests a simple optical set-up to recover the shape of a scene through imaging with shallow depth of field. Although numerous methods have been proposed for DFD, less attention has been paid to the particular problem of alignment between the captured images. The inherent shift-variant defocus often prevents standard registration techniques from achieving the accuracy needed for successful shape reconstruction. In this paper, we address the DFD and registration problem in a unified framework, exploiting their mutual relation to reach a better solution for both cues.We draw a formal connection between registration and defocus blur, find its limitations and reveal the weakness of the standard isolated approaches of registration and depth estimation. The solution is approached by energy minimization. The efficiency of the associated numerical scheme is justified by showing its equivalence to the celebrated Newton-Raphson method and proof of convergence of the emerged linear system. The computationally intensive approach of DFD, newly combined with simultaneous registration, is handled by GPU computing. Experimental results demonstrate the high sensitivity of the recovered shapes to slight errors in registration and validate the superior performance of the suggested approach over two, separately applying registration and DFD alternatives.
VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

* * *

* * *

TwitterAPIExchange Object
(
    [oauth_access_token:TwitterAPIExchange:private] => 301967669-yDz6MrfyJFFsH1DVvrw5Xb9phx2d0DSOFuLehBGh
    [oauth_access_token_secret:TwitterAPIExchange:private] => o29ji3VLVmB6jASMqY8G7QZDCrdFmoTvCDNNUlb7s
    [consumer_key:TwitterAPIExchange:private] => TdQb63pho0ak9VevwMWpEgXAE
    [consumer_secret:TwitterAPIExchange:private] => Uq4rWz7nUnH1y6ab6uQ9xMk0KLcDrmckneEMdlq6G5E0jlQCFx
    [postfields:TwitterAPIExchange:private] => 
    [getfield:TwitterAPIExchange:private] => ?cursor=-1&screen_name=hgpu&skip_status=true&include_user_entities=false
    [oauth:protected] => Array
        (
            [oauth_consumer_key] => TdQb63pho0ak9VevwMWpEgXAE
            [oauth_nonce] => 1475211470
            [oauth_signature_method] => HMAC-SHA1
            [oauth_token] => 301967669-yDz6MrfyJFFsH1DVvrw5Xb9phx2d0DSOFuLehBGh
            [oauth_timestamp] => 1475211470
            [oauth_version] => 1.0
            [cursor] => -1
            [screen_name] => hgpu
            [skip_status] => true
            [include_user_entities] => false
            [oauth_signature] => Wq0lVZn5MsVaj7pwHXY4BOu+78s=
        )

    [url] => https://api.twitter.com/1.1/users/show.json
)
Follow us on Facebook
Follow us on Twitter

HGPU group

2005 peoples are following HGPU @twitter

HGPU group © 2010-2016 hgpu.org

All rights belong to the respective authors

Contact us: