Document Type : Original Research Paper
Authors
1 Department of Computer Engineering, Hakim Sabzevari University, Sabzevar, Iran.
2 Computer Engineering Department, Ferdowsi University of Mashhad, Mashhad, Iran.
Abstract
Background and Objectives: Depth from defocus and defocus deblurring from a single image are two challenging problems caused by the finite depth of field in conventional cameras. Coded aperture imaging is a branch of computational imaging, which is used to overcome these two problems. Up to now, different methods have been proposed for improving the results of either defocus deblurring or depth estimation. In this paper, an asymmetric coded aperture is proposed which improves results of depth estimation and defocus deblurring from a single input image.
Methods: To this aim, a multi-objective optimization function taking into consideration both deblurring results and depth discrimination ability is proposed. Since aperture throughput affects on image quality, our optimization function is defined based on illumination conditions and camera specifications which yields an optimized throughput aperture. Because the designed pattern is asymmetric, defocused objects on two sides of the focal plane can be distinguished. Depth estimation is performed using a new algorithm, which is based on perceptual image quality assessment criteria and can discern blurred objects lying in front or behind the focal plane.
Results: Extensive simulations as well as experiments on a variety of real scenes are conducted to compare the performance of our aperture with previously proposed ones.
Conclusion: Our aperture has been designed for indoor illumination settings. However, the proposed method can be utilized for designing and evaluating appropriate aperture patterns for different imaging conditions.
Keywords
Main Subjects
Open Access
This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit: http://creativecommons.org/licenses/by/4.0/
Publisher’s Note
JECEI Publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Publisher
Shahid Rajaee Teacher Training University
Send comment about this article