Description
Coded aperture photography is a coded imaging technique where a patterned mask is inserted at the aperture of the camera while imaging the scene. This lets us control the Point Spread Function (PSF) of the camera. This can be leveraged to estimate a relative depth map of the scene from a single image. This information can further be used to deblur or refocus the captured image.
One question that can be asked is that, what patterned mask is one supposed to use. It has been shown in the literature that all patterned masks do not perform at the same level. Previous works have posed the problem of finding this optimal patterned mask as an optimization problem. To solve this problem, they have usually assumed an analytical prior over images. The patterned mask obtained with this method is optimal over only those images that obey the prior distribution.
Unlike previous approaches where patterned mask design and depth estimation have been considered as 2 separate problems, we propose a joint design. Motivated from Chakrabarti’s work on sensor multiplexing design, we propose a 2 stage neural network, which designs the code as well as estimates depth jointly. We also do not assume any analytical prior distribution over images. Instead, we let the neural network learn it based on the data provided.