Abstract
In this paper, we present a generic performance model able to evaluate the accuracy of depth estimation using depth from defocus (DFD). This model only requires the sensor point spread function at a given depth to evaluate the theoretical accuracy of depth estimation. Hence, it can be used for any (un)conventional system, using either one or several images. This model is validated experimentally on two unconventional DFD cameras, using either a coded aperture or a lens with chromatic aberration. Then, we use the proposed model for the end-to-end design of a 3D camera using an unconventional lens with chromatic aberration, for the specific use-case of small unmanned aerial vehicle navigation.
© 2021 Optical Society of America
Full Article | PDF ArticleMore Like This
Pauline Trouvé-Peloux, Frédéric Champagnat, Guy Le Besnerais, and Jérôme Idier
J. Opt. Soc. Am. A 31(12) 2650-2662 (2014)
Pauline Trouvé, Frédéric Champagnat, Guy Le Besnerais, Jacques Sabater, Thierry Avignon, and Jérôme Idier
Appl. Opt. 52(29) 7152-7164 (2013)
Benjamin Buat, Pauline Trouvé-Peloux, Frédéric Champagnat, and Guy Le Besnerais
Appl. Opt. 60(31) 9966-9974 (2021)