Abstract
The Perona–Malik model has been very successful at restoring images from noisy input. In this paper, we reinterpret the Perona–Malik model in the language of Gaussian scale mixtures and derive some extensions of the model. Specifically, we show that the expectation–maximization (EM) algorithm applied to Gaussian scale mixtures leads to the lagged-diffusivity algorithm for computing stationary points of the Perona–Malik diffusion equations. Moreover, we show how mean field approximations to these Gaussian scale mixtures lead to a modification of the lagged-diffusivity algorithm that better captures the uncertainties in the restoration. Since this modification can be hard to compute in practice, we propose relaxations to the mean field objective to make the algorithm computationally feasible. Our numerical experiments show that this modified lagged-diffusivity algorithm often performs better at restoring textured areas and fuzzy edges than the unmodified algorithm. As a second application of the Gaussian scale mixture framework, we show how an efficient sampling procedure can be obtained for the probabilistic model, making the computation of the conditional mean and other expectations algorithmically feasible. Again, the resulting algorithm has a strong resemblance to the lagged-diffusivity algorithm. Finally, we show that a probabilistic version of the Mumford–Shah segmentation model can be obtained in the same framework with a discrete edge prior.
Similar content being viewed by others
Notes
Strictly speaking, p(u) is not a proper probability density, as it is not normalizable. In Bayesian statistics, such probability densities are often referred to as improper probability densities. In practice, only the posterior density \(p(u \mid v_n)\) is used for calculations, which generally defines a normalizable probability density.
By a simple induction argument one gets \(\Psi ^{(n)}(t) = (-1)^{n}\sum _{k=0}^{n-1}\left( {\begin{array}{c}n\\ k\end{array}}\right) (-1)^{k}\Psi ^{(k)}(t)\exp (-t)\) from which we obtain \((-1)^{n}\Psi ^{(n)}\ge 0\) as desired.
For two probability distributions p and q, the Kullback–Leibler divergence is \({{\mathrm{KL}}}(q,p) = \int q\log \Big (\tfrac{q}{p}\Big )\).
It is possible \(\xi _0^{(k)}\) and \(\eta _0^{(k+1)}\) are outside the range of z and \(-\frac{1}{2} |\nabla u|^2\) (e.g., if z is a binary random variable). However, formally, \(q_1^{(k+1)}\) and \(q_2^{(k+1)}\) still behave like the indicated conditional distributions.
References
Andrews, D.F., Mallows, C.L.: Scale mixtures of normal distributions. J. R. Stat. Soc. Ser. B (Methodological), 99–102 (1974)
Bao, Y., Krim, H.: Smart nonlinear diffusion: a probabilistic approach. IEEE Trans. Pattern Anal. Mach. Intell. 26(1), 63–72 (2004)
Bardsley, J.M.: MCMC-based image reconstruction with uncertainty quantification. SIAM J. Sci. Comput. 34(3), A1316–A1332 (2012)
Bertero, M., Boccacci, P.: Introduction to Inverse Problems in Imaging. CRC Press, (1998)
Bezanson, J., Edelman, A., Karpinski, S., Shah, V.B.: Julia: a fresh approach to numerical computing. arXiv preprint arXiv:1411.1607, (2014)
Bishop, C.: Pattern Recognition and Machine Learning. Springer, New York (2006)
Chan, T.F., Mulet, P.: On the convergence of the lagged diffusivity fixed point method in total variation image restoration. SIAM J. Numer. Anal. 36(2), 354–367 (1999)
Chen, Y., Pock, T: Trainable nonlinear reaction diffusion: a flexible framework for fast and effective image restoration. IEEE Trans. Pattern Anal. Mach. Intell., (2016)
Chen, Y., Yu, W., Pock, T.: On learning optimized reaction diffusion processes for effective image restoration. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5261–5269, (2015)
Cohen, M.A., Grossberg, S.: Neural dynamics of brightness perception: Features, boundaries, diffusion, and resonance. Percept. Psychophys. 36(5), 428–456 (1984)
Curtis, C.R., Oman, M.E.: Iterative methods for total variation denoising. SIAM J. Sci. Comput. 17(1), 227–238 (1996)
Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc. Ser. B (methodological) 1-38 (1977)
Erdélyi, A., Magnus, W., Oberhettinger, F., Tricomi, F.G.: Tables of integral transforms. Vol. I. McGraw-Hill Book Company, Inc., New York-Toronto-London, 1954. Based, in part, on notes left by Harry Bateman
Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell, PAMI-6(6):721–741, (1984). 17002
Grossberg, S.: Outline of a theory of brightness, color, and form perception. Adv. Psychol. 20, 59–86 (1984)
Grossberg, S., Mingolla, E.: Neural dynamics of perceptual grouping: textures, boundaries, and emergent segmentations. Percept. Psychophys. 38(2), 141–171 (1985)
Järvenpää, M., Piché, R.:. Bayesian hierarchical model of total variation regularisation for image deblurring. arXiv:1412.4384 [math, stat], (2014)
Jin, B., Zou, J.: Augmented Tikhonov regularization. Inverse Probl., 25(2):025001, 25, (2009)
Kaipio, J., Somersalo, E.: Statistical and computational inverse problems. Number v. 160 in Applied mathematical sciences. Springer, New York, (2005)
Krajsek, K., Scharr, H.:. Diffusion filtering without parameter tuning: models and inference tools. In: Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, pp. 2536–2543. IEEE, (2010)
Krim, H., Bao, Y.: Nonlinear diffusion: a probabilistic view. In: Image Processing, 1999. ICIP 99. Proceedings. 1999 International Conference on, vol. 2, pp. 21–25. IEEE, (1999)
Morini, M., Negri, M.: Mumford–Shah functional as \(\gamma \)-limit of discrete Perona–Malik energies. Math. Models Methods Appl. Sci. 13(06), 785–805 (2003)
Mumford, D., Shah, J.: Optimal approximations by piecewise smooth functions and associated variational problems. Commun. Pure Appl. Math. 42(5), 577–685 (1989)
Murphy, K.P.: Machine Learning: A Probabilistic Perspective, Adaptive Computation and Machine Learning Series. MIT Press, Cambridge, MA (2012)
Niklas Nordström, K.: Biased anisotropic diffusion: a unified regularization and diffusion approach to edge detection. Image Vis. Comput. 8(4), 318–327 (1990)
Papandreou, G., Yuille, A.L.: Gaussian sampling by local perturbations. In: Lafferty, J.D., Williams, C.K.L., Shawe-Taylor, J., Zemel, R.S., Culotta, A. (editors), Advances in Neural Information Processing Systems 23, pp. 1858–1866. Curran Associates, Inc., (2010)
Perona, P., Malik, J.: Scale-space and edge detection using anisotropic diffusion. IEEE Trans. Pattern Anal. Mach. Intell 12(7), 629–639 (1990)
Peter, P., Weickert, J., Munk, A., Krivobokova, T., Li, H.: Justifying tensor-driven diffusion from structure-adaptive statistics of natural images. In: International Workshop on Energy Minimization Methods in Computer Vision and Pattern Recognition, pp. 263–277. Springer, (2015)
Pizurica, A., Vanhamel, I., Sahli, H., Philips, W., Katartzis, A.: A Bayesian formulation of edge-stopping functions in nonlinear diffusion. IEEE Signal Process. Lett. 13(8), 501–504 (2006)
Portilla, J., Strela, V., Wainwright, M.J., Simoncelli, E.P.: Image denoising using scale mixtures of Gaussians in the wavelet domain. IEEE Trans. Image Process. 12(11), 1338–1351 (2003)
Rudin, L.I., Osher, S., Fatemi, E.: Nonlinear total variation based noise removal algorithms. Phys. D 60(1), 259–268 (1992)
Rue, H., Held, L.: Gaussian Markov Random Fields: Theory and Applications. CRC Press, Boca Raton (2005)
Scharr, H., Black, M.J., Haussecker, H.W.: Image statistics and anisotropic diffusion. In: International Conference on Computer Vision (ICCV), 2003, pp. 840–847. ICCV, (2003)
Scherzer, O., Grasmair, M., Grossauer, H., Haltmeier, M., Lenzen, F.: Variational methods in imaging, Applied Mathematical Sciences, vol. 167. Springer, New York (2009)
Schmidt, U., Gao, Q., Roth, S.: A generative perspective on MRFs in low-level vision. In: Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, pp. 1751–1758. IEEE, (2010)
Wainwright, M.J., Jordan, M.I.: Graphical models, exponential families, and variational inference. Found. Trends Mach. Learn. 1(1–2), 1–305 (2008)
Wainwright, M.P., Simoncelli, E.P.: Scale mixtures of Gaussians and the statistics of natural images. In NIPS, pp. 855–861, (1999)
Weickert, J.: Anisotropic Diffusion in Image Processing. Teubner, Stuttgart (1998)
Yu, G., Sapiro, G., Mallat, S.: Image modeling and enhancement via structured sparse model selection. In: 2010 IEEE International Conference on Image Processing, pp. 1641–1644. IEEE, (2010)
Zhu, S.C., Mumford, D.: Prior learning and gibbs reaction-diffusion. IEEE Trans. Pattern Anal. Mach. Intell. 19(11), 1236–1250 (1997)
Acknowledgements
We would like to thank Sebastian Nowozin from Microsoft Research for some helpful literature hints.
Author information
Authors and Affiliations
Corresponding author
Additional information
This material was based upon work partially supported by the National Science Foundation under Grant DMS-1127914 to the Statistical and Applied Mathematical Sciences Institute. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
Rights and permissions
About this article
Cite this article
Mescheder, L.M., Lorenz, D.A. An Extended Perona–Malik Model Based on Probabilistic Models. J Math Imaging Vis 60, 128–144 (2018). https://doi.org/10.1007/s10851-017-0746-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10851-017-0746-0