[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

An Extended Perona–Malik Model Based on Probabilistic Models

  • Published:
Journal of Mathematical Imaging and Vision Aims and scope Submit manuscript

Abstract

The Perona–Malik model has been very successful at restoring images from noisy input. In this paper, we reinterpret the Perona–Malik model in the language of Gaussian scale mixtures and derive some extensions of the model. Specifically, we show that the expectation–maximization (EM) algorithm applied to Gaussian scale mixtures leads to the lagged-diffusivity algorithm for computing stationary points of the Perona–Malik diffusion equations. Moreover, we show how mean field approximations to these Gaussian scale mixtures lead to a modification of the lagged-diffusivity algorithm that better captures the uncertainties in the restoration. Since this modification can be hard to compute in practice, we propose relaxations to the mean field objective to make the algorithm computationally feasible. Our numerical experiments show that this modified lagged-diffusivity algorithm often performs better at restoring textured areas and fuzzy edges than the unmodified algorithm. As a second application of the Gaussian scale mixture framework, we show how an efficient sampling procedure can be obtained for the probabilistic model, making the computation of the conditional mean and other expectations algorithmically feasible. Again, the resulting algorithm has a strong resemblance to the lagged-diffusivity algorithm. Finally, we show that a probabilistic version of the Mumford–Shah segmentation model can be obtained in the same framework with a discrete edge prior.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. Strictly speaking, p(u) is not a proper probability density, as it is not normalizable. In Bayesian statistics, such probability densities are often referred to as improper probability densities. In practice, only the posterior density \(p(u \mid v_n)\) is used for calculations, which generally defines a normalizable probability density.

  2. By a simple induction argument one gets \(\Psi ^{(n)}(t) = (-1)^{n}\sum _{k=0}^{n-1}\left( {\begin{array}{c}n\\ k\end{array}}\right) (-1)^{k}\Psi ^{(k)}(t)\exp (-t)\) from which we obtain \((-1)^{n}\Psi ^{(n)}\ge 0\) as desired.

  3. For two probability distributions p and q, the Kullback–Leibler divergence is \({{\mathrm{KL}}}(q,p) = \int q\log \Big (\tfrac{q}{p}\Big )\).

  4. It is possible \(\xi _0^{(k)}\) and \(\eta _0^{(k+1)}\) are outside the range of z and \(-\frac{1}{2} |\nabla u|^2\) (e.g., if z is a binary random variable). However, formally, \(q_1^{(k+1)}\) and \(q_2^{(k+1)}\) still behave like the indicated conditional distributions.

References

  1. Andrews, D.F., Mallows, C.L.: Scale mixtures of normal distributions. J. R. Stat. Soc. Ser. B (Methodological), 99–102 (1974)

  2. Bao, Y., Krim, H.: Smart nonlinear diffusion: a probabilistic approach. IEEE Trans. Pattern Anal. Mach. Intell. 26(1), 63–72 (2004)

    Article  Google Scholar 

  3. Bardsley, J.M.: MCMC-based image reconstruction with uncertainty quantification. SIAM J. Sci. Comput. 34(3), A1316–A1332 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bertero, M., Boccacci, P.: Introduction to Inverse Problems in Imaging. CRC Press, (1998)

  5. Bezanson, J., Edelman, A., Karpinski, S., Shah, V.B.: Julia: a fresh approach to numerical computing. arXiv preprint arXiv:1411.1607, (2014)

  6. Bishop, C.: Pattern Recognition and Machine Learning. Springer, New York (2006)

    MATH  Google Scholar 

  7. Chan, T.F., Mulet, P.: On the convergence of the lagged diffusivity fixed point method in total variation image restoration. SIAM J. Numer. Anal. 36(2), 354–367 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  8. Chen, Y., Pock, T: Trainable nonlinear reaction diffusion: a flexible framework for fast and effective image restoration. IEEE Trans. Pattern Anal. Mach. Intell., (2016)

  9. Chen, Y., Yu, W., Pock, T.: On learning optimized reaction diffusion processes for effective image restoration. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 5261–5269, (2015)

  10. Cohen, M.A., Grossberg, S.: Neural dynamics of brightness perception: Features, boundaries, diffusion, and resonance. Percept. Psychophys. 36(5), 428–456 (1984)

    Article  Google Scholar 

  11. Curtis, C.R., Oman, M.E.: Iterative methods for total variation denoising. SIAM J. Sci. Comput. 17(1), 227–238 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  12. Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc. Ser. B (methodological) 1-38 (1977)

  13. Erdélyi, A., Magnus, W., Oberhettinger, F., Tricomi, F.G.: Tables of integral transforms. Vol. I. McGraw-Hill Book Company, Inc., New York-Toronto-London, 1954. Based, in part, on notes left by Harry Bateman

  14. Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell, PAMI-6(6):721–741, (1984). 17002

  15. Grossberg, S.: Outline of a theory of brightness, color, and form perception. Adv. Psychol. 20, 59–86 (1984)

    Article  MathSciNet  Google Scholar 

  16. Grossberg, S., Mingolla, E.: Neural dynamics of perceptual grouping: textures, boundaries, and emergent segmentations. Percept. Psychophys. 38(2), 141–171 (1985)

    Article  Google Scholar 

  17. Järvenpää, M., Piché, R.:. Bayesian hierarchical model of total variation regularisation for image deblurring. arXiv:1412.4384 [math, stat], (2014)

  18. Jin, B., Zou, J.: Augmented Tikhonov regularization. Inverse Probl., 25(2):025001, 25, (2009)

  19. Kaipio, J., Somersalo, E.: Statistical and computational inverse problems. Number v. 160 in Applied mathematical sciences. Springer, New York, (2005)

  20. Krajsek, K., Scharr, H.:. Diffusion filtering without parameter tuning: models and inference tools. In: Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, pp. 2536–2543. IEEE, (2010)

  21. Krim, H., Bao, Y.: Nonlinear diffusion: a probabilistic view. In: Image Processing, 1999. ICIP 99. Proceedings. 1999 International Conference on, vol. 2, pp. 21–25. IEEE, (1999)

  22. Morini, M., Negri, M.: Mumford–Shah functional as \(\gamma \)-limit of discrete Perona–Malik energies. Math. Models Methods Appl. Sci. 13(06), 785–805 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  23. Mumford, D., Shah, J.: Optimal approximations by piecewise smooth functions and associated variational problems. Commun. Pure Appl. Math. 42(5), 577–685 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  24. Murphy, K.P.: Machine Learning: A Probabilistic Perspective, Adaptive Computation and Machine Learning Series. MIT Press, Cambridge, MA (2012)

    MATH  Google Scholar 

  25. Niklas Nordström, K.: Biased anisotropic diffusion: a unified regularization and diffusion approach to edge detection. Image Vis. Comput. 8(4), 318–327 (1990)

    Article  Google Scholar 

  26. Papandreou, G., Yuille, A.L.: Gaussian sampling by local perturbations. In: Lafferty, J.D., Williams, C.K.L., Shawe-Taylor, J., Zemel, R.S., Culotta, A. (editors), Advances in Neural Information Processing Systems 23, pp. 1858–1866. Curran Associates, Inc., (2010)

  27. Perona, P., Malik, J.: Scale-space and edge detection using anisotropic diffusion. IEEE Trans. Pattern Anal. Mach. Intell 12(7), 629–639 (1990)

    Article  Google Scholar 

  28. Peter, P., Weickert, J., Munk, A., Krivobokova, T., Li, H.: Justifying tensor-driven diffusion from structure-adaptive statistics of natural images. In: International Workshop on Energy Minimization Methods in Computer Vision and Pattern Recognition, pp. 263–277. Springer, (2015)

  29. Pizurica, A., Vanhamel, I., Sahli, H., Philips, W., Katartzis, A.: A Bayesian formulation of edge-stopping functions in nonlinear diffusion. IEEE Signal Process. Lett. 13(8), 501–504 (2006)

    Article  Google Scholar 

  30. Portilla, J., Strela, V., Wainwright, M.J., Simoncelli, E.P.: Image denoising using scale mixtures of Gaussians in the wavelet domain. IEEE Trans. Image Process. 12(11), 1338–1351 (2003)

  31. Rudin, L.I., Osher, S., Fatemi, E.: Nonlinear total variation based noise removal algorithms. Phys. D 60(1), 259–268 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  32. Rue, H., Held, L.: Gaussian Markov Random Fields: Theory and Applications. CRC Press, Boca Raton (2005)

    Book  MATH  Google Scholar 

  33. Scharr, H., Black, M.J., Haussecker, H.W.: Image statistics and anisotropic diffusion. In: International Conference on Computer Vision (ICCV), 2003, pp. 840–847. ICCV, (2003)

  34. Scherzer, O., Grasmair, M., Grossauer, H., Haltmeier, M., Lenzen, F.: Variational methods in imaging, Applied Mathematical Sciences, vol. 167. Springer, New York (2009)

    MATH  Google Scholar 

  35. Schmidt, U., Gao, Q., Roth, S.: A generative perspective on MRFs in low-level vision. In: Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, pp. 1751–1758. IEEE, (2010)

  36. Wainwright, M.J., Jordan, M.I.: Graphical models, exponential families, and variational inference. Found. Trends Mach. Learn. 1(1–2), 1–305 (2008)

    MATH  Google Scholar 

  37. Wainwright, M.P., Simoncelli, E.P.: Scale mixtures of Gaussians and the statistics of natural images. In NIPS, pp. 855–861, (1999)

  38. Weickert, J.: Anisotropic Diffusion in Image Processing. Teubner, Stuttgart (1998)

    MATH  Google Scholar 

  39. Yu, G., Sapiro, G., Mallat, S.: Image modeling and enhancement via structured sparse model selection. In: 2010 IEEE International Conference on Image Processing, pp. 1641–1644. IEEE, (2010)

  40. Zhu, S.C., Mumford, D.: Prior learning and gibbs reaction-diffusion. IEEE Trans. Pattern Anal. Mach. Intell. 19(11), 1236–1250 (1997)

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank Sebastian Nowozin from Microsoft Research for some helpful literature hints.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to L. M. Mescheder.

Additional information

This material was based upon work partially supported by the National Science Foundation under Grant DMS-1127914 to the Statistical and Applied Mathematical Sciences Institute. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mescheder, L.M., Lorenz, D.A. An Extended Perona–Malik Model Based on Probabilistic Models. J Math Imaging Vis 60, 128–144 (2018). https://doi.org/10.1007/s10851-017-0746-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10851-017-0746-0

Keywords

Navigation