[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Exploring a novel facial animation technique using numerical traced algorithm

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Facial animation is a fundamental challenge that requires mathematical and computational strategies. In this paper, a novel facial animation technique using numerical traced algorithm is introduced. Homotpy-based animation methodology (HAM) uses the homotopy curve path in order to novelty generate intermediate frames for different λ values and therefore it represents the deformations from starting image to ending image. These deformations use system of equations embedded into a single homotopy equation in order to represent intermediate frames. Moreover, a hyperspherical tracking method establishes deformations with visually consistent and smooth changes. Experimental results reveal intermediate frames that can be interpreted as facial animation. Furthermore, histogram plots, homotopic trajectories, and pixel variation tables confirm that different pixel positions vary with different rates of change as the original image is transformed into the target image. Besides, these frames do not need external filters in order to correct visual interpretation errors and therefore the homotopy-based animation method can be considered as a useful alternative for animating facial images in different applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Ahn S, Ozawa S (2005) Facial animation based on muscular contraction parameters. In: 2005 IEEE international conference on systems, man and cybernetics, Waikoloa, HI, vol 4, pp 3112–3117

  2. Ahn S, Ozawa S (2005) Facial animation based on muscular contraction parameters. 2005 IEEE International Conference on Systems, Man and Cybernetics 4:3112–3117

    Article  Google Scholar 

  3. Allgower EL, Georg K (1994) Numerical path following

  4. Arai K, Kurihara T, Anjyo K (1996) Bilinear interpolation for facial expression and metamorphosis in real-time animation. Vis Comput 12:105–116

    Article  Google Scholar 

  5. Bansal M, Kumar M, Kumar M (2021) 2D object recognition: A comparative analysis of SIFT, SURF and ORB feature descriptors. Multimed Tools Appl 80:18839–18857

    Article  Google Scholar 

  6. Bastanfard A, Takahashi H, Nakajima M (2004) Toward E-appearance of human face and hair by age, expression and rejuvenation. In: International conference on cyberworlds, pp 306–311

  7. Dong L, Wang Y, Ni K, Lu K (2011) Facial animation system based on image warping algorithm. In: 2011 international conference on electronics, Communications and Control (ICECC), Ningbo

  8. Dong L, Wang Y, Ni K, Lu K (2011) Facial animation system based on image warping algorithm. In: 2011 international conference on electronics, communications and control (ICECC), pp 2648–2653

  9. Getreuer P (2011) Linear methods for image interpolation. Image Processing On Line 1:238–259

    Article  Google Scholar 

  10. Gonzalez-Franco M, Steed A, Hoogendyk S, Ofek E (2020) Using facial animation to increase the enfacement illusion and avatar Self-Identification. IEEE Trans Vis Comput Graph 26(5):2023–2029

    Article  Google Scholar 

  11. Kang L, Weissenfeld A, Ostermann J (2006) Parametrization of mouth images by LLE and PCA for image-based facial animation. In: 2006 IEEE international conference on acoustics speech and signal processing proceedings, Toulouse, pp V–V

  12. Kouadio C, Poulin P, Lachapelle P (1998) Real-time facial animation based upon a bank of 3D facial expressions. In: Proceedings computer animation ’98 (Cat. no.98EX169), Philadelphia, PA, USA, pp 128–136

  13. Lee Y, Terzopoulos D, Waters K (1995) Realistic modelling for facial animation. In: Proceedings of the 22nd annual conference on Computer graphics and interactive techniques (SIGGRAPH 95). Association for Computing Machinery, New York, NY, USA, pp 55–62

  14. Liu K, Weissenfeld A, Ostermann J (2006) Parameterization of mouth images by LLE and PCA for Image-Based facial animation. In: 2006 IEEE international conference on acoustics speech and signal processing proceedings

  15. Mahajan D, Huang F-C, Matusik W, Ramamoorthi R, Belhumeur P (2009) Moving gradients: A path-based method for plausible image interpolation. ACM Transactions on Graphics (SIGGRAPH 28(3):09

    Article  Google Scholar 

  16. Melek Z, Akarun L (2000) Automated lip synchronized speech driven facial animation. In: 2000 IEEE International Conference on Multimedia and Expo. ICME2000. Proceedings. L.test Advances in the Fast Changing World of Multimedia (Cat. No.00TH8532), New York, NY, vol 2, pp 623–626

  17. Obaid M, Mukundan R, Billinghurst M, Pelachaud C (2010) Expressive MPEG-4 facial animation using quadratic deformation models. In: 7th international conference on computer graphics imaging and visualization (CGIV)

  18. Obaid M, Mukundan R, Billinghurst M, Pelachaud C (2010) Expressive MPEG-4 facial animation using quadratic deformation models. In: 2010 Seventh international conference on computer graphics, imaging and visualization, Sydney, NSW, pp 9–14

  19. Parke FI, Waters K (2008) Computer facial animation. Taylor & Francis

  20. Patel NM, Zaveri M (2010) Parametric facial expression synthesis and animation. International Journal of Computer Applications 3(4):34–40

    Article  Google Scholar 

  21. Patel N, Zaveri M (2011) 3D Facial model construction and animation from a single frontal face image. In: 2011 international conference on communications and signal processing, Calicut, pp 203–207

  22. Patel N, Zaveri M (2011) 3D Facial model construction and animation from a single frontal face image. In: 2011 international conference on communications and signal processing, pp 203–207

  23. Pighin F, Szeliski R, Salesin DH (1999) Resynthesizing facial animation through 3D model-based tracking. In: Proceedings of the Seventh IEEE international conference on computer vision, Kerkyra, Greece, vol 1, pp 143–150

  24. Reed K, Cosker D (2019) User-Guided Facial animation through an evolutionary interface. Computer Graphics Forum 38:165–176

    Article  Google Scholar 

  25. Sheffer A, Praun E, Rose K (2006) Mesh Parameterization Methods and Their Applications

  26. Shiqiang R, Huabing Z, Zhengjun L, Tao W (2016) Animation generating based on MRLS image deformation. In: 15th international symposium on parallel and distributed computing (ISPDC), Fuzhou, pp 372–375

  27. Tang Y, Xu M, Cai Z (2010) Research on facial expression animation based on 2D mesh morphing driven by pseudo muscle model. In: 2010 international conference on educational and information technology, Chongqing, pp V2-403-V2-407

  28. Torres D, Hernández L, Vázquez H (2016) Spherical continuation algorithm with spheres of variable radius to trace homotopy curves. In: International journal of applied and computational. Springer, India, pp 421–433

  29. Torres D, Váquez H, Hernández L (2014) Improved spherical continuation algorithm with application to the double-bounded homotopy (DBH), Computational and Applied Mathematics, Springer

  30. Yamamura K (1993) Simple algorithms for tracing solution curves. IEEE Transactions on Circuits and Systems. I. Fundamental Theory and Applications 40(8):537–541

    Article  Google Scholar 

  31. Yi Z, Tang Q, Sanjay V, Srinivasan R, Xu Z (2020) Animating Through warping: An efficient method for high-quality facial expression animation. In: Proceedings of the 28th ACM international conference on multimedia

  32. Zhang Y, Ji Q, Zhu Z, Yi B (2008) Dynamic facial expression analysis and synthesis with MPEG-4 facial animation parameters. IEEE Transactions on Circuits and Systems for Video Technology 18(10):1383–1396

    Article  Google Scholar 

  33. Zhang Y, Ji Q, Zhu Z, Yi B (2008) Dynamic facial expression analysis and synthesis with MPEG-4 facial animation parameters. IEEE Trans Circuits Syst Video Technol 18(10):1383–1396

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Delia Torres-Muñoz.

Ethics declarations

Conflict of Interests

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Torres-Muñoz, D., Hernández-Mejía, C., Maldonado-Mendez, C. et al. Exploring a novel facial animation technique using numerical traced algorithm. Multimed Tools Appl 81, 30961–30976 (2022). https://doi.org/10.1007/s11042-022-12944-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-12944-7

Keywords

Navigation