[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

Capacitive facial movement detection for human–computer interaction to click by frowning and lifting eyebrows

Assistive technology

  • Published:
Medical & Biological Engineering & Computing Aims and scope Submit manuscript

Abstract

A capacitive facial movement detection method designed for human–computer interaction is presented. Some point-and-click interfaces use facial electromyography for clicking. The presented method provides a contactless alternative. Electrodes with no galvanic coupling to the face are used to form electric fields. Changes in the electric fields due to facial movements are detected by measuring capacitances between the electrodes. A prototype device for measuring a capacitance signal affected by frowning and lifting eyebrows was constructed. A commercial integrated circuit for capacitive touch sensors is used in the measurement. The applied movement detection algorithm uses an adaptive approach to provide operation capability in noisy and dynamic environments. Experimentation with 10 test subjects proved that, under controlled circumstances, the movements are detected with good efficiency, but characterizing the movements into frowns and eyebrow lifts is more problematic. Integration with a two-dimensional (2D) pointing solution and further experiments are still required.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Aylor JH, Johnson EL (1983) Microcomputing to aid the handicapped. Part I. IEEE Micro 3(3):6–7

    Article  Google Scholar 

  2. Barreto AB, Scargle SD, Adjouadi M (2000) A practical emg-based human–computer interface for users with motor disabilities. J Rehabil Res Dev 37(1):53–64

    Google Scholar 

  3. Chin CA, Barreto A, Cremades JG, Adjouadi M (2008) Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities. J Rehabil Res Dev 45(1):161–174

    Article  Google Scholar 

  4. Gips J (1998) On building intelligence into eagleeyes. In: Assistive technology and artificial intelligence. Springer, Berlin

  5. Huckauf A, Urbina MH (2008) On object selection in gaze controlled environments. J Eye Mov Res 2(4):1–7

    Google Scholar 

  6. Huckauf A, Goettel T, Heinbockel M, Urbina M (2005) What you don’t look at is what you get: anti-saccades can reduce the midas touch-problem. In: Proceedings of the 2nd symposium on applied perception in graphics and visualization. p 170

  7. Niemenlehto PH (2009) Constant false alarm rate detection of saccadic eye movements in electro-oculography. Comput Methods Programs Biomed 96(2):158–171

    Article  Google Scholar 

  8. Park KS, Lee KT (1996) Eye-controlled human/computer interface using the line-of-sight and the intentional blink. Comput Ind Eng 30(3):463–473

    Article  Google Scholar 

  9. Partala T, Aula A, Surakka V (2001) Combined voluntary gaze direction and facial muscle activity as a new pointing technique. In: Proceedings of IFIP INTERACT’01

  10. Skolnik MI (2001) Introduction to radar systems, 3rd edn. McGraw-Hill, New York

    Google Scholar 

  11. Smith J, White T, Dodge C, Paradiso J, Gershenfeld N, Allport D (1998) Electric field sensing for graphical interfaces. IEEE Comput Graph Appl 18(3):54–60

    Article  Google Scholar 

  12. Smith JR (1996) Field mice: extracting hand geometry from electric field measurements. IBM Syst J 35(3&4):587–608

    Article  Google Scholar 

  13. Surakka V, Illi M, Isokoski P (2004) Gazing and frowning as a new human–computer interaction technique. ACM SIGGRAPH Comput Graph 1(1):40–56

    Google Scholar 

  14. Surakka V, Isokoski P, Illi M, Salminen K (2005) Is it better to gaze and frown or gaze and smile when controlling user interfaces? In: Proceedings of HCI International 2005

  15. Tsui CSL, Gan JQ, Roberts SJ (2009) A self-paced brain–computer interface for controlling a robot simulator: an online event labelling paradigm and an extended kalman filter based algorithm for online training. Med Biol Eng Comput 47:257–265

    Article  Google Scholar 

  16. Vidal JJ (1973) Toward direct brain-computer communication. Annu Rev Biophys Bioeng 2:157–180

    Article  Google Scholar 

  17. Ware C, Mikaelian HH (1987) An evaluation of an eye tracker as a device for computer input. In: Proceedings of the SIGCHI/GI conference on human factors in computing systems and graphics interface. pp 183–188

  18. Zimmerman TG, Smith JR, Paradiso JA, Allport D, Gershenfeld N (1995) Applying electric field sensing to human–computer interfaces. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp 280–287

Download references

Acknowledgements

This study was carried out in a project called Face Interface. The project is funded by the Academy of Finland, funding decision no. 115997.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ville Rantanen.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Rantanen, V., Niemenlehto, PH., Verho, J. et al. Capacitive facial movement detection for human–computer interaction to click by frowning and lifting eyebrows. Med Biol Eng Comput 48, 39–47 (2010). https://doi.org/10.1007/s11517-009-0565-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11517-009-0565-6

Keywords

Navigation