Abstract
This paper describes a new control system interface which utilises the user’s eye gaze to enable severely disabled individuals control electronic devices easily. The system is based upon a novel human computer interface, which facilitates simple control of electronic devices by predicting and responding to the user’s possible intentions, based intuitively upon their point of gaze. The interface responds by automatically pre-selecting and offering only those controls appropriate to the specific device that the user looks at, in a simple and accessible manner. It therefore affords the user conscious choice of the appropriate range of control actions required, which may be executed by simple means and without the need to navigate manually through potentially complex control menus to reach them. Two systems using the head-mounted and the remote eye tracker respectively are introduced, compared and evaluated in this paper.
Chapter PDF
Similar content being viewed by others
Keywords
- Spinal Muscular Atrophy
- Assistive Technology
- Scale Invariant Feature Transform
- Scene Image
- Disable Individual
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
COGAIN, Communication by Gaze Interaction (accessed, 2006), http://www.cogain.org/
Bates, R., Donegan, M., Istance, H.O., Hansen, J.P., Rih, K.-J.: Introducing COGAIN – Communication by Gaze Interaction. In: Clarkson, J., Langdon, P., Robinson, P. (eds.) Designing Accessible Technology, pp. 77–84. Springer, London (2006)
ASL/MND Alliance International (accessed, 2007), http://www.alsmndalliance.org/
Shi, F., Gale, A.G., Purdy, K.J.: Eye-centric ICT control. In: Bust, P.D., McCabe, P.T. (eds.) Contemporary Ergonomics, pp. 215–223. Cambridge (2006)
Shi, F., Gale, A.G., Purdy, K.J.: Helping People with ICT Device Control by Eye Gaze. In: Miesenberger, K., Klaus, J., Zagler, W., Karshmer, A. (eds.). LNCS, pp. 480–487. Springer, Berlin (2006)
Shi, F., Gale, A.G., Purdy, K.J.: Exploring Eye Responsive Control from a head mounted to a remote system. In: Bust, P.D., McCabe, P.T. (eds.) Contemporary Ergonomics, Nottingham (to appear, 2007)
Lowe, D.G.: Distinctive Image Features from Scale-Invariant Keypoints. International Journal of Computer Vision 60(2), 91–110 (2004)
Gale, A.G., Shi, F., Purdy, K.J.: X10 - are you looking at me. In: de Waard, D. (ed.) Development in Human Factors in Transportation, Design, and Evaluation: Human Factors in Industrial and Consumer Products and Services conference, HFES Europe Chapter, Turin, Italy, pp. 285–294 (2006)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Shi, F., Gale, A., Purdy, K. (2007). A New Gaze-Based Interface for Environmental Control. In: Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. Ambient Interaction. UAHCI 2007. Lecture Notes in Computer Science, vol 4555. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-73281-5_109
Download citation
DOI: https://doi.org/10.1007/978-3-540-73281-5_109
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-73280-8
Online ISBN: 978-3-540-73281-5
eBook Packages: Computer ScienceComputer Science (R0)