[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2525194.2525288acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

EyeK: an efficient dwell-free eye gaze-based text entry system

Published: 24 September 2013 Publication History

Abstract

Over the last three decades, eye gaze has become an important modality of text entry in large and small display digital devices covering people with disabilities beside the able-bodied. Despite of many tools being developed, issues like minimizing dwell time, visual search time and interface area, eye-controlled mouse movement stability etc. are still points of concern in making any gaze typing interface more user friendly, accurate and robust. In this paper, we propose EyeK, a gaze-based text entry system which diminishes dwell time and favors to mitigate visual search time. Performance evaluation shows that proposed interface achieves on an average 15% higher text entry rate over the existing interfaces. As designed, the proposed interface can effortlessly be suited in medium-sized display devices like Tablet PC, PDA etc. Also, the developed system can be used by the people with motor disabilities.

References

[1]
Agustin, J. S., Skovsgaard, H., Mollenbach, E., Barret, M., Tall, M., Hansen, D. W., and Hansen, J. P. Evaluation of a Low-Cost Open-Source Gaze Tracker. In Proc of ETRA, ACM (New York, NY, USA, 2010), 77--80.
[2]
Collins, J. F., and Blackwell, L. K. Effects of Eye Dominance and Retinal Distance on Binocular Rivalry. Perceptual Motor Skills 39 (1974), 747--754.
[3]
Drewes, H., Luca, A. D., and Schmidt, A. Eye-gaze Interaction for Mobile Phones. In Proc of the Mobility Conference, ACM (2007), 364--371.
[4]
Gong, J., Haggerty, B., and Tarasewich, P. An Enhanced Multitap Text Entry Method with Predictive Next-letter Highlighting. In Extended Abstracts on Human Factors in Computing Systems, ACM (New York, NY, USA, 2005), 1399--1402.
[5]
Hansen, J. P., Hansen, D. W., and Johansen, A. S. Bringing Gaze-based Interaction back to Basics. Proceedings of Universal Access in Human-Computer Interaction (2001), 325--328.
[6]
Jacob, R. J. K. The Use of Eye Movements in Human-computer Interaction Techniques: What You Look at is What You Get. ACM Transactions on Information Systems 9, 2 (1991), 152--169.
[7]
Kristensson, P. O., and Vertanen, K. The Potential of Dwell-free Eye-typing for Fast Assistive Gaze Communication. In Proc of ETRA, ACM (2012), 241--244.
[8]
Magnien, L., Bouraoui, J., and Vigouroux, N. Mobile Text Input with Soft Keyboards: Optimization by Means of Visual Clues. In Proc of Mobile Human-Computer Interaction, ACM (New York, NY, USA, 2004), 197--218.
[9]
Majaranta, P. Text Entry by Eye Gaze. PhD thesis, Department of Computer Science, 2009.
[10]
Majaranta, P., Ahola, U. K., and Špakov, O. Fast Gaze Typing with an Adjustable Dwell Time. In Proc of conference on Human factors in computing systems, ACM (Boston, USA, 2009), 357--360.
[11]
Majaranta, P., Aula, A., and Räihä, K. J. Effects of Feedback on Eye Typing with a Short Dwell Time. In Proc of ETRA, ACM (2004), 139--146.
[12]
Majaranta, P., and Räihä, K. J. Twenty Years of Eye Typing: Systems and Design Issues. In Proc of ETRA, ACM (2002), 15--22.
[13]
Majaranta, P., and Räihä, K. J. Text Entry Systems: Mobility, accessibility, universality. Eds. Morgan Kaufmann, San Francisco, CA, 2007, ch. Text Entry by Gaze: Utilizing Eye-tracking, 175--187.
[14]
Mayzner, M. S., and Tresselt, M. E. Tables of Single-letter and Digram Frequency Counts for Various Word-length and Letter-position Combinations. Psychonomic Monograph Supplements 1, 2 (1965), 13--32.
[15]
Miniotas, D., Spakov, O., and Evreinov, G. Symbol Creator: An Alternative Eye-based Text Entry Technique with Low Demand for Screen Space. In Proceedings of INTERACT (2003), 137--143.
[16]
Morimoto, C. H., and Amir, A. Context Switching for Fast Key Selection in Text Entry Applications. In Proc of ETRA, ETRA '10, ACM (New York, NY, USA, 2010), 271--274.
[17]
Panwar, P., Sarcar, S., and Samanta, D. EyeBoard: A Fast and Accurate Eye Gaze-based Text Entry System. In Proc of IHCI, IEEE (2012), 1--8.
[18]
Pomplun, M., Reingold, E. M., and Shen, J. Area Activation: A computational Model of Saccadic Selectivity in Visual Search. Cognitive Science 27, 2 (2003), 299--312.
[19]
Sears, S., Jacko, J. A., Chu, J. Y. M., and Moro, F. The role of visual search in the design of effective soft keyboards. Behaviour & Information Technology 20, 3 (2001), 159--166.
[20]
Soukoreff, R. W., and MacKenzie, I. S. Metrics for text entry research: an evaluation of msd and kspc, and a new unified error metric. In Proc of the conference on Human factors in computing systems, ACM (2003), 113--120.
[21]
Špakov, O., and Majaranta, P. Scrollable Keyboards for Eye Typing. In Proc of COGAIN (Prague, Czech Republic, 2008), 63--66.
[22]
Špakov, O., and Miniotas, D. On-line Adjustment of Dwell Time for Target Selection by Gaze. In Proc of NordiCHI, ACM (2004), 203--206.
[23]
Urbina, M. H., and Huckauf, A. Dwell Time Free Eye Typing Approaches. In Proc of COGAIN (2007), 3--4.
[24]
Wobbrock, J. O., and Myers, B. A. Analyzing the input stream for character-level errors in unconstrained text entry evaluations. ACM Transactions on Computer-Human Interaction (TOCHI) 13, 4 (2006), 458--489.
[25]
Wobbrock, J. O., Rubinstein, J., Sawyer, M. W., and Duchowski, A. T. Longitudinal Evaluation of Discrete Consecutive Gaze Gestures for Text Entry. In Proc of ETRA, ACM (2008), 11--18.
[26]
Wolfe, J. M. Guided Search 2.0 - A Revised Model of Visual Search. Psychonomic bulletin & review 1, 2 (1994), 202--238.

Cited By

View all
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)SkiMR: Dwell-free Eye Typing in Mixed Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00065(439-449)Online publication date: 16-Mar-2024
  • (2024)Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in ARIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210630:5(2496-2506)Online publication date: 19-Mar-2024
  • Show More Cited By

Index Terms

  1. EyeK: an efficient dwell-free eye gaze-based text entry system

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    APCHI '13: Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction
    September 2013
    420 pages
    ISBN:9781450322539
    DOI:10.1145/2525194
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 24 September 2013

    Check for updates

    Qualifiers

    • Research-article

    Conference

    APCHI '13
    Sponsor:

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)39
    • Downloads (Last 6 weeks)9
    Reflects downloads up to 20 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
    • (2024)SkiMR: Dwell-free Eye Typing in Mixed Reality2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00065(439-449)Online publication date: 16-Mar-2024
    • (2024)Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in ARIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210630:5(2496-2506)Online publication date: 19-Mar-2024
    • (2023)FlexType: Flexible Text Input with a Small Set of Input GesturesProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584077(584-594)Online publication date: 27-Mar-2023
    • (2023)Gaze Speedup: Eye Gaze Assisted Gesture Typing in Virtual RealityProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584072(595-606)Online publication date: 27-Mar-2023
    • (2023)Characterizing information access needs in gaze-adaptive augmented reality interfaces: implications for fast-paced and dynamic usage contextsHuman–Computer Interaction10.1080/07370024.2023.226078839:5-6(553-583)Online publication date: 13-Oct-2023
    • (2022)Performance Analysis of Saccades for Primary and Confirmatory Target SelectionProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565619(1-12)Online publication date: 29-Nov-2022
    • (2022)GazeBreath: Input Method Using Gaze Pointing and Breath SelectionProceedings of the Augmented Humans International Conference 202210.1145/3519391.3519405(1-9)Online publication date: 13-Mar-2022
    • (2022)TapGazer: Text Entry with Finger Tapping and Gaze-directed Word SelectionProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501838(1-16)Online publication date: 29-Apr-2022
    • (2022)Eye Typing-Vision Based Human Activity Control2022 IEEE International Conference on Distributed Computing and Electrical Circuits and Electronics (ICDCECE)10.1109/ICDCECE53908.2022.9792928(1-5)Online publication date: 23-Apr-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media