[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2807442.2807499acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements

Published: 05 November 2015 Publication History

Abstract

We introduce Orbits, a novel gaze interaction technique that enables hands-free input on smart watches. The technique relies on moving controls to leverage the smooth pursuit movements of the eyes and detect whether and at which control the user is looking at. In Orbits, controls include targets that move in a circular trajectory in the face of the watch, and can be selected by following the desired one for a small amount of time. We conducted two user studies to assess the technique's recognition and robustness, which demonstrated how Orbits is robust against false positives triggered by natural eye movements and how it presents a hands-free, high accuracy way of interacting with smart watches using off-the-shelf devices. Finally, we developed three example interfaces built with Orbits: a music player, a notifications face plate and a missed call menu. Despite relying on moving controls -- very unusual in current HCI interfaces -- these were generally well received by participants in a third and final study.

Supplementary Material

suppl.mov (uist4210-file4.mp4)
Supplemental video
MP4 File (p457.mp4)

References

[1]
Barnes, G. Rapid learning of pursuit target motion trajectories revealed by responses to randomized transient sinusoids. Journal of Eye Movement Research 5, 3 (2012).
[2]
Baudisch, P. and Chu, G. Back-of-device interaction allows creating very small touch devices. In Proc. of CHI '09, ACM (2009), 1923--1932.
[3]
Blasko, G. and Feiner, S. An interaction system for watch computers using tactile guidance and bidirectional segmented strokes. Wearable Computers, 2004. ISWC 2004. Eighth International Symposium on, IEEE (2004), 120--123.
[4]
Bulling, A., Roggen, D., and Tröster, G. Wearable EOG goggles: Seamless sensing and contextawareness in everyday environments. Journal of Ambient Intelligence and Smart Environments 1, 2 (2009), 157--171.
[5]
Burke, M.R. and Barnes, G.R. Quantitative differences in smooth pursuit and saccadic eye movements. Experimental Brain Research 175, 4 (2006), 596--608.
[6]
Chen, X., Grossman, T., Wigdor, D.J., and Fitzmaurice, G. Duet: exploring joint interactions on a smart phone and a smart watch. In Proc. of CHI '14 ACM (2014), 159--168.
[7]
Cymek, D.H., Venjakob, A.C., Ruff, S., Lutz, O.H.M., Hofmann, S., and Roetting, M. Entering PIN Codes by Smooth Pursuit Eye Movements. Journal of Eye Movement Research 7, 4 (2014).
[8]
Dietz, P. and Leigh, D. DiamondTouch: a multi-user touch technology. In Procs of UIST '01, ACM (2001), 219--226.
[9]
Drewes, H. and Schmidt, A. Interacting with the Computer Using Gaze Gestures. In Proc. of INTERACT '07, Springer (2007), 475--488.
[10]
Dybdal, M.L., Agustin, J.S., and Hansen, J.P. Gaze Input for Mobile Devices by Dwell and Gestures. In Procs. of the Symposium on Eye Tracking Research and Applications, ACM (2012), 225--228.
[11]
Fekete, J.-D., Elmqvist, N., and Guiard, Y. Motionpointing: target selection using elliptical motions. In Procs of CHI '09, ACM (2009), 289--298.
[12]
Haines, R. Vintage Watches. Krause Publications, 2010.
[13]
Harrison, C. and Hudson, S.E. Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices. In Procs of UIST '09, ACM (2009), 121--124.
[14]
Hutchinson, T.E., White, J. K.P., Martin, W.N., Reichert, K.C., and Frey, L.A. Human-computer interaction using eye-gaze input. Systems, Man and Cybernetics, IEEE Transactions on 19, 6 (1989), 1527--1534.
[15]
Jacob, R.J. What you look at is what you get: eye movement-based interaction techniques. In Procs of CHI '90, ACM (1990), 11--18.
[16]
Kassner, M., Patera, W., and Bulling, A. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, ACM (2014), 1151--1160.
[17]
Kratz, S. and Rohs, M. HoverFlow: expanding the design space of around-device interaction. In Procs of MobileHCI '09s, ACM (2009), 4.
[18]
Lutz, O., Venjakob, A., and Ruff, S. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research 8, 1 (2015), 1--11.
[19]
Majaranta, P. and Räihä, K.-J. Twenty Years of Eye Typing: Systems and Design Issues. In Procs. of the 2002 Symposium on Eye Tracking Research & Applications, ACM (2002), 15--22.
[20]
Møllenbach, E., Hansen, J.P., Lillholm, M., and Gale, A.G. Single stroke gaze gestures. In Procs. of CHI '09, Extended Abstracts Volume, Boston, MA, USA, April 4--9, 2009, (2009), 4555--4560.
[21]
Morimoto, C.H. and Mimica, M.R.M. Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98, 1 (2005), 4--24.
[22]
Oakley, I. and Lee, D. Interaction on the edge: offset sensing for small devices. In Procs of CHI ' 14, ACM (2014), 169--178.
[23]
Perrault, S.T., Lecolinet, E., Eagan, J., and Guiard, Y. Watchit: simple gestures and eyes-free interaction for wristwatches and bracelets. In Procs of CHI '13, ACM (2013), 1451--1460.
[24]
Pfeuffer, K., Vidal, M., Turner, J., Bulling, A., and Gellersen, H. Pursuit calibration: Making gaze calibration less tedious and more flexible. In Procs of UIST '13, ACM (2013), 261--270.
[25]
Raffle, H.S., Wong, A., and Geiss, R. Unlocking a screen using eye tracking information. 2012.
[26]
Rantala, J., Kangas, J., Akkil, D., Isokoski, P., and Raisamo, R. Glasses with Haptic Feedback of Gaze Gestures. In Procs of the Extended Abstracts of CHI '14, ACM (2014), 1597--1602.
[27]
Roudaut, A., Huot, S., and Lecolinet, E. TapTap and MagStick: improving one-handed target acquisition on small touch-screens. Proceedings of the working conference on Advanced visual interfaces, ACM (2008), 146--153.
[28]
Sibert, L.E. and Jacob, R.J.K. Evaluation of Eye Gaze Interaction. In Procs of CHI '00, ACM (2000), 281--288.
[29]
Vidal, M., Bulling, A., and Gellersen, H. Analysing EOG signal features for the discrimination of eye movements with wearable devices. In Procs of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction, ACM (2011), 15--20.
[30]
Vidal, M., Bulling, A., and Gellersen, H. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Procss of Ubicomp '13, ACM (2013), 439--448.
[31]
Wertheimer, M. and Riezler, K. Gestalt theory. Social Research, (1944), 78--99.
[32]
Xiao, R., Laput, G., and Harrison, C. Expanding the input expressivity of smartwatches with mechanical pan, twist, tilt and click. In Procs of CHI '14, ACM (2014), 193--196.
[33]
Yu, N.-H., Tsai, S.-S., Hsiao, I.-C., et al. Clip-on gadgets: expanding multi-touch interaction area with unpowered tactile controls. In Procs of UIST '11, ACM (2011), 367--372.
[34]
Zhai, S., Morimoto, C., and Ihde, S. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Procs of CHI '99, ACM (1999), 246--253.
[35]
Zhang, Y., Bulling, A., and Gellersen, H. Sideways: A gaze interface for spontaneous interaction with situated displays. In Procs CHI '13, ACM (2013), 851--860.
[36]
Zhang, Y., Müller, J., Chong, M.K., Bulling, A., and Gellersen, H. GazeHorizon: enabling passers-by to interact with public displays by gaze. In Procs of Ubicomp '14, ACM (2014), 559--563.

Cited By

View all
  • (2024)Bi-Directional Gaze-Based Communication: A ReviewMultimodal Technologies and Interaction10.3390/mti81201088:12(108)Online publication date: 4-Dec-2024
  • (2024)PrivateGaze: Preserving User Privacy in Black-box Mobile Gaze Tracking ServicesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785958:3(1-28)Online publication date: 9-Sep-2024
  • (2024)EyeWithShut: Exploring Closed Eye Features to Estimate Eye PositionCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3677605(157-161)Online publication date: 5-Oct-2024
  • Show More Cited By

Index Terms

  1. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '15: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology
    November 2015
    686 pages
    ISBN:9781450337793
    DOI:10.1145/2807442
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 05 November 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eye tracking
    2. gaze input
    3. gaze interaction
    4. pursuits
    5. small devices
    6. small displays
    7. smart watches
    8. wearable computing.

    Qualifiers

    • Research-article

    Conference

    UIST '15

    Acceptance Rates

    UIST '15 Paper Acceptance Rate 70 of 297 submissions, 24%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)162
    • Downloads (Last 6 weeks)19
    Reflects downloads up to 17 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Bi-Directional Gaze-Based Communication: A ReviewMultimodal Technologies and Interaction10.3390/mti81201088:12(108)Online publication date: 4-Dec-2024
    • (2024)PrivateGaze: Preserving User Privacy in Black-box Mobile Gaze Tracking ServicesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785958:3(1-28)Online publication date: 9-Sep-2024
    • (2024)EyeWithShut: Exploring Closed Eye Features to Estimate Eye PositionCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3677605(157-161)Online publication date: 5-Oct-2024
    • (2024)Between Wearable and Spatial Computing: Exploring Four Interaction Techniques at the Intersection of Smartwatches and Head-mounted DisplaysProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656365(1-7)Online publication date: 4-Jun-2024
    • (2024)Hand Me This: Exploring the Effects of Gaze-driven Animations and Hand Representations in Users’ Sense of Presence and EmbodimentProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656362(1-7)Online publication date: 4-Jun-2024
    • (2024)Investigating the Effects of Eye-Tracking Interpolation Methods on Model Performance of LSTMProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656353(1-6)Online publication date: 4-Jun-2024
    • (2024)FocusFlow: 3D Gaze-Depth Interaction in Virtual Reality Leveraging Active Visual Depth ManipulationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642589(1-18)Online publication date: 11-May-2024
    • (2024)GazePointAR: A Context-Aware Multimodal Voice Assistant for Pronoun Disambiguation in Wearable Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642230(1-20)Online publication date: 11-May-2024
    • (2024)Uncovering and Addressing Blink-Related Challenges in Using Eye Tracking for Interactive SystemsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642086(1-23)Online publication date: 11-May-2024
    • (2024)Filtering on the Go: Effect of Filters on Gaze Pointing Accuracy During Physical Locomotion in Extended RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345615330:11(7234-7244)Online publication date: 1-Nov-2024
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media