[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3126594.3126616acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality

Published: 20 October 2017 Publication History

Abstract

SmoothMoves is an interaction technique for augmented reality (AR) based on smooth pursuits head movements. It works by computing correlations between the movements of on-screen targets and the user's head while tracking those targets. The paper presents three studies. The first suggests that head based input can act as an easier and more affordable surrogate for eye-based input in many smooth pursuits interface designs. A follow-up study grounds the technique in the domain of augmented reality, and captures the error rates and acquisition times on different types of AR devices: head-mounted (2.6%, 1965ms) and hand-held (4.9%, 2089ms). Finally, the paper presents an interactive lighting system prototype that demonstrates the benefits of using smooth pursuits head movements in interaction with AR interfaces. A final qualitative study reports on positive feedback regarding the technique's suitability for this scenario. Together, these results indicate show SmoothMoves is viable, efficient and immediately available for a wide range of wearable devices that feature embedded motion sensing.

Supplementary Material

MP4 File (p167-esteves.mp4)

References

[1]
Emilio Bizzi. 2011. Eye-Head Coordination. In Comprehensive Physiology, Ronald Terjung (ed.). John Wiley & Sons, Inc., Hoboken, NJ, USA. Retrieved April 5, 2016 from
[2]
Derya Ozcelik Buskermolen and Jacques Terken. 2012. Co-constructing Stories: A Participatory Design Technique to Elicit In-depth User Feedback and Suggestions About Design Concepts. In Proceedings of the 12th Participatory Design Conference: Exploratory Papers, Workshop Descriptions, Industry Cases Volume 2 (PDC '12), 33--36.
[3]
Christopher Clarke, Alessio Bellino, Augusto Esteves, Eduardo Velloso, and Hans Gellersen. 2016. TraceMatch: a computer vision technique for user input by tracing of animated controls. In UbiComp '16: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 298--303.
[4]
Nicholas Cooper, Aaron Keatley, Maria Dahlquist, Simon Mann, Hannah Slay, Joanne Zucco, Ross Smith, and Bruce H. Thomas. 2004. Augmented Reality Chinese Checkers. In Proceedings of the 2004 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE '04), 117-- 126.
[5]
Andrew Crossan, Mark McGill, Stephen Brewster, and Roderick Murray-Smith. 2009. Head Tilting for Interaction in Mobile Contexts. In Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '09), 6:1--6:10.
[6]
Dietlind Helene Cymek, Antje Christine Venjakob, Stefan Ruff, Otto Hans-Martin Lutz, Simon Hofmann, and Matthias Roetting. Entering PIN codes by smooth pursuit eye movements. Journal of Eye Movement Research 7(4), 1: 1--11.
[7]
Murtaza Dhuliawala, Juyoung Lee, Junichi Shimizu, Andreas Bulling, Kai Kunze, Thad Starner, and Woontack Woo. 2016. Smooth Eye Movement Interaction Using EOG Glasses. In Proceedings of the 18th ACM International Conference on Multimodal Interaction (ICMI 2016), 307--311.
[8]
David Dobbelstein, Philipp Hock, and Enrico Rukzio. 2015. Belt: An Unobtrusive Touch Input Device for Head-worn Displays. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15), 2135--2138.
[9]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, 457--466.
[10]
Karen M. Evans, Robert A. Jacobs, John A. Tarduno, and Jeff B. Pelz. 2012. Collecting and Analyzing EyeTracking Data in Outdoor Environments. Journal of Eye Movement Research 5, 2.
[11]
David G. Fleming, Gehard W. Vossius, George Bowman, and Ensign L. Johnson. 1969. Adaptive Properties Of The Eye-tracking System As Revealed By Moving-head And Open-loop Studies. Annals of the NY Academy of Sciences 156, 2 Rein Control,: 825--850.
[12]
Yi-Ta Hsieh, Antti Jylhä, Valeria Orso, Luciano Gamberini, and Giulio Jacucci. 2016. Designing a Willing-to-Use-in-Public Hand Gestural Interaction Technique for Smart Glasses. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16), 4203--4215.
[13]
Aulikki Hyrskykari, Howell Istance, and Stephen Vickers. 2012. Gaze Gestures or Dwell-based Interaction? In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12), 229-- 232.
[14]
Jari Kangas, Oleg Špakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Feedback for Smooth Pursuit Gaze Tracking Based Control. In Proceedings of the 7th Augmented Human International Conference 2016 (AH '16), 6:1--6:8.
[15]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct), 1151-- 1160.
[16]
Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling, and Florian Alt. 2016. TextPursuits: Using Text for Pursuits-based Interaction and Calibration on Public Displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16), 274--285.
[17]
Tiiu Koskela and Kaisa Väänänen-Vainio-Mattila. 2004. Evolution towards smart home environments: empirical evaluation of three user interfaces. Personal and Ubiquitous Computing 8, 3--4: 234--240.
[18]
Jeremy Lanman, Emilio Bizzi, and John Allum. 1978. The coordination of eye and head movement during smooth pursuit. Brain Research 153, 1: 39--53.
[19]
R. John Leigh and David S. Zee M.D. 2015. The Neurology of Eye Movements. Oxford University Press.
[20]
Otto Hans-Martin Lutz, Antje Christine Venjakob, and Stefan Ruff. 2015. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research 8(1), 2: 1--11.
[21]
Robert Mahony, Tarek Hamel, and Jean-Michel Pflimlin. 2008. Nonlinear Complementary Filters on the Special Orthogonal Group. IEEE Transactions on Automatic Control 53, 5: 1203--1218.
[22]
Microsoft. Microsoft HoloLens. Microsoft HoloLens. Retrieved September 20, 2016 from https://www.microsoft.com/microsoft-hololens/en-us
[23]
Pranav Mistry and Pattie Maes. 2009. SixthSense: A Wearable Gestural Interface. In ACM SIGGRAPH ASIA 2009 Sketches (SIGGRAPH ASIA '09), 11:1-- 11:1.
[24]
Louis-Philippe Morency and Trevor Darrell. 2006. Head Gesture Recognition in Intelligent Interfaces: The Role of Context in Improving Recognition. In Proceedings of the 11th International Conference on Intelligent User Interfaces (IUI '06), 32--38.
[25]
Kasım Özacar, Juan David Hincapié-Ramos, Kazuki Takashima, and Yoshifumi Kitamura. 2017. 3D Selection Techniques for Mobile Augmented Reality Head-Mounted Displays. Interacting with Computers 29, 4: 579--591.
[26]
Ken Pfeuffer, Melodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST '13), 261--270.
[27]
Christoph Rasche and Karl R. Gegenfurtner. 2009. Precision of speed discrimination and smooth pursuit eye movements. Vision Research 49, 5: 514--523.
[28]
Julie Rico and Stephen Brewster. 2010. Usable Gestures for Mobile Interfaces: Evaluating Social Acceptability. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10), 887--896.
[29]
D A Robinson. 1965. The mechanics of human smooth pursuit eye movement. The Journal of Physiology 180, 3: 569--591.
[30]
Boris Smus and Christopher Riederer. 2015. Magnetic Input for Mobile Virtual Reality. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC '15), 43--44.
[31]
Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12), 2981--2990.
[32]
Noboru Sugie and Makoto Wakakuwa. 1970. Visual Target Tracking with Active Head Rotation. IEEE Transactions on Systems Science and Cybernetics 6, 2: 103--109.
[33]
Gabriel Takacs, Vijay Chandrasekhar, Natasha Gelfand, Yingen Xiong, Wei-Chao Chen, Thanos Bismpigiannis, Radek Grzeszczuk, Kari Pulli, and Bernd Girod. 2008. Outdoors Augmented Reality on Mobile Phone Using Loxel-based Visual Feature Organization. In Proceedings of the 1st ACM International Conference on Multimedia Information Retrieval (MIR '08), 427--434.
[34]
Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct Control of Ambient Devices by Gaze. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems, 812--817.
[35]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13), 439--448.
[36]
Chun Yu, Ke Sun, Mingyuan Zhong, Xincheng Li, Peijun Zhao, and Yuanchun Shi. 2016. OneDimensional Handwriting: Inputting Letters and Words on Smart Glasses. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16), 71--82.
[37]
ES | PRODUCTS. JINS MEME. Retrieved July 5, 2017 from https://jins-meme.com/en/products/es/
[38]
Epson Moverio Next Generation Smart Eyewear Epson America, Inc. Retrieved September 20, 2016 from http://www.epson.com/cgibin/Store/jsp/Landing/moverio-augmented-realitysmart-eyewear-technology.do
[39]
Philips Hue. Retrieved September 21, 2016 from http://www2.meethue.com/en-gb/
[40]
Unity - Game Engine. Unity. Retrieved April 4, 2017 from https://unity3d.com
[41]
Vuforia | Augmented Reality. Retrieved April 4, 2017 from https://www.vuforia.com/

Cited By

View all
  • (2024)Virtual Task Environments Factors Explored in 3D Selection StudiesProceedings of the 50th Graphics Interface Conference10.1145/3670947.3670983(1-16)Online publication date: 3-Jun-2024
  • (2024)Hand Me This: Exploring the Effects of Gaze-driven Animations and Hand Representations in Users’ Sense of Presence and EmbodimentProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656362(1-7)Online publication date: 4-Jun-2024
  • (2024)Body Language for VUIs: Exploring Gestures to Enhance Interactions with Voice User InterfacesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660691(133-150)Online publication date: 1-Jul-2024
  • Show More Cited By

Index Terms

  1. SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '17: Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology
    October 2017
    870 pages
    ISBN:9781450349819
    DOI:10.1145/3126594
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 20 October 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. AR
    2. HMD
    3. augmented reality
    4. eye tracking
    5. input technique
    6. motion matching
    7. smooth pursuits
    8. wearable computing

    Qualifiers

    • Research-article

    Funding Sources

    • MSIP/IITP
    • Ministry of Science, ICT and Future Planning

    Conference

    UIST '17

    Acceptance Rates

    UIST '17 Paper Acceptance Rate 73 of 324 submissions, 23%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)77
    • Downloads (Last 6 weeks)12
    Reflects downloads up to 11 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Virtual Task Environments Factors Explored in 3D Selection StudiesProceedings of the 50th Graphics Interface Conference10.1145/3670947.3670983(1-16)Online publication date: 3-Jun-2024
    • (2024)Hand Me This: Exploring the Effects of Gaze-driven Animations and Hand Representations in Users’ Sense of Presence and EmbodimentProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656362(1-7)Online publication date: 4-Jun-2024
    • (2024)Body Language for VUIs: Exploring Gestures to Enhance Interactions with Voice User InterfacesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660691(133-150)Online publication date: 1-Jul-2024
    • (2024)RimSenseProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314567:4(1-24)Online publication date: 12-Jan-2024
    • (2024)Towards an Eye-Brain-Computer Interface: Combining Gaze with the Stimulus-Preceding Negativity for Target Selections in XRProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641925(1-17)Online publication date: 11-May-2024
    • (2024)HeadTrack: Real-Time Human–Computer Interaction via Wireless EarphonesIEEE Journal on Selected Areas in Communications10.1109/JSAC.2023.334538142:4(990-1002)Online publication date: Apr-2024
    • (2024)Enhancing Fixation and Pursuit: Optimizing Field of View and Number of Targets for Selection Performance in Virtual RealityInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2313888(1-13)Online publication date: 15-Feb-2024
    • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
    • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
    • (2024)Selection in Stride: Comparing Button- and Head-Based Augmented Reality Interaction During LocomotionHCI International 2024 Posters10.1007/978-3-031-61950-2_3(22-32)Online publication date: 7-Jun-2024
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media