[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality

Published: 14 December 2019 Publication History

Abstract

Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article reports a study of gaze shifts in virtual reality aimed to address the gap and inform design. We identify general eye, head and torso coordination patterns and provide an analysis of the relative movements’ contribution and temporal alignment. We quantify effects of target distance, direction and user posture, describe preferred eye-in-head motion ranges and identify a high variability in head movement tendency. Study insights lead us to propose gaze zones that reflect different levels of contribution from eye, head and body. We discuss design implications for HCI and VR, and in conclusion argue to treat gaze as multimodal input, and eye, head and body movement as synergetic in interaction design.

References

[1]
Henny Admoni and Brian Scassellati. 2017. Social eye gaze in human-robot interaction: A review. Journal of Human-Robot Interaction 6, 1 (May 2017), 25--63.
[2]
Sean Andrist, Tomislav Pejsa, Bilge Mutlu, and Michael Gleicher. 2012. A head-eye coordination model for animating gaze shifts of virtual characters. In Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction (Gaze-In’12). ACM, New York, NY, Article 4, 6 pages.
[3]
Ferran Argelaguet and Carlos Andujar. 2013. A survey of 3D object selection techniques for virtual environments. Computers 8 Graphics 37, 3 (May 2013), 121--136.
[4]
Till Ballendat, Nicolai Marquardt, and Saul Greenberg. 2010. Proxemic interaction: Designing for a proximity and orientation-aware environment. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces (ITS’10). ACM, New York, NY, 121--130.
[5]
Albert E. Bartz. 1966. Eye and head movements in peripheral vision: Nature of compensatory eye movements. Science 152, 3729 (1966), 1644--1645.
[6]
Richard Bates and Howell Istance. 2003. Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing devices. Universal Access in the Information Society 2, 3 (Oct. 2003), 280--290.
[7]
Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer. 2018. Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. In Proceedings of the Workshop on Communication by Gaze Interaction (COGAIN’18). ACM, New York, NY, Article 1, 9 pages.
[8]
Richard A. Bolt. 1981. Gaze-orchestrated dynamic windows. In Proceedings of the 8th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH’81). ACM, New York, NY, 109--119.
[9]
Xiang Cao, Jacky Jie Li, and Ravin Balakrishnan. 2008. Peephole pointing: Modeling acquisition of dynamically revealed targets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’08). ACM, New York, NY, 1699--1708.
[10]
Anup Doshi and Mohan M. Trivedi. 2012. Head and eye gaze dynamics during visual attention shifts in complex environments. Journal of Vision 12, 2 (2012), 9.
[11]
Wolfgang Einhäuser, Frank Schumann, Stanislavs Bardins, Klaus Bartl, Guido Böning, Erich Schneider, and Peter König. 2007. Human eye-head co-ordination in natural exploration. Network: Computation in Neural Systems 18, 3 (Jan 2007), 267--297.
[12]
Virgilio F. Ferrario, Chiarella Sforza, Graziano Serrao, GianPiero Grassi, and Erio Mossi. 2002. Active range of motion of the head and cervical spine: A three-dimensional investigation in healthy young adults. Journal of Orthopaedic Research 20, 1 (Jan 2002), 122--129.
[13]
Edward G. Freedman. 2008. Coordination of the eyes and head during visual orienting. Experimental Brain Research 190, 4 (Oct. 2008), 369--387.
[14]
Edward G. Freedman and David L. Sparks. 2000. Coordination of the eyes and head: Movement kinematics. Experimental Brain Research 131, 1 (Mar. 2000), 22--32.
[15]
James H. Fuller. 1992. Head movement propensity. Experimental Brain Research 92, 1 (Dec. 1992), 152--164.
[16]
H. H. L. M. Goossens and A. John Van Opstal. 1997. Human eye-head coordination in two dimensions under different sensorimotor conditions. Experimental Brain Research 114, 3 (May 1997), 542--560.
[17]
Daniel Guitton and Michel Volle. 1987. Gaze control in humans: Eye-head coordination during orienting movements to targets within and beyond the oculomotor range. Journal of Neurophysiology 58, 3 (1987), 427--459.
[18]
J. Gunderson and G. Vanderheiden. 1984. Direct optical head-pointing input to standard computers using a TV screen based keyboard. In Proceedings of the Discovery’83: Computers for the Disabled. Stout Vocational Rehabilitation Institute, University of Wisconsin-Stout, Menomonie, WI, 85--89.
[19]
John Paulin Hansen, Vijay Rajanna, I. Scott MacKenzie, and Per Bækgaard. 2018. A Fitts’ law study of click and dwell interaction by gaze, head and mouse with a head-mounted display. In Proceedings of the Workshop on Communication by Gaze Interaction (COGAIN’18). ACM, New York, NY, Article 7, 5 pages.
[20]
Craig Hennessey, Borna Noureddin, and Peter Lawrence. 2006. A single camera eye-gaze tracking system with free head motion. In Proceedings of the 2006 Symposium on Eye Tracking Research 8 Applications (ETRA’06). ACM, New York, NY, 87--94.
[21]
Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Jarodzka Halszka, and Joost van de Weijer. 2011. Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press, 560 pages.
[22]
Brian Hu, Ishmael Johnson-Bey, Mansi Sharma, and Ernst Niebur. 2017. Head movements during visual exploration of natural images in virtual reality. In Proceedings of the 2017 51st Annual Conference on Information Sciences and Systems (CISS’17). IEEE, 1--6.
[23]
Brian Hu, Ishmael Johnson-Bey, Mansi Sharma, and Ernst Niebur. 2018. Head movements are correlated with other measures of visual attention at smaller spatial scales. In Proceedings of the 52nd Annual Conference on Information Sciences and Systems (CISS’18). IEEE, 1--6.
[24]
Jeff Huang, Ryen White, and Georg Buscher. 2012. User see, user point: Gaze and cursor alignment in web search. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’12). ACM, New York, NY, 1341--1350.
[25]
Thomas E. Hutchinson, K. Preston White, Worthy N. Martin, Kelly C. Reichert, and Lisa A. Frey. 1989. Human-computer interaction using eye-gaze input. IEEE Transactions on Systems, Man, and Cybernetics 19, 6 (Nov. 1989), 1527--1534.
[26]
Laurent Itti, Nitin Dhavale, and Fréderic Pighin. 2006. Photorealistic attention-based gaze animation. In Proceedings of the 2006 IEEE International Conference on Multimedia and Expo. IEEE, 521--524.
[27]
Robert J. K. Jacob. 1990. What you look at is what you get: Eye movement-based interaction techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’90). ACM, New York, NY, 11--18.
[28]
Robert J. K. Jacob and Sophie Stellmach. 2016. What you look at is what you get: Gaze-based user interfaces. Interactions 23, 5 (2016), 62--65.
[29]
Shahram Jalaliniya, Diako Mardanbegi, and Thomas Pederson. 2015. MAGIC pointing for eyewear computers. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC’15). ACM, New York, NY, 155--158.
[30]
Tobit Kollenberg, Alexander Neumann, Dorothe Schneider, Tessa-Karina Tews, Thomas Hermann, Helge Ritter, Angelika Dierker, and Hendrik Koesling. 2010. Visual search in the (un)real world: How head-mounted displays affect eye movements, head movements and target detection. In Proceedings of the 2010 Symposium on Eye-Tracking Research 8 Applications (ETRA’10). ACM, New York, NY, 121--124.
[31]
Andrew Kurauchi, Wenxin Feng, Carlos Morimoto, and Margrit Betke. 2015. HMAGIC: Head movement and gaze input cascaded pointing. In Proceedings of the 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments (PETRA’15). ACM, New York, NY, Article 47, 4 pages.
[32]
Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, and Mark Billinghurst. 2018. Pinpointing: Precise head- and eye-based target selection for augmented reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI’18). ACM, New York, NY, Article 81, 14 pages.
[33]
Michael Land and Benjamin Tatler. 2012. Looking and Acting: Vision and Eye Movements in Natural Behaviour. Oxford University Press, United Kingdom.
[34]
Michael F. Land. 2004. The coordination of rotations of the eyes, head and trunk in saccadic turns produced in natural situations. Experimental Brain Research 159, 2 (01 Nov. 2004), 151--160.
[35]
Christian Lander, Sven Gehring, Antonio Krüger, Sebastian Boring, and Andreas Bulling. 2015. GazeProjector: Accurate gaze estimation and seamless gaze interaction across multiple displays. In Proceedings of the 28th Annual ACM Symposium on User Interface Software 8 Technology (UIST’15). ACM, New York, NY, 395--404.
[36]
Otto Lappi. 2016. Eye movements in the wild: Oculomotor control, gaze behavior 8 frames of reference. Neuroscience 8 Biobehavioral Reviews 69 (2016), 49--68.
[37]
Diako Mardanbegi, Christopher Clarke, and Hans Gellersen. 2019. Monocular gaze depth estimation using the vestibulo-ocular reflex. In Proceedings of the 11th ACM Symposium on Eye Tracking Research 8 Applications (ETRA’19). ACM, New York, NY, Article 20, 9 pages.
[38]
Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based head gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA’12). ACM, New York, NY, 139--146.
[39]
Diako Mardanbegi, Tobias Langlotz, and Hans Gellersen. 2019. Resolving target ambiguity in 3D gaze interaction through VOR depth estimation. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI’19). ACM, New York, NY, Article 612, 12 pages.
[40]
Diako Mardanbegi, Ken Pfeuffer, Alexander Perzl, Benedikt Mayer, Shahram Jalaliniya, and Hans Gellersen. 2019. EyeSeeThrough: Unifying tool selection and application in virtual environments. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR’19). IEEE, 474--483.
[41]
Nicolai Marquardt, Robert Diaz-Marino, Sebastian Boring, and Saul Greenberg. 2011. The proximity toolkit: Prototyping proxemic interactions in ubiquitous computing ecologies. In Proceedings of the 24th Annual ACM Symposium on User Interface Software 8 Technology (UIST’11). ACM, New York, NY, 315--326.
[42]
Mark R. Mine. 1995. Virtual Environment Interaction Techniques. UNC Chapel Hill CS Dept.
[43]
Carlos H. Morimoto and Marcio R. M. Mimica. 2005. Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding 98, 1 (April 2005), 4--24.
[44]
Orval H. Mowrer. 1932. XXXII. Concerning the normal function of the vestibular apparatus. Annals of Otology, Rhinology 8 Laryngology 41, 2 (1932), 412--421.
[45]
Omar Mubin, Tatiana Lashina, and Evert van Loenen. 2009. How not to become a buffoon in front of a shop window: A solution allowing natural head movement for interaction with a public display. In Human-Computer Interaction -- INTERACT 2009. Springer, Berlin, 250--263.
[46]
Yasuto Nakanishi, Takashi Fujii, Kotaro Kiatjima, Yoichi Sato, and Hideki Koike. 2002. Vision-based face tracking system for large displays. In UbiComp 2002: Ubiquitous Computing. Springer, Berlin, 152--159.
[47]
Brian S. Oommen, Ryan M. Smith, and John S. Stahl. 2004. The influence of future gaze orientation upon eye-head coupling during saccades. Experimental Brain Research 155, 1 (Mar. 2004), 9--18.
[48]
Tomislav Pejsa, Daniel Rakita, Bilge Mutlu, and Michael Gleicher. 2016. Authoring directed gaze for full-body motion capture. ACM Transactions on Graphics 35, 6, Article 161 (Nov. 2016), 11 pages.
[49]
Kevin Pfeil, Eugene M. Taranta, II, Arun Kulshreshth, Pamela Wisniewski, and Joseph J. LaViola, Jr.2018. A comparison of eye-head coordination between virtual and physical realities. In Proceedings of the 15th ACM Symposium on Applied Perception (SAP’18). ACM, New York, NY, Article 18, 7 pages.
[50]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: Combining gaze with multi-touch for interaction on the same surface. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology (UIST’14). ACM, New York, NY, 509--518.
[51]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. Gaze-shifting: Direct-indirect input with pen and touch modulated by gaze. In Proceedings of the 28th Annual ACM Symposium on User Interface Software 8 Technology (UIST’15). ACM, New York, NY, 373--383.
[52]
Thammathip Piumsomboon, Gun Lee, Robert W. Lindeman, and Mark Billinghurst. 2017. Exploring natural eye-gaze-based interaction for immersive virtual reality. In Proceedings of the 2017 IEEE Symposium on 3D User Interfaces (3DUI’17). IEEE, 36--39.
[53]
Yuan Yuan Qian and Robert J. Teather. 2017. The eyes don’t have it: An empirical comparison of head-based and eye-based selection in virtual reality. In Proceedings of the 5th Symposium on Spatial User Interaction (SUI’17). ACM, New York, NY, 91--98.
[54]
Robert G. Radwin, Gregg C. Vanderheiden, and Mei-Li Lin. 1990. A method for evaluating head-controlled computer input devices using Fitts’ law. Human Factors 32, 4 (Aug. 1990), 423--438.
[55]
Yashas Rai, Jesús Gutiérrez, and Patrick Le Callet. 2017. A dataset of head and eye movements for 360 degree images. In Proceedings of the 8th ACM on Multimedia Systems Conference (MMSys’17). ACM, New York, NY, 205--210.
[56]
Vijay Rajanna and John Paulin Hansen. 2018. Gaze typing in virtual reality: Impact of keyboard design, selection method, and motion. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research 8 Applications (ETRA’18). ACM, New York, NY, Article 15, 10 pages.
[57]
K. Ruhland, C. E. Peters, S. Andrist, J. B. Badler, N. I. Badler, M. Gleicher, B. Mutlu, and R. McDonnell. 2015. A review of eye gaze in virtual agents, social robotics and HCI: Behaviour generation, user interaction and perception. Computer Graphics Forum 34, 6 (Sept. 2015), 299--326.
[58]
Sohrab Saeb, Cornelius Weber, and Jochen Triesch. 2011. Learning the optimal control of coordinated eye and head movements. PLoS Computational Biology 7, 11 (Nov. 2011), e1002253.
[59]
Jeffrey S. Shell, Roel Vertegaal, Daniel Cheng, Alexander W. Skaburskis, Changuk Sohn, A. James Stewart, Omar Aoudeh, and Connor Dickie. 2004. ECSGlasses and EyePliances: Using attention to open sociable windows of interaction. In Proceedings of the 2004 Symposium on Eye Tracking Research 8 Applications (ETRA’04). ACM, New York, NY, 93--100.
[60]
Gwanseob Shin and Sudeep Hegde. 2010. User-preferred position of computer displays: Effects of display size. Human Factors 52, 5 (2010), 574--585.
[61]
Ludwig Sidenmark and Hans Gellersen. 2019. Eye8Head: Synergetic eye and head movement for pointing and selection. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST’19). ACM, New York, NY.
[62]
Ludwig Sidenmark and Anders Lundström. 2019. Gaze behaviour on interacted objects during hand interaction in virtual reality for eye tracking calibration. In Proceedings of the 11th ACM Symposium on Eye Tracking Research 8 Applications (ETRA’19). ACM, New York, NY, Article 6, 9 pages.
[63]
Vincent Sitzmann, Ana Serrano, Amy Pavel, Maneesh Agrawala, Diego Gutierrez, Belen Masia, and Gordon Wetzstein. 2018. Saliency in VR: How do people explore virtual environments? IEEE Transactions on Visualization and Computer Graphics 24, 4 (Apr. 2018), 1633--1642.
[64]
Barton A. Smith, Janet Ho, Wendy Ark, and Shumin Zhai. 2000. Hand eye coordination patterns in target selection. In Proceedings of the 2000 Symposium on Eye Tracking Research 8 Applications (ETRA’00). ACM, New York, NY, 117--122.
[65]
Oleg Špakov, Poika Isokoski, and Päivi Majaranta. 2014. Look and lean: Accurate head-assisted eye pointing. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA’14). ACM, New York, NY, 35--42.
[66]
Oleg Špakov and Päivi Majaranta. 2012. Enhanced gaze interaction using simple head gestures. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UbiComp’12). ACM, New York, NY, 705--710.
[67]
John S. Stahl. 1999. Amplitude of human head movements associated with horizontal saccades. Experimental Brain Research 126, 1 (Apr. 1999), 41--54.
[68]
John S. Stahl. 2001. Eye-head coordination and the variation of eye-movement accuracy with orbital eccentricity. Experimental Brain Research 136, 2 (Jan. 2001), 200--210.
[69]
Vildan Tanriverdi and Robert J. K. Jacob. 2000. Interacting with eye movements in virtual environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’00). ACM, New York, NY, 265--272.
[70]
Zachary C. Thumser, Brian S. Oommen, Igor S. Kofman, and John S. Stahl. 2008. Idiosyncratic variations in eye--head coupling observed in the laboratory also manifest during spontaneous behavior in a natural setting. Experimental Brain Research 191, 4 (01 Dec. 2008), 419--434.
[71]
Zachary C. Thumser and John S. Stahl. 2009. Eye-head coupling tendencies in stationary and moving subjects. Experimental Brain Research 195, 3 (May 2009), 393--401.
[72]
Jayson Turner, Andreas Bulling, and Hans Gellersen. 2012. Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA’12). ACM, New York, NY, 269--272.
[73]
D. Tweed, B. Glenn, and T. Vilis. 1995. Eye-head coordination during large gaze shifts. Journal of Neurophysiology 73, 2 (Feb. 1995), 766--779.
[74]
Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct control of ambient devices by gaze. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (DIS’16). ACM, New York, NY, 812--817.
[75]
Roel Vertegaal. 2003. Attentive user interfaces. Communications of the ACM 46, 3 (2003), 30--33.
[76]
Daniel Vogel and Ravin Balakrishnan. 2004. Interactive public ambient displays: Transitioning from implicit to explicit, public to personal, interaction with multiple users. In Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology (UIST’04). ACM, New York, NY, 137--146.
[77]
Colin Ware and Harutune H. Mikaelian. 1986. An evaluation of an eye tracker as a device for computer Input2. SIGCHI Bulletin 17, SI (May 1986), 183--188.
[78]
Robert C. Zeleznik, Andrew S. Forsberg, and Jürgen P. Schulze. 2005. Look-That-There: Exploiting Gaze in Virtual Reality Interactions. Technical Report. Brown University.
[79]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’99). ACM, New York, NY, 246--253.
[80]
Zhiwei Zhu and Qiang Ji. 2007. Novel eye gaze tracking techniques under natural head movement. IEEE Transactions on Biomedical Engineering 54, 12 (Dec. 2007), 2246--2260.

Cited By

View all
  • (2024)A tutorial: Analyzing eye and head movements in virtual realityBehavior Research Methods10.3758/s13428-024-02482-556:8(8396-8421)Online publication date: 8-Aug-2024
  • (2024)Large eye–head gaze shifts measured with a wearable eye tracker and an industrial cameraBehavior Research Methods10.3758/s13428-023-02316-w56:6(5820-5833)Online publication date: 10-Jan-2024
  • (2024)A Comparison of Head Movement Classification MethodsSensors10.3390/s2404126024:4(1260)Online publication date: 16-Feb-2024
  • Show More Cited By

Index Terms

  1. Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Transactions on Computer-Human Interaction
      ACM Transactions on Computer-Human Interaction  Volume 27, Issue 1
      February 2020
      206 pages
      ISSN:1073-0516
      EISSN:1557-7325
      DOI:10.1145/3372746
      Issue’s Table of Contents
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 14 December 2019
      Accepted: 01 September 2019
      Revised: 01 July 2019
      Received: 01 December 2018
      Published in TOCHI Volume 27, Issue 1

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Eye gaze
      2. eye
      3. eye tracking
      4. eye-head coordination
      5. gaze interaction
      6. gaze shifts
      7. head and body movement
      8. multimodal interaction

      Qualifiers

      • Research-article
      • Research
      • Refereed

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)359
      • Downloads (Last 6 weeks)65
      Reflects downloads up to 14 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)A tutorial: Analyzing eye and head movements in virtual realityBehavior Research Methods10.3758/s13428-024-02482-556:8(8396-8421)Online publication date: 8-Aug-2024
      • (2024)Large eye–head gaze shifts measured with a wearable eye tracker and an industrial cameraBehavior Research Methods10.3758/s13428-023-02316-w56:6(5820-5833)Online publication date: 10-Jan-2024
      • (2024)A Comparison of Head Movement Classification MethodsSensors10.3390/s2404126024:4(1260)Online publication date: 16-Feb-2024
      • (2024)Estimation of horizontal spatial specifications for ideal head-mounted displays in practical conditionsFrontiers in Virtual Reality10.3389/frvir.2024.14852435Online publication date: 5-Nov-2024
      • (2024)ShouldAR: Detecting Shoulder Surfing Attacks Using Multimodal Eye Tracking and Augmented RealityProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785738:3(1-23)Online publication date: 9-Sep-2024
      • (2024)Aiming, Pointing, Steering: A Core Task Analysis Framework for GameplayProceedings of the ACM on Human-Computer Interaction10.1145/36770578:CHI PLAY(1-48)Online publication date: 15-Oct-2024
      • (2024)AR-in-VR simulator: A toolbox for rapid augmented reality simulation and user researchACM Symposium on Applied Perception 202410.1145/3675231.3675240(1-11)Online publication date: 30-Aug-2024
      • (2024)RPG: Rotation Technique in VR Locomotion using Peripheral GazeProceedings of the ACM on Human-Computer Interaction10.1145/36556098:ETRA(1-19)Online publication date: 28-May-2024
      • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
      • (2024)Gaze on the Go: Effect of Spatial Reference Frame on Visual Target Acquisition During Physical Locomotion in Extended RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642915(1-16)Online publication date: 11-May-2024
      • Show More Cited By

      View Options

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media