[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1007/978-3-319-92141-9_2guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

User Defined Eye Movement-Based Interaction for Virtual Reality

Published: 15 July 2018 Publication History

Abstract

Most of the applications of eye movement-based interaction in VR are limited to blinking and gaze at present, however, gaze gestures were neglected. Therefore, the potential of eye movement-based interaction in VR is far from being realized. In addition, many scholars tried to define some special eye movements as input instructions, but these definitions are almost always empirical and neglect users’ habits and cultural background. In this paper, we focus on how Chinese users interact in VR using eye movements without relying on a graphical user interface. We present a guessability study focusing on intuitive eye movement-based interaction of common commands in 30 tasks of 3 categories in VR. A total of 360 eye movements were collected from 12 users and a consensus set of eye movements in VR that best met user’s cognition was obtained. This set can be applied to the design of eye movement-based interaction in VR to help designers to develop user-centered and intuitive eye movement-based interaction in VR. Meanwhile this set can be migrated to other interactive media and user interfaces, such as a Post-WIMP interface base on eye movement-based interaction, as a reference to design.

References

[1]
Jerald, J., LaViola Jr., J.J., Marks, R.: VR interactions. In: ACM SIGGRAPH, Los Angeles, USA, pp. 1–105. ACM Press, New York (2017)
[2]
Jacob RJ Eye movement-based human-computer interaction techniques: toward non-command interfaces Adv. Hum. Comput. Interact. 1993 4 151-190
[3]
Haber RN and Hershenson M The Psychology of Visual Perception 1980 1 Oxford Holt, Rinehart & Winston
[4]
Young LR and Sheena D Survey of eye movement recording methods Behav. Res. Methods Instrum. 1975 7 5 397-429
[5]
Jacob RJK What you look at is what you get: eye movement-based interaction techniques ACM Trans. Inf. Syst. 1990 9 2 152-169
[6]
Goldberg JH and Kotval XP Computer interface evaluation using eye movements: methods and constructs Int. J. Ind. Ergon. 1999 24 6 631-645
[7]
Hyrskykari, A., Istance, H., Vickers, S.: Gaze gestures or dwell-based interaction? In: Proceedings of the Symposium on Eye Tracking Research and Applications, California, USA, pp. 229–232. ACM Press, New York (2012)
[8]
Shioiri S and Cavanagh P Saccadic suppression of low-level motion Vis. Res. 1989 29 8 915-928
[9]
Blackler A, Popovic V, and Mahar D Investigating users’ intuitive interaction with complex artefacts Appl. Ergon. 2010 41 1 72-92
[10]
Cooper A, Reimann R, and Cronin D About Face 3: The Essentials of Interaction Design 2007 Indiana Wiley Publishing, Inc.
[11]
Wobbrock, J.O., Aung, H.H., Rothrock, B., Myers, B.A.: Maximizing the guessability of symbolic input. In: CHI 2005 Extended Abstracts on Human Factors in Computing Systems, Portland, USA, pp. 1869–1872. ACM Press, New York (2005)
[12]
Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, USA, pp. 1083–1092. ACM Press, New York (2009)
[13]
Ruiz, J., Li, Y., Lank, E.: User-defined motion gestures for mobile interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, pp. 197–206. ACM Press, New York (2011)
[14]
Vatavu, R.D.: User-defined gestures for free-hand TV control. In: Proceedings of the 10th European Conference on Interactive TV and Video, Berlin, Germany, pp. 45–48. ACM Press, New York (2012)
[15]
Piumsomboon, T., Clark, A., Billinghurst, M., Cockburn, A.: User-defined gestures for augmented reality. In: CHI 2013 Extended Abstracts on Human Factors in Computing Systems, Paris, France, vol. 8118, pp. 955–960. ACM Press, New York (2013)
[16]
Silpasuwanchai, C., Ren, X.: Jump and shoot!: prioritizing primary and alternative body gestures for intense gameplay. In: Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems, Toronto, Canada, pp. 951–954. ACM Press, New York (2014)
[17]
Leng, H.Y., Norowi, N.M., Jantan, A.H.: A user-defined gesture set for music interaction in immersive virtual environment. In: Proceedings of the 3rd International Conference on Human-Computer Interaction and User Experience, Indonesia, pp. 44–51. ACM Press, New York (2017)

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Guide Proceedings
Cross-Cultural Design. Methods, Tools, and Users: 10th International Conference, CCD 2018, Held as Part of HCI International 2018, Las Vegas, NV, USA, July 15-20, 2018, Proceedings, Part I
Jul 2018
462 pages
ISBN:978-3-319-92140-2
DOI:10.1007/978-3-319-92141-9

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 15 July 2018

Author Tags

  1. Eye movement-based interaction
  2. Gaze gesture
  3. Virtual reality
  4. Guessability
  5. Intuitive interaction

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 13 Dec 2024

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media