[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

A tutorial: Analyzing eye and head movements in virtual reality

  • Original Manuscript
  • Published:
Behavior Research Methods Aims and scope Submit manuscript

Abstract

This tutorial provides instruction on how to use the eye tracking technology built into virtual reality (VR) headsets, emphasizing the analysis of head and eye movement data when an observer is situated in the center of an omnidirectional environment. We begin with a brief description of how VR eye movement research differs from previous forms of eye movement research, as well as identifying some outstanding gaps in the current literature. We then introduce the basic methodology used to collect VR eye movement data both in general and with regard to the specific data that we collected to illustrate different analytical approaches. We continue with an introduction of the foundational ideas regarding data analysis in VR, including frames of reference, how to map eye and head position, and event detection. In the next part, we introduce core head and eye data analyses focusing on determining where the head and eyes are directed. We then expand on what has been presented, introducing several novel spatial, spatio-temporal, and temporal head–eye data analysis techniques. We conclude with a reflection on what has been presented, and how the techniques introduced in this tutorial provide the scaffolding for extensions to more complex and dynamic VR environments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19

Similar content being viewed by others

Data availability

The data underlying the results presented in the study are available in the repository Open Science Framework (https://doi.org/10.17605/OSF.IO/THR89). None of the experiments was preregistered.

Code availability

The code of the main analysis programs is available in the repository Open Science Framework (https://doi.org/10.17605/OSF.IO/THR89).

References

Download references

Funding

Partial financial support was received from the Natural Sciences and Engineering Research Council of Canada (NCA: Postdoctoral Fellowship; AK: RGPIN-2022-03079).

Natural Sciences and Engineering Research Council of Canada,AK: RGPIN-2022-03079,NCA: Postdoctoral Fellowship

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Walter F. Bischof.

Ethics declarations

Conflict of interest / Competing interests

All authors report no conflict of interests

Ethics approval

The study was approved by the ethics board of the University of British Columbia (H10-00527). The procedures used in this study adhere to the tenets of the Declaration of Helsinki.

Consent to participate

All participants provided informed consent prior to participation.

Consent for publication

We confirm that this work is original and has not been published elsewhere, nor is it currently under consideration for publication elsewhere.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bischof, W.F., Anderson, N.C. & Kingstone, A. A tutorial: Analyzing eye and head movements in virtual reality. Behav Res 56, 8396–8421 (2024). https://doi.org/10.3758/s13428-024-02482-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.3758/s13428-024-02482-5

Keywords

Navigation