• McAnally K, Grove P and Wallis G. (2024). Vergence eye movements in virtual reality. Displays. 10.1016/j.displa.2024.102683. 83. (102683). Online publication date: 1-Jul-2024.

    https://linkinghub.elsevier.com/retrieve/pii/S0141938224000477

  • Urbas L, Pelzer F, Lorenz S and Herlitzius T. (2024). Förderlicher Entwurf cyber-physischer Produktionssysteme. Handbuch Industrie 4.0. 10.1007/978-3-662-58528-3_132. (189-223).

    https://link.springer.com/10.1007/978-3-662-58528-3_132

  • Urbas L, Pelzer F, Lorenz S and Herlitzius T. (2023). Förderlicher Entwurf cyber-physischer Produktionssysteme. Handbuch Industrie 4.0. 10.1007/978-3-662-45537-1_132-1. (1-36).

    https://link.springer.com/10.1007/978-3-662-45537-1_132-1

  • Wagner P, Ho A and Kim J. (2022). Estimating 3D spatiotemporal point of regard: a device evaluation. Journal of the Optical Society of America A. 10.1364/JOSAA.457663. 39:8. (1343). Online publication date: 1-Aug-2022.

    https://opg.optica.org/abstract.cfm?URI=josaa-39-8-1343

  • Arefin M, Swan II J, Cohen Hoffing R and Thurman S. Estimating Perceptual Depth Changes with Eye Vergence and Interpupillary Distance using an Eye Tracker in Virtual Reality. 2022 Symposium on Eye Tracking Research and Applications. (1-7).

    https://doi.org/10.1145/3517031.3529632

  • Wang Z, Zhang A, Xia X, Zhang S, Li H, Wang J and Gao S. (2021). A Visual Feedback Supported Intelligent Assistive Technique for Amyotrophic Lateral Sclerosis Patients. Advanced Intelligent Systems. 10.1002/aisy.202100097. 4:4. Online publication date: 1-Apr-2022.

    https://onlinelibrary.wiley.com/doi/10.1002/aisy.202100097

  • Sun J, Wu Z, Wang H, Jing P and Liu Y. A Novel Integrated Eye-Tracking System with Stereo Stimuli for 3D Gaze Estimation. IEEE Transactions on Instrumentation and Measurement. 10.1109/TIM.2022.3225009. (1-1).

    https://ieeexplore.ieee.org/document/9964289/

  • Oney S, Rodrigues N, Becher M, Ertl T, Reina G, Sedlmair M and Weiskopf D. Evaluation of Gaze Depth Estimation from Eye Tracking in Augmented Reality. ACM Symposium on Eye Tracking Research and Applications. (1-5).

    https://doi.org/10.1145/3379156.3391835

  • Wahl S, Dragneva D and Rifai K. Digitalization versus immersion: performance and subjective evaluation of 3D perception with emulated accommodation and parallax in digital microsurgery. Journal of Biomedical Optics. 10.1117/1.JBO.24.10.106501. 24:10. (1).

    https://www.spiedigitallibrary.org/journals/journal-of-biomedical-optics/volume-24/issue-10/106501/Digitalization-versus-immersion--performance-and-subjective-evaluation-of-3D/10.1117/1.JBO.24.10.106501.full

  • Mardanbegi D, Clarke C and Gellersen H. Monocular gaze depth estimation using the vestibulo-ocular reflex. Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. (1-9).

    https://doi.org/10.1145/3314111.3319822

  • Mardanbegi D, Langlotz T and Gellersen H. Resolving Target Ambiguity in 3D Gaze Interaction through VOR Depth Estimation. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. (1-12).

    https://doi.org/10.1145/3290605.3300842

  • Weier M, Roth T, Hinkenjann A and Slusallek P. (2018). Foveated Depth-of-Field Filtering in Head-Mounted Displays. ACM Transactions on Applied Perception. 15:4. (1-14). Online publication date: 31-Oct-2018.

    https://doi.org/10.1145/3238301

  • Weber S, Schubert R, Vogt S, Velichkovsky B and Pannasch S. (2017). Gaze3DFix: Detecting 3D fixations with an ellipsoidal bounding volume. Behavior Research Methods. 10.3758/s13428-017-0969-4. 50:5. (2004-2015). Online publication date: 1-Oct-2018.

    http://link.springer.com/10.3758/s13428-017-0969-4

  • Elmadjian C, Shukla P, Tula A and Morimoto C. 3D gaze estimation in the scene volume with a head-mounted eye tracker. Proceedings of the Workshop on Communication by Gaze Interaction. (1-9).

    https://doi.org/10.1145/3206343.3206351

  • Weier M, Roth T, Hinkenjann A and Slusallek P. Predicting the gaze depth in head-mounted displays using multiple feature regression. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications. (1-9).

    https://doi.org/10.1145/3204493.3204547

  • Lee Y, Shin C, Piumsomboon T, Lee G and Billinghurst M. Automated enabling of head mounted display using gaze-depth estimation. SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications. (1-4).

    https://doi.org/10.1145/3132787.3139201

  • Mantiuk R. (2017). Accuracy of High-End and Self-build Eye-Tracking Systems. Hard and Soft Computing for Artificial Intelligence, Multimedia and Security. 10.1007/978-3-319-48429-7_20. (216-227).

    http://link.springer.com/10.1007/978-3-319-48429-7_20

  • Wang X, Lindlbauer D, Lessig C and Alexa M. (2017). Accuracy of Monocular Gaze Tracking on 3D Geometry. Eye Tracking and Visualization. 10.1007/978-3-319-47024-5_10. (169-184).

    http://link.springer.com/10.1007/978-3-319-47024-5_10

  • Tong I, Mohareri O, Tatasurya S, Hennessey C and Salcudean S. (2015). A retrofit eye gaze tracker for the da Vinci and its integration in task execution using the da Vinci Research Kit 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 10.1109/IROS.2015.7353648. 978-1-4799-9994-1. (2043-2050).

    http://ieeexplore.ieee.org/document/7353648/

  • Lanata A, Greco A, Valenza G and Scilingo E. (2015). On the tridimensional estimation of the gaze point by a stereoscopic wearable eye tracker 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 10.1109/EMBC.2015.7318848. 978-1-4244-9271-8. (2283-2286).

    http://ieeexplore.ieee.org/document/7318848/

  • Ma B, Zhou J, Gu X, Wang M, Zhang Y and Guo X. (2015). A new approach to create 3D fixation density maps for stereoscopic images 2015 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON 2015). 10.1109/3DTV.2015.7169360. 978-1-4673-8090-4. (1-4).

    http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=7169360

  • Fornalczyk K, Napieralski P, Szajerman D, Wojciechowski A, Sztoch P and Wawrzyniak J. (2015). Stereoscopic image perception quality factors 2015 MIXDES - 22nd International Conference "Mixed Design of Integrated Circuits & Systems". 10.1109/MIXDES.2015.7208495. 978-8-3635-7807-7. (129-133).

    http://ieeexplore.ieee.org/document/7208495/

  • Vogel J, Haddadin S, Jarosiewicz B, Simeral J, Bacher D, Hochberg L, Donoghue J and van der Smagt P. (2015). An assistive decision-and-control architecture for force-sensitive hand–arm systems driven by human–machine interfaces. The International Journal of Robotics Research. 10.1177/0278364914561535. 34:6. (763-780). Online publication date: 1-May-2015.

    http://journals.sagepub.com/doi/10.1177/0278364914561535

  • Duchowski A, House D, Gestring J, Wang R, Krejtz K, Krejtz I, Mantiuk R and Bazyluk B. Reducing visual discomfort of 3D stereoscopic displays with gaze-contingent depth-of-field. Proceedings of the ACM Symposium on Applied Perception. (39-46).

    https://doi.org/10.1145/2628257.2628259

  • Wang R, Pelfrey B, Duchowski A and House D. (2014). Online 3D Gaze Localization on Stereoscopic Displays. ACM Transactions on Applied Perception. 11:1. (1-21). Online publication date: 1-Apr-2014.

    https://doi.org/10.1145/2593689

  • Bernhard M, Dell'mour C, Hecher M, Stavrakis E and Wimmer M. The effects of fast disparity adjustment in gaze-controlled stereoscopic applications. Proceedings of the Symposium on Eye Tracking Research and Applications. (111-118).

    https://doi.org/10.1145/2578153.2578169

  • Duchowski A, House D, Gestring J, Congdon R, Świrski L, Dodgson N, Krejtz K and Krejtz I. Comparing estimated gaze depth in virtual and physical environments. Proceedings of the Symposium on Eye Tracking Research and Applications. (103-110).

    https://doi.org/10.1145/2578153.2578168

  • Hansen J, Alapetite A, MacKenzie I and Møllenbach E. The use of gaze to control drones. Proceedings of the Symposium on Eye Tracking Research and Applications. (27-34).

    https://doi.org/10.1145/2578153.2578156

  • Alt F, Schneegass S, Auda J, Rzayev R and Broy N. Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays. Proceedings of the 19th international conference on Intelligent User Interfaces. (267-272).

    https://doi.org/10.1145/2557500.2557518

  • Bertrand J, Ebrahimi E, Wachter A, Luo J, Babu S, Duchowski A, Meehan N and Gramopadhye A. Visual attention to wayfinding aids in virtual environments. Proceedings of the 5th Joint Virtual Reality Conference. (9-16).

    /doi/10.5555/2600262.2600265

  • Vogel J, Bayer J and van der Smagt P. (2013). Continuous robot control using surface electromyography of atrophic muscles 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2013). 10.1109/IROS.2013.6696449. 978-1-4673-6358-7. (845-850).

    http://ieeexplore.ieee.org/document/6696449/

  • Inoue T, Bounyong S, Kato Y and Ozawa J. (2013). Fixation distance estimation using vergence eye movement for automatic focusing glasses 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 10.1109/EMBC.2013.6610590. 978-1-4577-0216-7. (4674-4677).

    http://ieeexplore.ieee.org/document/6610590/

  • Mantiuk R, Bazyluk B and Mantiuk R. (2013). Gaze‐driven Object Tracking for Real Time Rendering. Computer Graphics Forum. 10.1111/cgf.12036. 32:2pt2. (163-173). Online publication date: 1-May-2013.

    https://onlinelibrary.wiley.com/doi/10.1111/cgf.12036

  • Wang R, Pelfrey B, Duchowski A and House D. Online Gaze Disparity via Bioncular Eye Tracking on Stereoscopic Displays. Proceedings of the 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission. (184-191).

    https://doi.org/10.1109/3DIMPVT.2012.37

  • Bulling A, Dachselt R, Duchowski A, Jacob R, Stellmach S and Sundstedt V. Gaze interaction in the post-WIMP world. CHI '12 Extended Abstracts on Human Factors in Computing Systems. (1221-1224).

    https://doi.org/10.1145/2212776.2212428

  • Ouzts A and Duchowski A. Comparison of eye movement metrics recorded at different sampling rates. Proceedings of the Symposium on Eye Tracking Research and Applications. (321-324).

    https://doi.org/10.1145/2168556.2168626