[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3490100.3516469acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiuiConference Proceedingsconference-collections
Work in Progress

GazeSync: Eye Movement Transfer Using an Optical Eye Tracker and Monochrome Liquid Crystal Displays

Published: 22 March 2022 Publication History

Abstract

Can we see the world through the eyes of somebody else? We present an early work to transfer eye gaze from one person to another. Imagine you can follow the eye gaze of an instructor while explaining a complex work step, or you can experience a painting like an expert would: Your gaze is directed to the important parts and follows the appropriate steps. In this work, we explore the possibility to transmit eye-gaze information in a subtle, unobtrusive fashion between two individuals. We present an early prototype consisting of an optical eye-tracker for the leader (person who shares the eye gaze) and two monochrome see-through displays for the follower (person who follows the eye gaze of the leader). We report the results of an initial user test and discuss future works.

References

[1]
Simon Baron-Cohen, Sally Wheelwright, Jacqueline Hill, Yogini Raste, and Ian Plumb. 2001. The “Reading the Mind in the Eyes” Test revised version: a study with normal adults, and adults with Asperger syndrome or high-functioning autism. The Journal of Child Psychology and Psychiatry and Allied Disciplines 42, 2(2001), 241–251.
[2]
Anne Böckler, Günther Knoblich, and Natalie Sebanz. 2011. Observing shared attention modulates gaze following. Cognition 120, 2 (2011), 292–298.
[3]
Yoonjeong Cha, Sungu Nam, Mun Yong Yi, Jaeseung Jeong, and Woontack Woo. 2018. Augmented collaboration in shared space design with shared attention and manipulation. In The 31st Annual ACM Symposium on User Interface Software and Technology Adjunct Proceedings. 13–15.
[4]
Dale J Cohen. 2005. Look little, look often: The influence of gaze frequency on drawing accuracy. Perception & Psychophysics 67, 6 (2005), 997–1009.
[5]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology(Charlotte, NC, USA) (UIST ’15). Association for Computing Machinery, New York, NY, USA, 457–466. https://doi.org/10.1145/2807442.2807499
[6]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th annual ACM symposium on user interface software & technology. 457–466.
[7]
Kunal Gupta, Gun A Lee, and Mark Billinghurst. 2016. Do you see what I see? The effect of gaze tracking on task space remote collaboration. IEEE transactions on visualization and computer graphics 22, 11(2016), 2413–2422.
[8]
Aiko Hagiwara, Akihiro Sugimoto, and Kazuhiko Kawamoto. 2011. Saliency-based image editing for guiding visual attention. In Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction. 43–48.
[9]
Hajime Hata, Hideki Koike, and Yoichi Sato. 2016. Visual guidance with unnoticed blur effect. Proceedings of the Workshop on Advanced Visual Interfaces AVI 07-10-June-2016, 28–35. https://doi.org/10.1145/2909132.2909254
[10]
Yuichi Hiroi, Yuta Itoh, Takumi Hamasaki, and Maki Sugimoto. 2017. Assisting Eye Adaptation Via Occlusive Optical See-Through Head-Mounted Displays AdaptiVisor.
[11]
Matthew W Hoffman, David B Grimes, Aaron P Shon, and Rajesh PN Rao. 2006. A probabilistic model of gaze imitation and shared attention. Neural Networks 19, 3 (2006), 299–310.
[12]
Roxane J Itier and Magali Batty. 2009. Neural bases of eye and gaze processing: the core of social cognition. Neuroscience & Biobehavioral Reviews 33, 6 (2009), 843–863.
[13]
Sarah Jessen and Tobias Grossmann. 2014. Unconscious discrimination of social cues from eye whites in infants. Proceedings of the National Academy of Sciences 111, 45(2014), 16208–16213. https://doi.org/10.1073/pnas.1411333111 arXiv:https://www.pnas.org/content/111/45/16208.full.pdf
[14]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: Adjunct publication. 1151–1160.
[15]
Tiffany CK Kwok, Peter Kiefer, Victor R Schinazi, Benjamin Adams, and Martin Raubal. 2019. Gaze-guided narratives: Adapting audio guide content to gaze in virtual and real environments. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–12.
[16]
Pupil Labs. 2014. Pupil Core Eye Tracker. https://pupil-labs.com/products/core/
[17]
Yuan Li, Feiyu Lu, Wallace S Lages, and Doug Bowman. 2019. Gaze direction visualization techniques for collaborative wide-area model-free augmented reality. In Symposium on spatial user interaction. 1–11.
[18]
Christine E Looser and Thalia Wheatley. 2010. The tipping point of animacy: How, when, and where we perceive life in a face. Psychological science 21, 12 (2010), 1854–1862.
[19]
AJung Moon, Daniel M Troniak, Brian Gleeson, Matthew KXJ Pan, Minhua Zheng, Benjamin A Blumer, Karon MacLean, and Elizabeth A Croft. 2014. Meet me where i’m gazing: how shared attention gaze affects human-robot handover timing. In Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction. 334–341.
[20]
Zahar Prasov and Joyce Y Chai. 2008. What’s in a gaze? The role of eye-gaze in reference resolution in multimodal conversational interfaces. In Proceedings of the 13th international conference on Intelligent user interfaces. 20–29.
[21]
Dale Purves and Stephen Mark Williams. 2001. Neuroscience. 2nd edition. Sinauer Associates 2001. http://lib.ugent.be/catalog/ebk01:3450000000002013
[22]
Ben Steichen, Giuseppe Carenini, and Cristina Conati. 2013. User-adaptive information visualization: using eye gaze data to infer visualization tasks and user cognitive abilities. In Proceedings of the 2013 international conference on Intelligent user interfaces. 317–328.
[23]
Hiroki Tanaka, Sakriani Sakti, Graham Neubig, Tomoki Toda, Hideki Negoro, Hidemi Iwasaka, and Satoshi Nakamura. 2015. Automated social skills trainer. In Proceedings of the 20th International Conference on Intelligent User Interfaces. 17–27.
[24]
Vincent Van Rheden, Bernhard Maurer, Dorothé Smit, Martin Murer, and Manfred Tscheligi. 2017. LaserViz: Shared gaze in the Co-located physical world. In Proceedings of the Eleventh International Conference on Tangible, Embedded, and Embodied Interaction. 191–196.
[25]
Qing Zhang, Hiroo Yamamura, Holger Baldauf, Dingding Zheng, Kanyu Chen, Junichi Yamaoka, and Kai Kunze. 2021. Tunnel Vision–Dynamic Peripheral Vision Blocking Glasses for Reducing Motion Sickness Symptoms. In 2021 International Symposium on Wearable Computers. 48–52.
[26]
Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2017. Mpiigaze: Real-world dataset and deep appearance-based gaze estimation. IEEE transactions on pattern analysis and machine intelligence 41, 1(2017), 162–175.

Cited By

View all
  • (2024)Mapping Gaze and Head Movement via Salience Modulation and Hanger ReflexAdjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3672539.3686349(1-3)Online publication date: 13-Oct-2024
  • (2022)Seeing our Blind Spots: Smart Glasses-based Simulation to Increase Design Students’ Awareness of Visual ImpairmentProceedings of the 35th Annual ACM Symposium on User Interface Software and Technology10.1145/3526113.3545687(1-14)Online publication date: 29-Oct-2022
  • (2022)Low Cost Real-time Eye Tracking System for Motorsports2022 29th IEEE International Conference on Electronics, Circuits and Systems (ICECS)10.1109/ICECS202256217.2022.9970888(1-4)Online publication date: 24-Oct-2022

Index Terms

  1. GazeSync: Eye Movement Transfer Using an Optical Eye Tracker and Monochrome Liquid Crystal Displays
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    IUI '22 Companion: Companion Proceedings of the 27th International Conference on Intelligent User Interfaces
    March 2022
    142 pages
    ISBN:9781450391450
    DOI:10.1145/3490100
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 22 March 2022

    Check for updates

    Author Tags

    1. eye tracking
    2. gaze communication
    3. gaze detection
    4. gaze synchronization

    Qualifiers

    • Work in progress
    • Research
    • Refereed limited

    Funding Sources

    • JPMJSP2123

    Conference

    IUI '22
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 746 of 2,811 submissions, 27%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)29
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 11 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Mapping Gaze and Head Movement via Salience Modulation and Hanger ReflexAdjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3672539.3686349(1-3)Online publication date: 13-Oct-2024
    • (2022)Seeing our Blind Spots: Smart Glasses-based Simulation to Increase Design Students’ Awareness of Visual ImpairmentProceedings of the 35th Annual ACM Symposium on User Interface Software and Technology10.1145/3526113.3545687(1-14)Online publication date: 29-Oct-2022
    • (2022)Low Cost Real-time Eye Tracking System for Motorsports2022 29th IEEE International Conference on Electronics, Circuits and Systems (ICECS)10.1109/ICECS202256217.2022.9970888(1-4)Online publication date: 24-Oct-2022

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media