[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3649902.3655652acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
abstract

Enhancing User Gaze Prediction in Monitoring Tasks: The Role of Visual Highlights

Published: 04 June 2024 Publication History

Abstract

This study examines the role of visual highlights in guiding user attention in drone monitoring tasks, employing a simulated interface for observation. The experiment results show that such highlights can significantly expedite the visual attention on the corresponding area. Based on this observation, we leverage both the temporal and spatial information in the highlight to develop a new saliency model: the highlight-informed saliency model (HISM), to infer the visual attention change in the highlight condition. Our findings show the effectiveness of visual highlights in enhancing user attention and demonstrate the potential of incorporating these cues into saliency prediction models.

References

[1]
Bahar Aydemir, Ludo Hoffstetter, Tong Zhang, Mathieu Salzmann, and Sabine Süsstrunk. 2023. TempSAL-Uncovering Temporal Information for Deep Saliency Prediction. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 6461–6470.
[2]
Mica R Endsley. 2000. Direct measurement of situation awareness: Validity and use of SAGAT. Situation awareness analysis and measurement 10 (2000), 147–173.
[3]
Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Harish Kulkarni, Shaun Kane, and Meredith Ringel Morris. 2017. Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the 2017 Chi conference on human factors in computing systems. 1118–1130.
[4]
Prakhar Gupta, Shubh Gupta, Ajaykrishnan Jayagopal, Sourav Pal, and Ritwik Sinha. 2018. Saliency prediction for mobile user interfaces. In 2018 IEEE Winter Conference on Applications of Computer Vision (WACV). IEEE, 1529–1538.
[5]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition. 770–778.
[6]
Nihan Karatas, Takahiro Tanaka, Kazuhiro Fujikakc, Yuki Yoshihara, Hitoshi Kanamori, Yoshitaka Fuwamoto, and Morihiko Yoshida. 2020. Evaluation of AR-HUD interface during an automated intervention in manual driving. In 2020 IEEE Intelligent Vehicles Symposium (IV). IEEE, 2158–2164.
[7]
Kyle Min and Jason J Corso. 2019. Tased-net: Temporally-aggregating spatial encoder-decoder network for video saliency detection. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 2394–2403.
[8]
Navyasri Reddy, Samyak Jain, Pradeep Yarlagadda, and Vineet Gandhi. 2020. Tidying deep saliency prediction architectures. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 10241–10247.
[9]
Angela T Schriver, Daniel G Morrow, Christopher D Wickens, and Donald A Talleur. 2017. Expertise differences in attentional strategies related to pilot decision making. In Decision making in aviation. Routledge, 371–386.
[10]
Yong Yu, Xiaosheng Si, Changhua Hu, and Jianxun Zhang. 2019. A review of recurrent neural networks: LSTM cells and network architectures. Neural computation 31, 7 (2019), 1235–1270.

Index Terms

  1. Enhancing User Gaze Prediction in Monitoring Tasks: The Role of Visual Highlights

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '24: Proceedings of the 2024 Symposium on Eye Tracking Research and Applications
    June 2024
    525 pages
    ISBN:9798400706073
    DOI:10.1145/3649902
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 June 2024

    Check for updates

    Author Tags

    1. Eye Tracking
    2. Human computer interaction (HCI)
    3. Human-centered Computing
    4. Visualization

    Qualifiers

    • Abstract
    • Research
    • Refereed limited

    Funding Sources

    • DFG grant 389792660

    Conference

    ETRA '24

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 31
      Total Downloads
    • Downloads (Last 12 months)31
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 01 Jan 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media