[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/1394281.1394288acmconferencesArticle/Chapter ViewAbstractPublication PagesapgvConference Proceedingsconference-collections
research-article

A psychophysical study of fixation behavior in a computer game

Published: 09 August 2008 Publication History

Abstract

Prediction of gaze behavior in gaming environments can be a tremendously useful asset to game designers, enabling them to improve gameplay, selectively increase visual fidelity, and optimize the distribution of computing resources. The use of saliency maps is currently being advocated as the method of choice for predicting visual attention, crucially under the assumption that no specific task is present. This is achieved by analyzing images for low-level features such as motion, contrast, luminance, etc. However, the majority of computer games are designed to be easily understood and pose a task readily apparent to most players. Our psychophysical experiment shows that in a task-oriented context such as gaming, the predictive power of saliency maps at design time can be weak. Thus, we argue that a more involved protocol utilizing eye tracking, as part of the computer game design cycle, can be sufficiently robust to succeed in predicting fixation behavior of players.

References

[1]
Canosa, R. 2003. Seeing, sensing, and selection: modeling visual perception in complex environments. PhD thesis, Rochester Institute of Technology, Rochester College of Science.
[2]
Cater, K., Chalmers, A., and Ledda, P. 2002. Selective quality rendering by exploiting human inattentional blindness: looking but not seeing. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, 17--24.
[3]
Cater, K., Chalmers, A., and Ward, G. 2003. Detail to attention: Exploiting visual tasks for selective rendering. In Proc. of the 14th Eurographics Workshop on Rendering, 270--280.
[4]
Deriugin, N. G. 1956. The power spectrum and the correlation function of the television signal. Telecommunications 1, 7, 1--12.
[5]
Duchowski, A. T. 2003. Eye tracking methodology: Theory and practice. Springer, New York.
[6]
El-Nasr, M. S., and Yan, S. 2006. Visual attention in 3d video games. In Proc. of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology, 22.
[7]
Flanagan, J. R., and Johansson, R. S. 2003. Action plans used in action observation. Nature 424, 6950, 769--771.
[8]
Haber, J., Myszkowski, K., Yamauchi, H., and Seidel, H.-P. 2001. Perceptually guided corrective splatting. Computer Graphics Forum 20, 3, 142--152.
[9]
Hayhoe, M. M., Shrivastava, A., Mruczek, R., and Pelz, J. B. 2003. Visual memory and motor planning in a natural task. J. Vis. 3, 1 (2), 49--63.
[10]
Itti, L., and Koch, C. 2001. Computational modeling of visual attention. Nature Reviews Neuroscience 2, 3, 194--203.
[11]
Itti, L., Koch, C., and Niebur, E. 1998. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. on Pattern Analysis and Machine Intelligence 20, 11, 1254--1259.
[12]
Kenny, A., Koesling, H., Delaney, D., McLoone, S., and Ward, T. 2005. A preliminary investigation into eye gaze data in a first person shooter game. In 19th European Conference on Modelling and Simulation.
[13]
Koch, C., and Ullman, S. 1985. Shifts in selective visual attention: towards the underlying neural circuitry. Human Neurobiology 4, 4, 219--227.
[14]
Land, M., Mennie, N., and Rusted, J. 1999. The roles of vision and eye movements in the control of activities of daily living. Perception 28, 11, 1311--1328.
[15]
Lee, S., Kim, G. J., and Choi, S. 2007. Real-time tracking of visually attended objects in interactive virtual environments. In ACM Symp. on Virtual Reality Software and Technology, 29--38.
[16]
Navalpakkam, V., and Itti, L. 2005. Modeling the influence of task on attention. Vision Research 45, 2, 205--231.
[17]
Palmer, S. E. 1999. Vision science: Photons to phenomenology. MIT Press, Boston.
[18]
Peters, R. J., and Itti, L. 2007. Beyond bottom-up: Incorporating task-dependent influences into a computational model of spatial attention. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 1--8.
[19]
Peters, R. J., and Itti, L. 2008. Applying computational tools to predict gaze direction in interactive visual environments. ACM Transactions on Applied Perception 5, 2, 22.
[20]
Rothkopf, C. A., Ballard, D. H., and Hayhoe, M. M. 2007. Task and context determine where you look. Journal of Vision 7, 14 (12), 1--20.
[21]
Salthouse, T. A., Ellis, C. L., Diener, D. C., and Somberg, B. L. 1981. Stimulus processing during eye fixations. Journal of Experimental Psychology: Human Perception and Performance 7, 3, 611--623.
[22]
Sundstedt, V., Gutierrez, D., Anson, O., Banterle, F., and Chalmers, A. G. 2007. Perceptual rendering of participating media. ACM Trans. on Applied Perception 4, 3, 15.
[23]
Triesman, A. M., and Gelade, G. 1980. A feature-integration theory of attention. Cognitive Psychology 12, 1, 97--136.
[24]
Yarbus, A. L. 1967. Eye movements during perception of complex objects. In Eye Movements and Vision, Plenum Press, New York, 171--196.
[25]
Yee, H., Pattanaik, S., and Greenberg, D. P. 2001. Spatiotemporal sensitivity and visual attention for efficient rendering of dynamic environments. ACM Trans. on Graphics 20, 1, 39--65.

Cited By

View all
  • (2023)NPF-200: A Multi-Modal Eye Fixation Dataset and Method for Non-Photorealistic VideosProceedings of the 31st ACM International Conference on Multimedia10.1145/3581783.3611839(2294-2304)Online publication date: 26-Oct-2023
  • (2022)Advancing dignity for adaptive wheelchair users via a hybrid eye tracking and electromyography training game2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529612(1-7)Online publication date: 8-Jun-2022
  • (2022)The task-attention theory of game learning: a theory and research agendaHuman–Computer Interaction10.1080/07370024.2022.204797139:5-6(257-287)Online publication date: 19-Apr-2022
  • Show More Cited By

Index Terms

  1. A psychophysical study of fixation behavior in a computer game

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    APGV '08: Proceedings of the 5th symposium on Applied perception in graphics and visualization
    August 2008
    209 pages
    ISBN:9781595939814
    DOI:10.1145/1394281
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 09 August 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. electronic games
    2. eye movements
    3. eye tracking
    4. psychophysics
    5. saliency
    6. visual attention

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    APGV08
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 19 of 33 submissions, 58%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)15
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 21 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)NPF-200: A Multi-Modal Eye Fixation Dataset and Method for Non-Photorealistic VideosProceedings of the 31st ACM International Conference on Multimedia10.1145/3581783.3611839(2294-2304)Online publication date: 26-Oct-2023
    • (2022)Advancing dignity for adaptive wheelchair users via a hybrid eye tracking and electromyography training game2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529612(1-7)Online publication date: 8-Jun-2022
    • (2022)The task-attention theory of game learning: a theory and research agendaHuman–Computer Interaction10.1080/07370024.2022.204797139:5-6(257-287)Online publication date: 19-Apr-2022
    • (2021)Statistical modeling of dynamic eye-tracking experiments: Relative importance of visual stimulus elements for gaze behavior in the multi-group caseBehavior Research Methods10.3758/s13428-021-01576-853:6(2650-2667)Online publication date: 23-May-2021
    • (2021)Emulating Foveated Path TracingProceedings of the 14th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3487983.3488295(1-9)Online publication date: 10-Nov-2021
    • (2020)Nothing else matters: Video games create sustained attentional selection away from task-irrelevant featuresAttention, Perception, & Psychophysics10.3758/s13414-020-02122-y82:8(3907-3919)Online publication date: 11-Sep-2020
    • (2020)To Add or Not to Add Game Elements? Exploring the Effects of Different Cognitive Task Designs Using Eye TrackingIEEE Transactions on Learning Technologies10.1109/TLT.2020.303164413:4(847-860)Online publication date: 1-Oct-2020
    • (2018)A group‐based approach for gaze behavior of virtual crowds incorporating personalitiesComputer Animation and Virtual Worlds10.1002/cav.180629:5Online publication date: 17-Apr-2018
    • (2016)Gaze analysis of BRDF distortionsProceedings of the Eurographics 2016 Workshop on Material Appearance Modeling10.5555/3061347.3061356(23-26)Online publication date: 22-Jun-2016
    • (2016)A visualisation course in a game development curriculumProceedings of the 37th Annual Conference of the European Association for Computer Graphics: Education Papers10.5555/3059068.3059070(9-16)Online publication date: 9-May-2016
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media