[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3343036.3343130acmconferencesArticle/Chapter ViewAbstractPublication PagessapConference Proceedingsconference-collections
short-paper
Public Access

Perceptual Comparison of Procedural and Data-Driven Eye Motion Jitter

Published: 19 September 2019 Publication History

Abstract

Research has shown that keyframed eye motions are perceived as more realistic when some noise is added to eyeball motions and to pupil size changes. We investigate whether this noise, in contrast to being motion captured, can be synthesized with standard techniques, e.g., procedural or data-driven approaches. In a two-alternative forced choice task, we compare eye animations created with four different techniques: motion captured, procedural, data-driven, and keyframed (lacking noise). Our perceptual experiment uses three character models with different levels of realism and two motions. Our results suggest that procedural and data-driven noise can be used to create animations at similar perceived naturalness to our motion captured approach. Participants’ eye movements when viewing the animations show that animations without jitter yielded fewer fixations, suggesting ease of dismissal as unnatural.

References

[1]
Sean Andrist, Tomislav Pejsa, Bilge Mutlu, and Michael Gleicher. 2012. A Head-eye Coordination Model for Animating Gaze Shifts of Virtual Characters. In Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction(Gaze-In ’12). ACM, New York, NY, USA, Article 4, 6 pages.
[2]
William J. Conover. 1971. Practical Nonparametric Statistics. John Wiley & Sons, New York.
[3]
Zhigang Deng, J. P. Lewis, and Ulrich Neumann. 2005. Automated eye motion using texture synthesis. IEEE Computer Graphics Applications 25, 2 (March 2005), 24–30.
[4]
Andrew Duchowski, Sophie Jörg, Aubrey Lawson, Takumi Bolte, Lech Świrski, and Krzysztof Krejtz. 2015. Eye movement synthesis with 1/f pink noise. In Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games(MIG ’15). 47–56.
[5]
Andrew T. Duchowski, Sophie Jörg, Tyler N. Allen, Ioannis Giannopoulos, and Krzysztof Krejtz. 2016. Eye movement synthesis. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications(ETRA ’16). 147–154.
[6]
Jessica Hodgins, Sophie Jörg, Carol O’Sullivan, Sang Il Park, and Moshe Mahler. 2010. The Saliency of Anomalies in Animated Human Characters. ACM Transactions on Applied Perception (TAP) 7, 4, Article 22 (July 2010), 22:1–22:14 pages.
[7]
Sophie Jörg, Andrew Duchowski, Krzysztof Krejtz, and Anna Niedzielska. 2018. Perceptual Adjustment of Eyeball Rotation and Pupil Size Jitter for Virtual Characters. ACM Trans. Appl. Percept. 15, 4, Article 24 (Oct. 2018), 13 pages.
[8]
Krzysztof Krejtz, Andrew Duchowski, Heng Zhou, Sophie Jörg, and Anna Niedzielska. 2017. Perceptual evaluation of synthetic gaze jitter. Computer Animation and Virtual Worlds(2017), e1745.
[9]
Sooha Park Lee, Jeremy B. Badler, and Norman I. Badler. 2002. Eyes alive. ACM Transactions on Graphics 21, 3 (July 2002), 637–644.
[10]
Xiaohan Ma and Zhigang Deng. 2009. Natural eye motion synthesis by modeling gaze-head coupling. In IEEE Virtual Reality. Lafayette, LA, 143–150.
[11]
Susana Martinez-Conde, Stephen L. Macknik, and David H. Hubel. 2004. The role of fixational eye movements in visual perception. Nature Reviews Neuroscience 5, 3 (March 2004), 229–240.
[12]
Rick Parent. 2007. Computer Animation: Algorithms and Techniques (3 ed.). Elsevier, Waltham, MA, USA.
[13]
Kerstin Ruhland, S. Andrist, J. B. Badler, Christopher E. Peters, Norman I. Badler, Michael Gleicher, Bilge Mutlu, and Rachel McDonnell. 2014. Look me in the eyes: A survey of eye and gaze animation for virtual agents and artificial systems. In Eurographics 2014 - State of the Art Reports, Sylvain Lefebvre and Michela Spagnuolo (Eds.).
[14]
Lawrence Stark, Fergus W. Campbell, and John Atwood. 1958. Pupil unrest: An example of noise in a biological servomechanism. Nature 182, 4639 (1958), 857–858.
[15]
Laura C. Trutoiu, Elizabeth J. Carter, Iain Matthews, and Jessica K. Hodgins. 2011. Modeling and animating eye blinks. ACM Transactions on Applied Perception (TAP) 2, 3, Article 17 (May 2011), 17 pages.
[16]
Sang Hoon Yeo, Martin Lesmana, Debanga R. Neog, and Dinesh K. Pai. 2012. Eyecatch: Simulating visuomotor coordination for object interception. ACM Transactions on Graphics 31, 4, Article 42 (July 2012), 10 pages.

Cited By

View all
  • (2022)Is the avatar scared? Pupil as a perceptual cueComputer Animation and Virtual Worlds10.1002/cav.204033:2Online publication date: 3-Feb-2022
  1. Perceptual Comparison of Procedural and Data-Driven Eye Motion Jitter

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      SAP '19: ACM Symposium on Applied Perception 2019
      September 2019
      188 pages
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 19 September 2019

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Character animation
      2. Eye motion
      3. Perceptual evaluation

      Qualifiers

      • Short-paper
      • Research
      • Refereed limited

      Funding Sources

      Conference

      SAP '19
      SAP '19: ACM Symposium on Applied Perception 2019
      September 19 - 20, 2019
      Barcelona, Spain

      Acceptance Rates

      Overall Acceptance Rate 43 of 94 submissions, 46%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)88
      • Downloads (Last 6 weeks)9
      Reflects downloads up to 01 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2022)Is the avatar scared? Pupil as a perceptual cueComputer Animation and Virtual Worlds10.1002/cav.204033:2Online publication date: 3-Feb-2022

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Login options

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media