[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3208031.3208032acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

The art of pervasive eye tracking: unconstrained eye tracking in the Austrian Gallery Belvedere

Published: 15 June 2018 Publication History

Abstract

Pervasive mobile eye tracking provides a rich data source to investigate human natural behavior, providing a high degree of ecological validity in natural environments. However, challenges and limitations intrinsic to unconstrained mobile eye tracking makes its development and usage to some extent an art. Nonetheless, researchers are pushing the boundaries of this technology to help assess museum visitors' attention not only between the exhibited works, but also within particular pieces, providing significantly more detailed insights than traditional timing-and-tracking or external observer approaches. In this paper, we present in detail the eye tracking system developed for a large scale fully-unconstrained study in the Austrian Gallery Belvedere, providing useful information for eye-tracking system designers. Furthermore, the study is described, and we report on usability and real-time performance metrics. Our results suggest that, although the system is comfortable enough, further eye tracker improvements are necessary to make it less conspicuous. Additionally, real-time accuracy already suffices for simple applications such as audio guides for the majority of users even in the absence of eye-tracker slippage compensation.

References

[1]
Amazon. 2018. Hydration Backpack. Accessed in 2018-02-15. (2018). https://www.amazon.de/gp/product/B019IAB38A/ref=oh_aui_detailpage_o09_s02?ie=UTF8&psc=1
[2]
Wolfgang Becker. 1989. The neurobiology of saccadic eye movements. Metrics. Reviews of oculomotor research 3 (1989), 13.
[3]
Andreas Bulling and Hans Gellersen. 2010. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9, 4 (2010), 8--12.
[4]
Michael Calonder, Vincent Lepetit, Christoph Strecha, and Pascal Fua. 2010. Brief: Binary robust independent elementary features. In European conference on computer vision. Springer, 778--792.
[5]
Andrew T Duchowski. 2002. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, & Computers 34, 4 (2002), 455--470.
[6]
Ergoneers. 2018. Dikablis Glasses Professional. Accessed in 2018-02-15. (2018). http://www.ergoneers.com/en/hardware/dikablis-glasses/
[7]
S Filippini Fantoni, K Jaebker, D Bauer, and K Stofer. 2013. Capturing visitorsâĂŹ gazes. Three eye tracking studies in museums. In The annual conference of museums and the web. 17--20.
[8]
Michael Frigge, David C Hoaglin, and Boris Iglewicz. 1989. Some implementations of the boxplot. The American Statistician 43, 1 (1989), 50--54.
[9]
Wolfgang Fuhl, Thomas Kübler, Katrin Sippel, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2015. Excuse: Robust pupil detection in real-world scenarios. In International Conference on Computer Analysis of Images and Patterns. Springer, 39--51.
[10]
Wolfgang Fuhl, Thiago Santini, Gjergji Kasneci, and Enkelejda Kasneci. 2016a. Pupil-Net: convolutional neural networks for robust pupil detection. arXiv preprint arXiv:1601.04902 (2016).
[11]
Wolfgang Fuhl, Thiago C Santini, Thomas Kübler, and Enkelejda Kasneci. 2016b. Else: Ellipse selection for robust pupil detection in real-world environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 123--130.
[12]
Sergio Garrido-Jurado, Rafael Muñoz-Salinas, Francisco José Madrid-Cuevas, and Manuel Jesús Marín-Jiménez. 2014. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognition 47, 6 (2014), 2280--2292.
[13]
Kaiming He, Georgia Gkioxari, Piotr Dollår, and Ross Girshick. 2017. Mask r-cnn. In Computer Vision (ICCV), 2017 IEEE International Conference on. IEEE, 2980--2988.
[14]
G.E. Hein. 2002. Learning in the Museum. Taylor & Francis.
[15]
Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost Van de Weijer. 2011. Eye tracking: A comprehensive guide to methods and measures. Oxford University Press.
[16]
Intel. 2018. Accessed in 2018-02-15. (2018). https://downloadcenter.intel.com/product/66427/Intel-Extreme-Tuning-Utility-Intel-XTU-
[17]
Huaizu Jiang and Erik Learned-Miller. 2017. Face detection with the faster R-CNN. In Automatic Face & Gesture Recognition (FG 2017), 2017 12th IEEE International Conference on. IEEE, 650--657.
[18]
Enkelejda Kasneci. 2017. Towards pervasive eye tracking. IT-Information Technology 59, 5 (2017), 253--257.
[19]
Susan M Kolakowski and Jeff B Pelz. 2006. Compensating for eye tracker camera movement. In Proceedings of the 2006 symposium on Eye tracking research & applications. ACM, 79--85.
[20]
Thomas C Kübler, Enkelejda Kasneci, Wolfgang Rosenstiel, Martin Heister, Kathrin Aehling, Katja Nagel, Ulrich Schiefer, and Elena Papageorgiou. 2015. Driving with glaucoma: task performance and gaze movements. Optometry & Vision Science 92, 11 (2015), 1037--1046.
[21]
Thomas C Kübler, Colleen Rothe, Ulrich Schiefer, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2017. SubsMatch 2.0: Scanpath comparison and classification based on subsequence frequencies. Behavior research methods 49, 3 (2017), 1048--1064.
[22]
Rensis Likert. 1932. A technique for the measurement of attitudes. Archives of psychology (1932).
[23]
Diako Mardanbegi and Dan Witzner Hansen. 2012. Parallax error in the monocular head-mounted eye trackers. In Proceedings of the 2012 acm conference on ubiquitous computing. ACM, 689--694.
[24]
Susana Martinez-Conde, Stephen L Macknik, and David H Hubel. 2004. The role of fixational eye movements in visual perception. Nature Reviews Neuroscience 5, 3 (2004), 229.
[25]
Eva Mayr et al. 2009. In-sights into mobile learning an exploration of mobile eye tracking methodology for learning in museums. In Proceedings of the Research Methods in Mobile and Informal Learning Wrokshop. Citeseer.
[26]
Moayad Mokatren, Tsvi Kuflik, and Ilan Shimshoni. 2016. Using Eye-Tracking for Enhancing the Museum Visit Experience. In Proceedings of the International Working Conference on Advanced Visual Interfaces (AVI '16). ACM, New York, NY, USA, 330--331.
[27]
Ian Morgan and Kathryn Rose. 2005. How genetic is school myopia? Progress in retinal and eye research 24, 1 (2005), 1--38.
[28]
Carlos H Morimoto and Marcio RM Mimica. 2005. Eye gaze tracking techniques for interactive applications. Computer vision and image understanding 98, 1 (2005), 4--24.
[29]
Omkar M Parkhi, Andrea Vedaldi, Andrew Zisserman, et al. 2015. Deep Face Recognition. In BMVC, Vol. 1. 6.
[30]
Ken Pfeuffer, Mélodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit calibration: making gaze calibration less tedious and more flexible. In The 26th Annual ACM Symposium on User Interface Software and Technology, UIST'13, St. Andrews, United Kingdom, October 8-11, 2013. 261--270.
[31]
Alex Poole and Linden J Ball. 2006. Eye tracking in HCI and usability research. Encyclopedia of human computer interaction 1 (2006), 211--219.
[32]
Pupil Labs. 2018. Accessed in 2018-02-15. (2018). https://pupil-labs.com/
[33]
Ravishankar Rao and Sarma Vrudhula. 2007. Performance optimal processor throttling under thermal constraints. In Proceedings of the 2007 international conference on Compilers, architecture, and synthesis for embedded systems. ACM, 257--266.
[34]
Ethan Rublee, Vincent Rabaud, Kurt Konolige, and Gary Bradski. 2011. ORB: An efficient alternative to SIFT or SURF. In Computer Vision (ICCV), 2011 IEEE international conference on. IEEE, 2564--2571.
[35]
Thiago Santini, Wolfgang Fuhl, David Geisler, and Enkelejda Kasneci. 2017b. EyeRecToo: Open-Source Software for Real-Time Pervasive Head-Mounted Eye-Tracking. In Proceedings of the 12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications.
[36]
Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2017a. CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 2594--2605.
[37]
Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018a. PuRe: Robust pupil detection for real-time pervasive eye tracking. Computer Vision and Image Understanding (Feb 2018).
[38]
Thiago Santini, Wolfgang Fuhl, and Enkelejda Kasneci. 2018b. PuReST: Robust Pupil Tracking for Real-Time Pervasive Eye Tracking. In Proceedings of the ACM Symposium on Eye Tracking Research & Applications. ACM.
[39]
Thiago Santini, Thomas Kübler, Lucas Draghetti, Peter Gerjets, Wolfgang Wagner, Ulrich Trautwein, and Enkelejda Kasneci. 2017c. Automatic Mapping of Remote Crowd Gaze to Stimuli in the Classroom. In Proceedings of the Eye Tracking Enhanced Learning Workshop.
[40]
Felix Schüssel, Johannes Bäurle, Simon Kotzka, Michael Weber, Ferdinand Pittino, and Anke Huckauf. 2016. Design and evaluation of a gaze tracking system for free-space interaction. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. ACM, 1676--1685.
[41]
SensoMotoric Instruments. 2018. SMI Eye Tracking Glasses 2 Wireless. Accessed in 2018-02-15. (2018). https://www.smivision.com/wp-content/uploads/2017/05/smi_prod_ETG_120Hz_asgm.pdf
[42]
Jeffrey S Shell, Roel Vertegaal, and Alexander W Skaburskis. 2003. EyePliances: attention-seeking devices that respond to visual attention. In CHI'03 extended abstracts on Human factors in computing systems. ACM, 770--771.
[43]
Lech Świrski, Andreas Bulling, and Neil Dodgson. 2012. Robust real-time pupil tracking in highly off-axis images. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 173--176.
[44]
Benjamin W Tatler, Ross G Macdonald, Tara Hamling, and Catherine Richardson. 2016. Looking at domestic textiles: an eye-tracking experiment analysing influences on viewing behaviour at Owlpen Manor. Textile history 47, 1 (2016), 94--118.
[45]
Tobii. 2018. Tobii Pro Glasses 2. Accessed in 2018-02-15. (2018). https://www.tobiipro.com/product-listing/tobii-pro-glasses-2/
[46]
Marc Tonsen, Xucong Zhang, Yusuke Sugano, and Andreas Bulling. 2016. Labelled pupils in the wild: a dataset for studying pupil detection in unconstrained environments. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 139--142.
[47]
Francesco Walker, Berno Bucker, Nicola C Anderson, Daniel Schreij, and Jan Theeuwes. 2017. Looking at paintings in the Vincent Van Gogh Museum: Eye movement patterns of children and adults. PloS one 12, 6 (2017), e0178912.
[48]
Daniel Wessel, Eva Mayr, and Kristin Knipfer. 2007. Re-viewing the museum visitor's view. In Workshop Research Methods in Informal and Mobile Learning, Institute of Education, London, UK.

Cited By

View all
  • (2024)Gaze-based detection of mind wandering during audio-guided panorama viewingScientific Reports10.1038/s41598-024-79172-x14:1Online publication date: 14-Nov-2024
  • (2023)Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking gogglesBehavior Research Methods10.3758/s13428-023-02150-056:1(53-79)Online publication date: 27-Jun-2023
  • (2023)How Do We Move in Front of Art? How Does This Relate to Art Experience? Linking Movement, Eye Tracking, Emotion, and Evaluations in a Gallery-Like SettingEmpirical Studies of the Arts10.1177/0276237423116000042:1(86-146)Online publication date: 13-Mar-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
PETMEI '18: Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction
June 2018
50 pages
ISBN:9781450357890
DOI:10.1145/3208031
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. calibration
  2. embedded
  3. eye tracking
  4. gaze estimation
  5. mobile
  6. pervasive
  7. pupil detection
  8. pupil tracking
  9. real-time
  10. system

Qualifiers

  • Research-article

Conference

ETRA '18

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)83
  • Downloads (Last 6 weeks)3
Reflects downloads up to 18 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Gaze-based detection of mind wandering during audio-guided panorama viewingScientific Reports10.1038/s41598-024-79172-x14:1Online publication date: 14-Nov-2024
  • (2023)Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking gogglesBehavior Research Methods10.3758/s13428-023-02150-056:1(53-79)Online publication date: 27-Jun-2023
  • (2023)How Do We Move in Front of Art? How Does This Relate to Art Experience? Linking Movement, Eye Tracking, Emotion, and Evaluations in a Gallery-Like SettingEmpirical Studies of the Arts10.1177/0276237423116000042:1(86-146)Online publication date: 13-Mar-2023
  • (2023)Comparing the Perception of In-Person and Digital Monitor Viewing of PaintingsEmpirical Studies of the Arts10.1177/0276237423115852041:2(465-496)Online publication date: 27-Feb-2023
  • (2023)Feeling the Temperature of the RoomProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35808207:1(1-21)Online publication date: 28-Mar-2023
  • (2022)The Open Gallery for Arts Research (OGAR): An open-source tool for studying the psychology of virtual art museum visitsBehavior Research Methods10.3758/s13428-022-01857-w55:2(824-842)Online publication date: 25-Apr-2022
  • (2022) How do people distribute their attention while observing The Night Watch ? Perception10.1177/0301006622112269751:11(763-788)Online publication date: 29-Sep-2022
  • (2022)Two-Step Gaze GuidanceProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3556612(299-309)Online publication date: 7-Nov-2022
  • (2022)Limitations and ethical reflection on the application of big data in museum visitor researchMuseum Management and Curatorship10.1080/09647775.2022.211133338:4(416-427)Online publication date: 17-Aug-2022
  • (2022)Art Perception and Power. A Plea for Relational Sociological AestheticsArts and Power10.1007/978-3-658-37429-7_5(87-102)Online publication date: 6-Sep-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media