[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2857491.2888591acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
abstract

Hybrid model and appearance based eye tracking with kinect

Published: 14 March 2016 Publication History

Abstract

Existing gaze estimation methods rely mainly on 3D eye model or 2D eye appearance. While both methods have validated their effectiveness in various fields and applications, they are still limited in practice, such as portable and non-intrusive system and robust eye gaze tracking in different environments. To this end, we investigate on combining eye model with eye appearance to perform gaze estimation and eye gaze tracking. Specifically, unlike traditional 3D model based methods which rely on cornea reflections, we plan to retrieve 3D information from depth sensor (Eg, Kinect). Kinect integrates camera sensor and IR illuminations into one single device, thus enable more flexible system settings. We further propose to utilize appearance information to help the basic model based methods. Appearance information can help better detection of gaze related features (Eg, pupil center). Plus, eye model and eye appearance can benefit each other to enable robust and accurate gaze estimation.

References

[1]
BioID. Bioid database https://www.bioid.com/about/bioid-face-database.
[2]
Funes Mora, K. A., and Odobez, J. 2014. Geometric generative gaze estimation (g3e) for remote rgb-d cameras. CVPR.
[3]
Krizhevsky, A., Sutskever, I., and Hinton, G. E. 2012. Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems.
[4]
Song, F., Tan, X., Chen, S., and Zhou, Z.-H. 2013. A literature survey on robust and efficient eye localization in real-life scenarios. Pattern Recognition.
[5]
Sun, L., Song, M., Liu, Z., and Sun, M. 2014. Real-time gaze estimation with online calibration. IEEE Multimedia and Expo.
[6]
Xiong, X., and De la Torre, F. 2013. Supervised descent method and its application to face alignment. CVPR.
[7]
Y. Sugano, Y. Matsushita, and Y. Sato. 2013. Appearance-based gaze estimation using visual saliency. IEEE Trans. Pattern Analysis and Machine Intelligence.

Cited By

View all
  • (2024)Wearable Near-Eye Tracking Technologies for Health: A ReviewBioengineering10.3390/bioengineering1107073811:7(738)Online publication date: 22-Jul-2024
  • (2024)Automatic Gaze Analysis: A Survey of Deep Learning Based ApproachesIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2023.332133746:1(61-84)Online publication date: Jan-2024
  • (2024)Robust Gaze Point Estimation for Metaverse With Common Mode Features Suppression NetworkIEEE Transactions on Consumer Electronics10.1109/TCE.2024.335119070:1(2090-2098)Online publication date: Feb-2024
  • Show More Cited By

Index Terms

  1. Hybrid model and appearance based eye tracking with kinect

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ETRA '16: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications
      March 2016
      378 pages
      ISBN:9781450341257
      DOI:10.1145/2857491
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 14 March 2016

      Check for updates

      Author Tags

      1. 3D eye model
      2. depth sensor
      3. eye appearance
      4. learning

      Qualifiers

      • Abstract

      Conference

      ETRA '16
      ETRA '16: 2016 Symposium on Eye Tracking Research and Applications
      March 14 - 17, 2016
      South Carolina, Charleston

      Acceptance Rates

      Overall Acceptance Rate 69 of 137 submissions, 50%

      Upcoming Conference

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)2
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 10 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Wearable Near-Eye Tracking Technologies for Health: A ReviewBioengineering10.3390/bioengineering1107073811:7(738)Online publication date: 22-Jul-2024
      • (2024)Automatic Gaze Analysis: A Survey of Deep Learning Based ApproachesIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2023.332133746:1(61-84)Online publication date: Jan-2024
      • (2024)Robust Gaze Point Estimation for Metaverse With Common Mode Features Suppression NetworkIEEE Transactions on Consumer Electronics10.1109/TCE.2024.335119070:1(2090-2098)Online publication date: Feb-2024
      • (2019)Generalizing Eye Tracking With Bayesian Adversarial Learning2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR.2019.01218(11899-11908)Online publication date: Jun-2019
      • (2019)Neuro-Inspired Eye Tracking With Eye Movement Dynamics2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR.2019.01006(9823-9832)Online publication date: Jun-2019
      • (2018)Accurate Model-Based Point of Gaze Estimation on Mobile DevicesVision10.3390/vision20300352:3(35)Online publication date: 24-Aug-2018

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media