[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

Synthetic Gaze Data Augmentation for Improved User Calibration

  • Conference paper
  • First Online:
Pattern Recognition. ICPR International Workshops and Challenges (ICPR 2021)

Abstract

In this paper, we focus on the calibration possibilitiesó of a deep learning based gaze estimation process applying transfer learning, comparing its performance when using a general dataset versus when using a gaze specific dataset in the pretrained model. Subject calibration has demonstrated to improve gaze accuracy in high performance eye trackers. Hence, we wonder about the potential of a deep learning gaze estimation model for subject calibration employing fine-tuning procedures. A pretrained Resnet-18 network, which has great performance in many computer vision tasks, is fine-tuned using user’s specific data in a few shot adaptive gaze estimation approach. We study the impact of pretraining a model with a synthetic dataset, U2Eyes, before addressing the gaze estimation calibration in a real dataset, I2Head. The results of the work show that the success of the individual calibration largely depends on the balance between fine-tuning and the standard supervised learning procedures and that using a gaze specific dataset to pretrain the model improves the accuracy when few images are available for calibration. This paper shows that calibration is feasible in low resolution scenarios providing outstanding accuracies below 1.5\(^\circ \) of error.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
GBP 19.95
Price includes VAT (United Kingdom)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
GBP 71.50
Price includes VAT (United Kingdom)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
GBP 89.99
Price includes VAT (United Kingdom)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://github.com/GonzaloGardeL/Synthetic-gaze-data-augmentation-for-improved-user-calibration.

References

  1. Abass, A., et al.: Positions of ocular geometrical and visual axes in Brazilian, Chinese and Italian populations. Curr. Eye Res. 43(11), 1404–1414 (2018). https://doi.org/10.1080/02713683.2018.1500609, pMID: 30009634

  2. Cerrolaza, J.J., Villanueva, A., Cabeza, R.: Study of polynomial mapping functions in video-oculography eye trackers. ACM Trans. Comput.-Hum. Interact. 19(2), 10:1–10:25 (2012). https://doi.org/10.1145/2240156.2240158

  3. Chen, Z., Shi, B.E.: Geddnet: a network for gaze estimation with dilation and decomposition (2020). https://arxiv.org/abs/2001.09284

  4. Cheng, Y., Huang, S., Wang, F., Qian, C., Lu, F.: A coarse-to-fine adaptive network for appearance-based gaze estimation (2020). https://arxiv.org/abs/2001.00187

  5. Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Fei-Fei, L.: ImageNet: a large-scale hierarchical image database. In: CVPR09 (2009)

    Google Scholar 

  6. Guestrin, E., Eizenman, M.: General theory of remote gaze estimation using pupil center and corneal reflections. IEEE Trans. Biomed. Eng. 53(6), 1124–1133 (2006)

    Article  Google Scholar 

  7. Guo, T., et al.: A generalized and robust method towards practical gaze estimation on smart phone. In: 2019 International Conference on Computer Vision (ICCV) Workshops ICCV 2019, October 2019

    Google Scholar 

  8. He, J., et al.: On-device few-shot personalization for real-time gaze estimation. In: 2019 IEEE International Conference on Computer Vision (ICCV) Workshops ICCV 2019, October 2019

    Google Scholar 

  9. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2015)

    Google Scholar 

  10. Hennessey, C., Noureddin, B., Lawrence, P.: A single camera eye-gaze tracking system with free head motion. In: Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, ETRA 2006, pp. 87–94. Association for Computing Machinery, New York (2006). https://doi.org/10.1145/1117309.1117349

  11. Krafka, K., et al.: Eye tracking for everyone. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2176–2184 (2016)

    Google Scholar 

  12. Linden, E., Sjostrand, J., Proutiere, A.: Learning to personalize in appearance-based gaze tracking. In: 2019 IEEE International Conference on Computer Vision (ICCV) Workshops, ICCV 2019, October 2019

    Google Scholar 

  13. Linden, E., Sjostrand, J., Proutiere, A.: Learning to personalize in appearance-based gaze tracking. In: The IEEE International Conference on Computer Vision (ICCV) Workshops, October 2019

    Google Scholar 

  14. Martinikorena, I., Cabeza, R., Villanueva, A., Porta, S.: Introducing I2Head database. In: 7th International Workshop on Pervasive Eye Tracking and Mobile Eye based Interaction PETMEI 2007 (2018)

    Google Scholar 

  15. Merchant, J., Morrissette, R., Porterfield, J.: Remote measurement of eye direction allowing subject motion over one cubic foot of space. IEEE Trans. Biomed. Eng. 21(4), 309–317 (1974)

    Article  Google Scholar 

  16. Morimoto, C.H., Amir, A., Flickner, M.: Detecting eye position and gaze from a single camera and 2 light sources. In: Proceedings International Conference on Pattern Recognition, pp. 314–317 (2002)

    Google Scholar 

  17. Porta, S., Bossavit, B., Cabeza, R., Larumbe-Bergera, A., Garde, G., Villanueva, A.: U2eyes: a binocular dataset for eye tracking and gaze estimation. In: 2019 IEEE International Conference on Computer Vision (ICCV) ICCV 2019, October 2019

    Google Scholar 

  18. Shih, S.W., Liu, J.: A novel approach to 3-D gaze tracking using stereo cameras. IEEE Trans. Syst. Man Cybern. Part-B 34(1), 234–245 (2004)

    Article  Google Scholar 

  19. Smith, L.N.: Cyclical learning rates for training neural networks. In: 2017 IEEE Winter Conference on Applications of Computer Vision (WACV) (2017)

    Google Scholar 

  20. Yu, Y., Liu, G., Odobez, J.M.: Improving few-shot user-specific gaze adaptation via gaze redirection synthesis. In: 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2019)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Gonzalo Garde , Andoni Larumbe-Bergera , Sonia Porta , Rafael Cabeza or Arantxa Villanueva .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Garde, G., Larumbe-Bergera, A., Porta, S., Cabeza, R., Villanueva, A. (2021). Synthetic Gaze Data Augmentation for Improved User Calibration. In: Del Bimbo, A., et al. Pattern Recognition. ICPR International Workshops and Challenges. ICPR 2021. Lecture Notes in Computer Science(), vol 12663. Springer, Cham. https://doi.org/10.1007/978-3-030-68796-0_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-68796-0_27

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-68795-3

  • Online ISBN: 978-3-030-68796-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics