[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2857491.2857525acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Gaze-based moving target acquisition in real-time full motion video

Published: 14 March 2016 Publication History

Abstract

Real-time moving target acquisition in full motion video is a challenging task. Mouse input might fail if targets move fast, unpredictably, or are only visible for a short period of time. In this paper, we describe an experiment with expert video analysts (N=26) which perform moving target acquisition by selecting targets in a full motion video sequence presented on a desktop computer. The results show that using gaze input (gaze pointing + manual key press), the participants were able to perform with significantly shorter completion times than with mouse input. Error rates (represented by target misses) and acquisition precision were similar. Subjective ratings of user satisfaction resulted in similar or even better scores for the gaze interaction.

References

[1]
Iso, I. S. O. 2012. 9241--411 Ergonomics of human-system interaction--Part 411: Evaluation methods for the design of physical input devices. International Organization for Standardization.
[2]
Isokoski, P., Joos, M., Spakov, O., and Martin, B. 2009. Gaze controlled games. Universal Access in the Information Society 8, 4, 323--337.
[3]
Jacob, R. J. K. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems 9, 2, 152--169.
[4]
Kumar, M., Klingner, J., Puranik, R, Winograd, T., and Paepke, A. 2008. Improving the accuracy of gaze input for interaction. In Proceedings of the 2008 symposium on Eye tracking research & applications, ACM, 65--68.
[5]
Vertegaal, R. 2008. A Fitts Law comparison of eye tracking and manual input in the selection of visual targets. In Proceedings of the 10th international conference on Multimodal Interfaces, ACM, 241--248.
[6]
Zhang, X., and MacKenzie, I. S. 2007. Evaluating eye tracking with ISO 9241-part 9. In Human-Computer Interaction. HCI Intelligent Multimodal Interaction Environments, Springer Berlin Heidelberg, 779--788.

Cited By

View all
  • (2023)Pilot Study on Interaction with Wide Area Motion Imagery Comparing Gaze Input and Mouse InputHuman Interface and the Management of Information10.1007/978-3-031-35132-7_26(352-369)Online publication date: 9-Jul-2023
  • (2022)Feasibility of Longitudinal Eye-Gaze Tracking in the WorkplaceProceedings of the ACM on Human-Computer Interaction10.1145/35308896:ETRA(1-21)Online publication date: 13-May-2022
  • (2022)Automated gaze-based mind wandering detection during computerized learning in classroomsUser Modeling and User-Adapted Interaction10.1007/s11257-019-09228-529:4(821-867)Online publication date: 11-Mar-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '16: Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications
March 2016
378 pages
ISBN:9781450341257
DOI:10.1145/2857491
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 March 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. evaluation
  2. expert video analysis
  3. gaze input
  4. input device
  5. moving target acquisition
  6. user study

Qualifiers

  • Short-paper

Conference

ETRA '16
ETRA '16: 2016 Symposium on Eye Tracking Research and Applications
March 14 - 17, 2016
South Carolina, Charleston

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)10
  • Downloads (Last 6 weeks)3
Reflects downloads up to 11 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Pilot Study on Interaction with Wide Area Motion Imagery Comparing Gaze Input and Mouse InputHuman Interface and the Management of Information10.1007/978-3-031-35132-7_26(352-369)Online publication date: 9-Jul-2023
  • (2022)Feasibility of Longitudinal Eye-Gaze Tracking in the WorkplaceProceedings of the ACM on Human-Computer Interaction10.1145/35308896:ETRA(1-21)Online publication date: 13-May-2022
  • (2022)Automated gaze-based mind wandering detection during computerized learning in classroomsUser Modeling and User-Adapted Interaction10.1007/s11257-019-09228-529:4(821-867)Online publication date: 11-Mar-2022
  • (2022)Gaze-Enhanced User Interface for Real-Time Video SurveillanceHCI International 2022 – Late Breaking Posters10.1007/978-3-031-19679-9_7(46-53)Online publication date: 24-Nov-2022
  • (2020)A Survey of Digital Eye Strain in Gaze-Based Interactive SystemsACM Symposium on Eye Tracking Research and Applications10.1145/3379155.3391313(1-12)Online publication date: 2-Jun-2020
  • (2018)Evaluating User Interfaces Supporting Change Detection in Aerial Images and Aerial Image SequencesHuman Interface and the Management of Information. Information in Applications and Services10.1007/978-3-319-92046-7_33(383-402)Online publication date: 7-Jun-2018
  • (2018)A Pilot Study on Gaze-Based Control of a Virtual Camera Using 360°-Video DataEngineering Psychology and Cognitive Ergonomics10.1007/978-3-319-91122-9_34(419-428)Online publication date: 15-Jul-2018
  • (2016)Intervention-free selection using EEG and eye trackingProceedings of the 18th ACM International Conference on Multimodal Interaction10.1145/2993148.2993199(153-160)Online publication date: 31-Oct-2016
  • (2016)Interacting with target tracking algorithms in a gaze-enhanced motion video analysis systemGeospatial Informatics, Fusion, and Motion Video Analytics VI10.1117/12.2223726(98410K)Online publication date: 13-May-2016

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media