[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3148456.3148493acmotherconferencesArticle/Chapter ViewAbstractPublication PagesihcConference Proceedingsconference-collections
research-article

Eyejam: a gaze-controlled musical interface

Published: 03 November 2015 Publication History

Abstract

Gaze-based interfaces use eye trackers as input devices to substitute, for example, the keyboard and mouse of a conventional graphical user interface. For people with severe motor disabilities, gaze is one of the few alternatives to interact with the world. The most common gaze interaction paradigm is called dwell-time, where keys are selected using fixations longer than a certain "dwell". Though simple, this mechanism does not permit the selection of keys at precise moments, a requirement to play music and certain video games. In this paper we introduce two gaze interaction techniques that overcome this limitation. The first is an extension of the dwell-time, where selection is triggered when the gaze exits the focused key. The second is based on the Context Switching paradigm, that uses saccades for selection instead of fixations. Results of a user experiment show that the Context Switching paradigm permits a more accurate timing control while playing music at 45 and 70 beats per minute.

References

[1]
D.W. Hansen and Qiang Ji. 2010. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence 32, 3 (march 2010), 478--500.
[2]
Anthony Hornof. 2014. The Prospects For Eye-Controlled Musical Performance. In Proceedings of the International Conference on New Interfaces for Musical Expression. London, United Kingdom, 461--466.
[3]
Anthony Hornof, Troy Rogers, and Tim Halverson. 2007. EyeMusic: performing live music and multimedia compositions with eye movements. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME).
[4]
Aulikki Hyrskykari, Howell Istance, and Stephen Vickers. 2012. Gaze Gestures or Dwell-based Interaction?. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). New York, NY, USA, 229--232.
[5]
P. Isokoski. 2000. Text input methods for eye trackers using off-screen targets. In ETRA '00: Proceedings of the 2000 symposium on Eye tracking research & applications. ACM, New York, NY, 15--21.
[6]
Robert J. K. Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Trans. Inf. Syst. 9, 2 (1991), 152--169.
[7]
J. Kim, G. Schiemer, and T. Narushima. 2007. Oculog: Playing with eye movements. In Proceedings of the 7th International Conference on New Interfaces for Musical Expression (NIME 2007). New York, 50--55.
[8]
I. Scott MacKenzie and R. William Soukoreff. 2002. A Character-level Error Analysis Technique for Evaluating Text Entry Methods. In Proceedings of the Second Nordic Conference on Human-computer Interaction (NordiCHI '02). ACM, New York, NY, USA, 243--246.
[9]
P. Majaranta and K. J. Räihä. 2007. Text Entry by Gaze:Utilizing eye tracking. In Text entry systems: Mobility, accessibility, universality, I.S. MacKenzie and K.T. Ishii (Eds.). Morgan Kaufmann, 175--187.
[10]
C.H. Morimoto and A. Amir. 2010. Context Switching for Fast Key Selection in Text Entry Applications. In Proceedings of the ACM Symposium on Eye Tracking Research & Applications - ETRA 2010. Austin, TX, 271--274.
[11]
Takehiko Ohno. 1998. Features of Eye Gaze Interface for Selection Tasks. In Proc. of the Third Asia Pacific Computer Human Interaction- APCHI'98. IEEE Computer Society, 1--6.
[12]
A. Polli. 1999. Active vision: controlling sound with eye movements. Leonardo 32, 5 (1999), 405--411.
[13]
Kari-Jouko Räihä and Saila Ovaska. 2012. An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 3001--3010.
[14]
Dario D. Salvucci and Joseph H. Goldberg. 2000. Identifying Fixations and Saccades in Eye-tracking Protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications (ETRA '00). ACM, New York, NY, USA, 71--78.
[15]
Zacharias Vamvakousis and Rafael Ramirez. 2012. Temporal Control In the EyeHarp Gaze-Controlled Musical Interface. In Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), G. Essl, B. Gillespie, M. Gurevich, and S. O'Modhrain (Eds.). University of Michigan, Ann Arbor, Michigan.
[16]
Stephen Vickers, Howell Istance, and Matthew Smalley. 2010. EyeGuitar: Making Rhythm Based Music Video Games Accessible Using Only Eye Movements. In Proceedings of the 7th International Conference on Advances in Computer Entertainment Technology (ACE '10). ACM, New York, NY, USA, 36--39.
[17]
J. O. Wobbrock, J. Rubinstein, M. W. Sawyer, and A. T. Duchowski. 2008. Longitudinal evaluation of discrete consecutive gaze gesture for text entry. In Proceedings of Eye Tracking Research & Applications, ETRA 2008. ACM Press, Savannah, GA, 11--18.

Cited By

View all
  • (2023)Kiroll: A Gaze-Based Instrument for Quadriplegic Musicians Based on the Context-Switching ParadigmProceedings of the 18th International Audio Mostly Conference10.1145/3616195.3616225(59-62)Online publication date: 30-Aug-2023
  • (2023)DuoRhythmo: Design and remote user experience evaluation (UXE) of a collaborative accessible digital musical interface (CADMI) for people with ALS (PALS)Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581285(1-13)Online publication date: 19-Apr-2023
  • (2023)A Case Study on Netychords: Crafting Accessible Digital Musical Instrument Interaction for a Special Needs ScenarioComputer-Human Interaction Research and Applications10.1007/978-3-031-49425-3_22(353-372)Online publication date: 23-Dec-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
IHC '15: Proceedings of the 14th Brazilian Symposium on Human Factors in Computing Systems
November 2015
514 pages
ISBN:9781450353625
DOI:10.1145/3148456
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 03 November 2015

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. accessibility
  2. gaze-based interaction
  3. musical interface

Qualifiers

  • Research-article

Funding Sources

Conference

IHC 2015

Acceptance Rates

Overall Acceptance Rate 331 of 973 submissions, 34%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)2
Reflects downloads up to 23 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Kiroll: A Gaze-Based Instrument for Quadriplegic Musicians Based on the Context-Switching ParadigmProceedings of the 18th International Audio Mostly Conference10.1145/3616195.3616225(59-62)Online publication date: 30-Aug-2023
  • (2023)DuoRhythmo: Design and remote user experience evaluation (UXE) of a collaborative accessible digital musical interface (CADMI) for people with ALS (PALS)Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581285(1-13)Online publication date: 19-Apr-2023
  • (2023)A Case Study on Netychords: Crafting Accessible Digital Musical Instrument Interaction for a Special Needs ScenarioComputer-Human Interaction Research and Applications10.1007/978-3-031-49425-3_22(353-372)Online publication date: 23-Dec-2023
  • (2020)Hands-Free Accessible Digital Musical Instruments: Conceptual Framework, Challenges, and PerspectivesIEEE Access10.1109/ACCESS.2020.30199788(163975-163995)Online publication date: 2020
  • (2019)Dueto: Accessible, Gaze-Operated Musical ExpressionProceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3308561.3354603(513-515)Online publication date: 24-Oct-2019
  • (2018)Playing music with the eyes through an isomorphic interfaceProceedings of the Workshop on Communication by Gaze Interaction10.1145/3206343.3206350(1-5)Online publication date: 15-Jun-2018

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media