[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3308561.3354603acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
poster

Dueto: Accessible, Gaze-Operated Musical Expression

Published: 24 October 2019 Publication History

Abstract

Gaze-tracking technologies can enable computer access for users who are unable to use standard input devices. However, using gaze as input poses challenges for interactions that require visual planning, like playing a digital instrument. We explore how multimodality can support eye-controlled musical expression by designing different multi-modal gaze interactions around a digital instrument we call Dueto. We tackle three design goals: creating an instrument that is explorable, easy to learn, and allows feature controllability. We showcase three different multimodal interactions for music playing such as eye gaze only, gaze + switch, and gaze + partner mode.

References

[1]
Sam Bailey, Adam Scott, Harry Wright, Ian Symonds, and Kia Ng. 2010. Eye . Breathe . Music?: creating music through minimal movement. In Proceedings of the 2010 international conference on Electronic Visualisation and the Arts (EVA'10), pp. 254--258. BCS Learning & Development Ltd., Swindon, UK, 254--258.
[2]
Nicola Davanzo and Marco Porta. 2018. Playing Music with the Eyes Through an Isomorphic Interface. In Proceedings of the Workshop on Communication by Gaze Interaction (COGAIN '18). ACM, New York, NY, USA, Article 5, 5 pages. https://doi.org/10.1145/3206343.3206350
[3]
Carlos H. Morimoto, Antonio Diaz-Tula, José A. T. Leyva, and Carlos E. L. Elmadjian. 2015. EyeJam?: A Gaze-Controlled Musical Interface. In Proceedings of the 14th Brazilian Symposium on Human Factors in Computing Systems (IHC '15), ACM, New York, NY, USA, Article 37, 9 pages. https://doi.org/10.1145/3148456.3148493
[4]
Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Shaun Kane, and Meredith Ringel Morris. 2017. Toward Everyday Gaze Input?: Accuracy and Precision of Eye Tracking and Implications for Design. In Proc. CHI Conference on Human Factors in Computing Systems (CHI '17). ACM. https://doi.org/10.1145/3025453.3025599
[5]
James D. Foley, Victor L. Wallace, and Peggy Chan. 1984. Human Factors of Computer Graphics Interaction Techniques. IEEE Computer Graphics and Applications 4, 11: 13--48. https://doi.org/10.1109/MCG.1984.6429355
[6]
Stewart Greenhill and Cathie Travers. 2016. Focal?: An Eye-Tracking Musical Expression Controller. In Proceedings of the 2016 Conference on New Interfaces for Musical Expression (NIME '16). 230--235.
[7]
Anthony J Hornof. 2014. The Prospects For Eye-Controlled Musical Performance. In Proceedings of the 2014 Conference on New Interfaces for Musical Expression (NIME '14). 461--466.
[8]
Anthony Hornof, Jeffrey Stolet, and Tim Halverson. 2008. Bringing to Life the Musical Properties of the Eyes. University of Oregon Department of Compuer and Information Science, Technical Report 08. 2008; 5.
[9]
Ishan Chatterjee, Robert Xiao, and Chris Harrison. 2015. Gaze + Gesture?: Expressive, Precise and Targeted Free-space Interactions. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction (ICMI '15). ACM, New York, NY, USA, 131--138. https://doi.org/10.1145/2818346.2820752
[10]
Robert J.K. Jacob. 1991. The Use of Eye Movements in Human-Computer Interaction Techniques?: What You Look At is What You Get. ACM Transactions on Information Systems, Vol. 9, No. 3: 152--169. http://dx.doi.org/10.1145/123078.128728
[11]
Manu Kumar, Andreas Paepcke, and Terry Winograd. 2007. EyePoint?: Practical Pointing and Selection Using Gaze and Keyboard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM, New York, NY, USA, 421--430. https://doi.org/10.1145/1240624.1240692
[12]
Nicola Orio, Norbert Schnell, and Marcelo Wanderley. 2001. Input devices for musical expression: borrowing tools from HCI. In Proc. NIME: 1--4. https://doi.org/10.1162/014892602320582981
[13]
Vijay Rajanna and Tracy Hammond. 2018. A Fitts ' Law Evaluation of Gaze Input on Large Displays Compared to Touch and Mouse Inputs. In Proceedings of the Workshop on Communication by Gaze Interaction (COGAIN '18). ACM, New York, NY, USA, Article 8, 5 pages. https://doi.org/10.1145/3206343.3206348
[14]
Zacharias Vamvakousis and Rafael Ramirez. 2012. Temporal Control In the EyeHarp Gaze-Controlled Musical Interface.In Proceedings of the 2012 Conference on New Interfaces for Musical Expression (NIME '12).
[15]
Stephen Vickers and Matthew Smalley. EyeGuitar?: Making Rhythm Based Music Video Games Accessible Using Only Eye Movements. In Proceedings of the 7th International Conference on Advances in Computer Entertainment Technology (ACE '10). ACM, New York, NY, USA, 36--39. https://doi.org/10.1145/1971630.1971641
[16]
Jacob O Wobbrock and Shaun K Kane. 2011. Ability-Based Design?: Concept, Principles and Examples. ACM Transactions on Accessible Computing (TACCESS). Article 9 (April 2011), 27 pages. http://dx.doi.org/10.1145/1952383.1952384
[17]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual And Gaze Input Cascaded ( MAGIC ) Pointing. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (CHI '99). ACM, New York, NY, USA, 246--253. http://dx.doi.org/10.1145/302979.303053
[18]
Xbox adaptive controller. Hardware product available at https://www.xbox.com/en-US/xbox-one/accessories/controllers/xbox-adaptive-controller
[19]
Eye Conductor. Software description available at https://abledata.acl.gov/product/eye-conductor
[20]
Makey-Makey. Hardware and firmware available at https://makeymakey.com

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ASSETS '19: Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility
October 2019
730 pages
ISBN:9781450366762
DOI:10.1145/3308561
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 October 2019

Check for updates

Author Tags

  1. accessibility
  2. eye tracking
  3. gaze input
  4. motor impairments
  5. multimodal interaction
  6. musical interfaces

Qualifiers

  • Poster

Conference

ASSETS '19
Sponsor:

Acceptance Rates

ASSETS '19 Paper Acceptance Rate 41 of 158 submissions, 26%;
Overall Acceptance Rate 436 of 1,556 submissions, 28%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)0
Reflects downloads up to 22 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Eye-tracking digital music creation and performance: disability and ableismInternational Journal of Performance Arts and Digital Media10.1080/14794713.2024.2329829(1-17)Online publication date: 29-Mar-2024
  • (2024)LeyenesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103204184:COnline publication date: 1-Apr-2024
  • (2023)Gaze-Based Human–Computer Interaction for Museums and Exhibitions: Technologies, Applications and Future PerspectivesElectronics10.3390/electronics1214306412:14(3064)Online publication date: 13-Jul-2023
  • (2023)Music Segmentation and Similarity Estimation Applied to a Gaze-Controlled Musical InterfaceRevista Vórtex10.33871/23179937.2023.11.1.706811:1(1-25)Online publication date: 2-May-2023
  • (2022)Designing Gestures for Digital Musical Instruments: Gesture Elicitation Study with Deaf and Hard of Hearing PeopleProceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3517428.3544828(1-8)Online publication date: 23-Oct-2022
  • (2021)A gaze-based interactive system to explore artwork imageryJournal on Multimodal User Interfaces10.1007/s12193-021-00373-z16:1(55-67)Online publication date: 21-May-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media