[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3206343.3206350acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Playing music with the eyes through an isomorphic interface

Published: 15 June 2018 Publication History

Abstract

Playing music with the eyes is a challenging task. In this paper, we propose a virtual digital musical instrument, usable by both motor-impaired and able-bodied people, controlled through an eye tracker and a "switch". Musically speaking, the layout of the graphical interface is isomorphic, since the harmonic relations between notes have the same geometrical shape regardless of the key signature of the music piece. Four main design principles guided our choices, namely: (1) Minimization of eye movements, especially in case of large note intervals; (2) Use of a grid layout where "nodes" (keys) are connected each other through segments (employed as guides for the gaze); (3) No need for smoothing filters or time thresholds; and (4) Strategic use of color to facilitate gaze shifts. Preliminary tests, also involving another eye-controlled musical instrument, have shown that the developed system allows "correct" execution of music pieces even when characterized by complex melodies.

References

[1]
S. Bailey, A. Scott, H. Wright, I. Symonds, and K. Ng. 2010. Eye.Breathe.Music: creating music through minimal movement. In Proceedings of the 2010 International Conference on Electronic Visualisation and the Arts (EVA'10), 254--258.
[2]
S. Ceurstemont. 2016. Virtual instruments let you play music using only your eyes. New Scientist, 16 March 2016 (https://www.newscientist.com/article/2081150-virtual-instruments-let-you-play-music-using-only-your-eyes/).
[3]
S. Dredge. 2015. Disabled boy learns to play piano with his eyes using virtual-reality headset. The Guardian, 14 January 2015 (https://www.theguardian.com/technology/2015/jan/14/disabled-play-piano-eyes-virtual-reality-headset).
[4]
P. M. Fitts. 1954. The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology, 1954 47(6): 381--391.
[5]
K. Holmqvist, M. Nyström, and F. MulveyI. 2012. Eye tracker data quality: What it is and how to measure it. In Proceedings of the 2012 ACM Symposium of Eye Tracking Research and Applications (ETRA 2012), 45--52
[6]
A. Hornof. 2014. The Prospects for Eye-Controlled Musical Performance. In Proceedings of the 2014 Conference on New Interfaces for Musical Expression (NIME'14), 461--466.
[7]
A. Hornof, T. Rogers, J. Stolet, and T. Halverson. 2008. Bringing to Life the Musical Properties of the Eyes. University of Oregon Department of Computer and Information Science, Technical Report 08-05.
[8]
A. Hornof and L. Sato. 2004. EyeMusic: Making Music with the Eyes. In Proceedings of the 2004 Conference on New Interfaces for Musical Expression (NIME'04), 185--188.
[9]
R. J. K. Jacob. 1995. Eye tracking in advanced interface design. In Virtual environments and advanced interface design, Oxford University Press, 258--288.
[10]
J. Kim, G. Schiemer, and T. Narushima. 2007. Oculog: Playing with Eye Movements. In Proceedings of the 2007 Conference on New Interfaces for Musical Expression (NIME'07), 50--55.
[11]
C. I. Lou, D. Migotina, J. P. Rodrigues, J. Semedo, F. Wan, P. U. Mak, P. I. Mak, M. I. Vai, F. Melicio, J. G. Pereira, and A. Rosa. 2012. Object Recognition Test in Peripheral Vision: A Study on the Influence of Object Color, Pattern and Shape. In Proceedings of the 2012 International Conference on Brain Informatics (BI'12), 18--26.
[12]
S. Maupin, D. Gerhard, and B. Park. 2011. Isomorphic Tessellations for Musical Keyboards. In Proceedings of the 8th Sound and Music Computing Conference (SMC'11), 471--478.
[13]
C. H. Morimoto, A. Diaz-Tula, J. A. T. Leyva, and C. E. L. Elmadjian. 2015. EyeJam: A Gaze-Controlled Musical Interface. In Proceedings of the 2015 Brazilian Symposium on Human Factors in Computing Systems (IHC'15), 360--368.
[14]
P. Q. Pfordresher and C. Palmer. 2002. Effects of delayed auditory feedback on timing of music performance. Psychological Research, 2002; 16: 71--79.
[15]
Tobii Dynavox. Eye Play Music. Software downloadable from http://www2.tobiidynavox.com/product/eye-play-music/.
[16]
Z. Vamvakousis and R. Ramirez. 2016. The EyeHarp: A Gaze-Controlled Digital Musical Instrument. Front. Pychol. 2016; 7: 906.

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
COGAIN '18: Proceedings of the Workshop on Communication by Gaze Interaction
June 2018
69 pages
ISBN:9781450357906
DOI:10.1145/3206343
  • General Chairs:
  • Carlos Morimoto,
  • Thies Pfeiffer
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye tracking
  2. gaze communication
  3. musical isomorphic interface

Qualifiers

  • Short-paper

Conference

ETRA '18

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)7
  • Downloads (Last 6 weeks)0
Reflects downloads up to 11 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)LeyenesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103204184:COnline publication date: 1-Apr-2024
  • (2023)Gaze-Based Human–Computer Interaction for Museums and Exhibitions: Technologies, Applications and Future PerspectivesElectronics10.3390/electronics1214306412:14(3064)Online publication date: 13-Jul-2023
  • (2023)Music Segmentation and Similarity Estimation Applied to a Gaze-Controlled Musical InterfaceRevista Vórtex10.33871/23179937.2023.11.1.706811:1(1-25)Online publication date: 2-May-2023
  • (2023)Kiroll: A Gaze-Based Instrument for Quadriplegic Musicians Based on the Context-Switching ParadigmProceedings of the 18th International Audio Mostly Conference10.1145/3616195.3616225(59-62)Online publication date: 30-Aug-2023
  • (2023)DuoRhythmo: Design and remote user experience evaluation (UXE) of a collaborative accessible digital musical interface (CADMI) for people with ALS (PALS)Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581285(1-13)Online publication date: 19-Apr-2023
  • (2023)A Case Study on Netychords: Crafting Accessible Digital Musical Instrument Interaction for a Special Needs ScenarioComputer-Human Interaction Research and Applications10.1007/978-3-031-49425-3_22(353-372)Online publication date: 23-Dec-2023
  • (2021)Resin: a Vocal Tract Resonances and Head Based Accessible Digital Musical InstrumentProceedings of the 16th International Audio Mostly Conference10.1145/3478384.3478403(280-283)Online publication date: 1-Sep-2021
  • (2021)Hybrid Manual and Gaze-Based Interaction With a Robotic Arm2021 26th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA )10.1109/ETFA45728.2021.9613371(1-4)Online publication date: 7-Sep-2021
  • (2021)A gaze-based interactive system to explore artwork imageryJournal on Multimodal User Interfaces10.1007/s12193-021-00373-z16:1(55-67)Online publication date: 21-May-2021
  • (2020)Hands-Free Accessible Digital Musical Instruments: Conceptual Framework, Challenges, and PerspectivesIEEE Access10.1109/ACCESS.2020.30199788(163975-163995)Online publication date: 2020
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media