[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3430524.3440641acmotherconferencesArticle/Chapter ViewAbstractPublication PagesteiConference Proceedingsconference-collections
research-article

Surface Electromyography for Sensing Performance Intention and Musical Imagery in Vocalists

Published: 14 February 2021 Publication History

Abstract

Through experience, the techniques used by professional vocalists become highly ingrained and much of the fine muscular control needed for healthy singing is executed using well-refined mental imagery. In this paper, we provide a method for observing intention and embodied practice using surface electromyography (sEMG) to detect muscular activation, in particular with the laryngeal muscles. Through sensing the electrical neural impulses causing muscular contraction, sEMG provides a unique measurement of user intention, where other sensors reflect the results of movement. In this way, we are able to measure movement in preparation, vocalised singing, and in the use of imagery during mental rehearsal where no sound is produced. We present a circuit developed for use with the low voltage activations of the laryngeal muscles; in sonification of these activations, we further provide feedback for vocalists to investigate and experiment with their own intuitive movements and intentions for creative vocal practice.

References

[1]
M. A. Achey, M. Z. He, and L. M. Akst. 2016. Vocal Hygiene Habits and Vocal Handicap Among Conservatory Students of Classical Singing. Journal of Voice 30, 2 (2016), 192–197. https://doi.org/10.1016/j.jvoice.2015.02.003
[2]
A. Aleman and W. van’t Wout. 2004. Subvocalization in auditory-verbal imagery: just a form of motor imagery?Cognitive Processing 5, 4 (2004), 228–231. https://doi.org/10.1007/s10339-004-0034-y
[3]
N. B. Armstrong. 2007. An enactive approach to digital musical instrument design–Theory, Models, Techniques. VDM Verlag.
[4]
A. Baddeley, M. Eldridge, and V. Lewis. 1981. The role of subvocalisation in reading. The Quarterly Journal of Experimental Psychology Section A 33, 4(1981), 439–454. https://doi.org/10.1080/14640748108400802
[5]
F. Bailes. 2006. The use of experience-sampling methods to monitor musical imagery in everyday life. Musicae Scientiae 10, 2 (2006), 173–190. https://doi.org/10.1177/102986490601000202
[6]
V. Becker, P. Oldrat, L. Barrios, and G. Sör. 2018. TouchSense: Classifying and Measuring the Force of Finger Touches with an Electromyography Armband. In Proc. Augmented Human International Conference (AH ’18). ACM, New York, NY, USA. https://doi.org/10.1145/3174910.3174947
[7]
G. Bernal, D. Ahuja, and F. Casalegno. 2010. EMG-based biofeedback tool for augmenting manual fabrication and improved exchange of empirical knowledge. In Proc. ACM XVI International Conference on Human Computer Interaction (Interacción ’15). 1–8. https://doi.org/10.1145/2829875.2829932
[8]
R. Brown and C. Palmer. 2012. Auditory–motor learning influences auditory memory for music. Memory & Cognition 40, 4 (2012), 567–578. https://doi.org/10.3758/s13421-011-0177-x
[9]
J. Callaghan. 1998. Singing Teachers and Voice Science - An Evaluation of Voice Teaching in Australian Tertiary Institutions. Research Studies in Music Education 10, 1 (1998), 25–41. https://doi.org/10.1177/1321103X9801000103
[10]
R. Chowdhury, M. Reaz, M. Ali, A. Bakar, K. Chellappan, and T. Chang. 2013. Surface Electromyography Signal Processing and Classification Techniques. Sensors 13, 9 (2013). https://doi.org/10.3390/s130912431
[11]
E. Costanza, S. A. Inverso, and R. Allen. 2005. Toward subtle intimate interfaces for mobile devices using an EMG controller. In Proc. CHI’05: SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 481–489. https://doi.org/10.1145/1054972.1055039
[12]
J. Cumming and S. E. Williams. 2012. The Role of Imagery in Performance. Oxford University Press, Oxford. https://doi.org/10.1093/oxfordhb/9780199731763.013.0011
[13]
M. Donnarumma, B. Caramiaux, and A. Tanaka. 2013. Combining EMG and MMG sensing for musical practice. In Proc. New Interfaces for Musical Expression (NIME), Daejeon, Korea, May 27-30, 2013.
[14]
M. Dozza, F. B. Horak, and L. Chiari. 2007. Auditory biofeedback substitutes for loss of sensory information in maintaining stance. Experimental Brain Research 178, 1 (2007), 37–48. https://doi.org/10.1007/s00221-006-0709-y
[15]
R. Dunbar-Wells. 1999. The Relevance of Metaphor to Effective Voice Teaching Strategies. Australian Voice 5(1999), 50–59.
[16]
A. O. Effenberg. 2004. Using Sonification to Enhance Perception and Reproduction Accuracy of Human Movement Patterns. In Proc. International Workshop on Interactive Sonification, Bielefeld, Germany. 1–5. https://doi.org/10.1109/MMUL.2005.31
[17]
C. Erdem and A. R. Jensenius. 2020. RAW: Exploring Control Structures for Muscle-based Interaction in Collective Improvisation. In Proc. New Interfaces for Musical Expression (NIME), Royal Birmingham Conservatoire, Birmingham, UK.
[18]
T. Erickson. 2010. The design and long-term use of a personal electronic notebook. In Proc. CHI ‘96, ACM Press. 11–18. https://doi.org/10.1145/238386.238392
[19]
G. Essl and S. O’Modhrain. 2006. An enactive approach to the design of new tangible musical instruments; The design of new tangible musical instruments. Organised Sound 11, 3 (2006), 285–296. https://doi.org/10.1017/S135577180600152X
[20]
F. Fontana, H. Järveläinen, S. Papetti, F. Avanzini, G. Klauer, and G. Malavolta. July 30-31 & Aug 1, 2015. Rendering and Subjective Evaluation of Real Vs. Synthetic Vibrotactile Cues on a Digital Piano Keyboard. In Proc. Sound and Music Computing (SMC-15), Maynooth, Ireland. 161–167.
[21]
C. Frauenberger. 2019. Entanglement HCI The Next Wave?ACM Transactions in Computer-Human Interaction 27, 1(2019), 2:27. https://doi.org/10.1145/3364998
[22]
L. I. Garrity. 1977. Electromyography: A review of the current status of subvocal speech research. Memory & Cognition 5, 6 (1977), 615–622. https://doi.org/10.3758/BF03197407
[23]
W. Gaver. 2006. The video window: my life with a ludic system. Personal and Ubiquitous Computing 10, 2-3 (2006), 60–65. https://doi.org/10.1007/s00779-005-0002-2
[24]
W. W. Gaver, J. Beaver, and S. Beford. 2003. Ambiguity as a Resource for Design. In Proc. CHI. 2003. 233–240. https://doi.org/10.1145/642611.642653
[25]
R. W. Gelding, W. F. Thompson, and B. W. Johnson. 2015. The pitch imagery arrow task: Effects of musical training, vividness, and mental control. PloS One 10(2015), e0121809. https://doi.org/10.1371/journal.pone .0121809
[26]
R. I. Godøy and H. Jørgensen. 2001. Musical Imagery. Number 5 in Studies on New Music Research. Swets & Zeitlinger, Lisse, Netherlands.
[27]
W. Goebl and C. Palmer. 2008. Tactile feedback and timing accuracy in piano performance. Experimental Brain Research 186, 3 (2008), 471–479. https://doi.org/10.1007/s00221-007-1252-1
[28]
E. Goodman, E. Stolterman, and R. Wakkary. 2011. Understanding Interaction Design Practices. In Proc. CHI 2011, May 7–12, 2011, Vancouver, BC, Canada. 1061–1070. https://doi.org/10.1145/1978942.1979100
[29]
E. B. Greenspon, P. Q. Pfordresher, and A. R. Halpern. 2018. Pitch Imitation Ability in Mental Transformations of Melodies. Music Perception: An Interdisciplinary Journal 34, 5 (2018), 585–604. https://doi.org/10.1525/mp.2017.34.5.585
[30]
A. R. Halpern. 2015. Differences in auditory imagery self-report predict neural and behavioral outcomes. Psychomusicology: Music, Mind, and Brain 25, 1 (2015), 37–47. https://doi.org/10.1037/pmu0000081
[31]
A. R. Halpern and R. J. Zatorre. 1999. When that tune runs through your head: A PET investigation of auditory imagery for familiar melodies. Cerebral Cortex 9(1999), 697–704. https://doi.org/10.1093/cercor/9.7.697
[32]
W. J. Hardcastle. 1976. Physiology of Speech Production: An Introduction for Speech Scientists. Academic Press Inc., London.
[33]
D. Hargreaves, D. Miell, and R. MacDonald. 2011. Musical Imaginations: Multidisciplinary perspectives on creativity, performance and perception. Oxford University Press, Oxford.
[34]
K. Hartman, B. Kourtoukov, and E. Lewis. 2018. Kinetic Body Extensions for Social Interactions. In Proc. Tangible, Embedded, and Embodied Interaction (TEI ’18). ACM, New York, NY, USA, 736–739. https://doi.org/10.1145/3173225.3173333
[35]
M. Heidegger. 1967. Being and Time. Blackwell, Oxford.
[36]
S. C. Herholz, A. R. Halpern, and R. J. Zatorre. 2012. Neuronal correlates of perception, imagery, and memory for familiar tunes. Journal of Cognitive Neuroscience 24 (2012), 1382–1397. https://doi.org/10.1162/jocn_a_00216
[37]
Z. Highben and C. Palmer. 2004. Effects of Auditory and Motor Mental Practice in Memorized Piano Performance. Bulletin of the Council for Research in Music Education (2004), 1–8.
[38]
A. Hiyama, Y. Doyama, K. Kakurai, H. Namiki, M. Miyasako, and M. Hirose. 2010. Archiving and transferring of traditional artisanship focused on interaction between artisan and tools. In Proc. 16th International Conference on Virtual Systems and Multimedia, Seoul, South Korea. 171–176. https://doi.org/10.1109/VSMM.2010.5665987
[39]
P. Holmes. 2005. Imagination in practice: a study of the integrated roles of interpretation, imagery and technique in the learning and memorisation processes of two experienced solo performers. British Journal of Music Education 32, 3 (2005), 217–235. https://doi.org/10.1017/S0265051705006613
[40]
K. Höök. 2010. Transferring Qualities from Horseback Riding to Design. In Proc. NordiCHI 2010, October 16–20, 2010, Reykjavik, Iceland. 226–235. https://doi.org/10.3390/informatics5010008
[41]
K. Höök, B. Caramiaux, C. Erkut, J. Forlizzi, N. Hajinejad, M. Haller, C. C. M. Hummels, K. Isbister, M. Jonsson, G. Khut, L. Loke, D. Lottridge, P. Marti, E. Melcer, F. F. Müller, M. G. Petersen, T. Schiphorst, E. M. Segura, A. Ståhl, D. Svanæs, J. Tholander, and H. Tobiasson. 2015. Embracing First-Person Perspectives in Soma-Based Design. Informatics 5, 8 (2015), 1–26. https://doi.org/10.3390/informatics5010008
[42]
N. Igarashi, K. Suzuki, H. Kawamoto, and Y. Sankai. 2010. bioLights: Light emitting wear for visualizing lower-limb muscle activity. In Proc. International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina. 6393–6396. https://doi.org/10.1109/IEMBS.2010.5627306
[43]
D. Ihde. 1975. The Experience of Technology: Human-Machine Relations. Cultural Hermeneutics 2, 3 (1975), 267–279. https://doi.org/10.1177/019145377500200304
[44]
A. R. Jensenius, V. G. Sanchez, A. Zelechowska, and K. A. V. Bjerkestrand. 2017. Exploring the Myo Controller for Sonic Microinteraction. In Proc. New Interfaces for Musical Expression (NIME).
[45]
J. A. Jestley. 2011. Metaphorical and Non-Metaphorical Imagery Use in Vocal Pedagogy: An Investigation of Underlying Cognitive Organisational Constructs. Ph.D. Dissertation. University of British Columbia.
[46]
P. Kantan and S. Dahl. 2019. Communicating Gait Performance Through Musical Energy: Towards an Intuitive Biodfeedback System for Neurorehabilitation. In Proc. International Workshop on Interactive Sonification, Stockholm, Sweden. 108–115. https://doi.org/10.5281/zenodo.3756783
[47]
A. Kapur, S. Kapur, and P. Maes. Mar 7-11, 2018. AlterEgo: A Personalized Wearable Silent Speech Interface. In Proc. Human Information Interaction & Retrieval (IUI ’18), Tokyo, Japan. 43–53. https://doi.org/10.1145/3172944.3172977
[48]
J. Karolus, A. Kilian, T. Kosch, A. Schmidt, and P. W. Wozńiak. 2020. Hit the Thumb Jack! Using Electromyography to Augment the Piano Keyboard. In Proc. DIS 2020. https://doi.org/10.1145/3357236.3395500
[49]
P. E. Keller. 2012. Mental imagery in music performance: underlying mechanisms and potential benefits. Annals of the New York Academy of Sciences 1252, 1 (2012), 206–213. https://doi.org/10.1111/j.1749-6632.2011.06439.x
[50]
Z. O. Khokhar, Z. G. Xiao, and C. Menon. 2010. Surface EMG pattern recognition for real-time control of a wrist exoskeleton. BioMedical Engineering OnLine 9, 41 (2010), 1–17. https://doi.org/10.1186/1475-925X-9-41
[51]
B. Kleber, N. Birbaumer, R. Veit, T. Trevorrow, and M. Lotze. 2007. Overt and imagined singing of an Italian aria. NeuroImage 36, 3 (2007), 889–900. https://doi.org/10.1016/j.neuroimage.2007.02.053
[52]
Y. Koike, K. Nakakoji, and Y. Yamamoto. 2006. Tele-kinesthetic interaction: using hand muscles to interact with a tangible 3D object. In Proc. SIGGRAPH ’06: ACM SIGGRAPH 2006 Emerging Technologies. ACM, New York, NY, USA, 33–es. https://doi.org/10.1145/1179133.1179167
[53]
S. M. Kosslyn, G. Ganis, and W. L. Thompson. 2001. Neural foundations of imagery. Nature Reviews Neuroscience 2, 9 (2001), 635–642. https://doi.org/10.1038/35090055
[54]
M. Leman and P. Maes. 2015. The Role of Embodiment in the Perception of Music. Empirical Musicology Review 9, 3-4 (2015), 236–246. https://doi.org/10.18061/emr.v9i3-4.4498
[55]
C. G. Lim, C-Y. Tsai, and M. Y. Chen. 2020. MuscleSense: Exploring Weight Sensing using Wearable Surface Electromyography (sEMG). In Proc. Tangible, Embedded, and Embodied Interaction (TEI ’20), February 9– 12, 2020, Sydney, NSW, Australia. ACM, New York, NY, USA. https://doi.org/10.1145/3374920.3374943
[56]
C. F. Lima, N. Lavan, S. Evans, Z. Agnew, A. R. Halpern, S. Meekings, D. Boebinger, M. Ostarek, C. McGettigan, J. E. Warren, and S. K. Scott. 2015. Feel the Noise: Relating Individual Differences in Auditory Imagery to the Structure and Function of Sensorimotor Systems. Cerebral Cortex 25, 11 (2015), 4638–4650. https://doi.org/10.1093/cercor/bhv134
[57]
N. Loimusalo and E. Huovinen. July 5-9, 2016. Silent Reading and Aural Models in Pianists’ Mental Practice. In Proc. International Conference on Music Perception and Cognition (ICMPC), San Francisco, California, USA. 609–614.
[58]
J. MacRitchie and A. J. Milne. 2017. Exploring the Effects of Pitch Layout on Learning a New Musical Instrument. Applied Sciences 7, 1218 (2017), 1–19. https://doi.org/10.3390/app7121218
[59]
C. P. Martin, A. R. Jensenius, K. A. V. Bjerkestrand, and V. Johnson. 2017. Stillness Under Tension: Performance for Myo armbands and Bela embedded computers. In MusicLab Vol.1: Biophysical Music.
[60]
C. P. Martin, A. R. Jensenius, and J. Torresen. 2018. Composing an Ensemble Standstill Work for Myo and Bela. In Proc. New Interfaces for Musical Expression (NIME).
[61]
M. Matsubara, H. Kadone, M. Iguchi, H. Terasawa, and K. Suzuki. 2013. The Effectiveness of Auditory Biofeedback on a Tracking Task for Ankle Joint Movements in Rehabilitation. In Proc. International Workshop on Interactive Sonification, Erlangen, Germany. 81–86.
[62]
A. P. McPherson. 2017. Bela: An embedded platform for low-latency feedback control of sound. Journal of the Acoustical Society of America 141, 3618(2017). https://doi.org/10.1121/1.4987761
[63]
G. S. Meltzner, J. Sroka1, J. T. Heaton, L. D. Gilmore, G. Colby, S. Roy, N. Chen, and C. J. De Luca. 2008. Speech Recognition for Vocalized and Subvocal Modes of Production using Surface EMG Signals from the Neck and Face. In INTERSPEECH 2008, 9th Annual Conference of the International Speech Communication Association, Brisbane, Australia, September 22-26, 2008. 2667–2670.
[64]
R. Miller. 1996. Imagery and the Teaching of Singing. In On the Art of Singing. Oxford University Press, Oxford, Chapter 1, 3–5. https://doi.org/10.1093/acprof:osobl/9780195098259.001.0001
[65]
C. Neustaedter and P. Sengers. 2006. Autobiographical design in HCI research: designing and learning through use-it-yourself. In Proc. DIS 2012, June 11-15, 2012, Newcastle, UK. 514–523. https://doi.org/10.1145/2317956.2318034
[66]
L. Nijs, M. Lesaffre, and M. Leman. 2013. The Musical Instrument as a Natural Extension of the Musician. In Music and Its Instruments, H. Castellengo, M. Genevois and J.-M. Bardez (Eds.). Editions Delatour France, 467–484.
[67]
K. Nymoen, M. R. Haugen, and A. R. Jensenius. 2015. MuMYO — Evaluating and Exploring the MYO Armband for Musical Interaction. In Proc. New Interfaces for Musical Expression (NIME).
[68]
J. O’Bryan. 2015. “We ARE our instrument!”: Forming a singer identity. Research Studies in Music Education 37, 1 (2015), 123–137. https://doi.org/10.1177/1321103X15592831
[69]
K. Okada and S. Hirai. 2019. Interactive Sonification for Correction of Poor Sitting Posture While Working. In Proc. International Workshop on Interactive Sonification, Stockholm, Sweden. 101–107. https://doi.org/10.5281/zenodo.3743339
[70]
P. Q. Pfordresher. 2019. Sound and Action in Music Performance. Academic Press, London.
[71]
P. Q. Pfordresher and A. R. Halpern. 2013. Auditory imagery and the poor-pitch singer. Psychonomic Bulletin & Review 20, 4 (2013), 747–753. https://doi.org/10.3758/s13423-013-0401-8
[72]
D. Prem and R. Parncutt. 2008. Corporality in the timbre vocabulary of professional female jazz vocalists. In M. M. Marin, M. Knoche, & R. Parncutt (Eds.), Proc. Students of Systematic Musicology (SysMus08), Graz, Austria, November 14-15, 2008. 69–71.
[73]
C. N. Reed and A. P. McPherson. 2020. Surface Electromyography for Direct Vocal Control. In Proc. New Interfaces for Musical Expression (NIME 2020) Royal Birmingham Conservatoire, Birmingham, UK. 1–6.
[74]
D. Schön. 1984. The Reflective Practitioner: How Professionals Think In Action. Basic Books.
[75]
P. Sengers and W. W. Gaver. 2006. Staying Open to Interpretation: Engaging Multiple Meanings in Design and Evaluation. In Proc. DIS 2006, June 26–28, 2006, University Park, Pennsylvania, USA. 99–108. https://doi.org/10.1145/1142405.1142422
[76]
J. Sundberg. 1994. Perceptual Aspects of Singing. Journal of Voice 8, 2 (1994), 106–122. https://doi.org/10.1016/S0892-1997(05)80303-0
[77]
A. Tanaka. 2015. Intention, Effort, and Restraint: The EMG in Musical Performance. Leonardo: Transactions in Live Interfaces 43, 8 (2015), 298–299. https://doi.org/10.1162/LEON_a_01018
[78]
A. Tanaka and R. B. Knapp. 2017. Multimodal Interaction in Music Using the Electromyogram and Relative Position Sensing. In A NIME Reader: Fifteen Years of New Interfaces for Musical Expression. Springer.
[79]
A. Tanaka and M. Ortiz. 2017. Gestural Musical Performance with Physiological Sensors, Focusing on the Electromyogram. In The Routledge Companion to Embodied Music Interaction. Routledge.
[80]
M. Theiss, P. M. Scholl, and K. Van Laerhoven. 2016. Predicting Grasps with a Wearable Inertial and EMG Sensing Unit for Low-Power Detection of In-Hand Objects. In Proc. Augmented Human International (AH ’16). ACM, New York, NY, USA, 1–8. https://doi.org/10.1145/2875194.2875207
[81]
W. H. Trusheim. 1991. Audiation and Mental Imagery: Implications for Artistic Performance. The Quarterly 2, 1-2 (1991), 138–147.
[82]
Y. Tsubouchi and K. Suzuki. 2010. BioTones: A wearable device for EMG auditory biofeedback. In Proc. International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina. 6543–6546. https://doi.org/10.1109/IEMBS.2010.5627097
[83]
K. Tuuri, J. Parvainen, and A. Pirhonen. 2017. Who Controls Who? Embodied Control Within Human–Technology Choreographies. Interacting with Computers(2017), 1–18. https://doi.org/10.1093/iwc/iww040
[84]
F. Varela. 1979. Principles of biological autonomy. Elsevier (North Holland), Amsterdam.
[85]
F. Varela and E. Thompson. 1991. The embodied mind: Cognitive science and human experience. MIT Press, Cambridge, MA.
[86]
F. Varela and E. Thompson. 2001. Radical embodiment: Neural dynamics and consciousness. Trends in Cognitive Sciences 5, 10 (2001), 418–425.
[87]
R. Wakkary, D. Oogjes1, H. W. J. Lin, and S. Hauser. 2018. Philosophers Living with the Tilting Bowl. In Proc. CHI 2018, April 21–26, 2018, Montréal, QC, Canada. 1-12. https://doi.org/10.1145/3173574.3173668
[88]
A. H. D. Watson. 2009. The Biology of Musical Performance and Performance-related Injury. Scarecrow Press, Lanham, Maryland, USA, Chapter The Voice: Management and Problems, 139–192.
[89]
K. Woodward, E. Kanjo, S. Burton, and A. Oikonomou. 2018. EmoEcho: A Tangible Interface to Convey and Communicate Emotions. In Proc. ACM International Joint Conference, International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers. ACM, New York, NY, USA, 746–749. https://doi.org/10.1145/3267305.3267705
[90]
D. J. Wright, C. J. Wakefield, and D. Smith. 2014. Using PETTLEP imagery to improve music performance: A review. Musicae Scientiae 18, 4 (2014), 448–463. https://doi.org/10.1177/1029864914537668
[91]
J. M. Zarate and R. J. Zatorre. 2008. Experience-dependent neural substrates involved in vocal pitch regulation during singing. NeuroImage 40, 4 (2008), 1871–1887. https://doi.org/10.1016/j.neuroimage.2008.01.026
[92]
R. J. Zatorre, J. L. Chen, and V. B. Penhune. 2007. When the brain plays music: auditory–motor interactions in music perception and production. Nature Reviews Neuroscience 8, 7 (2007), 547–558. https://doi.org/10.1038/nrn2152
[93]
R. J. Zatorre, A. R. Halpern, and M. Bouffard. 2010. Mental reversal of imagined melodies: A role for the posterior parietal cortex. Journal of Cognitive Neuroscience 22 (2010), 775–789. https://doi.org/10.1162/jocn.2009.21239
[94]
J. Zimmerman, J. Forlizz, and S. Evenson. 2007. Research through Design as a Method for Interaction Design Research in HCI. In Proc. CHI, ACM Press. 493–502. https://doi.org/10.1145/1240624.1240704

Cited By

View all
  • (2024)Shifting Ambiguity, Collapsing Indeterminacy: Designing with Data as Baradian ApparatusACM Transactions on Computer-Human Interaction10.1145/368904331:6(1-41)Online publication date: 6-Dec-2024
  • (2024)Sonic Entanglements with Electromyography: Between Bodies, Signals, and RepresentationsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661572(2691-2707)Online publication date: 1-Jul-2024
  • (2024)Liminal Space: A Performance with RaveNETProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3635337(1-4)Online publication date: 11-Feb-2024
  • Show More Cited By

Index Terms

  1. Surface Electromyography for Sensing Performance Intention and Musical Imagery in Vocalists
      Index terms have been assigned to the content through auto-classification.

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Other conferences
      TEI '21: Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction
      February 2021
      908 pages
      ISBN:9781450382137
      DOI:10.1145/3430524
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 14 February 2021

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Biofeedback
      2. Electromyography
      3. First-Person Perspectives
      4. Mental Imagery
      5. Performer Intention

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      • EPSRC

      Conference

      TEI '21

      Acceptance Rates

      TEI '21 Paper Acceptance Rate 40 of 136 submissions, 29%;
      Overall Acceptance Rate 393 of 1,367 submissions, 29%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)38
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 13 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Shifting Ambiguity, Collapsing Indeterminacy: Designing with Data as Baradian ApparatusACM Transactions on Computer-Human Interaction10.1145/368904331:6(1-41)Online publication date: 6-Dec-2024
      • (2024)Sonic Entanglements with Electromyography: Between Bodies, Signals, and RepresentationsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661572(2691-2707)Online publication date: 1-Jul-2024
      • (2024)Liminal Space: A Performance with RaveNETProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3635337(1-4)Online publication date: 11-Feb-2024
      • (2024)RaveNET: Connecting People and Exploring Liminal Space through Wearable Networks in Music PerformanceProceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3623509.3635270(1-8)Online publication date: 11-Feb-2024
      • (2024)Enhancing Vocal Performance using Variational Onsager Neural Network and Optimized with Golden Search Optimization AlgorithmApplied Artificial Intelligence10.1080/08839514.2024.234038938:1Online publication date: 24-Apr-2024
      • (2023)The Body as Sound: Unpacking Vocal Embodiment through Auditory BiofeedbackProceedings of the Seventeenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3569009.3572738(1-15)Online publication date: 26-Feb-2023
      • (2023)CorsettoProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581294(1-23)Online publication date: 19-Apr-2023
      • (2023)Haptic Servos: Self-Contained Vibrotactile Rendering System for Creating or Augmenting Material ExperiencesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580716(1-17)Online publication date: 19-Apr-2023
      • (2023)A Guide to Evaluating the Experience of Media and Arts TechnologyCreating Digitally10.1007/978-3-031-31360-8_10(267-300)Online publication date: 3-Dec-2023
      • (2022)Singing Knit: Soft Knit Biosensing for Augmenting Vocal PerformancesProceedings of the Augmented Humans International Conference 202210.1145/3519391.3519412(170-183)Online publication date: 13-Mar-2022
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media