[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3462244.3479946acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Towards Sound Accessibility in Virtual Reality

Published: 18 October 2021 Publication History

Abstract

Virtual reality (VR) leverages sight, hearing, and touch senses to convey virtual experiences. For d/Deaf and hard of hearing (DHH) people, however, information conveyed through sound may not be accessible. While prior work has explored making every day sounds accessible to DHH users, the context of VR is, as yet, unexplored. In this paper, we provide a first comprehensive investigation of sound accessibility in VR. Our primary contributions include a design space for developing visual and haptic substitutes of VR sounds to support DHH users and prototypes illustrating several points within the design space. We also characterize sound accessibility in commonly used VR apps and discuss findings from early evaluations of our prototypes with 11 DHH users and 4 VR developers.

Supplementary Material

MP4 File (1216_Jain.mp4)
ICMI 2021 Talk for the paper: "Towards Sound Accessibility in Virtual Reality"

References

[1]
Kurt Akeley, Simon J Watt, Ahna Reza Girshick, and Martin S Banks. 2004. A stereo display prototype with multiple focal distances. ACM transactions on graphics (TOG) 23, 3: 804–813.
[2]
Matilda Annerstedt, Peter Jönsson, Mattias Wallergård, Gerd Johansson, Björn Karlson, Patrik Grahn, Åse Marie Hansen, and Peter Währborg. 2013. Inducing physiological stress recovery with sounds of nature in a virtual reality forest—Results from a pilot study. Physiology & behavior 118: 240–250.
[3]
Beatriz Pozo Arcos, Conny Bakker, Bas Flipsen, and Ruud Balkenende. 2020. Practices of fault diagnosis in household appliances: Insights for design. Journal of Cleaner Production: 121812.
[4]
Morgan Baker. Gamasutra: Deaf Accessibility in Video Games. Retrieved September 6, 2020 from https://www.gamasutra.com/blogs/MorganBaker/20200720/366615/Deaf_Accessibility_in_Video_Games.php
[5]
Durand R Begault. 2000. 3-D Sound for Virtual Reality and Multimedia. Retrieved from https://ntrs.nasa.gov/api/citations/20010044352/downloads/20010044352.pdf?attachment=true
[6]
Bill Black, Mike Lopez, and Anthony Morcos. 1993. Basics of voice coil actuators. PCIM-VENTURA CA- 19: 44.
[7]
Danielle Bragg, Nicholas Huynh, and Richard E. Ladner. 2016. A Personalizable Mobile Sound Detector App Design for Deaf and Hard-of-Hearing Users. In Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility, 3–13.
[8]
Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology. Qualitative Research in Psychology 3, 2: 77–101.
[9]
Stuart K Card and Jock Mackinlay. 1997. The structure of the information visualization design space. In Proceedings of VIZ’97: Visualization Conference, Information Visualization Symposium and Parallel Rendering Symposium, 92–99.
[10]
Stuart K Card, Jock D Mackinlay, and George G Robertson. 1990. The design space of input devices. In Proceedings of the SIGCHI conference on Human factors in computing systems, 117–124.
[11]
David J Chalmers. 2017. The virtual and the real. Disputatio 9, 46: 309–352.
[12]
Kai Håkon Christensen, Johannes Röhrs, Brian Ward, Ilker Fer, Göran Broström, Øyvind Saetra, and Øyvind Breivik. 2013. Surface wave measurements using a ship-mounted ultrasonic altimeter. Methods in Oceanography 6: 1–15.
[13]
Peter Ciuha, Bojan Klemenc, and Franc Solina. 2010. Visualization of concurrent tones in music with colours. In Proceedings of the 18th ACM international conference on Multimedia, 1677–1680.
[14]
Juliet M Corbin and Anselm Strauss. 1990. Grounded theory research: Procedures, canons, and evaluative criteria. Qualitative sociology 13, 1: 3–21.
[15]
Sigal Eden and Sara Ingber. 2014. Enhancing storytelling ability with virtual environment among deaf and hard-of-hearing children. In International Conference on Computers for Handicapped Persons, 386–392.
[16]
Troels Folmann. 2004. Dimensions of game audio. Unpublished. Available at http://www. itu. dk/people/folmann/2004/11/dimensions-of-game-audio. html [Accessed October 28, 2005.].
[17]
Simon Franco. 2016. In-Game Audio Debugging Tools. In Game Development Tools. AK Peters/CRC Press, 161–184.
[18]
Karyn L. Galvin, Jan Ginis, Robert S. Cowan, Peter J. Blamey, and Graeme M. Clark. 2001. A Comparison of a New Prototype Tickle TalkerTM with the Tactaid 7. Australian and New Zealand Journal of Audiology 23, 1: 18–36.
[19]
Kathrin Gerling and Katta Spiel. 2021. A Critical Examination of Virtual Reality Technology in the Context of the Minority Body. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA.
[20]
Steven Goodman, Susanne Kirchner, Rose Guttman, Dhruv Jain, Jon Froehlich, and Leah Findlater. Evaluating Smartwatch-based Sound Feedback for Deaf and Hard-of-hearing Users Across Contexts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1–13.
[21]
Ru Guo, Yiru Yang, Johnson Kuang, Xue Bin, Dhruv Jain, Steven Goodman, Leah Findlater, and Jon Froehlich. 2020. HoloSound: Combining Speech and Sound Identification for Deaf or Hard of Hearing Users on a Head-mounted Display. In The 22nd International ACM SIGACCESS Conference on Computers and Accessibility, 1–4.
[22]
Sander Huiberts. 2010. Captivating sound the role of audio for immersion in computer games. University of Portsmouth.
[23]
John W Hunt, Marcel Arditi, and F Stuart Foster. 1983. Ultrasound transducers for pulse-echo medical imaging. IEEE Transactions on Biomedical Engineering, 8: 453–481.
[24]
Dhruv Jain, Bonnie Chinh, Leah Findlater, Raja Kushalnagar, and Jon Froehlich. 2018. Exploring Augmented Reality Approaches to Real-Time Captioning: A Preliminary Autoethnographic Study. In Proceedings of the 2018 ACM Conference Companion Publication on Designing Interactive Systems, 7–11.
[25]
Dhruv Jain, Brendon Chiu, Steven Goodman, Chris Schmandt, Leah Findlater, and Jon E Froehlich. 2020. Field study of a tactile sound awareness device for deaf users. In Proceedings of the 2020 International Symposium on Wearable Computers, 55–57.
[26]
Dhruv Jain, Leah Findlater, Christian Volger, Dmitry Zotkin, Ramani Duraiswami, and Jon Froehlich. 2015. Head-Mounted Display Visualizations to Support Sound Awareness for the Deaf and Hard of Hearing. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 241–250.
[27]
Dhruv Jain, Rachel Franz, Leah Findlater, Jackson Cannon, Raja Kushalnagar, and Jon Froehlich. 2018. Towards Accessible Conversations in a Mobile Context for People Who are Deaf and Hard of Hearing. In ASSETS 2018 - Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility, 81–92.
[28]
Dhruv Jain, Sasa Junuzovic, Eyal Ofek, Mike Sinclair, John Porter, Chris Yoon, Swetha Machanavajhala, and Meredith Ringel Morris. 2021. A Taxonomy of Sounds in Virtual Reality. In Designing Interactive Systems (DIS) 2021, 160–170.
[29]
Dhruv Jain, Angela Carey Lin, Marcus Amalachandran, Aileen Zeng, Rose Guttman, Leah Findlater, and Jon Froehlich. 2019. Exploring Sound Awareness in the Home for People who are Deaf or Hard of Hearing. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 94:1-94:13.
[30]
[30] Dhruv Jain, Kelly Mack, Akli Amrous, Matt Wright, Steven Goodman, Leah Findlater, and Jon E Froehlich. 2020. HomeSound: An Iterative Field Deployment of an In-Home Sound Awareness System for Deaf or Hard of Hearing Users. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20), 1–12.
[31]
Dhruv Jain, Hung Ngo, Pratyush Patel, Steven Goodman, Leah Findlater, and Jon Froehlich. 2020. SoundWatch: Exploring Smartwatch-based Deep Learning Approaches to Support Sound Awareness for Deaf and Hard of Hearing Users. In ACM SIGACCESS conference on Computers and accessibility, 1–13.
[32]
Maria Karam, Carmen Branje, Gabriel Nespoli, Norma Thompson, Frank Russo, and Deborah Fels. 2010. The emoti-chair: An interactive tactile music exhibit. 3069–3074.
[33]
Klaus Krippendorff. 2018. Content analysis: An introduction to its methodology. Sage publications.
[34]
Arpi Mardirossian and Elaine Chew. 2007. Visualizing Music: Tonal Progressions and Distributions. In ISMIR, 189–194.
[35]
Tiago A Marques, Len Thomas, Stephen W Martin, David K Mellinger, Jessica A Ward, David J Moretti, Danielle Harris, and Peter L Tyack. 2013. Estimating animal population density using passive acoustics. Biological Reviews 88, 2: 287–309.
[36]
Martyn Reding. Designing haptic responses. Retrieved September 6, 2020 from https://medium.com/@martynreding/basics-of-designing-haptic-responses-63dc6b52e010
[37]
Tara Matthews, Janette Fong, F. Wai-Ling Ho-Ching, and Jennifer Mankoff. 2006. Evaluating non-speech sound visualizations for the deaf. Behaviour & Information Technology 25, 4: 333–351. https://doi.org/10.1080/01449290600636488
[38]
Matthias Mielke and Rainer Brueck. 2015. Design and evaluation of a smartphone application for non-speech sound awareness for people with hearing loss. In Engineering in Medicine and Biology Society (EMBC), 2015 37th Annual International Conference of the IEEE, 5008–5011.
[39]
Mohammadreza Mirzaei, Peter Kan, and Hannes Kaufmann. 2020. EarVR: Using ear haptics in virtual reality for deaf and Hard-of-Hearing people. IEEE Transactions on Visualization and Computer Graphics 26, 5: 2084–2093.
[40]
Reiko Miyazaki, Issei Fujishiro, and Rumi Hiraga. 2003. Exploring MIDI datasets. In ACM SIGGRAPH 2003 Sketches & Applications. 1.
[41]
Martez Mott, Edward Cutrell, Mar Gonzalez Franco, Christian Holz, Eyal Ofek, Richard Stoakley, and Meredith Ringel Morris. 2019. Accessible by design: An opportunity for virtual reality. In 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), 451–454.
[42]
Suranga Chandima Nanayakkara, Lonce Wyse, S. H. Ong, and Elizabeth A. Taylor. 2013. Enhancing Musical Experience for the Hearing-Impaired Using Visual and Haptic Displays. Human–Computer Interaction 28, 2: 115–160.
[43]
Suranga Nanayakkara, Elizabeth Taylor, Lonce Wyse, and S H Ong. 2009. An enhanced musical experience for the deaf: design and evaluation of a music display and a haptic chair. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 337–346.
[44]
S C W Ong and S Ranganath. 2005. Automatic sign language analysis: a survey and the future beyond lexical meaning. Pattern Analysis and Machine Intelligence, IEEE Transactions on 27, 6: 873–891.
[45]
Shanmugam Muruga Palaniappan, Ting Zhang, and Bradley S Duerstock. 2019. Identifying Comfort Areas in 3D Space for Persons with Upper Extremity Mobility Impairments Using Virtual Reality. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility, 495–499.
[46]
Phil Parette and Marcia Scherer. 2004. Assistive Technology Use and Stigma. Education and Training in Developmental Disabilities-September 2004: 217–226.
[47]
David Passig and Sigal Eden. 2001. Virtual reality as a tool for improving spatial rotation among deaf and hard-of-hearing children. CyberPsychology & Behavior 4, 6: 681–686.
[48]
Mark Paterson. 2017. On haptic media and the possibilities of a more inclusive interactivity. New Media & Society 19, 10: 1541–1562.
[49]
Yi-Hao Peng, Ming-Wei Hsu, Paul Taele, Ting-Yu Lin, Po-En Lai, Leon Hsu, Tzu-chuan Chen, Te-Yen Wu, Yu-An Chen, Hsien-Hui Tang, and Mike Y. Chen. 2018. SpeechBubbles: Enhancing Captioning Experiences for Deaf and Hard-of-Hearing People in Group Conversations. In SIGCHI Conference on Human Factors in Computing Systems (CHI), Paper No. 293.
[50]
A J Phillips, A R D Thornton, S Worsfold, A Downie, and J Milligan. 1994. Experience of using vibrotactile aids with the profoundly deafened. European journal of disorders of communication 29, 1: 17–26.
[51]
Martin Pielot, Benjamin Poppinga, Wilko Heuten, and Susanne Boll. 2011. A tactile compass for eyes-free pedestrian navigation. In IFIP Conference on Human-Computer Interaction, 640–656.
[52]
Ilyas Potamitis, Stavros Ntalampiras, Olaf Jahn, and Klaus Riede. 2014. Automatic bird sound detection in long real-field recordings: Applications and tools. Applied Acoustics 80: 1–9.
[53]
Marti L Riemer-Reiss and Robbyn R Wacker. 2000. Factors associated with assistive technology discontinuance among individuals with disabilities. Journal of Rehabilitation 66, 3.
[54]
Tom Ritchey. 2011. General morphological analysis (GMA). In Wicked problems–Social messes. Springer, 7–18.
[55]
Frank A Saunders, William A Hill, and Barbara Franklin. 1981. A wearable tactile sensory aid for profoundly deaf children. Journal of Medical Systems 5, 4: 265–270.
[56]
Andrew Sears, Min Lin, Julie Jacko, and Yan Xiao. 2003. When computers fade: Pervasive computing and situationally-induced impairments and disabilities. In HCI international, 1298–1302.
[57]
Kristen Shinohara and JO Wobbrock. 2011. In the shadow of misperception: assistive technology use and social interactions. In SIGCHI Conference on Human Factors in Computing Systems (CHI), 705–714.
[58]
Liu Sicong, Zhou Zimu, Du Junzhao, Shangguan Longfei, Jun Han, and Xin Wang. 2017. UbiEar: Bringing Location-independent Sound Awareness to the Hard-of-hearing People with Smartphones. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 1, 2: 17.
[59]
Alexa F Siu, Mike Sinclair, Robert Kovacs, Eyal Ofek, Christian Holz, and Edward Cutrell. 2020. Virtual Reality Without Vision: A Haptic and Auditory White Cane to Navigate Complex Virtual Worlds. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–13.
[60]
Sean M Smith and Glen N Williams. 1997. A visualization of music. In Proceedings. Visualization’97 (Cat. No. 97CB36155), 499–503.
[61]
Axel Stockburger. 2003. The game environment from an auditive perspective. Level Up: 4–6.
[62]
I R Summers, M A Peake, and M C Martin. 1981. Field trials of a tactile acoustic monitor for the profoundly deaf. British journal of audiology 15, 3: 195–199.
[63]
Mauro Teófilo, Alvaro Lourenço, Juliana Postal, and Vicente F Lucena. 2018. Exploring virtual reality to enable deaf or hard of hearing accessibility in live theaters: A case study. In International Conference on Universal Access in Human-Computer Interaction, 132–148.
[64]
Ryan Wedoff, Lindsay Ball, Amelia Wang, Yi Xuan Khoo, Lauren Lieberman, and Kyle Rector. 2019. Virtual showdown: An accessible virtual reality game with scaffolds for youth with visual impairments. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 1–15.
[65]
Janet M Weisenberger, Susan M Broadstone, and Frank A Saunders. 1989. Evaluation of two multichannel tactile aids for the hearing impaired. The Journal of the Acoustical Society of America 86, 5: 1764–1775.
[66]
Jacob O Wobbrock, Shaun K Kane, Krzysztof Z Gajos, Susumu Harada, and Jon Froehlich. 2011. Ability-Based Design: Concept, Principles and Examples. ACM Trans. Access. Comput. 3, 3: 9:1–9:27. https://doi.org/10.1145/1952383.1952384
[67]
Lining Yao, Yan Shi, Hengfeng Chi, Xiaoyu Ji, and Fangtian Ying. 2010. Music-touch Shoes: Vibrotactile Interface for Hearing Impaired Dancers. In Proceedings of the Fourth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’10), 275–276.
[68]
Koji Yatani, Nikola Banovic, and Khai Truong. 2012. SpaceSense: representing geographical information to visually impaired people using spatial tactile feedback. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 415–424.
[69]
Eddy Yeung, Arthur Boothroyd, and Cecil Redmond. 1988. A Wearable Multichannel Tactile Display of Voice Fundamental Frequency. Ear and Hearing 9, 6: 342–350.
[70]
Kyle T Yoshida, Cara M Nunez, Sophia R Williams, Allison M Okamura, and Ming Luo. 2019. 3-DoF Wearable, Pneumatic Haptic Device to Deliver Normal, Shear, Vibration, and Torsion Feedback. In 2019 IEEE World Haptics Conference (WHC), 97–102.
[71]
Yuhang Zhao, Cynthia L Bennett, Hrvoje Benko, Edward Cutrell, Christian Holz, Meredith Ringel Morris, and Mike Sinclair. 2018. Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. In Proceedings of the 2018 CHI conference on human factors in computing systems, 1–14.
[72]
Jie Zhu, Johan Christensen, Jesper Jung, Luis Martin-Moreno, X Yin, Lee Fok, Xiang Zhang, and F J Garcia-Vidal. 2011. A holey-structured metamaterial for acoustic deep-subwavelength imaging. Nature physics 7, 1: 52–55.
[73]
Spice & Wolf VR Leaves You Wanting For More Of Lawrence And Holo's Sweet Interactions. Retrieved September 8, 2020 from https://www.siliconera.com/spice-wolf-vr-leaves-you-wanting-for-more-of-lawrence-and-holos-sweet-interactions/
[74]
Beat Saber and SUBPAC Bring VR Rhythm Game to Hearing Impaired. Retrieved September 8, 2020 from https://about.fb.com/news/2020/03/vr-game-for-hearing-impaired/
[75]
How Visuals Can Help Deaf Children “Hear” | Live Science. Retrieved August 18, 2020 from https://www.livescience.com/47004-visuals-help-deaf-children-experience-sound.html
[76]
Home | XR Access Initiative. Retrieved September 19, 2020 from https://xraccess.org/
[77]
XR Accessibility User Requirements. Retrieved September 19, 2020 from https://www.w3.org/TR/xaur/
[78]
The Persistence for PS VR - PlayStation.Blog. Retrieved September 20, 2021 from https://blog.playstation.com/2018/10/11/the-persistence-for-ps-vr-gets-huge-free-update-october-18/
[79]
Valorant: Deaf Accessibility Case Study – Morgan L. Baker. Retrieved September 8, 2020 from https://leahybaker.com/valorant_access/
[80]
Sign Language In VR ‘Worth Exploring’ As Hand Tracking Improves. Retrieved September 8, 2020 from https://uploadvr.com/sign-language-vr-asl/
[81]
Enabling VR for the visual and hearing impaired. What are methods to improve their experiences?: oculus. Retrieved September 8, 2020 from https://www.reddit.com/r/oculus/comments/22u0z6/enabling_vr_for_the_visual_and_hearing_impaired/
[82]
Insights Into VR Gaming and Esports from a Deaf Player. Retrieved September 8, 2020 from https://www.vrfitnessinsider.com/insights-into-vr-gaming-and-esports-from-a-deaf-player/
[83]
Phantom Premium Haptic Devices. Retrieved April 12, 2021 from https://www.3dsystems.com/haptics-devices/3d-systems-phantom-premium
[84]
Deaf Game Review - State of Decay 2. Retrieved September 8, 2020 from https://caniplaythat.com/2019/02/11/deaf-game-review-state-of-decay-2/
[85]
How Can Developers Build Accessibility Bridges for Deafness and Gaming? Retrieved September 6, 2020 from https://www.dualshockers.com/how-can-developers-make-games-accessible-for-deaf-gamers/
[86]
Minecraft Accessibility | Minecraft. Retrieved September 8, 2020 from https://www.minecraft.net/en-us/accessibility
[87]
Microsoft Flight Simulator Accessibility Review — Flying with Assistance. Retrieved September 8, 2020 from https://caniplaythat.com/2020/08/17/microsoft-flight-simulator-accessibility-review-pc/
[88]
Gaming's favorite VR mouse uses sign language in the cutest way. Retrieved September 8, 2020 from https://www.polygon.com/2017/8/3/16089720/moss-vr-sign-language-in-games
[89]
Unity - Scripting API: AudioSource.priority. Retrieved September 6, 2020 from https://docs.unity3d.com/ScriptReference/AudioSource.html
[90]
Unity - Scripting API: AudioSource. Retrieved September 6, 2020 from https://docs.unity3d.com/ScriptReference/AudioSource.html
[91]
Elevating Haptic Technology with Lofelt Wave. Retrieved September 6, 2020 from https://assets.ctfassets.net/qku6b8nf1d79/76s0L6xb9GBUdSMb3828AH/64695fe11161bd683e73ca757997e139/Elevating-Haptic-Technology-with-Lofelt-Wave.pdf
[92]
Dayton Audio 25mm Exciter. Retrieved September 6, 2020 from https://www.amazon.com/Dayton-Audio-DAEX25FHE-4-Efficiency-Exciter/dp/B00PY66ZCC
[93]
Kinter MA170+ 2-Channel Mini Amplifier. Retrieved September 6, 2020 from https://www.amazon.com/KINTER-MA170-2-Channel-Amplifier-Treble/dp/B07C1Q1FPT
[94]
Sabrent USB External Stereo Sound Adapter. Retrieved September 6, 2020 from https://www.amazon.com/Sabrent-External-Adapter-Windows-AU-MMSA/dp/B00IRVQ0F8
[95]
Unity - Manual: Prefabs. Retrieved September 13, 2020 from https://docs.unity3d.com/Manual/Prefabs.html

Cited By

View all
  • (2024)Exploring the SoundVizVR Plugin in the Development of Sound-Accessible Virtual Reality Games: Insights from Game Developers and PlayersACM Transactions on Accessible Computing10.1145/369888217:4(1-20)Online publication date: 5-Oct-2024
  • (2024)SoundModVR: Sound Modifications in Virtual Reality for Sound AccessibilityAdjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3672539.3686754(1-4)Online publication date: 13-Oct-2024
  • (2024)Supporting Sound Accessibility by Exploring Sound Augmentations in Virtual RealityProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3688525(1-5)Online publication date: 27-Oct-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI '21: Proceedings of the 2021 International Conference on Multimodal Interaction
October 2021
876 pages
ISBN:9781450384810
DOI:10.1145/3462244
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 October 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Deaf
  2. accessibility
  3. haptic
  4. hard of hearing
  5. sound
  6. virtual reality
  7. visualization

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ICMI '21
Sponsor:
ICMI '21: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION
October 18 - 22, 2021
QC, Montréal, Canada

Acceptance Rates

Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)385
  • Downloads (Last 6 weeks)39
Reflects downloads up to 12 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Exploring the SoundVizVR Plugin in the Development of Sound-Accessible Virtual Reality Games: Insights from Game Developers and PlayersACM Transactions on Accessible Computing10.1145/369888217:4(1-20)Online publication date: 5-Oct-2024
  • (2024)SoundModVR: Sound Modifications in Virtual Reality for Sound AccessibilityAdjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3672539.3686754(1-4)Online publication date: 13-Oct-2024
  • (2024)Supporting Sound Accessibility by Exploring Sound Augmentations in Virtual RealityProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3688525(1-5)Online publication date: 27-Oct-2024
  • (2024)SoundModVR: Sound Modifications in Virtual Reality to Support People who are Deaf and Hard of HearingProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675653(1-15)Online publication date: 27-Oct-2024
  • (2024)SoundHapticVR: Head-Based Spatial Haptic Feedback for Accessible Sounds in Virtual Reality for Deaf and Hard of Hearing UsersProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675639(1-17)Online publication date: 27-Oct-2024
  • (2024)SonifyAR: Context-Aware Sound Generation in Augmented RealityProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676406(1-13)Online publication date: 13-Oct-2024
  • (2024)SoundShift: Exploring Sound Manipulations for Accessible Mixed-Reality AwarenessProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661556(116-132)Online publication date: 1-Jul-2024
  • (2024)Is Your Family Ready for VR? Ethical Concerns and Considerations in Children's VR UsageProceedings of the 23rd Annual ACM Interaction Design and Children Conference10.1145/3628516.3655804(436-454)Online publication date: 17-Jun-2024
  • (2024)SocialCueSwitch: Towards Customizable Accessibility by Representing Social Cues in Multiple SensesExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651109(1-7)Online publication date: 11-May-2024
  • (2024)Accessibility Feature Implementation Within Free VR ExperiencesExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650935(1-9)Online publication date: 11-May-2024
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media