[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Around-the-Head Tactile System for Supporting Micro Navigation of People with Visual Impairments

Published: 23 July 2021 Publication History

Abstract

Tactile patterns are a means to convey navigation instructions to pedestrians and are especially helpful for people with visual impairments. This article presents a concept to provide precise micro-navigation instructions through a tactile around-the-head display. Our system presents four tactile patterns for fundamental navigation instructions in conjunction with continuous directional guidance. We followed an iterative, user-centric approach to design the patterns for the fundamental navigation instructions, combined them with a continuous directional guidance stimulus, and tested our system with 13 sighted (blindfolded) and 2 blind participants in an obstacle course, including stairs. We optimized the patterns and validated the final prototype with another five blind participants in a follow-up study. The system steered our participants successfully with a 5.7 cm average absolute deviation from the optimal path. Our guidance is only a little less precise than the usual shoulder wobbling during normal walking and an order of magnitude more precise than previous tactile navigation systems. Our system allows various new use cases of micro-navigation for people with visual impairments, e.g., preventing collisions on a sidewalk or as an anti-veering tool. It also has applications in other areas, such as personnel working in low-vision environments (e.g., firefighters).

References

[1]
Anonymous. 2020. Pigpio library. Retrieved from http://abyz.me.uk/rpi/pigpio/.
[2]
Apple Inc. 2020. Apple ARKit. Retrieved from https://developer.apple.com/augmented-reality/arkit/.
[3]
Paul Bach-y Rita, Carter C. Collins, Frank A. Saunders, Benjamin White, and Lawrence Scadden. 1969. Vision substitution by tactile image projection. Nature 221, 5184 (Mar. 1969), 963–964.
[4]
P. Bach-y Rita, K. A. Kaczmarek, M. E. Tyler, and J. Garcia-Lara. 1998. Form perception with a 49-point electrotactile stimulus array on the tongue: a technical note.Journal of Rehabilitation Research and Development 35, 4 (1998), 427–430.
[5]
Matthias Berning, Florian Braun, Till Riedel, and Michael Beigl. 2015. ProximityHat: a head-worn system for subtle sensory augmentation with tactile stimulation. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC’15). ACM Press, New York, 31–38.
[6]
Lorna M. Brown, Stephen A. Brewster, and Helen C. Purchase. 2005. A first investigation into the effectiveness of Tactons. In Proceedings of the 1st Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; World Haptics Conference (WHC’05). IEEE, 167–176.
[7]
Leandro Cancar, Alex Díaz, Antonio Barrientos, David Travieso, and David M. Jacobs. 2013. Tactile-sight: A sensory substitution device based on distance-related vibrotactile flow. International Journal of Advanced Robotic Systems 10, 6 (Jun. 2013), 272.
[8]
Alvaro Cassinelli, Carson Reynolds, and Masatoshi Ishikawa. 2006. Augmenting spatial awareness with haptic radar. In Proceedings of the 2006 10th IEEE International Symposium on Wearable Computers. IEEE, 61–64.
[9]
Akansel Cosgun, E. Akin Sisbot, and Henrik I. Christensen. 2014. Evaluation of rotational and directional vibration patterns on a tactile belt for guiding visually impaired people. In Proceedings of the 2014 IEEE Haptics Symposium (HAPTICS’14). IEEE, 367–370.
[10]
Ádám Csapó, György Wersényi, Hunor Nagy, and Tony Stockman. 2015. A survey of assistive technologies and applications for blind users on mobile platforms: a review and foundation for research. Journal on Multimodal User Interfaces 9, 4 (Dec. 2015), 275–286.
[11]
Victor Adriel de Jesus Oliveira, Luca Brayda, Luciana Nedel, and Anderson Maciel. 2017. Designing a vibrotactile head-mounted display for spatial awareness in 3D spaces. IEEE Transactions on Visualization and Computer Graphics 23, 4 (Apr. 2017), 1409–1417.
[12]
Victor Adriel de Jesus Oliveira, Luciana Nedel, Anderson Maciel, and Luca Brayda. 2018. Anti-veering vibrotactile HMD for assistance of blind pedestrians. In Proceedings of the EuroHaptics 2018. Domenico Prattichizzo, Hiroyuki Shinoda, Hong Z. Tan, Emanuele Ruffaldi, and Antonio Frisoli (Eds.), Lecture Notes in Computer Science, Vol. 10894. Springer International Publishing, 500–512.
[13]
Deutsches Institut für Normung. 2015. DIN 18065. Retrieved from https://www.din.de/en/getting-involved/standards-committees/ndr/standards/wdc-beuth:din21:227410112.
[14]
Vincent Diener, Michael Beigl, Matthias Budde, and Erik Pescara. 2017. VibrationCap: studying vibrotactile localization on the human head with an unobtrusive wearable tactile display. In Proceedings of the International Symposium on Wearable Computers. 82–89.
[15]
Michal Karol Dobrzynski, Seifeddine Mejri, Steffen Wischmann, and Dario Floreano. 2012. Quantifying information transfer through a head-attached vibrotactile display: Principles for design and control. IEEE Transactions on Biomedical Engineering 59, 7 (Jul. 2012), 2011–2018.
[16]
German Flores, Sri Kurniawan, Roberto Manduchi, Eric Martinson, Lourdes M. Morales, and Emrah Akin Sisbot. 2015. Vibrotactile guidance for wayfinding of blind walkers. IEEE Transactions on Haptics 8, 3 (Jul. 2015), 306–317.
[17]
Google. 2020. Google ARCore. Retrieved from https://developers.google.com/ar.
[18]
Guiding Eyes for the Blind. 2020. How many people use guide dogs? Retrieved from https://www.guidingeyes.org/about/faqs/.
[19]
Marion Hersh. 2015. Cane use and late onset visual impairment. Technology and Disability 27, 3 (2015), 103–116.
[20]
Wilko Heuten, Niels Henze, Susanne Boll, and Martin Pielot. 2008. Tactile wayfinder: A non-visual support system for wayfinding. In Proceedings of the 5th Nordic Conference on Human-Computer Interaction Building Bridges (NordiCHI’08). ACM Press, New York, 172.
[21]
Weijian Hu, Kaiwei Wang, Kailun Yang, Ruiqi Cheng, Yaozu Ye, Lei Sun, and Zhijie Xu. 2020. A comparative study in real-time scene sonification for visually impaired people. Sensors (Switzerland) 20, 11 (2020), 1–17.
[22]
Ali Israr and Ivan Poupyrev. 2011. Tactile brush. In Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems. ACM Press, New York, 2019.
[23]
Lynette A. Jones and Nadine B. Sarter. 2008. Tactile displays: Guidance for their design and application. Human Factors 50, 1 (Feb. 2008), 90–111.
[24]
H. Kajimoto, Y. Kanno, and S. Tachi. 2006. Forehead electro-tactile display for vision substitution. In Proceedings of the EuroHaptics. Retrieved from http://lsc.univ-evry.fr/~eurohaptics/upload/cd/papers/f62.pdf%5Cnpap ers2://publication/uuid/DA67D1A2-09FE-47CF-BC2D-F12B7393E338.
[25]
Brian F. G. Katz, Slim Kammoun, Gaëtan Parseihian, Olivier Gutierrez, Adrien Brilhault, Malika Auvray, Philippe Truillet, Michel Denis, Simon Thorpe, and Christophe Jouffrais. 2012. NAVIG: augmented reality guidance system for the visually impaired. Virtual Reality 16, 4 (Nov. 2012), 253–269.
[26]
Oliver Beren Kaul, Max Pfeiffer, and Michael Rohs. 2016. Follow the force: Steering the index finger towards targets using EMS. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, New York, NY, 2526–2532.
[27]
Oliver Beren Kaul and Michael Rohs. 2016. HapticHead: 3D guidance and target acquisition through a vibrotactile grid. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, New York, NY, 2533–2539.
[28]
Oliver Beren Kaul and Michael Rohs. 2017. HapticHead: A spherical vibrotactile grid around the head for 3D guidance in virtual and augmented reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI’17). ACM Press, New York, NY, 3729–3740.
[29]
Oliver Beren Kaul, Michael Rohs, and Marc Mogalle. 2020. Design and evaluation of on-the-head spatial tactile patterns. In Proceedings of the 19th International Conference on Mobile and Ubiquitous Multimedia. ACM, New York, NY, 229–239.
[30]
Oliver Beren Kaul, Michael Rohs, Benjamin Simon, Kerem Can Demir, and Kamillo Ferry. 2020. Vibrotactile funneling illusion and localization performance on the head. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, 1–13.
[31]
Hamideh Kerdegari, Yeongmi Kim, and Tony J. Prescott. 2016. Head-mounted sensory augmentation device: comparing haptic and audio modality. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 9793. Springer International Publishing, 107–118.
[32]
Lofelt GmbH. 2019. Elevating Haptic Technology with Lofelt Wave. Retrieved from https://lofelt.com/white-paper.
[33]
Jack M. Loomis, Reginald G. Golledge, Roberta L. Klatzky, James R. Marston, and Gary L. Ed Allen. 2007. Assisting wayfinding in visually impaired travelers. Applied Spatial Cognition From Research to Cognitive Technology 1, 60587 (2007), 179–202.
[34]
Ethan Luckett. 2018. A Quantitative Evaluation of the HTC Vive for Virtual Reality Research. Ph.D. Dissertation. University of Mississippi.
[35]
Manuel Martinez, Alina Roitberg, Daniel Koester, Rainer Stiefelhagen, and Boris Schauerte. 2017. Using technology developed for autonomous cars to help navigate blind people. Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops (ICCVW’17). 1424–1432.
[36]
Manuel Martinez, Kailun Yang, Angela Constantinescu, and Rainer Stiefelhagen. 2020. Helping the blind to get through COVID-19: Social distancing assistant using real-time semantic segmentation on rgb-d video. Sensors (Switzerland) 20, 18 (2020), 1–17.
[37]
Akira Matsuda, Kazunori Nozawa, Kazuki Takata, Atsushi Izumihara, and Jun Rekimoto. 2020. HapticPointer: A neck-worn device that presents direction by vibrotactile feedback for remote collaboration tasks. In Proceedings of the Augmented Humans International Conference. ACM, New York, NY, 1–10.
[38]
Renato Melo, Polyanna Amorim da Silva, Robson Souza, Maria Raposo, and Karla Ferraz. 2013. Head position comparison between students with normal hearing and students with sensorineural hearing loss. International Archives of Otorhinolaryngology 17, 04 (Sep. 2013), 363–369.
[39]
Kimberly Myles and Joel T. Kalb. 2009. Vibrotactile Sensitivity of the Head. Defense Technical Information Center. https://apps.dtic.mil/sti/citations/ADA499558.
[40]
Kimberly Myles and Joel T. Kalb. 2010. Guidelines for head tactile communication. Army Research Laboratory 1, March (2010), 26. Retrieved from http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA519112.
[41]
K. Myles and J. T. Kalb. 2013. Head tactile communication: Promising technology with the design of a head-mounted tactile display. Ergonomics in Design: The Quarterly of Human Factors Applications 21, 2 (Apr. 2013), 4–8.
[42]
Kimberly Myles, Joel T. Kalb, Janea Lowery, and Bheem P. Kattel. 2015. The effect of hair density on the coupling between the tactor and the skin of the human head. Applied Ergonomics 48 (2015), 177–185.
[43]
NaturalPoint Inc. 2018. OptiTrack Tracking System. Retrieved from https://optitrack.com/.
[44]
Tomi Nukarinen, Jussi Rantala, Ahmed Farooq, and Roope Raisamo. 2015. Delivering directional haptic cues through eyeglasses and a seat. In Proceedings of the 2015 IEEE World Haptics Conference. 345–350.
[45]
Qiangqiang Ouyang, Juan Wu, Zhiyu Shao, and Dapeng Chen. 2018. A vibrotactile belt to display precise directional information for visually impaired. IEICE Electronics Express 15, 20 (2018), 20180615–20180615.
[46]
Sabrina Paneels, Margarita Anastassova, Steven Strachan, Sophie Pham Van, Saranya Sivacoumarane, and Christian Bolzmacher. 2013. What’s around me? Multi-actuator haptic feedback on the wrist. In Proceedings of the 2013 World Haptics Conference. IEEE, 407–412.
[47]
Matteo Poggi and Stefano Mattoccia. 2016. A wearable mobility aid for the visually impaired based on embedded 3D vision and deep learning. Proceedings of the IEEE Symposium on Computers and Communications. 208–213.
[48]
Precision Microdrives. 2017. Precision Microdrives 312-101. Retrieved from https://www.precisionmicrodrives.com/product/312-101-12mm-vibration-motor-3mm-type.
[49]
Raspberry Pi Foundation. 2016. Raspberry Pi 3 Model B - Raspberry Pi. Retrieved from www.raspberrypi.org/.
[50]
Stefanie Schaack, George Chernyshov, Kirill Ragozin, Benjamin Tag, Roshan Peiris, and Kai Kunze. 2019. Haptic Collar - vibrotactile feedback around the neck for guidance applications. In Proceedings of the 10th Augmented Human International Conference. ACM, New York, NY, 1–4.
[51]
Boris Schauerte, Daniel Koester, Manel Martinez, and Rainer Stiefelhagen. 2015. Way to go! Detecting open areas ahead of a walking person. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Elsevier Ltd., 349–360.
[52]
S. Scheggi, A. Talarico, and D. Prattichizzo. 2014. A remote guidance system for blind and visually impaired people via vibrotactile haptic feedback. In Proceedings of the 22nd Mediterranean Conference on Control and Automation. IEEE, 20–23.
[53]
Hervé Segond, Déborah Weiss, and Eliana Sampaio. 2005. Human spatial navigation via a visuo-tactile sensory substitution system. Perception 34, 10 (Oct. 2005), 1231–1249.
[54]
Alexa F. Siu, Mike Sinclair, Robert Kovacs, Eyal Ofek, Christian Holz, and Edward Cutrell. 2020. Virtual reality without vision: A haptic and auditory white cane to navigate complex virtual worlds. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, 1–13.
[55]
Koji Tsukada and Michiaki Yasumura. 2004. ActiveBelt: Belt-type wearable tactile display for directional navigation. In Ubiquitous Computing. Vol. 3205. Springer, Berlin, 384–399.
[56]
US Department of Transportation. 2006. Lesson 9: Walkways, sidewalks, and public spaces. Federal Highway Administration University Course on Bicycle and Pedestrian Transportation 1, July (2006), 452.
[57]
Jan B. F. van Erp. 2001. Tactile navigation display. In Proceedings of the Haptic HCI 2000. 165–173.
[58]
Jan B. F. van Erp, Liselotte C. M. Kroon, Tina Mioch, and Katja I. Paul. 2017. Obstacle detection display for visually impaired: Coding of direction, distance, and height on a vibrotactile waist band. Frontiers in ICT 4, Sep (2017), 1–19.
[59]
Ramiro Velazquez, Edwige Pissaloux, Carolina Del-Valle-Soto, Aime Lay-Ekuakille, and Bruno Ando. 2020. Usability evaluation of foot-based interfaces for blind travelers. IEEE Instrumentation & Measurement Magazine 23, 4 (2020), 4–13.
[60]
Washington State. 2020. Dispelling Myths, Department of Services for the Blind. Retrieved from https://dsb.wa.gov/resources/blind-awareness/dispelling-myths.
[61]
White Cane Day. 2020. White Cane Day FAQ. Retrieved from http://whitecaneday.org/canes/.
[62]
Yuhang Zhao, Cynthia L. Bennett, Hrvoje Benko, Edward Cutrell, Christian Holz, Meredith Ringel Morris, and Mike Sinclair. 2018. Demonstration of enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. Proceedings of the 2018 Conference on Human Factors in Computing Systems. 1–14.

Cited By

View all
  • (2025)A comparison of vibrotactile patterns in an early warning system for obstacle detection using a haptic vestApplied Ergonomics10.1016/j.apergo.2024.104396122(104396)Online publication date: Jan-2025
  • (2024)Principles of User Interface Design Enabling People With Blindness Professional Work in Administration of Energy Systems in Intelligent Buildings Comparable to Sighted WorkersIEEE Access10.1109/ACCESS.2024.342533012(94176-94196)Online publication date: 2024
  • (2024)MyWay: a 3D and audio-enhanced transportation learning kit for the visually impaired teenagersCCF Transactions on Pervasive Computing and Interaction10.1007/s42486-024-00163-yOnline publication date: 23-Jul-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Computer-Human Interaction
ACM Transactions on Computer-Human Interaction  Volume 28, Issue 4
August 2021
297 pages
ISSN:1073-0516
EISSN:1557-7325
DOI:10.1145/3477419
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 July 2021
Accepted: 01 March 2021
Revised: 01 March 2021
Received: 01 July 2020
Published in TOCHI Volume 28, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Visually impaired
  2. obstacle avoidance
  3. tactile guidance
  4. tactile navigation
  5. tactile patterns

Qualifiers

  • Research-article
  • Research
  • Refereed

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)148
  • Downloads (Last 6 weeks)27
Reflects downloads up to 09 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2025)A comparison of vibrotactile patterns in an early warning system for obstacle detection using a haptic vestApplied Ergonomics10.1016/j.apergo.2024.104396122(104396)Online publication date: Jan-2025
  • (2024)Principles of User Interface Design Enabling People With Blindness Professional Work in Administration of Energy Systems in Intelligent Buildings Comparable to Sighted WorkersIEEE Access10.1109/ACCESS.2024.342533012(94176-94196)Online publication date: 2024
  • (2024)MyWay: a 3D and audio-enhanced transportation learning kit for the visually impaired teenagersCCF Transactions on Pervasive Computing and Interaction10.1007/s42486-024-00163-yOnline publication date: 23-Jul-2024
  • (2024)The user experience of distal arm-level vibrotactile feedback for interactions with virtual versus physical displaysVirtual Reality10.1007/s10055-024-00977-228:2Online publication date: 22-Mar-2024
  • (2023)DrivingVibe: Enhancing VR Driving Experience using Inertia-based Vibrotactile Feedback around the HeadProceedings of the ACM on Human-Computer Interaction10.1145/36042537:MHCI(1-22)Online publication date: 13-Sep-2023
  • (2023)Investigating Apparent Tactile Motion and Cutaneous Rabbit Illusion to Support Cyclists’ NavigationProceedings of Mensch und Computer 202310.1145/3603555.3608523(300-306)Online publication date: 3-Sep-2023
  • (2023)Survey and analysis of the current status of research in the field of outdoor navigation for the blindDisability and Rehabilitation: Assistive Technology10.1080/17483107.2023.222722419:4(1657-1675)Online publication date: 4-Jul-2023
  • (2022)Perception Accuracy of a Multi-Channel Tactile Feedback System for Assistive TechnologySensors10.3390/s2222896222:22(8962)Online publication date: 19-Nov-2022
  • (2022)SeeWay: Vision-Language Assistive Navigation for the Visually Impaired2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC)10.1109/SMC53654.2022.9945087(52-58)Online publication date: 9-Oct-2022
  • (2022)Comparison of the Static and Dynamic Vibrotactile Interactive Perception of Walking Navigation Assistants for Limited Vision PeopleIEEE Access10.1109/ACCESS.2022.316740710(42261-42267)Online publication date: 2022

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media