Comprehensive Review: High-Performance Positioning Systems for Navigation and Wayfinding for Visually Impaired People
<p>Methodological framework for the comprehensive literature review.</p> "> Figure 2
<p>Standard of dimensions of TWSI [<a href="#B17-sensors-24-07020" class="html-bibr">17</a>]. Key 1: flat-topped elongated bars, height 4 mm to 5 mm, beveled; s: spacing of ribs; b: width at base; L: minimum 270 mm; W: minimum 250 mm; d: minimum 30 mm.</p> "> Figure 3
<p>Transitioning from mobility capacity to the development of new requirements [<a href="#B5-sensors-24-07020" class="html-bibr">5</a>].</p> "> Figure 4
<p>Mobility course configuration and walking path on the platform at PAMELA [<a href="#B5-sensors-24-07020" class="html-bibr">5</a>]: (<b>a</b>) sections of the PAMELA platform; (<b>b</b>) proposed walking paths of VIPs.</p> "> Figure 5
<p>Average (avg.) effective speed for all VIPs when traversing the platform in individual, unidirectional, and opposing flows [<a href="#B5-sensors-24-07020" class="html-bibr">5</a>].</p> "> Figure 6
<p>Average effective speed across sections for each VIP in each scenario ordered by increasing visual function [<a href="#B5-sensors-24-07020" class="html-bibr">5</a>].</p> "> Figure 7
<p>Components of the DeepNAVI navigation assistant system [<a href="#B49-sensors-24-07020" class="html-bibr">49</a>].</p> ">
Abstract
:1. Introduction
- Review Justification: Section 1 delves into the navigation challenges faced by Visually Impaired People (VIPs);
- Precise Review Objective: This aims to develop a high-performance positioning system for VIPs, addressing the segregation of visually impaired research between medical and engineering disciplines;
- Inclusion and Exclusion Criteria: The included literature covers topics such as ‘classification of visual impairment’, ‘standards of navigation systems for visually impaired people’, ‘navigation assistants for visually impaired people’, ‘mobility aids for visually impaired people’, ‘positioning and wayfinding of navigation systems’, ‘artificial intelligence assistants for visually impaired people’, and ‘impact of COVID-19 on visually impaired people’. The excluded literature consists of outdated technological interventions and studies lacking validation;
- Explicit Literature Search: Key search terms include ‘visually impaired’, ‘mobility aids’, ‘navigation’, ‘positioning’, and ‘wayfinding’. This review encompasses publicly available documents, academic journals, conference papers, reports, and university theses from general academic engines, public scholar databases, and the university’s internal database up to the beginning of 2024;
- Efforts to Reduce Selection Bias and Identify All the Relevant Literature: Multiple reviewers assess the literature for reliability. A systematic search of the ‘grey literature’ is conducted using internet search engines;
- Quality Ranking of the Reviewed Literature: The literature with higher international impact and research significance is given higher importance;
- Suitability of Included Studies: Tables summarize the studies, providing information on authors, system performance, main system components, advantages, and disadvantages;
- Results and Interpretations: Findings are discussed at the end of each section, comparing them to other published works and related topics;
- Review Limitations: Limitations and future research directions are addressed in each section;
- Conclusion: The final section offers a concise summary of the primary review findings and the objectives of this review.
2. Literature Review of Types of Vision Impairment
2.1. WHO Classification of Visual Impairment
2.2. UK Classification of Visual Impairment (CVI)
- Sight impaired (partially sighted);
- Severely sight impaired (blind).
2.3. Classification of Visual Impairment in International Research
3. Literature Review of International Standards for Navigation for Visually Impaired People
3.1. ISO Standards of Navigation for Visually Impaired People
3.2. Standards of Navigation for Visually Impaired People in International and Regional Institutes
- (i)
- user-interface elements;
- (ii)
- user preference settings;
- (iii)
- accessibility adjustments;
- (iv)
- controls and operations, e.g., switching of input/output alternatives, optimization of the number of steps;
- (v)
- compatibility with assistive techniques, e.g., enabling concurrent operation of multiple assistive technologies.
3.3. Summary of International Standards for Navigation for Visually Impaired People
4. Literature Review of the Requirements of Visually Impaired People
4.1. Mobility Capacity, Mobility Performance, Environmental and User Requirements
- Obstacle and hazard awareness;
- Orientation and wayfinding (‘Where am I?’).
4.2. System and User Requirements
- Electronic Travel Aids (ETAs), designed for obstacle detection and hazard awareness;
- Electronic Orientation Aids (EOAs), focused on orientation and wayfinding (‘Where am I?’);
- Binary Electronic Mobility Systems (BEMS) seeks to combine the advantages of both ETAs and EOAs.
- Smaller Size and Lightweight (higher scalability): Navigation systems with considerable dimensions and weight hinder their adoption by VIPs for navigation purposes;
- Higher Affordability (cost-effectiveness) and Lower Learning Time: The substantial cost and learning time associated with existing systems discourage users from dedicating significant energy to familiarising themselves with the navigation technology;
- Less Infrastructure Implementation: Navigation systems requiring significant environmental changes, such as the installation of BLE beacons at Points of Interest, pose challenges and entail additional infrastructure investments;
- Multimodal Feedback Design: Many assistant aids incorporate only an audio feedback system, which may prove ineffective in noisy environments. Given the critical role of the feedback system in navigation, it should be designed to offer multi-modal feedback options;
- Real-time Information Delivery: The complexity of object detection operations results in delays in real-time information delivery. Any delay poses a risk of exposing users to hazardous situations;
- Privacy Considerations: Digital systems may put VIPs at risk of privacy leaks. None of the mentioned systems have adequately addressed data management, including audio and image data, during and after navigation. Establishing ethical professional standards for system development is crucial to safeguarding users’ data [29];
- Coverage Area/Limitations: Assessing the system’s range from single-room to global capabilities, considering specific environmental limitations that might impact visually impaired users;
- Market Maturity: Examining the development stage of assistive tools for VIPs, from concept to product availability, to determine their market maturity;
- Output Data: Evaluating the types of information provided by the system, including 2D or 3D coordinates, relative or absolute positions, and dynamic parameters such as speed, heading, uncertainty, and variances [30];
- Update Rate: Determining how frequently the system updates information, whether on-event, on request, or at periodic intervals, to meet the needs of VIPs [30];
- Interface: Considering the various interaction interfaces, including man-machine interfaces such as text-based, graphical, and audio, as well as electrical interfaces such as USB, fiber channels, or wireless communications, for optimal accessibility [30];
- Scalability: Assessing the scalability of the system, taking into account its adaptability with area-proportional node deployment and any potential accuracy loss;
- Approval: Considering the legal and regulatory aspects regarding the system’s operation, including its certification by relevant authorities, to ensure compliance with standards and enhance user trust;
- Intrusiveness/User Acceptance: Evaluating the impact of the system on VIPs, distinguishing between disturbing and imperceptible levels of intrusiveness to ensure high user acceptance;
- Number of Users: Determining the system’s capacity for the visually impaired group, ranging from single-user setups (e.g., total station) to support an unlimited number of users (e.g., passive mobile sensors), ensuring inclusivity [30].
- Long-term Performance: Integrating longitudinal studies to assess navigation systems’ usability and effectiveness for VIPs is crucial for understanding their adaptation to changing conditions and ensuring sustainability.
5. PAMELA Experimentation for Investigating and Comparing the Behavior of Visually Impaired People
5.1. Overview of PAMELA Experimentation
- individual;
- unidirectional group flow restricted;
- unidirectional group flow unrestricted;
- opposing group flow restricted;
- opposing group flow unrestricted.
5.2. Summary of Performance of PAMELA Experimentation
5.3. Conclusions of PAMELA Experimentation
5.4. Limitations and Recommendations of PAMELA Experimentation
6. Literature Review of Current Positioning/Navigation Systems for Visually Impaired People
- (i)
- vision-based navigation systems that utilize vision sensors (e.g., cameras) to detect obstacles and ensure safe navigation for VIPs;
- (ii)
- non-visual sensor-based navigation systems that guide VIPs, such as Bluetooth beacons, infrared, ultrasonic, maps, sound, and smartphones;
- (iii)
- smart systems based on technologies such as artificial intelligence and machine learning.
6.1. Visually Impaired Navigation with Vision Systems
- (i)
- enhancing battery life and exploring alternative power sources;
- (ii)
- improving E-glass design for comfort and aesthetics;
- (iii)
- further investigation of the real-world requirements of VIPs, particularly in complex outdoor environments.
- (i)
- there is a need to improve the user interface, such as in better speech recognition to achieve robustness in noisy places;
- (ii)
- the audio frequency should be adjustable and customized. For example, the system cannot navigate in complex, cluttered environments such as underground stations.
- (i)
- the system has yet to operate in indoor environments;
- (ii)
- although sonification can notify users about dangerous objects, the lack of decision on a direction to walk will increase the risks of danger;
- (iii)
- the dimensions and weights of the bulky helmet and backpack must be minimized to address concerns of portability.
6.2. Visually Impaired Navigation without Vision Systems
6.3. Visually Impaired Navigation with Smart Technologies (Artificial Intelligence)
System Proposed By | Data Network Undependability | Coverage (Indoor/ Outdoor) | Obstacle Recognition | Distance Estimation | Position Estimation | Scene Recognition | Motion Detection | Multimodal Output |
---|---|---|---|---|---|---|---|---|
[66] | ✓ | ✓ | ✓ | ✓ | ||||
[67] | ✓ | ✓ | ✓ | |||||
[68] | ✓ | ✓ | ✓ | ✓ | ||||
[69] | ✓ | ✓ | ||||||
[70] | ✓ | ✓ | ✓ | ✓ | ||||
[58] | ✓ | ✓ | ✓ | |||||
[57] | ✓ | ✓ | ||||||
[60] | ✓ | |||||||
[49] | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ |
- (i)
- MobileNet-V2 model (fine-tuned by transfer learning) to extract features on overlapping grids;
- (ii)
- Single Shot MultiBox Detector (SSD) to detect TWSI;
- (iii)
- Score Voting algorithm to determine user’s positions.
- (i)
- the system should detect more obstacles;
- (ii)
- adjustable feedback settings such as rate, pitch, or voice;
- (iii)
- multiple feedback modes such as haptic in addition to audio;
- (iv)
- avoid interruptions by another smartphone application while navigating.
- (i)
- an information moderation filter can provide cognitive power to process a considerable amount of information and further refine the navigation systems;
- (ii)
- provision of customization options for users to obtain the information they need during navigation;
- (iii)
- privacy protection: Ref. [74] has emphasized the challenges and requirements of VIPs in understanding the information relative to privacy under different techniques;
- (iv)
- ability to navigate in areas where data traffic and network coverage are low;
- (v)
- a trade-off between high accuracy and low latency in some lightweight deep-learning models [75].
6.4. Summary of Navigation Systems for Visually Impaired People
7. Adaptation of Navigation Systems in Response to a Global Pandemic
7.1. Impact of COVID-19 on Visually Impaired People
- (i)
- visual limitations hinder their ability to maintain safe distances [77];
- (ii)
- they cannot detect directional arrows on floors;
- (iii)
- guide dogs cannot assess new distances;
- (iv)
- canes are only forward-facing probes.
7.2. Technology to Help Visually Impaired People Navigate
8. Conclusions and Future Research
- Quantifying the navigation system requirements, given that current knowledge has limitations in defining these for VIPs. This includes accuracy, integrity risk, alarm limits, availability, and continuity. These factors can be derived from user requirements, VIP behaviors, infrastructure environments, and risk analysis. This process may lead to various requirement categories based on the visual function of the VIP and/or their operating environments;
- Quantifying the situational awareness requirements to ensure safety, including the integrity budget;
- Implementing advanced carrier phase GNSS positioning technologies, such as RTK (Real-Time Kinematic) and Precise Point Positioning (PPP), as the current understanding of requirements indicates that a centimeter to decimeter accuracy level is necessary for VIPs’ navigation;
- Developing an integrity monitoring algorithm that supports carrier phase positioning, namely Carrier Phase Receiver Autonomous Integrity Monitoring (CRAIM);
- Developing an integrity monitoring layer for indoor positioning systems. In addition to Fault Detection and Exclusion (FDE) and computing the protection level, this improvement involves identifying faulty modes and models and overbounding the error distribution to ensure safety;
- Conducting a feasibility assessment for indoor positioning systems that can meet VIPs’ navigation requirements. This assessment should follow the system requirements and use the protection level value of each indoor positioning system;
- Developing an integrity monitoring algorithm for SLAM methods in case the feasibility assessment suggests using SLAM;
- Performing a cost–benefit analysis for the candidate indoor positioning systems;
- Developing a situational awareness layer that ensures safety.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
Methodology Category | Methodology Name | Techniques | Limitation(s) | Reference |
---|---|---|---|---|
Bluetooth-related | Hybrid BLE beacons and Google Tango | 1. ‘Fingerprinting’ method for reducing effects from BLE signal attenuation (with KNN algorithm) 2. Tango (with RGB-D camera) models the 3D indoor environment and performs accurate indoor localization in real-time 3. Close-loop approach to reduce drift errors by Tango 4. BLE beacons provide coarse location detections to find the correct ADF to be loaded in Tango to provide correct users’ positions 5. The app finds the shortest path for navigation between the origin and the selected destinations using Dijkstra’s algorithm | 1. Predicted locations jump around 2. A small algorithm variation leads to varying accuracy in results 3. Tango has drift errors over time in estimating the camera’s positions 4. The size of ADF by Tango is limited to one floor | [40] |
Bluetooth-related | ASSIST | 1. Body camera (GoPro) captures and sends images to detection server 2. YOLO head detection 3. Detection server to perform facial and emotion recognition 4. Navigation server processes BLE localization 5. The phone performs BLE and Tango localization, provides feedback to users | 1. Easily attenuated BLE signals 2. Loading of a large ADF (exceeds 60 MB) triggers the internal timeout/crash within Tango SDK; the size of the ADF is limited to one floor | [86] |
Bluetooth-related | Bluetooth indoor positioning based on RSSI and Kalman filtering | 1. Achieve an accurate Bluetooth transmission model: traditional wireless signal propagation model + more training with environmental parameters and standards RSSI values in 1 m 2. Apply Kalman filtering to smooth Bluetooth signals to reduce problems such as signal drift and shock 3. A more accurate positioning by applying the weighted least square algorithm + weight to each element of the error vector 4. Four-border positioning method to calculate the coordinates of the unknown target device | 1. Some positioning points (near the obstacles) have more significant errors than others, far from the real positioning point, meaning the system is not precise 2. Filtering can only lower part of RSSI drifts; a more stable signal is needed 3. The superposition problem should be solved | [87] |
GNSS-related | Atmospheric augmentation PPP algorithm | 1. Doppler-smoothing reduces the errors/noise of pseudorange measurements and figures out the gross errors 2. Doppler integration method detects carrier phase outliers 3. A stochastic model based on carrier-on-noise can weigh GNSS observation 4. International GNSS Service (IGS) products reduce satellite-related errors 5. Square Root Information Filtering (SRIF) linearizes multi-GNSS and dual-frequency PPP equations for high-robustness positioning results | 1. The phase center offset and variation (in the smartphone GNSS antenna) are not considered due to the lack of information on the patch antenna in correcting receiver errors 2. The future challenge lies in achieving fast, high-precision positioning in GNSS-denial environments, which requires combining PPP technology with other sensors such as IMU, cameras, etc. | [88] |
Vision-related | VSLAM, PoI-graph, wayfinding, obstacle detection, and route-following algorithm | 1. VSLAM utilizes RGB and depth images to generate an offline ‘virtual blind road’ and localize users online 2. PoI graph is important in wayfinding, where the system plans the global shortest path (A* based algorithm) 3. Route finding: combines all information and applies the dynamic subgoal selection method to build the guiding information | Lack of positioning in outdoor and more complex environments | [38] |
IMU-related | GNSS, IMU, visual navigation data fusion system | 1. The error compensation module in MSDF can correct scale-factor errors in IMU data and remove bias in IMU data. 2. VO pose estimations converted to E-Frame before data fusion to resolve the earth rotations about I-Frame 3. Kalman filtering or Extended Kalman filtering for data fusion | 1. This architecture is designed for UAVs; further investigations are needed to prove the applicability for VIPs’ navigation 2. Further reduction in the drift errors of IMU and improvement in the sampling rate of vision sensors (to achieve lower computation time for the procession of the image captured) | [89] |
Appendix B
Appendix C
References
- World Health Organization. Blindness and Vision Impairment. 2022. Available online: https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment (accessed on 19 August 2024).
- World Health Organization. International Classification of Diseases (11th Revision); World Health Organization: Geneva, Switzerland, 2019. [Google Scholar]
- Kumaran, N.; Moore, A.T.; Weleber, R.G.; Michaelides, M. Leber Congenital Amaurosis/Early-Onset Severe Retinal Dystrophy: Clinical Features, Molecular Genetics and Therapeutic Interventions. Br. J. Ophthalmol. 2017, 101, 1147–1154. [Google Scholar] [CrossRef] [PubMed]
- Dandona, L.; Dandona, R. Revision of Visual Impairment Definitions in the International Statistical Classification of Diseases. BMC Med. 2006, 4, 7. [Google Scholar] [CrossRef] [PubMed]
- Feghali, J.M. Function to Functional: Investigating the Fundamental Human, Environmental, and Technological Factors Affecting the Mobility of Visually Impaired People. 2022.
- Great Britain Department of Health and Social Care. Certificate of Vision Impairment for People Who are Sight Impaired (Partially Sighted) or Severely Sight Impaired (Blind). 2018. Available online: https://assets.publishing.service.gov.uk/media/6318b725d3bf7f77d2995a5c/certificate-of-vision-impairment-form.pdf (accessed on 19 August 2024).
- Samoshin, D.; Istratov, R. The Characteristics of Blind and Visually Impaired People Evacuation in Case of Fire. Fire Saf. Sci. 2014, 11, 1160–1169. [Google Scholar] [CrossRef]
- Sørensen, J.G.; Dederichs, A.S. Evacuation Characteristics of Visually Impaired People—A Qualitative and Quantitative Study. Fire Mater. 2015, 39, 385–395. [Google Scholar] [CrossRef]
- Manduchi, R.; Kurniawan, S. Mobility-Related Accidents Experienced by People with Visual Impairment. AER J. Res. Pract. Vis. Impair. Blind. 2011, 4, 44–54. [Google Scholar]
- Long, R.G.; Rieser, J.J.; Hill, E.W. Mobility in Individuals with Moderate Visual Impairments. J. Vis. Impair. Blind. 1990, 84, 111–118. [Google Scholar] [CrossRef]
- Swenor, B.K.; Muñoz, B.; West, S.K. Does Visual Impairment Affect Mobility Over Time? the Salisbury Eye Evaluation Study. Invest. Ophthalmol. Vis. Sci. 2013, 54, 7683–7690. [Google Scholar] [CrossRef]
- Wright, M.S.; Cook, G.K.; Webber, G.M.B. Emergency Lighting and Wayfinding Provision Systems for Visually Impaired People: Phase I of a Study. Int. J. Light. Res. Technol. 1999, 31, 35–42. [Google Scholar]
- Salive, M.E.; Guralnik, J.; Glynn, R.J.; Christen, W.; Wallace, R.B.; Ostfeld, A.M. Association of Visual Impairment with Mobility and Physical Function. J. Am. Geriatr. Soc. 1994, 42, 287–292. [Google Scholar] [CrossRef]
- Zhang, S.; Zeng, J.; Liu, X.; Ding, S. Effect of Obstacle Density on the Travel Time of the Visually Impaired People. Fire Mater. 2019, 43, 162–168. [Google Scholar] [CrossRef]
- Leat, S.J.; Lovie-Kitchin, J. Visual Function, Visual Attention, and Mobility Performance in Low Vision. Optom. Vis. Sci. 2008, 85, 1049–1056. [Google Scholar] [CrossRef]
- Brouwer, D.M.; Sadlo, G.; Winding, K.; Hanneman, M.I.G. Limitations in Mobility: Experiences of Visually Impaired Older People. Br. J. Occup. Ther. 2008, 71, 414–421. [Google Scholar] [CrossRef]
- BS ISO 21542; Building Construction. Accessibility and Usability of the Built Environment. International Organisation of Standards: Geneva, Switzerland, 2020; pp. 90–117.
- BS EN ISO 9241-171:2008; Ergonomics of Human-System Interaction: Guidance on Software Accessibility. International Organisation of Standards: Geneva, Switzerland, 2009.
- EN 301 549; European Standard for Digital Accessibility. European Telecommunications Standards Institute: Sophia Antipolis, France, 2021.
- United States Department of Justice. 2010 ADA Standards for Accessible Design; United States Department of Justice: Washington, DC, USA, 2010. [Google Scholar]
- BS 8300-1:2018; Design of an Accessible and Inclusive Built Environment—External Environment. Code of Practice. The British Standards Institution: London, UK, 2018.
- BS 8300-2:2018; Design of an Accessible and Inclusive Built Environment—Building. Code of Practice. The British Standards Institution: London, UK, 2018.
- Strothotte, T.; Fritz, S.; Michel, R.; Raab, A.; Petrie, H.; Johnson, V.; Reichert, L.; Schalt, A. Development of Dialogue Systems for the Mobility Aid for Blind People: Initial Design and Usability Testing. In Proceedings of the Second Annual ACM Conference on Assistive Technologies, New York, NY, USA, 11–12 April 1996; pp. 139–144. [Google Scholar]
- Loomis, J.M.; Golledge, R.D.; Klatzky, R.L. GPS-Based Navigation Systems for the Visually Impaired; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 2001; pp. 429–446. [Google Scholar]
- Spinks, R.; Worsfold, J.; Di Bon-Conyers, L.; Williams, D.; Ochieng, W.Y.; Mericliler, G. Interviewed by: Feghali, J.M. 2021. Available online: https://www.youtube.com/watch?v=vLgWKestjEE (accessed on 19 August 2024).
- Wayfindr. Open Standard for Audio-Based Wayfinding—Recommendation 2.0. 2018. Available online: http://www.wayfindr.net/wp-content/uploads/2018/07/Wayfindr-Open-Standard-Rec-2.0.pdf (accessed on 19 August 2024).
- El-Sheimy, N.; Lari, Z. GNSS Applications in Surveying and Mobile Mapping; Wiley: Hoboken, NJ, USA, 2021; pp. 1711–1733. [Google Scholar]
- Feghali, J.M.; Mericliler, G.; Penrod, W.; Lee, D.B.; Kaiser, J. How Much Is Too Much? Understanding the Appropriate Amount of Information from ETAs and EOAs for Effective Mobility; Berndtsson, I., IMC Executive Committee, Eds.
- Kuriakose, B.; Shrestha, R.; Sandnes, F.E. Tools and Technologies for Blind and Visually Impaired Navigation Support: A Review. IETE Tech. Rev. 2022, 39, 3–18. [Google Scholar] [CrossRef]
- Mautz, R. Indoor Positioning Technologies; ETH Zurich: Zürich, Switzerland, 2012; pp. 15–16. [Google Scholar]
- Patel, I.; Turano, K.A.; Broman, A.T.; Bandeen-Roche, K.; Munoz, B.; West, S.K. Measures of Visual Function and Percentage of Preferred Walking Speed in Older Adults: The Salisbury Eye Evaluation Project. Invest. Ophthalmol. Vis. Sci. 2006, 47, 65–71. [Google Scholar] [CrossRef]
- Moussaïd, M.; Perozo, N.; Garnier, S.; Helbing, D.; Theraulaz, G. The Walking Behaviour of Pedestrian Social Groups and its Impact on Crowd Dynamics. PLoS ONE 2010, 5, e10047. [Google Scholar] [CrossRef]
- Weidmann, U. Transporttechnik Der Fussgänger; Institut für Verkehrsplanung: Zürich, Switzerland, 1992; Volume 90. [Google Scholar]
- Song, Y.; Li, Z.; Li, G.; Wang, B.; Zhu, M.; Shi, P. Multi-Sensory Visual-Auditory Fusion of Wearable Navigation Assistance for People With Impaired Vision. IEEE Trans. Autom. Sci. Eng. 2023, 1–13. [Google Scholar] [CrossRef]
- Li, B.; Munoz, J.P.; Rong, X.; Chen, Q.; Xiao, J.; Tian, Y.; Arditi, A.; Yousuf, M. Vision-Based Mobile Indoor Assistive Navigation Aid for Blind People. IEEE Trans. Mob. Comput. 2019, 18, 702–714. [Google Scholar] [CrossRef]
- Masud, U.; Saeed, T.; Malaikah, H.M.; Islam, F.U.; Abbas, G. Smart Assistive System for Visually Impaired People Obstruction Avoidance through Object Detection and Classification. Access 2022, 10, 13428–13441. [Google Scholar] [CrossRef]
- Schwarze, T.; Lauer, M.; Schwaab, M.; Romanovas, M.; Böhm, S.; Jürgensohn, T. A Camera-Based Mobility Aid for Visually Impaired People. Künstl. Intell. 2016, 30, 29–36. [Google Scholar] [CrossRef]
- Bai, J.; Lian, S.; Liu, Z.; Wang, K.; Liu, D. Virtual-Blind-Road Following-Based Wearable Navigation Device for Blind People. IEEE Trans. Consum. Electron. 2018, 64, 136–143. [Google Scholar] [CrossRef]
- Ton, C.; Omar, A.; Szedenko, V.; Tran, V.H.; Aftab, A.; Perla, F.; Bernstein, M.J.; Yang, Y. LIDAR Assist Spatial Sensing for the Visually Impaired and Performance Analysis. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 1727–1734. [Google Scholar] [CrossRef]
- Nair, V.; Tsangouri, C.; Xiao, B.; Olmschenk, G.; Seiple, W.H.; Zhu, Z. A Hybrid Indoor Positioning System for Blind and Visually Impaired using Bluetooth and Google Tango. J. Technol. Pers. Disabil. 2018, 6, 61–81. [Google Scholar]
- Martinez-Sala, A.; Losilla, F.; Sánchez-Aarnoutse, J.; García-Haro, J. Design, Implementation and Evaluation of an Indoor Navigation System for Visually Impaired People. Sensors 2015, 15, 32168–32187. [Google Scholar] [CrossRef]
- Sammouda, R.; Alrjoub, A. Mobile Blind Navigation System using RFID. In Proceedings of the 2015 Global Summit on Computer & Information Technology (GSCIT), Sousse, Tunisia, 11–13 June 2015; pp. 1–4. [Google Scholar]
- Patil, K.; Jawadwala, Q.; Shu, F.C. Design and Construction of Electronic Aid for Visually Impaired People. IEEE Trans. Hum. Mach. Syst. 2018, 48, 172–182. [Google Scholar] [CrossRef]
- Bindhu, V.; Tavares, J.M.R.S.; Boulogeorgos, A.A.; Vuppalapati, C. Indoor Navigation Assistant for Visually Impaired (INAVI); Springer Singapore Pte. Limited: Singapore, 2021; pp. 239–253. [Google Scholar]
- Kammoun, S.; Bouaziz, R.; Saeed, F.; Qasem, S.N.; Al-Hadhrami, T. HaptiSole: Wearable Haptic System in Vibrotactile Guidance Shoes for Visually Impaired Wayfinding. KSII Trans. Internet Inf. Syst. 2023, 17, 3064–3082. [Google Scholar]
- Sato, D.; Oh, U.; Naito, K.; Takagi, H.; Kitani, K.; Asakawa, C. NavCog3: An Evaluation of a Smartphone-Based Blind Indoor Navigation Assistant with Semantic Features in a Large-Scale Environment. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS ‘17), Baltimore, MD, USA, 20 October–1 November 2017; pp. 270–279. [Google Scholar]
- Marzec, P.; Kos, A. Low Energy Precise Navigation System for the Blind with Infrared Sensors. In Proceedings of the 2019MIXDES—26th International Conference “Mixed Design of Integrated Circuits and Systems”, Rzeszow, Poland, 27–29 June 2019; pp. 394–397. [Google Scholar]
- Guerreiro, J.A.; Ahmetovic, D.; Sato, D.; Kitani, K.; Asakawa, C. Airport Accessibility and Navigation Assistance for People with Visual Impairments. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland UK, 2 May 2019; pp. 1–14. [Google Scholar]
- Kuriakose, B.; Shrestha, R.; Sandnes, F.E. DeepNAVI: A Deep Learning Based Smartphone Navigation Assistant for People with Visual Impairments. Expert Syst. Appl. 2023, 212, 118720. [Google Scholar] [CrossRef]
- Castillo Guerrero, J.; Quezada-V, C.; Chacon-Troya, D. Design and Implementation of an Intelligent Cane, with Proximity Sensors, GPS Localization and GSM Feedback. In Proceedings of the In 2018 IEEE Canadian Conference on Electrical & Computer Engineering (CCECE), Quebec, QC, Canada, 13–16 May 2018; pp. 1–4. [Google Scholar]
- Saaid, M.F.; Mohammad, A.M.; Megat Ali, M.S.A.M. Smart Cane with Range Notification for Blind People. In Proceedings of the 2016 IEEE International Conference on Automatic Control and Intelligent Systems (I2CACIS), Selangor, Malaysia, 22–22 October 2016; pp. 225–229. [Google Scholar]
- Elmannai, W.; Elleithy, K. Sensor-Based Assistive Devices for Visually-Impaired People: Current Status, Challenges, and Future Directions. Sensors 2017, 17, 565. [Google Scholar] [CrossRef]
- Ali, Z.A. Design and Evaluation of Two Obstacle Detection Devices for Visually Impaired People. J. Eng. Res. 2023, 11, 100–105. [Google Scholar] [CrossRef]
- Shinde, S.H.; Munot, M.V.; Kumar, P.; Boob, S. Intelligent Companion for Blind: Smart Stick. Int. J. Innov. Technol. Explor. Eng. 2019, 8, 2278–3075. [Google Scholar] [CrossRef]
- De Alwis, D.; Samarawickrama, Y.C. Low Cost Ultrasonic Based Wide Detection Range Smart Walking Stick for Visually Impaired. Int. J. Multidiscip. Stud. 2016, 3, 123. [Google Scholar] [CrossRef]
- Yusof, Z.M.; Billah, M.M.; Kadir, K.; Rohim, M.A.; Nasir, H.; Izani, M.; Razak, A. Design and Analysis of a Smart Blind Stick for Visual Impairment. Indones. J. Electr. Eng. Comput. Sci. 2018, 11, 848–856. [Google Scholar]
- Mukhiddinov, M.; Cho, J. Smart Glass System using Deep Learning for the Blind and Visually Impaired. Electronic 2021, 10, 2756. [Google Scholar] [CrossRef]
- Rao, S.U.; Ranganath, S.; Ashwin, T.S.; Reddy, G.R.M. A Google Glass Based Real-Time Scene Analysis for the Visually Impaired. IEEE Access 2021, 9, 166351–166369. [Google Scholar]
- Kuriakose, B.; Shrestha, R.; Sandnes, F.E. Exploring the User Experience of an AI-Based Smartphone Navigation Assistant for People with Visual Impairments. In Proceedings of the 15th Biannual Conference of the Italian SIGCHI Chapter, Torino, Italy, 20–22 September 2023. [Google Scholar]
- Fusco, G.; Coughlan, J.M. Indoor Localization for Visually Impaired Travelers using Computer Vision on a Smartphone. In Proceedings of the 17th International Web for all Conference, Taipei, Taiwan, 20–21 April 2020. [Google Scholar]
- Chen, W.; Xie, Z.; Yuan, P.; Wang, R.; Chen, H.; Xiao, B. A Mobile Intelligent Guide System for Visually Impaired Pedestrian. J. Syst. Softw. 2023, 195, 111546. [Google Scholar] [CrossRef]
- Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 580–587. [Google Scholar]
- Dai, J.; Li, Y.; He, K.; Sun, J. R-FCN: Object Detection Via Region-Based Fully Convolutional Networks. In Proceedings of the Advances in Neural Information Processing Systems, Barcelona, Spain, 5–10 December 2016. [Google Scholar]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.; Berg, A.C. Ssd: Single Shot Multibox Detector. In Proceedings of the Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, 11–14 October 2016; pp. 21–37. [Google Scholar]
- Bochkovskiy, A.; Wang, C.; Liao, H.M. Yolov4: Optimal Speed and Accuracy of Object Detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
- Moharkar, L.; Varun, S.; Patil, A.; Pal, A. A Scene Perception System for Visually Impaired Based on Object Detection and Classification using CNN. ITM Web Conf. 2020, 32, 03039. [Google Scholar] [CrossRef]
- Ashiq, F.; Asif, M.; Ahmad, M.B.; Zafar, S.; Masood, K.; Mahmood, T.; Mahmood, M.T.; Lee, I.H. CNN-Based Object Recognition and Tracking System to Assist Visually Impaired People. IEEE Access 2022, 10, 14819–14834. [Google Scholar] [CrossRef]
- Joshi, R.C.; Yadav, S.; Dutta, M.K.; Travieso-Gonzalez, C.M. Efficient Multi-Object Detection and Smart Navigation using Artificial Intelligence for Visually Impaired People. Entropy 2020, 22, 941. [Google Scholar] [CrossRef]
- Kahraman, M.; Turhan, C. An Intelligent Indoor Guidance and Navigation System for the Visually Impaired. Assist. Technol. 2022, 34, 478–486. [Google Scholar] [CrossRef]
- Barontini, F.; Catalano, M.G.; Pallottino, L.; Leporini, B.; Bianchi, M. Bianchi. Integrating Wearable Haptics and Obstacle Avoidance for the Visually Impaired in Indoor Navigation: A User-Centered Approach. IEEE Trans. Haptics 2021, 14, 109–122. [Google Scholar] [CrossRef]
- Kuriakose, B.; Shrestha, R.; Sandnes, F.E. SceneRecog: A Deep Learning Scene Recognition Model for Assisting Blind and Visually Impaired Navigate using Smartphones. In Proceedings of the 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Melbourne, Australia, 17–20 October 2021; pp. 2464–2470. [Google Scholar]
- Plikynas, D.; Žvironas, A.; Budrionis, A.; Gudauskis, M. Indoor Navigation Systems for Visually Impaired Persons: Mapping the Features of Existing Technologies to User Needs. Sensors 2020, 20, 636. [Google Scholar] [CrossRef]
- Theodorou, P.; Meliones, A. Gaining Insight for the Design, Development, Deployment and Distribution of Assistive Navigation Systems for Blind and Visually Impaired People through a Detailed User Requirements Elicitation. Univ. Access Inf. Soc. 2022, 22, 841–867. [Google Scholar] [CrossRef]
- Akter, T.; Dosono, B.; Ahmed, T.; Kapadia, A.; Semaan, B. I Am Uncomfortable Sharing what I Can See: Privacy Concerns of the Visually Impaired with Camera Based Assistive Applications. In Proceedings of the 29th USENIX Security Symposium (USENIX Security 20), Berkeley, CA, USA, 12 August 2020; pp. 1929–1948. [Google Scholar]
- Ran, X.; Chen, H.; Zhu, X.; Liu, Z.; Chen, J. DeepDecision: A Mobile Deep Learning Framework for Edge Video Analytics. In Proceedings of the IEEE INFOCOM 2018—IEEE Conference on Computer Communications, Honolulu, HI, USA, 16–19 April 2018; pp. 1421–1429. [Google Scholar]
- Khan, H.M.; Khan, H.M.; Abbas, K.; Abbas, K.; Khan, H.N.; Khan, H.N. Investigating the Impact of COVID-19 on Individuals with Visual Impairment. Br. J. Vis. Impair 2023. [Google Scholar] [CrossRef]
- Lourens, H. The politics of touch-based help for visually impaired persons during the COVID-19 pandemic: An autoethnographic account. In The COVID-19 Crisis; Routledge: Abingdon, UK, 2021; pp. 67–76. [Google Scholar]
- Rizzo, J.; Beheshti, M.; Fang, Y.; Flanagan, S.; Giudice, N.A. COVID-19 and Visual Disability: Can’t Look and Now Don’t Touch. PM&R 2021, 13, 415–421. [Google Scholar]
- Bernard, A.; Weiss, S.; Rahman, M.; Ulin, S.S.; D’Souza, C.; Salgat, A.; Panzer, K.; Stein, J.D.; Meade, M.A.; McKee, M.M. The Impact of COVID-19 and Pandemic Mitigation Measures on Persons with Sensory Impairment. Am. J. Ophthalmol. 2022, 234, 49–58. [Google Scholar] [CrossRef]
- Shalaby, W.S.; Odayappan, A.; Venkatesh, R.; Swenor, B.K.; Ramulu, P.Y.; Robin, A.L.; Srinivasan, K.; Shukla, A.G. The Impact of COVID-19 on Individuals Across the Spectrum of Visual Impairment. Am. J. Ophthalmol. 2021, 227, 53–65. [Google Scholar] [CrossRef]
- Gombas, J.; Csakvari, J. Experiences of Individuals with Blindness Or Visual Impairment during the COVID-19 Pandemic Lockdown in Hungary. Br. J. Vis. Impair. 2021, 40, 378–388. [Google Scholar] [CrossRef]
- Mahfuz, S.; Sakib, M.N.; Husain, M.M. A Preliminary Study on Visually Impaired Students in Bangladesh during the COVID-19 Pandemic. Policy Futures Educ. 2022, 20, 402–416. [Google Scholar] [CrossRef]
- Shrestha, S.; Lu, D.; Tian, H.; Cao, Q.; Liu, J.; Rizzo, J.; Seiple, W.H.; Porfiri, M.; Fang, Y. Active Crowd Analysis for Pandemic Risk Mitigation for Blind Or Visually Impaired Persons. In Proceedings of the Computer Vision—ECCV 2020 Workshops, Glasgow, UK, 23–28 August 2020; pp. 422–439. [Google Scholar]
- Luo, G.; Pundlik, S. Influence of COVID-19 Lockdowns on the Usage of a Vision Assistance App among Global Users with Visual Impairment: Big Data Analytics Study. J. Med. Internet Res. 2021, 23, e26283. [Google Scholar] [CrossRef]
- Bellomo, N.; Chaplain, M.A. Predicting Pandemics in a Globally Connected World, Volume 1: Toward a Multiscale, Multidisciplinary Framework Through Modeling and Simulation; Springer: Berlin/Heidelberg, Germany, 2022. [Google Scholar]
- Nair, V.; Budhai, M.; Olmschenk, G.; Seiple, W.H.; Zhu, Z. ASSIST: Personalized Indoor Navigation Via Multimodal Sensors and High-Level Semantic Information; Springer: Berlin/Heidelberg, Germany, 2019; pp. 0–0. [Google Scholar]
- Zhou, C.; Yuan, J.; Liu, H.; Qiu, J. Bluetooth Indoor Positioning Based on RSSI and Kalman Filter. Wirel. Pers. Commun. 2017, 96, 4115–4130. [Google Scholar] [CrossRef]
- Wang, J.; Zheng, F.; Hu, Y.; Zhang, D.; Shi, C. Instantaneous Sub-Meter Level Precise Point Positioning of Low-Cost Smartphones. Navigation 2023, 70, navi.597. [Google Scholar] [CrossRef]
- Balamurugan, G.; Jayaraman, V.; Naidu, D.V. Survey on UAV Navigation in GPS Denied Environments. In Proceedings of the2016 International Conference on Signal Processing, Communication, Power and Embedded System (SCOPES), Paralakhemundi, India, 3–5 October 2016; pp. 198–204. [Google Scholar]
Classification of Visual Impairment | Visual Acuity Worse Than |
---|---|
Distance—Mild | 6/12 |
Distance—Moderate | 6/18 |
Distance—Severe | 6/60 |
Distance—Blindness | 3/60 |
Near Vision Impairment | N6 or M.08 with existing correction |
Reference | Title | Location | Test Sample |
---|---|---|---|
[7] | The Characteristics of Blind and Visually Impaired People Evacuation in Case of Fire | Russia | 201 visually impaired participants (68.2% over 40 years of age) classified according to Russian medical legislation where all people with disabilities are divided into 3 groups depending on their functional system of the body with the most severe problems relating to group I and the least to group III. |
[8] | Evacuation characteristics of visually impaired people—a qualitative and quantitative study | Denmark | 40 visually impaired participants aged 10–69 years were classified according to Danish classification (based on visual acuity ‘x’) as follows: A—Visually Impaired (0.1 < x ≤ 0.3) B—Social blindness (0.01 < x ≤ 0.1) C—Practical blindness (0.001 < x ≤ 0.01) D—Total blindness (x ≤ 0.001) |
[9] | Mobility-related accidents experienced by people with visual impairment | United States | 307 visually impaired survey participants aged 18 years and older were asked to describe their level of vision loss as either ‘blind’ (at most some light perception) or ‘legally blind’ (legally blind but not blind). |
[10] | Mobility in Individuals with Moderate Visual Impairment | United States | 22 ‘low vision’ participants aged 19–58 were tested on visual acuity, visual field, and contrast sensitivity. |
[11] | Does Visual Impairment Affect Mobility Over Time? The Salisbury Eye Evaluation Study | United States | 2520 Salisbury Eye Evaluation Study participants aged 65 years and older followed over 2, 6, and 8 years after baseline. Visual impairment was defined as best-corrected visual acuity worse than 20/40 or a visual field of approximately less than 20°. |
[12] | Emergency lighting and wayfinding provision systems for visually impaired people: Phase I of a study | United Kingdom | 30 participants aged 36–73 years were classified only according to their cause of visual impairment, including RP, Macular degeneration, Glaucoma, and Diabetic retinopathy. |
[13] | Association of Visual Impairment with Mobility and Physical Function | United States | Interview of 5143 older residents aged 65 years and older from three communities (Established Populations for the Epidemiologic Studies of the Elderly), classified according to visual acuity screening, self-reported activities of daily living and mobility, and objective physical performance measures of balance, walking, and rising from a chair. |
[14] | Effect of obstacle density on the travel time of visually impaired people | China | 8 sighted and 32 visually impaired participants were classified as ‘near blindness’ by Chinese national standards of 28–55 years. The range of vision loss (based on visual acuity) of the visually impaired people was <0.05, and they were described as being unable to walk without a walking stick or assistance from family members. |
[15] | Visual function, visual attention, and mobility performance in low vision | United States | 35 visually impaired participants aged 20–80 years with low vision due to various visual disorders classified according to UFV and clinical measures of contrast sensitivity, visual field, and visual acuity. Two models were considered; series 1 used the UFV scores as measured, and series 2 used the UFV scores corrected for visual field loss (only counting errors in areas of intact visual field). |
[16] | Limitations in mobility: experiences of visually impaired older people | The Netherlands | 10 visually impaired older participants aged 63–95 at a Dutch center for visual impairment with the following inclusion criteria: 1. Had applied for mobility training at the center 2. Had visual deficiencies 3. Were experiencing problems in mobility as a pedestrian 4. Had never before received mobility treatment 5. Had received treatment from someone other than the first researcher 6. Did not have any of the following problems: deafness or secondary psychiatric disabilities, or severe physical, diabetic or neurological disabilities 7. Spoke and understood Dutch 8. Were willing to participate in the study |
Visual Task | of the Lighter Surface (CIE Y) [%] | [%] | [%] |
---|---|---|---|
Large surface areas (i.e., walls, floors, doors, ceiling), elements and components to facilitate orientation (i.e., handrails, door furniture, tactile walking surface indicators, and visual indicators on glazed areas) | ≥40 | ≥30 | ≥45 |
Potential hazards (i.e., visual indicators on steps, glazed areas), small items (i.e., switches and controls), and self-contrasting markings | ≥50 | ≥60 | ≥75 |
Text information, i.e., signage | ≥70 | ≥60 | ≥75 |
Different Areas | [lx] |
---|---|
Outdoor environments | 20 |
Horizontal surfaces indoors | 100 |
Stairs, ramps, escalators, moving walkways | 150–200 |
Habitable spaces | 300–500 |
Visual tasks with small details or low contrast | 1000 |
Standard Title | Key Details for VIPs | Limitations | Reference |
---|---|---|---|
ISO 21542: Building construction. Accessibility and usability of the built environment. | Standards in tactile, visual, and audible information for VIPs orientation, emergency warning system, minimization of obstacles and hazards in the pathway, standards of relief facilities for assistance dogs | 1. Lack of standards for elements of external environments 2. Lack of discussion of regional variations in VIP requirements and local regulations 3. Complex and costly to implement all standards 4. It does not account for the rapid emergence of navigation aids for VIPs 5. Insufficient incorporation of VIP feedback on real navigation needs | [17] |
ISO 9241-171: Ergonomics of human-system interaction | Guidelines on names and labels for user-interface elements, user preference settings, user preference settings, special considerations for accessibility adjustments, controls and operations, compatibility with assistive techniques, etc. | 1. Lack of detailed discussion on real-world requirements of VIPs 2. Lack of guidance on the integration of physical aids such as tactile maps | [18] |
European Standard EN 301 549 | Comprehensive guidance on improving the accessibility of ICT products and services, alignment with WCAG | 1. Neglect considerations for physical aids 2. Struggle to match the pace of evolving technologies for accessibility | [19] |
2010 ADA Standards for Accessible Design | Outlines both scoping and technical requirements for accessibility; crucial standards such as the removal of barriers at lift call buttons, speech-output-enabled machines, tactile signage, and accessible routes | 1. Irregular revision frequency 2. A lack of detailed exploration of how the navigation needs to change with different levels of vision impairment 3. Impracticality in international regions | [20] |
2018 BS-8300: Design of an accessible and inclusive built environment | Provides comprehensive guidelines for designing accessible environments for VIPs in the UK, BS 8300-1 for external environments, BS 8300-2 for buildings | 1. Significant implementation costs 2. Efforts of regular training and maintenance | [21,22] |
Feature | Literature Gap | Hard User Requirement | Soft User Requirement |
---|---|---|---|
Entrance/Exit | - Entrance/exit type (such as a revolving or push/pull door) | - Accessibility of entrance/exit according to user ability - Usability as an entrance, exit, or both | - Prerequisites to accessing these (such as a key card) |
Pathway | - Walking position (left/right) in one-way systems - Texture and gradient | - Directionality and walking position | - Texture information for initial user planning |
Crossing | - Length of crossing - Green time | - Controlled, uncontrolled, or spontaneous crossings | - Length of crossing - Green time |
Decision Point | - Accessibility according to user ability - Emergency egress scenarios | - Accessibility of decision points according to user ability | - Egress routes and nearest exit upon alarm |
Tactile Surfaces | - Variability and inconsistency of existing tactile surface standards and applications | - Do not identify tactile surfaces to the user | - Surface textures, such as a smooth surface, to indicate entry points and routes |
Escalator | - Escalators may change movement direction (up/down/static) - Up/down/static escalators may vary in position | - Escalator movement direction - Escalator position | - Escalator width as a user preference - Handrail position - Distance to landing |
Stairs | - Stairs incorporate different tactile surfaces and edges - Navigating stairs is a repeated action forming tacit knowledge | - Do not identify the number of stairs or tactile surfaces | - Availability of handrail and its position - Head height obstacles or landing height - Alert for open riser staircase |
Lift Lobby | - New lifts with a centralized lift call | - Number of lifts - Route guidance to the arriving lift | - Connections between lift systems (call lift) |
Required Lift | - Not considered in [26] standard | - Positioning and ordering of buttons | - Presence of an accessibility button |
Toilet | - Not considered in [26] standard | - Toilet type and accessibility (such as a changing places toilet) | - Locks or interactivity (such as a radar key or code) |
Queuing Point | - Not considered in [26] standard | - Queuing point structure (structured/unstructured) | - Typical queue length |
Gateline | - Span of gateline and position of entry/exit gates | - Span of gateline and position of entry/exit gates | - Points at which not to stand or wait |
Platform | - Platform width - Seating location - Shelter availability - Help point location | - Platform width and edge warning - Seating location - Help point location | - Shelter availability - Customer information display location - Availability of platform edge doors - Illumination level |
ID | CVI | PVL | Impairment | VA | CS | VF | Avg. |
---|---|---|---|---|---|---|---|
A1 | SI | Y | Cataracts, Nystagmus, Detached Retina, Glaucoma | 2 | 2 | 1 | 1.7 |
A2 | SSI | Y | Still’s Disease | 1 | 1 | 1 | 1.0 |
B1 | SSI | Y | Retinopathy of Prematurity—Retinal Fibroplasia | 1 | 1 | 1 | 1.0 |
B2 | SSI | Y | Optic Nerve Atrophy | 1 | 1 | 1 | 1.0 |
C1 | SSI | Y | Retinitis Pigmentosa | 1 | 1 | 1 | 1.0 |
C2 | SSI | Y | Retinitis Pigmentosa | 1 | 1 | 1 | 1.0 |
D1 | SSI | N | Congenital Heredity Endothelial Dystrophy | 1 | 1 | 1 | 1.0 |
E1 | SSI | Y | Retinitis Pigmentosa | 1 | 1 | 1 | 1.0 |
E2 | SI | N | Glaucoma, Detached Retina | 1 | 2 | 1 | 1.3 |
F1 | SSI | Y | Retinitis Pigmentosa | 1 | 1 | 1 | 1.0 |
G1 | SSI | Y | Glaucoma | 1 | 1 | 2 | 1.3 |
H1 | SSI | N | Stargardt Disease | 1 | 1 | 2 | 1.3 |
Avg. | 1.1 | 1.2 | 1.2 | 1.1 | |||
SD | 0.3 | 0.4 | 0.4 | 0.2 | |||
CV | 26.6 | 33.4 | 33.4 | 19.6 |
Metric | Individual | Unres. Uni. | Res. Uni. | Unres. Opp. | Res. Opp. |
---|---|---|---|---|---|
Avg. (m/s) | 0.8 | 0.9 | 0.8 | 0.8 | 0.7 |
SD (m/s) | 0.2 | 0.1 | 0.2 | 0.2 | 0.2 |
CV (%) | 21.7 | 11.1 | 19.9 | 19.9 | 22.9 |
System Category | System Name | Sensors/Main Components | Performance: Range/Accuracy/Other | Advantage(s) | Limitation(s) | Reference |
---|---|---|---|---|---|---|
RGB-D camera (vision) | Not Applicable | Glasses with Realsense D435i camera embedded computer, headphones, microphone | Target detection success rate of 95.46%, optimal paths prediction success rate of 92.60% | 1. Novel fusion neural network aids VIPs in spatial target localization 2. The path planning method combines neural networks to find the optimal route based on fusion data 3. Wearable navigation system showcases visual-auditory fusion and navigation for VIPs | 1. Enhancement of battery life and exploration of alternative power sources 2. Emphasis on design improvements for E-glasses, prioritizing comfort and aesthetics. 3. Limited investigation into the real-world requirements of VIPs | [34] |
RGB-D camera (vision) | ISANA | Embedded RGB-D camera, wide-angle camera (visual motion tracking), 9-axis IMU | Accuracy: ISANA localization error smaller than 1 m without drifts (Closed loop) | 1. Indoor wayfinding for VIPs with environmental awareness, obstacle detection, and avoidance, robust HMI 2. Average navigation guidance error decreased significantly by adding SmartCane for accurate heading measure, and traveling time decreased for the same travel | 1. User interface to be improved, such as better speech recognition to achieve robustness in noisy places; audio frequency should be adjustable and customized 2. Lack of ability to navigate in complex, cluttered environments such as underground stations | [35] |
Raspberry Pi camera | Not Applicable | Raspberry pi 4 (4 GB RAM), ultrasonic sensor (HC-SR04), Raspberry Pi camera (V2), servo motor (sg90), battery | High object detection probability (e.g., 0.97 for a suitcase), average efficiency of 91% | 1. Efficient in terms of battery life, night vision, and weight 2. Simple and easy-to-use algorithms do not require internet access to operate 3. Clear image captured with minimum noise 4. Detection of the clear demarcation between the hindered object and the surroundings for VIPs to walk easily 5. Successful detection of the periphery of objects (e.g., transparent bottles) | 1. Possible faulty object detection because the picture is outside the frame 2. Distortion in the captured image when one VIP is moving 3. Hard to differentiate transparent objects and the background | [36] |
Stereo camera (vision) | Not Applicable | Binocular camera (environment perception), visual odometry (head tracking), IMU | Sensing range: 10–20 m; Positioning accuracy | 1. Environmental perception (through modeling geometric scene background by the binocular camera) 2. A robust head tracking algorithm (combines IMU and visual odometry) is used to obtain highly frequent and robust head pose estimations (head orientation and position) with small latency 3. Sonification of objects in users’ vicinity to overcome limitations of other assistive aids, semantically meaningful (different sounds to distinguish object categories), and comfortability 4. Enable users to walk in (unknown) urban areas and prevent obstacles 5. Wearable, lightweight, unobtrusive | 1. Although sonification can notify users about dangerous objects, the lack of decision of direction to walk will increase the risks of danger 2. The bulky helmet with camera and backpack must be minimized 3. Lack of significant evidence for operations indoors 4. Sonification ignores the needs of deafblind people | [37] |
Visual SLAM | Virtual-Blind-Road Following-Based Wearable Navigation Device | Depth camera, fisheye camera, ultrasonic rangefinder | Accuracy: Average deviation distance less than 0.3 m with a small variance | 1. Route following (PoI graph + dynamic subgoal selecting-based route following algorithm) and obstacle avoiding problems solved simultaneously 2. Low-cost, small dimension, relatively easy integration | 1. Higher accuracy is required in other applications than walking 2. The continuity of the system for replying only on cameras (e.g., worse performance at nighttime) | [38] |
LiDAR (vision) | LASS | LiDAR device, stereo headphone set (Audio Technica AUD ATHAD500X), scanning laser rangefinder (Hokuyo URG-04LX) | LASS system performs better (higher obstacle hit rate) in closer obstacles and when obstacles are found at a 90-degree angle | 1. Compared with ultrasound as the signal source, LiDAR is better due to its shorter wavelength and focused beam, thus higher spatial resolution 2. LiDAR can constantly scan the surrounding environments without head movements 3. LASS translates the distance information of the object to sounds with different pitches, which can be optimized to users’ reactions (closer objects with higher frequencies) 4. Wearable LiDAR device 5. Less training time 5. LASS can generate and receive signals so users can focus on interoperating spatial information | 1. Difficult to use the laser pointer to point at targets; longer training time is needed for the users 2. Long time cost for full scans 3. The complex layout of obstacles in the real world | [39] |
System Category | System Name | Sensors/Main Components | Performance: Range/Accuracy/Other | Advantage(s) | Limitation(s) | Reference |
---|---|---|---|---|---|---|
Bluetooth | Hybrid Indoor Positioning System for VIPs Using Bluetooth and Google Tango | BLE system (Samsung Galaxy S4), Tango system (Lenovo Phab 2 Pro) | (Hybrid system): Hybrid system requires less aid from researchers than BLE-only navigation system (p = 0.0096) but similar travel duration | 1. Able to create detailed 3D models of the indoor environment by utilizing the Tango system (which has an RGB-D camera for feature-based indoor localization and pose estimation) 2. Sensors of Tango are sensitive to noise 3. Tango can try to correct the accumulated drift errors 4. Vibrotactile aid (different vibration speed depends on how close the object is) | 1. BLE signals are extremely noisy due to the attenuation by materials inside the building. Fingerprinting method to compare current Received Signal Strength Indicator (RSSI) and pre-built snapshots of the area’s radio landscape, but the beacons can only produce a coarse location for users, so greater accuracy is required for the localization 2. The Tango system can only provide feature maps for around one floor each time (BLE can help to determine which ADF Tango to load to report accurate position to users) 3. Future works in supplements such as vibrotactile | [40] |
UWB | SUGAR | wall-mounted Ubisense UWB-based sensors, smartphone, UWB tag embedded in headphones, server with various modules | Accuracy: positioning errors up to 20 cm in most cases | 1. UWB can distinguish direct-path signals from reflected signals 2. UWB combines the position estimation method to diminish the effects of obstacles 3. High accuracy for obtaining the user’s location 4. Voice commands and acoustic signals to guide users to the selected destinations | High expense in small-scale deployments, the cost can limit its usage for private homes | [41] |
RFID | Mobile Blind Navigation System Using RFID | Android-operated mobile, RFID reader on white cane, GPS, RFID tags | High performance in obstacle avoidance and blind guidance | 1. Lightweight of the RFID handheld reader 2. Voice recognition and text-to-speech to communicate with VIPs 3. High performance in obstacle avoidance | 1. Cost of deployment of the RFID tags 2. Lack of detection of moving obstacles 3. Lack of experimentation to test the efficiency of the system in different environments | [42] |
Ultrasonic | Navguide | Six ultrasonic sensors (all are wide beam ping sensors for detecting floor-level and knee-level obstacles), one wet floor detector sensor, one step-down button, micro-controller circuits, four vibration motors, one battery | Under most experiment sets, Navguide reduces the rate of collision with obstacles; participants with Navguide can complete the tasks faster compared with participants using a white cane | 1. Main goal is to supply logical maps of environments and feedback on obstacles to users 2. Ability to detect wet floors to prevent slipping accidents 3. Detections of floor-level and knee-level obstacles 4. Tactile and auditory sensing are available for simple and priority information 5. Cheap and light to carry | 1. Unable to detect any pit and downhill 2. Unable to detect downstairs, higher risk of falling 3. Only sense the wet floor when users step on it | [43] |
Wi-Fi | INAVI | Wi-Fi signal scanner (NodeMCU ESP8266), a speaker | Location recognition accuracy of 96%, an average localization accuracy of 1.5 m (with more checkpoints) | 1. Leverage the Wi-Fi infrastructure within most public buildings to reduce cost 2. Ensure maximum portability with minimal hardware requirements (only ESP8266 and a speaker) | 1. Long response times occur without Wi-Fi signals 2. Wi-Fi signal fluctuations affect RSSIs 3. Trilateration needs three signals, limiting coverage in smaller areas 4. Users lack obstacle guidance, posing challenges | [44] |
Haptic | HaptiSole | Four vibrator motors (links to the board of Arduino) on the sole, Bluetooth transmission module, battery | An average accuracy rate of 94% in identifying four main directions (e.g., up, forward, right, left) | 1. Users can accurately recognize directions displayed on their feet 2. Blindfolded users can reach destinations 3. The system is embedded in shoes, and directions are communicated by vibration motors 4. An evaluation platform that helps EOA designers evaluate user interfaces without positioning errors 5. Considerations for ergonomics, self-sufficiency, resolution, aesthetics | 1. Users faced challenges in recognizing up-right and up-left directions 2. Multiple outages occurred during testing 3. The performance under diverse environmental and ground conditions remains unexplored | [45] |
Smartphone-based | NavCog3 | Smartphone (with in-built IMU, BLE beacons) | Accuracy: average accuracy is 1.65 m; performance: 93.8% of 260 turns are successful, more difficult to turn by 45 degrees than 90 degrees | 1. Provide turning instructions to the users 2. Instant audio feedback when the orientation is incorrect 3. Supply information such as distance, location, and direction of landmarks and points of interest 4. Compared with LiDAR, ultrasonic, UWB, etc., NavCog3 does not need more signal receivers | 1. For large-scale environments, there is a high deployment cost, and it is time-consuming to collect landmark information indoors 2. Lack of control users’ exposure to earlier versions of the system, which can affect the accuracy or subjective responses | [46] |
Infrared | Low Energy Precise Navigation System for the Blind with Infrared Sensors | 1. Device with two infrared sensors (M1 × 90,614 with 90 and 35 degrees FOV) (measures infrared radiation) on VIPs’ arms 2. Gyroscope, accelerometer, magnetometer sensors (MPU9250 module) (for compensating the deflection of the system) | Accuracy: 20 cm (average accuracy compared with tap measurements) | 1. Take advantage of different temperatures of walls and floors (or objects made of varied materials) to continuously find the distance between VIPs and objects such as walls and buildings, larger temperature differences when far from the object 2. Low power consumption 3. No need to emit signals | 1. The accuracy depends on the resolution of the infrared sensors or cameras 2. Accuracy should be improved for more complex environments and weather | [47] |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Feghali, J.M.; Feng, C.; Majumdar, A.; Ochieng, W.Y. Comprehensive Review: High-Performance Positioning Systems for Navigation and Wayfinding for Visually Impaired People. Sensors 2024, 24, 7020. https://doi.org/10.3390/s24217020
Feghali JM, Feng C, Majumdar A, Ochieng WY. Comprehensive Review: High-Performance Positioning Systems for Navigation and Wayfinding for Visually Impaired People. Sensors. 2024; 24(21):7020. https://doi.org/10.3390/s24217020
Chicago/Turabian StyleFeghali, Jean Marc, Cheng Feng, Arnab Majumdar, and Washington Yotto Ochieng. 2024. "Comprehensive Review: High-Performance Positioning Systems for Navigation and Wayfinding for Visually Impaired People" Sensors 24, no. 21: 7020. https://doi.org/10.3390/s24217020
APA StyleFeghali, J. M., Feng, C., Majumdar, A., & Ochieng, W. Y. (2024). Comprehensive Review: High-Performance Positioning Systems for Navigation and Wayfinding for Visually Impaired People. Sensors, 24(21), 7020. https://doi.org/10.3390/s24217020