[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (17)

Search Parameters:
Keywords = smart walker

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 5560 KiB  
Article
Development of Assistance Level Adjustment Function for Variable Load on a Forearm-Supported Robotic Walker
by Yuto Mori, Soichiro Yokoyama, Tomohisa Yamashita, Hidenori Kawamura and Masato Mori
Sensors 2024, 24(19), 6456; https://doi.org/10.3390/s24196456 - 6 Oct 2024
Viewed by 1164
Abstract
With the progression of an aging society, the importance of walking assistance technology has been increasing. The research and development of robotic walkers for individuals requiring walking support are advancing. However, there was a problem that the conventional constant support amount did not [...] Read more.
With the progression of an aging society, the importance of walking assistance technology has been increasing. The research and development of robotic walkers for individuals requiring walking support are advancing. However, there was a problem that the conventional constant support amount did not satisfy the propulsion force required for the walking speed that users wanted. In this study, in order to solve this problem, we propose an algorithm for determining the support amount to maintain the walking speed when the average walking speed of each user is set as the target speed. A robotic walker was developed by attaching BLDC motors to an actual walker, along with a control algorithm for assistance based on sampling-type PID control. The effectiveness of the assistance determination algorithm and the usefulness of the parameters were demonstrated through experiments using weights loaded on the forearm support and target speeds. Subsequently, subject experiments were conducted to verify the ability to maintain target speeds, and a questionnaire survey confirmed that the assistance did not interfere with actual walking. The proposed algorithm for determining the assistance levels demonstrated the ability to maintain target speeds and allowed for adjustments in the necessary level of assistance. Full article
Show Figures

Figure 1

Figure 1
<p>Main forces when walking with a forearm-supported walker.</p>
Full article ">Figure 2
<p>Appearance of the robotic walker when not in use.</p>
Full article ">Figure 3
<p>Appearance of the robotic walker when in use.</p>
Full article ">Figure 4
<p>Architecture of equipment on a robotic walker.</p>
Full article ">Figure 5
<p>Output of speed and duty ratio at a load capacity of 8 kg and a target speed of 0.46 m/s.</p>
Full article ">Figure 6
<p>Output of speed and duty ratio at a load capacity of 14 kg and a target speed of 0.46 m/s.</p>
Full article ">Figure 7
<p>Output of speed and duty ratio at a load capacity of 20 kg and a target speed of 0.46 m/s.</p>
Full article ">Figure 8
<p>Output of speed and duty ratio at a load capacity of 8 kg and a target speed of 0.79 m/s.</p>
Full article ">Figure 9
<p>Output of speed and duty ratio at a load capacity of 14 kg and a target speed of 0.79 m/s.</p>
Full article ">Figure 10
<p>Output of speed and duty ratio at a load capacity of 20 kg and a target speed of 0.79 m/s.</p>
Full article ">Figure 11
<p>Subject with weights attached.</p>
Full article ">Figure 12
<p>A single trial where walking speed stabilized at the target walking speed. The red dotted line represents Subject A’s average normal walking speed (0.601 m/s).</p>
Full article ">Figure 13
<p>A trial where walking speed stabilized above the target speed. The red dotted line represents Subject C’s average normal walking speed (0.629 m/s).</p>
Full article ">Figure 14
<p>A trial with a slow initial speed. The red dotted line represents Subject E’s average normal walking speed (0.656 m/s).</p>
Full article ">Figure 15
<p>A trial of the circular route. Based on the pre-measurement results of Subject C, the ideal velocity for the left wheel was calculated to be 0.504 m/s, and for the right wheel 0.714 m/s.</p>
Full article ">
24 pages, 7040 KiB  
Article
Virtual Obstacle Avoidance Strategy: Navigating through a Complex Environment While Interacting with Virtual and Physical Elements
by Fabiana Machado, Matheus Loureiro, Marcio Bezerra, Carla Zimerer, Ricardo Mello and Anselmo Frizera
Sensors 2024, 24(19), 6212; https://doi.org/10.3390/s24196212 - 25 Sep 2024
Viewed by 871
Abstract
Robotic walking devices can be used for intensive exercises to enhance gait rehabilitation therapies. Mixed Reality (MR) techniques may improve engagement through immersive and interactive environments. This article introduces an MR-based multimodal human–robot interaction strategy designed to enable shared control with a Smart [...] Read more.
Robotic walking devices can be used for intensive exercises to enhance gait rehabilitation therapies. Mixed Reality (MR) techniques may improve engagement through immersive and interactive environments. This article introduces an MR-based multimodal human–robot interaction strategy designed to enable shared control with a Smart Walker. The MR system integrates virtual and physical sensors to (i) enhance safe navigation and (ii) facilitate intuitive mobility training in personalized virtual scenarios by using an interface with three elements: an arrow to indicate where to go, laser lines to indicate nearby obstacles, and an ellipse to show the activation zone. The multimodal interaction is context-based; the presence of nearby individuals and obstacles modulates the robot’s behavior during navigation to simplify collision avoidance while allowing for proper social navigation. An experiment was conducted to evaluate the proposed strategy and the self-explanatory nature of the interface. The volunteers were divided into four groups, with each navigating under different conditions. Three evaluation methods were employed: task performance, self-assessment, and observational measurement. Analysis revealed that participants enjoyed the MR system and understood most of the interface elements without prior explanation. Regarding the interface, volunteers who did not receive any introductory explanation about the interface elements were mostly able to guess their purpose. Volunteers that interacted with the interface in the first session provided more correct answers. In future research, virtual elements will be integrated with the physical environment to enhance user safety during navigation, and the control strategy will be improved to consider both physical and virtual obstacles. Full article
(This article belongs to the Special Issue Mobile Robots for Navigation: 2nd Edition)
Show Figures

Figure 1

Figure 1
<p>UFES vWalker (<b>right</b>) and its digital twin (<b>left</b>) with their virtual and physical sensors.</p>
Full article ">Figure 2
<p>Block diagram containing the proposed MR system together with the obstacle avoidance strategy. The system is divided in four main blocks: the UFES vWalker, digital twin, mixed reality, and control strategy.</p>
Full article ">Figure 3
<p>Elements in the proposed MR system, showing the digital twin and the ellipse zone. The white rectangles represent the virtualization of physical elements near the user. The first-person perspective is displayed inside the orange rectangle, highlighting part of the yellow ellipse, a collectible item (a star), and the surrounding white rectangles. The red lines indicate virtual boundaries, which are interpreted as virtual obstacles to prevent the user from crossing them.</p>
Full article ">Figure 4
<p>Operation of block 3 (zone check) in the block diagram. For each laser, we verify whether the identification occurs inside or outside the ellipse by comparing the distance <math display="inline"><semantics> <mi mathvariant="bold-italic">r</mi> </semantics></math> from the center of the ellipse to the edge with the identification distance <math display="inline"><semantics> <mi mathvariant="bold-italic">l</mi> </semantics></math> for the specific angle <math display="inline"><semantics> <mi mathvariant="bold-italic">β</mi> </semantics></math>. This check is performed from the limit of <math display="inline"><semantics> <mi>β</mi> </semantics></math><math display="inline"><semantics> <mrow> <mo>−</mo> <mi>π</mi> <mo>/</mo> <mn>2</mn> </mrow> </semantics></math> to the limit of <math display="inline"><semantics> <mrow> <mi>π</mi> <mo>/</mo> <mn>2</mn> </mrow> </semantics></math>. The figure shows three identifications: <span class="html-italic">i</span> and <span class="html-italic">k</span> inside the ellipse and <span class="html-italic">j</span> outside the ellipse.</p>
Full article ">Figure 5
<p>Situation where the UFES vWalker can get stuck between two obstacles. When the obstacle is identified in (<b>a</b>), the UFES vWalker facilitates movement to the right. However, when identifying the obstacle to the right in (<b>b</b>), the UFES vWalker facilitates movement to the left and the situation in (<b>a</b>) is repeated. To ensure safety, the UFES vWalker checks the space between obstacles. In situations that if it is not possible to pass through safely, then the system instantiates a virtual wall, as seen in (<b>c</b>).</p>
Full article ">Figure 6
<p>Images depicting the user interface. In (<b>a</b>), a static obstacle on the left side of the UFES vWalker is within the yellow ellipse, and the green arrow indicates a suggested path to the right. In (<b>b</b>), a person is represented by the blue ellipse, and the arrow advises the user to proceed to the left.</p>
Full article ">Figure 7
<p>Physical environment in which the experiment took place, consisting of (<b>a</b>) a corridor, (<b>b</b>) a door with a narrow passage, and (<b>c</b>) a room.</p>
Full article ">Figure 8
<p>The experiments took place within a virtual environment depicted in (<b>a</b>–<b>d</b>), showing scenario in which the users performed the tasks. Notably, this virtual environment replicates the physical environment composed of a corridor, a narrow doorway, and a room.</p>
Full article ">
24 pages, 9505 KiB  
Article
Analysis of Gait Kinematics in Smart Walker-Assisted Locomotion in Immersive Virtual Reality Scenario
by Matheus Loureiro, Arlindo Elias, Fabiana Machado, Marcio Bezerra, Carla Zimerer, Ricardo Mello and Anselmo Frizera
Sensors 2024, 24(17), 5534; https://doi.org/10.3390/s24175534 - 27 Aug 2024
Viewed by 1493
Abstract
The decline in neuromusculoskeletal capabilities of older adults can affect motor control, independence, and locomotion. Because the elderly population is increasing worldwide, assisting independent mobility and improving rehabilitation therapies has become a priority. The combination of rehabilitation robotic devices and virtual reality (VR) [...] Read more.
The decline in neuromusculoskeletal capabilities of older adults can affect motor control, independence, and locomotion. Because the elderly population is increasing worldwide, assisting independent mobility and improving rehabilitation therapies has become a priority. The combination of rehabilitation robotic devices and virtual reality (VR) tools can be used in gait training to improve clinical outcomes, motivation, and treatment adherence. Nevertheless, VR tools may be associated with cybersickness and changes in gait kinematics. This paper analyzes the gait parameters of fourteen elderly participants across three experimental tasks: free walking (FW), smart walker-assisted gait (AW), and smart walker-assisted gait combined with VR assistance (VRAW). The kinematic parameters of both lower limbs were captured by a 3D wearable motion capture system. This research aims at assessing the kinematic adaptations when using a smart walker and how the integration between this robotic device and the VR tool can influence such adaptations. Additionally, cybersickness symptoms were investigated using a questionnaire for virtual rehabilitation systems after the VRAW task. The experimental data indicate significant differences between FW and both AW and VRAW. Specifically, there was an overall reduction in sagittal motion of 16%, 25%, and 38% in the hip, knee, and ankle, respectively, for both AW and VRAW compared to FW. However, no significant differences between the AW and VRAW kinematic parameters and no adverse symptoms related to VR were identified. These results indicate that VR technology can be used in walker-assisted gait rehabilitation without compromising kinematic performance and presenting potential benefits related to motivation and treatment adherence. Full article
(This article belongs to the Collection Sensors for Gait, Human Movement Analysis, and Health Monitoring)
Show Figures

Figure 1

Figure 1
<p>A participant from the experiment using the UFES vWalker and the five subsystems: Odometry and Control (OC), Human–Robot-Environment Interaction (HREI), Human–Robot Interaction (HRI), Motion Capture (MC), and Virtual Reality Integration (VRI).</p>
Full article ">Figure 2
<p>An example of the joint angles of the hip, knee, and ankle respectively during a gait cycle from a participant in the FW task.</p>
Full article ">Figure 2 Cont.
<p>An example of the joint angles of the hip, knee, and ankle respectively during a gait cycle from a participant in the FW task.</p>
Full article ">Figure 3
<p>The boxplot of all the spatiotemporal parameters divided by the FW, AW and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.</p>
Full article ">Figure 4
<p>The boxplot of the kinematic parameters of the right hip joint in the sagittal plane, and both sides in the coronal and transverse planes, divided by the FW, AW, and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.</p>
Full article ">Figure 5
<p>The boxplot of the kinematic parameters of the right knee joint in the sagittal plane, and both sides in the coronal and transverse planes, divided by the FW, AW, and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.</p>
Full article ">Figure 6
<p>The boxplot of the kinematic parameters of the right ankle joint in the sagittal plane, and both sides in the coronal plane, divided by the FW, AW, and VRAW tasks. The boxplots for AW and VRAW, marked with an asterisk (*), indicate that statistically significant differences were found between these tasks and FW.</p>
Full article ">
15 pages, 3438 KiB  
Article
Automatic Gait Classification Model Empowered by Machine Learning for People with and without Osteoporosis Using Smart Walker
by Nazia Ejaz, Saad Jawaid Khan, Fahad Azim, Muhammad Asif, Emil Teuțan, Alin Pleșa, Răzvan Păcurar and Sergiu-Dan Stan
Appl. Sci. 2024, 14(9), 3874; https://doi.org/10.3390/app14093874 - 30 Apr 2024
Viewed by 1499
Abstract
Osteoporosis constitutes a significant public health concern necessitating proactive prevention, treatment, and monitoring efforts. Timely identification holds paramount importance in averting fractures and alleviating the overall disease burden. The realm of osteoporosis diagnosis has witnessed a surge in interest in machine learning applications. [...] Read more.
Osteoporosis constitutes a significant public health concern necessitating proactive prevention, treatment, and monitoring efforts. Timely identification holds paramount importance in averting fractures and alleviating the overall disease burden. The realm of osteoporosis diagnosis has witnessed a surge in interest in machine learning applications. This burgeoning technology excels at recognizing patterns and forecasting the onset of osteoporosis, paving the way for more efficacious preventive and therapeutic interventions. Smart walkers emerge as valuable tools in this context, serving as data acquisition platforms for datasets tailored to machine learning techniques. These datasets, trained to discern patterns indicative of osteoporosis, play a pivotal role in enhancing diagnostic accuracy. In this study, encompassing 40 participants—20 exhibiting robust health and 20 diagnosed with osteoporosis—data from force sensors embedded in the handlebars of conventional walkers were gathered. A windowing action was used to increase the size of the dataset. The data were normalized, and k-fold cross-validation was applied to assess how well our model performs on untrained data. We used multiple machine learning algorithms to create an accurate model for automatic monitoring of users’ gait, with the Random Forest classifier performing the best with 95.40% accuracy. To achieve the best classification accuracy on the validation dataset, the hyperparameters of the Random Forest classifier were further adjusted on the training data. The results suggest that machine learning-based automatic monitoring of gait parameters could lead to accurate, non-laborious, cost-effective, and efficient diagnostic tools for osteoporosis and other musculoskeletal disorders. Further research is needed to validate these findings. Full article
(This article belongs to the Section Mechanical Engineering)
Show Figures

Figure 1

Figure 1
<p>Protocol of the 10 m walk test with a smart walker.</p>
Full article ">Figure 2
<p>Flow chart showing data processing, where ML is machine learning.</p>
Full article ">Figure 3
<p>Comparison of forces exerted by (<b>a</b>) healthy participants and (<b>b</b>) osteoporosis patients.</p>
Full article ">Figure 4
<p>Distribution of forces exerted by healthy participants on the left-side handlebar (HealthyFLS), forces applied by healthy participants on the right-side handlebar (HealthyFRS), forces applied by osteoporosis patients on the left-side handlebar (OsteoporosisFLS), and forces applied by osteoporosis patients on the right-side handlebar (OsteoporosisFRS).</p>
Full article ">Figure 5
<p>Force sensor readings at every heel strike during the 10 m walk test. (<b>a</b>) People without osteoporosis; (<b>b</b>) people with osteoporosis [<a href="#B31-applsci-14-03874" class="html-bibr">31</a>].</p>
Full article ">Figure 6
<p>Vector graph for Random Forest classifier.</p>
Full article ">Figure 7
<p>Confusion matrix of our classification model after k-fold cross-validation using Random Forest classifier.</p>
Full article ">
17 pages, 3835 KiB  
Article
Examining Gait Characteristics in People with Osteoporosis Utilizing a Non-Wheeled Smart Walker through Spatiotemporal Analysis
by Nazia Ejaz, Saad Jawaid Khan, Fahad Azim, Mehwish Faiz, Emil Teuțan, Alin Pleșa, Alexandru Ianosi-Andreeva-Dimitrova and Sergiu-Dan Stan
Appl. Sci. 2023, 13(21), 12017; https://doi.org/10.3390/app132112017 - 3 Nov 2023
Cited by 1 | Viewed by 1557
Abstract
Fragility fractures, caused by low-energy trauma, are a significant global health concern, with 158 million people aged 50 and over at risk. Hip fractures, a common issue in elderly patients, are often linked to underlying conditions such as osteoporosis. This study proposed a [...] Read more.
Fragility fractures, caused by low-energy trauma, are a significant global health concern, with 158 million people aged 50 and over at risk. Hip fractures, a common issue in elderly patients, are often linked to underlying conditions such as osteoporosis. This study proposed a cost-effective solution using a non-wheeled smart walker with load sensors to measure gait parameters, addressing the high cost of traditional gait analysis equipment, the prototype used PASCO load cells PS2200 for force measurement, eliminating the need for Arduino UNO or microcontroller-based hardware. A lightweight amplifier PS2198 amplified the signal, which was transmitted via USB to a personal computer. PASCO capstone software was used for data recording and visualization. The smart walker was tested on forty volunteers divided into two equal groups: those with osteoporosis and those without, by performing a 10 m walk test three times. ANOVA comparing spatiotemporal parameters (TSPs) of the two participant groups (α = 0.05) showed that significant differences lay in terms of time taken to complete the walk test (p < 0.01), left step length (p = 0.03), walking speed (p = 0.02), and stride length (p < 0.02). The results indicate that this smart walker is a reliable tool for assessing gait patterns in individuals with osteoporosis. The proposed system can be an alternative for time consuming and costly methods such as motion capture, and for socially stigmatizing devices such as exoskeletons. It can also be used further to identify risk factors of osteoporosis. Full article
(This article belongs to the Special Issue Mechatronics System Design in Medical Engineering)
Show Figures

Figure 1

Figure 1
<p>Smart Walker with on board sensors.</p>
Full article ">Figure 2
<p>PASPortal window, software operation during data collection.</p>
Full article ">Figure 3
<p>Validation of sensor measurement after calibration. (<b>a</b>) The sensor on the right handlebar, and (<b>b</b>) the sensor on the left handlebar.</p>
Full article ">Figure 4
<p>Force sensor reading. Force sensor on rear right side (FsRRs (N)), force sensor on rear left side (FsRLs (N)), force sensor on front right side (FsFRs (N)), and force sensor on front left side (FsFLs (N)).</p>
Full article ">Figure 5
<p>Sum of forces measured by sensor on rear and front (left) and on rear and front (right) during walk test using smart walker.</p>
Full article ">Figure 6
<p>A 55-year-old female volunteer from the osteoporosis group performing a walk test.</p>
Full article ">Figure 7
<p>Force sensor readings at every heel strike during the 10 m walk test. (<b>a</b>) People without osteoporosis, (<b>b</b>) people with osteoporosis.</p>
Full article ">Figure 8
<p>Graphical representation of spatiotemporal gait parameters of people with and without osteoporosis. (<b>a</b>) Average right step time, (<b>b</b>) average left step time, (<b>c</b>) stride time, and (<b>d</b>) average time taken to complete walk test.</p>
Full article ">Figure 9
<p>Graphical representation of spatiotemporal gait parameters of people with and without osteoporosis. (<b>a</b>) Average right step length, (<b>b</b>) average left step length, (<b>c</b>) average stride length, and (<b>d</b>) average cadence.</p>
Full article ">Figure 10
<p>Average walking speed of people with and without osteoporosis.</p>
Full article ">
10 pages, 1721 KiB  
Article
Taking a Load Off: User Perceptions of Smart Offloading Walkers for Diabetic Foot Ulcers Using the Technology Acceptance Model
by M. G. Finco, Gozde Cay, Myeounggon Lee, Jason Garcia, Elia Salazar, Tze-Woei Tan, David G. Armstrong and Bijan Najafi
Sensors 2023, 23(5), 2768; https://doi.org/10.3390/s23052768 - 2 Mar 2023
Cited by 8 | Viewed by 4258
Abstract
People with diabetic foot ulcers (DFUs) are commonly prescribed offloading walkers, but inadequate adherence to prescribed use can be a barrier to ulcer healing. This study examined user perspectives of offloading walkers to provide insight on ways to help promote adherence. Participants were [...] Read more.
People with diabetic foot ulcers (DFUs) are commonly prescribed offloading walkers, but inadequate adherence to prescribed use can be a barrier to ulcer healing. This study examined user perspectives of offloading walkers to provide insight on ways to help promote adherence. Participants were randomized to wear: (1) irremovable, (2) removable, or (3) smart removable walkers (smart boot) that provided feedback on adherence and daily walking. Participants completed a 15-item questionnaire based on the Technology Acceptance Model (TAM). Spearman correlations assessed associations between TAM ratings with participant characteristics. Chi-squared tests compared TAM ratings between ethnicities, as well as 12-month retrospective fall status. A total of 21 adults with DFU (age 61.5 ± 11.8 years) participated. Smart boot users reported that learning how to use the boot was easy (ρ =−0.82, p 0.001). Regardless of group, people who identified as Hispanic or Latino, compared to those who did not, reported they liked using the smart boot (p = 0.05) and would use it in the future (p = 0.04). Non-fallers, compared to fallers, reported the design of the smart boot made them want to wear it longer (p = 0.04) and it was easy to take on and off (p = 0.04). Our findings can help inform considerations for patient education and design of offloading walkers for DFUs. Full article
Show Figures

Figure 1

Figure 1
<p>Consolidated Standards of Reporting Trials (CONSORT) diagram for inclusion and exclusion of participants.</p>
Full article ">Figure 2
<p>Participants were randomized to wear (1) an irremovable cast walker (iRCW), (2) an original removable cast walker that is standard of care (oRCW; OG indicates original gadget), or (3) a smart removable cast walker designed to provide feedback on adherence via a sensor and smartwatch (sRCW).</p>
Full article ">Figure 3
<p>The overall smart offloading system, used by participants assigned to use the smart removable cast walker (sRCW). The system consists of a sensor that attaches to the cast walker, as well as a watch that provides the participant notifications regarding their adherence.</p>
Full article ">Figure 4
<p>Significant Spearman correlations between participant ratings on Technology Acceptance Model (TAM) Questionnaire items and participant characteristics. Ratings were coded as 1= strongly agree, 2 = agree, 3 = neutral, 4 = disagree, and 5 = strongly disagree. Significance was considered <span class="html-italic">p</span> ≤ 0.05. <span class="html-italic">p</span>-values with asterisks (*) and dark blue shading denotes significance of <span class="html-italic">p</span> ≤ 0.05. Bold text with two asterisks (**) and light blue shading denotes significance of <span class="html-italic">p</span> ≤ 0.001, which are discussed in the main text. Non-significant correlations are listed in <a href="#app1-sensors-23-02768" class="html-app">Supplementary Table S1</a>. Abbreviations: MoCA = Montreal Cognitive Assessment, PROMIS = Patient-Reported Outcome Measurement Information System, FES-I = Falls Efficacy Scale International.</p>
Full article ">
16 pages, 1252 KiB  
Review
The Use of Sports Rehabilitation Robotics to Assist in the Recovery of Physical Abilities in Elderly Patients with Degenerative Diseases: A Literature Review
by Fangyuan Ju, Yujie Wang, Bin Xie, Yunxuan Mi, Mengyun Zhao and Junwei Cao
Healthcare 2023, 11(3), 326; https://doi.org/10.3390/healthcare11030326 - 21 Jan 2023
Cited by 8 | Viewed by 3570
Abstract
The increase in the number of elderly patients with degenerative diseases has brought additional medical and financial pressures, which are adding to the burden on society. The development of sports rehabilitation robotics (SRR) is becoming increasingly sophisticated at the technical level of its [...] Read more.
The increase in the number of elderly patients with degenerative diseases has brought additional medical and financial pressures, which are adding to the burden on society. The development of sports rehabilitation robotics (SRR) is becoming increasingly sophisticated at the technical level of its application; however, few studies have analyzed how it works and how effective it is in aiding rehabilitation, and fewer individualized exercise rehabilitation programs have been developed for elderly patients. The purpose of this study was to analyze the working methods and the effects of different types of SRR and then to suggest the feasibility of applying SRR to enhance the physical abilities of elderly patients with degenerative diseases. The researcher’s team searched 633 English-language journal articles, which had been published over the past five years, and they selected 38 of them for a narrative literature review. Our summary found the following: (1) The current types of SRR are generally classified as end-effector robots, smart walkers, intelligent robotic rollators, and exoskeleton robots—exoskeleton robots were found to be the most widely used. (2) The current working methods include assistant tools as the main intermediaries—i.e., robots assist patients to participate; patients as the main intermediaries—i.e., patients dominate the assistant tools to participate; and sensors as the intermediaries—i.e., myoelectric-driven robots promote patient participation. (3) Better recovery was perceived for elderly patients when using SRR than is generally achieved through the traditional single-movement recovery methods, especially in strength, balance, endurance, and coordination. However, there was no significant improvement in their speed or agility after using SRR. Full article
Show Figures

Figure 1

Figure 1
<p>Flowchart detailing the systematic search, screening, eligibility, and inclusion procedure.</p>
Full article ">Figure 2
<p>(<b>a</b>) Number of literature accounted for by different diseases; (<b>b</b>) number of literature accounted for by different types of rehabilitation robots.</p>
Full article ">Figure 3
<p>Rehabilitation effects of different types of robots.</p>
Full article ">
27 pages, 4373 KiB  
Article
Application of EMGB to Study Impacts of Public Green Space on Active Transport Behavior: Evidence from South Korea
by Myung Ja Kim and C. Michael Hall
Int. J. Environ. Res. Public Health 2022, 19(12), 7459; https://doi.org/10.3390/ijerph19127459 - 17 Jun 2022
Cited by 6 | Viewed by 2943
Abstract
Public green spaces (e.g., parks, green trails, greenways) and motivations to engage in active transport are essential for encouraging walking and cycling. However, how these key factors influence walker and cyclist behavior is potentially being increasingly influenced by the use of smart apps, [...] Read more.
Public green spaces (e.g., parks, green trails, greenways) and motivations to engage in active transport are essential for encouraging walking and cycling. However, how these key factors influence walker and cyclist behavior is potentially being increasingly influenced by the use of smart apps, as they become more ubiquitous in everyday practices. To fill this research gap, this work creates and tests a theoretically integrated study framework grounded in an extended model of goal-directed behavior, including public green space and motivation with perceived usefulness of smart apps. In order to accomplish the purpose of this study, we conducted an online survey of Korean walkers (n = 325) and cyclists (n = 326) between 10 and 25 July 2021 and applied partial least squares, structural equation, and multi-group analysis to validate the research model. Results revealed that active transport users’ awareness of public green space positively influences attitude toward (γ = 0.163), as well as behavioral intention of (γ = 0.159), walking and cycling. Additionally, motivation (extrinsic and intrinsic) greatly influences attitude (γ = 0.539) and behavioral intention (γ = 0.535). Subjective norms (γ = 0.137) and positive (γ = 0.466) and negative anticipated emotions (γ = 0.225) have a significant impact on the desire that leads to behavioral intention. High and low perceived smart app usefulness also significantly moderates between public green space and attitude (t-value = 25.705), public green space and behavioral intention (t-value = 25.726), motivation and attitude (t-value = −25.561), and motivation and behavioral intention (t-value = −15.812). Consequently, the findings are useful to academics and practitioners by providing new knowledge and insights. Full article
Show Figures

Figure 1

Figure 1
<p>Proposed research model.</p>
Full article ">Figure 2
<p>Results of path analysis.</p>
Full article ">
16 pages, 3533 KiB  
Article
Gait Phase Estimation Based on User–Walker Interaction Force
by Pengcheng Li, Yasuhiro Akiyama, Xianglong Wan, Kazunori Yamada, Mayu Yokoya and Yoji Yamada
Appl. Sci. 2021, 11(17), 7888; https://doi.org/10.3390/app11177888 - 26 Aug 2021
Cited by 2 | Viewed by 1897
Abstract
Smart walkers have been developed for assistance and rehabilitation of elderly people and patients with physical health conditions. A force sensor mounted under the handle is widely used in smart walkers to establish a human–machine interface. The interaction force can be used to [...] Read more.
Smart walkers have been developed for assistance and rehabilitation of elderly people and patients with physical health conditions. A force sensor mounted under the handle is widely used in smart walkers to establish a human–machine interface. The interaction force can be used to control the walker and estimate gait parameters using methods such as the Kalman filter for real-time estimation. However, the estimation performance decreases when the peaks of the interaction force are not captured. To improve the stability and accuracy of gait parameter estimation, we propose an online estimation method to continuously estimate the gait phase and cadence. A multiple model switching mechanism is introduced to improve the estimation performance when gait is asymmetric, and an adaptive rule is proposed to improve the estimation robustness and accuracy. Simulations and experiments demonstrate the effectiveness and accuracy of the proposed gait parameter estimation method. Here, the average estimation error for the gait phase is 0.691 rad when the gait is symmetric and 0.722 rad when it is asymmetric. Full article
(This article belongs to the Section Robotics and Automation)
Show Figures

Figure 1

Figure 1
<p>Interaction force along vertical direction. Black line represents the raw data of the interaction force, blue line represents the signal after BMFLC, and red line represents the reconstructed signal of the AOs.</p>
Full article ">Figure 2
<p>Architecture of proposed gait phase estimation method.</p>
Full article ">Figure 3
<p>Simulation results with different frequency combinations (<b>a</b>) <math display="inline"><semantics> <msub> <mi>ω</mi> <mrow> <mi>s</mi> <mn>1</mn> </mrow> </msub> </semantics></math> = 1 Hz and <math display="inline"><semantics> <msub> <mi>ω</mi> <mrow> <mi>s</mi> <mn>2</mn> </mrow> </msub> </semantics></math> = 1 Hz. (<b>b</b>) <math display="inline"><semantics> <msub> <mi>ω</mi> <mrow> <mi>s</mi> <mn>1</mn> </mrow> </msub> </semantics></math> = 1.2 Hz and <math display="inline"><semantics> <msub> <mi>ω</mi> <mrow> <mi>s</mi> <mn>2</mn> </mrow> </msub> </semantics></math> = 1 Hz. (<b>c</b>) <math display="inline"><semantics> <msub> <mi>ω</mi> <mrow> <mi>s</mi> <mn>1</mn> </mrow> </msub> </semantics></math> = 1.4 Hz and <math display="inline"><semantics> <msub> <mi>ω</mi> <mrow> <mi>s</mi> <mn>2</mn> </mrow> </msub> </semantics></math> = 1 Hz. (<b>d</b>) <math display="inline"><semantics> <msub> <mi>ω</mi> <mrow> <mi>s</mi> <mn>1</mn> </mrow> </msub> </semantics></math> = 1.6 Hz and <math display="inline"><semantics> <msub> <mi>ω</mi> <mrow> <mi>s</mi> <mn>2</mn> </mrow> </msub> </semantics></math> = 1 Hz. <math display="inline"><semantics> <msub> <mi>S</mi> <mn>1</mn> </msub> </semantics></math> is the simulated signal; the black solid line represents the signal with <math display="inline"><semantics> <msub> <mi>ω</mi> <mrow> <mi>s</mi> <mn>1</mn> </mrow> </msub> </semantics></math> and the black dotted line represents the signal with <math display="inline"><semantics> <msub> <mi>ω</mi> <mrow> <mi>s</mi> <mn>2</mn> </mrow> </msub> </semantics></math>. <math display="inline"><semantics> <msub> <mi>φ</mi> <mi>s</mi> </msub> </semantics></math> is the phase of the simulated signal. <math display="inline"><semantics> <msub> <mi>ω</mi> <mi>s</mi> </msub> </semantics></math> is the estimated frequency of the AO (Red) and the proposed method (Blue). <math display="inline"><semantics> <msub> <mi>e</mi> <mi>s</mi> </msub> </semantics></math> is the phase estimation error of the AO (Red) and the proposed method (Blue).</p>
Full article ">Figure 4
<p>Simulated signal with transient changing amplitude.</p>
Full article ">Figure 5
<p>Phase estimation results. The black line is the reference phase, the blue line is the estimated phase of the proposed method, and the red line is the estimated phase of the AO.</p>
Full article ">Figure 6
<p>The value of <math display="inline"><semantics> <mi>σ</mi> </semantics></math>.</p>
Full article ">Figure 7
<p>Walker used in experiment.</p>
Full article ">Figure 8
<p>Gait phase estimation error and Heel Strike (HS) estimation error (red circles = estimated HS; green triangles = HS detected by pressure sensor).</p>
Full article ">Figure 9
<p>HS misdetections.</p>
Full article ">Figure 10
<p>Interaction force under HS misdetection (red circle).</p>
Full article ">
18 pages, 5116 KiB  
Article
Semi-Remote Gait Assistance Interface: A Joystick with Visual Feedback Capabilities for Therapists
by Daniel E. Garcia A., Sergio D. Sierra M., Daniel Gomez-Vargas, Mario F. Jiménez, Marcela Múnera and Carlos A. Cifuentes
Sensors 2021, 21(10), 3521; https://doi.org/10.3390/s21103521 - 19 May 2021
Cited by 3 | Viewed by 3499
Abstract
The constant growth of pathologies affecting human mobility has led to developing of different assistive devices to provide physical and cognitive assistance. Smart walkers are a particular type of these devices since they integrate navigation systems, path-following algorithms, and user interaction modules to [...] Read more.
The constant growth of pathologies affecting human mobility has led to developing of different assistive devices to provide physical and cognitive assistance. Smart walkers are a particular type of these devices since they integrate navigation systems, path-following algorithms, and user interaction modules to ensure natural and intuitive interaction. Although these functionalities are often implemented in rehabilitation scenarios, there is a need to actively involve the healthcare professionals in the interaction loop while guaranteeing safety for them and patients. This work presents the validation of two visual feedback strategies for the teleoperation of a simulated robotic walker during an assisted navigation task. For this purpose, a group of 14 clinicians from the rehabilitation area formed the validation group. A simple path-following task was proposed, and the feedback strategies were assessed through the kinematic estimation error (KTE) and a usability survey. A KTE of 0.28 m was obtained for the feedback strategy on the joystick. Additionally, significant differences were found through a Mann–Whitney–Wilcoxon test for the perception of behavior and confidence towards the joystick according to the modes of interaction (p-values of 0.04 and 0.01, respectively). The use of visual feedback with this tool contributes to research areas such as remote management of therapies and monitoring rehabilitation of people’s mobility. Full article
Show Figures

Figure 1

Figure 1
<p>System proposed to provide visual feedback in teleoperation applications of smart walkers.</p>
Full article ">Figure 2
<p>Illustration of the interaction system constituted by the feedback strategies, the path following task, and the simulation environment. <span class="html-italic">x</span> is the joystick position, <math display="inline"><semantics> <mi>τ</mi> </semantics></math> is the virtual torque, F is the impulse force, <span class="html-italic">v</span> is the linear velocity, <math display="inline"><semantics> <mi>ω</mi> </semantics></math> is the angular velocity, <math display="inline"><semantics> <msub> <mi>X</mi> <mi>ω</mi> </msub> </semantics></math> is the <span class="html-italic">x</span> coordinate of the walker’s position, <math display="inline"><semantics> <msub> <mi>Y</mi> <mi>ω</mi> </msub> </semantics></math> is the <span class="html-italic">y</span> coordinate of the walker’s position, and <math display="inline"><semantics> <msub> <mi>θ</mi> <mi>ω</mi> </msub> </semantics></math> is the walker’s orientation. FJ refers to feedback on the joystick and FS to feedback on screen.</p>
Full article ">Figure 3
<p>Visual feedback strategies applied in the robotic walker teleoperation. The upper figures show the simulation environment (gazebo) and the robotic walker used in the system. The central figures illustrate the ideal path and the proposed strategies with their characteristics in the graphic interface. The lower part exhibits the action on the joystick for each method.</p>
Full article ">Figure 4
<p>Illustration of the feedback strategy on the joystick. Three LEDs placed on the base of the device indicate the existence or absence of a path following error. The arrows indicate how to move the joystick to correct the error. The desired path is shown in green. The achieved path is shown in red.</p>
Full article ">Figure 5
<p>Path following task examples for one subject. Training and validation trials for (<b>a</b>) feedback on the screen and (<b>b</b>) feedback on the joystick are shown.</p>
Full article ">Figure 6
<p>Distribution of acceptance and usability questionnaire answers. Feedback on screen (FS), feedback on joystick (FJ). Asterisks indicate that there are significant differences between modes.</p>
Full article ">
16 pages, 10947 KiB  
Article
A Smart Walker for People with Both Visual and Mobility Impairment
by Nafisa Mostofa, Christopher Feltner, Kelly Fullin, Jonathan Guilbe, Sharare Zehtabian, Salih Safa Bacanlı, Ladislau Bölöni and Damla Turgut
Sensors 2021, 21(10), 3488; https://doi.org/10.3390/s21103488 - 17 May 2021
Cited by 18 | Viewed by 8737
Abstract
In recent years, significant work has been done in technological enhancements for mobility aids (smart walkers). However, most of this work does not cover the millions of people who have both mobility and visual impairments. In this paper, we design and study four [...] Read more.
In recent years, significant work has been done in technological enhancements for mobility aids (smart walkers). However, most of this work does not cover the millions of people who have both mobility and visual impairments. In this paper, we design and study four different configurations of smart walkers that are specifically targeted to the needs of this population. We investigated different sensing technologies (ultrasound-based, infrared depth cameras and RGB cameras with advanced computer vision processing), software configurations, and user interface modalities (haptic and audio signal based). Our experiments show that there are several engineering choices that can be used in the design of such assistive devices. Furthermore, we found that a holistic evaluation of the end-to-end performance of the systems is necessary, as the quality of the user interface often has a larger impact on the overall performance than increases in the sensing accuracy beyond a certain point. Full article
Show Figures

Figure 1

Figure 1
<p>The rollator configured with nine HC-SR04 ultrasonic sensors. Seven sensors are facing forward to capture obstacles in a wide area in front of the walker. To facilitate navigational guidance, two sensors (one on the left and the other on the right) are capturing obstacles.</p>
Full article ">Figure 2
<p>Object detection of door and car with TensorFlow.</p>
Full article ">Figure 3
<p>Top: The depth component of an image recorded by a Kinect camera in configuration (C). Darker colors show closer points, with black pixels representing points for which depth information is not available. The blue central column is the area processed for obstacle detection. Bottom: The one-dimensional array extracted from the depth map. Pixel 500 refers to the bottom of the image. The smooth increase in distance from 500 to 200 shows an approximately 4 m free area in front of the walker, with a drop starting after that.</p>
Full article ">Figure 4
<p>The smart walker in configuration D.</p>
Full article ">Figure 5
<p>The stages of processing the point cloud: (top) the high-resolution point cloud extracted from the Kinect sensor; (middle) the point cloud after the elimination of non-relevant points and downsampling; and (bottom) the points of an obstacle, after the floor plane was identified and removed from the image using RANSAC.</p>
Full article ">Figure 5 Cont.
<p>The stages of processing the point cloud: (top) the high-resolution point cloud extracted from the Kinect sensor; (middle) the point cloud after the elimination of non-relevant points and downsampling; and (bottom) the points of an obstacle, after the floor plane was identified and removed from the image using RANSAC.</p>
Full article ">Figure 6
<p>(top) Efficacy of the navigational guidance system for configurations A and B, measured as the percentage of the obstacles in the environment that were hit during the navigation (lower is better); and (bottom) time needed to perform a navigation task (lower is better).</p>
Full article ">Figure 7
<p>A complex navigation task in an indoor environment. The disabled user needs to navigate from the bedroom to the bathroom, avoid obstacles such as the bed and the chair, and must find and open the appropriate doors.</p>
Full article ">
19 pages, 2354 KiB  
Article
Evaluation of Physical Interaction during Walker-Assisted Gait with the AGoRA Walker: Strategies Based on Virtual Mechanical Stiffness
by Sergio D. Sierra M, Marcela Múnera, Thomas Provot, Maxime Bourgain and Carlos A. Cifuentes
Sensors 2021, 21(9), 3242; https://doi.org/10.3390/s21093242 - 7 May 2021
Cited by 11 | Viewed by 3489
Abstract
Smart walkers are commonly used as potential gait assistance devices, to provide physical and cognitive assistance within rehabilitation and clinical scenarios. To understand such rehabilitation processes, several biomechanical studies have been conducted to assess human gait with passive and active walkers. Several sessions [...] Read more.
Smart walkers are commonly used as potential gait assistance devices, to provide physical and cognitive assistance within rehabilitation and clinical scenarios. To understand such rehabilitation processes, several biomechanical studies have been conducted to assess human gait with passive and active walkers. Several sessions were conducted with 11 healthy volunteers to assess three interaction strategies based on passive, low and high mechanical stiffness values on the AGoRA Smart Walker. The trials were carried out in a motion analysis laboratory. Kinematic data were also collected from the smart walker sensory interface. The interaction force between users and the device was recorded. The force required under passive and low stiffness modes was 56.66% and 67.48% smaller than the high stiffness mode, respectively. An increase of 17.03% for the hip range of motion, as well as the highest trunk’s inclination, were obtained under the resistive mode, suggesting a compensating motion to exert a higher impulse force on the device. Kinematic and physical interaction data suggested that the high stiffness mode significantly affected the users’ gait pattern. Results suggested that users compensated their kinematics, tilting their trunk and lower limbs to exert higher impulse forces on the device. Full article
(This article belongs to the Section Sensors and Robotics)
Show Figures

Figure 1

Figure 1
<p>AGoRA Smart Walker illustration, a robotic platform for gait assistance and rehabilitation.</p>
Full article ">Figure 2
<p>Description of system’s architecture to provide multiple assistance levels.</p>
Full article ">Figure 3
<p>(<b>a</b>) Markers’ setup on subject. (<b>b</b>) Markers’ setup on the SW.</p>
Full article ">Figure 4
<p>Reference paths for the experimental trials in the motion analysis laboratory. The area that the cameras were able to capture was <math display="inline"><semantics> <mrow> <mn>6</mn> <mo>×</mo> <mn>6</mn> </mrow> </semantics></math> m<sup>2</sup>.</p>
Full article ">Figure 5
<p>Illustration of force and torque signals for one subject: Assistance Mode (AM), Passive Mode (PM), Resistance Mode (RM).</p>
Full article ">Figure 6
<p>Comparison of sagittal plane joint angles for the assistance levels. Each graph was generated using average gait cycles and standard deviations within mode.</p>
Full article ">
18 pages, 32837 KiB  
Article
An Introduction to Patterns for the Internet of Robotic Things in the Ambient Assisted Living Scenario
by Bruno Andò, Luciano Cantelli, Vincenzo Catania, Ruben Crispino, Dario Calogero Guastella, Salvatore Monteleone and Giovanni Muscato
Robotics 2021, 10(2), 56; https://doi.org/10.3390/robotics10020056 - 4 Apr 2021
Cited by 4 | Viewed by 4756
Abstract
The Internet of Things paradigm envisions the interoperation among objects, people, and their surrounding environment. In the last decade, the spread of IoT-based solutions has been supported in various domains and scenarios by academia, industry, and standards-setting organizations. The wide variety of applications [...] Read more.
The Internet of Things paradigm envisions the interoperation among objects, people, and their surrounding environment. In the last decade, the spread of IoT-based solutions has been supported in various domains and scenarios by academia, industry, and standards-setting organizations. The wide variety of applications and the need for a higher level of autonomy and interaction with the environment have recently led to the rise of the Internet of Robotic Things (IoRT), where smart objects become autonomous robotic systems. As mentioned in the recent literature, many of the proposed solutions in the IoT field have to tackle similar challenges regarding the management of resources, interoperation among objects, and interaction with users and the environment. Given that, the concept of the IoT pattern has recently been introduced. In software engineering, a pattern is defined as a general solution that can be applied to a class of common problems. It is a template suggesting a solution for the same problem occurring in different contexts. Similarly, an IoT pattern provides a guide to design an IoT solution with the difference that the software is not the only element involved. Starting from this idea, we propose the novel concept of the IoRT pattern. To the authors’ knowledge, this is the first attempt at pattern authoring in the Internet of Robotic Things context. We focus on pattern identification by abstracting examples also in the Ambient Assisted Living (AAL) scenario. A case study providing an implementation of the proposed patterns in the AAL context is also presented and discussed. Full article
Show Figures

Figure 1

Figure 1
<p>Representation of the obstacle avoidance pattern detailing hardware (HW), software (SW), local, and local/remote components.</p>
Full article ">Figure 2
<p>Representation of the indoor localization pattern detailing hardware (HW), software (SW), local, and local/remote components.</p>
Full article ">Figure 3
<p>Representation of the inertial monitoring pattern detailing hardware (HW), software (SW), local, and local/remote components.</p>
Full article ">Figure 4
<p>The smart walker developed for the SUMMIT project.</p>
Full article ">Figure 5
<p>A block diagram showing the interconnection between the implemented patterns.</p>
Full article ">Figure 6
<p>A simulation session during the testing of the obstacle avoidance algorithm.</p>
Full article ">Figure 7
<p>Different operating conditions of the smart walker: (<b>a</b>) no obstacle is present: brakes are not actuated; (<b>b</b>) an obstacle on the left is detected: the left brake is actuated; (<b>c</b>) a frontal obstacle is detected: both brakes are actuated.</p>
Full article ">Figure 7 Cont.
<p>Different operating conditions of the smart walker: (<b>a</b>) no obstacle is present: brakes are not actuated; (<b>b</b>) an obstacle on the left is detected: the left brake is actuated; (<b>c</b>) a frontal obstacle is detected: both brakes are actuated.</p>
Full article ">
19 pages, 6421 KiB  
Article
ROS-Based Smart Walker with Fuzzy Posture Judgement and Power Assistance
by Yeong-Hwa Chang, Nilima Sahoo, Jing-Yuan Chen, Shang-Yi Chuang and Hung-Wei Lin
Sensors 2021, 21(7), 2371; https://doi.org/10.3390/s21072371 - 29 Mar 2021
Cited by 9 | Viewed by 2956
Abstract
In recent years the increased rate of the aging population has become more serious. With aging, the elderly sometimes inevitably faces many problems which lead to slow walking, unstable or weak limbs and even fall-related injuries. So, it is very important to develop [...] Read more.
In recent years the increased rate of the aging population has become more serious. With aging, the elderly sometimes inevitably faces many problems which lead to slow walking, unstable or weak limbs and even fall-related injuries. So, it is very important to develop an assistive aid device. In this study, a fuzzy controller-based smart walker with a distributed robot operating system (ROS) framework is designed to assist in independent walking. The combination of Raspberry Pi and PIC microcontroller acts as the control kernel of the proposed device. In addition, the environmental information and user postures can be recognized with the integration of sensors. The sensing data include the road slope, velocity of the walker, and user’s grip forces, etc. According to the sensing data, the fuzzy controller can produce an assistive force to make the walker moving more smoothly and safely. Apart from this, a mobile application (App) is designed that allows the user’s guardian to view the current status of the smart walker as well as to track the user’s location. Full article
(This article belongs to the Section Wearables)
Show Figures

Figure 1

Figure 1
<p>Proposed smart walker architecture.</p>
Full article ">Figure 2
<p>Real design of the proposed smart walker.</p>
Full article ">Figure 3
<p>Real design of the proposed smart walker. (<b>a</b>) SolidWorks drawn coupling, (<b>b</b>) 3D printer made coupling, (<b>c</b>) L-shaped bracket with SolidWorks drawing and 3D printing, (<b>d</b>) wheel with coupling device and rotary encoder disc.</p>
Full article ">Figure 4
<p>Execution flowchart of smart walker.</p>
Full article ">Figure 5
<p>Robot operating system (ROS) framework of smart walker.</p>
Full article ">Figure 6
<p>Input membership functions.</p>
Full article ">Figure 7
<p>Output membership functions.</p>
Full article ">Figure 8
<p>Posture judgment (<math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>f</mi> </msub> </mrow> </semantics></math> = S): (<b>a</b>) <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>d</mi> </msub> </mrow> </semantics></math> = S; (<b>b</b>) <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>d</mi> </msub> </mrow> </semantics></math> = M; (<b>c</b>) <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>d</mi> </msub> </mrow> </semantics></math> = L.</p>
Full article ">Figure 9
<p>Posture judgment (<math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>f</mi> </msub> </mrow> </semantics></math> = M): (<b>a</b>) <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>d</mi> </msub> </mrow> </semantics></math> = S; (<b>b</b>) <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>d</mi> </msub> </mrow> </semantics></math> = M; (<b>c</b>) <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>d</mi> </msub> </mrow> </semantics></math> = L.</p>
Full article ">Figure 10
<p>Posture judgment (<math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>f</mi> </msub> </mrow> </semantics></math> = L): (<b>a</b>) <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>d</mi> </msub> </mrow> </semantics></math> = S; (<b>b</b>) <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>d</mi> </msub> </mrow> </semantics></math> = M; (<b>c</b>) <math display="inline"><semantics> <mrow> <msub> <mi>f</mi> <mi>d</mi> </msub> </mrow> </semantics></math> = L.</p>
Full article ">Figure 11
<p>Snapshots of smart walker assisted in downhill.</p>
Full article ">Figure 12
<p>The degree of slope, sensing force, and the assistive motor output (downhill).</p>
Full article ">Figure 13
<p>Snapshots of smart walker assisted in flat surface.</p>
Full article ">Figure 14
<p>The degree of slope, sensing force, and the assistive motor output (flat surface).</p>
Full article ">Figure 15
<p>Snapshots of smart walker assisted on a steep uphill.</p>
Full article ">Figure 16
<p>The degree of slope, sensing force, and the assistive motor output (steeper uphill).</p>
Full article ">Figure 17
<p>App showing: User’s location, user’s health status and environmental information encountered by smart walker.</p>
Full article ">
30 pages, 5622 KiB  
Review
Polymer Optical Fiber Sensors in Healthcare Applications: A Comprehensive Review
by Arnaldo G. Leal-Junior, Camilo A.R. Diaz, Letícia M. Avellar, Maria José Pontes, Carlos Marques and Anselmo Frizera
Sensors 2019, 19(14), 3156; https://doi.org/10.3390/s19143156 - 18 Jul 2019
Cited by 165 | Viewed by 12039
Abstract
Advances in medicine and improvements in life quality has led to an increase in the life expectancy of the general population. An ageing world population have placed demands on the use of assistive technology and, in particular, towards novel healthcare devices and sensors. [...] Read more.
Advances in medicine and improvements in life quality has led to an increase in the life expectancy of the general population. An ageing world population have placed demands on the use of assistive technology and, in particular, towards novel healthcare devices and sensors. Besides the electromagnetic field immunity, polymer optical fiber (POF) sensors have additional advantages due to their material features such as high flexibility, lower Young’s modulus (enabling high sensitivity for mechanical parameters), higher elastic limits, and impact resistance. Such advantages are well-aligned with the instrumentation requirements of many healthcare devices and in movement analysis. Aiming at these advantages, this review paper presents the state-of-the-art developments of POF sensors for healthcare applications. A plethora of healthcare applications are discussed, which include movement analysis, physiological parameters monitoring, instrumented insoles, as well as instrumentation of healthcare robotic devices such as exoskeletons, smart walkers, actuators, prostheses, and orthosis. This review paper shows the feasibility of using POF sensors in healthcare applications and, due to the aforementioned advantages, it is possible to envisage a further widespread use of such sensors in this research field in the next few years. Full article
(This article belongs to the Special Issue Wearable Sensors and Devices for Healthcare Applications)
Show Figures

Figure 1

Figure 1
<p>World population ageing throughout the years and predictions for the next 10 years, adapted from [<a href="#B18-sensors-19-03156" class="html-bibr">18</a>].</p>
Full article ">Figure 2
<p>Wearable healthcare devices and parameters for POF sensor applications. The figure also indicates which section each application is in.</p>
Full article ">Figure 3
<p>Schematic representation of a multisensory architecture for remote health monitoring.</p>
Full article ">Figure 4
<p>(<b>a</b>) Properties of different polymer optical fiber (POF) materials. (<b>I</b>) Glass transition temperature, (<b>II</b>) optical attenuation, (<b>III</b>) Young’s modulus, (<b>IV</b>) Temperature sensitivity (considering fiber Bragg gratings (FBGs)) and (<b>V</b>) Humidity sensitivity (considering FBGs). (<b>b</b>) Interrogation setup for intensity variation-based and FBG sensors. Figure inset shows microscopic images of POFs.</p>
Full article ">Figure 5
<p>Typical response curves from POFs. (<b>a</b>) Temperature response with storage modulus (E’) and loss factor (tanδ), (<b>b</b>) Frequency response, and (<b>c</b>) Creep response indicating the viscous and elastic part of the polymer response.</p>
Full article ">Figure 6
<p>POF-based sensors on wearable robots’ applications.</p>
Full article ">Figure 7
<p>POF sensors on the instrumentation of smart walkers (SWs).</p>
Full article ">Figure 8
<p>POF sensors applications on human movement.</p>
Full article ">Figure 9
<p>Plantar pressure instrumentation systems using POF sensors.</p>
Full article ">
Back to TopTop