[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
sensors-logo

Journal Browser

Journal Browser

Advanced Sensors/Devices for Ambient Assisted Living

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Wearables".

Deadline for manuscript submissions: closed (31 May 2023) | Viewed by 26033

Special Issue Editor

Department of Electric, Electronic and Information Engineering (DIEEI), University of Catania, Catania, Italy
Interests: smart sensing systems and readout electronics; assistive technologies; nano and micro sensors; µ-fluidics; bio-sensors; inkjet printed sensors and flexible sensors; sensors exploiting innovative materials (ferrofluids, ferroelectrics and multiferroics); sensor networks; smart signal processing and nonlinear techniques (including stochastic resonance and dithering) to improve performances of sensors and bio-receptors
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The main aim of this Special Issue is to offer a forum that facilitates professional exchange of knowledge related to assistive technologies (AT). AT is a hot area having an everyday increasing strategic relevance, given its impact on the quality of life of older adults and people with different kinds of impairments (primary end-users), on their relatives and caregivers (secondary end-users), as well as on the economy and on the society as a whole. In this sector, measurement systems, sensors, smart embedded systems and methodologies assume a role of primary importance. There is, in fact, an extremely vivid interest in this subject both in the scientific and in the industrial community. In spite of the numerous results available, there is still a large need for further research efforts and for novel solutions. In particular, multisensor platforms, body area sensor networks, and wireless sensors networks hold the promise of being able to bring innovative contribution to this area, where there is a need of acquiring a suitable amount of data to properly monitor users status, their habits, and the interaction with living environments. At the same time, advanced signal processing, artificial intelligence, machine learning represent strategic tools to improve data reliability and to optimize the amount and quality of information users should be provided with.

The scope of this Special Issue covers aspects ranging from fundamental research to all kinds of advanced technologies and applications. Submissions are welcomed on:

- Overview of state-of-the-art on solutions in the field of "Ambient Assisted Living", with particular regard to wellbeing and active aging;

- Innovative sensors for AT;

- Smart multisensor systems for AT;

- Wireless (body area) sensor networks and for AT;

- Robotics for AT;

- Measurement methodologies, algorithms, and advanced signal processing for AT;

- AI, predictive analytics, and decision-making;

- Applications, such as activity daily living monitoring, fall detection, postural instability assessment, habit monitoring;

- Design of new material, circuits and systems, and presentation of the recent research advances in the field.

Prof. Dr. Bruno Ando
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Assistive technologies
  • Wireless sensors networks
  • Multisensor systems
  • Artificial intelligence
  • Signal processing
  • Human activity monitoring

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

17 pages, 6086 KiB  
Article
Novel Deep Learning Network for Gait Recognition Using Multimodal Inertial Sensors
by Ling-Feng Shi, Zhong-Ye Liu, Ke-Jun Zhou, Yifan Shi and Xiao Jing
Sensors 2023, 23(2), 849; https://doi.org/10.3390/s23020849 - 11 Jan 2023
Cited by 29 | Viewed by 4625
Abstract
Some recent studies use a convolutional neural network (CNN) or long short-term memory (LSTM) to extract gait features, but the methods based on the CNN and LSTM have a high loss rate of time-series and spatial information, respectively. Since gait has obvious time-series [...] Read more.
Some recent studies use a convolutional neural network (CNN) or long short-term memory (LSTM) to extract gait features, but the methods based on the CNN and LSTM have a high loss rate of time-series and spatial information, respectively. Since gait has obvious time-series characteristics, while CNN only collects waveform characteristics, and only uses CNN for gait recognition, this leads to a certain lack of time-series characteristics. LSTM can collect time-series characteristics, but LSTM results in performance degradation when processing long sequences. However, using CNN can compress the length of feature vectors. In this paper, a sequential convolution LSTM network for gait recognition using multimodal wearable inertial sensors is proposed, which is called SConvLSTM. Based on 1D-CNN and a bidirectional LSTM network, the method can automatically extract features from the raw acceleration and gyroscope signals without a manual feature design. 1D-CNN is first used to extract the high-dimensional features of the inertial sensor signals. While retaining the time-series features of the data, the dimension of the features is expanded, and the length of the feature vectors is compressed. Then, the bidirectional LSTM network is used to extract the time-series features of the data. The proposed method uses fixed-length data frames as the input and does not require gait cycle detection, which avoids the impact of cycle detection errors on the recognition accuracy. We performed experiments on three public benchmark datasets: UCI-HAR, HuGaDB, and WISDM. The results show that SConvLSTM performs better than most of those reporting the best performance methods, at present, on the three datasets. Full article
(This article belongs to the Special Issue Advanced Sensors/Devices for Ambient Assisted Living)
Show Figures

Figure 1

Figure 1
<p>Main steps of IMU-based human activity recognition.</p>
Full article ">Figure 2
<p>Comparison between raw and filtered IMU data. (<b>a</b>) Comparison between raw and filtered 3D acceleration data. (<b>b</b>) Comparison between raw and filtered 3D angular velocity data.</p>
Full article ">Figure 2 Cont.
<p>Comparison between raw and filtered IMU data. (<b>a</b>) Comparison between raw and filtered 3D acceleration data. (<b>b</b>) Comparison between raw and filtered 3D angular velocity data.</p>
Full article ">Figure 3
<p>Schematic diagram of sliding window.</p>
Full article ">Figure 4
<p>Main architecture of the proposed SConvLSTM network for human activity recognition.</p>
Full article ">Figure 5
<p>Main architecture of the 1D-CNN network.</p>
Full article ">Figure 6
<p>Schematic diagram of the memory cell.</p>
Full article ">Figure 7
<p>Schematic diagram of bidirectional LSTM network.</p>
Full article ">Figure 8
<p>Confusion matrices for UCI HAR (<b>a</b>), HuGaDB (<b>b</b>), and WISDM (<b>c</b>) datasets.</p>
Full article ">Figure 8 Cont.
<p>Confusion matrices for UCI HAR (<b>a</b>), HuGaDB (<b>b</b>), and WISDM (<b>c</b>) datasets.</p>
Full article ">Figure 9
<p>ROC curve of proposed method and some based on deep learning methods on UCI HAR dataset.</p>
Full article ">
19 pages, 3065 KiB  
Article
A Comparison among Different Strategies to Detect Potential Unstable Behaviors in Postural Sway
by Bruno Andò, Salvatore Baglio, Salvatore Graziani, Vincenzo Marletta, Valeria Dibilio, Giovanni Mostile and Mario Zappia
Sensors 2022, 22(19), 7106; https://doi.org/10.3390/s22197106 - 20 Sep 2022
Cited by 6 | Viewed by 1890
Abstract
Assistive Technology helps to assess the daily living and safety of frail people, with particular regards to the detection and prevention of falls. In this paper, a comparison is provided among different strategies to analyze postural sway, with the aim of detecting unstable [...] Read more.
Assistive Technology helps to assess the daily living and safety of frail people, with particular regards to the detection and prevention of falls. In this paper, a comparison is provided among different strategies to analyze postural sway, with the aim of detecting unstable postural status in standing condition as precursors of potential falls. Three approaches are considered: (i) a time-based features threshold algorithm, (ii) a time-based features Neuro-Fuzzy inference system, and (iii) a Neuro-Fuzzy inference fed by Discrete-Wavelet-Transform-based features. The analysis was performed across a wide dataset and exploited performance indexes aimed at assessing the accuracy and the reliability of predictions provided by the above-mentioned strategies. The results obtained demonstrate valuable performances of the three considered strategies in correctly distinguishing among stable and unstable postural status. However, the analysis of robustness against noisy data highlights better performance of Neuro-Fuzzy inference systems with respect to the threshold-based algorithm. Full article
(This article belongs to the Special Issue Advanced Sensors/Devices for Ambient Assisted Living)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) The architecture miming standing postural sway behaviors; (<b>b</b>) the equivalent node position and representation of main quantities useful for reconstructing the AP and ML dynamics.</p>
Full article ">Figure 2
<p>The threshold algorithm considered in this work to analyze the postural sway and to detect potential unstable behaviors [<a href="#B22-sensors-22-07106" class="html-bibr">22</a>].</p>
Full article ">Figure 3
<p>The Neuro-Fuzzy inference system considered in this work to analyze the postural sway and to detect potential unstable behaviors [<a href="#B39-sensors-22-07106" class="html-bibr">39</a>].</p>
Full article ">Figure 4
<p>The behavior of the threshold-based algorithm for postural status detection. Both (<b>a</b>) training and (<b>b</b>) test datasets are shown.</p>
Full article ">Figure 5
<p>The behavior of the NF inference system for postural status detection. Both (<b>a</b>) training and (<b>b</b>) test datasets are shown.</p>
Full article ">Figure 6
<p>Features (8)–(10) estimated by the DWT for each of the computed levels d1-d5.</p>
Full article ">Figure 6 Cont.
<p>Features (8)–(10) estimated by the DWT for each of the computed levels d1-d5.</p>
Full article ">Figure 7
<p>Performance indexes (12)–(13) as a function of the range of influence.</p>
Full article ">Figure 8
<p>Results obtained by the NF inference system, fed with DWT-based features, in the case of the optimal range of influence. Both (<b>a</b>) training and (<b>b</b>) test datasets are shown.</p>
Full article ">Figure 9
<p>The behavior of the threshold algorithm for postural status detection, as a function of different levels of noise added to the dataset. Results for indexes (3), (13) and (14) calculated for reliability index (4) are shown.</p>
Full article ">Figure 10
<p>The behavior of the Neuro Fuzzy inference systems fed by time-based features, as a function of different levels of noise added to the dataset. Results for indexes (6), (13), and (14) calculated for reliability index (7) are shown.</p>
Full article ">Figure 11
<p>The behavior of the Neuro Fuzzy inference systems fed by DWT-based features, as a function of different levels of noise added to the dataset. Results for indexes (11), (13) and (14) calculated for reliability index (12) are shown.</p>
Full article ">
23 pages, 8353 KiB  
Article
Edge-Computing Meshed Wireless Acoustic Sensor Network for Indoor Sound Monitoring
by Selene Caro-Via, Ester Vidaña-Vila, Gerardo José Ginovart-Panisello, Carme Martínez-Suquía, Marc Freixes and Rosa Ma Alsina-Pagès
Sensors 2022, 22(18), 7032; https://doi.org/10.3390/s22187032 - 17 Sep 2022
Cited by 2 | Viewed by 2578
Abstract
This work presents the design of a wireless acoustic sensor network (WASN) that monitors indoor spaces. The proposed network would enable the acquisition of valuable information on the behavior of the inhabitants of the space. This WASN has been conceived to work in [...] Read more.
This work presents the design of a wireless acoustic sensor network (WASN) that monitors indoor spaces. The proposed network would enable the acquisition of valuable information on the behavior of the inhabitants of the space. This WASN has been conceived to work in any type of indoor environment, including houses, hospitals, universities or even libraries, where the tracking of people can give relevant insight, with a focus on ambient assisted living environments. The proposed WASN has several priorities and differences compared to the literature: (i) presenting a low-cost flexible sensor able to monitor wide indoor areas; (ii) balance between acoustic quality and microphone cost; and (iii) good communication between nodes to increase the connectivity coverage. A potential application of the proposed network could be the generation of a sound map of a certain location (house, university, offices, etc.) or, in the future, the acoustic detection of events, giving information about the behavior of the inhabitants of the place under study. Each node of the network comprises an omnidirectional microphone and a computation unit, which processes acoustic information locally following the edge-computing paradigm to avoid sending raw data to a cloud server, mainly for privacy and connectivity purposes. Moreover, this work explores the placement of acoustic sensors in a real scenario, following acoustic coverage criteria. The proposed network aims to encourage the use of real-time non-invasive devices to obtain behavioral and environmental information, in order to take decisions in real-time with the minimum intrusiveness in the location under study. Full article
(This article belongs to the Special Issue Advanced Sensors/Devices for Ambient Assisted Living)
Show Figures

Figure 1

Figure 1
<p>Set-up to measure the linearity of the two selected USB microphones. (<b>a</b>) Set-up with the speaker, the reference microphone and the evaluated microphones in the anechoic chamber. (<b>b</b>) Reference microphone (central microphone) and evaluated USB microphones (surrounding the central microphone) in the anechoic chamber.</p>
Full article ">Figure 2
<p>Temperature responses in time of the RPi using different setups (heat sink and enclosure) when performing a stress test.</p>
Full article ">Figure 3
<p>Raspberry Pi 4B without (<b>a</b>) and with (<b>b</b>) heat sink. (<b>a</b>) Raspberry Pi 4B without heat sink. (<b>b</b>) Raspberry Pi 4B with heat sink.</p>
Full article ">Figure 4
<p>Boxes’ designs. Perspective view of the assembled box, top view of the base and perspective view of the cover. (<b>a</b>) Large Non-holed PLA box (LNP). (<b>b</b>) Small Non-holed PLA box (SNP). (<b>c</b>) Small Slot-holed PLA box (SSP). (<b>d</b>) Small Honeycomb-holed PLA box (SHP) or Small Honeycomb-holed TPU box (SHT).</p>
Full article ">Figure 5
<p>Conceptual integration of the hardware elements of the sensor. The 3D model of the Raspberry Pi has been retrieved from [<a href="#B41-sensors-22-07032" class="html-bibr">41</a>].</p>
Full article ">Figure 6
<p>Example of WASN deployment in an indoor space.</p>
Full article ">Figure 7
<p>Example of a logical network design with five nodes (in red), one core (in yellow) and one router (in green).</p>
Full article ">
23 pages, 22898 KiB  
Article
Ambient and Wearable Sensor Technologies for Energy Expenditure Quantification of Ageing Adults
by Alessandro Leone, Gabriele Rescio, Giovanni Diraco, Andrea Manni, Pietro Siciliano and Andrea Caroppo
Sensors 2022, 22(13), 4893; https://doi.org/10.3390/s22134893 - 29 Jun 2022
Cited by 5 | Viewed by 2694
Abstract
COVID-19 has affected daily life in unprecedented ways, with dramatic changes in mental health, sleep time and level of physical activity. These changes have been especially relevant in the elderly population, with important health-related consequences. In this work, two different sensor technologies were [...] Read more.
COVID-19 has affected daily life in unprecedented ways, with dramatic changes in mental health, sleep time and level of physical activity. These changes have been especially relevant in the elderly population, with important health-related consequences. In this work, two different sensor technologies were used to quantify the energy expenditure of ageing adults. To this end, a technological platform based on Raspberry Pi 4, as an elaboration unit, was designed and implemented. It integrates an ambient sensor node, a wearable sensor node and a coordinator node that uses the information provided by the two sensor technologies in a combined manner. Ambient and wearable sensors are used for the real-time recognition of four human postures (standing, sitting, bending and lying down), walking activity and for energy expenditure quantification. An important first aim of this work was to realize a platform with a high level of user acceptability. In fact, through the use of two unobtrusive sensors and a low-cost processing unit, the solution is easily accessible and usable in the domestic environment; moreover, it is versatile since it can be used by end-users who accept being monitored by a specific sensor. Another added value of the platform is the ability to abstract from sensing technologies, as the use of human posture and walking activity for energy expenditure quantification enables the integration of a wide set of devices, provided that they can reproduce the same set of features. The obtained results showed the ability of the proposed platform to automatically quantify energy expenditure, both with each sensing technology and with the combined version. Specifically, for posture and walking activity classification, an average accuracy of 93.8% and 93.3% was obtained, respectively, with the wearable and ambient sensor, whereas an improvement of approximately 4% was reached using data fusion. Consequently, the estimated energy expenditure quantification always had a relative error of less than 3.2% for each end-user involved in the experimentation stage, classifying the high level information (postures and walking activities) with the combined version of the platform, justifying the proposed overall architecture from a hardware and software point of view. Full article
(This article belongs to the Special Issue Advanced Sensors/Devices for Ambient Assisted Living)
Show Figures

Figure 1

Figure 1
<p>Schematic representation of the proposed platform for EE quantification. Two different sensor technologies (ambient and wearable) transmit high-level information (e.g., posture label with timestamp) to a coordinator node.</p>
Full article ">Figure 2
<p>Intel<sup>®</sup> RealSense<sup>TM</sup> model D435i. It integrates a RGB sensor, a stereo image sensor and an infrared projector for the depth data collection.</p>
Full article ">Figure 3
<p>Proposed pipeline for posture and walking activity classification using an ambient sensor. It consists of a pre-processing of the acquired images followed by a feature extraction and reduction step and, at last, a classification block returning four different postures and walking activity at different speeds.</p>
Full article ">Figure 4
<p>The RGB (<b>a</b>) and depth (<b>b</b>) frames and, then, the 33 landmark BlazePose model (<b>c</b>) used to define postural features.</p>
Full article ">Figure 5
<p>Wearable system integrating an elastic band (<b>left</b>) and Shimmer3 IMU inertial device (<b>right</b>) equipped with the following sensors: tri-axial accelerometer, magnetometer, pressure and temperature sensor and tri-axial gyroscope.</p>
Full article ">Figure 6
<p>Proposed pipeline for posture and walking activity classification using a wearable sensor. It consists of a calibration stage to verify that the device was worn correctly, pre-processing of the acquired accelerometer signals followed by a feature selection/extraction step and, at last, a classification block returning four different postures and walking activity at different speeds.</p>
Full article ">Figure 7
<p>Elaboration unit (Raspberry Pi 4 Model B) for the acquisition and processing of sensory data and the fusion of high-level information classified by sensor nodes. Used case on the left and electronic board on the right.</p>
Full article ">Figure 8
<p>Total EE composition. It is composed of three major components: physical activity (PA), the internal effects of food and Resting Metabolic Rate (RMR).</p>
Full article ">Figure 9
<p>Flowchart clarifying the implemented platform’s operation. The information processed by the sensory nodes integrated in the platform classifies, after the data fusion step, four different postures and walking activities at three different speeds. Using lookup tables for MET and weight of the end-user, it is possible to quantify EE.</p>
Full article ">Figure 10
<p>Experimental setup. The image on the left shows the laboratory area used for experimentation, the image on the right details the ambient sensory node.</p>
Full article ">Figure 11
<p>Confusion matrices for seven classes of posture and walking activities for ambient (<b>a</b>–<b>c</b>) and wearable (<b>d</b>–<b>f</b>) sensors.</p>
Full article ">Figure 12
<p>Confusion matrices for seven classes of posture and walking activities for integrated platform using RF (<b>a</b>), SVM (<b>b</b>), and KNN (<b>c</b>).</p>
Full article ">
16 pages, 2634 KiB  
Article
Estimation of Steering and Throttle Angles of a Motorized Mobility Scooter with Inertial Measurement Units for Continuous Quantification of Driving Operation
by Jun Suzurikawa, Shunsuke Kurokawa, Haruki Sugiyama and Kazunori Hase
Sensors 2022, 22(9), 3161; https://doi.org/10.3390/s22093161 - 20 Apr 2022
Cited by 2 | Viewed by 3309
Abstract
With the growing demand from elderly persons for alternative mobility solutions, motorized mobility scooters (MMSs) have been gaining importance as an essential assistive technology to aid independent living in local communities. The increased use of MMSs, however, has raised safety issues during driving [...] Read more.
With the growing demand from elderly persons for alternative mobility solutions, motorized mobility scooters (MMSs) have been gaining importance as an essential assistive technology to aid independent living in local communities. The increased use of MMSs, however, has raised safety issues during driving and magnified the necessity to evaluate and improve user driving skills. This study is intended to develop a novel quantitative monitoring method for MMS driving operation using inertial measurement units (IMUs). The proposed method used coordinate transformations around the rotational axes of the steering wheel and the throttle lever to estimate the steering and throttle operating angles based on gravitational accelerations measured by IMUs. Consequently, these operating angles can be monitored simply using an IMU attached to the throttle lever. Validation experiments with a test MMS in the stationary state confirmed the consistency of the proposed coordinate transformation with the MMS’s geometrical structure. The driving test also demonstrated that the operating angles were estimated correctly on various terrains and that the effects of terrain inclination were compensated using an additional IMU attached to the scooter body. This method will be applicable to the quantitative monitoring of driving behavior and act as a complementary tool for the existing skills’ evaluation methods. Full article
(This article belongs to the Special Issue Advanced Sensors/Devices for Ambient Assisted Living)
Show Figures

Figure 1

Figure 1
<p>Motorized mobility scooter tested in this work. (<b>a</b>) Whole body structure. (<b>b</b>) Driving operation interfaces. The movable directions of the steering wheel and the throttle lever are indicated by the red and blue arrows, respectively.</p>
Full article ">Figure 2
<p>Coordinates and rotational order for the estimation of the operating angles (steering and throttle angles).</p>
Full article ">Figure 3
<p>Measurement equipment used for the evaluation experiments. (<b>a</b>) The block diagram of the operating angle estimation and validation. (<b>b</b>) Measurement of the true values for the validation using wire displacement sensors. In (<b>i</b>) the control panel, the wire displacement sensors and the wire winding disks were mounted to measure the true (<b>ii</b>) steering and (<b>iii</b>) throttle angles. The designs for both winding disks are also shown in (<b>iv</b>). (<b>c</b>) IMUs used to estimate the operating angles. The IMUs mounted on (<b>i</b>) the throttle lever and (<b>ii</b>) the front luggage basket were used to perform angle calculations and compensate for the body inclination and movement, respectively.</p>
Full article ">Figure 4
<p>Evaluation conditions in the static state. Combinations of five conditions of the scooter’s body inclinations and two and three statuses of the throttle and the steering, respectively, were tested.</p>
Full article ">Figure 5
<p>Test courses used for accuracy evaluation during driving. (<b>a</b>) Straight 20-m path. (<b>b</b>) Small curves. (<b>c</b>) Upward and downward slopes with angles of 7° and 5°, respectively. (<b>d</b>) Nine-meter side slope with an angle of 7°. (<b>e</b>) Path over a rough surface with tactile paving.</p>
Full article ">Figure 6
<p>Estimated angles versus the true values from the static evaluation. (<b>a</b>) Steering and (<b>b</b>) throttle angles estimated with (<b>i</b>) a single IMU and (<b>ii</b>) dual IMUs are shown. The results for all the conditions given in <a href="#sensors-22-03161-t001" class="html-table">Table 1</a> are plotted. The plots for the no inclination conditions and that with the maximum MAE for each panel are colored as shown in the legends, where the conditions are indicated using the column and row identifiers from <a href="#sensors-22-03161-t001" class="html-table">Table 1</a>.</p>
Full article ">Figure 7
<p>Typical time series values for the true and estimated operating angles during test driving over the five courses. (<b>a</b>) Steering angles. (<b>b</b>) Throttle angles. True values (red) are imposed on the estimated results obtained with the single (light blue) and dual (deep blue) IMUs.</p>
Full article ">Figure 8
<p>Estimated angles versus true angles during test driving over the five courses. The 2D histograms contain the recorded durations of the (<b>a</b>) steering and (<b>b</b>) throttle angles as estimated using single and dual IMUs in the pixels corresponding to the combinations of the true and estimated values. The recorded duration at each bin with widths of 2° in both axes is color-coded on the log scale.</p>
Full article ">Figure 9
<p>Distributions of the estimation errors obtained during test driving over the five courses. Histograms of the absolute errors and cumulative plots are shown for the (<b>a</b>) steering and (<b>b</b>) throttle angles. The results obtained with the single and dual IMUs are plotted in blue and red, respectively. The 90th percentile values of the error distributions are also shown in the panels.</p>
Full article ">
21 pages, 1070 KiB  
Article
Discovering Daily Activity Patterns from Sensor Data Sequences and Activity Sequences
by Mirjam Sepesy Maučec and Gregor Donaj
Sensors 2021, 21(20), 6920; https://doi.org/10.3390/s21206920 - 19 Oct 2021
Cited by 9 | Viewed by 2866
Abstract
The necessity of caring for elderly people is increasing. Great efforts are being made to enable the elderly population to remain independent for as long as possible. Technologies are being developed to monitor the daily activities of a person to detect their state. [...] Read more.
The necessity of caring for elderly people is increasing. Great efforts are being made to enable the elderly population to remain independent for as long as possible. Technologies are being developed to monitor the daily activities of a person to detect their state. Approaches that recognize activities from simple environment sensors have been shown to perform well. It is also important to know the habits of a resident to distinguish between common and uncommon behavior. In this paper, we propose a novel approach to discover a person’s common daily routines. The approach consists of sequence comparison and a clustering method to obtain partitions of daily routines. Such partitions are the basis to detect unusual sequences of activities in a person’s day. Two types of partitions are examined. The first partition type is based on daily activity vectors, and the second type is based on sensor data. We show that daily activity vectors are needed to obtain reasonable results. We also show that partitions obtained with generalized Hamming distance for sequence comparison are better than partitions obtained with the Levenshtein distance. Experiments are performed with two publicly available datasets. Full article
(This article belongs to the Special Issue Advanced Sensors/Devices for Ambient Assisted Living)
Show Figures

Figure 1

Figure 1
<p>Flowchart of the proposed framework.</p>
Full article ">Figure 2
<p>Excerpt from the preprocessed dataset. The first column denotes the day in the dataset, the following columns denote sensor values (one column per sensor), and the last columns denote activity values (one column per activity). Each line represents one data point and corresponds to one time slot. Value 1 denotes active sensor or present activity.</p>
Full article ">Figure 3
<p>The entropy of activities at different times of the day for the Kasteren dataset, the first resident of the CASAS 11 dataset, and the second resident of the CASAS 11 dataset. A day starts at 4 a.m. on one calendar day and ends at 4 a.m. on the next calendar day. In the CASAS 11 dataset, entropy was calculated separately for each of the two residents. It was calculated every half minute.</p>
Full article ">Figure 4
<p>The conditional entropy of activities at different times of the day for the Kasteren dataset, the first resident of the CASAS 11 dataset, and the second resident of the CASAS dataset.</p>
Full article ">Figure 5
<p>Hamming distances between daily activity vectors for consecutive days for the Kasteren dataset, the first resident of the CASAS 11 dataset, and the second resident of the CASAS dataset.</p>
Full article ">Figure 6
<p>Distance matrices of the H3 metric between daily activity vectors for (<b>a</b>) Kasteren dataset, (<b>b</b>) The first resident of the CASAS 11 dataset, and (<b>c</b>) The second resident of the CASAS dataset. Values are in thousands. The background color shows gradient changes in values, with red tones indicating low values and green tones indicating high values.</p>
Full article ">Figure 7
<p>Distance matrix of the Levenshtein metric between daily activity vectors for the Kasteren dataset. Values are in thousands. The background color shows gradient changes in values, with red tones indicating low values and green tones indicating high values.</p>
Full article ">Figure 8
<p>Distance matrix based on sensor data with <math display="inline"><semantics> <mrow> <mi>ε</mi> <mo>=</mo> <mn>0.7</mn> </mrow> </semantics></math> for the Kasteren dataset. Values are in thousands. The background color shows gradient changes in values, with red tones indicating low values and green tones indicating high values.</p>
Full article ">Figure 9
<p>Daily activity representations of the resident in the (<b>a</b>) Kasteren dataset, consecutive days; (<b>b</b>) Kasteren dataset, partitioned on daily activity vectors; (<b>c</b>) CASAS 11 dataset, first resident, consecutive days; (<b>d</b>) CASAS 11 dataset, first resident, partitioned on daily activity vectors; (<b>e</b>) CASAS 11 dataset, second resident, consecutive days; and (<b>f</b>) CASAS 11 dataset, second resident, partitioned on daily activity vectors.</p>
Full article ">Figure 10
<p>Daily activity representations of the resident in the Kasteren dataset, partitioned according to the clustering results based on sensor data.</p>
Full article ">
15 pages, 3021 KiB  
Article
Tom Pouce III, an Electronic White Cane for Blind People: Ability to Detect Obstacles and Mobility Performances
by Aya Dernayka, Michel-Ange Amorim, Roger Leroux, Lucas Bogaert and René Farcy
Sensors 2021, 21(20), 6854; https://doi.org/10.3390/s21206854 - 15 Oct 2021
Cited by 6 | Viewed by 3279
Abstract
We present a protocol for evaluating the efficiency of an electronic white cane for improving the mobility of blind people. The electronic cane used during the test is the Tom Pouce III, made of LIDAR sensors (light detection and ranging) with tactile feedback. [...] Read more.
We present a protocol for evaluating the efficiency of an electronic white cane for improving the mobility of blind people. The electronic cane used during the test is the Tom Pouce III, made of LIDAR sensors (light detection and ranging) with tactile feedback. The protocol comprises two parts. The first part, the “detection test”, evaluates the efficiency of the sensors in the Tom Pouce III for detecting the obstacles found in everyday life (thin and large poles, apertures) under different environmental conditions (darkness, sun light, rain). The second part of the test, the “mobility test”, compares the ability of blind participants to cross a 25 m path by avoiding obstacles with the simple white cane and the electronic cane. The 12 blind participants had between 2 and 20 years of experience of everyday usage of Tom Pouce devices. The results show a significant improvement in the capacity to avoid obstacles with the electronic cane relative to the simple white cane, and there was no speed difference. There was no correlation between the results and the years of experience of the users. Full article
(This article belongs to the Special Issue Advanced Sensors/Devices for Ambient Assisted Living)
Show Figures

Figure 1

Figure 1
<p>Illustration of Tom Pouce III detection ranges. <b>Top panel</b>: lateral view of the protection area of the Tom Pouce. <b>Bottom panel</b>: top view of the protection area.</p>
Full article ">Figure 2
<p>A descriptive illustration of the movement of the tip of the cane while walking.</p>
Full article ">Figure 3
<p>A sighted blindfolded participant sweeping the cane between the two bars at a given cadence to detect (<b>a</b>) a large dark obstacle under intermediate luminosity conditions; (<b>b</b>) a white thin post in full sunlight; (<b>c</b>) a dark thin post in full sunlight and (<b>d</b>) an aperture.</p>
Full article ">Figure 4
<p>We see a participant walking down a 25 m long and 2.4 m wide path; 7 cm wooden square bars were positioned on the left side of the path, and a wall was on the right side. Obstacles made of lab coats on coat racks were placed along the path.</p>
Full article ">Figure 5
<p>Detection performance for apertures, with black points representing points of measurement from different distances. The red parallelepiped contains points indicating apertures detected at 0.8 rad/s; those inside the yellow parallelepiped indicate apertures detected at 0.4 rad/s and the green parallelepiped indicates full aperture detection at 0.2 rad/s.</p>
Full article ">Figure 6
<p>Schematic top view of the experimental setup illustrating the parameters used for computing the probability of passing the apertures without collisions by chance.</p>
Full article ">Figure 7
<p>Effect of using the simple white cane and Tom Pouce III on the mobility score (±SD = error bar; ±SE = square).</p>
Full article ">Figure 8
<p>Effects of the simple white cane and Tom Pouce III on mean walking speed.</p>
Full article ">
22 pages, 4948 KiB  
Article
Self-Organizing IoT Device-Based Smart Diagnosing Assistance System for Activities of Daily Living
by Yu Jin Park, Seol Young Jung, Tae Yong Son and Soon Ju Kang
Sensors 2021, 21(3), 785; https://doi.org/10.3390/s21030785 - 25 Jan 2021
Cited by 4 | Viewed by 3244
Abstract
Activity of daily living (ADL) is a criterion for evaluating the performance ability of daily life by recognizing various activity events occurring in real life. However, most of the data necessary for ADL evaluation are collected only through observation and questionnaire by the [...] Read more.
Activity of daily living (ADL) is a criterion for evaluating the performance ability of daily life by recognizing various activity events occurring in real life. However, most of the data necessary for ADL evaluation are collected only through observation and questionnaire by the patient or the patient’s caregiver. Recently, Internet of Things (IoT) device studies using various environmental sensors are being used for ADL collection and analysis. In this paper, we propose an IoT Device Platform for ADL capability measurement. Wearable devices and stationary devices recognize activity events in real environments and perform user identification through various sensors. The user’s ADL data are sent to the network hub for analysis. The proposed IoT platform devices support many sensor devices such as acceleration, flame, temperature, and humidity in order to recognize various activities in real life. In addition, in this paper, using the implemented platform, ADL measurement test was performed on hospital patients. Through this test, the accuracy and reliability of the platform are analyzed. Full article
(This article belongs to the Special Issue Advanced Sensors/Devices for Ambient Assisted Living)
Show Figures

Figure 1

Figure 1
<p>Activity of Daily Living (ADL) Device Layout for Pilot Test.</p>
Full article ">Figure 2
<p>Hardware Block Diagram of Location Anchor Node.</p>
Full article ">Figure 3
<p>Implementation of Resource Device.</p>
Full article ">Figure 4
<p>Software Architecture of Stationary Resource Device.</p>
Full article ">Figure 5
<p>Hardware Block Diagram of Mobile Identification Device.</p>
Full article ">Figure 6
<p>Cell Management and Location Registration Protocol.</p>
Full article ">Figure 7
<p>Sequence Diagram of Activity Event Report.</p>
Full article ">Figure 8
<p>Low frequency (LF) Signal Packet Format.</p>
Full article ">Figure 9
<p>Sequence Diagram of Entrance Recognition System.</p>
Full article ">Figure 10
<p>Sequence Diagram of Activity Event Report (External Sensor Device).</p>
Full article ">Figure 11
<p>S/W Architecture of External Sensor Device.</p>
Full article ">Figure 12
<p>Implementation of Mobile Identification Device (Smart Band Type).</p>
Full article ">Figure 13
<p>Installation of Resource Device for Pilot Test.</p>
Full article ">Figure 14
<p>Hardware Block Diagram of Stationary Resource.</p>
Full article ">Figure 15
<p>External Sensor Device (Faucet).</p>
Full article ">Figure 16
<p>Hardware Block Diagram of External Sensor Device.</p>
Full article ">Figure 17
<p>Response Time of ADL Report Protocol.</p>
Full article ">Figure 18
<p>Average Power Consumption of ADL Report Protocol (Mobile Identification Device (MID)).</p>
Full article ">Figure 19
<p>Response Time of Entrance Recognition System.</p>
Full article ">Figure 20
<p>Power Consumption of Entrance Recognition System.</p>
Full article ">
Back to TopTop