[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Article in Journal
Precipitation Data Assimilation System Based on a Neural Network and Case-Based Reasoning System
Previous Article in Journal
A New Bi-Directional Projection Model Based on Pythagorean Uncertain Linguistic Variable
Previous Article in Special Issue
Robust Eye Blink Detection Based on Eye Landmarks and Savitzky–Golay Filtering
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Definition of Motion and Biophysical Indicators for Home-Based Rehabilitation through Serious Games

Department of Naval, Electrical, Electronics, and Telecommunications Engineering (DITEN) Università degli Studi di Genova, Via all’Opera Pia 11, Genova 16145, Italy
*
Author to whom correspondence should be addressed.
Information 2018, 9(5), 105; https://doi.org/10.3390/info9050105
Submission received: 10 March 2018 / Revised: 23 April 2018 / Accepted: 26 April 2018 / Published: 1 May 2018
(This article belongs to the Special Issue Selected Papers from ICBRA 2017)
Figure 1
<p>Summary of the Remote Monitoring Validation Engineering System (ReMoVES) architecture and list of serious games and their corresponding movement: (<b>a</b>) OwlNest for arm Flexion–Extension and Abduction–Adduction; (<b>b</b>) ShelfCans for arm Flexion–Extension and Abduction–Adduction; (<b>c</b>) FlappyCloud for leg Abduction–Adduction; (<b>d</b>) EquilibriumPaint for Sit to Stand; (<b>e</b>) HotAir for Balance Shift; (<b>f</b>) EndlessZig for hand Radial–Ulnar Deviation; (<b>g</b>) CityCar for Flexion–Extension; (<b>h</b>) WineBottle for hand Pronation–Supination; (<b>i</b>) FloatingTrap for Grasping.</p> ">
Figure 2
<p>Relaxation exercise: the patient inhales as the animation grows, then exhales as it shrinks.</p> ">
Figure 3
<p>(<b>a</b>) <span class="html-italic">Abduction–Adduction</span> exercise scheme; (<b>b</b>) <span class="html-italic">Hand trajectory</span> in ShelfCans game; (<b>c</b>) Heatmap to detect neglected areas.</p> ">
Figure 4
<p>Plot of <span class="html-italic">Score–Compensation</span> and <span class="html-italic">Barycenter–Target position</span> features in OwlNest game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.</p> ">
Figure 5
<p>Plot of <span class="html-italic">Score–Compensation</span> and <span class="html-italic">Hand trajectory–Compensation</span> features in ShelfCans game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.</p> ">
Figure 6
<p>Plot of <span class="html-italic">Score–Compensation</span> and <span class="html-italic">Elevation–Compensation</span> features in the FlappyCloud game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.</p> ">
Figure 7
<p>(<b>a</b>) Trunk tilt on sagittal and coronal planes; (<b>b</b>) Shoulder tilt detection.</p> ">
Figure 8
<p>Plot of <span class="html-italic">Repetition rate–Coronal shift</span> and <span class="html-italic">Repetition trend–Shoulder tilt</span> features in the EquilibriumPaint game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.</p> ">
Figure 9
<p>(<b>a</b>) Tracked joints for Centre of Pressure (<span class="html-italic">CoP</span>) estimation; (<b>b</b>) Gameplay during a serious game session; (<b>c</b>) Erroneous balance shift detection.</p> ">
Figure 10
<p>Normalised <span class="html-italic">CoP</span> oscillation acquired both with Microsoft Kinect V2 and Wii Balance Board during a comparison test. Time is on the <span class="html-italic">x</span>-axis (2-min-long trial).</p> ">
Figure 11
<p>Plot of <span class="html-italic">Score–CoP variation</span> and <span class="html-italic">CoP variation–Compensation</span> features in HotAir game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.</p> ">
Figure 12
<p>Movements detected during hand rehabilitation with Leap Motion: (<b>a</b>) Radial–Ulnar deviation movement in EndlessZig game; (<b>b</b>) Flexion–Extension movement in CityCar game; (<b>c</b>) Pronation–Supination movement in WineBottle game; (<b>d</b>) Grasping movement in FloatingTrap game.</p> ">
Figure 13
<p>The emotional index graph: Time is on the <span class="html-italic">x</span>-axis and the emotional index in terms of emotion experienced during the execution of rehabilitation exercise is on the <span class="html-italic">y</span>-axis.</p> ">
Figure 14
<p>Valence–Arousal circumplex chart: on the <span class="html-italic">x</span>-axis the valence and on the <span class="html-italic">y</span>-axis the arousal of emotions. The data concentration in the left part of the graph can indicate that the type of exercise assigned by the therapist is too difficult.</p> ">
Figure 15
<p>Demographic characteristics of the patients.</p> ">
Figure 16
<p>Classification of the diseases affecting the patients that tested the ReMoVES platform. The patients affected by hemiparesis are a subcategory of the ones affected by stroke (red box).</p> ">
Figure 17
<p>Classification of available serious games and frequency of use.</p> ">
Figure 18
<p></p> ">
Figure 18 Cont.
<p></p> ">
Versions Notes

Abstract

:
In this paper, we describe Remote Monitoring Validation Engineering System (ReMoVES), a newly-developed platform for motion rehabilitation through serious games and biophysical sensors. The main features of the system are highlighted as follows: motion tracking capabilities through Microsoft Kinect V2 and Leap Motion are disclosed and compared with other solutions; the emotional state of the patient is evaluated with heart rate measurements and electrodermal activity monitored by Microsoft Band 2 during the execution of the functional exercises planned by the therapist. The ReMoVES platform is conceived for home-based rehabilitation after the hospitalisation period, and the system will deploy machine learning techniques to provide an automated evaluation of the patient performance during the training. The algorithms should deliver effective reports to the therapist about the training performance while the patient exercises on their own. The game features that will be described in this manuscript represent the input for the training set, while the feedback provided by the therapist is the output. To face this supervised learning problem, we are describing the most significant features to be used as key indicators of the patient’s performance along with the evaluation of their accuracy in discriminating between good or bad patient actions.

1. Introduction

In recent years, the impact of telemedicine is still growing thanks to the adoption of the concept of “continuity of care” by the health community. Operators have access to all needed information, regardless of where they are located, and this helps to improve access to health services for patients and end users [1].
Societal changes drive the evolution of the healthcare sector, both in Europe and in the rest of the world. Some of the big issues including the ageing population, chronic diseases, and spread of disabilities (e.g., stroke and all degenerative diseases) have a great impact on health care strategies and care solutions. Thanks to the new technological solutions, the patient is increasingly attentive and aware to their state of health, trying to maintain the highest quality of life level, which requires continuous assistance and care [2].
Digital innovation is involved in many areas of daily life. It plays a fundamental role in the field of medical rehabilitation, and more and more facilities consider it as a fundamental part of their therapeutic and commercial offering. On the market, there are several gaming consoles used for fitness exercising. In particular, the Nintendo Wii is used in several centres as a rehabilitation device, to help the recovery of patients in a more enjoyable way [3,4]. Nevertheless, commercial games are not designed for people with disabilities, therefore making the patient experience demotivation when failing to achieve the goals due to their difficulty.
In this scenario, we are disclosing the Remote Monitoring Validation Engineering System (ReMoVES). We have developed a platform that addresses the problem of continuity of care in a smart and cost-effective way. It relies on serious games with which the patient interacts through Microsoft Kinect and Leap Motion. [5].
Many researchers developed similar solutions, but they focused on algorithms to detect body and hand gestures with a certain level of accuracy [6,7,8] and on the development of serious games for health [9]. In this article, which is an extended version of [10], we present a novel approach to automatically evaluating the performance of the patient during the execution of rehabilitation exercises when unsupervised by therapists. The methods we will use to process the data collected during the entire activity session are discussed in detail.
This paper is structured as follows. Section 2 presents the introduction to the system architecture and presents the characteristics of the serious games that have been developed and included in the ReMoVES platform. Section 3 explains the methods to process data acquired during the activity sessions carried out by patients. Section 4 presents some preliminary results with a sample of the data process. Finally, Section 5 presents a discussion of the results.

2. Description of the System

2.1. Overall Architecture

The ReMoVES platform employs three off-the-shelf devices for motion tracking and biophysical data acquisition which are activated during the execution of functional exercises. On the back-end, a cloud architecture was designed and deployed to provide web-services and data processing.
The idea behind the proposed architecture consists of providing a personal rehabilitation program that is performed at home by the patient themselves, while the therapist can track the performances and effectiveness of the training from any Internet-connected device. Among others, the monitoring of eventual compensation movements allows evaluation of whether an exercise is accomplished in the correct way.
In detail, the built-in algorithms aim to provide a clear and concise report to the therapist, in order to facilitate the interpretation of the evolution of therapy. According to Figure 1, the following technologies are deployed by the ReMoVES platform.
The motion and biophysical values originating from the serious games and the peripherals are stored in the database at 2Hz in an integer or float numeric data type (four bytes, single-precision) along with a timestamp to identify when that event occurred (it is generated on the machine running the serious games, not on the cloud back-end since this would be affected by unpredictable delay). The raw measures from the sensors are provided with different data types and are stored without rounding as reported in Table 1.
  • Serious games are digital games that were completely developed in our labs exclusively for the ReMoVES platform. The system currently includes 10 main serious games and 60 variations: they encourage the patient to carry out functional exercises autonomously along with the traditional motion rehabilitation. A representative screenshot for each serious game is given in Figure 1 (from a to i) and in Figure 2. The creation of these activities involved different processes, technologies, and specialists. In the case of serious games, special attention must be paid to the specifications given by physiotherapists and physiatrists, who shared their expertise to define game parameters such as the level of difficulty. The interface was designed using the Unity3D platform, which is a popular engine often used for developing games and virtual reality applications. The games are scripted in C#. All art assets in both 2D and 3D formats were downloaded from different online sources under Creative Commons license [11]. The whole set of games shares a core library that were developed to allow easy and straightforward communication with the Application Programming Interfaces (APIs) exposed by the cloud back-end that will be presented later.
  • Microsoft Kinect V2 is a motion sensing input device based on a high-resolution colour camera and an infrared emitter for depth analysis that can simultaneously 3D-track up to 25 fundamental joints of the framed human body. It offers a wide field of view (70 × 60 degrees) and recognition up to 4.5 m from the device [12]. Several studies have demonstrated that spatiotemporal parameters can be validly obtained by Microsoft Kinect V2 [13,14]. The set-up of the interface between the Microsoft Kinect V2 and the Unity3D engine is effortless because the manufacturer provides a Software Development Kit (SDK) and a Unity add-on, which gives developers access to body joint positions and orientations that can be used directly in rehabilitation game development.
  • Leap Motion is explicitly targeted at hand gesture recognition and computes the position of the fingertips and the hand orientation. Its interaction zone is limited to a semi-sphere of radius 0.60 m around the device, with a theoretical accuracy of 0.01 mm according to the manufacturer (or 0.7 mm under real conditions as measured by [15]). Examples of clinical studies that have used the Leap Motion controller are [16,17]. The manufacturer provides a Unity Assets bundle to allow a straightforward interface with the development engine.
  • Microsoft Band 2 is a physiological sensor used to collect the two signals chosen as a measure of involvement during the rehabilitation program: heart rate (HR) and electrodermal activity (EDA). It also includes a wide range of additional built-in sensors: accelerometer, gyroscope, barometer, GPS, skin temperature sensor, ambient light, and ultraviolet sensor. Microsoft Band 2 offers the opportunity to collect real-time data in a continuous manner for at least 4 h without recharging [18]. Our interface with Microsoft Band 2 is implemented using the Band SDK for the Universal Windows Platform in C#.
  • Cloud back-end was developed using state-of-the-art techniques to provide scalable, secure, and efficient data processing and storage. In Figure 1 it is schematically represented as the Data Processing cloud. The architecture of this component is not the subject of this paper, but it plays a fundamental role since it exposes REpresentational State Transfer (REST) APIs for all-around management (authentication, initialization, data storage) of the activity sessions which can take place remotely and simultaneously in different locations (rehabilitation centres, hospitals, or patients’ homes). In detail, the ReMoVES cloud back-end was developed with PHP 5 and is hosted on an Apache/2.4.6 web server running CentOS 7. The JavaScript Object Notation (JSON) standard has been adopted for encrypted (over SSL) message communication between the cloud back-end and other components. Data is stored in a MySQL relational database running on the same machine: it is made up of tables for users management, rehabilitation sessions scheduling and game log storage. The motion and biophysical values originating from the serious games and the peripherals are stored in the database at 2 Hz in an integer or float numeric data type (four bytes, single-precision) along with a timestamp to identify when that event occurred (it is generated on the machine running the serious games, not on the cloud back-end since this would be affected by unpredictable delay). The raw measures from the sensors are provided with different data types and are stored without rounding as reported in Table 1.

2.2. Serious Games

The rehabilitation activity proposed through the ReMoVES platform is used together with the traditional rehabilitation program. The system proposes a set of exercises from which the therapist can choose the most useful ones according to the patient disease and disability level.
These activities are presented to the patient as serious games in the specific sequence defined by the personalised rehabilitation plan designed by the therapist.
The system currently includes 10 different exercises that can be tweaked according to the requirement of the patient: level, duration, movement pattern, speed, and other parameters can be changed for a total of 60 game variations. They are reported in Table 2 and briefly described in the following.
  • BreathBall This activity (see Figure 2) helps the patient to relax and focus on their breathing. It is not an interactive game. The patient is led to breathe with a regular rhythm by following a “breath ball” displayed on the monitor. The BreathBall exercise makes the subject comfortable and brings back their heartbeat to a basal value before the actual rehabilitation session starts.
  • ShelfCans This serious game (Figure 1a) introduces the patient to a virtual environment similar to a kitchen. With the arm movement, the patient grabs one of the colourful drink cans appearing in the middle of the screen and drags it to the corresponding shelf. This game is appealing because it requires the user to be attentive to drop off the drink can in the correct shelf according to its colour.
  • OwlNest The patient is encouraged to reach an on-screen target (Figure 1b) with the arm motion (Reaching Task) in order to achieve a high in-game score. Many colourful owls appear randomly in any position of the screen for a given time-frame: the user should carry them in the nest before they disappear.
  • FlappyCloud This is a functional exercise for the lower limb. The leg Abduction–Adduction movement reflects the position of a cloud object in the game screen: the patient should make it move forward without hitting some obstacles (Figure 1c).
  • HotAir This is an activity to improve control of the patient’s body balance. The user can control the direction of a hot-air balloon, floating in the sky with the balance shift: in-game scores are collected when it is led towards the bonus targets (Figure 1d).
  • EquilibriumPaint This serious game is an interactive version of the Sit to Stand (STS) exercise, typically used in traditional rehabilitation to evaluate patient performance. The user should stand up and sit down repeatedly within a predefined amount of time (usually 30 or 60 s [19]); their trunk must remain erect: an erroneous lateral shift causes the fall of the cans of paint leaning on an unstable wooden beam (Figure 1e).
  • WineBottle This exercise mimics a real-world scenario: pouring liquids from a bottle. With the Pronation–Supination movement of the hand, the patient should control the rotation of a bottle of wine appearing on the screen. They must fill a glass over and over again to collect as many points as possible (Figure 1f).
  • EndlessZig In this activity, the patient drives a marble along a zigzag path appearing on the screen. Going out of the boundaries causes score loss; similarly, some bonus gems appear on the path. The patient controls the marble movement with Radial–Ulnar deviation (Figure 1g).
  • CityCar In this game, the patient drives a car along a randomly-generated road. The user should steer in the presence of curves and crossroads with the movement of Flexion/Extension of the wrist. Penalties are introduced when the user goes off-track (Figure 1h).
  • FloatingTrap In this serious game, the patient is led to open their hand and make a fist alternatively. This exercise requires a good level of concentration: in fact, the user moves a floating raft on the left or on right according to the finger Flexion/Extension in order to avoid some objects in the scene (Figure 1i).

3. Features Description

The core of the ReMoVES platform consists of the remote delivery of a report to the therapist, through which she/he can understand if the home-based rehabilitation process proposed to the patient is effective and carried on correctly [10].
To achieve this goal, we are here introducing a study about the most significant indicators which can be processed by some machine learning algorithms for automatic evaluation of a patient’s performance rehabilitation session.
In this section, we focus on the definition of appropriate indicators aiming to highlight the patient engagement and performance during the execution of the functional exercises.
To this end, the therapist feedback that judges the performance of each session carried out by patients was collected at rehabilitation facilities. This enabled the preparation of a labelled dataset which was used to test the significance and validity of the defined features.
In detail, data are displayed in the graphs of this section which are provided as examples dealing with the problem of discriminating between sessions which were correctly performed (labelled by the therapist as “Good”) and the ones which were inadequately executed (labelled as “Other”). Thanks to such examples, we are able to evaluate if the selected features have discriminant power, as expected by their definition. Such a preliminary phase gives an idea about the best feature, although a larger dataset must be created for a more robust statistical evaluation.
Even though more sophisticated methods can be exploited for the feature normalisation step, in the present study we experienced that simple feature scaling and mean normalisation methods were sufficient. The standardisation of each feature range herein has the purpose of giving the same weight to the various indicators. Mean normalisation involves subtracting the average from the values of an input variable, resulting in a new variable with zero mean. At the end, we expect values between −1 and +1.
In the following, the indicators specifically designed for each exercise are described along with an evaluation of their classification power. A summary is provided in Table 2.

3.1. Relaxation

To make the subject comfortable and to return their heartbeat to a basal value before the rehabilitation session, the patient is led to breathe with a regular rhythm by following a “breath ball” displayed on the monitor. The ball changes its size, simulating inspiration and expiration (Figure 2): when it grows, the patient should inhale, then exhale when it shrinks. When a patient had difficulty following the ball rhythm, we asked them to relax and not strain themselves.
During the execution of this exercise, the Heart rate (HR) signal is acquired. The trend analysis is performed, and the Angular coefficient β H R of the straight line that best approximates the heart beat is computed, in order to understand the general pattern of the signal. The patient can be considered ready for the execution of physical rehabilitation if β H R is equal to or less than zero. In fact, a positive coefficient means a state of fatigue and excitement, not suitable for the aim of the rehabilitation process.

3.2. Upper Limb

Microsoft Kinect V2 was deployed in this analysis: the patient interacting with digital games was encouraged to reach an on-screen target with the arm motion (reaching task) in order to collect a high in-game score. The main movements at the shoulder joint (glenohumeral joint) tracked during the execution of functional exercises were: Flexion–Extension and Abduction–Adduction.
Two serious games for upper limb are available in the ReMoVES platform: ShelfCans and OwlNest.
During the execution of the rehabilitation exercise, we extract a large amount of data, but we are currently focusing on these features for the upper limb:
  • Score: the score collected in-game is an abstract quantity associated to the player performance for their achievements. It acts as a quantitative indicator of success in the game introduced to motivate patients to attain a better score than their opponents/themselves.
    The Score feature is the ratio between the score collected by the user and the theoretical maximum achievable in the game:
    S c o r e = s c o r e u s e r / s c o r e m a x .
  • Hand trajectory: analysis of the hand trajectory is an indicator of the precision and the movement control level that the patient can keep during the activity. We calculate the distance between the trajectory (N samples over time h a n d i = ( x i , y i ) , where x and y are the horizontal/vertical coordinates of the coronal plane) and the shortest line passing through the two points of t a r g e t = ( x t , y t ) (target spawn position) and o r i g i n = ( x o , y o ) (Figure 3b):
    H a n d t r a j e c t o r y d i s t a n c e = 1 N i = 1 n ( y t y o ) x i ( x t x o ) y i + x t y o y t x o ( y t y o ) 2 + ( x t x o ) 2 .
  • Compensation: compensation is an undesired movement strategy adopted by the patient during the execution of the exercise. In detail, the distance d between the hand and the shoulder (Figure 3a) while performing the movement is evaluated in order to assess whether the patient reaches the targets by performing a correct Flexion–Extension or Abduction–Adduction movement or moves the whole body instead, implying that the hand-shoulder distance remains constant.
    Let us define μ the average of distances d i with respect to the N observations during a single session; the Compensation parameter, index of the wrong movement, is then extracted as follows:
    C o m p e n s a t i o n = 1 N i = 1 N d i μ 2 .
  • Barycenter: this indicator deals with the detection and measurement of a spatial negligence in hemiparetic patients who ignore the targets that are offered to their attention in a certain part of the in-game space. Spatial negligence affects approximately 30–40% of patients with acute stroke and consists of ignoring part of the space as a result of brain damage. The hand coordinates on the coronal plane, acquired over time, are used to populate a visual heatmap from which the therapist can evaluate the neglected areas; the spatial map is presented with a colour scale ranging from dark blue, where no movements were performed, to red, where there was the greatest occurrence (Figure 3c). This indicator is provided by the horizontal offset of the barycenter of the hand position observations over the session duration. In the evaluation of this indicator, it is also important to consider that in some variations of the OwlNest the targets are not equally distributed on both sides (e.g., to force a patient affected by left hemiparesis to focus his attention on the left side of the screen only).
For each serious game we will build a dataset that allows detection of when the patient is achieving good performances during the rehabilitation process. The data collection process takes place in a rehabilitation structure where a therapist provides live feedback about the patient’s performances. This will provide a dataset for supervised learning in the machine learning method.
In the OwlNest game, we are evaluating these relations: Score–Compensation and Barycenter–Target positions (Figure 4). In the former case, a good performance is achieved with a high Score and low Compensation. In the latter, a good performance is achieved when there is a linear dependency between the position of the hand and the position where the target on the screen spawned. With this analysis, we can discern when a shift from 0 in the Barycenter feature is caused by the patient’s syndrome or just by a bias in the positions where in-game targets (the coloured owls) are generated. For example, when most of the targets are generated in the left part of the screen, it is totally correct to collect a negative (left) Barycenter value and this is not an evidence of spatial negligence.
For the ShelfCans game we are evaluating the relations: Score–Compensation and Hand trajectory–Compensation (Figure 5). The first case was already depicted for the OwlNest game. The latter highlights good performances when the hand follows a straight trajectory matching the shortest path as much as possible, while the functional exercise is carried out without strategies of compensation.

3.3. Lower Limb

Microsoft Kinect V2 is deployed in this kind of activity. The user must accomplish the Abduction–Adduction movement with their impaired leg while they are standing. The therapist can suggest to the patient to hold on to a physical support, such as the back of a chair, in order to prevent falls or equilibrium instability. The elevation of the leg and additional body joint positions are tracked with the Microsoft Kinect V2 to detect the balance status of the user’s body: to accomplish the exercise correctly, the patient must keep the trunk erected and straight.
FlappyCloud is currently the only serious game included in the ReMoVES platform that is developed around a functional exercise for the lower limbs.
During the execution of the rehabilitation exercise, we extract a large amount of data, but we are currently focusing on these features for the lower limbs:
  • Score: same as previous activities.
  • Elevation: it is the maximum elevation of the impaired leg calculated as the angle between legs during the Abduction–Adduction movement. The Elevation feature is the ratio between the maximum angle reached by the user and the maximum reached on average by healthy subjects:
    E l e v a t i o n = e l e v a t i o n u s e r / e l e v a t i o n h e a l t h y .
  • Compensation: to accomplish the exercise correctly, the patient must keep the trunk erected and straight. In this case, the angle between the patient’s trunk and the ideal vertical attitude is considered.
    C o m p e n s a t i o n = t r u n k , v e r t i c a l .
For the FlappyCloud game, we are evaluating these relations: Score–Compensation and Elevation–Compensation (Figure 6). The first case was already depicted for other games. In the latter case, a good performance is achieved when the patient collects high Elevation value during the Abduction–Adduction while keeping their truck erected and straight.

3.4. Sit to Stand

Microsoft Kinect V2 was deployed in this analysis: the patient must stand up and sit down repeatedly, keeping the trunk as straight as possible. The serious game developed around the STS functional exercise is EquilibriumPaint.
Among the possible parameters we extracted, the following features were taken into account for Sit to Stand evaluation:
  • Repetition rate: number of repetitions in a given time; it is proportional to the in-game score. The Repetition rate is the ratio between the rate maintained by the user and the average kept by healthy subjects:
    R e p e t i t i o n r a t e = r a t e u s e r / r a t e h e a l t h y .
  • Repetition trend: the trend of the time required by the user to accomplish a single STS cycle. The analysis of the variability during a single session is, for example, important to detect the fatigue level of the patient. Given i the repetition number and t i the cycle duration, the Repetition trend is the slope (m) of the linear regression of t i with respect to i:
    m , q = L i n e a r R e g r e s s i o n ( i , t ) , R e p e t i t i o n t r e n d = m .
  • Coronal shift: verification of the correct strategy used by the patient to stand up, considering the trunk tilt on coronal plane (Figure 7a). Even though the movement trajectory projected in the sagittal plane must show some forward/backward displacement, the correct coronal trajectory must be linear and constant around the zero value.
  • Shoulder tilt: to accomplish the exercise correctly, the patient must stand up and sit down without twisting or tilting the shoulders. A correct movement provides very small angles of inclination with a tolerance of about 10 degrees, and therefore the average value must be around 0 degrees (Figure 7b).
For the EquilibriumPaint game, we are evaluating these relations: Repetition rate–Coronal shift and Repetition trend–Shoulder tilt (Figure 8). In the former case, a good performance is achieved when the patient collects high in-game Score while keeping the Coronal shift around zero during the STS exercise. In the latter, a good performance is achieved when the Repetition Trend remains constant (or even increases) during the whole session, while the average value of Shoulder tilt remains around 0 degrees.

3.5. Balance Shift

The Nintendo Wii Balance Board (WBB) has already been validated as a suitable alternative to a laboratory-grade force platform in the measures of Centre of Pressure (CoP) in contexts where a clinical assessment of balance is required without an elevated precision level [20,21]. The ReMoVES platform does not employ the WBB despite its adequate performances in the serious games context; in fact, according to the preliminary studies we have carried out, the Microsoft Kinect V2 can be exploited for the estimation of the oscillation of the CoP on the mediolateral axis and WBB would be redundant and anti-economic. The following procedure will disclose the method for the CoP estimation using the Microsoft Kinect V2:
  • Track the position in space of these fundamental body joints provided by Kinect SDK: right ankle ( A r ), left ankle ( A l ), the base of the spine ( S b o t t o m ), and the spine at the shoulder height ( S t o p ) (Figure 9a).
  • Compute the position in space of the middle point (M) between the right ankle and the left ankle.
  • Compute the difference between the angles related to the shift of lower and higher sections of the trunk:
    T i l t b o t t o m = ( A r , M , S b o t t o m ) ( A l , M , S b o t t o m ) , T i l t t o p = ( A r , M , S t o p ) ( A l , M , S t o p ) .
  • Calculate the average between the T i l t b o t t o m and T i l t t o p to get the estimated CoP:
    C o P = T i l t b o t t o m + T i l t t o p 2 .
We used this experimental setup to evaluate the performance of the proposed algorithm: three subjects repeated a 2-min-long serious game for postural assessment ten times. Data were synchronously recorded from WBB and Microsoft Kinect V2 and the CoP oscillation on the mediolateral were normalised and compared calculating the Pearson correlation coefficient. The average of the coefficients calculated for each test sessions brought the result of ∼0.89. Figure 10 shows a sample of synchronised and normalised data in a 2-min-long test session.
The serious game focused around the balance shift functional exercise is HotAir. During the execution of the rehabilitation exercise, we extract a large amount of data, but we are currently focusing on these features for balance shift:
  • Score: same as for upper limb games.
  • Centre of pressure variation: during the Balance shift functional exercise, the patient must load weight on the two legs alternately, and thus the variation of CoP on the mediolateral axis is calculated.
    C o P v a r i a t i o n = 1 N i = 1 N C o P i μ 2 .
  • Compensation: to accomplish the exercise correctly, the patient must alternate the load on the lower left and right limbs without compensating the movement by shifting the upper part of the body. This erroneous strategy can be detected by computing the difference between T i l t b o t t o m and T i l t t o p (Figure 9c):
    C o m p e n s a t i o n = T i l t b o t t o m T i l t t o p .
For the HotAir game, we are evaluating these relations: Score–CoP variation and CoP variation–Compensation (Figure 11). In the first case, a good performance is achieved when the patient collects a high in-game Score while their effort in shifting the load between left and right limbs is high. In the second analysis, the correct performance is achieved with high CoP variation and low Compensation.

3.6. Hand Movements

Leap Motion was deployed in this analysis: the digital games encourage the patient to control in-game objects with hand, wrist, and finger motion. The main movements tracked during the execution of functional exercises are: Radial–Ulnar deviation (Figure 12a), Flexion–Extension (Figure 12b), Pronation–Supination (Figure 12c), and Grasping (Figure 12d).
From the motion data, we extract these fundamental features:
  • Repetition rate: same as previous activities.
  • Maximum angle: maximum angle reached in each of the proposed movement (corresponding to yaw, pitch, and roll). These values are tracked, respectively, in EndlessZig, CityCar, and WineBottle games.
  • Grab strength: evaluation of the grasping exercise with Grab strength function provided by Leap Motion SDK that returns values between 0 and 1 respectively when the hand is open or when it is closed in a grabbing pose. This feature is currently tracked in the FloatingTrap serious game.

3.7. Heart Rate and Electrodermal Activity

During daily rehabilitation therapy, it is very important to monitor the patient’s emotional involvement in order to inform the therapist about the proper performance of the assigned plan of care [22]. For this reason, heart rate (HR) and electrodermal activity (EDA) were recorded and analysed to monitor major changes during the different phases of a rehabilitation session. These data were compared with those from motion sensors to monitor the progress of therapy and the patient’s involvement in the rehabilitation session.
The HR is an informative biological parameter that is easy to record and therefore to introduce in the home setting. In other words, a simple bracelet sensor is sufficient to understand the patient’s condition during the rehabilitation program. By analysing the heartbeat signal, it is also possible to detect if the patient is too fatigued because the activity is excessively difficult or the duration established by the therapist is inappropriate. In the literature, there are different works which indicate that HR and heart rate variability (HRV) may be useful for measuring the valence of emotions, like joy, sadness, and fear [23,24,25].
Both HR and EDA were chosen as biophysical signals to be correlated with activity performance in order to detect a particular state of stress during the rehabilitation process.
The EDA is defined as a change in the electrical properties of the skin [26]. The EDA signal is composed of both tonic and phasic components. The slowly varying base signal is the tonic EDA part, also called the skin conductance level (SCL). The faster-changing part (phasic activity) is related to external stimuli or non-specific activation, and it is called skin conductance response (SCR). Recently, EDA has been used as a biofeedback mechanism in order to teach meditation or emotional control and to treat anxiety disorder [27]. The EDA signal was also widely used as an indicator of the actual state, related in particular to changes in the arousal level. Many studies reported that magnitude of electrodermal change and the intensity of emotional experience are almost linearly associated in arousal dimension. For example, in [28], the authors used EDA and HR to evaluate immersive virtual reality environment, finding significant results of correlation between the two signals and the environment.
In this work, by fixing an observation window W, corresponding to each exercise duration, standard statistical parameters, such as mean, variance, maximum and minimum value, and the Pearson correlation coefficient between HR and EDA, were computed.
For the EDA signal, in order to obtain the phasic data by removing the tonic level of the signal, unrelated to arousal, a median filtering process was applied. The median EDA value of the surrounding samples based on a ±4 s interval centred on the current sample was computed for each sample and the average subtracted from the current sample. After obtaining the phasic data, a peak detection algorithm was applied in order to find skin conductance responses to external stimuli that can influence patients during the execution of rehabilitation activity. A value of 0.01 μ S was chosen as a significant threshold to discriminate skin conductance variations. The same processing method was applied to the HR signal.
To evaluate the general pattern of the two signals, a trend analysis was computed and evaluated by applying the least-squares method. After applying it, the slopes of the straight lines, β H R and β E D A , were calculated.
As explained above, HR and EDA are, respectively, indexes of valence and arousal. Once the two signals were normalised between −1 and 1, the β variable and θ angle were used to define the emotional index ( I e ) as follows:
H R n o r m = h r 1 , h r 2 , , h r n , n = | H R | = | E D A | , E D A n o r m = e d a 1 , e d a 2 , , e d a n , i = 1 , , n , θ i = arctan h r i , e d a i IF θ i < 0 β i = | θ i | + π 2 , IF θ i π 2 β i = π 2 θ i , IF θ i > π 2 β i = 5 2 π θ i , i = 1 , , n i e i = 1 β i π , I e = i e 1 , i e 2 , , i e n .
The emotional index was represented as a graphical parameter (Figure 13) by evaluating how strong the emotion was experienced in a given second in relation to all the emotions perceived throughout the whole rehabilitation activity.
The two normalised signals H R n o r m and E D A n o r m were also represented in the Valence–Arousal circumplex chart. The valence (HR variation) is represented on the x-axis and the arousal (EDA variation) is represented on the y-axis (Figure 14). Emotions can in fact be mapped out, classifying arousal (from high to low) and valence (from pleasure to displeasure) experienced during a particular task. In the right part of the graph, positive emotions are related to a patient which is experiencing happiness, serenity, calmness, and relaxation. On the contrary, the left part contains negative emotions like sadness, depression, fatigue, and tension. The aim of this part of the study is to detect and classify the emotions felt by the patient during the rehabilitation session, in order to personalise the plan of care and, for example, to simplify an exercise if the patient, not achieving the goal, feels fatigued. For each second of the acquisition, a point that represents the perceived emotion is plotted. The dots are printed with a colour that varies from dark blue to yellow in order to have a clear indication of the start and end points of the recording, and therefore the trend over time. The classification analysis of the emotions related to the rehabilitation exercises will be the subject of future studies when a larger dataset for the training phase will be collected.

4. Results

The ReMoVES system is in the test phase at the Complex Recovery and Rehabilitation Facility of the Colletta Hospital in Arenzano (Genoa, Italy) and at the Rehabilitation Centre of Don Carlo Gnocchi Foundation in Fivizzano (Massa-Carrara, Italy).
Up to this point, research efforts have been focused on the definition of the indicators aiming to highlight the patient performance during the execution of the functional exercises. This process requires mutual collaboration between ReMoVES platform developers and therapists. Moreover, the feedback given by the patients plays a crucial role in this testing phase: the system was tried by 41 participants (gender: 19 male, 22 female) aged between 30 and 86 (Figure 15) who were hospitalised due to various causes (stroke, fractures, prostheses, and Parkinson’s disease) (Figure 16); an informed consent has been collected from the involved subjects and data have been recorded anonymously.
Our test group was roughly made up of half post-stroke (11 hemiparetic and 8 with general diseases caused by stroke) and half orthopaedic patients, except for one participant affected by Parkinson’s disease and a young tetraplegic man (upper limb was slightly involved). The therapist integrated the patients’ traditional rehabilitation program with a set of additional exercises of their choice from among the ones provided by the ReMoVES platform. In Figure 17, an overview of the overall number of serious games sessions accomplished is provided. In accordance with Section 3.1, the most frequent exercise was the “BreathBall”, since it is usually carried out before the actual rehabilitation session to induce relaxation.
In this paper, we are providing preliminary results. We want to offer an overall glimpse of the method we have defined to introduce machine learning systems for the automatic evaluation of patient activity performance. At the moment, we have performed the initial steps required by the application of supervised learning methods:
  • Determine the training examples. We have decided which features are going to be used as training set for each of our serious games.
  • Gather a training set. Thus, a set of input objects is gathered and corresponding outputs are also gathered according to care providers. In detail, “labelled” information consist of flagged data by the therapist as “Good” when the patient performance is valid, and “Other” when the patient carried out the exercise erroneously/inadequately.
Unfortunately, despite the conspicuous number of sessions carried out by patients, the amount of data actually available for analysis is limited. In fact, during this initial testing phase, the focus of the operators (technicians and therapists) was on the stability of the system, the user experience, and the definition of the indicators of each individual game instead of on the labelling phase regarding the evaluation of the patient’s performance.
Most of the game sessions in our database are unlabelled and nearly useless for supervised analysis. Figure 18 shows the results of the classification process achieved with the implementation of a logistic regression model to predict whether a patient performed well during the rehabilitation activity through serious games.
The results are not conclusive because of the reduced dataset (we have nearly 20 labelled sessions for each game), but they highlight how it is possible to classify the “Good” sessions from those labelled as “Other”, showing that the choice of the selected features pairs was effective. We will re-analyze the issue with more data available.

5. Discussion

As reported in the paper, the system had a good impact on the rehabilitation procedures of the individuals involved in the study. Patients and therapists were deeply involved in the assessment and improvement of the overall structure of the system, with particular attention paid to usability and ease of use of the platform.
The general aim was to determine which parameters are useful to the therapist in order to evaluate the patients’ performances. Thanks to the professional involvement of the medical and therapeutic personnel, it was possible to set up a rough dataset that will be used by supervised learning algorithms.
Further steps planned are related to an improvement of the usability and an extension of use to a larger and diversified number of patients, in order to make the dataset larger and more solid. To improve the session labelling process a companion web application for the therapist is under development: this tool is accessible from any device—including a smartphone—and displays a minimal user interface with just two buttons corresponding to “Good” and “Other” label. The therapist will no longer have to note down manually his opinion about the patient performance, but he will label the game session in real-time through the companion web application while the patient is performing the rehabilitation activity. This solution will also bring a remarkable increase of data collection since the therapist will be able to provide multiple feedback over time (for example every 20 s), instead of just one overall final opinion. More granularity will also deliver more precise data for classification since the patient performance could be meaningfully different throughout a single game session and a single label for the entire session could be misleading under certain conditions.
Despite the limited data collected, we are confident that our work is going to bring innovation to the rehabilitation processes. For this reason the team is starting to move the system to a domestic and home-based environment, simplifying and lightening the general rehabilitation process in order to give a better service both to patients and therapists and in the near future to expand the possibility of improving real home-based care.

Author Contributions

Matteo Morando developed the ReMoVES platform (included all the serious games) and conceived the machine learning approach to the problem; Serena Ponte contributed with the definition of all the serious games indicators. All the authors have participated in writing the manuscript and have revised the final version. All authors read and approved the final manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors are grateful to Dott.ssa M. Simonini and all medical and physiotherapy staff of Colletta Hospital for supporting the experimental procedures and recruitment of patients. The authors are also grateful to Dott.ssa F. Cecchi and all medical and physiotherapist staff of Don Carlo Gnocchi Foundation-Fivizzano Rehabilitation Centre for supporting the experimental procedures and recruitment of patients. The authors are finally grateful to Prof. Giovanni Regesta for his support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Maule, J.; Chestnutt, L. Telemedicine in the 21st Century, Opportunities for Citizens, Society and Industry. In Proceedings of the an International Space University Workshop, Strasbourg, France, 4–5 November 1999. [Google Scholar]
  2. Ferrara, E.; Nardotto, S.; Ponte, S.; Dellepiane, S.G. Infrastructure for data management and user centered rehabilitation in Rehab@Home project. In Proceedings of the 7th International Conference on PErvasive Technologies Related to Assistive Environments, Island of Rhodes, Greece, 27–30 May 2014; p. 21. [Google Scholar]
  3. Deutsch, J.E.; Brettler, A.; Smith, C.; Welsh, J.; John, R.; Guarrera-Bowlby, P.; Kafri, M. Nintendo Wii Sports and Wii Fit Game Analysis, Validation, and Application to Stroke Rehabilitation. Top. Stroke Rehabil. 2011, 18, 701–719. [Google Scholar] [CrossRef] [PubMed]
  4. Mouawad, M.R.; Doust, C.G.; Max, M.D.; McNulty, P.A. Wii-based movement therapy to promote improved upper extremity function post-stroke: A pilot study. J. Rehabil. Med. 2011, 43, 527–533. [Google Scholar] [PubMed]
  5. Ponte, S.; Gabrielli, S.; Jonsdottir, J.; Morando, M.; Dellepiane, S. Monitoring game-based motor rehabilitation of patients at home for better plans of care and quality of life. In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 3941–3944. [Google Scholar]
  6. Vamsikrishna, K.; Dogra, D.P.; Desarkar, M.S. Computer-vision-assisted palm rehabilitation with supervised learning. IEEE Trans. Biomed. Eng. 2016, 63, 991–1001. [Google Scholar] [CrossRef] [PubMed]
  7. Cary, F.; Postolache, O.; Girao, P.S. Kinect based system and artificial neural networks classifiers for physiotherapy assessment. In Proceedings of the 2014 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Lisboa, Portugal, 11–12 June 2014; pp. 1–6. [Google Scholar]
  8. Bragaglia, S.; Di Monte, S.; Mello, P. A distributed system using MS kinect and event calculus for adaptive physiotherapist rehabilitation. In Proceedings of the Eighth International Conference on Complex, Intelligent and Software Intensive Systems (CISIS), Birmingham, UK, 2–4 July 2014; pp. 531–538. [Google Scholar]
  9. Wattanasoontorn, V.; Boada, I.; García, R.; Sbert, M. Serious games for health. Entertain. Comput. 2013, 4, 231–247. [Google Scholar] [CrossRef]
  10. Morando, M.; Ponte, S.; Ferrara, E.; Dellepiane, S. Biophysical and Motion Features Extraction for an Effective Home-Based Rehabilitation. In Proceedings of the International Conference on Bioinformatics Research and Applications ICBRA 2017, Barcelona, Spain, 8–10 December 2017; pp. 79–85. [Google Scholar]
  11. Pixabay Home Page. Available online: https://pixabay.com (accessed on 20 April 2018).
  12. Lachat, E.; Macher, H.; Landes, T.; Grussenmeyer, P. Assessment and calibration of a RGB-D camera (kinect v2 sensor) towards a potential use for close-range 3D modeling. Remote Sens. 2015, 7, 13070–13097. [Google Scholar] [CrossRef]
  13. Geerse, D.J.; Coolen, B.H.; Roerdink, M. Kinematic validation of a multi-Kinect v2 instrumented 10-m walkway for quantitative gait assessments. PloS ONE 2015, 10, e0139913. [Google Scholar] [CrossRef] [PubMed]
  14. Pöhlmann, S.T.; Harkness, E.F.; Taylor, C.J.; Astley, S.M. Evaluation of Kinect 3d sensor for healthcare imaging. J. Med. Biol. Eng. 2016, 36, 857–870. [Google Scholar] [CrossRef] [PubMed]
  15. Weichert, F.; Bachmann, D.; Rudak, B.; Fisseler, D. Analysis of the accuracy and robustness of the leap motion controller. Sensors 2013, 13, 6380–6393. [Google Scholar] [CrossRef] [PubMed]
  16. Khademi, M.; Mousavi Hondori, H.; McKenzie, A.; Dodakian, L.; Lopes, C.V.; Cramer, S.C. Free-hand interaction with leap motion controller for stroke rehabilitation. In Proceedings of the CHI’14 Extended Abstracts on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; pp. 1663–1668. [Google Scholar]
  17. Bassily, D.; Georgoulas, C.; Guettler, J.; Linner, T.; Bock, T. Intuitive and adaptive robotic arm manipulation using the leap motion controller. In Proceedings of the ISR/robotik 2014, 41st International Symposium on Robotics, Munich, Germany, 2–3 June 2014; pp. 1–7. [Google Scholar]
  18. Al-Naffakh, N.; Clarke, N.; Dowland, P.; Li, F. Activity Recognition using wearable computing. In Proceedings of the 2016 11th International Conference on Internet Technology and Secured Transactions (ICITST), Barcelona, Spain, 5–7 December 2016; pp. 189–195. [Google Scholar]
  19. Britton, E.; Harris, N.; Turton, A. An exploratory randomized controlled trial of assisted practice for improving sit-to-stand in stroke patients in the hospital setting. Clin. Rehabil. 2008, 22, 458–468. [Google Scholar] [CrossRef] [PubMed]
  20. Clark, R.A.; Bryant, A.L.; Pua, Y.; McCrory, P.; Bennell, K.; Hunt, M. Validity and reliability of the Nintendo Wii Balance Board for assessment of standing balance. Gait Posture 2010, 31, 307–310. [Google Scholar] [CrossRef] [PubMed]
  21. Bartlett, H.L.; Ting, L.H.; Bingham, J.T. Accuracy of force and center of pressure measures of the Wii Balance Board. Gait Posture 2014, 39, 224–228. [Google Scholar] [CrossRef] [PubMed]
  22. Ponte, S.; Ferrara, E.; Dellepiane, S. Home-based system for rehabilitation: improving quality of life through engineering solutions. In Proceedings of the Engineering 4 Society, Raising Awareness for the Societal and Environmental Role of Engineering and (Re)training Engineers for Participatory Design, Leuven, Belgium, 18–19 June 2015; pp. 126–130. [Google Scholar]
  23. Hamdi, H.; Richard, P.; Suteau, A.; Allain, P. Emotion assessment for affective computing based on physiological responses. In Proceedings of the 2012 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), Brisbane, Australia, 10–15 June 2012; pp. 1–8. [Google Scholar]
  24. Gu, Y.; Wong, K.J.; Tan, S.L. Analysis of physiological responses from multiple subjects for emotion recognition. In Proceedings of the 2012 IEEE 14th International Conference on e-Health Networking, Applications and Services (Healthcom), Beijing, China, 10–13 October 2012; pp. 178–183. [Google Scholar]
  25. Fleureau, J.; Guillotel, P.; Huynh-Thu, Q. Physiological-based affect event detector for entertainment video applications. IEEE Trans. Affect. Comput. 2012, 3, 379–385. [Google Scholar] [CrossRef]
  26. Boucsein, W. Electrodermal Activity; Springer: Boston, MA, USA, 2012. [Google Scholar]
  27. Moore, M.M.; Dua, U. A Galvanic Skin Response Interface for People with Severe Motor Disabilities. SIGACCESS Access. Comput. 2003, 48–54. [Google Scholar] [CrossRef]
  28. Egan, D.; Brennan, S.; Barrett, J.; Qiao, Y.; Timmerer, C.; Murray, N. An evaluation of Heart Rate and Electrodermal Activity as an objective QoE evaluation method for immersive virtual reality environments. In Proceedings of the 2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX), Lisbon, Portugal, 6–8 June 2016; pp. 1–6. [Google Scholar]
Figure 1. Summary of the Remote Monitoring Validation Engineering System (ReMoVES) architecture and list of serious games and their corresponding movement: (a) OwlNest for arm Flexion–Extension and Abduction–Adduction; (b) ShelfCans for arm Flexion–Extension and Abduction–Adduction; (c) FlappyCloud for leg Abduction–Adduction; (d) EquilibriumPaint for Sit to Stand; (e) HotAir for Balance Shift; (f) EndlessZig for hand Radial–Ulnar Deviation; (g) CityCar for Flexion–Extension; (h) WineBottle for hand Pronation–Supination; (i) FloatingTrap for Grasping.
Figure 1. Summary of the Remote Monitoring Validation Engineering System (ReMoVES) architecture and list of serious games and their corresponding movement: (a) OwlNest for arm Flexion–Extension and Abduction–Adduction; (b) ShelfCans for arm Flexion–Extension and Abduction–Adduction; (c) FlappyCloud for leg Abduction–Adduction; (d) EquilibriumPaint for Sit to Stand; (e) HotAir for Balance Shift; (f) EndlessZig for hand Radial–Ulnar Deviation; (g) CityCar for Flexion–Extension; (h) WineBottle for hand Pronation–Supination; (i) FloatingTrap for Grasping.
Information 09 00105 g001
Figure 2. Relaxation exercise: the patient inhales as the animation grows, then exhales as it shrinks.
Figure 2. Relaxation exercise: the patient inhales as the animation grows, then exhales as it shrinks.
Information 09 00105 g002
Figure 3. (a) Abduction–Adduction exercise scheme; (b) Hand trajectory in ShelfCans game; (c) Heatmap to detect neglected areas.
Figure 3. (a) Abduction–Adduction exercise scheme; (b) Hand trajectory in ShelfCans game; (c) Heatmap to detect neglected areas.
Information 09 00105 g003
Figure 4. Plot of Score–Compensation and Barycenter–Target position features in OwlNest game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.
Figure 4. Plot of Score–Compensation and Barycenter–Target position features in OwlNest game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.
Information 09 00105 g004
Figure 5. Plot of Score–Compensation and Hand trajectory–Compensation features in ShelfCans game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.
Figure 5. Plot of Score–Compensation and Hand trajectory–Compensation features in ShelfCans game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.
Information 09 00105 g005
Figure 6. Plot of Score–Compensation and Elevation–Compensation features in the FlappyCloud game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.
Figure 6. Plot of Score–Compensation and Elevation–Compensation features in the FlappyCloud game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.
Information 09 00105 g006
Figure 7. (a) Trunk tilt on sagittal and coronal planes; (b) Shoulder tilt detection.
Figure 7. (a) Trunk tilt on sagittal and coronal planes; (b) Shoulder tilt detection.
Information 09 00105 g007
Figure 8. Plot of Repetition rate–Coronal shift and Repetition trend–Shoulder tilt features in the EquilibriumPaint game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.
Figure 8. Plot of Repetition rate–Coronal shift and Repetition trend–Shoulder tilt features in the EquilibriumPaint game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.
Information 09 00105 g008
Figure 9. (a) Tracked joints for Centre of Pressure (CoP) estimation; (b) Gameplay during a serious game session; (c) Erroneous balance shift detection.
Figure 9. (a) Tracked joints for Centre of Pressure (CoP) estimation; (b) Gameplay during a serious game session; (c) Erroneous balance shift detection.
Information 09 00105 g009
Figure 10. Normalised CoP oscillation acquired both with Microsoft Kinect V2 and Wii Balance Board during a comparison test. Time is on the x-axis (2-min-long trial).
Figure 10. Normalised CoP oscillation acquired both with Microsoft Kinect V2 and Wii Balance Board during a comparison test. Time is on the x-axis (2-min-long trial).
Information 09 00105 g010
Figure 11. Plot of Score–CoP variation and CoP variation–Compensation features in HotAir game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.
Figure 11. Plot of Score–CoP variation and CoP variation–Compensation features in HotAir game. Data were labelled as “Good” or “Other” by the therapist according to patients’ performance.
Information 09 00105 g011
Figure 12. Movements detected during hand rehabilitation with Leap Motion: (a) Radial–Ulnar deviation movement in EndlessZig game; (b) Flexion–Extension movement in CityCar game; (c) Pronation–Supination movement in WineBottle game; (d) Grasping movement in FloatingTrap game.
Figure 12. Movements detected during hand rehabilitation with Leap Motion: (a) Radial–Ulnar deviation movement in EndlessZig game; (b) Flexion–Extension movement in CityCar game; (c) Pronation–Supination movement in WineBottle game; (d) Grasping movement in FloatingTrap game.
Information 09 00105 g012
Figure 13. The emotional index graph: Time is on the x-axis and the emotional index in terms of emotion experienced during the execution of rehabilitation exercise is on the y-axis.
Figure 13. The emotional index graph: Time is on the x-axis and the emotional index in terms of emotion experienced during the execution of rehabilitation exercise is on the y-axis.
Information 09 00105 g013
Figure 14. Valence–Arousal circumplex chart: on the x-axis the valence and on the y-axis the arousal of emotions. The data concentration in the left part of the graph can indicate that the type of exercise assigned by the therapist is too difficult.
Figure 14. Valence–Arousal circumplex chart: on the x-axis the valence and on the y-axis the arousal of emotions. The data concentration in the left part of the graph can indicate that the type of exercise assigned by the therapist is too difficult.
Information 09 00105 g014
Figure 15. Demographic characteristics of the patients.
Figure 15. Demographic characteristics of the patients.
Information 09 00105 g015
Figure 16. Classification of the diseases affecting the patients that tested the ReMoVES platform. The patients affected by hemiparesis are a subcategory of the ones affected by stroke (red box).
Figure 16. Classification of the diseases affecting the patients that tested the ReMoVES platform. The patients affected by hemiparesis are a subcategory of the ones affected by stroke (red box).
Information 09 00105 g016
Figure 17. Classification of available serious games and frequency of use.
Figure 17. Classification of available serious games and frequency of use.
Information 09 00105 g017
Figure 18.
Figure 18.
Information 09 00105 g018aInformation 09 00105 g018b
Table 1. Details about the motion and biophysical data collected during the activity: unit of measurement, typical range of values and data type in the database.
Table 1. Details about the motion and biophysical data collected during the activity: unit of measurement, typical range of values and data type in the database.
DescriptionUnitRangeData TypeNotes
Heart Ratesecondsfrom 2 to 0.24float: 5 dec.inter-beat interval
Electrodermal activitykohmsfrom 0 to INFinteger
Score integer
Joint positionmetersfrom −5 to +5float: 7 dec.for each X, Y, Z coordinates
Yaw, roll, pitchradiansfrom 0 to 2 π float: 7 dec.
Grab strength from 0 to 1float: 7 dec.0 palm open, 1 hand closed
Table 2. Serious games list including the purpose of the activity, the employed devices, and the indicators that describe the patient performance during the exercise.
Table 2. Serious games list including the purpose of the activity, the employed devices, and the indicators that describe the patient performance during the exercise.
Activity NameActivity PurposeDevicesPerformance Indicators
BreathBallRelaxationMicrosoft Band 2Heart Rate, Angular coefficient β H R
OwlNestUpper limb rehabMicrosoft Kinect V2Score, Compensation, Barycenter, Target position
ShelfCansUpper limb rehabMicrosoft Kinect V2Score, Compensation, Hand trajectory
FlappyCloudLower limb rehabMicrosoft Kinect V2Score, Compensation, Elevation
EquilibriumPaintSit to StandMicrosoft Kinect V2Repetition rate, Repetition trend, Coronal shift, Shoulder shift
HotAirBalance shiftMicrosoft Kinect V2Score, CoP variation, Compensation
EndlessZigHand rehabLeap MotionScore, Repetition rate, Range of Motion: yaw
CityCarHand rehabLeap MotionScore, Repetition rate, Range of Motion: pitch
WineBottleHand rehabLeap MotionScore, Repetition rate, Range of Motion: roll
FloatingTrapHand rehabLeap MotionScore, Repetition rate, Grab strength

Share and Cite

MDPI and ACS Style

Morando, M.; Ponte, S.; Ferrara, E.; Dellepiane, S. Definition of Motion and Biophysical Indicators for Home-Based Rehabilitation through Serious Games. Information 2018, 9, 105. https://doi.org/10.3390/info9050105

AMA Style

Morando M, Ponte S, Ferrara E, Dellepiane S. Definition of Motion and Biophysical Indicators for Home-Based Rehabilitation through Serious Games. Information. 2018; 9(5):105. https://doi.org/10.3390/info9050105

Chicago/Turabian Style

Morando, Matteo, Serena Ponte, Elisa Ferrara, and Silvana Dellepiane. 2018. "Definition of Motion and Biophysical Indicators for Home-Based Rehabilitation through Serious Games" Information 9, no. 5: 105. https://doi.org/10.3390/info9050105

APA Style

Morando, M., Ponte, S., Ferrara, E., & Dellepiane, S. (2018). Definition of Motion and Biophysical Indicators for Home-Based Rehabilitation through Serious Games. Information, 9(5), 105. https://doi.org/10.3390/info9050105

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop