Abstract
Cerebral Palsy is a motor disability that occurs in early childhood. Conventional therapy methods have proven useful for upper extremity rehabilitation, but can lead to non-compliance due to children getting bored with the repetition of exercises. Virtual reality and game-like simulations of conventional methods have proven to lead to higher rates of compliance, the patient being more engaged during exercising, and yield better performance during exercises. Most games are good at keeping players engaged, but does not focus on exercising fine motor control functions. In this paper, we present an analysis of classification techniques for static hand gestures. We also present a prototype of a game-like simulation of matching static hand gestures in order to increase motor control of the hand.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Cerebral palsy (CP) is a condition directly related to a lesion in the brain that occurs early in the life of a child. Children with cerebral palsy (CP) have permanent issues with posture and movement that impact participation in daily activities. They may also experience musculoskeletal changes, cognitive impairment, communication and behavioral concerns [1]. There are several subtypes of cerebral palsy based on the type of muscle tone (spasticity, dyskinetic, hypotonia, and mixed tone) and location of impairment (quadriplegia, hemiplegia, diplegia, and others) [2]. Hemiplegia is a type of CP where the child experiences limitations in posture and movement on one side of the body. A child with hemiplegia does not use the impaired arm as often as the unaffected arm due to repeated experiences of failure in using that arm. Human computer interaction (HCI) is a method that supplement traditional rehabilitation therapy, such as occupational and physical therapy, to create experiences and environments to provide children with successful opportunities to promote the use of their affected hand or limb without feeling a sense of failure. Research has shown that the use of specially designed computer games can motivate and help children to enhance the use the affected limb while also strengthening the muscles involved and any related affected functionality [3–5].
Introduction of low cost, off the shelf sensors, such as the Leap Motion [6], have increased the accessibility to and usability of equipment that was previously too expensive for many applications. The Leap Motion was specifically designed to detect hand motions and gestures. It operates over a small range and high precision due to its use of infrared optics. Rehabilitation therapists indicate that the Leap Motion has potential for rehabilitation and that it would be an effective motivational tool young people within a home environment without a therapist being present [7].
The purpose of this paper is twofold. First, we will compare three classification techniques, decision trees, Support Vector Machines (SVM) and k-nearest neighbors (KNN), to recognize and classify static gestures from the Leap Motion based on the position of the hand and fingers as well as the joint angles. Secondly, a game will be created to detect these gestures and will be evaluated by both student volunteers and occupational therapy experts in the field.
2 Background
Evidence for rehabilitation in the area of cerebral palsy has expanded in recent years due to new technologies and methodologies. Specific interventions have been researched to ensure efficacy, cost-effectiveness and safety [8]. In addition, the World Health Organization (WHO) has established the International Classification of Functioning, Disability, and Health (ICF) that is intended to serve as a collaborative global framework and scientific tool to measure health and disability. The ICF has shifted the focus from disability and impairment to that of function and participation within context of the social and physical environment [9]. It is for these reasons that it is important to develop rehabilitation interventions that are both evidence-based and consider contextual factors and participation.
Reference [8] completed a systematic review of smaller systematic reviews for interventions related to children with cerebral palsy. Therapeutic interventions that were found to have the strongest evidence included: bimanual training; Botulinum toxin (Botox) injections; context-focused therapy; goal-directed functional training using a motor-learning approach; therapeutic home programs; and Botox followed up by occupational therapy. Human computer interaction is an area within rehabilitation therapy that provides a new and innovative method for intervention. Virtual reality games are used in rehabilitation to promote movement and strengthening within a motivating environment [10]. Evidence is emerging to determine the efficacy of virtual reality and influence of functional outcomes related to children with cerebral palsy [11]. However, the principles of Virtual Reality (VR) support strong evidence because it can be designed to emphasize motor learning, bimanual training and goal-directed training within the home environment [10, 12]. The child’s participation in rehabilitation through the use of highly motivational VR games within the context of their home also supports WHO initiatives.
Gaming systems such as the commercially-available gaming systems and robotic arm systems are most commonly used in the clinic setting. Children with CP are often unable to use commercially-available systems due to movement restrictions in the upper extremities. Additionally, they are not beneficial to the occupational therapist because they do not specifically measure small upper extremity movements such as finger extension, wrist extension, ulnar/radial deviation, and forearm supination [13].
3 Related Work
VR has been used in the treatment of CP with an increased success rate compared to conventional exercises. The authors of [4] show that children with and without CP found that VR exercises are more interesting than conventional exercises. These children also were able to hold exercises longer and showed an increased range of motion during VR exercises compared to conventional exercises. The parents for the children also noticed their children having more fun during VR exercises and believe that their children would continue the exercises at home. The authors of [14] also agree with this, stating that a VR training program has potential to improve reaching abilities and control in children with CP.
The Leap Motion controller has been used in game based physical therapy. Reference [7] evaluated the usefulness of the Leap Motion controller for a clinical environment by developing game-like versions of existing rehabilitation activities that were evaluated by clinicians. The results of their trial show that the Leap Motion does have potential to be used in place of some traditional techniques, especially in the home and for young people. Reference [15] focused on the responses from patients. The patients in this study said that the game presented to them was very engaging and addressing a need of practicing movements that are related to daily functions. Also, the patients said that they would play this game if provided as part of home therapy program.
Work has also been done with using the Leap Motion controller in terms of gesture recognition. Reference [16] shows classification techniques using the Leap Motion controller for both static and dynamic gestures. Reference [17] also presents a gesture recognition system using the Leap Motion control made for therapy applications, including a list of gestures created with the help of therapists. Both these systems, however, are lacking a game aspect to keep the patient involved. These present a good starting point, but missing the game component could lead to non-compliance similar to that of conventional therapy exercises.
4 Experimental Procedure
4.1 Equipment
This paper focuses on the use of the Leap Motion controller. As shown in Fig. 1, the Leap Motion consists of three infrared (IR) light emitters and two IR cameras. Since this system uses stereo vision, it can be categorized as an optical tracking system instead of depth based tracking system [18]. The Leap Motion controller provides detailed information about a user’s hand, including the position of the wrist, palm, and finger digits in the Cartesian space, as well as the direction of the hand and finger digits. This information can be used to determine joint angles of the wrist and knuckles. The Leap Motion controller can also provide other information, such as what fingers are extended, the normal vector to the palm, information about the forearm and any tools being used within the Leap Motion controller’s field of view.
4.2 Data Collection
A gesture library was created for this prototype with the help of an occupational therapist. The goal of this was to actually recognize gestures that are used as exercises. Figure 2 shows the gestures that were chosen for this library.
The training data for the various classification procedures was gathered by having student volunteers perform the gestures in a controlled environment. A visualization tool was developed using the Unity Game Engine [19] and the Leap Motion API. UI tools were placed in the top left corner to allow the administrator of the data collection to easier start and stop data collection and save the data. Also, the volunteer can see a visualization their hand on the screen to allow the administrator and the volunteer to verify that the gesture is seen correctly by the Leap Motion Controller. Figure 3 shows this UI Design.
The volunteer was given photos of the gestures before data collection begun so they could know what they had to do. The volunteer then placed their right hand above the Leap Motion controller and made the first gesture. When it was shown correctly on the screen, the administrator collected approximately 5 s of data at a sample rate of 50 Hz and saved it. The volunteer then made the second gesture, and was recorded the same way. This task was repeated till all gestures were recorded, and the whole process was repeated two more times. We did not record all the data points produced by the Leap Motion control. Instead, we only recorded the features that were useful to determine the gesture. These included what fingers were extended, the direction of the forearm and the hand, the normal vector of the palm, and joint angles of the wrist and knuckles.
5 Analysis
We used three different methods of classification: Decision Tree, K-Nearest Neighbors (KNN), and Support Vector Machines (SVM). First, we had to determine what features are used to classify which gestures. For example, only the Supination of the Forearm gesture has the palm’s normal vector in the positive Y direction. This feature can then be ignored for all other gestures in this gesture library. A full list of the features that were used to classify each gesture is shown in Table 1.
We made ten different decision trees to classify the ten different gestures. The reason for this was based on game play. During gameplay, we can assume what gesture someone is supposed to make by where they are in the game, since they would only make certian gestures at certian points. With this assumption, we can classify the gesture with a decision tree with fewer levels than a single tree that would classify all gestures at once. One of the decision trees is shown in Fig. 4. The tolerances for the joint angles and directional vectors were determined by looking at the training data to verify that a majority of the training data would be classified correctly, and to allow some error from any potential input from a game.
The KNN analysis was developed using the built in Matlab functions. The number of neighbors chosen was 294 as it still yielded a very low error. No other parameters were changed. This approach only used one model to classify gestures, unlike the decision tree approach mentioned above. We only used one model, because KNN can easily handle multiple classes without consuming too much time. This approach also was the only one to use all features collected to classify the gesture.
Lastly, we used 10 SVMs also developed using the built in Matlab functions using the Gaussian Radial Basis Function as the kernel function. We made 10 different SVMs for the same reason as mentioned with the decision trees. The feature vector for each SVM comes from the Table 1.
Table 2 shows a comparison between the 3 different methods. The method used to verify the classification models developed was resubstitution. Each model had the data samples that were supposed to match the model resubstituted back into the model to give the results below. As shown, KNN yielded the best results. Most of the decision tree models were above 90 %, and further modification on the tolerances for the middle, ring, and pinky finger extensions would help fix this.
6 Game
For the game prototype development, we once again used the Unity Game Engine and the Leap Motion API. The game consists of two phases. The first is a fifteen second rest period. The player is not required to do anything during this phase. A picture of the next gesture is shown so that the player can prepare. During the second phase, the player is to match a gesture that is shown on the screen. Both a picture of the gesture and a visualization of the hand as seen from the Leap Motion are shown to players, so that they can see what they are doing with respect to real life and the Leap Motion itself. When the gesture is matched, the top left corner turns green, and red when it is not matched. The score increments by one for every second the gesture is held. A screen capture of this game is shown in Fig. 5. This is supposed to help strengthen the hand. We used the decision tree models for this game due to the lack of open source classification software readily available for Unity. The recognition of gestures did not seem to be affected by using decision trees in terms of response and recognizing most gestures. Certain gestures, however, did require more exact positioning than was expected by the authors.
Student volunteers played the game then filled out a three question survey afterwards. All but question was on a Likert Scale of 1–5 with 5 being the most positive and 1 being the most negative. The mean of the responses to the question “I feel the overall control interface is easy to use” was 3.67, but the mean of the responses to “I feel that with practice, I could become proficient in using the control interface” was 4.83. This shows that people feel that playing the game more would lead them to a higher score, which would then improve range of motion. The mean of the responses to the question “The tasks presented on the screen are easy to understand” was 4.33. The only comment from the volunteers that one of the pictures was rotated from the Leap Motion model, which caused some confusion.
A video of one of the authors playing the game was made and sent out 2 area hospitals to be evaluated by physical and occupational pediatric therapists. This prototype got mixed reviews. The mean of “The Leap Motion appears easy to use” and “I feel that I could become proficient in using the Leap Motion” was 3.78, while the mean of “The tasks on the screen are easy to understand” was a 4. When asked about an improved version of the prototype, the most interesting response was to “I feel patients would be motivated to use an improved version of this prototype” which had a mean of 2.87. The comments provided by theses therapists said that the game needs to be more engaging, fun, and interactive to help hold a patient’s attention.
7 Conclusion
In this paper, we have presented an analysis of classification techniques on data gathered from the Leap Motion controller. Decision trees provided over 90 % accuracy for the majority of gestures, but KNN and SVM provided much more accurate results. This is believed to be due to the tolerances chosen for the joint angles of the decision tree now allowing for a wide enough variance to properly classify certain gestures. Further adjustment of the tolerances should yield better classification results.
A game prototype also was presented. The reviews of the student volunteers playing this prototype said that the interface was easy to use and could easily become proficient in using it. Therapists viewing a demo of the game also had positive feedback in terms of using the Leap Motion controller and the way tasks were presented to the user. Therapists did comment on the engagement level of the game, saying that patients might not feel motivated to use the current or improved version of this prototype, saying that the game needs to have more features to keep the patient’s attention so that they feel motivated to use the system. Based on the feedback from the student volunteers and the therapists, there is enough evidence to develop a new version of the game which incorporate other gestures and data modalities, as well as a more engaging interface.
8 Future Work
The next phase of this prototype is to expand on the gesture library. Adding gestures will increase the number of features to be viewed to distinguish gestures from each other. Gestures added could be either static or dynamic. This would then mean that more analysis of various gesture recognition algorithms would be needed to determine the best use for dynamic gestures using the Leap Motion.
Also, a therapist user interface will be added. This will allow therapists to view any important data gathered during gameplay sessions. This will also enable the potential for telerehabilitation, since the therapist can then view the data from sessions the patient does at home. This interface will also allow the therapist to control the exercises, such as the order of the gestures and the difficulty of the exercises, or how accurate the gesture has to be.
Lastly, a more engaging game will be developed. The current game is very basic, and therapists have commented on it. A more engaging game might help therapists feel that the patient would feel motivated to play this game, especially in pediatrics.
References
Rosenbaum, P., Paneth, N., Leviton, A., Goldstein, M., Bax, M., Damiano, D., Dan, B., Jacobsson, B.: A report: the definition and classification of cerebral palsy. Dev. Med. Child Neurol. 49(supplement 109), 8–14 (2007)
Shevell, M., Dagenais, L., Hall, N.: The relationship of cerebral palsy subtype and functional motor impairment: a population based study. Dev. Med. Child Neurol. 51(11), 872–877 (2009)
Aarts, P.B., Hartingsveldt, M., Anderson, P.G., Tillar, I., Burg, J., Geurts, A.C.: The pirate group intervention protocol: description and a case report of a modified constraint-induced movement therapy combined with bimanual training for young children with unilateral spastic cerebral palsy. Occup. Therapy Int. 19(2), 76–87 (2012)
Bryanton, C., Bosse, J., Brien, M., Mclean, J., McCormick, A., Sveistrup, H.: Feasibility, motivation and selective motor control: virtual reality compared to conventional home exercise in children with cerebral palsy. Cyberpsychology Behav. 19(2), 123–128 (2006)
Snider, L., Majnemer, A., Darsaklis, V.: Virtual reality as a therapeutic modality for children with cerebral palsy. Dev. Neurorehabil. 13(2), 120–128 (2010)
Leap Motion Controller SDK. https://developer.leapmotion.com/
Charles, D., Pedlow, K., McDonough, S., Shek, K., Charles, T.: Close range depth sensing camera for virtual reality based hand rehabilitation. J. Assist. Technol. 8(3), 138–149 (2014)
Novak, I., McIntyre, S., Morgan, C., Campbell, L., Dark, L., Morton, N., Stumbles, E., Wilson, S., Goldsmith, S.: A systematic review of interventions for children with cerebral palsy: state of the evidence. Dev. Med. Child Neurol. 55(10), 885–910 (2013)
World Health Organization: International Classification of Functioning, Disability and Health. WHO, Geneva (2001)
Fluet, G.G., Qiu, Q., Kelly, D., Parikh, H.D., Ramirez, D., Saleh, S., Adamovich, S.V.: Interfacing a haptic robotic system with complex virtual environments to treat impaired upper extremity motor function in children with cerebral palsy. Dev. Neurorehabil. 13(5), 335–345 (2010)
Yin, C.W., Sien, N.Y., Ying, L.A., Chung, S.F.M., Tan, M.L.: Virtual reality for upper extremity rehabilitation in early stroke: a pilot randomized controlled trial. Clin. Rehabil. 28(11), 1107–1114 (2014)
Lewis, G.N., Rosie, J.A.: Virtual reality games for movement rehabilitation in neurological conditions: How do we meet the needs and expectations of the users? Disabil. Rehabil. 34(22), 1880–1886 (2012)
Chen, Y., Caldwell, M., Dickerhoof, E., Hall, A., Odakura, B., Morelli, K., Fanchiang, H.: Game analysis, validation, and potential application of EyeToy play and play 2 to upper-extremity rehabilitation. Rehabil. Res. Pract. 2014, 1–13 (2014)
Chen, Y., Kang, L., Chuang, T., Doong, J., Lee, S., Tsai, M., Jeng, S., Sung, W.: Use of virtual reality to improve upper-extremity control in children with cerbral palsy: a single subject design. Phys. Ther. 87(11), 1441–1457 (2007)
Khademi, M., Hondori, H.M., McKenzie, A., Dodakian, L., Lopes, C.V., Cramer, S.C.: Free hand interaction with leap motion controller for stroke rehabilitation. In: Proceedings of the Extended Abstracts of the 32nd Annual ACM Conference on Human Factors in Computing Systems, pp. 1663–1668. ACM, New York (2014)
Nowicki, M., Pilarczyk, O., Wasikowski, J., Zjawin, K., Jaskowski, W.: Gesture Recognition Library for Leap Motion Controller. Bachelor thesis. Poznan University of Technology, Poland (2014)
Rahman, R,A.: Multimedia non-invasive hand therapy monitoring system. In: 2014 IEEE International Symposium Medical Measurements and Applications, pp. 1–5. IEEE Press, New York (2014)
Weichert, F., Bachmann, D., Rudak, B., Fissler, D.: Analysis of the accuracy and robustness of the leap motion controller. Sensors 13(5), 6380–6393 (2013)
Unity Game Engine. http://unity3d.com/5
Acknowledgements
Special thanks to the students of the Heracleia Human Centered Computing Laboratory who volunteered to play the game and provide feedback. This work has been partially supported by the following NSF grants: CNS: 1338118, CNS: 1035913, and IIS: 1041637.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Gieser, S.N., Boisselle, A., Makedon, F. (2015). Real-Time Static Gesture Recognition for Upper Extremity Rehabilitation Using the Leap Motion. In: Duffy, V. (eds) Digital Human Modeling. Applications in Health, Safety, Ergonomics and Risk Management: Ergonomics and Health. DHM 2015. Lecture Notes in Computer Science(), vol 9185. Springer, Cham. https://doi.org/10.1007/978-3-319-21070-4_15
Download citation
DOI: https://doi.org/10.1007/978-3-319-21070-4_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-21069-8
Online ISBN: 978-3-319-21070-4
eBook Packages: Computer ScienceComputer Science (R0)