[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1007/978-3-642-02830-4_4guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Gesture Recognition with a 3-D Accelerometer

Published: 06 July 2009 Publication History

Abstract

Gesture-based interaction, as a natural way for human-computer interaction, has a wide range of applications in ubiquitous computing environment. This paper presents an acceleration-based gesture recognition approach, called <em>FDSVM (</em> Frame-based Descriptor and multi-class SVM), which needs only a wearable 3-dimensional accelerometer. With FDSVM, firstly, the acceleration data of a gesture is collected and represented by a frame-based descriptor, to extract the discriminative information. Then a SVM-based multi-class gesture classifier is built for recognition in the nonlinear gesture feature space. Extensive experimental results on a data set with 3360 gesture samples of 12 gestures over weeks demonstrate that the proposed FDSVM approach significantly outperforms other four methods: DTW, Naïve Bayes, C4.5 and HMM. In the user-dependent case, FDSVM achieves the recognition rate of 99.38% for the 4 direction gestures and 95.21% for all the 12 gestures. In the user-independent case, it obtains the recognition rate of 98.93% for 4 gestures and 89.29% for 12 gestures. Compared to other accelerometer-based gesture recognition approaches reported in literature FDSVM gives the best resulrs for both user-dependent and user-independent cases.

References

[1]
Schlömer, T., Poppinga, B., Henze, N., Boll, S.: Gesture Recognition with a Wii Controller. In: International Conference on Tangible and Embedded Interaction (TEI 2008), Bonn Germany, Feburary 18-20, pp. 11-14 (2008).
[2]
Tsukada, K., Yasumura, M.: Ubi-Finger: Gesture Input Device for Mobile Use. In: Proceedings of APCHI 2002, vol. 1, pp. 388-400 (2002).
[3]
Sawada, H., Hashimoto, S.: Gesture Recognition Using an Accelerometer Sensor and Its Application to Musical Performance Control. Electronics and Communications in Japan Part 3, 9-17 (2000).
[4]
Mäntylä, V.-M., Mäntyjärvi, J., Seppänen, T., Tuulari, E.: Hand Gesture Recognition of a Mobile Device User. In: Proceedings of the International IEEE Conference on Multimedia and Expo., pp. 281-284 (2000).
[5]
Mäntyjärvi, J., Kela, J., Korpipää, P., Kallio, S.: Enabling fast and effortless customization in accelerometer based gesture interaction. In: Proceedings of the 3rd International Conference on Mobile and Ubiquitous Multimedia (MUM 2004), October 27-29, pp. 25-31. ACM Press, New York (2004).
[6]
Mäntylä, V.-M.: Discrete Hidden Markov Models with Application to Isolated User-Dependent Hand Gesture Recognition. VTT publications (2001).
[7]
Hofmann, F.G., Heyer, P., Hommel, G.: Velocity profile based recognition of dynamic gestures with discrete hidden markov models. In: Wachsmuth, I., Fröhlich, M. (eds.) GW 1997. LNCS, vol. 1371, pp. 81-95. Springer, Heidelberg (1998).
[8]
Ravi, N., Dandekar, N., Musore, P., Littman, M.: Activity Recognition from Accelerometer Data. In: Proceedings of IAAI 2008, July 2005, pp. 11-18 (2005).
[9]
Frigo, M., Johnson, S.G.: The Design and implementation of FFTW3. Proceedings of the IEEE 93(2) (2005).
[10]
Joachims, T.: Making large-Scale SVM Learning Practical. In: Schöllkopf, B., Burges, C., Smola, A. (eds.) Advances in Kernel Methods - Support Vector Learning. MIT-Press, Cambridge (1999).
[11]
Christanini, J., Taylor, J.S.: An Introduction to Support Vector Machines and other Kernel-based Methods. Cambridge University Press, Cambridge (2000).
[12]
Mitra, S., Acharya, T.: Gesture Recognition: A Survey. IEEE Trans. Systems, Man, and Cybernetics, Part C 37(3), 311-324 (2007).
[13]
Quinlan, J.R.: Improved use of continuous attributes in c4.5. Journal of Artificial Intelligence Research 4, 77-90 (1996).
[14]
Moghaddam, B., Yang, M.-H.: Learning Gender with Support Faces, IEEE Trans. Pattern Analysis and Machine Intelligence 24(5), 707-711 (2002).
[15]
Osuna, E., Freund, R., Girosi, F.: Training Support Vector Machines: An Application to Face Detection. In: Proc. IEEE Computer Soc. Conf. Computer Vision and Pattern Recognition, pp. 130-136 (1997).
[16]
Cho, S.-J., Choi, E., Bang, W.-C., Yang, J., Sohn, J., Kim, D.Y., Lee, Y.-B., Kim, S.: Two-stage Recognition of Raw Acceleration Signals for 3D-Gesture-Understanding Cell Phones. In: 10th International Workshop on Frontiers in Handwriting Recognition (2006).
[17]
Niezen, G., Hancke, G.P.: Gesture recognition as ubiquitous input for mobile phones. In: International Workshop on Devices that Alter Perception (DAP 2008), conjunction with Ubicomp 2008 (2008).
[18]
Liu, J., Wang, Z., Zhong, L., Wickramasuriya, J., Vasudevan, V.: uWave: Accelerometer-based Personalized Gesture Recognition and Its Applications. In: IEEE PerCom 2009. (2009).
[19]
Bao, L., Intille, S.S.: Activity recognition from user-annotated acceleration data. In: Ferscha, A., Mattern, F. (eds.) PERVASIVE 2004. LNCS, vol. 3001, pp. 1-17. Springer, Heidelberg (2004).
[20]
Wilson, D.H., Wilson, A.: Gesture Recognition using the XWand, Technical Report CMU-RI-TR-04-57, CMU Robotics Institute (2004).
[21]
Apple iPhone, http://www.apple.com/iphone
[22]
Nintendo Wii, http://www.nintendo.com/wii
[23]
Hommel, G., Hofmann, F.G., Henz, J.: The TU Berlin High-Precision Sensor Glove. In: Proceedings of the WWDU 1994, Fourth International Scientific Conference, vol. 2, pp. 47-49. University of Milan, Milan (1994).
[24]
Kela, J., Korpipaa, P., Mantyjarvi, J., Kallio, S., Savino, G., Jozzo, L., Marca, D.: Accelerometer-based gesture control for a design environment. Personal Ubiquitous Computing 10, 285-299 (2006).

Cited By

View all
  • (2024)CARDinality: Interactive Card-shaped Robots with Locomotion and Haptics using VibrationProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676421(1-14)Online publication date: 13-Oct-2024
  • (2022)Application of Knowledge Model in Dance Teaching Based on Wearable Device Based on Deep LearningMobile Information Systems10.1155/2022/32995922022Online publication date: 1-Jan-2022
  • (2022)MyoSpring: 3D Printing Mechanomyographic Sensors for Subtle Finger Gesture RecognitionProceedings of the Sixteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3490149.3501321(1-13)Online publication date: 13-Feb-2022
  • Show More Cited By
  1. Gesture Recognition with a 3-D Accelerometer

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    UIC '09: Proceedings of the 6th International Conference on Ubiquitous Intelligence and Computing
    July 2009
    388 pages
    ISBN:9783642028298
    • Editors:
    • Daqing Zhang,
    • Marius Portmann,
    • Ah-Hwee Tan,
    • Jadwiga Indulska

    Publisher

    Springer-Verlag

    Berlin, Heidelberg

    Publication History

    Published: 06 July 2009

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 12 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)CARDinality: Interactive Card-shaped Robots with Locomotion and Haptics using VibrationProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676421(1-14)Online publication date: 13-Oct-2024
    • (2022)Application of Knowledge Model in Dance Teaching Based on Wearable Device Based on Deep LearningMobile Information Systems10.1155/2022/32995922022Online publication date: 1-Jan-2022
    • (2022)MyoSpring: 3D Printing Mechanomyographic Sensors for Subtle Finger Gesture RecognitionProceedings of the Sixteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3490149.3501321(1-13)Online publication date: 13-Feb-2022
    • (2020)Extend, Push, Pull: Smartphone Mediated Interaction in Spatial Augmented Reality via Intuitive Mode SwitchingProceedings of the 2020 ACM Symposium on Spatial User Interaction10.1145/3385959.3418456(1-10)Online publication date: 31-Oct-2020
    • (2019)Scratch Nodes MLExtended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290607.3312894(1-6)Online publication date: 2-May-2019
    • (2019)Probabilistic Hierarchical Model Using First Person Vision for Scenario RecognitionWireless Personal Communications: An International Journal10.1007/s11277-018-5933-9106:4(2179-2193)Online publication date: 1-Jun-2019
    • (2018)Class-balanced siamese neural networksNeurocomputing10.5555/3162631.3162749273:C(47-56)Online publication date: 17-Jan-2018
    • (2018)Pocket6Proceedings of the 2018 ACM Symposium on Spatial User Interaction10.1145/3267782.3267785(2-10)Online publication date: 13-Oct-2018
    • (2018)!FTL, an Articulation-Invariant Stroke Gesture Recognizer with Controllable Position, Scale, and Rotation InvariancesProceedings of the 20th ACM International Conference on Multimodal Interaction10.1145/3242969.3243032(125-134)Online publication date: 2-Oct-2018
    • (2018)TouchCamProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/31614161:4(1-23)Online publication date: 8-Jan-2018
    • Show More Cited By

    View Options

    View options

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media