[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content

Advertisement

Log in

3D Human-Gesture Interface for Fighting Games Using Motion Recognition Sensor

  • Published:
Wireless Personal Communications Aims and scope Submit manuscript

Abstract

As augmented reality–related technologies become commercialized due to requests for 3D content, they are developing a pattern whereby users utilize and consume the realism and reality of 3D content. Rather than using absolute position information, the pattern characteristics of gestures are extracted by considering body-proportion characteristics around the shoulders. Even if performing the same gesture, position coordinate values of the skeleton measured by a motion recognition sensor can vary, depending on the length and direction of the arm. In this paper, we propose a 3D human-gesture interface for fighting games using a motion recognition sensor. Recognizing gestures in the motion recognition sensor environment, we applied the gestures to a fighting action game. The motion characteristics of gestures are extracted by using joint information obtained from the motion recognition sensor, and 3D human motion is modeled mathematically. Motion is effectively modeled and analyzed with a method of expressing it in space via principal component analysis and then matching it with the 3D human-gesture interface for new input. Also, we propose an advanced pattern matching algorithm as a way to reduce motion constraints in a motion recognition system. Finally, based on the results of motion recognition, an example used as the interface of a 3D fight action game is presented. By obtaining high-quality 3D motion, the developed technology provides more realistic 3D content through real-time processing technology.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Weiser, M. (2001). The computer for the 21st century. Scientific Anerica, 265(3), 66–76.

    Google Scholar 

  2. Kang, S. K., Chung, K. Y., & Lee, J. H. (2014). Real-time tracking and recognition systems for interactive telemedicine health services. Wireless Personal Communications, 79(4), 2611–2626.

    Article  Google Scholar 

  3. Kang, S. K., Chung, K. Y., & Lee, J. H. (2015). Ontology based inference system for adaptive object recognition. Multimedia Tools and Applications, 74(20), 8893–8905.

    Article  Google Scholar 

  4. Jung, H., & Chung, K. (2015). Sequential pattern profiling based bio-detection for smart health service. Cluster Computing, 18(1), 209–219.

    Article  Google Scholar 

  5. Jung, H., & Chung, K. (2016). Knowledge based dietary nutrition recommendation for obesity management. Information Technology and Management, 17(1), 29–42.

    Article  Google Scholar 

  6. Jung, E. Y., Kim, J. H., Chung, K. Y., & Park, D. K. (2014). Mobile healthcare application with EMR interoperability for diabetes patients. Cluster Computing, 17(3), 871–880.

    Article  Google Scholar 

  7. Kim, S. H., & Chung, K. Y. (2014). 3D simulator for stability analysis of finite slope causing plane activity. Multimedia Tools and Applications, 68(2), 455–463.

    Article  Google Scholar 

  8. Kim, S. H., & Chung, K. Y. (2015). Medical information service system based on human 3D anatomical model. Multimedia Tools and Applications, 74(20), 8939–8950.

    Article  Google Scholar 

  9. Pavlovic, V. I., Sharma, R., & Huang, T. (1997). Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), 677–695.

    Article  Google Scholar 

  10. Haritaoglu, I., Harwood, D., & Davis, L. S. (1998). W4: Who? When? Where? What? A real-time system for detecting and tracking people. In Third Face and Gesture Recognition Conference (pp. 222–227).

  11. Kang, S. K., Chung, K. Y., & Lee, J. H. (2014). Development of head detection and tracking systems for visual surveillance. Personal and Ubiquitous Computing, 18(3), 515–522.

    Article  Google Scholar 

  12. Kim, J. M., Chung, K., & Kang, M. A. (2016). Continuous gesture recognition using HLAC and low-dimensional space. Wireless Personal Communications, 86(1), 255–270.

    Article  Google Scholar 

  13. Jo, S. M., & Chung, K. (2014). Design of access control system for telemedicine secure XML documents. Multimedia Tools and Applications, 74(7), 2257–2271.

    Article  Google Scholar 

  14. Jung, H., & Chung, K. (2016). PHR based life health index mobile service using decision support model. Wireless Personal Communications, 86(1), 315–332.

    Article  Google Scholar 

  15. Kim, S. H., & Chung, K. (2015). Emergency situation monitoring service using context motion tracking of chronic disease patients. Cluster Computing, 18(2), 747–759.

    Article  Google Scholar 

  16. Petis, M., & Fukui, K. (2012). Both-hand gesture recognition based on KOMSM with volume subspaces for robot teleoperation. In Proceedings of IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (pp. 191–196).

  17. Kim, J. C., Jung, H., Kim, S. H., & Chung, K. (2016). Slope based intelligent 3D disaster simulation using physics engine. Wireless Personal Communications, 86(1), 183–199.

    Article  Google Scholar 

  18. Jung, H., & Chung, K. (2016). P2P context awareness based sensibility design recommendation using color and bio-signal analysis. Peer-to-Peer Networking and Applications, 9(3), 546–557.

    Article  Google Scholar 

  19. Li, Y. (2012). Multi-scenario gesture recognition using Kinect. In Proceedings of the International Conference on Computer Games (pp. 126–130).

  20. Yun, H., Kim, K., Lee, J., & Lee, H. (2014). Development of experience dance game using kinect motion capture. KIPS Transactions on Software and Data Engineering, 3(1), 49–56.

    Article  Google Scholar 

  21. Oikonomidis, L., Kyriazis, N., & Argyros, A. A. (2011). Efficient model-based 3D tracking of hand articulations using Kinect. In In British Machine Vision Conference.

  22. Sung, J., Ponce, C., Selman, B., & Saxena, A. (2011). Human activity detection from RGBD images. In Proceedings of the International Workshop on Association for the Advancement of Artificial Intelligence.

  23. Kim, J. M. (2008). Three dimensional gesture recognition using PCA of stereo images and modified matching algorithm. In Proceedings of the International Conference on Fuzzy Systems and Knowledge Discovery (pp. 116–120).

  24. Holden, E. J., Lee, G., & Owens, R. (2005). Australian Sign Language Recognition. Machine Vision and Applications, 1(5), 312–320.

    Article  Google Scholar 

  25. Nickel, K., & Stiefelhagen, R. (2004). Real-time person tracking and pointing gesture recognition for human-robot interaction. Computer Vision in Human-Computer Interaction, 3058, 28–38.

    Article  Google Scholar 

  26. Microsoft Kinect SDK. http://www.microsoft.com/en-us/kinectforwindows

  27. Murase, H., & Nayar, S. K. (1995). Visual learning and recognition 3-D objects from appearance. International Journal of Computer Vision, 14, 5–24.

    Article  Google Scholar 

  28. Chung, K., Kim, J. C., & Park, R. C. (2016). Knowledge-based health service considering user convenience using hybrid Wi-Fi P2P. Information Technology and Management, 17(1), 67–80.

    Article  Google Scholar 

  29. Jung, E. Y., Kim, J. H., Chung, K. Y., & Park, D. K. (2013). Home health gateway based healthcare services through U-health platform. Wireless Personal Communications, 73(2), 207–218.

    Article  Google Scholar 

  30. Kim, J. M., & Kang, M. A. (2011). Appearance-based object recognition using higher correlation feature information and PCA. In Proceedings of the International Conference on Fuzzy Systems and Knowledge Discovery (pp. 1874–1878).

  31. TEKKEN 7. http://tk7.tekken-net.kr/

Download references

Acknowledgments

This study was conducted by research funds from Gwangju University in 2015.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kyungyong Chung.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kim, J., Jung, H., Kang, M. et al. 3D Human-Gesture Interface for Fighting Games Using Motion Recognition Sensor. Wireless Pers Commun 89, 927–940 (2016). https://doi.org/10.1007/s11277-016-3294-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11277-016-3294-9

Keywords

Navigation