[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to main content

3D finger tracking and recognition image processing for real-time music playing with depth sensors

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In this research, we propose a state-of-the-art 3D finger gesture tracking and recognition method. We use the depth sensors for both hands in real time music playing. In line with the development of 3D depth cameras, we implemented a set of 3D gesture-based instruments, such as Virtual Cello and Virtual Piano, which need precise finger tracking in 3D space. For hands tracking, model-based tracking for left hand and appearance-based tracking for right hand techniques are proposed. To detect finger gestures, our approaches consist number of systematic steps as reducing noise in depth map and geometrical processing for Virtual Cello. For Virtual Piano, we introduce the Neural Network (NN) method to detect special hand gestures. It has Multilayer Perceptron (MLP) structure with back propagation training. Literature has few examples using touch screen as medium, with fixed-coordinates, and 2D–gestures to control MIDI input. The end users should no longer carry anything on their hands. We use Senz3D and Leap Motion due to a few technical benefits. Senz3D and Leap Motion use a closer distance to hands, thus detailed finger gestures can be precisely identified. In the past years, we announced a set of virtual musical instruments and the MINE Virtual Band. Our research work is tested on lab environment and professional theatrical stage. More information and demonstrations of the proposed method can be accessed at: http://video.minelab.tw/DETS/VMIB/.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

References

  1. Bai H, Gao L, El-Sana J, Billinghurst M, (2013) Free-hand interaction for handheld augmented reality using an RGB-depth camera, In: SIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications, pp. 22

  2. Bergh MV, Gool LV (2011) Combining RGB and ToF cameras for real-time 3D hand gesture interaction. In: 2011 I.E. Workshop on Applications of Computer Vision (WACV), pp. 66–72

  3. Bergh MV, Koller-Meier E, Bosché F, Gool LV (2009) Haarlet-based hand gesture recognition for 3D interaction, In: 2009 Workshop on Applications of Computer Vision (WACV), pp. 1–8

  4. Christian VH and François B, (2001) Bare-hand human computer interaction”, Proc. 2001 workshop Percetive user interfaces, Orlando, pp. 1-8

  5. Fels S, Nishimoto K, and Mase K, (1997) MusiKalscope: a graphical musical instrument”. In: IEEE International Conference on Multimedia Computing and Systems ‘97. Proceedings, pp. 55–62

  6. Ghosh DK, Ari S, (2011) A static hand gesture recognition algorithm using k-mean based radial basis function neural network. In: Information, Communications and Signal Processing (ICICS) 2011 8th International Conference on IEEE, pp. 1–5

  7. Hsu M, Kumara W, Shih TK, Cheng Z. (2013) Spider King: Virtual musical instruments based on microsoft Kinect, In: Awareness Science and Technology and Ubi-Media Computing (iCAST-UMEDIA), 2013 International Joint Conference on IEEE, pp. 707–713

  8. Kilteni K, Bergstrom I, Slater M (2013) Drumming in immersive virtual reality: the body shapes the way we play. IEEE Trans Vis Comput Graph 19:597–605

    Article  Google Scholar 

  9. Li Y, (2012) Hand gesture recognition using Kinect. In: Software Engineering and Service Science (ICSESS), 2012 I.E. 3rd International Conference on IEEE, pp. 196–199

  10. Melax S, Keselman L, Orsten S, (2013) Dynamics based 3D skeletal hand tracking”, In: Proceedings of Graphics Interface 2013, pp. 63–70

  11. Mubashar KA, Umar W, Choudhary T, Hussain F, Haroon YM (2013) A new algorithmic approach for fingers detection and identification. In Proc of SPIE 8768:1–6

    Google Scholar 

  12. Murthy GRS, Jadon RS, (2010) Hand gesture recognition using neural networks. In: Advance Computing Conference (IACC), 2010 I.E. 2nd International, pp. 134–138

  13. Neto P, Pereira D, Pires JN, and Moreira AP, (2013) Real-time and continuous hand gesture spotting: an approach based on artificial neural networks, In: 2013 I.E. International Conference on Robotics and Automation (ICRA), pp. 178–183

  14. Paine G (2013) New musical instrument design considerations. IEEE MultiMedia 20:76–84

    Article  Google Scholar 

  15. Qian C, Sun X, Wei Y, Tang X, Sun J, (2014) Realtime and Robust Hand Tracking from Depth , Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1106–1113

  16. Raheja JL, Chaudhary A, Singal K, (2011) Tracking of fingertips and centers of palm using Kinect, In: Computational intelligence, modelling and simulation (CIMSiM), 2011 third international conference on IEEE, pp. 248–252

  17. Valbom L, Marcos A (2007) An immersive musical instrument prototype. IEEE Comput Graph Appl 27:14–19

    Article  Google Scholar 

  18. Wang RY, Popovi J (2009) Real-time hand-tracking with a color glove. ACM SIGGRAPH 2009:1–8

    Google Scholar 

  19. Xu D, (2006) A neural network approach for hand gesture recognition In: virtual reality driving training system of SPG”, 18th International Conference on Pattern Recognition (ICPR’06), 3, 519–522

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Timothy K. Shih.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Togootogtokh, E., Shih, T.K., Kumara, W.G.C.W. et al. 3D finger tracking and recognition image processing for real-time music playing with depth sensors. Multimed Tools Appl 77, 9233–9248 (2018). https://doi.org/10.1007/s11042-017-4784-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-017-4784-9

Keywords