Abstract
In this research, we propose a state-of-the-art 3D finger gesture tracking and recognition method. We use the depth sensors for both hands in real time music playing. In line with the development of 3D depth cameras, we implemented a set of 3D gesture-based instruments, such as Virtual Cello and Virtual Piano, which need precise finger tracking in 3D space. For hands tracking, model-based tracking for left hand and appearance-based tracking for right hand techniques are proposed. To detect finger gestures, our approaches consist number of systematic steps as reducing noise in depth map and geometrical processing for Virtual Cello. For Virtual Piano, we introduce the Neural Network (NN) method to detect special hand gestures. It has Multilayer Perceptron (MLP) structure with back propagation training. Literature has few examples using touch screen as medium, with fixed-coordinates, and 2D–gestures to control MIDI input. The end users should no longer carry anything on their hands. We use Senz3D and Leap Motion due to a few technical benefits. Senz3D and Leap Motion use a closer distance to hands, thus detailed finger gestures can be precisely identified. In the past years, we announced a set of virtual musical instruments and the MINE Virtual Band. Our research work is tested on lab environment and professional theatrical stage. More information and demonstrations of the proposed method can be accessed at: http://video.minelab.tw/DETS/VMIB/.
References
Bai H, Gao L, El-Sana J, Billinghurst M, (2013) Free-hand interaction for handheld augmented reality using an RGB-depth camera, In: SIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications, pp. 22
Bergh MV, Gool LV (2011) Combining RGB and ToF cameras for real-time 3D hand gesture interaction. In: 2011 I.E. Workshop on Applications of Computer Vision (WACV), pp. 66–72
Bergh MV, Koller-Meier E, Bosché F, Gool LV (2009) Haarlet-based hand gesture recognition for 3D interaction, In: 2009 Workshop on Applications of Computer Vision (WACV), pp. 1–8
Christian VH and François B, (2001) Bare-hand human computer interaction”, Proc. 2001 workshop Percetive user interfaces, Orlando, pp. 1-8
Fels S, Nishimoto K, and Mase K, (1997) MusiKalscope: a graphical musical instrument”. In: IEEE International Conference on Multimedia Computing and Systems ‘97. Proceedings, pp. 55–62
Ghosh DK, Ari S, (2011) A static hand gesture recognition algorithm using k-mean based radial basis function neural network. In: Information, Communications and Signal Processing (ICICS) 2011 8th International Conference on IEEE, pp. 1–5
Hsu M, Kumara W, Shih TK, Cheng Z. (2013) Spider King: Virtual musical instruments based on microsoft Kinect, In: Awareness Science and Technology and Ubi-Media Computing (iCAST-UMEDIA), 2013 International Joint Conference on IEEE, pp. 707–713
Kilteni K, Bergstrom I, Slater M (2013) Drumming in immersive virtual reality: the body shapes the way we play. IEEE Trans Vis Comput Graph 19:597–605
Li Y, (2012) Hand gesture recognition using Kinect. In: Software Engineering and Service Science (ICSESS), 2012 I.E. 3rd International Conference on IEEE, pp. 196–199
Melax S, Keselman L, Orsten S, (2013) Dynamics based 3D skeletal hand tracking”, In: Proceedings of Graphics Interface 2013, pp. 63–70
Mubashar KA, Umar W, Choudhary T, Hussain F, Haroon YM (2013) A new algorithmic approach for fingers detection and identification. In Proc of SPIE 8768:1–6
Murthy GRS, Jadon RS, (2010) Hand gesture recognition using neural networks. In: Advance Computing Conference (IACC), 2010 I.E. 2nd International, pp. 134–138
Neto P, Pereira D, Pires JN, and Moreira AP, (2013) Real-time and continuous hand gesture spotting: an approach based on artificial neural networks, In: 2013 I.E. International Conference on Robotics and Automation (ICRA), pp. 178–183
Paine G (2013) New musical instrument design considerations. IEEE MultiMedia 20:76–84
Qian C, Sun X, Wei Y, Tang X, Sun J, (2014) Realtime and Robust Hand Tracking from Depth , Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1106–1113
Raheja JL, Chaudhary A, Singal K, (2011) Tracking of fingertips and centers of palm using Kinect, In: Computational intelligence, modelling and simulation (CIMSiM), 2011 third international conference on IEEE, pp. 248–252
Valbom L, Marcos A (2007) An immersive musical instrument prototype. IEEE Comput Graph Appl 27:14–19
Wang RY, Popovi J (2009) Real-time hand-tracking with a color glove. ACM SIGGRAPH 2009:1–8
Xu D, (2006) A neural network approach for hand gesture recognition In: virtual reality driving training system of SPG”, 18th International Conference on Pattern Recognition (ICPR’06), 3, 519–522
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Togootogtokh, E., Shih, T.K., Kumara, W.G.C.W. et al. 3D finger tracking and recognition image processing for real-time music playing with depth sensors. Multimed Tools Appl 77, 9233–9248 (2018). https://doi.org/10.1007/s11042-017-4784-9
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-017-4784-9