[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/1477862.1477871acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
research-article

Vision-based 3D finger interactions for mixed reality games with physics simulation

Published: 08 December 2008 Publication History

Abstract

Mixed reality applications can provide users with enhanced interaction experiences by integrating virtual and real world objects in a mixed environment. Through the mixed reality interface, a more realistic and immersive control style is achieved compared to the traditional keyboard and mouse input devices. The interface proposed in this paper consists of a stereo camera, which tracks the user's hands and fingers robustly and accurately in the 3D space. To enable a physically realistic experience in the interaction, a physics engine is adopted for the simulating the physics of virtual object manipulation. The objects can be picked up and tossed with physical characteristics, such as gravity and collisions which occur in the real world. Detection and interaction in our system is fully computer-vision based, without any markers or additional sensors. We demonstrate this gesture-based interface using two mixed reality game implementations: finger fishing, in which a player can simulate fishing for virtual objects with his/her fingers as in a real environment, and Jenga, which is a simulation of the well-known tower building game. A user study is conducted and reported to demonstrate the accuracy, effectiveness and comfort of using this interactive interface.

Supplementary Material

JPG File (a7-song.jpg)
WMV File (a7-song.wmv)

References

[1]
Audet, S., Bedrosian, M., Clement, C., and Dinculescu, M., 2006. MulTetris: A test of graspable user interfaces in collaborative games. Course project, McGill University, Canada.
[2]
Billinghurst, M., Kato, H., and Poupyrev, I. 2001. The MagicBook: Moving seamlessly between reality and virtuality. IEEE Comput. Graph. Appl. 21, 3, 6--8.
[3]
Bullet. Bullet continuous collision detection and physics library. http://www.continuousphysics.com/Bullet/.
[4]
Cheok, A. D., Goh, K. H., Liu, W., Farbiz, F., Fong, S. W., Teo, S. L., Li, Y., and Yang, X. 2004. Human pacman: a mobile, wide-area entertainment system based on physical, social, and ubiquitous computing. Personal Ubiquitous Comput. 8, 2, 71--81.
[5]
Crowley, J., Bérard, F., and Coutaz, J. 1995. Finger-tracking as an input device for augmented reality. In Proc. International Conference on Automatic Face and Gesture Recognition.
[6]
Hardenberg, C., and Brard, F. 2001. Bare-hand human computer interaction. In Proc. Perceptual User Interfaces.
[7]
Havok. Havok physics. http://www.havok.com/.
[8]
Ishii, H., and Ullmer, B. 1997. Tangible bits: Towards seamless interfaces between people, bits and atoms. In Proceedings of CHI'97 Conference on Human Factors in Computing systems, 234--241.
[9]
Laptev, I., and Lindeberg, T. 2000. Tracking of multi-state hand models using particle filtering and a hierarchy of multiscale image features. In Technical Report ISRN KTH/NA/P00/12-SE, The Royal Institute of Technology (KTH).
[10]
Lee, S. Y., Kim, I. J., and Ahn, S. C., 2006. Real-time 3d video avatar for tangible interface. TSI workshop.
[11]
Lien, C., and Huang, C. 1998. Model-based articulated hand motion tracking for gesture recognition. Image and Vision Computing 16, 2, 121--134.
[12]
Mikado Game. http://www.allwag.co.uk/detail_8617_0_0_Mikado-Game.aspx.
[13]
Newton Physics Engine. A free win32 physics engine. http://www.physicsengine.com/.
[14]
Nielsen, R., Delman, T. F., and Lossing, T. 2005. A mixed reality game for urban planning. In Proceedings of Computers in Urban Planning and Urban Management.
[15]
ODE. Open dynamics engine. http://www.ode.org/.
[16]
Rehg, J., and Kanade, T. 1993. Digiteyes: Vision-based 3D human hand tracking. In Technical Report CMU-CS-93-220.
[17]
Sato, Y., Kobayashi, Y., and Koike, H. 2000. Fast tracking of hands and fingertips in infrared images for augmented desk interface. In Proc. International Conference on Automatic Face and Gesture Recognition.
[18]
Segen, J. 1998. Gesture VR: Vision-based 3D hand interface for spatial interaction. In Proc. ACM Multimedia Conference.
[19]
Song, P., Winkler, S., Gilani, S., and Zhou, Z. 2007. Vision-based projected tabletop interface for finger interactions. In Human-Computer Interaction, IEEE International Workshop, HCI 2007, Proceedings, 49--58.
[20]
Thomas, B., Close, B., Donoghue, J., Squires, J., Bondi, P. D., and Piekarski, W. 2002. First person indoor/outdoor augmented reality application: Arquake. Personal and Ubiquitous Computing 6, 1, 75--86.
[21]
Tokamak Game Physics. http://www.tokamakphysics.com/.
[22]
Triesch, J., and Malsburg, C. 1996. Robust classification of hand postures against complex background. In Proc. International Conference On Automatic Face and Gesture Recognition.
[23]
Uray, P., Kienzl, D. T., and Marsche, D. U. 2006. MRI: a mixed reality interface for the masses. In ACM SIGGRAPH Emerging technologies, 24.
[24]
Wanderley, I., Kelner, J., Costa, N., and Teichrieb, V. 2006. A survey of interaction in mixed reality systems. In Symposium on Virtual Reality, 1--4.
[25]
Winkler, S., Yu, H., and Zhou, Z. Y. 2007. Tangible mixed reality desktop for digital media management. In SPIE The Engineering Reality of Virtual Reality, vol. 6490B.

Cited By

View all
  • (2024)A Real-Time and Interactive Fluid Modeling System for Mixed RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345614030:11(7310-7320)Online publication date: 1-Nov-2024
  • (2022)Augmented and Virtual Reality Experiences for Learning Robotics and Training Integrative Thinking SkillsRobotics10.3390/robotics1105009011:5(90)Online publication date: 6-Sep-2022
  • (2022)Contactless Fingerprint Recognition Using Deep Learning—A Systematic ReviewJournal of Cybersecurity and Privacy10.3390/jcp20300362:3(714-730)Online publication date: 8-Sep-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
VRCAI '08: Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry
December 2008
223 pages
ISBN:9781605583358
DOI:10.1145/1477862
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 December 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. finger interaction
  2. finger tracking
  3. mixed reality
  4. physics simulation

Qualifiers

  • Research-article

Conference

VRCAI '08
Sponsor:

Acceptance Rates

Overall Acceptance Rate 51 of 107 submissions, 48%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)16
  • Downloads (Last 6 weeks)1
Reflects downloads up to 25 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)A Real-Time and Interactive Fluid Modeling System for Mixed RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345614030:11(7310-7320)Online publication date: 1-Nov-2024
  • (2022)Augmented and Virtual Reality Experiences for Learning Robotics and Training Integrative Thinking SkillsRobotics10.3390/robotics1105009011:5(90)Online publication date: 6-Sep-2022
  • (2022)Contactless Fingerprint Recognition Using Deep Learning—A Systematic ReviewJournal of Cybersecurity and Privacy10.3390/jcp20300362:3(714-730)Online publication date: 8-Sep-2022
  • (2022)Freehand Gestural Selection with Haptic Feedback in Wearable Optical See-Through Augmented RealityInformation10.3390/info1312056613:12(566)Online publication date: 2-Dec-2022
  • (2022)Use of augmented reality in science education: A mixed-methods research with the multi-complementary approachEducation and Information Technologies10.1007/s10639-022-11398-6Online publication date: 29-Oct-2022
  • (2022)A review of platforms for simulating embodied agents in 3D virtual environmentsArtificial Intelligence Review10.1007/s10462-022-10253-x56:4(3711-3753)Online publication date: 10-Sep-2022
  • (2020)Affine transformation of virtual 3D object using 2D localization of fingertipsVirtual Reality & Intelligent Hardware10.1016/j.vrih.2020.10.0012:6(534-555)Online publication date: Dec-2020
  • (2019)Detection and Tracking of Fingertips for Geometric Transformation of Objects in Virtual Environment2019 IEEE/ACS 16th International Conference on Computer Systems and Applications (AICCSA)10.1109/AICCSA47632.2019.9035256(1-8)Online publication date: Nov-2019
  • (2018)Collaborative Virtual Laboratory Environments with Hardware in the LoopCyber-Physical Laboratories in Engineering and Science Education10.1007/978-3-319-76935-6_15(363-402)Online publication date: 27-Apr-2018
  • (2016)Accurate fingertip detection from binocular mask images2016 Visual Communications and Image Processing (VCIP)10.1109/VCIP.2016.7805569(1-4)Online publication date: Nov-2016
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media