[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Pinch gesture interaction in the peripersonal space using VST smartphone-based HMDs

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Augmented Reality (AR) applications are known for using unconventional devices and enabling different ways to interact with the virtual elements of the environment. Head-mounted displays (HMDs), for example, enrich the user experience by providing greater immersion than other display devices. Most conventional HMDs are expensive or have restrictions, such as a limited field of view, making them inappropriate for many applications. With the increased computational power on mobile, some HMD models integrate conventional smartphones into the equipment. Smartphone-based HMDs have the advantage of being cheaper, so they can decrease the cost of AR systems. In addition, smartphone-based HMDs have the processing power, field of view, and camera quality required for most AR applications. Many applications that can use smartphone-based HMDs use the peripersonal space, distances of up to 1 meter from the user, as a region to interact with the virtual objects of the environment. As it is a specific region, it is essential to evaluate whether the device’s limitations affect the performance of any interaction within this space. In this work, we created a system that combines a depth sensor and an HMD device to evaluate whether the user’s performance using pinch gestures to interact inside the peripersonal space is uniform when using a smartphone-based HMD in AR applications. The results showed that such equipment is suitable for applications where interaction occurs under these conditions, as the user’s performance in similar tasks is not affected when the interaction occurs within the peripersonal space.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Availability of data and materials

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

References

  1. Alves JB, Marques B, Ferreira C et al (2022) Comparing augmented reality visualization methods for assembly procedures. Virtual Reality 26(1):235–248. https://doi.org/10.1007/s10055-021-00557-8

    Article  Google Scholar 

  2. Ballestin G, Solari F, Chessa M (2018) Perception and action in peripersonal space: A comparison between video and optical see-through augmented reality devices. In: 2018 IEEE International symposium on mixed and augmented reality adjunct (ISMAR-Adjunct), pp 184–189, https://doi.org/10.1109/ISMAR-Adjunct.2018.00063

  3. Batmaz AU, Machuca MDB, Pham DM et al (2019) Do head-mounted display stereo deficiencies affect 3d pointing tasks in ar and vr.In: IEEE Conference on virtual reality and 3D user interfaces, VR 2019, Osaka, Japan, March 23-27, 2019. IEEE, pp 585–592, https://doi.org/10.1109/VR.2019.8797975

  4. Billinghurst M, Clark A, Lee G (2015) A survey of augmented reality. Found Trends Hum-Comput Interact 8(2–3):73–272. https://doi.org/10.1561/1100000049

    Article  Google Scholar 

  5. Cakmakci O, Rolland J (2006) Head-worn displays: a review. J Disp Technol 3:199–216. https://doi.org/10.1109/JDT.2006.879846

    Article  Google Scholar 

  6. Colgan A (2014) How does the leap motion controller work. https://www.blog.leapmotion.com/hardware-to-software-how-does-the-leap-motion-controller-work. Acessed 18/11/2020

  7. Cutting JE, Vishton PM (1995) Perceiving layout and knowing distances: The integration, relative potency, and contextual use of different information about depth. In: Epstein W, Rogers S (eds) Perception of space and motion. Handbook of Perception and Cognition, Academic Press, San Diego, p 69–117, https://doi.org/10.1016/B978-012240530-3/50005-5

  8. Itoh Y, Langlotz T, Sutton J et al (2021) Towards indistinguishable augmented reality: A survey on optical see-through head-mounted displays. ACM Comput Surv 54(6). https://doi.org/10.1145/3453157

  9. Jiménez S (2014) Physical interaction in augmented environments. Master’s thesis, sl: Gjøvik University College

  10. Masia B, Wetzstein G, Didyk P et al (2013) A survey on computational displays: Pushing the boundaries of optics, computation, and perception. Comput & Graph 37(8):1012–1038. https://doi.org/10.1016/j.cag.2013.10.003

    Article  Google Scholar 

  11. Moser KR, Swan JE (2016) Evaluation of user-centric optical see-through head-mounted display calibration using a leap motion controller. In: 2016 IEEE symposium on 3D user interfaces (3DUI), IEEE, pp 159–167

  12. Naceri A, Chellali R, Hoinville T (2011) Depth perception within peripersonal space using head-mounted display. Presence 20(3):254–272. https://doi.org/10.1162/PRES_a_00048

    Article  Google Scholar 

  13. Pfeil K, Masnadi S, Belga J et al (2021) Distance perception with a video see-through head-mounted display. In: Proceedings of the 2021 CHI conference on human factors in computing systems. Association for Computing Machinery, New York, NY, USA, CHI ’21.https://doi.org/10.1145/3411764.3445223

  14. Pierdicca R, Frontoni E, Pollini R et al (2017) The use of augmented reality glasses for the application in industry 4.0. In: De Paolis LT, Bourdot P, Mongelli A (eds) Augmented Reality, Virtual Reality, and Computer Graphics. Springer International Publishing, Cham, pp 389–401

  15. Ping J, Liu Y, Weng D (2019) Comparison in depth perception between virtual reality and augmented reality systems. In: 2019 IEEE conference on virtual reality and 3D user interfaces (VR), pp 1124–1125.https://doi.org/10.1109/VR.2019.8798174

  16. Plopski A, Moser KR, Kiyokawa K, et al (2016) Spatial consistency perception in optical and video see-through head-mounted augmentations. In: 2016 IEEE Virtual Reality (VR), pp 265–266, https://doi.org/10.1109/VR.2016.7504755

  17. Qian J, Shamma DA, Avrahami D, et al (2020) Modality and depth in touchless smartphone augmented reality interactions. In: Proceedings of the 2020 ACM international conference on interactive media experiences. Association for Computing Machinery, New York, NY, USA, IMX ’20, pp 74–81, https://doi.org/10.1145/3391614.3393648

  18. Sanches SRR, Oizumi M, Oliveira C, et al (2017) Aspects of user profiles that can improve mobile augmented reality usage. In: 2017 19th Symposium on virtual and augmented reality (SVR), pp 236–242, https://doi.org/10.1109/SVR.2017.38

  19. Sanches SRR, Correa CG, Oliveira C et al (2019) Impact of frame rate in augmented reality for mobile devices. IEEE Lat Am Trans 17(02):228–235. https://doi.org/10.1109/TLA.2019.8863168

    Article  Google Scholar 

  20. Scavarelli A, Arya A, Teather RJ (2021) Virtual reality and augmented reality in social learning spaces: a literature review. Virtual Reality 25(1):257-277. https://doi.org/10.1007/s10055-020-00444-8

  21. da Silva Hounsell M, Tori R, Kirner C (2018) Realidade aumentada. In: Introdução a realidade virtual e aumentada. Editora SBC, pp 6–74

  22. Taira GMN, Sementille AC, Sanches SRR (2018) Influence of the camera viewpoint on augmented reality interaction. IEEE Lat Am Trans 16(1):260–264. https://doi.org/10.1109/TLA.2018.8291482

    Article  Google Scholar 

  23. Ultraleap (2021) Leap motion controller. https://www.ultraleap.com/product/leap-motion-controller. Acessed 04/07/2021

  24. Zhao C, Li KW, Peng L (2023) Movement time for pointing tasks in real and augmented reality environments. Appl Sci 13(2):788. https://doi.org/10.3390/app13020788

    Article  Google Scholar 

Download references

Funding

The authors declare that no funds, grants, or other support were received during the preparation of this manuscript.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study conception and design.

Corresponding author

Correspondence to Silvio Ricardo Rodrigues Sanches.

Ethics declarations

Conflicts of interest

We have no knowledge of any declared conflicts of interest in connection with this publication.

Ethics approval

Ethics approval is not applicable as there was no volunteer participation.

Authors Consent

The manuscript is submitted with the consent of all authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Batistão, L.V., Sementille, A.C., Corrêa, C.G. et al. Pinch gesture interaction in the peripersonal space using VST smartphone-based HMDs. Multimed Tools Appl 83, 80873–80887 (2024). https://doi.org/10.1007/s11042-024-18736-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-024-18736-5

Keywords