[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Understanding freehand gestures: a study of freehand gestural interaction for immersive VR shopping applications

Published: 10 December 2019 Publication History

Abstract

Unlike retail stores, in which the user is forced to be physically present and active during restricted opening hours, online shops may be more convenient, functional and efficient. However, traditional online shops often have a narrow bandwidth for product visualizations and interactive techniques and lack a compelling shopping context. In this paper, we report a study on eliciting user-defined gestures for shopping tasks in an immersive VR (virtual reality) environment. We made a methodological contribution by providing a varied practice for producing more usable freehand gestures than traditional elicitation studies. Using our method, we developed a gesture taxonomy and generated a user-defined gesture set. To validate the usability of the derived gesture set, we conducted a comparative study and answered questions related to the performance, error count, user preference and effort required from end-users to use freehand gestures compared with traditional immersive VR interaction techniques, such as the virtual handle controller and ray-casting techniques. Experimental results show that the freehand-gesture-based interaction technique was rated to be the best in terms of task load, user experience, and presence without the loss of performance (i.e., speed and error count). Based on our findings, we also developed several design guidelines for gestural interaction.

References

[1]
Chen T, Pan ZG, Zheng JM (2008). EasyMall—an interactive virtual shopping system. In: 5th international conference on fuzzy systems and knowledge discovery. 4. pp. 669–673
[2]
Zhao L, Zhang N (2012). The virtual reality systems in electronic commerce. In: IEEE symposium on robotics and applications. pp. 833–835
[3]
Speicher Marco, Cucerca Sebastian, and Krüger Antonio VRShop Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2017 1 3 1-31
[4]
Speicher M, Hell P, Daiber F, Simeone A, Krüger A (2018). A virtual reality shopping experience using the apartment metaphor. In: Proceedings of the international conference on advanced visual interfaces. pp. 1–9
[5]
Sanna A, Montrucchio B, Montuschi P, & Demartini C (2001). 3D-dvshop: a 3D dynamic virtual shop. In: Multimedia. pp. 33–42
[6]
Cardoso LS, da Costa RMEM, Piovesana A, Costa M, Penna L, Crispin A, Carvalho J, Ferreira H, Lopes M, Brandao G, Mouta R (2006). Using virtual environments for stroke rehabilitation. In: International workshop on virtual rehabilitation. pp. 1–5
[7]
Josman N, Hof E, Klinger E, Marié RM, Goldenberg K, Weiss PL, Kizony R (2006). Performance within a virtual supermarket and its relationship to executive functions in post-stroke patients. In: International workshop on virtual rehabilitation. pp. 106–109
[8]
Carelli L, Morganti F, Weiss P, Kizony R, Riva G (2008). A virtual reality paradigm for the assessment and rehabilitation of executive function deficits post stroke: feasibility study. In: IEEE virtual rehabilitation. pp. 99–104
[9]
Josman N, Kizony R, Hof E, Goldenberg K, Weiss P, and Klinger E Using the virtual action planning-supermarket for evaluating executive functions in people with stroke J Stroke Cerebrovasc Dis 2014 23 5 879-887
[10]
Wu HY, Wang Y, Qiu JL, Liu JY, and Zhang XL User-defined gesture interaction for immersive VR shopping applications Behav Inf Technol 2018
[11]
Nanjappan V, Liang HN, Lu FY, Papangelis K, Yue Y, and Man KL User-elicited dual-hand interactions for manipulating 3D objects in virtual reality environments Hum Comput Inf Sci 2018 8 31 1-16
[12]
Song P, Goh WB, Hutama W, Fu CW, Liu XP (2012). A handle bar metaphor for virtual object manipulation with mid-air interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 1297–1236
[13]
Ren G and O’Neill E 3D selection with freehand gesture Comput Graph 2013 37 101-120
[14]
Feng ZQ, Yang B, Li Y, Zheng YW, Zhao XY, Yin JQ, and Meng QF Real-time oriented behavior-driven 3D freehand tracking for direct interaction Pattern Recogn 2013 46 590-608
[15]
Alkemade R, Verbeek FJ, and Lukosch SG On the efficiency of a VR hand gesture-based interface for 3D object manipulations in conceptual design Int J Hum–Comput Int 2017 33 11 882-901
[16]
Cui J and Sourin A Mid-air interaction with optical tracking for 3D modeling Comput Graph 2018 74 1-11
[17]
Figueiredo L, Rodrigues E, Teixeira J, and Teichrieb V A comparative evaluation of direct hand and wand interactions on consumer devices Comput Graph 2018 77 108-121
[18]
Tollmar K, Demirdjian D, Darrell T (2004). Navigating in virtual environments using a vision-based interface. In: Proceedings of the third Nordic conference on Human-computer interaction. pp. 113–120
[19]
Sherstyuk A, Vincent D, Lui JJH, Connolly KK (2007). Design and development of a pose-based command language for triage training in virtual reality. In: IEEE symposium on 3D user interfaces. pp. 33–40
[20]
Verhulst E, Richard P, Richard E, Allain P, Nolin P (2016). 3D interaction techniques for virtual shopping: design and preliminary study. In: International conference on computer graphics theory and applications. pp. 271–279
[21]
Kölsch M, Turk M, Höllerer T (2004). Vision-based interfaces for mobility. In: Mobile and ubiquitous systems: networking and services. pp. 86–94
[22]
Colaco A, Kirmani A, Yang HS, Gong NW, Schmandt C, Goyal VK (2013). Mine: compact, low-power 3D gesture sensing interaction with head-mounted displays. In: Proceedings of the 26th annual ACM symposium on user interface software and technology. pp. 227–236
[23]
Ohta M, Nagano S, Takahashi S, Abe H, Yamashita K (2015). Mixed-reality shopping system using HMD and smartwatch. In: Adjunct proceedings of the 2015 ACM international joint conference on pervasive and ubiquitous computing and proceedings of the 2015 ACM international symposium on wearable computers. pp. 125–128
[24]
Badju A, Lundberg D (2015). Shopping using gesture driven interaction. Master’s Thesis. Lund University. pp. 1–105
[25]
Altarteer Samar, Charissis Vassilis, Harrison David, and Chan Warren Development and Heuristic Evaluation of Semi-immersive Hand-Gestural Virtual Reality Interface for Luxury Brands Online Stores Lecture Notes in Computer Science 2017 Cham Springer International Publishing 464-477
[26]
Chan E, Seyed T, Stuerzlinger W, Yang XD, Maurer F (2016). User elicitation on single-hand microgestures. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 3403–3411
[27]
Choi S Understanding people with human activities and social interactions for human-centered computing Hum Comput Inf Sci 2016 6 9 1-10
[28]
Wu HY, Zhang SK, Qiu JL, Liu JY, and Zhang XL The gesture disagreement problem in freehand gesture interaction Int J Hum–Comput Inter 2018
[29]
Furnas GW, Landauer TK, Gomez LM, and Dumais ST The vocabulary problem in human-system communication Commun ACM 1987 30 11 964-971
[30]
Morris MR, Wobbrock JO, Wilson AD (2010). Understanding users’ preferences for surface gestures. In: Proceedings of graphics interface. pp. 261–268
[31]
Wobbrock JO, Morris MR, Wilson AD (2009). User-defined gestures for surface computing. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 1083–1092
[32]
Kray C, Nesbitt D, Rohs M (2010). User-defined gestures for connecting mobile phones, public displays, and tabletops. In: Proceedings of the 12th international conference on human computer interaction with mobile devices and services. pp. 239–248
[33]
Ruiz J, Li Y, Lank E (2011). User-defined motion gestures for mobile interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 197–206
[34]
Shimon SSA, Lutton C, Xu ZC, Smith SM, Boucher C, Ruiz J (2016). Exploring non-touchscreen gestures for smartwatches. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 3822–3833
[35]
Peshkova E, Hitz M, Ahlström D, Alexandrowicz RW, Kopper A (2017). Exploring intuitiveness of metaphor-based gestures for UAV navigation. In: 26th IEEE international symposium on robot and human interactive communication (RO-MAN). pp. 175–182
[36]
Gheran BF, Vanderdonckt J, Vatavu RD (2018). Gestures for smart rings: empirical results, insights, and design implications. In: ACM SIGCHI conference on designing interactive systems. pp. 623–635
[37]
Morris MR, Danielescu A, Drucker S, Fisher D, Lee B, Schraefel MC, and Wobbrock JO Reducing legacy bias in gesture elicitation studies Interactions. 2014 21 3 40-45
[38]
Seyed T, Burns C, Sousa MC, Maurer F, Tang A (2012). Eliciting usable gestures for multi-display environments. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces. pp. 41–50
[39]
Tung YC, Hsu CY, Wang HY, Chyou S, Lin JW, Wu PJ, Valstar A, Chen MY (2015). User-defined game input for smart glasses in public space. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 3327–3336
[40]
Hoff L, Hornecker E, Bertel S (2016). Modifying gesture elicitation: Do kinaesthetic priming and increased production reduce legacy bias? In: Proceedings of the tenth international conference on tangible, embedded, and embodied interaction. pp. 86–91
[41]
Jo D and Kim GJ Iot + AR: pervasive and augmented environments for “Digi-log” shopping experience Hum Comput Inf Sci 2019 9 1 1-17
[42]
Wu HY, Wang JM, and Zhang XL Combining hidden Markov model and fuzzy neural network for continuous recognition of complex dynamic gestures Visual Computer. 2017 33 10 1227-1263
[43]
Vatavu RD, Wobbrock JO (2015). Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In: Proceedings of the SIGCHI conference on human factors in computing systems. pp. 1325–1334
[44]
Wu HY and Wang JM A visual attention-based method to address the Midas touch problem existing in gesture-based interaction Visual Computer. 2016 32 1 123-136
[45]
Montero CS, Alexander J, Marshall M, Subramanian S (2010). Would you do that?—Understanding social acceptance of gestural interfaces. In: Proceedings of the 12th international conference on human computer interaction with mobile devices and services. pp. 275–278
[46]
Wu HY, Wang JM, and Zhang XL User-centered gesture development in TV viewing environment Multimedia Tools Appl 2016 75 2 733-760
[47]
Hart SG and Staveland LE Development of NASA-TLX (Task Load Index): results of empirical and theoretical research Adv Psychol 1988 52 139-183
[48]
Lund AM Measuring usability with the USE Questionnaire SIG Newslett 2001 8 2
[49]
Regenbrecht H and Schubert T Real and illusory interaction enhance presence in virtual environments Pres Teleoper Virtual Environ 2002 11 4 425-434
[50]
Bowman DA, Kruijff E, LaViola J, and Poupyrev I 3D user interfaces: theory and practice 2004 Redwood City Addison Wesley Longman Publishing Co., Inc
[51]
Chen Z, Ma XC, Peng ZY, Zhou Y, Yao MG, Ma Z, Wang C, Gao ZF, and Shen MW User-defined gestures for gestural interaction: extending from hands to other body parts Int J Hum–Comput Inter 2018 34 3 238-250

Cited By

View all
  • (2024)Do users desire gestures for in-vehicle interaction? Towards the subjective assessment of gestures in a high-fidelity driving simulatorComputers in Human Behavior10.1016/j.chb.2024.108189156:COnline publication date: 9-Jul-2024
  • (2023)Creating a Customer Journey for Immersive Virtual Reality Shopping Environments: Investigating Customer Touchpoints and Purchase PhasesProceedings of the 35th Australian Computer-Human Interaction Conference10.1145/3638380.3638383(294-305)Online publication date: 2-Dec-2023
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Human-centric Computing and Information Sciences
Human-centric Computing and Information Sciences  Volume 9, Issue 1
Dec 2019
898 pages
ISSN:2192-1962
EISSN:2192-1962
Issue’s Table of Contents

Publisher

Springer-Verlag

Berlin, Heidelberg

Publication History

Published: 10 December 2019
Accepted: 03 December 2019
Received: 31 July 2019

Author Tags

  1. Immersive virtual reality
  2. Elicitation study
  3. User-defined gestures
  4. Online shopping
  5. User experience

Qualifiers

  • Research-article

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 13 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Do users desire gestures for in-vehicle interaction? Towards the subjective assessment of gestures in a high-fidelity driving simulatorComputers in Human Behavior10.1016/j.chb.2024.108189156:COnline publication date: 9-Jul-2024
  • (2023)Creating a Customer Journey for Immersive Virtual Reality Shopping Environments: Investigating Customer Touchpoints and Purchase PhasesProceedings of the 35th Australian Computer-Human Interaction Conference10.1145/3638380.3638383(294-305)Online publication date: 2-Dec-2023
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • (2023)A Community-Based Exploration Into Gesture-driven Locomotion Control in Virtual RealityProceedings of the 4th African Human Computer Interaction Conference10.1145/3628096.3629055(180-189)Online publication date: 27-Nov-2023
  • (2023)Grab It, While You Can: A VR Gesture Evaluation of a Co-Designed Traditional Narrative by Indigenous PeopleProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580894(1-13)Online publication date: 19-Apr-2023
  • (2023)Immersive and desktop virtual reality in virtual fashion stores: a comparison between shopping experiencesVirtual Reality10.1007/s10055-023-00806-y27:3(2281-2296)Online publication date: 16-May-2023
  • (2023)Using Virtual Reality to Overcome Legacy Bias in Remote Gesture Elicitation StudiesHuman-Computer Interaction10.1007/978-3-031-35596-7_14(200-225)Online publication date: 23-Jul-2023
  • (2022)Influence of Digital Technology on Ideological and Political Education in Colleges and Universities under 5G EraScientific Programming10.1155/2022/90043852022Online publication date: 1-Jan-2022
  • (2022)3DeformRProceedings of the 13th ACM Multimedia Systems Conference10.1145/3524273.3528180(52-61)Online publication date: 14-Jun-2022
  • (2022)Designing Gestures for Digital Musical Instruments: Gesture Elicitation Study with Deaf and Hard of Hearing PeopleProceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3517428.3544828(1-8)Online publication date: 23-Oct-2022
  • Show More Cited By

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media