[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3526114.3558691acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
poster

Towards using Involuntary Body Gestures for Measuring the User Engagement in VR Gaming

Published: 28 October 2022 Publication History

Abstract

Understanding the degree of user engagement in a VR game is vital to provide a better gaming experience. While prior work has suggested self-reports, and biological signal-based methods, measuring game engagement remains a challenge due to its complex nature. In this work, we provide a preliminary exploration of using involuntary body gestures to measure user engagement in VR gaming. Based on data collected from 27 participants performing multiple VR games, we demonstrate a relationship between foot gesture-based models for measuring arousal and physiological responses while engaging in VR games. Our findings show the possibility of using involuntary body gestures to measure engagement.

References

[1]
Nadia Bianchi-Berthouze, Whan Woong Kim, and Darshak Patel. 2007. Does body movement engage you more in digital game play? and why?. In International conference on affective computing and intelligent interaction. Springer, 102–113.
[2]
Emily Brown and Paul Cairns. 2004. A grounded investigation of game immersion. In CHI’04 extended abstracts on Human factors in computing systems. 1297–1300.
[3]
Rick Dale, Caitlin Kehoe, and Michael J Spivey. 2007. Graded motor responses in the time course of categorizing atypical exemplars. Memory & cognition 35, 1 (2007), 15–28.
[4]
Don Samitha Elvitigala, Denys JC Matthies, and Suranga Nanayakkara. 2020. StressFoot: Uncovering the potential of the foot for acute stress sensing in sitting posture. Sensors 20, 10 (2020), 2882.
[5]
Don Samitha Elvitigala, Philipp M Scholl, Hussel Suriyaarachchi, Vipula Dissanayake, and Suranga Nanayakkara. 2021. StressShoe: A DIY Toolkit for just-in-time Personalised Stress Interventions for Office Workers Performing Sedentary Tasks. In Proceedings of the 23rd International Conference on Mobile Human-Computer Interaction. 1–14.
[6]
Tammy S Gregersen. 2005. Nonverbal cues: Clues to the detection of foreign language anxiety. Foreign language annals 38, 3 (2005), 388–400.
[7]
Juho Hamari, David J Shernoff, Elizabeth Rowe, Brianno Coller, Jodi Asbell-Clarke, and Teon Edwards. 2016. Challenging games help students learn: An empirical study on engagement, flow and immersion in game-based learning. Computers in human behavior 54 (2016), 170–179.
[8]
Javier Hernandez, Pablo Paredes, Asta Roseway, and Mary Czerwinski. 2014. Under pressure: sensing stress of computer users. In Proceedings of the SIGCHI conference on Human factors in computing systems. 51–60.
[9]
Matthew Lombard and Theresa Ditton. 1997. At the heart of it all: The concept of presence. Journal of computer-mediated communication 3, 2 (1997), JCMC321.
[10]
Regan L Mandryk and Kori M Inkpen. 2004. Physiological indicators for the evaluation of co-located collaborative play. In Proceedings of the 2004 ACM conference on Computer supported cooperative work. 102–111.
[11]
Rosa Mikeal Martey, Kate Kenski, James Folkestad, Laurie Feldman, Elana Gordis, Adrienne Shaw, Jennifer Stromer-Galley, Ben Clegg, Hui Zhang, Nissim Kaufman, 2014. Measuring game engagement: multiple methods and construct complexity. Simulation & Gaming 45, 4-5 (2014), 528–547.
[12]
Desmond Morris. 2002. Peoplewatching. Random House.
[13]
Joe Navarro and Marvin Karlins. 2016. What every body is saying. HarperCollins.
[14]
Klaus R Scherer. 2009. The dynamic architecture of emotion: Evidence for the component process model. Cognition and emotion 23, 7 (2009), 1307–1351.
[15]
Klaus R Scherer, Johnny JR Fontaine, and Cristina Soriano. 2013. CoreGRID and MiniGRID: Development and validation of two short versions of the GRID instrument.(2013).
[16]
Vera Shuman, Katja Schlegel, and Klaus Scherer. 2015. Geneva Emotion Wheel Rating Study. Swiss Centre for Affective Sciences, Geneva(2015).
[17]
Rukshani Somarathna, Tomasz Bednarz, and Gelareh Mohammadi. 2022. An Exploratory Analysis of Interactive VR-Based Framework for Multi-Componential Analysis of Emotion. In 2022 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops). 353–358.

Cited By

View all
  • (2024)EmoFoot: Can Your Foot Tell How You Feel when Playing Virtual Reality Games?Adjunct Proceedings of the 26th International Conference on Mobile Human-Computer Interaction10.1145/3640471.3680234(1-6)Online publication date: 21-Sep-2024
  • (2024)Towards Understanding Player Experience in Virtual Reality Games through Physiological Computing2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops59983.2024.10503181(405-408)Online publication date: 11-Mar-2024
  • (2023)Exploring User Engagement in Immersive Virtual Reality Games through Multimodal Body MovementsProceedings of the 29th ACM Symposium on Virtual Reality Software and Technology10.1145/3611659.3615687(1-8)Online publication date: 9-Oct-2023

Index Terms

  1. Towards using Involuntary Body Gestures for Measuring the User Engagement in VR Gaming

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '22 Adjunct: Adjunct Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology
    October 2022
    413 pages
    ISBN:9781450393218
    DOI:10.1145/3526114
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 October 2022

    Check for updates

    Author Tags

    1. Body Gestures
    2. Component Process Model
    3. Emotions
    4. Foot Gestures
    5. Gaming
    6. IMU
    7. Play
    8. Stress
    9. VR
    10. VR Games

    Qualifiers

    • Poster
    • Research
    • Refereed limited

    Conference

    UIST '22

    Acceptance Rates

    Overall Acceptance Rate 355 of 1,733 submissions, 20%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)49
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 31 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)EmoFoot: Can Your Foot Tell How You Feel when Playing Virtual Reality Games?Adjunct Proceedings of the 26th International Conference on Mobile Human-Computer Interaction10.1145/3640471.3680234(1-6)Online publication date: 21-Sep-2024
    • (2024)Towards Understanding Player Experience in Virtual Reality Games through Physiological Computing2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops59983.2024.10503181(405-408)Online publication date: 11-Mar-2024
    • (2023)Exploring User Engagement in Immersive Virtual Reality Games through Multimodal Body MovementsProceedings of the 29th ACM Symposium on Virtual Reality Software and Technology10.1145/3611659.3615687(1-8)Online publication date: 9-Oct-2023

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media