[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3383668.3419867acmconferencesArticle/Chapter ViewAbstractPublication Pageschi-playConference Proceedingsconference-collections
short-paper

Characterizing the Coordinating Behaviors in a Three vs. One Ball Possession Game using Gaze Interaction

Published: 03 November 2020 Publication History

Abstract

Coordinating behaviors in group activities is a fundamental skill in our social life. However, people with impairments and disabilities may have difficulty performing a whole-body movement and manipulation of devices; therefore, their opportunities for coordinating behaviors are considerably reduced, especially in virtual environments. In this study, we focus on gaze interaction as a primary interaction method and investigate coordinating behaviors in a virtual environment. In addition, we implement a three vs. one ball possession game controlled using gaze interaction. As a preliminary study, we examine the coordinating behavioral patterns of game players without impairments and disabilities. Moreover, our experimental result demonstrates that it is possible to perform three vs. one ball possession games using gaze interaction. Further, we observe the distribution of angles among the three players, with modes at 56, 72, and 50 degrees for players 1, 2, and 3, respectively. This reveals the approximate shape of the triangle constructed by the players. In the real-world player's skill, the resulted triangle allowed us to categorize the players at the intermediate level.

Supplementary Material

SRT File (cpwp1020vfc.srt)
Video figure captions
MP4 File (cpwp1020vf.mp4)
Supplemental video

References

[1]
Francesco Alderisio, Gianfranco Fiore, Robin N. Salesse, Benoît G. Bardy, and Mario di Bernardo. 2017. Interaction patterns and individual dynamics shape the way we move in synchrony. 7, 1 (2017), 6846. https://doi.org/10.1038/s41598017-06559-4 Number: 1 Publisher: Nature Publishing Group.
[2]
Jacopo M. Araujo, Guangtao Zhang, John Paulin Paulin Hansen, and Sadasivan Puthusserypady. 2020. Exploring Eye-Gaze Wheelchair Control. In ACM Symposium on Eye Tracking Research and Applications (Stuttgart, Germany, 2020-06-02) (ETRA '20 Adjunct). Association for Computing Machinery, New York, NY, USA, 1--8. https://doi.org/10.1145/3379157.3388933
[3]
Alireza Fathi, Yin Li, and James M. Rehg. 2012. Learning to Recognize Daily Actions Using Gaze. In Computer Vision -- ECCV 2012 (Florence, Italy, 2012--10-08) (Lecture Notes in Computer Science), Andrew Fitzgibbon, Svetlana Lazebnik, Pietro Perona, Yoichi Sato, and Cordelia Schmid (Eds.). Springer, Berlin, Heidelberg, 314--327. https://doi.org/10.1007/978-3-642-33718-5_23
[4]
Keisuke Fujii, Keiko Yokoyama, Takeshi Koyama, Akira Rikukawa, Hiroshi Yamada, and Yuji Yamamoto. 2016. Resilient help to switch and overlap hierarchical subsystems in a small human group. 6, 1 (2016), 23911. https://doi.org/10.1038/srep23911 Number: 1 Publisher: Nature Publishing Group.
[5]
Rob Jacob and Sophie Stellmach. 2016. What you look at is what you get: gazebased user interfaces. 23, 5 (2016), 62--65. https://doi.org/10.1145/2978577
[6]
Masao Nakayama. 2008. The Effects of Play Area Size as Task Constraints on Soccer Pass Skills. (2008), 6.
[7]
Simon Schenk, Marc Dreiser, Gerhard Rigoll, and Michael Dorr. 2017. GazeEverywhere: Enabling Gaze-only User Interaction on an Unmodified Desktop PC in Everyday Scenarios. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA, 2017-05-02) (CHI '17). Association for Computing Machinery, New York, NY, USA, 3034--3044. https://doi.org/10.1145/3025453.3025455
[8]
Natalie Sebanz, Harold Bekkering, and Günther Knoblich. 2006. Joint action: bodies and minds moving together. 10, 2 (2006), 70--76. https://doi.org/10.1016/j. tics.2005.12.009
[9]
Sophie Stellmach and Raimund Dachselt. 2012. Designing gaze-based user interfaces for steering in virtual environments. In Proceedings of the Symposium on Eye Tracking Research and Applications (Santa Barbara, California, 2012-03-28) (ETRA '12). Association for Computing Machinery, New York, NY, USA, 131--138. https://doi.org/10.1145/2168556.2168577
[10]
Eduardo Velloso and Marcus Carter. 2016. The Emergence of EyePlay: A Survey of Eye Interaction in Games. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play (Austin, Texas, USA, 2016-10-15) (CHI PLAY '16). Association for Computing Machinery, New York, NY, USA, 171--185. https://doi.org/10.1145/2967934.2968084
[11]
Keiko Yokoyama, Hiroyuki Shima, Keisuke Fujii, Noriyuki Tabuchi, and Yuji Yamamoto. 2018. Social forces for team coordination in ball possession game. 97, 2 (2018), 022410. https://doi.org/10.1103/PhysRevE.97.022410 Publisher: American Physical Society.
[12]
Keiko Yokoyama and Yuji Yamamoto. 2011. Three People Can Synchronize as Coupled Oscillators during Sports Activities. 7, 10 (2011), e1002181. https://doi.org/10.1371/journal.pcbi.1002181 Publisher: Public Library of Science.

Cited By

View all
  • (2023)Leap to the Eye: Implicit Gaze-based Interaction to Reveal Invisible Objects for Virtual Environment Exploration2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00036(214-222)Online publication date: 16-Oct-2023

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI PLAY '20: Extended Abstracts of the 2020 Annual Symposium on Computer-Human Interaction in Play
November 2020
435 pages
ISBN:9781450375870
DOI:10.1145/3383668
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 03 November 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. coordinating behaviors in virtual space
  2. eye-tracking
  3. gaze control
  4. gaze interaction

Qualifiers

  • Short-paper

Conference

CHI PLAY '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 421 of 1,386 submissions, 30%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)4
  • Downloads (Last 6 weeks)0
Reflects downloads up to 13 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Leap to the Eye: Implicit Gaze-based Interaction to Reveal Invisible Objects for Virtual Environment Exploration2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00036(214-222)Online publication date: 16-Oct-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media