[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Multimodal multiplayer tabletop gaming

Published: 01 April 2007 Publication History

Abstract

There is a large disparity between the rich physical interfaces of co-located arcade games and the generic input devices seen in most home console systems. In this article we argue that a digital table is a conducive form factor for general co-located home gaming as it affords: (a) seating in collaboratively relevant positions that give all equal opportunity to reach into the surface and share a common view; (b) rich whole-handed gesture input usually seen only when handling physical objects; (c) the ability to monitor how others use space and access objects on the surface; and (d) the ability to communicate with each other and interact on top of the surface via gestures and verbal utterance. Our thesis is that multimodal gesture and speech input benefits collaborative interaction over such a digital table. To investigate this thesis, we designed a multimodal, multiplayer gaming environment that allows players to interact directly atop a digital table via speech and rich whole-hand gestures. We transform two commercial single-player computer games, representing a strategy and simulation game genre, to work within this setting.

References

[1]
BOLT, R.A. 1980. Put-that-there: Voice and gesture at the graphics interface. In Proceedings of the ACM Conference on Computer Graphics and Interactive Techniques (Seattle, WA), ACM, New York, 262--270.
[2]
CLARK, H. 1996. Using Language. Cambridge University Press.
[3]
COHEN, P.R., COULSTON, R., AND KROUT, K. 2002. Multimodal interaction during multiparty dialogues: Initial results. In Proceedings of the IEEE International Conference on Multimodal Interfaces, IEEE, Piscataway, NJ, 448--452.
[4]
COHEN, P.R, 2000. Speech can't do everything: A case for multimodal systems. Speech Technol. Mag. 5, 4.
[5]
COHEN, P.R., JOHNSTON, M., MCGEE, D., OVIATT, S., PITTMAN, J., SMITH, I., CHEN, L., AND CLOW, J. 1997. QuickSet: Multimodal interaction for distributed applications. In Proceedings of the ACM Multimedia Conference, ACM, New York, 31--40.
[6]
DIETZ, P.H. AND LEIGH, D.L. 2001. DiamondTouch: A multi-user touch technology. In Proceedings of the ACM Symposium on User Interface Software and Technology (UIST), ACM, New York, 219--226.
[7]
GREEBBERG, S. 1999. Designing computers as public artifacts. Int. J. Design Computing: Special Issue on Design Computing on the Net (Nov.30--Dec.3). University of Sydney.
[8]
GREENBERG, S. 1991. Personalizable groupware: Accommodating individual roles and group differences. In Proceedings of the ECSCW Conference,17--32.
[9]
GREENBERG, S. 1990. Sharing views and interactions with single-user applications. In Proceedings of the ACM COIS Conference, ACM, New York, 227--237.
[10]
GUTWIN, C. AND GREENBERG, S. 2004. The importance of awareness for team cognition in distributed collaboration. In Team Cognition: Understanding the Factors that Drive Process and Performance, E. Salas and S. Fiore (eds.), APA Press, 177--201.
[11]
HEATH, C.C. AND LUFF, P. 1991. Collaborative activity and technological design: Task coordination in London Underground control rooms. In Proceedings of the ECSCW Conference, 65--80.
[12]
MAAGERKURTH, C., MEMISOGLU, M., ENGELKE, T., AND STREITZ, N. 2004. Towards the next generation of tabletop gaming experiences. In Proceedings of the Graphics Interface Conference, 73--80.
[13]
OVIATT, S. 1999. Ten myths of multimodal interaction, Commun. ACM 42, 11 (Nov.), 74-81.
[14]
OVIATT, S. 1997. .Multimodal interactive maps: Designing for human performance. Human-Computer Interaction 12.
[15]
PINELLE, D., GUTWIN, C., AND GREENBERG, S. 2003. Task analysis for groupware usability evaluation: Modeling shared-workspace tasks with the mechanics of collaboration. ACM Trans. Computer-Human Interaction 10, 4 (Dec.), 281-311.
[16]
SEGAL, L. 1994. Effects of checklist interface on non-verbal crew communications. NASA Ames Research Center, Contractor Rep.177639.
[17]
TANG, J. 1991. Findings from observational studies of collaborative work. Int. J. Man-Machine Studies 34, 2.
[18]
TSE, E., SHEN, C., GREENBERG, S., AND FORLINES, C. 2005. Enabling interaction with single user applications through speech and gestures on a multi-user tabletop. In Proceedings of Advanced Visual Interfaces (AVI'06), ACM Press, 336-343.

Cited By

View all
  • (2023)Controlling Your Voice in a Shouting Match: A Preliminary Study on Fostering Self-Moderation among Gamers through Embodied PlayCompanion Proceedings of the Annual Symposium on Computer-Human Interaction in Play10.1145/3573382.3616072(29-35)Online publication date: 6-Oct-2023
  • (2022)Reducing the Cognitive Load of Playing a Digital Tabletop Game with a Multimodal InterfaceProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502062(1-13)Online publication date: 29-Apr-2022
  • (2021)Multicraft: A Multimodal Interface for Supporting and Studying Learning in MinecraftHCI in Games: Serious and Immersive Games10.1007/978-3-030-77414-1_10(113-131)Online publication date: 24-Jul-2021
  • Show More Cited By

Index Terms

  1. Multimodal multiplayer tabletop gaming

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Computers in Entertainment
    Computers in Entertainment   Volume 5, Issue 2
    Interactive TV
    April/June 2007
    156 pages
    EISSN:1544-3574
    DOI:10.1145/1279540
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 April 2007
    Accepted: 01 June 2006
    Received: 01 April 2006
    Published in CIE Volume 5, Issue 2

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. computer supported cooperative work
    2. multimodal speech and gesture interfaces
    3. tabletop interaction
    4. visual-spatial displays

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)40
    • Downloads (Last 6 weeks)9
    Reflects downloads up to 18 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Controlling Your Voice in a Shouting Match: A Preliminary Study on Fostering Self-Moderation among Gamers through Embodied PlayCompanion Proceedings of the Annual Symposium on Computer-Human Interaction in Play10.1145/3573382.3616072(29-35)Online publication date: 6-Oct-2023
    • (2022)Reducing the Cognitive Load of Playing a Digital Tabletop Game with a Multimodal InterfaceProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502062(1-13)Online publication date: 29-Apr-2022
    • (2021)Multicraft: A Multimodal Interface for Supporting and Studying Learning in MinecraftHCI in Games: Serious and Immersive Games10.1007/978-3-030-77414-1_10(113-131)Online publication date: 24-Jul-2021
    • (2021)Emerging ApplicationsTouch-Based Human-Machine Interaction10.1007/978-3-030-68948-3_7(179-229)Online publication date: 26-Mar-2021
    • (2020)GenoProceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology10.1145/3379337.3415848(1169-1181)Online publication date: 20-Oct-2020
    • (2019)Minuet: Multimodal Interaction with an Internet of ThingsSymposium on Spatial User Interaction10.1145/3357251.3357581(1-10)Online publication date: 19-Oct-2019
    • (2019)Modeling Multimodal-Multiuser Interactions in Declarative Multimedia LanguagesProceedings of the ACM Symposium on Document Engineering 201910.1145/3342558.3345400(1-10)Online publication date: 23-Sep-2019
    • (2019)MIMOSE: multimodal interaction for music orchestration sheet editorsMultimedia Tools and Applications10.1007/s11042-019-07838-078:23(33041-33068)Online publication date: 21-Jun-2019
    • (2017)Technological Tools and Interventions to Enhance Learning in Children with AutismSupporting the Education of Children with Autism Spectrum Disorders10.4018/978-1-5225-0816-8.ch011(204-224)Online publication date: 2017
    • (2016)Investigating the Impact of Cooperative Communication Mechanics on Player Performance in Portal 2Proceedings of the 42nd Graphics Interface Conference10.5555/3076132.3076143(41-48)Online publication date: 1-Jun-2016
    • Show More Cited By

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media