[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/274497.274505acmconferencesArticle/Chapter ViewAbstractPublication PagesassetsConference Proceedingsconference-collections
Article
Free access

The use of gestures in multimodal input

Published: 01 January 1998 Publication History

Abstract

For users with motion impairments, the standard keyboard and mouse arrangement for computer access often presents problems. Other approaches have to be adopted to overcome this.
In this paper, we will describe the development of a prototype multimodal input system based on two gestural input channels. Results from extensive user trials of this system are presented. These trials showed that the physical and cognitive loads on the user can quickly become excessive and detrimental to the interaction. Designers of multimodal input systems need to be aware of this and perform regular user trials to minimize the problem.

References

[1]
3Space Users Manual. Polhemus Navigations Sciences Division, McDonnell Douglas Electronics Company, 1987.
[2]
Card, S.K., Moran, T.P. and Newell, A. The Psychology of Human-Computer interaction. Lawrence Erlbaum Associates, inc., 1983.
[3]
Cushman W.H. and Rosenberg D.J. Human Factors in Product Design. Elsevier Science Publishing, 1991, 193-200.
[4]
Keates S., Potter R., Perricos C. and Robinson P. Gesture Recognition - Research and Clinical Perspectives in Proceedings of RESNA '97 (Pittsburgh, Pennsylvania), RESNA Press, 1997, 333- 335.
[5]
Perricos, C. Jester- A Head Gesture Recognition System for Windows 95 in Proceedings of RESNA '96 (Salt Lake City, Utah), RESNA Press, 1996, 304-306.
[6]
Smart W.D. et al. ARCHI!~ - Plan Recognition in a Human-Computer Interface in Proceedings of the 12tu UK Planning/Scheduling SIG Workshop (Cambridge, UK), 1993.
[7]
Wolf, C.G. Can People Use Gesture Commands? IBM Technical Report #86A002284, 1986.
[8]
Zhai S., Milgram P. and Buxton W. The Influence of Muscle Groups on Performance of Multiple-Degree-of- Freedom Input in Proceedings of CHI '96 (Vancouver, Canada), Addison Wesley, 1996, 308-315.

Cited By

View all
  • (2024)GoalTrack: Supporting Personalized Goal-Setting in Stroke Rehabilitation with Multimodal Activity JournalingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997238:4(1-29)Online publication date: 21-Nov-2024
  • (2023)Bring-Your-Own Input: Context-Aware Multi-Modal Input for More Accessible Virtual RealityExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3577056(1-5)Online publication date: 19-Apr-2023
  • (2022)Understanding How People with Limited Mobility Use Multi-Modal InputProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517458(1-17)Online publication date: 29-Apr-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
Assets '98: Proceedings of the third international ACM conference on Assistive technologies
January 1998
209 pages
ISBN:1581130201
DOI:10.1145/274497
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 January 1998

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gesture recognition
  2. multimodal input
  3. user trials

Qualifiers

  • Article

Conference

ASSETS98
Sponsor:
ASSETS98: The 3rd ACM SIGCAPH Conference on Assistive Technologies
April 15 - 17, 1998
California, Marina del Rey, USA

Acceptance Rates

Overall Acceptance Rate 436 of 1,556 submissions, 28%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)51
  • Downloads (Last 6 weeks)9
Reflects downloads up to 11 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)GoalTrack: Supporting Personalized Goal-Setting in Stroke Rehabilitation with Multimodal Activity JournalingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997238:4(1-29)Online publication date: 21-Nov-2024
  • (2023)Bring-Your-Own Input: Context-Aware Multi-Modal Input for More Accessible Virtual RealityExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3577056(1-5)Online publication date: 19-Apr-2023
  • (2022)Understanding How People with Limited Mobility Use Multi-Modal InputProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517458(1-17)Online publication date: 29-Apr-2022
  • (2021)A User-based Mid-air Hand Gesture Set for SpreadsheetsProceedings of the Asian CHI Symposium 202110.1145/3429360.3468193(122-128)Online publication date: 8-May-2021
  • (2021)It’s a Joint Effort: Understanding Speech and Gesture in Collaborative TasksHuman-Computer Interaction. Interaction Techniques and Novel Applications10.1007/978-3-030-78465-2_13(159-178)Online publication date: 3-Jul-2021
  • (2019)Approach for Intuitive and Touchless Interaction in the Operating RoomJ10.3390/j20100052:1(50-64)Online publication date: 23-Jan-2019
  • (2018)Collaborative Immersive AnalyticsImmersive Analytics10.1007/978-3-030-01388-2_8(221-257)Online publication date: 16-Oct-2018
  • (2016)User centered gesture development for smart lightingProceedings of HCI Korea10.17210/hcik.2016.01.146(146-150)Online publication date: 27-Jan-2016
  • (2014)Wearable and Non-Invasive Assistive TechnologiesWearable Sensors10.1016/B978-0-12-418662-0.00009-X(563-590)Online publication date: 2014
  • (2013)A Dual-Mode Human Computer Interface Combining Speech and Tongue Motion for People with Severe DisabilitiesIEEE Transactions on Neural Systems and Rehabilitation Engineering10.1109/TNSRE.2013.224874821:6(979-991)Online publication date: Nov-2013
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media