Abstract
In this work, we present the results of a comparative user study evaluating multimodal user interactions with regard to two different operation scenarios: a desktop Virtual-Reality application (DVA) and an automotive infotainment application (AIA). Besides classical tactile input devices, like touch-screen and key-console, the systems can be controlled by natural speech as well as by hand and head gestures. Concerning both domains, we have found out that experts tend to use tactile devices, but normal users and beginners prefer combinations of more advanced input possibilities. Complementary actions most often occurred in DVA, whereas in AIA, the use of redundant input clearly dominates the set of multimodal interactions. Concerning time relations, the individual interaction length of speech and gesture-based input was below 1.5 seconds on the average and staggered intermodal overlapping occurred most often. Additionally, we could find out that the test users try to stay within a chosen interaction form. With regard to the overall subjective user experiences, the interfaces were rated very positively.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Oviatt, S.L.: Multimodal interface research: A science without borders. In: Proc. of the 6th Int. Conf. on Spoken Language Processing, ICSLP (2000)
Cheyer, A., Julia, L.: Designing, developing and evaluating multimodal applications. In: WS on Pen/Voice Interfaces (CHI 1999) (1999)
Oviatt, S.L., et al.: Integration and synchronization of input modes during multimodal human-computer interaction. In: Proc. of the 6th ICSLP (2000)
Nigay, L., Coutaz, J.: A Design Space for Multimodal Systems: Concurrent Processing and Data Fusion. In: Proc. of INTERCHI 1993, pp. 172–178. ACM Press, New York (1993)
Cohen, P., et al.: Multimodal interaction during multiparty dialogues: Initial results. In: Proc. of 4th IEEE Int. Conf. on Multimodal Interfaces (2002)
Sowa, T.: Coverbal iconic gestures for object descriptions in virtual environments: An empirical study. In: Post-Proc. of Int. Conf on Gestures: Meaning and Use (2000)
Nielsen, J.: Usability Engineering. Morgan Kaufmann Publishers Inc., San Francisco (1993)
Neuss, R.: Usability Engineering als Ansatz zum Multimodalen Mensch-Maschine- Dialog. PhD thesis, Technical University of Munich (2001)
McGlaun, G., et al.: A new approach for the integration of multimodal input based on late semantic fusion. In: Proc. of USEWARE 2002 (2002)
Zobl, M., et al.: A usability-study on hand-gesture controlled operation of in-car devices. In: Proc. of 9th Int. Conf. on HCI (2001)
Althoff, F., et al.: A generic approach for interfacing VRML browsers to various input devices. In: Proc. of ACM Web3D Symposium, pp. 67–74 (2001)
Schuller, B., Lang, M., et al.: Towards automation of usability studies. In: Proc. of IEEE Int. Conf. on Systems, Man and Cybernetics, SMC 2002 (2002)
Althoff, F., Geiss, K., et al.: Experimental evaluation of user errors at the skill based level in an automotive environment. In: Proc. of CHI 2002 (2002)
McGlaun, G., Althoff, F., Lang, M.: A new technique for adjusting distraction moments in multitasking non-field usability tests. In: Proc. of CHI 2002 (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Althoff, F., McGlaun, G., Lang, M., Rigoll, G. (2004). Evaluating Multimodal Interaction Patterns in Various Application Scenarios. In: Camurri, A., Volpe, G. (eds) Gesture-Based Communication in Human-Computer Interaction. GW 2003. Lecture Notes in Computer Science(), vol 2915. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24598-8_39
Download citation
DOI: https://doi.org/10.1007/978-3-540-24598-8_39
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-21072-6
Online ISBN: 978-3-540-24598-8
eBook Packages: Springer Book Archive