[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/1753326.1753647acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Knowing where and when to look in a time-critical multimodal dual task

Published: 10 April 2010 Publication History

Abstract

Human-computer systems intended for time-critical multitasking need to be designed with an understanding of how humans can coordinate and interleave perceptual, memory, and motor processes. This paper presents human performance data for a highly-practiced time-critical dual task. In the first of the two interleaved tasks, participants tracked a target with a joystick. In the second, participants keyed-in responses to objects moving across a radar display. Task manipulations include the peripheral visibility of the secondary display (visible or not) and the presence or absence of auditory cues to assist with the radar task. Eye movement analyses reveal extensive coordination and overlapping of human information processes and the extent to which task manipulations helped or hindered dual task performance. For example, auditory cues helped only a little when the secondary display was peripherally visible, but they helped a lot when it was not peripherally visible.

Supplementary Material

JPG File (p2103.jpg)
MOV File (p2103.mov)

References

[1]
Abrams, R. A., Meyer, D. E., & Kornblum, S. (1989). Speed and accuracy of saccadic eye movements: Characteristics of impulse variability in the oculomotor system. Journal of Experimental Psychology: Human Perception and Performance, 15(3), 529--543.
[2]
Ballas, J. A., Heitmeyer, C. L., & Perez, M. A. (1992). Evaluating two aspects of direct manipulation in advanced cockpits. Proceedings of ACM CHI '92: Conference on Human Factors in Computing Systems, 127--134.
[3]
Bertera, J. H., & Rayner, K. (2000). Eye movements and the span of effective stimulus in visual search. Perception & Psychophysics, 62(3), 576--585.
[4]
Cabrera, D., Ferguson, S., & Laing, G. (2005). Development of auditory alerts for air traffic control consoles. Proceedings of the 119th Audio Engineering Society Convention, New York, USA.
[5]
Hornof, A. J., & Halverson, T. (2002). Cleaning up systematic error in eye tracking data by using required fixation locations. Behavior Research Methods, Instruments, and Computers, 34(4), 592--604.
[6]
Hornof, A. J., Halverson, T., Issacson, A., & Brown, E. (2008). Transforming object locations on a 2D visual display into cued locations in 3D auditory space. Proceedings of the 52nd Annual Meeting of the Human Factors and Ergonomics Society, 1170--1174
[7]
Kieras, D. E., Ballas, J., & Meyer, D. E. (2001). Computational Models for the Effects of Localized Sound Cuing in a Complex Dual Task (No. EPIC Report No. 13). Ann Arbor, Michigan: University of Michigan, Department of Electrical Engineering and Computer Science.
[8]
Kieras, D. E., & Meyer, D. E. (1997). An overview of the EPIC architecture for cognition and performance with application to human-computer interaction. Human-Computer Interaction, 12(4), 391--438.
[9]
Kieras, D., Meyer, D., & Ballas, J. (2001). Towards demystification of direct manipulation: Cognitive modeling charts the gulf of execution. Proceedings of ACM CHI 2001: Conference on Human Factors in Computing Systems, 128--135.
[10]
Pachella, R. G. (1974). The interpretation of reaction time in information-processing research. In B. H. Kantowitz (Ed.), Human Information Processing: Tutorials in Performance and Cognition. Hillsdale, New Jersey: Lawrence Erlbaum Associates.
[11]
R Development Core Team. (2009). R: A Language and Environment for Statistical Computing. Vienna, Austria: R Foundation for Statistical Computing.
[12]
Russo, J. E. (1978). Adaptation of cognitive processes to the eye movement system. In J. W. Senders, D. F. Fisher & R. A. Monty (Eds.), Eye Movements and the Higher Psychological Functions (pp. 89-109). Hillsdale, New Jersey: Lawrence Erlbaum Associates.
[13]
Salvucci, D. D., Taatgen, N. A., & Borst, J. P. (2009). Toward a unified theory of the multitasking continuum: From concurrent performance to task switching, interruption, and resumption. Proceedings of ACM CHI 2009: Conference on Human Factors in Computing Systems, 1819-1828.
[14]
Vera, A.H., Howes, A., McCurdy, M., Lewis, R.L., (2004) A constraint satisfaction approach to predicting skilled interactive cognition. Proceedings of ACM CHI 2004: Conference on Human Factors in Computing Systems, 121--128.

Cited By

View all
  • (2024) Shake it or light it! The effects of cueing in desktop‐VR learning environments on search time and learning Journal of Computer Assisted Learning10.1111/jcal.1294540:3(1201-1217)Online publication date: 29-Jan-2024
  • (2022)Spatial Audio for Multimodal Location MonitoringInteracting with Computers10.1093/iwc/iwac00933:5(564-582)Online publication date: 26-May-2022
  • (2017)COMMITProceedings of the 8th ACM on Multimedia Systems Conference10.1145/3083187.3084022(349-360)Online publication date: 20-Jun-2017
  • Show More Cited By

Recommendations

Reviews

James H. Bradford

We live in a multitasking world and yet we still do not have a clear idea about the design disciplines that best support rapid changes in focus. This paper describes an experiment that evaluates the effectiveness of color, motion, and audio cues in improving high-performance multitasking. The experiment was based on a military application in which subjects needed to switch their attention between two displays. In the first display, subjects were required to track a moving target with a joystick. In the second, moving color-coded targets needed to be classified and responses keyed in for each kind of target. The display screens were placed far enough apart that the subjects had to switch their visual focus between the screens in order to accomplish both tasks. Both displays featured moving targets and, as a consequence, subjects could not focus for long on either of the screens. Subjects showed a marked improvement in performance over several days that suggested they continuously improved their multitasking strategies. This phenomenon has not been well studied in the literature. More research into the design features that support strategic improvement is needed. Other results indicated (not surprisingly) that auditory cues helped subjects decide when to switch attention to a competing task. The results are intriguing, but we do not know how well findings in a single multitasking environment generalize to other applications. The discipline, as a whole, needs to do more research of this sort and eventually develop a taxonomy of multitasking applications. This paper will be of interest to scientists conducting basic research on the human factors of effective task switching. Online Computing Reviews Service

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
April 2010
2690 pages
ISBN:9781605589299
DOI:10.1145/1753326
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 10 April 2010

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. auditory displays
  2. cognitive strategies
  3. eye tracking
  4. multimodal
  5. multitasking
  6. visual displays

Qualifiers

  • Research-article

Conference

CHI '10
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024) Shake it or light it! The effects of cueing in desktop‐VR learning environments on search time and learning Journal of Computer Assisted Learning10.1111/jcal.1294540:3(1201-1217)Online publication date: 29-Jan-2024
  • (2022)Spatial Audio for Multimodal Location MonitoringInteracting with Computers10.1093/iwc/iwac00933:5(564-582)Online publication date: 26-May-2022
  • (2017)COMMITProceedings of the 8th ACM on Multimedia Systems Conference10.1145/3083187.3084022(349-360)Online publication date: 20-Jun-2017
  • (2017)Dividing Attention Between Tasks: Testing Whether Explicit Payoff Functions Elicit Optimal Dual‐Task PerformanceCognitive Science10.1111/cogs.1251342:3(820-849)Online publication date: 27-Jun-2017
  • (2016)Effect of Redundant Haptic Information on Task Performance during Visuo-Tactile Task Interruption and RecoveryFrontiers in Psychology10.3389/fpsyg.2016.019247Online publication date: 8-Dec-2016
  • (2014)Supporting Novice to Expert Transitions in User InterfacesACM Computing Surveys10.1145/265979647:2(1-36)Online publication date: 12-Nov-2014
  • (2014)HCI over multiple screensCHI '14 Extended Abstracts on Human Factors in Computing Systems10.1145/2559206.2578869(665-674)Online publication date: 26-Apr-2014
  • (2014)Understanding multitasking through parallelized strategy exploration and individualized cognitive modelingProceedings of the SIGCHI Conference on Human Factors in Computing Systems10.1145/2556288.2557351(3885-3894)Online publication date: 26-Apr-2014
  • (2013)Sound sample detection and numerosity estimation using auditory displayACM Transactions on Applied Perception10.1145/2422105.242210910:1(1-18)Online publication date: 4-Mar-2013
  • (2011)Sensing cognitive multitasking for a brain-based adaptive user interfaceProceedings of the SIGCHI Conference on Human Factors in Computing Systems10.1145/1978942.1978997(383-392)Online publication date: 7-May-2011
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media