[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1007/11555261_61guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

RealTourist: a study of augmenting human-human and human-computer dialogue with eye-gaze overlay

Published: 12 September 2005 Publication History

Abstract

We developed and studied an experimental system, RealTourist, which lets a user to plan a conference trip with the help of a remote tourist consultant who could view the tourist’s eye-gaze superimposed onto a shared map. Data collected from the experiment were analyzed in conjunction with literature review on speech and eye-gaze patterns. This inspective, exploratory research identified various functions of gaze-overlay on shared spatial material including: accurate and direct display of partner’s eye-gaze, implicit deictic referencing, interest detection, common focus and topic switching, increased redundancy and ambiguity reduction, and an increase of assurance, confidence, and understanding. This study serves two purposes. The first is to identify patterns that can serve as a basis for designing multimodal human-computer dialogue systems with eye-gaze locus as a contributing channel. The second is to investigate how computer-mediated communication can be supported by the display of the partner’s eye-gaze.

References

[1]
Altmann, G.T., and Kamide, Y. Incremental interpretation at verbs: Restricting the domain of subsequent references. Cognition, 73, (1999). 247-264.
[2]
Argyle, M. and Cook, M. Gaze and Mutual Gaze. Cambridge University Press (1976).
[3]
Argyle, M. and Graham, J. The central Europe experiment - Looking at persons and looking at things. Journal of Environmental Psychology and Nonverbal Behaviour, 1, (1977). 6-16.
[4]
Blank-for-blind-review. Conversing with the User Based on Eye-Gaze Patterns. blank for review (to appear).
[5]
Boyle, E.A., Anderson, A. H., and Newlans, A. The effect of visibility on dialogue and performance in a cooperative problem solving task. Language & Speech, 37, 1 (1994). 1-20.
[6]
Buxton, W.A.S. and Moran, T.P. EuroPARC's integrated interactive intermedia facility (iiif): Early experience. in Gibbs, S. and Verrijn-Stuart, A.A. (ed). Multi-User Interfaces and Applications, Elsevier, Amsterdam (1990), 11-34.
[7]
Clark, H.C. and Schaeffer, E.F. Contributing to discourse. Cognitive Science, 13, (1989). 259-294.
[8]
Clark, H.H. and Krych, M.A. Speaking while monitoring addresses for understanding. Journal of Memory and Language, 50, (2004). 62-81.
[9]
Cooper, R.M. The control of eye fixation by the meaning of spoken language - a new methodology for the real-time investigation of speech perception, memory, and language processing. Cognitive Psychology, 6, (1974). 84-107.
[10]
Doherty-Sneddon, G., Anderson, A., O'Malley, C., Langton, S., Garrod, S. and Bruce, V. Face-to-face and video mediated communication: A comparison of dialogue structure and task performance. Journal of Experimental Psychology: Applied, 3, (1997). 105-125.
[11]
Griffin, Z.M. and Bock, K. What the eye says about speaking. Psychological Science, 11, 4 (2000). 274-279.
[12]
Ishii, H. and Kobayashi, M., ClearBoard: A Seamless Media for Shared Drawing and Conversation with Eye-Contact. Proc. ACM CHI Conference on Human Factors in Computing Systems (1992), 525-532.
[13]
Jacob, R.J.K. The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look At is What You Get. ACM Transactions on Information Systems, vol. 9, no. 3 (1991). 152-169.
[14]
Kamide, Y., Altman, G. T. M., and Haywood, S. L. The time-course of prediction in incremental sentence processing: Evidence from anticipatory eye movements. Journal of Memory and Language, 49, (2003). 133-156.
[15]
Kaur, M., Tremaine, M., Huang, N., Wilder, J., Gacovski, Z., Flippo, F. and Mantravadi, C.S., Where is "it"? Event synchronization in gaze-speech input systems. Proc. Fifth International Conference on Multimodal Interfaces (2003), 151-157.
[16]
Kraut, R.E., Gergle, D. and Fussell, S.R., The use of visual information in shared visual spaces: Informing the development of virtual co-presence. Proc. ACM Conference on Computer Supported Cooperative Work (CSCW) (2002), 31-40.
[17]
Meyer, A.S., Sleiderink, A.M. and Levelt, W.J.M. Viewing and naming objects: Eye movements during noun and phrase production. Cognition, 66, (1998). B5-B33.
[18]
Monk, A. and Gale, C. A look is worth a thousands word: full gaze awareness in videomediated conversation. Discourse Processes, 33, 3 (2002). 257-278.
[19]
Ochsman, R.B. and Chapanis, A. The effects of 10 communication modes on the behaviour of teams during co-operative problem-solving. International Journal of Man-Machine Studies, 6, (1974). 579-619.
[20]
Richardson, D.C. and Dale, R., Looking to understand: The coupling between speakers' and listneners' eye movement and its relationship to discourse comprehension. Proc. the 26th Annual Meeting of the Cognitive Science Society (2004).
[21]
Salvucci, D.D. and Goldberg, J.H., Identifying fixations and saccades in eye-tracking protocols. Proc. ACM Eye Tracking Research & Application Symposium (ETRA) (2000), 71-79.
[22]
Simon, H.A. The scientist as problem solver. in Klahr, D. and Kotovsky, K. (ed). Complex information processing: The impact of Herbert A. Simon, Lawrence Erlbaum, Hilsdale, NJ (1989), 376-398.
[23]
Tanaka, K., A robust selection system using real-time multi-modal user-agent interactions. Proc. 4th International Conference on Intelligent User Interfaces (1999), 105-108.
[24]
Velichkovsky, B.M. Communicating attention-gaze position transfer in cooperative problem solving. Pragmatics and Cognition, 3, 2 (1995). 99-224.
[25]
Vertegaal, R., The GAZE Groupware System: Mediating Joint Attention in Multiparty Communication and Collaboration. Proc. CHI'99: ACM Conference on Human Factors in Computing Systems (1999), 294-301.
[26]
Ware, C. and Mikaelian, H.H., An evaluation of an eye tracker as a device for computer input. Proc. CHI+GI: ACM Conference on Human Factors in Computing Systems and Graphics Interface (1987), 183-188.
[27]
Winograd, T. and Flores, F. Understanding computers and cognition. Ablex Publishing Corp, Norwood, NJ, (1986).
[28]
Zhai, S. What's in the Eyes for Attentive Input Communications of the ACM (2003), 34-39.
[29]
Zhai, S., Morimoto, C. and Ihde, S., Manual and gaze input cascaded (MAGIC) pointing. Proc. CHI'99: ACM Conference on Human Factors in Computing Systems (1999), ACM Press, 246-253.
[30]
Zhang, Q., Imamiya, A., Go, K. and Gao, X., Overriding errors in speech and gaze multimodal architecture. Proc. 9th International Conference on Intelligent User Interfaces (2004), 346-348.

Cited By

View all
  • (2021)Learners Learn More and Instructors Track Better with Real-time Gaze SharingProceedings of the ACM on Human-Computer Interaction10.1145/34492085:CSCW1(1-23)Online publication date: 22-Apr-2021
  • (2019)Gaze-Guided NarrativesProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300721(1-12)Online publication date: 2-May-2019
  • (2018)I see what you seeProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3204542(1-9)Online publication date: 14-Jun-2018
  • Show More Cited By

Index Terms

  1. RealTourist: a study of augmenting human-human and human-computer dialogue with eye-gaze overlay
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    INTERACT'05: Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
    September 2005
    1157 pages
    ISBN:3540289437

    Sponsors

    • PHILIPS Research
    • AICA: Associazione Italiana per l'Informatica ed il Calcolo Automatico
    • SAP
    • Microsoft Research: Microsoft Research
    • Nokia Connecting People: Nokia Connecting People

    Publisher

    Springer-Verlag

    Berlin, Heidelberg

    Publication History

    Published: 12 September 2005

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 18 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2021)Learners Learn More and Instructors Track Better with Real-time Gaze SharingProceedings of the ACM on Human-Computer Interaction10.1145/34492085:CSCW1(1-23)Online publication date: 22-Apr-2021
    • (2019)Gaze-Guided NarrativesProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300721(1-12)Online publication date: 2-May-2019
    • (2018)I see what you seeProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3204542(1-9)Online publication date: 14-Jun-2018
    • (2018)An Eye For DesignProceedings of the 2018 CHI Conference on Human Factors in Computing Systems10.1145/3173574.3173923(1-12)Online publication date: 21-Apr-2018
    • (2017)Gaze-informed multimodal interactionThe Handbook of Multimodal-Multisensor Interfaces10.1145/3015783.3015794(365-402)Online publication date: 24-Apr-2017
    • (2017)An automated approach to estimate human interestApplied Intelligence10.1007/s10489-017-0947-747:4(1186-1207)Online publication date: 1-Dec-2017
    • (2016)Gaze Augmentation in Egocentric Video Improves Awareness of IntentionProceedings of the 2016 CHI Conference on Human Factors in Computing Systems10.1145/2858036.2858127(1573-1584)Online publication date: 7-May-2016
    • (2016)GazeTorchProceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems10.1145/2851581.2892459(1151-1158)Online publication date: 7-May-2016
    • (2014)Improving automatic speech recognition through head pose driven visual groundingProceedings of the SIGCHI Conference on Human Factors in Computing Systems10.1145/2556288.2556957(3235-3238)Online publication date: 26-Apr-2014
    • (2010)Using dual eye-tracking to unveil coordination and expertise in collaborative TetrisProceedings of the 24th BCS Interaction Specialist Group Conference10.5555/2146303.2146309(36-44)Online publication date: 6-Sep-2010
    • Show More Cited By

    View Options

    View options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media