[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3125739.3125745acmconferencesArticle/Chapter ViewAbstractPublication PageshaiConference Proceedingsconference-collections
research-article
Open access

Designing Elements for a Gaze Sensitive Object: Meet the CoffeePet

Published: 27 October 2017 Publication History

Abstract

It is not difficult to design an object that is sensitive to our gaze. The challenging part is to make the user realize it. And since people are not used to interacting with an object by simply moving their eyes (people use eyes to see, not to trigger something), the interaction itself become unfamiliar to them. Based on the gaze behavior of people socializing with others, we believe that the feeling of being looked back when interacting with a gaze sensitive object is rather crucial in order to overcome the problem of unfamiliarity and to make people naturally realize that the object was sensitive to their gaze behavior. To achieve this feeling, we conclude that eyes need to be presence in the user's view. In this paper, we present an anthropomorphize coffee machine called the CoffeePet, attached with two, small OLED screen that displays animated eyes. These eyes are responsive towards the user's gaze behavior. Furthermore, the CoffeePet will automatically start to brew the drink if the user manages to maintain a prolonged eye contact with it. In three experiments, we investigated the impact of the animated eyes in aiding the participants to become familiar and to realize that their gaze behavior influences the CoffeePet to react. Without being told on how to interact with the CoffeePet, participants were randomly assigned to participate in one of three conditions. 1) CoffeePet with watching eyes (eyes with direct gaze), 2) CoffeePet with interactive gaze model, and 3) CoffeePet with interactive gaze following. The results showed that the interactive sharing gaze did, in fact, lead the participants to become familiar and to realize that they can interact with the object by simply moving their eyes.

References

[1]
Anas, S.A. binti, Qiu, S., Rauterberg, M. and Hu, J. 2016. Exploring gaze in interacting with everyday objects with an interactive cup. In Proceedings of the Fourth International Conference on Human Agent Interaction (HAI '16) (2016).
[2]
Anas, S.A. binti, Qiu, S., Rauterberg, M. and Hu, J. 2016. Exploring social interaction with everyday object based on perceptual crossing. In Proceedings of the Fourth International Conference on Human Agent Interaction (HAI '16) (2016).
[3]
Baron-Cohen, S., Wheelwright, S., Hill, J., Raste, Y. and Plumb, I. 2001. The "Reading the Mind in the Eyes" Test Revised Version:A Study with Normal Adults, and Adults with Asperger Syndrome or High-functioning Autism. Journal of Child Psychology and Psychiatry. 42, 2 (2001), 241--251.
[4]
Bayliss, A.P. and Tipper, S.P. 2005. Gaze and arrow cueing of attention reveals individual differences along the autism spectrum as a function of target context. British Journal of Psychology. 96, 1 (2005), 95--114.
[5]
Bee, N., Wagner, J., André, E., Vogt, T., Charles, F., Pizzi, D. and Cavazza, M. 2010. Discovering eye gaze behavior during human-agent conversation in an interactive storytelling application. International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI '10) (2010).
[6]
Brennan, S.E., Chen, X., Dickinson, C.A., Neider, M.B. and Zelinsky, G.J. 2008. Coordinating cognition:The costs and benefits of shared gaze during collaborative search. Cognition. 106, 3 (2008), 1465--1477.
[7]
Carletta, J., Hill, R.L., Nicol, C., Taylor, T., Ruiter, J.P. de and Bard, E.G. 2010. Eyetracking for twoperson tasks with manipulation of a virtual world. Behavior Research Methods. 42, 1 (2010), 254--265.
[8]
Hsiao, J.H. and Cottrell, G. 2008. Two Fixations Suffice in Face Recognition. Psychological Science. 19, 10 (2008), 998--1006.
[9]
Jacob, R.J.K. 1995. Eye Tracking in Advanced Interface Design. W. Barfield and T.A. Furness, eds. Oxford University Press. 258--288.
[10]
Jacob, R.J.K. 1991. The Use of Eye Movements in Human-computer Interaction Techniques:What You Look at is What You Get. ACM Trans. Inf. Syst. 9, 2 (Apr. 1991), 152--169.
[11]
Jacob, R.J.K. and Karn, K.S. 2003. Eye Tracking in Human--Computer Interaction and Usability Research:Ready to Deliver the Promises. J. Hyona, R. Radach, and H. Deubel, eds. Elsevier B.V. 573-- 603.
[12]
Just, M.A. and Carpenter, P.A. 1976. Eye fixations and cognitive processes. Cognitive Psychology. 8, 4 (1976), 441--480.
[13]
Khamis, M., Bulling, A. and Alt, F. 2015. Tackling Challenges of Interactive Public Displays Using Gaze. Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers (Osaka, Japan, 2015), 763--766.
[14]
Kleinke, C.L. 1986. Gaze and Eye Contact:A Research Review. Psychological Bulletin. 100, 1 (1986), 78--100.
[15]
Koike, T., Tanabe, H.C., Okazaki, S., Nakagawa, E., Sasaki, A.T., Shimada, K., Sugawara, S.K., Takahashi, H.K., Yoshihara, K., Bosch-Bayard, J. and Sadato, N. 2016. Neural substrates of shared attention as social memory:A hyperscanning functional magnetic resonance imaging study. NeuroImage. 125, (2016), 401--412.
[16]
Land, M.F. and Furneaux, S. 1997. The knowledge base of the oculomotor system. Philosophical Transactions of the Royal Society B:Biological Sciences. 352, 1358 (1997), 1231--1239.
[17]
Lorigo, L., Haridasan, M., Brynjarsdóttir, H., Xia, L., Joachims, T., Gay, G., Granka, L., Pellacini, F. and Pan, B. 2008. Eye tracking and online search:Lessons learned and challenges ahead. Journal of the American Society for Information Science and Technology. 59, 7 (2008), 1041--1052.
[18]
Majaranta, P. and Räihä, K.-J. 2002. Twenty Years of Eye Typing:Systems and Design Issues. Proceedings of the 2002 Symposium on Eye Tracking Research & Applications (New Orleans, Louisiana, 2002), 15--22.
[19]
Mitterer-Daltoé, M.L., Queiroz, M.I., Fiszman, S. and Varela, P. 2014. Are fish products healthy? Eye tracking as a new food technology tool for a better understanding of consumer perception. {LWT} Food Science and Technology. 55, 2 (2014), 459-- 465.
[20]
Parise, S., Kiesler, S., Sproull, L. and Waters, K. 1999. Cooperating with life-like interface agents. Computers in Human Behavior. 15, 2 (1999), 123--142.
[21]
Peterson, M.F. and Eckstein, M.P. 2012. Looking just below the eyes is optimal across face recognition tasks. Proceedings of the National Academy of Sciences. 109, 48 (2012), E3314--E3323.
[22]
Philip Burgess. Electronic Animated Eyes using Teensy 3.1/3.2. 2015. Retrieved September 17, 2016 from https://learn.adafruit.com/animated-electroniceyes-using-teensy-3--1/overview.
[23]
Reeder, R.W., Pirolli, P. and Card, S.K. 2001. WebEyeMapper and WebLogger:Tools for Analyzing Eye Tracking Data Collected in Web-use Studies. CHI '01 Extended Abstracts on Human Factors in Computing Systems (Seattle, Washington, 2001), 19--20.
[24]
Stampe, D.M. and Reingold, E.M. 1995. Selection By Looking:A Novel Computer Interface And Its Application To Psychological Research. Eye Movement Research Mechanisms, Processes, and Applications. R.W. John M. Findlay and R.W. Kentridge, eds. North-Holland. 467--478.
[25]
Stephanidis, C. 2009. The Universal Access Handbook. CRC Press, Inc.
[26]
Sundstedt, V. 2010. Gazing at games:using eye tracking to control virtual characters. ACM SIGGRAPH 2010 Courses. 5, (2010), 1--160.
[27]
Zhang, Y., Chong, M.K., Müller, J., Bulling, A. and Gellersen, H. 2015. Eye Tracking for Public Displays in the Wild. Personal and Ubiquitous Computing. 19, 5 (2015), 967--981.

Index Terms

  1. Designing Elements for a Gaze Sensitive Object: Meet the CoffeePet

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HAI '17: Proceedings of the 5th International Conference on Human Agent Interaction
    October 2017
    550 pages
    ISBN:9781450351133
    DOI:10.1145/3125739
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 27 October 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. anthropomorphism
    2. emotional connection
    3. eye tracking
    4. gaze sensitive object
    5. human-object interaction

    Qualifiers

    • Research-article

    Funding Sources

    • Ministry of Higher Education Malaysia

    Conference

    HAI '17
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 121 of 404 submissions, 30%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 270
      Total Downloads
    • Downloads (Last 12 months)47
    • Downloads (Last 6 weeks)10
    Reflects downloads up to 13 Dec 2024

    Other Metrics

    Citations

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media