[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/985921.985998acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Active eye contact for human-robot communication

Published: 24 April 2004 Publication History

Abstract

Eye contact is an effective means of controlling communication for humans, such as starting communication. It seems that we can make eye contact if we look at each other. However, this alone cannot complete eye contact. In addition, we need to be aware of being looked by each other. We propose a method of active eye contact for human-robot communication considering both conditions. The robot changes its facial expressions according to the observation results of the human to make eye contact. Then, we present a robot that can recognize hand gestures after making eye contact with the human to show the effectiveness of eye contact as a means of controlling communication.

References

[1]
Argyle, M., and Ingham, R., Mutual gaze and proximity. Semiotica 6 (1972), 32--49.
[2]
Brooks, R.A., Breazeal, C., Marjanovic, M., Scassellati, B., and Williamson, M.M., The cog project: Building a humanoid robot. Nehaniv, C. (ed.), Computation for Metaphors, Analogy and Agents, Lecture Notes in Artificial Intelligence, vol.1562, Springer-Verlag (1998), 52--87.
[3]
Cranach, M., The role of orienting behavior in human interaction. Esser, A.H. (ed.), Behavior and Environment, Plenum Press (1971), 217--237.
[4]
Fukui, K. and Yamaguchi, O., Facial feature point extraction method based on combination of shape extraction and pattern matching. Systems and Computers in Japan 29, 6 (1998), 49--58.
[5]
Hansen, J.P., Anderson, A.W., and Roed, P., Eye-gaze control of multimedia systems. Anzai,Y., Ogawa, K., and Mori, H. (eds.), Symbiosis of Human and Artifact, vol.20A, Elsevier Science (1995), 37--42.
[6]
Hasegawa, O., Sakaue, K., Itou, K., Kurita, T., Hayamizu, S., Tanaka, K., and Otsu, N., Agent oriented multimodal image learning system. Proc. IJCAI-WS IMS 1997, 29--34.
[7]
Jacob, R.J.K., The use of eye movements in human-computer interaction techniques: What you look at is what you get. ACM Trans. Information Systems 9, 3 (1991), 152--169.
[8]
Kanda, T., Ishiguro, H., Ono, T., Imai, M., and Nakatsu, R., Development and evaluation of an interactive humanoid robot "Robovie". Proc. IEEE ICRA 2002, 1848--1855.
[9]
Matsusaka, Y., Kubota, S., Tojo, T., Furukawa, K., and Kobayashi, T., Multi-person conversation robot using multi-modal interface. Proc. SCI/ISAS 1999, vol.7, 450--455.
[10]
Nishimura, T., Mukai, T., and Oka, R., Spotting recognition of gestures performed by people from a single time-varying image. Proc. IEEE/RSJ IROS 1997, vol.2, 967--972.
[11]
Ohno, T. and Mukawa, N., Gaze-based interaction for anyone, anytime. Proc. HCI International 2003, vol.4, 1452--1456.

Cited By

View all
  • (2024)Enhancing emotional expression in cat-like robots: strategies for utilizing tail movements with human-like gazesFrontiers in Robotics and AI10.3389/frobt.2024.139901211Online publication date: 15-Jul-2024
  • (2022)Enhancing therapeutic processes in videoconferencing psychotherapy – a qualitative study on psychologists’ technological perspective (Preprint)JMIR Formative Research10.2196/40542Online publication date: 25-Jun-2022
  • (2020)Design of Effective Robotic Gaze-Based Social Cueing for Users in Task-Oriented Situations: How to Overcome In-Attentional Blindness?Applied Sciences10.3390/app1016541310:16(5413)Online publication date: 5-Aug-2020
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '04: CHI '04 Extended Abstracts on Human Factors in Computing Systems
April 2004
975 pages
ISBN:1581137036
DOI:10.1145/985921
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 April 2004

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. embodied agent
  2. eye contact
  3. gaze
  4. gesture recognition
  5. human-robot interface
  6. nonverbal behavior

Qualifiers

  • Article

Conference

CHI04
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Enhancing emotional expression in cat-like robots: strategies for utilizing tail movements with human-like gazesFrontiers in Robotics and AI10.3389/frobt.2024.139901211Online publication date: 15-Jul-2024
  • (2022)Enhancing therapeutic processes in videoconferencing psychotherapy – a qualitative study on psychologists’ technological perspective (Preprint)JMIR Formative Research10.2196/40542Online publication date: 25-Jun-2022
  • (2020)Design of Effective Robotic Gaze-Based Social Cueing for Users in Task-Oriented Situations: How to Overcome In-Attentional Blindness?Applied Sciences10.3390/app1016541310:16(5413)Online publication date: 5-Aug-2020
  • (2020)A perspective on Client-Psychologist Relationships in Videoconferencing Psychotherapy: A Literature Review (Preprint)JMIR Mental Health10.2196/19004Online publication date: 31-Mar-2020
  • (2020)The Human–Robot Interaction in Robot-Aided Medical CareHuman Centred Intelligent Systems10.1007/978-981-15-5784-2_19(233-242)Online publication date: 30-May-2020
  • (2017)Familiar and Strange: Gender, Sex, and Love in the Uncanny ValleyMultimodal Technologies and Interaction10.3390/mti10100021:1(2)Online publication date: 4-Jan-2017
  • (2016)IntroductionData Mining for Social Robotics10.1007/978-3-319-25232-2_1(1-31)Online publication date: 9-Jan-2016
  • (2014)Initiating interactions in order to get help: Effects of social framing on people's responses to robots' requests for assistanceThe 23rd IEEE International Symposium on Robot and Human Interactive Communication10.1109/ROMAN.2014.6926383(999-1005)Online publication date: Aug-2014
  • (2013)The acoustics of eye contactProceedings of the 6th workshop on Eye gaze in intelligent human machine interaction: gaze in multimodal interaction10.1145/2535948.2535949(7-12)Online publication date: 13-Dec-2013
  • (2013)Recognition of request through hand gesture for mobile care robotsIECON 2013 - 39th Annual Conference of the IEEE Industrial Electronics Society10.1109/IECON.2013.6700525(8312-8316)Online publication date: Nov-2013
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media