[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2593968.2610455acmconferencesArticle/Chapter ViewAbstractPublication PagesidcConference Proceedingsconference-collections
short-paper

Affective communication aid using wearable devices based on biosignals

Published: 17 June 2014 Publication History

Abstract

We propose a novel wearable interface for sharing facial expressions between children with autism spectrum disorders (ASD) and their parents, therapists, and caregivers. The developed interface is capable of recognizing facial expressions based on physiological signal patterns taken from facial bioelectrical signals and displaying the results in real time. The physiological signals are measured from the forehead and both sides of the head. We verified that the proposed classification method is robust against facial movements, blinking, and the head posture. This compact interface can support the perception of facial expressions between children with ASD and others to help improve their communication.

References

[1]
P. Ekman. An argument for basic emotions. Cognition and Emotion, 6(3):169200, 1992.
[2]
P. Ekman. Emotions Revealed: Recognizing Faces and Feelings to Improve Communication and Emotional Life. Times Books, 2003.
[3]
P. Ekman and W. Friesen. Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, 1978.
[4]
A. Funahashi, A. Gruebler, T. Aoki, H. Kadone, and K. Suzuki. The smiles of a child with autism spectrum disorder during an animalassisted activity may facilitate social positive behaviors quantitative analysis with smiledetecting interface. J Autism Dev Disord, 44(3):685693, 2014.
[5]
K. Gray and B. Tonge. Are there early features of autism in infants and preschool children? J Paediatr Child Health, 37(3):221226, June 2001.
[6]
A. Gruebler and K. Suzuki. Design of a wearable device for reading positive expressions from facial emg signals. IEEE Trans. on Affective Comput., (in press).
[7]
M. Hirokawa, A. Funahashi, and K. Suzuki. A dolltype interface for realtime humanoid teleoperation in robotassisted activity: A case study. In ACM/IEEE Intl. Conf. on HumanRobot Interaction, pages 174175, 2014.
[8]
J. J. Lien, T. Kanade, J. F. Cohn, and C. C. Li. Automated facial expression recognition based on facs action units. In IEEE. Published in the Proceedings of FG'98, April 1998.
[9]
F. R. Volkmar and L. C. Mayes. Gaze behavior in autism. Development and Psychopathology, 2(1):6169, January 1990.

Cited By

View all
  • (2022)Consistent Smile Intensity Estimation from Wearable Optical Sensors2022 10th International Conference on Affective Computing and Intelligent Interaction (ACII)10.1109/ACII55700.2022.9953867(1-8)Online publication date: 18-Oct-2022
  • (2021)Faces Don’t Lie: Analysis of Children’s Facial expressions during Collaborative CodingFabLearn Europe / MakeEd 2021 - An International Conference on Computing, Design and Making in Education10.1145/3466725.3466757(1-10)Online publication date: 2-Jun-2021
  • (2021)Information flow and children’s emotions during collaborative coding: A causal analysisProceedings of the 20th Annual ACM Interaction Design and Children Conference10.1145/3459990.3460731(350-362)Online publication date: 24-Jun-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
IDC '14: Proceedings of the 2014 conference on Interaction design and children
June 2014
378 pages
ISBN:9781450322720
DOI:10.1145/2593968
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 June 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. autism spectrum disorder
  2. facial expression
  3. smile sharing

Qualifiers

  • Short-paper

Conference

IDC'14
Sponsor:
IDC'14: Interaction Design and Children 2014
June 17 - 20, 2014
Aarhus, Denmark

Acceptance Rates

IDC '14 Paper Acceptance Rate 18 of 60 submissions, 30%;
Overall Acceptance Rate 172 of 578 submissions, 30%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)27
  • Downloads (Last 6 weeks)4
Reflects downloads up to 11 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2022)Consistent Smile Intensity Estimation from Wearable Optical Sensors2022 10th International Conference on Affective Computing and Intelligent Interaction (ACII)10.1109/ACII55700.2022.9953867(1-8)Online publication date: 18-Oct-2022
  • (2021)Faces Don’t Lie: Analysis of Children’s Facial expressions during Collaborative CodingFabLearn Europe / MakeEd 2021 - An International Conference on Computing, Design and Making in Education10.1145/3466725.3466757(1-10)Online publication date: 2-Jun-2021
  • (2021)Information flow and children’s emotions during collaborative coding: A causal analysisProceedings of the 20th Annual ACM Interaction Design and Children Conference10.1145/3459990.3460731(350-362)Online publication date: 24-Jun-2021
  • (2021)Smile Action Unit detection from distal wearable Electromyography and Computer Vision2021 16th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2021)10.1109/FG52635.2021.9667047(1-8)Online publication date: 15-Dec-2021
  • (2019)Sock-Type Wearable Sensor for Estimating Lower Leg Muscle Activity Using Distal EMG SignalsSensors10.3390/s1908195419:8(1954)Online publication date: 25-Apr-2019
  • (2019)Human perception and biosignal-based identification of posed and spontaneous smilesPLOS ONE10.1371/journal.pone.022632814:12(e0226328)Online publication date: 12-Dec-2019
  • (2019)Joint Emotional State of Children and Perceived Collaborative Experience in Coding ActivitiesProceedings of the 18th ACM International Conference on Interaction Design and Children10.1145/3311927.3323145(133-145)Online publication date: 12-Jun-2019
  • (2019)The Invisible Potential of Facial ElectromyographyProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300379(1-9)Online publication date: 2-May-2019
  • (2019)Emotion recognition system for autism disordered peopleJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-019-01492-yOnline publication date: 14-Sep-2019
  • (2018)From Research to PracticeProceedings of the 2018 CHI Conference on Human Factors in Computing Systems10.1145/3173574.3173676(1-16)Online publication date: 21-Apr-2018
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media