[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3382507.3418870acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Incorporating Measures of Intermodal Coordination in Automated Analysis of Infant-Mother Interaction

Published: 22 October 2020 Publication History

Abstract

Interactions between infants and their mothers can provide meaningful insight into the dyad's health and well-being. Previous work has shown that infant-mother coordination, within a single modality, varies significantly with age and interaction quality. However, as infants are still developing their motor, language, and social skills, they may differ from their mothers in the modes they use to communicate. This work examines how infant-mother coordination across modalities can expand researchers' abilities to observe meaningful trends in infant-mother interactions. Using automated feature extraction tools, we analyzed the head position, arm position, and vocal fundamental frequency of mothers and their infants during the Face-to-Face Still-Face (FFSF) procedure. A de-identified dataset including these features was made available online as a contribution of this work. Analysis of infant behavior over the course of the FFSF indicated that the amount and modality of infant behavior change evolves with age. Evaluating the interaction dynamics, we found that infant and mother behavioral signals are coordinated both within and across modalities, and that levels of both intramodal and intermodal coordination vary significantly with age and across stages of the FFSF. These results support the significance of intermodal coordination when assessing changes in infant-mother interaction across conditions.

Supplementary Material

MP4 File (3382507.3418870.mp4)
Infant-mother interactions can provide meaningful insight into the dyad?s well-being. Previous work has shown that infant-mother coordination, within a single modality, varies significantly with age and interaction quality. However, as infants are still developing motor and social skills, they may differ from their mothers in the modes they use to communicate. This work examines how coordination across modalities can be used to identify meaningful trends in infant-mother interactions. Using automated feature extraction, we analyzed the head position, arm position, and vocal fundamental frequency of mothers and their infants during the Face-to-Face Still-Face (FFSF) procedure. Results indicated that infant and mother behavioral signals were coordinated both within and across modalities, and that levels of coordination varied significantly across age and stages of the FFSF, supporting the significance of intermodal coordination when assessing changes in infant-mother interaction across conditions.

References

[1]
Lauren B Adamson and Janet E Frick. 2003. The still face: A history of a shared experimental paradigm. Infancy 4, 4 (2003), 451--473.
[2]
P Boersma and D Weenink. 2002. Praat 4.0: a system for doing phonetics with the computer [Computer software]. Amsterdam: Universiteit van Amsterdam (2002).
[3]
Steven M Boker, Jennifer L Rotondo, Minquan Xu, and Kadijah King. 2002. Windowed cross-correlation and peak picking for the analysis of variability in the association between behavioral time series. Psychological methods 7, 3 (2002), 338.
[4]
Zhe Cao, Gines Hidalgo, Tomas Simon, Shih-En Wei, and Yaser Sheikh. 2018. OpenPose: realtime multi-person 2D pose estimation using Part Affinity Fields. In arXiv preprint arXiv:1812.08008.
[5]
Emilie Delaherche, Mohamed Chetouani, Ammar Mahdhaoui, Catherine SaintGeorges, Sylvie Viaux, and David Cohen. 2012. Interpersonal synchrony: A survey of evaluation methods across disciplines. IEEE Transactions on Affective Computing 3, 3 (2012), 349--365.
[6]
Giulio Gabrieli, Wan Qing Leck, Andrea Bizzego, and Gianluca Esposito. 2019. Are Praat's default settings optimal for Infant cry analysis. In Proceedings of the 2019 CCRMA Linux Audio Conference, LAC, Stanford, LA, USA. 23--26.
[7]
Zakia Hammal, Jeffrey F Cohn, and Daniel S Messinger. 2015. Head movement dynamics during play and perturbed mother-infant interaction. IEEE transactions on affective computing 6, 4 (2015), 361--370.
[8]
Patrik N Juslin and Klaus R Scherer. 2005. Vocal expression of affect. The new handbook of methods in nonverbal behavior research (2005), 65--135.
[9]
A Kappas, U Hess, and KR Scherer. 1991. Voice and emotion: Fundamentals of Nonverbal Behavior. Rim, B., Feldman, RS (eds.) (1991), 200--238.
[10]
C Leclère, M Avril, S Viaux-Savelon, N Bodeau, Catherine Achard, S Missonnier, M Keren, R Feldman, M Chetouani, and David Cohen. 2016. Interaction and behaviour imaging: a novel method to measure mother--infant interaction using video 3D reconstruction. Translational Psychiatry 6, 5 (2016), e816--e816.
[11]
Chloë Leclère, Sylvie Viaux, Marie Avril, Catherine Achard, Mohamed Chetouani, Sylvain Missonnier, and David Cohen. 2014. Why synchrony matters during mother-child interactions: a systematic review. PloS one 9, 12 (2014).
[12]
Ammar Mahdhaoui, Mohamed Chetouani, Raquel S Cassel, Catherine SaintGeorges, Erika Parlato, Marie Christine Laznik, Fabio Apicella, Filippo Muratori, Sandra Maestro, and David Cohen. 2011. Computerized home video detection for motherese may help to study impaired interaction between infants who become autistic and their parents. International Journal of Methods in Psychiatric Research 20, 1 (2011), e6--e18.
[13]
Daniel M Messinger, Paul Ruvolo, Naomi V Ekas, and Alan Fogel. 2010. Applying machine learning to infant interaction: The development is in the details. Neural Networks 23, 8--9 (2010), 1004--1016.
[14]
Setareh Nasihati Gilani, David Traum, Arcangelo Merla, Eugenia Hee, Zoey Walker, Barbara Manini, Grady Gallagher, and Laura-Ann Petitto. 2018. Multimodal dialogue management for multiparty interaction with infants. In Proceedings of the 20th ACM International Conference on Multimodal Interaction. 5--13.
[15]
Center on the Developing Child at Harvard University. 2012. The science of neglect: The persistent absence of responsive care disrupts the developing brain.
[16]
Livio Provenzi, Giunia Scotto di Minico, Lorenzo Giusti, Elena Guida, and Mitho Müller. 2018. Disentangling the dyadic dance: theoretical, methodological and outcomes systematic review of mother-infant dyadic processes. Frontiers in psychology 9 (2018), 348.
[17]
Sally J Rogers, L Vismara, AL Wagner, C McCormick, G Young, and S Ozonoff. 2014. Autism treatment in the first year of life: a pilot study of infant start, a parent-implemented intervention for symptomatic infants. Journal of autism and developmental disorders 44, 12 (2014), 2981--2995.
[18]
Seokwoo Song, Seungho Kim, John Kim, Wonjeong Park, and Dongsun Yim. 2016. TalkLIME: mobile system intervention to improve parent-child interaction for children with language delay. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. 304--315.
[19]
Angela EB Stewart, Zachary A Keirn, and Sidney K D'Mello. 2018. Multimodal modeling of coordination and coregulation patterns in speech rate during triadic collaborative problem solving. In Proceedings of the 20th ACM International Conference on Multimodal Interaction. 21--30.
[20]
Rainer Stiefelhagen and Jie Zhu. 2002. Head orientation and gaze direction in meetings. In CHI'02 Extended Abstracts on Human Factors in Computing Systems. 858--859.
[21]
Chuangao Tang, Wenming Zheng, Yuan Zong, Zhen Cui, Nana Qiu, Simeng Yan, and Xiaoyan Ke. 2018. Automatic Smile Detection of Infants in Mother-Infant Interaction via CNN-based Feature Learning. In Proceedings of the Joint Workshop of the 4th Workshop on Affective Social Multimedia Computing and first Multi-Modal Affective Computing of Large-Scale Multimedia Data. 35--40.
[22]
Edward Tronick, Heidelise Als, Lauren Adamson, Susan Wise, and T Berry Brazelton. 1978. The infant's response to entrapment between contradictory messages in face-to-face interaction. Journal of the American Academy of Child psychiatry 17, 1 (1978), 1--13.

Cited By

View all
  • (2024)Towards Automated Annotation of Infant-Caregiver Engagement Phases with Multimodal Foundation ModelsProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685704(428-438)Online publication date: 4-Nov-2024
  • (2024)Survey of Automated Methods for Nonverbal Behavior Analysis in Parent-Child Interactions2024 IEEE 18th International Conference on Automatic Face and Gesture Recognition (FG)10.1109/FG59268.2024.10582009(1-11)Online publication date: 27-May-2024
  • (2023)Quantifying the Quality of Parent-Child Interaction Through Machine-Learning Based Audio and Video Analysis: Towards a Vision of AI-assisted Coaching Support for Social WorkersACM Journal on Computing and Sustainable Societies10.1145/36176932:1(1-21)Online publication date: 1-Sep-2023
  • Show More Cited By

Index Terms

  1. Incorporating Measures of Intermodal Coordination in Automated Analysis of Infant-Mother Interaction

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ICMI '20: Proceedings of the 2020 International Conference on Multimodal Interaction
      October 2020
      920 pages
      ISBN:9781450375818
      DOI:10.1145/3382507
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 22 October 2020

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. behavioral signal processing
      2. infant-mother interaction
      3. intermodal correlation

      Qualifiers

      • Research-article

      Funding Sources

      • JPB Foundation
      • National Science Foundation

      Conference

      ICMI '20
      Sponsor:
      ICMI '20: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION
      October 25 - 29, 2020
      Virtual Event, Netherlands

      Acceptance Rates

      Overall Acceptance Rate 453 of 1,080 submissions, 42%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)63
      • Downloads (Last 6 weeks)13
      Reflects downloads up to 21 Dec 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Towards Automated Annotation of Infant-Caregiver Engagement Phases with Multimodal Foundation ModelsProceedings of the 26th International Conference on Multimodal Interaction10.1145/3678957.3685704(428-438)Online publication date: 4-Nov-2024
      • (2024)Survey of Automated Methods for Nonverbal Behavior Analysis in Parent-Child Interactions2024 IEEE 18th International Conference on Automatic Face and Gesture Recognition (FG)10.1109/FG59268.2024.10582009(1-11)Online publication date: 27-May-2024
      • (2023)Quantifying the Quality of Parent-Child Interaction Through Machine-Learning Based Audio and Video Analysis: Towards a Vision of AI-assisted Coaching Support for Social WorkersACM Journal on Computing and Sustainable Societies10.1145/36176932:1(1-21)Online publication date: 1-Sep-2023
      • (2022)Evaluating Temporal Patterns in Applied Infant Affect Recognition2022 10th International Conference on Affective Computing and Intelligent Interaction (ACII)10.1109/ACII55700.2022.9953842(1-8)Online publication date: 18-Oct-2022
      • (2021)Dynamic Mode Decomposition with Control as a Model of Multimodal Behavioral CoordinationProceedings of the 2021 International Conference on Multimodal Interaction10.1145/3462244.3479916(25-33)Online publication date: 18-Oct-2021
      • (2021)BabyNet: A Lightweight Network for Infant Reaching Action Recognition in Unconstrained Environments to Support Future Pediatric Rehabilitation Applications2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN)10.1109/RO-MAN50785.2021.9515507(461-467)Online publication date: 8-Aug-2021

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media