[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2909132.2909262acmconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
research-article

Analysis of Intrapersonal Synchronization in Full-Body Movements Displaying Different Expressive Qualities

Published: 07 June 2016 Publication History

Abstract

Intrapersonal synchronization of limb movements is a relevant feature for assessing coordination of motoric behavior. In this paper, we show that it can also distinguish between full-body movements performed with different expressive qualities, namely rigidity, fluidity, and impulsivity. For this purpose, we collected a dataset of movements performed by professional dancers, and annotated the perceived movement qualities with the help of a group of experts in expressive movement analysis. We computed intra personal synchronization by applying the Event Synchronization algorithm to the time-series of the speed of arms and hands. Results show that movements performed with different qualities display a significantly different amount of intra personal synchronization: impulsive movements are the most synchronized, the fluid ones show the lowest values of synchronization, and the rigid ones lay in between.

References

[1]
D. Bernhardt and P. Robinson. Detecting emotions from everyday body movements. Presenccia PhD Sym., Barcelona, 2007.
[2]
A. Camurri, I. Lagerlöf, and G. Volpe. Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques. International journal of human-computer studies, 59(1):213--225, 2003.
[3]
A. Camurri, B. Mazzarino, M. Ricchetti, R. Timmers, and G. Volpe. Multimodal analysis of expressive gesture in music and dance performances. In Gesture-based communication in human-computer interaction, pages 20--39. Springer, 2004.
[4]
A. Camurri, B. Mazzarino, and G. Volpe. Analysis of expressive gesture: The eyesweb expressive gesture processing library. In Gesture-based communication in human-computer interaction, pages 460--467. Springer, 2004.
[5]
B. Caramiaux, M. Donnarumma, and A. Tanaka. Understanding gesture expressivity through muscle sensing. ACM Transactions on Computer-Human Interaction (TOCHI), 21(6):31, 2015.
[6]
G. Caridakis, A. Raouzaiou, E. Bevacqua, M. Mancini, K. Karpouzis, L. Malatesta, and C. Pelachaud. Virtual agent multimodal mimicry of humans. Language Resources and Evaluation, 41(3-4):367--388, 2007.
[7]
G. Castellano, S. D. Villalba, and A. Camurri. Recognising human emotions from body movement and gesture dynamics. In Affective computing and intelligent interaction, pages 71--82. Springer, 2007.
[8]
J. H. Daruna and P. A. Barnes. A neurodevelopmental view of impulsivity. In W. G. McCown, J. L. Johnson, and M. B. Shure, editors, The impulsive client: Theory, research, and treatment, pages 23--37. American Psychological Association, 1993.
[9]
P. Fraisse. Rhythm and tempo. The psychology of music, 1:149--180, 1982.
[10]
D. Glowinski, N. Dael, A. Camurri, G. Volpe, M. Mortillaro, and K. Scherer. Toward a minimal representation of affective gestures. Affective Computing, IEEE Transactions on, 2(2):106--118, 2011.
[11]
D. Glowinski, M. Mancini, R. Cowie, A. Camurri, C. Chiorri, and C. Doherty. The movements made by performers in a skilled quartet: a distinctive pattern, and the function that it serves. Frontiers in Psychology, 4(841), 2013.
[12]
M. M. Gross, E. A. Crane, and B. L. Fredrickson. Methodology for assessing bodily expression of emotion. Journal of Nonverbal Behavior, 34(4):223--248, 2010.
[13]
M. Leman, M. Demey, M. Lesaffre, L. Van Noorden, and D. Moelants. Concepts, technology, and assessment of the social music game" sync-in-team'. In Computational Science and Engineering, 2009. CSE'09. International Conference on, volume 4, pages 837--842. IEEE, 2009.
[14]
M. Leman, D. Moelants, M. Varewyck, F. Styns, L. van Noorden, and J.-P. Martens. Activating and relaxing music entrains the speed of beat synchronized walking. PloS one, 8(7):e67932, 2013.
[15]
G. Loewenstein and J. S. Lerner. The role of affect in decision making. Handbook of affective science, 619(642):3, 2003.
[16]
B. Mazzarino and M. Mancini. The need for impulsivity & smoothness-improving hci by qualitatively measuring new high-level human motion features. In SIGMAP, pages 62--67, 2009.
[17]
B. Mazzarino, M. Peinado, R. Boulic, G. Volpe, and M. M. Wanderley. Improving the believability of virtual characters using qualitative gesture analysis. In Gesture-Based Human-Computer Interaction and Simulation, pages 48--56. Springer, 2009.
[18]
Y. Miyake. Interpersonal synchronization of body motion and the walk-mate walking support robot. Robotics, IEEE Transactions on, 25(3):638--644, 2009.
[19]
P. Morasso. Spatial control of arm movements. Experimental brain research, 42(2):223--227, 1981.
[20]
R. Niewiadomski, M. Mancini, G. Volpe, and A. Camurri. Automated detection of impulsive movements in HCI. In Proceedings of the 11th Biannual Conference on Italian SIGCHI Chapter, pages 166--169. ACM, 2015.
[21]
S. Piana, A. Stagliano', A. Camurri, and F. Odone. Adaptive body gesture representation for automatic emotion recognition. In Transactions on Interactive Intelligent System. in printing, ACM press, 2015.
[22]
F. E. Pollick, H. M. Paterson, A. Bruderlin, and A. J. Sanford. Perceiving affect from arm movement. Cognition, 82(2):B51--B61, 2001.
[23]
R. Q. Quiroga, T. Kreuz, and P. Grassberger. Event synchronization: a simple and fast method to measure synchronicity and time delay patterns. Physical review E, 66(4):041904, 2002.
[24]
B. H. Repp. Musical synchronization. Music, motor control, and the brain, pages 55--76, 2006.
[25]
B. Schuller, E. Marchi, S. Baron-Cohen, H. O'Rielly, P. Robinson, I. Davies, O. Golan, S. Friedenson, S. Tal, S. Newman, et al. ASC-inclusion: Integrated internet-based environment for social inclusion of children with autism spectrum conditions. arXiv preprint arXiv:1403.5912, 2014.
[26]
A. Singh, S. Piana, D. Pollarolo, G. Volpe, G. Varni, A. Tajadura-Jimenez, A. C. Williams, A. Camurri, and N. Bianchi-Berthouze. Go-with-the-flow: Tracking, analysis and sonification of movement and breathing to build confidence in activity despite chronic pain. Human--Computer Interaction, (just-accepted), 2015.
[27]
N. W. Twyman, A. C. Elkins, J. K. Burgoon, and J. F. Nunamaker. A rigidity detection system for automated credibility assessment. Journal of Management Information Systems, 31(1):173--202, 2014.
[28]
G. Varni, G. Volpe, and A. Camurri. A system for real-time multimodal analysis of nonverbal affective social interaction in user-centric media. Multimedia, IEEE Transactions on, 12(6):576--590, 2010.
[29]
G. Varni, G. Volpe, and B. Mazzarino. Towards a social retrieval of music content. In SocialCom/PASSAT, pages 1466--1473, 2011.
[30]
P. Viviani and T. Flash. Minimum-jerk, two-thirds power law, and isochrony: converging approaches to movement planning. Journal of Experimental Psychology: Human Perception and Performance, 21(1):32, 1995.
[31]
H. G. Wallbott and K. R. Scherer. Cues and channels in emotion recognition. Journal of personality and social psychology, 51(4):690, 1986.

Cited By

View all
  • (2023)A Roadmap for Craft Understanding, Education, Training, and PreservationHeritage10.3390/heritage60702806:7(5305-5328)Online publication date: 13-Jul-2023
  • (2023)Modelling the “transactive memory system” in multimodal multiparty interactionsJournal on Multimodal User Interfaces10.1007/s12193-023-00426-518:1(103-117)Online publication date: 11-Nov-2023
  • (2022)Geocultural Precarities in Canonizing Computing Research Involving DanceProceedings of the 8th International Conference on Movement and Computing10.1145/3537972.3537988(1-14)Online publication date: 22-Jun-2022
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
AVI '16: Proceedings of the International Working Conference on Advanced Visual Interfaces
June 2016
400 pages
ISBN:9781450341318
DOI:10.1145/2909132
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 June 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Analysis
  2. Dance
  3. Event Synchronization
  4. Intrapersonal synchronization
  5. Movement

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • European Union's Horizon 2020 Programme

Conference

AVI '16

Acceptance Rates

AVI '16 Paper Acceptance Rate 20 of 96 submissions, 21%;
Overall Acceptance Rate 128 of 490 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)21
  • Downloads (Last 6 weeks)5
Reflects downloads up to 27 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)A Roadmap for Craft Understanding, Education, Training, and PreservationHeritage10.3390/heritage60702806:7(5305-5328)Online publication date: 13-Jul-2023
  • (2023)Modelling the “transactive memory system” in multimodal multiparty interactionsJournal on Multimodal User Interfaces10.1007/s12193-023-00426-518:1(103-117)Online publication date: 11-Nov-2023
  • (2022)Geocultural Precarities in Canonizing Computing Research Involving DanceProceedings of the 8th International Conference on Movement and Computing10.1145/3537972.3537988(1-14)Online publication date: 22-Jun-2022
  • (2020)Human Observer and Automatic Assessment of Movement Related Self-Efficacy in Chronic Pain: From Exercise to Functional ActivityIEEE Transactions on Affective Computing10.1109/TAFFC.2018.279857611:2(214-229)Online publication date: 1-Apr-2020
  • (2020)Salsa dance learning evaluation and motion analysis in gamified virtual reality environmentMultimedia Tools and Applications10.1007/s11042-020-09192-y79:33-34(24621-24643)Online publication date: 1-Sep-2020
  • (2019)Increased affect-arousal in VR can be detected from faster body motion with increased heart rateProceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games10.1145/3306131.3317022(1-6)Online publication date: 21-May-2019
  • (2019)Evaluating Movement Quality Through Intrapersonal SynchronizationIEEE Transactions on Human-Machine Systems10.1109/THMS.2019.291249849:4(304-313)Online publication date: Aug-2019
  • (2019)The role of respiration audio in multimodal analysis of movement qualitiesJournal on Multimodal User Interfaces10.1007/s12193-019-00302-114:1(1-15)Online publication date: 11-Apr-2019
  • (2018)Motion analysis and classification of salsa dance using music-related motion featuresProceedings of the 11th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3274247.3274514(1-10)Online publication date: 8-Nov-2018
  • (2016)Using the Audio Respiration Signal for Multimodal Discrimination of Expressive Movement QualitiesHuman Behavior Understanding10.1007/978-3-319-46843-3_7(102-115)Online publication date: 22-Sep-2016

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media