[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
Synthesis and acquisition of laban movement analysis qualitative parameters for communicative gestures
Publisher:
  • University of Pennsylvania
  • Computer and Information Science Dept. 2000 South 33rd St. Philadelphia, PA
  • United States
ISBN:978-0-493-25765-5
Order Number:AAI3015399
Pages:
147
Reflects downloads up to 03 Jan 2025Bibliometrics
Skip Abstract Section
Abstract

Humans use gestures in most communicative acts. How are these gestures initiated and performed__ __ What kinds of communicative roles do they play and what kinds of meanings do they convey__ __ How do listeners extract and understand these meanings__ __ Will it be possible to build computerized communicating agents that can extract and understand the meanings and accordingly simulate and display expressive gestures on the computer in such a way that they can be effective conversational partners__ __ All these questions are easy to ask, but far more difficult to answer. In the thesis we try to address these questions regarding the synthesis and acquisition of communicative gestures.

Our approach to gesture is based on the principles of movement observation science, specifically Laban Movement Analysis (LMA) and its Effort and Shape components. LMA, developed in the dance community over the past seventy years, is an effective method for observing, describing, notating, and interpreting human movement to enhance communication and expression in everyday and professional life. Its Effort and Shape component provide us with a comprehensive and valuable set of parameters to characterize gesture formation. The computational model (the EMOTE system) we have built offers power and flexibility to procedurally synthesize gestures based on predefined key pose and time information plus Effort and Shape qualities.

To provide real quantitative foundations for a complete communicative gesture model, we have built a computational framework where the observable characteristics of gestures—not only key pose and timing but also the underlying motion qualities—can be extracted from live performance, either in 3D motion capture data or in 2D video data, and correlated with observations validated by LMA notators. Experiments of this sort have not been conducted before and should be of interest not only to the computer animation and computer vision community but would be a powerful and valuable methodological tool for creating personalized, communicating agents.

Cited By

  1. ACM
    Mckendrick Z, Somin L, Finn P and Sharlin E Virtual Rehearsal Suite: An Environment and Framework for Virtual Performance Practice Proceedings of the 2023 ACM International Conference on Interactive Media Experiences, (27-39)
  2. ACM
    Otterbein R, Jochum E, Overholt D, Bai S and Dalsgaard A Dance and Movement-Led Research for Designing and Evaluating Wearable Human-Computer Interfaces Proceedings of the 8th International Conference on Movement and Computing, (1-9)
  3. ACM
    Volioti C, Manitsaris S, Hemery E, Hadjidimitriou S, Charisis V, Hadjileontiadis L, Katsouli E, Moutarde F and Manitsaris A (2018). A Natural User Interface for Gestural Expression and Emotional Elicitation to Access the Musical Intangible Cultural Heritage, Journal on Computing and Cultural Heritage , 11:2, (1-20), Online publication date: 7-Jun-2018.
  4. ACM
    Santos O and Eddy M Modeling Psychomotor Activity Adjunct Publication of the 25th Conference on User Modeling, Adaptation and Personalization, (305-310)
  5. ACM
    Yang Y, Shum H, Aslam N and Zeng L Temporal clustering of motion capture data with optimal partitioning Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry - Volume 1, (479-482)
  6. ACM
    Dardard F, Gnecco G and Glowinski D (2016). Automatic Classification of Leading Interactions in a String Quartet, ACM Transactions on Interactive Intelligent Systems, 6:1, (1-27), Online publication date: 5-May-2016.
  7. ACM
    Lockyer M, Bartram L, Schiphorst T and Studd K Extending computational models of abstract motion with movement qualities Proceedings of the 2nd International Workshop on Movement and Computing, (92-99)
  8. ACM
    Zacharatos H, Gatzoulis C, Chrysanthou Y and Aristidou A Emotion Recognition for Exergames using Laban Movement Analysis Proceedings of Motion on Games, (61-66)
  9. Baird B and Izmirli O (2011). Motion capture in a CS curriculum, Journal of Computing Sciences in Colleges, 26:6, (165-167), Online publication date: 1-Jun-2011.
  10. ACM
    Sundström P and Höök K Hand in hand with the material Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, (463-472)
  11. Santos L, Prado J and Dias J Human robot interaction studies on laban human movement analysis and dynamic background segmentation Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems, (4984-4989)
  12. Swaminathan D, Thornburg H, Mumford J, Rajko S, James J, Ingalls T, Campana E, Qian G, Sampath P and Peng B (2009). A dynamic Bayesian approach to computational Laban shape quality analysis, Advances in Human-Computer Interaction, 2009, (1-17), Online publication date: 1-Jan-2009.
  13. ACM
    Deng Z, Gu Q and Li Q Perceptually consistent example-based human motion retrieval Proceedings of the 2009 symposium on Interactive 3D graphics and games, (191-198)
  14. ACM
    Sheppard R, Kamali M, Rivas R, Tamai M, Yang Z, Wu W and Nahrstedt K Advancing interactive collaborative mediums through tele-immersive dance (TED) Proceedings of the 16th ACM international conference on Multimedia, (579-588)
  15. Rett J and Dias J Human robot interaction based on Bayesian analysis of human movements Proceedings of the aritficial intelligence 13th Portuguese conference on Progress in artificial intelligence, (530-541)
  16. ACM
    Schiphorst T, Nack F, KauwATjoe M, de Bakker S, Stock , Aroyo L, Rosillio A, Schut H and Jaffe N PillowTalk Proceedings of the 1st international conference on Tangible and embedded interaction, (23-30)
  17. Bhuyan M, Ghosh D and Bora P Continuous hand gesture segmentation and co-articulation detection Proceedings of the 5th Indian conference on Computer Vision, Graphics and Image Processing, (564-575)
  18. ACM
    Moen J Towards people based movement interaction and kinaesthetic interaction experiences Proceedings of the 4th decennial conference on Critical computing: between sense and sensibility, (121-124)
  19. Camurri A, Lagerlöf I and Volpe G (2003). Recognizing emotion from dance movement, International Journal of Human-Computer Studies, 59:1-2, (213-225), Online publication date: 1-Jul-2003.
  20. ACM
    Allbeck J, Kipper K, Adams C, Schuler W, Zoubanova E, Badler N, Palmer M and Joshi A ACUMEN Proceedings of the first international joint conference on Autonomous agents and multiagent systems: part 1, (191-198)
Contributors
  • University of Pennsylvania
  • University of Pennsylvania
Please enable JavaScript to view thecomments powered by Disqus.

Recommendations