[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/1056808.1056997acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Multimodal presentation method for a dance training system

Published: 02 April 2005 Publication History

Abstract

This paper presents a multimodal information presentation method for a basic dance training system. The system targets on beginners and enables them to learn basics of dances easily. One of the most effective ways of learning dances is to watch a video showing the performance of dance masters. However, some information cannot be conveyed well through video. One is the translational motion, especially that in the depth direction. We cannot tell exactly how far does the dancers move forward or backward. Another is the timing information. Although we can tell how to move our arms or legs from video, it is difficult to know when to start moving them. We solve the first issue by introducing an image display on a mobile robot. We can learn the amount of translation just by following the robot. We introduce active devices for the second issue. The active devices are composed of some vibro-motors and are developed to direct action-starting cues with vibration. Experimental results show the effectiveness of our multimodal information presentation method.

References

[1]
Chua, P.T. et al. Training for Physical Tasks in Virtual Environments: Tai Chi. Proc. the IEEE Virtual Reality 2003, (2003), 87--94.
[2]
CREST Ikeuchi Project: Automatically Generating Multimedia Contents of Curtural Heritage Objects through Observation. http://www.cvl.iis.u-tokyo.ac.jp/crest/eng/.
[3]
Hachimura, K. and Nakamura, M. Method of Generating Coded Description of Human Body Motion from Motion-Capture Data. Proc. the IEEE ROMAN 2001 Workshop, (2001), 122--127.
[4]
Hachimura, K., et al. A Prototype Dance Training Support System with Motion Capture and Mixed Reality Technologies. Proc. the 13th IEEE Int. Workshop on Robot and Human Interactive Communication (RO-MAN 2004), (2004), CD-ROM.
[5]
Hattori, M., et al. An Analysis of the Bunraku Puppets Motions based on the Phase Factors of the Puppet's Motions Axes --For the Generation of Humanoid Robots' Motions with Fertile Emotions--. J. the Japan Society of Mechanical Engineers, no.66, vol.644, (2000), 1243--1249, in Japanese.
[6]
Kosuge, K., et al. Dance Partner Robot --Ms DanceR--. Proc. the 2003 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS2003), (2003), CD-ROM.
[7]
Kuno, Y., et al. Motion Analysis of Dances and its Applications. Proc. the Int. Symposium on the CREST Digital Archiving Project, (2003), 183-193.
[8]
Lindeman, R.W., et al. Effective Vibrotactile Cueing in a Visual Search Task. Proc. Interact 2003, (2003).
[9]
Murakami, T., et al. Generation of Digital Contents for Traditional Dances by Integrating Appearance and Motion Data. Proc. the 2nd IASTED Int. Conf. Visualization, Imaging and Image Processing (VIIP 2002), (2002), 672-676.
[10]
Niwayama, T., et al. Mobile Robot System for Easy Dance Training. Proc. 2004 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, (2004), CD-ROM.
[11]
Yoshimura, M., et al. A Trial for Identifying and Characterizing Typical Parts of Japanese Dancing. Trans. the Institute of Electronics, Information and Communication Engineers, vol.J84-A-D-II, no.12, (2001), 2644-2653, in Japanese.
[12]
Yang, U. and Kim, G.J. Just Follow Me": An Immersive VR-based Motion Training System. Int. Conf. on Virtual Systems and Multimedia (VSMM), (1999).
[13]
Yukawa, T., et al. Human Motion Description System Using BUYO-FU. IPSJ Journal, vol.41, no.10, (2000), in Japanese.
[14]
Warabi-za Co.,Ltd. Digital Archives of Folk Dances Using MotionCapture. http://www.warabi.or.jp/buyo-fu/
[15]
Warabi-za Co.,Ltd. DVD de oboeru series. http://www.warabi.or.jp/oboeru/index.html, in Japanese.

Cited By

View all
  • (2024)Experiencing the body as play: Cultivating older adults’ exergame experiences using embodied metaphors and multimodal feedbackComputers in Human Behavior10.1016/j.chb.2024.108280158(108280)Online publication date: Sep-2024
  • (2024)Dance Information Processing: Computational Approaches for Assisting Dance CompositionNew Generation Computing10.1007/s00354-024-00273-242:5(1049-1064)Online publication date: 1-Dec-2024
  • (2023)Dancing on the inside: A qualitative study on online dance learning with teacher-AI cooperationEducation and Information Technologies10.1007/s10639-023-11649-028:9(12111-12141)Online publication date: 4-Mar-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '05: CHI '05 Extended Abstracts on Human Factors in Computing Systems
April 2005
1358 pages
ISBN:1595930027
DOI:10.1145/1056808
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 April 2005

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. active vibro-device
  2. dance training
  3. digital archives of dances
  4. mobile robot
  5. motion capture

Qualifiers

  • Article

Conference

CHI05
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)40
  • Downloads (Last 6 weeks)6
Reflects downloads up to 01 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Experiencing the body as play: Cultivating older adults’ exergame experiences using embodied metaphors and multimodal feedbackComputers in Human Behavior10.1016/j.chb.2024.108280158(108280)Online publication date: Sep-2024
  • (2024)Dance Information Processing: Computational Approaches for Assisting Dance CompositionNew Generation Computing10.1007/s00354-024-00273-242:5(1049-1064)Online publication date: 1-Dec-2024
  • (2023)Dancing on the inside: A qualitative study on online dance learning with teacher-AI cooperationEducation and Information Technologies10.1007/s10639-023-11649-028:9(12111-12141)Online publication date: 4-Mar-2023
  • (2023)Card Type Device to Support Acquirement of Card TechniquesDistributed, Ambient and Pervasive Interactions10.1007/978-3-031-34609-5_32(444-455)Online publication date: 9-Jul-2023
  • (2022)Geocultural Precarities in Canonizing Computing Research Involving DanceProceedings of the 8th International Conference on Movement and Computing10.1145/3537972.3537988(1-14)Online publication date: 22-Jun-2022
  • (2022)Online Dance Lesson Support System Using Flipped ClassroomAdvances in Mobile Computing and Multimedia Intelligence10.1007/978-3-031-20436-4_13(133-142)Online publication date: 21-Nov-2022
  • (2022)Slow, Repeat, Voice Guidance: Automatic Generation of Practice Charts to Improve Rhythm Action GamesHCI in Games10.1007/978-3-031-05637-6_8(119-138)Online publication date: 26-Jun-2022
  • (2021)SyncUpProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34781205:3(1-25)Online publication date: 14-Sep-2021
  • (2020)Real-time Snowboard Training System for a Novice Using Visual and Auditory Feedback2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC)10.1109/SMC42975.2020.9283160(4230-4235)Online publication date: 11-Oct-2020
  • (2018)Takes Tutu to BalletProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/31917702:1(1-30)Online publication date: 26-Mar-2018
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media