Towards Multimodal MIR: Predicting individual differences from music-induced movement
Abstract
As the field of Music Information Retrieval grows, it is important to take into consideration the multi-modality of music and how aspects of musical engagement such as movement and gesture might be taken into account. Bodily movement is universally associated with music and reflective of important individual features related to music preference such as personality, mood, and empathy. Future multimodal MIR systems may benefit from taking these aspects into account. The current study addresses this by identifying individual differences, specifically Big Five personality traits, and scores on the Empathy and Systemizing Quotients (EQ/SQ) from participants' free dance movements. Our model successfully explored the unseen space for personality as well as EQ, SQ, which has not previously been accomplished for the latter. R2 scores for personality, EQ, and SQ were 76.3%, 77.1%, and 86.7% respectively. As a follow-up, we investigated which bodily joints were most important in defining these traits. We discuss how further research may explore how the mapping of these traits to movement patterns can be used to build a more personalized, multi-modal recommendation system, as well as potential therapeutic applications.
- Publication:
-
arXiv e-prints
- Pub Date:
- July 2020
- DOI:
- arXiv:
- arXiv:2007.10695
- Bibcode:
- 2020arXiv200710695A
- Keywords:
-
- Computer Science - Machine Learning;
- Computer Science - Multimedia;
- Statistics - Machine Learning
- E-Print:
- Appearing in the proceedings of the 21st International Society for Music Information Retrieval Conference (ISMIR 2020) (camera-ready version)