[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Gaussian process inference modelling of dynamic robot control for expressive piano playing

PLoS One. 2020 Aug 14;15(8):e0237826. doi: 10.1371/journal.pone.0237826. eCollection 2020.

Abstract

Piano is a complex instrument, which humans learn to play after many years of practice. This paper investigates the complex dynamics of the embodied interactions between a human and piano, in order to gain insights into the nature of humans' physical dexterity and adaptability. In this context, the dynamic interactions become particularly crucial for delicate expressions, often present in advanced music pieces, which is the main focus of this paper. This paper hypothesises that the relationship between motor control for key-pressing and the generated sound is a manifold problem, with high-degrees of non-linearity in nature. We employ a minimalistic experimental platform based on a robotic arm equipped with a single elastic finger in order to systematically investigate the motor control and resulting outcome of piano sounds. The robot was programmed to run 3125 key-presses on a physical digital piano with varied control parameters. The obtained data was applied to a Gaussian Process (GP) inference modelling method, to train a network in terms of 10 playing styles, corresponding to different expressions generated by a Musical Instrument Digital Interface (MIDI). By analysing the robot control parameters and the output sounds, the relationship was confirmed to be highly nonlinear, especially when the rich expressions (such as a broad range of sound dynamics) were necessary. Furthermore this relationship was difficult and time consuming to learn with linear regression models, compared to the developed GP-based approach. The performance of the robot controller was also compared to that of an experienced human player. The analysis shows that the robot is able to generate sounds closer to humans' in some expressions, but requires additional investigations for others.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Feedback
  • Humans
  • Models, Theoretical*
  • Music*
  • Normal Distribution
  • Robotics*
  • Sound

Grants and funding

This work was funded by the UK Agriculture and Horticulture Development Board (CP 172), and Physical Sciences Research Council (EPSRC) MOTION grant [EP/T00519X/1]. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.