[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2317956.2318018acmconferencesArticle/Chapter ViewAbstractPublication PagesdisConference Proceedingsconference-collections
research-article

MelodicBrush: a novel system for cross-modal digital art creation linking calligraphy and music

Published: 11 June 2012 Publication History

Abstract

MelodicBrush is a novel system that connects two ancient art forms: Chinese ink-brush calligraphy and Chinese music. Our system uses vision-based techniques to create a digitized ink-brush calligraphic writing surface with enhanced interaction functionalities. The music generation combines cross-modal stroke-note mapping and statistical language modeling techniques into a hybrid model that generates music as a real-time, auditory response and feedback to the user's calligraphic strokes.
Our system is in fact a new cross-modal musical system that endows the ancient art of calligraphy writing with a novel auditory representation to provide the users with a natural and novel artistic experience. Experiment evaluations with real users suggest that MelodicBrush is intuitive and realistic, and can also be easily used to exercise creativity and support art generation.

Supplementary Material

JPG File (p418-huang.jpg)
MP4 File (p418-huang.mp4)

References

[1]
Adams B., Wicke M., Dutré P., Gross M., Pauly M., and Teschner M. Interactive 3D painting on point-sampled objects. In Proc. Eurographics 2004, ACM Press (2010)
[2]
Artrage. Ambient Design. http://www.artrage.com.
[3]
Antle A., Droumeva M., and Corness G. Playing with the sound maker: do embodied metaphors help children learn. In Proc. IDC 2008, ACM (2008), 178--185.
[4]
Bakker S., Antle A., and van den Hoven E. Identifying embodied metaphors in children's sound-action mappings. In Proc. IDC 2009, ACM (2009), 140--149.
[5]
Baxter W., and Lin M. A versatile interactive 3D brush model. In Proc. Pacific Graphics 2004, 319--328.
[6]
Baxter W. and Govindaraju N. Simple data-driven modeling of brushes. In Proc. I3D 2010. ACM Press (2010), 135--142.
[7]
Berman G. Synesthesia and the Arts. Leonardo (1999), 32(1), 15--22.
[8]
Bi X., Moscovich T., Ramos G., Balakrishnan R., and Hinckley K. An exploration of pen rolling for pen-based interaction. In Proc.UIST 2008. ACM Press (2008), 191--200.
[9]
Brandl P., Forlines C., Wigdor D., Haller M., and Shen C. Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces. In Proc. AVI 2008, ACM Press (2008), 154--161.
[10]
Callender C., Quinn I., and Tymoczko D. Generalized voice-leading spaces. Science (2008), 320(5874), 346--8.
[11]
Chu N. S., and Tai C. L. An efficient brush model for physically-based 3D painting. In Proc. PG 2002, IEEE Press (2002), 413--423.
[12]
Hall R. W. Geometrical Music Theory. Science (2008), 320(5874), 328--9.
[13]
Hinckley K., Yatani K., Pahud M., Coddington N., Rodenhouse J., Wilson A., Benko H., and Buxton B. Pen + touch = new tools. In Proc. UIST 2010, ACM Press (2010), 27--36.
[14]
HyperScore. http://www.hyperscore.com/.
[15]
Izadi S., Kim D., Hilliges O., Molyneaux D., Newcombe R., Kohli P., Shotton J., Hodges S., Freeman D., Davison A., and Fitzgibbon A. KinectFusion: realtime 3D reconstruction and interaction using a moving depth camera. In Proc. UIST 2011, ACM Press (2011), 559--568.
[16]
Jo K. DrawSound: a drawing instrument for sound performance. In Proc. TEI 2008, ACM Press (2008), 59--62.
[17]
Jordà S., Geiger G., Alonso M. and Kaltenbrunner M. The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. In Proc. TEI 2007, ACM Press (2007), 139--146.
[18]
Kalal Z., Mikolajczyk K., and Matas J. Face-TLD: Tracking-Learning-Detection Applied to Faces. In Proc. ICIP 2010, IEEE Press (2010), 3789--3792.
[19]
Kang L. and Chien H. Y. Hé: Calligraphy as a Musical Interface. In Proc. NIME 2010.
[20]
Laurienti P. J., Wallace M. T., Maldjian J. A., Susi C. M., Stein B. E. and Burdette J. H. Cross-modal sensory processing in the anterior cingulate and medial prefrontal cortices. Hum Brain Mapp, WILEY (2003), 19(4), 213--23.
[21]
Liu J., Chew E., and François A. From driving to expressive music performance: ensuring tempo smoothness. In Proc. ACE 2006, ACM, Article 78.
[22]
Lopes P., Mendes D., Araújo B., and Jorge J. A. Combining bimanual manipulation and pen-based input for 3D modelling. In Proc. SBIM 2011. ACM Press (2011), 15--22.
[23]
Nichols E., Morris D., and Basu S. Data-driven exploration of musical chord sequences. In Proc. IUI 2009. ACM Press (2009), 227--236.
[24]
Maurer D., Pathman T., and Mondloch C. J. The shape of boubas: sound--shape correspondences in toddlers and adults. Developmental Science (2006), 9(3), 316--22.
[25]
Obrenovic Z. A flexible system for creating music while interacting with the computer. In Proc. MM 2005. ACM Press (2005), 996--1004.
[26]
Photoshop. Adobe. http://www.adobe.com/photoshop/.
[27]
Ponsford D, Wiggins G, and Mellish C. Statistical learning of harmonic movement. Journal of New Music Research 1999, 28, 150--177.
[28]
Simon I., Morris D., and Basu S. MySong: automatic accompaniment generation for vocal melodies. In Proc. CHI 2008. ACM Press (2008), 725--734.
[29]
Song H., Benko H., Guimbretiere F., Izadi S., Cao X., and Hinckley K. Grips and gestures on a multi-touch pen. In Proc. CHI 2011. ACM Press (2011), 1323--1332.
[30]
Sonic wire sculptors. http://sws.cc/index.html.
[31]
Vandoren P., Claesen L., Laerhoven T. V., Taelman J., Raymaekers C., Flerackers E., and Reeth F. V. FluidPaint: an interactive digital painting system using real wet brushes. In Proc. ITS 2009. ACM Press (2009), 53--56.
[32]
Wanderley M. M., Depalle P. Gestural control of sound synthesis. In Proc. IEEE 2004, 92(4), 632--644.
[33]
Wang L., Zhang C., Yang R., and Zhang C. Tofcut: towards robust real-time foreground extraction using a time-of-flight camera. In Proc. 3DPVT 2010.
[34]
Wilson A. D. Using a depth camera as a touch sensor. In Proc. ITS 2010. ACM Press (2010), 69--72.
[35]
Xin Y., Bi X., and Ren X. Acquiring and pointing: an empirical study of pen-tilt-based interaction. In Proc. CHI 2011. ACM Press (2011), 849--858.
[36]
Xu, S., Tang, F., Lau, F., and Pan, Y. A solid model based virtual hairy brush. Eurographics 2002. ACM Press (2010), 21(3), 299--308.
[37]
Zbyszynski M., Wright M., Momeni A., and Cullen D. Ten years of tablet musical interfaces at CNMAT. In Proc. NIME 2007, ACM Press (2007), 100--105.

Cited By

View all
  • (2024)Exploring User Preferences and Acceptance of Digital Art Therapy Among Older Chinese Adults with Mild Cognitive ImpairmentHuman Aspects of IT for the Aged Population10.1007/978-3-031-61546-7_1(3-21)Online publication date: 1-Jun-2024
  • (2023)XCreation: A Graph-based Crossmodal Generative Creativity Support ToolProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606826(1-15)Online publication date: 29-Oct-2023
  • (2023)Art in the Machine: Value Misalignment and AI “Art”Cooperative Design, Visualization, and Engineering10.1007/978-3-031-43815-8_4(31-42)Online publication date: 18-Sep-2023
  • Show More Cited By

Index Terms

  1. MelodicBrush: a novel system for cross-modal digital art creation linking calligraphy and music

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      DIS '12: Proceedings of the Designing Interactive Systems Conference
      June 2012
      828 pages
      ISBN:9781450312103
      DOI:10.1145/2317956
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 11 June 2012

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Chinese guqin music
      2. Chinese calligraphy
      3. cross-modal interaction

      Qualifiers

      • Research-article

      Conference

      DIS '12
      Sponsor:
      DIS '12: Designing Interactive Systems Conference 2012
      June 11 - 15, 2012
      Newcastle Upon Tyne, United Kingdom

      Acceptance Rates

      Overall Acceptance Rate 1,158 of 4,684 submissions, 25%

      Upcoming Conference

      DIS '25
      Designing Interactive Systems Conference
      July 5 - 9, 2025
      Funchal , Portugal

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)35
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 09 Mar 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Exploring User Preferences and Acceptance of Digital Art Therapy Among Older Chinese Adults with Mild Cognitive ImpairmentHuman Aspects of IT for the Aged Population10.1007/978-3-031-61546-7_1(3-21)Online publication date: 1-Jun-2024
      • (2023)XCreation: A Graph-based Crossmodal Generative Creativity Support ToolProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606826(1-15)Online publication date: 29-Oct-2023
      • (2023)Art in the Machine: Value Misalignment and AI “Art”Cooperative Design, Visualization, and Engineering10.1007/978-3-031-43815-8_4(31-42)Online publication date: 18-Sep-2023
      • (2022)Birdbox: Exploring the User Experience of Crossmodal, Multisensory Data RepresentationsProceedings of the 21st International Conference on Mobile and Ubiquitous Multimedia10.1145/3568444.3568455(12-21)Online publication date: 27-Nov-2022
      • (2021)We Can Do More to Save Guqin: Design and Evaluate Interactive Systems to Make Guqin More Accessible to the General PublicProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445175(1-12)Online publication date: 6-May-2021
      • (2019)Gesture-Ink-SoundProceedings of the 6th International Conference on Movement and Computing10.1145/3347122.3347136(1-8)Online publication date: 10-Oct-2019
      • (2018)Enhancing the Appreciation of Traditional Chinese Painting Using Interactive TechnologyMultimodal Technologies and Interaction10.3390/mti20200162:2(16)Online publication date: 16-Apr-2018
      • (2017)AVUIProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3026042(1093-1104)Online publication date: 2-May-2017
      • (2016)Tap the ShapeTonesProceedings of the 2016 CHI Conference on Human Factors in Computing Systems10.1145/2858036.2858456(1055-1066)Online publication date: 7-May-2016
      • (2014)From Writing to PaintingProceedings of the 22nd ACM international conference on Multimedia10.1145/2647868.2654911(57-66)Online publication date: 3-Nov-2014
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media