[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
research-article

Bimodal perception of audio-visual material properties for virtual environments

Published: 18 January 2010 Publication History

Abstract

High-quality rendering of both audio and visual material properties is very important in interactive virtual environments, since convincingly rendered materials increase realism and the sense of immersion. We studied how the level of detail of auditory and visual stimuli interact in the perception of audio-visual material rendering quality. Our study is based on perception of material discrimination, when varying the levels of detail of modal synthesis for sound, and bidirectional reflectance distribution functions for graphics. We performed an experiment for two different models (a Dragon and a Bunny model) and two material types (plastic and gold). The results show a significant interaction between auditory and visual level of detail in the perception of material similarity, when comparing approximate levels of detail to a high-quality audio-visual reference rendering. We show how this result can contribute to significant savings in computation time in an interactive audio-visual rendering system. To our knowledge, this is the first study that shows interaction of audio and graphics representation in a material perception task.

References

[1]
Ben-Artzi, A., Overbeck, R., and Ramamoorthi, R. 2006. Real-time brdf editing in complex lighting. In Proceedings of the SIGGRAPH Conference. ACM, New York, 945--954.
[2]
Bonneel, N., Drettakis, G., Tsingos, N., Viaud-Delmon, I., and James, D. 2008. Fast modal sounds with scalable frequency-domain synthesis. In Proceedings of the SIGGRAPH Conference. ACM, New York.
[3]
Cook, R. L. and Torrance, K. E. 1982. A reflectance model for computer graphics. ACM Trans. Graph. 1, 1, 7--24.
[4]
Debevec, P. 1998. Rendering with natural light. In ACM SIGGRAPH 98 Electronic Art and Animation Catalog. ACM Press, New York, NY, 166.
[5]
Doel, K. V. D., Pai, D., Adam, T., Kortchmar, L., and Pichora-Fuller, K. 2002. Measurements of perceptual quality of contact sound models. In Proceedings of the International Conference on Auditory Display. 345--349.
[6]
Donnelly, W. and Lauritzen, A. 2006. Variance shadow maps. In Proceedings of the 2006 Symposium on Interactive 3D Graphics and Games. ACM, New York, NY, 161--165.
[7]
Fleming, R. W., Dror, R. O., and Adelson, E. H. 2003. Real-world illumination and the perception of surface reflectance properties. J. Vision 3, 5, 347--368.
[8]
Fujisaki, W., Shimojo, S., Kashino, M., and Nishida, S. 2004. Recalibration of audiovisual simultaneity. Nat Neurosci 7, 7, 773--778.
[9]
Giordano, B. L. and McAdams, S. 2006. Material identification of real impact sounds: Effects of size variation in steel, glass, wood, and plexiglass plates. J. Acoustical Soc. Amer. 119, 2, 1171--1181.
[10]
Green, R. 2003. Spherical harmonic lighting: The gritty details. In Proceedings of the Game Developers Conference.
[11]
Grelaud, D., Bonneel, N., Wimmer, M., Asselot, M., and Drettakis, G. 2009. Efficient and practical audio-visual rendering for games using crossmodal perception. InProceedings of the Symposium on Interactive 3D Graphics and Games. ACM, New York, NY.
[12]
James, D. L., Barbic, J., and Pai, D. K. 2006. Precomputed acoustic transfer: Output-sensitive, accurate sound generation for geometrically complex vibration sources. ACM Trans. Graph. 25, 3, 987--995.
[13]
Kautz, J., Sloan, P., and Snyder, J. 2002. Fast, arbitrary shading for low-frequency lighting using spherical harmonics. In Proceedings of the Eurographics Workshop on Rendering. 291--296.
[14]
Klatzky, R., Pai, D., and Krotkov, E. 2000. Perception of material from contact sounds. Presence: Teleoperators Virtual Environ. 9, 399--410.
[15]
Kristensen, A. W., Akenine-Möller, T., and Jensen, H. W. 2005. Precomputed local radiance transfer for real-time lighting design. ACM Trans. Graph. 24, 3, 1208--1215.
[16]
Mastoropoulou, G., Debattista, K., Chalmers, A., and Troscianko, T. 2005. The influence of sound effects on the perceived smoothness of rendered animations. In Proceedings of the Symposium on Applied Perception in Graphics and Visualization. ACM, New York, NY, 9--15.
[17]
Matusik, W., Pfister, H., Brand, M., and McMillan, L. 2003. A data-driven reflectance model. ACM Trans. Graph. 22, 3, 759--769.
[18]
McAdams, S., Chaigne, A., and Roussarie, V. 2004. The psycho-mechanics of simulated sound sources: Material properties of impacted bars. J. Acoustical Soc. Amer. 115, 1306--1320.
[19]
Ng, R., Ramamoorthi, R., and Hanrahan, P. 2003. All-frequency shadows using non-linear wavelet lighting approximation. ACM Trans. Graph. 22, 3, 376--381.
[20]
O'Brien, J. F., Shen, C., and Gatchalian, C. M. 2002. Synthesizing sounds from rigid-body simulations. In Proceedings of the Symposium on Computer Animation. ACM Press, 175--181.
[21]
Raghuvanshi, N. and Lin, M. C. 2007. Physically based sound synthesis for large-scale virtual environments. IEEE Comput. Graph. Appl. 27, 1, 14--18.
[22]
Ramamoorthi, R. and Hanrahan, P. 2001a. An efficient representation for irradiance environment maps. In Computer Graphics Proceedings, E. Fiume, Ed. 497--500.
[23]
Ramamoorthi, R. and Hanrahan, P. 2001b. On the relationship between radiance and irradiance: Determining the illumination from images of a convex object. J. Optical Soc. Amer.
[24]
Ramamoorthi, R. and Hanrahan, P. 2002. Frequency space environment map rendering. ACM Trans. Graph. 21, 3, 517--526.
[25]
Ramanarayanan, G., Ferwerda, J., Walter, B., and Bala, K. 2007. Visual equivalence: Towards a new standard for image fidelity. ACM Trans. Graph.
[26]
Reinhard, E., Stark, M., Shirley, P., and Ferwerda, J. 2002. Photographic tone reproduction for digital images. ACM Trans. Graph. 21, 3, 267--276.
[27]
Sloan, P.-P., Kautz, J., and Snyder, J. 2002. Precomputed radiance transfer for real-time rendering in dynamic, low-frequency lighting environments. In Proceedings of ACM SIGGRAPH, 527--536.
[28]
Sloan, P.-P., Luna, B., and Snyder, J. 2005. Local, deformable precomputed radiance transfer. ACM Trans. Graph. 24, 3, 1216--1224.
[29]
Storms, R. L. and Zyda, M. 2000. Interactions in perceived quality of auditory-visual displays. Presence: Teleoperators Virtual Environ. 9, 6, 557--580.
[30]
van den Doel, K. and Pai, D. K. 2003. Modal synthesis for vibrating objects. Audio Anecdotes.
[31]
Vangorp, P., Laurijssen, J., and DUTRÉ, P. 2007. The influence of shape on the perception of material reflectance. ACM Trans. Graph.

Cited By

View all
  • (2023)Understanding virtual drilling perception using sound, and kinesthetic cues obtained with a mouse and keyboardJournal on Multimodal User Interfaces10.1007/s12193-023-00407-817:3(151-163)Online publication date: 4-Aug-2023
  • (2021)Unified Approach to Augmented RealityImplementing Augmented Reality Into Immersive Virtual Learning Environments10.4018/978-1-7998-4222-4.ch004(74-88)Online publication date: 2021
  • (2021)MovEchoACM Transactions on Applied Perception10.1145/346469218:3(1-19)Online publication date: 20-Aug-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 7, Issue 1
January 2010
154 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/1658349
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 January 2010
Accepted: 01 December 2008
Revised: 01 December 2008
Received: 01 July 2008
Published in TAP Volume 7, Issue 1

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Audio-visual rendering
  2. bimodal perception
  3. crossmodal perception
  4. material perception

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)4
Reflects downloads up to 16 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Understanding virtual drilling perception using sound, and kinesthetic cues obtained with a mouse and keyboardJournal on Multimodal User Interfaces10.1007/s12193-023-00407-817:3(151-163)Online publication date: 4-Aug-2023
  • (2021)Unified Approach to Augmented RealityImplementing Augmented Reality Into Immersive Virtual Learning Environments10.4018/978-1-7998-4222-4.ch004(74-88)Online publication date: 2021
  • (2021)MovEchoACM Transactions on Applied Perception10.1145/346469218:3(1-19)Online publication date: 20-Aug-2021
  • (2021)A Texture Superpixel Approach to Semantic Material Classification for Acoustic Geometry TaggingExtended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411763.3451657(1-7)Online publication date: 8-May-2021
  • (2021)Investigating Textual Visual Sound Effects in a Virtual Environment and their impacts on Object Perception and Sound Perception2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR52148.2021.00048(320-328)Online publication date: Oct-2021
  • (2021)Psychometric Mapping of Audio Features to Perceived Physical Characteristics of Virtual Objects2021 IEEE Conference on Games (CoG)10.1109/CoG52621.2021.9619046(1-4)Online publication date: 17-Aug-2021
  • (2020)Immersive Virtual Reality Audio Rendering Adapted to the Listener and the RoomReal VR – Immersive Digital Reality10.1007/978-3-030-41816-8_13(293-318)Online publication date: 3-Mar-2020
  • (2019)Immersive Spatial Audio Reproduction for VR/AR Using Room Acoustic Modelling from 360° Images2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR.2019.8798247(120-126)Online publication date: Mar-2019
  • (2019)Pseudo-haptics: leveraging cross-modal perception in virtual environmentsThe Senses and Society10.1080/17458927.2019.161931814:3(313-329)Online publication date: 14-Oct-2019
  • (2019)Crossmodal perception in virtual realityMultimedia Tools and Applications10.1007/s11042-019-7331-z79:5-6(3311-3331)Online publication date: 26-Feb-2019
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media