[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ Skip to main content
Log in

Evaluating physical and rendered material appearance

  • Original Article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

Many representations and rendering techniques have been proposed for presenting material appearance in computer graphics. One outstanding problem is evaluating their accuracy. In this paper, we propose assessing accuracy by comparing human judgements of material attributes made when viewing a computer graphics rendering to those made when viewing a physical sample of the same material. We demonstrate this approach using 16 diverse physical material samples distributed to researchers at the MAM 2014 workshop. We performed two psychophysical experiments. In the first experiment, we examined how consistently subjects rate a set of twelve visual, tactile and subjective attributes of individual physical material specimens. In the second experiment, we asked subjects to assess the same attributes for identical materials rendered as BTFs under point-light and environment illuminations. By analyzing obtained data, we identified which material attributes and material types are judged consistently and to what extent the computer graphics representation conveyed the experience of viewing physical material appearance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
£29.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Adelson, E.H.: On seeing stuff: the perception of materials by humans and machines. In: Human vision and electronic imaging VI, vol. 4299. International Society for Optics and Photonics (2001)

  2. Brodatz, P.: A Photographic Album for Artists and Designers (Brodatz Texture Database). Dover Publications, Mineola (1966)

    Google Scholar 

  3. Dana, K., van Ginneken, B., Nayar, S., Koenderink, J.: Reflectance and texture of real-world surfaces. ACM Trans. Graph. 18(1), 1–34 (1999)

    Article  Google Scholar 

  4. Filip, J.: Analyzing and predicting anisotropic effects of BRDFs. In: ACM SAP, pp. 25–32 (2015)

  5. Filip, J., Chantler, M., Green, P., Haindl, M.: A psychophysically validated metric for bidirectional texture data reduction. ACM Trans. Graph. 27(5), 138 (2008)

  6. Filip, J., Vacha, P., Haindl, M.: Analysis of human gaze interactions with texture and shape. In: S+SSSPR Workshop, Springer LNCS 7252, pp. 160–172 (2011)

  7. Filip, J., Vávra, R., Haindl, M., Zid, P., Krupicka, M., Havran, V.: BRDF slices: accurate adaptive anisotropic appearance acquisition. In: Proceedings of the 26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2013, pp. 4321–4326 (2013)

  8. Fleming, R.W.: Visual perception of materials and their properties. Vis. Res. 94, 62–75 (2014)

    Article  Google Scholar 

  9. Goodman, T.: Measurement of naturalness: physics and perception. In: Proceedings of the 3rd International Conference on Appearance, pp. 21–24. MIT Press, Cambridge

  10. Havran, V., Filip, J., Myszkowski, K.: Perceptually motivated BRDF comparison using single image. Comput. Graph. Forum (2016). https://doi.org/10.1111/cgf.12944

    Google Scholar 

  11. Hayes, A.F., Krippendorff, K.: Answering the call for a standard reliability measure for coding data. Commun. Methods Meas. 1(1), 77–89 (2007)

    Article  Google Scholar 

  12. ITU: ITU-R.REC.P.910. Subjective audivisual quality assessment methods for multimedia applications. Technical report (2008)

  13. Jarabo, A., Wu, H., Dorsey, J., Rushmeier, H., Gutierrez, D.: Effects of approximate filtering on the appearance of bidirectional texture functions. IEEE Trans. Vis. Comput. Graph. 20(6), 880–892 (2014). https://doi.org/10.1109/TVCG.2014.2312016

    Article  Google Scholar 

  14. Keelan, B.: ISO 20462: a psychophysical image quality measurement standard. In: Proceedings of the SPIE, SPIE 2003, vol. 5294, p. 181–189 (2003)

  15. Long, H., Leow, W.: A hybrid model for invariant and perceptual texture mapping. In: Proceedings of the 16th International Conference on Pattern Recognition, 2002, vol. 1, pp. 135–138. IEEE (2002)

  16. Martín, R., Iseringhausen, J., Weinmann, M., Hullin, M.B.: Multimodal perception of material properties. In: Proceedings of the ACM SIGGRAPH Symposium on Applied Perception, SAP ’15, pp. 33–40 (2015). https://doi.org/10.1145/2804408.2804420

  17. Matusik, W., Pfister, H., Brand, M., McMillan, L.: A data-driven reflectance model. ACM Trans. Graph. 22(3), 759–769 (2003)

    Article  Google Scholar 

  18. Meseth, J., Müller, G., Klein, R., Röder, F., Arnold, M.: Verification of rendering quality from measured BTFs. In: Third Symposium on Applied Perception in Graphics and Visualization, pp. 127–134 (2006)

  19. Mojsilovic, A., Kovacevic, J., Kall, D., Safranek, R., Kicha Ganapathy, S.: The vocabulary and grammar of color patterns. IEEE Trans. Image Process. 9(3), 417–431 (2000)

    Article  Google Scholar 

  20. Motoyoshi, I., Nishida, S., Sharan, L., Adelson, E.: Image statistics and the perception of surface qualities. Nature 447(10), 206–209 (2007)

    Article  Google Scholar 

  21. Mylo, M., Giesel, M., Zaidi, Q., Hullin, M., Klein, R.: Appearance bending: a perceptual editing paradigm for data-driven material models. In: Hullin, M., Klein, R., Schultz, T., Yao, A. (eds.) Vision, Modeling and Visualization. The Eurographics Association (2017). https://doi.org/10.2312/vmv.20171254

  22. Nicodemus, F., Richmond, J., Hsia, J., Ginsburg, I., Limperis, T.: Geometrical considerations and nomenclature for reflectance. NBS Monogr. 160, 1–52 (1977)

    Google Scholar 

  23. Ramanarayanan, G., Ferwerda, J., Walter, B., Bala, K.: Visual equivalence: towards a new standard for image fidelity. ACM Trans. Graph. 26(3), 76:1–76:10 (2007)

    Article  Google Scholar 

  24. Ravishankar Rao, A., Lohse, G.: Towards a texture naming system: identifying relevant dimensions of texture. Vis. Res. 36(11), 1649–1669 (1996)

    Article  Google Scholar 

  25. Rushmeier, H.: The MAM2014 sample set. In: Proceedings of the Eurographics 2014 Workshop on Material Appearance Modeling: Issues and Acquisition, MAM ’14, pp. 25–26 (2014). http://dx.doi.org/10.2312/mam.20141297

  26. Sattler, M., Sarlette, R., Klein, R.: Efficient and realistic visualization of cloth. In: Eurographics Symposium on Rendering, pp. 167–178 (2003)

  27. Serrano, A., Gutierrez, D., Myszkowski, K., Seidel, H.P., Masia, B.: An intuitive control space for material appearance. ACM Trans. Graph. 35(6), 186:1–186:12 (2016). https://doi.org/10.1145/2980179.2980242

    Article  Google Scholar 

  28. Somol, P., Haindl, M.: Novel path search algorithm for image stitching and advanced texture tiling. In: WSCG 2005, pp. 155–218 (2005)

  29. Tamura, H., Mori, S., Yamawaki, T.: Textural features corresponding to visual perception. IEEE Trans. Syst. Man Cybern. 8(6), 460–473 (1978)

    Article  Google Scholar 

  30. Vangorp, P., Laurijssen, J., Dutre, P.: The influence of shape on the perception of material reflectance. ACM Trans. Graph. 26(3), 77:1–77:10 (2007)

    Article  Google Scholar 

Download references

Funding

This research has been supported by the Czech Science Foundation Grant 17-18407S and the US National Science Foundation Grant IIS-1218515.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jiří Filip.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Appendix: BIG data format description

Appendix: BIG data format description

The format stores image data by providing a list of image files to be included (so far PNG and EXR (half/float) image files are supported) together with optional meta data such as list of corresponding incoming and outgoing directions, color-space, spatial resolution, measured material name and descriptions. The stored binary data can be either loaded to the RAM or alternatively, for large datasets one can open a datafile and seek the requested data from a hard drive. The latter option is considerably slower but still acceptable for many off-line rendering scenarios. Once the file is loaded/opened one can use a standard “get-pixel” query function returning RGB triplet for specific spatial UV coordinate and image index. A transformation between image index and incoming/outgoing angles is up to the user and depends on an initial ordering of files during the saving process. Also we do not attempt to provide any compression of data as this could potentially impact visual quality and rendering speed. The compression can be easily added by extension of the format.

Since the proposed format is universal (it can include any LDR/HDR data), it allows an unified representation of any image-based information, e.g., movies, dynamic textures. The format also enables management of numerous scattered files that are difficult to handle without any metadata. The source codes for saving/loading of data to/from the format are made publicly available (http://btf.utia.cas.cz) to promote its wide usage and allowing easy adoption by various users for visualization and data analysis software packages. The format is composed of data chunks consisting of chunk ID, its size and data, as shown in a list of current data chunks and their brief description in Fig. 15.

Fig. 15
figure 15

A list of data chunks available in the BIG format

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Filip, J., Kolafová, M., Havlíček, M. et al. Evaluating physical and rendered material appearance. Vis Comput 34, 805–816 (2018). https://doi.org/10.1007/s00371-018-1545-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-018-1545-3

Keywords

Navigation