[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3550340.3564234acmconferencesArticle/Chapter ViewAbstractPublication Pagessiggraph-asiaConference Proceedingsconference-collections
research-article

Sampling Neural Radiance Fields for Refractive Objects

Published: 22 November 2022 Publication History

Abstract

Recently, differentiable volume rendering in neural radiance fields (NeRF) has gained a lot of popularity, and its variants have attained many impressive results. However, existing methods usually assume the scene is a homogeneous volume so that a ray is cast along the straight path. In this work, the scene is instead a heterogeneous volume with a piecewise-constant refractive index, where the path will be curved if it intersects the different refractive indices. For novel view synthesis of refractive objects, our NeRF-based framework aims to optimize the radiance fields of bounded volume and boundary from multi-view posed images with refractive object silhouettes. To tackle this challenging problem, the refractive index of a scene is reconstructed from silhouettes. Given the refractive index, we extend the stratified and hierarchical sampling techniques in NeRF to allow drawing samples along a curved path tracked by the Eikonal equation. The results indicate that our framework outperforms the state-of-the-art method both quantitatively and qualitatively, demonstrating better performance on the perceptual similarity metric and an apparent improvement in the rendering quality on several synthetic and real scenes.

Supplementary Material

MP4 File (3550340.3564234.mp4)
presentation
MP4 File (SampleNeRFRO_long.mp4)
Presentation video

References

[1]
Jonathan T Barron, Ben Mildenhall, Matthew Tancik, Peter Hedman, Ricardo Martin-Brualla, and Pratul P Srinivasan. 2021. Mip-nerf: A multiscale representation for anti-aliasing neural radiance fields. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 5855–5864.
[2]
Mojtaba Bemana, Karol Myszkowski, Jeppe Revall Frisvad, Hans-Peter Seidel, and Tobias Ritschel. 2022. Eikonal Fields for Refractive Novel-View Synthesis. arXiv preprint arXiv:2202.00948(2022).
[3]
Zekun Hao, Arun Mallya, Serge Belongie, and Ming-Yu Liu. 2021. Gancraft: Unsupervised 3d neural rendering of minecraft worlds. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 14072–14082.
[4]
Ivo Ihrke, Gernot Ziegler, Art Tevs, Christian Theobalt, Marcus Magnor, and Hans-Peter Seidel. 2007. Eikonal rendering: Efficient light transport in refractive objects. ACM Transactions on Graphics (TOG) 26, 3 (2007), 59–es.
[5]
Maximilian Krogius, Acshi Haggenmiller, and Edwin Olson. 2019. Flexible Layouts for Fiducial Tags. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).
[6]
Wojciech Matusik, Hanspeter Pfister, Remo Ziegler, Addy Ngan, and Leonard McMillan. 2002. Acquisition and Rendering of Transparent and Refractive Objects. In Proceedings of the 13th Eurographics Workshop on Rendering (Pisa, Italy) (EGRW ’02). Eurographics Association, Goslar, DEU, 267–278.
[7]
Ben Mildenhall, Pratul P Srinivasan, Matthew Tancik, Jonathan T Barron, Ravi Ramamoorthi, and Ren Ng. 2020. Nerf: Representing scenes as neural radiance fields for view synthesis. In European conference on computer vision. Springer, 405–421.
[8]
Xin Sun, Kun Zhou, Eric Stollnitz, Jiaoying Shi, and Baining Guo. 2008. Interactive relighting of dynamic refractive objects. In ACM SIGGRAPH 2008 papers. 1–9.
[9]
Dor Verbin, Peter Hedman, Ben Mildenhall, Todd Zickler, Jonathan T Barron, and Pratul P Srinivasan. 2021. Ref-NeRF: Structured View-Dependent Appearance for Neural Radiance Fields. arXiv preprint arXiv:2112.03907(2021).
[10]
Ziyu Wang, Wei Yang, Junming Cao, Lan Xu, Junqing Yu, and Jingyi Yu. 2022. NeReF: Neural Refractive Field for Fluid Surface Reconstruction and Implicit Representation. arXiv preprint arXiv:2203.04130(2022).
[11]
Kai Zhang, Gernot Riegler, Noah Snavely, and Vladlen Koltun. 2020. Nerf++: Analyzing and improving neural radiance fields. arXiv preprint arXiv:2010.07492(2020).
[12]
Xiuming Zhang, Pratul P Srinivasan, Boyang Deng, Paul Debevec, William T Freeman, and Jonathan T Barron. 2021. Nerfactor: Neural factorization of shape and reflectance under an unknown illumination. ACM Transactions on Graphics (TOG) 40, 6 (2021), 1–18.
[13]
Junqiu Zhu, Yaoyi Bai, Zilin Xu, Steve Bako, Edgar Velázquez-Armendáriz, Lu Wang, Pradeep Sen, Miloš Hašan, and Ling-Qi Yan. 2021. Neural complex luminaires: representation and rendering. ACM Transactions on Graphics (TOG) 40, 4 (2021), 1–12.

Cited By

View all
  • (2024)Refracting Once is Enough: Neural Radiance Fields for Novel-View Synthesis of Real Refractive ObjectsProceedings of the 2024 International Conference on Multimedia Retrieval10.1145/3652583.3658000(694-703)Online publication date: 30-May-2024
  • (2024)CrystalNet: Texture‐Aware Neural Refraction Baking for Global IlluminationComputer Graphics Forum10.1111/cgf.1522743:7Online publication date: 24-Oct-2024
  • (2024)Ray Deformation Networks for Novel View Synthesis of Refractive Objects2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)10.1109/WACV57701.2024.00309(3106-3116)Online publication date: 3-Jan-2024
  • Show More Cited By

Index Terms

  1. Sampling Neural Radiance Fields for Refractive Objects

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      SA '22: SIGGRAPH Asia 2022 Technical Communications
      December 2022
      91 pages
      ISBN:9781450394659
      DOI:10.1145/3550340
      • Editors:
      • Soon Ki Jung,
      • Neil Dodgson
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 22 November 2022

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. eikonal rendering
      2. neural radiance fields

      Qualifiers

      • Research-article
      • Research
      • Refereed limited

      Funding Sources

      Conference

      SA '22
      Sponsor:
      SA '22: SIGGRAPH Asia 2022
      December 6 - 9, 2022
      Daegu, Republic of Korea

      Acceptance Rates

      Overall Acceptance Rate 178 of 869 submissions, 20%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)64
      • Downloads (Last 6 weeks)0
      Reflects downloads up to 01 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Refracting Once is Enough: Neural Radiance Fields for Novel-View Synthesis of Real Refractive ObjectsProceedings of the 2024 International Conference on Multimedia Retrieval10.1145/3652583.3658000(694-703)Online publication date: 30-May-2024
      • (2024)CrystalNet: Texture‐Aware Neural Refraction Baking for Global IlluminationComputer Graphics Forum10.1111/cgf.1522743:7Online publication date: 24-Oct-2024
      • (2024)Ray Deformation Networks for Novel View Synthesis of Refractive Objects2024 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)10.1109/WACV57701.2024.00309(3106-3116)Online publication date: 3-Jan-2024
      • (2024) REF 2 -NeRF: Reflection and Refraction aware Neural Radiance Field 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS58592.2024.10801830(7196-7203)Online publication date: 14-Oct-2024
      • (2024)DerainNeRF: 3D Scene Estimation with Adhesive Waterdrop Removal2024 IEEE International Conference on Robotics and Automation (ICRA)10.1109/ICRA57147.2024.10609981(2787-2793)Online publication date: 13-May-2024
      • (2024)Differentiable Neural Surface Refinement for Modeling Transparent Objects2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.01916(20268-20277)Online publication date: 16-Jun-2024
      • (2024)Neural Fields as Distributions: Signal Processing Beyond Euclidean Space2024 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10.1109/CVPR52733.2024.00409(4274-4283)Online publication date: 16-Jun-2024
      • (2024)Multi-view 3D reconstruction based on deep learningNeurocomputing10.1016/j.neucom.2024.127553582:COnline publication date: 14-May-2024
      • (2024)Flying with Photons: Rendering Novel Views of Propagating LightComputer Vision – ECCV 202410.1007/978-3-031-72664-4_19(333-351)Online publication date: 26-Oct-2024

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media