Progressive Model-Driven Approach for 3D Modeling of Indoor Spaces
<p>Different types of BIM: (<b>a</b>) An example of a 3D model of structural elements (architectural BIM), Reprinted with permission from Ref. [<a href="#B8-sensors-23-05934" class="html-bibr">8</a>]. 2023, “Elsevier”. (<b>b</b>) An example of a 3D model of a building’s mechanical facilities (adopted from srinsofttech.com, accessed on 23 August 2022).</p> "> Figure 2
<p>Graph optimization examples (Colors shows separate closed spaces): (<b>a</b>) 2D graph optimization, Reprinted with permission from Ref. [<a href="#B12-sensors-23-05934" class="html-bibr">12</a>]. 2023, “Elsevier”, (<b>b</b>) 3D graph optimization, Reprinted with permission from Ref. [<a href="#B16-sensors-23-05934" class="html-bibr">16</a>]. 2023, “Elsevier”.</p> "> Figure 3
<p>Indoor spaces LoD definition [<a href="#B27-sensors-23-05934" class="html-bibr">27</a>].</p> "> Figure 4
<p>General flow diagram of proposed approach.</p> "> Figure 5
<p>Initial seed points (red dots) extraction procedure from left to right.</p> "> Figure 6
<p>Dynamic graphical presentation of the model-driven idea used in this study.</p> "> Figure 7
<p>Histogram of changes in number of points enclosed in the rectangle when passing through the boundary of room.</p> "> Figure 8
<p>Point cloud challenges to determine the correct stopping condition. (<b>a</b>) Priority of interior vertical furniture to the main wall, (<b>b</b>) poor points on the entrance of narrow space.</p> "> Figure 9
<p>Stop conditions during model growing procedure.</p> "> Figure 10
<p>Remaining parts and secondary seed points (red dots) extraction.</p> "> Figure 11
<p>An example of a 2D Manhattan model.</p> "> Figure 12
<p>The process of meaningful parts detection, (<b>a</b>) occupancy map, (<b>b</b>) projected 2D model, (<b>c</b>) b-a, (<b>d</b>) meaningful parts.</p> "> Figure 12 Cont.
<p>The process of meaningful parts detection, (<b>a</b>) occupancy map, (<b>b</b>) projected 2D model, (<b>c</b>) b-a, (<b>d</b>) meaningful parts.</p> "> Figure 13
<p>The process of extracting and modeling non-Manhattan intrusions using CMBR.</p> "> Figure 14
<p>2D models: (<b>a</b>) Initial 2D Manhattan model; (<b>b</b>) final 2D non-Manhattan model after intrusion removal.</p> "> Figure 15
<p>The ISPRS indoor modeling benchmark point clouds and 3D reconstructed models by proposed approach: (<b>a</b>) TUB1, (<b>b</b>) TUB2-first floor, (<b>c</b>) TUB2-second floor, (<b>d</b>) UoM and (<b>e</b>) UVigo.</p> "> Figure 15 Cont.
<p>The ISPRS indoor modeling benchmark point clouds and 3D reconstructed models by proposed approach: (<b>a</b>) TUB1, (<b>b</b>) TUB2-first floor, (<b>c</b>) TUB2-second floor, (<b>d</b>) UoM and (<b>e</b>) UVigo.</p> "> Figure 16
<p>Examples of the ability of the proposed method to deal with point could gaps caused by (<b>a</b>) open doors, (<b>b</b>) Unscannable materials or obstacles, (<b>c</b>) Unscannable materials or obstacles.</p> "> Figure 17
<p>Results evaluation on ISPRS benchmark datasets (b = 10 cm).</p> "> Figure 18
<p>Results comparison for TUB1; Bassier & Vergauwen [<a href="#B20-sensors-23-05934" class="html-bibr">20</a>]; Lim et al. [<a href="#B17-sensors-23-05934" class="html-bibr">17</a>]; Ahmed et al. [<a href="#B18-sensors-23-05934" class="html-bibr">18</a>]; Su et al. [<a href="#B19-sensors-23-05934" class="html-bibr">19</a>]; Tran & Khoshelham [<a href="#B2-sensors-23-05934" class="html-bibr">2</a>]; Maset et al. [<a href="#B33-sensors-23-05934" class="html-bibr">33</a>]; Previtali et al. [<a href="#B13-sensors-23-05934" class="html-bibr">13</a>]; Ochmann et al. [<a href="#B16-sensors-23-05934" class="html-bibr">16</a>]; Tran et al. [<a href="#B21-sensors-23-05934" class="html-bibr">21</a>].</p> "> Figure 19
<p>Results comparison for TUB2; Bassier & Vergauwen [<a href="#B20-sensors-23-05934" class="html-bibr">20</a>]; Lim et al. [<a href="#B17-sensors-23-05934" class="html-bibr">17</a>]; Ahmed et al. [<a href="#B18-sensors-23-05934" class="html-bibr">18</a>]; Su et al. [<a href="#B19-sensors-23-05934" class="html-bibr">19</a>]; Maset et al. [<a href="#B33-sensors-23-05934" class="html-bibr">33</a>]; Ochmann et al. [<a href="#B16-sensors-23-05934" class="html-bibr">16</a>]; Cui et al. [<a href="#B34-sensors-23-05934" class="html-bibr">34</a>]; Tran et al. [<a href="#B21-sensors-23-05934" class="html-bibr">21</a>].</p> "> Figure 20
<p>Results comparison for UoM; Bassier & Vergauwen [<a href="#B20-sensors-23-05934" class="html-bibr">20</a>]; Lim et al. [<a href="#B17-sensors-23-05934" class="html-bibr">17</a>]; Su et al. [<a href="#B19-sensors-23-05934" class="html-bibr">19</a>]; Tran & Khoshelham [<a href="#B2-sensors-23-05934" class="html-bibr">2</a>]; Maset et al. [<a href="#B33-sensors-23-05934" class="html-bibr">33</a>]; Ochmann et al. [<a href="#B16-sensors-23-05934" class="html-bibr">16</a>]; Tran et al. [<a href="#B21-sensors-23-05934" class="html-bibr">21</a>].</p> "> Figure 21
<p>Results comparison for UVigo Bassier & Vergauwen [<a href="#B20-sensors-23-05934" class="html-bibr">20</a>]; Lim et al. [<a href="#B17-sensors-23-05934" class="html-bibr">17</a>]; Tran & Khoshelham [<a href="#B2-sensors-23-05934" class="html-bibr">2</a>]; Maset et al. [<a href="#B33-sensors-23-05934" class="html-bibr">33</a>]; Ochmann et al. [<a href="#B16-sensors-23-05934" class="html-bibr">16</a>]; Tran et al. [<a href="#B21-sensors-23-05934" class="html-bibr">21</a>].</p> ">
Abstract
:1. Introduction
2. Related Work
3. Methodology
3.1. Data Pre-Processing
3.2. Initial Seed Points Extraction
3.3. Model-Driven Modeling
Main Parts Modeling
3.4. Secondary Seed Points Extraction
3.5. Modeling Remaining Quadrilateral Parts
3.6. Model Refinement
3.7. 2D to 3D Extrusion
4. Experiment and Results
5. Conclusions and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Jung, J.; Hong, S.; Jeong, S.; Kim, S.; Cho, H.; Hong, S.; Heo, J. Productive Modeling for Development of As-Built BIM of Existing Indoor Structures. Autom. Constr. 2014, 42, 68–77. [Google Scholar] [CrossRef]
- Tran, H.; Khoshelham, K. Procedural Reconstruction of 3D Indoor Models from Lidar Data Using Reversible Jump Markov Chain Monte Carlo. Remote Sens. 2020, 12, 838. [Google Scholar] [CrossRef] [Green Version]
- Kang, Z.; Yang, J.; Yang, Z.; Cheng, S. A Review of Techniques for 3D Reconstruction of Indoor Environments. ISPRS Int. J. Geo-Inf. 2020, 9, 330. [Google Scholar] [CrossRef]
- Hübner, P.; Weinmann, M.; Wursthorn, S.; Hinz, S. Automatic Voxel-Based 3D Indoor Reconstruction and Room Partitioning from Triangle Meshes. ISPRS J. Photogramm. Remote Sens. 2021, 181, 254–278. [Google Scholar] [CrossRef]
- Gourguechon, C.; MacHer, H.; Landes, T. Automation of As-Built Bim Creation from Point Cloud: An Overview of Research Works Focused on Indoor Environment. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.-ISPRS Arch. 2022, 43, 193–200. [Google Scholar] [CrossRef]
- Thomson, C.; Boehm, J. Automatic Geometry Generation from Point Clouds for BIM. Remote Sens. 2015, 7, 11753–11775. [Google Scholar] [CrossRef] [Green Version]
- Adan, A.; Huber, D. 3D Reconstruction of Interior Wall Surfaces under Occlusion and Clutter. In Proceedings of the 2011 International Conference on 3D Imaging, Modeling, Processing, Visualization and Transmission, Washington, DC, USA, 16–19 May 2011; pp. 275–281. [Google Scholar] [CrossRef]
- Jung, J.; Stachniss, C.; Ju, S.; Heo, J. Automated 3D Volumetric Reconstruction of Multiple-Room Building Interiors for as-Built BIM. Adv. Eng. Inform. 2018, 38, 811–825. [Google Scholar] [CrossRef]
- Zavar, H.; Arefi, H.; Malihi, S.; Maboudi, M. Topology-Aware 3D Modelling of Indoor Spaces from Point Clouds. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2021, XLIII-B4-2, 267–274. [Google Scholar] [CrossRef]
- Hong, S.; Jung, J.; Kim, S.; Cho, H.; Lee, J.; Heo, J. Semi-Automated Approach to Indoor Mapping for 3D as-Built Building Information Modeling. Comput. Environ. Urban Syst. 2015, 51, 34–46. [Google Scholar] [CrossRef]
- Mura, C.; Mattausch, O.; Villanueva, A.J.; Gobbetti, E.; Pajarola, R. Robust Reconstruction of Interior Building Structures with Multiple Rooms under Clutter and Occlusions. In Proceedings of the 2013 International Conference on Computer-Aided Design and Computer Graphics, Guangzhou, China, 16–18 November 2013; pp. 52–59. [Google Scholar] [CrossRef] [Green Version]
- Ochmann, S.; Vock, R.; Wessel, R.; Klein, R. Automatic Reconstruction of Parametric Building Models from Indoor Point Clouds. Comput. Graph. 2016, 54, 94–103. [Google Scholar] [CrossRef] [Green Version]
- Previtali, M.; Díaz-Vilariño, L.; Scaioni, M. Indoor Building Reconstruction from Occluded Point Clouds Using Graph-Cut and Ray-Tracing. Appl. Sci. 2018, 8, 1529. [Google Scholar] [CrossRef] [Green Version]
- Nikoohemat, S.; Diakité, A.A.; Zlatanova, S.; Vosselman, G. Indoor 3D Reconstruction from Point Clouds for Optimal Routing in Complex Buildings to Support Disaster Management. Autom. Constr. 2020, 113, 103109. [Google Scholar] [CrossRef]
- Macher, H.; Landes, T.; Grussenmeyer, P. From Point Clouds to Building Information Models: 3D Semi-Automatic Reconstruction of Indoors of Existing Buildings. Appl. Sci. 2017, 7, 1030. [Google Scholar] [CrossRef] [Green Version]
- Ochmann, S.; Vock, R.; Klein, R. Automatic Reconstruction of Fully Volumetric 3D Building Models from Oriented Point Clouds. ISPRS J. Photogramm. Remote Sens. 2019, 151, 251–262. [Google Scholar] [CrossRef] [Green Version]
- Lim, G.; Doh, N. Automatic Reconstruction of Multi-Level Indoor Spaces from Point Cloud and Trajectory. Sensors 2021, 21, 3493. [Google Scholar] [CrossRef]
- Shi, W.; Ahmed, W.; Li, N.; Fan, W.; Xiang, H.; Wang, M. Semantic Geometric Modelling of Unstructured Indoor Point Cloud. ISPRS Int. J. Geo-Inf. 2019, 8, 9. [Google Scholar] [CrossRef] [Green Version]
- Wang, S.; Cai, G.; Cheng, M.; Marcato Junior, J.; Huang, S.; Wang, Z.; Su, S.; Li, J. Robust 3D Reconstruction of Building Surfaces from Point Clouds Based on Structural and Closed Constraints. ISPRS J. Photogramm. Remote Sens. 2020, 170, 29–44. [Google Scholar] [CrossRef]
- Bassier, M.; Vergauwen, M. Topology Reconstruction of BIM Wall Objects from Point Cloud Data. Remote Sens. 2020, 12, 1800. [Google Scholar] [CrossRef]
- Tran, H.; Khoshelham, K.; Kealy, A.; Díaz-Vilariño, L. Shape Grammar Approach to 3D Modeling of Indoor Environments Using Point Clouds. J. Comput. Civ. Eng. 2019, 33, 04018055. [Google Scholar] [CrossRef]
- Kutzner, T.; Chaturvedi, K.; Kolbe, T.H. CityGML 3.0: New Functions Open Up New Applications. PFG-J. Photogramm. Remote Sens. Geoinf. Sci. 2020, 88, 43–61. [Google Scholar] [CrossRef] [Green Version]
- Liu, G.; Wei, S.; Zhong, S.; Huang, S.; Zhong, R. Reconstruction of Indoor Navigation Elements for Point Cloud of Buildings with Occlusions and Openings by Wall Segment Restoration from Indoor Context Labeling. Remote Sens. 2022, 14, 4275. [Google Scholar] [CrossRef]
- Jung, J.; Stachniss, C.; Kim, C. Automatic Room Segmentation of 3D Laser Data Using Morphological Processing. ISPRS Int. J. Geo-Inf. 2017, 6, 206. [Google Scholar] [CrossRef] [Green Version]
- Ambrus, R.; Claici, S.; Wendt, A. Automatic Room Segmentation from Unstructured 3-D Data of Indoor Environments. IEEE Robot. Autom. Lett. 2017, 2, 749–756. [Google Scholar] [CrossRef]
- Nikoohemat, S.; Peter, M.; Elberink, S.O.; Vosselman, G. Semantic Interpretation of Mobile Laser Scanner Point Clouds in Indoor Scenes Using Trajectories. Remote Sens. 2018, 10, 1754. [Google Scholar] [CrossRef] [Green Version]
- Tang, L.; Li, L.; Ying, S. A Full Level-of-Detail Specification for 3D Building Models Combining Indoor and Outdoor Scenes. ISPRS Int. J. Geo-Inf. 2018, 7, 419. [Google Scholar] [CrossRef] [Green Version]
- Khezrabad, M.A.; Zoej, M.J.V.; Safdarinezhad, A. A Method for Detection of Doors in Building Indoor Point Cloud through Multi-Layer Thresholding and Histogram Analysis. ISPRS Ann. Photogramm. Remote. Sens. Spat. Inf. Sci. 2023, X-4/W1-202, 19–22. [Google Scholar]
- Xiong, X.; Adan, A.; Akinci, B.; Huber, D. Automatic Creation of Semantically Rich 3D Building Models from Laser Scanner Data. Autom. Constr. 2013, 31, 325–337. [Google Scholar] [CrossRef] [Green Version]
- Khoshelham, K.; Díaz-Vilariño, L. 3D Modelling of Interior Spaces: Learning the Language of Indoor Architecture. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.-ISPRS Arch. 2014, 40, 321–326. [Google Scholar] [CrossRef] [Green Version]
- Arefi, H.; Reinartz, P. Building Reconstruction Using DSM and Orthorectified Images. Remote Sens. 2013, 5, 1681–1703. [Google Scholar] [CrossRef] [Green Version]
- Khoshelham, K.; Tran, H.; Acharya, D.; Vilariño, L.D.; Kang, Z.; Dalyot, S. Results of the ISPRS Benchmark on Indoor Modelling. ISPRS Open J. Photogramm. Remote Sens. 2021, 2, 100008. [Google Scholar] [CrossRef]
- Maset, E.; Magri, L.; Fusiello, A. Improving Automatic Reconstruction of Interior Walls from Point Cloud Data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.—ISPRS Arch. 2019, 42, 849–855. [Google Scholar] [CrossRef] [Green Version]
- Cui, Y.; Li, Q.; Yang, B.; Xiao, W.; Chen, C.; Dong, Z. Automatic 3-D Reconstruction of Indoor Environment with Mobile Laser Scanning Point Clouds. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 3117–3130. [Google Scholar] [CrossRef] [Green Version]
Dataset | Sensor | Number of Point (Million) | Mean Point Spacing (cm) | Sensor Trajectory | Clutter | Multi-Story | Manhattan World |
---|---|---|---|---|---|---|---|
TUB1 | Viametris IMS3D | 33.6 | 0.5 | Yes | Low | No | Yes |
TUB2 | Zeb Revo | 21.6 | 0.8 | Yes | Low | Yes | Yes |
UVigo | UVigo backpack | 14.9 | 1.0 | Yes | Moderate | No | Yes * |
UoM | Zeb 1 | 13.9 | 0.7 | No | Moderate | No | Yes * |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Abdollahi, A.; Arefi, H.; Malihi, S.; Maboudi, M. Progressive Model-Driven Approach for 3D Modeling of Indoor Spaces. Sensors 2023, 23, 5934. https://doi.org/10.3390/s23135934
Abdollahi A, Arefi H, Malihi S, Maboudi M. Progressive Model-Driven Approach for 3D Modeling of Indoor Spaces. Sensors. 2023; 23(13):5934. https://doi.org/10.3390/s23135934
Chicago/Turabian StyleAbdollahi, Ali, Hossein Arefi, Shirin Malihi, and Mehdi Maboudi. 2023. "Progressive Model-Driven Approach for 3D Modeling of Indoor Spaces" Sensors 23, no. 13: 5934. https://doi.org/10.3390/s23135934
APA StyleAbdollahi, A., Arefi, H., Malihi, S., & Maboudi, M. (2023). Progressive Model-Driven Approach for 3D Modeling of Indoor Spaces. Sensors, 23(13), 5934. https://doi.org/10.3390/s23135934