A Self-Assembly Portable Mobile Mapping System for Archeological Reconstruction Based on VSLAM-Photogrammetric Algorithm
<p>(<b>a</b>) 3D printing process of the prototype case; (<b>b</b>) cameras A and B with their placement inside the case; and (<b>c</b>) the final portable mobile mapping system prototype.</p> "> Figure 2
<p>In-field data capture. The white arrow indicates the user direction movement, parallel to the monument façade, followed during this test.</p> "> Figure 3
<p>General workflow of the algorithm implemented in C++ for the computations.</p> "> Figure 4
<p>Divergent self-rotation in the (<b>a</b>) <span class="html-italic">X</span>-axis; and (<b>b</b>) <span class="html-italic">Z</span>-axis.</p> "> Figure 5
<p>(<b>a</b>) Scheme with the data capture trajectories and (<b>b</b>) the areas covered by a frame, for 5, 12 and 20 m of distance prototype-monument. The figure that appears in (<b>b</b>), is a 3D model (mesh) generated by the software Meshlab [<a href="#B61-sensors-19-03952" class="html-bibr">61</a>] from the 20 m points cloud made only for visualization purposes.</p> "> Figure 6
<p>(<b>a</b>,<b>c</b>) Reference points spread on the two fronts of the monument. (<b>b</b>) Target model used in the test and (<b>d</b>) total station Pentax V-227N used to measure the network coordinates.</p> "> Figure 7
<p>Graphic on the evolution of the average errors and RMSEs for the distances of 5, 12 and 20 m from the camera to the monument. The results are shown for both systems: Prototype and VSLAM-photogrammetric algorithm and Canon camera with Agisoft Metashape software.</p> "> Figure 8
<p>Comparison between points clouds resulting from both systems (with a data capture distance of 12 m): (<b>a</b>) Prototype and VSLAM-photogrammetric algorithm; and (<b>b</b>) Canon camera EOS 1300D with Agisoft Metashape software.</p> "> Figure 9
<p>Points clouds resulted at the distances established in the experimental test: (<b>a</b>) 5 m; (<b>b</b>) 12 m; and (<b>c</b>) 20 m. The images show the central part of the color points clouds that resulted from the test. The points clouds have not been filtered or edited.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
Workflow of the Proposed Algorithm for the Computation
3. Accuracy Assessment and Results
4. Conclusions
- The accuracy results of both methods are similar as can be seen in Table 5. Although the average error is slightly higher in the proposed approach, the RMSE is a bit lower than with the Agisoft Metashape methodology. This indicates a small, but greater dispersion of the points of the proposed approach in respect to the Agisoft software. But as can be seen in the results, this factor does not imply an increase of RMSE, but this error is slightly less in the proposed approach in relation to Agisoft software.
- The processing time was a bit higher in the proposed approach for the distances of 5 and 12 m but not for 20 m, for which the time was slightly less. The differences are not significant, in our opinion, and indicate that the proposed method optimizes the number of images extracted and the photogrammetric process, thus equating well-known procedures such as the use of a Reflex camera and the Agisoft Metashape software.
- In our opinion, the greatest improvement occurred in the data capture field. The user does not worry about how to use the camera or where to take the picture, because in the proposed approach, the capture is continuous and the system chooses the images automatically, as is explained in Section 2. In this way, the learning curve changes significantly, provided the user doesn´t need to have previous knowledge about photography or photogrammetry. For this reason, the proposed approach here described, reduces significantly the time spent in the field, as can be seen in Table 4.
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Remondino, F.; El-Hakim, S. Image-Based 3D Modelling: A Review. Photogramm. Rec. 2006, 21, 269–291. [Google Scholar] [CrossRef]
- Raza, K.; Khan, T.A.; Abbas, N. Kinematic Analysis and Geometrical Improvement of an Industrial Robotic Arm. J. King Saud Univ. Eng. Sci. 2018, 30, 218–223. [Google Scholar] [CrossRef]
- Rayna, T.; Striukova, L. From Rapid Prototyping to Home Fabrication: How 3D Printing Is Changing Business Model Innovation. Technol. Soc. Chang. 2016, 102, 214–224. [Google Scholar] [CrossRef]
- Wu, P.; Wang, J.; Wang, X. A Critical Review of the Use of 3-D Printing in the Construction Industry. Autom. Constr. 2016, 68, 21–31. [Google Scholar] [CrossRef]
- Tay, Y.W.D.; Panda, B.; Paul, S.C.; Mohamed, N.A.N.; Tan, M.J.; Leong, K.F. 3D Printing Trends in Building and Construction Industry: A Review. Virtual Phys. Prototyp. 2017, 12, 261–276. [Google Scholar] [CrossRef]
- Yan, Q.; Dong, H.; Su, J.; Han, J.; Song, B.; Wei, Q.; Shi, Y. A Review of 3D Printing Technology for Medical Applications. Engineering 2018, 4, 729–742. [Google Scholar] [CrossRef]
- Moruno, L.; Rodríguez Salgado, D.; Sánchez-Ríos, A.; González, A.G. An Ergonomic Customized-Tool Handle Design for Precision Tools Using Additive Manufacturing: A Case Study. Appl. Sci. 2018, 8, 1200. [Google Scholar] [CrossRef]
- Liang, X.; Wang, Y.; Jaakkola, A.; Kukko, A.; Kaartinen, H.; Hyyppä, J.; Honkavaara, E.; Liu, J. Forest Data Collection Using Terrestrial Image-Based Point Clouds from a Handheld Camera Compared to Terrestrial and Personal Laser Scanning. IEEE Trans. Geosci. Remote Sens. 2015, 53. [Google Scholar] [CrossRef]
- Behmann, J.; Mahlein, A.-K.; Paulus, S.; Kuhlmann, H.; Oerke, E.-C.; Plümer, L. Calibration of Hyperspectral Close-Range Pushbroom Cameras for Plant Phenotyping. ISPRS J. Photogramm. Remote Sens. 2015, 106, 172–182. [Google Scholar] [CrossRef]
- Abellán, A.; Oppikofer, T.; Jaboyedoff, M.; Rosser, N.J.; Lim, M.; Lato, M.J. Terrestrial Laser Scanning of Rock Slope Instabilities. Earth Surf. Process. Landf. 2014, 39, 80–97. [Google Scholar] [CrossRef]
- Ghuffar, S.; Székely, B.; Roncat, A.; Pfeifer, N. Landslide Displacement Monitoring Using 3D Range Flow on Airborne and Terrestrial LiDAR Data. Remote Sens. 2013, 5, 2720–2745. [Google Scholar] [CrossRef] [Green Version]
- Lotsari, E.; Wang, Y.; Kaartinen, H.; Jaakkola, A.; Kukko, A.; Vaaja, M.; Hyyppä, H.; Hyyppä, J.; Alho, P. Gravel Transport by Ice in a Subarctic River from Accurate Laser Scanning. Geomorphology 2015, 246, 113–122. [Google Scholar] [CrossRef]
- Harpold, A.; Marshall, J.; Lyon, S.; Barnhart, T.; Fisher, A.B.; Donovan, M.; Brubaker, K.; Crosby, C.; Glenn, F.N.; Glennie, C.; et al. Laser Vision: Lidar as a Transformative Tool to Advance Critical Zone Science. Hydrol. Earth Syst. Sci. 2015, 19, 2881–2897. [Google Scholar] [CrossRef]
- Cacciari, I.; Nieri, P.; Siano, S. 3D Digital Microscopy for Characterizing Punchworks on Medieval Panel Paintings. J. Comput. Cult. Herit. 2014, 7, 19. [Google Scholar] [CrossRef]
- Jaklič, A.; Erič, M.; Mihajlović, I.; Stopinšek, Ž.; Solina, F. Volumetric Models from 3D Point Clouds: The Case Study of Sarcophagi Cargo from a 2nd/3rd Century AD Roman Shipwreck near Sutivan on Island Brač, Croatia. J. Archaeol. Sci. 2015, 62, 143–152. [Google Scholar] [CrossRef]
- Camburn, B.; Viswanathan, V.; Linsey, J.; Anderson, D.; Jensen, D.; Crawford, R.; Otto, K.; Wood, K. Design Prototyping Methods: State of the Art in Strategies, Techniques, and Guidelines. Des. Sci. 2017, 3, 1–33. [Google Scholar] [CrossRef]
- Luhmann, T.; Robson, S.; Kyle, S.; Harley, I. Close Range Photogrammetry: Principles, Techniques and Applications; Whittles Publishing: Dunbeath, UK, 2006. [Google Scholar]
- Ciarfuglia, T.A.; Costante, G.; Valigi, P.; Ricci, E. Evaluation of Non-Geometric Methods for Visual Odometry. Robot. Auton. Syst. 2014, 62, 1717–1730. [Google Scholar] [CrossRef]
- Yousif, K.; Bab-Hadiashar, A.; Hoseinnezhad, R. An Overview to Visual Odometry and Visual SLAM: Applications to Mobile Robotics. Intell. Ind. Syst. 2015, 1, 289–311. [Google Scholar] [CrossRef]
- Strobl, K.H.; Mair, E.; Bodenmüller, T.; Kielhöfer, S.; Wüsthoff, T.; Suppa, M. Portable 3-D Modeling Using Visual Pose Tracking. Comput. Ind. 2018, 99, 53–68. [Google Scholar] [CrossRef]
- Kim, P.; Chen, J.; Cho, Y. SLAM-Driven Robotic Mapping and Registration of 3D Point Clouds. Autom. Constr. 2018, 89, 38–48. [Google Scholar] [CrossRef]
- Balsa-Barreiro, J.; Fritsch, D. Generation of Visually Aesthetic and Detailed 3D Models of Historical Cities by Using Laser Scanning and Digital Photogrammetry. Digit. Appl. Archaeol. Cult. Herit. 2018, 8, 57–64. [Google Scholar] [CrossRef]
- Balsa-Barreiro, J.; Fritsch, D. Generation of 3D/4D Photorealistic Building Models. The Testbed Area for 4D Cultural Heritage World Project: The Historical Center of Calw (Germany). In Proceedings of the International Symposium on Visual Computing, Las Vegas, NV, USA, 14–16 December 2015; pp. 361–372. [Google Scholar] [CrossRef]
- Dupuis, J.; Paulus, S.; Behmann, J.; Plümer, L.; Kuhlmann, H. A Multi-Resolution Approach for an Automated Fusion of Different Low-Cost 3D Sensors. Sensors 2014, 14, 7563–7579. [Google Scholar] [CrossRef] [Green Version]
- Sirmacek, B.; Lindenbergh, R. Accuracy Assessment of Building Point Clouds Automatically Generated from Iphone Images. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 45, 547–552. [Google Scholar] [CrossRef]
- Lachat, E.; Macher, H.; Landes, T.; Grussenmeyer, P. Assessment and Calibration of a RGB-D Camera (Kinect v2 Sensor) Towards a Potential Use for Close-Range 3D Modeling. Remote Sens. 2015, 7, 13070–13097. [Google Scholar] [CrossRef] [Green Version]
- Sánchez, A.; Gómez, J.M.; Jiménez, A.; González, A.G. Analysis of Uncertainty in a Middle-Cost Device for 3D Measurements in BIM Perspective. Sensors 2016, 16, 1557–1574. [Google Scholar] [CrossRef]
- Zlot, R.; Bosse, M.; Greenop, K.; Jarzab, Z.; Juckes, E.; Roberts, J. Efficiently Capturing Large, Complex Cultural Heritage Sites with a Handheld Mobile 3D Laser Mapping System. J. Cult. Herit. 2014, 15, 670–678. [Google Scholar] [CrossRef]
- Pollefeys, M.; Nistér, D.; Frahm, J.-M.; Akbarzadeh, A.; Mordohai, P.; Clipp, B.; Engels, C.; Gallup, D.; Kim, S.-J.; Merrell, P.; et al. Detailed Real-Time Urban 3D Reconstruction from Video. Int. J. Comput. Vis. 2008, 78, 143–167. [Google Scholar] [CrossRef]
- Zingoni, A.; Diani, M.; Corsini, G.; Masini, A. Real-Time 3D Reconstruction from Images Taken from an UAV. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 313–319. [Google Scholar] [CrossRef]
- Sapirstein, P. Accurate Measurement with Photogrammetry at Large Sites. J. Archaeol. Sci. 2016, 66, 137–145. [Google Scholar] [CrossRef]
- O’Driscoll, J. Landscape Applications of Photogrammetry Using Unmanned Aerial Vehicles. J. Archaeol. Sci. Rep. 2018, 22, 32–44. [Google Scholar] [CrossRef]
- Campi, M.; di Luggo, A.; Monaco, S.; Siconolfi, M.; Palomba, D. Indoor and Outdoor Mobile Mapping Systems for Architectural Surveys. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 201–208. [Google Scholar] [CrossRef]
- Petrie, G. Mobile Mapping Systems: An Introduction to the Technology. Geoinformatics 2010, 13, 32–43. [Google Scholar]
- Kopsida, M.; Brilakis, I.; Antonio Vela, P. A Review of Automated Construction Progress Monitoring and Inspection Methods. In Proceedings of the 32nd CIB W78 Conference, Eindhoven, The Netherlands, 27–29 October 2015. [Google Scholar]
- Omar, T.; Nehdi, M.L. Data Acquisition Technologies for Construction Progress Tracking. Autom. Constr. 2016, 70, 143–155. [Google Scholar] [CrossRef]
- Dai, F.; Rashidi, A.; Brilakis, I.; Vela, P. Comparison of Image-Based and Time-of-Flight-Based Technologies for Three-Dimensional Reconstruction of Infrastructure. J. Constr. Eng. Manag. 2013, 139, 69–79. [Google Scholar] [CrossRef]
- El-Omari, S.; Moselhi, O. Integrating 3D Laser Scanning and Photogrammetry for Progress Measurement of Construction Work. Autom. Constr. 2008, 18, 1–9. [Google Scholar] [CrossRef]
- Rebolj, D.; Pučko, Z.; Babič, N.Č.; Bizjak, M.; Mongus, D. Point Cloud Quality Requirements for Scan-vs-BIM Based Automated Construction Progress Monitoring. Autom. Constr. 2017, 84, 323–334. [Google Scholar] [CrossRef]
- Wu, P. Integrated Building Information Modelling; Li, H., Wang, X., Eds.; Bentham Science Publishers: Sharjah, UAE, 2017. [Google Scholar] [CrossRef]
- U.S. General Services Administration, Public Buildings Service. GSA Building Information Modeling Guide Series: 03—GSA BIM Guide for 3DImaging; General Services Administration: Washington, DC, USA, 2019.
- Akca, D.; Freeman, M.; Sargent, I.; Gruen, A. Quality Assessment of 3D Building Data: Quality Assessment of 3D Building Data. Photogramm. Rec. 2010, 25, 339–355. [Google Scholar] [CrossRef]
- Tran, H.; Khoshelham, K.; Kealy, A. Geometric Comparison and Quality Evaluation of 3D Models of Indoor Environments. ISPRS J. Photogramm. Remote Sens. 2019, 149, 29–39. [Google Scholar] [CrossRef]
- Zhang, C.; Kalasapudi, V.S.; Tang, P. Rapid Data Quality Oriented Laser Scan Planning for Dynamic Construction Environments. Adv. Eng. Inf. 2016, 30, 218–232. [Google Scholar] [CrossRef]
- Tang, P.; Alaswad, F.S. Sensor Modeling of Laser Scanners for Automated Scan Planning on Construction Jobsites. In Construction Research Congress 2012; American Society of Civil Engineers: West Lafayette, IN, USA, 2012; pp. 1021–1031. [Google Scholar] [CrossRef]
- Soudarissanane, S.; Lindenbergh, R.; Menenti, M.; Teunissen, P. Scanning Geometry: Influencing Factor on the Quality of Terrestrial Laser Scanning Points. ISPRS J. Photogramm. Remote Sens. 2011, 66, 389–399. [Google Scholar] [CrossRef]
- Shanoer, M.M.; Abed, F.M. Evaluate 3D Laser Point Clouds Registration for Cultural Heritage Documentation. Egypt. J. Remote Sens. Space Sci. 2018, 21, 295–304. [Google Scholar] [CrossRef]
- Zhang, Z. A Flexible New Technique for Camera Calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
- Rublee, E.; Rabaud, V.; Konolige, K.; Bradski, G. ORB: An Efficient Alternative to SIFT or SURF. In Proceedings of the 2011 International Conference on Computer Vision, Barcelona, Spain, 6–13 November 2011; pp. 2564–2571. [Google Scholar] [CrossRef]
- Mur-Artal, R.; Montiel, M.M.J.; Tardós, J.D. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot. 2017, 31, 1255–1262. [Google Scholar] [CrossRef]
- Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision, 2nd ed.; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar] [CrossRef]
- Stewénius, H.; Engels, C.; Nistér, D. Recent Developments on Direct Relative Orientation. ISPRS J. Photogramm. Remote Sens. 2006, 60, 284–294. [Google Scholar] [CrossRef]
- Pierrot Deseilligny, M.; Clery, I. Apero, an Open Source Bundle Adjusment Software for Automatic Calibration and Orientation of Set of Images. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 269277. [Google Scholar] [CrossRef]
- Georgantas, A.; Brédif, M.; Pierrot-Desseilligny, M. An Accuracy Assessment of Automated Photogrammetric Techniques for 3D Modeling of Complex Interiors. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 39, 23–28. [Google Scholar] [CrossRef]
- Cerrillo-Cuenca, E.; Ortiz-Coder, P.; Martínez-del-Pozo, J.-Á. Computer Vision Methods and Rock Art: Towards a Digital Detection of Pigments. Archaeol. Anthropol. Sci. 2014, 6, 227–239. [Google Scholar] [CrossRef]
- Triggs, B.; Mclauchlan, P.; Hartley, R.; Fitzgibbon, A. Bundle Adjustment—A Modern Synthesis. In Proceedings of the International Workshop on Vision Algorithms, Singapore, 5–8 December 2000; pp. 198–372. [Google Scholar]
- Fischler, M.A.; Bolles, R.C. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
- Rupnik, E.; Daakir, M.; Pierrot Deseilligny, M. MicMac—A Free, Open-Source Solution for Photogrammetry. Open Geospat. Data Softw. Stand. 2017, 2, 14. [Google Scholar] [CrossRef]
- Deseilligny, M.; Paparodit, N. A Multiresolution and Optimization-Based Image Matching Approach: An Application to Surface Reconstruction from SPOT5-HRS Stereo Imagery. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 36, 1–5. [Google Scholar]
- Agisoft Metashape. Available online: https://www.agisoft.com/ (accessed on 28 August 2019).
- Meshlab. Available online: http://www.meshlab.net/ (accessed on 27 May 2015).
- Hong, S.; Jung, J.; Kim, S.; Cho, H.; Lee, J.; Heo, J. Semi-Automated Approach to Indoor Mapping for 3D as-Built Building Information Modeling. Comput. Environ. Urban Syst. 2015, 51, 34–46. [Google Scholar] [CrossRef]
- Koutsoudis, A.; Vidmar, B.; Ioannakis, G.; Arnaoutoglou, F.; Pavlidis, G.; Chamzas, C. Multi-image 3D reconstruction data evaluation. J. Cult. Herit. 2014, 15, 73–79. [Google Scholar] [CrossRef]
Model | Resolution (Pixels) | Megapixels | Pixel Size (µm) | Frame Rate (fps) | Sensor | Sensor Size | A/D (bit) |
---|---|---|---|---|---|---|---|
DFK 42AUC03 | 1280 × 960 | 1.2 | 3.75 | 25 | Aptina MT9M021 C | 1/3”CMOS | 8 |
DFK 33UX264 | 2448 × 2048 | 5 | 3.45 | ”8 | Sony IMX264 | 2/3” CMOS | 8/12 |
Model | Focal Length (mm) | Iris Range | Angle of view (H × V) |
---|---|---|---|
TIS-TBL 2.1 C | 2.1 | 2 | 97° × 81.2° |
Fujinon HF6XA–5M | 6 | 1.9–16 | 74.7° × 58.1° |
Level of Detail (LOD) | Level of Accuracy (LOA, Tolerance) | Resolution | Areas of Interest (Coordinate Frame, c. f.) |
---|---|---|---|
Level 1 | ±51 | 152 × 152 | Total Project area (Local or State c. f.) |
Level 2 | ±13 | 25 × 25 | e.g., building (local or project c. f.) |
Level 3 | ±6 | 13 × 13 | e.g., floor level (project or instrument c. f.) |
Level 4 | ±3 | 13 × 13 | e.g., room or artifact (instrument c. f.) |
System | Data Capture Distance (m) | Data Capture Time (min) | Processing Time (min) |
---|---|---|---|
Prototype and Visual Slam (VSLAM)-Photogrammetric Algorithm | 5 | 4.25 | 80 |
12 | 4.53 | 85 | |
20 | 4.65 | 99 | |
Canon Camera and Agisoft Metashape Software | 5 | 7.83 | 72 |
12 | 9.08 | 80 | |
20 | 9.50 | 89 |
Methodology | ||||||||
---|---|---|---|---|---|---|---|---|
Prototype and VSLAM-Photogrammetric Algorithm | Canon Camera and Agisoft Metashape Software | |||||||
Distance 5 m | ||||||||
Error Vector X (mm) | Error Vector Y (mm) | Error Vector Z (mm) | Error (mm) | Error Vector X (mm) | Error Vector Y (mm) | Error Vector Z (mm) | Error (mm) | |
Average Error | 12 | 11 | ||||||
RMSE | 5 | 8 | 8 | 8 | 4 | 9 | 8 | 12 |
Distance 12 m | ||||||||
AVERAGE Error | 26 | 23 | ||||||
RMSE | 21 | 16 | 10 | 16 | 12 | 18 | 17 | 28 |
Distance 20 m | ||||||||
Average Error | 46 | 35 | ||||||
RMSE | 32 | 30 | 38 | 33 | 18 | 24 | 24 | 39 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ortiz-Coder, P.; Sánchez-Ríos, A. A Self-Assembly Portable Mobile Mapping System for Archeological Reconstruction Based on VSLAM-Photogrammetric Algorithm. Sensors 2019, 19, 3952. https://doi.org/10.3390/s19183952
Ortiz-Coder P, Sánchez-Ríos A. A Self-Assembly Portable Mobile Mapping System for Archeological Reconstruction Based on VSLAM-Photogrammetric Algorithm. Sensors. 2019; 19(18):3952. https://doi.org/10.3390/s19183952
Chicago/Turabian StyleOrtiz-Coder, Pedro, and Alonso Sánchez-Ríos. 2019. "A Self-Assembly Portable Mobile Mapping System for Archeological Reconstruction Based on VSLAM-Photogrammetric Algorithm" Sensors 19, no. 18: 3952. https://doi.org/10.3390/s19183952
APA StyleOrtiz-Coder, P., & Sánchez-Ríos, A. (2019). A Self-Assembly Portable Mobile Mapping System for Archeological Reconstruction Based on VSLAM-Photogrammetric Algorithm. Sensors, 19(18), 3952. https://doi.org/10.3390/s19183952