Automated Aircraft Dent Inspection via a Modified Fourier Transform Profilometry Algorithm
<p>System configuration: camera and projector are freely positioned (optical axes in blue solid). Without loss of generality, camera is placed in the world origin and a virtual reference plane <math display="inline"><semantics> <mi mathvariant="script">R</mi> </semantics></math> lays in front of the camera.</p> "> Figure 2
<p>A fringe pattern with period <math display="inline"><semantics> <msub> <mi>T</mi> <mi>p</mi> </msub> </semantics></math>, with a central red-coloured stripe.</p> "> Figure 3
<p>Module of the FFT on both reference (dotted red) and object (solid blue) images along middle row for a real dent sample. Spectrum around the fundamental frequency (solid vertical) is isolated with the automatic band-pass filter (dotted vertical).</p> "> Figure 4
<p>Virtually built reference image with sinusoidal fringes. Perspective affects frequency components.</p> "> Figure 5
<p>(<b>Top</b>) Illustrative reference <math display="inline"><semantics> <msub> <mi>Φ</mi> <mn>0</mn> </msub> </semantics></math> (blue) and object <math display="inline"><semantics> <mi>Φ</mi> </semantics></math> (red) phases. (<b>Middle</b>) The wrapped object phase <math display="inline"><semantics> <msup> <mi>Φ</mi> <mi>w</mi> </msup> </semantics></math> presents many more <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>π</mi> </mrow> </semantics></math>-jumps compared to the (<b>Bottom</b>) wrapped phase shift <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msup> <mi>Φ</mi> <mi>w</mi> </msup> </mrow> </semantics></math>.</p> "> Figure 6
<p>(<b>Top</b>) The wrapped object phase <math display="inline"><semantics> <msup> <mi>Φ</mi> <mi>w</mi> </msup> </semantics></math> and (<b>Bottom</b>) wrapped phase shift <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msup> <mi>Φ</mi> <mi>w</mi> </msup> </mrow> </semantics></math> with the same white noise (<math display="inline"><semantics> <mrow> <mi>σ</mi> <mo>=</mo> <mn>0.6</mn> </mrow> </semantics></math>) added.</p> "> Figure 7
<p>The unwrapped object phase <math display="inline"><semantics> <mrow> <msup> <mover accent="true"> <mi>Φ</mi> <mo>˜</mo> </mover> <mi>r</mi> </msup> <mrow> <mo>(</mo> <msub> <mi>u</mi> <mi>c</mi> </msub> <mo>,</mo> <msub> <mi>v</mi> <mi>c</mi> </msub> <mo>)</mo> </mrow> </mrow> </semantics></math> using (<b>top</b>) Equation (<a href="#FD11-sensors-22-00433" class="html-disp-formula">11</a>) and (<b>bottom</b>) Equation (<a href="#FD13-sensors-22-00433" class="html-disp-formula">13</a>) with respect to the original phase (dashed blue). In the first case, the noise causes the unwrapping algorithm to discard a genuine jump right after <math display="inline"><semantics> <mrow> <msub> <mi>u</mi> <mi>c</mi> </msub> <mo>=</mo> <mn>20</mn> </mrow> </semantics></math>, introducing error. This does not happen in the second case due to the reduced presence of <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>π</mi> </mrow> </semantics></math>-jumps.</p> "> Figure 8
<p>Section of the reference dent function. Maximum depth is <math display="inline"><semantics> <mrow> <mn>0.368</mn> </mrow> </semantics></math> and just <math display="inline"><semantics> <mrow> <mn>0.005</mn> </mrow> </semantics></math> at <math display="inline"><semantics> <mrow> <mi>x</mi> <mo>=</mo> <mn>0.9</mn> </mrow> </semantics></math>.</p> "> Figure 9
<p>Compared to the (<b>a</b>) object phase <math display="inline"><semantics> <msup> <mi>Φ</mi> <mi>w</mi> </msup> </semantics></math>, the slower trend of the (<b>b</b>) phase shift <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msup> <mi>Φ</mi> <mi>w</mi> </msup> </mrow> </semantics></math> is clearly visible in a simulated dent with added white noise (<math display="inline"><semantics> <mrow> <mi>σ</mi> <mo>=</mo> <mn>0.6</mn> </mrow> </semantics></math>).</p> "> Figure 10
<p>(<b>Top</b>) Wrapped phase <math display="inline"><semantics> <msup> <mi>Φ</mi> <mi>w</mi> </msup> </semantics></math> and (<b>Bottom</b>) phase shift <math display="inline"><semantics> <mrow> <mo>Δ</mo> <msup> <mi>Φ</mi> <mi>w</mi> </msup> </mrow> </semantics></math> at middle row. Unwrapping of <math display="inline"><semantics> <msup> <mi>Φ</mi> <mi>w</mi> </msup> </semantics></math> has to manage many more <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>π</mi> </mrow> </semantics></math>-jumps.</p> "> Figure 11
<p>Unwrapping of the simulated dent signal with added white noise (<math display="inline"><semantics> <mrow> <mi>σ</mi> <mo>=</mo> <mn>0.6</mn> </mrow> </semantics></math>) using Itoh’s algorithm. Detail of one of the several genuine <math display="inline"><semantics> <mrow> <mn>2</mn> <mi>π</mi> </mrow> </semantics></math>-jumps missed using Equation (<a href="#FD11-sensors-22-00433" class="html-disp-formula">11</a>) (<b>top</b>) but not using Equation (<a href="#FD13-sensors-22-00433" class="html-disp-formula">13</a>) (<b>bottom</b>).</p> "> Figure 12
<p>Comparison of 3D reconstruction of the simulated dent (as heatmap of MAE) using Itoh’s algorithm with the (<b>a</b>) direct mapping and the (<b>b</b>) virtual reference methods.</p> "> Figure 13
<p>Comparison of 3D reconstruction of the simulated dent (as heatmap of MAE) using Estrada’s noise-robust algorithm (<math display="inline"><semantics> <mrow> <mi>τ</mi> <mo>=</mo> <mn>0.8</mn> </mrow> </semantics></math>) with the (<b>a</b>) direct mapping method and the (<b>b</b>) proposed virtual reference method.</p> "> Figure 14
<p>Input images for sample A. (<b>a</b>) Captured object image. Red values are converted to greyscale before FFT. (<b>b</b>) Virtual reference image.</p> "> Figure 15
<p>Calculated (solid) and ground truth (dashed) depth of sample A (<b>top</b>) and error (<b>bottom</b>) along the middle section using direct mapping (<b>left</b>) and proposed virtual reference method (<b>right</b>). Overall MAE was <math display="inline"><semantics> <mrow> <mn>0.158</mn> <mrow> <mo> </mo> <mi mathvariant="normal">m</mi> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math> and <math display="inline"><semantics> <mrow> <mn>0.126</mn> <mrow> <mo> </mo> <mi mathvariant="normal">m</mi> <mi mathvariant="normal">m</mi> </mrow> </mrow> </semantics></math>, respectively.</p> "> Figure 16
<p>Comparison of 3D reconstruction for sample C (as heatmap of MAE) using Estrada’s unwrapping algorithm with the (<b>a</b>) direct mapping and the (<b>b</b>) proposed virtual reference methods. Blue areas correspond to lower error than green ones. The border artefacts of the Fourier transform are visible as high-error areas (yellow and red).</p> "> Figure 17
<p>Comparison of 3D reconstruction for sample C acquired during movement using Estrada’s unwrapping algorithm with the (<b>a</b>) direct mapping and the (<b>b</b>) proposed virtual reference methods. Blue areas correspond to lower error than green ones. The border artefacts of the Fourier transform are visible as high-error areas (yellow and red).</p> ">
Abstract
:1. Introduction
- The automatic band-pass estimation (Section 2.2), thus eliminating the user interaction in the filtering process.
- The reduction of errors in the phase unwrapping process via a virtual reference image (Section 2.4).
Structured-Light Codification Strategies
2. Proposed Method
2.1. Notation and System Geometry
2.2. Automatic Band-Pass Filter
2.3. Virtual Reference Image
2.4. Reduction of Noise Effect over Phase Jumps
2.5. Stereo Correspondence
3. Computer Simulations
4. Experiments
4.1. Static Scenario
4.2. Dynamic Scenario
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Conflicts of Interest
References
- Latorella, K.; Prabhu, P. A review of human error in aviation maintenance and inspection. Int. J. Ind. Ergon. 2000, 26, 133–161. [Google Scholar] [CrossRef]
- Papa, U.; Ponte, S. Preliminary design of an unmanned aircraft system for aircraft general visual inspection. Electronics 2018, 7, 435. [Google Scholar] [CrossRef] [Green Version]
- Airbus Launches Advanced Indoor Inspection Drone. Available online: https://www.airbus.com/en/newsroom/press-releases/2018-04-airbus-launches-advanced-indoor-inspection-drone-to-reduce-aircraft (accessed on 19 December 2021).
- Drones Finally Approved For Lightning Checks. Available online: https://aviationweek.com/mro/drones-finally-approved-lightning-checks-dent-measurement-next (accessed on 19 December 2021).
- Lafiosca, P.; Fan, I.S. Review of non-contact methods for automated aircraft inspections. Insight-Non-Destr. Test. Cond. Monit. 2020, 62, 692–701. [Google Scholar] [CrossRef]
- Deane, S.; Avdelidis, N.P.; Ibarra-Castanedo, C.; Zhang, H.; Nezhad, H.Y.; Williamson, A.A.; Mackley, T.; Davis, M.J.; Maldague, X.; Tsourdos, A. Application of NDT thermographic imaging of aerospace structures. Infrared Phys. Technol. 2019, 97, 456–466. [Google Scholar] [CrossRef] [Green Version]
- The Boeing Company. Structural Repair Manual B737-400; The Boeing Company: Chicago, IL, USA, 2015. [Google Scholar]
- Civil Aviation Authority. CAP 716—Aviation Maintenance Human Factors; Civil Aviation Authority: Crawley, UK, 2003.
- Civil Aviation Authority. Paper 2013/03—Reliability of Damage Detection in Advanced Composite Aircraft Structures; Civil Aviation Authority: Crawley, UK, 2013.
- Jovančević, I.; Pham, H.H.; Orteu, J.J.; Gilblas, R.; Harvent, J.; Maurice, X.; Brèthes, L. 3D point cloud analysis for detection and characterization of defects on airplane exterior surface. J. Nondestruct. Eval. 2017, 36, 1–17. [Google Scholar] [CrossRef] [Green Version]
- Doğru, A.; Bouarfa, S.; Arizar, R.; Aydoğan, R. Using convolutional neural networks to automate aircraft maintenance visual inspection. Aerospace 2020, 7, 171. [Google Scholar] [CrossRef]
- Guehring, J. Dense 3-D surface acquisition by structured light using off-the-shelf components. In Proceedings of the SPIE—The International Society for Optical Engineering, San Jose, CA, USA, 21–24 January 2001. [Google Scholar] [CrossRef]
- Salvi, J.; Pages, J.; Batlle, J. Pattern codification strategies in structure light systems. Pattern Recognit. 2004, 37, 827–849. [Google Scholar] [CrossRef] [Green Version]
- Zuo, C.; Tao, T.; Feng, S.; Huang, L.; Asundi, A.; Chen, Q. Micro Fourier Transform Profilometry (μFTP): 3D shape measurement at 10,000 frames per second. Opt. Lasers Eng. 2017, 102, 70–91. [Google Scholar] [CrossRef] [Green Version]
- An, Y.; Hyun, J.S.; Zhang, S. Pixel-wise absolute phase unwrapping using geometric constraints of structured light system. Opt. Express 2016, 24, 18445–18459. [Google Scholar] [CrossRef] [PubMed]
- Chen, Y.; Song, B.; He, R.; Hu, H.; Chen, S. Fast two-step phase-shifting method for measuring the three-dimensional contour of objects. Opt. Eng. 2021, 60, 94104. [Google Scholar] [CrossRef]
- Zhang, Z. Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques. Opt. Lasers Eng. 2012, 50, 1097–1106. [Google Scholar] [CrossRef]
- Hügli, H.; Maître, G. Generation And Use Of Color Pseudo Random Sequences For Coding Structured Light In Active Ranging. In Proceedings of the 1988 International Congress on Optical Science and Engineering, Hamburg, Germany, 19–23 September 1988. [Google Scholar] [CrossRef]
- Vuylsteke, P.; Oosterlinck, A. Range image acquisition with a single binary-encoded light pattern. IEEE Trans. Pattern Anal. Mach. Intell. 1990, 12, 148–164. [Google Scholar] [CrossRef]
- Ulusoy, A.; Calakli, F.; Taubin, G. One-shot scanning using De Bruijn spaced grids. In Proceedings of the 2009 IEEE 12th International Conference on Computer Vision Workshops, Kyoto, Japan, 27 September–4 October 2009. [Google Scholar] [CrossRef] [Green Version]
- Zhang, l.; Curless, B.; Seitz, S. Rapid Shape Acquisition Using Color Structured Lightand Multi-pass Dynamic Programming. In Proceedings of the First International Symposium on 3D Data Processing Visualization and Transmission, Padova, Italy, 19–21 June 2002. [Google Scholar] [CrossRef] [Green Version]
- Takeda, M.; Mutoh, K. Fourier transform profilometry for the automatic measurement of 3-D object shapes. Appl. Opt. 1983, 22, 3977–3982. [Google Scholar] [CrossRef] [PubMed]
- Salvi, J.; Fernandez, S.; Pribanic, T.; Llado, X. A state of the art in structured light patterns for surface profilometry. Pattern Recognit. 2010, 43, 2666–2680. [Google Scholar] [CrossRef]
- Kemao, Q.; Wang, H.; Wenjing, G. Windowed Fourier transform for fringe pattern analysis: Theoretical analyses. Appl. Opt. 2008, 47, 5408–5419. [Google Scholar] [CrossRef]
- HUANG, L.; Kemao, Q.; Pan, B.; Asundi, A. Comparison of Fourier transform, windowed Fourier transform, and wavelet transform methods for phase extraction from a single fringe pattern in fringe projection profilometry. Opt. Lasers Eng. 2010, 48, 141–148. [Google Scholar] [CrossRef]
- Watkins, L. Phase Recovery Using The Wavelet Transform. In Proceedings of the AIP Conference Proceedings, Penang, Malaysia, 21–23 December 2010. [Google Scholar] [CrossRef]
- Zhong, J.; Zeng, H. Multiscale windowed Fourier transform for phase extraction of fringe patterns. Appl. Opt. 2007, 46, 2670–2675. [Google Scholar] [CrossRef]
- Fernandez, S.; Gdeisat, M.; Salvi, J.; Burton, D. Automatic window size selection in Windowed Fourier Transform for 3D reconstruction using adapted mother wavelets. Opt. Commun. 2011, 284, 2797–2807. [Google Scholar] [CrossRef]
- Xu, J.; Zhang, S. Status, challenges, and future perspectives of fringe projection profilometry. Opt. Lasers Eng. 2020, 135, 106193. [Google Scholar] [CrossRef]
- Feng, S.; Zuo, C.; Zhang, L.; Tao, T.; Hu, Y.; Yin, W.; Qian, J.; Chen, Q. Calibration of fringe projection profilometry: A comparative review. Opt. Lasers Eng. 2021, 143, 106622. [Google Scholar] [CrossRef]
- Zhang, S.; Huang, P. Novel method for structured light system calibration. Opt. Eng. 2006, 45, 83601. [Google Scholar] [CrossRef]
- Li, B.; Zhang, S. Structured light system calibration method with optimal fringe angle. Appl. Opt. 2014, 53, 7942–7950. [Google Scholar] [CrossRef]
- Wen, Y.; Li, S.; Cheng, H.; Su, X.; Zhang, Q. Universal calculation formula and calibration method in Fourier transform profilometry. Appl. Opt. 2010, 49, 6563–6569. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z. A Flexible New Technique for Camera Calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
- Zhang, S. High-speed 3D shape measurement with structured light methods: A review. Opt. Lasers Eng. 2018, 106, 119–131. [Google Scholar] [CrossRef]
- Li, Z.; Shi, Y.; Wang, C.; Wang, Y. Accurate calibration method for a structured light system. Opt. Eng. 2008, 47, 53604. [Google Scholar] [CrossRef]
- Martynov, I.; Kamarainen, J.K.; Lensu, L. Projector Calibration by “Inverse Camera Calibration”. In Proceedings of the Scandinavian Conference on Image Analysis, Ystad, Sweden, 1 May 2011. [Google Scholar] [CrossRef] [Green Version]
- Moreno, D.; Taubin, G. Simple, Accurate, and Robust Projector-Camera Calibration. In Proceedings of the 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission, Zurich, Switzerland, 13–15 October 2012. [Google Scholar] [CrossRef]
- Gdeisat, M.; Lilley, F. One-Dimensional Phase Unwrapping Problem. Available online: https://www.ljmu.ac.uk/~/media/files/ljmu/about-us/faculties-and-schools/fet/geri/onedimensionalphaseunwrapping_finalpdf.pdf (accessed on 17 November 2021).
- Zhang, S. Absolute phase retrieval methods for digital fringe projection profilometry: A review. Opt. Lasers Eng. 2018, 107, 28–37. [Google Scholar] [CrossRef]
- Itoh, K. Analysis of the phase unwrapping algorithm. Appl. Opt. 1982, 21, 2470. [Google Scholar] [CrossRef]
- Dickin, F.; Pollard, S.; Adams, G. Mapping and correcting the distortion of 3D structured light scanners. Precis. Eng. 2021, 72, 543–555. [Google Scholar] [CrossRef]
- Estrada, J.C.; Servin, M.; Quiroga, J.A. Noise robust linear dynamic system for phase unwrapping and smoothing. Opt. Express 2011, 19, 5126–5133. [Google Scholar] [CrossRef] [Green Version]
- Kompanets, I.; Zalyapin, N. Methods and Devices of Speckle-Noise Suppression. Opt. Photonics J. 2020, 10, 219–250. [Google Scholar] [CrossRef]
Camera | Projector | Stereo |
---|---|---|
0.477 | 1.231 | 2.686 |
Sample | Length (mm) | Width (mm) | Depth (mm) |
---|---|---|---|
A | 60 | 40 | 2 |
B | 120 | 100 | 2 |
C | 100 | 80 | 3 |
D | 120 | 80 | 1 |
Direct Mapping | Proposed | Difference | |
---|---|---|---|
Sample | MAE (std) in mm | MAE (std) in mm | mm (%) |
A | 0.158 (0.176) | 0.126 (0.115) | 0.032 (22.5%) |
B | 0.167 (0.202) | 0.144 (0.151) | 0.023 (10.1%) |
C | 0.259 (0.616) | 0.154 (0.494) | 0.105 (50.8%) |
D | 0.158 (0.566) | 0.144 (0.537) | 0.014 (9.3%) |
Direct Mapping | Proposed | Difference | |
---|---|---|---|
Sample | MAE (std) in mm | MAE (std) in mm | mm (%) |
A | 0.153 (0.174) | 0.120 (0.110) | 0.033 (24.2%) |
B | 0.165 (0.190) | 0.135 (0.130) | 0.030 (20.0%) |
C | 0.202 (0.162) | 0.115 (0.114) | 0.087 (54.9%) |
D | 0.213 (0.368) | 0.124 (0.131) | 0.089 (52.8%) |
Camera | Projector | Stereo |
---|---|---|
0.282 | 0.772 | 2.625 |
Direct Mapping | Proposed | ||
---|---|---|---|
Sample | Type | MAE (std) in mm | MAE (std) in mm |
A | Dynamic | 0.381 (0.456) | 0.280 (0.307) |
Static | 0.504 (0.486) | 0.285 (0.217) | |
B | Dynamic | 0.419 (0.438) | 0.309 (0.396) |
Static | 0.432 (0.329) | 0.287 (0.255) | |
C | Dynamic | 0.436 (0.443) | 0.271 (0.265) |
Static | 0.493 (0.546) | 0.320 (0.468) | |
D | Dynamic | 0.411 (0.394) | 0.295 (0.294) |
Static | 0.370 (0.357) | 0.293 (0.401) |
Approach | Pros | Cons |
---|---|---|
Traditional | No special hardware requirements Straightforward | Hazardous Subjective output Time-consuming |
Handheld 3D scanner | High accuracy Repeatable | Hazardous Time-consuming |
Monocular camera UAV | Fast Repeatable Safe | No measures Shallow dents not detectable |
UAV equipped with FTP | Fast Repeatable Safe Measures collected | Complex light control Smoothness assumption |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lafiosca, P.; Fan, I.-S.; Avdelidis, N.P. Automated Aircraft Dent Inspection via a Modified Fourier Transform Profilometry Algorithm. Sensors 2022, 22, 433. https://doi.org/10.3390/s22020433
Lafiosca P, Fan I-S, Avdelidis NP. Automated Aircraft Dent Inspection via a Modified Fourier Transform Profilometry Algorithm. Sensors. 2022; 22(2):433. https://doi.org/10.3390/s22020433
Chicago/Turabian StyleLafiosca, Pasquale, Ip-Shing Fan, and Nicolas P. Avdelidis. 2022. "Automated Aircraft Dent Inspection via a Modified Fourier Transform Profilometry Algorithm" Sensors 22, no. 2: 433. https://doi.org/10.3390/s22020433
APA StyleLafiosca, P., Fan, I.-S., & Avdelidis, N. P. (2022). Automated Aircraft Dent Inspection via a Modified Fourier Transform Profilometry Algorithm. Sensors, 22(2), 433. https://doi.org/10.3390/s22020433