[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Article in Journal
Burst-Mode 355 nm UV Laser Based on a QCW LD-Side-Pumped Electro-Optical Q-Switched Nd: YAG Laser
Next Article in Special Issue
Research on an Echo-Signal-Detection Algorithm for Weak and Small Targets Based on GM-APD Remote Active Single-Photon Technology
Previous Article in Journal
Underwater Dynamic Polarization-Difference Imaging with Greater Applicability
Previous Article in Special Issue
Evolution of Single Photon Lidar: From Satellite Laser Ranging to Airborne Experiments to ICESat-2
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

A Low-Cost Modulated Laser-Based Imaging System Using Square Ring Laser Illumination for Depressing Underwater Backscatter

1
Qilu Aerospace Information Research Institute, Jinan 250100, China
2
College of Photonics and Optical Engineering, Aerospace Information Technology University, Jinan 250299, China
3
Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, China
4
Key Laboratory of Computational Optical Imaging Technology, Chinese Academy of Sciences, Beijing 100094, China
*
Author to whom correspondence should be addressed.
Photonics 2024, 11(11), 1070; https://doi.org/10.3390/photonics11111070
Submission received: 3 October 2024 / Revised: 4 November 2024 / Accepted: 13 November 2024 / Published: 14 November 2024
Figure 1
<p>Schematics of the underwater optical imaging process.</p> ">
Figure 2
<p>(<b>a</b>) Schematics (<b>top</b>) and actual figure (<b>bottom</b>) of the modulated laser illumination system for underwater imaging. (<b>b</b>) Underwater experiment field figure of the modulated laser illumination system for underwater imaging (<b>bottom</b>) and square ring laser spot (<b>top</b>).</p> ">
Figure 3
<p>(<b>a</b>) The block chain of the optoelectronic system. (<b>b</b>) The diagram (<b>left</b>) and actual figure (<b>right</b>) of the electrical control system based on STM32. (<b>c</b>) Flow chart of dedicated firmware.</p> ">
Figure 4
<p>Comparison of original images captured by the camera with the illumination of the modulated laser (<b>top</b>) and the diverging laser (<b>bottom</b>) at different distances.</p> ">
Figure 5
<p>Effects of the relationship between the FOV and MLDA on imaging: (<b>a</b>) FOV <math display="inline"><semantics> <mrow> <mo>&lt;</mo> </mrow> </semantics></math> MLDA, (<b>b</b>) FOV = MLDA, and (<b>c</b>) FOV <math display="inline"><semantics> <mrow> <mo>&gt;</mo> </mrow> </semantics></math> MLDA.</p> ">
Figure 6
<p>Comparison of original images captured by DS-2XC6244F and the MLIS at different distances.</p> ">
Figure 7
<p>Comparison of images captured by the MLIS and enhanced with the optimized algorithm with the average UCIQUE improved from 0.428 to 0.925.</p> ">
Versions Notes

Abstract

:
Underwater vision data facilitate a variety of underwater operations, including underwater ecosystem monitoring, topographical mapping, mariculture, and marine resource exploration. Conventional laser-based underwater imaging systems with complex system architecture rely on high-cost laser systems with high power, and software-based methods can not enrich the physical information captured by cameras. In this manuscript, a low-cost modulated laser-based imaging system is proposed with a spot in the shape of a square ring to eliminate the overlap between the illumination light path and the imaging path, which could reduce the negative effect of backscatter on the imaging process and enhance imaging quality. The imaging system is able to achieve underwater imaging at long distance (e.g., 10 m) with turbidity in the range of 2.49 to 7.82 NTUs, and the adjustable divergence angle of the laser tubes enables the flexibility of the proposed system to image on the basis of application requirements, such as the overall view or partial detail information of targets. Compared with a conventional underwater imaging camera (DS-2XC6244F, Hikvision, Hangzhou, China), the developed system could provide better imaging performance regarding visual effects and quantitative evaluation (e.g., UCIQUE and IE). Through integration with the CycleGAN-based method, the imaging results can be further improved, with the UCIQUE increased by 0.4. The proposed low-cost imaging system with a compact system structure and low consumption of energy could be equipped with platforms, such as underwater robots and AUVs, to facilitate real-world underwater applications.

1. Introduction

The rapid development of unmanned underwater vehicles (UUVs) increases the activity capacity of human beings to explore the underwater environment, and visual information is essential for performing a number of underwater operations, such as underwater ecosystem monitoring [1,2], topographical mapping [3,4,5], mariculture [6,7,8], and marine resource exploration [9,10]. The selective absorption properties of water, such as the great absorption of red light and good maintenance of green and blue light, result in the color distortion of captured underwater images. Moreover, the particles suspended in water reflect the light from the sun and artificial light sources, which can introduce scattering to the image sensor and degrade the captured images. The underwater optical imaging process is described in Figure 1. The camera receives direct components, forward scattering, and backward scattering during the imaging process. The direct component refers to light that travels directly from the light source (e.g., the sun) to the target and then to the camera without scattering, which is the basis for forming the image of the item being taken. Forward scattering means the scattering of light at small angles traveling in the water. Backscattering happens when light is scattered by particles in the water and redirected to the camera, lowering the quality of the captured image; most underwater imaging solutions try to reduce the negative effects of backscattering [11].
Current underwater vision technologies can be grouped into hardware-based methods and software-based methods regarding their principles of operation. Hardware-based methods mainly focus on the improvement of the physical process of imaging to increase the information captured by cameras, while software-based methods rely on the enhancement and restoration of information contained in the images with well-established algorithms. The utilization of lasers in hardware-based imaging systems enhances the ability to obtain high-resolution images of objects and representative laser-enhanced underwater imaging methods, including range-gated imaging, structured light imaging, polarimetric imaging and laser scanning methods, etc. Herein, we mainly discuss the range-gated imaging and laser scanning methods; detailed information about other laser-based methods can be found in other literature sources [12]. The range-gated imaging system employs a synchronized controlled pulsed laser and gate camera with a small gate width to reduce backward scattering for high-resolution imaging. To further remove the backward scattering in gated images, depth-noise maps were calculated with a water attenuation coefficient and a reference image, which were then subtracted with target gate images to obtain a new gate image containing less noise to achieve high range resolution and accuracy 3D imaging [13]. Moreover, it is possible to lessen backscattering by analyzing scene depth with a single gated image, combined with parameters of time delay, laser pulse width, and gate pulse width for dehazing with a 134.78% improvement in PSNR (Peak Signal-to-Noise Ratio) in the underwater environment [14]. Laser scanning methods can capture high-resolution underwater images by reducing the common volume of the light path between the light source and camera to eliminate backscatter. A laser field synchronous scanning system was able to achieve underwater imaging in a range of 15 m and a CSNR (signal-to-noise ratio) improvement of 1.67 times compared with an LED-based imaging system [15]. The performance of the abovementioned range-gated imaging and laser scanning methods is affected by laser parameters, such as laser power and pulse width, which cannot separate the illumination and imaging paths. A range-gated imaging system with high power energy and short exposure time can image long-distance objects at the cost of bulky architecture and high consumption of power. In addition, the field of view (FOV) and imaging resolution of laser scanning methods are restricted by the scanning mirror; moreover, the motion in the imaging process can lead to the twist of high-density point clouds and degrade imaging accuracy.
Software-based methods refer to the methods for improving raw underwater image quality using developed algorithms, which can be classified into imaging enhancement methods, imaging restoration methods, and deep learning-based methods. Imaging enhancement methods process raw underwater images with different functions for advancing the visual effect of the image without considering the imaging physical model; in contrast, imaging restoration methods restore the underwater images by reconstructing the physical imaging formation model in the underwater environment. Typical imaging enhancement methods include Retinex and Fusion-based methods. Retinex aims to restore intrinsic images by eliminating the effects of illumination, which can achieve single underwater image enhancement using multicolor gradient priors of reflectance and illumination to complete specific underwater applications, such as underwater keypoint detection, underwater saliency detection, and underwater depth map estimation [16]. Fusion-based methods are able to reduce noise, expose dark regions, and enhance the contrast of raw images without the assistance of hardware and underwater environment parameters. The improvement of the fusion pipeline, such as reducing the color cast of input images with white balance processing, could enhance the attenuation features and edge information of raw images [17]. Imaging restoration methods try to employ the inverse operation of the underwater imaging formation model to restore a high-quality image with an accurate estimation of model parameters [18]. Deep learning methods, such as CNN and GAN-based methods, use established datasets and networks to learn the information in the images and produce images with good visual results [19,20]. CNN-based methods can learn input image features and provide required outputs, such as transmission maps and image formation model parameters, which can be integrated with the abovementioned enhancement and restoration methods to dehaze the degraded underwater images [21]. GAN-based networks with a generator generate unidentifiable fake images using input images to deceive the discriminator, and a discriminator tries to distinguish the fake images from real images. Due to the lack of datasets with paired images, CycleGAN was proposed to enhance underwater images without the requirement for paired images [22]. However, the performance of these three methods is highly dependent on the input images and accurate estimation of the optical model and parameters, restricting their generalization abilities and the complex network architecture of deep learning-based methods demanding high computability and making it difficult to develop image dehazing methods compatible with different underwater conditions.
In this study, a low-cost and low-power consumption modulated laser-based underwater imaging method was developed with a compact system architecture and long-range and high-quality imaging ability. This system could reduce the backscatter by modulating the laser to match the field of view to improve the SNR (signal-to-noise ratio); moreover, the intensive distribution of laser power in the form of a square ring could expand the imaging range to about 10 m with water turbidity of 7.82 NTUs in starlight illumination conditions. What is more, the performance of this system could be further improved using the CycleGAN-based method, with an obvious improvement in UCIQUE (underwater color image quality evaluation) and IE (information entropy) at different imaging distances. It is convenient to adjust the field of view (FOV) and divergence angle of the modulated laser to image the overall view and partial details of objects in the distance.

2. Methods

2.1. Modulated Laser-Based Imaging System (MLIS)

In order to reduce backscatter, a square-ring modulated laser illumination system was established that matched the rectangular imaging field of view angle through beam modulation. By utilizing the forward scattered light of the laser to illuminate the imaging field of view, the interference of the backward scattered light in the imaging process could be reduced and enhance the imaging quality and the imaging distance in the underwater environment. The light source in the system adopted low-cost semiconductor green laser tubes (wavelength of 532 nm) with special lenses to modulate the spot into a linear laser beam. Compared to a traditional dispersion illumination light with a circular beam, the square ring laser produces a linear beam and the energy is more concentrated, which can effectively reduce the overlap between the illumination light path and the imaging field of view, avoiding interference with backward scattered light on imaging to improve imaging quality, and the dispersion angle of the laser beam can be adjusted to match the field of view to obtain different information regarding the target and surrounding environment for different applications. The camera (JZC-N81820S with a resolution 1920 × 1080, frame rate 25 fps, minimum illumination 0.01 Lux, Xiongmaitech, Hangzhou, China) was placed in the center of the laser illumination source, consisting of a low-light imaging detector and a zoom imaging lens, which can collect target information light under different imaging fields of view. The modulated laser system can be found in Figure 2.

2.2. Electrical Control System

The system mainly consists of a lighting and imaging system and an embedded control system (see Figure 3a). The lighting and imaging system included 2 groups of driver circuits to drive 4 laser tubes with a wavelength of 532 nm (connected in series), 4 shaping lenses, a low illumination imaging module, and an imaging lens. The control system consisted of a microcontroller module, a signal amplification module, a ethernet module, and a power module. The PWM signal was generated and amplified to control the driver circuits for realizing the adjustment of the optical power of the 4 laser tubes. The ethernet module is responsible for signal transmission between the microcontroller and the external device and transmits the data of the imaging module to the external device for further analysis. STM32f103c8t6 with rich peripheral functions such as a timer and USART was used as the embedded controller in this study. With a 32-bit Cortex-M3 core and main frequency up to 72 MHz, this microcontroller realized the balance of high performance and low power consumption, meeting the underwater application requirement of low cost and low power consumption. The diagram and actual figure of the electrical control system based on STM32 can be found in Figure 3b.
The embedded programming was designed using Keil MDK5. The PWM was used to modulate the constant-current driver power supply of the laser source, which in turn realized the modulation of the outgoing optical power. PWM regulation was carried out through USART communication, and the baud rate of USART communication was set to 115,200. The controller used a timer to output two PWM signals to control the two driver circuits; each driver circuit controlled two laser tubes. The crossover coefficient of this timer was set to 9 and the duty cycle of the output PWM was changed by adjusting the value of the registers. A detailed flowchart of the firmware can be found in Figure 3c. The overall power consumption of the system was about 9.26 W when the laser illumination was turned on and 3.26 W without illumination; therefore, the overall power consumption was low and suitable for underwater long-time applications. The data stream was mainly generated by the imaging module with a frame rate of 24 fps and an image size of 1920 × 1080 and transmitted to the external device through the ethernet module with an overall data bandwidth of about 142.38 MB/s.

2.3. Imaging Quality Evaluation Metrics

In this study, we UCIQE and IE (information entropy) to quantitatively evaluate the imaging performance of the MLIS. UCIQE is a linear combination of color concentration, saturation, and contrast that is a commonly used metric to carry out quantitative evaluation of underwater images in terms of non-uniform color cast, blurring, and low contrast. UCIQE is a no-reference (ground truth) metric expressed as Equation (1), where c 1 , c 2 , and c 3 are weighting factors, σ c   is the standard deviation of chromaticity, c o n l is the luminance contrast, and μ s is the saturation mean.
UCIQUE = c 1 σ c + c 2 c o n l + c 3 μ s
IE (information entropy) can describe the complexity and richness of detail of a given image, which can be calculated with Equation (2), where L is the number of possible gray levels of the image and p(i) is the probability of the ith gray level appearing in the image, which can be calculated by dividing the total pixel value of the image with the pixel value of the ith gray level.
E =   i = 0 L 1 p i log 2 p ( i )

3. Experiments

A water pool with dimensions of 10 m × 2 m × 1 m (length × width × depth) was established to simulate the underwater environment. The modulated laser illumination system and camera were packaged in a sealed waterproof compartment and fixed at a depth of 0.5 m under the water surface on one end of the pool. The target was a standard calibration board with a size of 520 mm × 400 mm attached to a movable signpost. In order to remove the effects of sunlight on the imaging results, all of the experiments were performed at night or with the water tank covered with blackout fabric to ensure imaging quality.
The turbidity of the underwater environment could be adjusted in the range of 2.49 to 7.82 NTUs by adding milk to imitate real underwater environment scattering because the milk contained different sizes of spherical particles (e.g., casein molecules of 0.04–0.3 μm, fat globular molecules of 1–20 μm) [23].
Different experiments were carried out to evaluate the performance of the MLIS, including a comparison experiment of modulated laser and diverging laser, the matching effect of FOV and modulated laser divergence angle (MLDA), a comparison experiment of the MLIS with conventional underwater imaging camera (DS-2XC6244F), enhancement with CycleGAN-based method.

4. Results and Discussion

4.1. Comparison of the Modulated Laser and the Diverging Laser

To analyze the effects of the modulated laser on imaging, an underwater experiment was performed to compare the imaging results with the illumination of the modulated laser and diverging laser. The experimental conditions were controlled with water turbidity of 7.82 NTUs and a camera FOV of 15.658° × 8.845°. The choice of laser power should consider the imaging environment (e.g., distance), and low laser energy might not illuminate the target and instead degrade the imaging results. The laser power of 35.8 mW at a distance of 8 m was almost unable to illuminate the target during testing. Therefore, the modulated laser and diverging laser power were 118 mW in the measurement to ensure the illumination of the target.
The results show that modulated laser illumination could provide images with good detail preservation inside the target, uniform image brightness, and clear details of the target and its surrounding background. Under diverging laser illumination, the illumination path overlapped with the imaging path, resulting in a large amount of backscattered light entering the imaging field of view, meaning that the acquired image was mixed with the diverging illumination component in the background with uneven brightness, especially for remote imaging, as can be seen in Figure 4, where the bottom of the target and the surrounding environment were blurred. This finding demonstrates that the modulated laser illumination in the shape of a square ring was suitable for long-range underwater imaging with reasonable adjustment of FOV and modulated laser divergence angle, which will be investigated in the next section. This proposed system only evaluated the imaging distance of 10 m with the water tank length of 10 m; we believe that the imaging distance could be further extended with suitable experimental conditions as the laser at 10 m could still illuminate the target clearly and remain stable, which will be investigated in a future study.

4.2. Matching Effect of FOV and Modulated Laser Divergence Angle (MLDA)

This section investigates the matching effect of FOV and MLDA with the FOV fixed as 20.96° × 11.7° and the MLDA adjusted to be smaller than the FOV (10.01° × 5.15°), matched with the FOV (20.96° × 11.7°), and larger than the FOV (31.28° × 18.55°). During the experiments, the water turbidity was 7.82 NTUs, the imaging resolution was 1920 × 1280, and the power of the modulated laser was 118 mW. If the FOV was smaller than the MLDA, a lot of illuminated information was missed for long-distance imaging, such as imaging with distances of 9 m. When the FOV was larger than the MLDA, the target was imaged with sufficient detail in terms of subjective vision effect but the backscatter of incident light was introduced into the image, resulting in the accumulation of noise in the background. The matched FOV and MLDA preserved the information of the target scene and removed the background noise well (middle row in Figure 5), enabling the performance of the MLIS. What is more, the results implied that it was possible to choose the suitable relationship between the FOV and MLDA based on the application requirements to obtain the required imaging information.

4.3. Comparison of the MLIS with a Conventional Underwater Imaging Camera (DS-2XC6244F)

The imaging performance of our imaging system and Hikvision product were compared at different imaging distances from 2 m to 10 m under the same imaging conditions (water turbidity of 7.82 NTU, FOV of 8.4° × 4.9°, and resolution of 1920 × 1080). As shown in Figure 6, the conventional underwater imaging camera (DS-2XC6244F, Hikvision, Hangzhou, China) could image the target with the largest distance of 7 m in poor imaging quality and nothing could be identified from the image captured with a distance of 8 m. In comparison, the images captured by the MLIS provided better visual effects for all distances. It is even able to see the edge of the calibration board at the imaging distance of 10 m. To quantitatively evaluate the imaging performance of our imaging system and conventional underwater imaging camera, the IE of images obtained at each distance was calculated and the results can be found in Table 1. It was obvious that each IE value of the images captured by MLIS was higher than that of the conventional underwater imaging camera (DS-2XC6244F). Please note the IE value of the image obtained at 10 m by our system was higher than the IE value at short distance; this might have resulted from the captured background information and would not have affected the evaluation of imaging capacity.

4.4. Enhancement with the CycleGAN-Based Method

Through the comparison with the conventional underwater imaging camera (DS-2XC6244F), it was possible to further improve the imaging quality of the MLIS with compatible algorithms. In this study, we chose the CycleGAN-based method to enhance the raw underwater images captured by the MLIS without demanding paired images [24]. The enhanced results in Figure 7 show that the CycleGAN-based method could greatly improve the raw image quality with outstanding visual effects, and the average UCIQUE improved by 0.497. The targets, such as hippocampus japonicas, a starfish, and a crab, could be recognized clearly and the background seaweed could also be displayed vividly, implying the potential of an MLIS integrated with the CycleGAN-based method for a number of underwater applications, such as target recognition and mariculture.
To quantitively assess the enhanced results, the UCIQUE and IE of raw and enhanced images were analyzed, with the results shown in Table 2 and Table 3. The evaluation results revealed that the UCIQUE of images captured by the MLIS could be greatly enhanced by the CycleGAN-based method with an average improvement of 0.4. Especially for images captured at long distances, such as 7 m and 10 m, the enhanced UCIQE could improve them by about 0.7 to 0.9. The comparison of IE could give direct results about the complexity and richness of details in the captured images and the enhanced method could greatly improve the IE of images captured using the MLIS. The IE of images captured at distances of 8 m, 9 m, and 10 m could be improved by 4.091, 3.622, and 3.125, respectively. The quantitative comparison results of UCIQUE and IE verified the underwater imaging ability of the proposed system and provided a possible way to improve the performance of MLISs by integrating them with advanced deep learning-based methods. What is more, the obvious improvement of UCIQE and IE for images captured at long distances with the CycleGAN-based method provides the possibility for remote detection with the proposed MLIS for a variety of underwater activities requiring adequate imaging distances for operation, such as underwater topography surveying with a large FOV enabled by an AUV.

5. Conclusions

In this manuscript, a modulated laser-based imaging system is proposed with the spot adjusted into a linear laser beam in the shape of a square ring to eliminate the overlap between the illumination light path and the imaging path to reduce the negative effect of backscatter on the imaging process. This imaging system integrated with low-cost laser tubes and cameras provides a method for underwater imaging with long-distance demands (e.g., 10 m) and the adjustable MLDA satisfies the imaging requirement of different scenes for obtaining the overall view or partial details of the target. Compared with a conventional underwater imaging camera (DS-2XC6244F), the developed system can provide better imaging performance regarding visual effects and quantitative evaluation (e.g., UCIQUE and IE); moreover, it is possible to further improve system performance by integrating it with the CycleGAN-based method, with UCIQUE increasing by 0.4. The proposed system could be improved with well-designed imaging enhancement algorithms in the future and we hope that this system could be equipped with a platform, such as an underwater robot and AUV, to facilitate real-world underwater applications.

Author Contributions

Y.H. performed the experiments and wrote the manuscript. Y.Y. proposed the idea and performed the experiments. H.Z. established the optical system and performed the experiments. S.Z. performed the experiments. Z.Z. supervised the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This manuscript was funded by the National Natural Science Foundation of China (Grant No. 12304323).

Data Availability Statement

Upon reasonable request, the corresponding author might provide the data supporting the research findings.

Conflicts of Interest

The authors declare no competing financial interests.

References

  1. Aguzzi, J.; Thomsen, L.; Flögel, S.; Robinson, N.J.; Picardi, G.; Chatzievangelou, D.; Bahamon, N.; Stefanni, S.; Grinyó, J.; Fanelli, E.; et al. New technologies for monitoring and upscaling marine ecosystem restoration in deep-sea environments. Engineering 2024, 34, 195–211. [Google Scholar] [CrossRef]
  2. Mariani, P.; Quincoces, I.; Haugholt, K.H.; Chardard, Y.; Visser, A.W.; Yates, C.; Piccinno, G.; Reali, G.; Risholm, P.; Thielemann, J.T. Range-gated imaging system for underwater monitoring in ocean environment. Sustainability 2018, 11, 162. [Google Scholar] [CrossRef]
  3. Noguchi, Y.; Sekimori, Y.; Maki, T. Guidance method of underwater vehicle for rugged seafloor observation in close proximity. J. Field Robot. 2024, 41, 314–326. [Google Scholar] [CrossRef]
  4. Leng, Z.; Zhang, J.; Ma, Y.; Zhang, J. Underwater Topography Inversion in Liaodong Shoal Based on GRU Deep Learning Model. Remote Sens. 2020, 12, 4068. [Google Scholar] [CrossRef]
  5. Mandlburger, G. A review of active and passive optical methods in hydrography. Int. Hydrogr. Rev. 2022, 28, 8–52. [Google Scholar] [CrossRef]
  6. Darodes de Tailly, J.B.; Keitel, J.; Owen, M.A.; Alcaraz-Calero, J.M.; Alexander, M.E.; Sloman, K.A. Monitoring methods of feeding behaviour to answer key questions in penaeid shrimp feeding. Rev. Aquac. 2021, 13, 1828–1843. [Google Scholar] [CrossRef]
  7. Wang, Y.; Xiaoning, Y.; An, D.; Wei, Y. Underwater image enhancement and marine snow removal for fishery based on integrated dual-channel neural network. Comput. Electron. Agric. 2021, 186, 106182. [Google Scholar] [CrossRef]
  8. Liu, X.; Wang, Z.; Yang, X.; Liu, Y.; Liu, B.; Zhang, J.; Gao, K.; Meng, D.; Ding, Y. Mapping China’s offshore mariculture based on dense time-series optical and radar data. Int. J. Digit. Earth 2022, 15, 1326–1349. [Google Scholar] [CrossRef]
  9. Raveendran, S.; Patil, M.D.; Birajdar, G.K. Underwater image enhancement: A comprehensive review, recent trends, challenges and applications. Artif. Intell. Rev. 2021, 54, 5413–5467. [Google Scholar] [CrossRef]
  10. Chemisky, B.; Menna, F.; Nocerino, E.; Drap, P. Underwater Survey for Oil and Gas Industry: A Review of Close Range Optical Methods. Remote Sens. 2021, 13, 2789. [Google Scholar] [CrossRef]
  11. Hao, Y.; Yuan, Y.; Zhang, H.; Zhang, Z. Underwater Optical Imaging: Methods, Applications and Perspectives. Remote Sens. 2024, 16, 3773. [Google Scholar] [CrossRef]
  12. Shen, Y.; Zhao, C.; Liu, Y.; Wang, S.; Huang, F. Underwater optical imaging: Key technologies and applications review. IEEE Access 2021, 9, 85500–85514. [Google Scholar] [CrossRef]
  13. Wang, M.; Wang, X.; Sun, L.; Yang, Y.; Zhou, Y. Underwater 3D deblurring-gated range-intensity correlation imaging. Opt. Lett. 2020, 45, 1455–1458. [Google Scholar] [CrossRef] [PubMed]
  14. Wang, M.; Wang, X.; Zhang, Y.; Sun, L.; Lei, P.; Yang, Y.; Chen, J.; He, J.; Zhou, Y. Range-intensity-profile prior dehazing method for underwater range-gated imaging. Opt. Express 2021, 29, 7630–7640. [Google Scholar] [CrossRef] [PubMed]
  15. Wu, H.; Liu, Z.; Li, C.; Wang, H.; Zhai, Y.; Dong, L. A laser field synchronous scanning imaging system for underwater long-range detection. Opt. Laser Technol. 2024, 175, 110849. [Google Scholar] [CrossRef]
  16. Zhuang, P.; Li, C.; Wu, J. Bayesian retinex underwater image enhancement. Eng. Appl. Artif. Intell. 2021, 101, 104171. [Google Scholar] [CrossRef]
  17. Ancuti, C.O.; Ancuti, C.; De Vleeschouwer, C.; Bekaert, P. Color balance and fusion for underwater image enhancement. IEEE Trans. Image Process. 2017, 27, 379–393. [Google Scholar] [CrossRef]
  18. Zhou, J.; Yang, T.; Chu, W.; Zhang, W.-S. Underwater image restoration via backscatter pixel prior and color compensation. Eng. Appl. Artif. Intell. 2022, 111, 104785. [Google Scholar] [CrossRef]
  19. Menon, A.; Aarthi, R. A Hybrid Approach for Underwater Image Enhancement using CNN and GAN. Int. J. Adv. Comput. Sci. Appl. 2023, 14, 742–748. [Google Scholar] [CrossRef]
  20. Cong, R.; Yang, W.; Zhang, W.; Li, C.; Guo, C.L.; Huang, Q.; Kwong, S. PUGAN: Physical Model-Guided Underwater Image Enhancement Using GAN with Dual-Discriminators. IEEE Trans. Image Process. 2023, 32, 4472–4485. [Google Scholar] [CrossRef]
  21. Zhou, J.; Yang, T.; Zhang, W. Underwater vision enhancement technologies: A comprehensive review, challenges, and recent trends. Appl. Intell. 2023, 53, 3594–3621. [Google Scholar] [CrossRef]
  22. Du, R.; Li, W.; Chen, S.; Li, C.; Zhang, Y. Unpaired Underwater Image Enhancement Based on CycleGAN. Information 2022, 13, 1. [Google Scholar] [CrossRef]
  23. Dubreuil, M.; Delrot, P.; Leonard, I.; Alfalou, A.; Brosseau, C.; Dogariu, A. Exploring underwater target detection by imaging polarimetry and correlation techniques. Appl. Opt. 2013, 52, 997–1005. [Google Scholar] [CrossRef] [PubMed]
  24. Zhu, J.Y.; Park, T.; Isola, P.; Efros, A.A. Unpaired Image-to-Image Translation Using Cycle-Consistent Adversarial Networks. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 2242–2251. [Google Scholar]
Figure 1. Schematics of the underwater optical imaging process.
Figure 1. Schematics of the underwater optical imaging process.
Photonics 11 01070 g001
Figure 2. (a) Schematics (top) and actual figure (bottom) of the modulated laser illumination system for underwater imaging. (b) Underwater experiment field figure of the modulated laser illumination system for underwater imaging (bottom) and square ring laser spot (top).
Figure 2. (a) Schematics (top) and actual figure (bottom) of the modulated laser illumination system for underwater imaging. (b) Underwater experiment field figure of the modulated laser illumination system for underwater imaging (bottom) and square ring laser spot (top).
Photonics 11 01070 g002
Figure 3. (a) The block chain of the optoelectronic system. (b) The diagram (left) and actual figure (right) of the electrical control system based on STM32. (c) Flow chart of dedicated firmware.
Figure 3. (a) The block chain of the optoelectronic system. (b) The diagram (left) and actual figure (right) of the electrical control system based on STM32. (c) Flow chart of dedicated firmware.
Photonics 11 01070 g003
Figure 4. Comparison of original images captured by the camera with the illumination of the modulated laser (top) and the diverging laser (bottom) at different distances.
Figure 4. Comparison of original images captured by the camera with the illumination of the modulated laser (top) and the diverging laser (bottom) at different distances.
Photonics 11 01070 g004
Figure 5. Effects of the relationship between the FOV and MLDA on imaging: (a) FOV < MLDA, (b) FOV = MLDA, and (c) FOV > MLDA.
Figure 5. Effects of the relationship between the FOV and MLDA on imaging: (a) FOV < MLDA, (b) FOV = MLDA, and (c) FOV > MLDA.
Photonics 11 01070 g005
Figure 6. Comparison of original images captured by DS-2XC6244F and the MLIS at different distances.
Figure 6. Comparison of original images captured by DS-2XC6244F and the MLIS at different distances.
Photonics 11 01070 g006
Figure 7. Comparison of images captured by the MLIS and enhanced with the optimized algorithm with the average UCIQUE improved from 0.428 to 0.925.
Figure 7. Comparison of images captured by the MLIS and enhanced with the optimized algorithm with the average UCIQUE improved from 0.428 to 0.925.
Photonics 11 01070 g007
Table 1. IE of a conventional underwater imaging camera (DS-2XC6244F) and the MLIS.
Table 1. IE of a conventional underwater imaging camera (DS-2XC6244F) and the MLIS.
Methods2 m4 m6 m7 m8 m9 m10 m
DS-2XC6244F15.58813.8159.416.9236.5896.8616.946
MLIS18.25315.26311.2879.48.0787.679.295
Table 2. UCIQUE of the MLIS and enhanced MLIS.
Table 2. UCIQUE of the MLIS and enhanced MLIS.
Methods2 m4 m6 m7 m8 m9 m10 m
MLIS0.4080.3640.4510.4770.4810.4680.538
Enhanced MLIS0.7680.6370.6931.1950.8950.8981.099
Table 3. IE of the MLIS and enhanced MLIS.
Table 3. IE of the MLIS and enhanced MLIS.
Methods2 m4 m6 m7 m8 m9 m10 m
MLIS18.25315.26311.2879.48.0787.679.295
Enhanced MLIS18.19216.26913.74511.41412.16911.29212.42
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hao, Y.; Yuan, Y.; Zhang, H.; Zhang, S.; Zhang, Z. A Low-Cost Modulated Laser-Based Imaging System Using Square Ring Laser Illumination for Depressing Underwater Backscatter. Photonics 2024, 11, 1070. https://doi.org/10.3390/photonics11111070

AMA Style

Hao Y, Yuan Y, Zhang H, Zhang S, Zhang Z. A Low-Cost Modulated Laser-Based Imaging System Using Square Ring Laser Illumination for Depressing Underwater Backscatter. Photonics. 2024; 11(11):1070. https://doi.org/10.3390/photonics11111070

Chicago/Turabian Style

Hao, Yansheng, Yaoyao Yuan, Hongman Zhang, Shao Zhang, and Ze Zhang. 2024. "A Low-Cost Modulated Laser-Based Imaging System Using Square Ring Laser Illumination for Depressing Underwater Backscatter" Photonics 11, no. 11: 1070. https://doi.org/10.3390/photonics11111070

APA Style

Hao, Y., Yuan, Y., Zhang, H., Zhang, S., & Zhang, Z. (2024). A Low-Cost Modulated Laser-Based Imaging System Using Square Ring Laser Illumination for Depressing Underwater Backscatter. Photonics, 11(11), 1070. https://doi.org/10.3390/photonics11111070

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop