ULOTrack: Underwater Long-Term Object Tracker for Marine Organism Capture
<p>(<b>a</b>,<b>b</b>) Object tracking failure due to water weed occlusion. The yellow boxes denote the fish tracking results.</p> "> Figure 2
<p>A flowchart of the COTS micro-AUV design.</p> "> Figure 3
<p>Processing flowchart of the underwater long-term object tracker.</p> "> Figure 4
<p>Overall architecture of the underwater long-term object tracker.</p> "> Figure 5
<p>Overall architecture of the lightweight underwater object detection model.</p> "> Figure 6
<p>Examples of multiple numbers of peaks for target loss discrimination. (<b>a</b>) Frame 5. (<b>b</b>) Frame 99. (<b>c</b>) Frame 131. (<b>d</b>) Frame 143. The yellow boxes denote the fish tracking results.</p> "> Figure 7
<p>(<b>a</b>) The result of maximum response score discrimination. (<b>b</b>) The result of average peak-to-correlation energy.</p> "> Figure 8
<p>(<b>a</b>) OPE precision plots for the underwater visual data. (<b>b</b>) OPE success rate plots for the underwater visual data.</p> "> Figure 9
<p>The visualization perception results of our proposed model in various scenes; the box denotes the tracked marine organisms. (<b>a</b>) Example of low-light challenge (fish 1). (<b>b</b>) Example of fast motion challenge (fish 2). (<b>c</b>) Example of the motion deformation challenge (octopus). (<b>d</b>) Example of the challenge in target size (turtle). (<b>e</b>) Example of complex scene, including several potential tracking objects (fish 3). The boxes denote the fish tracking results.</p> "> Figure 10
<p>The tracking results of different models in various scenes.</p> "> Figure 11
<p>Examples of feature visualization on the different sequences. The red box is the ground truth of the target. (<b>a</b>) Underwater image and ground truth. (<b>b</b>) Ours. (<b>c</b>) EFSCF. (<b>d</b>) AS2RCF. The red areas represent high correlations.</p> ">
Abstract
:1. Introduction
- (1)
- We propose a novel underwater long-term object tracking architecture named ULOTrack, enabling consistent and accurate tracking on a low-power computing AUV platform.
- (2)
- We propose a multi-layer object tracking performance discriminator that evaluates the current tracking state’s stability and suppresses the model drift caused by rapid target movement. The layers are structured as follows: the top layer measures the maximum response score, the second layer evaluates the average peak correlation energy, and the third layer counts the number of multiple peaks.
- (3)
- We design a multiscale space filter and calculate scale responses to address the significant scale variations encountered with marine organisms. Extensive experiments on real-world datasets demonstrate that our algorithm not only achieves greater robustness across various target types but also outperforms other algorithms in tracking performance.
2. Related Work
2.1. Object Tracking
2.2. Underwater Object Tracking for AUVs
3. Experimental Autonomous Underwater Vehicle Platform
4. Underwater Long-Term Object Tracker
4.1. Lightweight Object Detection Method for Marine Organisms
4.2. Underwater Long-Term Object Tracker
4.2.1. Circulant Matrix
4.2.2. The Training of Classifier
4.2.3. Kernel Correlation Filter
4.3. Design of the Multi-Layer Target Loss Discriminator
5. Experimental Analysis
5.1. Lightweight Object Detection Experiment
5.1.1. Dataset and Evaluation Metrics
5.1.2. Experimental Settings
5.1.3. Experimental Results and Analysis
5.2. Long-Term Tracking Experiment
5.2.1. Evaluation Metrics
5.2.2. Experiment Analysis of the Multi-Layer Target Loss Discriminator
- (1)
- In the initial stable state of object tracking, the maximum response score (Fmax) and average peak correlation energy (APCE) values are relatively high. In the 5th frame, the experiment calculates Fmax = 0.391 and APCE = 29.07.
- (2)
- In subsequent scenarios where object tracking is successful, the Fmax and APCE values remain high. In the 99th frame, the experiment calculates Fmax = 0.397 and APCE = 27.84.
- (3)
- When the target is lost, both the Fmax and APCE values sharply decrease, exhibiting significant fluctuations. In the 131th frame when the target is lost, Fmax = 0.287 and APCE = 17.78.
- (4)
- After the target is lost, the correlation filter tracker introduces background error information, treating the background as the target and continuing tracking, leading to a lower value. In the 143th frame, Fmax = 0.237 and APCE = 13.34.
5.2.3. Tracking Model Performance Comparison
5.2.4. Ablation Experiment of Different Modules
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- He, J.; Xu, H.; Li, S.; Yu, Y. Efficient SonarNet: Lightweight CNN Grafted Vision Transformer Embedding Network for Forward-Looking Sonar Image Segmentation. IEEE Trans. Geosci. Remote. Sens. 2024, 62, 4210317. [Google Scholar] [CrossRef]
- Whitt, C.; Pearlman, J.; Polagye, B.; Caimi, F.; Muller-Karger, F.; Copping, A.; Spence, H.; Madhusudhana, S.; Kirkwood, W.; Grosjean, L.; et al. Future vision for autonomous ocean observations. Front. Mar. Sci. 2020, 7, 697. [Google Scholar] [CrossRef]
- Zhang, J.; Liu, M.; Zhang, S.; Zheng, R.; Dong, S. Multi-AUV adaptive path planning and cooperative sampling for ocean scalar field estimation. IEEE Trans. Instrum. Meas. 2022, 71, 1–14. [Google Scholar] [CrossRef]
- Yu, J.; Wu, Z.; Yang, X.; Yang, Y.; Zhang, P. Underwater target tracking control of an untethered robotic fish with a camera stabilizer. IEEE Trans. Syst. Man, Cybern. Syst. 2020, 51, 6523–6534. [Google Scholar] [CrossRef]
- Prasad, D.K.; Rajan, D.; Rachmawati, L.; Rajabally, E.; Quek, C. Video processing from electro-optical sensors for object detection and tracking in a maritime environment: A survey. IEEE Trans. Intell. Transp. Syst. 2017, 18, 1993–2016. [Google Scholar] [CrossRef]
- Melo, J.; Matos, A. Survey on advances on terrain based navigation for autonomous underwater vehicles. Ocean Eng. 2017, 139, 250–264. [Google Scholar] [CrossRef]
- Li, J.; Zhang, G.; Jiang, C.; Zhang, W. A survey of maritime unmanned search system: Theory, applications and future directions. Ocean Eng. 2023, 285, 115359. [Google Scholar] [CrossRef]
- Xu, H.; Zhang, X.; He, J.; Geng, Z.; Yu, Y.; Cheng, Y. Panoptic Water Surface Visual Perception for USVs using Monocular Camera Sensor. IEEE Sens. J. 2024, 15, 24263–24274. [Google Scholar] [CrossRef]
- Xu, H.; Zhang, X.; He, J.; Yu, Y.; Cheng, Y. Real-time Volumetric Perception for unmanned surface vehicles through fusion of radar and camera. IEEE Trans. Instrum. Meas. 2024, 73, 1–12. [Google Scholar] [CrossRef]
- Xu, H.; Zhang, X.; He, J.; Geng, Z.; Pang, C.; Yu, Y. Surround-view Water Surface BEV Segmentation for Autonomous Surface Vehicles: Dataset, Baseline and Hybrid-BEV Network. IEEE Trans. Intell. Veh. 2024, 10, 1–15. [Google Scholar] [CrossRef]
- Panetta, K.; Kezebou, L.; Oludare, V.; Agaian, S. Comprehensive underwater object tracking benchmark dataset and underwater image enhancement with GAN. IEEE J. Ocean. Eng. 2021, 47, 59–75. [Google Scholar] [CrossRef]
- Sun, C.; Wan, Z.; Huang, H.; Zhang, G.; Bao, X.; Li, J.; Sheng, M.; Yang, X. Intelligent target visual tracking and control strategy for open frame underwater vehicles. Robotica 2021, 39, 1791–1805. [Google Scholar] [CrossRef]
- Wu, X.; Han, X.; Zhang, Z.; Wu, H.; Yang, X.; Huang, H. A hybrid excitation model based lightweight siamese network for underwater vehicle object tracking missions. J. Mar. Sci. Eng. 2023, 11, 1127. [Google Scholar] [CrossRef]
- Li, X.; Wei, Z.; Huang, L.; Nie, J.; Zhang, W.; Wang, L. Real-time underwater fish tracking based on adaptive multi-appearance model. In Proceedings of the 2018 25th IEEE International Conference on Image Processing (ICIP), Athens, Greece, 7–10 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 2710–2714. [Google Scholar]
- Chuang, M.C.; Hwang, J.N.; Ye, J.H.; Huang, S.C.; Williams, K. Underwater fish tracking for moving cameras based on deformable multiple kernels. IEEE Trans. Syst. Man, Cybern. Syst. 2016, 47, 2467–2477. [Google Scholar] [CrossRef]
- Lu, Y.; Wang, H.; Chen, Z.; Zhang, Z. Multi-scale underwater object tracking by adaptive feature fusion. In Proceedings of the International Symposium on Artificial Intelligence and Robotics 2021, Fukuoka, Japan, 21–22 August 2021; SPIE: Bellingham, WA, USA, 2021; Volume 11884, pp. 346–357. [Google Scholar]
- Mayer, C.; Danelljan, M.; Paudel, D.P.; Van Gool, L. Learning target candidate association to keep track of what not to track. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, QC, Canada, 11–17 October 2021; pp. 13444–13454. [Google Scholar]
- Huang, Y.; Huang, H.; Niu, M.; Miah, M.S.; Wang, H.; Gao, T. UAV Complex-Scene Single-Target Tracking Based on Improved Re-Detection Staple Algorithm. Remote Sens. 2024, 16, 1768. [Google Scholar] [CrossRef]
- Gao, Z.; Zhuang, Y.; Gu, J.; Yang, B.; Nie, Z. A joint local–global search mechanism for long-term tracking with dynamic memory network. Expert Syst. Appl. 2023, 223, 119890. [Google Scholar] [CrossRef]
- Li, G.; Nai, K. Robust tracking via coarse-to-fine redetection and spatial-temporal reliability evaluation. Expert Syst. Appl. 2024, 256, 124927. [Google Scholar] [CrossRef]
- Liu, C.; Zhao, J.; Bo, C.; Li, S.; Wang, D.; Lu, H. LGTrack: Exploiting Local and Global Properties for Robust Visual Tracking. IEEE Trans. Circuits Syst. Video Technol. 2024, 34, 8161–8171. [Google Scholar] [CrossRef]
- Fan, B.; Cong, Y.; Du, Y. Discriminative multi-task objects tracking with active feature selection and drift correction. Pattern Recognit. 2014, 47, 3828–3840. [Google Scholar] [CrossRef]
- Zhang, Y.; Gao, X.; Chen, Z.; Zhong, H.; Li, L.; Yan, C.; Shen, T. Learning salient features to prevent model drift for correlation tracking. Neurocomputing 2020, 418, 1–10. [Google Scholar] [CrossRef]
- Kalman, R.E. A new approach to linear filtering and prediction problems. J. Basic Eng. 1960, 82, 35–45. [Google Scholar] [CrossRef]
- Zhou, S.K.; Chellappa, R.; Moghaddam, B. Visual tracking and recognition using appearance-adaptive models in particle filters. IEEE Trans. Image Process. 2004, 13, 1491–1506. [Google Scholar] [CrossRef]
- Comaniciu, D.; Meer, P. Mean shift: A robust approach toward feature space analysis. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 603–619. [Google Scholar] [CrossRef]
- Bolme, D.S.; Beveridge, J.R.; Draper, B.A.; Lui, Y.M. Visual object tracking using adaptive correlation filters. In Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Francisco, CA, USA, 13–18 June 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 2544–2550. [Google Scholar]
- Henriques, J.F.; Caseiro, R.; Martins, P.; Batista, J. Exploiting the circulant structure of tracking-by-detection with kernels. In Proceedings of the Computer Vision–ECCV 2012: 12th European Conference on Computer Vision, Florence, Italy, 7–13 October 2012; Springer: Berlin/Heidelberg, Germany, 2012; pp. 702–715. [Google Scholar]
- Danelljan, M.; Häger, G.; Khan, F.; Felsberg, M. Accurate scale estimation for robust visual tracking. In Proceedings of the British Machine Vision Conference, Nottingham, UK, 1–5 September 2014; Bmva Press: Durham, UK, 2014. [Google Scholar]
- Henriques, J.F.; Caseiro, R.; Martins, P.; Batista, J. High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 2014, 37, 583–596. [Google Scholar] [CrossRef]
- Danelljan, M.; Hager, G.; Shahbaz Khan, F.; Felsberg, M. Learning spatially regularized correlation filters for visual tracking. In Proceedings of the IEEE International Conference on Computer Vision, Santiago, Chile, 7–13 December 2015; pp. 4310–4318. [Google Scholar]
- Lukezic, A.; Vojir, T.; Cehovin Zajc, L.; Matas, J.; Kristan, M. Discriminative correlation filter with channel and spatial reliability. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 6309–6318. [Google Scholar]
- Kiani Galoogahi, H.; Fagg, A.; Lucey, S. Learning background-aware correlation filters for visual tracking. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 1135–1143. [Google Scholar]
- Li, F.; Tian, C.; Zuo, W.; Zhang, L.; Yang, M.H. Learning spatial-temporal regularized correlation filters for visual tracking. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 4904–4913. [Google Scholar]
- Wen, J.; Chu, H.; Lai, Z.; Xu, T.; Shen, L. Enhanced robust spatial feature selection and correlation filter learning for UAV tracking. Neural Netw. 2023, 161, 39–54. [Google Scholar] [CrossRef]
- Zhang, J.; He, Y.; Wang, S. Learning adaptive sparse spatially-regularized correlation filters for visual tracking. IEEE Signal Process. Lett. 2023, 30, 11–15. [Google Scholar] [CrossRef]
- Ma, S.; Zhao, B.; Hou, Z.; Yu, W.; Pu, L.; Yang, X. SOCF: A correlation filter for real-time UAV tracking based on spatial disturbance suppression and object saliency-aware. Expert Syst. Appl. 2024, 238, 122131. [Google Scholar] [CrossRef]
- Xia, R.; Chen, Y.; Ren, B. Improved anti-occlusion object tracking algorithm using Unscented Rauch-Tung-Striebel smoother and kernel correlation filter. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 6008–6018. [Google Scholar] [CrossRef]
- Cui, S.; Wang, Y.; Wang, S.; Wang, R.; Wang, W.; Tan, M. Real-time perception and positioning for creature picking of an underwater vehicle. IEEE Trans. Veh. Technol. 2020, 69, 3783–3792. [Google Scholar] [CrossRef]
- Lee, D.; Kim, G.; Kim, D.; Myung, H.; Choi, H.T. Vision-based object detection and tracking for autonomous navigation of underwater robots. Ocean Eng. 2012, 48, 59–68. [Google Scholar] [CrossRef]
- Chuang, M.C.; Hwang, J.N.; Williams, K.; Towler, R. Tracking live fish from low-contrast and low-frame-rate stereo videos. IEEE Trans. Circuits Syst. Video Technol. 2014, 25, 167–179. [Google Scholar] [CrossRef]
- Rout, D.K.; Subudhi, B.N.; Veerakumar, T.; Chaudhury, S. Walsh–Hadamard-kernel-based features in particle filter framework for underwater object tracking. IEEE Trans. Ind. Inform. 2019, 16, 5712–5722. [Google Scholar] [CrossRef]
- Bhat, P.G.; Subudhi, B.N.; Veerakumar, T.; Laxmi, V.; Gaur, M.S. Multi-feature fusion in particle filter framework for visual tracking. IEEE Sens. J. 2019, 20, 2405–2415. [Google Scholar] [CrossRef]
- Li, Y.; Wang, B.; Li, Y.; Liu, Z.; Huo, W.; Li, Y.; Cao, J. Underwater object tracker: UOSTrack for marine organism grasping of underwater vehicles. Ocean Eng. 2023, 285, 115449. [Google Scholar] [CrossRef]
- He, J.; Chen, J.; Xu, H.; Yu, Y. SonarNet: Hybrid CNN-Transformer-HOG Framework and Multifeature Fusion Mechanism for Forward-Looking Sonar Image Segmentation. IEEE Trans. Geosci. Remote Sens. 2024, 62, 1–17. [Google Scholar] [CrossRef]
- Li, W.; Chen, C.; Su, H.; Du, Q. Local Binary Patterns and Extreme Learning Machine for Hyperspectral Imagery Classification. IEEE Trans. Geosci. Remote Sens. 2015, 53, 3681–3693. [Google Scholar] [CrossRef]
- Wang, M.; Liu, Y.; Huang, Z. Large margin object tracking with circulant feature maps. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 4021–4029. [Google Scholar]
- Liu, C.; Li, H.; Wang, S.; Zhu, M.; Wang, D.; Fan, X.; Wang, Z. A dataset and benchmark of underwater object detection for robot picking. In Proceedings of the 2021 IEEE International Conference on Multimedia & Expo Workshops (ICMEW), Shenzhen, China, 5–9 July 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–6. [Google Scholar]
- Fu, C.; Liu, R.; Fan, X.; Chen, P.; Fu, H.; Yuan, W.; Zhu, M.; Luo, Z. Rethinking general underwater object detection: Datasets, challenges, and solutions. Neurocomputing 2023, 517, 243–256. [Google Scholar] [CrossRef]
- Zaidi, S.S.A.; Ansari, M.S.; Aslam, A. A survey of modern deep learning based object detection models. Digit. Signal Process. 2022, 126, 103514. [Google Scholar] [CrossRef]
- Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 18–22 June 2023; pp. 7464–7475. [Google Scholar]
- Jocher, G.; Chaurasia, A.; Qiu, J. YOLO by Ultralytics. 2023. Available online: https://github.com/ultralytics/ultralytics (accessed on 1 June 2023).
- Rezatofighi, H.; Tsoi, N.; Gwak, J.; Sadeghian, A.; Reid, I.; Savarese, S. Generalized intersection over union: A metric and a loss for bounding box regression. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019; pp. 658–666. [Google Scholar]
- Lee, M.F.R.; Chen, Y.C. Artificial intelligence based object detection and tracking for a small underwater robot. Processes 2023, 11, 312. [Google Scholar] [CrossRef]
- Yue, W.; Xu, F.; Yang, J. Tracking-by-Detection Algorithm for Underwater Target Based on Improved Multi-Kernel Correlation Filter. Remote Sens. 2024, 16, 323. [Google Scholar] [CrossRef]
Network | AP | Evaluation Index | |||
---|---|---|---|---|---|
Echinus | Holothurian | Scallop | Starfish | mAP | |
YOLOv5-n | 0.8651 | 0.6465 | 0.6116 | 0.7421 | 0.716 |
YOLOv7-tiny | 0.8592 | 0.6639 | 0.6003 | 0.7074 | 0.708 |
YOLOv8-n | 0.8891 | 0.6832 | 0.6237 | 0.7281 | 0.731 |
Ours | 0.9002 | 0.7035 | 0.6355 | 0.773 | 0.753 |
Network | AP | Evaluation Index | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Holothurian | Echinus | Scallop | Starfish | Fish | Corals | Diver | Cuttlefish | Turtle | Jellyfish | mAP | |
YOLOv5-n | 0.61 | 0.89 | 0.84 | 0.86 | 0.68 | 0.62 | 0.73 | 0.65 | 0.80 | 0.91 | 0.759 |
YOLOv7-tiny | 0.63 | 0.91 | 0.79 | 0.83 | 0.65 | 0.63 | 0.73 | 0.63 | 0.74 | 0.88 | 0.742 |
YOLOv8-n | 0.68 | 0.95 | 0.93 | 0.82 | 0.71 | 0.72 | 0.71 | 0.68 | 0.82 | 0.93 | 0.795 |
Ours | 0.71 | 0.96 | 0.92 | 0.86 | 0.75 | 0.70 | 0.75 | 0.70 | 0.85 | 0.94 | 0.813 |
Tracker | Accuracy (Acc) | Success Rate (SR) | Frame Rate (FPS) |
---|---|---|---|
CF2 [2015] | 0.855 | 0.590 | 43 (GPU) |
ECO [2017] | 0.850 | 0.632 | 50 (GPU) |
BACF [2017] | 0.784 | 0.581 | 35 (CPU) |
SiamFC [2016] | 0.776 | 0.579 | 58 (GPU) |
CSR_DCF [2017] | 0.768 | 0.562 | 13 (CPU) |
SRDCF [2018] | 0.759 | 0.572 | 5 (CPU) |
KCF [2015] | 0.719 | 0.523 | 172 (CPU) |
EFSCF [2023] | 0.823 | 0.637 | 18 (CPU) |
AS2RCF [2023] | 0.826 | 0.599 | 20 (GPU) |
OTFUR [2023] | 0.857 | 0.624 | 61 (GPU) |
SOCF [2024] | 0.845 | 0.621 | 48 (GPU) |
IMKCF [2024] | 0.861 | 0.629 | 16 (CPU) |
Ours | 0.883 | 0.642 | 42 (CPU) |
Model | Modules | Evaluation Index | ||||
---|---|---|---|---|---|---|
Re-Detection | Multi-Layer Tracking Discriminator (MTD) | Adaptive Template Updates (ATUs) | Multi-Feature Fusion (MFF) | Acc | SR | |
Without re-detection | ✘ | ✔ | ✔ | ✔ | 0.802 | 0.568 |
Without MTD | ✔ | ✘ | ✔ | ✔ | 0.821 | 0.617 |
Without ATU | ✔ | ✔ | ✘ | ✔ | 0.856 | 0.629 |
Without MFF | ✔ | ✔ | ✔ | ✘ | 0.834 | 0.590 |
OURS | ✔ | ✔ | ✔ | ✔ | 0.883 | 0.642 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
He, J.; Yu, Y.; Wei, H.; Xu, H. ULOTrack: Underwater Long-Term Object Tracker for Marine Organism Capture. J. Mar. Sci. Eng. 2024, 12, 2092. https://doi.org/10.3390/jmse12112092
He J, Yu Y, Wei H, Xu H. ULOTrack: Underwater Long-Term Object Tracker for Marine Organism Capture. Journal of Marine Science and Engineering. 2024; 12(11):2092. https://doi.org/10.3390/jmse12112092
Chicago/Turabian StyleHe, Ju, Yang Yu, Hongyu Wei, and Hu Xu. 2024. "ULOTrack: Underwater Long-Term Object Tracker for Marine Organism Capture" Journal of Marine Science and Engineering 12, no. 11: 2092. https://doi.org/10.3390/jmse12112092
APA StyleHe, J., Yu, Y., Wei, H., & Xu, H. (2024). ULOTrack: Underwater Long-Term Object Tracker for Marine Organism Capture. Journal of Marine Science and Engineering, 12(11), 2092. https://doi.org/10.3390/jmse12112092