Application of the Infrared Thermography and Unmanned Ground Vehicle for Rescue Action Support in Underground Mine—The AMICOS Project
<p>Mobile platform.</p> "> Figure 2
<p>Mobile platform control system.</p> "> Figure 3
<p>Vision sensors on rotation module.</p> "> Figure 4
<p>Robot equipped with sensory system and lighting.</p> "> Figure 5
<p>Block diagram of the sensory system.</p> "> Figure 6
<p>The map of Złoty Stok ore mining area (from in [<a href="#B38-remotesensing-13-00069" class="html-bibr">38</a>]). Legend: 1—mining area, 2—limestones, 3—ore nests, 4—mining waste heaps, 5—slants, galleries and adits, 6—shafts.</p> "> Figure 7
<p>Scheme of the old arsenic and gold mine in Złoty Stok (based on the figure from: <a href="http://geoportal.pgi.gov.pl" target="_blank">geoportal.pgi.gov.pl</a>).</p> "> Figure 8
<p>Exemplary scenarios of experiments.</p> "> Figure 9
<p>The scheme of presented procedure.</p> "> Figure 10
<p>Human in standing position—correct detection with algorithm YOLO.</p> "> Figure 11
<p>Human in standing position—correct detection with algorithm Histogram of Oriented Gradients (HOG) only in infrared (IR) image.</p> "> Figure 12
<p>Correct detection in both images with the You Only Look Once (YOLO) algorithm.</p> "> Figure 13
<p>Correct detection in RGB image and doubled in IR image with algorithm HOG.</p> "> Figure 14
<p>Human in squatting position—correct detection with algorithm YOLO.</p> "> Figure 15
<p>Human in squatting position—no detection in RGB image, correct detection in the IR image with algorithm HOG</p> "> Figure 16
<p>Human obscured—no detection in RGB image, correct detection in the IR image with algorithm YOLO.</p> "> Figure 17
<p>Human obscured—no detection in RGB image, correct detection in the IR image with algorithm HOG.</p> "> Figure 18
<p>Human lying—no detection in RGB image, correct detection in the IR image with algorithm YOLO.</p> "> Figure 19
<p>Human lying—correct (doubled) detection in RGB image, false detection in the IR image with algorithm HOG.</p> "> Figure 20
<p>Human lying—correct detection with algorithm YOLO.</p> "> Figure 21
<p>Human lying—correct (doubled) detection in RGB image, incomplete detection in IR image with algorithm HOG.</p> "> Figure 22
<p>Human standing rear view—no detection with algorithm YOLO.</p> "> Figure 23
<p>Correct human detection with algorithm YOLO after data correction (see <a href="#remotesensing-13-00069-f022" class="html-fig">Figure 22</a>).</p> "> Figure 24
<p>Human obscured—wrong detection in RGB image with HOG algorithm.</p> ">
Abstract
:1. Introduction
2. State-of-the-Art
3. UGV Platorm
3.1. UGV Platform—A Brief Description
3.2. Control System
3.3. Sensory System
4. Use Case Description
4.1. The “Zloty Stok” Gold and Arsenic Underground Historic Mine
4.2. The Scenarios of Experiments
- person standing (front, rear view),
- person squatting (crouching),
- person obscured,
- person lying.
5. Inspection Data Processing for Human Detection
5.1. General Concept
5.2. HOG Algoritm
5.3. YOLO Algorithm
5.4. Decision-Making
- If the the procedure recognize the detected object as a human.
- If the the procedure recognize the detected object as a probably human.
- If the the result of the procedure cannot be classified. The results of the procedure are transmitted to the remote operator, who will make the decisions.
- If the the procedure not detected any object.
6. Results
- Human standing in the front view. In Figure 10 and Figure 11 the results of detection of the standing person in the front view are presented. Both of them properly detect the standing person by using the IR image. Unfortunately, when RGB image is used, only the YOLO algorithm is successful. The HOG was not able to detect human in this case.
- Human squatting. Figure 14 and Figure 15 show the results of detection of squatting person. Similarly to the first case, in which the person was standing facing the camera, both techniques were successful for IR images and the detection based on the RGB image was correct only for the YOLOv3 algorithm.
- Human partially obscured. Figure 16 and Figure 17 show the results of detection of partially obscured person. Both techniques are successful for IR images, but in the case when the RGB image was used, none of them gave satisfactory effect of human detection. This is one of the reasons for the parallel use of cameras of different types.
- Human lying—case 1. In Figure 18 and Figure 19 the detection of lying person (case 1) is presented. The YOLO algorithm was able to detect human only on the IR image, but HOG provided a false detection result. The HOG generated a ROI that does not contain a human as based on the IR image. What is worth noting in this case, the YOLO did not detect human using RGB image, while the HOG identified two ROIs (both contain human).
- Human lying—case 2. Figure 20 and Figure 21 show the results of detection of lying person (case 2). Both algorithms have properly detected ROIs based on images of both types (RGB and IR). However, these results are a bit ambiguous because the algorithms have been trained to detect human in the most typical position (standing). In the case of lying person, the algorithms did not provide proper results. To overcome this problem, the methodology presented in this paper included rotating the image by 90°, 180°, and 270°. Such a transformation allows to detect human. The HOG algorithm pointed out the human, however, identified ROI does not fully cover the object.
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Polish State Mining Authority—Ventilation-Climatic Hazard. Available online: http://nd.wug.gov.pl/download/5712.pdf (accessed on 5 November 2020).
- Mori, M.; Tanaka, J.; Suzumori, K.; Kanda, T. Field test for verifying the capability of two high-powered hydraulic small robots for rescue operations. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 3492–3497. [Google Scholar]
- Guarnieri, M.; Kurazume, R.; Masuda, H.; Inoh, T.; Takita, K.; Debenest, P.; Hodoshima, R.; Fukushima, E.; Hirose, S. HELIOS system: A team of tracked robots for special urban search and rescue operations. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009; pp. 2795–2800. [Google Scholar]
- Murphy, R.R.; Stover, S. Rescue robots for mudslides: A descriptive study of the 2005 La Conchita mudslide response. J. Field Robot. 2008, 25, 3–16. [Google Scholar] [CrossRef]
- Murphy, R.R.; Kravitz, J.; Stover, S.L.; Shoureshi, R. Mobile robots in mine rescue and recovery. IEEE Robot. Autom. Mag. 2009, 16, 91–103. [Google Scholar] [CrossRef]
- Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the IEEE conference on computer vision and pattern recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 7263–7271. [Google Scholar]
- Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. CoRR 2018. abs/1804.02767. [Google Scholar]
- McConnell, R.K. Method of and Apparatus for Pattern Recognition. U.S. Patent 4567610, 28 January 1986. [Google Scholar]
- Liu, J. Current Research, Key Performances and Future Development of Search and Rescue Robot. Chin. J. Mech. Eng. 2006, 42. [Google Scholar] [CrossRef]
- Qian, S.H.; Ge, S.R.; Wang, Y.S.; Wang, Y.; Liu, C.Q. Research status of the disaster rescue robot and its applications to the mine rescue. Robot 2006, 28, 350–354. [Google Scholar]
- Murphy, R.R. Human-robot interaction in rescue robotics. IEEE Trans. Syst. Man, Cybern. Part C (Appl. Rev.) 2004, 34, 138–153. [Google Scholar] [CrossRef]
- Thrun, S.; Thayer, S.; Whittaker, W.; Baker, C.; Burgard, W.; Ferguson, D.; Hähnel, D.; Montemerlo, M.; Morris, A.; Omohundro, Z.; et al. Autonomous Exploration and Mapping of Abandoned Mines. IEEE Robot. Autom. Mag. 2005. [Google Scholar] [CrossRef] [Green Version]
- Guo, T.; Zheng, C.; Du, Z.; Xu, F. Research on obstacles avoidance technique of mine exploration and rescue robot based on fuzzy reasoning. In Proceedings of the 2009 Sixth International Conference on Fuzzy Systems and Knowledge Discovery, Tianjin, China, 14–16 August 2009; Volume 6, pp. 396–399. [Google Scholar]
- Gangfeng, L.; Lei, Z.; Zhenfeng, H.; Jie, Z. Distribution and communication of multi-robot system for detection in the underground mine disasters. In Proceedings of the 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO), Guilin, China, 18–22 December 2009; pp. 1439–1444. [Google Scholar]
- Green, J. Underground mining robot: A CSIR project. In Proceedings of the 2012 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), College Station, TX, USA, 5–8 November 2012; pp. 1–6. [Google Scholar]
- Tian, Z.; Zhang, L.; Chen, W. Improved algorithm for navigation of rescue robots in underground mines. Comput. Electr. Eng. 2013, 39, 1088–1094. [Google Scholar] [CrossRef]
- Alla, H.; Kalyan, B.; Murthy, C. Mine Rescue Robot System—A Review. Procedia Earth Planet. Sci. 2015, 11, 457–462. [Google Scholar] [CrossRef] [Green Version]
- Moczulski, W.; Przystałka, P.; Sikora, M.; Zimroz, R. Modern ICT and mechatronic systems in contemporary mining industry. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); 9920 LNAI; Springer: Berlin/Heidelberg, Germany, 2016; pp. 33–42. [Google Scholar] [CrossRef]
- Wang, Y.; Tian, P.; Zhou, Y.; Chen, Q. The Encountered Problems and Solutions in the Development of Coal Mine Rescue Robot. J. Robot. 2018, 2018, 8471503. [Google Scholar] [CrossRef]
- Szrek, J.; Wodecki, J.; Blazej, R.; Zimroz, R. An Inspection Robot for Belt Conveyor Maintenance in Underground Mine—Infrared Thermography for Overheated Idlers Detection. Appl. Sci. 2020, 10, 4984. [Google Scholar] [CrossRef]
- Zimroz, R.; Hutter, M.; Mistry, M.; Stefaniak, P.; Walas, K.; Wodecki, J. Why Should Inspection Robots be used in Deep Underground Mines? In Proceedings of the 27th International Symposium on Mine Planning and Equipment Selection—MPES 2018, Santiago, Chile, 20-22 November 2018; Widzyk-Capehart, E., Hekmat, A., Singhal, R., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 497–507. [Google Scholar]
- Szrek, J.; Wojtowicz, P. Idea of wheel-legged robot and its control system design. Bull. Pol. Acad. Sci. Tech. Sci. 2010, 58, 43–50. [Google Scholar] [CrossRef]
- Green, J. Mine rescue robots requirements Outcomes from an industry workshop. In Proceedings of the 2013 6th Robotics and Mechatronics Conference (RobMech), Durban, South Africa, 30–31 October 2013; pp. 111–116. [Google Scholar]
- Mansouri, S.S.; Kanellakis, C.; Georgoulas, G.; Nikolakopoulos, G. Towards MAV Navigation in Underground Mine Using Deep Learning. In Proceedings of the 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), Kuala Lumpur, Malaysia, 12–15 December 2018; pp. 880–885. [Google Scholar]
- Kim, D.; Lee, K. Segment-based region of interest generation for pedestrian detection in far-infrared images. Infrared Phys. Technol. 2013, 61, 120–128. [Google Scholar] [CrossRef]
- Li, J.; Gong, W.; Li, W.; Liu, X. Robust pedestrian detection in thermal infrared imagery using the wavelet transform. Infrared Phys. Technol. 2010, 53, 267–273. [Google Scholar] [CrossRef]
- Qi, B.; John, V.; Liu, Z.; Mita, S. Pedestrian detection from thermal images with a scattered difference of directional gradients feature descriptor. In Proceedings of the 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 8–11 October 2014; pp. 2168–2173. [Google Scholar]
- Xu, F.; Liu, X.; Fujimura, K. Pedestrian detection and tracking with night vision. IEEE Trans. Intell. Transp. Syst. 2005, 6, 63–71. [Google Scholar] [CrossRef]
- Navarro-Serment, L.E.; Mertz, C.; Vandapel, N.; Hebert, M. LADAR-Based Pedestrian Detection and Tracking; Carnegie Mellon University: Pittsburgh, PA, USA, 2008. [Google Scholar]
- Bertozzi, M.; Broggi, A.; Del Rose, M.; Felisa, M.; Rakotomamonjy, A.; Suard, F. A pedestrian detector using histograms of oriented gradients and a support vector machine classifier. In Proceedings of the 2007 IEEE Intelligent Transportation Systems Conference, Seattle, WA, USA, 30 September–3 October 2007; pp. 143–148. [Google Scholar]
- Bajracharya, M.; Moghaddam, B.; Howard, A.; Matthies, L.H. Detecting personnel around UGVs using stereo vision. In Proceedings of the Unmanned Systems Technology X. International Society for Optics and Photonics, Orlando, FL, USA, 16 April 2008; Volume 6962, p. 696202. [Google Scholar]
- Bajracharya, M.; Moghaddam, B.; Howard, A.; Brennan, S.; Matthies, L.H. A fast stereo-based system for detecting and tracking pedestrians from a moving vehicle. Int. J. Robot. Res. 2009, 28, 1466–1485. [Google Scholar] [CrossRef]
- Rankin, A.; Huertas, A.; Matthies, L.; Bajracharya, M.; Assad, C.; Brennan, S.; Bellutta, P.; Sherwin, G.W. Unmanned ground vehicle perception using thermal infrared cameras. In Proceedings of the Unmanned Systems Technology XIII. International Society for Optics and Photonics, Orlando, FL, USA, 27–29 April 2011; Volume 8045, p. 804503. [Google Scholar]
- Xu, Y.; Xu, D.; Lin, S.; Han, T.X.; Cao, X.; Li, X. Detection of sudden pedestrian crossings for driving assistance systems. IEEE Trans. Syst. Man Cybern. Part B Cybern. 2011, 42, 729–739. [Google Scholar]
- Wang, G.; Liu, Q. Far-infrared based pedestrian detection for driver-assistance systems based on candidate filters, gradient-based feature and multi-frame approval matching. Sensors 2015, 15, 32188–32212. [Google Scholar] [CrossRef]
- Heo, D.; Lee, E.; Ko, B.C. Pedestrian detection at night using deep neural networks and saliency maps. Electron. Imaging 2018, 2018, 060403-1. [Google Scholar] [CrossRef]
- Chen, Y.; Shin, H. Pedestrian detection at night in infrared images using an attention-guided encoder-decoder convolutional neural network. Appl. Sci. 2020, 10, 809. [Google Scholar] [CrossRef] [Green Version]
- Muszer, A. Gold at Złoty Stok–history, exploitation, characteristic and perspectives.[W:] Kozłowski, A. i Mikulski, SZ red. Gold Poland. Arch. Mineral. Monogr. 2011, 2, 45–61. [Google Scholar]
- Głowacki, T. Przebitka umożliwiająca turystyczne wykorzystanie Sztolni Czarnej w byłej kopalni złota w Złotym Stoku. In WUG: Bezpieczeństwo Pracy i Ochrona Środowiska w Górnictwie; Wyższy Urząd Górniczy: Katowice, Poland, 2007; pp. 17–18. [Google Scholar]
- OpenCV. Available online: https://opencv.org/ (accessed on 23 August 2020).
- Sidla, O.; Rosner, M. HOG pedestrian detection applied to scenes with heavy occlusion. Proc. SPIE Int. Soc. Opt. Eng. 2007. [Google Scholar] [CrossRef]
- Lee, J.; Lee, J.; Hong, H. Improved HOG Pedestrian Detection Means Using V-disparity Map. TECHART J. Arts Imaging Sci. 2016, 3, 39. [Google Scholar] [CrossRef]
- Li, Y.H.; Huang, S.S.; Lu, C.H.; Chang, F.C. Weighted HOG for Thermal Pedestrian Detection. In Proceedings of the 2018 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), Taichung, Taiwan, 19–21 May 2018; pp. 1–2. [Google Scholar] [CrossRef]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Miller, G.; Beckwith, R.; Fellbaum, C.; Gross, D.; Miller, K. Introduction to wordnet: An on-line lexical database. Int. J. Lexicogr. 1990, 3, 235–244. [Google Scholar] [CrossRef] [Green Version]
- Najafi Kajabad, E.; Ivanov, S. People Detection and Finding Attractive Areas by the use of Movement Detection Analysis and Deep Learning Approach. Procedia Comput. Sci. 2019, 156, 327–337. [Google Scholar] [CrossRef]
YOLO | HOG | |
---|---|---|
IR | ||
RGB |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Szrek, J.; Zimroz, R.; Wodecki, J.; Michalak, A.; Góralczyk, M.; Worsa-Kozak, M. Application of the Infrared Thermography and Unmanned Ground Vehicle for Rescue Action Support in Underground Mine—The AMICOS Project. Remote Sens. 2021, 13, 69. https://doi.org/10.3390/rs13010069
Szrek J, Zimroz R, Wodecki J, Michalak A, Góralczyk M, Worsa-Kozak M. Application of the Infrared Thermography and Unmanned Ground Vehicle for Rescue Action Support in Underground Mine—The AMICOS Project. Remote Sensing. 2021; 13(1):69. https://doi.org/10.3390/rs13010069
Chicago/Turabian StyleSzrek, Jarosław, Radoslaw Zimroz, Jacek Wodecki, Anna Michalak, Mateusz Góralczyk, and Magdalena Worsa-Kozak. 2021. "Application of the Infrared Thermography and Unmanned Ground Vehicle for Rescue Action Support in Underground Mine—The AMICOS Project" Remote Sensing 13, no. 1: 69. https://doi.org/10.3390/rs13010069
APA StyleSzrek, J., Zimroz, R., Wodecki, J., Michalak, A., Góralczyk, M., & Worsa-Kozak, M. (2021). Application of the Infrared Thermography and Unmanned Ground Vehicle for Rescue Action Support in Underground Mine—The AMICOS Project. Remote Sensing, 13(1), 69. https://doi.org/10.3390/rs13010069