Abstract
Monocular Visual Simultaneous Localisation and Mapping (VSLAM) systems are widely utilised for intelligent mobile robots to work in unknown environments. However, complex and varying illuminations challenge the accuracy and robustness of VSLAM systems significantly. Existing feature-based VSLAM methods often fail due to the insufficient feature points that can be extracted in those challenging illumination environments. Therefore, this paper proposes an improved ORB-SLAM algorithm based on adaptive FAST threshold and image enhancement (AFE-ORB-SLAM), which works in the environments with complex lighting conditions. An improved truncated Adaptive Gamma Correction (AGC) is combined with unsharp masking to reduce the effect caused by different illuminations. What is more, an improved ORB feature extraction method with the adaptive FAST threshold is proposed and adopted to obtain more reliable feature points. To verify the performance of the AFE-ORB-SLAM, three public datasets (the extended Imperial College London and National University of Ireland Maynooth (ICL-NUIM) dataset with different lighting conditions, Onboard Illumination Visual-Inertial Odometry (OIVIO) dataset and the European Robotics Challenge (EuRoC) dataset) are utilised. The results are compared with other state-of-the-art monocular VSLAM methods. The experimental results demonstrate that the AFE-ORB-SLAM could achieve the highest average localisation accuracy with robust performance in the environments with complex lighting conditions while keeping similar performance in the normal lighting scenarios.
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Availability of data and materials
The datasets used are publicly available in:
- OIVIO dataset: https://arpg.github.io/oivio/
- ICL-NUIM dataset with simulated lighting changes: https://cvg.ethz.ch/research/illumination-change-robust-dslam/
- EuRoC dataset: https://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets
Code Availability
The code used to obtain the experimental results can be made available on reasonable request.
References
Cadena, C., Carlone, L., Carrillo, H., Latif, Y., Scaramuzza, D., Neira, J., Reid, I., Leonard, J.J.: Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age. IEEE Transactions on Robotics 32(6), 1309–1332 (2016)
Yu, L., Yang, E., Ren, P., Luo, C., Dobie, G., Gu, D., Yan, X.: Inspection robots in oil and gas industry: a review of current solutions and future trends. In: 2019 25th International Conference on Automation and Computing (ICAC), pp 1–6. IEEE (2019)
Fang, B., Mei, G., Yuan, X., Wang, L., Wang, Z., Wang, J.: Visual slam for robot navigation in healthcare facility. Pattern Recogn. 113, 107822 (2021)
Fallon, M.F., Folkesson, J., McClelland, H., Leonard, J.J.: Relocating underwater features autonomously using sonar-based slam. IEEE J. Ocean. Eng. 38(3), 500–513 (2013)
Hess, W., Kohler, D., Rapp, H., Andor, D.: Real-time loop closure in 2d lidar slam. In: 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 1271–1278. IEEE (2016)
Forster, C., Pizzoli, M., Scaramuzza, D.: Svo: Fast semi-direct monocular visual odometry. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 15–22. IEEE (2014)
Fang, B., Zhan, Z.: A visual slam method based on point-line fusion in weak-matching scene. International Journal of Advanced Robotic Systems 17(2), 1729881420904193 (2020)
Nikolic, J., Burri, M., Rehder, J., Leutenegger, S., Huerzeler, C., Siegwart, R.: A uav system for inspection of industrial facilities. In: 2013 IEEE Aerospace Conference, pp. 1–8. IEEE (2013)
Brasch, N., Bozic, A., Lallemand, J., Tombari, F.: Semantic monocular slam for highly dynamic environments. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 393–400. IEEE (2018)
Servières, M., Renaudin, V., Dupuis, A., Antigny, N.: Visual and visual-inertial slam: State of the art, classification, and experimental benchmarking. Journal of Sensors, 2021 (2021)
Campos, C., Elvira, R., Rodríguez, J.J.G., Montiel, J.M., Tardós, J.D.: Or-slam3: An accurate open-source library for visual, visual–inertial, and multimap slam. IEEE Transactions on Robotics (2021)
Zubizarreta, J., Aguinaga, I., Montiel, J.M.M.: Direct sparse mapping. IEEE Trans. Robot. 36(4), 1363–1370 (2020)
Fang, Y., Shan, G., Wang, T., Li, X., Liu, W., Snoussi, H.: He-slam: A stereo slam system based on histogram equalization and orb features. In: 2018 Chinese Automation Congress (CAC), pp. 4272–4276. IEEE (2018)
Mur-Artal, R., Tardós, J.D.: Orb-slam2: an open-source slam system for monocular, stereo, and rgb-d cameras. IEEE Transactions on Robotics 33(5), 1255–1262 (2017)
Yang, W., Zhai, X.: Contrast limited adaptive histogram equalization for an advanced stereo visual slam system. In: 2019 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC), pp. 131–134. IEEE (2019)
Gu, Q., Liu, P., Zhou, J., Peng, X., Zhang, Y.: Drms: Dim-light robust monocular simultaneous localization and mapping. In: 2021 International Conference on Computer, Control and Robotics (ICCCR), pp. 267–271. IEEE (2021)
Miao, Y., Song, D., Shi, W., Yang, H., Li, Y., Jiang, Z., He, W., Gu, W.: Application of the clahe algorithm based on optimized bilinear interpolation in near infrared vein image enhancement. In: Proceedings of the 2nd International Conference on Computer Science and Application Engineering, pp. 1–6 (2018)
Rahman, S., Rahman, M.M., Abdullah-Al-Wadud, M., Al-Quaderi, G.D., Shoyaib, M.: An adaptive gamma correction for image enhancement. EURASIP Journal on Image and Video Processing 2016(1), 1–13 (2016)
Luo, K., Lin, M., Wang, P., Zhou, S., Yin, D., Zhang, H.: Improved orb-slam2 algorithm based on information entropy and image sharpening adjustment. Math. Probl. Eng. 2020 (2020)
Gomez-Ojeda, R., Zhang, Z., Gonzalez-Jimenez, J., Scaramuzza, D.: Learning-based image enhancement for visual odometry in challenging hdr environments. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 805–811. IEEE (2018)
Pumarola, A., Vakhitov, A., Agudo, A., Sanfeliu, A., Moreno-Noguer, F.: Pl-slam: Real-time monocular visual slam with points and lines. In: 2017 IEEE International Conference on Robotics and Automation (ICRA), pp. 4503–4508. IEEE (2017)
Huang, J., Liu, S.: Robust simultaneous localization and mapping in low-light environment. Computer Animation and Virtual Worlds 30(3-4), 1895 (2019)
Park, S., Schöps, T., Pollefeys, M.: Illumination change robustness in direct visual slam. In: 2017 IEEE International Conference on Robotics and Automation (ICRA), pp. 4523–4530. IEEE (2017)
Sun, P., Lau, H.Y.: Rgb-channel-based illumination robust slam method. Journal of Automation and Control Engineering, 7(2) (2019)
Engel, J., Stückler, J., Cremers, D.: Large-scale direct slam with stereo cameras. In: 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1935–1942. IEEE (2015)
Maragatham, G., Roomi, S.M.M.: A review of image contrast enhancement methods and techniques. Res. J. Appl. Sci. Eng. Technol. 9(5), 309–326 (2015)
Charles, P., et al: Digital Video and Hdtv Algorithms and Interfaces, vol. 260, p 630. Morgan Kaufmann Publishers, San Francisco (2003)
Huang, S.-C., Cheng, F.-C., Chiu, Y.-S.: Efficient contrast enhancement using adaptive gamma correction with weighting distribution. IEEE Transactions on Image Processing 22(3), 1032–1041 (2012)
Cao, G., Huang, L., Tian, H., Huang, X., Wang, Y., Zhi, R.: Contrast enhancement of brightness-distorted images by improved adaptive gamma correction. Computers & Electrical Engineering 66, 569–582 (2018)
Bhandari, A., Kumar, A., Padhy, P.: Enhancement of low contrast satellite images using discrete cosine transform and singular value decomposition. World Academy of Science. Eng. Technol. 79, 35–41 (2011)
Kim, M., Chung, M.G.: Recursively separated and weighted histogram equalization for brightness preservation and contrast enhancement. IEEE Trans. Consum. Electron. 54(3), 1389–1397 (2008)
Mur-Artal, R., Montiel, J.M.M., Tardos, J.D.: Orb-slam: a versatile and accurate monocular slam system. IEEE Transactions on Robotics 31(5), 1147–1163 (2015)
Kasper, M., McGuire, S., Heckman, C.: A benchmark for visual-inertial odometry systems employing onboard illumination. In: 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5256–5263. IEEE (2019)
Burri, M., Nikolic, J., Gohl, P., Schneider, T., Rehder, J., Omari, S., Achtelik, M.W., Siegwart, R.: The euroc micro aerial vehicle datasets. The International Journal of Robotics Research 35(10), 1157–1163 (2016)
Grisetti, G., Kümmerle, R., Strasdat, H., Konolige, K.: G2o: a general framework for (hyper) graph optimization. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, pp. 9–13 (2011)
Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of rgb-d slam systems. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 573–580. IEEE (2012)
Engel, J., Koltun, V., Cremers, D.: Direct sparse odometry. IEEE Transactions on Pattern Analysis and Machine Intelligence 40(3), 611–625 (2017)
Forster, C., Zhang, Z., Gassner, M., Werlberger, M., Scaramuzza, D.: Svo: Semidirect visual odometry for monocular and multicamera systems. IEEE Trans. Robot. 33(2), 249–265 (2016)
Acknowledgements
Thanks are given to the supports from Strathclyde NZTC robotics research team, especially Dr Gordon Dobie, Dr Charles MacLeod, Mr Mark Robertson and Prof Xiutian Yan.
Funding
This work is supported in part by the UK Net Zero Technology Centre (NZTC) under the LOCUST research project (2019-2021, Grant No.: AI-P-028). It is also supported partially by the Royal Society under the MOEA/D-PPR research project (2022-2024, Grant No.: IECNSFC211434). Mr. Leijian Yu is funded by the China Scholarship Council and the International Fees Only Studentship from the University of Strathclyde (2018-2021).
Author information
Authors and Affiliations
Contributions
Idea conception, L.Y.; project supervision, E.Y.; formal analysis, L.Y., E.Y. and B.Y.; original draft writing, L.Y.; review and editing, L.Y., E.Y, and B.Y.
Corresponding author
Ethics declarations
Conflict of Interests
The authors declare that they have no conflict of interest.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Yu, L., Yang, E. & Yang, B. AFE-ORB-SLAM: Robust Monocular VSLAM Based on Adaptive FAST Threshold and Image Enhancement for Complex Lighting Environments. J Intell Robot Syst 105, 26 (2022). https://doi.org/10.1007/s10846-022-01645-w
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10846-022-01645-w