The Indoor Localization of a Mobile Platform Based on Monocular Vision and Coding Images
<p>Technical flowchart.</p> "> Figure 2
<p>(<b>a</b>) The coding graphic example. (<b>b</b>) The template graphic.</p> "> Figure 3
<p>Flowchart of coding graphics identification and localization.</p> "> Figure 4
<p>Coding Image.</p> "> Figure 5
<p>Contour Matching Results.</p> "> Figure 6
<p>Interference Contour Culling Results.</p> "> Figure 7
<p>Coding graphic centroid.</p> "> Figure 8
<p>The <math display="inline"><semantics> <mrow> <mi>ρ</mi> <mo>,</mo> <mi>φ</mi> </mrow> </semantics></math> functions of Tukey.</p> "> Figure 9
<p>Experiment images. (<b>a</b>–<b>d</b>) show the images from group 1 to group 4.</p> "> Figure 10
<p>Error and weight of the observation values.</p> "> Figure 11
<p>(<b>a</b>) Relationships between v/m0 and weight P, with v less than m0. (<b>b</b>) Relationships between v/m0 and weight P, with v greater than m0.</p> "> Figure 12
<p>A panoramic photo of the environment of experiment 3.</p> "> Figure 13
<p>The moving trajectory of the platform and the accuracy comparison between the two methods.</p> "> Figure 14
<p>Experimental environment.</p> "> Figure 15
<p>The vehicle trajectory calculated by the Tukey weight method.</p> "> Figure 16
<p>Difference between the coordinates of the orbit and the calculation results.</p> "> Figure 17
<p>The cumulative distribution function (CDF) curve of the residual errors.</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Coding Graphic Localization
2.1.1. Coding Graphic Design
2.1.2. Coding Graphic Identification and Localization
2.2. Position Calculation Methods
2.2.1. A Method Based on Unit Weight
2.2.2. A Method Based on Tukey Weight
3. Experiments and Discussion
3.1. Experiment 1
3.2. Experiment 2
3.3. Experiment 3
4. Conclusions and Outlook
Author Contributions
Funding
Conflicts of Interest
References
- Ruizhi, C. Mobile phone thinking engine leads intelligent location service. J. Navig. Position. 2017, 5, 1–3. [Google Scholar]
- China Earth Observation and Navigation Technology Field Navigation Expert Group. Indoor and Outdoor High Accuracy Positioning and Navigation White Paper; Ministry of Science and Technology of the People’s Republic of China: Beijing, China, 2013; pp. 6–10.
- Abdulrahman, A.; AbdulMalik, A.-S.; Mansour, A.; Ahmad, A.; Suheer, A.-H.; Mai, A.-A.; Hend, A.-K. Ultra Wideband Indoor Positioning Technologies: Analysis and Recent Advances. Sensors 2016, 16, 707. [Google Scholar]
- Liu, F.; Wang, J.; Zhang, J.X.; Han, H. An Indoor Localization Method for Pedestrians Base on Combined UWB/PDR/Floor Map. Sensors 2019, 19, 2578. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Stella, M.; Russo, M.; Begušić, D. RF Localization in Indoor Environment. Radioengineering 2012, 21, 557–567. [Google Scholar]
- Kim, Y.-G.; An, J.; Lee, K.-D. Localization of Mobile Robot Based on Fusion of Artificial Landmark and RF TDOA Distance under Indoor Sensor Network. Int. J. Adv. Robot. Syst. 2011, 8, 52. [Google Scholar] [CrossRef] [Green Version]
- Chen, Z.; Zou, H.; Jiang, H.; Zhu, Q.; Soh, Y.C.; Xie, L. Fusion of WiFi, Smartphone Sensors and Landmarks Using the Kalman Filter for Indoor Localization. Sensors 2015, 15, 715–732. [Google Scholar] [CrossRef]
- Trawinski, K.; Alonso, J.M.; Hernández, N. A multiclassifier approach for topology-based WiFi indoor localization. Soft Comput. 2013, 17, 1817–1831. [Google Scholar] [CrossRef]
- Kriz, P.; Maly, F.; Kozel, T. Improving Indoor Localization Using Bluetooth Low Energy Beacons. Mob. Inf. Syst. 2016, 2016, 1–11. [Google Scholar] [CrossRef] [Green Version]
- Patil, A.; Kim, D.J.; Ni, L.M. A study of frequency interference and indoor location sensing with 802.11b and Bluetooth technologies. Int. J. Mob. Commun. 2006, 4, 621. [Google Scholar] [CrossRef]
- Piciarelli, C. Visual Indoor Localization in Known Environments. IEEE Signal Process. Lett. 2016, 23, 1330–1334. [Google Scholar] [CrossRef]
- Feng, G.; Ma, L.; Tan, X. Visual Map Construction Using RGB-D Sensors for Image-Based Localization in Indoor Environments. J. Sens. 2017, 2017, 1–18. [Google Scholar] [CrossRef] [Green Version]
- Luoh, L. ZigBee-based intelligent indoor positioning system soft computing. Soft Comput. 2013, 18, 443–456. [Google Scholar] [CrossRef]
- Niu, J.; Wang, B.; Shu, L.; Duong, T.Q.; Chen, Y. ZIL: An Energy-Efficient Indoor Localization System Using ZigBee radio to Detect WiFi Fingerprints. IEEE J. Sel. Areas Commun. 2015, 33, 1. [Google Scholar] [CrossRef] [Green Version]
- Marín, L.; Vallés, M.; Soriano, Á.; Valera, A.; Albertos, P. Multi Sensor Fusion Framework for Indoor-Outdoor Localization of Limited Resource Mobile Robots. Sensors 2013, 13, 14133–14160. [Google Scholar] [CrossRef] [PubMed]
- Xiao, J.; Zhou, Z.; Yi, Y.; Ni, L.M. A Survey on Wireless Indoor Localization from the Device Perspective. ACM Comput. Surv. 2016, 49, 1–31. [Google Scholar] [CrossRef]
- Wang, B.; Zhou, J.; Tang, G.; Di, K.; Wan, W.; Liu, C.; Wang, J. Research on visual localization method of lunar rover. Sci. Sin. Inf. 2014, 44, 452–460. [Google Scholar]
- Di, K. A Review of Spirit and Opportunity Rover Localization Methods. Spacecr. Eng. 2009, 5, 1–5. [Google Scholar]
- Scaramuzza, D.; Fraundorfer, F. Visual odometry Part I: The First 30 Years and Fundamentals. IEEE Robot. Autom. Mag. 2011, 18, 80–92. [Google Scholar] [CrossRef]
- Sadeghi, H.; Valaee, S.; Shirani, S.; Sadeghi, H. A weighted KNN epipolar geometry-based approach for vision-based indoor localization using smartphone cameras. In Proceedings of the 2014 IEEE 8th Sensor Array and Multichannel Signal Processing Workshop (SAM), A Coruna, Spain, 22–25 June 2014; pp. 37–40. [Google Scholar]
- Treuillet, S.; Royer, E. Outdoor/indoor vision-based localization for blind pedestrian navigation assistance. Int. J. Image Graph. 2010, 10, 481–496. [Google Scholar] [CrossRef] [Green Version]
- Elloumi, W.; Latoui, A.; Canals, R.; Chetouani, A.; Treuillet, S. Indoor Pedestrian Localization with a Smartphone: A Comparison of Inertial and Vision-based Methods. IEEE Sens. J. 2016, 16, 1. [Google Scholar] [CrossRef]
- Vedadi, F.; Valaee, S. Automatic Visual Fingerprinting for Indoor Image-Based Localization Applications. IEEE Trans. Syst. Man Cybern. Syst. 2020, 50, 305–317. [Google Scholar] [CrossRef]
- Liang, J.Z.; Corso, N.; Turner, E.; Zakhor, A. Image Based Localization in Indoor Environments. In Proceedings of the 2013 Fourth International Conference on Computing for Geospatial Research and Application, San Jose, CA, USA, 22–24 July 2013; Institute of Electrical and Electronics Engineers (IEEE): San Jose, CA, USA, 2013; pp. 70–75. [Google Scholar]
- Royer, E.; Lhuillier, M.; Dhome, M.; Lavest, J.-M. Monocular Vision for Mobile Robot Localization and Autonomous Navigation. Int. J. Comput. Vis. 2007, 74, 237–260. [Google Scholar] [CrossRef] [Green Version]
- Mur-Artal, R.; Montiel, J.M.M.; Tardos, J. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef] [Green Version]
- Zou, X.; Zou, H.; Lu, J. Virtual manipulator-based binocular stereo vision positioning system and errors modelling. Mach. Vis. Appl. 2010, 23, 43–63. [Google Scholar] [CrossRef]
- Domingo, J.D.; Cerrada, C.; Valero, E.; Cerrada, J. An Improved Indoor Positioning System Using RGB-D Cameras and Wireless Networks for Use in Complex Environments. Sensors 2017, 17, 2391. [Google Scholar] [CrossRef] [Green Version]
- Liu, Z.; Jiang, N.; Zhang, L. Self-localization of indoor mobile robots based on artificial landmarks and binocular stereo vision. In Proceedings of the 2009 International Workshop on Information Security and Application (IWISA 2009), Qingdao, China, 21–22 November 2009; p. 338. [Google Scholar]
- Zhong, X.; Zhou, Y.; Liu, H. Design and recognition of artificial landmarks for reliable indoor self-localization of mobile robots. Int. J. Adv. Robot. Syst. 2017, 14, 1729881417693489. [Google Scholar] [CrossRef] [Green Version]
- Xiao, A.; Chen, R.; Li, D.R.; Chen, Y.; Wu, D. An Indoor Positioning System Based on Static Objects in Large Indoor Scenes by Using Smartphone Cameras. Sensors 2018, 18, 2229. [Google Scholar] [CrossRef] [Green Version]
- Ramirez, B.; Chung, H.; Derhamy, H.; Eliasson, J.; Barca, J.C. Relative localization with computer vision and UWB range for flying robot formation control. In Proceedings of the 2016 14th International Conference on Control, Automation, Robotics and Vision (ICARCV), Phuket, Thailand, 13–15 November 2016; Institute of Electrical and Electronics Engineers (IEEE): Phuket, Thailand, 2016; pp. 1–6. [Google Scholar]
- Tiemann, J.; Ramsey, A.; Wietfeld, C. Enhanced UAV Indoor Navigation through SLAM-Augmented UWB Localization. In Proceedings of the 2018 IEEE International Conference on Communications Workshops (ICC Workshops), Kansas City, MO, USA, 20–24 May 2018; pp. 1–6. [Google Scholar]
- Muñoz-Salinas, R.; Marin-Jimenez, M.J.; Yeguas, E.; Medina-Carnicer, R. Mapping and localization from planar markers. Pattern Recognit. 2018, 73, 158–171. [Google Scholar] [CrossRef] [Green Version]
- Lim, H.; Lee, Y.S. Real-Time Single Camera SLAM Using Fiducial Markers. In Proceedings of the ICCAS-SICE, Fukuoka City, Japan, 18–21 August 2009. [Google Scholar]
- Fraundorfer, F.; Scaramuzza, D. Visual Odometry: Part II: Matching, Robustness, Optimization, and Applications. IEEE Robot. Autom. Mag. 2012, 19, 78–90. [Google Scholar] [CrossRef] [Green Version]
- Zhou, J.; Huang, Y.; Yang, Y.; Ou, J. Robust Least Square Method; Huazhong University of Science and Technology Press: Wuhan, China, 1991. [Google Scholar]
- Tukey, J.W. Study of Robustness by Simulation: Particularly Improvement by Adjustment and Combination. In Robustness in Statistics; Academic Press: New York, NY, USA, 1979; pp. 75–102. [Google Scholar]
- Kafadar, K. John Tukey and Robustness. Stat. Sci. 2003, 18, 319–331. [Google Scholar] [CrossRef]
- Huber, P.J.; John, W. Tukey’s Contributions to Robust Statistics. Ann. Stat. 2002, 30, 1640–1648. [Google Scholar]
- Wang, Z.; Wu, L.-X.; Li, H.-Y. Key technology of mine underground mobile positioning based on LiDAR and coded sequence pattern. Trans. Nonferrous Met. Soc. China 2011, 21, s570–s576. [Google Scholar] [CrossRef]
- Gary, B.; Adrian, K. Learning OpenCV3, 3rd ed.; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2016. [Google Scholar]
- Willian, K.P. Digital Image Processing, 3rd ed.; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2001. [Google Scholar]
- Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man, Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
- Hu, W.; Li, C.C.; Dai, G.H. Reform Based on “No Paper-Drawing” Mechanical Drawing Course System. Adv. Mater. Res. 2011, 271, 1519–1523. [Google Scholar] [CrossRef]
- Liao, X.; Feng, W. Determination of the Deviation between the Image of a Circular Target Center and the Center of the Ellipse in the Image. J. Wuhan Tech. Univ. Surv. 1999, 24, 235–239. [Google Scholar]
- Zhang, J.Q.; Pan, L.; Wang, S.G. Photogrammetry; Wuhan University Press: Wuhan, China, 2008. [Google Scholar]
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
Coding Sequence | Image Coordinates | World Coordinates | |||
---|---|---|---|---|---|
x (mm) | y (mm) | X (mm) | Y (mm) | Z (mm) | |
001111111111111111110000 | −7.63765 | 4.82016 | 6354 | 9630 | 8447 |
110000000111000111000001 | 9.37556 | 2.77728 | 9383 | 9407 | 7701 |
… | … | … | … | … | … |
000011111111100000000000 | −0.66985 | −5.67835 | 7344 | 7807 | 7947 |
Group | Results | X [mm] | Y [mm] | Z [mm] | ex [mm] | ey [mm] | ez [mm] |
---|---|---|---|---|---|---|---|
1 | Method1 | 6932.40 | 9410.20 | 12062.00 | −78.60 | −23.80 | 48.00 |
Method2 | 6957.00 | 9416.00 | 12048.00 | −54.00 | −18.00 | 34.00 | |
True value | 7011.00 | 9434.00 | 12014.00 | exy increase 30.69% | ez increase 29.17% | ||
2 | Method1 | 6991.10 | 9246.10 | 12189.00 | 79.10 | −28.90 | −12.00 |
Method2 | 6876.00 | 9235.90 | 12206.00 | −36.00 | −39.10 | 5.00 | |
True value | 6912.00 | 9275.00 | 12201.00 | exy increase 36.89% | ez increase 58.33% | ||
3 | Method1 | 7436.20 | 9432.00 | 12041.00 | −9.80 | −37.00 | −27.00 |
Method2 | 7435.10 | 9485.00 | 12075.00 | −10.90 | 16.00 | 7.00 | |
True value | 7446.00 | 9469.00 | 12068.00 | exy increase 49.42% | ez increase 74.07% | ||
4 | Method1 | 8506.30 | 9533.40 | 15264.00 | 90.30 | −52.60 | 16.00 |
Method2 | 8342.60 | 9612.50 | 15242.00 | −73.40 | 26.50 | −6.00 | |
True value | 8416.00 | 9586.00 | 15248.00 | exy increase 29.76% | ez increase 62.50% |
Station | Results | X [mm] | Y [mm] | Z [mm] | Accuracy Improvement in Plane | Accuracy Improvement in Elevation |
---|---|---|---|---|---|---|
1 | Method1 | 2073.00 | −1489.00 | −1313.70 | Increase 59.81% | Increase 71.05% |
Method2 | 2141.30 | −1275.30 | −1353.70 | |||
True value | 2120.00 | −1336.00 | −1370.00 | |||
2 | Method1 | 2767.20 | −1339.00 | −860.03 | Increase 17.40% | Increase 25.98% |
Method2 | 2761.70 | −1336.80 | −857.42 | |||
True value | 2735.00 | −1336.00 | −850.00 | |||
3 | Method1 | 1550.40 | −1320.10 | −854.52 | Increase 15.96% | Increase 18.446% |
Method2 | 1546.60 | −1325.90 | −853.69 | |||
True value | 1518.00 | −1336.00 | −850.00 | |||
4 | Method1 | 310.14 | −1359.40 | −877.33 | Increase 21.97% | Increase 43.55% |
Method2 | 307.82 | −1354.30 | −865.42 | |||
True value | 300.00 | −1336.00 | −850.00 | |||
5 | Method1 | −835.93 | −1320.50 | −867.53 | Increase 39.08% | Increase 53.26% |
Method2 | −869.97 | −1313.70 | −858.19 | |||
True value | −911.00 | −1336.00 | −850.00 | |||
6 | Method1 | −2115.20 | −1270.30 | −865.79 | Increase 51.02% | Increase 36.424% |
Method2 | −2095.80 | −1323.70 | −839.96 | |||
True value | −2126.00 | −1336.00 | −850.00 | |||
7 | Method1 | −3364.50 | −1229.40 | −861.68 | Increase 66.34% | Increase 52.13% |
Method2 | −3311.50 | −1314.10 | −855.59 | |||
True value | −3341.00 | −1336.00 | −850.00 | |||
8 | Method1 | −3348.20 | −1303.40 | −881.37 | Increase 65.53% | Increase 9.40% |
Method2 | −3344.90 | −1346.50 | −808.24 | |||
True value | −3340.00 | −1336.00 | −843.00 | |||
9 | Method1 | −3283.90 | −1378.00 | −1699.10 | Increase 21.78% | Increase 53.75% |
Method2 | −3306.00 | −1379.00 | −1603.90 | |||
True value | −3340.00 | −1336.00 | −1522.00 | |||
10 | Method1 | −3336.80 | −1429.00 | −1435.10 | Increase 50.03% | Increase 58.80% |
Method2 | −3350.90 | −1290.80 | −1557.80 | |||
True value | −3340.00 | −1336.00 | −1522.00 | |||
11 | Method1 | −3567.90 | −1407.90 | −844.55 | Increase 31.74% | Increase 58.88% |
Method2 | −3579.00 | −1363.60 | −847.76 | |||
True value | −3646.00 | −1336.00 | −850.00 | |||
12 | Method1 | −3363.80 | −1259.40 | 471.51 | Increase 65.35% | Increase 42.39% |
Method2 | −3359.80 | −1355.50 | 485.28 | |||
True value | −3340.00 | −1336.00 | 504.00 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, F.; Zhang, J.; Wang, J.; Li, B. The Indoor Localization of a Mobile Platform Based on Monocular Vision and Coding Images. ISPRS Int. J. Geo-Inf. 2020, 9, 122. https://doi.org/10.3390/ijgi9020122
Liu F, Zhang J, Wang J, Li B. The Indoor Localization of a Mobile Platform Based on Monocular Vision and Coding Images. ISPRS International Journal of Geo-Information. 2020; 9(2):122. https://doi.org/10.3390/ijgi9020122
Chicago/Turabian StyleLiu, Fei, Jixian Zhang, Jian Wang, and Binghao Li. 2020. "The Indoor Localization of a Mobile Platform Based on Monocular Vision and Coding Images" ISPRS International Journal of Geo-Information 9, no. 2: 122. https://doi.org/10.3390/ijgi9020122
APA StyleLiu, F., Zhang, J., Wang, J., & Li, B. (2020). The Indoor Localization of a Mobile Platform Based on Monocular Vision and Coding Images. ISPRS International Journal of Geo-Information, 9(2), 122. https://doi.org/10.3390/ijgi9020122