Integrated IMU with Faster R-CNN Aided Visual Measurements from IP Cameras for Indoor Positioning
<p>The concept of the proposed inertial measurement unit (IMU)/monocular vision relatively measuring (MVRM) integrated system.</p> "> Figure 2
<p>Schematic view of fine-tuning Faster R-CNN (Region Convolutional Neural Network).</p> "> Figure 3
<p>The monocular vision-based ranging model.</p> "> Figure 4
<p>The monocular vision-based angulation model.</p> "> Figure 5
<p>The performance of the fine-tuned Faster R-CNN. (<b>a</b>) The Precision-Recall Curve; (<b>b</b>) The cumulative probability distribution (CDF) of IoU.</p> "> Figure 6
<p>A person captured in four phases. (<b>a</b>) phase 1, marker 1–8th; (<b>b</b>) phase 2, marker 9–12th; (<b>c</b>) phase 3, marker 13–19th; (<b>d</b>) phase 4, marker 20–23th.</p> "> Figure 7
<p>The contour map of measurements errors. (<b>a</b>) Ranging errors; (<b>b</b>) angulation errors.</p> "> Figure 8
<p>The cumulative distribution functions of errors for four phases. (<b>a</b>) CDF of ranging errors; (<b>b</b>) CDF of angulation errors.</p> "> Figure 9
<p>Indoor obstacles environments. (<b>a</b>) Stationary obstacles; (<b>b</b>) pedestrian blockage.</p> "> Figure 10
<p>The effect of obstruction by various stationary obstacles. (<b>a</b>) Ranging error; (<b>b</b>) angulation error.</p> "> Figure 11
<p>The effect of obstruction under five overlap ratios of target and the pedestrian. (<b>a</b>) Ranging error; (<b>b</b>) angulation error.</p> "> Figure 12
<p>The setup for the experiment in test field. (<b>a</b>) The carried tag; (<b>b</b>) the setup for the test indoor positioning system.</p> "> Figure 13
<p>The horizontal trajectory in armchair scenario. (<b>a</b>) The armchair blocked the view of the camera from seeing the test person at the marker 10th, and failed to detect the person; (<b>b</b>) the armchair blocked the view of the camera from seeing the test person near the marker 22nd, and failed to detect the person.</p> "> Figure 14
<p>The horizontal trajectory in cabinet scenario. (<b>a</b>) The cabinet blocked the view of the camera from seeing the test person at the marker 10th, and failed to detect the person; (<b>b</b>) the cabinet blocked the little view of the camera from seeing the test person near the marker 21st, and succeed to detect but caused image height errors.</p> "> Figure 15
<p>The horizontal trajectory in stool scenario. (<b>a</b>) the stool blocked the little view of the camera from seeing the test person at the marker 10th, succeed to detect but caused image height errors; (<b>b</b>) the stool is not able to block any view of the camera from seeing the test person near the marker 21st, succeed to precisely detect.</p> "> Figure 16
<p>The horizontal trajectory in garbage can scenario. (<b>a</b>) The garbage can blocked the little view of the camera from seeing the test person at the marker 10th, and succeed to detect but caused little image height errors; (<b>b</b>) the stool is not able to block any view of the camera from seeing the test person near the marker 21st, and succeed to precisely detect.</p> "> Figure 17
<p>The horizontal trajectory in pedestrian scenario. (<b>a</b>) The pedestrian blocked the view of the camera from seeing the test person at the marker 10th and failed to detect the person; (<b>b</b>) the pedestrian blocked little view of the camera from seeing the test person near the marker 21st, and succeed to detect but caused larger image height errors.</p> ">
Abstract
:1. Introduction
2. System Overview
3. Methods
3.1. Faster R-CNN Based Object Detection
3.2. Monocular Vision-Based Relatively Measuring Method
3.2.1. Ranging Model
3.2.2. Angulation Model
3.3. IMU/MVRM Integrated System
3.3.1. Dynamical Model
3.3.2. Observation Model
4. Tests and Results
4.1. Experiments Preparations
4.2. Performance Evaluation
4.2.1. The Fine-Tuned Faster R-CNN
4.2.2. Analysis of Ranging and Angulation Model in Obstacle-Free Environments
4.2.3. Analysis of Ranging and Angulation Model in Obstacle Environments
Case 1: Stationary Obstacles
Case 2: Pedestrian Blockage
4.2.4. Positioning Results and Analysis
5. Conclusions and Future Work
Author Contributions
Funding
Conflicts of Interest
References
- De Angelis, G.; Pasku, V.; De Angelis, A.; Dionigi, M.; Mongiardo, M.; Moschitta, A.; Carbone, P. An indoor AC magnetic positioning system. IEEE Trans. Instrum. Meas. 2015, 64, 1275–1283. [Google Scholar] [CrossRef]
- Shi, G.W.; Ming, Y. Survey of indoor positioning systems based on ultra-wideband (UWB) technology. In Wireless Communications, Networking and Applications; Springer: New Delhi, India, 2016; pp. 1269–1278. [Google Scholar]
- Brena, R.F.; Garcia-Vazquez, J.P.; Galvan-Tejada, C.E.; Munoz-Rodriguez, D.; Vargas-Rosales, C.; Fangmeyer, J. Evolution of indoor positioning technologies: A survey. J. Sens. 2017, 2017, 2630413. [Google Scholar] [CrossRef]
- Molina, B.; Olivares, E.; Palau, C.E.; Esteve, M. A multimodal fingerprint-based indoor positioning system for airports. IEEE Access 2018, 6, 10092–10106. [Google Scholar] [CrossRef]
- Hwang, I.; Jang, Y.J. Process mining to discover shoppers’ pathways at a fashion retail store using a WiFi-base indoor positioning system. IEEE Trans. Autom. Sci. Eng. 2017, 14, 1786–1792. [Google Scholar] [CrossRef]
- Mashuk, M.S.; Pinchin, J.; Siebers, P.O.; Moore, T. A smart phone based multi-floor indoor positioning system for occupancy detection. In Proceedings of the 2018 IEEE/ION Position, Location and Navigation Symposium (PLANS), Monterey, CA, USA, 23–26 April 2018; pp. 216–227. [Google Scholar]
- Van der Ham, M.F.S.; Zlatanova, S.; Verbree, E.; Voute, R. Real time localization of assets in hospitals using quuppa indoor positioning technology. In Proceedings of the First International Conference on Smart Data and Smart Cities (30th UDMS), Split, Croatia, 7–9 September 2016; pp. 105–110. [Google Scholar]
- Zhuang, Y.; Syed, Z.; Li, Y.; El-Sheimy, N. Evaluation of two WiFi positioning systems based on autonomous crowdsourcing of handheld devices for indoor navigation. IEEE Trans. Mob. Comput. 2016, 15, 1982–1995. [Google Scholar] [CrossRef]
- Faragher, R.; Harle, R. An analysis of the accuracy of Bluetooth low energy for indoor positioning applications. In Proceedings of the 27th International Technical Meeting of the Satellite Division of the Institute of Navigation (ION GNSS 2014), Tampa, FL, USA, 8–12 September 2014; pp. 201–210. [Google Scholar]
- Lin, X.Y.; Ho, T.W.; Fang, C.C.; Yen, Z.S.; Yang, B.D.; Lai, F.P. A mobile indoor positioning system based on iBeacon technology. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 4970–4973. [Google Scholar]
- Huang, C.H.; Lee, L.H.; Ho, C.C.; Wu, L.L.; Lai, Z.H. Real-time RFID indoor positioning system based on Kalman-filter drift removal and Heron-bilateration location estimation. IEEE Trans. Instrum. Meas. 2015, 64, 728–739. [Google Scholar] [CrossRef]
- Yang, Z.X.; Zhang, P.B.; Chen, L. RFID-enabled indoor positioning method for a real-time manufacturing execution system using OS-ELM. Neurocomputing 2016, 174, 121–133. [Google Scholar] [CrossRef]
- Yang, D.; Xu, B.; Rao, K.Y.; Sheng, W.H. Passive infrared (PIR)-based indoor position tracking for smart homes using accessibility maps and a-star algorithm. Sensors 2018, 18, 332. [Google Scholar] [CrossRef] [PubMed]
- Wu, C.X.; Mu, Q.; Zhang, Z.B.; Jin, Y.F.; Wang, Z.Y.; Shi, G.Y. Indoor positioning system based on inertial mems sensors: Design and realization. In Proceedings of the 2016 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), Chengdu, China, 19–22 June 2016; pp. 370–375. [Google Scholar]
- Kasmi, Z.; Norrdine, A.; Blankenbach, J. Towards a decentralized magnetic indoor positioning system. Sensors 2015, 15, 30319–30339. [Google Scholar] [CrossRef] [PubMed]
- Alarifi, A.; Al-Salman, A.; Alsaleh, M.; Alnafessah, A.; Al-Hadhrami, S.; Al-Ammar, M.A.; Al-Khalifa, H.S. Ultra-wideband indoor positioning technologies: Analysis and recent advances. Sensors 2016, 16, 1927. [Google Scholar] [CrossRef] [PubMed]
- Mazhar, F.; Khan, M.G.; Sallberg, B. Precise indoor positioning using UWB: A review of methods, algorithms and implementations. Wirel. Pers. Commun. 2017, 97, 4467–4491. [Google Scholar] [CrossRef]
- Feng, C.; Au, W.S.A.; Valaee, S.; Tan, Z.H. Received-signal-strength-based indoor positioning using compressive sensing. IEEE Trans. Mob. Comput. 2012, 11, 1983–1993. [Google Scholar] [CrossRef]
- Janicka, J.; Rapinski, J. Application of RSSI based navigation in indoor positioning. In Proceedings of the 2016 Baltic Geodetic Congress (Bgc Geomatics), Gdansk, Poland, 2–4 June 2016; pp. 45–50. [Google Scholar]
- Xia, S.X.; Liu, Y.; Yuan, G.; Zhu, M.J.; Wang, Z.H. Indoor fingerprint positioning based on Wi-Fi: An overview. ISPRS Int. J. Geo-Inf. 2017, 6. [Google Scholar] [CrossRef]
- Tang, J.; Chen, Y.W.; Chen, L.; Liu, J.B.; Hyyppa, J.; Kukko, A.; Kaartinen, H.; Hyyppa, H.; Chen, R.Z. Fast fingerprint database maintenance for indoor positioning based on UGV SLAM. Sensors 2015, 15, 5311–5330. [Google Scholar] [CrossRef] [PubMed]
- Alsudani, A. NLOS mitigation and ranging accuracy for building indoor positioning system in UWB using commercial radio modules. AIP Conf. Proc. 2018, 1968. [Google Scholar] [CrossRef]
- Harle, R. A survey of indoor inertial positioning systems for pedestrians. IEEE Commun. Surv. Tutor. 2013, 15, 1281–1293. [Google Scholar] [CrossRef]
- Correa, A.; Barcelo, M.; Morell, A.; Vicario, J.L. A review of pedestrian indoor positioning systems for mass market applications. Sensors 2017, 17, 1927. [Google Scholar] [CrossRef] [PubMed]
- Li, X.; Wang, J.; Liu, C.Y. A Bluetooth/PDR integration algorithm for an indoor positioning system. Sensors 2015, 15, 24862–24885. [Google Scholar] [CrossRef] [PubMed]
- Tian, Z.S.; Fang, X.; Zhou, M.; Li, L.X. Smartphone-based indoor integrated WiFi/MEMS positioning algorithm in a multi-floor environment. Micromachines 2015, 6, 347–363. [Google Scholar] [CrossRef]
- Kok, M.; Hol, J.D.; Schon, T.B. Indoor positioning using ultra-wideband and inertial measurements. IEEE Trans. Veh. Technol. 2015, 64, 1293–1303. [Google Scholar] [CrossRef]
- Anup, S.; Goel, A.; Padmanabhan, S. Visual positioning system for automated indoor/outdoor navigation. In Proceedings of the TENCON 2017—2017 IEEE Region 10 Conference, Penang, Malaysia, 5–8 November 2017; pp. 1027–1031. [Google Scholar]
- Huang, Z.; Zhu, J.G.; Yang, L.H.; Xue, B.; Wu, J.; Zhao, Z.Y. Accurate 3-D position and orientation method for indoor mobile robot navigation based on photoelectric scanning. IEEE Trans. Instrum. Meas. 2015, 64, 2518–2529. [Google Scholar] [CrossRef]
- Endo, Y.; Sato, K.; Yamashita, A.; Matsubayashi, K. Indoor positioning and obstacle detection for visually impaired navigation system based on LSD-SLAM. In Proceedings of the 2017 International Conference on Biometrics and Kansei Engineering (ICBAKE), Kyoto, Japan, 15–17 September 2017; pp. 158–162. [Google Scholar]
- Saputra, M.R.U.; Markham, A.; Trigoni, N. Visual SLAM and structure from motion in dynamic environments: A survey. ACM Comput. Surv. 2018, 51. [Google Scholar] [CrossRef]
- Gui, J.J.; Gu, D.B.; Wang, S.; Hu, H.S. A review of visual inertial odometry from filtering and optimisation perspectives. Adv. Robot. 2015, 29, 1289–1301. [Google Scholar] [CrossRef]
- Schmid, K.; Hirschmuller, H. Stereo vision and IMU based real-time Ego-motion and depth image computation on a handheld device. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 4671–4678. [Google Scholar]
- Li, M.Y.; Mourikis, A.I. High-precision, consistent EKF-based visual-inertial odometry. Int. J. Robot. Res. 2013, 32, 690–711. [Google Scholar] [CrossRef]
- Leutenegger, S.; Lynen, S.; Bosse, M.; Siegwart, R.; Furgale, P. Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Robot. Res. 2015, 34, 314–334. [Google Scholar] [CrossRef]
- Qin, T.; Li, P.L.; Shen, S.J. Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 2018, 34, 1004–1020. [Google Scholar] [CrossRef]
- Antigny, N.; Servieres, M.; Renaudin, V. Hybrid visual and inertial position and orientation estimation based on known urban 3D models. In Proceedings of the 2016 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Alcala de Henares, Spain, 4–7 October 2016; pp. 1–8. [Google Scholar]
- Bertozzi, M.; Broggi, A.; Fascioli, A.; Tibaldi, A.; Chapuis, R.; Chausse, F. Pedestrian localization and tracking system with Kalman filtering. In Proceedings of the 2004 IEEE Intelligent Vehicles Symposium, Parma, Italy, 14–17 June 2004; pp. 584–589. [Google Scholar]
- Colombo, A.; Fontanelli, D.; Macii, D.; Palopoli, L. Flexible indoor localization and tracking based on a wearable platform and sensor data fusion. IEEE Trans. Instrum. Meas. 2014, 63, 864–876. [Google Scholar] [CrossRef]
- Kim, D.Y.; Song, H.Y. Method of predicting human mobility patterns using deep learning. Neurocomputing 2018, 280, 56–64. [Google Scholar] [CrossRef]
- Liu, Z.G.; Zhang, L.M.; Liu, Q.; Yin, Y.F.; Cheng, L.; Zimmermann, R. Fusion of magnetic and visual sensors for indoor localization: Infrastructure-free and more effective. IEEE Trans. Multimed. 2017, 19, 874–888. [Google Scholar] [CrossRef]
- Jiao, J.C.; Li, F.; Deng, Z.L.; Ma, W.J. A smartphone camera-based indoor positioning algorithm of crowded scenarios with the assistance of deep CNN. Sensors 2017, 17, 704. [Google Scholar] [CrossRef] [PubMed]
- Brunetti, A.; Buongiorno, D.; Trotta, G.F.; Bevilacqua, V. Computer vision and deep learning techniques for pedestrian detection and tracking: A survey. Neurocomputing 2018, 300, 17–33. [Google Scholar] [CrossRef]
- Chen, X.G.; Wei, P.X.; Ke, W.; Ye, Q.X.; Jiao, J.B. Pedestrian detection with deep convolutional neural network. In Computer Vision—ACCV 2014 Workshops, Pt I; Springer: Cham, Switzerland, 2015; pp. 354–365. [Google Scholar]
- Park, E.; del Pobil, A.P.; Kwon, S.J. The role of internet of things (IoT) in smart cities: Technology roadmap-oriented approaches. Sustainability 2018, 10, 1388. [Google Scholar] [CrossRef]
- Leong, C.Y.; Perumal, T.; Yaakob, R.; Peng, K.W. Enhancing indoor positioning service for location based internet of things (IoT) a source selecting approach with error compensation. In Proceedings of the 2017 IEEE International Symposium on Consumer Electronics (ISCE), Kuala Lumpur, Malaysia, 14–15 November 2017; pp. 52–55. [Google Scholar]
- Fathy, Y.; Barnaghi, P.; Tafazolli, R. Large-scale indexing, discovery, and ranking for the internet of things (IoT). ACM Comput. Surv. 2018, 51. [Google Scholar] [CrossRef]
- Lopes, S.I.; Vieira, J.M.N.; Reis, J.; Albuquerque, D.; Carvalho, N.B. Accurate smartphone indoor positioning using a WSN infrastructure and non-invasive audio for TDOA estimation. Pervasive Mob. Comput. 2015, 20, 29–46. [Google Scholar] [CrossRef]
- Wang, H.; Wen, Y.Y.; Zhao, D.Z. Differential barometric-based positioning technique for indoor elevation measurement in IoT medical applications. Technol. Health Care 2017, 25, S295–S304. [Google Scholar] [CrossRef] [PubMed]
- Jeong, J.; Yeon, S.; Kim, T.; Lee, H.; Kim, S.M.; Kim, S.C. Sala: Smartphone-assisted localization algorithm for positioning indoor IoT devices. Wirel. Netw. 2018, 24, 27–47. [Google Scholar] [CrossRef]
- De Angelis, G.; De Angelis, A.; Pasku, V.; Moschitta, A.; Carbone, P. A hybrid outdoor/indoor positioning system for IoT applications. In Proceedings of the 2015 IEEE International Symposium on Systems Engineering (ISSE) Proceedings, Rome, Italy, 28–30 September 2015; pp. 1–6. [Google Scholar]
- Liu, R.; Yuen, C.; Do, T.-N.; Jiao, D.; Liu, X.; Tan, U.X. Cooperative relative positioning of mobile users by fusing IMU inertial and UWB ranging information. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA 2017), Singapore, 29 May–3 June 2017; Institute of Electrical and Electronics Engineers Inc.: Singapore, 2017; pp. 5623–5629. [Google Scholar] [Green Version]
- Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation. In Proceedings of the 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Columbus, OH, USA, 23–28 June 2014; pp. 580–587. [Google Scholar]
- Girshick, R. Fast R-CNN. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015; pp. 1440–1448. [Google Scholar]
- Ren, S.Q.; He, K.M.; Girshick, R.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. 2017, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed]
- Zeiler, M.D.; Fergus, R. Visualizing and understanding convolutional networks. In Computer Vision—ECCV 2014, Pt I; Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T., Eds.; Springer: Cham, Switzerland, 2014; Volume 8689, pp. 818–833. [Google Scholar]
- Everingham, M.; Van Gool, L.; Williams, C.K.I.; Winn, J.; Zisserman, A. The Pascal Visual Object Classes Challenge 2007 (VOC 2007) Results. 2007. Available online: http://host.robots.ox.ac.uk/pascal/VOC/voc2007/results/index.shtml (accessed on 16 September 2018).
- The Pascal Visual Object Classes Challenge 2012 (VOC 2012) Development Kit. Available online: http://host.robots.ox.ac.uk/pascal/VOC/voc2012/htmldoc/devkit_doc.html#SECTION00044000000000000000 (accessed on 14 July 2018).
- Zhang, Z.Y. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
Phases | P1 | P2 | P3 | P4 | |
---|---|---|---|---|---|
By Ranging/m | RMSE | 0.33 | 0.14 | 0.29 | 0.42 |
STD | 0.26 | 0.1 | 0.28 | 0.39 | |
Max. | 1.01 | 0.33 | 1.07 | 1.35 | |
By Heading/m | RMSE | 0.03 | 0.02 | 0.03 | 0.04 |
STD | 0.01 | 0.01 | 0.02 | 0.04 | |
Max. | 0.07 | 0.07 | 0.09 | 0.14 |
Blockage Categories | Armchair | Cabinet | Stool | Garbage | Pedestrian (5%) | Pedestrian (30%) | Pedestrian (50%) | Pedestrian (70%) | Pedestrian (90%) | |
---|---|---|---|---|---|---|---|---|---|---|
Range/m | RMSE | 0.08 | 1.45 | 0.53 | 0.64 | 0.1 | 0.61 | 0.59 | 1.36 | 1.53 |
STD | 0.08 | 0.23 | 0.52 | 0.15 | 0.09 | 0.34 | 0.5 | 0.11 | 0.24 | |
Max. | 0.17 | 1.7 | 1.12 | 0.82 | 0.15 | 0.88 | 1.57 | 1.73 | 1.62 | |
Heading | RMSE | 0.46 | 0.11 | 0.24 | 0.1 | 0.59 | 0.54 | 0.65 | 0.91 | 1 |
STD | 0.05 | 0.02 | 0.24 | 0.09 | 0.05 | 0.4 | 0.41 | 0.48 | 0.42 | |
Max. | 0.5 | 0.13 | 0.73 | 0.12 | 0.7 | 1 | 1.09 | 1.25 | 1.6 | |
Detection Failure rates | 72% | 90% | 0 | 0 | 0 | 0 | 0 | 0 | 8% |
Horizontal Errors/m | Scenario 1 | Scenario 2 | Scenario 3 | Scenario 4 | Scenario 5 | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
East | North | East | North | East | North | East | North | East | North | ||
UWB | RMSE | 0.16 | 0.12 | 0.2 | 0.1 | 0.35 | 0.14 | 0.13 | 0.33 | 0.15 | 0.4 |
STD | 0.09 | 0.05 | 0.1 | 0.04 | 0.32 | 0.11 | 0.1 | 0.22 | 0.1 | 0.2 | |
Max. | 0.58 | 0.92 | 0.22 | 0.89 | 0.22 | 0.88 | 0.72 | 1.38 | 0.62 | 1.52 | |
MVRM | RMSE | 0.76 | 0.2 | 0.51 | 0.11 | 0.2 | 0.06 | 0.19 | 0.08 | 0.35 | 0.18 |
STD | 0.1 | 0.02 | 0.11 | 0.01 | 0.08 | 0.02 | 0.1 | 0.02 | 0.13 | 0.02 | |
Max. | 3.56 | 0.92 | 1.49 | 0.77 | 0.6 | 0.11 | 0.59 | 0.13 | 3.7 | 0.17 | |
IMU/MVRM | RMSE | 0.24 | 0.07 | 0.12 | 0.07 | 0.2 | 0.06 | 0.18 | 0.08 | 0.17 | 0.12 |
STD | 0.08 | 0.01 | 0.07 | 0.01 | 0.04 | 0.01 | 0.05 | 0.02 | 0.08 | 0.03 | |
Max. | 0.99 | 0.23 | 0.59 | 0.08 | 0.54 | 0.14 | 0.46 | 0.13 | 0.81 | 0.25 |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, L.; Zhou, T.; Lian, B. Integrated IMU with Faster R-CNN Aided Visual Measurements from IP Cameras for Indoor Positioning. Sensors 2018, 18, 3134. https://doi.org/10.3390/s18093134
Zhang L, Zhou T, Lian B. Integrated IMU with Faster R-CNN Aided Visual Measurements from IP Cameras for Indoor Positioning. Sensors. 2018; 18(9):3134. https://doi.org/10.3390/s18093134
Chicago/Turabian StyleZhang, Lin, Taoyun Zhou, and Baowang Lian. 2018. "Integrated IMU with Faster R-CNN Aided Visual Measurements from IP Cameras for Indoor Positioning" Sensors 18, no. 9: 3134. https://doi.org/10.3390/s18093134
APA StyleZhang, L., Zhou, T., & Lian, B. (2018). Integrated IMU with Faster R-CNN Aided Visual Measurements from IP Cameras for Indoor Positioning. Sensors, 18(9), 3134. https://doi.org/10.3390/s18093134