Abstract
For markerless indoor Augmented Reality Navigation (ARN) technology, camera pose is inevitably the fundamental argument of positioning estimation and pose estimation, and floor plane is indispensably the fiducial target of image registration. This paper proposes optical-flow-scene indoor positioning and wall-floor-boundary image registration to make ARN more precise, reliable, and instantaneous. Experimental results show both optical-flow-scene indoor positioning and wall-floor-boundary image registration have higher accuracy and less latency than conventional well-known ARN methods. On the other hand, these proposed two methods are seamlessly implemented on the handheld Android embedded platform and are smoothly verified to work well on the handheld indoor augmented reality navigation device.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Kim, J., Jun, H.: Vision-based location positioning using augmented reality for indoor navigation. IEEE Trans. Consum. Electron. 54(3), 954–962 (2008)
Mohareri, O., Rad, A.B.: Autonomous humanoid robot navigation using augmented reality technique. In: Proceedings of 2011 IEEE International Conference on Mechatronics (ICM), pp. 463–468, April 2011
DiVerdi, S., Hollerer, T.: Heads up and camera down: a vision-based tracking modality for mobile mixed reality. IEEE Trans. Visual. Comput. Graphics 14(3), 500–512 (2008)
Hile, H., Borriello, G.: Positioning and orientation in indoor environments using camera phones. IEEE Comput. Graph. Appl. 28(4), 32–39 (2008)
Oskiper, T., Sizintsev, M., Branzoi, V., Samarasekera, S., Kumar, R.: Augmented reality binoculars. IEEE Trans. Visual. Comput. Graphics 21(5), 611–623 (2015)
Cheok, A.D., Yue, L.: A novel light-sensor-based information transmission system for indoor positioning and navigation. IEEE Trans. Instrum. Meas. 60(1), 290–299 (2011)
Hervas, R., Bravo, J., Fontecha, J.: An assistive navigation system based on augmented reality and context awareness for people with mild cognitive impairments. IEEE J. Biomed. Health Inform. 18(1), 368–374 (2014)
Thomas, B.H., Sandor, C.: What wearable augmented reality can do for you. IEEE Pervasive Comput. 8(2), 8–11 (2009)
Comport, A.I., Marchand, E., Pressigout, M., Chaumette, F.: Real-time markerless tracking for augmented reality: the virtual visual servoing framework. IEEE Trans. Visual. Comput. Graphics 12(4), 615–628 (2006)
Lee, T., Hollerer, T.: Multithreaded hybrid feature tracking for markerless augmented reality. IEEE Trans. Visual. Comput. Graphics 15(3), 355–368 (2009)
Kim, Y.-G., Kim, W.-J.: Implementation of augmented reality system for smartphone advertisements. Int. J. Multimed. Ubiquitous Eng. 9(2), 385–392 (2014)
Simon, G., Berger, M.-O.: Pose estimation for planar structures. IEEE Comput. Graph. Appl. 22(6), 46–53 (2002)
Prince, S.J.D., Xu, K., Cheok, A.D.: Augmented reality camera tracking with homographies. IEEE Comput. Graph. Appl. 22(6), 39–45 (2002)
Maidi, M., Preda, M., Le, V.H.: Markerless tracking for mobile augmented reality. In: Proceedings of IEEE International Conference on Signal and Image Processing Applications, pp. 301–306, November 2011
Barcelo, G.C., Panahandeh, G., Jansson, M.: Image-based floor segmentation in visual inertial navigation. In: Proceedings of IEEE International Instrumentation and Measurement Technology Conference (I2MTC), pp. 1402–1407, May 2013
Ling, M., Jianming, W., Bo, Z., Shengbei, W.: Automatic floor segmentation for indoor robot navigation. In: Proceedings of International Conference on Signal Processing Systems (ICSPS), vol. 1, pp. V1-684–V1-689, July 2010
Posada, L.F., Narayanan, K.K., Hoffmann, F., Bertram, T.: Floor segmentation of omnidirectional images for mobile robot visual navigation. In: Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 804–809, October 2010
Rodriguez-Telles, F.G., Abril Torres-Mendez, L., Martinez-Garcia, E.A.: A fast floor segmentation algorithm for visual-based robot navigation. In: Proceedings of International Conference on Computer and Robot Vision (CRV), pp. 167–173, May 2013
Acknowledgments
This work was supported in part by Ministry of Science and Technology, Taiwan, under Grant MOST 106-2221-E-224-053.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Lin, WS., Ho, C.C. (2019). Markerless Indoor Augmented Reality Navigation Device Based on Optical-Flow-Scene Indoor Positioning and Wall-Floor-Boundary Image Registration. In: Chang, CY., Lin, CC., Lin, HH. (eds) New Trends in Computer Technologies and Applications. ICS 2018. Communications in Computer and Information Science, vol 1013. Springer, Singapore. https://doi.org/10.1007/978-981-13-9190-3_10
Download citation
DOI: https://doi.org/10.1007/978-981-13-9190-3_10
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-9189-7
Online ISBN: 978-981-13-9190-3
eBook Packages: Computer ScienceComputer Science (R0)